Video from Maker Faire

Many thanks to www.geek.com, who took this video (we got some too, but Eben prefers the angle they took theirs at, which doesn’t make him look as if the microphone is a really bad moustache). It’s pretty long at nearly 19 minutes, but we hope you watch at least a bit of it. It’s a demo presentation, where Eben shows Raspberry Pi running 1080p video, and the Quake 3 demo you’ve already seen here, alongside some talk about what the foundation’s doing, what the board does, and a Q&A session. It was really satisfying to see the Live Stage tent fill up for the talk to the point where there was standing room only.

I’ve got some more video from the day that I’ll put on the site later. A huge thank you to everyone who came to New York to see us, especially our board members and A, who came all the way from Pittsburgh just to have a look at a Raspberry Pi.

Most of all, I was really, really encouraged to meet some of the kids we’re aiming the device at – some of the day’s most intelligent and thoughtful questions came from the under-13s. There are some incredible ideas floating around out there for projects schools can do with the boards; we really want to see some of the younger people who are interested in Raspberry Pi relaying the news that we’re launching soon to the relevant teachers at their schools. I didn’t get everyone’s name, but I’d like to call out Andrew Katz in particular, an Arduino hacker with some very cool ideas, who had his own stand near ours, and who wants a Raspberry Pi for his birthday. We should be launching in time for Andrew to get one – I hope a hefty number of you will be buying one too!

If you were at Maker Faire and took pictures or video, please feel free to leave them in the comments here or email them to me at liz@raspberrypi.org. I’d love to see if anyone managed to get a picture of me with my mouth shut…

(ETA: there are some more pictures from the day here on the forum – thanks, @PaulDow!)

 

58 comments

Dustin avatar

So awesome. To think how far we’ve come over the decades, and at such a low price-point. Really wish I could get my hands on one of these, even if it were a prototype/beta :)

jamesh avatar

You just have to wait until the end of November and you can have a a shiny production version fresh off the line.

Frogbert avatar

Shut up and take my money!

jamesh avatar

Wait until November and we will take everything you throw in our direction (money that is, or compliments.)

MarkSmith avatar

I’m going to be very disappointed if there isn’t one under the tree on Dec 25th. :)

MmmPi avatar

With the echo & background din I miss half of everything Eben says, and I really wanted to hear a few points. Would there be anyone out there who can hear through the noise & is bored or kind enough to do a transcript? There is no recording straight off his mic is there?

jamesh avatar

I think everything he covered can be found on this website, forum or the wiki if you feel like looking, or perhaps Liz’s video (up in next couple of days) will have better quality sound.

Paul Dow avatar

It could have been worse. They could have been put by the MIDI controlled drums that were run almost continuously.
I’m surprised that people stayed in their booths near that all day. I would have packed up to protect my hearing.
http://laughingsquid.com/midi-controlled-drum-machine-robots-by-tim-laursen/
They were much louder than the video on that page makes it sound.

Dennis Fisch avatar

Hey there, thanks for putting this up, very interesting indeed. Though, unfortunately i do have to agree with MmmPi, i find this very difficult to understand, it must have been very noisy in that tent.

What i gathered is that the expected availability is sometime about start of next year?

jamesh avatar

Should be before the end of this year, if all goes well.

harald avatar

want! definitly want! i already have a home-project in mind for this and i would like to take a couple of them to a country, where nearly everybody has a television, but only a few can afford a computer …

harald avatar

btw.: are there any compatible USB 3G modems available for this?

jamesh avatar

Well, I haven’t been able to try one as I don’t have one to test. On of the other devs may have done, but I have not heard anything on the grapevine.

Dennis Fisch avatar

As it runs the linux kernel it should work with all Android devices acting as modems. I’m pretty sure it should also work with most 3G Sticks…

I’ve never had to install or compile any drivers since everything always worked out of the box.

chris avatar

I’ve never used a 3G modem on any of my linux machine but this device can be made to run any usb device that does not have any binary code in the driver

Shimshon avatar

I am very impressed with this device and project, but I really have problems with the choice of a device with an extremely powerful, but totally proprietary, black box. If the goal is to make computing hacking and experimenting cheap and approachable, I vehemently disagree with your choice. Your device has a very large “NO TRESPASSING” sign on part of it, perhaps the most compelling part of it. It makes me question if your design decisions are really to encourage exploration or to benefit Broadcom.

I hope you don’t retort that if that’s how I feel, I should design my own device. At least admit that this device, while extremely cool from a geek perspective, doesn’t measure up to your stated goals.

I don’t know how old the members of this project are. But I remember the Apple ][ (before the ][+). It came with a whimsical and very comprehensive manual.

All the devices of the time used completely off-the-shelf devices and the hardware could easily be cloned. They were generally well-documented too, being designed by, and for, geeks.

Don’t forget the PC, years later, that came with full BIOS source.

The C64, with its more custom, but documented hardware.

This is the environment that I grew up in. Maybe I’m wrong, but when I read about your goals, this is what comes to my mind, not some monstrously powerful, but utterly undocumented, CPU or GPU. I’d much rather have a less complicated device, even an MMU-less (and therefore could only run a Linux kernel like uCLinux) chip like a Cortex M3, but with documented and at least moderately capable hardware, than your device.

jamesh avatar

You may disagree with the choice, but it’s the only choice possible. Look it up – try and find a comparable chip elsewhere. You won’t. There simply isn’t another chip that gives the same level of performance at this price.

Also, members of the team have a lot of experience with this chip (Eben was on the design team, I work on it, Gert designs bits of it) so it was brought up much more quickly that something from somewhere else would have been. It also means better support, even if there is a black box.

Having a black box does not affect the stated aim of the device at all. It fact, the device measure up to the stated goal perfectly. A cheap teaching device. At no point does having a black box GPU affect that aim. You can still learn programming, you can learn 3D graphics using the standard OpenGL libraries etc. You cannot play around with the internals of the GPU, but that is a weird esoteric world of vector cores and hardware blocks, custom CPU’s and odd instruction sets, and of no use whatsoever to people learning from first principles.

If you just want to use the Arm, you are perfectly able to do so – its standard – no black box there- you don’t have to use the GPU at all if you don’t want too. That gives you your less complicated device, a 700Mhz Arm, fully documented (by Arm). Still only $25. Up to you.

I started on a Apple II btw. I hardly ever looked at the manual. I also used PC for years – I never even looked at the BIOS source (can you get bios source for modern PC’s?). I think you are seeing problems where non exist.

“It makes me question if your design decisions are really to encourage exploration or to benefit Broadcom.” Don’t make assertions like that without being able to back them up. Which you won’t be able to do. I’m sure Broadcom will benefit from this project in some ways, but this did not affect the design decision. That was made on the price point and capabilites fo the device.

waiter avatar

I guess how much will cost to develop a board based on
FPGA (with inside opensource softcore arm/sparc/mips 32bit plus VGA) + 512Mram + CompactFlash (for storage).
Most probably it will costs more than raspberrypi, but it is a alternative that should be explored. So, it will be possible to develop a 100% opensource board.

jamesh avatar

You are absolutely more than welcome to do that. We would encourage it. I’m not sure it would be worth the time though, as I am pretty sure it would cost more than the Raspi, and I’m not sure about the Arm performance running in an FPGA either.

But you would have a completely open source board.

I guess the fact that no-one has done it before (have they?) is an indicator of complexity and cost effectiveness.

waiter avatar

my error: i say “should” istead of “could”. I mean i was talking in general, it was not a “command” to RaspPI Team. I just said it because I “love to share ideas”. Here https://wiki.nesl.ucla.edu/doku.php?id=opensparc_on_xilinx_xupv5 there is anx example of softcore SPARC 64bit running on fpga but there are also using softcore ARM. As far as I know, there are already avaiable boards that does what i said at the beginning, but they are adapration of FPGA demoboard, not a totally new board than can compete (under size, cost and Watt) with raspPI. Nevertheless it can be a potential field to be exprored by who wish and can.

chris avatar

I thought eben said the were going to release API for the GPU, sufficient for coding under linux.
To me it sounds very open, from what understand its unbrickable and you can run your own assembler code, there is a lot of doco for ARM so basic IO should be fairly easy

Shimshon avatar

I do retract my assertion, and I apologize. But I still think your decision is colored by your employer. You have this absolutely awesome chip available. So you use it, and continue to make excuses. It’s esoteric, it’s not necessary, and so forth. Then why promote it? Look! It plays Quake 3 and real-time 1080p! It’s awesome. And hey, it’s also a great teaching tool!

In college, I actually made extensive use of manuals and books on the lowest levels of PC architecture.

I was also interested in a variety of areas where having documented, to hardware like this, even if arcane and obtuse, would have been beneficial or at the least, very interesting.

“I’d much rather have a less complicated device, even an MMU-less (and therefore could only run a Linux kernel like uCLinux) chip like a Cortex M3, but with documented and at least moderately capable hardware, than your device.”

It’s not that your device is not powerful, astoundingly so for the price. You keep saying that over and over. It’s that you present this as: Check out this awesome device. It’s a GREAT learning tool. Oh, but the part that makes it so much more awesome than every other device out there? Don’t bother with that. It’s too complicated and esoteric.

If it’s “first principles” you’re after, I think something like the Arduino is more a appropriate model, even if the Arduino itself is quite tame compared to your device. Easy to dive into, not so hard to bypass the default environment, but (probably) a more capable base CPU. And, most importantly, NOT A SINGLE PART OF IT IS OFF LIMITS.

I remember what I was like as a kid. If the RasPi was available then, I would certainly have bought one. I would even buy one today (despite my criticism). But I know for a fact that I, even as a 10 year old, would have been interested in learning not just about ARM assembler and general low-level techniques, but also the 3D hardware. Even if I had to be coddled with a wrapper library for a while, eventually, I would have gotten books about 3D math and programming, and learned it on my own. AS A TEN YEAR OLD. It is not beyond clever, smart, and inquisitive 10 year old kids to learn this stuff. I know I would have been very pissed to be talked down to like you are to me, and everyone else, when you insist that this part is irrelevant to learning in an organic way.

You are being presumptuous by writing off this one aspect of the board, the one aspect you tout above all others in your demos, while claiming this is an awesome device to gain experience and knowledge with.

tnelsond avatar

Seriously? Everybody who buys this device is going to want to write their own graphics card driver? I don’t think so. That’s the only reason to look at the source. You can do 3D graphics without knowing line for line what the graphics driver does, that’s why we have drivers. Sure the GPU is a black box, but it doesn’t affect anything but some utopian ideals of a perfectly open cheap powerful computer. It’s like the triangle: cheap, powerful, open, pick two. Well the R-Pi foundation managed to up the power by making it a little less open, this is a good compromise. The computer works, it’s hackable, it’s cheap.

Shimshon avatar

First of all, I didn’t say “write drivers.” I said, learn about the hardware and experiment.

Second, “that’s why we have drivers.” Seriously? Who writes the drivers?! How did they LEARN to do so? Is this a serious argument?

If you are going to say the purpose of the device is to foster learning, then “cheap” and “open” are the two correct choices. “Powerful” is a (very) distant third.

There is NO QUESTION whatsoever that this device is awesome. And the price is incredible. And I’ll even buy one. But please stop promoting it as a device to foster learning, or stop touting the closed black box as the premier “selling” point.

tnelsond avatar

Managing to build a computer that’s cheap, powerful, and open is tough. And I think they got the best bang, price, and openness for their buck.

jamesh avatar

I’m not going to answer most of you points, since most of it has been answered elsewhere, but..

I’m not talking down to anyone. I am giving you the facts. The GPU and parts of it’s coding is beyond most people. I really mean that. I’m not saying it’s beyond you, or the occasional 10 years old, but beyond MOST people. I’ve worked on it for three years, and I still don’t understand a lot of it, and there are only a few people IN THE WORLD who understand certain parts of the processor. There are millions of lines of code running on that GPU. Lots of vector assembler, and lots of stuff I’m not allowed to talk about that is very very VERY complicated. Just making a camera module work correctly (a pretty small part of the work the GPU does) requires a vast amount of knowledge, not just of the register set, but how to use it properly, where the lightest error makes everything go pear shaped.

Learning how the 3D hardware works would be purely an intellectual exercise (which is fair enough). You cannot use the specific knowledge anywhere else but on that GPU. Is that worth teaching or learning? I don’t think so unless you want to work for Broadcom, Cambridge. The employment market for people with this sort of 3D hardware graphics skills is pretty damn small, and best taught at university level anyway, and not the target demographic for this board.

As to the decision being coloured by our employer. Yes, I’m sure it was. We know the device and what it can do, and how to program it and how to make it work. That doesn’t make it the wrong decision. Nobody has yet come up with an alternative that gives us the price performance ratio you can see on this device. More to the point, the product would not exist without that knowledge, and I’m yet to see any other SoC manufacturers or other charities or companies coming out with their own versions of the Raspi.

liz avatar

I’m not going to get into this again, but I wrote a long response to someone with similar concerns to yours under the Hardware Q&A further down the page – please go and read it.

That multimedia performance is terrifically important. It’s a fantastic way to get penetration into living rooms and get non-technical kids to want to own a device which, we hope, they will realise they can do more than play games or watch TV on. I want to see as many kids as possible owning and using a Raspi, because that way, the small proportion who end up hacking on it will be a much larger number than it would if it was just a box that sat on a desk and went bing.

Most of us don’t work for Broadcom, by the way. And we started this project working with an ATMEL chip which didn’t do everything we needed it to. This isn’t presumption; it’s pragmatism. Much of our decision also has to do with cost, which we need to keep as low as possible; there is actually a tension between low pricing and open hardware which would have made it impossible to do what we want with Raspberry Pi without the BCM2835.

And I don’t think that it’s at all unreasonable to suggest that if the open source hardware community is that exercised about the issue, they should damn well go and tape out their own chip. I’ve spoken to a lot of open source hardware people in person over the last few months – all of them very polite and non-confrontational, which you might like to bear in mind (flies/vinegar/honey); they had questions but were perfectly cool about our position after sitting down and discussing it, and some are looking into starting chip design projects, which we’re very excited about. A much more reasonable approach than coming here and trying to start a fight with a charitable foundation which is trying to do a good thing in a way you don’t agree with. I should be developing a thicker skin about this, but I am absolutely sick to death of being called a traitor to open source, a secret Broadcom marketer, a liar with undisclosed reasons for working on RP and so forth. It’s uncalled for, it’s thoughtless, it’s frankly impolite, and it makes getting out of bed to work on Raspberry Pi for free every day for months on end much less fun that it ought to be.

SpaceHobo avatar

Speaking of flies/honey/vinegar I think that the R-Pi community’s response could dwell more on the positives such as work with open hardware developers. The defensive responses make it seem like the project is dismissing entirely the value of full disclosure in education (especially when they appear next to community members trying to claim that secrecy won’t affect education, or that graphics drivers are some kind of special case where nobody needs to ever learn the mechanisms driving things).

Opponents of the free graphics driver argument seem to all bandy about the word “pragmatism” as if to say “our fans in the free software community are unreasonable.” We’re not trying to be your enemies: we’re trying to become your customers and patrons.

If you have been unfairly criticized for this wart in your hardware platform, it strikes me that the most gracious response is to acknowledge the problem and present only the positive steps you are taking to overcome it. Arguing price/performance against principle is going to be a fruitless argument for both sides.

SpaceHobo avatar

And indeed, having just watched this video Eben’s reaction to the driver question was not the “Stop Wanting That” that you get here, but more of a “We’re Trying”. Kudos to Eben, and may more in the R-Pi community follow your lead.

jamesh avatar

Which is what everyone else commenting here has been trying to say, if only people would listen. We give you the facts, and have said the open sourcing of the Linux side drivers is still being discussed. We give you the reasons why, at the moment, they are closed. We tell you why the SoC was chosen, and defend that decision. We have acknowledged that not having OSS libraries is not perfect. And even after all that, no-one, after repeated requests has come up with an alternative.

Wart in the hardware? The fastest cheapest lowest power GPU on the market, the SoC that makes the Raspi possible, is a wart? Nice one.

No-one has even said, as you say in your post, that the OSS people are being unreasonable. Far from it – I have said a number of time it would be better if the libs were OSS, but, and this is where the pragmatism comes in, that’s not currently possible, so the charity has taken a stance that it is better to have the product with some closed parts, than to have no product at all. That is pragmatism.

liz avatar

Have you read the many, many other instantiations of this discussion on this site? You’ll see us *all* saying “we’re trying”. We are not saying “stop wanting that”. I will admit, though, to feeling a frisson of “stop asking the same damn question and go and look at where we’ve answered it before”.

Shimshon avatar

And yes, real people did examine the BIOS source back then, as well as hardware reference manuals. Out of sheer interest and love of learning about computers. I understand that the HARDWARE design data is, like with pretty much every CPU, proprietary, but why on earth does the spec on how to use it have to be proprietary? Don’t bother answering. There’s no point. Practically every major design firm is guilty of the same.

jamesh avatar

Just one answer.

You could write a game like Quake 3 on the Raspi, right now, using the standard libraries and API’s. So why do you need the low level spec on how to use the GPU? Out of interest and love of it doesn’t cut the mustard. There’s simply too much to know (Unlike a bios which is pretty simple) and it’s all pretty boring. You learn a load of stuff that means you can now write a library…that already exists……

The low level spec is simply something you really don’t need to know about for 99.9999% of programming tasks, which covers, 100%, the teaching ethos behind the device.

Shimshon avatar

Now that you have finally explained the complexity, I suppose there is some merit to your argument (although I still think it’s spurious). But I would still be interested in the hardware for non-3D rendering applications. I have an interest in robotics and computer vision. This device would probably be awesome for computer vision and many other uses. I would love to be able to actually use the hardware for things like that. Will I be able to? Will some sort of GPGPU API be made available?

jamesh avatar

I would like to see OpenCL implemented on the device to give user land access to the GPU power (and it has a lot). However, that is a lot of work, and I am not sure Broadcom would fund it, as I cannot see (yet) a market for it that would repay the investment. The complexity of the device means it’s unlikely that third party developers would be able to do the job effectively – it would have to be people who know the Videocore well (and OpenCL), and there ain’t many of those about.

Marcus V. avatar

I second the proposal for OpenCL. All this general purpose computing power in the GPU should not be wasted for playing movies or 3D games only. I for myself would like to make music with it. I know there is also a DSP on board, but that would be a very proprietary solution. The near future will be many cores and from an educational point of view users should be encouraged to get into parallel programming, where OpenCL is an emerging standard, supported by all major players.

JamesH avatar

Well, there isn’t a DSP as such, it seems to have been used a catch all for the the processors on the GPU (2 vector cores + some other stuff). OpenCL would be great, but the amount of work to implement it is colossal. I mean , really huge. Multiple man years, and there simply isn’t the market for it for it to be cost effective for Broadcom to spend that much time on it time (let’s say a OCL implementation cost about $1M to implement and debug – you need to sell a lot of chips to make that investment back). I’ve been looking at trying to give a bit of access to the vector cores in other ways but time is very short for me at the moment.

JamesH avatar

Oh, and “All this general purpose computing power in the GPU should not be wasted for playing movies or 3D games” is the very reason why this SoC and GPU was designed – it’s is exactly what it is for! The Raspi is using it outside its normal use case!

Motley avatar

If you’ve got some cool computer vision idea that BCM2835 is a great fit for then surely you should go talk to Broadcom to get all the data on their chip. I don’t see how is this ends up being a project that this small charity has to provide you with support for?

Shimshon avatar

I would like to say that I never called you traitors or betrayers to the open source “cause.” I thought I stuck strictly to the merits of the issue, and I retracted my insinuation, which was in poor taste and totally uncalled for. It was wrong and I apologize.

I’m not a hardware person by any stretch. I did take electronics in high school, but I never progressed beyond basic circuits. I make my living in software, and dabble in various interests on the side. Right now those side interests include robotics and UAVs, and I think the GPU on this device would be superb for those kinds of tasks.

In any case, I do think it’s a shame about this type of hardware (in general, I said above everyone does it) having such huge pieces completely closed up like this.

I am very sorry about taking a little spring out of your step on the way to work. I know that can be a drag.

liz avatar

I should apologise too – I didn’t mean to imply that you were the person saying those things. (I get quite a lot of spectacularly impolite emails on the subject, and we have a few folk posting stuff here too.) Thanks for popping back in; I do hope you stick around!

PurplePig avatar

Shimshon – I understand how great it would be if 100% of the hardware was completely open. But you have to appreciate the realities of the price point.

Can I suggest you look at it from this point of view?…

Pretend RaspPi never had a GPU. Still a 700MHz ARM11 though. The performance for the price would still be stunning and everything would be completely open and I think you would be very happy because you can hack away to your heart’s content at every single piece of hardware in the system.

Now pretend that a black-box GPU is added, completely free of charge, opening up the possibility of learning about OpenGL with a sensible level of performance.

That just makes me happier. Admittedly, I’d be even happier if it was an open GPU, but there’s only so much happiness an individual can take :-)

To the Raspberry Pi team:
Best of luck to you all, this is fantastic work. I’ll definitely be in the queue for your first production run.

jamesh avatar

Thanks PurplePig, much appreciated sense there, but I have one (Devils advocate) question.

Why would having the GPU open make you happier?

It’s a question I have difficultly answering myself, perhaps because I know how some of the GPU works (and therefor know how little point there is in knowing it!), so would be interested in others POV.

PurplePig avatar

Only happier in a very literal sense. In the same way that 1.0001 > 1. I.e. a bit happier.

Being completely open I think would excite the demo community, but would be of little use beyond that. If the demo community had a new fixed platform to work on I’m sure they’d find new and entertaining ways to exploit the hardware, like they’re still doing with 8 bit machines now.

But that’s largely irrelevant to the core of what this project is about. As a machine to introduce computing to the ‘user’ generation, very low level GPU access beyond what is provided by the likes of OpenGL is just not required at all.

Actually, the demo community will probably be quite happy that’s it a black-box – it’ll give them a whole new interesting challenge of reverse engineering it :-)

Alien/ST-CNX avatar

I think your platform will be very compelling for computer vision and robotics. However having coded a video decoder/player on an Arm chip (iPhone), I know how little CPU headroom is left. To use your platform for computer vision or machine learning projects people will need access to more performance. The fact the chip can encode H264 video suggests very impressive computational capabilities, but they are locked away in the GPU.

I also think you may be underestimating how far young people used to hack their home computers in the 1980s and early 1990s, and I think you may be overestimating the level of support you’d need to provide. I’m just going to speak of my own experience. My first computer was a ZX81. The assembler sucked. So at the age of 10 I learnt the hex encoding of the Z80 instruction set off by heart and wrote code in hex. By the age of 12, I had also figured out how to make high resolution graphics on a ZX81 using only software (required some very crazy code if you’re interested). By the time I got to the Atari ST, I was writing cutting edge demos and writing articles in magazines about how the video hardware worked having reverse engineered it. Unsurprisingly my first full time job was at Cyrix (an x86 chip maker). I doubt very much that I would have learnt as much today as a teenager hacking in my bedroom in a world of APIs and “use the drivers”. So I hope you convince whomever it is at Broadcom to provide some access to the graphic units’ capabilities, if only a document on the instruction set and how to upload code to the GPU… writing an assembler isn’t hard unlike say a compiler. Although even there one might be able to leverage LLVM.

jamesh avatar

Since I am one of those people who hacked their home PC’s in the 80’s (BBC’s in my case), I can see where you are coming from, but as a Broadcom employee who works on Videocore I think you woefully underestimate the complexity of a modern GPU. I have a printout of the instructions set here (in fact two instructions sets). It’s about 160 pages of instructions. Not too bad. Until you realise that the is another document of about 600 pages describing the register sets, which you also need to know in order to use the GPU. And those registers cover so many domains that no one person can understand it all. And you need to know HOW to use those registers, not just WHAT they do. That’s why Broadcom have a team of 200 or more engineers working on this GPU. Yes, 200.

That’s the complexity out of the way. You still need to persuade Broadcom to release lots of propriety information that may give competitors some ammo in trying to catch up. That’s even more difficult!

That said, I am looking in to providing translucent access to the vector instruction set from the Arm (a bit like a chopped down OpenCL) to give a bit of extra oomph for the end user. Whether that extends to using the HW blocks I don’t know yet. At the moment just looking to provide a library of fast memory operations.

Alien/ST-CNX avatar

@Jamesh: Thanks for your answer. FWIW, I actually do know what you’re talking about, complexity wise. At Cyrix, our unreleased (grumble, marketing decisions, grumble, state of the art at the time, grumble) MXi GPU was also a programmable RISC derivative and had some daunting but not always perfect internal documentation. I do recognize that there is a significant difference working at a chip company and working on the outside: one can always ask someone or fire up the verilog simulator to determine what the chip really is doing when one can’t figure it out from the software side…

But ultimately that’s not really the point. I’m glad to hear you are trying to get the vector instruction set out there. I’m encouraging you to make as much as possible available.

Obviously I don’t know how this chip is architected internally but if you were clever you might for instance have merged functionality from disperate domains (3D and video decoding). For instance YCrCb conversion, DCTs, x-y interpolation, texture bilinear filtering, etc are all convolutions and if you had a powerful convolution engine for which one can set the coefficients it might be nice to use for other things too…

jamesh avatar

@Alien/ST-CNX

I couldn’t possibly comment on how we do the internals, mainly because I don’t understand most of it (only been working on it for three years – just about got the hang of the camera interface and a few other bits)

And of course, Broadcom would be really annoyed in a collect your p45 on the way out type of way.

ukscone avatar

a few more videos from slightly different angles and we should have enough information to make a 3d model of eben to insert into quake3

MmmPi avatar

So that when we can’t hear him over the noise we can yell “Talk louder or we shoot you!”? I jest

ukscone avatar

just think how quickly an army of eben’s could get the raspberry pi finished

xxxstarmanxxx avatar

Personally – I don’t give a damn how much of the board is open/closed tech. What I care about most is that Raspi succeeds on its ideas and principles that it brings to the consumer. In some small way the Raspi foundation are in a position to change computing in a way Sinclair did back in the 80’s. The parallels are quite uncanny – We are all staring at the modern day equivalent of the ZX80 all over again……I think 2012 is going to be the new 1982……and I can’t wait for it :)

Yld avatar

Thank You ! +10 000

waiter avatar

“We are all staring at the modern day equivalent of the ZX80 all over again……I think 2012 is going to be the new 1982”

oh, you bad guy that make me to remember my “green years” when i was a 16 y/o Z80 asm coder! Where are you gone, teenage times? Just a grey gost in my memory :( :( :(

James avatar

Where’s my Pi? Where’s my Pi? All I think about is Rasberry Pi and I am starting to get Pi-eyed. My mind has fallen to Pi~eces and every morning I have to Pi~ck myself up off the floor.

Marcel avatar

Hello,

nice video, but sorry you must work on your “Ähm” comments in your videos it sounds not very good. I can imagine that you are nervous, but less “Ähm” makes you a better salesman. Don’t be angry that’s only my opinion.

BTW, i like your product, and i am a forex fan as well and i would like to use it as a little trading machine on my wall near my router were it trades safely with linux day by day :-).

Nice Day
Marcel

jamesh avatar

Well, it’s not a sales video – it was unscripted presentation to an audience that was filmed. And Eben isn’t a salesman anyway….

If you really want to complain about someone’s presentation skills, try any of Elon Musk’s presentations (SpaceX website and others), doesn’t seem to affect his companies much though….

chris avatar

It provides an OpenGL library, which is more then sufficient to learn gfx coding
Teaching people to hack away and create a graphical 3D library is pointless and backwards, you might as well also go and make your own programming language in the process

Comments are closed