Libraries, codecs, OSS
Here’s a post from JamesH, one of the team working on the Raspberry Pi’s software. He works for Broadcom as his day job, and knows the BCM2835 as well as anybody; you will also have seen him posting in our forums and spanking trolls in our blog comments. Thanks JamesH!
There have been quite a few questions in the forums and on the comments about what libraries will be available, what codecs, what is open source etc. This short post will try and give people some idea of what will be available at or around launch time. It won’t be comprehensive – I am sure that for some it will generate more questions than answers, but I hope it will be of help.
Firstly, libraries. Any distribution will need to supply a set of closed source libraries that give access to the GPU acceleration features. The libraries that will be available are :
- OpenGL ES 2.0
OpenGL is a 3D library, very commonly used on desktops and embedded systems. It is defined by the Khronos Group.
OpenVG is a 2D vector drawing library, also commonly used on desktops and embedded systems. Again, defined by the Khronos Group.
EGL is an interface between Khronos rendering APIs such as OpenGL ES or OpenVG and the underlying native platform window system.
- OpenMAX IL
OpenMAX supplies a set of API’s that provides abstractions for routines used during audio, video, and still images processing. OpenMAX defines three layers, this is the IL layer, which provides an interface between media framework such as Gstreamer and a set of multimedia components (such as codecs).
The first three adhere to the standard Linux library API’s, so should be a straight forward swap in for applications that use them. OpenMAX IL does not have a standard API at this stage, so is a custom implementation. All these libraries are as supplied by Broadcom, the SoC (System On Chip) provider.
There is loads of information on Wikipedia and the Khronos website on these API’s.
Two licensed codecs will be provided at launch, MPEG4 and h.264. Codec licences have quite an impact of the cost of the device which is why there are only two at this stage. There are non-licensed Codecs such at MPEG2, VC1 etc, but for the moment they will not be accelerated by the GPU.
Dom adds: As an aside, the GPU can hardware decode H264, MPEG1/2/4, VC1, AVS, MJPG at 1080p30. It can software (but still vector accelerated) decode VP6, VP7, VP8, RV, Theora, WMV9 at DVD resolutions. We are restricted due to licensing what we can support. We should be able to support VP8, MJPG and Theora, as I believe they are license free.
Open vs Closed Source
The Open/Closed source debate can become quite heated, as those perusing the comments and forums may have noticed. As stated above, the host side libraries for the graphics acceleration are closed source and are provided by the SoC supplier. The Foundation has no control over the closed nature of these libraries. Since the vast majority of people simply use libraries such as these, it was deeded a trade off worth making to get the high graphics performance. It’s worth noting there are no other SoC devices with a similar graphics performance that are open source. There is no GPL issue here, these are user side libraries not linked in any way to the kernel.
There are a few drivers for the SoC which are linked in to the kernel, these are GPLed and hence OSS. One of these drivers is the interface from the user space libraries to the GPU. The user side libraries use this ‘driver’ to communicate with the GPU and tell it what to do.
Here’s a handy diagram that may help visualise what’s what.
Finally, for those of you wondering why we do not sell the device as a kit, one reason can be seen in the picture below. The picture shows, on the left, the PoP memory and it’s BGA array. This fits on top of the chip on the right, the BRCM2835 SoC (underside showing). As you might imagine, it’s very difficult to position and then manually solder these small SoC devices. Each of those dots needs its own individual dot of solder, and the two packages have to be lined up perfectly. Some people in our forums say they have tried to solder BGA components and succeeded. We know of many, many more who have failed; and there is no recovering once you have failed. Usually, it’s robots called Pick and Place machines that do all the positioning, after the PCB has had solder paste applied to all the pads which helps the parts stick to it. The board then goes off to a reflow oven where the solder is melted, completing the electrical connection.
So there you have it, a very quick introduction to the libraries, codecs, and even a quick explanation of how the PCB’s are populated!
XBMC has traditionally handled the decoding of video themselves, offloading this decoding to hardware using APIs provided.
Will there be APIs provided so that they can take advantage of this, hence do all of their decoding with their own code allowing all of the traditional codec support we have come to know and love of XBMC?
Quote: “OpenMAX supplies a set of API’s that provides abstractions for routines used during audio, video, and still images processing. OpenMAX defines three layers, this is the IL layer, which provides an interface between media framework such as Gstreamer and a set of multimedia components (such as codecs).”
“The first three adhere to the standard Linux library API’s, so should be a straight forward swap in for applications that use them. OpenMAX IL does not have a standard API at this stage, so is a custom implementation. All these libraries are as supplied by Broadcom, the SoC (System On Chip) provider.”
“There is loads of information on Wikipedia and the Khronos website on these API’s.”
You say that MPEG4 and H.264 will be supported. In fact, H.264 is MPEG4 Part 10. So what do you mean by MPEG4? Is it Part 2?
Also you say that there are only these two at this stage. Is it going to be possible to somehow upgrade the existing device? Is it just software thing?
Yes, it’s just a software thing. Hopefully more codecs wwill be added at a later date. There may even be a pay for licecned codecs option – see Liz’s post below.
I’ve quoted the two Codecs as often they are regarded as separate by the end user who have filenames like fred.mp4, or barney.h264, but you are correct, H264 is an MPEG4 part 10, MPEG4 is part 2 (H263)
Daniel M. Basso
Is there any support for FFT and/or MDCT in hardware? That would be so awesome for innumerable things!
Well, yes, there is HW support for stuff like that, but it on the GPU and inaccessible from the Arm. Sorry!
You can do both of these as GPGPU tasks with the normal OpenGL ES libraries*.
So while “in hardware” might be inaccessible, “on GPU” is definitely possible.
*Old-Skool GPGPU. The more modern approaches to GPGPU may come later.
Is there any chance that Hi10P support will be included? I know it’s a longshot, but that’s going to be the biggest hurdle in using Raspberry Pi’s as my media centers.
Nope, no chance whatsoever.
Why are you using that profile? Not sure of any commercial available consumer equipment that supports it, mainly because it’s a right PITA. It may be 10 bit, but processing internally needs to be done in 16, which is a massive increase requriement in the performance of the encoder.
haha, what a great answer!
The root cause of this issue is that some naive people (mostly Anime fans) have begun using the H264 Hi10P encoding profile for their video releases on the internet, and now all unknowing people who download this are complaining that their videos are not working with any hardware players, so thay have to software decode them using raw CPU power which no embedded SoC can do, thus they have to use a very fast computer as a HTPC.
The real solution is of course for the people who encode such videos to stop using this H264 Hi10P encoding profile and go hack to using the standard H264 L4.1 profile :P
They should then just wait for the newer HEVC (High Efficiency Video Coding) a.k.a. H265 codec to become popular and available in future hardware players and their SoC’s :)
Galaxy S2 supports 10bit actually…
Yes, there does appear to be a misconception about hi10. It really is a waste of time. People would be better off just increasing the bitrate. Any improvement on current gen TV’s is pretty much imperceptible.
Galaxy S2 supports Hi10 – are you sure?
Galaxy S2 and whatever other good spec ARM device can decode hi10p at the software level. Not at the hardware level. Considering the specifications, Raspberry Pi should be able to decode hi10p h264 at the software level, if, and only if, you know what you are doing. Since most anime fans who bought into hi10p being superior (have fun enjoying trade-offs from banding removal yielding mosquito noise) are retards; that is not the case.
Actually, software decoding a hi10p video requires some serious ARM horsepower. Even an overclocked Tegra 3 (1.6ghz) with a video player that can use all four cores still chugs with decoding a 720 hi10p video file (granted, the anime encoders like to do non-standard stuff like 16 ref frames with a 720p).
I can imagine at least a newer dual-core Cortex A15 at 2.0 or 2.5ghz can decode a hi10p video in software better. Also, libavcodec currently has no 10-bit NEON assembly support (unless that’s changed very recently).
You mention that MPEG-4 is one of two supported codecs. Like Limoto, I assume that you’re pointing to MPEG-4 (Advanced) Simple Profile (SP / ASP) with that. However, I don’t get why XviD is specifically excluded then, given the fact that XviD is just an MPEG-4 ASP implementation. Also, I never heard of a “V7” codec (and I do work in the field of video compression for a living). And what about MPEG-2?
No MPEG2 – the licence cost was too great. You’ll have to transcode ripped MPEG2 video, I’m afraid.
MPEG 2 is used for television here (Australia); I was planning to use a Pi to handle live TV viewing (among other media functions).
Is it a possible option for an additional paid license?
I’d second that, it MPEG2 (video+audio) is used for most SD digital Television broadcast around the world in most transmission formats (eg Terrestrial, Satellite, Cable and broadcaster centric IPTV). Many HD channels are H264 encoded so if they aren’t encrypted then they should be feasible to watch on the Pi as long as an MPEG2 TS container format is accepted.
Liz mentions below that it’s an expensive licence. For the foundation to sell additional licences would be an administrative nightmare, but it might happen.
I believe HD Channels in Europe are H.264. In the U.S. (and Australia) all broadcast HD channels are MPEG-2, all cable HD channels are also MPEG-2. So, this is not “ripped” MPEG-2 content, this is the national standard for video broadcast in significant portions of the world.
Transcoding this content is not a practical option for HD MPEG2 content. It takes a very long time to convert a 1080i MPEG2 program to H.264.
From a market perspective, there are many cheap media devices that support H.264 (AppleTV, Roku, WD, etc. ), providing an option for an open platform which can support MPEG-2 would be a big differentiator.
Like others have mentioned, paying an additional license fee is a fine option.
I have an Archos tablet, and they made a similar choice. The MPEG2/AC3 license is a separate purchase / plugin download for 15.00 euros. For a device like the Raspberry Pi, the license is disproportionately expensive, and in practice will rarely be needed. For those looking for an easy, very fast, free transcoder can check out HandBrake (Linux/Mac/Win):
Seconded on Handbrake – nice bit of kit
handbrake forgot it. http://www.transcoder.org will be a better choice for anything memory limited and that the platform is arm. Mono .net runs into issues on arm chips. Worse mono like java runs into trouble in low memory environments. Linux kernel does not handle jit data that well. This will require selecting you applications carefully to get the most out of it.
For those who own an CUDA capable Nvidia Graphic card , they can use Badaboom . An very fast encoder , up to 20x more speed using only graphic card GPU cores.
+1 for being able to pay for MPEG2 license. I was hoping to use the RaspPi to play the over the air HD here in the USA. The encoding process takes too long to be reasonable for my use.
Guys , dont forget the UPnP servers with live trascoding . They might be an option . Don’t you think ?
No. Having to run another power consuming PC to transcode kinds misses the whole point. Being able to build a fully functional 7watt DVR that is cool to the touch would be… well a dream come true.
Does the driver offer any support for partial offloading ala XvMC? If it did, there’s a reasonable chance the CPU would have enough grunt to decode the rest (At least for SD).
I thought maybe it was a reference to On2 VP7?
That’s a typo – sorry, should read VP8. My mistake.
I’ve fixed it. Sorry I didn’t spot it either!
Hernan Rodriguez Colmeiro
Probably this was asked somewhere in the forums before, but… As VP8 is an open format (WebM), why is it that is not accelerated by the GPU?
You need a license or something like that from Broadcom to be able to create a driver that leverages the GPU?
I don’t know why it isn’t supplied. It’s possible the current implementation isn’t production ready.
You’re exactly right: Creating a video decoder driver for the VideoCore 4 would require technical documents about that core, and this might not be available from Broadcom at all (not even under NDA).
Furthermore, it should be noted that VP8 does some things very differently compared to H.264, MPEG-2, MPEG-4 and VC-1. Depending on how much the VideoCore 4 is specifically tailored to these codecs, it might not be able to create an efficient implementation at all.
XviD is MPEG-4 ASP / MPEG-4 part 2 (H263), so if they say that it can decode MPEG-4 ASP / MPEG-4 part 2 (H263) then it can decode XviD. Hence XviD is not excluded here, it will be hardware accelerated too.
Would it be possible to purchase extra codecs separately ? Like having a firmware update. I’ve seen it on some tablets (Archos), the device is shipped with some default codecs, and if you want others you need to purchase codecs separately.
This is an option we’re looking at exploring. If we do decide to go down that road, it won’t be for a few months; administratively, it’s hard to manage, we’d need to see a clear benefit for the charity, and it’ll require another round of talking to the MPEG LA (and another round of form filling; the back and forth took about a month last time).
I am just an enthusiast, but should you decide to pursue the MPEG2 oportunity, I’ll be glad to help in any way possible.
I, for one, am more than happy to pay significantly more for a verison with more codec support (MPEG-2 in particular), or for the codecs seperately.
(Suggestion) Why not sell a specific media player version, at a much higher price point (double, or triple – with a nice case), that can subsidise the ‘educational’ versions?
You may as well buy one of the already available media devices in that case – they are already only a bit more expensive than the Raspi.
I don’t know of any existing media players in that price bracket that will easily allow me to customise the software (excluding the few closed source pieces) to my liking.
@shiftyphil The Roku 2 XS (using BCM2835) can be hacked. Roku also provides an SDK and the kernel source are available. It’s not as open as the Raspberry Pi though.
Looks like Roku hasn’t licensed MPEG-2 either, so no advantage there.
I too would pay extra $ for the right to play other media formats.
Have you seen the CuBox?
I believe it supports MPEG2. It’s a little more expensive though. I for one will continue to wait for the Pi
This is a good idea!!
I hope this can be done in a not very far future…
Will be possible to have Flash on browsers?
No. Lots of discussion about this in the forums – pile in!
You don’t need to use flash to attain similar results in the browser anymore. Unless you specifically mean using flash as a games plugin, in which case you’d need the proprietary flash developer package, which doesn’t run on linux anyway. If you mean to use this as a youtube/vimeo watching computer, you will probably want to use their HTML5 versions instead. You don’t really need flash on the browser anymore anyways.
What about Gnash?
I bet that the Raspberry Pi Foundation just asked Adobe if they could make an Adobe Flash binary package for Linux on the Raspberry Pi then they would do so for free!
It is simply in Adobe’s interest to have Adobe Flash support in web browsers on the Raspberry Pi ;)
Flash is appallingly memory-hungry (and Flash for mobile is being discontinued, so Flash for our SoC isn’t really an option). JamesH mentioned the work that’d be involved, but even discounting that, I don’t think it’ll ever be happening on Raspberry Pi; it’s much, much less common online than it was even a year ago now Apple has stopped supporting it, and there are much better, less crashy alternatives available now.
The memory issue might be fixed if you will use an SWAP area . An HyperX USB flash drive will be fast enought for SWAP use .
In fact , it is recomended if you will use the Raspberry 3.14 as desktop computer .
VP8 codec, part of the WEBM container format, and an important part of the HTML5 language? Doesn’t it lend itself very well to the RPi world, since it is an open format?
In fact it doesn’t lend itself to the GPU H/W acceleration which is designed around H264. Absolutely nothing to do with the open or closed nature of GPU’s. The Arm itself might have enough grunt to decode it at SD sizes.
HTML5 doesn’t mandate any particular codec. H264 is supported by lots of people, VP8 is support by lots of people. We are half way there.
Liz – I understand that for a lot of codecs, the owners require all the units of a particular model to be paid for. So it might mean having a more extensive model range (Model C, D, …)
However – if you have the time later on the in year to come up with an expected Model Z that has ‘all’ the codecs, would be interesting to hear what the cost is.
That’s absolutely true, and some also require that you buy, say, 100,000 licences in a block, or nothing. So you can see that it gets quite expensive! (This is also true of MAC addresses: you either buy a million – which I really hope we’ll get to use, but it’s a very optimistic number – or none. And they’re non-transferable; one of the trustees had hundreds of thousands of MAC addresses he wasn’t using on a product and wanted to give them to the foundation, but we found that wasn’t a supported operation.)
It might be worth talking to the Nanode (http://nanode.eu/) people.
They provide a MAC per board on the back of the board.
They obviously have some method of getting them inexpensively.
Oh – we’re all set with MAC addresses now. Bought them ages ago. :)
You can buy them from the IEEE in blocks of 16.7 million or 4097.
I have no idea what you do if you want less than 16.7 million but more than 4097 – which is presumably where the Foundation sits as I don’t see them on the OUI list.
Call me silly, but – buy multiple 4097 blocks?
The small blocks are intended for prototyping and are priced accordingly – 3 x 4097 blocks costs slightly more than 1 x 16.7 million block, so that route means it’s cheaper to buy a full 16.7 million than 10,000.
I am a bit surprised not to see the Foundation on the OUI list though – how did you get them in the end?
Gosh, didn’t realise how small that SoC is…
I’m still surprised at how small the board itself is when you actually have it in your hand. You’re going to love it. :)
fully understand the physical relationship between the Broadcom device and the memory. I guess Broadcom buy the memory device from another supplier…Samsung or Micron perhaps. JamesH, Liz, two pedant questions:
1. Do Broadcom assemble the memory onto the BRCM2835 before supplying it to you?
2. Can you say who supplies the memory?
Broadcom don’t buy it – we do, and the factory assembles the two together. (Finding a factory that could do PoP was another of our UK manufacture problems.) It’s made by Hynix, as you can see if you look at closeups of the beta boards in previous posts.
Liz, thank you. Good to see Hynix have some larger memories in the pipeline :-)
Since this is the important type of article that should be put into the Wiki for long term referencing, could the lettering in the dark red block in the graphic be inverted to white so it can be read easier?
Sorry……I’ll re-render it when I get time – it is a bit difficult to read. That’s why I write software rather than being a graphic designer.
That’s why you are forgiven :)
Agreed, that is awful, and unreadable :(
Does the H.264 codec support GPU accelerated *encoding* as well as *decoding* ?That would allow me to develop a low-cost and low-power consuming H.264 transcoding application.
Not at the moment. We’ll be making a camera add-on board later in 2012, though, and when that’s released we’ll also be releasing a firmware update to allow encode.
I rather like to encode a file or a stream, not a video signal. I hope the OpenMAX lib offers encoding support, like for example Qualcomm does for their Snapdragon chipsets. If not, then either I have to wait for the video board launch or, hopefully, the OSS GPU kerneldriver offers some clues to add encoding functionality to the OpenMAX lib.
Don’t quote me on this – I think if you can write the appropriate OpenMAX code (and its not easy), you may be able to do encode of a file stream. Depends if its enabled on the GPU.
Note that you would NOT be able to decode&encode 1080p30 in real time, the GPU doesn’t have the horsepower. SD would be possible.
I won’t quote you :)
Ohh, “Depends if its enabled on the GPU. ” sounds frightening.
Thanks a lot for terrific information!
I am extremely curious about camera add-on board and HW accelerated h264 encoding also. Is there any decided release date so far ?
I don’t understand very well this accelerated video thing, but since the codecs provided by the foundation will only accelerate MPEG4 and h.264, will we be able to use OpenMAX IL API to accelerate XviD, for example?
No, not without the required Codec on the GPU. You could decode it on the Arm though – there may be enough horsepower to do SD.
Maybe… What i’m afraid of is that the CPU can’t handle an application like XBMC and decoding XviD both at the same time.
XVid is basically an opensource implementation of an MPEG4 codec (possibly with a custom container format), so a MPEG4 decoder should be able to accelerate the decode of it, provided the decoder supports the feature set of the encoded stream.
Another option for acceleration would be using the OpenMAX DL layer. Arm have released a reference implementation of this which would allow more optimal software decoders.
(Where IL is along the lines of accelerating the decode of an entire frame, the DL layer is about accelerating the individual decode functions such as DCT transforms etc.
Yes, xvid is mpeg4, and so is supported by mpeg4 decoder.
I think MPEG is fairly evil. How much do you have to pay them?
I would prefer to buy a RPi without paying anything to MPEG.
To be fair the companies who develop the standard have to put in quite a lot of devleopment time and expense to develop the standards, they don’t just spring into being magically. The MPEG-LA is a patent pool, so all the companies who allow their patents to be used in the standard receive a royalty for it. This means that companies who didn’t contribute to the standard are still able to put together products that adhere to the standard.
The license fees (when last I looked) are not that bad for someone making a commercial product as it can be rolled into the cost of production.
In addition the fee only becomes required when over a certain number of devices shipped (something like 10k if I remember correctly).
It is mostly a concern for something that is intended to be really low cost (like the rPi) but high volume.
Well, that would mean no H264 or MPEG4, so not much decoding at all. In fact the H264 licence is pretty sensibly prices – it’s things like Dolby with AAC that really extract the Michael.
Yes, it’s disappointing for me, too. I could care less about video playback (will be headless for me).
It’s a pity those scumbags are going to get some of my money. I don’t want to pay them!
I thought that licenses for h.264 were free for the first 100,000 units then US$0.20 per unit, MPEG4-2 free for the first 50,000 then US$0.25. The shock was MPEG2 at a stupendous $2.50 per unit! Don’t know if they’re current prices.
I could be wildly off with the figures above but they’re what seems to be what I remember.
Glad to know that BT are still making money out of the MPEG licensing.
AAC’s stupidly expensive too.
Interesting… I was under the original impression that the RAM was in the same package as the BCM chip. I figured you guys were sourcing 2 different models of the chip for the 2 different memory sizes for the A/B models.
Did the increased cost of the 256MB vs 128MB RAM really make enough difference (vs the volume discount of putting the same amount of RAM on both A+B models) to justify having to source 2 different RAM sizes? The only other major changes between the A+B are the ethernet chip, 2nd USB port and the ethernet connector. I can see those adding maybe an additional $5 USD to the cost, so I’m assuming the RAM size was the major contributor to the cost difference between the A and B models.
One reason to use open source codecs. Great post and thanks for the time making the diagrams to explain it better. This only makes me feel like we have more novices out there, that will either abandon ship or really learn new things when they get this device. Now on to the real question, how many more days until shipment arrives for you guys to sell?
Cool. Thanks for the info.
Great post, shame about the lack of MPEG-2. That kind of knocks on the head any aspirations for DVB TV, at least initially. Hopefully the future holds more fruit! ;-) Am I right in thinking that you pay royalties for being able to decode MPEG/H264 *compression* using hardware rather than paying royalties for a codec as such – for example, ffmpeg is an open source software codec that can decode both MPEG-2 and H264 compression. It looks like MPEG LA reduced the prices in 2009 to $2.00 for MPEG-2 decode+encode….
IANAL (you anal, we all anal) but distribution of ffmpeg without a licence is technically illegal in some places. Also, it’s worth noting that for some codecs software and hardware licences differ in cost.
Not wanting to start an OSS war; but Eben mentioned he would like to try to persuade Broadcom to release the source of the OpenGL state machine.
Any news on that front? I’d like to look into getting GL going on alternative OSes (or even bare metal – asking a lot, I know!)
Can’t wait for the release.
No news. We’ll let you know if/when there is any.
That flowchart had the binary blob and a kernel driver wrapper. Surely that bit isn’t hermetically sealed as a binary too? I was looking at some bare metal work too My own system for the sake of it to try some ideas, which I really haven’t wanted to do with x86 architecture because it’s just not suitable. I don’t mind loading the proprietary binary blob as long as I can actually use it.
“Two licensed codecs will be provided at launch, MPEG4 and h.264. Codec licences have quite an impact of the cost of the device …. ”
Hmmm :/ things getting to technical for me :( I thought H.264 codec was hardwired in BCM2835 … maybe it’s flash-rom-ish hard-wired :/
There is HW support in the GPU for H264. The Foundation pays a licence fee for Broadcom, the supplier of the SoC, to supply access to that hardware support.
Thank you James for the answer.
My question was if h264 support is hardwired in BCM2835 and the answer was “yes, it’s hw”. So basically you’re going for the “default” codec of the chip which is a good thing in order to keep the price as low as possible.
The licensing part is clear, if Broadcom implements a codec algorithm on its chip it should pay patent licensing fees to due party(ies) which in turn adds to the cost of the chip (which RaspPi foundation buys it to bake the Pie, which will in turn add to the price of the Pie).
Yes, it is pretty much the default codec. And to be honest H264 is much better than MPEG2, both in quality and pricing!
As usual (so far) you’ve made a sensible and good decision.
Anyways if someone wants to do anything using MPEG-2, I think they can do it by means of software ultimately, if it’s a file they can convert it and if it’s streaming, bearing in mind that this is a $25 computer, accept a little lower performance (e.g. dvd resolution) compared to GPU crunched video output performance. Anyways if I’m not wrong back in the day DVD player programs did it almost all using software. And let’s not forget that this device first and foremost is an educational device (well if it does other things the better!) but educational use is the stated and main purpose. And that’s what makes this project very noble.
I still dont get it, sorry :(
Could someone explain this in layman terms?
VDPAU for example can handle a lots of formats, why cant “you” use some sort of similar and/or opensource libraries?
I don’t get _who_ and _why_ someone should have extra money for us/you to be able to decode using the GPU….
Sorry if this is is a noobish F-A-Q… I’m just so damn tired of all the licensing wherever you look. In my book, you pay for the GPU, you can use the GPU.
When you buy the SoC from the supplier, you pay a licence fee for whichever codecs you want supplied. You cannot get round that, if the supplier sold the SoC without charging the licence they would be breaking a squillion or more patents and would have the living daylights sued out of them.
VDPAU is lots of software implementations supplied by A.N.Other for free which has nothing to do with the GPU, you don’t pay for it, even though you are using the patented algorithms. You probably SHOULD pay for it, I’m not sure of the legalities.
The licence fee in this case is on the hardware that is providing the acceleration of the decode process. So the VDPAU is opensource and free because you have already paid the licence fee when you bought your nVidia graphics card. In the case of the rPi, you havent bought a license, as Liz and James have said that may change in the future..
The ‘who’ is the pool of patent holders for the various patents involved in mpeg decode/encode.
Okay, think it’s more clear now. Thanks for taking the time explaining!
Think of it this way (James/dom/eben/liz, correct me if I’m wrong)…..
The GPU is a second processor. It too runs code just like the ARM. However, we don’t have a compiler nor the specifications to make one. Broadcom likes to keep the information for themselves. (They think it’ll be harder to copy in china if the chinese copiers don’t have this information).
So, to run code on the GPU, we need to feed it an executable provided by broadcom.
Broadcom is willing to provide the raspberry pi foundation with an executable with the features that are paid for. If you pay for mpeg2, you’ll get mpeg2. If you don’t, you don’t.
Besides the cost of the software that broadcom charges for the licence for an additional codec, broadcom will also take care of third party patent licence fees for that additional codec.
Now in reality things may be a bit more complicated. For example, I get the impression that besides some software for the GPU some of the codecs are partly implemented in hardware on the GPU. So when you pay for such a feature in, broadcom will give you only a “small” piece of software that allows you to use the hardware….
(Implementing a codec in hardware is quite difficult. So in general it is likely that only the time-consuming parts are offloaded into the hardware, while things like header parsing are still done by software. So in reality still some real software will be involved).
I have the impression that the foundation members don’t even know wether the binary code provided for the GPU already supports the not-paid-for-features, or if a new binary would have to be shipped to those who pay for an extra codec.
Anyway, economy of scale means that at the moment we’re “stuck” with just “h264” and “mpeg4” support.
Well, Eben knows what the binaries in the GPU are (but he’s somewhere over middle America at the moment); I think Dom does too. I certainly don’t. :)
I agree with your comment. The binary blob is there only because broadcom doesn’t want to give access to all of the hardware features of the GPU.
I see several options of how things will go in the future:
*broadcom opens the blob
*the community reverse-engineers the blob
*a “full-featured” blob leaks to the net and everyone starts using that
*everything remains as it is, most of the stuff is rendered in software.
I hope for one of the first two variants!
Guilherme de Sousa
My question is. This broadcom GPU is used in other systems that have the all codecs(I don’t know anyone of these, but there is certainly one at least), so thats not to difficult to obtain.. probably not legal, but certainly not difficult. Am I wrong?
Yes, the GPU IS used in other systems that use all the codecs. You could probably get hold of the GPU binary (Remember DMCA). But it won’t work on a Pi – the binaries are compiled for specific platforms, and won’t work on a different one because of the different hardware arrangement.
@Roger Wolff: Roger, I asked James and he kindly replied considering the load of work he has to do [the full time Broadcom job, the foundation and now the forums!] so I really don’t want to overload him with additional questions specially because it’s my general curiosity.
My question was: “if h.264 codes support is hardwired in BCM2835 and James said “yes it’s hw support”
Now Roger, reading the posts, I have this impression of how things work in mind which I write from my layman’s point of view, please confirm if they are correct or which ones are correct if any. I’ll number them so it’s easy for you to answer each one.
1) Reading the posts, as I understand you can pay for additional codecs and have the chip support it, so one way to do that is that there is a flash memory in BCM2835 chip which Broadcom loads the flash with codec software according to customer’s order and then ships the chips. Which in turn means the codec part can be *electronically* added, changed, deleted, so it’s not hard-wired in the micro circuits (like you can NOT change MMX instructions in a Pentium cpu)
2) MPEG-2 support is also hard-wired in BCM2835 (as hard wired as MMX) but if you don’t pay for it you don’t get the “key” to use it (now that “key” whatever it might be) [or you don’t get the legal right to use it, not much difference in the outcome]
3) Broadcom advertises its BCM2835 with h.264 support, do you think that the support for this particular codec is wholly micro-circuit level (again like MMX) on the chip?
Oops! Let me correct this part I wrote “I’ll number them so it’s easy for you to answer each one”. … not anwering each one :)) … IT SHOUT BE … I’ll number them so it’s easy for you to choose one (or say none) is the case [or similar to]
VDPAU can only handle the codecs supported by the hardware and its firmware. It is possible to implement VDPAU on top of OpenMAX, but it won’t get you better codec support. (I think VDPAU support would be pretty great, since it’s widey supported, but this is a lot of work – VDPAU does more than just video decoding).
Everybody else commenting here is wrong.
VDPAU is just an API. It doesn’t decode ANYTHING.
The idea here would be that every piece of software would make use of that same API, and that the API would have a specific implementation based on the hardware or software it is using to do the actual work.
So like driving a car, the API is the steering wheel, gas pedal, and brake pedal. The back end is gas ICE, diesel ICE, electric, hybrid electric/gas ICE, hybrid electric/diesel ICE, hydrogen fuel cell, or whatever. The user doesn’t have to care what is going on the the background, because they all have the same kind of controls. That’s what an API is for.
The licensing fee is to give you the right to use the back end.
How about on the pure audio side of things? FLAC, Vorbis and Speex should be no problem. Isn’t the MP3 decoder free as well?
Audio is all done on the Arm, so whatever codecs you can get your hands on.
mp3 is never “free”. You can get it on linux without paying any money with your OS if you live in a “free” country (i.e. not the USA), or if you lie and run the software anyway without an official licence.
Libraries, codecs, OSS | Raspberry Pi | szimpatikus.hu trackback proxy
[…] Szimpatikus.hu Közösség, a “Libraries, codecs, OSS | Raspberry Pi” című bejegyzésre szeretném felhívni a figyelmeteket. Várom a hozzászólásokat, […]
Will there be acces to a Development Kit / Compiler for the GPU.
So that someone would be able to implement his own HW accelerated Coded and put it into the GPU ?
Also, any idea if/when some sort of GPGPU/OpenCL type API will be available to use? I know it was mentioned at some point that it was hoped to support something like this in the future. Given that the GPU is so powerful, it would be a great to be able to tap into that power to augment the ARM…
OpenCL – probably not – that is a hell of a lot of work, and for very little reward. GPGPU might be possible as that just uses the shaders (I think)
Old-School GPGPU works out of the box.
Just abuse the OpenGL ES 2.0 shader compiler.
It’s quite amazing what you can do with vertex and fragment shaders and a little sideways thinking.
– OpenCL just makes it easier to write (and more portable), it doesn’t really give you much else.
OpenCL definitely gives you a lot. I can only speculate that you never used it. OpenGL ES is quite limited in features that are useful for GPGPU, and even more so GLSL. Compared to OpenGL, there’s no real support for integers, no PBOs, no local memory support, etc.
While OpenGL ES *might* be interesting for some GPGPU applications, especially image processing, in general it’s simply no good for GPGPU.
So if a texture isn’t a big block of integers, what is it?
The texture storage format is only a very small part of the whole picture. Texture sampling is always floating-point, and GLSL ES does not have proper integer support at all. I.e. the “int” type does not necessarily map to a real integer type and there aren’t any bitwise operations and the like available. On the other hand, if you want to process floating-point data, that doesn’t really work either, as there is no floating-point texture storage support.
Plus what I already mentioned: Compared to OpenGL, no PBOs (so no efficient, asynchronous data transfer between GPU and host) and no multiple render targets (so you need to always stuff output into a single texture, which is inefficient and complicated). Even full OpenGL is missing a very powerful tool, workgroups with shared memory (impossible to do various things without, and in other cases everything gets inefficient since you need to do multiple redundant texture reads).
My point is, OpenCL definitely gives you a LOT of features that make GPGPU computing
a) more efficient, more powerful and much more flexible
b) less error-prone and complex
thanks, nice succinct informative post. I honestly can’t believe that there are folk who have manually soldered BGA components. Maybe it’s something to do with turning 50 and the old eyes not being what they were, but still…
I don’t believe it either, but there are people on the forums who swear they have done! Perhaps they have very, very tiny fingers. And matching tiny soldering equipment.
I think they use masks and reflow ovens made from toasters or heat guns rather than soldering irons. I did a short course of soldering techniques and it was possible to do large pitch BGA’s with a heatproof mask and the right heating equipment, but I have no ideas how you would manually do the PoP.
PoP involves dipping the device in flux and then placing it on top, there is no paste involved. Probably easier to DIY than the BGA underneath – after all you can even see most of the balls. Usually a special fluxing unit will be used to make sure you get just the right amount like this one http://www.europlacer.com/en/dip-fluxer.html . As a technology it is obviously quite focused on high density assembly such as you might find in a mobile phone which is why you will find it harder to find the capability in this country. In our factory for instance PoP is not a technology we would even consider investing in, we would never get the custom to pay for it.
Proof that stuff like this is possible. From the early days of the psp hacking community…
got it from here: http://hackaday.com/2011/03/13/reverse-engineering-the-psp/
BGA’s certainly do-able (although you’re likely to have several failed attempts on the way) – but those chips are a *lot* bigger than the ones here, and proportionally easier to deal with. Still not easy, though; it’s still damned impressive. PoP just adds another layer of impossible to the equation.
I did some pixel-measuring on the image above this post, and came to the conclusion that the pitch of the RAM is 0.50 mm, and the pitch of the broadcom processor is 0.65mm.
The first BGA chips were all 1.0mm pitch.
I did too, but first I zoomed in, and they were 10.0mm pitch! easy! :P
OK then, as I understand, the GPU won’t do Hardware MPEG2 decoding without a licence from Broardcom – presumably they have to knobble the chip in some way to prevent abuse.
However – with the right software and possibly ££ we may be able to decode MPEG2 in the ARM – any estimates as to what the maximum resolution/bit rate that can be achieved in this way? – or would performance be so poor as to be unusable.
You won’t need the ££. I think there are OSS MPEG2 decoders. Speed is the issue.
Anyone got an estimate of whether software MPEG2 decoding will work?
I’d hoped to (amongst other things) try out the Pi for media – either to play TV via USB DVB-T tuner or to act as a MythTV client via XMBC. AFAIK both of these require MPEG2, so that probably rules them out.
I don’t think either of those will work out for you – sorry.
What you’d have to do is:
have a big noisy box upstairs that transcodes the incoming mpeg2 into H264. Then you stream that to the R-PI.
All this is “easy” if you are going to watch shows that you’ve “taped” before, and are not going to watch them live.
There are some DVB-C/T usb sticks which support hardware h.264 encoding, but dunno if there are any w/ Linux/Arm drivers
Unless I’m mistaken, there are some DVB-T2 sticks will do that, but only for the few HD channels that exist (at least in the UK).
A couple of these are working on Linux, and I’ve been assuming anything that has drivers in the linux kernel will work on ARM?
I’ll give standard DVB-T a go (assuming I get a Pi, I have the USB stick anyway), and can always explore encoding files on the server first to a pi friendly format. However, given the lack of flash (and therefore BBC iPlayer) I’m not sure how far people will get with the Pi as a general purpose media centre.
Of course, there are a million and one other things to do with the Pi, it’s just a little ironic that it’ll be able to play HD movies smoothly – something my linux box with a dual core Athlon and 2GB RAM couldn’t do until I got a new graphics card – but I can’t find a way to watch BBC1!
What about the very basic terminal display that is used on boot? That seems available before the kernal is loaded so does it need driver support or can it be accessed directly from code running on the “bare metal”? (I mean after the GPU blob has loaded of course).
Source for the framebuffer driver is in the set of GPL patches. I think it’s basically just a case of setting width, height, color depth and a pointer to a framebuffer in some magic memory location. The videocore actually takes care of “booting” and loading it’s blob from the SD card before giving control to the arm – so I’m hoping some funky framebuffer access should be possible bare metal with pretty minimal ARM.
One of the things I want to play with!
Good info, especally with all the (dis)information floating around.
WIll we have enough infomation to not interface with “Closed Source” layers, and just talk driectly with the open kernel driver layer and the “binary blob?” I am interested in running non-linux systems on here, and while writing a replacement for the linux kernel interface to said blob seems resonable, Interfacing back with the closed “Open.*” layers does not.
(please, please, please, no condeming anyone for closed bits (as I’ve seen happen while lurking in the forms. Everyone is doing the best they can.)
So what does this mean for me if I want regular OpenGL, not OpenGL ES? Is it possible for someone to write a wrapper or something?
The short answer is “no”. Firsty OpenGL 1.0 ES (and 1.1) is very different from OpenGL 2.0 ES. OpenGL 1.0 ES implements a subset of the full OpenGL 1.0 and adds a few things. OpenGL 2.0 ES implements a subset of the full OpenGL 2.0 but *removes* much of the older functionallity of OpenGL 1.0 (called the fixed pipeline). Using OpenGL 2.0 ES (which is what the Pi is offering) you *have* to use vertex shaders and fragment shaders.
There’s no reason the fixed pipeline can’t be implemented via shaders in the driver, is there? Isn’t that what OpenGL drivers for modern desktop cards do these days anyway?
No reason at all, although it would be much more than a “wrapper”. I’m pretty much a noob when it comes to the open source community so if anyone knows of a FLOSS full OpenGL 2.0 driver that interfaces to OpenGL 2.0 ES and can be complied for ARM 11 then I would be interested.
I know someone made a OpenGL-OpenGLES wrapper to allow Blender to run on the Nokia N900. That particular code seems to have disappeared off the face of the earth though.
The fixed pipeline is a pretty trivial pair of shaders.
– IIRC, the ‘copy the fixed pipeline’ were the first shader examples in the “learn HLSL” book I read during my MSc.
– I really do need to find the time to learn GLSL.
Yes, if you want to emulate the most common tranform and lighting, combiner and blending modes then that isn’t too hard, but to emulate the full possible range of functionality of the fixed pipeline would be a lot of work.
I put pre GPU 3d hardware through contortions to make it do “software” bump mapping, detail mapping, volumetric shadows, etc. It’s a hard task to make an emulation work in all situations.
Excellent post James – thanks.
I’m learning a whole bunch here and I haven’t even got my pi yet. This bodes so well for the future. What a fantastic idea you guys are turning into reality – thank you!
And just to lower to tone a tad – JamesH, Troll Spanker! I can’t help but think of you in a whole different light now – thanks Liz
I didn’t see that….hmmm. Troll Spanker. I think that might be a first.
I used to be a Troll Spanker … until I took an arrow to the knee.
(sorry I couldn’t resist)
Give that man the comment of the day award. :)
I knew you weren’t supposed to feed them – but I much prefer your approach. Could that be built into the educational message too?
– Can it play 1080p SMPTE VC-1 movies? Can the BCM accelerate by the GPU?
– What about DTS, AC-3, AAC codec versionS? Is the ARM strong enough to decode these?
– Could you post the accessible video/audio? (Complete Broadcom codec list) Good to know for the future…
– Is it possible to upgrade the licenses on these boards or needed to buy another one which can handle other codecs (upgradeable by sw)?
Did you read anything?
Yes, do you think all of the answers were written before? Did you read all of the questions thoroughly?
Since most of the questions you asked have already been answered, did you read anything?
Yes as I wrote befoe. You said ‘most’. Maybe you gave some answers for some questions but was not clear enought just you think it was clear. Therefore please give us answer for
– vc1 (I was written that it has license for part2 and part10 license but I asked : can the gpu accelerate? is it possible to buy a license or is it not done in the hw?)
– available (not licensed) codecs for this soc I am sure there was nowhere mentioned.
– this ARM strong enough to the above audio codecs? There was not mentioned just the fact that this will do which is not equal that it will be able to do
psz, just a quick tip: if you REALLY want to get on the wrong side of the overworked/underpaid technical mind behind the hardware marvel, just ask your questions a third time. Or try insulting his family, it’s much quicker!
Simply posting, then repeating poorly-phrased, open-ended questions (that *have* been answered in some detail if you’d care to read this thread a little more carefully) is taking way too long.
(You’re most welcome!)
PLEASE don’t say you supprt “MPEG-4”. This is completely misleading and very ambigious. MPEG-4 is a collection of almost 30 standards. MPEG-4 includes two video codecs, various audio codecs, subtitle formats, container formats and a lot more.
I know this was clarified a little bit in the comments, but I think this should also be updated in the article. Also, I’m still not quite sure what’s supported. Is it really H.263 like James stated? That’s not even part of MPEG-4. Is it MPEG-4 part 2? Since that is basically XViD/DivX, but it was stated that isn’t supported by the hardware decoder.
MPEG-4 part 2 is functionally the same as H263. That’s what we support, as well as MPEG4 part 10 which is functionally the same as H264. Perhaps I do need to clarify that a bit. Also need to clarify that xVid is indeed MPEG4 part 2 so is supported.
H.263 is quite similar to MPEG-4 part 2, but not compatible to it. However if MPEG-4 part 2 is definitely supported that’s fine, H.263 is barely used anywhere.
Quote from WIkipedia..
“MPEG-4 Part 2 is H.263 compatible in the sense that a basic H.263 bitstream is correctly decoded by an MPEG-4 Video decoder. ”
H263 is used quite a bit in Video conferencing I think – I could be wrong. It’s not in the same league at H264 which is by far the best.
Yes, but this doesn’t work the other way around, and doesn’t work with the various variants (H.263+, etc.). E.g. if you can only accelerate H.263 in hardware, you won’t be able to decode MPEG-4 part 2 video with it.
Well, since we accelerate MPEG4p2, H263 comes as a bonus. Whatever, it still encodes/decodes h263.
Great information, thanks for sharing.
So, I’m putting together my own Arch based rootfs, from where can I download the binary blob and the non-OSS libraries ? So I’m ready to go from day one.
I don’t think they are available yet, still being finalised. More importantly you need to check out the kernel on github (see front page somewhere) to get the low level kernel drivers that are specific to Raspberry.
Did that last week, thanks. Kernel ready to go (both now in qemu and for the hardware).
Please announce on the front page when the libraries are available.
The binary blob and some binaries (VLL files?) are available in some images in the web (Cf Mer for Raspberry Pi). However, I’m not sure those available will be compatible with the kernel recently released.
Maybe trough OpenGL shaders it will be possible to accelerate some codec operations (e.g. color space conversions)
I think we will see a lot of clever ideas in this field.
Anyway, (IMHO) this is a demonstration that software patents are against innovation
I think the issue here is that when the prevailing paradigm is one of hella-expensive hardware, the licence costs are such a small percentage of the cost of the whole machine that they go completely unnoticed. Once you start looking in the $25/$35 area, though, you’re suddenly looking at licences which are (in the case of MPEG2 and the Model A) 10% of the cost of the device, which is just…absurd, especially given that it’s so much less complicated, competent and clever than something like the much cheaper h.264.
If we’re successful in making a path for more companies to bring out devices at this sort of price, I suspect we’ll start to see a change in licensing costs.
MPEG-2 became a standard in 1995-1996, shouldn’t most of the patents (maybe all) expire in 2013-2014?
December 31, 2015 according to the license term agreement here: http://www.mpegla.com/main/programs/M2/Pages/Agreement.aspx
Oh well, only four years to go!
Regardless, none of the MPEG-2 patents are even valid in the UK, where the R-pi foundation is based. Grrr @ MPEG-LA!
Did you forget that, at least for laws related to technology and intellectual property, we are under US jurisdiction?
It depends on the country you are in, but in the US, it is 14-Feb-2018.
Colorspace conversion (and scaling) is usually directly supported by GPU hardware (and exposed through APIs like Xv or OpenGL) and can be done via shaders, but that’s hardly anything special. It is the norm to do accelerated colorspace conversion, and has been so for many years.
The GPU is capable of some fairly miraculous feats when it comes to scaling, colourspace conversion, rotation etc. It still amazes me (as someone who grew up with BBC micros) how fast these devices are at doing really complex operations on large datasets – for example the real time image pipeline. The sheer bandwidth of the memory also beggars belief.
Right, it’s pretty cool to get this done by the GPU “for free”, almost. Anyway, my point is, basically, it’s no use trying to accelerate video by doing colorspace conversion in hardware since it’s already done this way!
I hope I’m reading this wrong, but if the libraries to interface with the media capabilities of the chip are closed source, doesn’t this mean that there can be no GPL-licensed software which takes advantage of hardware acceleration?
If that’s the case, is there some kind of fallback software rendering method for running GPL applications (e.g., the GNOME desktop) on the Raspberry Pi?
You are reading this wrong. You can call in to libraries whether closed source of not – that doesn’t affect the GPL status of the code doing the call.
It’s no different to running GPL software under Microsoft Windows.
The whole of the MS Windows API libraries are closed-source.
You can still run GPL software under MS Windows (which obviously *must* call the Windows APIs) without breaking the terms of either the GPL or your licence with Microsoft.
Don’t worry about it. This is a non-issue.
If only Broadcom can compile stuff for the GPU, can’t they just compile free codecs so they are accelerated without having to pay licenses?
I have no idea. The Raspberry Pi Foundation /= Broadcom, and the Broadcom engineers who are very generously donating their time in the evenings to the project /= the executives and lawyers who get to make that sort of decision.
Nope. It’s not the code itself that you pay the licence for, it’s the algorithms. So if Broadcom compile some OSS code, they would still need to pay the licence fee as they still sell that on to the end user, and it still uses the patented algorithms.
‘1080p30 H.264 high-profile decodes’
Which profiles exactly? Does anybody know?
High profile is a specific profile within H.264.
H264 has a number of profiles, the most commonly used are
Baseline, Main and High. Generally speaking if you can decode High profile you can decode Main and Baseline. If you can decode Main profile you can decode Baseline.
The differences are in the features used in the stream. So main and high profile streams can use B pictures and CABAC high profile can use 8×8 intra prediction and transform.
So by specifying high-profile it pretty much means it will handle any H.264 video you are likely to come across.
In addition to profiles you have levels to worry about. These define the size and bitrate of streams that can be decoded, as with the profile they are indicating 1080p30 which should cover most of the streams you are likely to encounter.
Level 5.1 has a maximum frame size of 36864 macroblocks, and a maximum macroblock rate of 983040 macroblocks per second.
1080p30 is 8160 macroblocks, and a rate of 244800 macroblocks per second. Level 5.1 is more than 4 times this. Most people are unlikely to be getting anywhere near these requirements.
Most people will be happy with level 4 which will cope with 1080p30 content.
Well, as I understand gpu can handle max L4.x.
Thanks a lot
Good to see that OpenGL, OpenVG, and EGL are supported. That is exactly what I need for the GPU work I am intending to do.
(Dont really care about video codecs, but I guess lots of other folks are interested)
Note that it’s OpenGL ES that’s supported, not OpenGL.
I don’t understand why people even want a kit at all. The assembled version is already ridiculously cheap due to high volume. There’s tons of surface mount parts that would be annoying to even package for people. Why ruin a perfectly good small form factor to make it a little easier for a few people who want to solder it themselves? Also, it would take you hours of your own time assembling it. Why don’t people spend the time designing their own hardware instead?
Another issue is that it is very hard to debug an assembled board. If one of the pads on the BGA doesn’t make contact it’s nearly impossible to diagnose. A power to ground short would be very difficult to locate. They can’t use their automated test jig to sort out defective parts or errors in assembly, etc. And then the manufacturer will be prompting tons of support requests by people. It really isn’t worth the effort.
Indeed – we have to use an x-ray machine with microscopy to ensure all the pads are connected properly. And NOBODY has one of those at home.
This post has just been Slashdotted. http://hardware.slashdot.org/story/12/01/31/203229/why-the-raspberry-pi-wont-ship-in-kit-form Plenty of commenters there appear downright insulted that we don’t think they’ve got ovens, masks, and an x-ray machine at home, along with the dexterity of a TINY TINY PIXIE. (They don’t have any of those things, but they’re still insulted.) Sometimes I really hate Slashdot.
It’s a symptom of the gigantification of confectionary.
Tangentially – you might be surprised. Chap in the village asked some advice over a pint a while ago about upgrading the software he uses to control his scanning electron microscope.
Well, it is a little insulting. The message you are sending is “don’t bother trying, you won’t be able to do it”. That neither fits in with the educational spirit, not the spirit of hobbyist hackers. You are denying opportunity rather than enabling it. At the same time, I don’t really believe you are that concerned about people struggling with tiny components. The statement “there is no recovering once you have failed” is just misleading.
As you mentioned elsewhere, the real reason is that you don’t want the admin overhead. That is not unreasonable, so why not say that rather than spin about how impossible BGA assembly is for anyone but professionals?
Add in the dubious closed source claims (“we had no choice!” really?), Raspberry Pi seems to be more about teaching kids to be consumers of corporate mass produced stuff than learning about computers. Probably not your intention, but it’s how it looks. I guess it’s inevitable that these sort of projects end up as corporate vehicles, like OLPC.
Wow – who peed in your coffee?
We’ve had to talk about this kit thing not because we ever thought it realistic, but because a few months ago a journalist heard the words “parts kit” and totally misunderstood it, writing an article which got syndicated all over the web about how you’d be able to buy the thing in kit form. We are explaining here the reasons – and they really are physical as well as administrative – that we’re not doing that, because the misapprehension has persisted.
Explaining like this is not insulting. What *is* insulting is rolling up and calling us…well, insulting, misleading, opportunity-denying corporate spinners. Keep it up and you’ll be banned; we pay for bandwidth here to inform and discuss, not to be trolled.
I’m pretty sympathetic to both sides here. While I believe that ideally the system would be entirely open and hackable through the whole stack, imitating 80s home computers, where the whole system could be understood by one person who’d spent enough of their time learning, this conflicts with requirements in terms of price and quality. I can see that the rPi foundation’s pragmatism is entirely appropriate in trying to produce something *now* rather than despairing it is impossible. Still, and I know after dealing with these types of messages for so long it may become hard to distinguish, but I don’t really see this Bob character as much of a troll as such, but more as a supersensitve being (http://randombios.blogspot.com/2008/11/emma-goldman-1869-1940.html).
What kind of distribution terms are there likely to be for the closed-source stuff? Will I be able to, say, distribute turnkey SD card images for the Pi containing all these drivers without jumping through horrible licensing hoops?
I believe so.
I think that software libraries that have ‘Open’ so prominently in their name, should actually consider being open. It’s a shame to see that an open hardware project being hindered so much by the software side, while you’d expect it to be the other way around.
Open refers to the API, not the source. For example, there is a version on OpenGL on windows..The API is open, you can use it. You just on’t have the source code.
It’s somewhat disappointing that this post didn’t explain exactly what functionality will be present once a freedom-loving user scrubs all proprietary software from the device.
There have been rumors of a trivial poke of a magic number that will grant terminal-style console access, but it would be helpful to have full instructions for how to use the RasPi exclusively with free software, even if that requires having no graphics whatsoever.
I think that particular rumour is about as reliable as the one about the world ending this year. (Can we have a pointer to where you found it?)
I have to be frank here – we haven’t explored your scenario at all, because it’s so far away from the charity’s stated aims that…well, there’s no reason for us to spend any of our limited development time on it. I’m sure that folks like you will do their own experimentation and you’ll have the answers you’re looking for when the device is unleashed on the public, but this stuff really isn’t the Foundation’s job!
It has been said that the proprietary GPU blob on the SD card serves as a BIOS, it starts the ARM core and loads the kernel, so without that you’d probably own a very nice paperweight.
Well, yes. There’s that too.
I assume the “magic poke” stuff refers to using the framebuffer – there’s a GPL driver for that, so non-accelerated graphics would be OK without closed code.
Of course people have mentioned the GPU runs a binary blob, and you can also add to that list the firmware for the USB chip, ethernet chip, etc … If you have the slightly bizarre requirement that only everything that runs on the ARM must be open then you can probably still get most functionality from the open drivers in the github GPL patches (I guess).
Sorry to disappoint. To be honest what you want written up doesn’t really coincide to what I wanted to write. I was interested in getting across information that would be of use to the vast majority of users, information that answers common questions being asked on the forum. Your request is useful to…er… you. I doubt anyone here would be interested in writing such a set of instructions, but we are happy for you to write your own – all the information you need ins in the github sources which are released to the public.
Although the device will not boot without the GPU binary blob, so its not really possible to use it with solely free software.
I’m a freedom loving user BTW, it’s just a different sort of API freedom.
Broadcom wouldn’t like this, but it might be possible to reverse engineer the binary blob, giving us an open source executable for the GPU that we could put on the SD card…
It runs on the proprietary GPU architecture for which there are no publicly-available specs or compilers. It’s going to stay closed; much like the firmware that runs on whatever is inside the USB or ethernet chip. Just see it as a black box that provides graphics, just one that always grabs its “firmware” from the SD on boot. The “blob” never runs on the ARM, so I don’t really see the reason for any fuss here.
Even with the compilers and datasheets, there’s probably at least a 100 man years of work from a very experienced development team required.
Just use the blob. It’s much easier.
Thank you for the partial answer, i.e. it is a brick without the binary GPU driver. I am curious how vital the closed source user libraries are, such as EGL are. Can you remove them and still boot the raspberrypi? How hard is it going to be to say, port NetBSD to this?
awesome info. can’t wait to try this thing out. you are really doing an outstanding job. people who dont have much money could really have the world of hacking and making opened up to them by devices like this. i wish i had bought several of these instead of one nook.
DAMN, I thought they would sell the first batch of Pi’s after the maintenance :(
What about X drivers?
What have codecs to do with the foundation? I thought these are purely software related. As such, we should be able to install and buy them ourselves?
You can install them yourself, and they would run on the Arm chip not the GPU. Codecs that run on the GPU are not available to buy. Maybe one day.
The GPU is not just a “fast CPU”. It is a GPU. It requires special instructions to function properly/quickly.
Good post, thanks James H.
1. I assume that H.264 ENcoding is hardware/GPU accelerated, am I correct?
2. Can I add my voice to those asking for MPEG-2 support in hardware/GPU? MPEG-2 is ubiquitous. Almost every easily available source of video is MPEG-2. In fact, it would be a reasonable educational project to make a DVB-T receiver in the UK and many other countries. You have free-to-air MPEG-2 DVB-T broadcasts, you have free DVB standards (thank you EU) and you have very cheap USB DVB-T tuners.
3. Perhaps you could license MPEG-2 DEcoding only – MPEG-LA charge for encoders and decoders separately don’t they?
1. Correct. 1080p30 is pretty difficult without HW acceleration!
2. There is already HW support for MPEG2. But the license was too expensive so it’s not included. See other discussions re: a separate codec pack.
3. No idea.
What about a DIY kit that is part assembled? aka – all the really difficult parts done allready …
Nope – sorry! It’d create an extra step in manufacturing, raising the price, and it’d also increase our already pretty intolerable administrative load.
A DIY kit was never, ever on the cards. A few months ago a journalist misunderstood what “parts kit” meant, and all the requests for a kit have come from that one interview (which was syndicated quite a lot). If you want a DIY kit, by all means desolder your RP and solder the bits back on; but we’re not going to sell you parts separately.
Nice to have clear, concise descriptions.
As usual, though, it generates more questions.
But to ask the right question, one needs to know most of the answer.
So I’m going out now to do some research. I may be some time.
James, Dom, Liz, thank you all so much for this excellent info. Really appreciated it ALL. (Unlike some people *cough*psz*cough* :))
We need library / module or API for CEC support for the Raspberry Pi & XBMC
CEC (Consumer Electronics Control) for HDMI will enable XBMC running on Raspberry Pi to control your television and receiver on/off/volume, as well as letting you controlling XBMC via your televisions remote control without a dedicated media center remote.
Would be great with integrated open source code for Raspberry Pi in libCEC
libCEC is the open source library by Pulse-Eight that XBMC uses for CEC
libCEC also needs the following dependencies in order to work correctly:
* udev v151 or later
* cdc-acm support compiled into the kernel or available as module
Does libcec support any hardware other than pulse eight’s USB dongle?
Not yet, but they intend to support the Pi as soon as they can.
“In the spirit of rPi we will update libCEC to talk directly with the built in CEC in the chip, we’ve already sent a request to rpi asking them for more information”
I wonder if the Raspberry Pi Foundation have replied to Pulse-Eight with this information yet?
They should post the requested CEC code and API to the public as soon as possible in any case so everyone can access it, this should have very high priority I think!
You may think it’s high priority, but compared with lots of other stuff that needs doing – not so much.
Even when Pulse-Eight said that they will do the work on libCEC themselves if you just provide the technical information from Broadcom? :P
Doesn’t sound like too much work on your part :(
You seem ill informed on the amount of work required to release technical information like that. Raspberry Foundation do not have that documentation – it is owned by Broadcom the suppliers of the SoC. So the foundation needs to go to Broadcom to get it. If the current technical document is for internal use only (which it is) it needs to be adapted for outside use. It may even need to go past lawyers. It certainly needs to be proof read to ensure that its correct and doesn’t expose unnecessary information that would normally require an NDA. There is also the possibility that to access the CEC stuff requires code access to part of the GPU. Which would need to be done at Broadcom. Already we are talking $1000’s of work.
Once all that is done, then maybe Pulse8 would have enough information to adapt libCEC. Or maybe not.
On the other hand, there are probably already Broadcom people thinking about CEC support for Linux, so it may already be underway.
I really have to agree with this post. Afterall, it was the stated goal from the start of this project that the intended purpose was to allow children and young adults a way to control their TV with a remote control. Wait, that wasn’t the purpose?
Thanks the idea I will keep the advice but some answer was posted by Dom after my questions…
What about FLAC audio decoding on XBMC?, the majority of the movies i have in H264 also have FLAC 6 channel audio ONLY, will it have enough compute power to decode in realtime(well flac ain’t very cpu intensive anyway)
Do the closed source libraries have any dependencies (like glibc)?
Mourad De Clerck
Great info, thanks.
Just a few details I’m still wondering about:
– Is that kernel driver a Kernel Modesetting-style driver? ie. will it (eventually) be possible to run Plymouth, Wayland, xf86-video-modesetting on it?
– Is the open source kernel component useful on its own? (is it upstreamable? I know there’s a binary blob, but that hasn’t stopped e.g. Radeon drivers from being included upstream)
– no GLX – will it be possible to run something like a (stripped down) Gnome-Shell? I know Clutter theoretically runs on OpenGL ES, but I have no idea if anyone actually tried it.
Despite mailing list,
every hour check shop.
Still only stickers.
8/10 – you’d have got 10 if you hadn’t missed a syllable in the second line. :p
I was reading hour as two syllables but that’s probably just my west-country accent.
8/10 is fair decent though
A rhotic accent
Makes ev-er-y difference
Your new score is ten!
Can you give an idea on how close we are to p-day. I’ve got 27 Euros burning a hole in my pocket here.
Hopefully a couple of weeks. It depends on when the factory sends the units back to us, and how long they take to get here – right now, they haven’t given us any firm idea when that’ll be.
I know that there’s a lot of hard work going on. I can’t wait for that e-mail to hit my inbox. Just hope there’s still pi left when I get to the shop.
When Lawyers Run the System It Becomes Dependent on Them | Techrights
[…] that proprietary codecs drive up the costs,” wrote to us a reader today. It’s a tax on ideas and here is one way to tackle it along with Phoronix‘s analysis. The first three adhere to […]
There is a very big problem not covered. How often will the closed source libraries be updated.
I will point out a few critical things. Webkit/chromuim and Firefox/Gecko can directly expose Opengl ES to the Internet. Using own closed version of this library if there is not updates risks secuirty issues.
There is one thing todo binary blobs. I can live with that. Its a complete different thing to start doing ABI interfaces that applications are expected to use closed source. You never know exactly how they will be used and if people will be opening those interfaces up to the Internet.. Once you start providing closed ABI to applications you are 100 percent responsible to provide updates.
Sane design would have been the ABI interfaces of openvs, opengl es, openvg and egl open source. So that other parties can add filters as required to address secuirty issues. Yes all 4 might have internal binary blobs they call.
Also how long will broadcoms support for these libraries be. Will the libraries come open source when broadcoms support ends?
Would I have to be asking these questions if broadcom was not providing a closed source ABI to to userspace and expecting userspace applications to use it. No I would not.
There are lines in the sand you should not cross. Lot of companies making embedded systems are not doing the eval where these lines are. Also are not taking the responsibility to provide the updates to these libraries as they should. Because updating these libraries is extra man hours. If broadcom does not have the man hours to maintain these libraries they should not be closed source basically.
So key question will Broadcom be maintaining these libraries and how often can we expect to see updates to them.
If I was using a texas chip even with the powervr I don’t have this issue. Yes powervr still has a userspace binary blob buts it not direct user ABI. Allwinner is better the full required drivers are open source.
This is a broadcom issue that needs to be addressed.
I doubt those school kids (and grown-ups) learning how to install an operating system and how to do a bit of scratch and python to understand how computers work are particularly going to be worried by your trivial media center centric issues.
The entire project wouldn’t exist without the development and investment of commerical companies – In the real world open source can’t solve every problem and it doesn’t try to; everyone has to pay their bills somehow.
If you think the RasPi is so flawed you should find a supplier and device more suited to your needs as the RasPi Device/Project/Foundation are perfectly suited to plenty of other needs – especially the one that they are intended for.
This is not a trivial issue. We are talking a device that could if it goes well sell in the millions. Any weak point that cannot be patched in the future could come back and annoy us.
“The entire project wouldn’t exist without the development and investment of commerical companies – In the real world open source can’t solve every problem and it doesn’t try to; everyone has to pay their bills somehow.”
True and False. Other commercial companies don’t have issues release source to the same ABI libraries. There is making money Allwinner and Texas makes money selling you the chip. You must have software so you will buy the chip. Basically broadcom is doing something different to other chip makers. So yes they do need to explain self on this point.
I would suspect that the libraries are mostly paper weights without the hardware. This is why the other two makers I use don’t care about keeping the source code closed to use it you still have to buy the chip.
You would expect all the secrets hidden in the firmware. Anything in user-space you will be able to place a debugger on.
I do not see how broadcom will lose any money releasing the libraries open source. The firmware is nicely securely loaded by the chip never seen by OS. Unless broadcom is going todo something evil like making the Raspberry Pi libs unique to Raspberry PI so it cannot be loaded on other BCM2835 chips.
I cannot see a profit advantage here. Closed source the libraries cannot be submitted to coverity for free static checks for errors. Cannot have independent third party checking. So more QA issues.
If I can see a profit making logic I can possibly make peace with myself. Basically a reason why missing out on the freebies.
Alan my problem I see this as broadcom missing out. So possible result is poorer grade product with no gain for broadcom. Of course I want Broadcom to recover there spend and make a profit out this. So I don’t want Broadcom doing actions that will cost them extra long term. So they will see doing stuff like this as profitable actions. So we get more items.
I am not really asking for much. Simple how much support from broadcom the libraries will get. What happens at end of live of broadcoms support. Have they serous-ally considered what they are cut themselves out of by not releasing the source. Like the free coverity scan paid for by the USA mil.
I will ask to see what the long term support for the libraries is. Since the head of the foundation works there and quite a number opf posters, I think even if official support lapses, you will still have unofficial support anyway.
That said, these libraries are in pretty good shape, and are unlikely to need much in the way of changes anyway.
These are the typical reasons why companies keep drivers closed-source:
1) They include 3rd party licensed source that they are not allowed to pass on except in binary form
2) There are remaining TODOs, known fringe bugs, and other quality issues that they do not want to reveal, and don’t want to be bothered with source-level support
3) The source code could potentially (or knowingly) infringe on patents (very easy to do in this field) and they do not want to showcase it
4) The source code reveals architectural decisions of the custom hardware that they consider trade secret
Excellent and accurate list! Thanks.
In fact, in the Raspi’s case the drivers in the kernel are required to be open source due to the GPL, and indeed they are. There are some libraries in user space that are closed.
Broadcom will be updating the libraries as and when necessary. How long that level of support will continue I don’t know, I doubt anyone does. However, these libraries are used in lots of other products using Broadcom chips, so I would expect for quite some time (years). BTW, I don’t think you are in a position to judge what is ‘sane’ or not when it comes to commercial decisions made by a company you don’t work for.
I you don’t want to use these libraries you don’t have to. You just won’t have access to the acceleration of the GPU. If you are unhappy with anything on the Raspi in this area, please feel free to choose an alternative supplier, you have suggested a couple already.
What I class as sane is future proof. Does not cause a case of forced maintenance on the chip maker.
JamesH so from what you see the libraries will have at least 12 months and you guess more support. For the support time of the hardware I don’t have to worry that much. Statement of support for how long from broadcom would kind be good in this regard. So we can live span cost the device.
Still the question remains what is going to happen at the point support for the chip used in the raspberry pi ends. Basically if there is an assurance that the source code of the interface libraries will evidentially be released when broadcom no longer sees the device as commercial viable to maintain. Lot reason to reverse the chip goes by by.
I am guessing that once the raspberrypi is released, they will be reverse engineered. Since it is open source above and below, it will be possible to choose what inputs are run, and see what outputs are generated from the closed source module.
Good luck with that. Not an easy task. Millions of combinations.
Just noticed this MPEG-2 CODEC licensing issue – why can’t libmpeg2 which is available under a GPL licence not be used ? Is it only available for x86 architecture ? Or maybe I don’t understand the problem at all.. Are there some links I can read up to undestand this issue of licensing deeply. I remember ages ago we had this problem – and still do with encoding into MPEG2 , but decoding.. it all went away on Linux machines a while back when GPL licensed CODECs became available.. Or maybe is it the case that the MPEG2 CODEC is in the Nvidia driver ( machines with Nvidia graphics chips ) or something like that nowadays ?
Because any codecs supplied by the foundation would need to pay the licence fee. And the only way to get HD MEPG2 support would be to provide it on the GPU, and therefore it’s provided by the foundation. The fact that there is open source MPEG2 is irrelevant, it’s not the code itself that attracts the licence fee, but the algorithms used.
You would want to run the mpeg2-decoder (mostly) on the GPU. There is no compiler for that! You’re free (provided the laws in your country allow you) to run the mpeg2-decoder on the ARM processor. However it won’t be able to perform well enough to provide full-hd resolution at a real-time pace.
My camera records video in MotionJPEG (MJPG), so it’s good that the Raspberry Pi can handle that in hardware. That gives me one more reason to pick one up.
Actually there in some argument in house (Broadcom) whether this IS supported. Not difficult to implement if necessary.
Once the board is out I am guessing that someone will post firmwares for free that enables everything that can be enabled in the GPU.
OliverNET.CO.CC | Oliver Zdravkovic » Raspberry Pi: Details zum Grafikchip veröffentlicht » Sport, Politik, Technik, Psychologie
[…] SoC selbst stammt von Broadcom und trägt die Bezeichnung BCM2835. Die Grafikeinheit funktioniert ohne proprietäre Treiber nicht, die allesamt von Broadcom […]
Raspberry Pi Update: What can and can’t the GPU decode? | Coburn's Domain
[…] the meantime, you can read about the GPU decoding here on the Raspberry Pi website. Read more from News $25 PC, ARM, GPU Video Acceleration, Hardware, […]
Is there a way to make sure that none of this non-free software is used, activated, or even present on the device? I once thought the Pi looked hopeful, but all the attention to such things makes me think twice about getting one. I hope there is a way to allow the Pi to become a device that encourages studying, examining, and modifying, especially for the younger generation
Well, the device will not boot without the GPU binary blob. But if you can put up with that (and most people can), then you can run the device without the closed user land libraries, but you won;t get any acceleration (e.g. 3D, 2D and video encode/decode).
If you are so set on having all free software, the Raspi is not for you. I personally take a more pragmatic approach – I have no need for the source of the libraries or the blob, I just want to use then. I also don’t believe there are any great benefits to be had by making them open – security is a strawman argument, as is the community being able to fix bugs in the code if the source were available. Bugs in the code get fixed anyway by the SoC supplier, as do security issues.
Well it is the attitude of “just using ” I find problematic. Why is it ok to not be free to study one part but promote “open source” ideals everywhere else? In that case the non-free stuff looks like lures- ie “even if you do not want to hack python etc… it plays quake 3 !” I think there is enough value in free software and in freedom itself to not need to sign a deal for mere acceleration. In fact I think such deals have no place in the future and in support of a free internet, culture , and world do not use them. The attitude that “someone else” will update and fix drivers relegates the wonder of computing to dependence on a corporation and seems to stick out here like a sore thumb. I think dependence on corporations should stop at purchasing the hardware.
To be honest, I think you are completely missing my point. The majority of people for whom the board is intended (the education market), just using is perfect. People who say you cannot cater for the educational market AND have these closed blobs are talking out of their arses. This board is designed to get people programming. And it should do that. As a bonus they get to use state of the art graphics and video codecs.
And are the foundation promoting open source ideals? I thought they were promoting a cheap computing device to introduce more people to programming.
I you feel so strongly about it, I suggest you build your own board, using entirely free software. Since there are no GPU’s with free software, you will be limited to a simple Arm core, but that seem to fit your ideas. If you don’t want to do that, then I am afraid you are stuck with some closed code. Note, even the Allwinner A10 has closed code.
As to dependence on the corporate worlds to keep updates flowing – Welcome to the real world. You already rely on them for food, health, transport etc. Computers are no different.
If both chips had edge connectors you could plug them (with your thumb) into a hollow square column designed to connect them.
Then at any stage you could upgrade the memory or the SOC or both.
Yep R-pi and the chip manufacturers may save a few cents this way, but you get to throw the whole board away if you find you need more memory or Broadcom bring out a more powerful (pin compatible) chip.
Sorry for the negativity, I think R-pi is great, but encouraging reuse or an upgrade path would be an extra feather in your cap.
I think you underestimate the cost of designing a brand new plug in system for chips. How many 10’s of millions would you be willing to spend on a new socket design? The edge connector would also be very long – count the balls on the BGA….the socket itself, once designed would cost in the multiple dollars, the chips packaging costs would increase dramatically.
Just as a idea, you can buy sockets for BGA used on development rigs. They cost > $1000 each.
The edge connetors would join the memory to the CPU while holding the memory chip in place – think of a clothes peg.
Now think of a clothes peg with lots of little vertical wires and plastic guides to position the wires.
Next, the conectors are on the bottom edge of the memory and the top edge of the CPU.
Now imagine two clothes pegs holding the chip, on all four sides (OK that’s one wierd clothes peg).
I don’t know how closely together you could position those wires, but as long as there are plastic guides to ensure the correct placement, I think it’s do-able.
Sure the ball grid array for the CPU is another deal, but they could be as simple as pins on (tiny) springs with about a millimetre of vertical freedom coming out of the PCB.
Those clothes pegs could then hold that in place too.
As for “They cost > $1000 each”?
I’m going out to buy some clothes pegs!
As long as the clothes pegs are 0.2mm long you should be OK.
Would 0.4mm be small enough?
Something that could work with the chips as they are would be to put connecting pins in a plastic sheet between the chips.
These pins have a spring to them allowing the connection to be made by the modest application of downward force, achieved by metal fasteners pushing down on the chips, like the fasteners you see securing RAM chips in a PC.
I call the plastic sheet the “moist maker” in honour of the TV series “friends”, just to make it easy to reference.
This would require four 1mm holes through the board at the corners for the metal fasteners to pull against from the other side of the board.
I would need the help of someone who can use blender or equivalent to describe the pins and the securing operation in detail (any volunteers?).
This would allow the R-pi(2?) to be sold as a (99% complete?) kit, suitable for those with a requirement for more memory, for example.
Only problem being the millions you would need to find to develop such a device, and the 10x cost of making each chip. I’m not even sure you would be able to make any of these concepts small enough (I think the BGA is 0.3mm pitch)
Sorry for replying out-of-order.
I meant 0.4mm vertically – about the thickness of transparency film.
Can I play 720p / 1080p mkv ( matroska) without any hitch ?
MKV is the container format – whether it plays will depend on which codec is used for the video inside the container. MPEG4 and H264 are supported, plus some free codec. See the original blog post.
What about OGG codec ?
The OGG codec can be played directly over HTML5 without Flash !
It will be an good idea if you guys will sell the Raspberry as kit .
Beside the practical knowledge that will be an good exercise for the programmers and also they can improve the board .
If the CPU and the Memory will come populated with soldering balls. Just to put in place and treated with heated air flow ( at 283 degrees Celsius ) for 20 to 30 seconds.
This kind of kit will open the doors wide for hacks . Yummy !
I was excited for the Raspberry Pi until I saw this post. I can live with the proprietary GPU binary (though it’s a drag)… but forcing us to do business with MPEG-LA? Uhhh… and you say this device is meant for education? So you want my tax dollars to pay for these devices for kids to use in school? I am NOT OK with my tax dollars going to support MPEG-LA.
I only intend on using this device for Qt Quick hardware accelerated programming. Why do I need to do business with MPEG-LA all of the sudden? Keep the $0.25 and your license (give it to someone else that DOES want h.264 decoding). Pool all the $0.25 you save from people like me and donate it to some other charity… or just use it to buy more Raspberry Pi’s (**WITHOUT** the licenses) for children in schools. Every 99 units sold, you donate a Raspberry Pi to a child at no cost. I pulled $0.25 out of my ass but the real figures are probably protected by a Non-Disclosure Agreement lol…
The cost to the foundation of maintaining sales of two different products (with and without licensed codecs) would actually be more than the codec cost. So its really a non-starter to provide a device without the codec. I’m really not sure why you brought up the subject of education – this doesn’t seem to have anything to do with it. Remember, you have already paid MPEG LA if you have a graphics card in your desktop, or a LCD TV, or a DVD player or a XBox etc. It’s not like the Raspi is unique in this respect.
If you really cannot live with the codec cost, then I suggest you buy a different $25 machine that doesn’t have the codec cost. Except there isn’t one. Even with this codec cost, this is still far cheaper than any alternative. If you want to pay MORE for machines in education that have less features, you can go for it. But make sure that they haven’t hidden the cost of the codec licences in the price…
Yes, Xbox, my graphics card, etc etc all pay the license… as does any other proprietary product marketing themselves as such.
An ARM GNU/Linux box for $25. Take a byte!
An ARM GNU/Linux + Proprietary Stuff + MPEG-LA Licensing in a box for $25. Take a byte!
…just doesn’t have the same ring to it.
The issue of MPEG-LA and Education don’t directly conflict.. but with the Raspberry Pi aimed at the education market (as well as the technical market), I just can’t say I support giving MPEG-LA that much business. A product that sells millions (for schools paid for by taxes) gives MPEG-LA a pretty penny.
“If you really cannot live with the codec cost, then I suggest you buy a different $25 machine that doesn’t have the codec cost. Except there isn’t one. Even with this codec cost, this is still far cheaper than any alternative. If you want to pay MORE for machines in education that have less features, you can go for it. But make sure that they haven’t hidden the cost of the codec licences in the price…”
It’s just a GPU. In a logical world, the codec would make the hardware cost more, not less. MPEG-LA owns us.
I have never, ever, seen any marketing of graphics cards, PC’s , PC software, DVD/BluRay players say that part of the money you pay for the product goes to a licence fee for the codecs involved (and because they implement more codecs, a lot more than you pay on codecs on the Raspi). Why should the Raspi be any different? Why must we put extra stuff in our ‘advertising’ that other don’t? AFter all, we are a propriety product.
You don’t like MPEGLA. Neither do I. But selling a Raspi in to schools gives MPEGLA less licence fee’s that a normal desktop PC (If I understand licencing correctly).
As to your last sentence, I’m not sure of your point. In a logical world, you pay for the codecs you want to have. Which is what the Foundation does. The GPU could support many more than are actually going to be made available. If the codec price were included in the GPU, you would probably end up paying a LOT more for the GPU, and getting loads of codecs you didn’t want.
great work! thanks!
XBMC, Raspberry Pi and TuxTV | Rob Clarke
[…] it turns out, after a number of questions on the Raspberry Pi Forums, and a final blog post stating which codecs will be available to use, it seems this will not be possible at the […]
I don’t get it. How is OpenGL, OpenVG or OpenMAX closed source?
The API’s are not closed, but the source code of the libraries implementing those API’s is closed. You can still use the binary libraries using the standard OpenXXX API’s, you just don;t have access to the source code. For almost anyone, this is not a problem.
Quick question: which of the H.264 audio codecs are supported? Think the following are the options:
MP3, AC3, DTS, LC-AAC/HE-AACv1/HE-AACv2, Vorbis, FLAC and PCM
Audio is decoded on the Arm, so anything you can get on there basically.
Do you know which ones have been tested – which ones were used in the examples we have seen. Thanks
Quick question: which of the H.264 audio codecs are supported? Think the following are the options:
MP3, AC3, DTS, LC-AAC/HE-AACv1/HE-AACv2, Vorbis, FLAC and PCM
Those aren’t “h264 audio codecs.” H264 is a /video/ compression format. The other things you mentioned are /audio/ compression formats. All you need to decode audio is software that handles it.
That’s not something this device needs to include, buy one, get source code for said decoding software, compile against device, enjoy musics.
Also anime makes you a retard.
Technically H.264 is not a codec, but a codec standard X.264 is a codec as it an implementation of the standard. So whilst MP3 isn’t an audio codec LAME would be as it is an implementation. I was unsure what audio codecs the Raspberri PI would support or have been tested with. I do like to have sound with in my media files so when watching a H.264 video I was wondering what sound format I would place in the same container. So when you encode a media file you have options which audio codecs to use, I was simply asking which ones were supported.
News – Raspberry Pi mini-PC launches this month for just £16 – HEXUS Community Discussion Forums
[…] £16 Originally Posted by kopite http://www.raspberrypi.org/archives/604 Also check out http://www.raspberrypi.org/archives/592 for a list of supported codecs on chip! […]
Nada sobre Cesar » Rumo ao Raspberry Pi
[…] explicação sobre o tamanho da placa (ou “porque não vamos ver kits”), sobre bibliotecas e sobre o […]
Please, please, please give us a way to buy HW MPEG2 licenses. I’ve been following the r-pi news for a very long time and the thing I want to use it for most requires HW MPEG2. :-( I would happily pay quite a bit for the license pack.
I certainly being thought about – but its quite difficult to ensure only those who have paid a licence use it (ie no sending your binary blob to a mate). Would require work on software itself, and lots of administration headaches.
That’s understandable. Sad, but understandable. :-) Although it seems like for something that isn’t very expensive, people don’t tend to go out of the way looking for pirated copies of stuff. Good luck. I hope you find a solution. Great work on everything!
as a lot of other people already said, it would have been great if more codecs would have been possible. But it isn’t and that’s what we need to deal with.
I would like to ask however if the supplied codec can decode/encode all profiles within the H.264/MPEG-4 Part 10 AVC standard and up to which level (I think it will be level 4)?
Besides that, I would like to add my vote for more codecs which can be bought per device. Technically it would be possible to handle it within a module and firmware file which check on serial number or something like that to prevent abuse. I do understand however the administrative hell the foundation will end up in.
kind regards and please keep up the good work
“Is it all open yet?” Can’t wait till June when it will finally all be open.
I would gladly pay an additional amount for the missing codec licenses, would love to use my pi to be my primary xbmc device – it’s so small and it’s quiet can be mounted behind TV – and it’s so much more fun!
Hope enough people request and you work out some method for those of us with Pi’s already
Just wanted to chip in and say +1 for the separate licensing of the “imaginary property” required to fully use the GPU. If people are willing to pay, I think letting features go unused would be kind of a shame.
I hope you can pull it off. Good luck, and thanks for the device!
+1 for the GPU license pack.
I will happily pay some extra $ for this.
I am trying to use the Pi as an xbmc player. This is primarily to play the TV files I have archived off my Freesat PVR to my NAS. However the TV files are in .TS (MPEG-2) format, and so are unplayable :(
Transcoding these files seems like a pain, when they could be played direct.
How to overlay graphics on top of video? Use OpenVG for graphics and OpenMax for video? Is it possible?
Hi, just to ask, if I use Raspbmc, and purchase the 2 licenses, will I be able to play RMVB format videos?
Comments are closed