Astrophotography with the camera board
Cristos Vasilas from Dash One, a lover of astronomy and electronics, has been trying out the Raspberry Pi camera board as an astrophotography tool. He’s captured some amazingly sharp, short video of the moon, and of Saturn, rings clearly visible, swinging across the sky.
Cristos used foam packing material to attach the camera board to the eyepiece of his telescope, and mounted the Pi on the barrel of the telescope with velcro.
He says: “A dedicated Celestron 5M pixel imager costs $200, and I doubt it is nearly as versatile as the rPi.” Since filming the images above, Cristos has also discovered that a group of telescope enthusiasts have released code enabling the Pi to drive Stellarium, the planetarium software that tells the telescope where to point, so he can also lose the laptop from the kit needed to take photos like this in the future. If you haven’t played with Stellarium yet, you really should; several of us here at the Foundation are big fans and use it regularly – you don’t need a telescope to enjoy it.
Cristos says he has more work to do on exposure, gain, contrast and so on, and we hope he’ll be posting the results on his blog.
Updated to add: Cristos took some more video and stills of Saturn, this time with a 6mm lens, making the pictures even larger. They’re amazing – you can very clearly see the gap in the rings, and the shadow cast on the planet by the rings. Check them out.
We’ve found that there’s enormous potential in bringing down the cost of amateur photography – of all kinds – as a hobby with the Pi, whether or not you’re using the camera board. Check out these earlier posts if you’re interested in finding out more.
Updated to add: Cristos’ blog seems to have lost itself within the depths of the internet. All links have therefore been removed. If you know of its new home, please let us know the URL in the comments below.
Amazing project! This is one of my targets in my personal project.
I have been playing for a while with a raspberrypi and a telescope connected with Stellarium. The source code is in GitHub:
By the way, if you are using stills and not video, you can access the RAW data, giving you some more options in image processing (like no noise removal, which also kills some fine detail). I have made a start to this already, see my bealecorner.org webpage link above.
Very cool. Newtonian telescopes with very short focal lengths (called ‘fast’ newtonians) have poor contrast because they have a rather large diagonal mirror in the field of view. If the rPi + camera were made small enough, one could place the camera and rPi in the tube in place of the secondary mirror and thus eliminating losses from the secondary and possibly increasing contrast (which is useful for planetary observations). Having the computer ride with the telescope can also reduce cable management issues that often occur when trying to run data and power lines from the immobile observatory to the telescope which is always in motion.
Hmmm… I have a half-built Newtonian (based around a beautiful David Hinds 8.5″ mirror) that I’ve been meaning to finish for years. Might have a play with this — I’ve not built the spider yet so it’s ripe for hacking.
Great project BTW Cristos!
Might not need the RPi in the tube. The camera flex cable is thin, edge-on has a fairly small obscuration. Remarkably, even a 4 meter long camera cable apparently works: http://www.raspberrypi.org/phpBB3/viewtopic.php?f=43&t=43737#p362736
Gert van Loo
I am pretty sure the 4M cable is a spoof.
That’s a fantastic idea, will start hacking my f/5 right away! :)
Will the foundation be releasing different versions of the camera? I have a long exposure mod on my astro web camera – this i think would be an excellent addition to the pi camera (maybe by software control) to allow for better astro imaging in HD quality.
Unless there is already a way to control the exposure? – I have not got a pi camera yet but cant wait to get hold of one! :)
Check out the forums – lots there on how to change the camera’s parameters.
Lots there how you can’t change the parameters now, and how there’s no time to implement them in the short term.
In that particular lunar video, clearly long shutter speed is the opposite of what you need, due to the atmospheric turbulence effects. Instead you need a lot of frames with short exposure, individually selected and aligned afterwards.
And, in fact, you can go beyond this: you can move the sensor diagonally, or align it so that the Earth’s rotation makes the telescope image drift across the sensor at an angle, and then you can use the known resulting motion of the sensor (or use interpolation techniques directly from the image data), to calculate how to align the resulting stack of short-exposure images with sub-pixel accuracy.
That then gives you images that are not only long exposure, but which have a genuinely higher resolution than your physical sensor! :) The overlaid partially- overlapping pixels generate the subpixel data.
“Active” sub-pixel alignment would in theory let you produce gigapixel shots using a megapixel sensor and making use of natural or artificial camera shake (a tripod is counter-productive!) – the problem with the method (known as “drizzle”) is the amount of number-crunching needed.
If only someone made a cheap battery-powered computer board with a camera interface, that also had an onboard programmable graphics processing chip …
Also known as drift scanning see here for details of how to do – just need some Pi software to implement…
Just WOW! :)
Me too: WOW!
+1 – WOW
Impressive. Now, what if the camera was on a balloon…
Also, you wrote “from” twice.
I have flu. I think my writing *anything* should be applauded. (Thanks for the spot.)
This is awesome! :D I can’t wait to start a project on this!
In fact, I was thinking about this after seeing the post on time-lapse photography! Now it has become a reality! =T but the good thing is, i appreciate all sources of reference ;)
Message to Astrophoto enthusiasts, I’m porting qastrocam-g2 to raspberry pi/raspbian.
The package can be found here :
Congratulations, i have try your compilation of qastrocamg2 on my raspberry with my sp900 modified lx mod;
The webcam is view by the software but the screen for viewing the image stay white and when you take a picture, it saves only the txt file.
must be some other packets to install?
Note his June 4 update has a much better Saturn pic showing far more detail. Follow the link to his blog and look at it in full resolution – the photos and videos shown above are just his earlier ones at lower magnification.
How is it that the pictures are coming through so sharp? Everything I’m generating is coming out at near 2MP quality.
Have you updated your firmware?
More details here http://raspi.tv/2013/part-2-raspberry-pi-camera-stills-vs-video-old-vs-new-resolution-comparison
Sounds like these images were taken using the Afocal technique.
You get much better results when you use prime focus.
How hard is it to remove the lens from the camera module so its just the sensor?
Removing the lens and placing the camera module inside an old extension tube or even a film canister will allow you to use the actual sensor as an ‘eye piece’ and the telescope as a lens, and with a pixel size of 1.4 that will give you ABOUT the same magnification as a 2mm eye piece.
With the ability to capture 60fps at 1280×720 you should be able to get some amazing images that I think will rival some of the top of the line planetary cameras. (The $200 Celestron NexImage camera mentioned in the article is VERY new entry hardware and preforms about the same as a $40 web camera).
Here is an example of one of my images taken using a modified Microsoft Lifecam.
It was taken at 1280×720 @30fps.
I have a feeling this camera module will be able to produce even better images than this one.
I’m working on a RPi based astrophotography setup controlled by a smartphone or tablet. I’ve built an adapter for prime focus photography and the results look good. You can see my setup at: http://www.flickr.com/photos/robpettengill/sets/72157635483690850/
Please excuse any ignorance as I have just come across this project from a FORA.TV presentation and didn’t know anything about Raspbery Pi before this.
My question is: are the photos/videos taken and the unit brought back to earth or are they simply transmitted to earth and the unit lost in space.
Thank you in advance
Have just tried this myself using the Raspberry Pi camera at prime focus with the lens removed. Using a Celestron 130 SLT telescope and got this photo of Jupiter. I am very pleased since it is my first attempt at this. Things can only get better! :)
Comments are closed