Quote Originally Posted by Redrobes View Post
Edit: Cool - read that now, very good stuff. I would check out Celestia the app as a) I believe it has some projection options in there, b) Its made for doing this type of thing and c) Theres a whole heap of stuff to project with it. The usual way to calibrate this sort of thing is merely to project the map onto a 3D model and project the model. Then on the model you put some calibrations points down in the texture and look at where they end up on the globe. Then after a process of iteration, you can adjust the 3D model in order to make the cal points sit on the places you want them in the real model. Then when you project the map onto the model it should transform it into the real world to fit. Your model looks like its a pretty close fit already tho. Post back with updates tho wont ya. Kinda interested to know how your going to do the touch feeedback input from it. You gonna use a web cam and decode the hand positions or use the Kinect ? I know that Microsofts projected table used some infrared LEDs pushing light into the perspex top and then where your fingers touched it you get IR spots which a camera picked up and then you can decode that into movements.
Celestia looks perfect for this. I had looked at Stellarium once before, but it seems like Celestia may be able to do the projection I need out of the box.

I'm still looking at how to do touch input, but IR LEDs is probably what I'll end up trying. I have a PS3 Eye modified for IR use already, though it doesn't have a fisheye lens. If I can't work out a good way to do it, I'll probably just interface a SpaceNavigator to it.