This project is solving the ExoMars Rover is My Robot challenge. Description
As a base platform we used 4 wheel robot equipped by x64 processor inside mini-ITX mother board and ruled by Ubuntu OS.
To orient on planet we used compass from quadcopter controller.
To find 'interesting' targets we are used color photo from one of the cams and found them by color (we used pink targets). It should help to robot find best direction for exploration and align its arm to target (this part wasn't implemented in hackathon scope). Once detected it's saved as 'last seen target' and can be obtained by command center on the Earth. This approach helps reduce size of information which are send to Earth - only important photo are send.
To investigate objects we added arm with LED Flash and microscope (we didn't implement logic to take photo from it, unfortunately).
To detect obstacles we planed to use PlayStation eye stereo cam, but it's turned out that obstacles detection is worked very unconstitutionally. So we decided to not include it in final presentation.
Actually we have 2 of them: one was implemented to test robot functionality and allow us to control him manually. The second one is represent Earth control center and allow anybody to see last robot seen target and also give him command (in presentation you can see that only command to orientation by magnetic field was implemented for now).
We have a good hardware and software base to proceed with automated exploration robot. Unfortunately we haven't finish some of our goals in this 2 days, but we happy with what we have. For now robot can't make own decisions what and how investigated, but he has a good subsystems which can help him with this work in future.
License: GNU General Public License version 3.0 (GPL-3.0)
Source Code/Project URL: https://github.com/Mortido/Rusty
youtube video - http://www.youtube.com/watch?v=lWSZKj6Bt4I