
Haiya Stephen, Was this done at Uni level? Since am a code learner ( but a semiconductor person many years ago ), I'll challenge other coders to write out the simple code for the Robot control in your posting below. - Colour detection = different wavelength = different waveform = different voltage. 50% of the job done by the sensors - One collision detection = flat object = flat waveform = halt drive motors. Sensor does the work again! - Identify sponge = colour detection waveform. - Pick sponge = look at sensor waveform, each object has different pulse timings = run action. Where is the AI in this? Wooii...my words here are too harsh. Did the students programme the controller on waveforms or did they actually build intelligent code to guide the robot? Rgds. On Fri, Jul 23, 2010 at 4:54 PM, ndungu stephen <ndungustephen@gmail.com> wrote:
This year competetion went like this:
1. Put on robot - robot should find white line or track and stay within the track
2. Once end of track is reached, robot comes across tray full of sponges and stops
3. Robot should identify color red sponges (from an assortment of white, blue, yellow etc) from the rack,, and using an arm, should reach out and pick 'red' sponges only
4. Then robot should go back on the track and drop the red sponges on a tray awaiting on the other end
5. Then robot should go back and pick out red sponges again
That was the way the competition went - and as far as recall, a private university won.
== winner is whoever has the most red sponges
trick is your robot's arm should not be clumsy - e.g upset the tray of sponges,, drop the sponges while transporting them
and also distinguish colors..
<this was done in kenya :-) >