Project Log - Joel, Shengyang, Shao Xuan and Yu Hann

Last Updated by Sheng Yang on 7 June



Objectives:
  1. Tracking / Recognition / GPS / Autonomous
    • Installation of sensors (e.g. touch sensors, light sensors)
    • Mapping a designated arena and tasking the robot to complete tasks (e.g. flying and avoiding obstacles) within the area
    • May look into carrying of load
  2. Aerodynamics / Agility
    • Movement on land and water
    • Ability of movement of the rotor (omniwheel)
    • Making the robot more organic (i.e. adding limbs)
    • Transformation of form
    • Explore changing the ways the robot takes off






Our progress thus far:

In General
  • Attended NI workshop on 23 May 2012
  • Picked up basic programming in NI Labview and explored some possibilities for using Labview in the AR Drone and Intel Wireless proejcts with instructor Mr Kwok How

1-6 June 2012
  • Downloaded NI LabView software 30-day trial
  • Studied the AR.Drone Labview tool kit developed and the thesis written on it (http://ardronelabviewtoolkit.wordpress.com/)
    Managed to create some program in theory using Labview
  • [Problem] Could not upgrade ardrone firmware to 1.4. Unable to connect Ardrone to Labview. Could not test the program on the drone.
  • Dropped an email to NI (Mr Kwok How) to seek advice

7 June 2012
  • Went down to NI on 7 June 2012 to meet up with Mr Kwok How to discuss on AR Drone
  • Updated the version of Ardrone from 1.2 to 1.4
  • Ensured contact between computer and Ardrone (able to manipulate Ardrone software)
  • Talked about line following:
    • Orientation of Ardrone to line (backwards/forwards because there are two ends to a line)
    • [Problem] of sharp turns and T shaped junctions (which route to take?)
  • Line/ object following:
    • Prerequisite of obvious color difference between background and object/line
    • Ardrone first applies a 'color mask' which removes every color above/below various intensities of say green ( tracking green line or ball)
    • Every pixel of non-green is then removed and converted to a value of 0 while pixels of green have a value of 1.
    • Next a blob detector (http://en.wikipedia.org/wiki/Blob_detection) is used to identify a lump of green pixels which means the ball and the location is used as a point for the Ardrone to follow.
  • Face detection:
    • The above steps are similar, just that Chroma keying is used instead, with the background being ignored (similar to green screen background for photos) and have the Ardrone focus on red which is the predominant color in human skin.
    • [Problem] Face recognition can only be done in a controlled environment to ensure a bigger 'blob' with the color red does not come into the view of the Ardrone.
    • [Problem] Camera has to be stable for faster image processing. (gryrocamera)
    • gyrocam not feasible as camera cannot be isolated from system. 'Eye' camera (a separate and detachable camera) would be more achievable.

13 June 2012
  • Succeeded in tracking colored objects
    • Moving up and down
    • Moving left and right
    • Yaw-ing (circling the object)
    • Still working on perception of distance for forward and backward movement (utilizing the % of screen occupied by the object to judge distance)
  • Working on facial recognition and tracking
  • Working on navigating a corridor (vanishing point)