Engineering, Technology and Robotics

David Ball

(July 2008 - December 2011)

Introduction

My primary role has been to enable new science through collaboration where I provide technology, systems and mechatronics engineering leadership to the Thinking Systems team. In this way I work with the researchers to evaluate and provide appropriate engineering planning and solutions. I also work on new mechatronics solutions for mobile robots primarily focussed on building a new rat animat robot.

iRat (Intelligent Rat Animat Technology) (with Scott Heath, Janet Wiles)

The iRat (intelligent Rat animat Technology) is a rat animat robot designed for robotic and neuroscience teams as a told for studies in navigation, embodied cognition, and neuroscience research. The rat animat has capabilities comparable to the popular standard Pioneer DX robots but is an order of magnitude smaller in size and weight. The robot’s volume is approximately 0.08m2 with a mass of 0.5kg and has visual, proximity, and odometry sensors, a differential drive, a 1 GHz x86 computer, and LCD navigation pad interface. To facilitate the value of the platform to a broader range of researchers, the robot uses the Player-Stage or Robot-Operating-System frameworks, and C/C++, Python, and MATLAB APIs have been tested in real time. Several studies as described below, including two studies of neural simulation for robot navigation have confirmed the rat animat’s capabilities.

An industrial design company, Infinity Design, has styled the iRat’s cover to give a professional look suitable for commercialisation.

More information on the iRat.

iRats

Figure 1. (left) iRat prototype. (right) final iRat shown next to a computer mouse for scale

Three 2010 and one 2011 undergraduate thesis projects are based around the iRat.

  • Development of a whisker system to sense proximity and texture.
  • Development of a visual obstacle avoidance system.
  • Initial work on developing legs for a future quadruped version.
  • Development of a telerobot framework.

Ball, D., Heath, S., Wyeth, G.F., Wiles, J. (2010) iRat: Intelligent Rat Animat Technology, Proceedings of the 2010 Australasian Conference on Robotics and Automation (ACRA), Brisbane, Australia.

RatSLAM on the iRat (with Michael Milford and Gordon Wyeth)

This study ran RatSLAM on the iRat to map a figure of eight environment which demonstrated the capabilities of the iRat.

RatSLAM experience map

Figure 2. (left) RatSLAM experience map, (right) ‘non conjuctive grid cells’ cells generated from the pose cells.

Ball, D., Heath, S., Milford, M.J., Wyeth, G.F., Wiles, J. (2010) A Navigating Rat Animat, 12th International Conference on the Synthesis and Simulation of Living Systems (Alife), Odense, Denmark.

Australian SLAM

In collaboration with UQ’s School of Journalism and Communication an Australian road movie set has been specially built. Using a new C++ version of the bio-inspired navigation system, RatSLAM, the iRat can build a semi-metric topological map of paths it travelled around Australia. The robot uses the map to plan paths to reach goals.

iRat's view

Figure 3. iRat’s view of the Australian environment.

Figure 4. (left) Australian map set with iRat near Melbourne. (Right) RatSLAM experience map.

Publication in preparation for Robotics and Automation Magazine

Spike Time Robotics (with Peter Stratton, Christopher Nolan)

In this study a spiking network controls the iRat in real time. The study demonstrates how the neural controller directs the rat animat’s movement towards temporal stimuli of the appropriate frequency using an approach based on Braitenberg Vehicles. The circuit responds robustly (after four cycles) when first detecting a light pulsing at 1 Hz, and rapidly (after one-to-three cycles) when primed by recent experiences with the same frequency. This study is the first to demonstrate a biologically-inspired spike-based robot that is both robust and rapid in detecting and responding to temporal dynamics in the environment. It provides the basis for further studies of biologically-inspired spike-based robotics.

iRat location and spiking network

Figure 5. Rat animat location and spiking network output while tracking a 1Hz flashing stimulus. (top left) Rat animat showing two light sensors, their respective resonant circuits and crossed connections to the wheels. (top middle) Tracking camera view. (top right) Tracking data showing three trials, first with the robot directly facing the flashing stimulus, then rotated approximately 45° to the left and right. (bottom) Left and right sensor responses (see text for details).

Wiles, J., Ball, D., Heath, S., Nolan, C., Stratton, P. (2010) Spike-Time Robotics: A Rapid Response Circuit for a Robot that Seeks Temporally Varying Stimuli, In 17th International Conference on Neural Information Processing (ICONIP).

Blind Bayes

Real rodents are able to maintain localisation even when visually deprived. This study is investigating the ability to maintain localisation with only sensory information about the walls in close proximity.

iRat global pose

Figure 6. iRat global pose tracked overhead (blue line), integrated wheel odometry (red line) and sensory wall measurements (green crosses). The goal is to maintain accurate global pose by resetting the iRat’s internal map position when colliding with the wall.

Cheung, A., Ball, D., Milford, M., Wyeth, G.F., Wiles, J.  Blind Bayes in a Box: Rat and Robot Navigation in the Dark (in preparation)

Telerobotics (with Scott Heath and Angus Cummings)

A telerobot, is a telepresence device that allows a user to interact with a robot over the internet. There are few active public telerobots on the internet. In general, direct control of the robot is not appropriate because of the communication lag and the ratio of users to robots. The RatSLAM navigation system allows the user to indirectly control the robot by setting navigation goals. A generic streaming and interaction module has been written for the Apache web server which communicates with the robot and an Adobe Flash client in the user’s browser. In the current implementation it allows the user to set navigation goals while the robot’s camera, internal (RatSLAM experience) map, etc., is streamed.

Telerobot architecture

Figure 7. (left) The telerobot architecture showing the connection between the pioneer, the webserver and the user’s client. (middle) The flash client interface showing the robot’s camera view, map, battery charge level, etc. Users can click on the map to add navigation goals (shown as a green dot) (right) Final mobile phone version of the website.

Heath, S., Cummings, A., Wiles, J., Ball, D. (2011) A Rat in the Browser, Proceedings of the 2011 Australian Conference on Robotics and Automation.

RatSLAM – MATLAB and C++ version (with Scott Heath, Michael Milford)

There was motivation for a lightweight version of RatSLAM:

  • that could be released publically,
  • that SLAM researchers could use on their own datasets, and
  • a tool to understand how the RatSLAM algorithm works.

MATLAB was chosen for a version that could be publically released due to its inbuilt functions for matrix operations, graphing, and loading video. The result was a lightweight implementation that clearly shows the major functionality of RatSLAM including: View Templates, Pose Cells and Experience Map. The core parts of the RatSLAM have also been written in C and are loaded as a DLL which allows much faster processing. A module is also available to process overhead images to track global pose. The code supports as an input:

  • processing a combination of video and wheel odometry information from files (such a recorded from a robot) , or
  • processing video and performing visual odometry (such as recorded from a moving platform like a car), or
  • real time closed loop control of a robot using Player-Stage.

The code for the offline versions is online at http://ratslam.itee.uq.edu.au/ along with two datasets: a partial St Lucia suburb video and a partial Axon level 5 video.

A C++ version that runs significantly faster has been developed.

Visual Odometry

Figure 8. St Lucia (left) and Axons level 5 (right) using visual odometry using a MATLB version of RatSLAM.

The code is available online at http://code.google.com/p/ratslam/ under a GPL licence.

Omni-directional drive robot platform (with Chris Lehnert)

This project designed a robot research platform with high mobility that can traverse typical office environments and has decent onboard computational resources. A novel spherical drive mechanism has been designed and tested. The advantage of this spherical drive mechanism is continuous contact with the ground plane to reduce vibration and isotropic rotational characteristics that facilitate improved traversal properties. This year the goal is to incorporate the drive system in a complete robot platform.

Prototype omnidirectional drive robot

Figure 9. (left) Prototype omnidirectional drive robot. (right) A CAD drawing of the spherical mechanism.

Ball, D., Lehnert, C., Wyeth, G.F. (2010) A Practical Implementation of a Continuous Isotropic Spherical Omnidirectional Drive, Proceedings of the International Conference on Robotics and Automation (ICRA), Anchorage, Alaska.

Telerobot for RatSLAM (with Scott Heath)

A telerobot, is a telepresence device that allows a user to interact with a robot over the internet. There are few active public telerobots on the internet. In general, direct control of the robot is not appropriate because of the communication lag and the ratio of users to robots. The RatSLAM navigation system allows the user to indirectly control the robot by setting navigation goals. A generic streaming and interaction module has been written for the Apache web server which communicates with the robot and an Adobe Flash client in the user’s browser. In the current implementation it allows the user to set navigation goals while the robot’s camera, internal (RatSLAM experience) map, etc., is streamed.

Insect environment replication - Multi-monitor world (with Tien Luu, Allen Cheung, Gavin Taylor)

Neuroscientists investigate honeybee flight behaviour using visual stimulus. Initial investigations demonstrated that moving an object on an LCD past a Bee resulted in changes to the angle of the Bee’s abdomen. This investigation prompted the development of a setup that would allow a virtual world to be rendered on monitors surrounding the honeybee. My solution was a multithreaded C++ DirectX application that could render a viewport on each monitor. Objects can be rendered relative to the world coordinates or the camera. This system works with up to 6 monitors @ 1920 x 1200 pixels @ 60 Hz. A Python interface was added so that the biologists could adjust major settings such as the number of windows, the textures and planes, the position and velocity of the camera and the platform. A LEGO platform was constructed that can raise and lower on command from the python script.

Four monitors with moving scene

Figure 10. (left) Four monitors with moving scene surrounding the Bee tether and LEGO platform. (right) The LEGO platform from another angle.

Luu, T., Cheung, A., Ball, D., Srinivasan, M.V (2011) Honeybee flight: A novel streamlining response, The Journal of Experimental Biology.

Rodent Electrophysiology
Digital Wireless Neural Telemetry Phase One (with Ryan Wong, Francois Windels)

Typically, electrophysiologists tether the rodent to their neural recording equipment. A wireless neural recording system would allow for: larger and more complex environments, social interaction studies, and outdoor recordings. Existing wireless rodent recording systems can be classified by the number of channels, recording rate, bit precision, continuous streaming versus spike only, size, weight, battery life, and cost.

A prototype has been developed that allows for continuous digital recording from 16 channels @ 20kHz @ 8 bits, weighs less than 50 grams and lasts over one hour. The prototype has recorded spikes from a freely behaving rodent with comparable results to a tethered system.

The wireless module is enabling a novel experiment to be setup which is currently in progress.

 

 

Digital wireless neural recording

Figure 11. (left) Digital wireless neural recording system architecture. (right) Data recorded using the digital wireless neural recording system from a freely behaving rodent showing a neuron spike.

Publication dependent on results

Digital Wireless Neural Telemetry Custom ICs (with Tara Hamilton, Bala Thanigaivelan)

Work has begun on designing the next phase module uses a custom IC to replace the analogue components which make up over 50% of the area and power. The IC is a challenge in itself due to the high gain requirements. The IC’s have amplifier issues however testing is ongoing.

In the meantime a new version of the wireless module is under development to have all the electronics mounted on the rat head and will be finished in late December 2011.

Bala Thanigaivelan, David Ball, Janet Wiles and Tara Hamilton (2010) An 8-channel neural recording system with programmable gain and bandwidth, 2nd Asia-Pacific Signal and Information Processing Association (APSIPA)

Invited Presentations and Workshops (with Janet Wiles)

  • Presentation – A Rat Animat at the James S. McDonnell Foundation Adult Neurogenesis Consortium Meeting, San Diego, USA in May 2010
  • Workshop – A Rat Animat at the Temporal Dynamics and Learning Centre, San Diego, USA in May 2010
  • Presentation - Thinking Systems: Engineering “collaborations between engineers and neuroscientists” at the James S. McDonnell Foundation Adult Neurogenesis Consortium Meeting, San Diego, USA in May 2009

Lecturing (with Ben Upcroft)

  • In semester 2, 2010 I was the course coordinator for METR4202 – Advanced Control and Robotics (65 students) and co-lecturer for METR3800 – Mechatronics Design Project (30 students).
  • In semester 2, 2011 I was the course coordinator for METR4202 – Advanced Control and Robotics (94 students).

PhD

An adaptive agent improves its performance by learning from experience. This paper describes an approach to adaptation based on modelling dynamic elements of the environment in order to make predictions of likely future state. This approach is akin to an elite sports player being able to “read the play”, allowing for decisions to be made based on predictions of likely future outcomes. Modelling of the agent’s likely future state is performed using Markov Chains and a technique called “Motion and Occupancy Grids”. The experiments in this paper compare the performance of the planning system with and without the use of this predictive model. The results of the study demonstrate a surprising decrease in performance when using the predictions of agent occupancy. The results are derived from statistical analysis of the agent’s performance in a high fidelity simulation of a world leading real robot soccer team.

Ball, D., Wyeth, G.F. (2008) Reading the Play – Adaptation by Prediction of Agent Motion, Proceedings of the 2008 Australasian Conference on Robotics and Automation (ACRA), Canberra, Australia.

Student Scholars

The scholarships allowed us to:

  • give the students the opportunity to experience a research environment,
  • mentor and evaluate students for future higher research with the Thinking Systems group, and
  • achieve research support outcomes.

Many students have gone on to do undergraduate theses and some onto PhD’s with us. The following lists the projects and students with the ones I supervised directly underlined.

Summer 2008-2009

  • SS08-01 - Omni-directional drive robot research platform (Robotics) – Chris Lehnert
  • SS08-02 - Digital wireless telemetry for neural recording (Electronics, Signal Processing) – Ryan Wong
  • SS08-03 - Interactive web interface for robots (Robotics, Networking) – Scott Heath
  • SS08-04 - Environment replication for insects (Mechatronics, 3D Graphics) – Gavin Taylor
  • SS08-06 - Large Text Corpus (General) – Andrew Jones
  • SS08-09 - Aggressive Bee Model (Computational Modelling) (QBI) – Timothy Mews
  • SS08-10 - Enriching social interactions for robot language games (Software, agent-based modelling) – Elizabeth Alpert
  • SS08-12 – Rodent Neural Recordings (Electrophysiology) (QBI) – Robert Ninness

Winter 2009 Students

  • WS09-01 - Mobile Robot Sensor System (Robotics) – Ezra Zigenbine
  • WS09-06 - Modelling Spiking Neurons (Modelling) – Nabeelah Ali
  • WS09-07 - Deconstructing a Squiggle (Modelling) – Benjamin Sinclair
  • WS09-10 - Enriching social interactions for robot language games (Software, agent-based modelling) - Elizabeth Alpert

Summer 2010 Students

  • SS09-01 - Omni Robot - Docking and Power System (Robotics) – Kieran Wynn
  • SS09-02 - Omni Robot - Omni Vision System (Software) – Joel McGrath
  • SS09-03 - Omni Robot - Drive and Sensor System (Electrical) – Jessica Wrigley and Ezra Zigenbine
  • SS09-04 - Omni Robot - Navigation and Behaviours (Mechatronics or Software) – Hilton Bristow
  • SS09-06 - Robot Rat (Mechatronics) – Scott Heath
  • SS09-07 - Rodent Whisking (Mechatronics) – Nick Calver
  • SS09-09 - Automatic Behavioural Analysis (Software) – Tim Martin
  • SS09-10 - Navigation for a High Speed Outdoor Robot (Robotics) – Daniel Clarke
  • SS09-11 - Boxed In (Modelling) – Kieran McLean
  • SS09-18 - Adding objects to spatial language games (AI) - Elizabeth Alpert

Summer 2011 Students

  • SS10-01 – iRat on the Web – Telerobot + iRat + RatSLAM (software) – Angus Cummings
  • SS10-02 – Kangaroo Legs – Mechanical design (mechanical) – James Wall
  • SS10-03 – Kangaroo Legs – Force Control (electrical) – Ricky Chow
  • SS10-04 – Kangaroo Legs – Adaptive Learning (software) – Patrick Ross

Conversion

Scott Heath, Chris Lehnert, Gavin Taylor and (future) Patrick Ross have gone onto to start postgraduate research either at UQ or QUT with our group.

Employment

I have employed a number of RAs and research scholars for the various postdocs and themes as follows.

  • Robert Ninnes: Assists Francois Windels by building microdrives and other associated electrophysiology equipment.
  • Justin Cappadonna: Assists Tien Luu by performing Bee electrophysiology experiments.
  • Daniel Clarke: Assisted Oliver Baumann by adapting the fMRI barrel world to add interference.
  • Jack Valmadre: Is assisting Oliver Baumann by developing a new fMRI world to investigate head direction in humans.
  • Scott Heath: iRat software platform.

Future

I have accepted an 80% postdoctoral research 20% academic position at the Queensland University of Technology working with Peter Corke on robots for agriculture. This is part of a new rapidly growing robotics group focussing on robots for the real world.