ControlNAVIORaspberry PiRobocarUncategorized

Build an Autonomous Car with RPi, NAVIO2 and Tensorflow/Keras, Part II: The Software

In the previous post I’ve outlined the hardware build of a “Robocar”, a simple autonomous car platform using monocular vision and Deep Learning, using a small RC car with few modifications. The post focused exclusively on the hardware. If you’ve followed the directions in that post, you should be able to customize your RC car with a simple wooden or plastic platform, a Raspberry Pi, a camera and a PWM HAT1 that can control a motor and a servo. For my build I’ve also added an RC receiver, since my NAVIO2 HAT supports decoding of SBUS and PPM signals out of the box. However this is optional, and there are many ways to control your car, depending on what you have available (WiFi, for instance).

Even though the hardware is essential to a functioning autonomous robocar, at it’s heart it’s the software and the algorithms that enable autonomy. In this post we will be focusing on building a simple software stack on the Raspberry Pi that can control the steering of an autonomous vehicle using a Convolutional Neural Network (CNN).

Background and Aims

Let us elaborate on the background and our goals a bit. As mentioned earlier, the aim of this project is to build a car that can navigate itself around a course, using vision and vision alone. Not only that, but decision making regarding steering and throttle all happen as part of a single CNN, which takes the image as input and outputs values corresponding to steering and throttle2. This type of decision-making known in machine learning as an “end-to-end” approach: Information comes in raw at the input, and the desired value is presented at the output. The neural network needs to infer suitable decision making procedures as part of it;s training. End-to-end training is just one of a number of different approaches for autononous vehicles. Another popular one is the so-called “robotics” approach, where a suite of different sensors (vision included) are “fused” together algorithmically to produce a map of the vehicle surroundings and localize the vehicle within it. Then, decision making takes place as a separate step, and sometimes consists of hand-coded conditions and actions.

It is not the place in this blog post to debate the merits of one approach vs the other. The truth may lie in a compositional approach, for all we know 3 Taking into account, however, the simplicity of this project and it’s DIY roots, as well as the recent leaps in self driving vehicles achieved by end-to-end neural net approaches, I feel it’s worth a try. And so did quite a few people, including the Donkey team, whose default CNN model and pieces of code we’ll be using in this build.

Installation

This car build uses the Burro autonomous RC car software, freely available on Github. Burro is an adaptation of Donkey for the NAVIO2 HAT. While it borrows a lot of features from Donkey, Burro has a number of significant differences:

  • There is no separare server instance, all telemetry is served by an onboard web socket server
  • RC(SBUS) or gamepad (Logitech F710) are used for control of the car
  • It is adapted for use with (and requires) the NAVIO2 board’s RC decoder, PWM generator and IMU (gyroscope)

Currently Burro requires a Raspberry 2 or 3 board with the NAVIO2 HAT. Before proceeding with the installation of Burro, you will need to have a working EMLID image installation. The latest version is strongly recommended. Please make sure you follow the instructions available in the relevant EMLID docs.

Once this is complete, ssh to your Rpi, which by default should be navio.local, if using the EMLID image.

wget the Burro install script

change permissions and run it

This will install all required libraries, create a virtual environment, clone the Burro repo and set it up, and create symlinks for you. After successful completion, you end up with a fully working installation.

A warning: Some steps of this script can take a significant amount of time, especially the numpy pip install step, which needs to happen due to library incompatibility with the apt-get versions. Total installation time should be around 30min. To ensure that your installation is not interrupted midway, make sure that you run your Pi out of either a power supply that can supply at least 5V/2A, or a fully charged power bank or LiPo of sufficient capacity.

Configuring

I am using the software with a Turnigy mini-trooper 1/16 RC car. If you have the same car, you need only change your RC channels if necessary. RC Input channels are as follows: 0 – Yaw (i.e. steering), 2 – Throttle, 4 – Arm. Yaw and throttle are configurable via config.py, but Arm is hardwired to ch. 4. Each time the RC controller is armed, a neutral point calibration is performed. Thus, you only need to make sure that your sticks are center before arming the car.

By default Burro outputs throttle on channel 2 of the NAVIO2 rail, and steering on channel 0. You may wish to change this.

You may also wish to configure the throttle threshold value above which images are recorded.

See the Readme in the Burro repo for more instructions on how to edit your configuration.

Testing

After installation and configuration is complete, you should be able to drive your car around, either using the manual controls, or using a mix of CNN for steering and manual controls for throttle. Automatic throttle control is not yet available, but it will be in a future version.

To start a Burro instance, first ssh to your RPi, if you havent done already:

Then type following, from the place where your install-burro.sh script was located:

Drive it!

Point your browser to your RPi address (http://navio.local by default for the EMLID image), the telemetry interface will come up. Choose your driving mode based on your controller. The default is using the F710 gamepad for steering and throttle. There are options for RC, gamepad, and mixed RC+CNN and gamepad+CNN driving, where the CNN controls the steering and you control the throttle. Autonomous throttle control is not yet implemented in Burro.

Here’s a video from a Burro car running in Mixed mode:

Next Steps

I like to think the Burro project as being part of the lively Donkey community, since it has been spun out of Donkey after all. As such it is worth taking a look at many resources created by the Donkey developers, namely:

If you’re interested in the development of autonomous small scale vehicles, you may wish to be part of the Slack community of Donkey, by requesting an invite.

Conclusion

This is the second post in a series discussing the software aspects of a small scale autonomous vehicle, using vision alone and end-to-end machine learning for control and navigation. The Burro software was briefly presented, together with installation instructions.

Autonomous vehicles is a very young and promising field of AI, and certainly we will be seeing very interesting competition in the near future.

  1. Like this one, for instance, used by Donkey
  2. Values can be in fact either continuous or categorical
  3. There is a quote of James McLelland, which mentions that science is best served by pursuing integrated accounts that span multiple levels of analysis simultaneously. 

7 Comments

  1. Pingback: An Experiment On Small Vehicle Transmission – Unmanned Build

  2. Krst
     

    Hi, many thanks for your fork with Navio2. Done intial trials with my 1:10 car and also my mini Z.

    I ran into a problem however that the Servo is turning in the wrong direction in one of my cars (controller thinks it turns left but wheels go right and opposite), cant figure out how to change this in the code. Do you have any suggestions? I thought one could switch the max and min values in the drive.py but that had none effect...

    Thanks in advance.

    • yconst
       

      Hi,

      This is because apparently some servos are either placed opposite wrt the steering mechanism, or have different endpoints. Currently the easiest way to fix thisis to edit the file drive.py inside the burro/ directory. Change line 111 to `pwm_val = 1.5 - value * 0.5` . Make sure your tabs are ok, and that should do it.

    • yconst
       

      There is also the chance that your RC controller may need the yaw channel to be reversed. I'm planning to provide configuration options for all these, btw.

      Oh, and if you have any videos with your setup, do share!

      • krstec83
         

        Thanks, that solved it offcourse. Just first steps today, had some ECS issues with my car. I am not there yet, getting some angle errors on boot why i am reinstalling the image now from scratch.

        So while i can drive it and teach it, i still am to get some decent drives in automunus on my circuit which is tape on the floor.

        A picture of the car within the link, https://ibb.co/bu3qnk, it is a custom rebuild mini z buggy with separate esc and servo card. Althought it looks to be verry crammed (it is) it drives very good and 4wd. Getting some servo jitter on the servo, something strange with the Navio2 pwm.

        Is it possible i assume to load several instances of cars? I.e change from the default setup? This way to be able to teach the car differnt things in differnt sessions. Where would one then change the vehicle in the code?

        I am rather novice to this as you notice! Compared to the donkey car, how much benefit is it to use the IMU of the navio in your experience? I assume it is great to be able to counter drift, but have you seen some else remarkable benefits?

        • yconst
           

          Hey, looks nice. One suggestion I have: Get the wide-angle camera. It's a $14-15 extra cost, but the default camera just has too narrow of a viewing field to capture nearby features (lanes etc). If you use the default camera, you could try to raise your position and angle.
          Not sure about your second point..
          Re. the gyro drift correction: this is just an adjustment happening post-steering, mainly to keep the car from slowly drifting due to miscalibration. I should add an integral (I) component as well to make this work better, but even now it works ok imo. It doesn't affect training or inference of the neural net though.

  3. Pingback: First Autonomous Indoor Laps – No Tracklines! – Unmanned Build

Leave a Reply

%d bloggers like this: