F14: Collision Avoidance Car

From Embedded Systems Learning Academy
Revision as of 06:13, 22 December 2014 by Proj user8 (talk | contribs) (References)

Jump to: navigation, search

Collision Avoidance Car

PICTURE HERE

Abstract

The inspiration behind the Collision Avoidance Car project comes from the state-of-the-art field of self-driving cars. All major automotive companies are investing heavily in autonomous car technology. One of the more prominent autonomous cars being developed is the Google Car, which features Lidar and photo-imaging technology to implement autonomy. The goal of the self-driving car is to reduce gridlock, eliminate traffic fatalities, and most importantly, to eliminate the monotony of driving. This project will explore the fundamentals of Lidar, and how this technology is being used for cutting-edge products, such as the Google Car.

Objective & Scope

The project objective was to implement Lidar(laser-based device for measuring distance) technology, coupled with a toy car, to autonomously detect and avoid obstacles. The car operates in two modes: automatic and manual. In the automatic mode, the car maneuvers around autonomously, avoiding obstacles to the front, rear, and sides. Whenever an obstacle is detected, the car maneuvers in the opposite direction, as long as that direction is also free of obstacles. In the manual mode, the car's movements are controlled via a Bluetooth connection. In addition to direction controls, a user is able to adjust the speed of the car and to adjust the distance at which obstacles are avoided.

Team Members & Roles

Eduardo Espericueta - Lidar Unit Integration

Sanjay Maharaj - Hardware Integration & System Wiring

George Sebastian - Software Infrastructure (tasks / movement logic) & Bluetooth Integration & Lidar

Introduction

The RC car that was purchased allowed for easy modifications to accommodate all of the components that were added. It is important to choose an RC car with enough room to package the components so they will be prone to damage from the car crashing into walls while testing. The Cadillac Escalade RC car purchased from Amazon in particular was easy to work with, as it had DC motors for both driving the rear wheels, and steering left or right. It also had LEDs, a switch, and a battery pack that could easily be repurposed for our project. All of these components were wired to a PCB, which was removed from the RC car once the wires were cut. Both DC motors worked well with a +/- 7 volt supply.

A bluetooth module interfaced to the SJ-ONE board over UART allowed for serial terminal emulation through pc or smartphone applications. This enabled commands to be sent to the vehicle wirelessly and on the fly.

The LIDAR unit was purchased as a spare part on eBay, and did not come with any documentation, as it isn’t designed for use in the scope of this project. However, a website of resources dedicated to hacking the Neato XV-11 Vacuum, as well as its LIDAR unit could be found at: https://xv11hacking.wikispaces.com/

Design Implementation

CMPE146 F14 LidarGroup SoftwareStateMachine2.png

The state machine above shows a high level working of the autonomous car. The car's software infrastructure is made up of four tasks: the motor task, the steering task, the Lidar task, and the Bluetooth task.


The Bluetooth task is in charge of receiving commands from a computer or a phone to control the autonomous car. There are many Bluetooth commands, including:

  • Switch between manual and automatic driving mode.
  • Steering and movement (for use in manual mode).
  • Movement speed adjustments (faster or slower).
  • Steering speed adjustments (faster or slower).
  • Change threshold values used in logic for dynamic calibration.

The Lidar task continuously runs and updates a global array with information about obstacles at every degree. The movement task will move the car forward or backward based on flags set by the Bluetooth task while in manual mode. The steering task behaves similarly when in manual mode except it moves the car left and right. In automatic mode, both cars will compare threshold values with the global array the Lidar task populated to make its decision to move.

System Integration

CMPE146 F14 LidarGroup Hardware.jpg

Bill Of Materials

Component Quantity Cost (EA) Source
Neato XV-11 LIDAR Unit 1 $90.00 eBay
1/14ths Scale Cadillac Escalade RC Car 1 $40.00 Amazon
SJ-ONE ARM Development Board 1 provided SJSU SCE
SJValley PWM Motor Controller 2 provided SJSU Parts Bin
Buck Converter for Voltage Conversion 2 $5.99 Amazon
LM7805CT 5V Voltage Regulator 1 $0.40 Mouser
HC-05 Bluetooth Module 1 $7.99 Amazon
Radioshack Battery Pack 12V 1 $2.99 RadioShack

Verification

To further understand how distance data compared to the physical distance of an object, values of the 0,90,180, and 360 degree points were printed to the screen while an object was placed at various distances. While testing, it was found that the LIDAR's data is erroneous if the vehicle is moving too fast. This is most likely due to the point at which the IR laser is emitted having moved with the speed of the vehicle sooner than the point at which the IR laser is received to compute a distance at some degree. After the vehicle was slowed down, the LIDAR unit returned consistent data, and the vehicle maneuvered very consistently.

Technical Challenges

1. The SJValley PWM Motor controllers that were used operate on a minimum input voltage of 7V.
2. The LIDAR DC Motor will only output data consistently between 240 and 300 RPM, so a constant 3.3V supply is recommended.
3. Lidar values can be corrupted and those values should be discarded in software.
4. The Firmware for the LIDAR can be interacted with over a serial connection, but is not easily utilized.
5. It is important to note that the printf function is costly when debugging at a 115200 baud rate.

Future Enhancements

Currently the device only uses some of the available Lidar data when driving autonomously. Much more complex driving patterns and obstacle avoidance could be possible if more of the data points are used. This could potentially eliminate blind spots the car might have. Further enhancements could come in the form of:

  • GPS support so the car can drive autonomously.
  • Some form of memory so the car can remember obstacles and form some kind of intelligent routing mechanism.

Conclusion

References

xv11hacking LIDAR Resources: https://xv11hacking.wikispaces.com/