F17: Optimus

From Embedded Systems Learning Academy
Revision as of 01:18, 22 December 2017 by Proj user5 (talk | contribs) (Software Implementation)

Jump to: navigation, search
Optimus left view
Optimus front view
Optimus right view

Optimus - An Android app controlled Self Navigating Car powered by SJOne(LPC1758) microcontroller. Optimus manuevers through the selected Routes using LIDAR and GPS Sensors. This wiki page covers the detailed report on how Optimus is built by Team Optimus.

Contents

Abstract

Embedded Systems are omnipresent and one of its unique, yet powerful application is Self Driving Car. In this project we to build a Self-Navigating Car named Optimus, that navigates from a source location to a selected destination by avoiding obstacles in its path.

The key features the system supports are

1. Android Application with Customized map and Dashboard Information.

2. LIDAR powered obstacle avoidance.

3. Route Calculation and Manuvering to the selected destination

4. Self- Adjusting the speed of the car on Ramp.

The system is built on FreeRTOS running on LPC1758 SJOne controller and Android application. The building block of Optimus are the five controllers communicating through High Speed CAN network designed to handle dedicated tasks. The controllers integrates various sensors that is used for navigation of the car.

     1. Master Controller -  handles the Route Manuevering and Obstacle Avoidance 
     2. Sensor Controller -  detects the surrounding objects
     3. Geo Controller - provides current location
     4. Drive Controller - controls the ESC
     5. Bridge controller - Interfaces the system to Android app 
System Architecture
Android Application

Objectives & Introduction

Our Objective is to build and integrate the functionality of these five controllers to develop fully functioning self-driving system.

Sensor Controller: Sensor controller uses RPLIDAR to scan its 360-degree environment within 6-meter range. It sends the scanned obstacle data to master controller and bridge controller.

Geo Controller: Geo controller uses NAZA GPS module that provides car current GPS location and compass angle. It calculates heading and bearing angle that helps the car to turn with respect to destination direction.

Drive Controller: Drive controller drives the motor based on the commands it receives from the Master.

Bridge Controller: Bridge controller works as a gateway between the Android application and Self-driving car and passes information to/from between them.

Master Controller: Master controller controls all other controllers and takes decision of drive.

Android Application: Android application communicates with the car through Bridge controller. It sends the destination location to be reached to the Geo controller and also provides all the Debugging information of the Car like

1. Obstacles information around the car

2. Car's turning angle

3. Compass value

4. Bearing angle

5. Car's GPS location

6. Destination reached status

7. Total checkpoints in the route

8. Current checkpoint indication

Team Members & Responsibilities

  • Master Controller:
    • Revathy

Project Schedule

Legend:

Major Feature milestone , CAN Master Controller , Sensor & IO Controller , Android Controller, Motor Controller , Geo , Testing, Ble controller, Team Goal

Week# Date Planned Task Actual Status
1 9/23/2017
  • Decide roles for each team member
  • Read FY16 project reports and understand requirements
  • Setup Gitlab project readme
  • Ordered CAN Tranceivers and get R/C car
  • Team roles are decided and module owners are assigned
  • Gitlab project is set
  • Ordered CAN tranceivers and got R/C Car
Complete.
2 9/30/2017
  • Design software architecture for each module and design signal interfaces between modules
  • Setup Wiki Project Report template
  • Design Hardware layout of system components
  • Create component checklist and order required components for individual modules.
  • Setup Gitlab project code for each modules
  • Overall project requirements are understood
  • Wiki Project report setup is done
  • Odered components for Geo controller module
  • Initial commit of project base is done
Complete
3 10/14/2017
  • Major Feature: Implement Free run mode
    • Implement heartbeat messages and initial system bootup sync between modules
    • Interface the RPLidar to SJOne board via UART
    • Achieve basic communication such as obtaining the device and health info.
    • Study of Android Toolkit for Bluetooth Adapter connections and APIs
    • Study of HC-05 Bluetooth Module
    • Creating APIs for Start/ STOP button requests to write to output-Stream buffers
    • Creating RFComm SPP Connection socket and the rest of UI for basic operation of Pairing, Connection
    • Checking the AT Command sequence for Bluetooth Operation and Pairing
    • Automating the AT Command sequence for Bluetooth HC-05 operation and Android App
    • Run Motors via commands from SJOne Automatically
    • Order the RPM sensor module for the Drive Controller
    • Design and Order PCB
  • Major Feature: Implemented Free run mode
    • Added hearbeat messages from all controllers to master in can_db and implemented the handling functions in master controller
    • Implemented speed steer command CAN msg transmission and handling in Master controller. Master-Drive integration phase-I
    • Interfaced RPLidar to SJOne board and achieved basic communication via UART. Started obtaining data as well.
    • Android:Android API for Bluetooth Adapter connections studied.
    • Android:Learning of AT Command sequence for Bluetooth Operation and Pairing done.
    • Android:Created Start/Stop API's for button requests to be Sent to HC-05 IC.
    • Android:Basic Pairing Operation Working.
    • Motor: ESC Traxxas XL-5 (Electronic Speed Control) interfaced to SJOne board
    • Tested and identified duty cycles for different speeds required; Callibration and testing of ESC is over exteral switch at P0.1
    • Ordered RPM sensor
Complete
4 10/21/2017
  • Major Feature: Implement Basic Obstacle Avoidance in Free-run mode
    • Add all modules CAN messages to DBC file
    • Test steer and speed CAN commands between Master and Motor
    • Implement Obstacle avoidance algorithm
    • Obtain data from the lidar and process the data i.e. decide on the format in which the data has to be sent to the master
    • Write unit test cases for the lidar.
    • Interface compass module to SJOne board and calibrate the errors
    • find the heading and bearing angle based on mocked checkpoint
    • Test and verify GPS module outdoor to receive valid data and check for errors
    • Calibrate the GPS module error
    • Design and implement the DRIVE_CONTROLLER STEER/SPEED interface with Master (TDD)
    • Install the new RPM sensor module for the Drive Controller
    • Operating motors based on the CAN messages from the Master
  • Major Feature: Implemented Free-run mode w/o obstacle avoidance
    • Added all modules basic CAN messages in can_db
    • Implemented interface files in master controller to handle CAN messages from all nodes to master
    • Implemented Master-Drive controller Integration
    • Implemented Master-Bluetooth controller integration
    • Added all modules basic CAN messages in can_db
    • GPS integrated to SJONE board
    • Added all modules basic CAN messages in can_db
    • Wrote unit test cases for the LIDAR.
    • Wrote logic for dividing the information obtained from the lidar into sectors and tracks.
    • MASTER_SPEED_STEER_CMD was defined to use 8-bits for speed control (neutral, forward, and reverse); 9-bits for steer control (straight, left, and right)
    • Designed glue code: DriveManager and hardware interface code: DriveController using TDD (test code in _MOTOR/_cgreen_test/)
    • Got the Traxxas #6520 RPM sensor; installed the same with the slipper clutch; Observed the RPM sensor trigger over an oscilloscope and found the minimum distance of magnet to RPM sensor is not achievable with the stock slipper clutch. Ordered Traxxas #6878 new slipper clutch and ball-bearings
    • Master - Drive Controller Interface implemented and tested over CAN; Check "drive" terminal command on Master controller
complete
5 10/28/2017
  • Major Feature: Implement maneuvering in Master controller
    • Implement maneuvering algorithm to drive steering angle of the servo
    • Implement maneuvering algorithm to control ESC speed
    • Test and validate the information obtained from the sensor.
    • Send the Lidar data and heartbeat over CAN.
    • LIDAR should be fully working.
    • Identify the basic speed(s) at which the car shall move; the min, max and normal forward speeds, and the min and normal reverse speeds
    • Interface the RPM sensor over ADC and validate the readings
    • Writing PID Algorithm for Motor Control
    • Calibrating PID constants according to the Motors
    • Testing the Bluetooth Range and multiple pairing option to establish security of the Master device
    • Testing the accuracy of GPS while moving
    • Made the code modular and added the wrapper function for all the important modules
    • Worked on android app which will dump the lattitude and longitude information for checkpoints
    • Test the accuracy of GPS while moving
    • Get the code review done and do the testing after that
    • Worked on the Android app that will dump the checkpoints into a file
    • Finish PCB design and place order
  • Major Feature: Implemented maneuvering in Master-Geo controller
  • Major Feature: Implemented Basic Obstacle Avoidance in Free-run mode
    • Implement maneuvering algorithm in android app is moved to next week schedule
    • Implemented maneuvering algorithm in Master to drive steering angle of the servo
    • Implement maneuvering algorithm in Master to control ESC speed
    • Unit Testing obstacle avoidance algorithm
    • Tested and validated the sensor data by plotting graphs in an EXCEL sheet.
    • Sending the obstacle information and heartbeat over CAN.
    • LIDAR fully working and sending obstacle information.
    • Identified basic speeds, slow, normal, and turbo for forward and reverse
    • Interfaced the RPM sensor over GPIO and validated; but the clutch gear with magnet was far apart from the RPM Sensor
    • Wrote the PID code keeping future integration in mind; Have pushed the code
    • Failed to use RPM sensor - new clutch gear also did not work (magnet is too far away - validated with Oscilloscope); Have to consider using IR sensor for feedback
    • Android:Tested successfully individual and multiple Device pairing.
    • Android:Android app updated with Navigation and Drawer Modules with Detecting NAV points.
    • Tested the accuracy of GPS while moving
    • Made the GPS and compass code modular and checked the functionaity after the changes
    • Worked on the Android app that will dump the checkpoints into a file
    • Completed PCB Design
Complete
6 11/07/2017
  • Major Feature: Implement maneuvering with mocked GEO checkpoints
    • Collect mock checkpoints using the Android Data Collector application
    • Collect mock checkpoints using the GEO module and compare for any discrepancies
    • Identify I/O on-board Display information; Currenly identified are documented below:
    • Health status like GPS Lock status, etc.
    • Identify hardware to check battery-status and procure the same; update PCB as well
    • Display bluetooth pairing status
    • Test on-board I/O module for bluetooth pairing status
    • In case RPM installation/usage fail, Identify new mechanism for feedback and order components; Update PCB as well to include new hardware
    • Implement simple feature additions on steer control to handle reverse; basically steering rear-left and rear-right has to be practically implemented on motor/drive controller
    • Receive GEO Controller's Turning-angle message and compute target steer
    • Use GEO Controller's distance to next-checkpoint information to compute target speed
    • Mock checkpoint navigation testing using different possible obstacle heights and forms possible
    • Identify advertisement messages on the DBC file and add documentation in Wiki; Currently identified advertisements: a) current GEO location, b) SENSOR radar map
    • Shall define the BLE Controller to android message structure and message generation-intervals (classify on-demand advertisements and periodic advertisements)
    • Implement marker for current location display - which is an on-demand advertisement
    • Implement feature for the user to enter destination - a Google Map View shall be shown to the user to confirm route from source(current car location) to destination
    • Android app (once on the new device) shall download the entire offline map information of the SJSU campus and store it on a SQLite database
  • Major Feature: Implemented maneuvering with mocked GEO checkpoints
    • Provided Mock checkpoints and used the heading and bearing angle logic to get the turning angle
    • Collected mock checkpoints and check for the error with different places
    • Interfaced the Sparkfun Seven segment display with the SJOne Board.
    • Implemented interface method to receive GEO Controller's Turning-angle message and set target steer
    • Target speed is not changed between checkpoints.So geo feedback for distance to destination is not used in design
    • Destination Reached flag is tracked to stop the car on reaching destination
    • Checkpoint Id CAN signal is processed by Master to start the car once destination is selected
    • Android:Implemented Marker for current position Display.
    • Android:User entry for setting up destination on MAP done.
    • RPM Installation failed, but could get auxiliary hardware (motor pinion) from local shop and get it working
    • Implemented basic motor feedback using hall sensor (RPM sensor); tested working on ramps
    • Steer left and right on reverse now follows natural order; Could not finish literal reverse-left and reverse-right implementation; Moved this task forward; Had to test and implement motor feedback this week
    • Defined the BLE Controller messages to android in JSON message structure and message generation-intervals (classify on-demand advertisements and periodic advertisements)
    • On Demand Advertisement- Current Marker Location
    • Draggable Destination Marker for final destination and intermittent checkpoint transmission to GEO from Android via BLE
    • Marking the checkpoints with HUE_BLUE color to do better tracking of the navigation.
    • Added multi state BT options and Added restrictions on buttons like NAV usage dependency on BT Connection, Powerup button dependency on NAV setup before actually powering the car.
Complete
7 11/14/2017
  • Major Feature: Implementing maneuvering with Android app supplied GEO checkpoints with on-board I/O
    • Use mock data from file to compute: a) Heading b) Bearing -> use Haversine's algorithm to compute turning angle
    • Advertise distance to the next checkpoint (again using Haversine's algorithm)
    • Save the proper checkpoints for one route (Clark's to SU) to SDCARD on GEO Controller
    • Implement system start/stop triggers from different use cases
    • Turning angle offset of -10,10 is added to take right / left turn
    • Implement the battery-status DBC Message advertisement
    • Indicate checkpoint proximity using backlight indicators
    • Create 2 CAN messages for Disgnostic and I/O data to transmit it to BLE module
    • Receive the diagnostic CAN message and decode to transmit it to Android App
    • [Android I/O:] Design Android app views for visualizing Diagnostic and I/O data
    • Test and validate success/fail cases for on-board I/O display information(as defined above)
    • Update PWM pulses to match MASTER's target speed with proper feedback from the identified feedback-mechanism
    • Identify PID constants kp, ki, kd and evaluate performance against the basic feedback implementation
    • Finalize feedback algorithm and fine-tuning
  • Major Feature: Implemented maneuvering with Android app supplied GEO checkpoints with on-board I/O
    • [Geo:] Implemented mock data from file to compute: a) Heading b) Bearing -> used Haversine's algorithm to compute turning angle
    • [Geo:] Advertised distance to the next checkpoint (again using Haversine's algorithm)
    • [Geo:] Saving the checkpoints in SDCARD on GEO Controller
    • Implemented start-stop triggers from android and auto start on start of route navigation
    • Turning angle from geo is handled with offset
    • battery-status is optional feature. Planning for later
    • Indicate checkpoint proximity using backlight indicators
    • [BLE:] Created CAN messages for Telemetry data from all modules to BLe to send to Android
    • [BLE:] Received Telemetry messages are transmitted to Android App
    • [Android I/O:] Android app views created for visualizing Telemetry data
    • Test and validate success/fail cases for on-board I/O display information
    • Update PWM pulses to match MASTER's target speed with proper feedback from the identified feedback-mechanism
    • Finalize feedback algorithm and fine-tuning
Complete.
8 11/21/2017
  • Major Feature: Complete maneuvering implementation with Android app and Android I/O
    • [Android I/O:] Implement display of Sensor Obstacle Information on a RADAR map
    • [Android I/O:] Dynamically update car's Current location on the map's route path
    • [Android I/O:] BT Auto Connection and Pairing implemented
    • [Android I/O:] Health information from BLE Controller, namely battery, GPS lock status, and motor speed shall be updated
    • [Android I/O:] BT Auto connect implementation and re-connection on disconnection.
    • Test achievable target speeds with different possible obstacle heights and forms possible, and ground conditions
  • Major Feature: Completed maneuvering implementation with Android app
    • [Android I/O:] Sensor obstacle LIDAR information has been updated on the app
    • [Android I/O:] Dynamic update of Car's current location and intermittent checkpoints implemented.
    • [Android I/O:] Health information from BLE Controller, namely GPS lock status, and motor speed has been updated on the Dashboard of the app.
    • [Android I/O:] Completed BT Auto connect implementation and re-connection on disconnection.
Complete.
9 11/28/2017
  • Major Feature: Full feature integration test
    • Execute the test plan created above [Planned for 11/14] (check Testing documentation in Wiki)
    • Execute the test plan created above [Planned for 11/14]; Phase 1: Test all identified cases for ground-conditions (grass, inclines, etc)
    • Execute the test plan created above [Planned for 11/14]; Phase 2: Test all identified cases for GPS routes and obstacle forms
  • Major Feature: Full feature integration test
    • Integration testing with all controllers and Android App to select routes and send checkpoints from App to Ble.
Complete.
10 12/5/2017
  • Major Feature: Full feature integration test
    • Execute the test plan created above [Planned for 11/14]; Phase 3: Test all identified cases for speed levels and on-board I/O validation
    • Execute the test plan created above [Planned for 11/14]; Phase 4: Test all identified cases for [Android I/O] validation
  • Major Feature: Full feature integration test
    • Integration testing with Android App with Debug view/Dash board with sensor and GPS data
Complete
11 12/12/2017
  • Major Feature: Full feature integration test
    • Execute the test plan created above [Planned for 11/14]; Phase 5: Test all identified cases for desired Turbo mode(s)
  • Update Wiki Complete Report
  • Major Feature: Full feature integration test
complete

Parts List & Cost

The Project bill of materials is as listed in the table below.

SNo. Component Units Total Cost
General System Components
1 SJ One Board (LPC 1758) 5 $400
2 Traxaas RC Car 1 From Prof. Kaikai Liu
3 CAN Transceivers 15 $55
4 PCAN dongle 1 From Preet
5 PCB Manufacturing 5 $70
6 3D printing 2 From Marvin
6 General Hardware components( Connectors,standoffs,Soldering Kits) 1 $40
7 Power Bank 1 $41.50
8 LED Digital Display 1 From Preet
9 Acrylic Board 1 $12.53
Sensor/IO Controller Components
10 RP Lidar 1 $449
Geo Controller Components
11 GPS Module 1 $69
Bluetooth Bridge Controller Component
12 Bluetooth Module 1 $34.95
Drive Controller Component
13 RPM Sensor 1 $20

CAN Communication

The controllers are connected in a CAN bus at 100K baudrate. System Nodes: MASTER, MOTOR, BLE, SENSOR, GEO

SNo. Message ID Message from Source Node Receivers
Master Controller Message
1 2 System Stop command to stop motor Motor
2 17 Target Speed-Steer Signal to Motor Motor
3 194 Telemetry Message to Display it on Android BLE
Sensor Controller Message
4 3 Lidar Detections of obstacles in 360 degree grouped as sectors Master,BLE
5 36 Heartbeat Master
Geo Controller Message
6 195 Compass, Destination Reached flag, Checkpoint id signals Master,BLE
7 196 GPS Lock Master,BLE
8 4 Turning Angle Master,BLE
9 214 Current Coordinate Master,BLE
10 37 Heartbeat Master
Bluetooth Bridge Controller Message
11 1 System start/stop command Master
12 38 Heartbeat Master
13 213 Checkpoint Count from AndroidApp Geo
14 212 Checkpoints (Lat, Long) from Android App Geo
Drive Controller Message
15 193 Telemetry Message BLE
16 35 Heartbeat Master

DBC File

The CAN message id's transmitted and received from all the controllers are designed based on the priority of the CAN messages. The priority is as follows

Priority Level 1 - User Commands

Priority Level 2 - Sensor data

Priority Level 3 - Status Signals

Priority Level 4 - Heartbeat

Priority Level 5 - Telemetry signals to display in I/O

BU_: DBG DRIVER IO MOTOR SENSOR MASTER GEO BLE
BO_ 1 BLE_START_STOP_CMD: 1 BLE
SG_ BLE_START_STOP_CMD_start : 0|4@1+ (1,0) [0|1] "" MASTER
SG_ BLE_START_STOP_CMD_reset : 4|4@1+ (1,0) [0|1] "" MASTER
BO_ 2 MASTER_SYS_STOP_CMD: 1 MASTER
SG_ MASTER_SYS_STOP_CMD_stop : 0|8@1+ (1,0) [0|1] "" MOTOR
BO_ 212 BLE_GPS_DATA: 8 BLE
SG_ BLE_GPS_long : 0|32@1- (0.000001,0) [0|0] "" GEO
SG_ BLE_GPS_lat : 32|32@1- (0.000001,0) [0|0] "" GEO
BO_ 213 BLE_GPS_DATA_CNT: 1 BLE 
SG_ BLE_GPS_COUNT : 0|8@1+ (1,0) [0|0] "" GEO,SENSOR
BO_ 214 GEO_CURRENT_COORD: 8 GEO
SG_ GEO_CURRENT_COORD_LONG : 0|32@1- (0.000001,0) [0|0] "" MASTER,BLE
SG_ GEO_CURRENT_COORD_LAT : 32|32@1- (0.000001,0) [0|0] "" MASTER,BLE
BO_ 195 GEO_TELECOMPASS: 6 GEO
SG_ GEO_TELECOMPASS_compass : 0|12@1+ (0.1,0) [0|360.0] "" MASTER,BLE
SG_ GEO_TELECOMPASS_bearing_angle : 12|12@1+ (0.1,0) [0|360.0] "" MASTER,BLE
SG_ GEO_TELECOMPASS_distance : 24|12@1+ (0.1,0) [0|0] "" MASTER,BLE
SG_ GEO_TELECOMPASS_destination_reached : 36|1@1+ (1,0) [0|1] "" MASTER,BLE
SG_ GEO_TELECOMPASS_checkpoint_id : 37|8@1+ (1,0) [0|0] "" MASTER,BLE
BO_ 194 MASTER_TELEMETRY: 3 MASTER
SG_ MASTER_TELEMETRY_gps_mia : 0|1@1+ (1,0) [0|1] "" BLE
SG_ MASTER_TELEMETRY_sensor_mia : 1|1@1+ (1,0) [0|1] "" BLE
SG_ MASTER_TELEMETRY_sensor_heartbeat : 2|1@1+ (1,0) [0|1] "" BLE
SG_ MASTER_TELEMETRY_ble_heartbeat : 3|1@1+ (1,0) [0|1] "" BLE
SG_ MASTER_TELEMETRY_motor_heartbeat : 4|1@1+ (1,0) [0|1] "" BLE
SG_ MASTER_TELEMETRY_geo_heartbeat : 5|1@1+ (1,0) [0|1] "" BLE
SG_ MASTER_TELEMETRY_sys_status : 6|2@1+ (1,0) [0|3] "" BLE
SG_ MASTER_TELEMETRY_gps_tele_mia : 8|1@1+ (1,0) [0|1] "" BLE
BO_ 196 GEO_TELEMETRY_LOCK: 1 GEO
SG_ GEO_TELEMETRY_lock : 0|8@1+ (1,0) [0|0] "" MASTER,SENSOR,BLE

BO_ 3 SENSOR_LIDAR_OBSTACLE_INFO: 6 SENSOR
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR0 : 0|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR1 : 4|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR2 : 8|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR3 : 12|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR4 : 16|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR5 : 20|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR6 : 24|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR7 : 28|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR8 : 32|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR9 : 36|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR10 : 40|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR11 : 44|4@1+ (1,0) [0|12] "" MASTER,BLE
BO_ 4 GEO_TURNING_ANGLE: 2 GEO
SG_ GEO_TURNING_ANGLE_degree : 0|9@1- (1,0) [-180|180] "" MASTER,BLE


The CAN DBC is available at the Gitlab link below

https://gitlab.com/optimus_prime/optimus/blob/master/_can_dbc/243.dbc

CAN Bus Debugging

We used PCAN Dongle to connect to the host pc to monitor the CAN Bus traffic using BusMaster tool. The screenshot of the Bus Master log is shown below

BusMaster CAN Singal Log

Hardware & Software Architecture

Master Controller

Software Architecture Design

The Master Controller Integrates the functionality of all other controllers and it acts as the Central Control Unit of the Self Navigating car. Two of the major functionalities handled by Master Controller is Obstacle avoidance and Route Maneuvering.

The overview of Master Controller Software Architecture is as show in the figure below.

SW Architecture

As an analogy to Human driving, it receives the inputs from sensors to determine the surrounding of the Self Navigating car and take decisions based on the environment and current location of the car. The input received and output sent by the Master are as mentioned below:

Input to Master:

1. Lidar Object Detection information - To determine if there is an obstacle in the path of navigation

2. GPS and Compass Reading - To understand the Heading and Bearing angle to decide the direction of movement

3. User command from Android - To stop or Navigate to the Destination

Output from Master:

1. Motor control information - sends the target Speed and Steering direction to the Motor.

Software Implementation

The Master controller runs 2 major algorithms, Route Maneuvering and Obstacle Avoidance. The System start/stop is handled by master based on the Specific commands. The implicit requirement is that When the user selects the destination, route is calculated and the checkpoints of the route are sent from Android through bridge controller to the Geo. Once Geo Controller receives a complete set of checkpoints, the master controller starts the system based on the "Checkpoint ID". If the ID is a non-zero value, the route has started and Master controller runs the Route Maneuvering Algorithm.

The Overall control flow of master controller is shown in the below figure.

Process Flowchart

Unit Testing

Using Cgreen Unit Testing framework, the Obstacle avoidance algorithm is unit tested.The complete code for unit test is added in git project.

Ensure(test_obstacle_avoidance)
{
   //Obstacle Avoidance Algorithm
   pmaster->set_target_steer(MC::steer_right);
   mock_obstacle_detections(MC::steer_right,MC::steer_right,false,false,false,false,false,false,true);
   assert_that(pmaster->RunObstacleAvoidanceAlgo(obs_status),is_equal_to(expected_steer));
   assert_that(pmaster->get_forward(),is_equal_to(true));
   assert_that(pmaster->get_target_speed(),is_equal_to(MC::speed_slow));
}
Ensure(test_obstacle_detection)
{
   //Obstacle Detection Algorithm
   mock_CAN_Rx_Lidar_Info(2,2,6,0,2,2,4,0,2,0,5,0);
   set_expected_detection(true,false,true,false,true,false,false);
   actual_detections = psensor->RunObstacleDetectionAlgo();
   assert_that(compare_detections(actual_detections) , is_equal_to(7));
}
TestSuite* master_controller_suite()
{
   TestSuite* master_suite = create_test_suite();
   add_test(master_suite,test_obstacle_avoidance);
   add_test(master_suite,test_obstacle_detection);
   return master_suite;
}

On board debug indications

Sr.No LED Number Debug Signal
1 LED 1 Sensor Heartbeat, Sensor Data Mia
2 LED 2 Geo Heartbeat, Turning Angle Signal Mia
3 LED 3 Bridge Heartbeat mia
4 LED 4 Motor Heartbeat mia

Design Challenges

The critical part in Obstacle Avoidance Algorithm is designing as two parts, 1. Detect obstacles 2. Avoid Obstacles. Since we get 360-degree view of obstacles, we need to group the zones into sectors and tracks to process the 360-degree detections and take decision-based

Obstacle Avoidance Design

Motor Controller

Design & Implementation

The Motor Controller is responsible for the Movement and Steering action of the Car. It includes two types of motors, DC motor for movement and DC Servo motor for Steering. The Motor has an inbuilt driver called ESC (Electronic Speed Control) Circuit used the manipulate the speed and steering of the Car. It has a PWM input for both Servo Motor and DC Motor. We are using RPM sensor to take the feedback from the motor to monitor the speed.

Hardware Design

Motor Hardware Schematics
SJOne Pin Diagram
Sr.No Pin Number Pin Function
1 P0.1 HEADlIGHTS
2 P1.19 BRAKELIGHTS
3 P1.20 LEFT INDICATORS
4 P1.22 RIGHT INDICATORS
5 P0.26 RPM SENSOR
6 P2.0 SERVO PWM
7 P2.1 MOTOR PWM




Hardware Specifications

  • 1. DC Motor, Servo and ESC

This is a Traxxas Titan 380 18-turn brushed motor. The DC motor comes with the Electronic Speed Control(ESC) module. The ESC module can control both servo and DC motor using Pulse Width Modulation (PWM) control. ESC also requires an initial calibration: ESC is operated using PWM Signals. The DC motor PWM is converted in the range of -100% to 100% where -100% means "reverse with full speed", 100% means "forward with full speed" and 0 means "Stop or Neutral". Also, the servo can also be operated in a Safe manner using PWM.

As we need a locked 0 –> 180 degrees motion in certain applications like robot arm, Humanoids etc. We use these Servo motors. These are Normal motors only with a potentiometer connected to its shaft which gives us the feedback of analog value and adjusts its angle according to its given input signal.

So… How to Operate it? A servo usually requires 5V->6V As VCC. (Industrial servos requires more.) and Ground and a signal to adjust its position. The signal is a PWM waveform. For a servo, we need to provide a PWM of frequency about 50Hz-200Hz (Refer the datasheet). So the time duration of a clock cycle goes to 20ms. From this 20ms if the On time is 1ms and off time is 19ms we generally get the 0 degrees position. And when we increase the duty cycle from 1ms to 2ms the angle changes from 0–> 180 degrees. So where can it go wrong-

Servo Motor Operation

Power->> The power we provide. Generally we tend to give a higher volt batteries for our applications by pulling the voltage down through regulators to 5Vs. But we surely can-not give supply to the servo through our uC as the servo eats up a hell lot of current.

Another way to burn the servo is at certain times the supply is given directly through the battery so the uC will not blow up. But if you Give a supply say 12Volts then boom. Your servo will go on for ever.

PWM–> PWM should strictly be in the range between 1ms–> 2ms (refer datasheets) If by any mistakes the PWM goes out then boom the servo will start jittering and will heat up and heat up and will burn itself down. But this problem is easily identifiable as there is a jitter sound which if you have got enough experience with servos, you will totally notice the noise. So if the noise is there when you turn on the servo, turn it off right away and change the code ASAP.

Load— Hobby servos don’t have high load bearing capacities and as it is designed that way it always tries to adjust its angle according to signal. But here is the catch. As there is too much off load the servo cannot go further and the signal is forcing it to. So again.. heat… heat and boom. How to avoid this. Give load to the servo only in the figure of safety.

  • 2. RPM Sensor
RPM Sensor

We use a hall effect Sensor to measure wheel's RPS (Rotations per second). A "trigger magnet" is installed in the spur gear and this hall effect sensor which is installed in the gear cover read the magnet once every revolution. The sensor sends a high pulse with the varying voltage at the transducer caused by the magnet's proximity. The motor controller handles this pulse over external interrupt handler on the P0.26 port. This count is timely used to calculate the RPS and thereby the speed of our vehicle. A simple algorithm maintains the speed according to the measured RPS (Rotations per second) in the 10Hz task.

The RPM sensor above requires a specific kind of Installation. STEPS ARE:

CmpE243 F17 RPM install1.JPG
CmpE243 F17 RPM install2.JPG


Once the installation is done, the RPM can be read using the above magnetic RPM sensor. It gives a high pulse at every rotation of the wheel. Hence, to calculate the RPM, the output of the above sensor is fed to a gpio pin of SJOne board.

Motor Module Hardware Interface

The Hardware connections of Motor Module is shown in above Schematic. The motor receive signals through CAN bus from the Master and feedback is sent via RPM sensor to the Master as current speed of the Car. The speed sent from a RPM sensor over a CAN bus is also utilized by I/O Module and BLE module to print the values on LED display and Android App.

Software Design

The following diagram describes the flow of the software implementation for the motor driver and speed feedback mechanism.

Motor controller flowchart

Motor Module Implementation

The motor controller is operated based on the CAN messages received from the Master. The CAN messages for Drive and Steer commands are sent from the Master Controller. Motor controller converts the value received from Master (+100 to -100 for Drive Speed percent and +100 to -100 for Steer angle in the range of 1 to 180 degrees turn) into specific PWM value as required by DC motor and Servo.

  • Speed Regulation:

Upon detection of uphill the pulse frequency from RPM Sensor reduces, that means car is slowing down. Hence, in that scenario, car is accelerated (increase PWM) further to maintain the required speed. Similarly in case of Downhill pulse frequency increases, which means car is speeding up. Hence, brakes (reduced PWM) are applied to compensate the increased speed.

Sensor Controller

The Sensor is for detecting and avoiding obstacles. For this purpose we have used RPLIDAR by SLAMTEC.

Introduction

The RPLIDAR A2 is a 360 degree 2D laser scanner (LIDAR) solution developed by SLAMTEC. It can take up to 4000 samples of laser ranging per second with high rotation speed. And equipped with SLAMTEC patented OPTMAG technology, it breakouts the life limitation of traditional LIDAR system so as to work stably for a long time. The system can perform 2D 360-degree scan within a 6-meter range. The generated 2D point cloud data can be used in mapping, localization and object/environment modeling. The typical scanning frequency of the RPLIDAR A2 is 10hz (600rpm).
LIDAR System Composition

Under this condition, the resolution will be 0.9°. And the actual scanning frequency can be freely adjusted within the 5-15hz range according to the requirements of users. The RPLIDAR A2 adopts the low cost laser triangulation measurement system developed by SLAMTEC, which makes the RPLIDAR A2 has excellent performance in all kinds of indoor environment and outdoor environment without direct sunlight exposure.

This LIDAR consists of a range scanner core and the mechanical powering part which makes the core rotate at a high speed. When it functions normally, the scanner will rotate and scan clockwise. And users can get the range scan data via the communication interface of the RPLIDAR (UART) and control the start, stop and rotating speed of the rotate motor via PWM.

A laser beam is sent out by the transmitter and the reflected laser beam is received back. Depending on the time taken to receive the beam back, the distance of the obstacle is calculated. If there is no obstacle, the beam will not be reflected back.

Hardware Implementation

Specifications of the LIDAR

The specifications of the LIDAR as mentioned in the datasheet are as follows:
Power Supply: 5V
Serial Communication interface: UART
Baud Rate for the UART: 115200
Working mode of the UART: 8N1
PWM Maximum Voltage: 5V (Typical 3.3V) PWM frequency: 25KHz
Duty Cycle of the PWM wave: 60% - 100%

Connections to the SJOne Board

The LIDAR works with a UART interface and hence has been connected to the UART3 pins of of the SJOne board i.e. P4.28 and P4.29. As the LIDAR needs a 5V supply, it is provided from the PCB (which is powered through a power bank) instead of the SJOne board which can supply only 3.3V. The connections can be seen in the figure below.

LIDAR Connections to SJOne Board

Software Implementation

Approach for obtaining the data from the LIDAR

The LIDAR senses all the obstacles around it (360 degrees upto a range of 6000cm) one degree at a time. This means that for one rotation of the LIDAR WE GET 360 values i.e. 360 angles with their corresponding obstacle information. It takes 180ms for the LIDAR to complete one 360 degree scan. Since we do not need obstacle information for each and every angle, we group a few angles together into "sectors" and consider the nearest object present in a sector as an obstacle. To identify how far an obstacle is located, the distance values are grouped into "tracks" i.e 0cm to 25cm is track 1 and 25cm to 50cm is track 2 and so on. The motor will take action depending on the track in which an obstacle is present.

LIDAR readings divided into sectors and tracks

Algorithm for interfacing LIDAR to SJOne board and obtaining the obstacle info

Step 1: Send a GET_HEALTH (0XA5 0X52) Request. If the receive times out it is a communication error.

The GET_HEALTH request and response packets

Step 2: Check if a ‘protection stop’ is happening. If it is happening then send a RESET (0XA5 0X40). Again check for ‘protection stop’ and if it it still set, send a RESET again. If ‘protection stop’ is set even after sending RESET multiple times it means there is a hardware defect. If there is no hardware defect, proceed to the next step.

The RESET request packet

Step 3: Send a START_SCAN (0XA5 0X20) request. Wait for the response header. If there is no timeout, read the measurement sample. Otherwise check HEALTH_STATUS and MOTOR_STATUS again. Send START_SCAN again.

The START_SCAN request and response packets

Step 4: Continuously read the measurement samples.The data sent from the LIDAR will contain the start bit, angle, distance and quality. The start bit is set to 1 after every single 360 degree scan. The angle and distance represent to the motor angle and the obstacle in that corresponding angle. The quality represents the strength of the reflected beam. If the quality is zero it means that there is no obstacle in that direction. This data is processed to be group the information into sectors and tracks.

The measurement response packet

Step 5: If we wish to stop the readings, send a STOP (0XA5 0X25) request. This is the end of operation.

Flowchart for Communicating with the LIDAR and receiving obstacle information

The entire flowchart for communicating with the LIDAR and receiving data from it is shown in the figure below:

Flowchart for communicating with the LIDAR and receiving obstacle information from it

Testing the data obtained from the LIDAR

To perform the initial testing of the LIDAR and to check if we are getting the correct obstacle info, we have made a setup enclosing the LIDAR on all four sides. So, by plotting the distance info given by the LIDAR in Microsoft Excel we can visualize a map of the obstacles as detected by the LIDAR. The map plotted in Excel after closing almost all four sides of the LIDAR can be shown in the figure shown below.

Data Obtained from the LIDAR plotted on an Excel sheet

CAN DBC messages sent from the Sensor Controller

The data received from the LIDAR is grouped into sectors and tracks and is sent over the CAN bus. The CAN DBC messages in the DBC file will be as follows
BO_ 3 SENSOR_LIDAR_OBSTACLE_INFO: 6 SENSOR

SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR0 : 0|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR1 : 4|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR2 : 8|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR3 : 12|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR4 : 16|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR5 : 20|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR6 : 24|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR7 : 28|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR8 : 32|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR9 : 36|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR10 : 40|4@1+ (1,0) [0|12] "" MASTER,BLE
SG_ SENSOR_LIDAR_OBSTACLE_INFO_SECTOR11 : 44|4@1+ (1,0) [0|12] "" MASTER,BLE

Android Application

Description

An Android Mobile Device Application to Navigate and trigger power to the Self Driving Car "OPTIMUS".

Optimus App serves an important role in the SDLC as it integrates the UI alongwith RC Controls like "Power", "Navigation" over Bluetooth channel with the Self navigating CAR.
The App uses RF-Comm Bluetooth Communication protocol and a BLE Transceiver to communicate with CAR and exchange several useful information over a Baud-rate of 9600bps.
Optimus mobile Platform needs to be connected with a specific Device Address based on the BLE Chip type in use.

  • OPTIMUS HOME
Optimus App: OPTIMUS HOME

Features

1. BLUETOOTH

The Android mobile application includes support for the Bluetooth network stack, which allows a device to wirelessly exchange data with the HC-05 Bluetooth device.
The Android application framework provides access to the Bluetooth functionality through the Android Bluetooth APIs. These APIs let applications wirelessly connect to other Bluetooth device, enabling point-to-point wireless features.

Using the Bluetooth APIs, Android application performs the following:

  • Scan for other Bluetooth device HC-05 on RC Car [00:**:91:D9:14:**]
  • Query the local Bluetooth adapter for paired Bluetooth devices
  • Establish RFCOMM channels
  • Connect to other devices through service discovery
  • Transfer data to and from other devices
  • Manage multiple connections

2. MAPS

OPTIMUS App uses Google Maps for setting up the Routing Map information and to decide on the next checkpoint for the Car and the appropriate shortest route by computing the checkpoints using "Adjacency Matrix" and certain algorithms.
Google Maps are used along with other promising features to improve the navigation experience as the Route plot and Checkpoint mapping on groovy paths around campus are difficult to plan and route using Google Api(s).

  • MAPS :: ANDROID - BLE COMMUNICATION JSON SCHEMA

The App was also upgraded to have live tracking feature of Car's location by indicating the crossed marker with YELLOW_HUE color to distinguish the original path and the traversed path by the car.
As soon as the car crosses a checkpoint marker the marker color will be updated to YELLOW from its original BLUE Color to indicate the checkpoint flag has been crossed.

Optimus App: LIVE CAR TRACKING

Optimus app uses interpolation schemes to calculate intermediate routes and to set checkpoints using Draggable Marker mechanism to set Destination and plot route path till the same.
The Json Format shown has various tags for extracting checkpoint information using Json reader and plotting the points on the Map. Features of the Json Data packet are:

  • Feature Properties:
 * Name                   : Description of the route Start Point
* Description [optional] : Custom Description of the route
* LineString  : Signifies the route type eg. Line Plot
* coordinates  : List of Lat-Long Coordinates till Next major Check point
ROUTE INTERPOLATION DATA


3. DASHBOARD

Dash Board was designed to have an at a glance View and to project a UI similar to a CAR Dashboard on the App wherein we have Compass Values, Bearing and Heading Angles, Lidar Maps to resonate the data obtained from LIDAR which also helps in debugging the features and the values being sent from respective Sensor Modules.

  • OPTIMUS DASHBOARD
Optimus App: OPTIMUS DASHBOARD

  • DASHBOARD JSON SCHEMA
DASHBOARD DATA

  • Dashboard Information:
 * JSON_ID_GPS_LOCK_STAT   : Signifies the current Status of GPS LOCK on the car
* JSON_ID_COMPASS_HEADING : Signifies current Heading Angle from COMPASS
* JSON_ID_COMPASS_BEARING : Signifies current Bearing Angle from COMPASS
* JSON_ID_TURNING_ANGLE  : Signifies current TurningAngle from COMPASS
* JSON_ID_DIST_TO_DEST  : Signifies distance from Current Location to Destination or Absolute Displacement of Car relative to Destination Checkpoint
* JSON_ID_DEST_REACHED  : Signifies whether the car has reached Destination or not!
  • LIDAR Information:
 * JSON_ID_SENSOR_LIDAR_OBSTACLE_INFO_SEC0   : Signifies Track position of the Obstacles detected on multiple Sectors by LIDAR

For Example: Track 9, Sector 1 means Obstacle is detected at Sector 1 at 450 centimeters or 4.50 meters from the Current position of car at an angle range 20-45 degrees from LIDAR/CAR Front line of vision at that particular time instance

LiDAR detection of Track 9 Sector 1 i.e. 4.50 mts.
Android: LIDAR PLOT
Optimus App: Lidar Obstacle Detection
Android: LIDAR PLOT

Bluetooth Controller

Hardware Implementation

' Bluetooth Module Pin Configuration:’

We are using HC-05 Bluetooth module to send and receive the data from our android application.

Bluetooth Module
pin configuration



The Bridge controller is connected to the bluetooth module through the uart serial interface (Uart3) with 9600 baud rate 8-bit data and 1 stop bit.

Software Implementation

Pseudo code of Bridge controller:

1. Turn on bridge controller.

2. Initialise Bluetooth controller with Uart3 settings.

3. Initialise CAN-BUS with 100 kbps speed.

4. Handle Incoming IO messages it received from the Geo and the Sensor over CAN Bus.

5. Send the received CAN message to the Android over Bluetooth each second.

6. Send the heartbeat message every second to the Master controller.

7. Read Bluetooth message it received from the Android app.

8. Forward the Android message to GEO controller if it received checkpoints otherwise forward it to Master.

Process Flowchart

DBC format for messages sent from Bluetooth controller :

BO_ 1 BLE_START_STOP_CMD: 1 BLE
SG_ BLE_START_STOP_CMD_start : 0|4@1+ (1,0) [0|1] "" MASTER
SG_ BLE_START_STOP_CMD_reset : 4|4@1+ (1,0) [0|1] "" MASTER
BO_ 38 BLE_HEARTBEAT: 1 BLE
SG_ BLE_HEARTBEAT_signal : 0|8@1+ (1,0) [0|255] "" MASTER
BO_ 212 BLE_GPS_DATA: 8 BLE
SG_ BLE_GPS_long : 0|32@1- (0.000001,0) [0|0] "" GEO
SG_ BLE_GPS_lat : 32|32@1- (0.000001,0) [0|0] "" GEO
BO_ 213 BLE_GPS_DATA_CNT: 1 BLE 
SG_ BLE_GPS_COUNT : 0|8@1+ (1,0) [0|0] "" GEO,SENSOR

Geographical Controller

Design & Implementation

Hardware Connection

The Pin Configuration is as follows:

Block Diagram

GPS and Compass Module:

GPS:

GPS

GPS is a global navigation satellite system that provides geo location and time information to a GPS receiver anywhere on or near the Earth where there is an unobstructed line of sight to four or more GPS satellites.

Compass:

Compass

A compass is an instrument used for navigation and orientation that shows direction relative to the geographic cardinal directions (or points). Usually, a diagram called a compass rose shows the directions north, south, east, and west on the compass face as abbreviated initials. When the compass is used, the rose can be aligned with the corresponding geographic directions; for example, the "N" mark on the rose really points northward. Compasses often display markings for angles in degrees in addition to (or sometimes instead of) the rose. North corresponds to 0°, and the angles increase clockwise, so east is 90° degrees, south is 180°, and west is 270°. These numbers allow the compass to show azimuths or bearings, which are commonly stated in this notation. We are using DJI’s NAZA GPS/COMPASS to get the GPS coordinates and Heading angle. The diagram of the module is as follows:

GPS and Compass Module

Message Structure:

  • GPS':

The 0x10 message contains GPS data. The message structure is as follows:

GPS Data


  • Compass:

The 0x20 message contains compass data. The structure of the message is as follows:

Compass Data
  • Calibration':

Why calibrate the compass?

Ferromagnetic substances placed on multi-rotor or around its working environment affect the reading of earth’s magnetic field for the digital compass. It also reduces the accuracy of the multi-rotor control, or even reads an incorrect heading. Calibration will eliminate such influences, and ensure MC system performs well in a non-ideal magnetic environment.

When to do it?

• The first time you install Naza compass.

• When the multi-rotor mechanical setup has changed.

• If the GPS/Compass module is re-positioned.

• If electronic devices are added/removed/re-positioned.

Software Design

Algorithm: Distance calculation:

We are using the ‘haversine’ formula to calculate the great-circle distance between two points – that is, the shortest distance over the earth’s surface

Bearing Angle calculation:

The bearing of a point is the number of degrees in the angle measured in a clockwise direction from the north line to the line joining the centre of the compass with the point. A bearing is used to represent the direction of one point relative to another point. The bearing angle is calculated by using the following formula:

Angle Information
DBC Messages
Flowchart

Package Design

PCB Design

PCB Complete Schematic for All 5 Control Interfaces

PCB Complete Board design for All 5 Control Interfaces

3D Printed Sensor Mounts

We designed 3D printing Models for holding the Sensor LiDAR and GPS using OpenScad Software.

LiDAR Mount

GPS Mount

Git Project Management

The Gitlab project is managed using working on different branches for different controllers and restricting access to all users to merge the branch to master branch. To get easy notification of all git activity, we created a webhook for git notifications in CMPE243 Slack Channel. The useful features of git such as Issues List, Milestone tracks are used for easy management

GitLab WebHooks

The project git repository is below.

https://gitlab.com/optimus_prime/optimus/tree/master

Technical Challenges

Motor Technical Challenges

1) ESC Calibration
We messed up the calibration on the ESC.
XL 5 had a long press option to calibrate the ESC, where the ESC shall:
a) After long press, glow green and start taking PWM signals for neutral (1.5).
b) Glow green once again where we shall feed in PWM signals for Forward (2ms).
b) Glow green twice again where we shall feed in PWM signals for Reverse (1ms)."
-We wrote code to calibrate using EXT-INT (EINT3) over P0.1 - switch to calibrate the ESC this way!

2) ESC Reverse
The ESC was not activating reverse if we directly - as in the datasheet (no formal datasheet - only XL 5 forums - talked about 1ms pulse width at 50Hz for reverse).
We figured out that Reverse is actually 3 steps:
a) goNeutral()
b) goReverse()
c) goNeutral()
d) goReverse()


3) RPM Sensor Installation:
After following the steps to install RPM sensor (as steps above), the RPM sensor was not detecting the Rotation (magnet) of the wheel.
The reason for that was Machine steeled pinion gear and slipper clutch. The Machine steeled pinion gear and slipper clutch that came with the RC car was big. That increased the distance between Magnet and RPM sensor. That's why we were not able to detect RPM of wheel.
We even checked the activity using Digital Oscilloscope.
Then we changed the smaller Machine steeled pinion gear and slipper clutch and reinstalled the RPM sensor and it worked.

Android Issues Undergone

  • MAPS: Plotting Routes and Offline Check Points Calculation

With our initial implementation using Google Android API we were able to route maps but sooner during testing of the route navigation we faced a couple of issues as follows:

1. For Straight Line Routes, often the intermediate checkpoints were not received, as according to Google Api's checkpoints are only generated at the intersections where the route bends.
2. Due to the aforesaid drawback on straight routes it was hard to navigate and interpolation was required to make sure the GEO has enough checkpoints to redefine the heading angle before the car goes too far from its destined straight route path.
3. Google Route's are calculated from any point on the ground to the nearest offset point on the pre-drawn custom Google poly-line path, as a result the route from certain locations ended up to be on the sharp edge routes rather than smooth curves which also led to little longer routes and our car ended up in side walks or side bushes while correcting its course to follow the main route.

Optimus App: Navigation and Route Selection

  • Application Compatibility

During Implementation one of the issues faced were the security features of Android applications and permissions to use Geo Locations and App Storage.
Every time after fresh app Installation the permissions had to be revisited and enabled for the app to access them, something which still can be upgraded further.

Testing and Procedures to Overcome Challenges

MAP DEBUGGING & ROUTE CALCULATION

For overcoming the problem of placing routes and calculating the shortest path we decided to interpolate routes in the university premises.
Steps involved:

  • Draw polylines routes over saved checkpoint coordinates by reading and parsing a json file at the app level to get the next checkpoint coordinates.
  • Use Dijkstra's Algorithm to calculate shortest path between those routes.
  • For longer routes two approaches could be taken to calculate the intermediate checkpoints:
    • a. Straightline Approach
    • b. Geodesy Engineering Approach.
Source:wikipedia.org::Dijkstra's algorithm

Geodesy approach is complex and can be implemented using 'Haversine' technique to calculate the intermediate points between two points along the geographic surface of the earth but since the distances for the demo were not so long enough that can be significantly impacted by the curvature we decided to go with the primary approach.

We used Vincenty formula to compute the interpolated points between two checkpoints when the distance between the two exceeded ~(10±5)meters the algorithm will interpolate the route to give intermediate checkpoints which will be marked on the map using BLUE Markers.

For easy user view we added Hybrid TYPE MAP on the app so that user can have a 3D feel of the route.

MARKERS

  • We also added colored Markers for denoting following:
  >>> START/STOP : Custom Markers
>>> CAR LOCATION : Yellow Markers
>>> INTERMEDIATE CHECKPOINTS : HUE_BLUE Markers
Optimus App: Map Markers

Sensor Controller

1. LIDAR is not able to detect black colored objects sometimes as the light from the LASER is completely absorbed by black and nothing is reflected back.

LIDAR doesn't detect black objects

2. LIDAR object detection will be the plane where it is mounted. So, if the object height is less than the height the LIDAR is mounted then the object will not be detected.

LIDAR doesn't detect objects lower than it's height

3. If there is very high ramp then ramp will also come in the plane of the LIDAR and it will be considered as an obstacle.

4. LIDAR's Exposure to direct sunlight will cause noise creation in the obstacle detection.

Geo Technical Challenges

The first and the major issue we faced with the GEO module was selecting the proper hardware for GPS and Compass. We tried with Sparkfun, Adafruit and Ublox GPS modules. We observed a lot of time taken by the GPS to get a lock and also the error was high. Then we switched to DJI Naza GPS and we found that it was pretty accurate and the lock up time was hardly a minute. The software issue which we faced with Naza GPS was that it did not have a proper software documentation. We tried to understand the message packets and went through the forums to understand the message layout. After this we were able to integrate the module successfully.

The Naza gps module comes with a in-built compass and it simplified our setup as we did not have to integrate two separate modules.

We faced one more hardware issue once the Rx pin of the gps module was accidentaly connected to the ground pin then the gps started to draw a lot of current. So to avoid this kind of mistakes we integrated fuse with the gps so even if extra current is drawn the fuse will take care that this does not hamper the entire system.

Also, the car was going to the edges even if the path was towards the middle of the road as per google maps. So after developing an app to map the checkpoints we found that the path is actually inside the buildings. So, we had to find a different solution to solve this problem. Afterwards, we created a database of all the routes in campus and then processed the route through the android app.


GPS Route

Project Videos

https://youtu.be/Os_0bN8rR10

Conclusion

As a team we were able to achieve the set of goals and requirements within the required time frame. Over the course of this project, we learnt cutting edge industry standards and techniques such as:

  • Team Work: Working in a team with so many people gave us a real sense of what happens in the industry when a large number of people work together.
  • GIT: Our source code versioning, code review sessions and test management was using GIT.
  • CAN: A simple and robust broadcast bus which works with a pair of differential signals. We were able to use the CAN bus to interconnect five LPC1758 micro controllers powered by FreeRTOS.
  • Accountability: Dealing with both software and hardware is not an easy task and nothing can be taken for granted, especially the hardware.
  • Hardware issues:
    • Power Issues: Initially we were using a single port from the Power bank power up everything (all the boards) including the LIDAR. This caused the LIDAR to stop working due to insufficient current. It took a while for us to figure this out.
    • GPS: Calibrating the GPS and getting accurate data from the GPS was a challenging task.
    • Android Application: Using google maps to obtain checkpoints did not workout as google maps was giving a single checkpoint. So we created a database of checkpoints for navigating the car across SJSU campus.
    • Debugging: Connecting the PCAN dongle to the car and moving around with it is a difficult way to debug. Hence we created a dashboard on the android application to view all the useful information on the tab without any hassles.

To the teams that are designing their car:

  • If using a LIDAR for obstacle avoidance make sure to test it in all lighting conditions.
  • It is better to have PCB instead of soldering everything on a wire-wrapping board.
  • Start with the implementation for the Geo module early.

Project Source Code

The source code is available in the below github link

https://gitlab.com/optimus_prime/optimus

References

Acknowledgement

We are thankful for the guidance and support by

Professor

  • Preetpal Kang

ISA

  • Prashant Aithal
  • Saurabh Ravindra Deshmukh
  • Purvil Kamdar
  • Shruthi Narayan
  • Parth Pachchigar
  • Abhishek Singh

For 3D printing

  • Our sincere thanks to Marvin Flores <marvin.flores@sjsu.edu> for printing our 3D print models.

For Sponsoring R/C car

  • Professor Kaikai, Liu