An hw/sw update of the vehicle with revisions of the indoor circuit. Major revisions have been made with the vehicle. It has been stripped of the LIDAR, AHRS 9-DOF Inertial Measurement Unit, Stereo Camera, AP hotspot and IPS beacon and replaced with a single high speed USB 3.0 camera. This camera outputs 50 FPS at HD resolution, which is pretty neat for such a small camera. Anyway, moving on, the vehicle seems to perform better with AWD rather than RWD due to its tendency to oversteer at high speed braking. I am vaguely guessing here, but it goes at least 60 mph. Also, there seems to be a lot of electromagnetic noise coming from the Electronic Speed Controller, which I think is becoming quite annoying. Something I plan to fix with an LC filter. Below is a diagram of my final hardware setup. (Vehicle diagram painstakingly drawn in Microsoft paint.) On the software side, I am being lazy and currently using the LSTM/CNN based steering script from Udacity’s second competition winner Ilya Edrenkin. He generously open sourced his code, which I utilized to correlate steering, throttle and brake commands with images. Training is done on a GTX 1080TI and deployed on a GTX 970m. The model is around 2.8 GB when deployed with five 144 x 144 resolution images input consecutively. My Rosbags are storing the images from the camera, but its getting quite heavy. The Ximea cameras I bought do not seem to have a ROS package API where I can bin pixels together to lower the resolution of the image. An hour of data collection gets to be sized at 500GB, so I bought a 1TB SSD. Because my algorithm only uses a 144 x 144 resolution, I could technically write a ROS package that bins pixels 5 x 5 off of the Ximea xAPI. That would downsize the resolution, increase the FPS and help me collect more images for the same amount of training/hard drive space. Something to work on. Latency is another issue I have to work on. I am currently using the FFMPEG h.264 encoder with ultrafast, zero latency settings, but using the univerisity wifi network produces a lot of lag. Ended up buying a ASUS Router (RT-AC68U) and created my own local wifi network. I’ve been exploring 4G as an outdoor solution for wireless communication. Got a set of USB 4G dongles that didn’t work out of the box so sent them straight back. Also noticed the latencies on 4G networks is too large for any robust real time video streaming. Better wait for 5G LTE. (Formula Fast Indoor Go Kart Track) Thanks to the Formula Fast Indoor Go Kart venue in Milton Keynes, Iwas able to get some free circuit time in their 500m indoor track. It was a great experience because I learned about the limitations of wifi. Although they had indoor wifi capabilities, the speeds were too slow and coverage too sparse to provide any meaningful bandwidth for streaming images or controlling the car via Logitech steering wheel. This experience opened up a whole new world of wifi debugging. I did some quick latency tests with itop and ping. I realized that the simultaneous connection disconnect I was occasionally having stopped with the private wifi connection from the new ASUS router. This made sense because if the connection with the host computer and vehicle disconnected for both vehicles it would have to be a network issue. I still kept getting random disconnects with the vehicles, and I suspect that it is either an issue with the camera serial interface, or ESC noise, which needs that LC filter. I ended up using Christmas LED lights to create a unique TRON-esque race track. However, the track was still unsuccessful because (I think) the neural network had trouble recognizing the lanes which overlapped. It was cool to look at, but not functional. at the same time, (LED circuit. Tron-esque.) Final thoughts: I think I have enough code I wrote to start my own Github repository.