PiWars PiWars 2021 Planning

My First PiWars: The Plan

I decided, in the year of uncertainties, to apply to PiWars for the first time. To my surprise, I was accepted to participate. I think I may be a glutton for punishment as this was combined with starting a new job, making my free-time less existent than the UK’s world-leading COVID-19 track and trace system.

I’m currently a researcher in medical imaging at the University of Manchester (hence the team name). My main research interests are focused around computer vision and machine learning, so I thought I could combine this with creating a robot. I haven’t worked with robotics in any form before, so I think combining all elements of building my robot will be a big challenge. I’ll be going at this alone rather than in a team, partly for the challenge, but mostly because there is no-one I know that could help.

For the base of my robot, I initially chose the STS-Pi from Pimoroni, however it quickly became clear that it was way too small to have a chance in all the challenges (especially the obstacle course).

The STS-Pi is designed to work with an Explorer HAT, so I opted to get the Explorer HAT Pro. This was my first introduction to any form of motor controller.

When I got accepted to PiWars, I decided to get a bigger robot base that could have a lot added to it on the fly, like sensors. So I decided to get the Devastator Tank from The Pi Hut. As you can see in the images below, there is quite a size difference.

The Devastator base has many areas to screw stuff down and decent enough motors to give a chance at a decent obstacles course. At the same time I needed a motor control board and this blog post by Michael Horne pretty much helped me decide on the RedBoard+ from RedRobotics. I must admit it was pretty intimidating with the amount of connections possible, but I at least got the robot to move with a PS4 controller using the available library (available on RedRobotics GitHub page).

I have some ideas of how to apply my computer vision/machine learning knowledge to the following challenges. These are my initial plans and I hope by the end of the competition, I can show the progress from how I believe I should tackle the challenges, to how I actually ended up tackling them. As of writing this, no algorithms have been created, only the vague ideas of what I think is possible.

Feed the Fish

Currently, I think I will be aiming for the simpler task of rolling the golf balls into the ‘bowl’. Getting something that can shoot feels too difficult for me. I do want to make it automated though, as will be the aim for all the challenges I attempt. As this will require some form of computer vision/machine learning, I got a Coral USB accelerator from The Pi Hut that will take a lot of the processing power required for these tasks away from the limited power of the Raspberry Pi.

The Google Coral USB Accelerator

So, I think it will be some form of object detection for the golf balls, direct the robot towards it and aim for the large bowl. Object detection isn’t too bad for me to do, but the aiming and pushing the object (and making sure the robot doesn’t go after it once this happens) is still a mystery.

Tidy Up the Toys

I think this will be my favourite challenge. It will definitely be autonomous and use the Coral accelerator as before. The blocks will be coloured red, green and blue (I still need to source these or 3D print them) and will be picked up by the below robot gripper and servo from The Pi Hut.

A combination of object detection, colour segmentation and navigation will be required. Getting the coloured blocks to the correct area once picked up is the bit I have yet to figure out. I did get a couple more sensors recently to potentially help with the first two challenges: An Adafruit 9 DoF sensor (BNO055) to help determine where my robot is and track it’s local movements (I hope) and an Adafruit VL53L0X Time of Flight Distance Sensor to add extra data on how far the robot is from objects.

Up the Garden Path

My initial thoughts on this would have been a line follow sensor. However, that seems too obvious and away from my theme of computer vision. I recently saw OpenCV based line following video here which led me to this blog post by Helen Lynn. I would love to have a go at implementing this an seeing the results.

DIY Obstacle Course

This is the only challenge that I may have to use a remote control method. I would like to set up the course in my back garden, as I have a lot of room and terrain. Extra points are added for imagination, complexity and humour, so I will be making extra effort here to grab points that I may not get if I can’t get autonomous mode working.

Technical and Artistic Merit

This might be a long way off depending on what I decide, but hopefully my computer vision/machine learning approaches help with technical points, and whatever I do to make the tank look more friendly should help on the artistic side.

I can’t wait to move out of my comfort zone and have fun with this competition. If anyone comes across my blog and have any advice or feedback, it would be more than welcome.

Coral TPU Accelerator Functionality PiWars PiWars 2021 Planning Raspberry Pi Camera The Grabber

Overdue Build Update

As with my normal work, I have procrastinated with writing the blog. Back to it with a robot build update.

I’ve added a few things to the main functionality of the robot: a large servo-controlled grabber, a Raspberry Pi camera and mount (model available here from Explaining Computers), and a proper connection of the Coral TPU accelerator with the object detection example working with the camera.

The Grabber

One big issue to overcome is to fix the grabber to the front of the robot. Unfortunately, the front of the robot chassis has a slope and rendered any usual right-angle brackets useless for mounting the grabber. Currently, it’s attached by a single screw and balancing pretty loosely.

In short – it’s a bad solution.

So I have bought some cheap right-angle brackets and I’m just going to bend them to the correct angle. Hopefully that will make a much more stable attachment.

Raspberry Pi Camera and Mount

The camera was an initial issue with mounting too, but luckily explainable computers created a Raspberry Pi camera mount 3D model specifically for my devastator chassis. I managed to print it and have a decent mount for my camera. Shortly after, my 3D printer hot end on my Anycubic Chiron decided it did not want to heat up properly anymore, so printing the ‘Tidy Up the Toys’ cubes had become impossible.

Object Detection with TPU Accelerator

By far the easiest to fix to the robot thanks to the convenient mounting holes. A little tricky to plug into the USB port of the Raspberry Pi, but that’s what I get for mounting the Pi with the USB ports against the internal robot walls.

I have yet to start on any autonomous programming yet, so I should really get on with that.

PiWars PiWars 2021 Robot Simulator Webots

Webots Robot Simulator

A few days ago, I was looking for a way to simulate my robot’s sensors and potential vision tasks. The first thing I came across was the Webots Robot Simulator, which became open source relatively recently in 2018.

I ran through the tutorials, which were pretty easy to follow, but also have the solution files in case you get stuck. They focus on the fundamentals of Webots, so don’t expect to be modelling your PiWars robot perfectly straight away. For example, below is a screenshot of my ‘4-Wheels Robot’ from the tutorial.

My ‘4-Wheel Robot’ and it’s PROTO node brother.

I would like to model my Devastator robot, but the shape is a little more complex than a box and uses tracks rather than wheels. Luckily, there is a Track node that I can use. My main focus would be to prototype the running of the robot with the various sensor nodes available.

While this appears to be a great piece of software (once you get through the steep learning curve), I may be adding more stuff to do that delays building and testing my actual robot. If anyone has used this software, I’d love to hear your experiences.