Since there are certain situations which would require the robot to stop its general execution at any time, there were two choices for error handling. One option is polling and the other is to use interrupts. Polling takes up computational time to check the status of each error signal. Therefore, interrupts were much more suitable for this application.

As such, the error handling flowchart is shown in the figure below.

Untitled

Error sources currently include the e-stop for safety purposes as well as signals from the battery management system. These signals would include high battery temperatures, low battery warnings, and high current draw levels. In all cases, the robot would suspend execution until the interrupt handler had processed the error. In terms of an e-stop signal, the robot would suspend execution until a user manually cleared the error with the help of another external signal. The battery management signals may be addressed differently. If the battery level is detected to be too low, the robot will need to shut down so that the battery can recharge. If the current draw or battery temperature is too high, the execution of the robot will be suspended until the signal has been cleared by both the BMS and manually by a user.

The main loop the robot would regularly operate is shown below.

Untitled

A quick proof of demo for the most critical part of the algorithm was created to evaluate the software design.

The constraint is to be able to locate the center of the flower. To test whether the robot will be able to successfully able to locate the center of the flower a quick proof of concept demo was conducted in Python.

The test image is shown below.

Untitled

Our goal is to detect the centers of each flower in the image.

The first thing that was done was to segment the image into yellow and white components. This segmentation can also be done in multiple ways in the future. This segmentation is currently done using the RGB colour space. However, this would make the algorithm very susceptible to changes in exposure. Other colour spaces such as HSI, YCbCr, or CIELAB may be used in the future for a more robust implementation for this reason. This segmentation is shown in the figure below.

Untitled

The next step of the algorithm was to run blob detection on all the yellow pixels in the image. This was done using OpenCV. The results of the blob detection is shown in the image below.

Untitled

Each detected yellow is a blob is a potential flower candidate. However, we can see that there are noisy yellow blobs, along the edge of the pot for example, that are getting picked up as potential flower candidates. Therefore, some more filtering needs to be done in order to locate the centers of all the flowers in the image.

However, there is some prior knowledge about the image and its contents that can be considered. For example, the middle of the citrus flower is yellow. The outside petals of the citrus flower are white. This is not a common combination. Therefore, we can conclude that if a yellow blob is close lots of white pixels, that it is likely part of the center of the flower.

The are two options here. One is to count the number of white pixels around the yellow blobs. If the percentage of pixels around the yellow blobs is above a certain threshold, then that yellow blob can be considered part of the center of the flower.

Another option is to strongly blur the white image. If a yellow blob is close to a relatively large number of white pixels, we can presume that it is a yellow stamen or pistil next to some white petals, and we will filter out any blobs that do not fit this criterion.

The blobs which pass through this filter are shown in the image below.