When the annual science fair at my daughter’s school came around last year, we decided it was time to build SAM. SAM, or Serving Assistant Machine, is a waitor on wheels powered by an Arduino microcontroller and a Nvidia Jetson Nano, along with a handful of sensors. A serving tray was attached to the top of the device, with drinks placed upon it; SAM would then peruse the room, detecting faces and offering drinks.
With the Arduino situated at the base, along with the batteries, ultrasonic sensors, wheels and motors, SAM would navigate itself around the room, using its ultrasonic sensors to detect when something was in front of it. Upon detection of an obstacle, the Arduino would stop SAM in his tracks and send a signal to the Jetson Nano, which was located higher up, just below the serving tray. The Nano would then turn on its camera and, using OpenCV, look for a face; if it found one, SAM would audibly ask if the person would like a drink. On top of just detecting faces, the Nano would create a database of faces, and if it encountered a face it had already seen, SAM would adapt its question to indicate so. After a pause, SAM would then turn and start its routine over again.
This project included designing and 3D printing two different enclosures, one for the lower section that included the Arduino, motors, wheels, multiple batteries, motor controller, and ultrasonic sensors, as well as an extension to hold up the wooden pole that the top half of the machine sits on. At the top, the second enclosure holding the Jetson Nano, webcam, speaker sat below the serving tray.
Coding included C++ for the Arduino section, which basically entailed building a rudimental self-driving vehicle. Python was used for programming the Jetson Nano, with machine learning incorporated to build the model for face detection.