Brewing Coffee, Translating Words, and Moving Virtual Objects
Students in Embedded Systems in Robotics brought lessons learned in class to life through their first-quarter team projects.
Can a robot replace a barista?
A robot might not have the same types of conversations or face-to-face interactions as a human, but students in Northwestern Engineering's Master of Science in Robotics (MSR) program proved it can brew a pretty good cup of pour-over coffee.
The project was part of Embedded Systems in Robotics, a project-based course for students in their first quarter of the program. The class introduces students to computer vision, control, kinematics, localization, and path planning. Most importantly, the course provides students with the opportunity to use the Robot Operating System (ROS), a key framework emphasized throughout MSR and used by many robotics professionals.
The students who created "Botrista" used a 7DOF (seven degrees of freedom) robot arm, an Intel RealSense camera, and OpenCV — a library of programming functions — to locate a mug and follow the steps needed to brew and pour the coffee.
"You're not creating a commercial product, but you're creating something close to it," said Anuj Natraj (MSR '24), one of five students who worked on the project. "You have to systematically figure out an entire programming structure as well as how to build a product. It's a lot of learning."
MSR codirector Matthew Elwin, who teaches the class, said that’s the course’s goal.
"This class teaches a lot of important lessons in robotics early, including that everything is going to be harder than you think it is," Elwin said. "We learn about all these things in theory, but this is where you have to take everything and put it together and put it into practice."
At the end of the quarter, the student teams demonstrated their projects for each other, as well as other students, faculty, and staff from MSR and the Center for Robotics and Biosystems.
One team of students incorporated ongoing research into its project, which focused on dexterous manipulation through virtual reality. The setup for the project featured five robot arms, a virtual reality headset, haptic gloves, and two anthropomorphic robot hands. The complex system enabled a user to stack toy rings in a virtual environment, receive sensory feedback from the haptic gloves, and then have the robot arms replicate the toy stacking.
Elwin said it was the first time he’d integrated a complex ongoing research project into this introductory course experience.
“I was impressed with the ability of students to jump into a large ongoing research project and immediately make contributions," he said. "It is a testament to their dedication to learning during what was an extremely fast-paced quarter.”
For Kassidy Shedd (MSR '24), the opportunity to learn what ROS is and how it works was the most valuable part of the course. Her team created a robot that could read a word written on a whiteboard in any language, translate it to any other language, say it out loud, and then write the translation on the whiteboard.
Seeing the robot do what it was supposed to do was an incredible experience, she said.
"Being able to see the robot in full action, that was the most rewarding part," Shedd said, "being able to understand what is making it do what it's doing, rather than just watching a robot move and saying, 'Oh, that's cool,' which is what I would have done before this class.
"This class taught me the best ways to structure my code and write good code. It taught me about persevering when things get hard. This class taught me how to think about things differently so I could get over the obstacles I was facing."