Choose your language
How Augmented Reality and Cobots Drive the Next Wave of Automation
17. November 2017 / By Universal Robots / 1 Comment
Modern Machine Shop did this video on ITAMCO’s AR solution with UR cobots
ITAMCO demonstrates prototyping and collaboration applications with cobot and Microsoft HoloLens
ITAMCO, an Indiana-based manufacturer of precision-machined components, recently demonstrated an augmented reality application using a UR robot and a Microsoft HoloLens headset, which includes Xbox Kinect 360 sensors and is typically used for gaming. The HoloLens is a wearable computer that projects information on top of the reality of the robot’s actions, and allows the operator to control the robot using hand movements. Because the HoloLens has a camera, it can record both real-life and virtual images to share with other individuals—for example, an engineer or operator could demonstrate a robot setup to someone in another plant or department.
Joel Neidig, Business Development & Technology Manager at ITAMCO, commanding the UR5 cobot through HoloLens.
Joel Neidig, Business Development & Technology Manager at ITAMCO says, “I think it’s going to bring a lot of collaboration between operators and engineers, even going out to the point-of-use on the manufacturing floor, where the UR robot is being used every day. You can capture work flows and the motion of the robot, and people can record their setups and display some of the virtual models inside the machine before they actually manufacture it.”
Using a virtual environment prior to manufacturing could be especially valuable to experiment with setup for expensive parts, or to plan for parts that haven’t been manufactured yet. The AR system can show the user how the part will be loaded in the machine without having the actual parts on-hand. “It’s really important to have more prototype tools like this throughout the industry, and being able to rapidly prototype and test your design,” Neidig explains. This type of system will allow engineers, manufacturers, and operators to collaborate and to make changes so that when parts go to production, the processes are as efficient as possible.
Example of 3D graphics seen through ITAMCO’s Hololens
Neidig looked specifically for a collaborative robot so that people using the AR system could safely stand next to the robot while it was in action, without being separated by a safety cage. The robot needed to be lightweight enough to be easily moved, and ease of integration was also key. Neidig says, “We chose the UR robot for this application because it’s an open platform. We can communicate with Python scripts and secure sockets, and it’s got a nice Ethernet port that’s already set up. UR brings it all together, and just being intuitive, it’s very easy to maneuver around, and we like the platform as a whole.”
At this year’s Automate show in Chicago, Kubica Corporation let attendees assemble the inside panel of a car door in an AR-assisted process
Kubica integrates Light Guide Systems projection for assembly assistance and QA
The combination of AR technology and cobots can also bring a whole new level of collaboration to the table. Kubica Corporation, a Michigan-based engineering firm, recently demonstrated an AR automobile door panel assembly program using a UR10 robot and a Light Guide Systems projection system. Carol Choma, Operations & New Business Development Manager at Kubica, explains the application, saying, “This is a great example of a true collaborative cell work environment where the UR10 robot from Universal Robots is working with the operator at an assembly cell for the automotive industry.”
In this application, the Light Guide system projects assembly instructions directly onto the work environment. The operator swipes his hand over the virtual “start button” projection and watches for additional directions as the robot begins its process. The projection highlights the robot’s actions and prompts the operator for his tasks. The projection guides the proper assembly process by lighting up and color-coding the path for the operator to install a wire harness accurately while the robot is working on another part of the assembly.
The operator continues to follow the instructions shown by the Light Guide system, interacting with it by swiping a hand over virtual “buttons.”
The operator can respond to quality control messages, such as a missing pin, and use the projection system’s virtual controls to instruct the robot on additional processes. Once all processes are complete, the projection system takes a picture of the assembly for traceability purposes.
AR research is still in its early stages, but promises to expand the use of robots into more complex applications, improve quality and consistency, and increase opportunities for collaboration with human workers.
Ready to get your team onboard the Automation Train? --> Find out how in our free ebook
We believe that collaborative robotic technology can be used to benefit all aspects of task-based businesses – no matter what their size.
We believe that the latest collaborative robot technology should be available to all businesses. The nominal investment cost is quickly recovered as our robotic arms have an average payback period of just six months.