TY - GEN
T1 - An Embedded Deep Learning System for Grasping and Classifying PVC Pieces in Cluttered Environments
AU - Gomez, Rolando Mendieta
AU - Guerrero, Sara
AU - Realpe, Miguel
AU - Añazco, Edwin Valarezo
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Robot grasping and manipulation in clutter environments remain challenging tasks due to the need for multiple machine intelligence capabilities. In this research, we present a Deep Learning-driven machine vision intelligence with a robotic control framework to grasp and classify PVC pieces in stand-alone mode using a Niryo One robotic arm, an RGB-D camera, and a Jetson Nano. The Deep Learning-based algorithms were integrated using ROS to automate the object grasping, classification, and relocation (i.e., organization) tasks. The validation of the proposed system produced a success rate of 94% in the grasp-hold objects task, an accuracy for object classification in real-time attempts of 90.5%, and an accuracy of the overall object organization task of 86%. Additionally, the complete system was deployed in a Jetson Nano without relying on external computing resources. The CPU, GPU, and RAM usage were recorded below 65%, proving the feasibility of performing object organization on a computation-constrained board. These results establish a solid foundation for complex robotic manipulation systems used in collaborative or industrial applications.
AB - Robot grasping and manipulation in clutter environments remain challenging tasks due to the need for multiple machine intelligence capabilities. In this research, we present a Deep Learning-driven machine vision intelligence with a robotic control framework to grasp and classify PVC pieces in stand-alone mode using a Niryo One robotic arm, an RGB-D camera, and a Jetson Nano. The Deep Learning-based algorithms were integrated using ROS to automate the object grasping, classification, and relocation (i.e., organization) tasks. The validation of the proposed system produced a success rate of 94% in the grasp-hold objects task, an accuracy for object classification in real-time attempts of 90.5%, and an accuracy of the overall object organization task of 86%. Additionally, the complete system was deployed in a Jetson Nano without relying on external computing resources. The CPU, GPU, and RAM usage were recorded below 65%, proving the feasibility of performing object organization on a computation-constrained board. These results establish a solid foundation for complex robotic manipulation systems used in collaborative or industrial applications.
UR - https://www.scopus.com/pages/publications/105018304489
U2 - 10.1109/CASE58245.2025.11164062
DO - 10.1109/CASE58245.2025.11164062
M3 - Contribución a la conferencia
AN - SCOPUS:105018304489
T3 - IEEE International Conference on Automation Science and Engineering
SP - 3518
EP - 3523
BT - 2025 IEEE 21st International Conference on Automation Science and Engineering, CASE 2025
PB - IEEE Computer Society
T2 - 21st IEEE International Conference on Automation Science and Engineering, CASE 2025
Y2 - 17 August 2025 through 21 August 2025
ER -