®®®® SIIA Público

Título del libro: Proceedings - 2010 Ieee Electronics, Robotics And Automotive Mechanics Conference, Cerma 2010
Título del capítulo: Real Time Object Recognition Methodology

Autores UNAM:
MARGARITA CABRERA BRAVO; ROMAN VICTORIANO OSORIO COMPARAN; HUMBERTO GOMEZ NARANJO;
Autores externos:

Idioma:
Inglés
Año de publicación:
2010
Palabras clave:

3D object; ART networks; Descriptors; Fast learning; Fuzzy ARTMAP; Intelligent manufacturing cell; Learning techniques; Neuronal networks; On-line recognition; Real-time object recognition; Robotic tasks; Single stage; Step-by-step; Unstructured environments; Visual perception; Algorithms; Automobile electronic equipment; Automobile parts and equipment; Flexible manufacturing systems; Industrial robots; Intelligent robots; Mechanics; Neural networks; Object recognition; Robotics; Vectors; Robotic assembly


Resumen:

This paper shows a methodology for on-line recognition and classification of pieces in robotic assembly tasks and its application into an intelligent manufacturing cell. The performance of industrial robots working in unstructured environments can be improved using visual perception and learning techniques The object recognition is accomplished using a neuronal network with FuzzyARTMAP architecture for learning and recognition purposes, which receives a descriptor vector called CFD&POSE as the input. This vector represents an innovative methodology for classification and identification of pieces in robotic tasks, every single stage of the methodology, is described step by step and the proposed algorithms explained. The vector compresses 3D object data from assembly parts and is invariant to scale, rotation and orientation. The approach in combination with the fast learning capability of ART networks indicates the suitability for industrial robot applications as it is shown in experimental results and the possibility to add concatenated information into the descriptor vector to achieve a much more robust methodology.


Entidades citadas de la UNAM: