Přeskočit na hlavní obsah
Přeskočit hlavičku
Project number: CZ.02.1.01/0.0/0.0/17_049/0008425
Programme: Operational Programme Research, Development and Education
Period: 2018 - 2022
Name of applicant: VŠB – Technical University of Ostrava
Research organizations: VŠB-Technical University Ostrava, Fraunhofer Institute for Machine Tools and Forming Technology (Chemnitz, Germany)
Partners, corporations: SIEMENS, s.r.o., Brose CZ spol. s.r.o., HELLA AUTOTECHNIK NOVA, s.r.o., VOP CZ, s.p., Moravskoslezské inovační centrum Ostrava, a.s., Moravskoslezský automobilový klastr, z.s., Varroc Lighting Systems, s.r.o., Continental Automotive Czech Republic s.r.o., Brano a.s.

Brief Project Description - Abstract

The aim of the proposed project "Industry Research Platform 4.0 and Robotics in the Ostrava Agglomeration" is to intensify long-term interdisciplinary cooperation by supporting the creation of new and the development of existing interdisciplinary partnerships and cooperation of research organizations with the application sphere within the framework of jointly conducted research focused on Industry 4.0 and Robotics. As part of the implementation of the project, new and strengthened existing instruments and forms of bi-directional transmission of the unique knowledge and experience of the involved entities - research organizations and industrial partners will be created. Part of the project plan is also the fulfilment of the specific objective of SO2 PA1 OP RDE - Capacity building and strengthening of the long-term cooperation of research organizations with the application sphere – by creating a network of research workplaces for the implementation of joint interdisciplinary and intersectoral research.

These newly built capacities will form the platform infrastructure for the research of control, information and communication technologies oriented towards Industry 4.0 and Robotics. This is a unique instrumental infrastructure and SW means enabling a common interdisciplinary and intersectoral research of new principles and verification of new concepts applicable in the area of Industry 4.0 and Robotics. Within the project, positions will be created for researchers from research organizations using these capacities and their instrumental infrastructure and SW resources to carry out joint research of new principles and concepts in collaboration with existing researchers of research institutes and industrial partners involved in the project. The results of joint research in the form of proven new concepts implemented in practice by industrial partners will strengthen the competitiveness and innovation potential of these partners, which is the main impact of this project.

Research Intents

1. BIG DATA ANALYSIS

2. AUTONOMOUS AND COLLABORATIVE ROBOTS (solved by the Department of Robotics)

2.1  Research of physical principles, technical and programme resources for on-line optimization of the robot trajectory in a dynamically changing environment with obstacles.

2.1.1  Research of suitable physical principles of sensors for space analysis and evaluation of mutual object spacing.
2.1.2  Finding new mathematical procedures for processing and interpreting spatial data for collision-free movement of the robot arm, including a tool or other manipulation object, in environments with dynamically changing obstacles.
2.1.3  Research of appropriate principles of collision-free trajectory determination and securing safe stopping of the robot before a collision with an obstacle or any part of the body of the operator.
2.1.4  Research of possibilities of integrating automatic trajectory correction into robot software. Implementation of a functional sample of equipment and verification of functionality with industrial partners of the project.

2.2  Research of technical instruments for assisted assembly with a collaborative robot.

2.2.1  Research of technical and programme instruments of HMI interface for guiding collaborative robots via contact or contactless interface.
2.2.2  Research of suitable physical principles of sensors to find the exact position and orientation of the manipulation object in the gripper.
2.2.3  Integration of a manual guidance system with collision control into robot software. Implementation of a functional sample of equipment and verification of functionality with industrial partners of the project.

2.3  Research of technical and programme instruments for inspection of 3D shapes of components during production and manipulation.

2.3.1  Research of appropriate physical principles and sensory systems to inspect the shapes of complex spatial components and to control the relative positioning and orientation of functional surfaces.
2.3.2  Research of mathematical methods of spatial point cloud processing with respect to the required evaluation speed during robotic manipulation with the object.
2.3.3  Research of suitable algorithms to compare product shapes with their CAD models. Search 3D model(s) on the base its 3D Point Cloud data.

3. MODELLING METHODS AND TECHNIQUES: PIL, HIL, SIL, MODEL-BASED DESIGN

4. ARTIFICIAL INTELLIGENCE

5. SMART SYSTEMS AND COMPONENTS AND COMMUNICATION ON THE INTERNET OF THINGS

Leaders of intent 2 (Autonomous and Collaborative Robots)

  • prof. Dr. Ing. Petr Novák
  • doc. Ing. Dalibor Lukáš, Ph.D.,
  • doc. Ing. Zdenko Bobovský, Ph.D.

2. Autonomous and Collaborative Robots - current state of research

2.1  Research of physical principles, technical and programme resources for on-line optimization of the robot trajectory in a dynamically changing environment with obstacles

2-1-octomap
Fig 2.1-1: Automatic obstacle detection and avoidance by a collaborative robot

2.2  Research of technical instruments for assisted assembly with a collaborative robot

2-2-scanner
Fig 2.2-1: Getting the position of an object of manipulation
by scanning (V-REP simulation)

One of the tasks solved in this project is automatic verification of precise position and orientation of an object of manipulation in a gripper. The position of the object in the gripper will be done by scanning the manipulated object. There will be two scans - the first scan will be a calibrating scan (more points, bigger area), the second scan will be a working scans (fewer points, smaller area). There will be computed correction data (translation and orientation) of the manipulated object by transforming working scan with calibrating scan by ICP algorithm.

For testing of this method without a real sensor a scene in the V-REP simulation system with virtual sensors was created, with our own application created to control the scene. To create a 3D scan, 2D scanners are used in combination with linear move.

2-2-openpose
Fig 2.2-2: Detecting arm pose for the robot gesture interaction system

Another task related to this research intent is to create a system capable of controlling a collaborative robot by gestures. The proposed solution is based on a camera watching the human operator - the system detects the arms postures, location of hands and palms and configuration of individual fingers. The robot can then be controlled and navigated by chosen gestures. This camera system can be also used to localize and identify other objects in the scene (bolts, nuts, holes for bolts etc.). These entities are marked and labeled in augmented reality (projector, monitor, HMD device).

2.3  Research of technical and programme instruments for inspection of 3D shapes of components during production and manipulation

2-3-virtualscan-2
Fig 2.3-1: A 3D model or a mechanical part and the corresponding virtual scan (0 % noise, 5 % noise)

2-3-virtualscan-3
Fig 2.3-2: Demonstraction of the ability to simulate damaged parts

Identification of physical mechanical components will be done by making a 3D scan of the part and comparing the scan (point cloud) with models in the database. To be able to make and test the algorithm for point cloud matching without the need to have a real 3D scanner, an application was created for virtual scanning. The application can load a 3D model in the STL file format and export point clouds representing simulated 3D scans.

There are two possible virtual scanning methods:

  • fixed scanner(s) around the model (1 or more),
  • rotating scanner(s) around the model.

Each virtual scanner has the following parameters:

  • position,
  • orientation,
  • angle of view (vertical and horizontal),
  • resolution (vertical and horizontal),
  • accuracy (noise) [%].

The simulated scan can be randomly rotated to simulate different orientations of the part. The application is also capable to make a reference point cloud covering evenly the whole surface which can be used in the matching algorithm. It can also simulate damaged parts (bending, slicing) for verifications of robustness of the matching algorithm. Another functionality is batch processing - virtual scans or reference point clouds can be made easily for a huge number of 3D model files (STL) at once.

As far as real scanning of physical parts is concerned, the photogrammetry scanning method has been tried. The method uses a series of snapshots to create a 3D model of the selected object. The surface of the subject and the lighting conditions at which the pictures are taken play an important role. A testing device has been designed and built to allow automatic rotation of the scanned object and capturing of the necessary snapshots at defined angle increments.

The snapshots are then turned into a pointcloud or a 3D mesh by the Meshroom software tool. The time it takes to get a 3D model depends on the number of frames and their quality. It ranges from tens of minutes to more than several hours. The created 3D model does not correspond to the real dimensions of the selected object. However, the proportions of dimensions the object are preserved.

Another tested 3D scanning method is based on a camera and a pair of lasers. The lasers project on the object beams in the form of lines. The camera records the object surface only under the laser beams. To obtain a complete 3D model, the object is placed on a rotary table. The time to create a 3D model is approximately 5 minutes and the 3D model accuracy is ±2 mm.

2-3-photogrammetry
Fig 2.3-3: A testing device created for the photogrammetry scanning method

2-3-lasers
Fig 2.3-4: Demonstration of the scanning method based on 2 lasers and a camera