Summary: | With the advent of the Industry 4.0 paradigm, manufacturing is shifting from mass production towards customisable production lines. While robots excel at reliably executing repeating tasks in a fast and precise manner, they lack the now desired versatility of humans. Human-robot collaboration (HRC) seeks to address this issue by allowing human operators to work together with robots in close proximity, leveraging the strengths of both agents to increase adaptability and productivity. Safety is critical to user acceptance and the success of collaborative robots (cobots) and is thus a focus of research. Typical approaches provide the cobot with information such as operator pose estimates or higher-level motion predictions to facilitate adaptive planning of trajectory or action. Therefore, locating the operator in the shared workspace is a key feature. This dissertation seeks to kickstart the development of a human operator tracking system that provides a three-dimensional pose estimate and, in turn, ensures safety. State-of-the-art methods for human pose estimation in two-dimensional RGB images are tested with a custom dataset and evaluated. The results are then analysed considering real-time capability in the use case of a single operator performing industrial assembly tasks in a collaborative robotic cell equipped with a robotic arm. The resulting observations enable future work like fusion of depth information.
|