Analysis of masonry work activity recognition accuracy using a spatiotemporal graph convolutional network across different camera angles

  • Sangyoon Yun
  • , Sungkook Hong
  • , Sungjoo Hwang
  • , Dongmin Lee
  • , Hyunsoo Kim

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

Human activity recognition (HAR) in construction has gained attention for its potential to improve safety and productivity. While HAR research has shifted toward vision-based approaches, many studies typically use data from a specific angle, limiting understanding of how camera angles affect accuracy. This paper addresses this gap by using AlphaPose and Spatial-Temporal Graph Convolutional Network (ST-GCN) algorithms to analyze the impact of various camera angles on HAR accuracy in masonry work. Data was collected from seven angles (0° to 180°), with the frontal view only used for training. Results showed consistently high recognition accuracy (>80 %) for side views, while accuracy decreased as the camera shifted toward rear views, especially from directly behind due to occlusion. By quantifying HAR accuracy across angles, this study provides baseline data for predicting performance from various camera positions, improving camera placement strategies and enhancing monitoring system effectiveness on construction sites.

Original languageEnglish
Article number106178
JournalAutomation in Construction
Volume175
DOIs
StatePublished - Jul 2025

Bibliographical note

Publisher Copyright:
© 2024

Keywords

  • AlphaPose
  • Camera angle
  • Human Activity Recognition (HAR)
  • Spatial-Temporal Graph Convolutional Network (ST-GCN)

Fingerprint

Dive into the research topics of 'Analysis of masonry work activity recognition accuracy using a spatiotemporal graph convolutional network across different camera angles'. Together they form a unique fingerprint.

Cite this