Precise Top-Layer Fabric Segmentation for Fabric Destacking with Edge- and Shape-Aware Deep Networks

Published:

W. Dong1, 3, D. Bhattacharya1, 2, A. Kobayashi1, 2, A. Seino1, 2, F. Tokuda1, 2, X. Huang1, 3, K. Tang1, 3, N. C. Tien1, 2, K. Kosuge1, 2, 3

1. Centre for Transformative Garment Production, Unites 1215 to 1220, 12/F, Building 19W, SPX1, Hong Kong Schience Park, Pak Shek Kok, N. T., Hong Kong SAR
2. Department of Electrical and Electronic Engineering, The University of Hong Kong, Hong Kong SAR
3. Director of the JC STEM Lab of Robotics forSoft Materials, Department of Electrical and Electronic Engineering, Faculty of Engineering, The University of Hong Kong, Hong Kong SAR

Published in 2025 IEEE International Conference on Mechatronics and Automation (ICMA), (2025)

DOI: 10.1109/ICMA65362.2025.11120697

Download here.

Abstract
Fabric destacking requires precise segmentation of the topmost fabric layer, a task complicated by subtle fabric boundaries and high visual similarity between fabric layers. Existing semantic and edge-based segmentation approaches often struggle with these complexities, limiting the performance of robotic manipulation for different tasks. In this work, a novel segmentation training architecture tailored for top-layer fabric segmentation in stacked fabrics is proposed. The method ex-tends the classical encoder-decoder framework by introducing two specialized branches-an edge-aware branch and a shape-aware branch-that are used to supervise the backbone network for better tuning. The edge-aware branch enhances boundary delineation, while the shape-aware branch guides the network to capture and align the overall fabric shape with reference masks derived from Computer Aided Design (CAD) models. Experiments on a real-world fabric dataset demonstrate that the training approach outperforms established baselines, verifying the effectiveness of the multi-branch design through both quan-titative results and ablation studies.