Olugbade, Temitayo, Lin, Lili, Sansoni, Alice, Nihara, Warawita, Gan, Yuanze, Wei, Xijia, Petreca, Bruna, Boccignone, Giuseppe, Atkinson, Douglas ORCID: https://orcid.org/0000-0001-7108-6225, Cho, Youngjin, Baurley, Sharon and Berthouze, Nadia (2024) FabricTouch: a multimodal fabric assessment touch gesture dataset to slow down fast fashion. In: ACII 2023: Affective Computing and Intelligent Interaction, 10 September 2023 - 13 September 2023, Cambridge, MA, USA.
|
Accepted Version
Available under License Creative Commons Attribution. Document DOI: https://doi.org/10.23634/MMU.00632550.00329720 Download (3MB) | Preview |
Abstract
Touch exploration of fabric is used to evaluate its properties, and it could further be leveraged to understand a consumer’s sensory experience and preference so as to support them in real time to make careful clothing purchase decisions. In this paper, we open up opportunities to explore the use of technology to provide such support with our FabricTouch dataset, i.e., a multimodal dataset of fabric assessment touch gestures. The dataset consists of bilateral forearm movement and muscle activity data captured while 15 people explored 114 different garments in total to evaluate them according to 5 properties (warmth, thickness, smoothness, softness, and flexibility). The dataset further includes subjective ratings of the garments with respect to each property and ratings of pleasure experienced in exploring the garment through touch. We further report baseline work on automatic detection. Our results suggest that it is possible to recognise the type of fabric property that a consumer is exploring based on their touch behaviour. We obtained mean F1 score of 0.61 for unseen garments, for 5 types of fabric property. The results also highlight the possibility of additionally recognizing the consumer’s subjective rating of the fabric when the property being rated is known, mean F1 score of 0.97 for unseen subjects, for 3 rating levels.
Impact and Reach
Statistics
Additional statistics for this dataset are available via IRStats2.