UPDF AI

Intelligent Clothing Design and Production Integrating CAD and Virtual Reality Technology

Xianyu Wang,Xiaoguang Sun

2023 · DOI: 10.14733/cadaps.2023.s13.111-123
Computer-Aided Design and Applications · 1 Citations

TLDR

The language decoding model based on double-layer LSTM can perform more nonlinear transformations on image and language information, thereby improving the semantic expression ability of the generated description statements.

Abstract

. The full name of clothing CAD is computer-aided clothing design, which refers to the use of electronic computers to assist personnel in clothing design. It has higher efficiency, better quality, and more accurate positioning than traditional manual design. In addition, many clothing CAD systems have integrated intelligent technology, elevating traditional clothing design to the level of intelligent assisted design. This project initiated the application research of intelligent visual technology based on DL (deep learning) in ethnic clothing culture and pattern design. A pattern generative model based on DL is proposed. Our new model uses a two-layer LSTM (long-and short-term memory) network in the decoding part, and performs the required mapping through sufficient depth and nonlinear transformation, instead of placing a simple single hidden layer multilayer perceptron on the top of the decoder in the original language decoding model. Research has shown that the algorithm proposed in this paper greatly improves the pattern matching accuracy of four corner rotation changes, with an accuracy increase of 56.127%. From the perspective of the highest accuracy of a single pattern, the traditional algorithm has a maximum accuracy of only 70.066%, while for some ethnic patterns, the highest accuracy of the improved algorithm in this article can reach 100%. The language decoding model based on double-layer LSTM can perform more nonlinear transformations on image and language information, thereby improving the semantic expression ability of the generated description statements.