Thai Silk Patterns Classification with Deep Neural Networks

Main Article Content

Nakharin Ingo
Budsarathip Phatichaikiart
Sakpod Tongleamnak
Thanaphon Tangchoopong


The art of silk weaving has been transferred through generations as part of folk wisdom. Every locality has its distinct silk pattern design. Expertise and familiarity with silk are necessary for the classification of silk patterns. Therefore, only a few experts can recognize the silk's pattern. This study aims to implement a system for classifying silk patterns using image processing technology to help identify silk patterns from images. This research collected silk pattern data from the Chonnabot district, Khon Kaen Province. We selected 15 silk patterns and collected a total of 2,156 images. We examined two convolutional neural networks (CNNs), which differed in feature extraction and regularization via the dropout technique. The experimental results showed that CNN model 1 achieved an F1-score of 0.62. The CNN model 2, in which feature extraction using the pre-trained model was added to the CNN model 2, achieved an F1-score of 0.92, which can assist in resolving the confusion in silk pattern classification.

Article Details

How to Cite
Ingo, N., Phatichaikiart, B., Tongleamnak, S., & Tangchoopong, T. (2023). Thai Silk Patterns Classification with Deep Neural Networks. Journal of Applied Informatics and Technology, 5(2), 116–129.
Research Article


Butploy, N., & Boonying, S. (2020). Classification of Benjapakee Buddha amulets image by deep learning. RMUTSB Academic Journal, 8(1), 100-111. [In Thai]

Kaewmongkol, J. & Kittilap, R. (2016). Guidelines for development of silk weaving of the silk weavers group In Khon Kaen. Dhammathas Academic Journal, 16(2), 67-74. [In Thai]

Kaya, Y., & Gürsoy, E. (2023). A novel multi-head CNN design to identify plant diseases using the fusion of RGB images. Ecological Informatics, 75. 101998.

Keereemek, P. (2016). ผ้าไหมมัดหมี่อำเภอชนบท ขอนแก่น [Mudmee silk, Amphoe Chonnabot Khon Kaen]. Retrieved from [In Thai]

Kittinaradorn, C. (2020). Convolutional neural network. Retrieved from [In Thai]

Kummong, R. (2022). Mangosteen detection using deep learning. Information Technology Journal, 18(1), 47–55. [In Thai]

Kumpala, I., Wichapha, N., & Prasomsab, P. (2022). Sugar cane red stripe disease detection using YOLO CNN of deep learning technique. Engineering Access, 8(2), 192–197.

Lapthanachai, N., Chomthong, A., Waijanya, S., & Promrit, N. (2023). Classification of nail abnormalities using convolutional neural network. Journal of Applied Informatics and Technology, 5(1),18–35. [In Thai]

LeCun, Y. & Bengio, Y. (1995). Convolutional networks for images, speech, and time-series. The handbook of brain theory and neural networks. MIT Press. pp. 276–278.

National News Bureau of Thailand. (2022). ศูนย์หม่อนไหมฯ ขอนแก่น ติดตามการผลิตเส้ไหม อ.โนนศิลา และ GAP หม่อนผล อ.อุบลรัตน์ และ GAP หม่อนใบ อ.พล จ.ขอนแก่น [Khon Kaen Sericulture Center follow up production of silk thread, Non Sila District and GAP mulberry fruit Ubolratana District and GAP mulberry Phon District, Khon Kaen]. Retrieved from [In Thai]

Raksaard, N. & Surinta, O. (2018). Comparative study between local descriptors and deep learning for silk pattern image retrieval. Science and Technology Journal Mahasarakham, 37(6), 736-746. [In Thai]

Saisangchan, U., Chamchong, R. & Suwannasa, A. (2022). Comparison of lime leaf disease analysis using deep learning. Journal of Applied Informatics and Technology, 4(1), 71-86. [In Thai]

Shah, A.A., Malik, H.A.M., Muhammad, A., Alourani, A., & Butt Z.A. (2023). Deep learning ensemble 2D CNN approach towards the detection of lung cancer. Scientific Reports, 13, 2987 (2023).

Vaddi, R. & Manoharan, P. (2020). Hyperspectral image classification using CNN with spectral and spatial features integration. Infrared Physics & Technology, 107.

Ukwuoma, C.C., Qin, Z., Heyat, B.B., Akhtar, F., Bamisile, O., Muaad, A.Y., Addo, D., & Al-antari, M.A. (2023). A hybrid explainable ensemble transformer encoder for pneumonia identification from chest X-ray images. Journal of Advanced Research, 48. 191-211.