Skip to main content

Automated vision system for fabric defect inspection using Gabor filters and PCNN

Abstract

In this study, an embedded machine vision system using Gabor filters and Pulse Coupled Neural Network (PCNN) is developed to identify defects of warp-knitted fabrics automatically. The system consists of smart cameras and a Human Machine Interface (HMI) controller. A hybrid detection algorithm combing Gabor filters and PCNN is running on the SOC processor of the smart camera. First, Gabor filters are employed to enhance the contrast of images captured by a CMOS sensor. Second, defect areas are segmented by PCNN with adaptive parameter setting. Third, smart cameras will notice the controller to stop the warp-knitting machine once defects are found out. Experimental results demonstrate that the hybrid method is superior to Gabor and wavelet methods on detection accuracy. Actual operations in a textile factory verify the effectiveness of the inspection system.

Background

Defect detection is highly important to fabric quality control. Traditionally, defects are detected by human eyes. The efficiency of this manual method is low and the missed rate is high because of eye fatigue. In the best case, a quality control person cannot detect more than 60–70 % of the present defects (Çelik et al. 2014b). Hence, an automatic inspection system is necessary for textile industry. In the literature, fabric defect detection methods were categorized into six groups: statistical, spectral, model based, learning, structural, and motif-based (Ngan et al. 2011). Spectral and structural methods, as well as defect classification by neural networks, are still popular topics in this field. Spectral methods include Fourier transform, wavelet transform, Gabor transform, and so on. The Fourier transform is the classic method for fabric analysis. However, Fourier transform was usually used with other approaches in the latest works (Schneider and Merh 2015; Hu et al. 2015; Mohamed et al. 2014; As et al. 2013). Schneider presented an automatic method for plain and twill fabric detection by combining Fourier analysis, template matching and fuzzy clustering (Schneider and Merh 2015). The system proved to be robust and versatile as a 97 % detection accuracy could be achieved. An unsupervised approach for the inspection of periodic pattern fabric by applying Fourier analysis and wavelet shrinkage was proposed (Hu et al. 2015). The advantage of this method is that no reference image is needed. Wavelet transforms elicited much attention in the fabric detection field because of its good local time–frequency characteristics (Zhu et al. 2015; Li et al. 2015; Hu et al. 2014; Wen et al. 2014). Wavelet methods perform well in defects with outstanding edges, but poorly in flat defects with smooth grayscale differences. Gabor filters are suitable for emulating the biological features of human eyes and were employed in fabric detection (Ibrahim et al. 2015; Hu 2015; Bissi et al. 2013; Jing et al. 2014). However, given that Gabor filters perform filtering of multi-scale and multi-orientation, which results in high computational complexity, real-time requirements are difficult to meet. To decrease the computational complexity, the optimal Gabor filter is built via genetic algorithm, in which the filter is only performed at one scale and one direction (Hu 2015; Jing et al. 2014). In recent years, neural networks have also been utilized for fabric defect detection and classification (Çelik et al. 2014a, b; Furferi and Governi 2008). Furferi and Governi proposed an ANN-based method to detect and classify defects, according to 23 parameters extracted from each acquired image (Furferi and Governi 2008). The advantage of this method is that no experimental thresholds are needed. In addition to the classic back-propagation (BP) networks, an emerging neural network, namely, Pulse Coupled Neural Network, is also applied to identify the defect area on plain fabric (Song et al. 2008; Zhu and Hao 2013).

There may exist many types of defects in raw fabrics, such as loom fly, thin bar, broken end, etc. Furferi and Governi grouped these defects into three categories: dark and light area or point defects, dark and light localized defects and other defects (Furferi and Governi 2008). The most common defects of warp-knitting fabrics are linear defects of vertical orientation caused by the broken end of warp yarns (Du et al. 2012), which are shown as Fig. 1. The defects will become larger and larger if the warp-knitting machine is not stopped. So the target of an online vision inspection system is to detect defects and stop the warp-knitting machine as early as possible once defects appear on fabrics.

Fig. 1
figure 1

Common defects in warp-knitting fabrics

Though many researchers have focused on fabric defect detection in past years, it is still difficult to inspect defects of warp-knitted fabrics in practice due to the following reasons: (1) The quality of images captured by smart cameras in a factory is not as good as that in the laboratory because it is affected by lighting variation, machine vibration, dust, electromagnetic interference and other factors. Generally, the illumination device is very essential for image acquisition quality in machine vision system. However, to save cost and installation space, there are no specific lightening units in our system except fluorescent lamps installed along the web. (2) The typical defect caused by a broken yarn is not obvious, especially with the very thin yarn. All the above factors make defect detection is challenging in practice.

To deal with the difficulties, two issues are addressed in our study. First, Gabor filtering is performed on the images with specific parameters to enhance the image contrast. Second, a parameter adaptive PCNN is employed to segment the images with high precisions.

The other parts of this paper are structured as follows. The first section presents the system architecture. The second section proposes the fabric inspection algorithm. The third section focuses on the experimental results and discussions. The fourth section describes the actual operations. The final section concludes the research.

System architecture

The inspection system consists of smart cameras and an HMI controller. Smart cameras are powered by POE, and can be accessed from the HMI controller through local area network. Alarm messages will be sent to HMI once a camera has detected a defect, and then HMI will trigger AC contactor to stop the main motor of warp-knitting machine. The system diagram is shown in Fig. 2.

Fig. 2
figure 2

Automatic inspection system diagram

In the past years, some machine vision systems have been developed to detect fabric defects automatically (Çelik et al. 2014b; Dorian et al. 2012, 2014). These systems have similar architectures, which consist of industrial cameras, frame grabber, lighting unit and a computer. In contrast to the existing PC-based methods, the advantages of the embedded system are obvious, which include small size, easy to install, low power consumption, low cost, etc. In practice, multiple cameras are installed on the beam of warp-knitting machine. Each smart camera covers about 900 mm width of the web.

The smart camera consists of a SOC processor, FPGA-based Image Processing (ISP) module, DDR memories, FLASH memories and GigE Ethernet interface. Figure 3 shows the hardware diagram of the smart camera.

Fig. 3
figure 3

Hardware diagram of smart cameras

  • ISP module: A low cost CMOS image sensor with 1920 × 1080 resolutions is employed, and then the raw data output from CMOS sensor are processed by an FPGA processor. After processing, image data are transferred into the memory of SOC in YUV422 format.

  • SOC processor: TMS320DM6467 is chosen as host processor, which has an ARM9 core and a DSP core of 1 GHz. The detection algorithm is running on the DSP core and other tasks are implemented on the ARM9 core.

  • GigE port: The Ethernet port is included for information interaction between smart cameras and HMI controller.

  • Memory: There are128 MB DDR memories and 64 MB FLASH memories on the board.

Hybrid method for fabric defect inspection

Fabric inspection algorithm is the key component of the smart camera software. The algorithm consists of two phases: image enhancement and image segmentation. Image enhancement is implemented by Gabor filtering with optimal parameters, which makes the defects more obvious. In this paper, a parameter adaptive PCNN is utilized to segment defects layer by layer.

Gabor filters

A group of multi-scale and multi-orientation Gabor filters are suitable to characterize the texture features of the fabrics. So Gabor filters are widely used in the field of fabric defect inspection. The real part of the 2-D Gabor function is defined as:

$$g(x,y) = \exp \left\{ { - \frac{1}{2}\left[ {\left( {\frac{{x^{\prime} }}{{\sigma_{x} }}} \right)^{2} + \left( {\frac{{y^{\prime} }}{{\sigma_{y} }}} \right)^{2} } \right]} \right\}\cos (2\pi fx^{\prime} )$$
(1)
$$\left[ {\begin{array}{*{20}c} {x^{\prime} } \\ {y^{\prime} } \\ \end{array} } \right] = \left[ {\begin{array}{*{20}c} {\cos \theta } & { - \sin \theta } \\ {\sin \theta } & {\cos \theta } \\ \end{array} } \right]\left[ {\begin{array}{*{20}c} x \\ y \\ \end{array} } \right]$$
(2)

where f is the sinusoidal wave frequency, \(\theta\) is the rotated orientation, \(\sigma_{x}\) and \(\sigma_{y}\) are variances along the x-axis and y-axis respectively.

The traditional Gabor filters perform filtering at multi-scale and multi-orientation, which result in high computational complexity. The real-time requirements are difficult to meet with our system. To simplify the Gabor filter operation, we only perform filtering at a specific orientation and scale. In fact, the outputs of Gabor filters are greatly affected by the parameter \(\theta\) when applying Gabor filters to warp-knitted fabrics. Figure 4 gives a group results corresponding to different orientations. Figure 4a is the raw fabric image. The orientations of (b), (c), (d), (e), (f) and (g) are \(0^{ \circ }\), \(30^{ \circ }\), \(60^{ \circ }\), \(90^{ \circ }\), \(120^{ \circ }\), \(150^{ \circ }\) respectively. To demonstrate the effect of Gabor filter, three “handcrafted defects”, i.e., horizontal defect, vertical defect and diagonal defect, were added into the raw fabric image manually. It is clearly seen that the vertical texture defect is enhanced significantly by Gabor filtering with \(90^{ \circ }\) orientation, which is shown as Fig. 4e. In contrast, the horizontal defect is enhanced by Gabor filtering with \(0^{ \circ }\) orientation and greatly suppressed by Gabor filtering with \(90^{ \circ }\) orientation. Since the defects of warp-knitting fabrics are usually of the vertical linear shape, so the Gabor filtering is only performed at the \(90^{ \circ }\) orientation in our scheme.

Fig. 4
figure 4

Results of Gabor filtering at different orientations. a The raw fabric image, b Gabor filtering with \(0^{ \circ }\), c Gabor filtering with \(30^{ \circ }\), d Gabor filtering with \(60^{ \circ }\), e Gabor filtering with \(90^{ \circ }\), f Gabor filtering with \(120^{ \circ }\), g Gabor filtering with \(150^{ \circ }\)

We also found that the scale parameter doesn’t have much affection on texture features of warp-knitted fabrics. So only one scale is used in our method. In contrast to multi-scale and multi-orientation methods, our scheme can decrease the computation complexity, meanwhile remain the advantage of Gabor filters.

Pulse coupled neural network

PCNN model, inspired by synchronous dynamics of neuronal activity in cat visual cortex, was developed by Johnson et al. on the basis of Echorn’s cortical model (Eckhorn et al. 1990; Johnson and Padgett 1999). Nowadays PCNN becomes the most potential method in image processing (Chen et al. 2011). In this study, we use a simplified version of PCNN (Zhu and Hao 2013) to decrease the computational complexity while remaining the advantages of cortical model. The simplified PCNN model is shown in Fig. 5.

Fig. 5
figure 5

A simplified PCNN model

PCNN is 2-D networks with a single layer. Each neuron of PCNN is corresponding to a pixel when applying PCNN to image processing. Suppose the image to be processed is S ij , n is the iteration number, PCNN in Fig. 4 can be described as follows:

$$F_{ij} (n) = S_{ij} (n)$$
(3)
$$L_{ij} (n) = \sum\limits_{k,l} {w_{ijkl} } Y_{i + k,j + l} (n - 1)$$
(4)
$$U_{ij} (n) = F_{ij} (n)(1 + \beta L_{ij} (n))$$
(5)
$$E_{ij} (n) = E_{ij} (n) - \Delta t + V_{e} Y_{ij} (n - 1)$$
(6)
$$Y_{ij} = \left\{ {\begin{array}{*{20}l} {1,} \hfill &\quad {U_{ij} (n) \ge E_{ij} (n)} \hfill \\ {0,} \hfill &\quad {otherwise} \hfill \\ \end{array} } \right.$$
(7)

where F(n) is the feeding input, W is the link weights which represents the impact of the around pixels, Y(n) is the binary state of each neuron, L is the linking input which is a convolution of W and Y(n). The internal activity term U(n) is the image pixel value modulated by the linking input. E(n) is the dynamic threshold of neurons. Once the internal activity U(n) is larger than the E(n), the neurons will fire and the sates in Y(n) will update to ‘1’. After firing, the dynamic threshold E(n) will increase suddenly, which makes the neurons couldn’t fire in a period.

There are four parameters in the simplified PCNN model: W, \(\beta\), \(\Delta t\) and V e . W is the link weights which represent the impact of the around pixels. Usually W is defined as the inverse of Euclidean distance. \(\beta\) at the linking strength in the linking input. A larger value of β means a neuron is affected strongly by its neighboring neurons. \(\Delta t\) is the decay step inverse to segmentation precision. A small value of \(\Delta t\) could get a better segmentation precision. V e is the amplitude of dynamic threshold E(n). Parameter setting is crucial to PCNN. Song proposed a learning method to determine optimal parameters from defectiveless reference images (Song et al. 2008). Chen et al. attempted to build a relationship between dynamic behaviors of neurons and the static properties of the image, and proposed an automatic parameter setting method based the relationship (Chen et al. 2011). Zhou proposed an automatic setting method based on the relationship between the dynamic threshold and the average of the firing area (Zhou et al. 2014). Herein, we present an adaptive setting method described as follows:

$$V_{e} = \hbox{max} (S_{ij} )$$
(8)
$$V_{aver} = \frac{{\sum\nolimits_{i = 1}^{m} {\sum\nolimits_{j = 1}^{n} {S_{ij} } } }}{m \times n}$$
(9)
$$\Delta t = \frac{{V_{e} - V_{aver} }}{N}$$
(10)

where N is the total iteration number. In addition to these parameters, the iteration number N best with optimal segmentation should also be considered when using PCNN in defect detection. Researchers have developed some criterions to determine the optimal iteration number, such as Maximum Entropy, Minimum Cross Entropy, Maximum Variance of Inter-class and Minimum Variance of Intra-class. However, these criterions couldn’t obtain satisfied segmentation results. We propose a novel criterion based on firing time sequences, which is successfully used in defect detection of warp-knitted fabrics. The firing time sequences T (n) is defined as the number of fired neurons of iterations. The best iteration number is determined when T(n) increases suddenly, and the criterion is defined as:

$$N_{best} = n,\quad T(n + 1) - T(n) \ge 2T(n)$$
(11)

The defects can be segmented out from the fabric images accurately by PCNN iterations. After iterations we can get N frames of intermediate binary images. At last, the N best -th frame image is chosen as the final segmentation result. Suppose S is the input image, the iteration procedures are described as follows:

Initialize Y(0), E(0), W, β, Δt and V e

for i = 1 to N do

   F(i) ← S

   L(i) ← W * Y(i − 1)

   U(i) ← F(i)(1 + βL(i))

   Y(i) ← U(i) ≥ E(i − 1)

   E(i) ← E(i − 1) − ∆t + V e Y(i)

   T(i) ← sum(Y(i))

end for

Determine N best according to formula (11)

Hybrid inspection method

In this section, a hybrid detection method combing Gabor filters and PCNN is presented. The flowchart of the method is shown in Fig. 6.

Fig. 6
figure 6

Flowchart of hybrid detection algorithm

First, images captured by smart cameras are enhanced by Gabor filtering with \(90^{ \circ }\) orientation. Second, defect areas are segmented by PCNN with adaptive parameter setting. Finally the optimal segmentation is determined according to firing time sequences, and noises are removed by morphology filtering.

Experimental results and discusses

To evaluate the performance of the proposed hybrid method, we compared it with Gabor and wavelet methods. The testing code was implemented under MATLAB version R2012B. The testing computer was configured with an AMD Athlon processor with 3.01 GHz frequency and 3.25 GB memories.

The available detection area of web on the warp-knitting machine is limited to a narrow rectangle. In the experiment, two images (labeled as Image 1 and 2) captured by smart cameras are used as test pictures, which are shown as Fig. 7a. The vertical linear defect is very unsharp. As mentioned before, the defect area is enhanced by Gabor filtering at specific orientation and scale. In this experiment, the parameters of Gabor filter are set as follows: \(\sigma_{x} = 1.0,\sigma_{y} = 1.7\), \(\theta = \pi /2,f = 5.5\). The filtering results of Gabor are shown as Fig. 7b. From Fig. 7, we can see that the defects are really enhanced.

Fig. 7
figure 7

The effect of Gabor filtering. a Test images, b results of Gabor filtering

Next, the processed image is segmented by PCNN. \(V_{e}\) and \(\Delta t\) are determined according to (8)–(10). The total iteration number N = 20, \(W = \left[ {\begin{array}{*{20}c} {0.5} & 1 & {0.5} \\ 1 & 0 & 1 \\ {0.5} & 1 & {0.5} \\ \end{array} } \right]\), \(\beta = 0.1\). The image in Fig. 7b is segmented layer by layer, so we can obtain 20 frames of binary sequences. Figure 8 shows part of the intermediate segmentation results of Image 1.

Fig. 8
figure 8

Intermediate segmentations of Image 1 by PCNN. a Result of the 10-th iteration, b result of the 11-th iteration, c result of the 12-th iteration, d result of the 13-th iteration

How to choose the best result among the 20 iterations? Herein we propose an optimal criterion based on the firing time sequences of PCNN. Firing time sequences are the numbers of fired neurons of iterations, which include the temporal information of the image segmentation. The firing time sequence of Image 1 is shown as Fig. 9a. We can determine the best iteration number of Image 1 as 11 according to the formula (11). Figure 8 shows the segmentation results of the 10-th, 11-th, 12-th and 13-th iterations respectively. We can see that the 10-th segmentation is incomplete, while the 12-th and 13-th results include too many noises. So the 11-th segmentation is the best result. The firing time sequence of Image 2 is shown as Fig. 9b, and the best iteration number is determined as 17. However the intermediate binary results of Image 2 are omitted here to save space.

Fig. 9
figure 9

Firing time sequences. a Results of Image 1, b results of Image 2

Figure 10 is the comparisons between the wavelet method, Gabor method and the proposed hybrid method. DB4 wavelets are employed to perform 2-level decomposition of the fabric images. Because the broken end defects are vertical linear, so the vertical sub-image at composition level 2 is used for defect detection. Figure 10c are the binary thresholding and morphology filtering results of the wavelet sub-image. To demonstrate the significance of the proposed hybrid method, Gabor only method is used for comparisons. Figure 10d are the binary thresholding and morphology filtering results of the Gabor filtering image. Figure 10e are the binary images segmented by the proposed hybrid method. Comparing to the ground-truths of Fig. 10b, we can see that the detection results of hybrid method are more accurate than other methods.

Fig. 10
figure 10

Detection results comparisons of wavelet (wavelet + binary thresholding + morphology filtering), Gabor (Gabor + binary thresholding + morphology filtering) and the hybrid (Gabor + PCNN + morphology filtering) methods. a The original images, b the ground-truths labeled by manual, c segmentation results of wavelet method, d segmentation results of Gabor method, e segmentation results of hybrid method

Further, a group of metrics, including accuracy (ACC), true positive rate (TPR) and positive predictive value (PPV), is employed to quantify the detection accuracy (Li et al. 2016). The definition of ACC, TPR, PPV are described as (12)–(14)

$$ACC = (TP + TN)/(TP + TN + FP + FN)$$
(12)
$$TPR = TP/(TP + FN)$$
(13)
$$PPV = TP/(TP + FP)$$
(14)

where true positive (TP), true negative (TN), false positive (FP), false negative (FN) are labeled in Fig. 11.

Fig. 11
figure 11

Definitions of TN, FN, TP and FP

The ACC, TPR and PPV metrics of the three methods are listed as Table 1, and the best results are marked in italics. We can see that the hybrid method is significantly better than the wavelet and Gabor methods, and get the highest scores of all testing items.

Table 1 Detection accuracy comparison of three methods

Actual operations

The proposed hybrid method is suitable to run on an embedded system because of the low computation. There are no complex operations in the simplified PCNN model. We have developed a prototype system installed on a 210 in. warp-knitting machine. The system consists of six smart cameras and an HMI controller. The processing speed is about 5 frames per second, which can meet the real-time detection demands of warp-knitting machine. The actual operations proved the system is effective with a detection accuracy of 98.6 %.

Conclusions

This paper presents an automatic fabric inspection system that consists of smart cameras and an HMI controller. The system can be applied to defect inspection for warp-knitting machine. The key part of the system is the hybrid inspection algorithm combining Gabor filters and PCNN with adaptive parameter setting. The performance of the system was verified on a warp knitting machine successfully. Future work will investigate the effectiveness of the proposed method for defect inspection of more complex fabrics.

References

  • As M, Drean JY, Bigue L, Osselin JF (2013) Optimization of automated online fabric inspection by fast Fourier transform (FFT) and cross-correlation. Text Res J 83(3):256–268

    Article  Google Scholar 

  • Bissi L, Baruffa G, Placidi P, Ricci E, Scorzoni A, Valigi P (2013) Automated defect detection in uniform and structured fabrics using Gabor filters and PCA. J Vis Commun Image R 24(7):838–845

    Article  Google Scholar 

  • Çelik HI, Canan DL, Topalbekiroglu M (2014a) Fabric defect detection using linear filtering and morphological operations. Indian J Fibre Text Res 39(3):254–259

    Google Scholar 

  • Çelik Hİ, Dülger LC, Topalbekiroğlu M (2014b) Development of a machine vision system: real-time fabric defect detection and classification with neural networks. J Textile Inst 105(6):575–585

    Article  Google Scholar 

  • Chen YL, Park SK, Ma YD, Ala R (2011) A new automatic parameter setting method of a simplified PCNN for image segmentation. IEEE Trans Neural Netw 22(6):880–892

    Article  Google Scholar 

  • Dorian S, Timm H, Florian N, Achim H, Til A, Thomas G (2012) A vision based system for high precision online fabric defect detection. In: Proceedings of the 2012 7th IEEE conference on industrial electronics and applications, pp 1494–1499

  • Dorian S, Timm H, Dorit M (2014) A traverse inspection system for high precision visual on-loom fabric defect detection. Mach Vis Appl 25(6):1585–1599

    Article  Google Scholar 

  • Du B, Bai R, Li Y, Chen W (2012) On-line vision-based fabric inspection algorithm. J Jiangnan Univ 11(1):19–22 (In Chinese)

    Google Scholar 

  • Eckhorn R, Reitboeck HJ, Arndt M, Dicke PW (1990) Feature linking via synchronization among distributed assemblies: simulations of results from cat visual cortex. Neural Comput 2(3):293–307

    Article  Google Scholar 

  • Furferi R, Governi L (2008) Machine vision tool for real-time detection of defects on textile raw fabrics. J Text Inst 99(1):57–66

    Article  Google Scholar 

  • Hu GH (2015) Automated defect detection in textured surfaces using optimal elliptical Gabor filters. Optik 126(14):1331–1340

    Article  Google Scholar 

  • Hu GH, Zhang GH, Wang QH (2014) Automated defect detection in textured materials using wavelet-domain hidden Markov models. Opt Eng 53(9):093107

    Article  Google Scholar 

  • Hu G, Zhang Q, Zhang G (2015) Unsupervised defect detection in textiles based on Fourier analysis and wavelet shrinkage. Appl Opt 54(10):2963–2980

    Article  Google Scholar 

  • Ibrahim CH, Topalbekitoglu M, Canan DL (2015) Real-time denim fabric inspection using image analysis. Fibres Text East Europe 23(3):85–90

    Google Scholar 

  • Jing JF, Yang PP, Li PF, Kang XJ (2014) Supervised defect detection on textile fabrics via optimal Gabor filter. J Ind Text 44:40–57

    Article  Google Scholar 

  • Johnson JL, Padgett M (1999) PCNN models and applications. IEEE Trans Neural Netw 10(3):480–498

    Article  Google Scholar 

  • Li PF, Zhang HH, Jing JF, Li RZ, Zhao J (2015) Fabric defect detection based on multi-scale wavelet transform and Gaussian mixture model method. J Text Inst 106(6):587–592

    Article  Google Scholar 

  • Li Y, Zhao W, Pan J (2016) Deformable patterned fabric defect detection with Fisher criterion based deep learning. IEEE Trans Autom Sci Eng. doi:10.1109/TASE.2016.2520955

    Google Scholar 

  • Mohamed E, Mounir H, Khadijah Q, Ebraheem S (2014) Application of principal component analysis to boost the performance of an automated fabric fault detector and classifier. Fibres Text East Europe 22(4):51–57

    Google Scholar 

  • Ngan HYT, Pang GKH, Yung NHC (2011) Automated fabric defect detection—a review. Image Vis Comput 29(7):442–458

    Article  Google Scholar 

  • Schneider D, Merh D (2015) Blind weave detection for woven fabrics. Pattern Anal Appl 18(3):725–737

    Article  Google Scholar 

  • Song YM, Yuan DL, Lu YF, Qiao GH (2008) Automated detection of fabric defects based on optimum PCNN model. Chin J Sci Instrum 29(4):888–891

    Google Scholar 

  • Wen ZJ, Cao JJ, Liu XP, Ying SH (2014) Fabric defects detection using adaptive wavelets. Int J Cloth Sci Technol 26(3):202–211

    Article  Google Scholar 

  • Zhou DG, Gao C, Guo YC (2014) Adaptive Simplified PCNN Parameter Setting for Image Segmentation. Acta Autom Sin 40(6):1191–1197

    Google Scholar 

  • Zhu SW, Hao CH (2013) An approach for fabric defect image segmentation based on the improved conventional PCNN Model. Acta Electron Sin 40(3):611–616

    Google Scholar 

  • Zhu B, Liu J, Pan R, Gao W, Liu JL (2015) Seam detection of inhomogeneously textured fabrics based on wavelet transform. Text Res J 85(13):1381–1393

    Article  Google Scholar 

Download references

Authors’ contributions

YL drafted the manuscript. CZ participated in the design of the study and performed the experimental analysis. Both authors read and approved the final manuscript.

Acknowledgements

Yundong Li was supported by the Beijing Education Committee Science and Technology Project.

Competing interests

The authors declare that they have no competing interests.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yundong Li.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, Y., Zhang, C. Automated vision system for fabric defect inspection using Gabor filters and PCNN. SpringerPlus 5, 765 (2016). https://doi.org/10.1186/s40064-016-2452-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40064-016-2452-6

Keywords