UC-Merced Image Classification with CNN Feature Reduction Using Wavelet Entropy Optimized with Genetic Algorithm

UC-Merced Image Classification with CNN Feature Reduction Using Wavelet Entropy Optimized with Genetic Algorithm

Fatih Özyurt Engin Avcı Eser Sert

Department of Software Engineering, Firat University, Elazig 23100, Turkey

Department of Computer Engineering, Malatya Turgut Ozal University, Malatya 44210, Turkey

Corresponding Author Email: 
eser.sert@ozal.edu.tr
Page: 
347-353
|
DOI: 
https://doi.org/10.18280/ts.370301
Received: 
2 February 2020
|
Accepted: 
26 April 2020
|
Published: 
30 June 2020
| Citation

OPEN ACCESS

Abstract: 

The classification of high-resolution and remote sensed terrain images with high accuracy is one of the greatest challenges in machine learning. In the present study, a novel CNN feature reduction using Wavelet Entropy Optimized with Genetic Algorithm (GA-WEE-CNN) method was used for remote sensing images classification. The optimal wavelet family and optimal value of the parameters of the Wavelet Sure Entropy (WSE), Wavelet Norm Entropy (WNE), and Wavelet Threshold Entropy (WTE) were calculated, and given to classifiers such as K-Nearest Neighbors (KNN) and Support Vector Machine (SVM). The efficiency of the proposed hybrid method was tested using the UC-Merced dataset. 80% of the data were used as training data, and a performance rate of 98.8% was achieved with SVM classifier, which has been the highest ratio compared to all studies using same dataset so far with only 18 features. These results proved the advantage of the proposed method.

Keywords: 

CNN, feature reduction, entropy, genetic algorithm, UC Merced dataset

1. Introduction

Thanks to rapid technological developments in the last 30 years, innovations in computer technology have led to the emergence of new fields for many disciplines as well as changes in various application fields. Geographic Information Systems (GIS) is one of the technologies that have emerged as a result of this development, and it has been used by many disciplines. Remote sensing data has recently come to the forefront as an important data source for GIS. In particular, the commercialization of high-resolution satellite data with remote sensing data for geographical information systems has gradually become cost-efficient and up-to-date data source [1, 2]. The integration of remote sensing and geographic information systems technologies into solving location-based real-world problems and timely decision-making offers significant advantages both in for the analysis of satellite images and in large-scale applications that require the analysis of geographic and spatial data.

Artificial neural networks form the basis of deep learning and offer architectures that enable better modeling. Since hardware constraints did not allow intensive matrix operations during the 1980s, deep learning could not be turned into practice. However, in the late 1980s, Hinton and Lecun proposed the backpropagation algorithm [3]. In the existing literature, many studies were carried out using deep learning techniques, which yielded successful results [4-7]. Convolutional Neural Network (CNN), a specialized architecture of deep learning, is particularly successful in image processing.

Discrete Wavelet Transform (DWT) displays successful performances in different fields such as classification, feature extraction, feature reduction, and multi-resolution analysis of signs and images [8]. It was observed in the literature review that DWT had been used in various fields such as image steganography [9], brain MR (Magnetic Resonance) image classification [10], digital modulation recognition [11], expert target recognition system [12], texture classification [13], noise removal and feature extraction [14].

Genetic Algorithm (GA) is a bio-inspired optimization technique inspired by Darwin’s theory of natural evolution [15]. It pioneered the design of innovative optimization techniques following successful results obtained using bio-inspired optimization techniques. It was observed in the literature review that GA had been employed in various fields such as classification [16-19], expert target recognition system [12], feature extraction [20], feature selection [21, 22], face recognition [23], and segmentation [24].

The most important step of classification studies is the feature extraction stage because extracted features in an image represent the image itself. In the present study, Alexnet CNN architecture, one of the most popular academic study fields in today’s world, was used as a feature extractor. The proposed method was used to develop a system capable of automatic classification by reducing the number of features.

In the present study, 4096 features belonging to 2100 remotely sensed aerial images in 21 classes were derived from Alexnet architecture. 4096 features were reduced to 18 features by calculating Wavelet Sure Entropy (WSE), Wavelet Norm Entropy (WNE), Wavelet Threshold Entropy (WTE), Shannon Entropy, Log Energy Entropy and Energy values for each of 1 approximation and 4 detail coefficients of 4 level DWT. Later, machine learning algorithms, i.e. SVM and K-Nearest Neighbors (KNN), were used. Thus, the accuracy of the recognition performance of classification was increased, while the training time of classification was decreased. The Genetic Algorithm (GA) optimization method was used to find the optimum parameter values for these 18 features. The feature reduction was performed using the obtained optimal wavelet family, optimal Ep, p, t parameter values of WSE, WNE, WTE using GA and Shannon, Log energy entropy values, and energy values.

Materials and methods are presented in Section 2. The applications of the proposed GA-WEE-CNN Method is given in Section 3. Experimental results and other state-of-the-art studies are described in detail in Section 4. The study is concluded with a summary of our method in Section 5.

2. Materials and Methods

An overview of the UC Merced dataset is given in Section 2.1. The proposed method consists of three sections as transfer learning with pre-trained AlexNet CNN architecture, DWT and GA, and are described below.

2.1. Data sets

UC Merced Land Use [25] is a high-resolution data set containing 100 different images (256×256 pixels) divided equally in each unique class. This data set contains aerial images and is widely used in various remote sensing applications. Some examples of this dataset are shown in Figure 1.

2.2. Transfer learning with pre-trained AlexNet

Since the ImageNet competition in 2012, AlexNet CNN architecture has been used frequently in image processing applications. It was used in image classification, image recognition, and object tracking problems, and displayed high performance rates [26, 27]. Normally, CNN has a layered structure that consists of input layer, pooling layer, dropout, relu, fully connected layers. The final layer is the classification layer [28, 29].

Transfer learning is a research problem in machine learning which focuses on storing information obtained from the solution of a problem and applying it to a different problem [30]. Given the huge resources needed to train big and challenging datasets with deep learning models or deep learning models, transfer learning is one of the most trending topics in deep learning. It only operates in deep learning if the features learned from the initial task are general. In transfer learning, we first prepare a basic network of basic data sets and tasks, and redesign the learned features or transfer them to a second destination network for training on a destination dataset and task. Some examples of Transfer learning with Alexnet architecture are shown in Figure 2.

2.3 Discrete wavelet transform

DWT is a powerful technique to analyze non-stationary signals such as Phonocardiogram (PCG) signals [31]. The main advantage of wavelet transformation is the variable window size which can be adjusted as wide for narrow and low frequencies for high frequencies. DWT allows all nodes in the tree structure to be separated more at each dissociation level. The approximate and detail coefficients sample of a DWT signal is given in Figure 3. Here, a4 represents 4 level DWT approximation coefficients, and d1, d2, d3, and d4 represent 4 level details coefficients of DWT respectively.

4096 features for each remote sensed land images were obtained from Alexnet CNN architecture. Four levels Wavelet Sure Entropy (WSE), Wavelet Norm Entropy (WNE), Wavelet Threshold Entropy (WTE), Shannon Entropy, Log Energy Entropy, and Energy were obtained. In total, 18 features for 1 approximation and 4 detail coefficients were given as follows:

1. Shannon Entropy,

2. Log Energy Entropy,

3. Energy,

4. Wavelet Threshold Entropy of 1 approximation and 4 details coefficients for 4 level DWT.

5. Wavelet Norm Entropy of 1 approximation and 4 details coefficients for 4 level DWT.

6. Wavelet Sure Entropy of 1 approximation and 4 details coefficients for 4 level DWT.

Here, Ep parameter value of WSE varies between 1 and 8, increasing by 0.5. p parameter value of WNE varies between 1.0 and 1.9, increasing by 0.06. Finally, t parameter value of WTE varies between 0.1 and 0.9, increasing by 0.05. Therefore, 2100x4096 feature data from the fc7 fully connected layer of CNN architecture were reduced to 2100x18 using 4 level DWT.

2.4 Genetic algorithm

Introduced by John Holland [32], genetic algorithms (GA) are evolution algorithms that optimize functions by modeling the biological process. GA parameters represent the genes in biology, while the aggregate set of parameters constitutes the chromosome. Each individual of GAs consists of populations represented as chromosomes (individuals). The suitability of the population is maximized or minimized within certain rules. Every new generation which survives in arrays created by random information exchange is obtained through combining.

These are population, reproduction, discard, modification, and evaluation [19, 33, 34]. The building of the Genetic Algorithm contains some genetic operations as shown in Figure 4.

In Genetic Algorithm, realized process stages can be listed as follows [19, 33, 34]:

Stage 1: In this stage, a random population of n people, which is the appropriate solution for the problem, is created.

Stage 2: In this stage, the fitness function f(x) is calculated for each of chromosome in population, which is selected as randomly.

Stage 3: In this stage, crossover operators are realized. To this aim, the chromosome with the highest fitness value between two parents is selected from the population. Later, cross-operators are applied to these parental chromosomes in order to form new chromosomes. If the crossover operator is not implemented, the chromosomes will be a complete copy of the parents.  

Stage 4: In this stage, the mutation operations are realized by displacing some sequences on the chromosome based on the specific mutation rate.

Stage 5: In this stage, the new population is obtained by performing the crossover and mutation operations, respectively.

Stage 6: In this stage, the new population is obtained by performing the crossover and mutation operations, respectively.

Stage 7: In this stage, GA is ceased if the final conditions are suitable. Thus, the current population yields the optimum result.

Stage 8: In this stage, the new population is generated and returned to Stage 2.

The proposed GA-WEE-CNN method is shown in Figure 5.

Figure 1. Sample images of UCM data set in 21 land-cover classes

Figure 2. Transfer learning with Alexnet architecture

Figure 3. 4 level DWT tree with corresponding high-pass and low-pass filters

Figure 4. Components of a Genetic Algorithm

Figure 5. The block diagram of proposed GA-WEE-CNN method

3. Applications of Proposed GA-WEE-CNN Method

The application of the proposed GA-WEE-CNN involves four steps as follows: 

• Step-1: In this step, the UC-Merced database used in these applications were obtained.

• Step-2: In this step, 4096 features of remote sensed land images were obtained from Alexnet CNN architecture.

• Step-3: In this step, the feature reduction was performed using the obtained optimal wavelet family, optimal Ep, p, t parameter values of WSE, WNE, WTE using GA and Shannon, Log energy entropy values and energy values. Thus, each of features by calculating WSE, WNE, WTE, Shannon Entropy, and Log Energy Entropy and Energy values for each of the obtained 1 approximation and 4 detail coefficients of 4 level DWT. t parameter value of WTE varies between 0.1 and 0.9, increasing by 0.05. In the present study, UC-Merced database used in these applications were obtained. Here, 4096 features for each of remote sensed land images were obtained from Alexnet CNN architecture. These features were reduced from 96 to 18 to obtain effective features. Thus, the recognition accuracy performance of the classification was increased, while the training time of classification was decreased.

•Step-4: In this step, the classification was realized using effective 18 features, which were obtained in the previous step for each of 21 categories of 2100 remote sensed land images with KNN and SVM classifiers.

The efficiency of this hybrid GA-WEE-CNN method proposed in the present study was tested in UC-Merced dataset. 80% of all images were used as training data, while 20% were used as testing data. Experimental results demonstrated that the proposed GA-WEE-CNN method yielded satisfactory results in terms of classification accuracy. The flow chart of GA-WEE-CNN method for UC-Merced Images Classification is shown in Figure 6.

Properties of the bit capacity of the proposed method:

  • In GA construction, a chromosome consists of a total of 16 bits. 
  • The first four bits of each of these individuals (1st, 2nd, 3nd and 4th bits) represent the 16 types of wavelet families, which are db2, db4, db5, db10, bior1.3, bior2.2), bior3.5, bior6.8, coif1, coif2, coif3, coif5, sym2, sym3, sym5, and sym8.
  • The second four bits of each of these individuals (5th, 6th, 7th and 8th bits) represent the Ep parameter values (1 to 16) of the WSE.
  • The third four bits of each of these individuals (9th, 10th, 11th and 12th bits) represent the p parameter values (1 to 16) of the WNE.
  • The third four bits of each of these individuals (13th, 14th, 15th and 16th bits) represent the t parameter values (1 to 16) of the WTE.

Chromosomes are randomly selected for the first population. The GA-WEE-CNN aims to display the highest performance for the classifier. GA-WEE-CNN system used 21 categories of 2100 remote sensed land UC-Merced images for training and testing. 80% of the images were used as training data, while 20% were used as testing data. In the existing literature, these percentages are generally used for training and testing as demonstrated by previous studies on this topic. Therefore, these percentages were selected for training and testing. The obtained results indicate that the proposed GA-WEE-CNN method yielded satisfactory results in terms of classification accuracy. 

The optimal wavelet family and optimal value of the parameters of the WSE, WNE, and WTE were calculated using GA-WEE-CNN method, which suggested the optimum values. As a result, GA-WEE-CNN method proves to be a very powerful and real-time system for the classification of 21 categories of 2100 remote sensed land UC-Merced images. In the present study, a triple cross-validation scheme is implemented using GA. The mean values are calculated to find the performance of GA-WEE-CNN method.

60 chromosomes were selected as the initial population randomly. Each of these chromosomes creates a total of 16 bits. The coding values of Ep, p, t parameter of the GA and coding of wavelet families are given in Table 1 and Table 2, respectively.

Figure 6. Block diagram of GA-WEE-CNN method for UC-merced images classification

Table 1. Coding for parameters of GA

Values of Ep parameter

Values of p parameter

Values of t

parameter

Coding

1

1

0.1

0 0 0 0

1.5

1.06

0.15

0 0 0 1

2

1.12

0.2

0 0 1 0

2.5

1.18

0.25

0 0 1 1

3

1.24

0.3

0 1 0 0

3.5

1.3

0.35

0 1 0 1

4

1.36

0.4

0 1 1 0

4.5

1.42

0.45

0 1 1 1

5

1.48

0.5

1 0 0 0

5.5

1.54

0.55

1 0 0 1

6

1.6

0.6

1 0 1 0

6.5

1.66

0.65

1 0 1 1

7

1.72

0.7

1 1 0 0

7.5

1.78

0.75

1 1 0 1

8

1.84

0.8

1 1 1 0

8

1.9

0.9

1 1 1 1

 
Table 2. Coding of wavelet families

Values of wavelet families

Coding

1

0 0 0 0

2

0 0 0 1

3

0 0 1 0

4

0 0 1 1

5

0 1 0 0

6

0 1 0 1

7

0 1 1 0

8

0 1 1 1

9

1 0 0 0

10

1 0 0 1

11

1 0 1 0

12

1 0 1 1

13

1 1 0 0

14

1 1 0 1

15

1 1 1 0

16

1 1 1 1

In the present study, the performance of GA-WEE-CNN method was calculated using classification accuracy. The classification accuracy rates for the data sets were calculated using Eqns. (1-2).

correctaccuracy $(C)=\frac{\sum_{k=1}^{|C|} \operatorname{assess}\left(c_{k}\right)}{|C|}, c_{k} \in C$

$\operatorname{assess}(C)=\left\{\begin{array}{cc}1 & \text { if classify }(c)=c . m \\ 0 & \text { otherwise }\end{array}\right.$          (1)

$\operatorname{assess}(c)=\left\{\begin{array}{ll}1, & \text { if } \text { classify }(c)=c . d \\ 0, & \text { otherwise }\end{array}\right.$          (2)

where, C is number of classes. c.m is the class of substance c, and (c) gives the classification of GA-WEE-CNN.

4. Results

The performance of the proposed GA-WEE-CNN is assessed using classification accuracy. The performances of classical KNN, SVM classifiers with optimal wavelet families and optimal parameters of the WSE, WNE, and WTE were compared to evaluate the performance of the GA-WEE-CNN method. The best classification accuracy of the proposed GA-WEE-CNN was found to be 98.8%. Later, values of Ep, p, t parameters of the WSE, WNE, WTE and wavelet family were calculated as 5, 1.12, 0.4, and db4, respectively. Table 3 presents the highest four performances and parameter values of the suggested GA-WEE-CNN.

In the present study, we ran many iterations for GA to find the highest classification accuracy rates. We proved the highest classification accuracy rates in Table 4, and, in Table 3, the classification accuracy of the GA-WEE-CNN based system was compared with previous methods [35-42]. It can be understood from Table 3 that the highest performance for the classification of UC-Merced dataset was obtained using GA-WEE-CNN with an accuracy rate of 98.8%.

As given in Table 3, the proposed method displayed the best performance rate, which shows the effectiveness of the proposed method. The proposed method produced the most advanced performance with accuracy rates of 98.8% just using 18 features. In another study similar to the proposed study, ResNet-TP, a two-pathway convolutional network with context aggregation [42] technique was proposed. However, classical CNN architecture was used in this study. Although the working time for conventional CNN was not specified, the performance rate achieved was lower than the method proposed in the present study.

Table 3. The comparison of GA-WEE-CNN based system with the previous methods

Method

Accuracy %

Reference

CaffeNet

95.02±0.81

Xia et al. [35]

GoogLeNet

94.31±0.89

Xia et al. [35]

VGG-VD-16

95.21±1.22

Xia et al. [35]

CNN-ELM

95.62

Weng et al. [36]

salM3LBP-CLM

95.75±0.80

Bian et al. [37]

TEX-Net-LF

96.62±0.49

Anwer et al. [38]

Fusion by addition

97.42±1.79

Chaib et al. [39]

CNN fusion

98.02±1.03

Yu & Liu [40]

ResNet50

98.50±1.40

Scott [41]

ResNet-TP-50

98.56

Zhou [42]

GA-WEE-CNN

98.8

The Proposed Method

5. Conclusion

In the present study, GA-WEE-CNN was proposed as an intelligent system for the classification of remote sensed land UC-Merced images. The direct use of the feature vector, rapid training, and test time and the ability to generalize over traditional classification methods are the main advantages of the GA-WEE-CNN classification system. In the present study, the GA-WEE feature reduction method from the obtained CNN features was used for the first time in the existing literature. 4096 features were obtained from CNN based on Alexnet architecture from each of these remote sensed land UC-Merced images, and were later reduced to 18 features by applying WSE, WNE, Shannon Entropy, Log Energy Entropy, and Threshold Entropy values. Finally, these obtained features were given to classifiers such as KNN and SVM for classification. 

Table 4. The best four performance and parameter values of the suggested GA-WEE-CNN.

Used Method

Type of the Used Classifier

Value of Ep

parameter

Value of p parameter

Value of t parameter

Used Wavelet Family

Accuracy (%)

GA-WEE-CNN

SVM

5

1.12

0.4

db4

98.8

GA-WEE-CNN

SVM

4

1.3

0.6

sym3

96.67

GA-WEE-CNN

SVM

2

1.42

0.6

db2

93.58

GA-WEE-CNN

KNN

6

1.48

0.2

coif2

92.84

The performance of the suggested GA-WEE-CNN was assessed using classification accuracy. The performances of classical KNN and SVM classifiers with optimal wavelet families and optimal parameters of the WSE, WNE, and WTE were compared to evaluate the performance of the GA-WEE-CNN method. As shown in Table 4, the best classification accuracy of the suggested GA-WEE-CNN was found to be 98.8%. These results demonstrate the advantage of the proposed GA-WEE-CNN.

  References

[1] Özyurt, F. (2019). Efficient deep feature selection for remote sensing image recognition with fused deep learning architectures. The Journal of Supercomputing, 1-19. https://doi.org/10.1007/s11227-019-03106-y

[2] Chen, Y., Zhu, L., Ghamisi, P., Jia, X., Li, G., Tang, L. (2017). Hyperspectral images classification with Gabor filtering and convolutional neural network. IEEE Geoscience and Remote Sensing Letters, 14(12): 2355-2359. https://doi.org/10.1109/LGRS.2017.2764915

[3] Lecun, Y. (1988). A theoretical framework for back-propagation. In: Touretzky, D., Hinton, G., Sejnowski, T. (eds) Proceedings of the 1988 Connectionist Models Summer School, CMU. Pittsburg, PA, Morgan Kaufmann.

[4] Doğantekin, A., Özyurt, F., Avcı, E., Koç, M. (2019). A novel approach for liver image classification: PH-C-ELM. Measurement, 137: 332-338. https://doi.org/10.1016/j.measurement.2019.01.060

[5] Neelapu, R., Devi, G.L., Rao, K.S. (2018). Deep learning based conventional neural network architecture for medical image classification. Traitement du Signal, 35(2): 169-182. https://doi.org/ 10.3166/TS.35.169-182

[6] Özyurt, F. (2019). A fused CNN model for WBC detection with MRMR feature selection and extreme learning machine. Soft Computing, 1-10. https://doi.org/10.1007/s00500-019-04383-8

[7] Xie, J.B., Li, R.T., Lv, S.W., Wang, Y.J., Wang, Q.Y., Vorotnitsky, Y.I. (2019). Chinese alt text writing based on deep learning. Traitement du Signal, 36(2): 161-170. https://doi.org/10.18280/ts.360206

[8] Avci, E., Turkoglu, I. (2003). Modelling of tunnel diode by adaptive-network-based fuzzy inference system. Int. J. of Computational Intelligence, 1(1): 231-233.

[9] Subhedar, M.S., Mankar, V.H. (2016). Image steganography using redundant discrete wavelet transform and QR factorization. Computers & Electrical Engineering, 54: 406-422. https://doi.org/10.1016/j.compeleceng.2016.04.017

[10] Nayak, D.R., Dash, R., Majhi, B. (2016). Brain MR image classification using two-dimensional discrete wavelet transform and Ada Boost with random forests. Neurocomputing, 177: 188-197. https://doi.org/10.1016/j.neucom.2015.11.034

[11] Avci, E., Hanbay, D., Varol, A. (2007). An expert discrete wavelet adaptive network based fuzzy inference system for digital modulation recognition. Expert Systems with Applications, 33: 582-589. https://doi.org/10.1016/j.eswa.2006.06.001

[12] Avci, E. (2013). A new method for expert target recognition system: Genetic wavelet extreme learning machine (GAWELM). Expert Systems with Applications, 40: 3984-3993. https://doi.org/10.1016/j.eswa.2013.01.011

[13] Krishnan, K.G., Vanathi, P.T. (2018). An efficient texture classification algorithm using integrated Discrete Wavelet Transform and local binary pattern features. Cognitive Systems Research, 52: 267-274. https://doi.org/10.1016/j.cogsys.2018.07.015

[14] Lin, H.Y., Liang, S.Y., Ho, Y.L., Lin, Y.H., Ma, H.P. (2014). Discrete-wavelet-transform-based noise removal and feature extraction for ECG signals. IRBM, 35(6): 351-361. https://doi.org/10.1016/j.irbm.2014.10.004

[15] Hemanth, D.J., Anitha, J. (2019). Modified genetic algorithm approaches for classification of abnormal magnetic resonance brain tumour images. Applied Soft Computing Journal, 75: 21-28. https://doi.org/10.1016/j.asoc.2018.10.054

[16] Ertam, F., Avcı, E. (2017). A new approach for internet traffic classification: GA-WK-ELM. Measurement, 95: 135-142. https://doi.org/10.1016/j.measurement.2016.10.001

[17] Avci, E. (2007). A new optimum feature extraction and classification method for speaker recognition: GWPNN. Expert Systems with Applications, 32: 485-498. https://doi.org/10.1016/j.eswa.2005.12.004

[18] Diker, A., Avci, D., Avci, E., Gedikpinar, M. (2019). A new technique for ECG signal classification genetic algorithm Wavelet Kernel extreme learning machine. Optik-International Journal for Light and Electron Optics, 180: 46-55. https://doi.org/10.1016/j.ijleo.2018.11.065

[19] Singh, A., Singh, K.K. (2017). Satellite image classification using Genetic Algorithm trained radial basis function neural network, application to the detection of flooded areas. J. Vis. Commun. Image R., 42: 173-182. https://doi.org/10.1016/j.jvcir.2016.11.017 

[20] Perez-Jimenez, A.J., Perez-Cortes, J.C. (2006). Genetic algorithms for linear feature extraction. Pattern Recognition Letters, 27: 1508-1514. https://doi.org/10.1016/j.patrec.2006.02.011

[21] Dong, H., Li, T., Ding, R., Sun, J. (2018). A novel hybrid genetic algorithm with granular information for feature selection and optimization. Applied Soft Computing, 65: 33-46. https://doi.org/10.1016/j.asoc.2017.12.048

[22] Das, A.K., Das, S., Ghosh, A. (2017). Ensemble feature selection using bi-objective genetic algorithm. Knowledge-Based Systems, 123: 116-127. https://doi.org/10.1016/j.knosys.2017.02.013

[23] Zhi, H., Liu, S. (2019). Face recognition based on genetic algorithm. J. Vis. Commun. Image R., 58: 495-502. https://doi.org/10.1016/j.jvcir.2018.12.012

[24] Janc, K., Tarasiuk, J., Bonnet, A.S., Lipinski, P. (2013). Genetic algorithms as a useful tool for trabecular and cortical bone segmentation. Computer Methods and Programs Biomedicine, 11(1): 72-83. https://doi.org/10.1016/j.cmpb.2013.03.012

[25] Yang, Y., Newsam, S. (2010). Bag-of-visual-words and spatial extensions for land-use classification. In Proceedings of the 18th SIGSPATIAL International Conference on Advances in Geographic Information Systems (GIS ’10). Association for Computing Machinery, New York, NY, USA, pp. S270–S279. 

[26] Özyurt, F., Tuncer, T., Avci, E., Koç, M., Serhatlioğlu, İ. (2019). A novel liver image classification method using perceptual hash-based convolutional neural network. Arabian Journal for Science and Engineering, 44(4): 3173-3182. https://doi.org/10.1007/s13369-018-3454-1

[27] https://medium.com/@RaghavPrabhu/understanding-of-convolutional-neural-network-cnn-deep-learning-99760835f148, accessed on 1 Jan. 2019.

[28] https://brohrer.github.io/how_convolutional_neural_networks_work.html, accessed on 3 Jan. 2019.

[29] Schmidhuber, J. (2015). Deep learning in neural networks: An overview. Neural networks, 61: 85-117. https://doi.org/10.1016/j.neunet.2014.09.003

[30] Kermany, D.S., Goldbaum, M., Cai, W., Valentim, C.C., Liang, H., Baxter, S.L., Dong, J. (2018). Identifying medical diagnoses and treatable diseases by image-based deep learning. Cell, 172(5): 1122-1131. https://doi.org/10.1016/j.cell.2018.02.010

[31] Safara, F., Doraisamy, S., Azman, A., Jantan, A., Ranga, S. (2012). Wavelet packet entropy for heart murmurs classification. Journal of Advances in Bioinformatics, 2012: 1-6. https://doi.org/10.1155/2012/327269

[32] Holland, J.H. (1992). Genetic algorithms. Scientific American, 267(1): 44-50. https://doi.org/10.1038/scientificamerican0792-66

[33] Alpman, E. (2018). Multiobjective aerodynamic optimization of a microscale ducted wind turbine using a genetic algorithm. Turkish Journal of Electrical Engineering & Computer Sciences, 26(1): 618-629. https://doi.org/10.3906/elk-1612-307

[34] Varjovi, M.H., Altun, S., Talu, M.F., Yeroğlu, C. (2018). Genetic algorithm based tree segmentation. In: 2018 International Conference on Artificial Intelligence and Data Processing, pp. S1-S5.

[35] Xia, G.S., Hu, J., Hu, F., Shi, B., Bai, X., Zhong, Y., Lu, X. (2017). AID: A benchmark data set for performance evaluation of aerial scene classification. IEEE Transactions on Geoscience and Remote Sensing, 55(7): 3965-3981. https://doi.org/10.1109/TGRS.2017.2685945

[36] Weng, Q., Mao, Z., Lin, J., Guo, W. (2017). Land-use classification via extreme learning classifier based on deep convolutional features. IEEE Geoscience and Remote Sensing Letters, 14(5): 704-708. https://doi.org/10.1109/LGRS.2017.2672643

[37] Bian, X., Chen, C., Tian, L., Du, Q. (2017). Fusing local and global features for high-resolution scene classification. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 10(6): 2889-2901. https://doi.org/10.1109/JSTARS.2017.2683799

[38] Anwer, R.M., Khan, F.S., van de Weijer, J., Molinier, M., Laaksonen, J. (2018). Binary patterns encoded convolutional neural networks for texture recognition and remote sensing scene classification. ISPRS Journal of Photogrammetry and Remote Sensing, 138: 74-85. https://doi.org/10.1016/j.isprsjprs.2018.01.023

[39] Chaib, S., Liu, H., Gu, Y., Yao, H. (2017). Deep feature fusion for VHR remote sensing scene classification. IEEE Transactions on Geoscience and Remote Sensing, 55(8): 4775-4784. https://doi.org/10.1109/TGRS.2017.2700322

[40] Yu, Y., Liu, F. (2018). A two-stream deep fusion framework for high-resolution aerial scene classification. Computational Intelligence and Neuroscience, 2018: 1-13. https://doi.org/10.1155/2018/8639367

[41] Scott, G.J., England, M.R., Starms, W.A., Marcum, R.A., Davis, C.H. (2017). Training deep convolutional neural networks for land–cover classification of high-resolution imagery. IEEE Geoscience and Remote Sensing Letters, 14(4): 549-553. https://doi.org/10.1109/LGRS.2017.2657778

[42] Zhou, Z., Zheng, Y., Ye, H., Pu, J., Sun, G. (2018). Satellite image scene classification via convnet with context aggregation. In: Pacific Rim Conference on Multimedia, pp. 329-339.