Odor and Subject Identification Using Electroencephalography Reaction to Olfactory

Odor and Subject Identification Using Electroencephalography Reaction to Olfactory

Onder Aydemir  

Department of Electrical-Electronics Engineering, Faculty of Engineering, Karadeniz Technical University, Trabzon 61080, Turkey

Corresponding Author Email: 
onderaydemir@ktu.edu.tr
Page: 
799-805
|
DOI: 
https: //doi.org/10.18280/ts.370512
Received: 
23 April 2020
|
Revised: 
8 September 2020
|
Accepted: 
16 September 2020
|
Available online: 
25 November 2020
| Citation

© 2020 IIETA. This article is published by IIETA and is licensed under the CC BY 4.0 license (http://creativecommons.org/licenses/by/4.0/).

OPEN ACCESS

Abstract: 

It is certain that the human brain responds to all kinds of inputs such as feeling, sound, light, and odor. However, to the best of our knowledge, limited works have investigated the response of the human brain to different inputs, especially in eyes-open and eyes-closed (EO & EC) conditions. Due to its fine temporal resolution, portability, noninvasiveness, and low set-up costs, electroencephalography (EEG) is one of the most practical way to evaluate the response of the brain to different inputs. In this study, the brain reactions to olfactory were analyzed, and two identifications were done, which were odor and subject. The brain reactions were captured by EEG from five healthy subjects during smelling of valerian, lotus flower, cheese, and rosewater odors in EO & EC conditions. We tested band power, statistical data, Hjorth parameters, and autoregressive model features and achieved the highest average classification accuracy rates of 96.94% and 99.34% for odor and subject identifications, respectively. The obtained results proved that the olfactory response of the human brain in EO & EC conditions can be reliably used for odor and subject identifications.

Keywords: 

electroencephalogram, brain response, odor, subject identification, multi-class classification, feature extraction

1. Introduction

The human brain controls daily functions, including motor control, sensation, learning, thoughts, and feelings, by interpreting signals from sense organs to sensory regions of the brain. These regions receive and process sensory information (including sight, touch, taste, smell, and hearing), which comes into the brain as electrical impulses. For example, when an odor molecule binds to a receptor, it produces an electrical signal that goes from the sensory neurons to the olfactory bulbs, which contain neuron cell bodies that transmit the electrical signal along the cranial nerves, which are extensions of the olfactory bulbs. They send the signal down the olfactory nerves toward the olfactory area of the cerebral cortex. Similarly, the other sensory receptors (sight, touch, taste, and hearing) convert physical stimuli into neural activity. The only difference is that the activity is sent to a related sensory area of the cerebral cortex [1-4].

The neural response of the brain to the signals from the sense organs can be evaluated by electroencephalography (EEG) [5]. Because of its fine temporal resolution, noninvasiveness, portability, and low set-up costs, EEG is generally the most preferred technique for recording neural activity [6]. In addition to providing a lot of information about brain, EEG also helps to evaluate functions of the nose. For example, EEG signals, which are acquired during smelling, can investigate any lack of smelling ability of a subject, or for the purpose of measuring the response of the brain. It is known that the lack of smelling ability is a prediagnosis for Parkinson’s disease [1-4]. Hence, it is crucial to objectively determine the lack of smelling ability. However, there is limited research into the response of the human brain to different odors [7-9]. Additionally, studies vary in terms of experimental methodology and outcomes. For example, while some papers investigated the classification of EEG signals recorded during imagination of odors [9-11], some of them classified only pleasant and unpleasant odor-based EEG signals [12-15].

In a pleasant/unpleasant-based odor study, researchers acquired EEG signals from five subjects in the eyes-closed condition (ECC). Although participants were asked to smell four odors during the experiment, researchers classified only the most unpleasant and pleasant odors among them. Finally, they obtained a 79.91% CA rate [13]. In other research, Kroupi et al. acquired electrical brain activity from five participants in the eyes-open and eyes-closed (EO & EC) conditions [14]. As with previous work, the researchers also classified pleasant and unpleasant odors after subjectively asking participants to define their choices and then calculated a 90% average CA rate. It can be commented that, in such pleasant/unpleasant classification problems, it is naturally expected that the brain would produce more discriminative responses. In a previous study, we classified only lotus flower (LF) and cheese (C) odors by representing the signals with wavelet transform features, which revealed 98.29% and 94.08% average CA rates on the EO & EC conditions, respectively [16]. In another study, we applied a wavelet-transform features-based method into the valerian (V), rosewater (R), and LF and C odors and achieved 87.50% and 94.12% average CA rates in the EO & EC conditions, respectively [17].

In addition to the aforementioned studies, neural activity was also utilized for subject identification using EEG signals recorded during different tasks [18-20]. For example, Bajwa and Dantu applied their method to motor imagery EEG signals, which recorded the data set from seven subjects, and obtained a CA rate of 98.46% [21]. In another EEG-based biometry study, Falzon et al. used steady-state visual-evoked potentials for biometric measures [22]. The authors tested their method on eight subjects and obtained an overall CA rate of 91.7%. Although different types of EEG signals, recorded under different conditions, have been noted in the literature, it is worthwhile to mention that the current study is the first attempt to use an olfactory-based EEG database for subject identification.

In this study, we improved our odor-identification method not only in terms of CA but also in terms of computational complexity and proposed a subject-specific method to recognize multiclass electrical neural activity acquired during smelling of V, R, LF, and C odors in EO & EC conditions. Moreover, we proposed a subject-identification method, which uses olfactory-based EEG signals for EO & EC conditions. The electrical neural activity was represented by band power (BP), statistical data, Hjorth parameters, and autoregressive (AR) model features. Then, extracted features were classified by the k-nearest neighbor (k-NN) and naïve Bayes (NB) algorithms to categorize the EEG trials as V, R, LF, and C. The obtained results showed that subject-specific parameters, including feature extraction and classification methods, provide much better performance than a subject-independent model for odor recognition. On the other hand, it was shown that EEG signals recorded during smelling can be used for biometry, regardless of which of the four odors were smelled.

After the introduction, Section 2 provides descriptions of the data set, feature extraction, and classification algorithms, respectively. In Section 3, the classification results are given. In the last section, results are concluded and discussed.

2. Materials and Methods

2.1 Data acquisition

The data set was acquired from five healthy subjects (Subject 1 to Subject 5 ) during smelling of four kinds of odors, including V, LF, C, and R odors in the EO & EC conditions. It should be noted that, while V, LF, and R were used in liquid form, C was used in solid form. The subjects were right-handed and between 26 and 32 years old. It should be noted that all subjects had no clinical disease. Participants sat on a chair, with a screen set 1 m away.

Figure 1. Number of trials: (a) Number of eyes-open trials; (b) Number of eyes-closed trials

The experimenter randomly chose a bottle with an odor. The random selection approach was important for recording only the brain’s response to physically smelling an odor and avoiding the activity of the imagination of the odor. Afterward, following a “smell” command, the experimenter placed an odor bottle under the participant’s nose for about 2 s. This process constituted a single EEG trial. The electrical brain activity was recorded at 250 Hz sampling rate with 256 electrodes. The total number of trials for each of odor was between 94 and 114. The exact number of trials for each subject is given in Figure 1. In this figure, the dark and light colors, respectively, demonstrated the exact number of training and testing EEG trials, which were randomly selected [14]. This study has two main purposes: 1) to categorize the trials in the testing set into V, LF, C and R odors, which is called odor identification; 2) to identify the subjects as Subject 1, Subject 2, Subject 3, Subject 4, or Subject 5 using the V, LF, C, and R trials.

2.2 Feature extraction

Feature extraction is one of the most important stages in pattern recognition and machine learning studies since its capability directly influences the performance of the classifier. Because it is known that capable feature extraction algorithms provide discriminative feature space, which helps to increase CA performance. In this research, we used BP, statistical data, Hjorth parameters, and AR model features. The BP features are calculated by fast Fourier transform (FFT). It is worthwhile to note that, because there is no certain suggested frequency band interval for EEG signals, the band powers were calculated for band intervals of 0-30 Hz, 0-40 Hz, and 0-50 Hz [23, 24].

To obtain BP features of the trials, first we calculated the FFT coefficients as follows:

$X(k)=\frac{1}{N} \sum_{i=0}^{N-1} x_{i} e^{-j 2 \pi(k-1)(n-1) / N}, \quad k=0,1, \ldots, N$     (1)

where, N is number of EEG samples taken for analysis, xi is one discrete time point from an EEG trial, and X(k) is the kth FFT coefficient. Then, BP was computed by:

$\left.B P\right|_{f_{L}} ^{f_{U}}=\sum\left\|\left.X(k)\right|_{f_{L}} ^{f_{U}}\right\|^{2}$       (2)

where, $\left.X(k)\right|_{f_{L}} ^{f_{U}}$ indicates FFT coefficients between low (fL) and upper cut-off frequency (fU). The low cutoff frequency and upper cutoff frequency values are presented in Table 1.

Table 1. Low and upper cut-off frequency limits

Total Band

fL(Hz)

fU (Hz)

1

0

30

2

0

40

3

0

50

The statistical features, including kurtosis (K), skewness (S), standard deviation (SD), variance (V), and mean ($\bar{x}$), were calculated as follows:

$K=\frac{\frac{1}{N} \sum_{i=1}^{N}\left(x_{i}-\bar{x}\right)^{4}}{\left(\frac{1}{N} \sum_{i=1}^{N}\left(x_{i}-\bar{x}\right)^{2}\right)^{2}}$     (3)

$S=\frac{\frac{1}{N} \sum_{i=1}^{N}\left(x_{i}-\bar{x}\right)^{3}}{\left(\sqrt{\frac{1}{N} \sum_{i=1}^{N}\left(x_{i}-\bar{x}\right)^{2}}\right)^{3}}$     (4)

$S D=\sqrt{\frac{\sum\left(x_{i}-\bar{x}\right)^{2}}{N}}$     (5)

$V=S D^{2}$      (6)

$\bar{x}=\frac{\sum x_{i}}{N}$     (7)

where, $\bar{x}$ indicates the mean of the xi [25, 26]. Hjorth parameters are defined by three descriptors, i.e., activity (A), mobility (M), and complexity (C), which, respectively, are calculated as follows:

$A=\frac{\sum_{i=1}^{N}\left(x_{i}-\bar{x}\right)^{2}}{N}$     (8)

$M=\frac{\sqrt{\frac{\sum_{i=1}^{N}\left(\dot{x_{l}}-\overline{\dot{x}}\right)^{2}}{N}}}{\sqrt{A}}$    (9)

$C=\frac{M(\dot{x})}{M(x)}$      (10)

where, $\dot{x}$ and $\dot{\dot{x}}$ denote the first derivative of x and the mean of the first derivative of x, respectively [27].

A raw EEG trial, xi, can be written as an AR model in the following equation:

$x_{i}=A_{1} x_{(i-1)}+A_{2} x_{(i-2)}+\cdots+A_{p} x_{(i-p)}+e_{i}$     (11)

where, A1, A2…, Ap are the AR model parameters, p is the AR model order, i is an integer that represents discrete time samples of the EEG signal, and ei is a white noise, which has zero mean and variance [28]. The forward–backward method was used to calculate the AR model parameter. In this study, the predicted AR model parameters were selected as features. It is worthwhile to note that p was determined as 5 based on the cross-validation process on training set.

2.3 Classification

In this study, the proposed method was tested by k-NN and NB classifiers. Although the k-NN classifier is called a “lazy algorithm,” it is also an easy algorithm to implement, highly efficient, and useful in solving multilabel machine-learning problems [29]. Based on the distance metric and k parameter, which indicates k-nearest data points (neighbors) in the training feature space, it determines the label of a test trial. A simple example for the k-NN classifier is illustrated in Figure 2, which shows 2D sample feature space, which are representations of four classes of data. Let’s consider in this case k=5. The unlabeled test trial (*) would be labeled by the category of the “Class 4,” because three out of its five closest data points are “Class 4.” Note that the k parameter can be up to the total number of trials of a class, which includes the lowest number of trials among other classes. For example, in Figure 2, the k parameter can be up to 7 because Class 1 includes the lowest number of trials as 7 among other classes.

Figure 2. Example for the k-NN classifier

It should be noted that we implemented the leave-one-out cross-validation technique in the training stage. Because it uses the training data set well and avoids the problems of random selections. In addition to this, because the number of trials is limited, we used leave-one-out cross-validation technique to reveal the most suitable value of k, which achieves the highest CA performance in training set. It should be emphasized that, since the lowest number of trials in a class in the training set was equal to 8, the biggest possible value of k was set to 8.

In addition to k-NN, we also applied an NB classifier, which is a probabilistic technique and is mostly used in the literature. NB considers training feature space and applies Bayes’ theorem with naïve independence assumptions [30]. The class of an unknown trial is probabilistically calculated by the NB using the available training feature space to predict the most probable class. The highest probable class FNB of an unknown trial with the conjunction M=m1, m2, ..., mn was calculated by:

$F_{N B}=\underset{f \in F}{\arg  \max } p(f \backslash M)^{\leftarrow}$    (12)

It should be noted that we calculated the CA metric to evaluate the performance of the classifiers as given in Eq. (13). In this equation, CCT and TNT indicate the correctly classified EEG trials and the total number of considered trials, respectively:

$C A=\frac{C C T}{T N T}$     (13)

3. Results

In this study, the brain reactions during smelling of V, LF, C, and R odors in EO & EC conditions were classified for the purpose of olfactory and subject identification. In order to show robustness of the proposed method, we ran it one hundred times by randomly splitting the training and testing data sets. Thus, we calculated an average CA and its standard deviation value for each feature-extraction method.

3.1 Results of odor identification

Odor identification average test CA results with their standard deviations of BP features are provided in Tables 2 and 3 for the EO & EC conditions, respectively. The achieved highest performances are given in bold format. As seen from the tables, in the eyes-open condition (EOC), the best CA was obtained by k-NN classifier for Subject 1 as 91.70% in 0-30 Hz. Similarly, the best CA was again achieved for Subject 1 in 0-40 Hz band as 92.32% for the ECC. In terms of average CA over the subjects, it could be said that the k-NN classifier proved more successful than the NB classifier. In EO & EC conditions, the highest classification accuracies were achieved by k-NN classifier in terms of average CA over the subjects. They were obtained in EO & EC conditions as 76.19% and 77.89%, respectively.

For the EOC, the average test CA results, with their standard deviations of statistical data, Hjorth parameters, and AR model features, are provided in Figure 3(a), Figure 3(b), and Figure 3(c), respectively; for the ECC, they are, respectively, given in Figure 4(a), Figure 4(b), and Figure 4(c). Note that the lines above the bars represent the standard deviations; further, in each subfigure the last two bars show the average test CA values over the subjects. Based on the subfigures given in Figure 3, it can be obviously said that statistical features achieved better performance for EOC compared with the results of the Hjorth parameters and AR model features. The highest CA was obtained for Subject 2 as 92.61% with statistical features using an NB classifier. In terms of the average CA values over the subjects, the NB classifier also achieved the highest performance for statistical features as 76.16%. On the other hand, the highest performance for Hjorth parameters and AR model features were obtained as 50.77% (with k-NN) and 48.69% (with NB), respectively.

As shown in the subfigures of Figure 4, AR model features achieved better performance for ECC compared with the results of Hjorth parameters and statistical features. The highest CA was achieved for Subject 1 as 96.94% with AR model features using a k-NN classifier. In terms of the average CA values over the subjects, the k-NN classifier also achieved the highest performance for AR model features as 81.08%. On the other hand, the highest performance for statistical and Hjorth parameter features were obtained as 75.75% (with k-NN) and 44.97% (with NB), respectively.

Based on the average CA values over the subjects, it can be concluded that, for the EOC, the BP in 0-40 Hz with a k-NN classifier provided the highest performance. On the other hand, for the ECC AR model features achieved the best performance in terms of the average CA values over the subjects. On the contrary, the worst performances of EO & EC conditions were obtained with Hjorth parameter features. Individually, the proposed best performances of each subjects are summarized in Table 4.

Figure 3. Results of statistical data, Hjorth parameters, and AR model features in the EOC. (a) Results of statistical features. (b) Results of Hjorth parameter features. (c) Results of AR model features

Figure 4. Results of statistical data, Hjorth parameters, and AR model features. (a) Results of statistical features. (b) Results of Hjorth parameter features. (c) Results of AR model features

Table 2. Results of band power features in the EOC

 

Band Power Features

0-30 Hz

0-40 Hz

0-50 Hz

k-NN

NB

k-NN

NB

k-NN

NB

S1

91.70±4.3

71.38±7.0

85.32±5.6

72.81±7.1

74.38±5.8

59.38±7.7

S2

74.31±5.9

70.64±6.3

77.03±4.2

70.86±5.9

67.64±5.3

59.83±5.4

S3

63.43±5.4

59.11±5.7

59.73±5.8

56.46±6.3

54.93±6.8

43.73±5.8

S4

68.93±6.3

61.68±6.5

77.67±5.0

64.29±7.9

84.91±4.6

64.18±8.7

S5

76.16±5.3

70.65±7.1

81.22±5.8

67.65±6.0

73.54±5.9

55.65±6.4

Avg.

74.91±5.4

66.69±6.5

76.19±5.3

66.41±6.6

71.08±5.7

56.55±6.8

Table 3. Results of band power features in the ECC

 

Band Power Features

0-30 Hz

0-40 Hz

0-50 Hz

k-NN

NB

k-NN

NB

k-NN

NB

S1

90.20±3.6

74.14±6.6

92.32±4.2

76.76±7.2

78.49±6.4

60.73±7.5

S2

82.67±4.5

75.90±5.2

79.49±5.0

75.02±6.2

69.85±4.8

64.95±6.4

S3

69.00±5.8

59.48±7.5

50.10±4.8

46.33±6.9

47.82±6.2

40.62±7.0

S4

76.02±4.5

62.55±8.9

77.25±4.5

64.75±5.4

71.20±5.4

56.07±6.9

S5

71.55±5.5

48.08±6.7

69.43±6.3

46.40±5.8

64.44±6.4

48.63±6.9

Avg.

77.89±4.8

64.03±7.0

73.72±5.0

61.85±6.3

66.36±5.8

54.20±6.9

Table 4. Best performances of the proposed method

 

 

Eyes Open

Eyes Closed

CA

Features

Classifier

CA

Features

Classifier

S1

91.70±4.3

BP

k-NN

96.94±3.2

AR

k-NN

S2

92.61±5.3

Statistical

NB

93.80±2.9

Statistical

k-NN

S3

66.05±8.7

Statistical

NB

73.21±5.8

Statistical

k-NN

S4

81.76±5.7

Statistical

NB

85.50±4.5

AR

k-NN

S5

81.22±5.8

BP

k-NN

75.20±5.9

AR

k-NN

Avg.

82.67±6.0

-

-

84.93±4.5

-

-

Table 5. Subject identification results

 

Eyes Open

Eyes Closed

Feature

k-NN

NB

k-NN

NB

BP (0-30 Hz)

97.48±1.21

69.04±3.26

98.57±0.87

81.45±2.88

BP (0-40 Hz)

95.82±1.33

69.07 ±2.58

97.00±0.97

73.43 ±3.51

BP (0-50 Hz)

93.75±1.53

93.42±1.66

72.70 ±2.48

68.63±3.47

AR

61.47±3.35

59.53 ±3.04

97.72±1.17

88.09±2.00

Statistical

75.75±2.68

74.78±3.01

99.34±0.54

92.79±1.72

Hjorth

57.16±2.33

52.95 ±2.34

47.68 ±2.57

58.87±2.61

3.2 Results of subject identification

The subject identification average CA test results with their standard deviations of each feature are provided in Table 5. For biometry results, we achieved the highest CA value with statistical features and k-NN classifiers as 99.34% in ECC. In this condition, the worst CA was calculated with Hjorth parameter features as 47.68%, which is under the random determination value of 50%. It is obviously seen that, except for this result, k-NN performed better than NB in other features for both EOC and ECC. The best CA was obtained for BP (0–30 Hz) features as 97.48% in EOC. For the most optimal achievements of EO&EC conditions, the standard deviations were calculated as 1.21 and 0.54, which verified the robustness of the proposed method.

4. Conclusion

In this paper, the responses of the brain to V, LF, C, and R odors were analyzed in terms of odor and subject identification. In order to show its performance and robustness, the proposed method was run one hundred times, and the average classification accuracies and their standard deviations were calculated. The higher CA and smaller standard deviation values proved that the proposed method was successful and stable.

In terms of the odor-identification approach, from the subject-independent point of view, it can be concluded that, while the band power features in 0-40 Hz with k-NN classifier provides better performance in EOC, the AR model features with k-NN classifier achieved better performance in ECC. Compared with the subject-independent results, it can be obviously said that the subject-specific parameters, including feature extraction and classifier method, remarkably improved the CA performances. Moreover, based on the results of BP features, it is worthwhile to note that electrical brain activity has a nonstationary property and its response to the stimulus generally differs from subject to subject. Therefore, it can be said that each subject might have one’s own dominant frequency band for extracting discriminative attributes. Hence, in this study, we proposed a subject-specific model for classifying V, LF, C, and R odors.

While the minimum difference between EO&EC conditions was obtained for Subject 2 as 1.19%, the maximum difference was calculated for Subject 3 as 7.16%. On the other hand, since the EEG signals of Subject 1 were the most discriminative, it can be interpreted that his/her motivation to the task is higher or his/her olfactory system works better. The opposite can be said for Subject 3. Based on the achieved average CA rates, we improved the study of Aydemir (2017) [17], 5.11% and 2.82% for EO & EC conditions, respectively.

In terms of the performance of feature extraction methods, it can be concluded that statistical features generally provide higher performances for many cases than other methods. Moreover, the computational complexity is easy and not time-consuming, which takes only 0.04 s for extraction statistical features from a trial. Further, AR model and BP features also achieved better performance in some cases. However, Hjorth parameter features do not yield satisfactory results. On the other hand, the obtained classification results proved that the performance of the k-NN classifier is better than that of the NB classifier.

The overall results verified that the olfactory response of the human brain in EO & EC conditions can be reliably used for odor and subject identifications. The odor-identification model offers to measure the response of the brain. Furthermore, it can help in quantifying olfactory loss for clinical purposes. On the other hand, because of increased security and privacy, the proposed method is a potentially good alternative for EEG-based biometry for person identification and authentication, especially compared with conventional biometric measures such as fingerprint, palm, iris, or voice. We believe that, due to the unique nature of EEG and its portability, EEG-based biometric systems can be easily employed with highly reliable performances in the real applications.

  References

[1] Haehner, A., Hummel, T., Hummel, C., Sommer, U., Junghanns, S., Reichmann, H. (2007). Olfactory loss may be a first sign of idiopathic Parkinson's disease. Movement Disorders, 22(6): 839-842. https://doi.org/10.1002/mds.21413

[2] Berendse, H.W., Booij, J., Francot, C.M., Bergmans, P.L., Hijman, R., Stoof, J.C., Wolters, E.C. (2001). Subclinical dopaminergic dysfunction in asymptomatic Parkinson's disease patients' relatives with a decreased sense of smell. Annals of Neurology, 50(1): 34-41. https://doi.org/10.1002/ana.1049

[3] Haehner, A., Masala, C., Walter, S., Reichmann, H., Hummel, T. (2019). Incidence of Parkinson’s disease in a large patient cohort with idiopathic smell and taste loss. Journal of Neurology, 266(2): 339-345. https://doi.org/10.1007/s00415-018-9135-x

[4] Haehner, A., Boesveldt, S., Berendse, H.W., Mackay-Sim, A., Fleischmann, J., Silburn, P.A., Johnston, A.N., Mellick, G.D., Herting, B, Reichmann, H., Hummel, T. (2009). Prevalence of smell loss in Parkinson's disease-a multicenter study. Parkinsonism & Related Disorders, 15(7): 490-494. https://doi.org/10.1016/j.parkreldis.2008.12.005

[5] Polomac, N., Leicht, G., Nolte, G., Andreou, C., Schneider, T.R., Steinmann, S., Engel, A.K., Mulert, C. (2015). Generators and connectivity of the early auditory evoked gamma band response. Brain Topography, 28(6): 865-878. https://doi.org/10.1007/s10548-015-0434-6

[6] Meinel, A., Castaño-Candamil, S., Blankertz, B., Lotte, F., Tangermann, M. (2019). Characterizing regularization techniques for spatial filter optimization in oscillatory EEG regression problems. Neuroinformatics, 17(2): 235-251. https://doi.org/10.1007/s12021-018-9396-7

[7] Hou, H.R., Zhang, X.N., Meng, Q.H. (2020). Odor-induced emotion recognition based on average frequency band division of EEG signals. Journal of Neuroscience Methods, 334: 108599. https://doi.org/10.1016/j.jneumeth.2020.108599

[8] Oleszkiewicz, A., Pellegrino, R., Guducu, C., Farschi, L., Warr, J., Hummel, T. (2019). Temporal encoding during unimodal and bimodal odor processing in the human brain. Chemosensory Perception, 12(1): 59-66. https://doi.org/10.1007/s12078-018-9251-0

[9] Placidi, G., Avola, D., Petracca, A., Sgallari, F., Spezialetti, M. (2015). Basis for the implementation of an EEG-based single-trial binary brain computer interface through the disgust produced by remembering unpleasant odors. Neurocomputing, 160: 308-318. https://doi.org/10.1016/j.neucom.2015.02.034

[10] Henkin, R.I., Levy, L.M. (2001). Lateralization of brain activation to imagination and smell of odors using functional magnetic resonance imaging (fMRI): Left hemispheric localization of pleasant and right hemispheric localization of unpleasant odors. Journal of Computer Assisted Tomography, 25(4): 493-514. https://doi.org/10.1097/00004728-200107000-00001

[11] Bensafi, M., Pouliot, S., Sobel, N. (2005). Odorant-specific patterns of sniffing during imagery distinguish ‘bad’ and ‘good’ olfactory imagers. Chemical Senses, 30(6): 521-529. https://doi.org/10.1093/chemse/bji045

[12] Lorig, T.S. (2000). The application of electroencephalographic techniques to the study of human olfaction: A review and tutorial. International Journal of Psychophysiology, 36(2): 91-104. https://doi.org/10.1016/S0167-8760(99)00104-X

[13] Yazdani, A., Kroupi, E., Vesin, J.M., Ebrahimi, T. (2012). Electroencephalogram alterations during perception of pleasant and unpleasant odors. IEEE 2012 Fourth International Workshop on Quality of Multimedia Experience, pp. 272-277. https://doi.org/10.1109/QoMEX.2012.6263860

[14] Kroupi, E., Yazdani, A., Vesin, J.M., Ebrahimi, T. (2014). EEG correlates of pleasant and unpleasant odor perception. ACM Transactions on Multimedia Computing. Communications, and Applications, 11(1s): 13. https://doi.org/10.1145/2637287

[15] Li, D.Y., Jia, J.F., Wang, X.C (2020). Unpleasant food odors modulate the processing of facial expressions: An event-related potential study. Frontiers in Neuroscience, 14: 686. https://doi.org/10.3389/fnins.2020.00686

[16] Yavuz, E., Aydemir, O. (2016). Olfaction recognition by EEG analysis using wavelet transform features. 2016 International Symposium on Innovations in Intelligent Systems and Applications, pp. 1-4. https://doi.org/10.1109/INISTA.2016.7571827

[17] Aydemir, O. (2017). Olfactory recognition based on EEG gamma-band activity. Neural Computation, 29(6): 1667-1680. https://doi.org/10.1162/NECO_a_00966

[18] DelPozo-Banos, M., Travieso, C.M., Alonso, J.B., John, A. (2018). Evidence of a task-independent neural signature in the spectral shape of the electroencephalogram. International Journal of Neural Systems, 28(1): 1750035. https://doi.org/10.1142/S0129065717500356

[19] DelPozo-Banos, M., Travieso, C.M., Weidemann, C.T., Alonso, J.B. (2015). EEG biometric identification: A thorough exploration of the time-frequency domain. Journal of Neural Engineering, 12(5): 056019. https://doi.org/10.1088/1741-2560/12/5/056019

[20] Bidgoly, A.J., Bidgoly, H.J., Arezoumand, Z. (2020). A survey on methods and challenges in EEG based authentication. Computers & Security, 93: 101788. https://doi.org/10.1016/j.cose.2020.101788

[21] Bajwa, G., Dantu, R. (2016). Neurokey: Towards a new paradigm of cancelable biometrics-based key generation using electroencephalograms. Computers & Security, 62: 95-113. https://doi.org/10.1016/j.cose.2016.06.001

[22] Falzon, O., Zerafa, R., Camilleri, T., Camilleri, K.P. (2017). EEG-based biometry using steady state visual evoked potentials. In 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 4159-4162. 

[23] Kołodziej, M., Majkowski, A., Rak, R. (2011). Implementation of genetic algorithms to feature selection for the use of brain-computer interface. Przegląd Elektrotechniczny, 87(5): 71-73.

[24] Slobounov, S., Tutwiler, R., Slobounova, E., Rearick, M., Ray, W. (2000). Human oscillatory brain activity within gamma band (30–50 Hz) induced by visual recognition of non-stable postures. Cognitive Brain Research, 9(2): 177-192. https://doi.org/10.1016/S0926-6410(99)00055-5

[25] Nagabushanam, P., George, S.T., Radha, S. (2019). EEG signal classification using LSTM and improved neural network algorithms. Soft Computing, 24: 9981-10003. https://doi.org/10.1007/s00500-019-04515-0

[26] Kołodziej, M., Majkowski, A., Rak, R.J. (2011). A new method of EEG classification for BCI with feature extraction based on higher order statistics of wavelet components and selection with genetic algorithms. In: Dobnikar A., Lotrič U., Šter B. (eds) Adaptive and Natural Computing Algorithms. ICANNGA 2011. Lecture Notes in Computer Science, vol 6593. Springer, Berlin, Heidelberg, pp. 280-289. https://doi.org/10.1007/978-3-642-20282-7_29

[27] Oh, S.H., Lee, Y.R., Kim, H.N. (2014). A novel EEG feature extraction method using Hjorth parameter. International Journal of Electronics and Electrical Engineering, 2(2): 106-110. https://doi.org/10.12720/ijeee.2.2.106-110

[28] Rafik, D., Larbi, B. (2019). Autoregressive modeling based empirical mode decomposition (EMD) for epileptic seizures detection using EEG signals. Traitement du Signal, 36(3): 273-279. https://doi.org/10.18280/ts.360311

[29] Duda, R.O., Hart, P.E., Stork, D.G. (2001). Pattern Classification, 2nd edition, Wiley, NY.

[30] Gupta, N., Ahuja, N., Malhotra, S., Bala, A., Kaur, G. (2017). Intelligent heart disease prediction in cloud environment through ensembling. Expert Systems, 34(3): e12207. https://doi.org/10.1111/exsy.12207