Zhan Zhang’s research while affiliated with Chinese Academy of Sciences and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (9)


The 25 joints captured by a Kinect sensor
A comparison of the signal before and after the Gaussian filter
The number of features that were significantly different between the two groups
The correlations between the model-predicted scores and the self-reported sleep quality scores
Gait can reveal sleep quality with machine learning models
  • Article
  • Full-text available

September 2019

·

170 Reads

·

19 Citations

Xingyun Liu

·

Bingli Sun

·

Zhan Zhang

·

[...]

·

Tingshao Zhu

Sleep quality is an important health indicator, and the current measurements of sleep rely on questionnaires, polysomnography, etc., which are intrusive, expensive or time consuming. Therefore, a more nonintrusive, inexpensive and convenient method needs to be developed. Use of the Kinect sensor to capture one’s gait pattern can reveal whether his/her sleep quality meets the requirements. Fifty-nine healthy students without disabilities were recruited as participants. The Pittsburgh Sleep Quality Index (PSQI) and Kinect sensors were used to acquire the sleep quality scores and gait data. After data preprocessing, gait features were extracted for training machine learning models that predicted sleep quality scores based on the data. The t-test indicated that the following joints had stronger weightings in the prediction: the Head, Spine Shoulder, Wrist Left, Hand Right, Thumb Left, Thumb Right, Hand Tip Left, Hip Left, and Foot Left. For sleep quality prediction, the best result was achieved by Gaussian processes, with a correlation of 0.78 (p < 0.001). For the subscales, the best result was 0.51 for daytime dysfunction (p < 0.001) by linear regression. Gait can reveal sleep quality quite well. This method is a good supplement to the existing methods in identifying sleep quality more ecologically and less intrusively.

Download

See your mental state from your walk: Recognizing anxiety and depression through Kinect-recorded gait data

May 2019

·

356 Reads

·

68 Citations

As the challenge of mental health problems such as anxiety and depression increasing today, more convenient, objective, real-time assessing techniques of mental state are in need. The Microsoft Kinect camera is a possible option for contactlessly capturing human gait, which could reflect the walkers’ mental state. So we tried to propose a novel method for monitoring individual’s anxiety and depression based on the Kinect-recorded gait pattern. In this study, after finishing the 7-item Generalized Anxiety Disorder Scale (GAD-7) and the 9-item Patient Health Questionnaire (PHQ-9), 179 participants were required to walked on the footpath naturally while shot by the Kinect cameras. Fast Fourier Transforms (FFT) were conducted to extract features from the Kinect-captured gait data after preprocessing, and different machine learning algorithms were used to train the regression models recognizing anxiety and depression levels, and the classification models detecting the cases with specific depressive symptoms. The predictive accuracies of the regression models achieved medium to large level: The correlation coefficient between predicted and questionnaire scores reached 0.51 on anxiety (by epsilon-Support Vector Regression, e-SVR) and 0.51 on depression (by Gaussian Processes, GP). The predictive accuracies could be even higher, 0.74 on anxiety (by GP) and 0.64 on depression (by GP), while training and testing the models on the female sample. The classification models also showed effectiveness on detecting the cases with some symptoms. These results demonstrate the possibility to recognize individual’s questionnaire measured anxiety/depression levels and some depressive symptoms based on Kinect-recorded gait data through machine learning method. This approach shows the potential to develop non-intrusive, low-cost methods for monitoring individuals’ mental health in real time.


Self-esteem recognition based on gait pattern using Kinect

September 2017

·

88 Reads

·

69 Citations

Gait & Posture

Background: Self-esteem is an important aspect of individual's mental health. When subjects are not able to complete self-report questionnaire, behavioral assessment will be a good supplement. In this paper, we propose to use gait data collected by Kinect as an indicator to recognize self-esteem. Methods: 178 graduate students without disabilities participate in our study. Firstly, all participants complete the 10-item Rosenberg Self-Esteem Scale (RSS) to acquire self-esteem score. After completing the RRS, each participant walks for two minutes naturally on a rectangular red carpet, and the gait data are recorded using Kinect sensor. After data preprocessing, we extract a few behavioral features to train predicting model by machine learning. Based on these features, we build predicting models to recognize self-esteem. Results: For self-esteem prediction, the best correlation coefficient between predicted score and self-report score is 0.45 (p<0.001). We divide the participants according to gender, and for males, the correlation coefficient is 0.43 (p<0.001), for females, it is 0.59 (p<0.001). Conclusion: Using gait data captured by Kinect sensor, we find that the gait pattern could be used to recognize self-esteem with a fairly good criterion validity. The gait predicting model can be taken as a good supplementary method to measure self-esteem.



Table 3 Classification performance for differentiating neutral and angry. 
Table 5 Classification performance for differentiating neutral and happy. 
Table 7 The classification accuracy for differentiating happy and angry. 
Table 9 Classification performance for differentiating neutral, happy, and angry. 
Emotion recognition based on customized smart bracelet with built-in accelerometer

July 2016

·

1,419 Reads

·

84 Citations

Background: Recently, emotion recognition has become a hot topic in human-computer interaction. If computers could understand human emotions, they could interact better with their users. This paper proposes a novel method to recognize human emotions (neutral, happy, and angry) using a smart bracelet with built-in accelerometer. Methods: In this study, a total of 123 participants were instructed to wear a customized smart bracelet with built-in accelerometer that can track and record their movements. Firstly, participants walked two minutes as normal, which served as walking behaviors in a neutral emotion condition. Participants then watched emotional film clips to elicit emotions (happy and angry). The time interval between watching two clips was more than four hours. After watching film clips, they walked for one minute, which served as walking behaviors in a happy or angry emotion condition. We collected raw data from the bracelet and extracted a few features from raw data. Based on these features, we built classification models for classifying three types of emotions (neutral, happy, and angry). Results and discussion: For two-category classification, the classification accuracy can reach 91.3% (neutral vs. angry), 88.5% (neutral vs. happy), and 88.5% (happy vs. angry), respectively; while, for the differentiation among three types of emotions (neutral, happy, and angry), the accuracy can reach 81.2%. Conclusions: Using wearable devices, we found it is possible to recognize human emotions (neutral, happy, and angry) with fair accuracy. Results of this study may be useful to improve the performance of human-computer interaction.



Emotion recognition based on customized smart bracelet with built-in accelerometer

January 2016

·

69 Reads

Background. In recent years, artificial intelligence (AI) has become an important issue, in which, how to make computers understand the thinking of human being is one of critical topics. If computer could perceive and respond to person's non-verbal language such as emotions, the communication between human beings and computers will be more friendly and naturally. So, more and more researchers are paying attention to realize human daily emotion recognition based on wearable sensor' signals, which could be applied in many applications of health care and Human-Computer-Interaction (HCI). Methods. In this paper, we propose an emotion recognition method, which is based on customized smart bracelet with built-in accelerometer. Those bracelets can be worn on people's ankle and wrist. Firstly, the acceleration data of ankle and wrist are obtained when person is walking naturally. Considering the original acceleration data is noisy and variable, the Moving Average Filter is used to eliminate noise. Besides, walking can be regarded as the repetitive movement of legs and arms, so the collected acceleration data may be redundant. Through analysis, we design a sliding window to divide the whole data into several data slices of the same size, and let the neighboring slices be partially overlapped. In the subsequent operations, each data slice would be taken as one sample. This process can not only avoid high computational requirement caused by redundancy data, and can also expand the number of samples. For each data slice, 114 relevant features for emotion recognition are extracted. Then, Principal Component Analysis (PCA) is applied to select effective attributes. Finally, we built the emotion recognition classifier based on Weka software platform. Taking the same attributes as input, we compared the performance of emotion recognition among some classical classifiers, including Support Vector Machine (SVM), Decision Tree, Random Tree and Random Forest. Results. The classification accuracy is used to evaluate the effectiveness of our proposed emotion recognition method. Overall, SVM outperforms the other classifiers. The two-category classification accuracies of neutral-anger, neutral-happiness and happiness-anger are 91.3%, 88.5% and 88.5% respectively. The accuracy of multi-category classification among neutral, happiness and anger is 81.2%. Discussion. In the comparative experiments, the recognition rates of different emotion states are all above 81%. It is concluded that gait is capable to reveal affective state of minds.


Emotion recognition based on customized smart bracelet with built-in accelerometer

January 2016

·

63 Reads

·

1 Citation

Background. In recent years, artificial intelligence (AI) has become an important issue, in which, how to make computers understand the thinking of human being is one of critical topics. If computer could perceive and respond to person's non-verbal language such as emotions, the communication between human beings and computers will be more friendly and naturally. So, more and more researchers are paying attention to realize human daily emotion recognition based on wearable sensor' signals, which could be applied in many applications of health care and Human-Computer-Interaction (HCI). Methods. In this paper, we propose an emotion recognition method, which is based on customized smart bracelet with built-in accelerometer. Those bracelets can be worn on people's ankle and wrist. Firstly, the acceleration data of ankle and wrist are obtained when person is walking naturally. Considering the original acceleration data is noisy and variable, the Moving Average Filter is used to eliminate noise. Besides, walking can be regarded as the repetitive movement of legs and arms, so the collected acceleration data may be redundant. Through analysis, we design a sliding window to divide the whole data into several data slices of the same size, and let the neighboring slices be partially overlapped. In the subsequent operations, each data slice would be taken as one sample. This process can not only avoid high computational requirement caused by redundancy data, and can also expand the number of samples. For each data slice, 114 relevant features for emotion recognition are extracted. Then, Principal Component Analysis (PCA) is applied to select effective attributes. Finally, we built the emotion recognition classifier based on Weka software platform. Taking the same attributes as input, we compared the performance of emotion recognition among some classical classifiers, including Support Vector Machine (SVM), Decision Tree, Random Tree and Random Forest. Results. The classification accuracy is used to evaluate the effectiveness of our proposed emotion recognition method. Overall, SVM outperforms the other classifiers. The two-category classification accuracies of neutral-anger, neutral-happiness and happiness-anger are 91.3%, 88.5% and 88.5% respectively. The accuracy of multi-category classification among neutral, happiness and anger is 81.2%. Discussion. In the comparative experiments, the recognition rates of different emotion states are all above 81%. It is concluded that gait is capable to reveal affective state of minds.


Identifying Emotion from Natural Walking

August 2015

·

30 Reads

·

2 Citations

To provide a means of identification of human emotion in walking, this paper analyzes the capability of walking activity to reveal a person's affective states.We obtain pure wrist and ankle accelerometer data, because of redundant information existing in high dimension data, then we set different w(moving average filter window size) and utilize principal component analysis(PCA) to reduce dimension,respectively compare classification accuracy for wrist and ankle with respect to different w value with specific models.In fact, emotion identification from ankle has better performance than wrist.Best accuracy of anger is achieved for Decision Tree with 74% when w is 5, best accuracy of happiness is 85%, and the identification ratio of anger-happy is 85.34%. It is concluded that gait of walking is capable to reveal and identify the emotional state of person when walking.

Citations (5)


... The Epworth Sleepiness Scale (ESS) is employed to gauge the severity of excessive daytime sleepiness symptoms 12 , while the subjective sleep quality can be assessed by either Pittsburg Sleep Quality Index (PSQI) or the Insomnia severity index (ISI) 13 . However, a signi cant drawback of these assessment tools is their reliance on subjective reports, which may lack precision, accuracy, and reliability in certain contexts 14,15 . This shortfall has intensi ed the quest for more objective and precise diagnostic techniques to evaluate the severity of cognitive impairment, subjective sleep quality, and daytime sleepiness symptoms. ...

Reference:

MRI quantified perivascular space metrics as imaging biomarkers for assessing the severity of cognitive impairment and sleep disturbance in young adults with long-time mobile phone use through machine learning approaches
Gait can reveal sleep quality with machine learning models

... Judgments about another person's mood, motives, and characteristics are heavily influenced by nonverbal behavior (Ekman et al., 1980;Murphy, 2012). The influence of nonverbal behavior on person perception serves an adaptive purpose, as nonverbal expressions provide valid cues to emotional states and personality traits (Floyd, 2006;Zhao et al., 2019). ...

See your mental state from your walk: Recognizing anxiety and depression through Kinect-recorded gait data

... Lafreniere et al. [72] stated that people with high self-esteem develop a harmonious passion, whereas people with low self-esteem develop an obsessive passion. In the light of the existence of studies indicating that self-concept is related to identity and personality [73][74][75][76][77][78], it can be said that passion and identity are indirectly related and a harmonious passion can positively affect identity. One of the main characteristics for athletes to continue their sporting careers for a long time is passion. ...

Self-esteem recognition based on gait pattern using Kinect
  • Citing Article
  • September 2017

Gait & Posture

... The point numbers of our vision recognition were determined by the size and accuracy of the image. This also means that when performing similar computer vision recognition, Earth 2024, 5 717 the number of points identified can be changed to improve the fit of the route afterward if more detailed data are available [105,106]. However, mathematical models may have certain drawbacks. ...

Emotion Detection Using Kinect 3D Facial Points
  • Citing Conference Paper
  • October 2016

... Wearable sensors that measure motion using accelerometers have been used not only in movement recognition as surveyed in [35], but also in recognising emotion from movement (e.g., [36]). In [37], body movements were categorised based on their general characteristics (e.g., light, strong, or sustained walking). ...

Emotion recognition based on customized smart bracelet with built-in accelerometer