1887
Volume 20, Issue 1
  • ISSN 1572-0373
  • E-ISSN: 1572-0381
USD
Buy:$35.00 + Taxes

Abstract

Abstract

Tactile emotion recognition provides a lot of valuable information in human-computer interaction, and it has strong application prospects in many aspects such as smart home and medical treatment. So this situation raises a question: How to quickly and efficiently let the robot perform the correct emotion recognition? In this work, we develop a lifelong learning algorithm which is based on the efficient dictionary learning technology, to tackle the tactile emotion recognition across different tasks. To verify the efficiency of the proposed method, we applied it to two data sets for experimentation: Corpus of Social Touch (CoST) and our dataset(We built it with a 12X12 array sensor). The results show that the proposed lifelong learning algorithm achieves satisfactory results.

Loading

Article metrics loading...

/content/journals/10.1075/is.18041.wei
2019-07-15
2019-08-21
Loading full text...

Full text loading...

References

  1. Brady, K., Y. Gwon, P. Khorrami, E. Godoy, W. Campbell, C. Dagli, and T. S. Huang
    (2016) “Multi-modal audio, video and physiological sensor learning for continuous emotion prediction,” inInternational Workshop on Audio/visual Emotion Challenge, pp.97–104. 10.1145/2988257.2988264
    https://doi.org/10.1145/2988257.2988264 [Google Scholar]
  2. Chang, C. C. and C. J. Lin
    (2011) “Libsvm: A library for support vector machines,” vol.2, no.3, pp.1–27. 10.1145/1961189.1961199
    https://doi.org/10.1145/1961189.1961199 [Google Scholar]
  3. Chen, Z. and B. Liu
    (2014) “Topic modeling using topics from many domains, lifelong learning and big data,” inInternational Conference on International Conference on Machine Learning, pp.II–703.
    [Google Scholar]
  4. (2016) “Lifelong machine learning,” Synthesis Lectures on Artificial Intelligence & Machine Learning, vol.10, no.3, pp.1–145. 10.2200/S00737ED1V01Y201610AIM033
    https://doi.org/10.2200/S00737ED1V01Y201610AIM033 [Google Scholar]
  5. Chen, Z., N. Ma, and B. Liu
    (2018) “Lifelong learning for sentiment classification,” arXiv preprint arXiv, vol.21, no.7, pp.845–853.
    [Google Scholar]
  6. Debrot, A., D. Schoebi, M. Perrez, and A. B. Horn
    (2013) “Touch as an interpersonal emotion regulation process in couples’ daily lives: the mediating role of psychological intimacy,” Personality & Social Psychology Bulletin, vol.39, no.10, p.1373. 10.1177/0146167213497592
    https://doi.org/10.1177/0146167213497592 [Google Scholar]
  7. Gao, Y., N. Bianchi-Berthouze, and H. Meng
    (2012) “What does touch tell us about emotions in touchscreen-based gameplay?” ACM Transactions on Computer-Human Interaction (TOCHI), vol.19, no.4, pp.1–30. 10.1145/2395131.2395138
    https://doi.org/10.1145/2395131.2395138 [Google Scholar]
  8. Graesser, A.
    (2011) “The half-life of cognitive-affective states during complex learning,” Cognition & Emotion, vol.25, no.7, p.1299. 10.1080/02699931.2011.613668
    https://doi.org/10.1080/02699931.2011.613668 [Google Scholar]
  9. Guest, S., J. M. Dessirier, A. Mehrabyan, F. Mcglone, G. Essick, G. Gescheider, A. Fontana, R. Xiong, R. Ackerley, and K. Blot
    (2011) “The development and validation of sensory and emotional scales of touch perception.” Attention Perception & Psychophysics, vol.73, no.2, pp.531–550. 10.3758/s13414‑010‑0037‑y
    https://doi.org/10.3758/s13414-010-0037-y [Google Scholar]
  10. Hertenstein, M. J., R. Holmes, M. Mccullough, and D. Keltner
    (2009) “The communication of emotion via touch.” Emotion, vol.9, no.4, p.566. 10.1037/a0016108
    https://doi.org/10.1037/a0016108 [Google Scholar]
  11. Huang, G. B., D. H. Wang, and Y. Lan
    (2011) “Extreme learning machines: a survey,” International Journal of Machine Learning & Cybernetics, vol.2, no.2, pp.107–122. 10.1007/s13042‑011‑0019‑y
    https://doi.org/10.1007/s13042-011-0019-y [Google Scholar]
  12. Jung, M. M., R. Poppe, M. Poel, and D. K. J. Heylen
    (2014) “Touching the void – introducing cost:corpus of social touch,” inIcmi 14 International Conference on Multimodal Interaction, pp.120–127.
    [Google Scholar]
  13. Jung, M. M., M. Poel, R. Poppe, and D. K. J. Heylen
    (2017) “Automatic recognition of touch gestures in the corpus of social touch,” Journal on Multimodal User Interfaces, vol.11, no.1, pp.81–96. 10.1007/s12193‑016‑0232‑9
    https://doi.org/10.1007/s12193-016-0232-9 [Google Scholar]
  14. Liu, H., J. Qin, F. Sun, and D. Guo
    (2017) “Extreme kernel sparse learning for tactile object recognition,” IEEE Transactions on Cybernetics, vol.47, no.12, pp., 4509–4520. 10.1109/TCYB.2016.2614809
    https://doi.org/10.1109/TCYB.2016.2614809 [Google Scholar]
  15. Liu, H., F. Sun, D. Guo, and B. Fang
    (2017) “Structured output-associated dictionary learning for haptic understanding,” IEEE Transactions on Systems, Man and Cybernetics: Systems, vol.47, no.7, pp.1564–1574. 10.1109/TSMC.2016.2635141
    https://doi.org/10.1109/TSMC.2016.2635141 [Google Scholar]
  16. Maramis, C., L. Stefanopoulos, I. Chouvarda, and N. Maglaveras
    (2018) “Emotion Recognition from Haptic Touch on Android Device Screens.” Precision Medicine Powered by pHealth and Connected Health, pp.205–209. 10.1007/978‑981‑10‑7419‑6_34
    https://doi.org/10.1007/978-981-10-7419-6_34 [Google Scholar]
  17. Matsuda, Y., T. Isomura, I. Sakuma, Y. Jimbo, E. Kobayashi, and T. Arafune
    (2008) “Emotion recognition of finger braille,” inInternational Conference on Intelligent Information Hiding and Multimedia Signal Processing, pp.1408–1411. 10.1109/IIH‑MSP.2008.241
    https://doi.org/10.1109/IIH-MSP.2008.241 [Google Scholar]
  18. Morrison, I., L. S. Loken, and H. Olausson
    (2010) “The skin as a social organ.” Experimental Brain Research, vol.204, no.3, pp.305–314. 10.1007/s00221‑009‑2007‑y
    https://doi.org/10.1007/s00221-009-2007-y [Google Scholar]
  19. Park, Y. W., K. M. Baek, and T. J. Nam
    (2013) “The roles of touch during phone conversations:long-distance couples’ use of poke in their homes,” inProceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp.1679–1688. 10.1145/2470654.2466222
    https://doi.org/10.1145/2470654.2466222 [Google Scholar]
  20. Preeti, K. and M. Sasikumar
    (2010) “Recognising emotions from keyboard stroke pattern,” International Journal of Computer Applications, vol.11, no.9, pp.24–28.
    [Google Scholar]
  21. Ruvolo, P. and E. Eaton
    (2013) “Ella: an efficient lifelong learning algorithm,” InInternational Conference on International Conference on Machine Learning, pp.I–507.
    [Google Scholar]
  22. Silver, D. L., Q. Yang, and L. Li
    (2013) “Lifelong machine learning systems: Beyond learning algorithms,” inAAAI 2013 Spring Symposium on Lifelong Machine Learning.
    [Google Scholar]
  23. Thrun, S. and T. M. Mitchell
    (1995) “Lifelong robot learning,” Robotics & Autonomous Systems, vol.15, no.1–2, pp.25–46. 10.1016/0921‑8890(95)00004‑Y
    https://doi.org/10.1016/0921-8890(95)00004-Y [Google Scholar]
  24. Yang, C., K. Huang, H. Cheng
    (2017) “Haptic identification by ELM-controlled uncertain manipulator,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, 47(8): 2398–2409. 10.1109/TSMC.2017.2676022
    https://doi.org/10.1109/TSMC.2017.2676022 [Google Scholar]
  25. Yang, C., C. Zeng, P. Liang, Z. Li, R. Li, C. Su
    (2018) “Interface Design of a Physical Human-Robot Interaction System for Human Impedance Adaptive Skill Transfer,” IEEE Trans. Automation Science and Engineering15(1): 329–340. 10.1109/TASE.2017.2743000
    https://doi.org/10.1109/TASE.2017.2743000 [Google Scholar]
http://instance.metastore.ingenta.com/content/journals/10.1075/is.18041.wei
Loading
/content/journals/10.1075/is.18041.wei
Loading

Data & Media loading...

  • Article Type: Research Article
Keyword(s): emotion recognition , Lifelong Learning and tactile interaction

Most Cited This Month

This is a required field
Please enter a valid email address
Approval was successful
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error