mdh.sePublications
Change search
Link to record
Permanent link

Direct link
BETA
Publications (9 of 9) Show all publications
Akalin, N., Kiselev, A., Kristoffersson, A. & Loutfi, A. (2018). Enhancing Social Human-Robot Interaction with Deep Reinforcement Learning.. In: Proc. FAIM/ISCA Workshop on Artificial Intelligence for Multimodal Human Robot Interaction, 2018: . Paper presented at FAIM/ISCA Workshop on Artificial Intelligence for Multimodal Human Robot Interaction (AI-MHRI), Stockholm, Sweden 14-15 July, 2018 (pp. 48-50). MHRI
Open this publication in new window or tab >>Enhancing Social Human-Robot Interaction with Deep Reinforcement Learning.
2018 (English)In: Proc. FAIM/ISCA Workshop on Artificial Intelligence for Multimodal Human Robot Interaction, 2018, MHRI , 2018, p. 48-50Conference paper, Published paper (Refereed)
Abstract [en]

This research aims to develop an autonomous social robot for elderly individuals. The robot will learn from the interaction and change its behaviors in order to enhance the interaction and improve the user experience. For this purpose, we aim to use Deep Reinforcement Learning. The robot will observe the user’s verbal and nonverbal social cues by using its camera and microphone, the reward will be positive valence and engagement of the user.

Place, publisher, year, edition, pages
MHRI, 2018
National Category
Robotics
Research subject
Computer Science
Identifiers
urn:nbn:se:mdh:diva-43376 (URN)10.21437/AI-MHRI.2018-12 (DOI)
Conference
FAIM/ISCA Workshop on Artificial Intelligence for Multimodal Human Robot Interaction (AI-MHRI), Stockholm, Sweden 14-15 July, 2018
Projects
SOCRATES
Available from: 2019-05-09 Created: 2019-05-09 Last updated: 2019-05-09Bibliographically approved
Akalin, N., Kiselev, A., Kristoffersson, A. & Loutfi, A. (2018). The Relevance of Social Cues in Assistive Training with a Social Robot. In: Ge, S.S., Cabibihan, J.-J., Salichs, M.A., Broadbent, E., He, H., Wagner, A., Castro-González, Á. (Ed.), 10th International Conference on Social Robotics, ICSR 2018, Proceedings: . Paper presented at 10th International Conference on Social Robotics (ICSR 2018), Qingdao, China, November 28-30, 2018 (pp. 462-471). Springer
Open this publication in new window or tab >>The Relevance of Social Cues in Assistive Training with a Social Robot
2018 (English)In: 10th International Conference on Social Robotics, ICSR 2018, Proceedings / [ed] Ge, S.S., Cabibihan, J.-J., Salichs, M.A., Broadbent, E., He, H., Wagner, A., Castro-González, Á., Springer , 2018, p. 462-471Conference paper, Published paper (Refereed)
Abstract [en]

This paper examines whether social cues, such as facial expressions, can be used to adapt and tailor a robot-assisted training in order to maximize performance and comfort. Specifically, this paper serves as a basis in determining whether key facial signals, including emotions and facial actions, are common among participants during a physical and cognitive training scenario. In the experiment, participants performed basic arm exercises with a social robot as a guide. We extracted facial features from video recordings of participants and applied a recursive feature elimination algorithm to select a subset of discriminating facial features. These features are correlated with the performance of the user and the level of difficulty of the exercises. The long-term aim of this work, building upon the work presented here, is to develop an algorithm that can eventually be used in robot-assisted training to allow a robot to tailor a training program based on the physical capabilities as well as the social cues of the users.

Place, publisher, year, edition, pages
Springer, 2018
Series
Lecture Notes in Computer Science, ISSN 0302-9743 ; 11357
Keywords
Social cues, Facial signals, Robot-assisted training
National Category
Computer Systems Computer Vision and Robotics (Autonomous Systems)
Research subject
Computer Science
Identifiers
urn:nbn:se:mdh:diva-43377 (URN)10.1007/978-3-030-05204-1_45 (DOI)2-s2.0-85058342671 (Scopus ID)978-3-030-05203-4 (ISBN)978-3-030-05204-1 (ISBN)
Conference
10th International Conference on Social Robotics (ICSR 2018), Qingdao, China, November 28-30, 2018
Projects
SOCRATES
Funder
EU, Horizon 2020, 721619
Available from: 2018-12-19 Created: 2019-05-09Bibliographically approved
Akalin, N., Kiselev, A., Kristoffersson, A. & Loutfi, A. (2017). An Evaluation Tool of the Effect of Robots in Eldercare on the Sense of Safety and Security. In: Kheddar, A.; Yoshida, E.; Ge, S.S.; Suzuki, K.; Cabibihan, J-J:, Eyssel, F:, He, H. (Ed.), Social Robotics: 9th International Conference, ICSR 2017, Tsukuba, Japan, November 22-24, 2017, Proceedings. Paper presented at 9th International Conference on Social Robotics (ICSR 2017), Tsukuba, Japan, November 22-24, 2017 (pp. 628-637). Springer International Publishing
Open this publication in new window or tab >>An Evaluation Tool of the Effect of Robots in Eldercare on the Sense of Safety and Security
2017 (English)In: Social Robotics: 9th International Conference, ICSR 2017, Tsukuba, Japan, November 22-24, 2017, Proceedings / [ed] Kheddar, A.; Yoshida, E.; Ge, S.S.; Suzuki, K.; Cabibihan, J-J:, Eyssel, F:, He, H., Springer International Publishing , 2017, p. 628-637Conference paper, Published paper (Refereed)
Abstract [en]

The aim of the study presented in this paper is to develop a quantitative evaluation tool of the sense of safety and security for robots in eldercare. By investigating the literature on measurement of safety and security in human-robot interaction, we propose new evaluation tools. These tools are semantic differential scale questionnaires. In experimental validation, we used the Pepper robot, programmed in the way to exhibit social behaviors, and constructed four experimental conditions varying the degree of the robot’s non-verbal behaviors from no gestures at all to full head and hand movements. The experimental results suggest that both questionnaires (for the sense of safety and the sense of security) have good internal consistency.

Place, publisher, year, edition, pages
Springer International Publishing, 2017
Series
Lecture Notes in Computer Science, ISSN 0302-9743 ; 10652
Keywords
Sense of safety, Sense of security, Eldercare, Video-based evaluation, Quantitative evaluation tool
National Category
Computer Systems Computer Vision and Robotics (Autonomous Systems)
Research subject
Computer Science
Identifiers
urn:nbn:se:mdh:diva-43375 (URN)10.1007/978-3-319-70022-9_62 (DOI)000449941100062 ()2-s2.0-85035814295 (Scopus ID)978-3-319-70022-9 (ISBN)978-3-319-70021-2 (ISBN)
Conference
9th International Conference on Social Robotics (ICSR 2017), Tsukuba, Japan, November 22-24, 2017
Projects
SOCRATES
Funder
EU, Horizon 2020, 721619
Available from: 2017-11-22 Created: 2019-05-09Bibliographically approved
Orlandini, A., Kristoffersson, A., Almquist, L., Björkman, P., Cesta, A., Cortellessa, G., . . . Coradeschi, S. (2016). ExCITE Project: A Review of Forty-Two Months of Robotic Telepresence Technology. Presence - Teleoperators and Virtual Environments, 25(3), 204-221
Open this publication in new window or tab >>ExCITE Project: A Review of Forty-Two Months of Robotic Telepresence Technology
Show others...
2016 (English)In: Presence - Teleoperators and Virtual Environments, ISSN 1054-7460, E-ISSN 1531-3263, Vol. 25, no 3, p. 204-221Article in journal (Refereed) Published
Abstract [en]

This article reports on the EU project ExCITE with specific focus on the technical development of the telepresence platform over a period of 42 months. The aim of the project was to assess the robustness and validity of the mobile robotic telepresence (MRP) system Giraff as a means to support elderly people and to foster their social interaction and participation. Embracing the idea of user-centered product refinement, the robot was tested over long periods of time in real homes. As such, the system development was driven by a strong involvement of elderly people and their caregivers but also by technical challenges associated with deploying the robot in real-world contexts. The results of the 42-months’ long evaluation is a system suitable for use in homes rather than a generic system suitable, for example, in office environments.

Place, publisher, year, edition, pages
Cambridge, USA: MIT Press, 2016
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:mdh:diva-43406 (URN)10.1162/PRES_a_00262 (DOI)000395117300002 ()2-s2.0-85011585315 (Scopus ID)
Projects
ExCITE
Note

Funding Agency:

EU AAL-2009-2-125

Available from: 2019-05-09 Created: 2019-05-09 Last updated: 2019-05-09Bibliographically approved
Kiselev, A., Scherlund, M., Kristoffersson, A., Efremova, N. & Loutfi, A. (2015). Auditory immersion with stereo sound in a mobile robotic telepresence system. In: 10th ACM/IEEE International Conference on Human-Robot Interaction, 2015: . Paper presented at 10th ACM/IEEE International Conference on Human-Robot Interaction, Portland, USA, March 2-5, 2015. Association for Computing Machinery (ACM)
Open this publication in new window or tab >>Auditory immersion with stereo sound in a mobile robotic telepresence system
Show others...
2015 (English)In: 10th ACM/IEEE International Conference on Human-Robot Interaction, 2015, Association for Computing Machinery (ACM) , 2015Conference paper, Published paper (Refereed)
Abstract [en]

Auditory immersion plays a significant role in generating a good feeling of presence for users driving a telepresence robot. In this paper, one of the key characteristics of auditory immersion - sound source localization (SSL) - is studied from the perspective of those who operate telepresence robots from remote locations. A prototype which is capable of delivering soundscape to the user through Interaural Time Difference (ITD) and Interaural Level Difference (ILD) using the ORTF stereo recording technique was developed. The prototype was evaluated in an experiment and the results suggest that the developed method is sufficient for sound source localization tasks.

Place, publisher, year, edition, pages
Association for Computing Machinery (ACM), 2015
Keywords
Human-Robot Interaction; Mobile Robotic Telepresence; Teleoperation; Sound Source Localization; Auditory Immersion; User Interfaces; ORTF Stereo
National Category
Human Computer Interaction Computer Sciences Robotics
Research subject
Computer Science
Identifiers
urn:nbn:se:mdh:diva-43386 (URN)
Conference
10th ACM/IEEE International Conference on Human-Robot Interaction, Portland, USA, March 2-5, 2015
Available from: 2019-05-09 Created: 2019-05-09 Last updated: 2019-05-09Bibliographically approved
Kiselev, A., Kristoffersson, A. & Loutfi, A. (2015). Combining Semi-autonomous Navigation with Manned Behaviour in a Cooperative Driving System for Mobile Robotic Telepresence. In: COMPUTER VISION - ECCV 2014 WORKSHOPS, PT IV: . Paper presented at 13th European Conference on Computer Vision (ECCV), Zurich, Switzerland, September 6-12, 2014 (pp. 17-28). Berlin: Springer Berlin/Heidelberg
Open this publication in new window or tab >>Combining Semi-autonomous Navigation with Manned Behaviour in a Cooperative Driving System for Mobile Robotic Telepresence
2015 (English)In: COMPUTER VISION - ECCV 2014 WORKSHOPS, PT IV, Berlin: Springer Berlin/Heidelberg , 2015, p. 17-28Conference paper, Published paper (Refereed)
Abstract [en]

This paper presents an image-based cooperative driving system for telepresence robot, which allows safe operation in indoor environments and is meant to minimize the burden on novice users operating the robot. The paper focuses on one emerging telepresence robot, namely, mobile remote presence systems for social interaction. Such systems brings new opportunities for applications in healthcare and elderly care by allowing caregivers to communicate with patients and elderly from remote locations. However, using such systems can be a difficult task particularly for caregivers without proper training. The paper presents a first implementation of a vision-based cooperative driving enhancement to a telepresence robot. A preliminary evaluation in the laboratory environment is presented.

Place, publisher, year, edition, pages
Berlin: Springer Berlin/Heidelberg, 2015
Series
Lecture Notes in Computer Science, ISSN 0302-9743 ; 8928
Keywords
Human-robot interaction, Mobile robotic telepresence, Teleoperation, User interfaces
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:mdh:diva-43382 (URN)10.1007/978-3-319-16220-1_2 (DOI)000361842800002 ()978-3-319-16220-1 (ISBN)978-3-319-16219-5 (ISBN)
Conference
13th European Conference on Computer Vision (ECCV), Zurich, Switzerland, September 6-12, 2014
Available from: 2019-05-09 Created: 2019-05-09 Last updated: 2019-05-09Bibliographically approved
Kiselev, A., Kristoffersson, A., Melendez, F., Galindo, C., Loutfi, A., Gonzalez-Jimenez, J. & Coradeschi, S. (2015). Evaluation of using semi-autonomy features in mobile robotic telepresence systems. In: Proceedings of the 2015 7th IEEE International Conference on Cybernetics and Intelligent Systems, CIS 2015 and Robotics, Automation and Mechatronics, RAM 2015: . Paper presented at 7th IEEE International Conference on Cybernetics and Intelligent Systems (CIS) And Robotics, Automation and Mechatronics (RAM), Angkor Wat, Cambodia, July 15-17, 2015 (pp. 147-152). New York, USA: IEEE conference proceedings
Open this publication in new window or tab >>Evaluation of using semi-autonomy features in mobile robotic telepresence systems
Show others...
2015 (English)In: Proceedings of the 2015 7th IEEE International Conference on Cybernetics and Intelligent Systems, CIS 2015 and Robotics, Automation and Mechatronics, RAM 2015, New York, USA: IEEE conference proceedings , 2015, p. 147-152Conference paper, Published paper (Refereed)
Abstract [en]

Mobile robotic telepresence systems used for social interaction scenarios require that users steer robots in a remote environment. As a consequence, a heavy workload can be put on users if they are unfamiliar with using robotic telepresence units. One way to lessen this workload is to automate certain operations performed during a telepresence session in order to assist remote drivers in navigating the robot in new environments. Such operations include autonomous robot localization and navigation to certain points in the home and automatic docking of the robot to the charging station. In this paper we describe the implementation of such autonomous features along with user evaluation study. The evaluation scenario is focused on the first experience on using the system by novice users. Importantly, that the scenario taken in this study assumed that participants have as little as possible prior information about the system. Four different use-cases were identified from the user behaviour analysis.

Place, publisher, year, edition, pages
New York, USA: IEEE conference proceedings, 2015
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:mdh:diva-43384 (URN)10.1109/ICCIS.2015.7274564 (DOI)000380472300027 ()2-s2.0-84960945361 (Scopus ID)978-1-4673-7338-8 (ISBN)
Conference
7th IEEE International Conference on Cybernetics and Intelligent Systems (CIS) And Robotics, Automation and Mechatronics (RAM), Angkor Wat, Cambodia, July 15-17, 2015
Available from: 2016-09-20 Created: 2019-05-09Bibliographically approved
Kiselev, A., Mosiello, G., Kristoffersson, A. & Loutfi, A. (2014). Semi-Autonomous Cooperative Driving for Mobile Robotic Telepresence Systems. In: Proceedings of the 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI 2014): . Paper presented at 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI 2014) (pp. 104-104). IEEE conference proceedings
Open this publication in new window or tab >>Semi-Autonomous Cooperative Driving for Mobile Robotic Telepresence Systems
2014 (English)In: Proceedings of the 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI 2014), IEEE conference proceedings , 2014, p. 104-104Conference paper, Published paper (Refereed)
Abstract [en]

Mobile robotic telepresence (MRP) has been introduced to allow communication from remote locations. Modern MRP systems offer rich capabilities for human-human interactions. However, simply driving a telepresence robot can become a burden especially for novice users, leaving no room for interaction at all. In this video we introduce a project which aims to incorporate advanced robotic algorithms into manned telepresence robots in a natural way to allow human-robot cooperation for safe driving. It also shows a very first implementation of cooperative driving based on extracting a safe drivable area in real time using the image stream received from the robot.

Place, publisher, year, edition, pages
IEEE conference proceedings, 2014
Keywords
Human-Robot Interaction, Mobile Robotic Telepresence, Teleoperation, User Interfaces
National Category
Computer Sciences Human Computer Interaction Computer Vision and Robotics (Autonomous Systems)
Research subject
Computer Science; Human-Computer Interaction; Computerized Image Analysis
Identifiers
urn:nbn:se:mdh:diva-43385 (URN)10.1145/2559636.2559640 (DOI)978-1-4503-2658-2 (ISBN)
Conference
9th ACM/IEEE International Conference on Human-Robot Interaction (HRI 2014)
Available from: 2019-05-09 Created: 2019-05-09 Last updated: 2019-05-09Bibliographically approved
Kiselev, A., Kristoffersson, A. & Loutfi, A. (2014). The Effect of Field of View on Social Interaction in Mobile Robotic Telepresence Systems. In: Proceedings of the 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI 2014): . Paper presented at 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI 2014) (pp. 214-215). IEEE conference proceedings
Open this publication in new window or tab >>The Effect of Field of View on Social Interaction in Mobile Robotic Telepresence Systems
2014 (English)In: Proceedings of the 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI 2014), IEEE conference proceedings , 2014, p. 214-215Conference paper, Published paper (Refereed)
Abstract [en]

One goal of mobile robotic telepresence for social interaction is to design robotic units that are easy to operate for novice users and promote good interaction between people. This paper presents an exploratory study on the effect of camera orientation and field of view on the interaction between a remote and local user. Our findings suggest that limiting the width of the field of view can lead to better interaction quality as it encourages remote users to orient the robot towards local users.

Place, publisher, year, edition, pages
IEEE conference proceedings, 2014
Keywords
Human-Robot Interaction, Mobile Robotic Telepresence, Teleoperation, User Interfaces
National Category
Computer Sciences Human Computer Interaction Computer Vision and Robotics (Autonomous Systems)
Research subject
Computer Science; Human-Computer Interaction
Identifiers
urn:nbn:se:mdh:diva-43383 (URN)10.1145/2559636.2559799 (DOI)978-1-4503-2658-2 (ISBN)
Conference
9th ACM/IEEE International Conference on Human-Robot Interaction (HRI 2014)
Available from: 2019-05-09 Created: 2019-05-09 Last updated: 2019-05-09Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0002-0305-3728

Search in DiVA

Show all publications