mdh.sePublications
Change search
Link to record
Permanent link

Direct link
BETA
Publications (10 of 30) Show all publications
Wallmyr, M., Sitompul, T. A., Holstein, T. & Lindell, R. (2019). Evaluating Mixed Reality Notifications to Support Excavator Operator Awareness. In: 17th IFIP TC 13 International Conference, Paphos, Cyprus, September 2–6, 2019, Proceedings, Part I: . Paper presented at 17th IFIP TC13 International Conference on Human-Computer Interaction, INTERACT 2019; Paphos; Cyprus; 2 September 2019 through 6 September 2019 (pp. 743-762). Cham: Springer, 11746
Open this publication in new window or tab >>Evaluating Mixed Reality Notifications to Support Excavator Operator Awareness
2019 (English)In: 17th IFIP TC 13 International Conference, Paphos, Cyprus, September 2–6, 2019, Proceedings, Part I, Cham: Springer, 2019, Vol. 11746, p. 743-762Conference paper, Published paper (Refereed)
Abstract [en]

Operating heavy vehicles, for instance an excavator, requires a high level of attention to the operation done using the vehicle and awareness of the surroundings. Digital transformation in heavy vehicles aims to improve productivity and user experience, but it can also increase the operators mental load because of a higher demand of attention to instrumentation and controls, subsequently leading to reduced situation awareness. One way to mitigate this, is to display information within the operators’ field of view, which enhances information detectability through quick glances, using mixed reality interfaces. This work explores two types of mixed reality visualizations and compares them to a traditional display setup in a simulated excavator environment. We have utilized eye-tracking glasses to study users’ attention to the task, surrounding awareness, and interfaces, followed by a NASA-RTLX questionnaire to evaluate the users’ reported mental workload. The results indicate benefits for the mixed reality approaches, with lower workload ratings together with an improved rate in detection of presented information.

Place, publisher, year, edition, pages
Cham: Springer, 2019
Series
Lecture Notes in Computer Science, ISSN 0302-9743 ; 11746
Keywords
Excavator, Head-up display, Heavy-vehicles, Human-vehicle interaction, Mixed reality, Situational awareness
National Category
Computer Systems
Research subject
Computer Science
Identifiers
urn:nbn:se:mdh:diva-45583 (URN)10.1007/978-3-030-29381-9_44 (DOI)2-s2.0-85072870944 (Scopus ID)
Conference
17th IFIP TC13 International Conference on Human-Computer Interaction, INTERACT 2019; Paphos; Cyprus; 2 September 2019 through 6 September 2019
Projects
Immersive Visual Technologies for Safety-critical Applications (ImmerSAFE)ITS-EASY Post Graduate School for Embedded Software and Systems
Funder
EU, Horizon 2020, 764951
Available from: 2019-10-17 Created: 2019-10-17 Last updated: 2019-11-18Bibliographically approved
Kade, D., Lindell, R., Ürey, H. & Özcan, O. (2018). Evaluation of a mixed reality head-mounted projection display to support motion capture acting. In: Lecture Notes in Computer Science: . Paper presented at 14th International Conference on Advances in Computer Entertainment Technology, ACE 2017; London; United Kingdom; 14 December 2017 through 16 December 2017 (pp. 14-31). Springer Verlag, 714
Open this publication in new window or tab >>Evaluation of a mixed reality head-mounted projection display to support motion capture acting
2018 (English)In: Lecture Notes in Computer Science, Springer Verlag , 2018, Vol. 714, p. 14-31Conference paper, Published paper (Refereed)
Abstract [en]

Motion capture acting is a challenging task, it requires trained and experienced actors who can highly rely on their acting and imagination skills to deliver believable performances. This is especially the case when preparation times are short and scenery needs to be imagined, as it is commonly the case for shoots in the gaming industry. To support actors in such cases, we developed a mixed reality application that allows showing digital scenery and triggering emotions while performing. In this paper we tested our hypothesis that a mixed reality head-mounted projection display can support motion capture acting through the help of experienced motion capture actors performing short acting scenes common for game productions. We evaluated our prototype with four motion capture actors and four motion capture experts. Both groups considered our application as helpful, especially as a rehearsal tool to prepare performances before capturing the motions in a studio. Actors and experts indicated that our application could reduce the time to prepare performances and supports the set up of physical acting scenery.

Place, publisher, year, edition, pages
Springer Verlag, 2018
Series
Lecture Notes in Computer Science, ISSN 0302-9743 ; 10714
National Category
Media and Communication Technology Computer Systems
Identifiers
urn:nbn:se:mdh:diva-38870 (URN)10.1007/978-3-319-76270-8_2 (DOI)000432607700002 ()2-s2.0-85043530133 (Scopus ID)9783319762692 (ISBN)
Conference
14th International Conference on Advances in Computer Entertainment Technology, ACE 2017; London; United Kingdom; 14 December 2017 through 16 December 2017
Available from: 2018-03-22 Created: 2018-03-22 Last updated: 2018-06-07Bibliographically approved
Kade, D., Lindell, R., Ürey, H. & Özcan, O. (2018). Supporting motion capture acting through a mixed reality application. IGI Global, 3
Open this publication in new window or tab >>Supporting motion capture acting through a mixed reality application
2018 (English)Book (Other academic)
Abstract [en]

Current and future animations seek for more human-like motions to create believable animations for computer games, animated movies and commercial spots. A technology widely used technology is motion capture to capture actors’ movements which enrich digital avatars motions and emotions. However, a motion capture environment poses challenges to actors such as short preparation times and the need to highly rely on their acting and imagination skills. To support these actors, we developed a mixed reality application that allows showing digital environments while performing and being able to see the real and virtual world. We tested our prototype with 6 traditionally trained theatre and TV actors. As a result, the actors indicated that our application supported them getting into the demanded acting moods with less unrequired emotions. The acting scenario was also better understood with less need of explanation than when just discussing the scenario, as commonly done in theatre acting. 

Place, publisher, year, edition, pages
IGI Global, 2018
National Category
Interaction Technologies Human Computer Interaction
Identifiers
urn:nbn:se:mdh:diva-39301 (URN)10.4018/978-1-5225-5469-1.ch084 (DOI)2-s2.0-85046610093 (Scopus ID)9781522554707 (ISBN)
Available from: 2018-05-24 Created: 2018-05-24 Last updated: 2018-05-24Bibliographically approved
Lindell, R. & Kumlin, T. (2017). Augmented Embodied Performance. In: New Interfaces for Musical Expression NIME: . Paper presented at New Interfaces for Musical Expression NIME, 15 May 2017, Copenhagen, Denmark. Copenhagen, Denmark
Open this publication in new window or tab >>Augmented Embodied Performance
2017 (English)In: New Interfaces for Musical Expression NIME, Copenhagen, Denmark, 2017Conference paper, Published paper (Refereed)
Abstract [en]

We explore the phenomenology of embodiment based on research through design and reflection on the design of artefacts for augmenting embodied performance. We present three designs for highly trained musicians; the designs rely on the musicians’ mastery acquired from years of practice. Through the knowledge of the living body their instruments —saxophone, cello, and flute — are extensions of themselves; thus, we can explore technology with rich nuances and precision in corporeal schemas. With the help of Merleau-Ponty’s phenomenology of embodiment we present three hypotheses for augmented embodied performance: the extended artistic room, the interactively enacted teacher, and the humanisation of technology.

Place, publisher, year, edition, pages
Copenhagen, Denmark: , 2017
Keywords
Embodiment, Performance, Music, Bio-signals, Interaction Design
National Category
Interaction Technologies
Identifiers
urn:nbn:se:mdh:diva-37048 (URN)
Conference
New Interfaces for Musical Expression NIME, 15 May 2017, Copenhagen, Denmark
Available from: 2017-11-20 Created: 2017-11-20 Last updated: 2017-11-20Bibliographically approved
Kumlin, T. & Lindell, R. (2017). Biosignal Augmented Embodied Performance. In: Proceedings of AudioMostly AM: . Paper presented at Proceedings of AM ’17, London, United Kingdom, August 23–26, 2017. , Article ID a19.
Open this publication in new window or tab >>Biosignal Augmented Embodied Performance
2017 (English)In: Proceedings of AudioMostly AM, 2017, article id a19Conference paper, Published paper (Refereed)
Abstract [en]

We explore the phenomenology of embodiment based on research through design and reflection on the design of artefacts for augmenting embodied performance. We present three designs for musicians and a dancer; the designs rely on the artists’ mastery acquired from years of practice. Through the knowledge of the living body, their instruments —cello, flute and dance —are extensions of themselves; thus, we can explore technology with rich nuances and precision in corporeal schemas. With the help of Merleau-Ponty’s phenomenology of embodiment we present two perspectives for augmented embodied performance: the interactively enacted teacher, and the humanisation of technology.

Keywords
Embodiment, Performance, Biosignals, Music, Dance, Interaction Design
National Category
Design Interaction Technologies
Identifiers
urn:nbn:se:mdh:diva-37049 (URN)10.1145/3123514.3123547 (DOI)2-s2.0-85038382634 (Scopus ID)978-1-4503-5373-1 (ISBN)
Conference
Proceedings of AM ’17, London, United Kingdom, August 23–26, 2017
Available from: 2017-11-20 Created: 2017-11-20 Last updated: 2017-12-28Bibliographically approved
Kade, D., Lindell, R., Ürey, H. & Özcan, O. (2017). Supporting motion capture acting through a mixed reality application. In: Optimizing Human-Computer Interaction With Emerging Technologies: (pp. 248-273). IGI Global
Open this publication in new window or tab >>Supporting motion capture acting through a mixed reality application
2017 (English)In: Optimizing Human-Computer Interaction With Emerging Technologies, IGI Global , 2017, p. 248-273Chapter in book (Other academic)
Abstract [en]

Current and future animations seek for more human-like motions to create believable animations for computer games, animated movies and commercial spots. A technology widely used technology is motion capture to capture actors' movements which enrich digital avatars motions and emotions. However, a motion capture environment poses challenges to actors such as short preparation times and the need to highly rely on their acting and imagination skills. To support these actors, we developed a mixed reality application that allows showing digital environments while performing and being able to see the real and virtual world. We tested our prototype with 6 traditionally trained theatre and TV actors. As a result, the actors indicated that our application supported them getting into the demanded acting moods with less unrequired emotions. The acting scenario was also better understood with less need of explanation than when just discussing the scenario, as commonly done in theatre acting.

Place, publisher, year, edition, pages
IGI Global, 2017
National Category
Computer and Information Sciences
Identifiers
urn:nbn:se:mdh:diva-36298 (URN)10.4018/978-1-5225-2616-2.ch010 (DOI)2-s2.0-85027712029 (Scopus ID)9781522526179 (ISBN)1522526161 (ISBN)9781522526162 (ISBN)
Available from: 2017-08-31 Created: 2017-08-31 Last updated: 2018-01-13Bibliographically approved
Lindell, R. (2016). Design in ecology of other artefacts. In: ACM International Conference Proceeding Series: . Paper presented at Audio Mostly 2016, AM 2016, 4 October 2016 through 6 October 2016 (pp. 240-244).
Open this publication in new window or tab >>Design in ecology of other artefacts
2016 (English)In: ACM International Conference Proceeding Series, 2016, p. 240-244Conference paper, Published paper (Refereed)
Abstract [en]

Music software has traditionally been developed in ecosystems of technology that artists combine to create their tools for compositions. The ability to connect artefacts has developed thought the evolution of music technology. Today, the mobile platform has become an increasingly important ecosystem for music applications, combining traditional technology with the platform specific infrastructure. To make an artefact's design valid in the ecosystem, designers and developers need to support this infrastructure. This paper present the design process of c3n loops, an iOS music app based on zoomable user interface in contrast with the platform's design idioms. The c3n loops design relies on its own design idioms while balancing the support for platform specific technologies such as AudioBus/Inter-app Audio, Ableton Link beat synchronisation, and MIDI.

Keywords
Experience design, Interaction design, System development, Audio acoustics, Ecology, Ecosystems, User interfaces, Design process, Mobile platform, Music applications, Music technologies, Zoomable user interfaces, Design
National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:mdh:diva-34233 (URN)10.1145/2986416.2986439 (DOI)000390600700034 ()2-s2.0-84999217806 (Scopus ID)
Conference
Audio Mostly 2016, AM 2016, 4 October 2016 through 6 October 2016
Available from: 2016-12-15 Created: 2016-12-15 Last updated: 2017-01-19Bibliographically approved
Schaeffer, J. & Lindell, R. (2016). Emotions in design. Information Design Journal, 22(1), 19-31
Open this publication in new window or tab >>Emotions in design
2016 (English)In: Information Design Journal, ISSN 0142-5471, E-ISSN 1569-979X, Vol. 22, no 1, p. 19-31Article in journal (Refereed) Published
Abstract [en]

Operators in highly automated control rooms are said to be constantly bored, and boredom is an emotional state that can have economic and environmental consequences. This article presents insights into users' emotions and their role in the design of control rooms. The study focused on the users' experience in two control rooms, where operators explored their emotions in relation to a situation, object, place, or action. Based on the results of the study and previous research, this article examines control room's information design and makes recommendations on how it might be given a tangible and ambient form.

Keywords
Ambient interaction, Control room, Design, Emotion, Tangible interaction
National Category
Production Engineering, Human Work Science and Ergonomics
Identifiers
urn:nbn:se:mdh:diva-32535 (URN)10.1075/idj.22.1.03sch (DOI)2-s2.0-84979082796 (Scopus ID)
Available from: 2016-08-18 Created: 2016-08-18 Last updated: 2020-02-04Bibliographically approved
Kade, D., Lindell, R., Ürey, H. & Özcan, O. (2016). Evaluation of a Mixed Reality Projection Display to Support Motion Capture Acting. In: 13th EAI International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services. Paper presented at MobiQuitous’16, 28th Nov – 1st Dez 2016, Hiroshima, Japan.
Open this publication in new window or tab >>Evaluation of a Mixed Reality Projection Display to Support Motion Capture Acting
2016 (English)In: 13th EAI International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services, 2016Conference paper, Published paper (Refereed)
Abstract [en]

We present an evaluation of our mixed reality prototype for motion capture acting, tested with experienced motion capture actors. Motion capture acting requires trained and experienced actors who can highly rely on their acting and imagination skills. This is especially the case when preparation times are short and scenery needs to be imagined. To support actors in such cases, we developed a mixed reality application that allows showing digital scenery and triggering emotions while performing. In this paper we tested our prototype with experienced motion capture actors performing short acting scenes. We also evaluated the prototype’s usefulness for motion capture with four actors and four motion capture experts. The actors and experts considered our application helpful, especially as a rehearsal tool to prepare performances before motion capture shoots. They indicated that our application could reduce the time to prepare performances support the preparation of physical acting scenarios.

National Category
Human Computer Interaction Other Computer and Information Science
Identifiers
urn:nbn:se:mdh:diva-31724 (URN)
Conference
MobiQuitous’16, 28th Nov – 1st Dez 2016, Hiroshima, Japan
Available from: 2016-06-06 Created: 2016-06-06 Last updated: 2018-12-14Bibliographically approved
Andersson, J. & Lindell, R. (2016). It Could Just as Well Have Been in Greek:: Experiences from Introducing Code as a Design Material to Exhibition Design Students. In: TEI '16 Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction Pages 126-132: . Paper presented at The ACM International conference on Tangible, Embedded and Embodied Interaction (TEI) TEI, 14 Feb 2016, EINDHOVEN, Netherlands (pp. 126-132). EINDHOVEN, Netherlands: ACM
Open this publication in new window or tab >>It Could Just as Well Have Been in Greek:: Experiences from Introducing Code as a Design Material to Exhibition Design Students
2016 (English)In: TEI '16 Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction Pages 126-132, EINDHOVEN, Netherlands: ACM , 2016, p. 126-132Conference paper, Published paper (Refereed)
Abstract [en]

This paper discusses the experience and learning from introducing programming in a museum exhibition design course. Thirty-seven information design students from Sweden, with no previous experience in programming, participated in the course in 2014 and 2015. The students' tasks were to create interactive exhibition stations at a county museum in five weeks. We introduced Arduino and Processing programming in the course to enlarge the information design students' repertoire and to find ways to develop the interactive aspects of the exhibition medium. We aim to identify and discuss challenges and strengths when introducing code as design material in design education. The education of future exhibition designers is an important matter relevant the TEI community.

Place, publisher, year, edition, pages
EINDHOVEN, Netherlands: ACM, 2016
National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:mdh:diva-33738 (URN)10.1145/2839462.2839475 (DOI)000390588700018 ()2-s2.0-84964844205 (Scopus ID)978-1-4503-3582-9 (ISBN)
Conference
The ACM International conference on Tangible, Embedded and Embodied Interaction (TEI) TEI, 14 Feb 2016, EINDHOVEN, Netherlands
Projects
Normkritiskt Utställningslaboratorium / Normcritical Exhibition Lab
Available from: 2016-11-21 Created: 2016-11-21 Last updated: 2017-01-19Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0003-3163-6039

Search in DiVA

Show all publications