mdh.sePublications
Change search
Refine search result
1 - 18 of 18
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Aksit, Kaan
    et al.
    Koç University, Turkey.
    Kade, Daniel
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Özcan, Oguzhan
    Koç University, Turkey.
    Ürey, Hakan
    Koç University, Turkey.
    Head-worn Mixed Reality Projection Display Application2014In: ACM International Conference Proceedings Series (ICPS), 2014Conference paper (Refereed)
    Abstract [en]

    The main goal of this research is to develop a mixed real- ity (MR) application to support motion capture actors. This application allows seeing and exploring a digital environment without occluding the actor’s visual field. A prototype is built by combining a retro-reflective screen covering surrounding walls and a headband consisting of a laser scanning projector with a smartphone. Built-in sensors of a smartphone provide navigation capabilities in the digital world. The integrated system has some unique advantages, which are collectively demonstrated for the first time: (i) providing fixed field-of- view (50o in diagonal), fixed retinal images at full-resolution, and distortion-free images that are independent of the screen distance and shape; (ii) presenting different perspectives to the users as they move around or tilt their heads, (iii) allow- ing a focus-free and calibration-free display even on non-flat surfaces using laser scanning technology, (iv) enabling mul- tiple users to share the same screen without crosstalk due to the use of retro-reflectors, and (v) producing high brightness pictures with a projector of only 15 lm; due to a high-gain retro-reflective screen. We demonstrated a lightweight, com- fortable to wear and low cost head-mounted projection dis- play (HMPD) which acts as a stand-a-lone mobile system. Initial informal functionality tests have been successfully per- formed. The prototype can also be used as a 3D stereo system using the same hardware by additionally mounting polarized glasses and an active polarization rotator, while maintaining all of the advantages listed above. 

  • 2.
    Dodig-Crnkovic, Gordana
    et al.
    Chalmers, Sweden.
    Kade, Daniel
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Wallmyr, Markus
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Holstein, Tobias
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Almér, Alexander
    University of Gothenburg, Sweden.
    Transdisciplinarity seen through Information, Communication, Computation, (Inter-)Action and Cognition2017In: INFORMATION STUDIES AND THE QUEST FOR TRANSDISCIPLINARITY / [ed] Mark Burgin and Wolfgang Hofkirchner, Sweden: World Scientific , 2017, p. 217-261Chapter in book (Other academic)
    Abstract [en]

    This book is the second volume of a two-volume edition based on the International Society for Information Studies Summit Vienna 2015 on "The Information Society at the Crossroads. Response and Responsibility of the Sciences of Information" (see summit.is4is.org). The book gives an up-to-date multiaspect exposition of contemporary studies in the field of information and related areas. It presents most recent achievements, ideas and opinions of leading researchers in this domain reflecting their quest for advancing information science and technology. With the goal of building a better society, in which social and technological innovations help make information key to the flourishing of humanity, we dispense with the bleak view of the dark side of information society. It is aimed at readers that conduct research into any aspect of information, information society and information technology, who develop or implement social or technological applications. It is also for those who have an interest in participating in setting the goals for the sciences of information and the social applications of technological achievements and the scientific results.

  • 3.
    Du, Jiaying
    et al.
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Kade, Daniel
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems. Motion Control i Västerås AB, Västerås, Sweden.
    Gerdtman, Christer
    Motion Control i Västerås AB, Västerås, Sweden.
    Lindell, Rikard
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Ozcan, Oguzhan
    Arçelik Research Center for Creative Industries, Koç University, Rumelifeneri, Sarıyer, İstanbul, Turkey.
    Lindén, Maria
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Perception of Delay in Computer Input Devices Establishing a Baseline for Signal Processing of Motion Sensor Systems2016In: The 3rd EAI International Conference on IoT Technologies for HealthCare HealthyIoT'16, Västeraås, Sweden, 2016, Vol. 187, p. 107-112Conference paper (Refereed)
    Abstract [en]

    New computer input devices in healthcare applications using small embedded sensors need firmware filters to run smoothly and to provide a better user experience. Therefore, it has to be investigated how much delay can be tolerated for signal processing before the users perceive a delay when using a computer input device. This paper is aimed to find out a threshold of unperceived delay by performing user tests with 25 participants. A communication retarder was used to create delays from 0 to 100 ms between a receiving computer and three different USB-connected computer input devices. A wired mouse, a wifi mouse and a head-mounted mouse were used as input devices. The results of the user tests show that delays up to 50ms could be tolerated and are not perceived as delay, or depending on the used device still perceived as acceptable.

  • 4.
    Du, Jiaying
    et al.
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Kade, Daniel
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Gerdtman, Christer
    Motion Control, Västerås, Sweden.
    Özcan, Oguzhan
    Koç University, Turkey.
    Lindén, Maria
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    The effects of perceived USB-delay for sensor and embedded system development2016In: Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBSVolume 2016, 2016, p. 2492-2495, article id 7591236Conference paper (Refereed)
    Abstract [en]

    Perceiving delay in computer input devices is a problem which gets even more eminent when being used in healthcare applications and/or in small, embedded systems. Therefore, the amount of delay found as acceptable when using computer input devices was investigated in this paper. A device was developed to perform a benchmark test for the perception of delay. The delay can be set from 0 to 999 milliseconds (ms) between a receiving computer and an available USB-device. The USB-device can be a mouse, a keyboard or some other type of USB-connected input device. Feedback from performed user tests with 36 people form the basis for the determination of time limitations for the USB data processing in microprocessors and embedded systems without users' noticing the delay. For this paper, tests were performed with a personal computer and a common computer mouse, testing the perception of delays between 0 and 500 ms. The results of our user tests show that perceived delays up to 150 ms were acceptable and delays larger than 300 ms were not acceptable at all.

  • 5.
    Kade, Daniel
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Ethics of Virtual Reality Applications in Computer Games Production2015In: Philosophies MDPI, ISSN 2409-9287, Vol. 1, p. 73-86Article in journal (Other academic)
    Abstract [en]

    The gaming industry is a multi-billion dollar business, constantly on the hunt for innovations. More realistic and believable looking animations are a current trend in this industry. Lately, motion capture techniques have been used to create realistic and persuasive animations. Immersive virtual environments are one of the technologies being developed to support motion capture actors with their work. The possibilities to use virtual environments as work environments have already been explored but no ethical analysis or applied ethical code has been provided for such situations.

    In this paper we investigate the ethical implications of introducing a highly im- mersive virtual environment within motion capture and discuss under which circum- stances it is ethically justified to place an actor in such an environment. Moreover, we provide an overview of research in computer games ethics, virtual realities and acting, as well as an investigation of potential moral and ethical issues in motion capture. A discussion shall help finding an ethical consensus within the field of mo- tion capture and for related situations. 

  • 6.
    Kade, Daniel
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Head-mounted Projection Display to Support and Improve Motion Capture Acting2016Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    Current and future animations seek for realistic motions to create an illusion of authentic and believable animations. A technology widely used to support this process is motion capture. Therefore, motion capture actors are used to enrich the movements of digital avatars with suitable and believable motions and emotions.

    Acting for motion capture, as it is performed today, is a challenging work environment for actors and directors. Short preparation times, minimalistic scenery, limited information about characters and the performance as well as memorizing movements and spatial positions requires actors who are trained and able to highly rely on their acting and imagination skills. In many cases these circumstances can lead to performances with unnatural motions such as stiff looking and emotionless movements, as well as less believable characters. To compensate this, time-consuming repetitions of performances or post-processing of motion capture recordings is needed.

    To improve this, we explore the possibilities of acting support and immersion through an interactive system supporting motion capture actors during their performances. In this process, we use an approach that combines research methods from interaction design and computer science. For our research, we firstly identify the challenges actors are facing in motion capture, as well as suggest possible concepts to support the actors. Thereafter, we explore initial prototypes built to support actors during their performance in a motion capture studio. The resulting insights from these initial prototypes led to the design exploration and development of a mixed reality head-mounted projection display that allows showing virtual scenery to the actors and provides real-time acting support. Thereafter, we describe our developed mixed reality application and our findings on how hardware and software prototypes can be designed as acting support, usable in a motion capture environment. A working prototype allowing to evaluate actors' experiences and performances was built as a proof-of-concept.

    Additionally, we explored the possibility to use our developed mixed reality prototype in other fields and investigated its applicability for computer games and as an industrial simulator application.

    Finally, we conducted user studies with traditionally trained theatre and TV actors, experienced motion capture actors and experts, evaluating the experiences with our prototype. The results of these user studies indicate that our application makes it easier for motion capture actors to get into demanded moods and to understand the acting scenario. Furthermore, we show a prototype that complies with the requirements of a motion capture environment, that has the potential to improve motion capture acting results and supports actors with their performances.

  • 7.
    Kade, Daniel
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Towards Immersive Motion Capture Acting: Design, Exploration and Development of an Augmented System Solution2014Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Current and future animations seek for realistic motions to create a perception of authentic and human-like animations. A technology widely used for such purposes is motion capture. Therefore, to create such human-like animations, motion capture actors enrich the movements of digital avatars with realistic and believable motions and emotions.

    Acting for motion capture, as it is performed today, does not provide a natural acting environment. This is mostly because motion capture actors do not see and feel the virtual environment they act for, while acting. In many cases this can result in unnatural motions such as stiff looking and emotionless movements.

    To investigate ways to solve this, we first identify the challenges actors are facing as well as concepts to support a motion capture actor. Furthermore, we discussed, how the task of supporting motion capture actors was approached and which factors were discovered to provide support when designing and im- plementing a solution. Initial prototypes have been created to address the men- tioned issues and to find suitable solutions to support and immerse motion cap- ture actors during their performance. For this thesis, one goal was to conduct research by focusing on the question: What are the experiential qualities of immersion in an interactive system to create an immersive acting environment that supports motion capture actors.

    The developed application provides a flexibility to set up and modify digital assets and scenes quickly and with an easy to use interface. Furthermore, the prototype helps to provide an understanding on which hardware and software prototypes can be designed and used to build an immersive motion capture environment. The built prototype allows to investigate user experiences, user tests and the satisfaction of users and their effects on motion capture acting. 

  • 8.
    Kade, Daniel
    et al.
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Aksit, Kaan
    Koc Univ, Dept Elect Engn, Istanbul, Turkey.
    Urey, Hakan
    Koc Univ, Dept Elect Engn, Istanbul, Turkey.
    Ozcan, Oguzhan
    oc Univ, Arcelik Res Ctr Creat Ind, Istanbul, Turkey.
    Head-mounted mixed reality projection display for games production and entertainment2015In: Personal and Ubiquitous Computing, ISSN 1617-4909, E-ISSN 1617-4917, Vol. 19, no 3-4, p. 509-521Article in journal (Refereed)
    Abstract [en]

    This research presents a mixed reality (MR) application that is designed to be usable during a motion capture shoot and supports actors with their task to perform. Through our application, we allow seeing and exploring a digital environment without occluding an actor's field of vision. A prototype was built by combining a retroreflective screen covering surrounding walls and a headband consisting of a laser scanning projector with a smartphone. Built-in sensors of a smartphone provide navigation capabilities in the digital world. The presented system was demonstrated in an initially published paper. Here, we extend these research results with our advances and discuss the potential use of our prototype in gaming and entertainment applications. To explore this potential use case, we built a gaming application using our MR prototype and tested it with 45 participants. In these tests, we use head movements as rather unconventional game controls. According to the performed user tests and their feedback, our prototype shows a potential to be used for gaming applications as well. Therefore, our MR prototype could become of special interest because the prototype is lightweight, allows for freedom of movement and is a low-cost, stand-alone mobile system. Moreover, the prototype also allows for 3D vision by mounting additional hardware.

  • 9.
    Kade, Daniel
    et al.
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Lindell, Rikard
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Ürey, H.
    Koç University, Rumelifeneri Yolu Sariyer, Istanbul, Turkey.
    Özcan, O.
    Koç University, Rumelifeneri Yolu Sariyer, Istanbul, Turkey.
    Evaluation of a mixed reality head-mounted projection display to support motion capture acting2018In: Lecture Notes in Computer Science, Springer Verlag , 2018, Vol. 714, p. 14-31Conference paper (Refereed)
    Abstract [en]

    Motion capture acting is a challenging task, it requires trained and experienced actors who can highly rely on their acting and imagination skills to deliver believable performances. This is especially the case when preparation times are short and scenery needs to be imagined, as it is commonly the case for shoots in the gaming industry. To support actors in such cases, we developed a mixed reality application that allows showing digital scenery and triggering emotions while performing. In this paper we tested our hypothesis that a mixed reality head-mounted projection display can support motion capture acting through the help of experienced motion capture actors performing short acting scenes common for game productions. We evaluated our prototype with four motion capture actors and four motion capture experts. Both groups considered our application as helpful, especially as a rehearsal tool to prepare performances before capturing the motions in a studio. Actors and experts indicated that our application could reduce the time to prepare performances and supports the set up of physical acting scenery.

  • 10.
    Kade, Daniel
    et al.
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Lindell, Rikard
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Ürey, H.
    Koç University, Turkey.
    Özcan, O.
    Koç University, Turkey.
    Supporting motion capture acting through a mixed reality application2017In: Optimizing Human-Computer Interaction With Emerging Technologies, IGI Global , 2017, p. 248-273Chapter in book (Other academic)
    Abstract [en]

    Current and future animations seek for more human-like motions to create believable animations for computer games, animated movies and commercial spots. A technology widely used technology is motion capture to capture actors' movements which enrich digital avatars motions and emotions. However, a motion capture environment poses challenges to actors such as short preparation times and the need to highly rely on their acting and imagination skills. To support these actors, we developed a mixed reality application that allows showing digital environments while performing and being able to see the real and virtual world. We tested our prototype with 6 traditionally trained theatre and TV actors. As a result, the actors indicated that our application supported them getting into the demanded acting moods with less unrequired emotions. The acting scenario was also better understood with less need of explanation than when just discussing the scenario, as commonly done in theatre acting.

  • 11.
    Kade, Daniel
    et al.
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Lindell, Rikard
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Ürey, H.
    Koç University, Turkey.
    Özcan, O.
    Koç University, Turkey.
    Supporting motion capture acting through a mixed reality application2018Book (Other academic)
    Abstract [en]

    Current and future animations seek for more human-like motions to create believable animations for computer games, animated movies and commercial spots. A technology widely used technology is motion capture to capture actors’ movements which enrich digital avatars motions and emotions. However, a motion capture environment poses challenges to actors such as short preparation times and the need to highly rely on their acting and imagination skills. To support these actors, we developed a mixed reality application that allows showing digital environments while performing and being able to see the real and virtual world. We tested our prototype with 6 traditionally trained theatre and TV actors. As a result, the actors indicated that our application supported them getting into the demanded acting moods with less unrequired emotions. The acting scenario was also better understood with less need of explanation than when just discussing the scenario, as commonly done in theatre acting. 

  • 12.
    Kade, Daniel
    et al.
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Lindell, Rikard
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Ürey, Hakan
    Koc¸ University Istanbul, Turkey.
    Ozcan, Oguzhan
    Koc¸ University Istanbul, Turkey.
    Acting 2.0: When Entertainment Technology Helps Actors to Perform2015In: ACM International Conference Proceeding Series, 2015, article id 15Conference paper (Other academic)
    Abstract [en]

    Motion capture shoots involve a wide range of technology and entertainment production systems such as motion capture cameras, tracking software and digital environments to create entertainment applications. However, acting in this high-tech environment is still traditional and brings its own challenges to the actors. Good acting and imagination skills are highly needed for many motion capture shoots to deliver satisfying results. In our research, we are exploring how to support the actors and use a head-mounted projection display to create a mixed reality application helping actors to perform during motion capture shoots. This paper presents the latest enhancements of our head-mounted projection display application and dis- cusses the use of this technology for motion capture acting as well as the potential use for entertainment purposes.

  • 13.
    Kade, Daniel
    et al.
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Lindell, Rikard
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Ürey, Hakan
    Özcan, Oguzhan
    Evaluation of a Mixed Reality Projection Display to Support Motion Capture Acting2016In: 13th EAI International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services, 2016Conference paper (Refereed)
    Abstract [en]

    We present an evaluation of our mixed reality prototype for motion capture acting, tested with experienced motion capture actors. Motion capture acting requires trained and experienced actors who can highly rely on their acting and imagination skills. This is especially the case when preparation times are short and scenery needs to be imagined. To support actors in such cases, we developed a mixed reality application that allows showing digital scenery and triggering emotions while performing. In this paper we tested our prototype with experienced motion capture actors performing short acting scenes. We also evaluated the prototype’s usefulness for motion capture with four actors and four motion capture experts. The actors and experts considered our application helpful, especially as a rehearsal tool to prepare performances before motion capture shoots. They indicated that our application could reduce the time to prepare performances support the preparation of physical acting scenarios.

  • 14.
    Kade, Daniel
    et al.
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Lindell, Rikard
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Ürey, Hakan
    Koç University, Rumeli Feneri Mh., 34450 Istanbul, Turkey.
    Özcan, Oguzhan
    Koç University, Rumeli Feneri Mh., 34450 Istanbul, Turkey.
    Supporting Acting Performances Through Mixed Reality and Virtual Environments2016In: Proceedings of SETECEC 2016, 2016Conference paper (Refereed)
    Abstract [en]

    Motion capture actors need to deal with short preparation times and highly rely on their acting and imagination skills. To support these actors, we developed a mixed reality application that allows showing digital acting environments while performing and tested our prototype with 6 traditionally trained theatre and TV actors. As a result, these 6 actors indicated that our application supported them getting into the demanded acting moods with less unrequired emotions. The acting scenario was also better understood with less need of explanation than when just discussing the scenario, as commonly done in theatre acting.

  • 15.
    Kade, Daniel
    et al.
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Wallmyr, Markus
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Holstein, Tobias
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Lindell, Rikard
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Ürey, Hakan
    Koç University, Istanbul, Turkey.
    Özcan, Oguzhan
    Koç University, Istanbul, Turkey.
    Low-cost mixed reality simulator for industrial vehicle environments2016In: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Volume 9740, 2016, p. 597-608Conference paper (Refereed)
    Abstract [en]

    High-end industrial vehicle simulators are generally expensive and aim at providing a high level of realism. The access to such simulators is often a limited resource to researchers and developers who find themselves using a PC-based simulator instead. We challenge this approach by introducing a low-cost mixed reality simulator for industrial vehicles that allows to test new vehicle control concepts and design ideas in a rapid prototyping manner. Our simulator prototype consists of a head-mounted projection display, a CAVE-like room covered with a retro-reflective cloth and a rotatable chair with controls to steer an industrial vehicle. The created digital environment represents an obstacle course for an excavator and can be controlled by a joystick, a keyboard and can be explored by natural head movements. Performed user tests with 21 participants showed that the mixed reality simulator is perceived as more realistic, natural to use and provides a more immersive experience than a PC-based simulator with the same environment and controls.

  • 16.
    Kade, Daniel
    et al.
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Özcan, Oguzhan
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Lindell, Rikard
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    An Immersive Motion Capture Environment2013Conference paper (Refereed)
    Abstract [en]

    Motion capturing technology has been used for quite a while and several research has been done within this area. Nevertheless, we discovered open issues within current motion capturing environments. In this paper we provide a state-of-the-art overview of the addressed research areas and show issues with current motion capturing environments. Observations, interviews and questionnaires have been used to reveal the challenges actors are currently facing in a motion capturing environment. Furthermore, the idea to create a more immersive motion capturing environment to improve the acting performances and motion capturing outcomes as a potential solution is introduced. It is hereby the goal to explain the found open issues and the developed ideas which shall serve for further research as a basis. Moreover, a methodology to address the interaction and systems design issues is proposed. A future outcome could be that motion capture actors are able to perform more naturally, especially if using a non-body-worn solution.

  • 17.
    Kade, Daniel
    et al.
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Özcan, Oguzhan
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Lindell, Rikard
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Towards Stanislavski-based Principles for Motion Capture Acting in Animation and Computer Games2013Conference paper (Refereed)
    Abstract [en]

    Current and future animations crave for realistic motions to create a perception of motions that are close to a realistic human-like performance. To create such human-like animations, motion capture actors enrich the movements of digital avatars with realistic and believable motions and emotions. Acting for motion capture, as it is performed today, implies certain challenges. In this paper we address these challenges and argue how to support motion capture actors especially when acting for computer games. We discuss the nature of motion capture acting in the view of Stanislavski’s acting principles and point out the actors’ skills and demands. We conclude that the developed principles should be: ’Imagination’, ’Objectives’, ’Information & Visual References’, ’Magic if’, ’Adaptation’ and ’Relaxation’ to support motion capture actors with their work.

  • 18.
    Wallmyr, Markus
    et al.
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Kade, Daniel
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Holstein, Tobias
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems. University of Applied Sciences, Darmstadt, Germany.
    360 Degree Mixed Reality Environment to Evaluate Interaction Design for Industrial Vehicles Including Head-Up and Head-Down Displays2018In: Lecture Notes in Computer Science , Volume 10910, 2018, p. 377-391Conference paper (Refereed)
    Abstract [en]

    Designing and testing new information and safety features for industrial vehicles do not need to involve the realization of high-fidelity and expensive simulators. We propose a low-cost mixed reality environment which allows for rapid development and rearrangement of a virtual and physical setup of a simulator for industrial vehicles. Our mixed reality simulator allows for safe testing of controls, information, and safety features to support drivers of industrial vehicles. In this paper, we test the implications of showing extra digital information to excavator drivers through a virtual environment, an external head-up display as well as a head-down display. Through user tests we have seen first indications that projected information through our mixed reality system and content on a head-up display is perceived as more helpful and intuitive than using head-down displays, when controlling our industrial vehicle simulator. Moreover, we have seen that the fear of overseeing an obstacle or other important information is lower when using a head-up display, in comparison to other tested visualization options. © Springer International Publishing AG, part of Springer Nature 2018.

1 - 18 of 18
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf