https://www.mdu.se/

mdu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Head-worn Mixed Reality Projection Display Application
Koç University, Turkey.
Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems. (Ubiquitous Computing Group)ORCID iD: 0000-0001-7722-5310
Koç University, Turkey.ORCID iD: 0000-0001-8828-1684
Koç University, Turkey.
2014 (English)In: ACM International Conference Proceedings Series (ICPS), 2014Conference paper, Published paper (Refereed)
Abstract [en]

The main goal of this research is to develop a mixed real- ity (MR) application to support motion capture actors. This application allows seeing and exploring a digital environment without occluding the actor’s visual field. A prototype is built by combining a retro-reflective screen covering surrounding walls and a headband consisting of a laser scanning projector with a smartphone. Built-in sensors of a smartphone provide navigation capabilities in the digital world. The integrated system has some unique advantages, which are collectively demonstrated for the first time: (i) providing fixed field-of- view (50o in diagonal), fixed retinal images at full-resolution, and distortion-free images that are independent of the screen distance and shape; (ii) presenting different perspectives to the users as they move around or tilt their heads, (iii) allow- ing a focus-free and calibration-free display even on non-flat surfaces using laser scanning technology, (iv) enabling mul- tiple users to share the same screen without crosstalk due to the use of retro-reflectors, and (v) producing high brightness pictures with a projector of only 15 lm; due to a high-gain retro-reflective screen. We demonstrated a lightweight, com- fortable to wear and low cost head-mounted projection dis- play (HMPD) which acts as a stand-a-lone mobile system. Initial informal functionality tests have been successfully per- formed. The prototype can also be used as a 3D stereo system using the same hardware by additionally mounting polarized glasses and an active polarization rotator, while maintaining all of the advantages listed above. 

Place, publisher, year, edition, pages
2014.
Keywords [en]
head-mounted projection display; mixed reality; motion capture; laser projector; immersive environments
National Category
Other Computer and Information Science
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:mdh:diva-25886DOI: 10.1145/2663806.2663826Scopus ID: 2-s2.0-84938328251OAI: oai:DiVA.org:mdh-25886DiVA, id: diva2:744807
Conference
11th Advances in Computer Entertainment Technology Conference, ACE 2014; Funchal, Madeira; Portugal; 11 November 2014 through 14 November 2014; Code 113147
Available from: 2014-09-08 Created: 2014-09-08 Last updated: 2018-01-11Bibliographically approved
In thesis
1. Towards Immersive Motion Capture Acting: Design, Exploration and Development of an Augmented System Solution
Open this publication in new window or tab >>Towards Immersive Motion Capture Acting: Design, Exploration and Development of an Augmented System Solution
2014 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]

Current and future animations seek for realistic motions to create a perception of authentic and human-like animations. A technology widely used for such purposes is motion capture. Therefore, to create such human-like animations, motion capture actors enrich the movements of digital avatars with realistic and believable motions and emotions.

Acting for motion capture, as it is performed today, does not provide a natural acting environment. This is mostly because motion capture actors do not see and feel the virtual environment they act for, while acting. In many cases this can result in unnatural motions such as stiff looking and emotionless movements.

To investigate ways to solve this, we first identify the challenges actors are facing as well as concepts to support a motion capture actor. Furthermore, we discussed, how the task of supporting motion capture actors was approached and which factors were discovered to provide support when designing and im- plementing a solution. Initial prototypes have been created to address the men- tioned issues and to find suitable solutions to support and immerse motion cap- ture actors during their performance. For this thesis, one goal was to conduct research by focusing on the question: What are the experiential qualities of immersion in an interactive system to create an immersive acting environment that supports motion capture actors.

The developed application provides a flexibility to set up and modify digital assets and scenes quickly and with an easy to use interface. Furthermore, the prototype helps to provide an understanding on which hardware and software prototypes can be designed and used to build an immersive motion capture environment. The built prototype allows to investigate user experiences, user tests and the satisfaction of users and their effects on motion capture acting. 

Place, publisher, year, edition, pages
Västerås: Mälardalen University, 2014
Series
Mälardalen University Press Licentiate Theses, ISSN 1651-9256 ; 181
National Category
Other Computer and Information Science
Research subject
Computer Science
Identifiers
urn:nbn:se:mdh:diva-25888 (URN)978-91-7485-161-8 (ISBN)
Presentation
2014-10-31, Kappa, Högskoleplan 1, Mälardalens högskola, Västerås, 13:15 (English)
Opponent
Supervisors
Available from: 2014-09-08 Created: 2014-09-08 Last updated: 2018-01-11Bibliographically approved
2. Head-mounted Projection Display to Support and Improve Motion Capture Acting
Open this publication in new window or tab >>Head-mounted Projection Display to Support and Improve Motion Capture Acting
2016 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Current and future animations seek for realistic motions to create an illusion of authentic and believable animations. A technology widely used to support this process is motion capture. Therefore, motion capture actors are used to enrich the movements of digital avatars with suitable and believable motions and emotions.

Acting for motion capture, as it is performed today, is a challenging work environment for actors and directors. Short preparation times, minimalistic scenery, limited information about characters and the performance as well as memorizing movements and spatial positions requires actors who are trained and able to highly rely on their acting and imagination skills. In many cases these circumstances can lead to performances with unnatural motions such as stiff looking and emotionless movements, as well as less believable characters. To compensate this, time-consuming repetitions of performances or post-processing of motion capture recordings is needed.

To improve this, we explore the possibilities of acting support and immersion through an interactive system supporting motion capture actors during their performances. In this process, we use an approach that combines research methods from interaction design and computer science. For our research, we firstly identify the challenges actors are facing in motion capture, as well as suggest possible concepts to support the actors. Thereafter, we explore initial prototypes built to support actors during their performance in a motion capture studio. The resulting insights from these initial prototypes led to the design exploration and development of a mixed reality head-mounted projection display that allows showing virtual scenery to the actors and provides real-time acting support. Thereafter, we describe our developed mixed reality application and our findings on how hardware and software prototypes can be designed as acting support, usable in a motion capture environment. A working prototype allowing to evaluate actors' experiences and performances was built as a proof-of-concept.

Additionally, we explored the possibility to use our developed mixed reality prototype in other fields and investigated its applicability for computer games and as an industrial simulator application.

Finally, we conducted user studies with traditionally trained theatre and TV actors, experienced motion capture actors and experts, evaluating the experiences with our prototype. The results of these user studies indicate that our application makes it easier for motion capture actors to get into demanded moods and to understand the acting scenario. Furthermore, we show a prototype that complies with the requirements of a motion capture environment, that has the potential to improve motion capture acting results and supports actors with their performances.

Place, publisher, year, edition, pages
Västerås: Mälardalen University, 2016
Series
Mälardalen University Press Dissertations, ISSN 1651-4238 ; 203
National Category
Human Computer Interaction Other Computer and Information Science
Research subject
Computer Science
Identifiers
urn:nbn:se:mdh:diva-31725 (URN)978-91-7485-275-2 (ISBN)
Public defence
2016-08-30, Lambda, Mälardalens högskola, Västerås, 09:15 (English)
Opponent
Supervisors
Available from: 2016-06-10 Created: 2016-06-06 Last updated: 2018-01-10Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopushttp://www.academia.edu/9483272/Head-worn_Mixed_Reality_Projection_Display_Application

Authority records

Kade, DanielÖzcan, Oguzhan

Search in DiVA

By author/editor
Kade, DanielÖzcan, Oguzhan
By organisation
Embedded Systems
Other Computer and Information Science

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 1164 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf