SlideShare a Scribd company logo
RESEARCH DIRECTIONS IN
CROSS REALITY INTERFACES
Mark Billinghurst
mark.billinghurst@unisa.edu.au
Summer School on Cross Reality
July 2nd 2024
Research Directions for Cross Reality Interfaces
Research Directions for Cross Reality Interfaces
Research Directions for Cross Reality Interfaces

Recommended for you

Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality

This is a guest lecture given by Mark Billinghurst at the University of Sydney on March 27th 2024. It discusses some future research directions for Augmented Reality.

augmented realityresearch
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse

This document discusses empathic computing and its relationship to the metaverse. It defines key elements of the metaverse like virtual worlds, augmented reality, mirror worlds, and lifelogging. Research on the metaverse is still fragmented across these areas. The document outlines a vision for empathic computing systems that allow sharing experiences, emotions, and environments through technologies like virtual reality, augmented reality, and sensor data. Examples are given of research projects exploring collaborative VR experiences and AR/VR systems for remote collaboration and communication. The goal is for technology to support more natural and implicit understanding between people.

metaverseaugmented realityvirtual reality
Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...
Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...
Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...

The document discusses the potential benefits of immersive augmented reality (AR) for software engineering. It outlines how AR could help with software evolution, comprehension, and performance awareness by overcoming issues of 3D visualization on screens. The document presents preliminary frameworks for collaboration/communication, embodiment/mediated reality, mobility/multi-device usage, and pervasiveness/privacy in AR. It suggests AR may benefit requirements engineering, software design, implementation, DevOps, testing, and maintenance by leveraging aspects like collaboration, mobility, and pervasiveness.

software engineeringaugmented realityhololens
Computer Interfaces
• Separation between real and digital worlds
• WIMP (Windows, Icons, Menus, Pointer) metaphor
Rekimoto, J. and Nagao, K. 1995. The world through the computer: computer augmented interaction with real world environments.
Making Interfaces Invisible
(c) Internet of Things
Internet of Things (IoT)..
• Embed computing and sensing in real world
• Smart objects, sensors, etc..
(c) Internet of Things
Virtual Reality (VR)
• Users immersed in Computer Generated environment
• HMD, gloves, 3D graphics, body tracking

Recommended for you

ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote

Keynote speech given by Mark Billinghurst at the ISS 2022 conference. Presented on November 22nd, 2022. This keynote outlines some research opportunities in the Metaverse.

metaverseempathic computingaugmented reality
Moving Beyond Questionnaires to Evaluate MR Experiences
Moving Beyond Questionnaires to Evaluate MR ExperiencesMoving Beyond Questionnaires to Evaluate MR Experiences
Moving Beyond Questionnaires to Evaluate MR Experiences

This document discusses the evolution of Mark Billinghurst's research evaluating mixed reality experiences over 25 years. It summarizes four of his key studies: 1) His 1995 study which used sketch maps to measure cognitive maps in virtual environments. It found maps correlated with orientation and different worlds produced different understanding. 2) His 1998 study of a collaborative AR/VR experience which found seeing a partner's body improved performance and AR was better than VR. 3) His 2003 study analyzing communication behaviors in colocated AR interfaces, finding gestures and speech were similar between face-to-face and AR conditions. 4) A 2018 meta-review analyzing 10 years of AR usability studies and opportunities to improve experiments

mixed realityaugmented realityvirtual reality
COMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in ARCOMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in AR

Final lecture from the COMP 4010 course on Virtual and Augmented Reality. This lecture was about Research Directions in Augmented Reality. Taught by Mark Billinghurst on November 1st 2016 at the University of South Australia

augmented realityresearch directions
Augmented Reality (AR)
• Virtual Images blended with the real world
• See-through HMD, handheld display, viewpoint tracking, etc..
From Reality to Virtual Reality
Internet of Things Augmented Reality Virtual Reality
Real World Virtual World
Milgram’s Mixed Reality (MR) Continuum
Augmented Reality Virtual Reality
Real World Virtual World
Mixed Reality
"...anywhere between the extrema of the virtuality continuum."
P. Milgram and A. F. Kishino, (1994) A Taxonomy of Mixed Reality Visual Displays
Internet of Things
Milgram’s Mixed Reality (MR) Continuum
Augmented Reality Virtual Reality
Real World Virtual World
Mixed Reality
Internet of Things

Recommended for you

Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research Directions

The final lecture in the 2021 COMP 4010 class on AR/VR. This lecture summarizes some more research directions and trends in AR and VR. This lecture was taught by Mark Billinghurst on November 2nd 2021 at the University of South Australia

augmented realityvirtual realityresearch
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR

COMP 4010 Lecture 7 on an Introduction to VR. This lecture was taught by Mark Billinghurst at the University of South Australia on September 8th 2022.

virtual realityinterfaceintroduction
Fifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using ARFifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using AR

This document discusses how augmented and virtual reality technologies can be used to create more empathetic and collaborative experiences. It outlines trends in content capture, networking bandwidth, and natural interfaces that enable new types of shared experiences. Examples are presented of past and current AR/VR systems that allow remote users to share live video, 3D spaces, gestures, and physiological cues like gaze and expression. The document concludes that AR and VR are well-suited for developing empathetic computing applications by allowing users to understand, experience, and share perspectives and emotions.

collaborationtélécommunicationsaugmented reality
The MagicBook (2001)
Reality Virtuality
Augmented
Reality (AR)
Augmented
Virtuality (AV)
Billinghurst, M., Kato, H., & Poupyrev, I. (2001). The MagicBook: a transitional AR interface. Computers & Graphics, 25(5), 745-753.
MagicBook Demo
Features
• Seamless transition from Reality to Virtuality
• Reliance on real decreases as virtual increases
• Supports egocentric and exocentric views
• User can pick appropriate view
• Independent Views
• Privacy, role division, scalability
• Collaboration on multiple levels:
• Physical Object, AR Object, Immersive Virtual Space
• Egocentric + exocentric collaboration
• multiple multi-scale users
Apple Vision Pro (2024)
• Transitioning from AR to VR
• Spatial Computing – interface seamlessly blending with real world

Recommended for you

COMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR ApplicationsCOMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR Applications

Lecture 11 from the 2017 COMP 4010 course on AR and VR at the University of South Australia. This lecture was on AR applications and was taught by Mark Billinghurst on October 26th 2017.

augmented realityapplications
Empathic Mixed Reality
Empathic Mixed RealityEmpathic Mixed Reality
Empathic Mixed Reality

This presentation was on Empathic Mixed Reality, which we applied Mixed Reality technology to Empathic Computing in our studies. We shared an overview of our research and selected findings. This talk was given at ETRI and KAIST in Daejeon, South Korea, on the 24th of May 2017.

empathic computingmixed realityaugmented reality
COMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VRCOMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VR

COMP lecture 4 given by Bruce Thomas on August 16th 2017 at the University of South Australia about 3D User Interfaces for VR. Slides prepared by Mark Billinghurst.

virtual realityinteraction designhuman-computer interaction
Cross Reality (CR) Systems
•Systems that facilitate:
•a smooth transition between systems using
different degrees of virtuality or
•collaboration between users using different
systems with different degrees of virtuality
Simeone, Adalberto L., Mohamed Khamis, Augusto Esteves, Florian Daiber, Matjaž Kljun, Klen Čopič Pucihar,
Poika Isokoski, and Jan Gugenheimer. "International workshop on cross-reality (xr) interaction." In Companion
Proceedings of the 2020 Conference on Interactive Surfaces and Spaces, pp. 111-114. 2020.
Publications in Cross Reality
Increasing publications since 2019
Key CR Technologies
• Augmentation technologies that layer information onto our
perception of the physical environment.
• Simulation technologies that model reality
• Intimate technologies are focused inwardly, on the identity
and actions of the individual or object;
• External technologies are focused outwardly, towards the
world at large;
Taxonomy
• Four Key Components
• Virtual Worlds
• Augmented Reality
• Mirror Worlds
• Lifelogging

Recommended for you

Mobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface DesignMobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface Design

Lecture 9 from a course on Mobile Based Augmented Reality Development taught by Mark Billinghurst and Zi Siang See on November 29th and 30th 2015 at Johor Bahru in Malaysia. This lecture describes principles for effective Interface Design for Mobile AR applications. Look for the other 9 lectures in the course.

mobilehuman-computer interactionaugmented reality
COMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VRCOMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VR

COMP 4010 lecture on research directions in AR and VR, taught by Mark Billinghurst on November 2nd 2017 at the University of South Australia. This is the final lecture in the 2017 COMP 4010 course on AR and VR

augmented realityvirtual reality
2016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 52016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 5

The fifth lecture from the Augmented Reality Summer School taught by Mark Billinghurst at the University of South Australia, February 15th - 19th, 2016. This provides an overview of AR research directions.

augmented reality
Mirror Worlds
• Simulations of external space/content
• Capturing and sharing surroundings
• Photorealistic content
• Digital twins
Matterport Deep Mirror Google Street View
Soul Machines
Lifelogging
• Measuring user’s internal state
• Capturing physiological cues
• Recording everyday life
• Augmenting humans
Apple Fitbit Shimmer
OpenBCI
M
ixed
R
eality
Expanded Research Opportunities

Recommended for you

Augmented reality the evolution of human computer interaction
Augmented reality the evolution of human computer interactionAugmented reality the evolution of human computer interaction
Augmented reality the evolution of human computer interaction

Augmented Reality: An Evolution in Human-Computer Interaction This document discusses augmented reality (AR), which overlays digital information on the physical world. AR works by sensing the environment using cameras and sensors, augmenting it by identifying objects and context, and presenting additional information through displays. Common AR applications include mobile apps that provide information about objects by taking photos. The document also discusses how AR is used in education and training by bringing digital content into the real world through books, games and simulations. As AR systems expand, managing complex interactions and information across different tiers becomes important for scalability, interoperability and flexibility.

Calongne vr simulations games ctu doctoral july 2017
Calongne vr simulations games ctu doctoral july 2017Calongne vr simulations games ctu doctoral july 2017
Calongne vr simulations games ctu doctoral july 2017

Two virtual reality, virtual worlds, games and simulation research workshops at the Colorado Technical University Doctoral Symposium July 12-13, 2017 hosted by Dr. Cynthia Calongne, aka Lyr Lobo in the Metaverse.

virtual realityburning manvirtual worlds
The Metaverse: Are We There Yet?
The  Metaverse:    Are   We  There  Yet?The  Metaverse:    Are   We  There  Yet?
The Metaverse: Are We There Yet?

Keynote talk by Mark Billinghurst at the 9th XR-Metaverse conference in Busan, South Korea. The talk was given on May 20th, 2024. It talks about progress on achieving the Metaverse vision laid out in Neil Stephenson's book, Snowcrash.

metaverseaugmented realityvirtual reality
What is the Metaverse Research Landscape?
•Survey of Scopus papers (to June 2023)
• ~1900 papers found with Metaverse in abstract/keywords
•Further analysis
• Look for publications in AR, VR, MirrorWorlds (MW), LifeLogging (LL)
• Look for research across boundaries
•Application analysis
• Most popular application domains
Single Topic Research
36%
10%
12%
2%
Crossing Boundaries
11%
1%
2%
2%
16% 0%
Crossing Corners
2%
2% 0%
1%

Recommended for you

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems

Guest lecture given by Mark Billinghurst on some Human Factors aspects of designing AR/VR systems. Given on April 24th 2024.

human factorsaugmented realityvirtual reality
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024

These are slides from the Defence Industry event orgranized by the Australian Research Centre for Interactive and Virtual Environments (IVE). This was held on April 18th 2024, and showcased IVE research capabilities to the South Australian Defence industry.

augmented realityvirtual realitydefence
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences

Presentation given by Mark Billinghurst at the 2024 XR Spring Summer School on March 7 2024. This lecture talks about different evaluation methods that can be used for Social XR/AR/VR experiences.

augmented realityxrevaluation
Entire Quadrant
2%
Lessons Learned
•Research Strengths
• Most Metaverse research VR related (36%)
• Strong connections between AR/VR (16%)
• Strong connections between MW/VR (11%)
•Research Opportunities
• Opportunities across boundaries - 1% papers in AR/LL, 0% in MW/LL
• Opportunities to combine > 2 quadrants – 0% in AR/MW/LL
• Opportunities for research combining all elements
• Broadening application space – industry, finance, etc
Possible Research Directions
• Lifelogging to VR
• Bringing real world actions into VR, VR to experience lifelogging data
• AR to Lifelogging
• Using AR to view lifelogging data in everyday life, Sharing physiological data
• Mirror Worlds to VR
• VR copy of the real world, Mirroring real world collaboration in VR
• AR to Mirror Worlds
• Visualizing the past in place, Asymmetric collaboration
• And more..
Example: Sharing Communication Cues
• Measuring non-verbal cues
• Gaze, face expression, heart rate
• Sharing in Augmented Reality
• Collaborative AR experiences

Recommended for you

2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems

Lecture 6 of the COMP 4010 course on AR/VR. This lecture is about designing AR systems. This was taught by Mark Billinghurst at the University of South Australia on September 1st 2022.

augmented realitydesignhci
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping

This document discusses various techniques for prototyping augmented reality interfaces, including sketching, storyboarding, wireframing, mockups, and video prototyping. Low-fidelity techniques like sketching and paper prototyping allow for rapid iteration and exploring interactions at early stages. Higher-fidelity techniques like interactive mockups and video prototypes communicate the look and feel of the final product and allow for user testing. A variety of tools are presented for different stages of prototyping, from sketching and interactive modeling in VR, to scene assembly using drag-and-drop tools, to final mockups using design software. Case studies demonstrate applying these techniques from initial concepts through to higher-fidelity prototypes. Overall the document

augmented realityvirtual realityhuman computer interaction
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction

Lecture 4 in the 2022 COMP 4010 lecture series on AR/VR. This lecture is about AR Interaction techniques. This was taught by Mark Billinghurst at the University of South Australia in 2022.

augmented realityvirtual realityhuman computer interaction
Empathy Glasses
• Combine together eye-tracking, display, face expression
• Implicit cues – eye gaze, face expression
+
+
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the
34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
Remote Collaboration
• Eye gaze pointer and remote pointing
• Face expression display
• Implicit cues for remote collaboration
Research Directions for Cross Reality Interfaces
Research Directions
•Enhancing Communication Cues
•Avatar Representation
•AI Enhanced communication
•Scene Capture and Sharing
•Asynchronous CR systems
•Prototyping Tools
•Empathic Computing

Recommended for you

2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology

This document discusses augmented reality technology and visual tracking methods. It covers how humans perceive reality through their senses like sight, hearing, touch, etc. and how virtual reality systems use input and output devices. There are different types of visual tracking including marker-based tracking using artificial markers, markerless tracking using natural features, and simultaneous localization and mapping which builds a model of the environment while tracking. Common tracking technologies involve optical, magnetic, ultrasonic, and inertial sensors. Optical tracking in augmented reality uses computer vision techniques like feature detection and matching.

augmented realityvirtual realityhuman computer interaction
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception

Lecture 2 in the 2022 COMP 4010 Lecture series on AR/VR and XR. This lecture is about human perception for AR/VR/XR experiences. This was taught by Mark Billinghurst at the University of South Australia in 2022.

augmented realityvirtual realityhuman computer interaction
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR

This document provides an introduction to extended reality technologies from Mark Billinghurst, the director of the Empathic Computing Lab at the University of South Australia. It outlines Billinghurst's background and research interests. It then provides an overview of the class, including assignments, equipment available, and the lecture schedule. The lecture schedule covers topics such as augmented reality, virtual reality, the metaverse, and the history of AR/VR.

augmented realityvirtual realityhuman computer interaction
ENHANCING COMMUNICATION CUES
Remote Communication
• Using AR/VR to share communication cues
• Gaze, gesture, head pose, body position
• Sharing same environment
• Virtual copy of real world
• Collaboration between AR/VR
• VR user appears in AR user’s space
Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues
in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5.
Sharing Virtual Communication Cues (2019)
Sharing Virtual Communication Cues
• AR/VR displays
• Gesture input (Leap Motion)
• Room scale tracking
• Conditions
• Baseline, FoV, Head-gaze, Eye-gaze

Recommended for you

Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics

This document discusses empathic computing and collaborative immersive analytics. It notes that while fields like scientific and information visualization are well established, little research has looked at collaborative visualization specifically. Collaborative immersive analytics combines mixed reality, visual analytics and computer-supported cooperative work. Empathic computing aims to develop systems that allow sharing experiences, emotions and perspectives using technologies like virtual and augmented reality with physiological sensors. Applying these concepts could enhance communication and understanding for collaborative immersive analytics tasks.

visualizationempathic computingaugmented reality
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning

This document discusses how metaverse concepts can be applied to corporate learning and leadership development. It defines the metaverse and outlines its key components: virtual worlds, augmented reality, mirror worlds, and lifelogging. Traditional corporate learning is described as instructor-led, group-based, and discrete. The document proposes applying metaverse concepts like learning in the flow of work, just-in-time learning, and adaptive personalized learning. Specific applications explored are virtual reality for skills and soft skills training, augmented reality for hands-on training, lifelogging for adaptive training, and mirror worlds for capturing real-world tasks.

augmented realityvirtual realitymetaverse
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions

Lecture 12 in the COMP 4010 course on AR/VR. This lecture was about research directions in AR/VR and in particular display research. This was taught by Mark Billinghurst on September 26th 2021 at the University of South Australia.

virtual realityaugmented realityresearch
Research Directions for Cross Reality Interfaces
Results
• Predictions
• Eye/Head pointing better than no cues
• Eye/head pointing could reduce need for pointing
• Results
• No difference in task completion time
• Head-gaze/eye-gaze great mutual gaze rate
• Using head-gaze greater ease of use than baseline
• All cues provide higher co-presence than baseline
• Pointing gestures reduced in cue conditions
• But
• No difference between head-gaze and eye-gaze
Enhancing Gaze Cues
How sharing gaze behavioural cues can improve remote collaboration in Mixed Reality environment.
➔ Developed eyemR-Vis, a 360 panoramic Mixed Reality remote collaboration system
➔ Showed gaze behavioural cues as bi-directional spatial virtual visualisations shared
between a local host (AR) and a remote collaborator (VR).
Jing, A., May, K. W., Naeem, M., Lee, G., & Billinghurst, M. (2021). eyemR-Vis: Using Bi-Directional Gaze Behavioural Cues to Improve Mixed
Reality Remote Collaboration. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-7).
System Design
➔ 360 Panaramic Camera + Mixed Reality View
➔ Combination of HoloLens2 + Vive Pro Eye
➔ 4 gaze behavioural visualisations:
browse, focus, mutual, fixated circle

Recommended for you

Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications

Lecture 11 of the COMP 4010 class on Augmented Reality and Virtual Reality. This lecture is about VR applications and was taught by Mark Billinghurst on October 19th 2021 at the University of South Australia

virtual reality
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications

Lecture 11 of the COMP 4010 class on Augmented Reality and Virtual Reality. This lecture is about VR applications and was taught by Mark Billinghurst on October 19th 2021 at the University of South Australia

virtual reality
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality

Guest lecture on Grand Challenges for Mixed Reality. Taught by Mark Billinghurst on October 19th 2021 at the University of Canterbury

virtual realityaugmented realitymixed reality
System Design
Browse Focus
Mutual Fixed
Circle-map
Example: Multi-Scale Collaboration
• Changing the user’s virtual body scale
Piumsomboon, T., Lee, G. A., Irlitti, A., Ens, B., Thomas, B. H., & Billinghurst, M. (2019, May). On the shoulder of the giant: A
multi-scale mixed reality collaboration with 360 video sharing and tangible interaction. In Proceedings of the 2019 CHI
conference on human factors in computing systems (pp. 1-17).
Research Directions for Cross Reality Interfaces
Sharing: Separating Cues from Body
• What happens when you can’t see your colleague/agent?
Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M. (2018, April). Mini-me: An adaptive
avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI conference on human factors in computing systems (pp. 1-13).
Collaborating Collaborator out of View

Recommended for you

Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface Design

Lecture 10 in the COMP 4010 Lectures on AR/VR from the Univeristy of South Australia. This lecture is about VR Interface Design and Evaluating VR interfaces. Taught by Mark Billinghurst on October 12, 2021.

augmented realityinterface designusability
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and Systems

Lecture 9 of the COMP 4010 course in AR/VR from the University of South Australia. This was taught by Mark Billinghurst on October 5th, 2021. This lecture describes VR input devices, VR systems and rapid prototyping tools.

virtual realityinteraction designuser interface
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR

AREA talk given on Advanced Methods for User Evaluation in Enterprise AR on September 29th 2021 by Mark Billinghurst.

augmented realityuser evaluationusability
Mini-Me Communication Cues in MR
• When lose sight of collaborator a Mini-Me avatar appears
• Miniature avatar in real world
• Mini-Me points to shared objects, show communication cues
• Redirected gaze, gestures
Research Directions for Cross Reality Interfaces
User Study (16 participants)
• Collaboration between user in AR, expert in VR
• Hololens, HTC Vive
• Two tasks:
• (1) asymmetric, (2) symmetric
• Key findings
• Mini-Me significantly improved performance time (task1) (20% faster)
• Mini-Me significantly improved Social Presence scores
• 63% (task 2) – 75% (task 1) of users preferred Mini-Me
“I feel like I am
talking to my
partner”
AVATAR REPRESENTATION

Recommended for you

Quality Patents: Patents That Stand the Test of Time
Quality Patents: Patents That Stand the Test of TimeQuality Patents: Patents That Stand the Test of Time
Quality Patents: Patents That Stand the Test of Time

Is your patent a vanity piece of paper for your office wall? Or is it a reliable, defendable, assertable, property right? The difference is often quality. Is your patent simply a transactional cost and a large pile of legal bills for your startup? Or is it a leverageable asset worthy of attracting precious investment dollars, worth its cost in multiples of valuation? The difference is often quality. Is your patent application only good enough to get through the examination process? Or has it been crafted to stand the tests of time and varied audiences if you later need to assert that document against an infringer, find yourself litigating with it in an Article 3 Court at the hands of a judge and jury, God forbid, end up having to defend its validity at the PTAB, or even needing to use it to block pirated imports at the International Trade Commission? The difference is often quality. Quality will be our focus for a good chunk of the remainder of this season. What goes into a quality patent, and where possible, how do you get it without breaking the bank? ** Episode Overview ** In this first episode of our quality series, Kristen Hansen and the panel discuss: ⦿ What do we mean when we say patent quality? ⦿ Why is patent quality important? ⦿ How to balance quality and budget ⦿ The importance of searching, continuations, and draftsperson domain expertise ⦿ Very practical tips, tricks, examples, and Kristen’s Musts for drafting quality applications https://www.aurorapatents.com/patently-strategic-podcast.html

patentspatent applicationpatent prosecution
Implementations of Fused Deposition Modeling in real world
Implementations of Fused Deposition Modeling  in real worldImplementations of Fused Deposition Modeling  in real world
Implementations of Fused Deposition Modeling in real world

The presentation showcases the diverse real-world applications of Fused Deposition Modeling (FDM) across multiple industries: 1. **Manufacturing**: FDM is utilized in manufacturing for rapid prototyping, creating custom tools and fixtures, and producing functional end-use parts. Companies leverage its cost-effectiveness and flexibility to streamline production processes. 2. **Medical**: In the medical field, FDM is used to create patient-specific anatomical models, surgical guides, and prosthetics. Its ability to produce precise and biocompatible parts supports advancements in personalized healthcare solutions. 3. **Education**: FDM plays a crucial role in education by enabling students to learn about design and engineering through hands-on 3D printing projects. It promotes innovation and practical skill development in STEM disciplines. 4. **Science**: Researchers use FDM to prototype equipment for scientific experiments, build custom laboratory tools, and create models for visualization and testing purposes. It facilitates rapid iteration and customization in scientific endeavors. 5. **Automotive**: Automotive manufacturers employ FDM for prototyping vehicle components, tooling for assembly lines, and customized parts. It speeds up the design validation process and enhances efficiency in automotive engineering. 6. **Consumer Electronics**: FDM is utilized in consumer electronics for designing and prototyping product enclosures, casings, and internal components. It enables rapid iteration and customization to meet evolving consumer demands. 7. **Robotics**: Robotics engineers leverage FDM to prototype robot parts, create lightweight and durable components, and customize robot designs for specific applications. It supports innovation and optimization in robotic systems. 8. **Aerospace**: In aerospace, FDM is used to manufacture lightweight parts, complex geometries, and prototypes of aircraft components. It contributes to cost reduction, faster production cycles, and weight savings in aerospace engineering. 9. **Architecture**: Architects utilize FDM for creating detailed architectural models, prototypes of building components, and intricate designs. It aids in visualizing concepts, testing structural integrity, and communicating design ideas effectively. Each industry example demonstrates how FDM enhances innovation, accelerates product development, and addresses specific challenges through advanced manufacturing capabilities.

fdmffffused deposition modeling
What's New in Copilot for Microsoft365 May 2024.pptx
What's New in Copilot for Microsoft365 May 2024.pptxWhat's New in Copilot for Microsoft365 May 2024.pptx
What's New in Copilot for Microsoft365 May 2024.pptx

This is a slide deck that showcases the updates in Microsoft Copilot for May 2024

microsoftmicrosoft copilot
Avatar Representation for Social Presence
• What should avatars look
like for social situations?
• Cartoon vs. realistic?
• Partial or full body?
• Impact on Social Presence?
Yoon, B., Kim, H. I., Lee, G. A., Billinghurst, M., & Woo, W. (2019, March). The effect of
avatar appearance on social presence in an augmented reality remote collaboration. In
2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (pp. 547-556). IEEE.
Avatar Representations
• Cartoon vs. Realistic, Part Body vs. Whole Body
• Realistic Head & Hands (RHH), Realistic Upper Body (RUB), Realistic Whole Body (RWB),
• Cartoon Head & Hands (CHH), Cartoon Upper Body (CUB), Cartoon Whole Body (CWB).
Experiment
• Within-subjects design (24 subjects)
• 6 conditions: RHH, RUB, RWB, CHH, CUB, CWB
• AR/VR interface
• Subject in AR interface, actor in VR
• Experiment measures
• Social Presence
• Networked Mind Measure of Social Presence survey
• Bailenson’s Social Presence survey
• Post Experiment Interview
• Tasks
• Study 1: Crossword puzzle (Face to Face discussion)
• Study 2: Furniture placement (virtual object placement)
AR user
VR user
Hypotheses
H1. Body Part Visibility will affect the user’s Social Presence in AR.
H2. The Whole-Body virtual avatars will have the highest Social
Presence among the three levels of visibility.
H3. Head & Hands virtual avatars will have the lowest Social
Presence among the three levels of visibility.
H4. The Character Style will affect the user’s Social Presence.
H5. Realistic avatars will have a higher Social Presence than
Cartoon Style avatars in an AR remote collaboration.

Recommended for you

Measuring the Impact of Network Latency at Twitter
Measuring the Impact of Network Latency at TwitterMeasuring the Impact of Network Latency at Twitter
Measuring the Impact of Network Latency at Twitter

Widya Salim and Victor Ma will outline the causal impact analysis, framework, and key learnings used to quantify the impact of reducing Twitter's network latency.

UiPath Community Day Kraków: Devs4Devs Conference
UiPath Community Day Kraków: Devs4Devs ConferenceUiPath Community Day Kraków: Devs4Devs Conference
UiPath Community Day Kraków: Devs4Devs Conference

We are honored to launch and host this event for our UiPath Polish Community, with the help of our partners - Proservartner! We certainly hope we have managed to spike your interest in the subjects to be presented and the incredible networking opportunities at hand, too! Check out our proposed agenda below 👇👇 08:30 ☕ Welcome coffee (30') 09:00 Opening note/ Intro to UiPath Community (10') Cristina Vidu, Global Manager, Marketing Community @UiPath Dawid Kot, Digital Transformation Lead @Proservartner 09:10 Cloud migration - Proservartner & DOVISTA case study (30') Marcin Drozdowski, Automation CoE Manager @DOVISTA Pawel Kamiński, RPA developer @DOVISTA Mikolaj Zielinski, UiPath MVP, Senior Solutions Engineer @Proservartner 09:40 From bottlenecks to breakthroughs: Citizen Development in action (25') Pawel Poplawski, Director, Improvement and Automation @McCormick & Company Michał Cieślak, Senior Manager, Automation Programs @McCormick & Company 10:05 Next-level bots: API integration in UiPath Studio (30') Mikolaj Zielinski, UiPath MVP, Senior Solutions Engineer @Proservartner 10:35 ☕ Coffee Break (15') 10:50 Document Understanding with my RPA Companion (45') Ewa Gruszka, Enterprise Sales Specialist, AI & ML @UiPath 11:35 Power up your Robots: GenAI and GPT in REFramework (45') Krzysztof Karaszewski, Global RPA Product Manager 12:20 🍕 Lunch Break (1hr) 13:20 From Concept to Quality: UiPath Test Suite for AI-powered Knowledge Bots (30') Kamil Miśko, UiPath MVP, Senior RPA Developer @Zurich Insurance 13:50 Communications Mining - focus on AI capabilities (30') Thomasz Wierzbicki, Business Analyst @Office Samurai 14:20 Polish MVP panel: Insights on MVP award achievements and career profiling

#uipathcommunity#automation#automationdeveloper
Best Practices for Effectively Running dbt in Airflow.pdf
Best Practices for Effectively Running dbt in Airflow.pdfBest Practices for Effectively Running dbt in Airflow.pdf
Best Practices for Effectively Running dbt in Airflow.pdf

As a popular open-source library for analytics engineering, dbt is often used in combination with Airflow. Orchestrating and executing dbt models as DAGs ensures an additional layer of control over tasks, observability, and provides a reliable, scalable environment to run dbt models. This webinar will cover a step-by-step guide to Cosmos, an open source package from Astronomer that helps you easily run your dbt Core projects as Airflow DAGs and Task Groups, all with just a few lines of code. We’ll walk through: - Standard ways of running dbt (and when to utilize other methods) - How Cosmos can be used to run and visualize your dbt projects in Airflow - Common challenges and how to address them, including performance, dependency conflicts, and more - How running dbt projects in Airflow helps with cost optimization Webinar given on 9 July 2024

apache airflowdbtdbt-core
Results
• Aggregated Presence Scores
• 1: strongly disagree - 7: strongly agree
User Comments
• ‘Whole Body’ Avatar Expression to Users
• “Presence was high with full body parts, because I could notice joints’
movement, behaviour, and reaction.”
• “I didn’t get the avatar’s intention of the movement, because it had only
head and hands.”
• ‘Upper Body’ vs. ‘Whole Body’ Avatar
• “I preferred the one with whole body, but it didn’t really matter because I
didn’t look at the legs much.”,
• “I noticed head and hands model immediately, but I didn’t feel the
difference whether the avatar had a lower body or not.”
• ‘Realistic’ vs ‘Cartoon’ style Avatars
• "The character seemed more like a game than furniture placement in real. I
felt that realistic whole body was collaborating with me more.”
Key Lessons Learned
• Avatar Body Part visibility should be considered first when designing for AR remote
collaboration since it significantly affects Social Presence
• Body Part Visibility
• Whole Body & Upper Body: Whole body is preferred, but upper body is okay in some cases
• Head & Hands: Should be avoided
• Character Style
• No difference in Social Presence between Realistic and Cartoon avatars
• However, the majority of participants had a positive response towards the Realistic avatar
• Cartoon character for fun, Realistic avatar for professional meetings
Avatar Representation in Training
• Pilot study with recorded avatar
• Motorcycle engine assembly
• Avatar types
• (A1) Annotation: Computer-generated lines drawn in 3D space.
• (A2) Hand Gesture: Real hand gestures captured using stereoscopic cameras
• (A3) Avatar: Virtual avatar reconstructed using inverse kinematics.
• (A4) Volumetric Playback: Using three Kinect cameras, the movements of an expert
are captured and played back as a virtual avatar via a see-through headset.

Recommended for you

DealBook of Ukraine: 2024 edition
DealBook of Ukraine: 2024 editionDealBook of Ukraine: 2024 edition
DealBook of Ukraine: 2024 edition

The DealBook is our annual overview of the Ukrainian tech investment industry. This edition comprehensively covers the full year 2023 and the first deals of 2024.

7 Most Powerful Solar Storms in the History of Earth.pdf
7 Most Powerful Solar Storms in the History of Earth.pdf7 Most Powerful Solar Storms in the History of Earth.pdf
7 Most Powerful Solar Storms in the History of Earth.pdf

Solar Storms (Geo Magnetic Storms) are the motion of accelerated charged particles in the solar environment with high velocities due to the coronal mass ejection (CME).

solar storms
Password Rotation in 2024 is still Relevant
Password Rotation in 2024 is still RelevantPassword Rotation in 2024 is still Relevant
Password Rotation in 2024 is still Relevant

Password Rotation in 2024 is still Relevant

passwordmanagementrotation
Avatar Representation
Remote pointer Realistic hands
Representing Remote Users
Virtual Avatar Volumetric Avatar
Experiment Design (30 participants)
Performing motorbike assembly task under guidance
- Easy, Medium, Hard task
Hypotheses
- H1. Volumetric playback would have a better sense of social
presence in a remote training system.
- H2. Volumetric playback would enable faster completion of
tasks in a remote training system
Measures
• NMM Social Presence Questionnaire, NASA TLX, SUS
Results
• Hands, Annotation significantly faster than avatar
• Volumetric playback induced the highest sense of co-presence
• Users preferred Volumetric or Annotation interface
Performance Time
Average Ranking

Recommended for you

20240705 QFM024 Irresponsible AI Reading List June 2024
20240705 QFM024 Irresponsible AI Reading List June 202420240705 QFM024 Irresponsible AI Reading List June 2024
20240705 QFM024 Irresponsible AI Reading List June 2024

Everything that I found interesting last month about the irresponsible use of machine intelligence

quantumfaxmachine
[Talk] Moving Beyond Spaghetti Infrastructure [AOTB] 2024-07-04.pdf
[Talk] Moving Beyond Spaghetti Infrastructure [AOTB] 2024-07-04.pdf[Talk] Moving Beyond Spaghetti Infrastructure [AOTB] 2024-07-04.pdf
[Talk] Moving Beyond Spaghetti Infrastructure [AOTB] 2024-07-04.pdf

Kief Morris rethinks the infrastructure code delivery lifecycle, advocating for a shift towards composable infrastructure systems. We should shift to designing around deployable components rather than code modules, use more useful levels of abstraction, and drive design and deployment from applications rather than bottom-up, monolithic architecture and delivery.

infrastructure as codeclouddevops
Pigging Solutions Sustainability brochure.pdf
Pigging Solutions Sustainability brochure.pdfPigging Solutions Sustainability brochure.pdf
Pigging Solutions Sustainability brochure.pdf

Sustainability requires ingenuity and stewardship. Did you know Pigging Solutions pigging systems help you achieve your sustainable manufacturing goals AND provide rapid return on investment. How? Our systems recover over 99% of product in transfer piping. Recovering trapped product from transfer lines that would otherwise become flush-waste, means you can increase batch yields and eliminate flush waste. From raw materials to finished product, if you can pump it, we can pig it.

pigging solutionsprocess piggingproduct transfers
Results
Volumetric instruction cues exhibits an increase in co-presence and
system usability while reducing mental workload and frustration.
Mental Load (NASA TLX)
System Usability Scale
User Feedback
• Annotations easy to understand (faster performance)
• “Annotation is very clear and easy to spot in a 3d environment”.
• Volumetric creates high degree of social presence (working with person)
• “Seeing a real person demonstrate the task, feels like being next to a person”.
• Recommendations
• Use Volumetric Playback to improve Social Presence and system usability
• Using a full-bodied avatar representation in a remote training system is not
recommended unless it is well animated
• Using simple annotation can have significant improvement in performance if
social presence is not of importance.
AI ENHANCED COMMUNICATION
Enhancing Emotion
• Using physiological and contextual cues to enhance emotion representation
• Show user’s real emotion, make it easier to understand user emotion, etc..
Real User
Physiological Cues
Arousal/Valence
Positive
Negative
Avatar
Context Cues

Recommended for you

Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...
Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...
Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...

Have you noticed the OpenSSF Scorecard badges on the official Dart and Flutter repos? It's Google's way of showing that they care about security. Practices such as pinning dependencies, branch protection, required reviews, continuous integration tests etc. are measured to provide a score and accompanying badge. You can do the same for your projects, and this presentation will show you how, with an emphasis on the unique challenges that come up when working with Dart and Flutter. The session will provide a walkthrough of the steps involved in securing a first repository, and then what it takes to repeat that process across an organization with multiple repos. It will also look at the ongoing maintenance involved once scorecards have been implemented, and how aspects of that maintenance can be better automated to minimize toil.

dartflutteropenssf
Active Inference is a veryyyyyyyyyyyyyyyyyyyyyyyy
Active Inference is a veryyyyyyyyyyyyyyyyyyyyyyyyActive Inference is a veryyyyyyyyyyyyyyyyyyyyyyyy
Active Inference is a veryyyyyyyyyyyyyyyyyyyyyyyy

Not so much to say

Cookies program to display the information though cookie creation
Cookies program to display the information though cookie creationCookies program to display the information though cookie creation
Cookies program to display the information though cookie creation

Java Servlet programs

System Design
Early Results
Face Tracking Positive Affect Avatar Outcome
Research Directions for Cross Reality Interfaces
Conversational agent
Intelligent Virtual Agents (IVAs)
Embodied in 2D Screen Embodied in 3D space

Recommended for you

Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...
Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...
Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...

Today’s digitally connected world presents a wide range of security challenges for enterprises. Insider security threats are particularly noteworthy because they have the potential to cause significant harm. Unlike external threats, insider risks originate from within the company, making them more subtle and challenging to identify. This blog aims to provide a comprehensive understanding of insider security threats, including their types, examples, effects, and mitigation techniques.

insider securitycybersecurity threatsenterprise security
20240702 Présentation Plateforme GenAI.pdf
20240702 Présentation Plateforme GenAI.pdf20240702 Présentation Plateforme GenAI.pdf
20240702 Présentation Plateforme GenAI.pdf

Support en anglais diffusé lors de l'événement 100% IA organisé dans les locaux parisiens d'Iguane Solutions, le mardi 2 juillet 2024 : - Présentation de notre plateforme IA plug and play : ses fonctionnalités avancées, telles que son interface utilisateur intuitive, son copilot puissant et des outils de monitoring performants. - REX client : Cyril Janssens, CTO d’ easybourse, partage son expérience d’utilisation de notre plateforme IA plug & play.

genaicloudrgpd
Transcript: Details of description part II: Describing images in practice - T...
Transcript: Details of description part II: Describing images in practice - T...Transcript: Details of description part II: Describing images in practice - T...
Transcript: Details of description part II: Describing images in practice - T...

This presentation explores the practical application of image description techniques. Familiar guidelines will be demonstrated in practice, and descriptions will be developed “live”! If you have learned a lot about the theory of image description techniques but want to feel more confident putting them into practice, this is the presentation for you. There will be useful, actionable information for everyone, whether you are working with authors, colleagues, alone, or leveraging AI as a collaborator. Link to presentation recording and slides: https://bnctechforum.ca/sessions/details-of-description-part-ii-describing-images-in-practice/ Presented by BookNet Canada on June 25, 2024, with support from the Department of Canadian Heritage.

a11yaccessibilityalt text
Photorealistic Characters
• Synthesia
• AI + ML to create videos
• Speech + image synthesis
• Supports >60 languages
• Personalized characters
https://www.youtube.com/watch?v=vifHh4WjEFE
Empathic Mixed Reality Agents
Intelligent Digital Humans
• Soul Machines
• AI digital brain
• Expressive digital humans
• Autonomous animation
• Able to see and hear
• Learn from users

Recommended for you

INDIAN AIR FORCE FIGHTER PLANES LIST.pdf
INDIAN AIR FORCE FIGHTER PLANES LIST.pdfINDIAN AIR FORCE FIGHTER PLANES LIST.pdf
INDIAN AIR FORCE FIGHTER PLANES LIST.pdf

These fighter aircraft have uses outside of traditional combat situations. They are essential in defending India's territorial integrity, averting dangers, and delivering aid to those in need during natural calamities. Additionally, the IAF improves its interoperability and fortifies international military alliances by working together and conducting joint exercises with other air forces.

air force fighter planebiggest submarinezambia port
find out more about the role of autonomous vehicles in facing global challenges
find out more about the role of autonomous vehicles in facing global challengesfind out more about the role of autonomous vehicles in facing global challenges
find out more about the role of autonomous vehicles in facing global challenges

accommodate the strengths, weaknesses, threats and opportunities of autonomous vehicles

automotive self-driving car technology
Towards Empathic Social Agents
• Goal: Using agents to creating
empathy between people
• Combine
• Scene capture
• Shared tele-presence
• Trust/emotion recognition
• Enhanced communication cues
• Separate cues from representation
• Facilitating brain synchronization
Trends..
Time
Human
Touch
Empathic Agents
Digital Humans
Photo Realistic
Chatbots
Voice menus
SCENE CAPTURE AND SHARING
Example: Connecting between Spaces
• Augmented Reality
• Bringing remote people into your real space
• Virtual Reality
• Bringing elements of the real world into VR
• AR/VR for sharing communication cues
• Sharing non-verbal communication cues

Recommended for you

Shared Sphere – 360 Video Sharing
Shared
Live 360 Video
Host User Guest User
Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a
live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (pp. 1-4).
Research Directions for Cross Reality Interfaces
3D Live Scene Capture
• Use cluster of RGBD sensors
• Fuse together 3D point cloud
Live 3D Scene Capture

Recommended for you

Scene Capture and Sharing
Scene Reconstruction Remote Expert Local Worker
AR View Remote Expert View
3D Mixed Reality Remote Collaboration (2022)
Tian, H., Lee, G. A., Bai, H., & Billinghurst, M. (2023). Using Virtual Replicas to Improve Mixed
Reality Remote Collaboration. IEEE Transactions on Visualization and Computer Graphics.
View Sharing Evolution
• Increased immersion
• Improved scene understanding
• Better collaboration
2D 360 3D

Recommended for you

Switching between 360 and 3D views
• 360 video
• High quality visuals
• Poor spatial representation
• 3D reconstruction
• Poor visual quality
• High quality 3D reconstruction
Swapping between 360 and 3D views
• Have pre-captured 3D model of real space
• Enable remote user to swap between live 360 video or 3D view
• Represent remote user as avatar
Teo, T., F. Hayati, A., A. Lee, G., Billinghurst, M., & Adcock, M. (2019). A technique for mixed reality remote collaboration using
360 panoramas in 3d reconstructed scenes. In 25th ACM Symposium on Virtual Reality Software and Technology (pp. 1-11).
Research Directions for Cross Reality Interfaces
SharedNERF (2024)
• Combines 3D point cloud with NeRF rendering
• Uses head mounted camera view to create NeRF images, pointcloud for fast moving objects
Sakashita, M., et al. (2024, May). SharedNeRF: Leveraging Photorealistic and View-dependent Rendering for Real-time
and Remote Collaboration. In Proceedings of the CHI Conference on Human Factors in Computing Systems (pp. 1-14).

Recommended for you

https://www.youtube.com/watch?v=h3InhMfKA58
ASYNCHRONOUS COMMUNICATION
User could move along the Reality-Virtuality Continuum
Timetravellers - Motivation
Expert worker
Store room
Workbench
?
Store room
Workbench
• In a factory.
Cho, H., Yuan, B., Hart, J. D., Chang, Z., Cao, J.,
Chang, E., & Billinghurst, M. (2023, October). Time
Travellers: An Asynchronous Cross Reality Collaborative
System. In 2023 IEEE International Symposium on
Mixed and Augmented Reality Adjunct (ISMAR-
Adjunct) (pp. 848-853). IEEE.
Design Goals
96
AR
/ VR
Recording user’s actions MR Playback
MR Headset
(Magic Leap 2)
Real object
tracking

Recommended for you

Design Goals
97
AR
/ VR
AR VR
WIM (World In Miniature)
VR view manipulation
Design Goals
98
AR
/ VR
Visual Annotation (AR mode) Visual Annotation (VR mode)
Design Goals
99
AR
/ VR
[ Seamless Transition ]
AR -> VR
VR -> AR
[ Avatar, Virtual Replica ]
“Time Travellers” Overview
100
Time
Step 1: Recording an expert’s
standard process
Step 2: Reviewing the recorded process through the
hybrid cross-reality playback system
2nd
User
1st
User MR Headset
(Magic Leap 2)
Real object
tracking
1st User’s view
Visual annotation
Avatar interaction
2nd User’s view
Timeline manipulation
Recording Data
[ 3D Work space ]
[ Avatar, Object ]
Spatial Data
1st User’s view Real object
tracking
AR mode VR mode
Cross reality asynchronous collaborative system
AR mode VR mode

Recommended for you

Pilot User Evaluation -User studyDesign-
101
• The participants(6) performed a task of reviewing and annotating on recorded videos in both AR
and AR+VR (Cross Reality) conditions.
• Leaves a marker when the action begins, and an arrow when it ends.
[ AR condition ] [ AR+VR condition ]
Pilot User Evaluation -Measurements -
102
• Objective Measurements
[ Task Completion Time ] [ Moving Trajectory ]
[ Timeline Manipulation Time ]
• Subjective Measurements
• NASA TLX
• System sability questionnaire (SUS)
Pilot User Evaluation - Results and Lessons Learned -
103
• Objective Measurements
[ Task Completion Time ] [ Moving Trajectory ]
[ Timeline Manipulation Time ]
200
210
220
230
240
Completion time
(sec)
AR AR+VR
0
5
10
15
20
Timeline
manipulation (sec)
AR AR+VR
0
10
20
30
40
50
Moving trajectories (m)
AR AR+VR
Pilot User Evaluation - Results and Lessons Learned -
104
• Subjective Measurements
The Cross-Reality mode as more useful in terms of overall understanding of the collaboration process.
(Faster task completion with a Lower task load)

Recommended for you

PROTOTYPING TOOLS
ShapesXR – www.shapesxr.com
https://www.youtube.com/watch?v=J7tS2GpwDUo
Challenges with Prototyping CR Systems
• Cross platform support
• Need for programming skills
• Building collaborative systems
• Need to build multiple different interfaces
• Connecting multiple devices/components
• Difficult to prototype hardware/display systems
Example: Secondsight
A prototyping platform for rapidly testing cross-device interfaces
• Enables an AR HMD to "extend" the screen of a smartphone
Key Features
• Can simulate a range of HMD Field of View
• Enables World-fixed or Device-fixed content placement
• Supports touch screen input, free-hand gestures, head-pose selection
Reichherzer, C., Fraser, J., Rompapas, D. C., & Billinghurst, M. (2021, May). Secondsight: A framework for cross-device augmented
reality interfaces. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-6).

Recommended for you

Content Placement
© 2021 SIGGRAPH. All Rights Reserved.
Input
Map application
© 2021 SIGGRAPH. All Rights Reserved.
Implementation
Hardware
• Meta2 AR Glasses (82o FOV)
• Samsung Galaxy S8 phone
• OptiTrack motion capture
system
Software
• Unity game engine
• Mirror networking library

Recommended for you

Google - The Cross-device Toolkit - XDTK
• Open-source toolkit to enable
communication between Android
devices and Unity
• Handles
• Device discovery/communication
• Sensor data streaming
• ARCore pose information
• https://github.com/google/xdtk
Gonzalez, E. J., Patel, K., Ahuja, K., & Gonzalez-Franco, M. (2024, March). XDTK: A Cross-Device Toolkit for Input & Interaction
in XR. In 2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW) (pp. 467-470). IEEE.
Skill
&
resources
required
Level of fidelity in AR/VR
Class 1
InVision,
Sketch,
XD, ...
Class 2
DART,
Proto.io,
Montage,
...
Class 3
ARToolKit,
Vuforia/
Lens/Spark
AR Studio,
...
Class 4
SketchUp,
Blender, ...
Class 5
A-Frame, Unity,
Unreal Engine,
...
Immersive Authoring
Tilt Brush, Blocks,
Maquette, Pronto, ......
?
Research Needed
ProtoAR,
360proto,
XRDirector, ...
XR Prototyping Tools
On-device/Cross-device/Immersive Authoring
https://www.youtube.com/watch?v=CXdgTMKpP_o
Leiva, G., Nguyen, C., Kazi, R. H., & Asente, P. (2020, April). Pronto: Rapid augmented reality video prototyping using sketches
and enaction. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1-13).
VRception Toolkit
• First multi-user and multi-environment rapid-prototyping toolkit for non-experts
• Designed for rapid prototyping of CR systems
• Supports prototyping in VR or in Unity3D
Gruenefeld, U., et al. (2022, April). Vrception: Rapid prototyping of cross-reality systems in virtual reality. In Proceedings of the
2022 CHI Conference on Human Factors in Computing Systems (pp. 1-15).

Recommended for you

https://www.youtube.com/watch?v=EWzP9_FAtL8
EMPATHIC COMPUTING
Modern Communication Technology Trends
1. Improved Content Capture
• Move from sharing faces to sharing places
2. Increased Network Bandwidth
• Sharing natural communication cues
3. Implicit Understanding
• Recognizing behaviour and emotion
Natural
Collaboration
Implicit
Understanding
Experience
Capture

Recommended for you

Natural
Collaboration
Implicit
Understanding
Experience
Capture
Empathic
Computing
“Empathy is Seeing with the
Eyes of another, Listening with
the Ears of another, and Feeling
with the Heart of another..”
Alfred Adler
Empathic Computing Research Focus
Can we develop systems that allow
us to share what we are seeing,
hearing and feeling with others?
Key Elements of Empathic Systems
•Understanding
• Emotion Recognition, physiological sensors
•Experiencing
• Content/Environment capture, VR
•Sharing
• Communication cues, AR

Recommended for you

Example: NeuralDrum
• Using brain synchronicity to increase connection
• Collaborative VR drumming experience
• Measure brain activity using 3 EEG electrodes
• Use PLV to calculate synchronization
• More synchronization increases graphics effects/immersion
Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain
Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
Set Up
• HTC Vive HMD
• OpenBCI
• 3 EEG electrodes
Research Directions for Cross Reality Interfaces
Results
"It’s quite interesting, I actually felt like my
body was exchanged with my partner."
Poor Player Good Player

Recommended for you

Technology Trends
• Advanced displays
• Wide FOV, high resolution
• Real time space capture
• 3D scanning, stitching, segmentation
• Natural gesture interaction
• Hand tracking, pose recognition
• Robust eye-tracking
• Gaze points, focus depth
• Emotion sensing/sharing
• Physiological sensing, emotion mapping
Sensor Enhanced HMDs
Eye tracking, heart rate,
pupillometry, and face camera
HP Omnicept Project Galea
EEG, EMG, EDA, PPG,
EOG, eye gaze, etc.
Multiple Physiological Sensors into HMD
• Incorporate range of sensors on HMD faceplate and over head
• EMG – muscle movement
• EOG – Eye movement
• EEG – Brain activity
• EDA, PPG – Heart rate
Research Directions for Cross Reality Interfaces

Recommended for you

• Advanced displays
• Real time space capture
• Natural gesture interaction
• Robust eye-tracking
• Emotion sensing/sharing
Empathic
Tele-Existence
Empathic Tele-Existence
• Based on Empathic Computing
• Creating shared understanding
• Covering the entire Metaverse
• AR, VR, Lifelogging, Mirror Worlds
• Transforming collaboration
• Observer to participant
• Feeling of doing things together
• Supporting Implicit collaboration
CONCLUSIONS
Summary
• Cross Reality systems transition across boundaries
• Mixed Reality continuum, Metaverse taxonomy
• Important research areas
• Enhancing Communication Cues, Asynchronous CR
systems, Empathic Computing
• Scene Capture and Sharing, Avatar Representation, AI
Enhanced communication, Prototyping Tools
• New research opportunities available
• XR + AI + Sensing + ??

Recommended for you

Research Directions for Cross Reality Interfaces
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.ac.nz

More Related Content

Similar to Research Directions for Cross Reality Interfaces

Empathic Computing: New Approaches to Gaming
Empathic Computing: New Approaches to GamingEmpathic Computing: New Approaches to Gaming
Empathic Computing: New Approaches to Gaming
Mark Billinghurst
 
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityBeyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Mark Billinghurst
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VR
Mark Billinghurst
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
Mark Billinghurst
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
Mark Billinghurst
 
Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...
Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...
Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...
Leonel Merino
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
Mark Billinghurst
 
Moving Beyond Questionnaires to Evaluate MR Experiences
Moving Beyond Questionnaires to Evaluate MR ExperiencesMoving Beyond Questionnaires to Evaluate MR Experiences
Moving Beyond Questionnaires to Evaluate MR Experiences
Mark Billinghurst
 
COMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in ARCOMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in AR
Mark Billinghurst
 
Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research Directions
Mark Billinghurst
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
Mark Billinghurst
 
Fifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using ARFifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using AR
Mark Billinghurst
 
COMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR ApplicationsCOMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR Applications
Mark Billinghurst
 
Empathic Mixed Reality
Empathic Mixed RealityEmpathic Mixed Reality
Empathic Mixed Reality
Thammathip Piumsomboon
 
COMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VRCOMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VR
Mark Billinghurst
 
Mobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface DesignMobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface Design
Mark Billinghurst
 
COMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VRCOMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VR
Mark Billinghurst
 
2016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 52016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 5
Mark Billinghurst
 
Augmented reality the evolution of human computer interaction
Augmented reality the evolution of human computer interactionAugmented reality the evolution of human computer interaction
Augmented reality the evolution of human computer interaction
Bello Abubakar
 
Calongne vr simulations games ctu doctoral july 2017
Calongne vr simulations games ctu doctoral july 2017Calongne vr simulations games ctu doctoral july 2017
Calongne vr simulations games ctu doctoral july 2017
Cynthia Calongne
 

Similar to Research Directions for Cross Reality Interfaces (20)

Empathic Computing: New Approaches to Gaming
Empathic Computing: New Approaches to GamingEmpathic Computing: New Approaches to Gaming
Empathic Computing: New Approaches to Gaming
 
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityBeyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented Reality
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VR
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
 
Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...
Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...
Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
 
Moving Beyond Questionnaires to Evaluate MR Experiences
Moving Beyond Questionnaires to Evaluate MR ExperiencesMoving Beyond Questionnaires to Evaluate MR Experiences
Moving Beyond Questionnaires to Evaluate MR Experiences
 
COMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in ARCOMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in AR
 
Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research Directions
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
 
Fifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using ARFifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using AR
 
COMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR ApplicationsCOMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR Applications
 
Empathic Mixed Reality
Empathic Mixed RealityEmpathic Mixed Reality
Empathic Mixed Reality
 
COMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VRCOMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VR
 
Mobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface DesignMobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface Design
 
COMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VRCOMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VR
 
2016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 52016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 5
 
Augmented reality the evolution of human computer interaction
Augmented reality the evolution of human computer interactionAugmented reality the evolution of human computer interaction
Augmented reality the evolution of human computer interaction
 
Calongne vr simulations games ctu doctoral july 2017
Calongne vr simulations games ctu doctoral july 2017Calongne vr simulations games ctu doctoral july 2017
Calongne vr simulations games ctu doctoral july 2017
 

More from Mark Billinghurst

The Metaverse: Are We There Yet?
The  Metaverse:    Are   We  There  Yet?The  Metaverse:    Are   We  There  Yet?
The Metaverse: Are We There Yet?
Mark Billinghurst
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
Mark Billinghurst
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
Mark Billinghurst
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
Mark Billinghurst
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems
Mark Billinghurst
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping
Mark Billinghurst
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction
Mark Billinghurst
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology
Mark Billinghurst
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception
Mark Billinghurst
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR
Mark Billinghurst
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
Mark Billinghurst
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
Mark Billinghurst
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
Mark Billinghurst
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
Mark Billinghurst
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
Mark Billinghurst
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality
Mark Billinghurst
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface Design
Mark Billinghurst
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and Systems
Mark Billinghurst
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
Mark Billinghurst
 

More from Mark Billinghurst (19)

The Metaverse: Are We There Yet?
The  Metaverse:    Are   We  There  Yet?The  Metaverse:    Are   We  There  Yet?
The Metaverse: Are We There Yet?
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface Design
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and Systems
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
 

Recently uploaded

Quality Patents: Patents That Stand the Test of Time
Quality Patents: Patents That Stand the Test of TimeQuality Patents: Patents That Stand the Test of Time
Quality Patents: Patents That Stand the Test of Time
Aurora Consulting
 
Implementations of Fused Deposition Modeling in real world
Implementations of Fused Deposition Modeling  in real worldImplementations of Fused Deposition Modeling  in real world
Implementations of Fused Deposition Modeling in real world
Emerging Tech
 
What's New in Copilot for Microsoft365 May 2024.pptx
What's New in Copilot for Microsoft365 May 2024.pptxWhat's New in Copilot for Microsoft365 May 2024.pptx
What's New in Copilot for Microsoft365 May 2024.pptx
Stephanie Beckett
 
Measuring the Impact of Network Latency at Twitter
Measuring the Impact of Network Latency at TwitterMeasuring the Impact of Network Latency at Twitter
Measuring the Impact of Network Latency at Twitter
ScyllaDB
 
UiPath Community Day Kraków: Devs4Devs Conference
UiPath Community Day Kraków: Devs4Devs ConferenceUiPath Community Day Kraków: Devs4Devs Conference
UiPath Community Day Kraków: Devs4Devs Conference
UiPathCommunity
 
Best Practices for Effectively Running dbt in Airflow.pdf
Best Practices for Effectively Running dbt in Airflow.pdfBest Practices for Effectively Running dbt in Airflow.pdf
Best Practices for Effectively Running dbt in Airflow.pdf
Tatiana Al-Chueyr
 
DealBook of Ukraine: 2024 edition
DealBook of Ukraine: 2024 editionDealBook of Ukraine: 2024 edition
DealBook of Ukraine: 2024 edition
Yevgen Sysoyev
 
7 Most Powerful Solar Storms in the History of Earth.pdf
7 Most Powerful Solar Storms in the History of Earth.pdf7 Most Powerful Solar Storms in the History of Earth.pdf
7 Most Powerful Solar Storms in the History of Earth.pdf
Enterprise Wired
 
Password Rotation in 2024 is still Relevant
Password Rotation in 2024 is still RelevantPassword Rotation in 2024 is still Relevant
Password Rotation in 2024 is still Relevant
Bert Blevins
 
20240705 QFM024 Irresponsible AI Reading List June 2024
20240705 QFM024 Irresponsible AI Reading List June 202420240705 QFM024 Irresponsible AI Reading List June 2024
20240705 QFM024 Irresponsible AI Reading List June 2024
Matthew Sinclair
 
[Talk] Moving Beyond Spaghetti Infrastructure [AOTB] 2024-07-04.pdf
[Talk] Moving Beyond Spaghetti Infrastructure [AOTB] 2024-07-04.pdf[Talk] Moving Beyond Spaghetti Infrastructure [AOTB] 2024-07-04.pdf
[Talk] Moving Beyond Spaghetti Infrastructure [AOTB] 2024-07-04.pdf
Kief Morris
 
Pigging Solutions Sustainability brochure.pdf
Pigging Solutions Sustainability brochure.pdfPigging Solutions Sustainability brochure.pdf
Pigging Solutions Sustainability brochure.pdf
Pigging Solutions
 
Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...
Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...
Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...
Chris Swan
 
Active Inference is a veryyyyyyyyyyyyyyyyyyyyyyyy
Active Inference is a veryyyyyyyyyyyyyyyyyyyyyyyyActive Inference is a veryyyyyyyyyyyyyyyyyyyyyyyy
Active Inference is a veryyyyyyyyyyyyyyyyyyyyyyyy
RaminGhanbari2
 
Cookies program to display the information though cookie creation
Cookies program to display the information though cookie creationCookies program to display the information though cookie creation
Cookies program to display the information though cookie creation
shanthidl1
 
Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...
Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...
Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...
Bert Blevins
 
20240702 Présentation Plateforme GenAI.pdf
20240702 Présentation Plateforme GenAI.pdf20240702 Présentation Plateforme GenAI.pdf
20240702 Présentation Plateforme GenAI.pdf
Sally Laouacheria
 
Transcript: Details of description part II: Describing images in practice - T...
Transcript: Details of description part II: Describing images in practice - T...Transcript: Details of description part II: Describing images in practice - T...
Transcript: Details of description part II: Describing images in practice - T...
BookNet Canada
 
INDIAN AIR FORCE FIGHTER PLANES LIST.pdf
INDIAN AIR FORCE FIGHTER PLANES LIST.pdfINDIAN AIR FORCE FIGHTER PLANES LIST.pdf
INDIAN AIR FORCE FIGHTER PLANES LIST.pdf
jackson110191
 
find out more about the role of autonomous vehicles in facing global challenges
find out more about the role of autonomous vehicles in facing global challengesfind out more about the role of autonomous vehicles in facing global challenges
find out more about the role of autonomous vehicles in facing global challenges
huseindihon
 

Recently uploaded (20)

Quality Patents: Patents That Stand the Test of Time
Quality Patents: Patents That Stand the Test of TimeQuality Patents: Patents That Stand the Test of Time
Quality Patents: Patents That Stand the Test of Time
 
Implementations of Fused Deposition Modeling in real world
Implementations of Fused Deposition Modeling  in real worldImplementations of Fused Deposition Modeling  in real world
Implementations of Fused Deposition Modeling in real world
 
What's New in Copilot for Microsoft365 May 2024.pptx
What's New in Copilot for Microsoft365 May 2024.pptxWhat's New in Copilot for Microsoft365 May 2024.pptx
What's New in Copilot for Microsoft365 May 2024.pptx
 
Measuring the Impact of Network Latency at Twitter
Measuring the Impact of Network Latency at TwitterMeasuring the Impact of Network Latency at Twitter
Measuring the Impact of Network Latency at Twitter
 
UiPath Community Day Kraków: Devs4Devs Conference
UiPath Community Day Kraków: Devs4Devs ConferenceUiPath Community Day Kraków: Devs4Devs Conference
UiPath Community Day Kraków: Devs4Devs Conference
 
Best Practices for Effectively Running dbt in Airflow.pdf
Best Practices for Effectively Running dbt in Airflow.pdfBest Practices for Effectively Running dbt in Airflow.pdf
Best Practices for Effectively Running dbt in Airflow.pdf
 
DealBook of Ukraine: 2024 edition
DealBook of Ukraine: 2024 editionDealBook of Ukraine: 2024 edition
DealBook of Ukraine: 2024 edition
 
7 Most Powerful Solar Storms in the History of Earth.pdf
7 Most Powerful Solar Storms in the History of Earth.pdf7 Most Powerful Solar Storms in the History of Earth.pdf
7 Most Powerful Solar Storms in the History of Earth.pdf
 
Password Rotation in 2024 is still Relevant
Password Rotation in 2024 is still RelevantPassword Rotation in 2024 is still Relevant
Password Rotation in 2024 is still Relevant
 
20240705 QFM024 Irresponsible AI Reading List June 2024
20240705 QFM024 Irresponsible AI Reading List June 202420240705 QFM024 Irresponsible AI Reading List June 2024
20240705 QFM024 Irresponsible AI Reading List June 2024
 
[Talk] Moving Beyond Spaghetti Infrastructure [AOTB] 2024-07-04.pdf
[Talk] Moving Beyond Spaghetti Infrastructure [AOTB] 2024-07-04.pdf[Talk] Moving Beyond Spaghetti Infrastructure [AOTB] 2024-07-04.pdf
[Talk] Moving Beyond Spaghetti Infrastructure [AOTB] 2024-07-04.pdf
 
Pigging Solutions Sustainability brochure.pdf
Pigging Solutions Sustainability brochure.pdfPigging Solutions Sustainability brochure.pdf
Pigging Solutions Sustainability brochure.pdf
 
Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...
Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...
Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...
 
Active Inference is a veryyyyyyyyyyyyyyyyyyyyyyyy
Active Inference is a veryyyyyyyyyyyyyyyyyyyyyyyyActive Inference is a veryyyyyyyyyyyyyyyyyyyyyyyy
Active Inference is a veryyyyyyyyyyyyyyyyyyyyyyyy
 
Cookies program to display the information though cookie creation
Cookies program to display the information though cookie creationCookies program to display the information though cookie creation
Cookies program to display the information though cookie creation
 
Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...
Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...
Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...
 
20240702 Présentation Plateforme GenAI.pdf
20240702 Présentation Plateforme GenAI.pdf20240702 Présentation Plateforme GenAI.pdf
20240702 Présentation Plateforme GenAI.pdf
 
Transcript: Details of description part II: Describing images in practice - T...
Transcript: Details of description part II: Describing images in practice - T...Transcript: Details of description part II: Describing images in practice - T...
Transcript: Details of description part II: Describing images in practice - T...
 
INDIAN AIR FORCE FIGHTER PLANES LIST.pdf
INDIAN AIR FORCE FIGHTER PLANES LIST.pdfINDIAN AIR FORCE FIGHTER PLANES LIST.pdf
INDIAN AIR FORCE FIGHTER PLANES LIST.pdf
 
find out more about the role of autonomous vehicles in facing global challenges
find out more about the role of autonomous vehicles in facing global challengesfind out more about the role of autonomous vehicles in facing global challenges
find out more about the role of autonomous vehicles in facing global challenges
 

Research Directions for Cross Reality Interfaces

  • 1. RESEARCH DIRECTIONS IN CROSS REALITY INTERFACES Mark Billinghurst mark.billinghurst@unisa.edu.au Summer School on Cross Reality July 2nd 2024
  • 5. Computer Interfaces • Separation between real and digital worlds • WIMP (Windows, Icons, Menus, Pointer) metaphor
  • 6. Rekimoto, J. and Nagao, K. 1995. The world through the computer: computer augmented interaction with real world environments. Making Interfaces Invisible (c) Internet of Things
  • 7. Internet of Things (IoT).. • Embed computing and sensing in real world • Smart objects, sensors, etc.. (c) Internet of Things
  • 8. Virtual Reality (VR) • Users immersed in Computer Generated environment • HMD, gloves, 3D graphics, body tracking
  • 9. Augmented Reality (AR) • Virtual Images blended with the real world • See-through HMD, handheld display, viewpoint tracking, etc..
  • 10. From Reality to Virtual Reality Internet of Things Augmented Reality Virtual Reality Real World Virtual World
  • 11. Milgram’s Mixed Reality (MR) Continuum Augmented Reality Virtual Reality Real World Virtual World Mixed Reality "...anywhere between the extrema of the virtuality continuum." P. Milgram and A. F. Kishino, (1994) A Taxonomy of Mixed Reality Visual Displays Internet of Things
  • 12. Milgram’s Mixed Reality (MR) Continuum Augmented Reality Virtual Reality Real World Virtual World Mixed Reality Internet of Things
  • 13. The MagicBook (2001) Reality Virtuality Augmented Reality (AR) Augmented Virtuality (AV) Billinghurst, M., Kato, H., & Poupyrev, I. (2001). The MagicBook: a transitional AR interface. Computers & Graphics, 25(5), 745-753.
  • 15. Features • Seamless transition from Reality to Virtuality • Reliance on real decreases as virtual increases • Supports egocentric and exocentric views • User can pick appropriate view • Independent Views • Privacy, role division, scalability • Collaboration on multiple levels: • Physical Object, AR Object, Immersive Virtual Space • Egocentric + exocentric collaboration • multiple multi-scale users
  • 16. Apple Vision Pro (2024) • Transitioning from AR to VR • Spatial Computing – interface seamlessly blending with real world
  • 17. Cross Reality (CR) Systems •Systems that facilitate: •a smooth transition between systems using different degrees of virtuality or •collaboration between users using different systems with different degrees of virtuality Simeone, Adalberto L., Mohamed Khamis, Augusto Esteves, Florian Daiber, Matjaž Kljun, Klen Čopič Pucihar, Poika Isokoski, and Jan Gugenheimer. "International workshop on cross-reality (xr) interaction." In Companion Proceedings of the 2020 Conference on Interactive Surfaces and Spaces, pp. 111-114. 2020.
  • 18. Publications in Cross Reality Increasing publications since 2019
  • 19. Key CR Technologies • Augmentation technologies that layer information onto our perception of the physical environment. • Simulation technologies that model reality • Intimate technologies are focused inwardly, on the identity and actions of the individual or object; • External technologies are focused outwardly, towards the world at large;
  • 20. Taxonomy • Four Key Components • Virtual Worlds • Augmented Reality • Mirror Worlds • Lifelogging
  • 21. Mirror Worlds • Simulations of external space/content • Capturing and sharing surroundings • Photorealistic content • Digital twins Matterport Deep Mirror Google Street View Soul Machines
  • 22. Lifelogging • Measuring user’s internal state • Capturing physiological cues • Recording everyday life • Augmenting humans Apple Fitbit Shimmer OpenBCI
  • 25. What is the Metaverse Research Landscape? •Survey of Scopus papers (to June 2023) • ~1900 papers found with Metaverse in abstract/keywords •Further analysis • Look for publications in AR, VR, MirrorWorlds (MW), LifeLogging (LL) • Look for research across boundaries •Application analysis • Most popular application domains
  • 30. Lessons Learned •Research Strengths • Most Metaverse research VR related (36%) • Strong connections between AR/VR (16%) • Strong connections between MW/VR (11%) •Research Opportunities • Opportunities across boundaries - 1% papers in AR/LL, 0% in MW/LL • Opportunities to combine > 2 quadrants – 0% in AR/MW/LL • Opportunities for research combining all elements • Broadening application space – industry, finance, etc
  • 31. Possible Research Directions • Lifelogging to VR • Bringing real world actions into VR, VR to experience lifelogging data • AR to Lifelogging • Using AR to view lifelogging data in everyday life, Sharing physiological data • Mirror Worlds to VR • VR copy of the real world, Mirroring real world collaboration in VR • AR to Mirror Worlds • Visualizing the past in place, Asymmetric collaboration • And more..
  • 32. Example: Sharing Communication Cues • Measuring non-verbal cues • Gaze, face expression, heart rate • Sharing in Augmented Reality • Collaborative AR experiences
  • 33. Empathy Glasses • Combine together eye-tracking, display, face expression • Implicit cues – eye gaze, face expression + + Pupil Labs Epson BT-200 AffectiveWear Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
  • 34. Remote Collaboration • Eye gaze pointer and remote pointing • Face expression display • Implicit cues for remote collaboration
  • 36. Research Directions •Enhancing Communication Cues •Avatar Representation •AI Enhanced communication •Scene Capture and Sharing •Asynchronous CR systems •Prototyping Tools •Empathic Computing
  • 39. • Using AR/VR to share communication cues • Gaze, gesture, head pose, body position • Sharing same environment • Virtual copy of real world • Collaboration between AR/VR • VR user appears in AR user’s space Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5. Sharing Virtual Communication Cues (2019)
  • 40. Sharing Virtual Communication Cues • AR/VR displays • Gesture input (Leap Motion) • Room scale tracking • Conditions • Baseline, FoV, Head-gaze, Eye-gaze
  • 42. Results • Predictions • Eye/Head pointing better than no cues • Eye/head pointing could reduce need for pointing • Results • No difference in task completion time • Head-gaze/eye-gaze great mutual gaze rate • Using head-gaze greater ease of use than baseline • All cues provide higher co-presence than baseline • Pointing gestures reduced in cue conditions • But • No difference between head-gaze and eye-gaze
  • 43. Enhancing Gaze Cues How sharing gaze behavioural cues can improve remote collaboration in Mixed Reality environment. ➔ Developed eyemR-Vis, a 360 panoramic Mixed Reality remote collaboration system ➔ Showed gaze behavioural cues as bi-directional spatial virtual visualisations shared between a local host (AR) and a remote collaborator (VR). Jing, A., May, K. W., Naeem, M., Lee, G., & Billinghurst, M. (2021). eyemR-Vis: Using Bi-Directional Gaze Behavioural Cues to Improve Mixed Reality Remote Collaboration. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-7).
  • 44. System Design ➔ 360 Panaramic Camera + Mixed Reality View ➔ Combination of HoloLens2 + Vive Pro Eye ➔ 4 gaze behavioural visualisations: browse, focus, mutual, fixated circle
  • 46. Example: Multi-Scale Collaboration • Changing the user’s virtual body scale Piumsomboon, T., Lee, G. A., Irlitti, A., Ens, B., Thomas, B. H., & Billinghurst, M. (2019, May). On the shoulder of the giant: A multi-scale mixed reality collaboration with 360 video sharing and tangible interaction. In Proceedings of the 2019 CHI conference on human factors in computing systems (pp. 1-17).
  • 48. Sharing: Separating Cues from Body • What happens when you can’t see your colleague/agent? Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M. (2018, April). Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI conference on human factors in computing systems (pp. 1-13). Collaborating Collaborator out of View
  • 49. Mini-Me Communication Cues in MR • When lose sight of collaborator a Mini-Me avatar appears • Miniature avatar in real world • Mini-Me points to shared objects, show communication cues • Redirected gaze, gestures
  • 51. User Study (16 participants) • Collaboration between user in AR, expert in VR • Hololens, HTC Vive • Two tasks: • (1) asymmetric, (2) symmetric • Key findings • Mini-Me significantly improved performance time (task1) (20% faster) • Mini-Me significantly improved Social Presence scores • 63% (task 2) – 75% (task 1) of users preferred Mini-Me “I feel like I am talking to my partner”
  • 53. Avatar Representation for Social Presence • What should avatars look like for social situations? • Cartoon vs. realistic? • Partial or full body? �� Impact on Social Presence? Yoon, B., Kim, H. I., Lee, G. A., Billinghurst, M., & Woo, W. (2019, March). The effect of avatar appearance on social presence in an augmented reality remote collaboration. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (pp. 547-556). IEEE.
  • 54. Avatar Representations • Cartoon vs. Realistic, Part Body vs. Whole Body • Realistic Head & Hands (RHH), Realistic Upper Body (RUB), Realistic Whole Body (RWB), • Cartoon Head & Hands (CHH), Cartoon Upper Body (CUB), Cartoon Whole Body (CWB).
  • 55. Experiment • Within-subjects design (24 subjects) • 6 conditions: RHH, RUB, RWB, CHH, CUB, CWB • AR/VR interface • Subject in AR interface, actor in VR • Experiment measures • Social Presence • Networked Mind Measure of Social Presence survey • Bailenson’s Social Presence survey • Post Experiment Interview • Tasks • Study 1: Crossword puzzle (Face to Face discussion) • Study 2: Furniture placement (virtual object placement) AR user VR user
  • 56. Hypotheses H1. Body Part Visibility will affect the user’s Social Presence in AR. H2. The Whole-Body virtual avatars will have the highest Social Presence among the three levels of visibility. H3. Head & Hands virtual avatars will have the lowest Social Presence among the three levels of visibility. H4. The Character Style will affect the user’s Social Presence. H5. Realistic avatars will have a higher Social Presence than Cartoon Style avatars in an AR remote collaboration.
  • 57. Results • Aggregated Presence Scores • 1: strongly disagree - 7: strongly agree
  • 58. User Comments • ‘Whole Body’ Avatar Expression to Users • “Presence was high with full body parts, because I could notice joints’ movement, behaviour, and reaction.” • “I didn’t get the avatar’s intention of the movement, because it had only head and hands.” • ‘Upper Body’ vs. ‘Whole Body’ Avatar • “I preferred the one with whole body, but it didn’t really matter because I didn’t look at the legs much.”, • “I noticed head and hands model immediately, but I didn’t feel the difference whether the avatar had a lower body or not.” • ‘Realistic’ vs ‘Cartoon’ style Avatars • "The character seemed more like a game than furniture placement in real. I felt that realistic whole body was collaborating with me more.”
  • 59. Key Lessons Learned • Avatar Body Part visibility should be considered first when designing for AR remote collaboration since it significantly affects Social Presence • Body Part Visibility • Whole Body & Upper Body: Whole body is preferred, but upper body is okay in some cases • Head & Hands: Should be avoided • Character Style • No difference in Social Presence between Realistic and Cartoon avatars • However, the majority of participants had a positive response towards the Realistic avatar • Cartoon character for fun, Realistic avatar for professional meetings
  • 60. Avatar Representation in Training • Pilot study with recorded avatar • Motorcycle engine assembly • Avatar types • (A1) Annotation: Computer-generated lines drawn in 3D space. • (A2) Hand Gesture: Real hand gestures captured using stereoscopic cameras • (A3) Avatar: Virtual avatar reconstructed using inverse kinematics. • (A4) Volumetric Playback: Using three Kinect cameras, the movements of an expert are captured and played back as a virtual avatar via a see-through headset.
  • 62. Representing Remote Users Virtual Avatar Volumetric Avatar
  • 63. Experiment Design (30 participants) Performing motorbike assembly task under guidance - Easy, Medium, Hard task Hypotheses - H1. Volumetric playback would have a better sense of social presence in a remote training system. - H2. Volumetric playback would enable faster completion of tasks in a remote training system Measures • NMM Social Presence Questionnaire, NASA TLX, SUS
  • 64. Results • Hands, Annotation significantly faster than avatar • Volumetric playback induced the highest sense of co-presence • Users preferred Volumetric or Annotation interface Performance Time Average Ranking
  • 65. Results Volumetric instruction cues exhibits an increase in co-presence and system usability while reducing mental workload and frustration. Mental Load (NASA TLX) System Usability Scale
  • 66. User Feedback • Annotations easy to understand (faster performance) • “Annotation is very clear and easy to spot in a 3d environment”. • Volumetric creates high degree of social presence (working with person) • “Seeing a real person demonstrate the task, feels like being next to a person”. • Recommendations • Use Volumetric Playback to improve Social Presence and system usability • Using a full-bodied avatar representation in a remote training system is not recommended unless it is well animated • Using simple annotation can have significant improvement in performance if social presence is not of importance.
  • 68. Enhancing Emotion • Using physiological and contextual cues to enhance emotion representation • Show user’s real emotion, make it easier to understand user emotion, etc.. Real User Physiological Cues Arousal/Valence Positive Negative Avatar Context Cues
  • 70. Early Results Face Tracking Positive Affect Avatar Outcome
  • 72. Conversational agent Intelligent Virtual Agents (IVAs) Embodied in 2D Screen Embodied in 3D space
  • 73. Photorealistic Characters • Synthesia • AI + ML to create videos • Speech + image synthesis • Supports >60 languages • Personalized characters
  • 76. Intelligent Digital Humans • Soul Machines • AI digital brain • Expressive digital humans • Autonomous animation • Able to see and hear • Learn from users
  • 77. Towards Empathic Social Agents • Goal: Using agents to creating empathy between people • Combine • Scene capture • Shared tele-presence • Trust/emotion recognition • Enhanced communication cues • Separate cues from representation • Facilitating brain synchronization
  • 79. SCENE CAPTURE AND SHARING
  • 80. Example: Connecting between Spaces • Augmented Reality • Bringing remote people into your real space • Virtual Reality • Bringing elements of the real world into VR • AR/VR for sharing communication cues • Sharing non-verbal communication cues
  • 81. Shared Sphere – 360 Video Sharing Shared Live 360 Video Host User Guest User Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (pp. 1-4).
  • 83. 3D Live Scene Capture • Use cluster of RGBD sensors • Fuse together 3D point cloud
  • 84. Live 3D Scene Capture
  • 85. Scene Capture and Sharing Scene Reconstruction Remote Expert Local Worker
  • 86. AR View Remote Expert View
  • 87. 3D Mixed Reality Remote Collaboration (2022) Tian, H., Lee, G. A., Bai, H., & Billinghurst, M. (2023). Using Virtual Replicas to Improve Mixed Reality Remote Collaboration. IEEE Transactions on Visualization and Computer Graphics.
  • 88. View Sharing Evolution • Increased immersion • Improved scene understanding • Better collaboration 2D 360 3D
  • 89. Switching between 360 and 3D views • 360 video • High quality visuals • Poor spatial representation • 3D reconstruction • Poor visual quality • High quality 3D reconstruction
  • 90. Swapping between 360 and 3D views • Have pre-captured 3D model of real space • Enable remote user to swap between live 360 video or 3D view • Represent remote user as avatar Teo, T., F. Hayati, A., A. Lee, G., Billinghurst, M., & Adcock, M. (2019). A technique for mixed reality remote collaboration using 360 panoramas in 3d reconstructed scenes. In 25th ACM Symposium on Virtual Reality Software and Technology (pp. 1-11).
  • 92. SharedNERF (2024) • Combines 3D point cloud with NeRF rendering • Uses head mounted camera view to create NeRF images, pointcloud for fast moving objects Sakashita, M., et al. (2024, May). SharedNeRF: Leveraging Photorealistic and View-dependent Rendering for Real-time and Remote Collaboration. In Proceedings of the CHI Conference on Human Factors in Computing Systems (pp. 1-14).
  • 95. User could move along the Reality-Virtuality Continuum Timetravellers - Motivation Expert worker Store room Workbench ? Store room Workbench • In a factory. Cho, H., Yuan, B., Hart, J. D., Chang, Z., Cao, J., Chang, E., & Billinghurst, M. (2023, October). Time Travellers: An Asynchronous Cross Reality Collaborative System. In 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR- Adjunct) (pp. 848-853). IEEE.
  • 96. Design Goals 96 AR / VR Recording user’s actions MR Playback MR Headset (Magic Leap 2) Real object tracking
  • 97. Design Goals 97 AR / VR AR VR WIM (World In Miniature) VR view manipulation
  • 98. Design Goals 98 AR / VR Visual Annotation (AR mode) Visual Annotation (VR mode)
  • 99. Design Goals 99 AR / VR [ Seamless Transition ] AR -> VR VR -> AR [ Avatar, Virtual Replica ]
  • 100. “Time Travellers” Overview 100 Time Step 1: Recording an expert’s standard process Step 2: Reviewing the recorded process through the hybrid cross-reality playback system 2nd User 1st User MR Headset (Magic Leap 2) Real object tracking 1st User’s view Visual annotation Avatar interaction 2nd User’s view Timeline manipulation Recording Data [ 3D Work space ] [ Avatar, Object ] Spatial Data 1st User’s view Real object tracking AR mode VR mode Cross reality asynchronous collaborative system AR mode VR mode
  • 101. Pilot User Evaluation -User studyDesign- 101 • The participants(6) performed a task of reviewing and annotating on recorded videos in both AR and AR+VR (Cross Reality) conditions. • Leaves a marker when the action begins, and an arrow when it ends. [ AR condition ] [ AR+VR condition ]
  • 102. Pilot User Evaluation -Measurements - 102 • Objective Measurements [ Task Completion Time ] [ Moving Trajectory ] [ Timeline Manipulation Time ] • Subjective Measurements • NASA TLX • System sability questionnaire (SUS)
  • 103. Pilot User Evaluation - Results and Lessons Learned - 103 • Objective Measurements [ Task Completion Time ] [ Moving Trajectory ] [ Timeline Manipulation Time ] 200 210 220 230 240 Completion time (sec) AR AR+VR 0 5 10 15 20 Timeline manipulation (sec) AR AR+VR 0 10 20 30 40 50 Moving trajectories (m) AR AR+VR
  • 104. Pilot User Evaluation - Results and Lessons Learned - 104 • Subjective Measurements The Cross-Reality mode as more useful in terms of overall understanding of the collaboration process. (Faster task completion with a Lower task load)
  • 107. Challenges with Prototyping CR Systems • Cross platform support • Need for programming skills • Building collaborative systems • Need to build multiple different interfaces • Connecting multiple devices/components • Difficult to prototype hardware/display systems
  • 108. Example: Secondsight A prototyping platform for rapidly testing cross-device interfaces • Enables an AR HMD to "extend" the screen of a smartphone Key Features • Can simulate a range of HMD Field of View • Enables World-fixed or Device-fixed content placement • Supports touch screen input, free-hand gestures, head-pose selection Reichherzer, C., Fraser, J., Rompapas, D. C., & Billinghurst, M. (2021, May). Secondsight: A framework for cross-device augmented reality interfaces. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-6).
  • 109. Content Placement © 2021 SIGGRAPH. All Rights Reserved.
  • 110. Input
  • 111. Map application © 2021 SIGGRAPH. All Rights Reserved.
  • 112. Implementation Hardware • Meta2 AR Glasses (82o FOV) • Samsung Galaxy S8 phone • OptiTrack motion capture system Software • Unity game engine • Mirror networking library
  • 113. Google - The Cross-device Toolkit - XDTK • Open-source toolkit to enable communication between Android devices and Unity • Handles • Device discovery/communication • Sensor data streaming • ARCore pose information • https://github.com/google/xdtk Gonzalez, E. J., Patel, K., Ahuja, K., & Gonzalez-Franco, M. (2024, March). XDTK: A Cross-Device Toolkit for Input & Interaction in XR. In 2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW) (pp. 467-470). IEEE.
  • 114. Skill & resources required Level of fidelity in AR/VR Class 1 InVision, Sketch, XD, ... Class 2 DART, Proto.io, Montage, ... Class 3 ARToolKit, Vuforia/ Lens/Spark AR Studio, ... Class 4 SketchUp, Blender, ... Class 5 A-Frame, Unity, Unreal Engine, ... Immersive Authoring Tilt Brush, Blocks, Maquette, Pronto, ...... ? Research Needed ProtoAR, 360proto, XRDirector, ... XR Prototyping Tools
  • 115. On-device/Cross-device/Immersive Authoring https://www.youtube.com/watch?v=CXdgTMKpP_o Leiva, G., Nguyen, C., Kazi, R. H., & Asente, P. (2020, April). Pronto: Rapid augmented reality video prototyping using sketches and enaction. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1-13).
  • 116. VRception Toolkit • First multi-user and multi-environment rapid-prototyping toolkit for non-experts • Designed for rapid prototyping of CR systems • Supports prototyping in VR or in Unity3D Gruenefeld, U., et al. (2022, April). Vrception: Rapid prototyping of cross-reality systems in virtual reality. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (pp. 1-15).
  • 119. Modern Communication Technology Trends 1. Improved Content Capture • Move from sharing faces to sharing places 2. Increased Network Bandwidth • Sharing natural communication cues 3. Implicit Understanding • Recognizing behaviour and emotion
  • 122. “Empathy is Seeing with the Eyes of another, Listening with the Ears of another, and Feeling with the Heart of another..” Alfred Adler
  • 123. Empathic Computing Research Focus Can we develop systems that allow us to share what we are seeing, hearing and feeling with others?
  • 124. Key Elements of Empathic Systems •Understanding • Emotion Recognition, physiological sensors •Experiencing • Content/Environment capture, VR •Sharing • Communication cues, AR
  • 125. Example: NeuralDrum • Using brain synchronicity to increase connection • Collaborative VR drumming experience • Measure brain activity using 3 EEG electrodes • Use PLV to calculate synchronization • More synchronization increases graphics effects/immersion Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
  • 126. Set Up • HTC Vive HMD • OpenBCI • 3 EEG electrodes
  • 128. Results "It’s quite interesting, I actually felt like my body was exchanged with my partner." Poor Player Good Player
  • 129. Technology Trends • Advanced displays • Wide FOV, high resolution • Real time space capture • 3D scanning, stitching, segmentation • Natural gesture interaction • Hand tracking, pose recognition • Robust eye-tracking • Gaze points, focus depth • Emotion sensing/sharing • Physiological sensing, emotion mapping
  • 130. Sensor Enhanced HMDs Eye tracking, heart rate, pupillometry, and face camera HP Omnicept Project Galea EEG, EMG, EDA, PPG, EOG, eye gaze, etc.
  • 131. Multiple Physiological Sensors into HMD • Incorporate range of sensors on HMD faceplate and over head • EMG – muscle movement • EOG – Eye movement • EEG – Brain activity • EDA, PPG – Heart rate
  • 133. • Advanced displays • Real time space capture • Natural gesture interaction • Robust eye-tracking • Emotion sensing/sharing Empathic Tele-Existence
  • 134. Empathic Tele-Existence • Based on Empathic Computing • Creating shared understanding • Covering the entire Metaverse • AR, VR, Lifelogging, Mirror Worlds • Transforming collaboration • Observer to participant • Feeling of doing things together • Supporting Implicit collaboration
  • 136. Summary • Cross Reality systems transition across boundaries • Mixed Reality continuum, Metaverse taxonomy • Important research areas • Enhancing Communication Cues, Asynchronous CR systems, Empathic Computing • Scene Capture and Sharing, Avatar Representation, AI Enhanced communication, Prototyping Tools • New research opportunities available • XR + AI + Sensing + ??