DCDC24-EmissionVision/README.md
2024-06-04 18:23:42 +02:00

11 KiB

Screenshot_2024-05-26_at_18 10 22-removebg-preview

Emission Vision

What if knowing the past, can change the future?

Introduction

Emission Vision is a digital twin implemented in mixed reality and designed to provide an interactive and collaborative way of analyzing global CO2 emissions data and each country's contribution. Over the years, CO2 emissions have steadily climbed and continue to rise. Consequently, it is crucial for policymakers and environmental analysts to provide accurate projections based on data patterns. By integrating real-world environmental data into a 3D visualization, Emission Vision enables users to better explore, analyze, and compare the complex trends of CO2 emissions ratio around the world over time. Users can easily compare countries' contributions, gain insights from the data, and collaborate in real-time, allowing analysts to work together to address global environmental challenges.

Design Process

Emission Vision is a project designed for the Design for Complex and Dynamic Contexts course at Stockholm University. Our goal was to create a collaborative and interactive tool that could transform environmental data into useful insights with the help of mixed reality.

Brainstorming

In the initial brainstorming sessions, we focused on identifying areas that could be enhanced with digital twin and mixed reality technologies. We discussed the importance of analyzing CO2 emissions over the years and how this information could be useful in addressing environmental challenges. We explored how to incorporate the three main layers of a Digital Twin: Physical Entity, Virtual Entity, and Data. For the physical entity, we chose the phenomena of CO2 emissions on the planet Earth; hence, for the virtual entity, we decided on a virtual globe. Initially, we aimed to use real data on the total amount of CO2 emissions per year for each country. However, we realized that using the CO2 emission ratio would be more insightful as it accounts for population differences, providing a clearer picture of each country's impact.

Wireframes and Prototypes

Screenshot 2024-05-26 at 19 50 56 Screenshot 2024-05-26 at 19 51 05

User Personas

Persona 1: Environmental Analyst

  • Name: Dr. Emma Green
  • Age: 45
  • Occupation: Senior Environmental Analyst
  • Goals: Analyze CO2 emission trends, make data-driven policy recommendations.
  • Needs: Accurate, easy-to-interpret data visualization, real-time collaboration with colleagues.
  • Pain Points: Difficulty in comparing CO2 data across different countries and years.

Persona 2: Policy Maker

  • Name: John Smith
  • Age: 50
  • Occupation: Government Policy Maker
  • Goals: Understand the impact of geometrical location and policies on CO2 emissions and create effective environmental regulations.
  • Needs: Comprehensive insights from data, clear visual comparisons.
  • Pain Points: Lack of accessible and understandable data for decision-making.

User Journey

Since our project is multi-user and supports collaboration, we decided on an approach where when one person interacts, the other user cannot interact with anything in the mixed-reality environment (with the virtual objects). As soon as one person stops interacting, the other user is allowed to interact. Here is one potential scenario for the users' journey in this app: Screenshot 2024-05-26 at 19 57 39

Challenges and Solutions

One major challenge was integrating the data into Unity. Some values were in a format that Unity could not read directly. To address this, we conducted Exploratory Data Analysis (EDA) to identify and handle potential missing values and format inconsistencies before integrating the data into our project.

System description

Features

  • Mixed-reality Setup, showcasing a virtual earth model in the physical space
  • Real Data Integration with Unity, displaying CO2 emission ratios from countries worldwide across different years with color-coded countries on the globe
  • Interactive Controls, using controllers to rotate the globe and explore data by year
  • Real-time Collaboration using Photon Unity Networking (PUN), enabling analysts to collaborate and discuss data trends in real time.
  • Compatibility with Meta Quest headsets

Installation

This section outlines the steps to set up your environment for developing Android VR applications using Unity 2022.3.

Step One: Setting Up Unity Hub

  1. Download and install Unity Hub from the Unity download page.
  2. Open Unity Hub after installation.

Step Two: Installing Unity Editor and Required Modules

  1. In Unity Hub, navigate to the 'Installs' tab and click on the 'Add' button to install a new version of the Unity Editor.
  2. Select Unity Editor LTS version 2022.3.Xf1 or higher from the list of available versions. You can find the recommended versions on the Unity releases page.
  3. During the installation setup, ensure to include the following modules: Microsoft Visual Studio IDE (for code editing). Android Build Support (libraries necessary for creating Android
  4. Follow the instructions to complete the Unity Editor installation.

Note: Since you are only working with a pre-built project and do not need to modify the code, you might not need to install Visual Studio. However, it is recommended that you install it if you need to troubleshoot or make adjustments to the code.

Step Three: Downloading the Emission Vision Repository

  1. Download the zip file from this GitHub Repository.
  2. Extract the zip file to a desired location on your disk.
  3. In Unity Hub, navigate to the Projects tab and click on the Add button.
  4. Select the folder where you extracted the project and add it to Unity Hub.

Step Four: Building and Deploying the Project to Your Headset

1. Connect the Headset:

  • Connect your VR headset to your computer using the appropriate cable.

2. Open Build Settings:

  • In Unity, go to File > Build Settings.

3. Switch to Android Platform:

  • In the Build Settings window, if the platform is not already set to Android, select Android from the list and click Switch Platform.

4. Select Run Device:

  • In the Build Settings window, find the Run Device dropdown menu.
  • Select your connected VR headset from the dropdown list.
  • If your headset is not listed, click the Refresh button and try again.

5. Enter Keystore Passwords:

  • If prompted for a password during the build process, go to File > Build Settings > Player Settings.
  • In the Player Settings panel, navigate to Player and scroll down to find Publishing Settings.
  • Enter the password 123456 in both the Project Keystore and Project Key fields.

6. Build and Run:

  • Back in the Build Settings window, click the Build and Run button to start building the project.
  • Unity will build the project and deploy it to your connected VR headset.

7. Verify Deployment:

  • Once the build process is completed, verify that the application is running correctly on your VR headset.

By following these steps, you will be able to build and deploy your VR project to your headset directly from Unity. Make sure your headset is properly connected and recognized by Unity before starting the build process.

Usage

  • To enter the scene, hold the left-hand controller in front of you to see the Shared Spatial Anchor Samples menu and choose Anchor Sharing Demo by pointing to it and pressing the trigger button on the right-hand controller.
  • To create a room, hold the left-hand controller in front of you and choose Create New Room by pointing to it and pressing the trigger button on the right-hand controller.
  • To join a room, hold the left-hand controller in front of you and choose Join Room by pointing to it and pressing the trigger button on the right-hand controller. Then, choose the room you want to join in a similar way.
  • To create a new anchor, select Create New Anchor by pointing to it and pressing the trigger button on the right-hand controller.
  • Press the trigger button to place the anchor.
  • Select Share Anchor and align to it by choosing Align to Anchor or align to an already existing anchor the same way.
  • Press A on the right-hand controller to spawn the globe and the necessary assets.
  • To rotate the globe horizontally, use the right joystick on the controller.
  • To select a year, point towards the slider's little globe icon, press the right-hand controller's grip button and then drag it onto the desired year.
  • To select a country, point to the country on the globe with the right controller and press the trigger button to view CO2 emissions ratio data for that country (You can select up to three countries to display on the panel).
  • To close the panel/panels, point to the Close button on each panel with the right-hand controller and press the trigger.

Basic Use Case Scenario

  1. Put on your Meta Quest headset and launch the Emission Vision app.
  2. Create a new room or join an existing room.
  3. You can either create and share a spatial anchor so the other users can align to that or wait for a spatial anchor to appear and align to it.
  4. Read the instructions panel.
  5. Scroll through the available years using the slider.
  6. Select the desired year to update the globe with CO2 emissions data for that period.
  7. Rotate the globe to find your desired country.
  8. Select one country to view more detailed information.
  9. Select another country to compare its data side-by-side.
  10. Close the countries to go back to the instructions panel or select new countries to compare.

References

Unity Assets

Documentations

Portfolio Video

  1. Cooking Sunday - 오준성
  2. Divination - Saint Of Sin
  3. 171101 (Instrumental) - Chace
  4. Imagine (Original Mix) - MaHi

Contributors