top of page

Unity Technologies 
MDM Program | Industry Project

Creation of a robust UX framework and an intuitive VR tool prototype for filmmaking.

Project Type

User Experience, Interaction Design, Virtual Reality

Keywords

UX and User Research, User Experience Design, Interaction Design, User Interface Design, Usability, Visual Development, Virtual Reality, Remote Collaboration, Agile Methodology.

Background

Timeline: Jan 2020 - Apr 2020.

A project developed for Unity Technologies, an industry client of the Centre for Digital Media.

Team: Santiago Sotomayor, Alessandra Huang, Irene Sasaki, Rainie Han, Ali Shafiei, Michal Korek.

Overview

*A predominant part of this project was developed remotely.

the debugger 2.jpg

In January of 2020, Unity Technologies approached us with an idea to build an intuitive virtual reality tool for filmmakers. 

 

This tool gives film production teams the ability to visualize a set, place props, cameras, markers, and block out shot sequences, essentially creating the whole production set in virtual reality and fixing any potential conflicts before going out to create the set in the real world.

My Roles

UX/UI and Interaction Designer

  • To elaborate and conduct qualitative & quantitative research, plan and conduct user tests, and translate its results into actionable insights and design actions.

  • Responsible for writing user test and design documentation, and maintaining a routine of rapid iterations on the design system.

  • My biggest challenge was to translate the results of our research & user tests, plus the requirements and restraints of the tool itself, into ergonomic, intuitive and visually innovative designs.

Visual Development Artist and 3D Modeller

I was also responsible for driving the project's visual development:

  • Based on the clients and teams expectations, I designed and modelled all 3D assets to be implemented in Unity. ​

  • Responsible for the creation and elaboration of low to high fidelity mock-ups of the tool, highlighting features and interactions.

The Project

Video Duration: 7 minutes

For a better experience, please listen with sound ; )

Problem Statement

In the film industry, scouting physical sets to plan scenes and shots can be extremely expensive for several reasons:

  • Physical sets need to be constructed before shots can be planned, which is a pipeline bottleneck and often results in expensive set changes.

  • Shots planned without a virtual set, or even a 3D set, don’t convey what shooting on the actual set will be like, and don’t allow the director to immersively move around the scene and control a camera.

  • The previz phase doesn’t require many people but is very long. 

Assumption

An intuitive VR tool can streamline the process of iterating and communicating complex cinematic shots.

Our Solution | Value Proposition

Development of a robust UX framework based on Unity Editor XR conventions and an intuitive VR camera tool prototype in Unity Engine that will be used for set-scouting, shot planning and pre-visualization. This tool will allow filmmakers, especially the DP (Director of Photography) and Cinematographer to discover and convey their vision clearly to the rest of the stakeholders, thus saving time and money.

Discovery Phase

A broad research was conducted, focused on ways to visualize an environment, place markers/cameras and block shot sequences. It was then followed by an investigation of how to represent a timeline feature, different cameras, and placing dolly tracks in a virtual reality environment. 

Film Production | Research

The goal was to get a general view of the production pipeline during filmmaking. The research looked deep into the different roles on a film set, their departments, and responsibilities. Once the roles were mapped, we focused on the Photography, Lighting and Director departments and their workflow.

Screen Shot 2020-09-24 at 1.17.56 PM.png
Screen Shot 2020-09-24 at 1.17.41 PM.png

Existing Software & Tools | Research

This phase identified main tools and software with similar purposes to the one we intended to develop, and compared their pros and cons. Two important aspects of the tools were analyzed: the whole User Experience and the User Interface.

Research_competitive-tools.png
Research_competitive-tools.png

Competition analysis: camera workflow of  Tvori and AnimVR.

Screen Shot 2020-09-24 at 1.27.41 PM.png
Screen Shot 2020-09-24 at 1.26.48 PM.png

The pros and cons of existing software were mapped, for comparison purposes.

Camera Operation | Research

The research goal was to understand how the most popular cameras in the market are used by professionals. It was important to identify the different interfaces of these cameras, in order to develop an intuitive and innovative user experience that took into consideration the industry conventions. 

Screen Shot 2020-09-24 at 1.18.56 PM.png
Screen Shot 2020-09-24 at 1.19.07 PM.png

User Research

A survey aimed at professional filmmakers and filmmaking students was conducted. In the meanwhile, 1-on-1 interviews were held to better understand the point of view of the target audience. The main goal here was to pinpoint the most meaningful features that could be implemented in a VR environment.

Planning & Development

The research served as a foundation for building the broad prospect of the project. A big vision framework was structured focused on the roles performed by the Photography, Lighting, and Director departments, and their specific workflows. 

Based on the film tools & camera operations research, the main basic features were defined: camera, camera viewfinder, snapshot, storyboard.

Big vision framework v0.1.jpg

Camera Workflow and Loop

As the big mainframe was defined, the next step was to plan the features' infrastructure of the VR camera tool, the main focus of the project.

All users would have access to a Camera Workspace: similar to a "working station", containing the tools and features they would need to create and plan a film shot.

 

From the Camera Workspace, users are able to glab and place a Camera Tool in the environment. Users could use as many Camera Tools as they need. 

 

The Camera Tool functions like a real camera: allowing the user to move and tilt the screen in order to decide shots, take snapshots, and adjust its settings (lenses and aspect ratio).

IA.png

IA diagrams for the Camera workspace and the Camera tool.

A camera workflow was created, aimed for an intuitive journey requiring the least amount of steps possible for the user to complete (*in comparison to existent software with similar purposes, as researched in the discovery phase).

Screen Shot 2020-09-24 at 1.54.56 PM.png
UCT_Feature Framework.jpg

Design Evolution

Several iterations of the tool were made:

1. Clipboard / Call Sheet Workspace

As one of the target users for the VR Camera Tool is the director, one idea was of presenting the users with a Call Sheet: a highly known element for professionals in the film industry. The Call Sheet, as well as the toolbox workspace, would contain all the elements and features for the user to interact with.

Screen Shot 2021-01-28 at 9.18.01 AM.png
Clipboard Idea - Overview.jpg

2. Toolbox Workspace

The toolbox iteration was designed considering the three most relevant features during pre-production and pre-visualization within the filmmaking industry:

a) Camera; b) Props; c) Lights.

Box Idea - Overview.jpg

3. Camera Workspace

The Camera Workspace was designed as a workstation containing all the tools the user would need to plan the shooting of a scene. 

 

It had three main functionalities:

 

a) Create and place the Camera tool: the user could grab and place one (or more) cameras in the environment. 

b) Pre-visualization of existing cameras placed in the environment: users could consult its preview and settings in the large screen (viewfinder).  

c) Remote control of all cameras placed in the environment: by selecting an existing camera, users are able to manipulate its settings from a remote distance.

Workspace_CameraSettings_V02_portfolio.j

  Camera Workspace + Settings  

User Tests

A series of user tests were conducted, with the main goals of:

  1. Validate assumptions concerning the team's design choices & finding the experience's weak points, in order to further iterate on it.

  2. Confirm that the designed work-frame of the camera tool was intuitive enough for users to use it.

  3. Confirm that users understood they could create and work with multiple cameras inside the virtual environment.

Screen Shot 2020-09-24 at 2.04.44 PM.png
Screen Shot 2020-09-24 at 2.05.17 PM.png

User Test Takeaways & Design Insights

UserTest_1.png
UserTest_2.png
UserTest_3.png

Before:

1. The Camera tool visuals did not resemble a real life camera. Most users mistook the Camera's inactive mode for a Snapshot. The result of this misunderstanding was that most users didn't explore the camera features properly.

CameraSimpleMode_V1.jpg

Camera tool (inactive mode)

CameraCompleteMode_V1.jpg

Camera tool (selected & expanded mode)

2. The Camera tool "move" feature wasn't clear. Its icon resembled a small camera icon, requiring to be selected in order to be expanded - revealing the moving options available (move and tilt). 

Move_Icon.jpg

Camera "move" options: moving horizontally and vertically, and tilting the camera using a handle. 

After:

1. The Camera tool visuals were updated to resemble a real life camera.

 

2. The Snapshot feature was incorporated inside the camera user interface, and a Record feature was added.

3. The Camera "move" option was reformated into a Tripod and Handle, being incorporated as design parts of the Camera Tool:

a) by grabbing the Handle, users could move the camera horizontally and vertically.

b) by using the Tripod, users could freely tilt the camera to find better shooting angles.

Camera_SimpleMode_V02_portfolio.jpg

Camera tool (inactive mode)

Camera_CompleteMode_V02_portfolio.jpg

Camera tool (selected & expanded mode)

Comparison:

CameraCompleteMode_V1.jpg

Before

Camera_CompleteMode_V02_portfolio.jpg

After design iterations

Final Product

The team worked within an Agile environment and methodology, alternating quick sprints of development, user tests and design iterations. This process evolution resulted into the final design of the tool:

Camera Workspace:

The control center of the VR Camera Tool. By using the Camera Workspace, users are able to:

  • Place multiple cameras in the virtual environment

  • Select and access the multiple cameras placed in the environment

  • Manipulated placed cameras settings and parameters remotely

Workspace1_V03.jpg
Workspace2_V03.jpg

Camera Tool:

One of the main components of the VR Filmmaking Tool:

  • Containing all settings and configurations needed during the film pre-visualization stage (in a retractable panel)

  • Tripod and handle features which allow an experience very close to real-life

  • Easy to use Record and Snapshot features, similar to a physical camera 

Camera_SimpleMode_V03.jpg
Camera_CompleteMode_V03.jpg
Camera_V03.jpg

Snapshot Feature:

Users can take as many Snapshots as needed to plan a scene or to build a storyboard. Each individual snapshot displays the camera settings and configurations of the camera, for further consultation if needed.

Snapshot_V03.jpg
Snapshot_Detail_V03.jpg

Storyboard Feature:

Users can use the taken snapshots in order to build a storyboard or rearrange an existing one. This feature allows users to quickly create and iterate on ideas, or simply improve screen planning sequences in advance of a real-life shoot. 

Workspace_Storyboard_V03.jpg
Storyboard_Detail_V03.jpg

And here's our Prototype:

Teamwork 

We worked hard, learned so much together and had lots of fun. 

It was an amazing opportunity to work for a client of such caliber as Unity Technologies, and to be part of an incredibly talented team of UX & interaction designers and developers. 

Screen Shot 2020-09-24 at 2.59.57 PM.png
Screen Shot 2020-09-24 at 2.08.16 PM.png
IMG_4681.HEIC
ScreenShot_edited.jpg
bottom of page