IMG_9559.JPG

Mixed Reality

Mixed Cuts

Mixed Cuts is a collaborative video development tool combining the advantages of tangible interaction and mixed reality to aid exploration of video sequencing and editing in groups. The system utilises software tracking of image targets attached to physical tokens. The modular, physical interface is arranged on a table, tracked by a camera and augmented with a projector.

Users accomplish editing tasks by interacting with the tokens, the physical manipulations are mapped to digital commands by the system. Functionalities include sequencing, temporal prolongation, volume adjustment and filtering. Our implementation improves upon alternative interfaces by partnering tangible, whole-body interaction in real-world contexts with the powerful flexibility of mixed reality.

 

Prototyping Process

IMG_8239.JPG
3d.jpg

Setp 1  sketch

Initially, the tokens were designed to be much smaller. However, due to the limiting factors in the camera resolution the size of the image targets, and hence physical tokens, had to be increased. The tokens sizes are as above.

Setp 2  3D models

123D Design was used to create the 3D models and to calculate how much material was required. To fabricate it, there are two options: 3D printing and laser cutting. We tried both techniques and compared them according to how

 
8240.jpg

Setp 3  Fabrication

Ultimately, we found that laser cutting was superior as 3D printed tokens proved to be too fragile to manipulate. The material used for laser cutting, medium-density fibreboard (MDF), was thus not only easier to handle for the user and more robust but also easier to cut and process. Additionally, as we designed the tokens iteratively, MDF could be re-cut and reused whereas the plastic used for 3D printing could not.

IMG_8238.JPG

Step 4  integration

Based on our evaluation, all the tokens were manufactured using laser cutting. Inkscape was used to define the trajectory of the laser. After cutting, pieces were polished and then assembled using wood glue

 
 The unity scene including the clip (1), the clip slider (2), the edit button (3), the volume slider (4), a filter (5) and a reference point (6) used for the crop start and crop end points (7).

The unity scene including the clip (1), the clip slider (2), the edit button (3), the volume slider (4), a filter (5) and a reference point (6) used for the crop start and crop end points (7).

Setp 5Augmented reality developing

The first Unity plug-in, Vuforia, provided augmented reality functionality. We created our graspable 3D objects in Unity, representing them as tracked image targets. AVPro provided basic play media controls like play, rewind, forward and play speed adjustment. Additional functionality was implemented by custom scripts that can be attached to Unity 3D objects manipulating their behavior. These scripts were written in C#. The final version of the project can be seen in the figure abrove.