Friday, March 11, 2016

Process Updates

Artefacts

The capabilities of the artefacts have been further defined with a view to achieving a functional prototype for testing in the next week. This will include an Arduino nano microcontroller connected to a wireless transceiver to be able to seamlessly communicate with the projection system. The information that will be communicated will be derived from a combined gyroscope, magnetometer, and accelerometer as well as a capacitive touch sensor to determine whether participants are in contact with each artefact.

This technology will be embedded within wooden objects of various shapes and sizes. A copper inlay of the surface will specify the capacitive touch functions and act as an antenna for the wireless transmission.


Application

Development is continuing on the application to control the projection system using openframeworks. The current base model involves a 3D model environment with a video texture. Since there exists no existing library to generate a spherical or cylindrical projection from the virtual camera a solution has been devised using an array of cameras, each slightly rotated and then rendered to slices that can be recombined to a single output that simulates a panoramic projection.

A physics simulation has been added to the vantage point of the cameras, allowing input to push the spatial representation around to give the sensation of motion.

A serial input has been added to allow an arduino to communicate the received wireless transmissions from the artefacts to the projection system

Space

A space has been confirmed for the exhibition of the project from May 27 - 28 in a warehouse location in Brooklyn. The thesis show will provide an opportunity to demonstrate the project before its final implementation.

One factor of such a large space is that it will necessitate a surrounding experience to enhance the installation. To this end we are currently seeking to curate several other projects too exist alongside.

Candidacy Review Slides