Sunday, February 28, 2016

Projection Mapping UTD ATEC Project 2 : Classroom Mapping
by:
Phillip Baker, Ayaz Ismail, Nathaniel Propp

Concept:

For this work we knew we wanted to focus on movement driven by sound. When considering designs for our geometric relief sculpture the keyword we decided on was radiate. We created a base center and created several corresponding smaller pieces to not only radiate off the body, but  also direct the views eye around the piece. In the previous assignment our design was driven by the shape of the human eye, and I believe that persisted through to this composition as well.

Once our structure was completed, we created several different presets to help us explore new tools and concepts.



Process:

The first step in this assignment was to create the relief sculpture. Phillip has extensive experience in building forms using Pepakura. We first started with some sketches and then created models in Maya and eventually Rhino. In 3D space we put our model through another iteration of design. We manipulated some points and faces to create a form that plays on the expectation of symmetry, but the asymmetrical realization really encourages the viewer's eye to move around the piece. 

Part of the form is the off shooting segments that radiate and imply rotational movement. To get these just right we used the projector and Photoshop to trace the outline of the body and pin point the center pivot on the wall the relief sculpture was hung. We then needed to trace only one radiated segment to then duplicate and basically discreet rotate the outline to evenly space out the consecutive radiated segments. Using nails we then were able to mount the relief sculpture easily and with great accuracy.

The next part conducting a spacial scan of the sculpture using MadMapper and a DSLR camera. This gave us an ideal shot of the piece from the projector's view so we could easily compose content at home away from the actual piece. 

UV unwrapping of the model was simple enough and allowed us to use the UV's as the mask. This enabled effective automatic masking that saved time and was very accurate. 

At this point were we able to start gathering resources. Utilizing MadMapper's presets we were all able to create several different compositions and easily switch between them. 

After some content was generated, we imported some content directly into MadMapper. Some other content was combined and composed in Modul8, which gave us a lot of freedom with moving, looping, coloring, filtering, and manipulate our content to a whole new degree. This was then fed into MadMapper through Syphon. 

With Ayaz's expertise we were able to incorporate live sound. Using a microphone we anaylized any sounds or music and utilized it to drive some of our movement. Setting speed and color to certain values such as Bass and Midtones really brought the piece to life.

Finished Work




Additional Thoughts

We had a really difficult time processing live sound with in Modul8 which was disappointing because so many of the elements could have been driven by the sound levels. In the future we hope to conquer this problem to create dynamic performances. 

No comments:

Post a Comment