Wednesday, January 15, 2014

Mars needs models

Custom translator: NASA Mars Orbital Lidar Altimeter to 3D printing data.




3D Prints 



Friday, January 10, 2014

Software controlled DMX portrait lighting with Kinect based automatic positioning, 12 Nikon D800e camera satellite

The goal: to create a portable lighting system able to capture diffuse, specular, and normal texture maps with even shadowless lighting, and capture a 3D model of the actor or subject at the same time, with or without augmented structured light from a satellite of five projectors. Add in enough software control of lighting to create a mesoscopic normal map.

I spec'ed and designed the rig, and wrote the software that drives the cameras, projectors, and lighting, as well as an automated Kinect-based trigger that detects the moment when the subject is in position and holding still.






A mastery of the Nikon SDK was required to control the cameras, but USB timing limitations required a separate microcontroller-based hardware trigger to half/full press with precise timing.



Thursday, January 9, 2014

Software dev - DAM automated digital asset management


The problem: even the fastest 3D data viewer is brought to its knees by dense scan data, making DAM difficult without indirection.
The solution: automated preview generator that in a "live" manipulatable window that allows rotation, scaling, and lighting changes before committing to a 2D bitmap version to be used in cataloging software.


Avengers 2 cross polarization camera + filter control

A fast but stable (vibration free) polarization angle switcher. The embedded microcontroller also fires the camera(s). By taking a continuous sequence of several shots, two that match closely can be culled.


Truth-be-told, I had about five hours to design and build this contraption before leaving LA for the Manhattan photo studio where the actors would converge the next day. 


This allows diffuse and specular skin reference images without using a beam splitter (and thus halving of light intensity.)

Wednesday, January 8, 2014

Modeling the entire Enterprise... from the inside

When ILM needed to simulate artificial gravity going haywire on the Enterprise, I was tasked with modeling the Star Trek set from transporter room to bridge. I used a Leica LIDAR scanner which rapidly captures an environment, and placed the device at several dozen locations along the soundstage interior. The end result was one of the largest, most detailed movie models yet created.









Z buffer based 3D outputs

Working under Dr. Catmull's advice, I created a method of deriving 3D models
from Renderman Z-buffer output instead of the traditional polygon input. In this way, details that were only seen upon rendering such as fur, displacement, subdivision, and procedurally generated features would be included in the 3D printer output.











And the Stupid RAT Trick that sparked it all:
<(Is no longer archived by Pixar)>

Lucasfilm archives












Arm game development for Nintendo DS

With the notion of creating simple one-dollar apps for the Nintendo app store, I developed a how-to-draw series of games that would feature a variety of licensed characters. This would be easier to use with Nintendo's stylus than with fingers on a phone screen. Unfortunately Nintendo limits their store to large established game developers only.


Tuesday, January 7, 2014

Reverse engineering movies

For a Tech Museum of innovation exhibit, I reverse-engineered the pod from 2001: A Space odyssey, creating 3D blueprints that fed a Viper 3D Printer.





Sunday, January 5, 2014

Slitscan as a reference tool?


A continuous bead of transcontinental freight trains lace directly through the middle of the city of Plano Illinois, never dipping below 100KPH, in a seemingly contemptuous disregard for this tiny heartland town set in the cornfields of the midwest. Since we were committed to capturing reference data of the town for a 2013 superhero movie, a spontaneous idea sparked me to record this westbound with a 5D Mark II and feed it into an excitedly self-coded program on the laptop I used for archiving reference and texture photography. The result is a very, very megapixely image of an entire train with what is essentially a computational lens that is orthographic in the horizontal axis.

Friday, January 3, 2014

Open Camera Controller

In my daily routine I often find myself having to do extensive photographic documentation of objects or locations - only there is often someone waiting to turn off the lights, or move props, or any myriad of possible interruptions. This is one reason why using a camera tethered to a laptop can be a pain. The computer is a burden to travel with, takes five minutes to boot,  the batteries run low, the ten minutes you had to get the job done just became five…  It occurred to me that if I could somehow tether a DSLR to an instant-on device like an Arduino microcontroller I would have less weight to carry around and could get more work done. After mentally spec’ing out what I would need, I realized the solution was right in front of me - because I bring it with me for Mario Kart wireless races on long night jobs - the Nintendo is an instant-on hand-held computer.




Thanks to devkitPro for providing an ARM9 c++ programming environment. Hooking the camera to the game cartridge slot took some figuring, but in the end it essentially involves strobing a pin connected through an optical isolation circuit to the camera’s cable release port. This method sacrifices the aperture control that using the Canon SDK allows when tethered to a laptop, however aperture is always the setting that never changes during a reference shoot.




A funny cool thing happens once the camera is controlled by what is essentially a instant-on computer. Where the Canon can do a bracket of 3-9 shots, spread two stops apart, the “DS-DSLR” can do any number of shots, and if I don’t like the way it does it, I can rewrite the software to do it differently.

The Nintendo can run in bulb-mode as well, so I can do automated exposures of several minutes beyond the thirty second limit of tethered laptop software, as well as allow for sensor cooling between bursts. 

It also acts as a very precise intervalometer, which as Amazon kindly points out is a $120 value. But beyond just spacing out shots the timer can be set to run the bracketed exposure range at every timer interval instead.

The fun begins when you start to harness the play value of the DS in conjunction with the DSLR. This audio-based camera trigger function, for example, was trivial to add in, because of the DS’ built in microphone. One such device I found on the web sells for $350.

Wednesday, January 1, 2014

Voxelization workflow

For a Pirates movie, I developed a voxelization workflow to create
reference data from fine cloth and rope set details