The below images are for a random project I did for Leonard Phillips of the dickies
I was tasked with getting blending 6 projects seemlessly on a curved wall that had port holes cut into it. In addition, we needed to provide the 3D animator with the final resolution of the video and the location of the port holes as the animation was going to interact with them.
The first challenge was selecting the technology to handle the projector warping and blending. Because of the size of the room and our commitment to image quality, we wanted a server that could also handle the native resolution of the combined 6 projectors (6144 x 768). After days of researching and contacting various vendors, we finally selected the Pixel Warp server, provided by Pixel Wix. While their server at the time maxed out at 3 projectors, it was able to frame accurately sync with other servers. And with the price of the server, i was able to recommend a 3rd server as a backup and still come in under budget.
The next step was warping and blending the projectors in order to map out where the port holes where. Because the Pixel Warp server supported live video, we were able to run an HDMI output from After Effects into the servers, which allowed us to in real time draw masks and place markers on each individual projector, which was then combined into one image for the animator. When the animation was rendered out, I split the video into 2 files and finalized the edge blending.
With a half sphere dome spanning 8 feet in diameter, the total surface area when corrected for curvature is the equivalent to a 170 inch display. And with the quality standards the designers at the Aquarium have always prided themselves on, with a display of that size, a more realistic experience that comes from pixel density was absolutely important. They selected a projector with a native resolution that exceeded 2K standards of 2560×1600, and it was up to me to find a flawless playback solution that not only could handle they data rate of an uncompressed video at that resolution, but also could sync to a secondary video stream with bi-lingual captions of the narrated 3D animation.
I looked into how to best push that much video data and settled on a utilizing a custom presenter provided by Window’s Enhanced Video Renderer, and built the computer with a second video card to handle the status reporting. The final result is flawless playback that is without stuttering or image tearing.
This is another project made from my XML driven synchronized content engine that I call Hybrid Video. Making videos that have a large amount of repetitive graphic elements like bullet boxes can be produced much faster using this engine and by keeping the video layered like this allows for a lot of slick dynamic content that can be added. Plus the “look” of the graphics comes from a separate file much like CSS, so the video can be completely re-skinned quickly.
And if you’d like to learn more about our Hybrid Video technology, click here.
Edited and composited on an Avid DS on site in St. Louis, MO. I also provided 2 After Effects workstations to assist in the rendering as the time available from start to completion was only 2 days.
This was a small postcard design project I did for some friends who were fundraising for Team in Training to fight Leukemia. I came up with the initial pitch, oversaw the shoot and did all of the Photoshop and graphic design.
The sister organization to the Monterey Bay Aquarium is the Monterey Bay Aquarium Research Institute, or MBARI. And when that connection was to be showcased in the Monterey Bay Aquarium, we stepped up to help in numerous ways. From programming the 3 video games visitors play that demonstrate various research MBARI projects to providing our own uncompressed HD video player software (UnPeg) and hardware for a 3 channel video wall and tied all of it together so the whole system together so it all was synchronized with the Aquarium’s show controller. The permanent exhibit has been open since 2007 and has only required one service visit from us (which had a response time of under 12 hours).
For an exhibit on the International Space Station, Chabot had a bold challenge to solve; providing a way for visitors to take a photo of themselves in zero gravity. With a budget that excluded commissioning a NASA level contraption to accomplish this goal, Chabot decided to simplify the challenge to taking a photo of someone at the height of a jump into the air. From there, we set off to solve how best to do it. Our solution was to utilize a commercial grade floor mat, like the ones used for automatic doors at grocery stores. With that, our solution recorded all the frames between the time someone leaves to mat to when they land back on it. From there the three images that are at their highest point (when they are neutral to the pull of gravity) are presented for them to select and email to themselves and friends. And for Chabot, they get monthly update for their mailing list of visitors that opted into receiving their newsletter.
images of people jumping.
The final piece was hugely successful and is still one of the most popular installations at Chabot with an average of 60,000 photos taken annually.
This installation presented the visitor with 5 objects to search in the sky using spinner controls.