If we dared to open the “Way Back Machine“, here are some of the cool projects our staff have worked on in their careers:
Scenario Based Training App and Graphics Engine:
So, it may be hard to believe, but there was a time you had to write your own graphics engine–that is if you did not have $100,000 to $1 million dollars laying around to licensing an existing engine. This is where the SteamEngine came from.
We built this product for use by Northrup Grumman to use for scenario based training of Army Soldiers. We started from scratch and built the engine on top of OpenGL. It featured models, animations, LUA scripting, lighting (no realtime shadows unfortunately), audio zones, animated textures, and embedded Flash movies.
AAAV 16-monitor Man-in-the-Loop Simulator:
Sigh, to be young again! The photo below shows a laboratory mockup of the Advanced Amphibious Assault Vehicle (AAAV) simulator we worked on, circa 2004.
The AAAV is a basically a tank that floats, and can jet across the water at around 3o mph. The simulator had three crew stations: driver, gunner and navigator. We built a full up physics simulation of the craft’s buoyancy and wave reactions in the water, and of its tread driven suspension system on land. Real controls from the AAAV were used at each operator station to drive the simulation.
The final trainer ran on a million dollar Silicon Graphics 16-GPU Infinite Reality system. This allowed us to drive 16 monitors scattered around a full mock-up of the vehicle interior: 8 vision blocks for the gunner, three for the driver, and several displays of simulated control panels and readouts.
The graphics are pitiful by today’s standards, but it wowed the kids back then:
3DLabs and Sun Microsystems Demo App:
Below are some screenshots from a demonstration app we built for 3DLabs. 3DLabs used it in their booth at Siggraph in 2005 and 2006 to demonstrate the capabilities of their cutting-edge graphics cards. It is hard to image now, but their cards, costing a few thousand dollars, were starting to give $250,000 machines like Silicon Graphics‘ desktop units a run for their money.
3DLabs wanted a demo that showed how their cards to work in a wide range of applications, while highlighting their incredible rendering accuracy. The result was Pixi-E 3D!
It featured scenes such as:
A Pod of Whales swimming happily through caustic dappled waters:
An interactive Dandelion Fan. You could grab the handle in 3D and rotate it to make the gear train spin. This then spun the fan, and caught the dandelion seeds up in a swirling vortex of physically simulated wind. It demonstrated special effects like reflections, glossy textures, specular highlights on the chrome, and tight polygon accuracy where the gears meshed together.
The Virtual Head Slicer — i.e. CAT Scan Viewer — allowed users to slice through a 3D volumetric data set of a human brain. The simple user interface (at the bottom left) let you move the slice plane up and down or tilt left and right to see different slices through the brain.
This was a very early demo of what realtime interactive 3D could do for Architectural Visualization. It used what was bleeding edge technology at that time–baked shadow maps! Ooooh!!
Views of the Collection:
The Huntsville Museum of Art put on show pieces of its collection that are rarely seen. For this exhibit, we recorded video interviews with 20 high school students giving their interpretations of each of the items on display. We developed this simple viewer application below that allowed visitors to tap on a student and hear their video interview. This app was then placed in the museum on what at the time was an amazing invention, a touch screen kiosk.
Citywide Cable Network Bandwidth Viewer:
This application tapped into the flow of network traffic over a citywide cable network, and showed the bandwidth being used in regional sub-nodes, in realtime. The nodes were geolocated, and their height and color indicated how much bandwidth was being consumed in that part of the city.
Baseball Swing Analyzer:
The idea was to use computer based vision to capture and analyze a baseball player’s swing. We would track the bat as it travelled through space, and then show its path in 3D along with speed plotted as a color scale from blue for slow to bright red for the fastest.
Bird Identification Software:
The goal of this project was to develop automated bird identification software using machine vision techniques. It turned out the easiest way was to start by searching for the eye of the bird. This then made it possible to search for key color and shape features around the eye, face, and beak. The images below shows how blob parsing was used to isolate the eye and key banding patterns, resulting in a prediction of known species: