The HIPerWall system was a pretty impressive collection of hardware for 2005, with 50 processors (more were added later), 50 GB of RAM, more that 10 TB of storage (we got a gift of 5TB worth of drives for our RAID system from Western Digital), and 50 of the nicest monitors available, but it was the software that really made it special. Remember that HIPerWall is an acronym for Highly Interactive Parallelized Display Wall. We took that interactivity seriously, so we didn’t just want to be able to show a 400 million pixel image of a rat brain, but we want to allow users to pan and zoom the image to visually explore the content. This user interactivity set the HIPerWall software apart from the other tiled display software available at the time and is still a major advantage over competing systems.
The original software was written by Sung-Jin Kim, a doctoral student at the time who was working on distributed rendering of large images. His software, TileViewer, was originally written to use Professor Kane Kim’s TMO distributed real-time middleware, but Sung-Jin ported it to Mac OS X and IP networking so it could work on HIPerWall. TileViewer ran on both the control node and on the display nodes. The control node managed the origin and zoom level of the image, while TileViewer on the display nodes computed exactly where the display was in the overall pixel space, then loaded and rendered the appropriate portion of the image. We preprocessed the images into a hierarchical format so the right level and image tiles (hence the name) could be loaded efficiently. The images were replicated to the display nodes using Apple’s very powerful Remote Desktop software. TileViewer also allowed color manipulation of the image using Cg shaders, so we took advantage of the graphics cards’ power to filter and recolor images. TileViewer didn’t have much of a user interface beyond a few key presses, so Dr. Chris Knox, a postdoctoral scholar at the time, wrote a GTK-based GUI that allowed the user to select an image to explore and then provided zoom and movement buttons that zoomed and panned the image on the HIPerWall. The picture below shows Dr. Chris Knox and Dr. Frank Wessel examining a TileViewer image on HIPerWall. The Macs are visible on the left of the image. The one below that shows Sung-Jin Kim in front of TileViewer on HIPerWall.
The HIPerWall was built in the newly built Calit2 building at UCI. We knew HIPerWall was coming, so Professor Falko Kuester, the HIPerWall PI, and I, as Co-PI, worked to get infrastructure in place in the visualization lab. Falko was on the planning committee for the building, so we hoped our needs would be met. The building had good networking in place, though no user-accessible patch panels, but power was “value engineered” out. We quickly determined (blowing a few breakers in the process) that HIPerWall would need a lot more power than was available in the visualization lab at the time. The Calit2/UCI director at the time, Professor Albert Yee, agreed and ordered new power circuits for the lab. Meanwhile, postdocs Kai-Uwe Doerr and Chris Knox were busy assembling the framing and installing monitors into the 11×5 frame designed by Greg Dawe of UCSD. We had a deadline, because the Calit2 Advisory Board was to meet in the new UCI Calit2 building and Director Larry Smarr wanted to show HIPerWall. At somewhere around 3:00 PM on the day before the meeting, the electricians finished installing the power behind the wall. At that point, we moved the racks into place, putting 5 PowerMac G5s on each rack, installing Ethernet cables and plugging in the monitors and Macs to power. Once we booted the system, it turned out that TileViewer just worked. We were done making the system work by 6PM and it was a great surprise for Larry Smarr that HIPerWall was operational for the meeting the next morning.
Sung-Jin Kim then turned to distributed visualization of other things, like large datasets and movies, also in a highly interactive manner. The dataset he tackled first was Normalized Difference Vegetation Index data, so the new software was initially named NDVIviewer. This software allowed the import of raw data slabs that could then be color coded and rendered on the HIPerWall. In keeping with the “interactive” theme, each data object could be smoothly moved anywhere on the display wall and zoomed in or out as needed. Once again, the display node software figured out exactly what needed to be rendered where and did so very rapidly. The NDVI data comprised sets of 3D blocks of data that represented vegetation measured over a particular area over time, so each layer was a different timestep. The software allowed the user to navigate forward and backward among these timesteps in order to animate the change in vegetation. The picture below shows NDVIviewer running on HIPerWall showing an NDVI dataset.
NDVIviewer was also able to show an amazing set of functional MRI (fMRI) brain scans. This 800 MB data set held fMRI brain image slices for 5 test subjects who were imaged on 10 different fMRI systems around the country to se whether machines with different calibration or from different manufacturers yield significantly different images (they sure seem to do so), for a total of 50 sets of brain scans. NDVI viewer allowed each scan to be moved anywhere on the HIPerWall, and the used could step through an individual brain by varying the depth or through all simultaneously. In addition, the Cg shader image processing could be used to filter and highlight the images in real-time. Overall, this was an excellent use of the huge visualization space provided by HIPerWall and never failed to impress visitors.
NDVIviewer could do much more than just show data slices. It showed JPEG images with ease, smoothly sliding them anywhere on the wall. It could also show QuickTime movies, using the built-in QuickTime capability of the display node Macs to render the movies, then showing the right portions of the movies in the right place. While this capability had minimal scientific purpose, it was always impressive to visitors, because a playing movie could be resized and moved anywhere on the HIPerWall. The picture below shows a 720p QuickTime movie playing on HIPerWall.
Sung-Jin Kim added yet another powerful feature to NDVIviewer that allowed it to show very high-resolution 3D terrain models based on the SOAR engine. SOAR is extremely well suited for tiled display visualization, because it is a “level-of-detail” engine that renders as much as if can of the viewable area based on some desired level of detail (perhaps dependent on frame rate or user preferences). NDVIviewer’s implementation allowed the used to vary the level of detail in real-time, thus smoothing the terrain or rendering sharper detail. The movie below shows SOAR terrain rendering on HIPerWall.
Because of the power and capabilities of NDVIviewer, I started calling it MediaViewer, a name which stuck with almost everyone. An undergraduate student, Duy-Quoc Lai, doing summer research added streaming video capability to MediaViewer, so we could capture Firewire video from our Panasonic HD camera and stream it live to the HIPerWall. Starting with the addition of streaming video in 2006, we started transitioning the software to use the SPDS_Messaging library that I had developed for parallel and distributed processing research in my Scalable Parallel and Distributed Systems laboratory.
In addition to TileViewer and MediaViewer, several other pieces of software were used to drive the HIPerWall. The SAGE engine from the University of Illinois, Chicago’s Electronic Visualization Lab was the tiled display environment for OptIPuter, so we ran it on HIPerWall occasionally. See the movie below for an example of SAGE on HIPerWall.
Dr. Chris Knox wrote a very ambitious viewer for climate data that could access and parse netCDF data for display on the HIPerWall. This allowed us to explore data sets from the UN Intergovernmental Panel on Climate Change (IPCC) on a massive scale. We could see data from many sites at once or many times at once, or both. This outstanding capability was a fine example of what HIPerWall was intended to do. The picture below shows one version of the IPCC viewer running on HIPerWall.
Doctoral student Tung-Ju Hsieh also modified the SOAR engine to run on HIPerWall. His software allowed whole-Earth visualization from high-res terrain data sets, as shown in the movie below. This project was built to explore earthquakes by showing hypocenters in 3D space and in relation to each other. As before, each display node only renders the data needed for its displays and only to the level of detail specified to meet the desired performance.
Doctoral student Zhiyu He modified MediaViewer to display genetic data in addition to brain imagery for a project with UCI Drs. Fallon and Potkin to explore genetic bases for Schizophrenia. This research turned out to be very fruitful, as HIPerWall speeded up the discovery process for Drs. Fallon and Potkin. The image below shows Dr. Fallon on the left and Dr. Potkin on the right in front of HIPerWall. Photo taken by Paul Kennedy for UCI.
Another software project started on HIPerWall is the Cross-Platform Cluster Graphics Library CGLX. This powerful distributed graphics library makes it possible to port OpenGL applications nearly transparently to tiled displays, thus supporting 3D high-resolution visualization. Professor Falko Kuester and Dr. Kai-Uwe Doerr moved to UCSD at the end of 2006 and continued development of CGLX there. CGLX is now deployed on systems around the world.
In the next article, I will cover new research software from 2007 on when I took over leadership of the project at UCI. This new software forms the basis of the technology licensed to Hiperwall Inc., significantly advanced versions of which are available as part of Samsung UD systems and as products from Hiperwall Inc. In a future post, I will cover the wonderful content we have for HIPerWall (and Hiperwall) and how easy it is to make high-resolution content these days.