Data Projection Comes Full Circle
The UTS Data Arena takes the idea of an immersive experience to a whole new level.
Text:/ Derek Powell
To considerable industry and public acclaim, the University of Technology Sydney (UTS) has opened a world leading Data Arena facility. Most people would think of an arena as a place of spectacle and excitement – a Colosseum where the audience gets to experience the kind of entertainment that is just too big to be staged anywhere else.
While the UTS facility might be better suited to just thirty people (rather than 30,000); its 20,000 by 1200-pixel active 3D display can put on quite a show. Once you step inside the 10m-diameter arena, you are enclosed within a seamless 4m-high cylindrical screen. From high above, six perfectly matched, edge-blended projectors create a flawless 3D image, complemented by 14.2 channel surround sound. But you’re not there for entertainment.
Think of a data arena as an enveloping stage that allows colossal datasets to be brought to life and be visualised in their entirety. Its bank of NVIDIA graphics processing units (GPUs) can generate an interactive visual representation of anything from microscopic germs invading the body to the movement of entire star systems. The unique perspective that the data arena gives, allows scientists and engineers to fly through their data and make discoveries that aren’t possible any other way. It’s a serious research tool, though it has a strong movie industry pedigree, as we’ll discover.
Ben Simons, technical director of the UTS Data Arena, has an unusual CV for a researcher. His expertise is in CGI for cinema and, prior to joining UTS, he was head of visual effects on the movie Happy Feet II. He has discovered that the kind of skills he used to animate 3D models of, say, dancing penguins, are not dissimilar to those needed to visualise the movement of molecules during a chemical reaction, the propagation of cracks in underground pipes, or any one of a thousand scientific or engineering problems.
Ben explained the basics of turning data into 3D video. “It’s geometry,” he said. “There’s no one magic way but we use a program called Houdini that’s typically used for visual effects in feature films.” Ben has worked on 15 features in the last 10 years so he’s learned a thing or two about bringing the impossible to screen. “To start, we really need to go back and think about what the data represents and how you want to see it. It might be topographical data, like a map; or financial data; or data from a microscope where we are looking at bacteria. If it is bacteria, do you need to see the shapes of the bacteria or do you want to think about them as particles in motion and look at their speed. Each researcher will think about their data in a different way and what we’re trying to do is capture that idea, that visualisation, and actually generate it for real”.
NUMBERS TO VISUALS
To transform numbers to visuals, the streams of numbers that make up the raw data are organised into channels within the software. Each channel is then mapped to either a geometric attribute such as its position in 3D space (using x, y and z axis) or perhaps a colour by assigning the numbers to RGB attributes or even to 3D sound. The data sets that researchers use may have been gathered from many kinds of digital device. While data may be generated by scientific instruments such as microscopes or LIDAR mapping devices, other researchers may aggregate data from mobile phones, public transport cards or even fitness apps.
There’s some serious IT behind the Data Arena display and Ben and his team are breaking new ground in real-time rendering. A typical animated feature film would create 3D character models and then render each scene overnight into a movie sequence before it could be viewed. Backstage at the Data Arena, Ben uses a bank of NVIDIA K6000 GPUs to render and display his models in real time. There’s some serious graphics grunt with the equivalent of 26,000 CUDA cores on tap but as you might expect, it is the software architecture that is crucial to the real-time processing.
“The two key things we are doing is parallel rendering and load balancing,” Ben confided. “We can give the graphics problem to the seven networked computers and they can divide up the calculations between the nine GPUs.”
Creating the pixels in real-time is one huge step but presenting them perfectly on screen is another hugely complex challenge. After carefully researching the available immersive simulation platforms, UTS shrewdly decided to develop the physical environment for the Arena using the Advanced Visualisation Interactive Environment (AVIE) developed as part of the iCinema project at UNSW.
The audiovisual installation was overseen by Damian Leonard of Immersive Realisation and involved some serious new technology. Six projectors are mounted on a circular truss suspended above the massive 10m-diameter screen. Damian explained that while previous AVIE installations had used Projectiondesign F series DLP projectors, for the Data Arena, the specifications were more demanding. Barco, which had taken over the Projectiondesign product line, had just released the F-85, which had the brightness and resolution to meet the demanding UTS requirements – but what about a lens?
“Wide angle lenses are essential to ensure that the light path of the projector is kept short and to minimise the casting of shadows from viewers standing in the main viewing region,” Damian noted. “Barco had had just developed the EN29 wide angle lens which looked like it had the specs we needed – but we weren’t sure it could meet the focus requirements on the curved screen.”
Only four of the new lenses had actually been manufactured – but in a stroke of luck, they had all gone to a special project at the Melbourne Convention and Exhibition Centre. MCEC’s Paul Rumble provided one of the precious lenses for a test on the curved screen and it came up trumps. The project quickly locked in six of the Barco F85 AS3D WUXGA projectors along with 12 Barco WB1920 for signal warping and edge blending. WUXGA signals are output from the NVIDIA Quadro K6000 cards as a separate left and right signal at 60Hz and converted using the Barco DCC120 module on the F85 to an active 3D 120Hz signal for projection. Viewers within the arena use Optoma Active Shutter glasses that feature reliable RF synchronisation.
strapping a television to your head is an inherently isolating experience. The group experience of the Data Arena is completely different
BIG DATA, BIG SCREEN
The 4m-high micro-perforated screen, installed by Pollard Productions, is an engineering marvel. Most other simulation designs simply provide a gap for the audience to enter, which doesn’t allow the full 360° immersion. The Data Arena goes one step further. A steel ring anchors the one-piece screen to the floor and a special door covered in screen material is set into the screen framework. Once closed, it is virtually seamless. Ben Simons remarked: “Once you’ve been inside for five minutes and are looking all around, you can lose the exit completely!” The centre of the elastic screen surface actually bows inward toward the audience by 20 centimetres – like the inside surface of a doughnut, which makes the projector line-up process even more demanding. Fortunately, Ben reports that the geometry has remained rock solid so far.
INTERACTION & CONTROL
There’s even more going on with an array of 12 Optitrack Prime 17W cameras above the arena screen. These can track infra-red motion capture markers within the arena with a precision of one millimetre. Ben is already experimenting with the use of multiple 3D ‘mice’ that allow viewers to simultaneously and co-operatively take control of the display.
Ben Simons is very clear that the potential of the Data Arena lies in its ability to render the complex scenes it creates from raw data in real time. At any time, you can stop, go back, move around, filter out certain aspects of the data or look at things from a different viewpoint. Participants in a session can potentially interact with the data and the display using smartphones, 3D mice, PlayStation controllers, Virtual Reality Wands, or a range of other Wi-Fi, MIDI, OSC, USB, and Bluetooth input devices.
Ben pointed out that while there are other 3D viewers, like the Oculus Rift goggles, strapping a television to your head is an inherently isolating experience. The group experience of the Data Arena is completely different. “We are creating a collaborative space,” he explained, “where people can meet and discuss data and not just be a passenger.”
It is clear that UTS have caught the vision and are working to create new possibilities. Although located in the faculty of Engineering & IT building, the Data Arena is a shared resource, like the University Library, reporting to the Deputy Vice Chancellor of Research. It is available to students and staff from all faculties and UTS is also encouraging industry and government users to bring their data to the Arena.
Ben Simons is energised by the prospect of new applications for the massive display. He sees his role as helping to build bridges between the visual effects industry and high performance computing. “We’re just beginning,” he says. “The next year is going to be very exciting.”
Cluster IG configuration
Dell PCs fitted with NVIDIA Quadro K6000 card
Single IG configuration
Xenon PC fitted with 3 x NVIDIA Quadro K6000 cards
Barco WB1920 Image Processors
Gefen 32 x 24 DVI Matrix
Gefen FM 500 DVI over fibre extender / receivers
6 x Barco F85 AS3D WUXGA Projectors (3-Chip DLP 10,000 ANSI Lumens) fitted with EN29 Lens and DCC120 Xport modules
RME FireFace UFX
Behringer Ultragain ADA8200
14 x Genelec 8020C Powered Monitors
2 x Genelec 7060B Powered Subwoofers
Camera Tracking System:
12 x Optitrack Prime 17W cameras
Project Manager: Damian Leonard
System Architect: Ardrian Hardjono
System Engineer: Robin Chow
System Engineer: Rob Lawther (iCinema)
Project Manager: Hugh Cranswick
Technical Director: Ben Simons
Software Developer: Darren Lee
Project Manager/Head Rigger: Alex Griffith
Barco Systems Australia
Service Engineer: Peter Cito
Account Manager: Jason Coy
WHY WE NEED A DATA ARENA
Some things just have to be seen in their entirety to understand what is going on. You can’t really grasp the importance of the annual migration of wildlife through Africa’s Serengeti plains by simply following a single herd of gazelles on foot. But fly over the endless plains in a light aircraft and you could instantly see the importance and effect of the seasonal movement of millions of animals. Think of trying to watch a football grand final if the only view you had was a close up of a single player. You need wide shots to see the whole team so you can understand how the game is being played out.
Big data sets pose exactly the same problem. Let’s imagine a meteorologist who has a set of rainfall and wind speed readings taken from hundreds of observatories across the Pacific Ocean. Looking at a dozen observations from each station in turn they might see that some kind of storm had passed over, but it might take hours, days or weeks of comparing climatic data to deduce what a particular weather system was and where it might be headed. However if those readings could be represented as images of moving clouds – like a satellite photo from space, they could instantly see the rotating pattern of a cyclone, and be able to issue an immediate warning.
Sometimes, there is no equivalent to that ‘wide shot’ from space or from an aircraft. The software used at the Data Arena allows massive sets of data – think of hundreds of pages of spreadsheets – to be transformed into graphics as a snapshot. Combining individual graphics into an animation shows how each variable changes over time and where each data point is in relation to every other data point. Displaying data like this can provide a picture of things we can measure but never actually see, like how cancer cells travel through the body or how an infectious disease spreads across a city.