The History Channel’s The Universe, Ancient Mysteries Solved: Roman Engineering

History Channel at IDIA Lab
History Channel at IDIA Lab

John Fillwalk, director of the Institute for Digital Intermedia Arts and senior director of the university’s Hybrid Design Technologies initiative, was interviewed for the History Channel’s “The Universe.” The episode, which aired 9 p.m. May 23, featured the IDIA Lab’s digital recreation of ancient Roman structures.

For the season finale of its popular program “The Universe,” the History Channel is shining a spotlight on the work of virtual artists from Ball State University.

The season finale episode of the documentary series, which aired May 23, explores how Roman emperors built ancient structures to align with movements of the sun. To confirm experts’ theories about the religious, political, and cultural significance behind these phenomena, the cable network enlisted the help of Ball State’s Institute for Digital Intermedia Arts (IDIA).

Through the use of 3-D computer animation, artists and designers from the IDIA Lab virtually recreated several monuments featured in the episode, along with accurately simulating the alignment of the sun. These structures include: the Pantheon, one of the best preserved buildings of ancient Rome; the Temple of Antinous, a complex that today lies in ruins within Hadrian’s Villa outside of Rome; and the Meridian of Augustus, a site containing an Egyptian obelisk brought to Rome for use as a giant sundial and calendar.

‘Getting things right’

IDIA Lab's Roccabruna reconstruction

 

The IDIA Lab digitally rebuilt ancient Roman structures to give viewers of the History Channel’s “The Universe” a vivid idea of what ruins looked like when they were inhabited.

James Grant Goldin, a writer and director for The History Channel, says IDIA’s animations were an essential part of the program’s second season finale. An earlier episode of “The Universe,” which aired in 2014, contracted with IDIA for animation of Stonehenge in a segment demonstrating how the prehistoric monument may have been used to track celestial movements.

This time around, the work of Ball State digital artists is featured throughout the Roman engineering episode.

“It’s an honor for us,” says John Fillwalk, director of IDIA and senior director of the university’s Hybrid Design Technologies initiative. “Ideally our relationship with the History Channel will continue long into the future.”

Goldin said the cable network is keen on employing the IDIA for future projects because Fillwalk and his team create beautifully rendered graphics backed by data and research.

“John was devoted to getting things right,” he said. ” Many of the theories we discuss in the show remain controversial, and in nonfiction TV it’s always a good idea to present such things with a qualifying phrase in the narration—an ‘if’ or a ‘maybe.’ But the IDIA Lab combined their own research with that of experts, and I’m very happy with the results.”

Gaming software transforms history

IDIA Lab's Pantheon reconstruction

 

The Pantheon is one of the ancient structures the IDIA Lab developed 3-D renderings of for the History Channel’s “The Universe.”

Fillwalk has worked closely over the years with many prominent scholars of the ancient world, including Bernard Frischer, a Roman archeologist at Indiana University and Robert Hannah, dean of arts and social sciences from New Zealand’s University of Waikat, who advised on the archeo-astronomy of the project.

Hannah says he’s been astounded to see the way up-to-the-minute gaming software can bring to life the work of today’s historians and virtual archaeologists. “I’ve seen my son play games like ‘Halo,’ so I knew what was possible,” he said, “but I’d never seen it adapted to ancient world buildings.”

Phil Repp, Ball State’s vice president for information technology, says the relationship between the university and the cable network is a key example of how Ball State is distinguishing itself as leading world provider of emerging media content.

“For the History Channel to want the continued help of John and his staff speaks to the quality of what our lab can produce,” he said.

Goldin’s praise for the IDIA supports Repp’s theory. “Bringing the past to life is a very challenging task,” he said. “The renderings Ball State artists have created represent the most accurate possible picture of something that happened almost 2,000 years ago.”

VIRTUAL INSTRUMENT 3D UI

ChromaChord utilizes a combination of the Oculus Rift headset and an attached Leap Motion controller to create an immersive 3D VMI environment. The Oculus Rift provides a three-paneled visual interface, through Unity, that allows the users to access different component windows of the instrument (simply by turning their head), while the Leap Motion Controller allows the user to interact with the visual interface to musical effect (processed and generated in Max/MSP). ChromaChord uses the Oculus Rift DK2, a virtual reality headset to immerse the performer within the VMI. Utilizing two cameras in the virtual environment, this device relays a stereoscopic image to the wearer allowing for highly dimensional experiences. Additionally, a gyroscope, accelerometer, and magnetometer detect real-world head rotation and transform its virtual equivalents. Additionally, a peripheral infrared camera captures limited-range positional tracking. Mounted to the Oculus Rift is affixed a Leap Motion. This provides infrared tracking of hands. In our application, the Motion tracks the user’s real world hands and simulates them, mimicking hand positions and finger movements. Therefore, the user’s own hands are the primary input mechanism. Custom Software
  • Key note triggering and glow by Leap Motion hand touches
  • Particle generation triggered Leap Motion hand touches
  • Button click and glow by Leap Motion hand touches
  • Particle destruction and orbit physics
  • OSC Server and Client configurator / Message Handler / Data Handler
Our graphical user interface is composed of three sections – left, center and right panels. On the left, the user can see the modulation sphere. By touching this sphere, the user generates orbital particles. The number of these particles affects a live process on the notes. The user can remove these particles by simply swiping them away. Turning to the right reveals the root note and scale type controls. To select a root note and scale type, the user must press the desired button. The interface responds by highlighting the selected options. The center module contains the keys, a chromatic scale in two colored columns based on the Solfège. This is the core performance area of the UI. In addition there are arrows facing up and down that the user can touch to raise or lower the octave. All buttons and keys respond to touches by glowing white. 3D Musical Instrument The default window is the center palette, where the instrument itself is located. Within this window, there are two columns of colored keys (four per column) to the left and the right of the screen. To create sound, the user has but to put their hands vertically in front of them and ‘touch’ the keys, which are placed for both ease of use and the ability to contact multiple notes simultaneously to create chords. The notes, themselves, are stacked in intervals of a third from each other; consequently, neighboring notes on the same hand will create conventional harmonies. Between the columns, on the top and bottom-most extremities of them, are two arrows that, if touched, move the current note set up or down an octave. By touching the notes, Unity sends a UDP message to Max/MSP, indicating which note has been touched and remains active until contact has ceased. Similarly, by touching the octave buttons, UDP messages are sent to Max to reassign the MIDI note values. Within Max, MIDI values are then converted to their equivalent frequencies and assigned to their respective oscillators before being processed and output through Max’s audio out. When looking to the right, the user accesses the settings window, where note assignment takes place. First, the user selects the root note of the scale they wish to be within. The second parameter allows the user to select what mode of scale they would like – the options being: major, natural minor, and harmonic minor. By default, the instrument opens with a root note of C and is in the major mode. The UDP messages sent from the settings window control a series of gates within Max that then send adjusted MIDI values to the instrument’s oscillators. The performer accesses the modulation window to the left, which controls the frequency and Q of a filter, as well as level of effect. By touching the button in the center, Unity spawns floating particles that surround the user’s hands. The longer the users touches the button, the more particles accumulate; they can be removed by making shaking or throwing motions. Without particles, the filter’s frequency setting is at it’s lowest audible setting, creating a much darker, mellow timbre; alternatively, as more particles accumulate, the filter opens up, allowing for higher frequencies to pass through, creating a brighter, richer timbre. Additionally, the particles move the dry/wet ratio for effects towards the ‘wet’ end of the spectrum, deepening the audibility of the effects. The UDP message sent from the modulation window sends a single numeric value for the total number of particles active; this value is then scaled in Max/MSP to most appropriately suit each parameter that’s being altered.  
Data 0 0 0 0 0 0 0 0 0 0 0 0 0
Name Do Re Mi Fa Sol La Ti Do Up Octave Down Octave Modulation Root Note Scale Type
Type Key Key Key Key Key Key Key Key Octave Switcher Octave Switcher Modulation Root Note Scale Type
Range 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-100 0-11 0-2
Notes Bottom of octave Top of octave 1 when clicked 1 when clicked # of particles

Halloween Celestial Origins

stonehenge_summerfar

The Charles W. Brown Planetarium’s newest production, “Halloween: Celestial Origins” premiered the weekend of October 7th in collaborations with IDIA lab.

Learn the history of Halloween and how it fits into the seasons as a “cross-quarter day” during this new production. Explore the night sky and learn what planets, constellations, and stars will be out on Halloween. Created in collaboration with the Ball State University – Department of Physics and Astronomy, the Department of Anthropology, and IDIA Lab The Institute for Digital Intermedia Arts at Ball State University. More information at http://bsu.edu/planetarium.

spider

IDIA Lab Featured on WTHR News for Virtual Reality research

wthr

The future of home entertainment is taking a giant step forward into the world of virtual reality. More devices are hitting the market that transform the way we look at the world around us, and a group of developers at Ball State is on the cutting edge.

It’s something we experienced first-hand at Ball State University, where they produce content for virtual reality devices at the Institute for Digital Intermedia Arts. The lab is located in the architecture building on the Ball State Campus, where the focus is on the exploration the arts, science and technology.

For the last ten years, Ball State has been developing a wide range of projects for museums, the History Channel and other clients.

John Fillwalk oversees the work done there.

“We basically explore 3D simulation and virtual reality,” he explained. “The skill sets from the staff are across computer science to design to animation.”

“We continue to further the emerging media brand aspect of this university, giving students professional opportunities to work and stay in Indiana for design technology jobs,” said John Fillwalk.

The virtual reality industry estimates more than 7 million headsets will ship this year, even though researchers have found more than half of Americans have no interest in VR and don’t know what it is. Developers foresee huge growth in the next six years as they develop virtual reality social media apps, combined with lower prices for the technology.

http://bit.ly/29JHJAl

Solar System and Jupiter App

jupiter-2

Explore the solar system by touch and augmented reality. New app for iOS, Android and Oculus Rift designed by BSU’s IDIA Lab allows users to learn about our sun and planets – and their moons. Physical museum objects trigger augmented experiences of each planet, providing deeper context and understanding. View the moon’s orbits, examine each moon, the planet’s atmosphere and unique features such as Jupiter’s Great Red Spot. IDIA designed this augmented reality app as a prototype for the Boston Museum of Science Planetarium.

Virtual Buffalo Bill

Ball State University’s IDIA Lab https://idialab.org is developing a multiplayer virtual world that simulates Buffalo Bill Wild West Show. This digital history project is built in Unity 3D using custom software created by IDIA Lab and is being produced for the Buffalo Bill Center for the West in Cody, WY. Scholars include Douglas Seefeldt and James Connolly at Ball State University. http://centerofthewest.org

Buffalo Bill's Wild West

Astym 3D Visualization

Trevor Danehy and Tom Anderson, students from the IDIA Immersion Seminar, designed and produced a 3D visualization of a therapeutic process developed by Dr. Tom Sevier of Performance Dynamics. Students worked with state of the art software and a commercial anatomical 3D model with over 4,000 parts to realize a professional quality and dynamic scientific illustration for an external client.

The ASTYM simulation was an animation used to visualize specific effects and treatments of tendonosis. Information was gathered from ASTYM researchers to turn a complex, unseen procedure into a visual explanation. Choosing a design style was a challenge because 3D animation in the medical field has a different goal than most common animations. There needs to be a balance between accuracy and simple, clean visuals. Various shader styles were considered to properly display the internal processes of the human arm accurately, while avoiding anything realistically gruesome or visual clutter that would be detrimental to explaining the complexity of the human body. Transparent, yet visible objects were needed, so students opted for a dynamic incidence shader, which would shade the curvature and sides of objects while leaving the middle portions transparent enough to see what lies under them. Bright glowing colors were used to highlight particular tendons and separate muscles from each other.

Links
Performance Dynamics