Virtual Hadrian’s Villa Launch at Harvard Center

Virtual Hadrian’s Villa Launch at Harvard Center

IDIA Lab has designed a virtual simulation of the villa of the Roman Emperor Hadrian, which is a UNESCO World Heritage site located outside of Rome in Tivoli, Italy. This project has been produced in collaboration with the Virtual World Heritage Laboratory (VWHL) at Indiana University (IU), directed by Dr. Bernard Frischer and funded by the National Science Foundation. This large-scale recreation virtually interprets the entire villa complex in consultation with the world’s foremost Villa scholars. The project has been authored in the game engine of Unity as a live virtual multi-user online learning environment that allows students and visitors to immerse themselves in all aspects of the simulated villa. The project launched at the Harvard Center for Hellenic Studies in Washington, DC on November 22, 2013. The webplayer versions of the Hadrian’s Villa project are funded through a grant from the Mellon Foundation.

The Launch of the Digital Hadrian’s Villa Project

The Center for Hellenic Studies, Ball State University, and Indiana University

Friday, November 22, 2013

Harvard Center for Hellenic Studies

Washington, DC

Speakers:
John Fillwalk, IDIA Lab, BSU
Bernard Frischer, VWHL, IU
Marina Sapelli Ragni

The presentations included previews of:
The Virtual World of Hadrian’s Villa
The Digital Hadrian’s Villa website

VirtualHadriansVilla_IDIALab_Login

The project not only recreates the villa buildings but also includes a complete Roman avatar system, non-player characters with artificial intelligence, furniture, appropriate vegetation, dynamic atmospheric system and sophisticated user interface. The interface provides learning, navigation, reporting and assessment opportunities and also allows users to change the position of the sun to any date in 130 AD using data from the Horizons database at JPL NASA – testing theses of astro-alignments of architectural features during solstices and equinoxes. Learning communities are briefed on the culture and history of the villa and learn the virtual environment prior to immersing themselves within it. The avatar system allows for visitors to enter the world selecting class and gender – already being aware of the customs and behavior of the Roman aristocracy, soldier, slave or politician.

Khan Academy Walkthrough of Virtual Hadrian’s Villa: http://youtu.be/Nu_6X04EGHk

Link to Virtual Hadrian’s Villa Walkthrough: http://youtu.be/tk7B012q7Eg

The Digital Hadrian’s Villa Project:

Virtual World Technology as an Aid to Finding Alignments between

Built and Celestial Features

Bernard Frischer1

John Fillwalk2

1Director, Virtual World Heritage Laboratory, University of Virginia

2Director, IDIA Lab, Ball State University

Hadrian’s Villa is the best known and best preserved of the imperial villas built in the hinterland of Rome by emperors such as Nero, Domitian, and Trajan during the first and second centuries CE. A World Heritage site, Hadrian’s Villa covers at least 120 hectares and consists of ca. 30 major building complexes. Hadrian built this government retreat about 20 miles east of Rome between 117, when he became emperor, and 138 CE, the year he died. The site has been explored since the 15th century and in recent decades has been the object of intense study, excavation, and conservation (for a survey of recent work, see Mari 2010).

From 2006 to 20011, with the generous support of the National Science Foundation[1]and a private sponsor, the Virtual World Heritage Laboratory created a 3D restoration model of the entire site authored in 3DS Max. From January to April 2012, Ball State University’s Institute for Digital Intermedia Arts (IDIA Lab) converted the 3D model to Unity 3D, a virtual world (VW) platform, so that it could be explored interactively, be populated by avatars of members of the imperial court, and could be published on the Internet along with a related 2D website that presents the documentation undergirding the 3D model.

The 3D restoration model and related VW were made in close collaboration with many of the scholars who have written the most recent studies on the villa.[2] Our goal was to ensure that all the main elements—from terrain, gardens, and buildings to furnishings and avatars—were evidence-based. Once finished, the was used in two research projects.

The first project was a NSF-sponsored study of the usefulness of VW technology in archaeological education and research. We used the virtual villa in undergraduate classes at Xavier University and the University of Virginia to investigate the thesis of two recent studies by project advisors Michael Ytterberg and Federica Chiappetta about how this enormous built space was used by six different groups of ancient Romans, ranging from the Emperor and Empress to normal citizens and slaves (Ytterberg 2005; Chiappetta 2008). Avatars representing these groups have been created and are being operated by undergraduate students as a Problem‐Based Learning (PBL) experience. They are observed by subject experts, who are using the data generated to test and, if necessary, refine the initial theses about how circulation through the villa was handled.  The results are still being evaluated. Preliminary indications are that the data show that the combination of VW used in a PBL educational context is very effective in taking advantage of the known connection between between the hippocampus and long-term learning, especially when the information to be mastered is spatial (Kandel 2007).

The second project involved use of the VW for some new archaeoastronomical studies. Most of our advisors’ publications, like the older work by archaeologists that preceded them, have concentrated on archaeological documentation, restoration, formal, and functional analysis. The latest research by advisor De Franceschini and her collaborator Veneziano (2011) combined formal and functional analysis: it considered the alignment of certain important parts of the villa in relation to the sun’s apparent path through the sky on significant dates such as the solstices. In their recent book they showed how two features of the villa are aligned with the solar solstices: the Temple of Apollo in the Accademia; and the Roccabruna. We used the VW to extend their research to other areas of the villa, taking advantage of 3D technology to restore the sun to the right place in the sky and also to restore the damage to the architecture of the villa, as De Franceschini and Veneziano had independently suggested be done before they learned about our digital model of the villa.

The work of De Franceschini and Veneziano is innovative. Archaeastronomy has become an accepted field of study in recent decades, and a considerable amount of work has been done in Old and New World archaeology. In Roman archaeology, however, this approach is still rarely encountered. Significantly, one of the few compelling studies concerns the most famous Hadrianic building: the Pantheon in Rome. Hannah and Magli 2009 and Hannah 2011 have shown a number of solar alignments in the building, of which the most notable are the sun’s illumination of the entrance doorway at noon on April 21; and the view of sunset silhouetting the statue of Hadrian as Sun god on a four-horse chariot atop the Mausoleum of Hadrian as viewed from the middle of the Pantheon’s plaza at sunset on the summer solstice. Like the summer solstice, April 21 is also a significant date: on it occurred the  annual festival in Rome known as the Parilia (re-named the Romaia by Hadrian),[3] which celebrated the founding of Rome.

De Franceschini and Veneziano pursued an observation of Mangurian and Ray (2008) to document an impressive example of solar alignment at Hadrian’s Villa involving the tower known as Roccabruna at the western end of the villa. Originally, a tower-like structure topped by a round temple, what remains today is the well-preserved, massive lower floor. The main entrance is located on the northwestern side to the right and gives access to a large circular hall covered by a dome. The dome is punctuated by an odd feature: five conduits that are wider on the outside than on the inside (figure 1).

What is the function of these unusual conduits? They have no known parallel in Roman architecture. After asking themselves this same question, on June 21st, 1988, the day of summer solstice, the American architects Robert Mangurian and Mary Ann Ray went to Roccabruna at sunset, and discovered the extraordinary light phenomena which occur there. At sunset the Sun enters through the main door illuminating the niche on the opposite side, something that happens during most of the summer days. But only in the days of the summer Solstice the Sun penetrates also into the conduit located above that door: its rays come out from the slot inside the dome projecting a rectangular light blade on the opposite side of the dome. In June 2009, De Franceschini verified the findings of Mangurian and Ray. However, they know that the apparent path of the Sun through the sky changes slightly each year, so that in the nearly 1880 years separating us from Hadrian, the precise effect of the alignment has been lost. As they noted, only a computer simulation can recreate the original experience of being in the lower sanctuary at Roccabruna at sunset on the summer solstice during the reign of Hadrian.

Once we had our 3D model of the site, we were able to obtain from NASA’s Horizons system[4] the correct azimuthal data for the year AD 130 and put the sun into the sky at sunset on the summer solstice. Following the lead of De Franceschini, who in the meantime had become a consultant to our project, we put into the niche one of the four statues of the Egyptian sky goddess Isis that were found at the Villa. De Franceschini chose Isis because first of all, there is no question there was a statue in this niche so we need to put something there; and the two flanking niches had candelabra, whose bases are preserved and are decorated with Isiac iconography. Moreover, Isis’ festival in Rome was on the summer solstice. So we scanned and digitally restored one of the several statues of Isis from the villa and put it into the central niche. Finally, for the dome, which we know from surviving paint was blue and therefore had the famous “dome of heaven” motif (Lehmann 1945), we followed De Franceschini in restoring a zodiac set up in such a way that the sign of Gemini is over the statue niche since the last day of Gemini is the summer solstice. Our zodiac is adapted from the great Sun God mosaic in the Rheinisches Landesmuseum in Bonn, which kindly gave us permission to use it.

As can be seen in figure 2, when we restored the sun in the right position in the sky dome for sunset on the summer solstice (June 21) of 130 CE in our 3DS Max model of Roccabruna, the sunlight coming through the main doorway illuminated the statue of Isis in the statue niche, and the light entering through the conduit lit up the sign of Gemini painted on the cupola. So we were able to confirm the Mangurian-Ray thesis.

The approach we have taken in our Roccabruna project is deductive: Mangurian and Ray noted the strange feature of the conduits punctuating the cupola of Roccabruna; they hypothesized a solar alignment. De Franceschini and Veneziano agreed and for various reasons we don’t need to go into today, they put a statue of Isis into the statue niche. We set up the conditions in which these hypotheses could be tested and were able to verify them.

But surely, if there is one such alignment at the villa of the same emperor who was responsible for the Pantheon, there may be others. But the villa is very big—covering over 100 hectares—and has 30 major building complexes, most larger than Roccabruna. Moreover, such alignments could just as easily involve astrological features such as the Moon and the planets. Faced with this level of complexity, the best methodological way forward in searching for new alignments is clearly inductive and empirical. This is one reason why we asked the Institute for Digital Intermedia Arts (IDIA Lab) of Ball State University to create a multi-user virtual world based in Unity 3D from our 3DS Max model.

The project of virtually interpreting a simulation on the scope and scale of Hadrian’s Villa was a daunting one – engaging layers of scholarly, technical and pedagogical challenges. The technical challenges were many – foremost to leverage the game engine of Unity 3D to become an effective multi-user avatar-based virtual world. An important factor was to create an environment that was straightforward and accessible via standard web browsers on both Mac and Windows and selected Unity 3D as the starting point for developing the platorm. We required specific back-end administration tools to handle the accounts and server side aspects of the project – for this we relied on Smart Fox Server as it manages Unity 3D quite well. Our team took an approach that bridged and integrated disparate technologies, creating a robust virtual world platform to immersively augment both instructional and PBL processes. VW features available to the learning community included text based communication, a live map showing current visitor positions, map based teleportation, managed voice channel, user selected avatar gestures, online users, paradata, photographs of the extant site, plan views, and integrated web links.

Key to the project was a varied system of avatars representing the imperial court, freemen, senators, scholars, soldiers, and slaves to the emperor. The avatar system provided several important functions testing recent scholarly interpretations of circulation throughout the villa and the use of various spaces for typical court activities – meals, imperial audiences, bathing, worship, etc. Upon entering the simulation, the choice of avatar would predicate how one’s social standing within the role-play of the world.

A gesture system was created via motion capture providing each user with a unique set of actions and gestural responses to engage social interactions – including greetings, bowing and gestures specific to rank and class. Communication was also a critical element in the modes of problem based learning engaged by the participants in the simulation. Specific technologies provided varied abilities such as public chat, private instant messaging and live multi-user voice channels.

A companion website was co-developed and integrated into the VW environment providing learners with visual assets such as photographs and panoramas of the current site, site plans, elevations, and video interviews with Villa scholars. We also developed three-dimensional turntables of the interpreted and reconstructed models, overview information on each of the major Villa features, bibliography and an expansive database of art attributed to the Villa site. This information can be directly accessed by learners directly from within the virtual world. The development team integrated the notion of paradata, introduced by the London Charter – making instantly transparent the scholarship and all underlying elements of the 3D model (from terrain to buildings, furnishing, costumes, and human behavior).

In support of new research theme on celestial alignments by consultants De Franceschini and Veneziano, a major goal for the project was to develop an accurate simulation for the position of the sun. The solar tracking, or virtual heliodon that we created as a response to this research, was envisioned as a simulation that would a bridge between the virtual environment and coordinates from an external database calculating solar positions. After investigating existing tools we decided to employ the Horizons database that was created by NASA’s Jet Propulsion Laboratory as an on-line solar system data computation service – tracking celestial bodies in ephemerides from 9999 BCE to 9999 CE. In implementing solar tracking for the Villa project in instances were we where we wanted to investigate potential significant solar alignments, we entered the latitude, longitude and altitudes of specific buildings from the Tivoli site to poll the Horizons data for the year 130 CE. The user was able to change the date, time of day, and quickly play the sun from specific moments via the user interface. The system was co-related to both the Julian and Gregorian calendars and contained presets for the vernal and autumnal equinoxes as well at the summer and winter solstices.

These tools allowed for the rapid discovery of potential alignment that might bear further investigation. The solar feature allows one to proceed empirically, in effect turning the clock back to 130 CE and running experiments in which the days and hours of the year are sped up by orders of magnitude so that one can in a very short time find candidate alignments not yet hypothesized by scholars working in the traditional way of Mangurian-Ray.

As developers, our goal was to create the solar tool and let students and scholars use it to undertake their own empirical research. Our team was not intending to engage in this research ourselves, yet in the process of working within the environment daily we quickly began to notice curious solar phenomena. In a bit of empirical study of the very first component of the site we installed in the simulation, the Antinoeion – or newly-discovered Temple of the Divine Antinous, we noticed an alignment of potential interest. The most likely alignment seemed at first glance to be along the main axis running from the entrance, through the obelisk in the central plaza to the statue niche at the end of the axis. We ran the days and hours of the year and found that the sun and shadow of the obelisk align at sunrise on July 20. We consulted with our expert on the Egyptian calendar in the Roman period, Professor Christian Leitz of the University of Tuebingen–and he confirmed that this date has religious significance. It is, in fact, the date of the Egyptian New Year, as the Romans of Hadrian’s age clearly knew (cf. the Roman writer Censorinus, who states that the Egyptian New Year’s Day fell on July 20 in the Julian Calendar in 139 CE, which was a heliacal rising of Sirius in Egypt).

In the process of developing and subsequently utilizing the simulation tools we created for astro-archeological research, our conclusions have been that virtual world technologies can indeed take the inquiry for significant built-celestial alignments to a new level of insight.

Bibliography

Chiappetta, F. 2008. I percorsi antichi di Villa Adriana (Rome).

De Franceschini, M. and G. Veneziano, 2011. Villa Adriana. Architettura celeste. Gli secreti degli solstizi (Rome).

Hannah, R. 2008. Time in Antiquity (London).

Hannah, R. 2011. “The Role of the Sun in the Pantheon’s Design and Meaning,”Numen 58: 486-513.

Kandel, E. 2007. In Search of Memory: The Emergency of a New Science of Mind(W. W. Norton, New York). Kindler edition.

Lehmann, K. “The Dome of Heaven,” Art Bulletin 27: 1-27.

Lugli, G. 1940. “La Roccabruna di Villa Adriana,” Palladio, 4: 257-274

Mangurian, R. and M.A. Ray. 2008. “Re-drawing Hadrian’s Villa,” Yale Architectural Journal, 113-116.

Mari, Z. 2010. “Villa Adriana. Recenti scoperte e stato della ricerca,” Ephemeris Napocensis 20: 7-37.

Ytterberg, M. 2005. “The Perambulations of Hadrian. A Walk through Hadrian’s Villa,” Ph.D. dissertation, University of Pennsylvania.

Khan Academy

The Khan Academy featured HDT / IDIA Lab and Virtual World Heritage Laboratories Digital Hadrian’s Villa Project. Hosted by Beth Harris, Dean of Art and History at Khan Academy, a walkthrough with archeologist and HDT research fellow Dr. Bernard Frischer showcases the project. A virtual tour of Hadrian’s Villa using a 3D digital model of the villa was created under the direction of Dr. Bernard Frischer in conjunction with the IDIA Lab. The ruins of Hadrian’s Villa, in the town of Tivoli, near Rome, is spread over an area of approximately 250 acres. Many of the structures were designed by the Emperor Hadrian who ruled from 117 until his death in 138 C.E. This virtual rendering is based on current archeological research and has been created in consultation with art historians, archaeologists, and museum curators with expertise in this area. Please note, a few features are necessarily assumptions based on the best available evidence. Dr. Bernard Frischer and Dr. Beth Harris are the speakers in this walk though that was created by Beth Harris and Steven Zucker of Smart History.

NSF FUNDED VIRTUAL SIMULATION OF HADRIAN’S VILLA

NFS Funded Virtual Simulation of Hadrian’s Villa

IDIA Lab has designed a virtual simulation of the villa of the Roman Emperor Hadrian, which is a UNESCO World Heritage site located outside of Rome in Tivoli, Italy. This project is being produced in collaboration with the Virtual World Heritage Laboratory (VWHL) at the University of Virginia (UVA), directed by Dr. Bernard Frischer and funded by the National Science Foundation. This large-scale recreation virtually interprets the entire villa complex in consultation with the world’s foremost villa scholars and educators. The project has been authored in the game engine of Unity as a live 3D multi-user online learning environment that allows students and visitors to immerse themselves in all aspects of the simulated villa.

The project will not only accurately recreate the villa buildings but also include a complete Roman avatar system, non-player characters with artificial intelligence, furniture, indigenous vegetation, dynamic atmospheric system and sophisticated user interface. The interface will not only provide learning, navigation, reporting and assessment opportunities but will also allow users to change the position of the sun to any date in 130 AD using data from the Horizons database at JPL NASA – testing theses of astro-alignments of architectural features during solstices and equinoxes. UVA students will be briefed on the culture and history of the villa as well as learn the virtual environment for five weeks prior to immersing themselves within it. The avatar system will allow for them to enter the world choosing class and gender – already being aware of the customs and behavior of the Roman aristocracy, soldier, slave or politician. This project will be delivered to VWHL at UVA in early March.

The Digital Hadrian’s Villa Project:

Virtual World Technology as an Aid to Finding Alignments between

Built and Celestial Features

Bernard Frischer1

John Fillwalk2

1Director, Virtual World Heritage Laboratory, University of Virginia

2Director, IDIA Lab, Ball State University

Hadrian’s Villa is the best known and best preserved of the imperial villas built in the hinterland of Rome by emperors such as Nero, Domitian, and Trajan during the first and second centuries CE. A World Heritage site, Hadrian’s Villa covers at least 120 hectares and consists of ca. 30 major building complexes. Hadrian built this government retreat about 20 miles east of Rome between 117, when he became emperor, and 138 CE, the year he died. The site has been explored since the 15th century and in recent decades has been the object of intense study, excavation, and conservation (for a survey of recent work, see Mari 2010).

From 2006 to 20011, with the generous support of the National Science Foundation[1] and a private sponsor, the Virtual World Heritage Laboratory created a 3D restoration model of the entire site authored in 3DS Max. From January to April 2012, Ball State University’s Institute for Digital Intermedia Arts (IDIA Lab) converted the 3D model to Unity 3D, a virtual world (VW) platform, so that it could be explored interactively, be populated by avatars of members of the imperial court, and could be published on the Internet along with a related 2D website that presents the documentation undergirding the 3D model.

The 3D restoration model and related VW were made in close collaboration with many of the scholars who have written the most recent studies on the villa.[2] Our goal was to ensure that all the main elements—from terrain, gardens, and buildings to furnishings and avatars—were evidence-based. Once finished, the was used in two research projects.

The first project was a NSF-sponsored study of the usefulness of VW technology in archaeological education and research. We used the virtual villa in undergraduate classes at Xavier University and the University of Virginia to investigate the thesis of two recent studies by project advisors Michael Ytterberg and Federica Chiappetta about how this enormous built space was used by six different groups of ancient Romans, ranging from the Emperor and Empress to normal citizens and slaves (Ytterberg 2005; Chiappetta 2008). Avatars representing these groups have been created and are being operated by undergraduate students as a Problem‐Based Learning (PBL) experience. They are observed by subject experts, who are using the data generated to test and, if necessary, refine the initial theses about how circulation through the villa was handled.  The results are still being evaluated. Preliminary indications are that the data show that the combination of VW used in a PBL educational context is very effective in taking advantage of the known connection between between the hippocampus and long-term learning, especially when the information to be mastered is spatial (Kandel 2007).

The second project involved use of the VW for some new archaeoastronomical studies. Most of our advisors’ publications, like the older work by archaeologists that preceded them, have concentrated on archaeological documentation, restoration, formal, and functional analysis. The latest research by advisor De Franceschini and her collaborator Veneziano (2011) combined formal and functional analysis: it considered the alignment of certain important parts of the villa in relation to the sun’s apparent path through the sky on significant dates such as the solstices. In their recent book they showed how two features of the villa are aligned with the solar solstices: the Temple of Apollo in the Accademia; and the Roccabruna. We used the VW to extend their research to other areas of the villa, taking advantage of 3D technology to restore the sun to the right place in the sky and also to restore the damage to the architecture of the villa, as De Franceschini and Veneziano had independently suggested be done before they learned about our digital model of the villa.

The work of De Franceschini and Veneziano is innovative. Archaeastronomy has become an accepted field of study in recent decades, and a considerable amount of work has been done in Old and New World archaeology. In Roman archaeology, however, this approach is still rarely encountered. Significantly, one of the few compelling studies concerns the most famous Hadrianic building: the Pantheon in Rome. Hannah and Magli 2009 and Hannah 2011 have shown a number of solar alignments in the building, of which the most notable are the sun’s illumination of the entrance doorway at noon on April 21; and the view of sunset silhouetting the statue of Hadrian as Sun god on a four-horse chariot atop the Mausoleum of Hadrian as viewed from the middle of the Pantheon’s plaza at sunset on the summer solstice. Like the summer solstice, April 21 is also a significant date: on it occurred the  annual festival in Rome known as the Parilia (re-named the Romaia by Hadrian),[3] which celebrated the founding of Rome.

De Franceschini and Veneziano pursued an observation of Mangurian and Ray (2008) to document an impressive example of solar alignment at Hadrian’s Villa involving the tower known as Roccabruna at the western end of the villa. Originally, a tower-like structure topped by a round temple, what remains today is the well-preserved, massive lower floor. The main entrance is located on the northwestern side to the right and gives access to a large circular hall covered by a dome. The dome is punctuated by an odd feature: five conduits that are wider on the outside than on the inside (figure 1).

What is the function of these unusual conduits? They have no known parallel in Roman architecture. After asking themselves this same question, on June 21st, 1988, the day of summer solstice, the American architects Robert Mangurian and Mary Ann Ray went to Roccabruna at sunset, and discovered the extraordinary light phenomena which occur there. At sunset the Sun enters through the main door illuminating the niche on the opposite side, something that happens during most of the summer days. But only in the days of the summer Solstice the Sun penetrates also into the conduit located above that door: its rays come out from the slot inside the dome projecting a rectangular light blade on the opposite side of the dome. In June 2009, De Franceschini verified the findings of Mangurian and Ray. However, they know that the apparent path of the Sun through the sky changes slightly each year, so that in the nearly 1880 years separating us from Hadrian, the precise effect of the alignment has been lost. As they noted, only a computer simulation can recreate the original experience of being in the lower sanctuary at Roccabruna at sunset on the summer solstice during the reign of Hadrian.

Once we had our 3D model of the site, we were able to obtain from NASA’s Horizons system[4] the correct azimuthal data for the year AD 130 and put the sun into the sky at sunset on the summer solstice. Following the lead of De Franceschini, who in the meantime had become a consultant to our project, we put into the niche one of the four statues of the Egyptian sky goddess Isis that were found at the Villa. De Franceschini chose Isis because first of all, there is no question there was a statue in this niche so we need to put something there; and the two flanking niches had candelabra, whose bases are preserved and are decorated with Isiac iconography. Moreover, Isis’ festival in Rome was on the summer solstice. So we scanned and digitally restored one of the several statues of Isis from the villa and put it into the central niche. Finally, for the dome, which we know from surviving paint was blue and therefore had the famous “dome of heaven” motif (Lehmann 1945), we followed De Franceschini in restoring a zodiac set up in such a way that the sign of Gemini is over the statue niche since the last day of Gemini is the summer solstice. Our zodiac is adapted from the great Sun God mosaic in the Rheinisches Landesmuseum in Bonn, which kindly gave us permission to use it.

As can be seen in figure 2, when we restored the sun in the right position in the sky dome for sunset on the summer solstice (June 21) of 130 CE in our 3DS Max model of Roccabruna, the sunlight coming through the main doorway illuminated the statue of Isis in the statue niche, and the light entering through the conduit lit up the sign of Gemini painted on the cupola. So we were able to confirm the Mangurian-Ray thesis.

The approach we have taken in our Roccabruna project is deductive: Mangurian and Ray noted the strange feature of the conduits punctuating the cupola of Roccabruna; they hypothesized a solar alignment. De Franceschini and Veneziano agreed and for various reasons we don’t need to go into today, they put a statue of Isis into the statue niche. We set up the conditions in which these hypotheses could be tested and were able to verify them.

But surely, if there is one such alignment at the villa of the same emperor who was responsible for the Pantheon, there may be others. But the villa is very big—covering over 100 hectares—and has 30 major building complexes, most larger than Roccabruna. Moreover, such alignments could just as easily involve astrological features such as the Moon and the planets. Faced with this level of complexity, the best methodological way forward in searching for new alignments is clearly inductive and empirical. This is one reason why we asked the Institute for Digital Intermedia Arts (IDIA Lab) of Ball State University to create a multi-user virtual world based in Unity 3D from our 3DS Max model.

The project of virtually interpreting a simulation on the scope and scale of Hadrian’s Villa was a daunting one – engaging layers of scholarly, technical and pedagogical challenges. The technical challenges were many – foremost to leverage the game engine of Unity 3D to become an effective multi-user avatar-based virtual world. An important factor was to create an environment that was straightforward and accessible via standard web browsers on both Mac and Windows and selected Unity 3D as the starting point for developing the platorm. We required specific back-end administration tools to handle the accounts and server side aspects of the project – for this we relied on Smart Fox Server as it manages Unity 3D quite well. Our team took an approach that bridged and integrated disparate technologies, creating a robust virtual world platform to immersively augment both instructional and PBL processes. VW features available to the learning community included text based communication, a live map showing current visitor positions, map based teleportation, managed voice channel, user selected avatar gestures, online users, paradata, photographs of the extant site, plan views, and integrated web links.

Key to the project was a varied system of avatars representing the imperial court, freemen, senators, scholars, soldiers, and slaves to the emperor. The avatar system provided several important functions testing recent scholarly interpretations of circulation throughout the villa and the use of various spaces for typical court activities – meals, imperial audiences, bathing, worship, etc. Upon entering the simulation, the choice of avatar would predicate how one’s social standing within the role-play of the world.

A gesture system was created via motion capture providing each user with a unique set of actions and gestural responses to engage social interactions – including greetings, bowing and gestures specific to rank and class. Communication was also a critical element in the modes of problem based learning engaged by the participants in the simulation. Specific technologies provided varied abilities such as public chat, private instant messaging and live multi-user voice channels.

A companion website was co-developed and integrated into the VW environment providing learners with visual assets such as photographs and panoramas of the current site, site plans, elevations, and video interviews with Villa scholars. We also developed three-dimensional turntables of the interpreted and reconstructed models, overview information on each of the major Villa features, bibliography and an expansive database of art attributed to the Villa site. This information can be directly accessed by learners directly from within the virtual world. The development team integrated the notion of paradata, introduced by the London Charter – making instantly transparent the scholarship and all underlying elements of the 3D model (from terrain to buildings, furnishing, costumes, and human behavior).

In support of new research theme on celestial alignments by consultants De Franceschini and Veneziano, a major goal for the project was to develop an accurate simulation for the position of the sun. The solar tracking, or virtual heliodon that we created as a response to this research, was envisioned as a simulation that would a bridge between the virtual environment and coordinates from an external database calculating solar positions. After investigating existing tools we decided to employ the Horizons database that was created by NASA’s Jet Propulsion Laboratory as an on-line solar system data computation service – tracking celestial bodies in ephemerides from 9999 BCE to 9999 CE. In implementing solar tracking for the Villa project in instances were we where we wanted to investigate potential significant solar alignments, we entered the latitude, longitude and altitudes of specific buildings from the Tivoli site to poll the Horizons data for the year 130 CE. The user was able to change the date, time of day, and quickly play the sun from specific moments via the user interface. The system was co-related to both the Julian and Gregorian calendars and contained presets for the vernal and autumnal equinoxes as well at the summer and winter solstices.

These tools allowed for the rapid discovery of potential alignment that might bear further investigation. The solar feature allows one to proceed empirically, in effect turning the clock back to 130 CE and running experiments in which the days and hours of the year are sped up by orders of magnitude so that one can in a very short time find candidate alignments not yet hypothesized by scholars working in the traditional way of Mangurian-Ray.

As developers, our goal was to create the solar tool and let students and scholars use it to undertake their own empirical research. Our team was not intending to engage in this research ourselves, yet in the process of working within the environment daily we quickly began to notice curious solar phenomena. In a bit of empirical study of the very first component of the site we installed in the simulation, the Antinoeion – or newly-discovered Temple of the Divine Antinous, we noticed an alignment of potential interest. The most likely alignment seemed at first glance to be along the main axis running from the entrance, through the obelisk in the central plaza to the statue niche at the end of the axis. We ran the days and hours of the year and found that the sun and shadow of the obelisk align at sunrise on July 20. We consulted with our expert on the Egyptian calendar in the Roman period, Professor Christian Leitz of the University of Tuebingen–and he confirmed that this date has religious significance. It is, in fact, the date of the Egyptian New Year, as the Romans of Hadrian’s age clearly knew (cf. the Roman writer Censorinus, who states that the Egyptian New Year’s Day fell on July 20 in the Julian Calendar in 139 CE, which was a heliacal rising of Sirius in Egypt).

In the process of developing and subsequently utilizing the simulation tools we created for astro-archeological research, our conclusions have been that virtual world technologies can indeed take the inquiry for significant built-celestial alignments to a new level of insight.

Bibliography

Chiappetta, F. 2008. I percorsi antichi di Villa Adriana (Rome).

De Franceschini, M. and G. Veneziano, 2011. Villa Adriana. Architettura celeste. Gli secreti degli solstizi (Rome).

Hannah, R. 2008. Time in Antiquity (London).

Hannah, R. 2011. “The Role of the Sun in the Pantheon’s Design and Meaning,” Numen 58: 486-513.

Kandel, E. 2007. In Search of Memory: The Emergency of a New Science of Mind (W. W. Norton, New York). Kindler edition.

Lehmann, K. “The Dome of Heaven,” Art Bulletin 27: 1-27.

Lugli, G. 1940. “La Roccabruna di Villa Adriana,” Palladio, 4: 257-274

Mangurian, R. and M.A. Ray. 2008. “Re-drawing Hadrian’s Villa,” Yale Architectural Journal, 113-116.

Mari, Z. 2010. “Villa Adriana. Recenti scoperte e stato della ricerca,” Ephemeris Napocensis 20: 7-37.

Ytterberg, M. 2005. “The Perambulations of Hadrian. A Walk through Hadrian’s Villa,” Ph.D. dissertation, University of Pennsylvania.


[1] NSF grant # IIS-1018512.

[2] For a list of scholarly consultants, see http://idialabprojects.org/hvtest/mission.php,

[3] Athenaeus 8.361e-f.

[4] http://ssd.jpl.nasa.gov/?horizons.

Ball State University’s IDIA Lab is granted rights to Blue Mars virtual world technology from Avatar Reality

Ball State University’s IDIA Lab is granted rights to Blue Mars virtual world technology from Avatar Reality http://bit.ly/BSU_BlueMars Ball State granted rights to develop $10 million Blue Mars virtual world technology (3/5/2012)

Ball State granted rights to develop $10 million Blue Mars virtual world technology (3/5/2012)

A digital laser scan of a centuries-old Buddha (above) and a simulation of the 1915 World’s Fair in San Francisco (below) are two examples of how Ball State has employed Blue Mars for cutting edge technology projects.

The future of virtual worlds has a new home at Ball State University, thanks to the generosity of Avatar Reality Inc., which granted the university rights to the multimillion-dollar 3-D virtual world platform, Blue Mars.

Blue Mars’ creator, Hawaii-based Avatar Reality, has provided expanded rights to the technology, valued at $10 million in research and development, to Ball State for 3-D simulation and research projects beyond the world of gaming.

“We are honored to have our institution selected to build upon the tremendous innovations within Avatar Reality’s Blue Mars platform,” said John Fillwalk, director of the Institute for Digital Intermedia Arts (IDIA) and senior director of Ball State’s Hybrid Design Technologies initiative. “This partnership between Avatar Reality and Ball State is an unprecedented gesture and a great distinction.”

Since 2009, Ball State’s IDIA has been a leading developer for Blue Mars, employing the virtual platform on projects such as digital laser scans of a centuries-old Buddha and the creation of the Virtual Middletown Project. The project, which ushers visitors via their computers into the world of the early 20th-century Ball Brothers Glass Manufacturing Co., is one example of the cultural heritage work possible with Blue Mars. Another is IDIA’s simulation of the 1915 World’s Fair in San Francisco.

“Avatar Reality designed in Blue Mars a next-generation 3-D platform that greatly advanced the graphic fidelity, functionality and scalability of virtual worlds — well beyond current standards,” Fillwalk said. “Its strategy connected social spaces across devices, from high-performance desktops to mobile phones. Our goal is to build upon both its technology and vision for the future of virtual worlds. “

Recognition speaks to reputation

Phil Repp, Ball State’s vice president for information technology, said the new agreement with Avatar Reality can be traced to Fillwalk’s commitment to advancing IDIA to the national level.

“The fact that other universities — and many of them of very high caliber — are working hard to gain expertise in hybrid worlds and that Ball State was identified as the university most likely to further develop this technology speaks volumes about our reputation, experience and abilities,” Repp said. “It’s an honor for Ball State and John to be singled out with this kind of recognition.”

Under Ball State’s terms of agreement with Avatar Reality, the university will begin to fully operate Blue Mars for noncommercial purposes, expand upon the source code, increase its research and academic initiatives, and enhance the community of Blue Mars. In addition, Ball State will deliver original content on Blue Mars as it has done in the past. Existing commercial activity will continue to be operated and supported by Avatar Reality, Inc.

“I am really excited about the future,” Repp said. “Through our division of Hybrid Design Technologies, Ball State will further our position as a national leader in hybrid worlds and their applications to mediated learning technologies. Our reputation in this discipline is such a perfect fit to our institutional history of innovation in teaching and learning.”

About Ball State: Located in Muncie, Ind., Ball State University is redefining education by providing relevant, immersive learning experiences that engage high-caliber students in intense interdisciplinary projects both in and out of the classroom. Its vibrant campus is home to about 22,000 undergraduate and graduate students from across the country and abroad. Learn more at www.bsu.edu.

About Blue Mars: Developed by Avatar Reality Inc., Blue Mars is a premium 3-D virtual world platform featuring unparalleled interactivity, fidelity, scalability and security, and it enables artists and developers to create and distribute interactive 3-D experiences to a global audience. Blue Mars launched in open beta in October 2009 and began selling virtual land to third party developers in January 2010. Blue Mars Mobile, an iOS app based on the content and technology from the virtual world, was introduced in February 2011.

IDIA Lab Featured on WTHR News for Virtual Reality research

wthr

The future of home entertainment is taking a giant step forward into the world of virtual reality. More devices are hitting the market that transform the way we look at the world around us, and a group of developers at Ball State is on the cutting edge.

It’s something we experienced first-hand at Ball State University, where they produce content for virtual reality devices at the Institute for Digital Intermedia Arts. The lab is located in the architecture building on the Ball State Campus, where the focus is on the exploration the arts, science and technology.

For the last ten years, Ball State has been developing a wide range of projects for museums, the History Channel and other clients.

John Fillwalk oversees the work done there.

“We basically explore 3D simulation and virtual reality,” he explained. “The skill sets from the staff are across computer science to design to animation.”

“We continue to further the emerging media brand aspect of this university, giving students professional opportunities to work and stay in Indiana for design technology jobs,” said John Fillwalk.

The virtual reality industry estimates more than 7 million headsets will ship this year, even though researchers have found more than half of Americans have no interest in VR and don’t know what it is. Developers foresee huge growth in the next six years as they develop virtual reality social media apps, combined with lower prices for the technology.

http://bit.ly/29JHJAl

Hero’s Horizons a Real-Time 3D Adventure Game

CAI_startmenu

Hero’s Horizons is a real-time 3D adventure game where players can choose a character that shapes the outcomes of their own experience in learning about civic responsibility and the consequences and impact of their own actions. The game employs a graphic novel esthetic as it immerses players in the conflict and progress of improving the health of an urban community.

In order to design an environment which allowed mission-based gameplay as well as free-roaming exploration, the team at Ball State University’s IDIA lab employed Unity 3D alongside various modeling, animation, motion capture, scripting, painting and texturing packages to create the look and feel of the world. Characters came to life through cinematic and graphic novel techniques, allowing the story to flow as required by interactive missions, but yet allow free gameplay via exploration and mini-games. Players’ good deeds provided mementos from those characters whose lives they helped shape and change.

Ball State University and Creative Associates have developed and disseminated a highly successful and transformative life skills curriculum to help at-risk youth in Central America and the Caribbean. But Creative Associates’ life skills training has always happened face-to-face, making it difficult for organizations to scale it large enough to help the greatest number of at-risk youth. In 2013, Creative Associates and Ball State University joined forces to explore the possibilities of gaming and gamification to expand the reach and impact of the existing life skills curriculum. The result is Hero’s Horizons – an innovative game-based learning system designed to deliver critical life skills training to at-risk youth via mobile devices and the Web.

Creative Associates International and Ball State University have combined expertise to create a truly blended-learning game for change targeted at youth living in high crime areas of Central America and the Caribbean.  By bringing together a mobile-ready graphic novel game – Hero’s Horizons, built in Unity 3-D, and 20 interactive e-modules, Creative seeks to build the life skills of 10,000 youth living in gang affected communities. Youth can access the game and training through their mobile devices as well as in some of the 200 youth outreach centers Creative has set up across the region.

Hero’s Horizons allows the player to customize a character, and then play the role of a young community organizer working in 4 sections of Capital City to help mobilize community youth and leaders to create a youth outreach center.  Along the way he/she encounters issues of apathy, political manipulation, and many issues facing at risk youth that he/she has the opportunity to make choices in how to respond to as he/she learns to overcome barriers at both the individual and community level.

The graphic novel game and accompanying e-modules help youth to advance their life skills in 20 competencies including conflict management, finding employment, and team work among others.  the game and the e-modules continually reinforce each other as part of a blended learning approach to long term retention.

Youth’s achievements are rewarded both through achievement badges as they successfully complete tasks and levels, as well as in the e-module section where youth earn printable Certificates of Competency for completing modules and Mozilla badges for achieving learning paths.

The game is being rolled out and tested through Creative’s network of 200 youth outreach centers in the region and other youth serving organizations to measure its effectiveness and further adapt it to the realities that youth in these communities face.

CAI

www.creativeassociatesinternational.com

Download Android game at Google Play

https://play.google.com/store/apps/details?id=org.idialab.LifeSkills&hl=en

Wonders of Oz: iOS and Android app

Wonders of Oz is an augmented reality application that provides users an enhanced 3D experience with the film, The Wizard of Oz. This demo app can be triggered during film (or anywhere, anytime) to deliver animated content and sound to augment the screening and share events with a live audience. Flying monkeys, the witch on her burning broom, lollipops, the hot air balloon, Dorothy’s falling house, cows, chickens, rowboats, bicycles, rainbows and the wizard all appear with the press of a button! Users can use their devices to discover and track the virtual performers as they appear around them during the film. The original version contained more 3D content and was designed to work on a timer so the audience can share in the digital performers appearing on cue! This app can be adapted for any event, live theater, film concerts, art and cultural heritage events, etc. Note: This application requires the use of a gyroscope – not all Android devices possess one.

iOS: http://apple.co/1SmwI7Z

Google Play: http://bit.ly/1RRbNds

951ca5f3-1448-406b-8c43-238e6368642f

Buffalo Bill’s Wild West gets a virtual makeover

Virtual Buffalo Bill's Wild West

Digital artists from Ball State’s IDIA Lab built their virtual simulation of Buffalo Bill’s Wild West with the Unity 3-D gaming platform along with custom software created by the lab.

Cowboys and Indians captivated the country when Buffalo Bill’s Wild West rolled through America in the late 1800s and early 1900s. More than a century later, Ball State digital artists have re-created the legendary showman’s outdoor exhibition.

Working with staff from the Buffalo Bill Center of the West, artists and designers from Ball State’s Institute for Digital Intermedia Arts (IDIA) have crafted a computer-generated world that authentically simulates the Wild West showdramatizing frontier life.

“The visual look and feel of the project is something we’re really proud of,” said John Fillwalk, IDIA director and senior director of the university’s Hybrid Design Technologies initiative.

Fillwalk collaborated on the project with Jeremy Johnston, curator of the center’s Buffalo Bill Museum, and Ball State historians James Connolly and Douglas Seefeldt.

As a senior digital editor of the Papers of William F. Cody, Seefeldt has worked closely with Johnston on several projects the National Endowment for the Humanities recently recognized as among the most significant it funded.

“When Doug introduced me to John, I was excited because all we had to visually represent the Wild West show at the Buffalo Bill Museum was this small architectural diorama,” said Johnston, who is also managing editor of the Papers of William F. Cody. “It gave our visitors an aerial overview of the show but lacked action.

“What the IDIA captured for us is the look and feel of the experience, right down to the sound effects of horses and the stage coach running through the arena.”

Buffalo Bill’s Muncie visit

Interactive Firearm Demonstration

IDIA-created augmented reality apps will feature objects in the museums’ collections, such as firearms.

The Virtual Buffalo Bill project offered a crossover research opportunity for Connolly, director of Ball State’s Center for Middletown Studies. The center is developing Virtual Middletown, a 3-D visualization of industrializing early 20th-century Muncie, and the Buffalo Bill simulation offered an opportunity to produce a module as part of that endeavor.

Connolly and Seefeldt provided Fillwalk with photographs and newspaper accounts of Buffalo Bill’s 1899 stop in Muncie. “He personified the Wild West for audiences in these small towns,” Connolly said.

Connolly’s and Seefeldt’s research, along with assets provided by the Buffalo Bill Center, allowed Fillwalk and his artists to create beautifully rendered graphics based on data and research, hallmarks that have distinguished IDIA’s work in emergent media design.

“The attack on the Deadwood Stage Coach is simulated down to representing John Y. Nelson, one of America’s original Mountain Men driving the coach,” Fillwalk explained. “And Cody himself—along with his wardrobe—was painstakingly researched and re-created. His appearance was based on specific clothing of Cody’s in the museum collection that we were allowed to photograph.”

Seefeldt said Fillwalk’s re-creations uniquely capture William F. “Buffalo Bill” Cody.

“His show had it all—buffalos, the Pony Express, Annie Oakley, re-enactments of iconic events in the history of the West. He was one of the most famous people in the country, a celebrity of that era, and it’s a thrill to see the way John has brought him back to life.”

Ball State-Center of the West partnership continues

Located in Cody, Wyoming, the Buffalo Bill Center of the West includes the Buffalo Bill, Draper Natural History, Whitney Western Art, Plains Indian and Cody Firearms museums, along with the McCracken Research Library.

The Origins of Buffalo Bill

Buffalo Bill portraitBorn in 1846, William F. Cody rode for the Pony Express, served as a military scout and earned his moniker “Buffalo Bill” while hunting the animals for the Kansas Pacific Railroad work crews. Beginning in 1883, he became one of the world’s best showmen with the launch of Buffalo Bill’s Wild West, which was staged for 30 years, touring America and Europe multiple times.

The IDIA Lab’s next project for the center will be a series of augmented reality apps featuring objects in each museum’s collection. By holding electronic devices over images like a grizzly bear or gun, users can learn more about them as 3-D models of the subjects pop up on screen.

“By using their phones or tablets, visitors can see museum exhibits come to life,” Fillwalk said. “All of our work is meant to give visitors a greater appreciation for these assets with the aid of our digital interpretations.”

Johnston said what he likes best about Fillwalk’s approach is the way “he puts technology in the users’ hands.”

“I’ve seen so many younger people walking through our museums with their heads down, glued to their iPhones and iPads. With John’s help, I’m excited that we’re taking something they’re so familiar with and using it in a way to get them to engage with our exhibits here.”

Funding for the Virtual Buffalo Bill project was provided by a grant from the Buffalo Bill Center for the West, which was matched by internal grant funding from Ball State.

http://cms.bsu.edu/news/articles/2015/10/ball-state-gives-buffalo-bills-wild-west-a-virtual-makeover

Temple of Artemis: Wonder of the Ancient World

IDIA Lab was contracted by digital humanities scholars at UCLA to design and build a virtual simulation of the Temple of Artemis, one of the Wonders of the Ancient World, This massive Greek temple, four times the size of the Parthenon lies in ruin in present-day Turkey. This simulation incorporates our CelestialEngine with accurately positions both the sun and moon using a site’s latitude, longitude, altitude and year via NASA JPL data. This particular simulation studies whether an opening in the temple’s portico allowed moonlight to illuminate the statue of Artemis on her feast day.

IDIA Lab wins the Bronze Medal at iENA Exposition in Nuremberg, Germany

IDIA Lab’s Virtual Pantheon/Kinect project in Blue Mars and our occupational simulation for Alzheimer’s patients won the Bronze Medal at the iENA International Trade Fair for Ideas, Inventions and New Products in Nuremberg, Germany. http://www.iena.de/en/home.html

The Virtual Pantheon Project in Blue Mars is a multi-user simulation of the Roman Pantheon and its surroundings as it might have existed in 320AD. The structure is the largest and most intact surviving example of ancient Roman architecture in existence. The building was commissioned by Marcus Agrippa during the reign of Augustus as a temple to all the gods of ancient Rome, and rebuilt by the emperor Hadrian about 126 AD. The simulation was created by the Institute for Digital Intermedia Arts at Ball State University in consultation with the University of Otago, NZ and the Virtual World Heritage Laboratory at Indiana University. The simulation leverages the significant graphic capabilities of the CryEngine to create a compelling immersive experience. IDIA Lab and Avatar Reality worked together to design an interactive solar tracker that allows the visitor to change time of day around important celestial and celebratory events in Ancient Rome. More detail here: http://bit.ly/18cfDKu The project can also be interacted with via MS Kinect sensor directly in Blue Mars.

Virtual Art Museum and Virtual Collaboration Center Projects in AvayaLive Engage

Avaya_ArtMuseum2The IDIA Lab has developed two new simulations for AvayaLive Engage – a multi-user virtual world platform based on the Unreal Game Engine. The virtual collaboration Center is a multi-use teaching and learning facility prototype for use by large and small groups containing rich media and collaboration tools including desktop sharing, presentations, drop box, white boards, streaming video and webcam. The Virtual Art Museum is a demonstration of an interactive museum experience with links to companion webpages including a database that allows for extremely close examination of the paintings. Both simulations work on Mac and PC within industry standard web browsers and will be launching in the month of May, 2013. You can visit the project here:  https://wa11619.avayalive.com/11619/html/index.html Certain areas need administrative permission – we will be opening these up in the near future.

AvayaLive Engage
http://avayalive.com/Engage/Products.aspx
Unreal Game Engine UDK
http://www.unrealengine.com

Gaming in Virtual Worlds and on Mobile Devices

REAL TIME 3D ENVIRONMENTS

Virtual worlds are three-dimensional environments that can provide compelling shared experiences for their users. These media-rich immersive spaces can connect to social networks, web-services, and data to bridge into an entire universe of external content. The IDIA has developed scores of projects in virtual worlds and game environments including Blue Mars, Unity 3D, Second Life, Open Simulator, and Quest 3D.

The IDIA built and animated Odie’s character using Maya 3D; an industry standard animation software package used in the feature film and visualization industries. This game that IDIA designed with Paws characters, was developed in Unity 3D.

Paws characters Odie, Garfield and the neighborhood were then brought into the Unity game engine where the lighting, real-time shadows, and physics simulations were designed.  Unity’s scripting capability was also utilized to control the Odie character’s animation and movement, trigger collision sounds, and animate Garfield as Odie moves by. There are a selection of objects for Odie to interact with, each with its own physical properties including weight, friction, and bounciness.

http://unity3d.com

MOBILE ENTERTAINMENT

Mobile devices are increasingly used to help us navigate, communicate, stay informed and be entertained. Devices such as the Apple iPhone are capable of supporting rich and complex 3D experiences such as augmented reality and games. The IDIA has been developing numerous projects for PDA enabled mobile phones including applications for social networking, conference systems, augmented reality walking tours, live musical performance and games.

The prototype 3D game, IDIA built with Paws content, has also been re-purposed here as an iPhone game – exhibiting the range and scalability of these authoring environments. The same content from any game can also be realized as a Nintendo Wii game using this production workflow.

This authoring environment supports two and three-dimensional game spaces, interacting through clicks, tilts and touch. Users can walk Odie through his neighborhood, encountering trashcans, balls and Garfield himself!

http://unity3d.com/iPhone

Astym 3D Visualization

Trevor Danehy and Tom Anderson, students from the IDIA Immersion Seminar, designed and produced a 3D visualization of a therapeutic process developed by Dr. Tom Sevier of Performance Dynamics. Students worked with state of the art software and a commercial anatomical 3D model with over 4,000 parts to realize a professional quality and dynamic scientific illustration for an external client.

The ASTYM simulation was an animation used to visualize specific effects and treatments of tendonosis. Information was gathered from ASTYM researchers to turn a complex, unseen procedure into a visual explanation. Choosing a design style was a challenge because 3D animation in the medical field has a different goal than most common animations. There needs to be a balance between accuracy and simple, clean visuals. Various shader styles were considered to properly display the internal processes of the human arm accurately, while avoiding anything realistically gruesome or visual clutter that would be detrimental to explaining the complexity of the human body. Transparent, yet visible objects were needed, so students opted for a dynamic incidence shader, which would shade the curvature and sides of objects while leaving the middle portions transparent enough to see what lies under them. Bright glowing colors were used to highlight particular tendons and separate muscles from each other.

Links
Performance Dynamics

Biotown

IDIA, in collaboration with the College of Architecture and Planning, will be constructing a virtual tour of the town of Reynolds, Indiana for Governor Mitch Daniels’s Biotown Project. Reynolds will eventually be removed from the power grid and will be self-sustained by biological fuels. The virtual tour will show the town as it exists today, and how it will appear after the project has been finished.

News Items:

Biotown, USA

Augmented Reality Research

The IDIA is utilizing the open source Augmented Reality toolkit to explore possible applications of AR in the educational and art realms.  Of particular interest is the use of AR in architecture to effectively evaluate students’ white models in a group setting.

Dave Schultz demonstrates the use of AR to display 3D architectural models.

Dave Schultz demonstrates the mapping of video onto an AR object.

Virtual Middletown Living Museum in Blue Mars

The Virtual Middletown Living Museum project in Blue Mars is a simulation of the Ball Glass factory from early 20th century Muncie, Indiana. Life and conditions in the factory were one of the key elements of the Middletown Studies by Robert S. and Helen Merrell Lynd in their landmark studies Middletown (1929) and Middletown in Transition (1937). These in-depth accounts of life in Muncie, Indiana, became classic sociological studies and established the community as a barometer of social trends in the United States. In the years since, scholars in a variety of fields have returned to Muncie to follow up on the Lynds’ work, making this small city among the most studied communities in the nation. The center continues this tradition by sponsoring and promoting research on Muncie as Middletown, on small cities generally, and on the themes and issues the Lynds explored.

This simulation of industrial life, built as a prototype for a much larger project dealing with all aspects of the Lynd Study, has aimed to create an virtual living museum experience expanding the opportunities for both learning and interpretation. The approach to interactive design embeds learning and navigation experiences subtly into the project to maintain the sense of immersion. IDIA has prototyped several techniques to do this including: interactive objects that allow for close up inspection; objects that when clicked bring up web resources that show information; plans or photographs used in the interpretation; non-player character factory workers, a live interactive avatar of Frank C. Ball who greets visitors and introduces them to the factory; video and audio files of factory experts and archival films; an in-world interactive Heads-Up-Display (HUD) that provides deeper investigation and navigation through the factory; and a supporting webpage with complete documentation on all resources used in this interpretation.

To download the Blue Mars client, create an account and tour Virtual Middletown, please visit: http://blink.bluemars.com/City/IDIA_IDIALabExhibitions/

Follow these steps to get set up:

  1. Slelect “Download Client” and follow instructions to install the BlueMars client on your PC (Windows desktop or laptop)
  2. Once you have successfully installed the BlueMars client, select “Go To City” to install the Virtual Middletown virtual world
  3. Register your account and confirm when you receive an email from BlueMars
  4. Modify your avatar (optional)
  5. Explore Virtual Middletown!

NOTE: If you are a Macintosh user (OS X) you may run the BlueMars client and Virtual Middletown virtual world using the Boot Camp emulation: http://www.apple.com/support/bootcamp/

Here are links for additional information on the project:

  1. Center for Middletown Studies project website: http://cms.bsu.edu/Academics/CentersandInstitutes/Middletown/Research/Virtual/Virtual.aspx
  2. IDIA Lab project website: https://idialab.org/virtual-middletown-living-museum-in-blue-mars/

Recommended settings: Very High Graphics with good graphics cards – otherwise High or Low as needed. Screen resolution: Minimum 1280 by 720 or higher. Sound levels should be up. Many objects in the world are interactive – anything that highlights blue can be clicked with the left mouse button and examined – or might can reference a web page. The heads up display in the lower right hand corner provides information and navigation to augment your visit.

Project partners: The Center for Middletown Studies, Library Services and the Emerging Media Initiative at Ball State University

Press release: http://www.bsu.edu/news/article/0,1370,7273-850-65401,00.html

IDIA Lab Simulations in Blue Mars

Ball State University’s Institute for Digital Intermedia Arts (IDIA) in Blue Mars is a large scale virtual simulation that showcases re-creations of artifacts and artworks in their original historical contexts, including sculptures at the 1915 Panama-Pacific International Exposition in San Francisco and a centuries-old Buddha at a Japanese temple. Through the use of such emerging technologies, visitors to Blue Mars can virtually experience these important but long lost sites.

IDIA’s simulations in Blue Mars present several examples of recent grant-supported research projects. IDIA is exploring various new ways that it might be used, from learning and the arts to gaming. Future IDIA initiatives in Blue Mars will include the prototyping the 3D internet, shared media collaboration tools and the development of an open virtual campus for use by educators interested in engaging the platform as an environment for learning.

Videos on IDIA Lab Youtube Channel
http://www.youtube.com/user/IDIALab#p/u

Images on IDIA Lab Flickr Photostream
http://www.flickr.com/photos/idia_lab/

This release of IDIA Lab introduces the following new features and updates, including

 • Heads up Display system – flash-based HUD system displaying dynamic information and map based teleportation

• Automated avatar/bot greeter system

• A guided tour vehicle – launching from the orientation island

• The Snow Globe, a virtual winter simulation – employing physics, environment and particle systems. Teleports are placed throughout the City

• Depth of Field – now enabled in most environments • New navigation and orientation system

• New vegetation design – new plants with LOD

• High fidelity statues using advanced mapping techniques

• High optimization, terrain painting, parallax mapping

• Please use Very High Graphics settings to view

 

Blue Mars Builds


The IDIA Lab

THE INSTITUTE FOR DIGITAL INTERMEDIA ARTS is a collaborative research and design studio exploring the intersections between art, science and technology. We consult, design and produce immersive experiences for virtual reality, hybrid worlds, simulations, visualizations, games and interfaces for various platforms and devices. For more information, please visit IDIALAB.org. You can walk to all simulations or alternatively use the teleport columns you see at each location. Follow us on Twitter @ IDIA_Lab for news and updates.

Panama-Pacific International Exposition Simulation

IDIA simulation of the 1915 San Francisco Panama – Pacific Exposition The project allows for visitors to travel to the past to immersively tour a recreation of an historic environment that no longer exists. The exposition celebrated the opening of the Panama Canal but also San Francisco’s recovery from the devastating earthquake of 1906. IDIA 3D laser-scanned two sculptures by artist Adolph Weinman that have been included in this simulation and were originally installed on top of tall columns in the Court of the Universe. A more detailed examination of the sculptures can be found in our Museum Simulator. Visitors can change the time of day near controls found near this sign and the Fine Arts Palace to experience the lighting design of this exposition.

Palace of Fine Arts

The Palace of Fine Arts in the Marina District of San Francisco, California was originally constructed for the 1915 Panama-Pacific Exposition. One of a handful of surviving structures from the Exposition, it is the still situated on its original site. It was rebuilt in 1965 – and renovation of the lagoon, walkways, and a seismic retrofit were completed in early 2009. IDIA Lab constructed this as an addition to its Panama- Pacific Court of the Universe simulation.

Art Museum Simulation

Using blue prints, photographs, laser-scanning and measurements, IDIA Lab simulated an actual museum gallery to a high degree of accuracy for exploration and study within a live 3D environment. This goals of this build were to provide visitors with a remote immersive experience of an actual exhibition as it was installed – closely replicating all original factors such as the layout, framing and lighting of the physical space. Additional information is provided by touching each canvas or sculpture in the exhibition. Via a simulation such as this, curators can also spatially archive a show or prototype layout, lighting, and installation design.

Byodo-In Temple (Amida Hall)

Amida Hall, the most famous building in the Byodo-in temple complex was constructed in 1053 and is the only remaining building from the original site. Additional buildings making up the compound were burnt down during a civil war in 1336. IDIA scanned an Amida Buddha and created this temple site to give visitors an understanding of a typical context in which the Buddha was observed. A replica of the temple was also constructed on the island of O’ahu in Hawai’i. Nearby there are 360º immersion spheres of the Hawaiian temple that avatars can also explore.

Relic Temple

Relic Temple – located in the Ji Le Temple complex – Nangang District in Harbin, China.The Relic Temple simulation was built by by IDIA Lab as part of our larger 3D Laser Scanning and Simulation Project, including the 1915 Panama Pacific Exposition in San Francisco, Byodo-In Temple in Japan and the Bingling Si Buddhist Cave site – all simulated here on IDIA Lab in Blue Mars.

Shafer Tower

Shafer Tower is a 150-foot-tall (46 m) bell tower with a chiming clock and carillon bells located in the middle of the campus of Ball State University. A small staircase in the tower leads to a control room of the carillon, which has 48 custom-made bells. This virtual tower chimes when an avatar approaches – as does the bell in a smaller temple at our Byodo-in build.

The Snow Globe

Winter themed landscape simulation including environment, weather, particle and physics systems.