Virtual Hadrian’s Villa Launch at Harvard Center

Virtual Hadrian’s Villa Launch at Harvard Center

IDIA Lab has designed a virtual simulation of the villa of the Roman Emperor Hadrian, which is a UNESCO World Heritage site located outside of Rome in Tivoli, Italy. This project has been produced in collaboration with the Virtual World Heritage Laboratory (VWHL) at Indiana University (IU), directed by Dr. Bernard Frischer and funded by the National Science Foundation. This large-scale recreation virtually interprets the entire villa complex in consultation with the world’s foremost Villa scholars. The project has been authored in the game engine of Unity as a live virtual multi-user online learning environment that allows students and visitors to immerse themselves in all aspects of the simulated villa. The project launched at the Harvard Center for Hellenic Studies in Washington, DC on November 22, 2013. The webplayer versions of the Hadrian’s Villa project are funded through a grant from the Mellon Foundation.

The Launch of the Digital Hadrian’s Villa Project

The Center for Hellenic Studies, Ball State University, and Indiana University

Friday, November 22, 2013

Harvard Center for Hellenic Studies

Washington, DC

Speakers:
John Fillwalk, IDIA Lab, BSU
Bernard Frischer, VWHL, IU
Marina Sapelli Ragni

The presentations included previews of:
The Virtual World of Hadrian’s Villa
The Digital Hadrian’s Villa website

VirtualHadriansVilla_IDIALab_Login

The project not only recreates the villa buildings but also includes a complete Roman avatar system, non-player characters with artificial intelligence, furniture, appropriate vegetation, dynamic atmospheric system and sophisticated user interface. The interface provides learning, navigation, reporting and assessment opportunities and also allows users to change the position of the sun to any date in 130 AD using data from the Horizons database at JPL NASA – testing theses of astro-alignments of architectural features during solstices and equinoxes. Learning communities are briefed on the culture and history of the villa and learn the virtual environment prior to immersing themselves within it. The avatar system allows for visitors to enter the world selecting class and gender – already being aware of the customs and behavior of the Roman aristocracy, soldier, slave or politician.

Khan Academy Walkthrough of Virtual Hadrian’s Villa: http://youtu.be/Nu_6X04EGHk

Link to Virtual Hadrian’s Villa Walkthrough: http://youtu.be/tk7B012q7Eg

The Digital Hadrian’s Villa Project:

Virtual World Technology as an Aid to Finding Alignments between

Built and Celestial Features

Bernard Frischer1

John Fillwalk2

1Director, Virtual World Heritage Laboratory, University of Virginia

2Director, IDIA Lab, Ball State University

Hadrian’s Villa is the best known and best preserved of the imperial villas built in the hinterland of Rome by emperors such as Nero, Domitian, and Trajan during the first and second centuries CE. A World Heritage site, Hadrian’s Villa covers at least 120 hectares and consists of ca. 30 major building complexes. Hadrian built this government retreat about 20 miles east of Rome between 117, when he became emperor, and 138 CE, the year he died. The site has been explored since the 15th century and in recent decades has been the object of intense study, excavation, and conservation (for a survey of recent work, see Mari 2010).

From 2006 to 20011, with the generous support of the National Science Foundation[1]and a private sponsor, the Virtual World Heritage Laboratory created a 3D restoration model of the entire site authored in 3DS Max. From January to April 2012, Ball State University’s Institute for Digital Intermedia Arts (IDIA Lab) converted the 3D model to Unity 3D, a virtual world (VW) platform, so that it could be explored interactively, be populated by avatars of members of the imperial court, and could be published on the Internet along with a related 2D website that presents the documentation undergirding the 3D model.

The 3D restoration model and related VW were made in close collaboration with many of the scholars who have written the most recent studies on the villa.[2] Our goal was to ensure that all the main elements—from terrain, gardens, and buildings to furnishings and avatars—were evidence-based. Once finished, the was used in two research projects.

The first project was a NSF-sponsored study of the usefulness of VW technology in archaeological education and research. We used the virtual villa in undergraduate classes at Xavier University and the University of Virginia to investigate the thesis of two recent studies by project advisors Michael Ytterberg and Federica Chiappetta about how this enormous built space was used by six different groups of ancient Romans, ranging from the Emperor and Empress to normal citizens and slaves (Ytterberg 2005; Chiappetta 2008). Avatars representing these groups have been created and are being operated by undergraduate students as a Problem‐Based Learning (PBL) experience. They are observed by subject experts, who are using the data generated to test and, if necessary, refine the initial theses about how circulation through the villa was handled.  The results are still being evaluated. Preliminary indications are that the data show that the combination of VW used in a PBL educational context is very effective in taking advantage of the known connection between between the hippocampus and long-term learning, especially when the information to be mastered is spatial (Kandel 2007).

The second project involved use of the VW for some new archaeoastronomical studies. Most of our advisors’ publications, like the older work by archaeologists that preceded them, have concentrated on archaeological documentation, restoration, formal, and functional analysis. The latest research by advisor De Franceschini and her collaborator Veneziano (2011) combined formal and functional analysis: it considered the alignment of certain important parts of the villa in relation to the sun’s apparent path through the sky on significant dates such as the solstices. In their recent book they showed how two features of the villa are aligned with the solar solstices: the Temple of Apollo in the Accademia; and the Roccabruna. We used the VW to extend their research to other areas of the villa, taking advantage of 3D technology to restore the sun to the right place in the sky and also to restore the damage to the architecture of the villa, as De Franceschini and Veneziano had independently suggested be done before they learned about our digital model of the villa.

The work of De Franceschini and Veneziano is innovative. Archaeastronomy has become an accepted field of study in recent decades, and a considerable amount of work has been done in Old and New World archaeology. In Roman archaeology, however, this approach is still rarely encountered. Significantly, one of the few compelling studies concerns the most famous Hadrianic building: the Pantheon in Rome. Hannah and Magli 2009 and Hannah 2011 have shown a number of solar alignments in the building, of which the most notable are the sun’s illumination of the entrance doorway at noon on April 21; and the view of sunset silhouetting the statue of Hadrian as Sun god on a four-horse chariot atop the Mausoleum of Hadrian as viewed from the middle of the Pantheon’s plaza at sunset on the summer solstice. Like the summer solstice, April 21 is also a significant date: on it occurred the  annual festival in Rome known as the Parilia (re-named the Romaia by Hadrian),[3] which celebrated the founding of Rome.

De Franceschini and Veneziano pursued an observation of Mangurian and Ray (2008) to document an impressive example of solar alignment at Hadrian’s Villa involving the tower known as Roccabruna at the western end of the villa. Originally, a tower-like structure topped by a round temple, what remains today is the well-preserved, massive lower floor. The main entrance is located on the northwestern side to the right and gives access to a large circular hall covered by a dome. The dome is punctuated by an odd feature: five conduits that are wider on the outside than on the inside (figure 1).

What is the function of these unusual conduits? They have no known parallel in Roman architecture. After asking themselves this same question, on June 21st, 1988, the day of summer solstice, the American architects Robert Mangurian and Mary Ann Ray went to Roccabruna at sunset, and discovered the extraordinary light phenomena which occur there. At sunset the Sun enters through the main door illuminating the niche on the opposite side, something that happens during most of the summer days. But only in the days of the summer Solstice the Sun penetrates also into the conduit located above that door: its rays come out from the slot inside the dome projecting a rectangular light blade on the opposite side of the dome. In June 2009, De Franceschini verified the findings of Mangurian and Ray. However, they know that the apparent path of the Sun through the sky changes slightly each year, so that in the nearly 1880 years separating us from Hadrian, the precise effect of the alignment has been lost. As they noted, only a computer simulation can recreate the original experience of being in the lower sanctuary at Roccabruna at sunset on the summer solstice during the reign of Hadrian.

Once we had our 3D model of the site, we were able to obtain from NASA’s Horizons system[4] the correct azimuthal data for the year AD 130 and put the sun into the sky at sunset on the summer solstice. Following the lead of De Franceschini, who in the meantime had become a consultant to our project, we put into the niche one of the four statues of the Egyptian sky goddess Isis that were found at the Villa. De Franceschini chose Isis because first of all, there is no question there was a statue in this niche so we need to put something there; and the two flanking niches had candelabra, whose bases are preserved and are decorated with Isiac iconography. Moreover, Isis’ festival in Rome was on the summer solstice. So we scanned and digitally restored one of the several statues of Isis from the villa and put it into the central niche. Finally, for the dome, which we know from surviving paint was blue and therefore had the famous “dome of heaven” motif (Lehmann 1945), we followed De Franceschini in restoring a zodiac set up in such a way that the sign of Gemini is over the statue niche since the last day of Gemini is the summer solstice. Our zodiac is adapted from the great Sun God mosaic in the Rheinisches Landesmuseum in Bonn, which kindly gave us permission to use it.

As can be seen in figure 2, when we restored the sun in the right position in the sky dome for sunset on the summer solstice (June 21) of 130 CE in our 3DS Max model of Roccabruna, the sunlight coming through the main doorway illuminated the statue of Isis in the statue niche, and the light entering through the conduit lit up the sign of Gemini painted on the cupola. So we were able to confirm the Mangurian-Ray thesis.

The approach we have taken in our Roccabruna project is deductive: Mangurian and Ray noted the strange feature of the conduits punctuating the cupola of Roccabruna; they hypothesized a solar alignment. De Franceschini and Veneziano agreed and for various reasons we don’t need to go into today, they put a statue of Isis into the statue niche. We set up the conditions in which these hypotheses could be tested and were able to verify them.

But surely, if there is one such alignment at the villa of the same emperor who was responsible for the Pantheon, there may be others. But the villa is very big—covering over 100 hectares—and has 30 major building complexes, most larger than Roccabruna. Moreover, such alignments could just as easily involve astrological features such as the Moon and the planets. Faced with this level of complexity, the best methodological way forward in searching for new alignments is clearly inductive and empirical. This is one reason why we asked the Institute for Digital Intermedia Arts (IDIA Lab) of Ball State University to create a multi-user virtual world based in Unity 3D from our 3DS Max model.

The project of virtually interpreting a simulation on the scope and scale of Hadrian’s Villa was a daunting one – engaging layers of scholarly, technical and pedagogical challenges. The technical challenges were many – foremost to leverage the game engine of Unity 3D to become an effective multi-user avatar-based virtual world. An important factor was to create an environment that was straightforward and accessible via standard web browsers on both Mac and Windows and selected Unity 3D as the starting point for developing the platorm. We required specific back-end administration tools to handle the accounts and server side aspects of the project – for this we relied on Smart Fox Server as it manages Unity 3D quite well. Our team took an approach that bridged and integrated disparate technologies, creating a robust virtual world platform to immersively augment both instructional and PBL processes. VW features available to the learning community included text based communication, a live map showing current visitor positions, map based teleportation, managed voice channel, user selected avatar gestures, online users, paradata, photographs of the extant site, plan views, and integrated web links.

Key to the project was a varied system of avatars representing the imperial court, freemen, senators, scholars, soldiers, and slaves to the emperor. The avatar system provided several important functions testing recent scholarly interpretations of circulation throughout the villa and the use of various spaces for typical court activities – meals, imperial audiences, bathing, worship, etc. Upon entering the simulation, the choice of avatar would predicate how one’s social standing within the role-play of the world.

A gesture system was created via motion capture providing each user with a unique set of actions and gestural responses to engage social interactions – including greetings, bowing and gestures specific to rank and class. Communication was also a critical element in the modes of problem based learning engaged by the participants in the simulation. Specific technologies provided varied abilities such as public chat, private instant messaging and live multi-user voice channels.

A companion website was co-developed and integrated into the VW environment providing learners with visual assets such as photographs and panoramas of the current site, site plans, elevations, and video interviews with Villa scholars. We also developed three-dimensional turntables of the interpreted and reconstructed models, overview information on each of the major Villa features, bibliography and an expansive database of art attributed to the Villa site. This information can be directly accessed by learners directly from within the virtual world. The development team integrated the notion of paradata, introduced by the London Charter – making instantly transparent the scholarship and all underlying elements of the 3D model (from terrain to buildings, furnishing, costumes, and human behavior).

In support of new research theme on celestial alignments by consultants De Franceschini and Veneziano, a major goal for the project was to develop an accurate simulation for the position of the sun. The solar tracking, or virtual heliodon that we created as a response to this research, was envisioned as a simulation that would a bridge between the virtual environment and coordinates from an external database calculating solar positions. After investigating existing tools we decided to employ the Horizons database that was created by NASA’s Jet Propulsion Laboratory as an on-line solar system data computation service – tracking celestial bodies in ephemerides from 9999 BCE to 9999 CE. In implementing solar tracking for the Villa project in instances were we where we wanted to investigate potential significant solar alignments, we entered the latitude, longitude and altitudes of specific buildings from the Tivoli site to poll the Horizons data for the year 130 CE. The user was able to change the date, time of day, and quickly play the sun from specific moments via the user interface. The system was co-related to both the Julian and Gregorian calendars and contained presets for the vernal and autumnal equinoxes as well at the summer and winter solstices.

These tools allowed for the rapid discovery of potential alignment that might bear further investigation. The solar feature allows one to proceed empirically, in effect turning the clock back to 130 CE and running experiments in which the days and hours of the year are sped up by orders of magnitude so that one can in a very short time find candidate alignments not yet hypothesized by scholars working in the traditional way of Mangurian-Ray.

As developers, our goal was to create the solar tool and let students and scholars use it to undertake their own empirical research. Our team was not intending to engage in this research ourselves, yet in the process of working within the environment daily we quickly began to notice curious solar phenomena. In a bit of empirical study of the very first component of the site we installed in the simulation, the Antinoeion – or newly-discovered Temple of the Divine Antinous, we noticed an alignment of potential interest. The most likely alignment seemed at first glance to be along the main axis running from the entrance, through the obelisk in the central plaza to the statue niche at the end of the axis. We ran the days and hours of the year and found that the sun and shadow of the obelisk align at sunrise on July 20. We consulted with our expert on the Egyptian calendar in the Roman period, Professor Christian Leitz of the University of Tuebingen–and he confirmed that this date has religious significance. It is, in fact, the date of the Egyptian New Year, as the Romans of Hadrian’s age clearly knew (cf. the Roman writer Censorinus, who states that the Egyptian New Year’s Day fell on July 20 in the Julian Calendar in 139 CE, which was a heliacal rising of Sirius in Egypt).

In the process of developing and subsequently utilizing the simulation tools we created for astro-archeological research, our conclusions have been that virtual world technologies can indeed take the inquiry for significant built-celestial alignments to a new level of insight.

Bibliography

Chiappetta, F. 2008. I percorsi antichi di Villa Adriana (Rome).

De Franceschini, M. and G. Veneziano, 2011. Villa Adriana. Architettura celeste. Gli secreti degli solstizi (Rome).

Hannah, R. 2008. Time in Antiquity (London).

Hannah, R. 2011. “The Role of the Sun in the Pantheon’s Design and Meaning,”Numen 58: 486-513.

Kandel, E. 2007. In Search of Memory: The Emergency of a New Science of Mind(W. W. Norton, New York). Kindler edition.

Lehmann, K. “The Dome of Heaven,” Art Bulletin 27: 1-27.

Lugli, G. 1940. “La Roccabruna di Villa Adriana,” Palladio, 4: 257-274

Mangurian, R. and M.A. Ray. 2008. “Re-drawing Hadrian’s Villa,” Yale Architectural Journal, 113-116.

Mari, Z. 2010. “Villa Adriana. Recenti scoperte e stato della ricerca,” Ephemeris Napocensis 20: 7-37.

Ytterberg, M. 2005. “The Perambulations of Hadrian. A Walk through Hadrian’s Villa,” Ph.D. dissertation, University of Pennsylvania.

Mellon Foundation Humanities Virtual World Consortium

Screen Shot 2013-04-23 at 4.22.14 PM

BSU’s IDIA Lab was contracted by the Mellon Foundation Humanities Virtual World Consortium to design and develop a major open source virtual world initiative for Digital Humanities and Cultural Heritage projects. The consortium – comprised of Kings College, London; UCLA; the University of Virginia and Trinity College, Dublin – turned to the expertise of BSU’s IDIA Lab to create this two-year innovative hybrid simulation platform that leverages new modes of teaching and learning in immersive environments.

http://virtualworlds.etc.ucla.edu

Concurrent with the development of the prototype infrastructure, members of the Consortium developed working relationships with Ball State University’s IDIA Lab, an internationally recognized academic leader in in the development of virtual worlds, human computer interaction, visualization and 3D simulation. Most important for the Consortium, the IDIA Lab is engaged in the development of scholarly, creative and pedagogical projects that explore the intersection between the arts, science and technology. The IDIA Lab is not merely a technical development team, but is also a interdisciplinary design studio that integrates art and emergent technologies into every phase of development. After inviting John Fillwalk, the lab’s director, to attend a series of conference calls with the consortium, a proposal for interface and in world design was solicited. John Fillwalk is an internationally recognized artist and developer of virtual and hybrid environments. He serves as the senior director of the Hybrid Design Technologies initiative [HDT], professor of Art and as the director of the Institute for Digital Intermedia Arts [IDIA Lab} at Ball State University. Over the past 25 years, his interactive and virtual artworks have been exhibited internationally in numerous festivals, galleries and museums including SIGGRAPH, CYNETart, Synthése, 404 Festival, Dutch Design Week, Boston Cyberarts, Virtual Broad Art Museum, ISEA, ASCI, VIdéoformes, Indian Institute of Technology and the Beijing Science and Technology Museum.

The Consortium elected to partner with the IDIA Lab, since it offers design as well as technological expertise and a common interest in the theoretical implications of Virtual World technologies on research and pedagogy.

Development will be split between the two independent teams, with the IDIA Lab, in general, centering its work on the development of the Unity based platform including avatar selection, navigation, network controller, user interface system, and back end network hosting, while Tipodean develops the HTML and KML system and works with members of the Consortium to integrate the four individual projects into the shared platform. The development will not occur in isolation from the rest of the Consortium.  The external development teams will offer scheduled monthly training sessions to the internal technical teams of the Consortium. We are employing a similar development model to that successfully used during the Planning Phase of the HVWC in which, through a collaborative effort of local staff and third-party developers, we implemented a prototype template and virtual world environment with a subset of features below enabled.  In addition, we plan to hire a graphic design independent contractor and a game design independent contractor to work with the PIs and our development teams on the look and feel of the Consortium’s web presence as well as the conceptualization of the interface design.

Our Mission

I. Project Summary

The 1990s saw the development of digital technologies supporting the 3D (three dimensional) modeling of cultural heritage objects and environments. For the first time, humanists could digitally model and reconstruct the damaged or vanished monuments of the past. The results were typically 2D renderings or videos (“animations”). The decade of the 2000s saw the enhancement of 3D environments with avatars making it possible for scholars to enter into the 3D world and to use the Internet to interact with the simulated environment while communicating with fellow humanists located anywhere on the planet. Such software platforms are called networked virtual worlds (NVWs). The Humanities Virtual World Consortium (HVWC) will explore how the unique characteristics of  networked virtual worlds can enable and advance humanistic research while working towards creating a sustainable base for mainstreaming the technology in humanities scholarship. Our initial work is based upon a series of related scholarly initiatives that draw upon virtual world technology and which are meant to: a) advance the current state of research on the phenomenology of space and place, b) design visual and aural conventions to evoke the sensorial experience lost to us due to technological and evidentiary constraints, c) test the current capabilities of virtual worlds to explore chronotopic problems, previously inaccessible due to the limitations of prior technology, d) guide future development of humanities-driven virtual worlds, and e) produce works of exemplary digital scholarship, disseminated in formal, peer-reviewed publications, that solve specific research problems in particular disciplines and area studies. Our overall intent is to demonstrate how networked virtual worlds can uniquely enable important kinds of research inquiry, and thus contribute to the transformation of scholarly communication in the relevant subject fields and to the field of digital humanities. With this in mind, our projects have been chosen so that they span multiple disciplines— including Archaeology, Art History, Architectural History, Buddhist Studies, Classics, History, Irish Studies, Literary Studies, Tibetan Studies—and periods from the ancient past to contemporary times. While the projects explore discipline-specific research questions, they share common goals concerning humanities research and scholarly communication in a networked virtual world environment.

II. Expected Outcomes and Benefits of the Project

Project Deliverables: At the conclusion of the project we will release a networked virtual world platform template to be used in the Unity game development engine, the Consortium web site with documentation, four virtual world projects released as Unity3D builds, four draft in-world “articles” embedded within the virtual world project, and four articles submitted for peer review in journals devoted to specific scholarly domains.

The main outcomes of the proposed project will be (i) the consolidation of the Humanities Virtual World Consortium as a robust, fully functional academic organization that can persist and expand in the future; (ii) the development of a shared virtual world platform that is adapted for maximum utility for scholarly projects; (iii) the publication of four significant scholarly projects in that platform that exemplify the way in which virtual worlds offer unique affordances for scholarly research inquiries; (iv) individual articles based on the virtual world content submitted journals in our traditionally-defined, domain-specific fields. At the conclusion of the grant, a) the organization will be published on the web with bylaws and an means for others to join,  b) the new collaboratively maintained virtual world research and publication environment will host four scholarly publications products, and c) the shared development template and accompanying documentation will be made available online to aid others in the development of Virtual World content. The primary benefit of this project is that it constitutes a valuable and necessary step towards establishing the organizational and technical basis for a sustainable scholarly exploration of the unique ways in which networked virtual worlds can enable and advance humanistic research. While valuable and compelling individual projects have emerged over the past years, interoperability between them has not been supported. The barriers remain extremely high for individual scholars who wish to use this medium for their own research without major funding. For these reasons, to date these projects have failed to attain a wide scholarly audience. We propose to establish an organization, technology, and specific scholarly publication apparatus that would address these deficiencies in the subsequent years and beyond.

The broader, long-term, and more far-reaching benefits are that the initiative will establish a basis for the scholarly community to engage in a serious attempt at creating a sustainable, shared environment for scholarly research and communication to be done in a networked virtual world environment. These publications will show the viability and importance of such research, raise its profile in the scholarly community, and demonstrate in a compelling, virtually tangible, and accessible way the benefits of using a shared platform. Such publications will, it is envisioned, lead to an expansion of the Consortium and finally the development of a robust platform in which scholars can easily participate without the need for major grants to support further development of one-off technologies. Scholarly projects from disparate disciplines and geographical regions would be interoperable, and scholarly use would be widespread due to the consistency of interface and technology.

Mesa Verde Virtual Companion

Mesa Verde Virtual Companion

Ball State Univeristy’s IDIA Lab has been contracted by the US Department of the Interior to develop virtual visitor experiences for the Mesa Verde National Park. The park preserves and interprets the archeological heritage of the Ancestral Pueblo people who made it their home for over 700 years, from 600 to 1300 CE. Today, the park protects nearly 5,000 known archeological sites, including 600 cliff dwellings.

The application will bring to life well-known museum dioramas and locative walking tours of the park. Augmented reality and interactive 3D experiences will help tell the stories of the UNESCO World Heritage site – including the transformation of static elements of the historic dioramas with animated figures and interactive elements. The application will be available on both Google Play and AppStore in 2021.

History Channel’s The Universe Features Virtual Stonehenge

IDIA Lab has developed a simulation of Stonehenge in Unity 3D which illustrates the various stages of construction and celestial alignments in an interactive virtual simulator. The project incorporates IDIA Lab’s CelestialEngine which uses NASA JPL data to accurately position the sun, moon and visible planets – correcting for changes in time in the Earth’s rotation and other forces – allowing for accurate observations of the night sky as they would have appeared thousands of years ago at Stonehenge.

The History Channel’s television series The Universe recently shot segments of an upcoming Stonehenge episode at Ball State University’s IDIA Lab and will feature the use of our simulator and rendered animations throughout the hour long episode. http://www.history.com/shows/the-universe/episodes

Press release: https://www.insideindianabusiness.com/newsitem.asp?ID=63067

Simulator video feature walkthrough:  http://www.youtube.com/watch?v=fNK-cCIog1E&list=UUtQrLF1JPprEvP4AO-iSNvw&feature=share&index=2

The History Channel’s The Universe, Ancient Mysteries Solved: Stonehenge

Perhaps the most mysterious structure on Earth, Stonehenge has stood on a plain in Southern England for 5000 years. Why is it there? In this episode we explore the possibility that this was a prehistoric astronomical observatory. Here ancient astronomer priests may have divined the complex movements of the Sun and Moon, recognizing patterns that would not be discovered elsewhere for thousands of years. The primitive Shamans may have also been the first astronomers to predict eclipses.

Stonehenge

Introduction

Stonehenge is one of the most famous prehistoric sites in the world – consisting of a ring of standing stones set within large earthworks. It is in the middle of the most dense complex of Neolithic and Bronze Age monuments in England, including several hundred burial mounds.

Archaeologists believe the earliest phases were built approximately around 3000 BC with radiocarbon dating in 2008 suggesting that the first Sarsen stones were raised between 2400 and 2200 BC and that the bluestones may have been raised at the site as early as 3000 BC.

The site and its surroundings were added to the UNESCO‘s list of World Heritage Sites in 1986 in a co-listing with Avebury Henge. It is a national legally protected Scheduled Ancient Monument. Stonehenge is owned by the Crown and managed by English Heritage, while the surrounding land is owned by the National Trust.

Archaeological evidence found by the Stonehenge Riverside Project in 2008 indicates that Stonehenge could have been a burial ground from its earliest beginnings. The dating of cremated remains found on the site indicates that deposits contain human bone from as early as 3000 BC, when the ditch and bank were first constructed.

PHASES

Phase One 3000-2920 BC

The first monument was essentially an earthwork, consisted of a circular bank and ditch enclosure measuring about 110 meters (360 ft.) in diameter, with a large entrance to the northeast and a smaller one to the south. It stood in open grassland on a slightly sloping spot. Within the outer edge of the enclosed bank is a circle of 56 holes each about a meter in diameter, known as the Aubrey holes after John Aubrey, a 17th-century antiquarian who was thought to have first identified them. Current thinking is that the Aubrey holes contained 56 bluestones during this phase of construction. There are suspected to be three heel stones during this era.

Phase Two 2620-2480 BC

Archaeological excavation has indicated that around 2600 BC, the builders reimagined the monument entirely – and began a massive phase of construction. During this period the sarsen ring with horizontal lintels was erected, the “U” shaped cluster of 5 central trilithons. These huge stones, ten uprights and five lintels, weigh up to 50 tons each. They were linked using complex jointing transferred from knowledge of woodworking. Also during this phase an inner ring of bluestones was constructed – most likely from the removal and relocation of the 56 Aubrey hole bluestones.

The earthwork was altered to create two barrows containing the addition of two of the station stones, with the remaining two outside the barrows, forming a rectangle. These station stones have both solar and lunar alignments. The heel stones were reduced to one – which stands somewhat angled today.

Phase Three 2480-2280 BC

In stage three the Avenue was constructed, a long roadway leading to the river Avon and leading to other settlements and monuments. A bluestone circle is constructed inside the ring of trilithons. This phase also noted the appearance of three slaughter stones.

Phase Four 2280-2020 BC

The main alteration of the monument during this period was the reconstruction of the bluestone configuration within the sarsen ring. They were reworked into two distinct patterns, one a central inner oval of 23 stones inside the trilithon ring – the other a circle of 59 stones between the trilithons and the sarsen ring. The remnants of both patterns are visible today. Also, the slaughter stones were reduced to one – which remains in a fallen state.

Phase Five  1680-1520 BC

The site is essentially unchanged with the exception of the construction of the X and Y holes. There are 30 Y holes and 29 Z holes – these are suspected to perhaps have significance in the tracking of the lunar month. The Y and Z Holes are the last known construction at Stonehenge, built about 1600 BC.

Present Day (1600 BC on)

Roman coins and medieval artifacts have all been found in or around the monument but it is unknown if the monument was in continuous use throughout British prehistory and beyond, or exactly how it would have been used. The Romans are believed to have removed 4 of the 23 inner bluestones from the oval, leaving the remaining 19 stones and holes forming the horseshoe we see today. The site was known to scholars during the Middle Ages and since then it has been studied and adopted by numerous groups.

Throughout the 20th century, Stonehenge began to be revived as a place of religious significance, by adherents of Neopagan and New Age beliefs, particularly the Neo-druids.

In the late 1920s a nation-wide appeal was launched to save Stonehenge from the encroachment of the modern buildings that had begun to rise around it. By 1928 the land around the monument had been purchased with the appeal donations, and given to the National Trust to preserve. The buildings were removed (although the roads were not), and the land returned to agriculture. More recently the land has been part of a grassland reversion scheme, successfully returning the surrounding fields to native chalk grassland.

Celestial Alignments

Many astronomical alignments have been attributed to Stonehenge, a complex of megaliths and earthworks in the Salisbury Plain of England. The most famous of these is the midsummer alignment, where the Sun rises over the Heel Stone.

As well as solar alignments, there are proposed lunar alignments. The four station stones mark out a rectangle. The short sides point towards the midsummer sunrise and midwinter sunset. The long sides if viewed towards the southeast, face the most southerly rising of the moon.

Gerald Hawkins, a professor and chair of the astronomy department at Boston University in the United States, published an analysis of Stonehenge in 1965 in which he proposed its purpose as an ancient astronomical observatory predicting movements of sun and stars. Archaeologists and other scholars have since demonstrated such sophisticated, complex planning and construction at other prehistoric earthwork sites across the globe.

Function and Construction

Stonehenge was produced by a culture that left no written records. Many aspects of Stonehenge remain subject to debate. There is little or no direct evidence for the construction techniques used by the Stonehenge builders. Proposed functions for the site include usage as an astronomical observatory or as a religious site.

Professor Michael Parker Pearson of Sheffield University has suggested that Stonehenge was part of a ritual landscape and was joined to Durrington Walls by their corresponding avenues and the River Avon. He suggests that the area around Durrington Walls Henge was a place of the living, whilst Stonehenge was a domain of the dead. A journey along the Avon to reach Stonehenge was part of a ritual passage from life to death, to celebrate past ancestors and the recently deceased. Whatever religious, mystical or spiritual elements were central to Stonehenge, its design includes a celestial observatory function, which might have allowed prediction of eclipse, solstice, equinox and other celestial events important to a contemporary religion.

IDIA Lab Virtual Stonehenge Simulator

IDIA Lab has developed a simulation of Stonehenge that illustrates the various stages of construction and celestial alignments in an interactive virtual simulator. The project incorporates IDIA Lab’s Celestial SimEngine which uses NASA JPL data to accurately position the sun, moon and visible planets – correcting for changes in time in the earths rotation and other forces – allowing for accurate observations of the night sky as they would have appeared thousands of years ago at Stonehenge.  https://idialab.org/virtual-stonehenge/

________________________________________

BBC The Sky at Night Episode

This past summer, BBC’s program, The Sky at Night also included IDIA Lab’s Stonehenge simulation and animation in an episode about Stonehenge and the summer solstice. The Sky at Night is a 50 year running program on astronomy which airs on BBC One, Two, Three and Four.

BBC Sky at Night Programme website:  http://www.bbc.co.uk/programmes/b036r5nj

Stonehenge pan preview: Stonehenge Sunset Pan

NSF FUNDED VIRTUAL SIMULATION OF HADRIAN’S VILLA

NFS Funded Virtual Simulation of Hadrian’s Villa

IDIA Lab has designed a virtual simulation of the villa of the Roman Emperor Hadrian, which is a UNESCO World Heritage site located outside of Rome in Tivoli, Italy. This project is being produced in collaboration with the Virtual World Heritage Laboratory (VWHL) at the University of Virginia (UVA), directed by Dr. Bernard Frischer and funded by the National Science Foundation. This large-scale recreation virtually interprets the entire villa complex in consultation with the world’s foremost villa scholars and educators. The project has been authored in the game engine of Unity as a live 3D multi-user online learning environment that allows students and visitors to immerse themselves in all aspects of the simulated villa.

The project will not only accurately recreate the villa buildings but also include a complete Roman avatar system, non-player characters with artificial intelligence, furniture, indigenous vegetation, dynamic atmospheric system and sophisticated user interface. The interface will not only provide learning, navigation, reporting and assessment opportunities but will also allow users to change the position of the sun to any date in 130 AD using data from the Horizons database at JPL NASA – testing theses of astro-alignments of architectural features during solstices and equinoxes. UVA students will be briefed on the culture and history of the villa as well as learn the virtual environment for five weeks prior to immersing themselves within it. The avatar system will allow for them to enter the world choosing class and gender – already being aware of the customs and behavior of the Roman aristocracy, soldier, slave or politician. This project will be delivered to VWHL at UVA in early March.

The Digital Hadrian’s Villa Project:

Virtual World Technology as an Aid to Finding Alignments between

Built and Celestial Features

Bernard Frischer1

John Fillwalk2

1Director, Virtual World Heritage Laboratory, University of Virginia

2Director, IDIA Lab, Ball State University

Hadrian’s Villa is the best known and best preserved of the imperial villas built in the hinterland of Rome by emperors such as Nero, Domitian, and Trajan during the first and second centuries CE. A World Heritage site, Hadrian’s Villa covers at least 120 hectares and consists of ca. 30 major building complexes. Hadrian built this government retreat about 20 miles east of Rome between 117, when he became emperor, and 138 CE, the year he died. The site has been explored since the 15th century and in recent decades has been the object of intense study, excavation, and conservation (for a survey of recent work, see Mari 2010).

From 2006 to 20011, with the generous support of the National Science Foundation[1] and a private sponsor, the Virtual World Heritage Laboratory created a 3D restoration model of the entire site authored in 3DS Max. From January to April 2012, Ball State University’s Institute for Digital Intermedia Arts (IDIA Lab) converted the 3D model to Unity 3D, a virtual world (VW) platform, so that it could be explored interactively, be populated by avatars of members of the imperial court, and could be published on the Internet along with a related 2D website that presents the documentation undergirding the 3D model.

The 3D restoration model and related VW were made in close collaboration with many of the scholars who have written the most recent studies on the villa.[2] Our goal was to ensure that all the main elements—from terrain, gardens, and buildings to furnishings and avatars—were evidence-based. Once finished, the was used in two research projects.

The first project was a NSF-sponsored study of the usefulness of VW technology in archaeological education and research. We used the virtual villa in undergraduate classes at Xavier University and the University of Virginia to investigate the thesis of two recent studies by project advisors Michael Ytterberg and Federica Chiappetta about how this enormous built space was used by six different groups of ancient Romans, ranging from the Emperor and Empress to normal citizens and slaves (Ytterberg 2005; Chiappetta 2008). Avatars representing these groups have been created and are being operated by undergraduate students as a Problem‐Based Learning (PBL) experience. They are observed by subject experts, who are using the data generated to test and, if necessary, refine the initial theses about how circulation through the villa was handled.  The results are still being evaluated. Preliminary indications are that the data show that the combination of VW used in a PBL educational context is very effective in taking advantage of the known connection between between the hippocampus and long-term learning, especially when the information to be mastered is spatial (Kandel 2007).

The second project involved use of the VW for some new archaeoastronomical studies. Most of our advisors’ publications, like the older work by archaeologists that preceded them, have concentrated on archaeological documentation, restoration, formal, and functional analysis. The latest research by advisor De Franceschini and her collaborator Veneziano (2011) combined formal and functional analysis: it considered the alignment of certain important parts of the villa in relation to the sun’s apparent path through the sky on significant dates such as the solstices. In their recent book they showed how two features of the villa are aligned with the solar solstices: the Temple of Apollo in the Accademia; and the Roccabruna. We used the VW to extend their research to other areas of the villa, taking advantage of 3D technology to restore the sun to the right place in the sky and also to restore the damage to the architecture of the villa, as De Franceschini and Veneziano had independently suggested be done before they learned about our digital model of the villa.

The work of De Franceschini and Veneziano is innovative. Archaeastronomy has become an accepted field of study in recent decades, and a considerable amount of work has been done in Old and New World archaeology. In Roman archaeology, however, this approach is still rarely encountered. Significantly, one of the few compelling studies concerns the most famous Hadrianic building: the Pantheon in Rome. Hannah and Magli 2009 and Hannah 2011 have shown a number of solar alignments in the building, of which the most notable are the sun’s illumination of the entrance doorway at noon on April 21; and the view of sunset silhouetting the statue of Hadrian as Sun god on a four-horse chariot atop the Mausoleum of Hadrian as viewed from the middle of the Pantheon’s plaza at sunset on the summer solstice. Like the summer solstice, April 21 is also a significant date: on it occurred the  annual festival in Rome known as the Parilia (re-named the Romaia by Hadrian),[3] which celebrated the founding of Rome.

De Franceschini and Veneziano pursued an observation of Mangurian and Ray (2008) to document an impressive example of solar alignment at Hadrian’s Villa involving the tower known as Roccabruna at the western end of the villa. Originally, a tower-like structure topped by a round temple, what remains today is the well-preserved, massive lower floor. The main entrance is located on the northwestern side to the right and gives access to a large circular hall covered by a dome. The dome is punctuated by an odd feature: five conduits that are wider on the outside than on the inside (figure 1).

What is the function of these unusual conduits? They have no known parallel in Roman architecture. After asking themselves this same question, on June 21st, 1988, the day of summer solstice, the American architects Robert Mangurian and Mary Ann Ray went to Roccabruna at sunset, and discovered the extraordinary light phenomena which occur there. At sunset the Sun enters through the main door illuminating the niche on the opposite side, something that happens during most of the summer days. But only in the days of the summer Solstice the Sun penetrates also into the conduit located above that door: its rays come out from the slot inside the dome projecting a rectangular light blade on the opposite side of the dome. In June 2009, De Franceschini verified the findings of Mangurian and Ray. However, they know that the apparent path of the Sun through the sky changes slightly each year, so that in the nearly 1880 years separating us from Hadrian, the precise effect of the alignment has been lost. As they noted, only a computer simulation can recreate the original experience of being in the lower sanctuary at Roccabruna at sunset on the summer solstice during the reign of Hadrian.

Once we had our 3D model of the site, we were able to obtain from NASA’s Horizons system[4] the correct azimuthal data for the year AD 130 and put the sun into the sky at sunset on the summer solstice. Following the lead of De Franceschini, who in the meantime had become a consultant to our project, we put into the niche one of the four statues of the Egyptian sky goddess Isis that were found at the Villa. De Franceschini chose Isis because first of all, there is no question there was a statue in this niche so we need to put something there; and the two flanking niches had candelabra, whose bases are preserved and are decorated with Isiac iconography. Moreover, Isis’ festival in Rome was on the summer solstice. So we scanned and digitally restored one of the several statues of Isis from the villa and put it into the central niche. Finally, for the dome, which we know from surviving paint was blue and therefore had the famous “dome of heaven” motif (Lehmann 1945), we followed De Franceschini in restoring a zodiac set up in such a way that the sign of Gemini is over the statue niche since the last day of Gemini is the summer solstice. Our zodiac is adapted from the great Sun God mosaic in the Rheinisches Landesmuseum in Bonn, which kindly gave us permission to use it.

As can be seen in figure 2, when we restored the sun in the right position in the sky dome for sunset on the summer solstice (June 21) of 130 CE in our 3DS Max model of Roccabruna, the sunlight coming through the main doorway illuminated the statue of Isis in the statue niche, and the light entering through the conduit lit up the sign of Gemini painted on the cupola. So we were able to confirm the Mangurian-Ray thesis.

The approach we have taken in our Roccabruna project is deductive: Mangurian and Ray noted the strange feature of the conduits punctuating the cupola of Roccabruna; they hypothesized a solar alignment. De Franceschini and Veneziano agreed and for various reasons we don’t need to go into today, they put a statue of Isis into the statue niche. We set up the conditions in which these hypotheses could be tested and were able to verify them.

But surely, if there is one such alignment at the villa of the same emperor who was responsible for the Pantheon, there may be others. But the villa is very big—covering over 100 hectares—and has 30 major building complexes, most larger than Roccabruna. Moreover, such alignments could just as easily involve astrological features such as the Moon and the planets. Faced with this level of complexity, the best methodological way forward in searching for new alignments is clearly inductive and empirical. This is one reason why we asked the Institute for Digital Intermedia Arts (IDIA Lab) of Ball State University to create a multi-user virtual world based in Unity 3D from our 3DS Max model.

The project of virtually interpreting a simulation on the scope and scale of Hadrian’s Villa was a daunting one – engaging layers of scholarly, technical and pedagogical challenges. The technical challenges were many – foremost to leverage the game engine of Unity 3D to become an effective multi-user avatar-based virtual world. An important factor was to create an environment that was straightforward and accessible via standard web browsers on both Mac and Windows and selected Unity 3D as the starting point for developing the platorm. We required specific back-end administration tools to handle the accounts and server side aspects of the project – for this we relied on Smart Fox Server as it manages Unity 3D quite well. Our team took an approach that bridged and integrated disparate technologies, creating a robust virtual world platform to immersively augment both instructional and PBL processes. VW features available to the learning community included text based communication, a live map showing current visitor positions, map based teleportation, managed voice channel, user selected avatar gestures, online users, paradata, photographs of the extant site, plan views, and integrated web links.

Key to the project was a varied system of avatars representing the imperial court, freemen, senators, scholars, soldiers, and slaves to the emperor. The avatar system provided several important functions testing recent scholarly interpretations of circulation throughout the villa and the use of various spaces for typical court activities – meals, imperial audiences, bathing, worship, etc. Upon entering the simulation, the choice of avatar would predicate how one’s social standing within the role-play of the world.

A gesture system was created via motion capture providing each user with a unique set of actions and gestural responses to engage social interactions – including greetings, bowing and gestures specific to rank and class. Communication was also a critical element in the modes of problem based learning engaged by the participants in the simulation. Specific technologies provided varied abilities such as public chat, private instant messaging and live multi-user voice channels.

A companion website was co-developed and integrated into the VW environment providing learners with visual assets such as photographs and panoramas of the current site, site plans, elevations, and video interviews with Villa scholars. We also developed three-dimensional turntables of the interpreted and reconstructed models, overview information on each of the major Villa features, bibliography and an expansive database of art attributed to the Villa site. This information can be directly accessed by learners directly from within the virtual world. The development team integrated the notion of paradata, introduced by the London Charter – making instantly transparent the scholarship and all underlying elements of the 3D model (from terrain to buildings, furnishing, costumes, and human behavior).

In support of new research theme on celestial alignments by consultants De Franceschini and Veneziano, a major goal for the project was to develop an accurate simulation for the position of the sun. The solar tracking, or virtual heliodon that we created as a response to this research, was envisioned as a simulation that would a bridge between the virtual environment and coordinates from an external database calculating solar positions. After investigating existing tools we decided to employ the Horizons database that was created by NASA’s Jet Propulsion Laboratory as an on-line solar system data computation service – tracking celestial bodies in ephemerides from 9999 BCE to 9999 CE. In implementing solar tracking for the Villa project in instances were we where we wanted to investigate potential significant solar alignments, we entered the latitude, longitude and altitudes of specific buildings from the Tivoli site to poll the Horizons data for the year 130 CE. The user was able to change the date, time of day, and quickly play the sun from specific moments via the user interface. The system was co-related to both the Julian and Gregorian calendars and contained presets for the vernal and autumnal equinoxes as well at the summer and winter solstices.

These tools allowed for the rapid discovery of potential alignment that might bear further investigation. The solar feature allows one to proceed empirically, in effect turning the clock back to 130 CE and running experiments in which the days and hours of the year are sped up by orders of magnitude so that one can in a very short time find candidate alignments not yet hypothesized by scholars working in the traditional way of Mangurian-Ray.

As developers, our goal was to create the solar tool and let students and scholars use it to undertake their own empirical research. Our team was not intending to engage in this research ourselves, yet in the process of working within the environment daily we quickly began to notice curious solar phenomena. In a bit of empirical study of the very first component of the site we installed in the simulation, the Antinoeion – or newly-discovered Temple of the Divine Antinous, we noticed an alignment of potential interest. The most likely alignment seemed at first glance to be along the main axis running from the entrance, through the obelisk in the central plaza to the statue niche at the end of the axis. We ran the days and hours of the year and found that the sun and shadow of the obelisk align at sunrise on July 20. We consulted with our expert on the Egyptian calendar in the Roman period, Professor Christian Leitz of the University of Tuebingen–and he confirmed that this date has religious significance. It is, in fact, the date of the Egyptian New Year, as the Romans of Hadrian’s age clearly knew (cf. the Roman writer Censorinus, who states that the Egyptian New Year’s Day fell on July 20 in the Julian Calendar in 139 CE, which was a heliacal rising of Sirius in Egypt).

In the process of developing and subsequently utilizing the simulation tools we created for astro-archeological research, our conclusions have been that virtual world technologies can indeed take the inquiry for significant built-celestial alignments to a new level of insight.

Bibliography

Chiappetta, F. 2008. I percorsi antichi di Villa Adriana (Rome).

De Franceschini, M. and G. Veneziano, 2011. Villa Adriana. Architettura celeste. Gli secreti degli solstizi (Rome).

Hannah, R. 2008. Time in Antiquity (London).

Hannah, R. 2011. “The Role of the Sun in the Pantheon’s Design and Meaning,” Numen 58: 486-513.

Kandel, E. 2007. In Search of Memory: The Emergency of a New Science of Mind (W. W. Norton, New York). Kindler edition.

Lehmann, K. “The Dome of Heaven,” Art Bulletin 27: 1-27.

Lugli, G. 1940. “La Roccabruna di Villa Adriana,” Palladio, 4: 257-274

Mangurian, R. and M.A. Ray. 2008. “Re-drawing Hadrian’s Villa,” Yale Architectural Journal, 113-116.

Mari, Z. 2010. “Villa Adriana. Recenti scoperte e stato della ricerca,” Ephemeris Napocensis 20: 7-37.

Ytterberg, M. 2005. “The Perambulations of Hadrian. A Walk through Hadrian’s Villa,” Ph.D. dissertation, University of Pennsylvania.


[1] NSF grant # IIS-1018512.

[2] For a list of scholarly consultants, see http://idialabprojects.org/hvtest/mission.php,

[3] Athenaeus 8.361e-f.

[4] http://ssd.jpl.nasa.gov/?horizons.

Final Wisdom I: Interactive Haptic Poetry Installation at SIGGRAPH 2010

Final Wisdom I is an interactive installation engaging viewers in a sensory exploration of temporal and spatialized poetry. Participants manipulate imagery, sound and language through their gesture, touch and proximity. The work is engaged through a framework of cameras and sensors that react to heat, position, and capacitance – presenting a shifting environment of reactive media and haptics. Final Wisdom I is the work of artists Hans Breder and John Fillwalk, poetry by critic Donald Kuspit, music by composers Carlos Cuellar Brown and Jesse Allison. This project is produced through the Institute for Digital Intermedia Arts at Ball State University in collaboration with the Institute for Digital Fabrication. Final Wisdom I was exhibited at the art gallery of SIGGRAPH 2010 in Los Angeles, CA. Special thanks to IDF/CAP students Matthew Wolak, Christopher Baile and Claire Matucheski, and Assistant Professor of Architecture Joshua Vermillion. http://www.i-m-a-d-e.org/

As an intermedia artist, John Fillwalk actively investigates emerging technologies that inform his work in a variety of media, including video installation, virtual art, and interactive forms. His perspective is rooted in the traditions of painting, cinematography, and sculpture, with a particular interest in spatialized works that can immerse and engage a viewer within an experi- ence. Fillwalk positions his work to act as both a threshold and mediator between tangible and implied space, creating a conduit for the transformative extension of experience, and to pursue the realization of forms, sounds and images that afford interaction at its most fundamental level. In working with technology, he values the synergy of collaboration and regularly works with other artists and scientists on projects that could not be realized otherwise. Electronic media extend the range of traditional processes by establishing a palette of time, motion, interactivity, and extensions of presence. The ephemeral qualities of electronic and intermedia works, by their very nature, are inherently transformative, and the significance of the tangible becomes fleeting, shifting emphasis away from the object and toward the experience.

John Fillwalk is Director of the Institute for Digital Intermedia Arts (IDIA Lab) at Ball State University, an interdisciplinary and collaborative hybrid studio. An intermedia artist and Associate Professor of Electronic Art, Fillwalk investigates media in video installation, hybrid reality and interactive forms. He received his MFA from the University of Iowa in Intermedia and Video Art, and has since received numerous grants, awards, commissions and fellowships.

Donald Kuspit is an art critic, author and professor of art history and philosophy at State University of New York at Stony Brook and lends his editorial expertise to several journals, including Art Criticism, Artforum, New Art Examiner, Sculpture and Centennial Review. Hans Breder was born in Herford, Germany, and trained as a painter in Hamburg, Germany. Attract- ed to the University of Iowa’s School of Art and Art History in 1966, Breder established the Intermedia Program. Carlos Cuellar Brown, a.k.a ccbrown, is a composer, instrumentalist and music producer. Formally trained as a classical pianist, Cuellar specialized in experimental music and intermedia with the late American maverick composer Kenneth Gaburo. Jesse Allison is the Virtual Worlds Research Specialist, IDIA, Assistant Professor of Music Technology, Ball State University. He is also President of Hardware Engineering with Electrotap, LLC, an innovative human-computer interface firm.

Leonardo: The International Society for the Arts, Science and Technology article published by The MIT Press. http://muse.jhu.edu/journals/leonardo/summary/v043/43.4.fillwalk.html

The History Channel’s The Universe, Ancient Mysteries Solved: Roman Engineering

History Channel at IDIA Lab
History Channel at IDIA Lab

John Fillwalk, director of the Institute for Digital Intermedia Arts and senior director of the university’s Hybrid Design Technologies initiative, was interviewed for the History Channel’s “The Universe.” The episode, which aired 9 p.m. May 23, featured the IDIA Lab’s digital recreation of ancient Roman structures.

For the season finale of its popular program “The Universe,” the History Channel is shining a spotlight on the work of virtual artists from Ball State University.

The season finale episode of the documentary series, which aired May 23, explores how Roman emperors built ancient structures to align with movements of the sun. To confirm experts’ theories about the religious, political, and cultural significance behind these phenomena, the cable network enlisted the help of Ball State’s Institute for Digital Intermedia Arts (IDIA).

Through the use of 3-D computer animation, artists and designers from the IDIA Lab virtually recreated several monuments featured in the episode, along with accurately simulating the alignment of the sun. These structures include: the Pantheon, one of the best preserved buildings of ancient Rome; the Temple of Antinous, a complex that today lies in ruins within Hadrian’s Villa outside of Rome; and the Meridian of Augustus, a site containing an Egyptian obelisk brought to Rome for use as a giant sundial and calendar.

‘Getting things right’

IDIA Lab's Roccabruna reconstruction

 

The IDIA Lab digitally rebuilt ancient Roman structures to give viewers of the History Channel’s “The Universe” a vivid idea of what ruins looked like when they were inhabited.

James Grant Goldin, a writer and director for The History Channel, says IDIA’s animations were an essential part of the program’s second season finale. An earlier episode of “The Universe,” which aired in 2014, contracted with IDIA for animation of Stonehenge in a segment demonstrating how the prehistoric monument may have been used to track celestial movements.

This time around, the work of Ball State digital artists is featured throughout the Roman engineering episode.

“It’s an honor for us,” says John Fillwalk, director of IDIA and senior director of the university’s Hybrid Design Technologies initiative. “Ideally our relationship with the History Channel will continue long into the future.”

Goldin said the cable network is keen on employing the IDIA for future projects because Fillwalk and his team create beautifully rendered graphics backed by data and research.

“John was devoted to getting things right,” he said. ” Many of the theories we discuss in the show remain controversial, and in nonfiction TV it’s always a good idea to present such things with a qualifying phrase in the narration—an ‘if’ or a ‘maybe.’ But the IDIA Lab combined their own research with that of experts, and I’m very happy with the results.”

Gaming software transforms history

IDIA Lab's Pantheon reconstruction

 

The Pantheon is one of the ancient structures the IDIA Lab developed 3-D renderings of for the History Channel’s “The Universe.”

Fillwalk has worked closely over the years with many prominent scholars of the ancient world, including Bernard Frischer, a Roman archeologist at Indiana University and Robert Hannah, dean of arts and social sciences from New Zealand’s University of Waikat, who advised on the archeo-astronomy of the project.

Hannah says he’s been astounded to see the way up-to-the-minute gaming software can bring to life the work of today’s historians and virtual archaeologists. “I’ve seen my son play games like ‘Halo,’ so I knew what was possible,” he said, “but I’d never seen it adapted to ancient world buildings.”

Phil Repp, Ball State’s vice president for information technology, says the relationship between the university and the cable network is a key example of how Ball State is distinguishing itself as leading world provider of emerging media content.

“For the History Channel to want the continued help of John and his staff speaks to the quality of what our lab can produce,” he said.

Goldin’s praise for the IDIA supports Repp’s theory. “Bringing the past to life is a very challenging task,” he said. “The renderings Ball State artists have created represent the most accurate possible picture of something that happened almost 2,000 years ago.”

Ball State University’s IDIA Lab is granted rights to Blue Mars virtual world technology from Avatar Reality

Ball State University’s IDIA Lab is granted rights to Blue Mars virtual world technology from Avatar Reality http://bit.ly/BSU_BlueMars Ball State granted rights to develop $10 million Blue Mars virtual world technology (3/5/2012)

Ball State granted rights to develop $10 million Blue Mars virtual world technology (3/5/2012)

A digital laser scan of a centuries-old Buddha (above) and a simulation of the 1915 World’s Fair in San Francisco (below) are two examples of how Ball State has employed Blue Mars for cutting edge technology projects.

The future of virtual worlds has a new home at Ball State University, thanks to the generosity of Avatar Reality Inc., which granted the university rights to the multimillion-dollar 3-D virtual world platform, Blue Mars.

Blue Mars’ creator, Hawaii-based Avatar Reality, has provided expanded rights to the technology, valued at $10 million in research and development, to Ball State for 3-D simulation and research projects beyond the world of gaming.

“We are honored to have our institution selected to build upon the tremendous innovations within Avatar Reality’s Blue Mars platform,” said John Fillwalk, director of the Institute for Digital Intermedia Arts (IDIA) and senior director of Ball State’s Hybrid Design Technologies initiative. “This partnership between Avatar Reality and Ball State is an unprecedented gesture and a great distinction.”

Since 2009, Ball State’s IDIA has been a leading developer for Blue Mars, employing the virtual platform on projects such as digital laser scans of a centuries-old Buddha and the creation of the Virtual Middletown Project. The project, which ushers visitors via their computers into the world of the early 20th-century Ball Brothers Glass Manufacturing Co., is one example of the cultural heritage work possible with Blue Mars. Another is IDIA’s simulation of the 1915 World’s Fair in San Francisco.

“Avatar Reality designed in Blue Mars a next-generation 3-D platform that greatly advanced the graphic fidelity, functionality and scalability of virtual worlds — well beyond current standards,” Fillwalk said. “Its strategy connected social spaces across devices, from high-performance desktops to mobile phones. Our goal is to build upon both its technology and vision for the future of virtual worlds. “

Recognition speaks to reputation

Phil Repp, Ball State’s vice president for information technology, said the new agreement with Avatar Reality can be traced to Fillwalk’s commitment to advancing IDIA to the national level.

“The fact that other universities — and many of them of very high caliber — are working hard to gain expertise in hybrid worlds and that Ball State was identified as the university most likely to further develop this technology speaks volumes about our reputation, experience and abilities,” Repp said. “It’s an honor for Ball State and John to be singled out with this kind of recognition.”

Under Ball State’s terms of agreement with Avatar Reality, the university will begin to fully operate Blue Mars for noncommercial purposes, expand upon the source code, increase its research and academic initiatives, and enhance the community of Blue Mars. In addition, Ball State will deliver original content on Blue Mars as it has done in the past. Existing commercial activity will continue to be operated and supported by Avatar Reality, Inc.

“I am really excited about the future,” Repp said. “Through our division of Hybrid Design Technologies, Ball State will further our position as a national leader in hybrid worlds and their applications to mediated learning technologies. Our reputation in this discipline is such a perfect fit to our institutional history of innovation in teaching and learning.”

About Ball State: Located in Muncie, Ind., Ball State University is redefining education by providing relevant, immersive learning experiences that engage high-caliber students in intense interdisciplinary projects both in and out of the classroom. Its vibrant campus is home to about 22,000 undergraduate and graduate students from across the country and abroad. Learn more at www.bsu.edu.

About Blue Mars: Developed by Avatar Reality Inc., Blue Mars is a premium 3-D virtual world platform featuring unparalleled interactivity, fidelity, scalability and security, and it enables artists and developers to create and distribute interactive 3-D experiences to a global audience. Blue Mars launched in open beta in October 2009 and began selling virtual land to third party developers in January 2010. Blue Mars Mobile, an iOS app based on the content and technology from the virtual world, was introduced in February 2011.

ChromaChord: A Virtual Musical Instrument

ChromaChord: A Virtual Musical Instrument

IDIA Lab: Ball State University

 

ChromaChord utilizes a combination of the Oculus Rift headset and an attached Leap Motion controller to create an immersive 3D VMI environment. The Oculus Rift provides a three-paneled visual interface, through Unity, that allows the users to access different component windows of the instrument (simply by turning their head), while the Leap Motion Controller allows the user to interact with the visual interface to musical effect (processed and generated in Max/MSP).

ChromaChord uses the Oculus Rift DK2, a virtual reality headset to immerse the performer within the VMI. Utilizing two cameras in the virtual environment, this device relays a stereoscopic image to the wearer allowing for highly dimensional experiences. Additionally, a gyroscope, accelerometer, and magnetometer detect real-world head rotation and transform its virtual equivalents. Additionally, a peripheral infrared camera captures limited-range positional tracking.

Mounted to the Oculus Rift is affixed a Leap Motion. This provides infrared tracking of hands. In our application, the Motion tracks the user’s real world hands and simulates them, mimicking hand positions and finger movements. Therefore, the user’s own hands are the primary input mechanism.

Custom Software

  • Key note triggering and glow by Leap Motion hand touches
  • Particle generation triggered Leap Motion hand touches
  • Button click and glow by Leap Motion hand touches
  • Particle destruction and orbit physics
  • OSC Server and Client configurator / Message Handler / Data Handler

Our graphical user interface is composed of three sections – left, center and right panels. On the left, the user can see the modulation sphere. By touching this sphere, the user generates orbital particles. The number of these particles affects a live process on the notes. The user can remove these particles by simply swiping them away. Turning to the right reveals the root note and scale type controls. To select a root note and scale type, the user must press the desired button. The interface responds by highlighting the selected options.

The center module contains the keys, a chromatic scale in two colored columns based on the Solfège. This is the core performance area of the UI. In addition there are arrows facing up and down that the user can touch to raise or lower the octave. All buttons and keys respond to touches by glowing white.

3D Musical Instrument

The default window is the center palette, where the instrument itself is located. Within this window, there are two columns of colored keys (four per column) to the left and the right of the screen. To create sound, the user has but to put their hands vertically in front of them and ‘touch’ the keys, which are placed for both ease of use and the ability to contact multiple notes simultaneously to create chords. The notes, themselves, are stacked in intervals of a third from each other; consequently, neighboring notes on the same hand will create conventional harmonies. Between the columns, on the top and bottom-most extremities of them, are two arrows that, if touched, move the current note set up or down an octave. By touching the notes, Unity sends a UDP message to Max/MSP, indicating which note has been touched and remains active until contact has ceased. Similarly, by touching the octave buttons, UDP messages are sent to Max to reassign the MIDI note values. Within Max, MIDI values are then converted to their equivalent frequencies and assigned to their respective oscillators before being processed and output through Max’s audio out.

When looking to the right, the user accesses the settings window, where note assignment takes place. First, the user selects the root note of the scale they wish to be within. The second parameter allows the user to select what mode of scale they would like – the options being: major, natural minor, and harmonic minor. By default, the instrument opens with a root note of C and is in the major mode. The UDP messages sent from the settings window control a series of gates within Max that then send adjusted MIDI values to the instrument’s oscillators.

The performer accesses the modulation window to the left, which controls the frequency and Q of a filter, as well as level of effect. By touching the button in the center, Unity spawns floating particles that surround the user’s hands. The longer the users touches the button, the more particles accumulate; they can be removed by making shaking or throwing motions. Without particles, the filter’s frequency setting is at it’s lowest audible setting, creating a much darker, mellow timbre; alternatively, as more particles accumulate, the filter opens up, allowing for higher frequencies to pass through, creating a brighter, richer timbre. Additionally, the particles move the dry/wet ratio for effects towards the ‘wet’ end of the spectrum, deepening the audibility of the effects. The UDP message sent from the modulation window sends a single numeric value for the total number of particles active; this value is then scaled in Max/MSP to most appropriately suit each parameter that’s being altered.

 

Data 0 0 0 0 0 0 0 0 0 0 0 0 0
Name Do Re Mi Fa Sol La Ti Do Up Octave Down Octave Modulation Root Note Scale Type
Type Key Key Key Key Key Key Key Key Octave Switcher Octave Switcher Modulation Root Note Scale Type
Range 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-100 0-11 0-2
Notes Bottom of octave Top of octave 1 when clicked 1 when clicked # of particles

Virtual Companion: Indianapolis

Virtual Monument Circle
1910s-1930s

Virtual Monument Circle is a mobile application that uses historic photographs and maps of downtown Indianapolis, IN to create an immersive interpretation of various historic phases of the city center. This project is a prototype for a larger potential city-wide endeavor bringing to life significant neighborhoods and sites within the city. It is developed as a possible collaboration between the Institute for Digital Intermedia Arts at Ball State University, the Polis Center at Indiana University Purdue University Indianapolis and the Indiana Historical Society.

There are two experiences with the content dependent on the proximity of the user’s location – onsite and offsite. It will be available soon for both iOS and Android.

Virtual Monument Circle was designed and produced by the Institute for Digital Intermedia Arts at Ball State University. Special thanks to the Polis Center at IUPUI and the Indiana Historical Society.

For more information contact:  idialab@bsu.edu

Virtual Companion: Columbus

IDIA Lab – in collaboration with BSU Assistant Professor of Architecture, Kristin Barry – has  designed the cultural heritage mobile application, Virtual Columbus Experience – allowing users to virtually tour the rich architectural history of Columbus, Indiana. This locative mobile app was produced with fellowship support of Ball State University’s Digital Scholarship Lab.

Though a relatively small city, Columbus has made itself a cradle of modern architecture, commissioning many buildings and public works since the middle of the 20th century. The number of landmark buildings and notable architects to have worked in the city has earned it the nickname “Athens on the Prairie.”

With data and artifacts gathered by Kristin Barry’s immersive class, users of the app can explore 3D models of key buildings commissioned over the years, survey timelines showing when architects were active or buildings were constructed, and meet Columbus’ famous architects – such as Eero Saarinen, I.M. Pei, and Harry Weese.

After its launch, Virtual Columbus Experience will seek further funding to expand the scope and depth of the project across the community.

Buffalo Bill Center of the West AR Diorama

AR Diorama

Buffalo Bill Center of the West has contracted with IDIA Lab to create an augmented reality (AR) diorama of Buffalo Bill’s Wild West show. Guests to the museum will be able to view the physical white model. While IDIA Lab’s previous Buffalo Bill project focused on his Muncie show, this iteration explores the Ambrose Park show in Brooklyn, NY.

Adam Kobitz

“Constructing the physical portion of the Buffalo Bill diorama through 3D printing and laser cutting processes presented us with several challenges, the first of which was recreating our existing models within CAD software (Autodesk Fusion 360) to ensure that they were designed with these techniques in mind. This modelling process is quite different from those employed on the AR side of the piece, and entails ensuring that the sizes, thicknesses, etc. of each element correlates with the parameters of our machines. Additionally, given the overall scale of the final product, many elements throughout the diorama had to be broken down into smaller components and reassembled by hand. After the models had been 3D printed or laser cut, each of the elements needed to be assembled with adhesives, their seams hidden with a filler substance, filed and sanded, and painted to achieve the final aesthetic.”

Trevor Danehy

“I identified several 19th century photos of William Cody, Iron Tail, and Nate Salsbury with in a timeframe of a few years. Using these photos I constructed a digital 3D model of them within programs like Zbrush and Maya. I also did extensive research of the 1884 Buffalo Bill Wild West show in Ambrose Park, Brooklyn- a long forgotten location that has been all but wiped from history. There are very few details of Ambrose Park aside from a few photographs from the late nineteenth century. The location was demolished in the early 20th century and the Brooklyn Navy Terminal was paved over it. I was able to confirm structures and locations of the Wild West show by cross referencing photographs that overlap the same locations. An interesting note is that we have an electrical layout map of the show in Ambrose park. When comparing photographs to the electrical map, while it does confirm the exact locations of electrical poles throughout the campus, the positions of major tents have changed, possibly due to weather or other circumstance. Many informative photographs came from Robert Bracklow, a professional photographer that was heavily active in the New York area at the time. Many other photographers can be seen in the Wild West show, but his archive survived over more than a century. It was his photographs that helped establish the location of the tents belonging to William Cody, Annie Oakley, and even John Burke’s cabin. They also helped piece together what parts of the exterior grandstand might have looked like.”

EVAC+ Virtual Disaster Simulator

EVAC+ Virtual Disaster Simulator

IDIA Lab partnered with physicians at the Seattle Children’s Hospital to develop a virtual simulator to train staff in the case of various disasters.

The EVAC+ Virtual Disaster Simulator leverages virtual reality to train health care providers to react quickly and effectively to unexpected events that threaten patient safety. It was developed after pilot live simulations, user focus groups, surveys, and a successful pilot of the initial EVAC system. After the need for more communication opportunities and variations in disasters responses was identified in the pilot, EVAC+ was created.  EVAC+ leverages the same ability to prepare patient equipment for an evacuation but with significantly more features allowing for interacting with hospital room, other staff, and family members. Upon entering the EVAC+ environment providers are oriented to the virtual space, including navigation, interacting with objects, and how to interact verbally with non-player characters.  

Once they are comfortable with the space they can choose from a menu of disasters to experience, including earthquakes, fire, and an active shooter event. Each simulation has unique objectives around patient protection, equipment packing, and communication with other staff and families. Learners practice how they would provide a status update using the SBAR technique to their charge nurse, use closed loop communication, and respond to anxious family members.  

Pre-determined safety behaviors, such as stopping and resolving a mission critical issue, are prompted and performance recorded for feedback. After each scenario the user is guided through a self-debriefing on how well they performed on key disaster and communication behaviors. Information on how they managed and packed medical equipment is presented along with evidence-based information on how experts would recommend managing equipment. Users can choose to repeat any scenario to practice or refine their responses and can explore the scenarios in any order.  

The Unity3D environment is accessible on web-browsers and head-mounted VR systems. Amazon Web Services (AWS) supports voice communication and real-time analytics. The EVAC+ system fills a unique need for accessible, interactive, sustainable disaster training for healthcare providers.

Photogrammetry and 3D Printing in “The Power of Place”

Long on the cutting edge of emerging technologies, IDIA Lab has gained a reputation for unique skill sets. Providing their photogrammetry and 3D printing services, IDIA Lab joined several departments in the production of the new exhibition at the David Owsley Museum of Art. The Power of Place: 100 Years of Architecture at Ball State University

“The Power of Place: 100 Years of Architecture at Ball State University”

“The Power of Place: 100 Years of Architecture at Ball State University” showcases the ways Ball State’s campus architecture has changed and how the past still shines through.

“The exhibition embodies the idea that even though Ball State’s built environment has greatly changed over time by way of adding new buildings constructed with modern materials, our past is forever present,” says Rebecca Torsell, archivist for architectural records at Ball State University, who organized the exhibition.

Running from September 27 through December 21, 2018, the exhibition is part of a campus-wide celebration of Ball State’s Centennial.

“The museum is an ideal place to celebrate the University’s Centennial,” says Robert G. La France, director of the David Owsley Museum of Art. “For more than 80 years the museum’s building — the Fine Arts Building — has stood as Ball State’s palace to the arts. This show displays the University’s long-standing commitment to distinctive architecture and introduces visitors to its vast architectural archives and special collections.”

Photogrammetry

Part of the history of Ball State architecture includes unique tiles used in the Fine Arts Building itself. The museum wanted a way to display these ornate, relief tiles without removing them from the wall. Using their photogrammetry techniques, IDIA Lab first recreated the tiles as a virtual 3D model.

IDIA Lab Modeler, Trevor Danehy, explains the process, “We take at least one hundred photos of each tile, everything from straight on shots of the tile to sharp angles in an effort to expose the levels of depth in the crevices of the tile. The photogrammetry software algorithm requires that each photo has at least 50% overlap compared to the last photo taken in order to stitch together a mesh.”

“It can be a very trial and error process, so the more images taken, the better. That is why we prefer an DSLR camera that can take several images rapidly. Quantity over quality works best in photogrammetry. Once the software stitches together a crude mesh, we do a small amount of digital cleanup and export it for 3D printing.”

3D Printing

After the virtual mesh has been created, the next step is to recreate the tile using a physical medium. At IDIA Lab, this is accomplished using 3D printing. IDIA Lab’s Adam Kobitz, who manages 3D printing in the lab, describes the printing technique. “We printed the tiles using an Fused Deposition Modeling (FDM) printing process, in which the physical objects are built up by extruding layer after layer of melted plastic. This process allows us to produce accurate recreations of the tiles relatively quickly, permitting viewers of the exhibit to observe and interact with them in a manner they’re unable to with the original works.”

After the tiles are printed, they are ready to be displayed as a portable replica of part of Ball State’s architectural history.

Wonders of Oz is an augmented reality app

oz

Wonders of Oz is an augmented reality application that provides users an enhanced 3D experience with the film, The Wizard of Oz. This demo app can be triggered during film (or anywhere, anytime) to deliver animated content and sound to augment the screening and share events with a live audience. Flying monkeys, the witch on her burning broom, lollipops, the hot air balloon, Dorothy’s falling house, cows, chickens, rowboats, bicycles, rainbows and the wizard all appear with the press of a button! Users can use their devices to discover and track the virtual performers as they appear around them during the film. The original version contained more 3D content and was designed to work on a timer so the audience can share in the digital performers appearing on cue! This app can be adapted for any event, live theater, film concerts, art and cultural heritage events, etc. Note: This application requires the use of a gyroscope – not all Android devices possess one.
iOS: http://apple.co/1SmwI7Z
Google Play: http://bit.ly/1RRbNds

Launch of Virtual Companion iOS app

Virtual Companion – custom augmented reality app employing LocusEngine, a geolocative process developed by IDIA Lab.
https://idialab.org/launch-of-virtual-companion-ios-app-by-…/

Halloween Celestial Origins

stonehenge_summerfar

The Charles W. Brown Planetarium’s newest production, “Halloween: Celestial Origins” premiered the weekend of October 7th in collaborations with IDIA lab.

Learn the history of Halloween and how it fits into the seasons as a “cross-quarter day” during this new production. Explore the night sky and learn what planets, constellations, and stars will be out on Halloween. Created in collaboration with the Ball State University – Department of Physics and Astronomy, the Department of Anthropology, and IDIA Lab The Institute for Digital Intermedia Arts at Ball State University. More information at http://bsu.edu/planetarium.

spider

IDIA Lab Featured on WTHR News for Virtual Reality research

wthr

The future of home entertainment is taking a giant step forward into the world of virtual reality. More devices are hitting the market that transform the way we look at the world around us, and a group of developers at Ball State is on the cutting edge.

It’s something we experienced first-hand at Ball State University, where they produce content for virtual reality devices at the Institute for Digital Intermedia Arts. The lab is located in the architecture building on the Ball State Campus, where the focus is on the exploration the arts, science and technology.

For the last ten years, Ball State has been developing a wide range of projects for museums, the History Channel and other clients.

John Fillwalk oversees the work done there.

“We basically explore 3D simulation and virtual reality,” he explained. “The skill sets from the staff are across computer science to design to animation.”

“We continue to further the emerging media brand aspect of this university, giving students professional opportunities to work and stay in Indiana for design technology jobs,” said John Fillwalk.

The virtual reality industry estimates more than 7 million headsets will ship this year, even though researchers have found more than half of Americans have no interest in VR and don’t know what it is. Developers foresee huge growth in the next six years as they develop virtual reality social media apps, combined with lower prices for the technology.

http://bit.ly/29JHJAl

Solar System and Jupiter App

jupiter-2

Explore the solar system by touch and augmented reality. New app for iOS, Android and Oculus Rift designed by BSU’s IDIA Lab allows users to learn about our sun and planets – and their moons. Physical museum objects trigger augmented experiences of each planet, providing deeper context and understanding. View the moon’s orbits, examine each moon, the planet’s atmosphere and unique features such as Jupiter’s Great Red Spot. IDIA designed this augmented reality app as a prototype for the Boston Museum of Science Planetarium.

Photogrammetry capture of the Pertev Paşa Mosque

mosque-interior-pan01

IDIA Lab’s Chris Harrison presented on the topic of photogrammetry at a workshop in İzmit, Turkey, the results of which have recently been published in Mimarlık | Tasarım Kültürü Dergisi (Architecture | Design Culture Magazine).

View the 3D Model

https://skfb.ly/PKpx

The workshop

The workshop was a collaboration between Ball State University’s College of Architecture and Planning (group led by Jonathan Spodek, FAIA), the Politecnico di Milano (group led by Prof. Elisabetta Rosina), and the host Kocaeli University (group led by Dr. Emre Kishali).

While the workshop focused broadly on theories of historic preservation in relation to the city of İzmit as a whole, Chris was invited to the workshop to present photogrammetry as a cheap, fast, and accessible method of site documentation and preservation. Chris led a team comprised of one student from each participating university and used the mosque and its grounds as a hands-on experience to teach various capture techniques.

While other documentation methods such as laser scanning exist, they require very expensive equipment that takes considerable training. Photogrammetry only requires a digital camera, which many people have on their cell phones these days. Using a free version of Autodesk’s browser-based ReMake, Chris and the students were able to capture and process the mosque interior and several smaller exterior details over the course of a few days.

Post processing

After returning from the workshop, we decided to further refine the model and optimize it for a virtual reality experience. Using a variety of softwares, the model was simplified so that it could be run as a live experience. This now enables visitors to experience a site located halfway across the world as if they were there.

Hero’s Horizons a Real-Time 3D Adventure Game

CAI_startmenu

Hero’s Horizons is a real-time 3D adventure game where players can choose a character that shapes the outcomes of their own experience in learning about civic responsibility and the consequences and impact of their own actions. The game employs a graphic novel esthetic as it immerses players in the conflict and progress of improving the health of an urban community.

In order to design an environment which allowed mission-based gameplay as well as free-roaming exploration, the team at Ball State University’s IDIA lab employed Unity 3D alongside various modeling, animation, motion capture, scripting, painting and texturing packages to create the look and feel of the world. Characters came to life through cinematic and graphic novel techniques, allowing the story to flow as required by interactive missions, but yet allow free gameplay via exploration and mini-games. Players’ good deeds provided mementos from those characters whose lives they helped shape and change.

Ball State University and Creative Associates have developed and disseminated a highly successful and transformative life skills curriculum to help at-risk youth in Central America and the Caribbean. But Creative Associates’ life skills training has always happened face-to-face, making it difficult for organizations to scale it large enough to help the greatest number of at-risk youth. In 2013, Creative Associates and Ball State University joined forces to explore the possibilities of gaming and gamification to expand the reach and impact of the existing life skills curriculum. The result is Hero’s Horizons – an innovative game-based learning system designed to deliver critical life skills training to at-risk youth via mobile devices and the Web.

Creative Associates International and Ball State University have combined expertise to create a truly blended-learning game for change targeted at youth living in high crime areas of Central America and the Caribbean.  By bringing together a mobile-ready graphic novel game – Hero’s Horizons, built in Unity 3-D, and 20 interactive e-modules, Creative seeks to build the life skills of 10,000 youth living in gang affected communities. Youth can access the game and training through their mobile devices as well as in some of the 200 youth outreach centers Creative has set up across the region.

Hero’s Horizons allows the player to customize a character, and then play the role of a young community organizer working in 4 sections of Capital City to help mobilize community youth and leaders to create a youth outreach center.  Along the way he/she encounters issues of apathy, political manipulation, and many issues facing at risk youth that he/she has the opportunity to make choices in how to respond to as he/she learns to overcome barriers at both the individual and community level.

The graphic novel game and accompanying e-modules help youth to advance their life skills in 20 competencies including conflict management, finding employment, and team work among others.  the game and the e-modules continually reinforce each other as part of a blended learning approach to long term retention.

Youth’s achievements are rewarded both through achievement badges as they successfully complete tasks and levels, as well as in the e-module section where youth earn printable Certificates of Competency for completing modules and Mozilla badges for achieving learning paths.

The game is being rolled out and tested through Creative’s network of 200 youth outreach centers in the region and other youth serving organizations to measure its effectiveness and further adapt it to the realities that youth in these communities face.

CAI

www.creativeassociatesinternational.com

Download Android game at Google Play

https://play.google.com/store/apps/details?id=org.idialab.LifeSkills&hl=en

Wonders of Oz: iOS and Android app

Wonders of Oz is an augmented reality application that provides users an enhanced 3D experience with the film, The Wizard of Oz. This demo app can be triggered during film (or anywhere, anytime) to deliver animated content and sound to augment the screening and share events with a live audience. Flying monkeys, the witch on her burning broom, lollipops, the hot air balloon, Dorothy’s falling house, cows, chickens, rowboats, bicycles, rainbows and the wizard all appear with the press of a button! Users can use their devices to discover and track the virtual performers as they appear around them during the film. The original version contained more 3D content and was designed to work on a timer so the audience can share in the digital performers appearing on cue! This app can be adapted for any event, live theater, film concerts, art and cultural heritage events, etc. Note: This application requires the use of a gyroscope – not all Android devices possess one.

iOS: http://apple.co/1SmwI7Z

Google Play: http://bit.ly/1RRbNds

951ca5f3-1448-406b-8c43-238e6368642f

Launch of Virtual Companion iOS app by IDIA Lab

Learn how modern technology can shape our understanding of the past during a special program at Mounds State Park on Saturday, Nov. 14.

Visitors to the 1 p.m. program will join park naturalist Kelley Morgan to learn about modern technologies that help archaeologists and historians bring the past to life. During the second half, director John Fillwalk and animator Neil Zehr of the Institute for Digital Intermedia Arts Laboratory at Ball State University will demonstrate how they use archaeological data to interpret the past to the public.

BSU’s IDIA Lab is premiering Virtual Companion – their custom augmented reality app employing LocusEngine, a geolocative process developed by IDIA Lab. Visitors to the park use the app to aid in learning and discovery while exploring the park’s Adena-Hopewell mounds. Using GPS data, the user’s position is geolocated in reference to the historical sites, allowing the app to display relevant content as a dynamic guide. This approach can be applied in cultural heritage, archeology, the sciences and the arts.

Interactive features, as well as the user’s current location in the park, are marked on a series of map options designed to provide multiple layers of locative information throughout the park. A GPS-driven trail map is available, allowing the user to track their movement through the trails and important features. When an interactive feature is selected on the map, an augmented reality view using gyroscope and compass data is loaded, portraying native people’s and habitats from the Adena-Hopewell era. Archaeologists have proposed that the enclosures were used to track celestial alignments. Using solar data from NASA’s JPL Horizons database, the movements of the sun on the equinoxes and solstices during the Adena-Hopewell era can be viewed and tracked to search for important alignments.

Standard park entry fees of $5 per in-state vehicle apply. Mounds State Park (stateparks.IN.gov/2977.htm) is at 4306 Mounds Road, Anderson, 46017.

Demonstration videos:

Download app here: http://bit.ly/VC_Mounds

Buffalo Bill’s Wild West gets a virtual makeover

Virtual Buffalo Bill's Wild West

Digital artists from Ball State’s IDIA Lab built their virtual simulation of Buffalo Bill’s Wild West with the Unity 3-D gaming platform along with custom software created by the lab.

Cowboys and Indians captivated the country when Buffalo Bill’s Wild West rolled through America in the late 1800s and early 1900s. More than a century later, Ball State digital artists have re-created the legendary showman’s outdoor exhibition.

Working with staff from the Buffalo Bill Center of the West, artists and designers from Ball State’s Institute for Digital Intermedia Arts (IDIA) have crafted a computer-generated world that authentically simulates the Wild West showdramatizing frontier life.

“The visual look and feel of the project is something we’re really proud of,” said John Fillwalk, IDIA director and senior director of the university’s Hybrid Design Technologies initiative.

Fillwalk collaborated on the project with Jeremy Johnston, curator of the center’s Buffalo Bill Museum, and Ball State historians James Connolly and Douglas Seefeldt.

As a senior digital editor of the Papers of William F. Cody, Seefeldt has worked closely with Johnston on several projects the National Endowment for the Humanities recently recognized as among the most significant it funded.

“When Doug introduced me to John, I was excited because all we had to visually represent the Wild West show at the Buffalo Bill Museum was this small architectural diorama,” said Johnston, who is also managing editor of the Papers of William F. Cody. “It gave our visitors an aerial overview of the show but lacked action.

“What the IDIA captured for us is the look and feel of the experience, right down to the sound effects of horses and the stage coach running through the arena.”

Buffalo Bill’s Muncie visit

Interactive Firearm Demonstration

IDIA-created augmented reality apps will feature objects in the museums’ collections, such as firearms.

The Virtual Buffalo Bill project offered a crossover research opportunity for Connolly, director of Ball State’s Center for Middletown Studies. The center is developing Virtual Middletown, a 3-D visualization of industrializing early 20th-century Muncie, and the Buffalo Bill simulation offered an opportunity to produce a module as part of that endeavor.

Connolly and Seefeldt provided Fillwalk with photographs and newspaper accounts of Buffalo Bill’s 1899 stop in Muncie. “He personified the Wild West for audiences in these small towns,” Connolly said.

Connolly’s and Seefeldt’s research, along with assets provided by the Buffalo Bill Center, allowed Fillwalk and his artists to create beautifully rendered graphics based on data and research, hallmarks that have distinguished IDIA’s work in emergent media design.

“The attack on the Deadwood Stage Coach is simulated down to representing John Y. Nelson, one of America’s original Mountain Men driving the coach,” Fillwalk explained. “And Cody himself—along with his wardrobe—was painstakingly researched and re-created. His appearance was based on specific clothing of Cody’s in the museum collection that we were allowed to photograph.”

Seefeldt said Fillwalk’s re-creations uniquely capture William F. “Buffalo Bill” Cody.

“His show had it all—buffalos, the Pony Express, Annie Oakley, re-enactments of iconic events in the history of the West. He was one of the most famous people in the country, a celebrity of that era, and it’s a thrill to see the way John has brought him back to life.”

Ball State-Center of the West partnership continues

Located in Cody, Wyoming, the Buffalo Bill Center of the West includes the Buffalo Bill, Draper Natural History, Whitney Western Art, Plains Indian and Cody Firearms museums, along with the McCracken Research Library.

The Origins of Buffalo Bill

Buffalo Bill portraitBorn in 1846, William F. Cody rode for the Pony Express, served as a military scout and earned his moniker “Buffalo Bill” while hunting the animals for the Kansas Pacific Railroad work crews. Beginning in 1883, he became one of the world’s best showmen with the launch of Buffalo Bill’s Wild West, which was staged for 30 years, touring America and Europe multiple times.

The IDIA Lab’s next project for the center will be a series of augmented reality apps featuring objects in each museum’s collection. By holding electronic devices over images like a grizzly bear or gun, users can learn more about them as 3-D models of the subjects pop up on screen.

“By using their phones or tablets, visitors can see museum exhibits come to life,” Fillwalk said. “All of our work is meant to give visitors a greater appreciation for these assets with the aid of our digital interpretations.”

Johnston said what he likes best about Fillwalk’s approach is the way “he puts technology in the users’ hands.”

“I’ve seen so many younger people walking through our museums with their heads down, glued to their iPhones and iPads. With John’s help, I’m excited that we’re taking something they’re so familiar with and using it in a way to get them to engage with our exhibits here.”

Funding for the Virtual Buffalo Bill project was provided by a grant from the Buffalo Bill Center for the West, which was matched by internal grant funding from Ball State.

http://cms.bsu.edu/news/articles/2015/10/ball-state-gives-buffalo-bills-wild-west-a-virtual-makeover

Virtual Buffalo Bill

Ball State University’s IDIA Lab https://idialab.org is developing a multiplayer virtual world that simulates Buffalo Bill Wild West Show. This digital history project is built in Unity 3D using custom software created by IDIA Lab and is being produced for the Buffalo Bill Center for the West in Cody, WY. Scholars include Douglas Seefeldt and James Connolly at Ball State University. http://centerofthewest.org

Buffalo Bill's Wild West

Virtual Apollo 15

IDIA Lab’s Virtual Apollo 15 is a simulation of the mission which launched on July 26, 1971 from the Kennedy Space Center, at Cape Canaveral, Florida. During the launch, the S-IC did not completely shut off following staging for four seconds, creating the possibility of the spent stage banging into the S-II engines, damaging them and forcing an abort (the S-II exhaust also struck a telemetry package on the S-IC and caused it to fail). Despite this, the third stage and spacecraft reached its planned Earth parking orbit. A couple of hours into the mission, the third stage reignited to propel the spacecraft out of Earth orbit and on to the Moon. The Apollo 15 command module is on display at the Smithsonian Air and Space Museum in Washington, D.C.

Virtual Apollo 15

Visit Virtual Apollo 15 in Blue Mars

http://blink.bluemars.com/City/IDIA_Lunar/

Virtual Apollo 15 in Blue Mars Video Walkthrough

Temple of Artemis: Wonder of the Ancient World

IDIA Lab was contracted by digital humanities scholars at UCLA to design and build a virtual simulation of the Temple of Artemis, one of the Wonders of the Ancient World, This massive Greek temple, four times the size of the Parthenon lies in ruin in present-day Turkey. This simulation incorporates our CelestialEngine with accurately positions both the sun and moon using a site’s latitude, longitude, altitude and year via NASA JPL data. This particular simulation studies whether an opening in the temple’s portico allowed moonlight to illuminate the statue of Artemis on her feast day.

Dice+

Dice+4 copy

Connecting Dice+ to a Cell Phone

The DicePlus (Dice+) can communicate with both iOS and Android devices and has an available SDK for Android, iOS, and Unity.

It has capabilities including: return dice number rolled, control of LED’s, magnetometer and accelerometer for orientation and movement, capacitive sensor for proximity and touch, and temperature sensor for temperature.

For more information regarding its use check out www.dicepl.us

 

 

 

Mobile Arduino Controller

Mobile Arduino2 logoThe Mobile Arduino Controller allows one to access an Arduino board via a mobile friendly website. It’s a website designed for mobile devices for a couple reasons…

1) Allows access from computers, not just smart phones.

2) Easier for the public to access the Arduino or multiple Arduinos for installations without searching for an app and then waiting for it to download.

3) Works on all smart phones.

The mobile website allows for near real time manipulation of anything connected to the Arduino board. It could also be used to relay sensor information back to the website or user. Right now, it is just connected to control LED’s, but it could be anything (servos, speakers, etc.)

We hope to use this for more than just turning on/off an LED on the board, but currently it is in a prototyping/research phase.

Unity Mobile Controller

Unity Mobile ControllerThe Unity Mobile Controller would allow us to have a Unity project running on a PC, and users would be able to download an app that turns their smart phone into the controller. This allows for the project running on the PC to show the player and the smart to display the controls and other information (such as maps, menus, etc.). It’s still in a research/prototyping stage right now.

 

Virtual Roman Pantheon in Blue Mars / CryEngine

Pantheon32

The Pantheon is the best-preserved architectural monument of ancient Rome. This simulation by BSU’s IDIA Lab represents the Pantheon and its surrounds as it may have appeared in 320 AD. Visitors to this Blue Mars / CryEngine simulation can tour the vicinity, learning about the history, function and solar alignments through an interactive heads up display created for this project. The project opened in beta in late 2013 and will premiere publicly in February 2014 and includes new solar simulation software calibrated to the buildings location and year, an interactive HUD, a greeter bot system and a new AI Non Player Character system developed in partnership between IDIA Lab and Avatar Reality.

Originally built by Agrippa around 27 BC under the rule of Augustus, it was destroyed by fire, then rebuilt and finally completed in its present form during Emperor Hadrian’s reign, around 128 AD. Agrippa finished the construction of the building and it bears his name above the portico. The Pantheon would have contained numerous marble statues representing the major Roman deities. The statues displayed in this simulation represent a possible configuration and are scanned via photogrammetry. The buildings surrounding the Pantheon are built and interpreted by IDIA based on the large scale model of ancient Rome built by Italo Gismondi between 1935 and 1971. The model resides in the Museo della Civiltà Romana, just outside of Rome, Italy.

Video walkthrough of the Virtual Pantheon in Blue Mars:

To  visit the Virtual Pantheon firsthand:

First step, create an account: https://member.bluemars.com/game/WebRegistration.html/
Secondly, download client: http://bluemars.com/BetaClientDownload/ClientDownload.html/
Lastly, visit the Virtual Pantheon: http://blink.bluemars.com/City/IDIA_IDIALabPantheon/

Advisors

The Solar Alignment Simulation of the Roman Pantheon in Blue Mars was developed under consultation with archeo-astronomer Dr. Robert Hannah, Dean of Arts and Social Sciences at the University of Waikato, New Zealand, one of the world’s  foremost scholars on Pantheon solar alignments; and archaeologist Dr. Bernard Frischer, Indiana University.

Background

The Pantheon that we can visit today is composed of a rectangular porch with three rows of granite columns in front of a circular building designed as a huge hemispherical dome (142 feet in diameter), built over a cylinder of the same diameter and as high as the radius. Therefore, the ideal completion of the upper hemisphere by a hypothetical lower one touches the central point of the floor, directly under the unique source of natural light of the building. This light source is the so-called oculus, a circular opening over 27 feet wide on the top of the cupola. It is the only source of direct light since no direct sunlight can enter from the door in the course of the whole year, owing to the northward orientation of the entrance doorway. Of the original embellishments the building should have had, the coffered ceiling, part of the marble interiors, the bronze grille over the entrance and the great bronze doors have survived.

Interior Wall

The interior wall, although circular in plan, is organized into sixteen regularly spaced sectors: the northernmost one contains the entrance door, and then (proceeding in a clockwise direction) pedimented niches and columned recesses alternate with each other. Corresponding to this ground level sector are fourteen blind windows in the upper, attic course, just below the offset between the cylinder and the dome. It is likely that both the niches and the windows were meant for statues, which, however, have not survived.

Oculus

Direct sunlight penetrates the interior only through a large, 27 foot wide oculus in the center of the domed roof. Otherwise indirect sunlight can enter the building, but only through the large, north-facing doorway, when it is open. The fall of direct sunlight through the oculus into the essentially spherical building leads to the comparison with a roofed sundial.

Celestial Alignments

A columned porch leads through a vestibule of the Pantheon, into a huge, shadowy interior, over 142 feet in height and as much in diameter. The building’s form is essentially that of a sphere with its lower half transformed into a cylinder of the same radius. Direct sunlight penetrates the interior only through a large, 27 feet wide oculus in the centre of the domed roof.

The shift from one semester to the other is marked by the passage of the sun at the equinoxes in March and September. At this point the noontime sun shines partially just below the dome, passing through the grill over the entrance doorway and falling on the floor of the porch outside. More significantly, however, the centre of this equinoctial, midday circle of sunlight lies on the interior architectural moulding, which marks the base of the dome.

On April 21st, the midday sun shines directly on to visitors to the Pantheon when they are standing in the open doorway, dramatically highlighting them. This day is of particular significance, not just because this was when the sun entered Taurus, but more because it is the traditional Birthday of Rome, a festival preserved from antiquity right through to the present day. And it may be that when the building was officially commissioned in AD 128, the person expected to be standing in the open doorway was the emperor Hadrian himself.

The illustration indicates a section through the Pantheon, showing the fall of the noon sunlight at the winter solstice, when the sun is at altitude 24 degrees; noon sunlight, at both equinoxes at altitude 48 degrees; noon sunlight on April 21st, when the sun is at altitude 60 degrees; and finally, noon sunlight at the summer solstice, when the sun is at altitude 72 degrees.

Meaning of Pantheon

The Pantheon is a building in Rome, Italy commissioned by Marcus Agrippa during the reign of Augustus as a temple to all of the gods of ancient Rome, and rebuilt by the emperor Hadrian about 126 AD.

Pantheon is an ancient Greek composite word meaning All Gods. Cassius Dio, a Roman senator who wrote in Greek, speculated that the name comes either from the statues of so many gods placed around this building, or from the resemblance of the dome to the heavens.

“Agrippa finished the construction of the building called the Pantheon. It has this name, perhaps because it received among the images which decorated it the statues of many gods, including Mars and Venus; but my own opinion of the name is that, because of its vaulted roof, it resembles the heavens.”

-Cassius Dio History of Rome 53.27.2

Augustus

Augustus was the founder of the Roman Empire and its first Emperor, ruling from 27 BC until his death in 14 AD.

The reign of Augustus initiated an era of relative peace known as the Pax Romana (The Roman Peace). Despite continuous wars or imperial expansion on the Empire’s frontiers and one year-long civil war over the imperial succession, the Roman world was largely free from large-scale conflict for more than two centuries. Augustus dramatically enlarged the Empire, annexing Egypt, Dalmatia, Pannonia, Noricum, and Raetia, expanded possessions in Africa, expanded into Germania, and completed the conquest of Hispania.

Beyond the frontiers, he secured the Empire with a buffer region of client states, and made peace with the Parthian Empire through diplomacy. He reformed the Roman system of taxation, developed networks of roads with an official courier system, established a standing army, established the Praetorian Guard, created official police and fire-fighting services for Rome, and rebuilt much of the city during his reign.

Augustus died in 14 AD at the age of 75. He may have died from natural causes, although there were unconfirmed rumors that his wife Livia poisoned him. He was succeeded as Emperor by his adopted son (also stepson and former son-in-law), Tiberius.

Argrippa

Marcus Vipsanius Agrippa (c. 23 October or November 64/63 BC – 12 BC) was a Roman statesman and general. He was a close friend, son-in-law, lieutenant and defense minister to Octavian, the future Emperor Caesar Augustus and father-in-law of the Emperor Tiberius, maternal grandfather of the Emperor Caligula, and maternal great-grandfather of the Emperor Nero. He was responsible for most of Octavian’s military victories, most notably winning the naval Battle of Actium against the forces of Mark Antony and Cleopatra VII of Egypt.

In commemoration of the Battle of Actium, Agrippa built and dedicated the building that served as the Roman Pantheon before its destruction in 80 AD. Emperor Hadrian used Agrippa’s design to build his own Pantheon, which survives in Rome. The inscription of the later building, which was built around 125, preserves the text of the inscription from Agrippa’s building during his third consulship. The years following his third consulship, Agrippa spent in Gaul, reforming the provincial administration and taxation system, along with building an effective road system and aqueducts.

Arch of Piety

The Arch of Piety is believed to have stood in the piazza to the immediate north of the Pantheon. Statements made in mediaeval documents imply, but do not specifically say, that the scene of Trajan and the widow was represented in a bas-relief on the Arch – narrating the story of the emperor and a widow, suppressing the emperor’s name. His probable source, the mediaeval guidebook of Rome known as Mirabilia Romae, does not even state that the arch was built in commemoration of the event. It mentions the arch and then says that the Incident happened there.

Giacomo Boni discusses the legend of Trajan, giving many interesting pictures which show how the story was used in medieval painting and sculpture. He has found a bas-relief on the Arch of Constantino, which he thinks may have given rise to the story. It shows a woman sitting, her right hand raised in supplication to a Roman figure, who is surrounded by other men, some in military dress, and two accompanied by horses. Boni suggests that the Middle Ages may have supposed this figure to be Trajan because of his reputation for justice.

Saepta Julia

The Saepta Julia was a building in Ancient Rome where citizens gathered to cast votes. The building was conceived by Julius Caesar and dedicated by Marcus Vipsanius Agrippa in 26 BC. The building was originally built as a place for the comitia tributa to gather to cast votes. It replaced an older structure, called the Ovile, which served the same function. The building did not always retain its original function. It was used for gladiatorial fights by Augustus and later as a market place.

The conception of the Saepta Julia began during the reign of Julius Caesar (died 44 BC). Located in the Campus Martius, the Saepta Julia was built of marble and surrounded a huge rectangular space next to the Pantheon. The building was planned by Julius Caesar who wanted it to be built of marble and have a mile long portico according to a letter written by Cicero to his friend Atticus about the building project. The quadriporticus (four-sided portico, like the one used for the enclosure of the Saepta Julia) was an architectural feature made popular by Caesar.

After Caesar’s assassination in 44 BC, and in the backlash of public support for the former ruler, men continued to work on projects that Caesar had set into motion. Marcus Aemilius Lepidus, who used to support Caesar and subsequently aligned with his successor Octavian, took on the continuation of the Saepta Julia building project. The building was finally completed and dedicated by Marcus Vipsanius Agrippa in 26 BC. Agrippa also decorated the building with marble tablets and Greek paintings.

The Saepta Julia can be seen on the Forma Urbis Romae, a map of the city of Rome as it existed in the early 3rd century AD. Part of the original wall of the Saepta Julia can still be seen right next to the Pantheon.

– edited from Robert Hannah, “The Pantheon as Timekeeper”, 2009.
________

Simulation by the Institute for Digital Intermedia Arts at Ball State University
Project Director: John Fillwalk, Senior Directory IDIA Lab, BSU.
IDIA Staff: Neil Zehr, Trevor Danehy, David Rodriguez, Ina Marie Henning, Adam Kobitz

PROJECT ADVISORS:
Dr. Robert Hannah, University of Waikato, New Zealand
Dr. Bernard Frischer, Virtual World Heritage Laboratory, Indiana University, USA

SPECIAL THANKS:
Shinichi Soeda, Avatar Reality

Virtual Nursing Simulator

NursingSimLoginScreen

HDT and IDIA Lab have developed a next generation, high-fidelity Virtual Nursing Simulator in the game engine of Unity 3D for testing by the Ball State University School of Nursing. Students will practice interviews via role-playing – employing avatars with predefined health histories. The simulator introduces our new media tools within the environment including the ability to play video, webcam, screen sharing, presentations and webpages. Additionally the simulator allows for interdisciplinary immersion – providing a context for student nurses to role-play scenarios alongside doctors, health insurance representatives, administrators and other professionals. Desktop, web and tablet versions are in development.
Video preview here: http://www.youtube.com/watch?v=Asz92FuhEyc&feature=share&list=UUtQrLF1JPprEvP4AO-iSNvw

Izapa Group F Ballcourt Solar Simulation

Mayan Solar Simulator
Izapa Group F Ballcourt

IDIA_IzapaSimulator1stPerson

We have applied our Virtual Solar Simulator to a simple white model build of the pre-Columbian ballcourt in Group F at the Izapa site in Chiapas, Mexico. Izapa is considered to be the birthplace of the Mayan Long Count, which ends its cycle today on December 21st, 2012 – the winter solstice. Viewed today, the site is oriented on an axis where it is aligned generally but not precisely to sunrise on the winter solstice and to sunset on summer solstice. In direct observation today, the alignment along the axis of the ballcourt from the throne #2 to the stela #60 is off-axis by approximately two degrees.

The simulator can be visited here:

https://projects.idialab.org/izapasolarsimulator.html

* The simulation requires Internet Explorer and the Unity plugin.

IDIA_IzapaSimulatorPlanView

The solar simulator developed by IDIA Lab at Ball State University polls data from the NASA / JPL Horizons database (http://ssd.jpl.nasa.gov/horizons.cgi) calculates celestial objects position (sun, moon, planets, comets, etc). The database takes into account for the Chandler Wobble – a periodic deviation in the rotation of the Earth’s axis. Archeo-astronomical alignments that are viewed today at Izapa appear to be off-axis, however when compensated for changes in time, rotation, position and Chandler effect via this simulation method, the solstice events come back into their original orientation for the era of their construction and can be observed to be in actual alignment.

A b’ak’tun is 144,000 days in length – almost 400 years. In the Mayan Long Count it took 13 b’ak’tuns to progress throughout a full cycle of creation. Dec. 21, 2012, marks the end of the 13th b’ak’tun of this Mayan calendar and after today’s solstice, the next Long Count cycle begins again – another 5128.8 years.

Happy New Mayan Long Count!
– John
December 21st, 2012

Simulator:

https://projects.idialab.org/izapasolarsimulator.html

* The simulation requires Internet Explorer and the Unity plugin.

It requires the Unity plugin http://unity3d.com/webplayer/

Note on display time and date

Dates are in the year 300 BCE/BC.
Time is represented in Universal Time (UT1/UTC/GMT).
To obtain local time for the Izapa site, subtract 6 hours from the UT time displayed in the simulator.
Next release will provide solstice and equinox preset for the year 2012 for comparison.

Camera control

To switch cameras press the 1 for 1st person camera view, 2 is for bird’s eye view and 3 is for top down view.
To pan use the WASD keys and to zoom use the middle mouse button. To rotate camera use CTRL-Click or Right Mouse Button.

Background

The model is geolocated and accurately referenced in 3D space by latitude, longitude, orientation and elevation. The Horizons database is then scraped using these coordinates as well as the year of interest – in this case we chose 300 BCE/BC as an apogee in the range in which the Izapa site was inhabited.

IDIA Lab has developed the celestial simulation – which can also track the moon and planets, etc – to work on various Cultural Heritage and archeo-astronomy projects in collaboration with international scholars. Other projects utilizing the Celestial Simulator include simulations of Hadrian’s Villa, the Roman Pantheon, the Solarium Augusti in Rome (the largest sun calendar of the ancient world), the Temple of Artemis in modern Turkey (one of the ancient wonders) and Stonehenge.

Clarian Health VR Simulation

In collaboration with Clarian Health, IDIA has created a training video and interactive media to help facilitate the transition into the Ball Memorial Hospital New South Tower.

How can nurses train in a new hospital wing before it is constructed? Or after it’s complete and full of patients? For help addressing the situation, hospital officials turned to Ball State University and its emerging media experts. Rather than have the nurses don hardhats and run training seminars amidst saws and hammers, Ball State’s Institute for Digital Intermedia Arts (IDIA) worked to create a virtual training program. The result will allow nurses to interact with the new layout and better adapt t o their new surroundings well before the physical construction is complete.

This could be extremely important in terms of ultimate patient care, as nurses will be adapting to a new facility as well as learning a new patient-based floor design. Rather than having a single nursing station surrounded by many rooms, the wing will have individual nursing stations ensconced between two rooms. “Our interactive training simulation showcases new, more efficient methods for working in a decentralized care unit as opposed to a centralized care unit,” said John Fillwalk, IDIA director. “Switching from a single hub to multiple nursing stations represents a culture shift for nurses, and we were able to help them work through that.”

Using Quest 3D, a virtual reality program, Fillwalk and his team developed “New Spaces, New Care,” a training simulation that works like a computer game, allowing nurses to virtually explore their new environment, sit at their new workstations, view and walk into patients’ rooms, examine charts and access medicine cabinets.

In the weeks prior to the facilities opening, nurses assigned to the South Tower will complete the virtual training. By exploring the new wing before it’s complete, the nurses will be better acquainted the amenities, which once mastered, will give them more time for their patients, Fillwalk said. “By working directly with hospital officials and the nursing staff, we were able to create a program to more quickly acclimate staff members to their new environment and help them focus on the most important component of their jobs — tending to patients,” he added.

Links
Indianapolis WRTV Channel 6 Story

The Star Press: Energize ECI collaborating to lead an economic transformation

Gaming in Virtual Worlds and on Mobile Devices

REAL TIME 3D ENVIRONMENTS

Virtual worlds are three-dimensional environments that can provide compelling shared experiences for their users. These media-rich immersive spaces can connect to social networks, web-services, and data to bridge into an entire universe of external content. The IDIA has developed scores of projects in virtual worlds and game environments including Blue Mars, Unity 3D, Second Life, Open Simulator, and Quest 3D.

The IDIA built and animated Odie’s character using Maya 3D; an industry standard animation software package used in the feature film and visualization industries. This game that IDIA designed with Paws characters, was developed in Unity 3D.

Paws characters Odie, Garfield and the neighborhood were then brought into the Unity game engine where the lighting, real-time shadows, and physics simulations were designed.  Unity’s scripting capability was also utilized to control the Odie character’s animation and movement, trigger collision sounds, and animate Garfield as Odie moves by. There are a selection of objects for Odie to interact with, each with its own physical properties including weight, friction, and bounciness.

http://unity3d.com

MOBILE ENTERTAINMENT

Mobile devices are increasingly used to help us navigate, communicate, stay informed and be entertained. Devices such as the Apple iPhone are capable of supporting rich and complex 3D experiences such as augmented reality and games. The IDIA has been developing numerous projects for PDA enabled mobile phones including applications for social networking, conference systems, augmented reality walking tours, live musical performance and games.

The prototype 3D game, IDIA built with Paws content, has also been re-purposed here as an iPhone game – exhibiting the range and scalability of these authoring environments. The same content from any game can also be realized as a Nintendo Wii game using this production workflow.

This authoring environment supports two and three-dimensional game spaces, interacting through clicks, tilts and touch. Users can walk Odie through his neighborhood, encountering trashcans, balls and Garfield himself!

http://unity3d.com/iPhone

Astym 3D Visualization

Trevor Danehy and Tom Anderson, students from the IDIA Immersion Seminar, designed and produced a 3D visualization of a therapeutic process developed by Dr. Tom Sevier of Performance Dynamics. Students worked with state of the art software and a commercial anatomical 3D model with over 4,000 parts to realize a professional quality and dynamic scientific illustration for an external client.

The ASTYM simulation was an animation used to visualize specific effects and treatments of tendonosis. Information was gathered from ASTYM researchers to turn a complex, unseen procedure into a visual explanation. Choosing a design style was a challenge because 3D animation in the medical field has a different goal than most common animations. There needs to be a balance between accuracy and simple, clean visuals. Various shader styles were considered to properly display the internal processes of the human arm accurately, while avoiding anything realistically gruesome or visual clutter that would be detrimental to explaining the complexity of the human body. Transparent, yet visible objects were needed, so students opted for a dynamic incidence shader, which would shade the curvature and sides of objects while leaving the middle portions transparent enough to see what lies under them. Bright glowing colors were used to highlight particular tendons and separate muscles from each other.

Links
Performance Dynamics

Museum Scanning Project

The IDIA Seminar’s current project focuses on virtually recreating the original settings of various sculptures found throughout the Ball State University Museum of Art. The project focused on scanning five different sculptures using a 3D laser scanner.

This project has allowed Ball State students to get involved with 3D scanning by learning the method, techniques and limitations involved with accurate scanning procedures.
The museum scanning project was initially a way to not only digitally archive a few select sculptures, but to place them in an animated video to visualize the art in their original context, before they found their way to the Ball State Museum of Fine Arts.

Sculptures such as Adolph Alexander Weinman’s “Descending Night” and “Rising Sun” originally were meant to be viewed at the Panama-Pacific International Exposition of 1915 in San Francisco. The students intend to visually replicate that scene along with how the artist framed them for viewing with the help of a  Zcorporation Zscanner 700 at a high resolution. Scanning typically takes the class around six hours (for a human sized sculpture). The ZScanner has the ability to scan any object in the tightest spaces and do it in real-time with one continuous scan.  Once the scan data is acquired, there are a large variety of mediums that it can be transferred to both digitally and physically. Students then place the models within an animation using AutoDesk Maya. A high-resolution viewer and interactive touch screens are also used to view the models. Students are also investigating a method of 3D prototyping the models to a smaller, more reproducible copy.

Links:
3D Scanner Technology

Virtual Middletown Living Museum in Blue Mars

The Virtual Middletown Living Museum project in Blue Mars is a simulation of the Ball Glass factory from early 20th century Muncie, Indiana. Life and conditions in the factory were one of the key elements of the Middletown Studies by Robert S. and Helen Merrell Lynd in their landmark studies Middletown (1929) and Middletown in Transition (1937). These in-depth accounts of life in Muncie, Indiana, became classic sociological studies and established the community as a barometer of social trends in the United States. In the years since, scholars in a variety of fields have returned to Muncie to follow up on the Lynds’ work, making this small city among the most studied communities in the nation. The center continues this tradition by sponsoring and promoting research on Muncie as Middletown, on small cities generally, and on the themes and issues the Lynds explored.

This simulation of industrial life, built as a prototype for a much larger project dealing with all aspects of the Lynd Study, has aimed to create an virtual living museum experience expanding the opportunities for both learning and interpretation. The approach to interactive design embeds learning and navigation experiences subtly into the project to maintain the sense of immersion. IDIA has prototyped several techniques to do this including: interactive objects that allow for close up inspection; objects that when clicked bring up web resources that show information; plans or photographs used in the interpretation; non-player character factory workers, a live interactive avatar of Frank C. Ball who greets visitors and introduces them to the factory; video and audio files of factory experts and archival films; an in-world interactive Heads-Up-Display (HUD) that provides deeper investigation and navigation through the factory; and a supporting webpage with complete documentation on all resources used in this interpretation.

To download the Blue Mars client, create an account and tour Virtual Middletown, please visit: http://blink.bluemars.com/City/IDIA_IDIALabExhibitions/

Follow these steps to get set up:

  1. Slelect “Download Client” and follow instructions to install the BlueMars client on your PC (Windows desktop or laptop)
  2. Once you have successfully installed the BlueMars client, select “Go To City” to install the Virtual Middletown virtual world
  3. Register your account and confirm when you receive an email from BlueMars
  4. Modify your avatar (optional)
  5. Explore Virtual Middletown!

NOTE: If you are a Macintosh user (OS X) you may run the BlueMars client and Virtual Middletown virtual world using the Boot Camp emulation: http://www.apple.com/support/bootcamp/

Here are links for additional information on the project:

  1. Center for Middletown Studies project website: http://cms.bsu.edu/Academics/CentersandInstitutes/Middletown/Research/Virtual/Virtual.aspx
  2. IDIA Lab project website: https://idialab.org/virtual-middletown-living-museum-in-blue-mars/

Recommended settings: Very High Graphics with good graphics cards – otherwise High or Low as needed. Screen resolution: Minimum 1280 by 720 or higher. Sound levels should be up. Many objects in the world are interactive – anything that highlights blue can be clicked with the left mouse button and examined – or might can reference a web page. The heads up display in the lower right hand corner provides information and navigation to augment your visit.

Project partners: The Center for Middletown Studies, Library Services and the Emerging Media Initiative at Ball State University

Press release: http://www.bsu.edu/news/article/0,1370,7273-850-65401,00.html

3D Point Cloud Laser Scanning

From a seashell to auditorium, laser-scanning can capture the objects and spaces we inhabit and convert them into a digital model. The IDIA scans a range of small or large scale 3D objects using facilities and equipment shared with the SimLab in the College of Architecture and Planning and supported by the Office of Information Technology. Three-dimensional geometry is captured in terms of millions of points that outline the scanned object. These points are converted into polygons to be used in the resulting animation workflow.

In 2009, the IDIA Immersion Seminar in Virtual Worlds worked to simulate the original settings of several sculptures in the collection of the Ball State University Museum of Art. Students where trained professionally to use the ZCorp 3D laser scanner using archival museum methods.

This project has allowed Ball State students to obtain skills in 3D scanning by learning the methodology, techniques and limitations involved with accurate scanning procedures. The museum scanning project was initially a way, to not only digitally archive select sculptures, but to place them in an live three-dimensional multi-user virtual environment that immerses viewers in a simulation of the works in their original context.

This significance of this project was to address the fact that pre-modern sculpture was typically originally designed to be an element in large-scale architectural, public or natural setting. IDIA developed a workflow that is now also a service that can be provided to external clients in the sciences, industry and the arts – employing emerging digital tools to illuminate a viewer’s understanding of context.

FARO and Z Corporation’s scanning technologies are used in applications such as reverse engineering, accident reconstruction, forensic science, historic preservation, entertainment and virtual simulation.

Eco-Net

“Eco-Net” is an attempt to connect wireless network data with nature by visualizing that data with plant-like structures and organic motion. Each plant represents a computer connected to the network and each IP address is displayed above the corresponding plant. Collective network activity is displayed as websites are browsed and emails are sent. This piece represents our constantly connected state, simulated through plant and root structures, and the constant barrage of data that flows through the air all around us every day. This piece was created completely in Java, using the Processing API.

Links
Students win international award for work in the digital arts

Virtual World/Social Media Mashup: Flickr Gettr v4

IDIA Lab’s new Flickr Gettr launching soon – shared & mobile virtual world mashup artwork – preview Flickr pics here http://bit.ly/bCWQCL

Flickr Gettr: Shared and Mobile Media Mashup Artwork
Virtual World/Social Media Mashup: Flickr Gettr v4
The Institute for Digital Intermedia Arts (IDIA Lab) at Ball State University

IDIA Labs Flickr Gettr connects the social image repository of Flickr to virtual worlds and mobile devices through an interactive 3D and sonic experience – immersing the viewer in a dimensional cloud of user searched imagery and sound.

Background

Flickr Gettr in Second Life, leverages the new SL shared-media plugin architecture to provide rich spatially integrated web experiences. Participants query to search public image folders at the Flickr web-service. Queries then pull in related images from Flickr and feeds them back to be spatialized in an immersive visual and sonic environment.  As each image spawns, it generates a sound which parallels the environment of imagery that surrounds the viewer in their selected imagery and related tags that are spoken in synthesized voice.

Flickr Gettr positions virtual worlds as a platform to navigate media in the information metaverse. Virtual worlds have the potential to position the Internet as a three-dimensional information and communication platform where live data can flow in and out to visualize, contextualize, communicate, and to inspire.

In Flickr Gettr, an external web service was used as an intermediary to query Flickr, receive images and format them for delivery as a texture.  It then passed the aspect ratios and tags in a second query to be able to map the textures properly.  To make these interactions more flexible, the intermediary web server was employed to collate and prepare information to retain states that can be queried from external applications.  The web application effectively serves as an intermediary between the virtual world and outside environments, providing the communications link and logic to assimilate the information.  This can make the creation of Web 2.0 mash-ups much simpler as the code for doing these sorts of queries and logic is already highly developed in Java, Ruby, and Perl for example.  Flickr Gettr also triggered music files upon the rapid rezzing of objects to create a cumulative ambient effect.

In 2010, IDIA Lab was invited by the Indian Institute of Technology in Mumbai, India to to install Flickr Gettr at the their annual technology festival – the largest of its kind in Asia – which hosted more than 65,000 attendees.