IDIA Lab exhibits AI Art at the National Museum of China

Wishing Well (许愿池)

Wishing Well (许愿池) by media artist John Fillwalk and IDIA Lab, is an Artificial Intelligence (AI) installation that was invited into the 5th Arts and Science International Exhibition and Symposium (TASIES 2019) at the National Museum of China in Beijing located on Tiananmen Square. The exhibition curated the work of international artists engaged in the pioneering use of AI in the creation of art-making. The Wishing Well installation creates an interactive environment where visitors’ spoken wishes shape the substance, meaning and character of the artwork – mediated through artificial intelligence.


Wishing Well ( 许愿池 ) engages both private and public spatial interactions in the creation of a simultaneously intimate and collective social experience. The tone of the wishes aggregated by the system is processed anonymously and delivered into this social sculpture via synthesized whispered voices, spoken in Mandarin. The pool of water in the bowl, or wishing well, is vibrated by a speaker placed underneath – creating a three-dimensional dynamic sound wave representation shaped by the sound of the actual wish. Virtual colored shapes are emitted based on the tonal analysis of each wish and are aggregated into an historical representation of previous wishes – launching skyward to become part of the larger collection of dreams.

BSU HOSTS XR CONFERENCE

IDIA Lab was invited by the Campus Alliance for Advanced Visualization to host their annual conference (CAAV) in Fall 2023 at the College of Architectures and Planning. The CAAV is a nonprofit organization that promotes the capabilities for use of extended reality technologies, such as CAVE-style immersive VR systems, video walls, headsets, and advanced visualization software employed in research, learning and public engagement. 

CAAV’s purpose is to act as an information resource to worldwide higher education – focusing on immersive visualization platforms, advanced video screen formats, visualization software and issues of sustainability and operation for high tech visualization facilities.

VBAM: VIRTUAL BROAD ART MUSEUM COMMISSION

Press Release

View the VBAM project

VBAM Introductory Lecture

John Fillwalk and the Institute for Digital Intermedia Arts (IDIA Lab) at Ball State University were commissioned to design and build a virtual museum artworks for the Eli and Edythe Broad Art Museum in East Lansing, MI. The physical Museum, was designed by architect Zaha Hadid and opened to the public in Fall 2012. Museum Director, Michael Rush commissioned Fillwalk and IDIA Lab to envision and create a dynamic program of artworks for the Virtual Broad Art Museum project (VBAM). Fillwalk and his collaborators created an immersive multi-user environment and four commissioned artworks using Unity 3D – experienced entirely within a web browser.

“Engaging visitors with innovators at the leading edge of art and technology, both here at MSU and around the globe, is key to the Broad Art Museum’s mission,” said founding director Michael Rush. “With the Virtual Broad Art Museum, we have an opportunity to embrace the tremendous creative and connective possibilities that exist in the digital world.”

The mixed world reception took place on Thursday, March 15th 2012 from 6:00 – 8:00 pm EST simultaneously on the campus of Michigan State University and in the VBAM virtual worlds.

Please visit the VBAM project links below.

http://idialabprojects.org/vbam/proxy.html

http://idialabprojects.org/vbam/flickrgettr.html

http://idialabprojects.org/vbam/survey.html

http://idialabprojects.org/vbam/confluence.html

VBAM Exhibitions by John Fillwalk

Participants in con|FLUENCE can create pathways based on reactions to both social and spatial relationships. There is a virtual response and persistence to interactions, as the routes are drawn and sculpted in the three-dimensional environment – forming nodes that are created via visitors’ proximity.

con|FLUENCE. John Fillwalk. 2012.

Survey for VBAM is an immersive landscape simulation using real time weather data from the location of the Broad Art Museum in East Lansing, Michigan. Representations of surveyor’s tape, flags, light, sky color, time of day, wind speed and even cloud density are driven by the actual physical weather forces – informing the virtual landscape.

Survey. John Fillwalk. 2012.

dis|PLACE is an interactive gesture-based performance that will be exhibited at the VBAM reception on the 15th of March. This installation links virtual space to physical reality in an exploration of agency and gesture. Employing a participant’s movement, the work can both navigate 3D space – as well as control direct interactions with the digital performers via a Microsoft Kinect™ sensor.

dis|PLACE. John Fillwalk. 2012.

Proxy is a performative, interactive and site-specific virtual installation where participants shape the evolution of a sculptural and sonic composition as a response to the Museum. Interactions are painted, sculpted and sonified within this collaborative and evolving environment. The work progresses to construct in itself relation to the Museum – eventually transforming as integrated structure – reflecting and becoming its environment.

Proxy. John Fillwalk. 2012.

Flickr™ Gettr for VBAM connects the social image web service of Flickr™ to the virtual Museum environment, allowing visitors to create a dynamic cloud of spatial imagery by entering a search term of their choice – providing tagged images from Flickr™ into the Museum environment. The search term is spoken by the software and tags from image cloud are revealed allowing for subsequent user transformations of the spatial imagery.

Flickr Gettr. John Fillwalk. 2012.

Credits

Proxy.
John Fillwalk with
Neil Zehr, Michael James Olson
2012

Survey for VBAM
John Fillwalk with
Neil Zehr, Keith Kothman, Charles Estell
2012

Flickr Gettr for VBAM
John Fillwalk with
Jesse Allison, Neil Zehr, Charles Estell
2012

con|FLUENCE
John Fillwalk with
Neil Zehr
2012

dis|PLACE (exhibited at the reception on the 15th of March as a live performance only)
John Fillwalk with
Neil Zehr, Jonathan Strong
2012

Michael Rush, Director of the Broad Museum of Art, MSU
Adam Brown, Professor Intermedia and Electronic Art, MSU
John Fillwalk, Director of the Institute for Digital Intermedia Arts [IDIA Lab], BSU
Neil Zehr, IDIA Lab, BSU
Charles Estell, IDIA Lab, BSU
Michael James Olson, IDIA Lab, BSU
Ina-Marie Henning, IDIA Lab, BSU
Jonathan Strong, IDIA Lab, BSU
Jesse Allison, LSU
Keith Kothman, BSU
Jonathan Ross, BSU
Blake Boucher, BSU
Michael Davidson, MSU
Jayne Goedekke, MSU
Kyle Gomboy, Reaction Grid
Chris Hart, Reaction Grid

Special thank you to Neil Zehr, IDIA Lab, BSU

Mellon Foundation Humanities Virtual World Consortium

Screen Shot 2013-04-23 at 4.22.14 PM

BSU’s IDIA Lab was contracted by the Mellon Foundation Humanities Virtual World Consortium to design and develop a major open source virtual world initiative for Digital Humanities and Cultural Heritage projects. The consortium – comprised of Kings College, London; UCLA; the University of Virginia and Trinity College, Dublin – turned to the expertise of BSU’s IDIA Lab to create this two-year innovative hybrid simulation platform that leverages new modes of teaching and learning in immersive environments.

http://virtualworlds.etc.ucla.edu

Concurrent with the development of the prototype infrastructure, members of the Consortium developed working relationships with Ball State University’s IDIA Lab, an internationally recognized academic leader in in the development of virtual worlds, human computer interaction, visualization and 3D simulation. Most important for the Consortium, the IDIA Lab is engaged in the development of scholarly, creative and pedagogical projects that explore the intersection between the arts, science and technology. The IDIA Lab is not merely a technical development team, but is also a interdisciplinary design studio that integrates art and emergent technologies into every phase of development. After inviting John Fillwalk, the lab’s director, to attend a series of conference calls with the consortium, a proposal for interface and in world design was solicited. John Fillwalk is an internationally recognized artist and developer of virtual and hybrid environments. He serves as the senior director of the Hybrid Design Technologies initiative [HDT], professor of Art and as the director of the Institute for Digital Intermedia Arts [IDIA Lab} at Ball State University. Over the past 25 years, his interactive and virtual artworks have been exhibited internationally in numerous festivals, galleries and museums including SIGGRAPH, CYNETart, Synthése, 404 Festival, Dutch Design Week, Boston Cyberarts, Virtual Broad Art Museum, ISEA, ASCI, VIdéoformes, Indian Institute of Technology and the Beijing Science and Technology Museum.

The Consortium elected to partner with the IDIA Lab, since it offers design as well as technological expertise and a common interest in the theoretical implications of Virtual World technologies on research and pedagogy.

Development will be split between the two independent teams, with the IDIA Lab, in general, centering its work on the development of the Unity based platform including avatar selection, navigation, network controller, user interface system, and back end network hosting, while Tipodean develops the HTML and KML system and works with members of the Consortium to integrate the four individual projects into the shared platform. The development will not occur in isolation from the rest of the Consortium.  The external development teams will offer scheduled monthly training sessions to the internal technical teams of the Consortium. We are employing a similar development model to that successfully used during the Planning Phase of the HVWC in which, through a collaborative effort of local staff and third-party developers, we implemented a prototype template and virtual world environment with a subset of features below enabled.  In addition, we plan to hire a graphic design independent contractor and a game design independent contractor to work with the PIs and our development teams on the look and feel of the Consortium’s web presence as well as the conceptualization of the interface design.

Our Mission

I. Project Summary

The 1990s saw the development of digital technologies supporting the 3D (three dimensional) modeling of cultural heritage objects and environments. For the first time, humanists could digitally model and reconstruct the damaged or vanished monuments of the past. The results were typically 2D renderings or videos (“animations”). The decade of the 2000s saw the enhancement of 3D environments with avatars making it possible for scholars to enter into the 3D world and to use the Internet to interact with the simulated environment while communicating with fellow humanists located anywhere on the planet. Such software platforms are called networked virtual worlds (NVWs). The Humanities Virtual World Consortium (HVWC) will explore how the unique characteristics of  networked virtual worlds can enable and advance humanistic research while working towards creating a sustainable base for mainstreaming the technology in humanities scholarship. Our initial work is based upon a series of related scholarly initiatives that draw upon virtual world technology and which are meant to: a) advance the current state of research on the phenomenology of space and place, b) design visual and aural conventions to evoke the sensorial experience lost to us due to technological and evidentiary constraints, c) test the current capabilities of virtual worlds to explore chronotopic problems, previously inaccessible due to the limitations of prior technology, d) guide future development of humanities-driven virtual worlds, and e) produce works of exemplary digital scholarship, disseminated in formal, peer-reviewed publications, that solve specific research problems in particular disciplines and area studies. Our overall intent is to demonstrate how networked virtual worlds can uniquely enable important kinds of research inquiry, and thus contribute to the transformation of scholarly communication in the relevant subject fields and to the field of digital humanities. With this in mind, our projects have been chosen so that they span multiple disciplines— including Archaeology, Art History, Architectural History, Buddhist Studies, Classics, History, Irish Studies, Literary Studies, Tibetan Studies—and periods from the ancient past to contemporary times. While the projects explore discipline-specific research questions, they share common goals concerning humanities research and scholarly communication in a networked virtual world environment.

II. Expected Outcomes and Benefits of the Project

Project Deliverables: At the conclusion of the project we will release a networked virtual world platform template to be used in the Unity game development engine, the Consortium web site with documentation, four virtual world projects released as Unity3D builds, four draft in-world “articles” embedded within the virtual world project, and four articles submitted for peer review in journals devoted to specific scholarly domains.

The main outcomes of the proposed project will be (i) the consolidation of the Humanities Virtual World Consortium as a robust, fully functional academic organization that can persist and expand in the future; (ii) the development of a shared virtual world platform that is adapted for maximum utility for scholarly projects; (iii) the publication of four significant scholarly projects in that platform that exemplify the way in which virtual worlds offer unique affordances for scholarly research inquiries; (iv) individual articles based on the virtual world content submitted journals in our traditionally-defined, domain-specific fields. At the conclusion of the grant, a) the organization will be published on the web with bylaws and an means for others to join,  b) the new collaboratively maintained virtual world research and publication environment will host four scholarly publications products, and c) the shared development template and accompanying documentation will be made available online to aid others in the development of Virtual World content. The primary benefit of this project is that it constitutes a valuable and necessary step towards establishing the organizational and technical basis for a sustainable scholarly exploration of the unique ways in which networked virtual worlds can enable and advance humanistic research. While valuable and compelling individual projects have emerged over the past years, interoperability between them has not been supported. The barriers remain extremely high for individual scholars who wish to use this medium for their own research without major funding. For these reasons, to date these projects have failed to attain a wide scholarly audience. We propose to establish an organization, technology, and specific scholarly publication apparatus that would address these deficiencies in the subsequent years and beyond.

The broader, long-term, and more far-reaching benefits are that the initiative will establish a basis for the scholarly community to engage in a serious attempt at creating a sustainable, shared environment for scholarly research and communication to be done in a networked virtual world environment. These publications will show the viability and importance of such research, raise its profile in the scholarly community, and demonstrate in a compelling, virtually tangible, and accessible way the benefits of using a shared platform. Such publications will, it is envisioned, lead to an expansion of the Consortium and finally the development of a robust platform in which scholars can easily participate without the need for major grants to support further development of one-off technologies. Scholarly projects from disparate disciplines and geographical regions would be interoperable, and scholarly use would be widespread due to the consistency of interface and technology.

Mesa Verde Virtual Companion

Mesa Verde Virtual Companion

Ball State Univeristy’s IDIA Lab has been contracted by the US Department of the Interior to develop virtual visitor experiences for the Mesa Verde National Park. The park preserves and interprets the archeological heritage of the Ancestral Pueblo people who made it their home for over 700 years, from 600 to 1300 CE. Today, the park protects nearly 5,000 known archeological sites, including 600 cliff dwellings.

The application will bring to life well-known museum dioramas and locative walking tours of the park. Augmented reality and interactive 3D experiences will help tell the stories of the UNESCO World Heritage site – including the transformation of static elements of the historic dioramas with animated figures and interactive elements. The application will be available on both Google Play and AppStore in 2021.

Khan Academy

The Khan Academy featured HDT / IDIA Lab and Virtual World Heritage Laboratories Digital Hadrian’s Villa Project. Hosted by Beth Harris, Dean of Art and History at Khan Academy, a walkthrough with archeologist and HDT research fellow Dr. Bernard Frischer showcases the project. A virtual tour of Hadrian’s Villa using a 3D digital model of the villa was created under the direction of Dr. Bernard Frischer in conjunction with the IDIA Lab. The ruins of Hadrian’s Villa, in the town of Tivoli, near Rome, is spread over an area of approximately 250 acres. Many of the structures were designed by the Emperor Hadrian who ruled from 117 until his death in 138 C.E. This virtual rendering is based on current archeological research and has been created in consultation with art historians, archaeologists, and museum curators with expertise in this area. Please note, a few features are necessarily assumptions based on the best available evidence. Dr. Bernard Frischer and Dr. Beth Harris are the speakers in this walk though that was created by Beth Harris and Steven Zucker of Smart History.

Meet3D

Meet3D is a virtual collaboration space that brings people together to share, learn, and work – wherever they are. Private or public auditoriums, classrooms, conference halls or offices can be joined by any group or audience on demand. Meet3D was developed by the Institute for Digital Intermedia Arts at Ball State University to bring remote participants together in a common space and facilitate a sense of physical presence. The platform is available in both browser-based and executable versions.

Interactive Art at the College of Architecture and Planning

The IDIA Lab is designing a series of extended reality artworks that will be installed in the atrium of the College of Architecture and Planning. The installation allows passers-by to interact using various modes including motion, voice, and gesture to shape the compositions. The works employ sensors, sound art, artificial intelligence, and other programming to create dynamic experiences.

NSF Virtual Teaching Simulator

The College of Architecture and Planning’s IDIA Lab creates technological innovation in the arts, sciences, and humanities. The Lab engages students, staff, scholars and clients in collaborative projects including 3D, virtual reality and mobile applications. This newsletter includes updates on current IDIA Lab projects including an augmented reality visitor experience application for Mesa Verde National Park, a publication on our Roman archeology celestial alignment research, a multiuser simulation for understand aspects of wellness in Amazon’s Lumberyard game engine, and a $3 million dollar NSF grant in which IDIA Lab was selected by Massachusetts Institute of Technology, Educational Testing Services and Indiana University to create simulations for elementary math and science classrooms.

While the COVID-19 pandemic has had a major effect on PK-12 classrooms, it has also affected opportunities for preservice teachers to gain practical teaching experience. Responding to this problem, School of Education faculty Meredith Park Rogers and Adam Maltese, along with Dionne Cross Francis of the University of North Carolina at Chapel Hill, an affiliate faculty member of IU, have begun work on a project with collaborators from ETS, Towson University and Massachusetts Institute of Technology, where they will design and study an online suite of practice-based teaching activities intended to support mathematics and science preservice teachers in honing their skills for facilitating argumentation-focused discussions.

This project, funded by a grant for over $3 million from the National Science Foundation (Grant 2037983), will run from 2020-2023. ETS serves as the lead organization for the project, titled Online Practice Suite (OPS): Practice Spaces, Simulations and Virtual Reality Environments for Preservice Teachers to Learn to Facilitate Argumentation Discussions in Mathematics and Science.

The OPS consists of a coordinated set of online practice-based teaching activities that include game-based practice spaces, small-group avatar-based simulations, and a virtual reality classroom. The activities are designed to be complemented with targeted feedback and support from teacher educators.

With the COVID-19 pandemic increasingly pushing K-12 schools and universities across the nation to adopt online approaches, there is an immediate need for finding effective methods for preservice teachers to practice teaching even if traditional school-based teaching experiences are less available.

Even without pandemic-related restrictions, learning how to teach effectively demands that preservice teachers have robust, authentic and consistent opportunities to engage in the work of teaching—ideally across different contexts with diverse student populations and for varied purposes.

Over the next three years, the OPS research team will work with mathematics and science teacher educators to design, test, and refine the full suite of online practice-based teaching activities. Findings will be used to understand the mechanisms that support preservice teachers’ learning within and across the OPS activities and to document an emergent set of best practices for supporting preservice teachers’ improvement in this ambitious teaching practice over time. The OPS will be accompanied by support materials to help teacher educators use and integrate the activities into teacher education courses, including online and face-to-face learning contexts. Both elementary and middle/secondary school mathematics and science scenarios will be developed for the OPS.

The IU faculty will work with the Institute for Digital Intermedia Arts (IDIA Lab) at Ball State University to develop an immersive virtual reality environment that will situate teachers in learning to navigate a whole-class discussion with students. This aspect of the OPS is referred to as the Virtual Teacher Simulator (VTS). In these simulations, preservice teachers will move beyond the one-on-one and small-group interactions of the other two OPS activities and begin to put their skills together to teach a full classroom of virtual student avatars. Each preservice teacher will have the opportunity to move around the classroom and engage with the virtual student avatars just as though they were in a real classroom. Additionally, they will receive feedback from a mathematics or science teacher educator (i.e., coach) in order to continue to improve their practice.

As Park Rogers explains, “The VTS context especially affords preservice teachers multiple opportunities to rehearse their teaching within a short time span, without disrupting learning in a real classroom, all while receiving immediate and expert feedback from specialized mathematics and science teacher educators. Furthermore, incorporating real-time feedback and opportunities for multiple rehearsals, all within an immersive whole class context, will allow preservice teachers to take risks in their practice and try out novel pedagogical moves that they may not feel secure enough, or have opportunities, to try in an actual classroom.”

“We must find a way to continue to prepare high-quality math and science teachers through this pandemic and beyond,” says Park Rogers. “Through this collaboration, we hope to offer teacher educators and their preservice student teachers a solution to the current dilemma the global pandemic of COVID-19 has created, and also alternative or supplementary methods for targeting core teaching practices even when we are able to safely return to classrooms to work with real children.”

This material is based upon work supported by the National Science Foundation under Grant 2037983. The opinions expressed are those of the author and do not necessarily represent views of the National Science Foundation.

By Catherine Winkler, Indiana University.

NSF FUNDED VIRTUAL SIMULATION OF HADRIAN’S VILLA

NFS Funded Virtual Simulation of Hadrian’s Villa

IDIA Lab has designed a virtual simulation of the villa of the Roman Emperor Hadrian, which is a UNESCO World Heritage site located outside of Rome in Tivoli, Italy. This project is being produced in collaboration with the Virtual World Heritage Laboratory (VWHL) at the University of Virginia (UVA), directed by Dr. Bernard Frischer and funded by the National Science Foundation. This large-scale recreation virtually interprets the entire villa complex in consultation with the world’s foremost villa scholars and educators. The project has been authored in the game engine of Unity as a live 3D multi-user online learning environment that allows students and visitors to immerse themselves in all aspects of the simulated villa.

The project will not only accurately recreate the villa buildings but also include a complete Roman avatar system, non-player characters with artificial intelligence, furniture, indigenous vegetation, dynamic atmospheric system and sophisticated user interface. The interface will not only provide learning, navigation, reporting and assessment opportunities but will also allow users to change the position of the sun to any date in 130 AD using data from the Horizons database at JPL NASA – testing theses of astro-alignments of architectural features during solstices and equinoxes. UVA students will be briefed on the culture and history of the villa as well as learn the virtual environment for five weeks prior to immersing themselves within it. The avatar system will allow for them to enter the world choosing class and gender – already being aware of the customs and behavior of the Roman aristocracy, soldier, slave or politician. This project will be delivered to VWHL at UVA in early March.

The Digital Hadrian’s Villa Project:

Virtual World Technology as an Aid to Finding Alignments between

Built and Celestial Features

Bernard Frischer1

John Fillwalk2

1Director, Virtual World Heritage Laboratory, University of Virginia

2Director, IDIA Lab, Ball State University

Hadrian’s Villa is the best known and best preserved of the imperial villas built in the hinterland of Rome by emperors such as Nero, Domitian, and Trajan during the first and second centuries CE. A World Heritage site, Hadrian’s Villa covers at least 120 hectares and consists of ca. 30 major building complexes. Hadrian built this government retreat about 20 miles east of Rome between 117, when he became emperor, and 138 CE, the year he died. The site has been explored since the 15th century and in recent decades has been the object of intense study, excavation, and conservation (for a survey of recent work, see Mari 2010).

From 2006 to 20011, with the generous support of the National Science Foundation[1] and a private sponsor, the Virtual World Heritage Laboratory created a 3D restoration model of the entire site authored in 3DS Max. From January to April 2012, Ball State University’s Institute for Digital Intermedia Arts (IDIA Lab) converted the 3D model to Unity 3D, a virtual world (VW) platform, so that it could be explored interactively, be populated by avatars of members of the imperial court, and could be published on the Internet along with a related 2D website that presents the documentation undergirding the 3D model.

The 3D restoration model and related VW were made in close collaboration with many of the scholars who have written the most recent studies on the villa.[2] Our goal was to ensure that all the main elements—from terrain, gardens, and buildings to furnishings and avatars—were evidence-based. Once finished, the was used in two research projects.

The first project was a NSF-sponsored study of the usefulness of VW technology in archaeological education and research. We used the virtual villa in undergraduate classes at Xavier University and the University of Virginia to investigate the thesis of two recent studies by project advisors Michael Ytterberg and Federica Chiappetta about how this enormous built space was used by six different groups of ancient Romans, ranging from the Emperor and Empress to normal citizens and slaves (Ytterberg 2005; Chiappetta 2008). Avatars representing these groups have been created and are being operated by undergraduate students as a Problem‐Based Learning (PBL) experience. They are observed by subject experts, who are using the data generated to test and, if necessary, refine the initial theses about how circulation through the villa was handled.  The results are still being evaluated. Preliminary indications are that the data show that the combination of VW used in a PBL educational context is very effective in taking advantage of the known connection between between the hippocampus and long-term learning, especially when the information to be mastered is spatial (Kandel 2007).

The second project involved use of the VW for some new archaeoastronomical studies. Most of our advisors’ publications, like the older work by archaeologists that preceded them, have concentrated on archaeological documentation, restoration, formal, and functional analysis. The latest research by advisor De Franceschini and her collaborator Veneziano (2011) combined formal and functional analysis: it considered the alignment of certain important parts of the villa in relation to the sun’s apparent path through the sky on significant dates such as the solstices. In their recent book they showed how two features of the villa are aligned with the solar solstices: the Temple of Apollo in the Accademia; and the Roccabruna. We used the VW to extend their research to other areas of the villa, taking advantage of 3D technology to restore the sun to the right place in the sky and also to restore the damage to the architecture of the villa, as De Franceschini and Veneziano had independently suggested be done before they learned about our digital model of the villa.

The work of De Franceschini and Veneziano is innovative. Archaeastronomy has become an accepted field of study in recent decades, and a considerable amount of work has been done in Old and New World archaeology. In Roman archaeology, however, this approach is still rarely encountered. Significantly, one of the few compelling studies concerns the most famous Hadrianic building: the Pantheon in Rome. Hannah and Magli 2009 and Hannah 2011 have shown a number of solar alignments in the building, of which the most notable are the sun’s illumination of the entrance doorway at noon on April 21; and the view of sunset silhouetting the statue of Hadrian as Sun god on a four-horse chariot atop the Mausoleum of Hadrian as viewed from the middle of the Pantheon’s plaza at sunset on the summer solstice. Like the summer solstice, April 21 is also a significant date: on it occurred the  annual festival in Rome known as the Parilia (re-named the Romaia by Hadrian),[3] which celebrated the founding of Rome.

De Franceschini and Veneziano pursued an observation of Mangurian and Ray (2008) to document an impressive example of solar alignment at Hadrian’s Villa involving the tower known as Roccabruna at the western end of the villa. Originally, a tower-like structure topped by a round temple, what remains today is the well-preserved, massive lower floor. The main entrance is located on the northwestern side to the right and gives access to a large circular hall covered by a dome. The dome is punctuated by an odd feature: five conduits that are wider on the outside than on the inside (figure 1).

What is the function of these unusual conduits? They have no known parallel in Roman architecture. After asking themselves this same question, on June 21st, 1988, the day of summer solstice, the American architects Robert Mangurian and Mary Ann Ray went to Roccabruna at sunset, and discovered the extraordinary light phenomena which occur there. At sunset the Sun enters through the main door illuminating the niche on the opposite side, something that happens during most of the summer days. But only in the days of the summer Solstice the Sun penetrates also into the conduit located above that door: its rays come out from the slot inside the dome projecting a rectangular light blade on the opposite side of the dome. In June 2009, De Franceschini verified the findings of Mangurian and Ray. However, they know that the apparent path of the Sun through the sky changes slightly each year, so that in the nearly 1880 years separating us from Hadrian, the precise effect of the alignment has been lost. As they noted, only a computer simulation can recreate the original experience of being in the lower sanctuary at Roccabruna at sunset on the summer solstice during the reign of Hadrian.

Once we had our 3D model of the site, we were able to obtain from NASA’s Horizons system[4] the correct azimuthal data for the year AD 130 and put the sun into the sky at sunset on the summer solstice. Following the lead of De Franceschini, who in the meantime had become a consultant to our project, we put into the niche one of the four statues of the Egyptian sky goddess Isis that were found at the Villa. De Franceschini chose Isis because first of all, there is no question there was a statue in this niche so we need to put something there; and the two flanking niches had candelabra, whose bases are preserved and are decorated with Isiac iconography. Moreover, Isis’ festival in Rome was on the summer solstice. So we scanned and digitally restored one of the several statues of Isis from the villa and put it into the central niche. Finally, for the dome, which we know from surviving paint was blue and therefore had the famous “dome of heaven” motif (Lehmann 1945), we followed De Franceschini in restoring a zodiac set up in such a way that the sign of Gemini is over the statue niche since the last day of Gemini is the summer solstice. Our zodiac is adapted from the great Sun God mosaic in the Rheinisches Landesmuseum in Bonn, which kindly gave us permission to use it.

As can be seen in figure 2, when we restored the sun in the right position in the sky dome for sunset on the summer solstice (June 21) of 130 CE in our 3DS Max model of Roccabruna, the sunlight coming through the main doorway illuminated the statue of Isis in the statue niche, and the light entering through the conduit lit up the sign of Gemini painted on the cupola. So we were able to confirm the Mangurian-Ray thesis.

The approach we have taken in our Roccabruna project is deductive: Mangurian and Ray noted the strange feature of the conduits punctuating the cupola of Roccabruna; they hypothesized a solar alignment. De Franceschini and Veneziano agreed and for various reasons we don’t need to go into today, they put a statue of Isis into the statue niche. We set up the conditions in which these hypotheses could be tested and were able to verify them.

But surely, if there is one such alignment at the villa of the same emperor who was responsible for the Pantheon, there may be others. But the villa is very big—covering over 100 hectares—and has 30 major building complexes, most larger than Roccabruna. Moreover, such alignments could just as easily involve astrological features such as the Moon and the planets. Faced with this level of complexity, the best methodological way forward in searching for new alignments is clearly inductive and empirical. This is one reason why we asked the Institute for Digital Intermedia Arts (IDIA Lab) of Ball State University to create a multi-user virtual world based in Unity 3D from our 3DS Max model.

The project of virtually interpreting a simulation on the scope and scale of Hadrian’s Villa was a daunting one – engaging layers of scholarly, technical and pedagogical challenges. The technical challenges were many – foremost to leverage the game engine of Unity 3D to become an effective multi-user avatar-based virtual world. An important factor was to create an environment that was straightforward and accessible via standard web browsers on both Mac and Windows and selected Unity 3D as the starting point for developing the platorm. We required specific back-end administration tools to handle the accounts and server side aspects of the project – for this we relied on Smart Fox Server as it manages Unity 3D quite well. Our team took an approach that bridged and integrated disparate technologies, creating a robust virtual world platform to immersively augment both instructional and PBL processes. VW features available to the learning community included text based communication, a live map showing current visitor positions, map based teleportation, managed voice channel, user selected avatar gestures, online users, paradata, photographs of the extant site, plan views, and integrated web links.

Key to the project was a varied system of avatars representing the imperial court, freemen, senators, scholars, soldiers, and slaves to the emperor. The avatar system provided several important functions testing recent scholarly interpretations of circulation throughout the villa and the use of various spaces for typical court activities – meals, imperial audiences, bathing, worship, etc. Upon entering the simulation, the choice of avatar would predicate how one’s social standing within the role-play of the world.

A gesture system was created via motion capture providing each user with a unique set of actions and gestural responses to engage social interactions – including greetings, bowing and gestures specific to rank and class. Communication was also a critical element in the modes of problem based learning engaged by the participants in the simulation. Specific technologies provided varied abilities such as public chat, private instant messaging and live multi-user voice channels.

A companion website was co-developed and integrated into the VW environment providing learners with visual assets such as photographs and panoramas of the current site, site plans, elevations, and video interviews with Villa scholars. We also developed three-dimensional turntables of the interpreted and reconstructed models, overview information on each of the major Villa features, bibliography and an expansive database of art attributed to the Villa site. This information can be directly accessed by learners directly from within the virtual world. The development team integrated the notion of paradata, introduced by the London Charter – making instantly transparent the scholarship and all underlying elements of the 3D model (from terrain to buildings, furnishing, costumes, and human behavior).

In support of new research theme on celestial alignments by consultants De Franceschini and Veneziano, a major goal for the project was to develop an accurate simulation for the position of the sun. The solar tracking, or virtual heliodon that we created as a response to this research, was envisioned as a simulation that would a bridge between the virtual environment and coordinates from an external database calculating solar positions. After investigating existing tools we decided to employ the Horizons database that was created by NASA’s Jet Propulsion Laboratory as an on-line solar system data computation service – tracking celestial bodies in ephemerides from 9999 BCE to 9999 CE. In implementing solar tracking for the Villa project in instances were we where we wanted to investigate potential significant solar alignments, we entered the latitude, longitude and altitudes of specific buildings from the Tivoli site to poll the Horizons data for the year 130 CE. The user was able to change the date, time of day, and quickly play the sun from specific moments via the user interface. The system was co-related to both the Julian and Gregorian calendars and contained presets for the vernal and autumnal equinoxes as well at the summer and winter solstices.

These tools allowed for the rapid discovery of potential alignment that might bear further investigation. The solar feature allows one to proceed empirically, in effect turning the clock back to 130 CE and running experiments in which the days and hours of the year are sped up by orders of magnitude so that one can in a very short time find candidate alignments not yet hypothesized by scholars working in the traditional way of Mangurian-Ray.

As developers, our goal was to create the solar tool and let students and scholars use it to undertake their own empirical research. Our team was not intending to engage in this research ourselves, yet in the process of working within the environment daily we quickly began to notice curious solar phenomena. In a bit of empirical study of the very first component of the site we installed in the simulation, the Antinoeion – or newly-discovered Temple of the Divine Antinous, we noticed an alignment of potential interest. The most likely alignment seemed at first glance to be along the main axis running from the entrance, through the obelisk in the central plaza to the statue niche at the end of the axis. We ran the days and hours of the year and found that the sun and shadow of the obelisk align at sunrise on July 20. We consulted with our expert on the Egyptian calendar in the Roman period, Professor Christian Leitz of the University of Tuebingen–and he confirmed that this date has religious significance. It is, in fact, the date of the Egyptian New Year, as the Romans of Hadrian’s age clearly knew (cf. the Roman writer Censorinus, who states that the Egyptian New Year’s Day fell on July 20 in the Julian Calendar in 139 CE, which was a heliacal rising of Sirius in Egypt).

In the process of developing and subsequently utilizing the simulation tools we created for astro-archeological research, our conclusions have been that virtual world technologies can indeed take the inquiry for significant built-celestial alignments to a new level of insight.

Bibliography

Chiappetta, F. 2008. I percorsi antichi di Villa Adriana (Rome).

De Franceschini, M. and G. Veneziano, 2011. Villa Adriana. Architettura celeste. Gli secreti degli solstizi (Rome).

Hannah, R. 2008. Time in Antiquity (London).

Hannah, R. 2011. “The Role of the Sun in the Pantheon’s Design and Meaning,” Numen 58: 486-513.

Kandel, E. 2007. In Search of Memory: The Emergency of a New Science of Mind (W. W. Norton, New York). Kindler edition.

Lehmann, K. “The Dome of Heaven,” Art Bulletin 27: 1-27.

Lugli, G. 1940. “La Roccabruna di Villa Adriana,” Palladio, 4: 257-274

Mangurian, R. and M.A. Ray. 2008. “Re-drawing Hadrian’s Villa,” Yale Architectural Journal, 113-116.

Mari, Z. 2010. “Villa Adriana. Recenti scoperte e stato della ricerca,” Ephemeris Napocensis 20: 7-37.

Ytterberg, M. 2005. “The Perambulations of Hadrian. A Walk through Hadrian’s Villa,” Ph.D. dissertation, University of Pennsylvania.


[1] NSF grant # IIS-1018512.

[2] For a list of scholarly consultants, see http://idialabprojects.org/hvtest/mission.php,

[3] Athenaeus 8.361e-f.

[4] http://ssd.jpl.nasa.gov/?horizons.

Final Wisdom I: Interactive Haptic Poetry Installation at SIGGRAPH 2010

Final Wisdom I is an interactive installation engaging viewers in a sensory exploration of temporal and spatialized poetry. Participants manipulate imagery, sound and language through their gesture, touch and proximity. The work is engaged through a framework of cameras and sensors that react to heat, position, and capacitance – presenting a shifting environment of reactive media and haptics. Final Wisdom I is the work of artists Hans Breder and John Fillwalk, poetry by critic Donald Kuspit, music by composers Carlos Cuellar Brown and Jesse Allison. This project is produced through the Institute for Digital Intermedia Arts at Ball State University in collaboration with the Institute for Digital Fabrication. Final Wisdom I was exhibited at the art gallery of SIGGRAPH 2010 in Los Angeles, CA. Special thanks to IDF/CAP students Matthew Wolak, Christopher Baile and Claire Matucheski, and Assistant Professor of Architecture Joshua Vermillion. http://www.i-m-a-d-e.org/

As an intermedia artist, John Fillwalk actively investigates emerging technologies that inform his work in a variety of media, including video installation, virtual art, and interactive forms. His perspective is rooted in the traditions of painting, cinematography, and sculpture, with a particular interest in spatialized works that can immerse and engage a viewer within an experi- ence. Fillwalk positions his work to act as both a threshold and mediator between tangible and implied space, creating a conduit for the transformative extension of experience, and to pursue the realization of forms, sounds and images that afford interaction at its most fundamental level. In working with technology, he values the synergy of collaboration and regularly works with other artists and scientists on projects that could not be realized otherwise. Electronic media extend the range of traditional processes by establishing a palette of time, motion, interactivity, and extensions of presence. The ephemeral qualities of electronic and intermedia works, by their very nature, are inherently transformative, and the significance of the tangible becomes fleeting, shifting emphasis away from the object and toward the experience.

John Fillwalk is Director of the Institute for Digital Intermedia Arts (IDIA Lab) at Ball State University, an interdisciplinary and collaborative hybrid studio. An intermedia artist and Associate Professor of Electronic Art, Fillwalk investigates media in video installation, hybrid reality and interactive forms. He received his MFA from the University of Iowa in Intermedia and Video Art, and has since received numerous grants, awards, commissions and fellowships.

Donald Kuspit is an art critic, author and professor of art history and philosophy at State University of New York at Stony Brook and lends his editorial expertise to several journals, including Art Criticism, Artforum, New Art Examiner, Sculpture and Centennial Review. Hans Breder was born in Herford, Germany, and trained as a painter in Hamburg, Germany. Attract- ed to the University of Iowa’s School of Art and Art History in 1966, Breder established the Intermedia Program. Carlos Cuellar Brown, a.k.a ccbrown, is a composer, instrumentalist and music producer. Formally trained as a classical pianist, Cuellar specialized in experimental music and intermedia with the late American maverick composer Kenneth Gaburo. Jesse Allison is the Virtual Worlds Research Specialist, IDIA, Assistant Professor of Music Technology, Ball State University. He is also President of Hardware Engineering with Electrotap, LLC, an innovative human-computer interface firm.

Leonardo: The International Society for the Arts, Science and Technology article published by The MIT Press. http://muse.jhu.edu/journals/leonardo/summary/v043/43.4.fillwalk.html

Engaging Technology II: Art + Science

The interplay of art and science—a concept at least as old as Leonardo da Vinci—is the focus of “Engaging Technology II,” an exhibition at Ball State University’s David Owsley Museum of Art which ran September 28, 2017 through December 22, 2017. Museum patrons experienced a selection of work from internationally renowned artists whose explorations on the topic include installations, code art (artwork generated by computer programming), art/science, augmented reality and human-computer interaction. 

The exhibition was curated by Ball State’s John Fillwalk, director of the Institute for Digital Intermedia Arts (IDIA Lab) in the College of Architecture and Planning and is both a sequel and a new concept, according to Robert La France, the museum’s director. This exhibit builds on the success of a prior exhibition curated by Fillwalk, that examined the emergence of the genre of Intermedia Art, with a particular focus on the intersection of technology and artmaking. The new show shifts the curatorial emphasis on STEAM – science, technology, engineering, arts and mathematics.

This exhibition continues to position Ball State as an international center of STEAM innovation. A goal of Engaging Technology II was to invite artists exploring the edges of art and science into discourse within our community—advancing our University’s conversation regarding interdisciplinary collaboration and innovation. This collection of artists examines various aspects of the physical and life sciences, from code and biology to chemistry and physics.

Among those individuals whose work is featured in the exhibition is Adam Brown, an associate professor at Michigan State University. His live biochemical installation, The Great Work of the Metal Lover is a work that sits at the intersection of art, science and alchemy. “It uses microbiology as a technique to solve the mystery of the philosopher’s stone,” said Brown, who described the centuries-old “stone” as a legendary substance sought by alchemists who believed it capable of turning metals like mercury into gold and silver. 

His installation uses custom lab equipment to introduce a “highly specialized” bacterium into an engineered atmosphere, turning toxic gold chloride into usable 24K gold over time during the course of the show. “By the end of the exhibition, the process will produce enough gold to put in the palm of your hand,” Brown said.

Other innovative contributors to “Engaging Technology II” include code artist Casey Raes; composer and visual artist Tristian Perich; and Evelina Domnitch and Dmitry Gelfand, European-Russian artists who create sensory immersion environments that merge physics, chemistry and computer science with uncanny philosophical practices. 

“10000 Peacock Feathers in Foaming Acid” is Evelina Domnitch and Dmitry Gelfand’s live, audiovisual performance which was presented on the fulldome projection screen of Ball State’s Charles W. Brown Planetarium. The artwork uses a penetrating laser beam to scan the surface of soap bubbles, the behavior of which model the unique properties of
cell membranes. 

Throughout the run of Engaging Technology II, a series of invited performances, lectures and workshops were scheduled on campus and in the Muncie community to enhance the exhibition’s reach. Classes, workshops, lectures and family events were offered for local
and regional audiences. 

By Gail Werner, BSU Media Strategist
Et al.

Apollo 11 Simulation for International MoonBase Alliance

A 3D simulation of an Apollo 11 landing was produced for International MoonBase Alliance and ImmersiveWorlds, LLC. by IDIA Lab.




About IMA

The International MoonBase Alliance (IMA) is an association comprised of leading scientists, educators, and entrepreneurs from space agencies and industries worldwide to advance the development and implementation of an international base on the Moon.

Our priority is to create an action plan that will culminate with the building of MoonBase prototypes on Earth, followed by a sustainable settlement on the Moon. Building on discussions and recommendations from the Lunar Exploration and Analysis Group (LEAG), the European Lunar Symposium, the International Space Development Conference, the NewSpace Symposium, the International Astronautical Congress (IAC), and other worldwide space forums, we intend to formulate an integrated strategy for establishing a proposed multinational lunar base.

Towards this goal, the IMA hosted a 2017 summit last October on Hawaiʻi’s Big Island and invited leading aerospace, engineering and space industry professionals for a series of conferences focusing on identifying essential resources and technological capabilities required to enable and sustain lunar-based operations. Through the promotion of international collaborations and public-private partnerships, we can reduce costs, enhance benefits, and accelerate timetables for lunar settlements.

moonbasealliance.com/

CAVE

CAVE is an immersive 270° panorama screen primarily used for presentations to large groups of students. OmniFocus projection maximizes the shadow-free zone in our installation of the Elumenati‘s 8m GeoDome Panorama, so up to 40 people can share the space. Without headsets, the shared space sets the stage for spontaneous interactions with the content and one another. It’s also logistically smoother than managing 40 headsets at once, especially post-pandemic.

Another use case for the CAVE is as part of our virtual production pipeline, using the Elumenati’s WorldViewer and Omnity plugin for the Unity game engine. This allows us to collaborate inside the 3D environment and demo content for clients.

A vendor and partner since 2018, the Elumenati is a design and engineering firm creating innovative solutions for immersive visualization. With nearly twenty years as leaders in the field, their mission is to develop elegant, effective systems and novel applications of immersion and interactivity, building VR domes and social, collaborative spaces for XR experiences.

Ball State University’s IDIA Lab is granted rights to Blue Mars virtual world technology from Avatar Reality

Ball State University’s IDIA Lab is granted rights to Blue Mars virtual world technology from Avatar Reality http://bit.ly/BSU_BlueMars Ball State granted rights to develop $10 million Blue Mars virtual world technology (3/5/2012)

Ball State granted rights to develop $10 million Blue Mars virtual world technology (3/5/2012)

A digital laser scan of a centuries-old Buddha (above) and a simulation of the 1915 World’s Fair in San Francisco (below) are two examples of how Ball State has employed Blue Mars for cutting edge technology projects.

The future of virtual worlds has a new home at Ball State University, thanks to the generosity of Avatar Reality Inc., which granted the university rights to the multimillion-dollar 3-D virtual world platform, Blue Mars.

Blue Mars’ creator, Hawaii-based Avatar Reality, has provided expanded rights to the technology, valued at $10 million in research and development, to Ball State for 3-D simulation and research projects beyond the world of gaming.

“We are honored to have our institution selected to build upon the tremendous innovations within Avatar Reality’s Blue Mars platform,” said John Fillwalk, director of the Institute for Digital Intermedia Arts (IDIA) and senior director of Ball State’s Hybrid Design Technologies initiative. “This partnership between Avatar Reality and Ball State is an unprecedented gesture and a great distinction.”

Since 2009, Ball State’s IDIA has been a leading developer for Blue Mars, employing the virtual platform on projects such as digital laser scans of a centuries-old Buddha and the creation of the Virtual Middletown Project. The project, which ushers visitors via their computers into the world of the early 20th-century Ball Brothers Glass Manufacturing Co., is one example of the cultural heritage work possible with Blue Mars. Another is IDIA’s simulation of the 1915 World’s Fair in San Francisco.

“Avatar Reality designed in Blue Mars a next-generation 3-D platform that greatly advanced the graphic fidelity, functionality and scalability of virtual worlds — well beyond current standards,” Fillwalk said. “Its strategy connected social spaces across devices, from high-performance desktops to mobile phones. Our goal is to build upon both its technology and vision for the future of virtual worlds. “

Recognition speaks to reputation

Phil Repp, Ball State’s vice president for information technology, said the new agreement with Avatar Reality can be traced to Fillwalk’s commitment to advancing IDIA to the national level.

“The fact that other universities — and many of them of very high caliber — are working hard to gain expertise in hybrid worlds and that Ball State was identified as the university most likely to further develop this technology speaks volumes about our reputation, experience and abilities,” Repp said. “It’s an honor for Ball State and John to be singled out with this kind of recognition.”

Under Ball State’s terms of agreement with Avatar Reality, the university will begin to fully operate Blue Mars for noncommercial purposes, expand upon the source code, increase its research and academic initiatives, and enhance the community of Blue Mars. In addition, Ball State will deliver original content on Blue Mars as it has done in the past. Existing commercial activity will continue to be operated and supported by Avatar Reality, Inc.

“I am really excited about the future,” Repp said. “Through our division of Hybrid Design Technologies, Ball State will further our position as a national leader in hybrid worlds and their applications to mediated learning technologies. Our reputation in this discipline is such a perfect fit to our institutional history of innovation in teaching and learning.”

About Ball State: Located in Muncie, Ind., Ball State University is redefining education by providing relevant, immersive learning experiences that engage high-caliber students in intense interdisciplinary projects both in and out of the classroom. Its vibrant campus is home to about 22,000 undergraduate and graduate students from across the country and abroad. Learn more at www.bsu.edu.

About Blue Mars: Developed by Avatar Reality Inc., Blue Mars is a premium 3-D virtual world platform featuring unparalleled interactivity, fidelity, scalability and security, and it enables artists and developers to create and distribute interactive 3-D experiences to a global audience. Blue Mars launched in open beta in October 2009 and began selling virtual land to third party developers in January 2010. Blue Mars Mobile, an iOS app based on the content and technology from the virtual world, was introduced in February 2011.

NATIONAL AWARD: VIRTUAL ESCAPE ROOM


The BSU College of Health and CAP’s IDIA Lab were honored with a national award for their collaborative virtual escape room project. The  serious game, “The Case of the Kidnapped Healthcare Professionals: Virtual Escape Room” was selected to receive the 2023 Creativity in Practice and Education Award from the National Academies of Practice (NAP). BSU’s Lori Porter accepted the award on behalf of the project team Karrie Osborne, PhD, RN, Blair Mattern, John Fillwalk, IDIA staff and Katherine Berryhill for their work in developing the project.

Escape rooms employ innovative learning strategies such as problem-based immersive learning, role-play, and game principles to improve teamwork, communication and leadership. Nationally, health profession curricula are beginning to explore the use of escape room experiences as innovative pedagogy. Escape rooms challenge individuals to collaboratively explore new content, work as a team to discover clues, solve puzzles, and accomplish interactive tasks in a limited amount of time.

Flickr Gettr: IDIA performs at the International Annual Science and Technology Festival in India

John Fillwalk and Jesse Allison travelled to Mumbai, India for the International Annual Science and Technology Festival, January 22-24, to exhibit the Institute for Digital Intermedia Arts’ (IDIA) “Flickr Getter” installation. The festival, the largest of its kind in Asia, hosted more than 65,000 attendees. In addition to exhibitions such as “Flickr Getter”, the festival featured keynotes such as Lars Rasmussen, Co-Founder of Google Wave and Google Maps; Vic Hayes, father of Wi-Fi; Ajay Bhatt, Co-Inventor of USB; Jonathan Gardner, Senior NASA Scientist; and R. Chidambaram, Principal Scientific Adviser of the Government of India.

IDIA’s Flickr Getter is an interactive, sonic and visual experience, immersing the viewer in a three-dimensional cloud of user searched Flickr images. The installation, completed in collaboration with the New Media Consortium, Linden Labs and the Institute for Digital Fabrication, was also highlighted in the festival’s promotional video.

For more information, please visit http://www.techfest.org/scintillations/flickrgettr/

Smithsonian and National Park Service Presentations

Vice-President of Information Technology, Phil Repp and IDIA Lab Director, John Fillwalk traveled to Washington, D.C. to present Ball State projects in electronic field trips, virtual worlds and hybrid design to several federal organizations. Meetings included a presentation at the Smithsonian Institution Building on the National Mall to representatives from all the Smithsonian Museums and Institutes as well as a separate presentation to the National Park Service. Conversations are ongoing regarding partnership and collaboration opportunities.
 

VIRTUAL COURTROOM SIMULATOR

IDIA is collaborating with BSU’s Brandy and Greg Rocheleau from the Department of Criminal Justice and Criminology on the production of a Virtual Courtroom. The project, funded by BSU’s Women of Beneficence, is a program designed to allow students in designated criminal justice classrooms to participate in courtroom simulations through the use of avatars to facilitate knowledge and empathy as well as practice decision-making skills in a courtroom setting. 

One of the main purposes of this project is to give criminal justice students an opportunity to practice ethical decision-making skills in the field of criminal justice. It also helps students develop empathy for vulnerable populations so students become socially-responsible professionals. IDIA is designing a multiuser virtual simulation where students can engage via avatars and communicate in voice, chat, video or whiteboard. The facilities will include a courtroom, deposition room, classroom and even an accurate digital twin of the Supreme Court!

RUTGERS HEALTHCARE SIMULATION

IDIA is designing an interactive virtual training simulator for Rutgers University’s School of Health Professions. The project immerses students within TeamSTEPPS, Team Strategies and Tools to Enhance Performance and Patient Safety. Developed by AHRQ, the Agency for Healthcare Research and Quality, TeamSTEPPS is an interprofessional communication framework providing interdisciplinary collaboration and teamwork tools aimed at optimizing patient outcomes by improving communication among healthcare professionals. It is AHRQ’s signature curriculum to improve patient safety by training health care teams to communicate and practice effective teamwork. TeamSTEPPS trains staff in teamwork skills to lead medical teamwork improvement initiatives in an organization, from initial concept development through sustainment of positive changes

Students will learn best practice interprofessional communication strategies and apply them in lab-based scenarios in a single-player simulator created in the 3D game engine of Unity. The experience is designed with branched dialogue trees to provide decision making opportunities to assess learned concepts as they are applied to various healthcare scenarios. The project was recently presented at the National Academies of Practice (NAP) annual conference.

GIS PAPER: DEEP MAPPING

BSU’s James Connolly, Director, Center for Middletown Studies and John Fillwalk, Director, IDIA Lab presented their joint paper at the 30th ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems in Seattle, WA. This paper explored innovations in visualizing data in immersive 3D environments associated with the newly launched Deep Mapping Middletown project. Lead by Connolly and Fillwalk, the project consists of a team of scholars and library scientists from several institutions producing deep maps based on the extensive archive created by the century of social research focused on Muncie, IN.

https://dl.acm.org/doi/abs/10.1145/3557919.3565815

Virtual Escape Room

IDIA Lab is working on a Virtual Escape Room project for Ball State University College of Health. Project will launch in the fall semester. In this collaboration we are developing a virtual multi-user escape room. Escape rooms employ innovative learning strategies such as problem-based, immersive learning, role play, and game principles to improve teamwork, communication and leadership. Nationally, health profession curricula are beginning to explore the use of escape room experiences as innovative pedagogy. Escape rooms challenge individuals to collaboratively explore new content, work as a team to discover clues, solve puzzles, and accomplish interactive tasks in a limited amount of time. The virtual multi-user virtual escape room will support teaching teamwork and communications curricula for health profession students and contribute to improved health outcomes.

Engaging Technology II: Art + Science – 10000 Peacock Feathers in Foaming Acid

Engaging Technology II: Art + Science premiered at the David Owsley Museum of Art on September 28th, 2017. The exhibition was curated by John Fillwalk, Director of the Institute for Digital Intermedia Art at Ball State University. The opening reception was followed by a special performance by two of the featured artists – Evelina Domnitch and Dimitry Gelfand at the Charles W. Brown Planetarium.

About the performance, from the artists: “In 10000 Peacock Feathers in Foaming Acid we use laser light to scan the surfaces of nucleating and dissipating soap bubble clusters. Unlike ordinary light, the laser’s focused beam is capable of crawling through the micro and nano structures within a bubble’s skin. When aimed at specific angles, this penetrating light generates a large-scale projection of molecular interactions as well as the phenomena of non-linear optics. Bubble behaviors viewed in such proximity evoke the dynamics of living cells (the lipid membranes of which, are direct chemical descendants of soap films).”

http://www.portablepalace.com

Derek Bohm

Derek Bohm is a Graduate Assistant pursuing a Master of Architecture degree at Ball State University. He
leads workshops on mixed reality software with a focus on architectural applications for students in
CAP’s SimLab. He has an MA in Philosophy from the University of Durham and a BA in Theology from the
University of Manchester. He is interested in experimental architecture as applied theory and its
potential for positive social impact. Derek sees architecture as inhabitable ethics and embodied
theology. Today’s XR technology provides a low-cost, low-risk laboratory for architectural experiments
and explorations.

Natalie Yates

Natalie Yates is an Assistant Professor of Landscape Architecture at Ball State University. Her scholarly work is situated at the intersection of representation and analysis of landscape systems. Her research focuses on dynamic landscape illustration and simulation, sensing methods in design process (including DIY sensing technologies, UAVs [drones], and GIS), post-industrial landscape remediation, and urban agriculture. Natalie teaches design studios, design communication, time-based media/technology methodologies, and UAS technologies.


Natalie is Ball State University’s 2020-2021 University Design Research Fellow for Exhibit Columbus. Her design research proposal and installation entitled “Calibrate” will open during Exhibit Columbus in Fall 2021.
Additionally, Natalie is president of the Board of Directors of Farmished, a local non-profit for promoting a thriving local sustainable food system. Farmished is currently transforming a former machine foundry site in south Muncie into an Urban Agriculture Training Farm.

Natalie has published research at the Association for Computer Aided Design in Architecture (ACADIA) national conference and co-authored the book Modeling the Environment: Techniques and Tools for the 3D Illustration of Dynamic Landscapes (John Wiley & Sons, Inc., 2012).

Kristen Barry

Kristin Barry – Architecture and Planning

Kristin Barry is an assistant professor of Architecture at Ball State where she teaches courses in architecture history and theory. She was previously an Instructor at Penn State University, where she taught graduate courses in architectural history/theory, and survey courses in ancient art history and architecture. After receiving her Bachelor of Science in Architecture from the University of Cincinnati, Kristin began working as an archaeological architect, and has worked in Greece, France, Israel, Egypt, and Turkey to document and interpret historical sites for a modern audience. Following her Master of Architecture from the University of Cincinnati, she was on the 2008 Masterplanning team at the Archaeological Site of Ancient Troy in Turkey working to redesign the tourism site to accommodate modern needs. Her research and publication explore how architecture and design in particular describe and interpret ancient remains, as well as change or affect historical understanding at some of the most popular archaeological sites in the world. Kristin is currently the site architect at the Hierakonpolis excavation in Egypt and volunteers with the PUP Global Heritage Consortium.

Virtual Companion: Indianapolis

Virtual Monument Circle
1910s-1930s

Virtual Monument Circle is a mobile application that uses historic photographs and maps of downtown Indianapolis, IN to create an immersive interpretation of various historic phases of the city center. This project is a prototype for a larger potential city-wide endeavor bringing to life significant neighborhoods and sites within the city. It is developed as a possible collaboration between the Institute for Digital Intermedia Arts at Ball State University, the Polis Center at Indiana University Purdue University Indianapolis and the Indiana Historical Society.

There are two experiences with the content dependent on the proximity of the user’s location – onsite and offsite. It will be available soon for both iOS and Android.

Virtual Monument Circle was designed and produced by the Institute for Digital Intermedia Arts at Ball State University. Special thanks to the Polis Center at IUPUI and the Indiana Historical Society.

For more information contact:  idialab@bsu.edu

Virtual Companion: Columbus

IDIA Lab – in collaboration with BSU Assistant Professor of Architecture, Kristin Barry – has  designed the cultural heritage mobile application, Virtual Columbus Experience – allowing users to virtually tour the rich architectural history of Columbus, Indiana. This locative mobile app was produced with fellowship support of Ball State University’s Digital Scholarship Lab.

Though a relatively small city, Columbus has made itself a cradle of modern architecture, commissioning many buildings and public works since the middle of the 20th century. The number of landmark buildings and notable architects to have worked in the city has earned it the nickname “Athens on the Prairie.”

With data and artifacts gathered by Kristin Barry’s immersive class, users of the app can explore 3D models of key buildings commissioned over the years, survey timelines showing when architects were active or buildings were constructed, and meet Columbus’ famous architects – such as Eero Saarinen, I.M. Pei, and Harry Weese.

After its launch, Virtual Columbus Experience will seek further funding to expand the scope and depth of the project across the community.

Buffalo Bill Center of the West AR Diorama

AR Diorama

Buffalo Bill Center of the West has contracted with IDIA Lab to create an augmented reality (AR) diorama of Buffalo Bill’s Wild West show. Guests to the museum will be able to view the physical white model. While IDIA Lab’s previous Buffalo Bill project focused on his Muncie show, this iteration explores the Ambrose Park show in Brooklyn, NY.

Adam Kobitz

“Constructing the physical portion of the Buffalo Bill diorama through 3D printing and laser cutting processes presented us with several challenges, the first of which was recreating our existing models within CAD software (Autodesk Fusion 360) to ensure that they were designed with these techniques in mind. This modelling process is quite different from those employed on the AR side of the piece, and entails ensuring that the sizes, thicknesses, etc. of each element correlates with the parameters of our machines. Additionally, given the overall scale of the final product, many elements throughout the diorama had to be broken down into smaller components and reassembled by hand. After the models had been 3D printed or laser cut, each of the elements needed to be assembled with adhesives, their seams hidden with a filler substance, filed and sanded, and painted to achieve the final aesthetic.”

Trevor Danehy

“I identified several 19th century photos of William Cody, Iron Tail, and Nate Salsbury with in a timeframe of a few years. Using these photos I constructed a digital 3D model of them within programs like Zbrush and Maya. I also did extensive research of the 1884 Buffalo Bill Wild West show in Ambrose Park, Brooklyn- a long forgotten location that has been all but wiped from history. There are very few details of Ambrose Park aside from a few photographs from the late nineteenth century. The location was demolished in the early 20th century and the Brooklyn Navy Terminal was paved over it. I was able to confirm structures and locations of the Wild West show by cross referencing photographs that overlap the same locations. An interesting note is that we have an electrical layout map of the show in Ambrose park. When comparing photographs to the electrical map, while it does confirm the exact locations of electrical poles throughout the campus, the positions of major tents have changed, possibly due to weather or other circumstance. Many informative photographs came from Robert Bracklow, a professional photographer that was heavily active in the New York area at the time. Many other photographers can be seen in the Wild West show, but his archive survived over more than a century. It was his photographs that helped establish the location of the tents belonging to William Cody, Annie Oakley, and even John Burke’s cabin. They also helped piece together what parts of the exterior grandstand might have looked like.”

Virtual World Heritage Ohio

Aerial view of Newark Earthworks render

Ball State University and Ohio History Council

Sponsored by the Office of Digital Humanities, National Endowment for the Humanities

The public will be able to explore the prehistoric Newark Earthworks in Ohio the way they appeared 2,000 years ago. Funded by the National Endowment for the Humanities’ Office of Digital Humanities, this interactive 3D simulation is under development at Ball State University.

 Ball State’sApplied Anthropology Laboratories (AAL) and the Institute for Digital Intermedia Arts (IDIA Lab) are creating a web-based virtual world that interprets the earthworks in their era of construction. The project is led oy Kevin Nolan, director and senior archaeologist at AAL and project co-director John Fillwalk, senior director of IDIA Lab.Interactive features will include accurate celestial alignments.  The sky will be simulated with accurate celestial bodies using data from NASA’s Jet Propulsion Laboratory to allow users to view the stars, planets, moon, and sun as they appeared 2000 years ago.

First person view of Newark Earthworks render

Already a National Historic Landmark, Ohio designated the Newark Earthworks as “the official prehistoric monument of the state” in 2006. Spread across four miles in what is now present-day Newark, Ohio, mounds and walls are constructed to record significant celestial alignments on the landscape, including the 18.6-year lunar cycle. The earthworks created community for the Hopewell People and provided sacred spaces for religious rituals and ceremonies related to their society. The Newark Earthworks comprise the largest set of geometric earthen enclosures in the world, built by the Hopewell People between A.D. 1 to A.D. 400 to serve a variety of cultural and spiritual purposes.

The project is a collaboration between Ball State and the Ohio History Connection, with support and partnership from several federally recognized American Indian tribes, including the Eastern Shawnee Tribe of Oklahoma and the Shawnee Tribe.

Dr. Michael Rhoades

Dr. Michael Rhoades is a multimedia artist and researcher. He joined the IDIA Lab team in August of 2019 where his role is that of a digital audio researcher creating generative algorithmic systems and configuring venues intended for multi-channel diffusion and stereoscopic projection.

In the spring of 2018 Michael received a BFA in Creative Technologies from the School of Visual Arts at Virginia Tech. Continuing there, in the summer of 2018 he began pursuing an interdisciplinary PhD in Computer Science, Musical Art, and Visual Art, which he completed in the fall of 2020.

The foci of his practice-based research reside in the areas of holograms, holophons, and supercomputing each contributing toward the production of novel visual music compositions and the development leading edge theoretical perspectives. After completing his graduate coursework at Virginia Tech, Michael returned to Indiana to be closer to family where he continues his research and creative practice remotely.

As a digital animator, painter, filmmaker, composer, computer scientist, theorist, and researcher, Michael’s multi-faceted artistic and academic endeavors continue to expand in an ever-broadening scope as the creative impulse inspires and requires. Examples of his music, visual art, visual music, and of his philosophical and technical writing are located at http://www.perceptionfactory.com.

Apollo 11 VR on Steam

Apollo 11 VR is a first person experience of the interior of the 1969 Command Module. Explore the cockpit firsthand in an authentic model of the interior based on scans from the Smithsonian. This is the 50th anniversary of the historic landing where Buzz Aldrin and Neal Armstrong walked on the surface of the moon on a site name Tranquility Base. 

The Spacecraft

The Apollo spacecraft had three parts: a command module (CM) with a cabin for the three astronauts, and the only part that returned to Earth; a service module (SM), which supported the command module with propulsion, electrical power, oxygen, and water; and a lunar module (LM) that had two stages – a descent stage for landing on the Moon, and an ascent stage to place the astronauts back into lunar orbit. The astronauts used Eagle’s ascent stage to lift off from the lunar surface and rejoin Collins in the command module. They jettisoned Eagle before they performed the maneuvers that propelled the ship out of the last of its 30 lunar orbits on a trajectory back to Earth. They returned to Earth and splashed down in the Pacific Ocean on July 24 after more than eight days in space.

Armstrong’s first step onto the lunar surface was broadcast on live TV to a worldwide audience. He described the event as “one small step for man, one giant leap for mankind.”

Wishing Well

Wishing Well creates an environment where visitor’s interaction shapes the
substance, meaning and character of the artwork – mediated through artificial intelligence. Upon engaging the installation, a participant speaks a wish into a pool of water that only they can hear. Their spoken wish is collected by an artificially intelligent set of software and hardware that begins to process the viewer’s hope. There is a technical document attached to the proposal that details the specific flow and technologies, but in essence, once each wish is recorded it is spoken back in a transformed in a poetic manner by text to speech synthesis in a whispering tone. The individual viewer privately hears this once – their wish then is processed to
become part of a larger collection of all wishes which can be heard through
synthesized voice through directional speakers. The tone of the collection of all wishes is analyzed and affects the processing of the spoken wishes, the ambient sound, and the color and brightness of the light produced above and below the pool.  

Wishing Well engages both private and public spaces in the creation of a
simultaneously intimate and collective social experience. The tone of the wishes aggregated by the system is processed anonymously and delivered into this social sculpture via synthesized whispered voice spoken in English. The water pool, or wishing well, is vibrated by a speaker underneath creating a three-dimensional dynamic sound wave image of the actual wish. Virtual lanterns will appear projected with each wish and will launch skyward to become part of the larger collection of dreams.

AI

Speech To Text

IBM Watson’s Speech to Text service accepts an audio file or stream as input and returns a transcription. This service will allow Wishing Well to analyze user speech for its tone and repeat it back using text-to-speech.

Tone Analyzer

IBM Watson’s Tone Analyzer service accepts text as input and returns a tone analysis that detects analytical attitude, inhibition, confidence, hope, sadness and joy across a spectrum. This data will be used to drive the color and intensity of the light of Wishing Well based on the aggregate tone of the wishes.

Polly

Amazon AWS’ Polly output can be modified with SSML tags which can affect the timbre, loudness, and pace of the audio. Wishing Well will whisper wishes back to the wisher.

Cymatics and Spouting Bowls

Cymatics, from the Ancient Greek: κῦμα, signifying, “wave”, is a
subset of modulation physics. Ordinarily the outside of a plate or
vessel is vibrated, and areas of most extreme and least disturbance
are made visible by animating fluid. Various patterns emerge dependent on the amplitude, frequency and shape of the vessel.

The device utilized can be direct, for example, the Chinese spouting bowl, in which copper handles are rubbed and cause the copper base components to vibrate – thus stimulating the water.

Standing vessels started in China. An early form called nao took the
state of a stemmed flagon, mounted with edge upright, and hit
outwardly with a mallet.

Wishing Well User Experience

  1. The visitor steps into a zone and triggers a proximity sensor that begins recording.
  2. The user speaks their wish into the microphone in the console.
  3. The recording stops.
  4. The audio is converted to text via the speech-to-text AI service.
  5. The text is sent to the IBM Language Translator and then to Amazon Polly.
  6. The Language Translator returns the text.
  7. The text is sent to the Tone Analyzer.
  8. The Tone Analyzer returns a tone analysis.
  9. Tone is analysis is sent to Unity and Max to affect light color and sound.
  10. The tone analysis drives visual and synthesized voice effects based on the mood of the wish.
  11. The synthesized voice audio is played through a hyper-directional speaker using ultrasound directed toward the visitor so only they can hear privately. http://hypersound.com/pro/products/
  12. Previous wishes are played at the same time through a randomly selected hyper-directional speaker surrounding the installation. Each visitors wish becomes part of the cloud of wishes and randomly plays anonymously through AI voice.
  13. The installation goes into an idle / attract state in which visual and auditory effects are influenced by the combined tone of the currently saved wishes.

ATK Procedural Audio Library

ATK-Sharp

ATK (Audio Tool Kit) is a demo of ATK-Sharp, a procedural audio generation library. ATK contains four example sound scripts that developers can use as a reference to script their own custom sounds.

Download now from the Unity Asset Store!

Why use procedural audio?

Using procedurally-generated audio instead of pre-made audio files has two main benefits:

1. Build file size can be reduced. Audio files can take up megabytes of disk space, while procedural audio is an order of magnitude smaller, typically only requiring kilobytes to store its script’s algorithm.

2. Procedural audio is flexible. Audio files are essentially baked. You may be able to change the pitch or speed of a file, but they could never match the sample-by-sample control you’ll have using procedural audio. Adding physics and logic as input, a given sound script may never produce the same output more than once.

Features

– WindAudio.cs demonstrates the very basics of what a developer will need to do to start scripting procedural audio using the library.
– FireAudio.cs script showcases how the developer can combine noise generators and a modifier to create the hiss, crackle, and pop of a campfire.
– ChimeAudio.cs exhibits how the developer can use physics as an input to the audio script.
– CricketAudio.cs models the chirp pattern of an actual species of cricket, known as the fall field cricket, to demonstrate how far the concept of procedural audio can be taken.

Dependencies

ATK-Sharp

Support

For support with ATK, please email us at idialab@bsu.edu.

Photogrammetry and 3D Printing in “The Power of Place”

Long on the cutting edge of emerging technologies, IDIA Lab has gained a reputation for unique skill sets. Providing their photogrammetry and 3D printing services, IDIA Lab joined several departments in the production of the new exhibition at the David Owsley Museum of Art. The Power of Place: 100 Years of Architecture at Ball State University

“The Power of Place: 100 Years of Architecture at Ball State University”

“The Power of Place: 100 Years of Architecture at Ball State University” showcases the ways Ball State’s campus architecture has changed and how the past still shines through.

“The exhibition embodies the idea that even though Ball State’s built environment has greatly changed over time by way of adding new buildings constructed with modern materials, our past is forever present,” says Rebecca Torsell, archivist for architectural records at Ball State University, who organized the exhibition.

Running from September 27 through December 21, 2018, the exhibition is part of a campus-wide celebration of Ball State’s Centennial.

“The museum is an ideal place to celebrate the University’s Centennial,” says Robert G. La France, director of the David Owsley Museum of Art. “For more than 80 years the museum’s building — the Fine Arts Building — has stood as Ball State’s palace to the arts. This show displays the University’s long-standing commitment to distinctive architecture and introduces visitors to its vast architectural archives and special collections.”

Photogrammetry

Part of the history of Ball State architecture includes unique tiles used in the Fine Arts Building itself. The museum wanted a way to display these ornate, relief tiles without removing them from the wall. Using their photogrammetry techniques, IDIA Lab first recreated the tiles as a virtual 3D model.

IDIA Lab Modeler, Trevor Danehy, explains the process, “We take at least one hundred photos of each tile, everything from straight on shots of the tile to sharp angles in an effort to expose the levels of depth in the crevices of the tile. The photogrammetry software algorithm requires that each photo has at least 50% overlap compared to the last photo taken in order to stitch together a mesh.”

“It can be a very trial and error process, so the more images taken, the better. That is why we prefer an DSLR camera that can take several images rapidly. Quantity over quality works best in photogrammetry. Once the software stitches together a crude mesh, we do a small amount of digital cleanup and export it for 3D printing.”

3D Printing

After the virtual mesh has been created, the next step is to recreate the tile using a physical medium. At IDIA Lab, this is accomplished using 3D printing. IDIA Lab’s Adam Kobitz, who manages 3D printing in the lab, describes the printing technique. “We printed the tiles using an Fused Deposition Modeling (FDM) printing process, in which the physical objects are built up by extruding layer after layer of melted plastic. This process allows us to produce accurate recreations of the tiles relatively quickly, permitting viewers of the exhibit to observe and interact with them in a manner they’re unable to with the original works.”

After the tiles are printed, they are ready to be displayed as a portable replica of part of Ball State’s architectural history.

BALL STATE UNIVERSITY AND TU DORTMUND EXCHANGE PROGRAM IN INTERMEDIA ART

Ball State University and the Technische Universität Dortmund have partnered in creating an international exchange program for students studying the design and technology of Intermedia artforms.

Ball State University and the Technische Universität Dortmund have partnered in creating an international exchange program for students studying the design and technology of Intermedia artforms. This program will provide opportunities for students and faculty engaged in experiential project-based approaches to the collaborative creation of new media art. The Hans Breder Foundation, which is a non-profit international Intermedia art organization, has contributed a significant body of historic artwork to the Museum Ostwall in Dortmund – the largest art museum in Germany. The archive will serve as a catalyst for research and production of Intermedia based projects initiated by the BSU and TU Dortmund exchange. John Fillwalk, also president of the Hans Breder Foundation, initiated the exchange program with the assistance of the Rinker Center for International Programs. He will be present at the formal signing ceremony on November 19th in Dortmund, Germany.

Oculus Rift + Leap Motion controller 3D printing.

Using our MakerBot Replicator 2X 3D printer, IDIA’s Chris Harrison worked with David Rodriguez to create a family of 3 brackets with varying uses and advantages to mount a Leap Motion Controller to the front of an Oculus Rift DK2.

Finding that double sided tape was not doing a very effective job of holding the Leap Motion Controller in place, we looked around Makerbot’s Thingiverse, an online warehouse of 3D print-ready objects, for a solution.

We found a bracket which when printed didn’t quite meet the tolerances of the Leap’s dimensions, and so some slight modifications were made to better accommodate it. In addition, rather than the 2-piece configuration on the website, a new bracket was made to be printed in one single pass.

Finally, after realizing other potential uses for the Leap, 2 more brackets were designed and printed so that the Leap can be securely installed onto the Oculus in a total of 3 different configurations.

The brackets can be viewed and downloaded here:

Bracket 1 Straight bracket used for visual IR passthrough from Leap camera

Bracket 2 Straight bracket used to minimize Oculus IR emitter occlusion

Bracket 3 Angles backet used to track hands with best angle – if no passthrough is desired

View a Oculus / Leap project here: https://idialab.org/oculus-rift-and-leap-motion-demo/

Oculus Rift and Leap Motion Demo

The Oculus Rift allows users to be visually transported to virtual 3D worlds. When paired with the Leap Motion, users now have the ability to interact with a 3D virtual world using their own hands.

The visual displays on the Oculus Rift project the user into an immersive 3D environment. The Leap Motion then allows interactivity via infrared images of the users hands. Together, these two pieces of hardware create exciting new possibilities.

Blue Mars Promo

Blue Mars is a 3D massively multiplayer virtual world platform developed by Hawaii-based Avatar Reality – allowing third party developers to create virtual worlds, MMOG games, simulations, businesses, entertainment venues, clothing, custom avatars, furniture, virtual homes, and other items. It consists of four main parts: the client software, the Sandbox Editor SDK suite, the website, and the host servers.

Free Account and Download here: http://www.bluemars.com/BetaClientDow…

Blue Mars Client

The free Blue Mars Client is your window into our free to play massively multiplayer virtual world made up of an expanding set of independently operated Cities that feature unique themes, activities, and attractions. Through the Blue Mars Client, users can participate in a wide variety of entertainment, educational, and productivity experiences.

Blue Mars Sandbox Editors

The free Blue Mars Sandbox Editors are the bridge between your offline content and the online Blue Mars world. Our Editors work in concert with digital content creation tools like 3DSMax, Maya, Blender, and SketchUp to create immersive 3D spaces and upload them into the virtual world. Using the Blue Mars Sandbox Editors, developers can arrange geometry, script interactive features, define items for sale, manage shop and residence interiors, preview lighting conditions, adjust time of day, create shops, manage land, develop terrain and vegetation, choose textures, adjust shaders, and much more. Our editors are “What you see is what you play” so the view of your offline content in the Editor is exactly how it will appear in the online environment.

BlueMars.com

BlueMars.com is the central hub for discovery, community, and user and developer account management. Everything from currency balances, friend lists, messaging, business intelligence, inventory, virtual land, search, shopping, and account information will be managed through the web site.

Blue Mars Servers

The Blue Mars Servers distribute the content uploaded into the virtual world by developers, manage the global item and inventory database, and coordinate the movement, actions, and communication of the users exploring Blue Mars. Our advanced server architecture can support many thousands of users simultaneously in the same location.

Chris Harrison CAP Lecture

Chris Harrison CAP LectureChris Harrison gave a presentation during the annual CAP (College of Architecture and Planning) Faculty and Student Symposium about the use of quadcopter and drones for use in photogrammetry. Using a quadcopter and a GoPro Hero 3 camera, he was able to capture a subject and submit the workflow into Autodesk ReCap in order to get a virtual model out of it.

The video is located here on our YouTube.

 

 

Chichen Itza

Chichen Itza2Student Fellow, Adam Kobitz, demonstrates a recreation of Chichen Itza within Unity Software.

For the video, click on the link below.

YouTube Video Demonstration

 

 

 

Collaboration Screens in Unity Software

Screen Shot 2014-02-19 at 12.26.09 PMDavid takes us through an example of using Collaboration screens within the Unity Software.

 

YouTube Link.

 

 

 

REDgrid: BSU Virtual World for Teaching & Learning

REDgrid

IDIA is pleased to announce REDgrid – a 3D virtual campus to support the educational mission of faculty, staff and students. Ball State community members can use it for free as a virtual gathering space including classroom instruction, small group meetings, presentations, panel discussions, or performances. It is a secure environment hosted and managed solely by Ball State’s IDIA Lab. Virtual classrooms or field experiences can be customized as well as made private to suit your needs. REDgrid also offers a developmental platform for online instruction or community displays.

In this video, David gives us a preview of what you are able to do in REDgrid.

Ball State institute creates virtual world to be used for learning, connecting
The Daily News | Amanda Belcher Published 08/31/15 12:27pm Updated 09/01/15 5:15pm
By 2013, the Sims video game series had sold 175 million copies worldwide. Its users could create avatars, build houses—just like in reality.

Ball State’s own REDgrid uses a similar concept. Students can make an avatar, walk around a virtual Ball State campus and interact with other avatars via written messages or a headset in this open simulator.

Ball State’s Institute for Digital Intermedia Arts (IDIA) developed REDgrid.

“The server is physically housed at BSU IDIA Lab and was installed with collaboration with Information Technology,” said John Fillwak, the director of Ball State’s IDIA Lab.

When it comes to REDgrid, the possibilities can seem limitless—and some faculty members have already begun testing the simulator’s boundaries.

Mai Kuha, an assistant professor of English, used REDgrid in an honors class as a gender identity project. Students were assigned a gender and told to create an avatar of that gender. This enabled them to observe how people of opposite genders are treated differently.

Kuha isn’t the only professor to use REDgrid as a learning tool in the classroom. Ann Blakey, an associate professor of biology, used it in one of her courses for an environmental ethics activity.

Students were assigned a role, such as scientist or environmentalist, and were invited to explore the virtual environment and file a report. The activity gave students the opportunity to see the environment from different perspectives.

Fillwalk envisions even more opportunities for the platform.

“Ball State community members can use it for free as a virtual gathering space including classroom instruction, small group meetings, presentations, panel discussions or performances,” Fillwalk said.

The virtual classrooms and field experiences can be customized to fit each teacher’s needs.

This kind of creative teaching is what Stephen Gasior, an instructor of biology, is looking to expand upon. An enthusiastic voice for REDgrid, Gasior encourages professors and students to utilize this tool. He explains that it gives professors the ability to shape an environment and allows students to experience any number of events or situations.

REDgrid isn’t just for academic purposes either, Gasior said. Students can use the avatars for social experiments too.

“REDgrid represents Ball State’s campus, and international or online students get [the] feeling of being in the campus environment,” he said.

Fillwalk fully understands this aspect of REDgrid.

“We designed it to be a flexible platform connecting BSU faculty and students to international,” he said.

Connection is key with REDgrid. Gasior stressed that it can help build and connect communities—Ball State or otherwise.

Ball State is already working with faculties of other universities so the tool can be used on campuses other than just Ball State’s, Fillwalk said.

“The platform could certainly continue to be expanded,” he said.

Gasior has plans for the future—like researching developmental grants and expanding REDgrid. But IDIA staff can only do so much.

“People who come to REDgrid and have a passion for it will shape the road it will take,” Gasior said.

http://www.ballstatedaily.com/article/2015/08/redgrid-virtual-world

Trans|Duce Project

Trans|Duce Project Demo Video

Back of Plate WatermarkThe Trans|Duce uses capacitive sensing and an Arduino to sense when a person is near it. That information is sent to MAX 5 and affects a sound and/or filter which is then sent to an amplifier which vibrates a transducer placed on the back of the metal sheet.

The metal sheet then acts as a speaker and its unconventional makeup creates an interesting timbre. Different gauges of metal can be used to better amplify certain ranges of the sound spectrum. From this, we can create different sounds depending on how close or far away we are from the metal sheet.

Be sure to check out the demo video above and see the transducer in action.

EEG Device

EEG5Using electroencephalogram (EEG for short) we are developing applications to digitize analog electrical brainwaves. With this device from NeuroSky (www.neurosky.com) we are able to wear the device and control applications on electronics, such as cell phones and computers. Unlike most EEG devices on the market, this one is affordable and stylish. With using the provided developmental tools we plan to implement our own work into it and create more applications for it. Using certain “hacks” and applying our own touches to it, we are able to unlock the ability to create new ideas on what to use this product for. We look forward to continuing our research on this, but it is still in a development stage.

Dice+

Dice+4 copy

Connecting Dice+ to a Cell Phone

The DicePlus (Dice+) can communicate with both iOS and Android devices and has an available SDK for Android, iOS, and Unity.

It has capabilities including: return dice number rolled, control of LED’s, magnetometer and accelerometer for orientation and movement, capacitive sensor for proximity and touch, and temperature sensor for temperature.

For more information regarding its use check out www.dicepl.us

 

 

 

Mobile Arduino Controller

Mobile Arduino2 logoThe Mobile Arduino Controller allows one to access an Arduino board via a mobile friendly website. It’s a website designed for mobile devices for a couple reasons…

1) Allows access from computers, not just smart phones.

2) Easier for the public to access the Arduino or multiple Arduinos for installations without searching for an app and then waiting for it to download.

3) Works on all smart phones.

The mobile website allows for near real time manipulation of anything connected to the Arduino board. It could also be used to relay sensor information back to the website or user. Right now, it is just connected to control LED’s, but it could be anything (servos, speakers, etc.)

We hope to use this for more than just turning on/off an LED on the board, but currently it is in a prototyping/research phase.

Unity Mobile Controller

Unity Mobile ControllerThe Unity Mobile Controller would allow us to have a Unity project running on a PC, and users would be able to download an app that turns their smart phone into the controller. This allows for the project running on the PC to show the player and the smart to display the controls and other information (such as maps, menus, etc.). It’s still in a research/prototyping stage right now.

 

Proserpine Captured with Autodesk ReCap

Photogrammetry scan of the a marble bust by artist Hiram Powers of Proserpine, daughter of Ceres who became the goddess of the underworld when Pluto carried her away and made her his wife. IDIA Lab used Autodesk ReCap software, displayed in Sketchfab.
Title
Date Made
Artist
Artist Biography
Medium
Dimensions
From the collection of the Ball State Museum of Art.