This is the sub-category for Immersion Projects posts. Any posts within this sub-category will be displayed under the Immersion Projects header on the Academic page.
IDIA Lab’s chatbots provide interactions with virtual 3D animated non-player characters (NPC) that can be created with customized persona profiles. Users can converse with NPCs via voice or text input, where the virtual characters use artificial intelligence to assess the tone of the conversation and reflect simulated responses with facial animation, gesture, and vocal intonation. The lab engages ChatGPT and other AI software to assemble an integrated system within the game engine of Unity to create a robust real time user engagement with an infinite variety of customized NPC profile modeling – from historical characters to highly specific technical interactions.
The Maryknoll Virtual School, created by the IDIA Lab, is a digital twin simulation of a parochial elementary school in Honolulu, HI. The virtual school provides an environment for students and teachers to remotely engage in collaborative real time lectures and conversations in a multiplayer, avatar based teaching environment. The application also provides administrators the ability to monitor admission, registration, class assignments, and grading through an extensive management portal. The application allows for communication via chat, voice, live whiteboard, screen share, in-world video conferencing, and file sharing.
BSU IDIA Lab developed Virtual Milestones of Flight, an immersive head mounted display prototype designed for the permanent exhibition at the Smithsonian Air and Space Museum in Washington, DC.
Both the exhibition and simulation celebrate selections of the most significant airplanes, rockets, and spacecraft in history. Some of these craft incorporated into the simulation include: Charles Lindbergh’s Spirit of St. Louis, the X15, a hypersonic rocket powered USAF aircraft, the Wright Brothers’ flyer, Sputnik, the first Russian artificial satellite, and NASA’s Apollo 11 Lunar Lander.
IDIA Lab has designed a virtual simulation of the villa of the Roman Emperor Hadrian, which is a UNESCO World Heritage site located outside of Rome in Tivoli, Italy. This project is being produced in collaboration with the Virtual World Heritage Laboratory (VWHL) at the University of Virginia (UVA), directed by Dr. Bernard Frischer and funded by the National Science Foundation. This large-scale recreation virtually interprets the entire villa complex in consultation with the world’s foremost villa scholars and educators. The project has been authored in the game engine of Unity as a live 3D multi-user online learning environment that allows students and visitors to immerse themselves in all aspects of the simulated villa.
The project will not only accurately recreate the villa buildings but also include a complete Roman avatar system, non-player characters with artificial intelligence, furniture, indigenous vegetation, dynamic atmospheric system and sophisticated user interface. The interface will not only provide learning, navigation, reporting and assessment opportunities but will also allow users to change the position of the sun to any date in 130 AD using data from the Horizons database at JPL NASA – testing theses of astro-alignments of architectural features during solstices and equinoxes. UVA students will be briefed on the culture and history of the villa as well as learn the virtual environment for five weeks prior to immersing themselves within it. The avatar system will allow for them to enter the world choosing class and gender – already being aware of the customs and behavior of the Roman aristocracy, soldier, slave or politician. This project will be delivered to VWHL at UVA in early March.
The Digital Hadrian’s Villa Project:
Virtual World Technology as an Aid to Finding Alignments between
Built and Celestial Features
Bernard Frischer1
John Fillwalk2
1Director, Virtual World Heritage Laboratory, University of Virginia
2Director, IDIA Lab, Ball State University
Hadrian’s Villa is the best known and best preserved of the imperial villas built in the hinterland of Rome by emperors such as Nero, Domitian, and Trajan during the first and second centuries CE. A World Heritage site, Hadrian’s Villa covers at least 120 hectares and consists of ca. 30 major building complexes. Hadrian built this government retreat about 20 miles east of Rome between 117, when he became emperor, and 138 CE, the year he died. The site has been explored since the 15th century and in recent decades has been the object of intense study, excavation, and conservation (for a survey of recent work, see Mari 2010).
From 2006 to 20011, with the generous support of the National Science Foundation[1] and a private sponsor, the Virtual World Heritage Laboratory created a 3D restoration model of the entire site authored in 3DS Max. From January to April 2012, Ball State University’s Institute for Digital Intermedia Arts (IDIA Lab) converted the 3D model to Unity 3D, a virtual world (VW) platform, so that it could be explored interactively, be populated by avatars of members of the imperial court, and could be published on the Internet along with a related 2D website that presents the documentation undergirding the 3D model.
The 3D restoration model and related VW were made in close collaboration with many of the scholars who have written the most recent studies on the villa.[2] Our goal was to ensure that all the main elements—from terrain, gardens, and buildings to furnishings and avatars—were evidence-based. Once finished, the was used in two research projects.
The first project was a NSF-sponsored study of the usefulness of VW technology in archaeological education and research. We used the virtual villa in undergraduate classes at Xavier University and the University of Virginia to investigate the thesis of two recent studies by project advisors Michael Ytterberg and Federica Chiappetta about how this enormous built space was used by six different groups of ancient Romans, ranging from the Emperor and Empress to normal citizens and slaves (Ytterberg 2005; Chiappetta 2008). Avatars representing these groups have been created and are being operated by undergraduate students as a Problem‐Based Learning (PBL) experience. They are observed by subject experts, who are using the data generated to test and, if necessary, refine the initial theses about how circulation through the villa was handled. The results are still being evaluated. Preliminary indications are that the data show that the combination of VW used in a PBL educational context is very effective in taking advantage of the known connection between between the hippocampus and long-term learning, especially when the information to be mastered is spatial (Kandel 2007).
The second project involved use of the VW for some new archaeoastronomical studies. Most of our advisors’ publications, like the older work by archaeologists that preceded them, have concentrated on archaeological documentation, restoration, formal, and functional analysis. The latest research by advisor De Franceschini and her collaborator Veneziano (2011) combined formal and functional analysis: it considered the alignment of certain important parts of the villa in relation to the sun’s apparent path through the sky on significant dates such as the solstices. In their recent book they showed how two features of the villa are aligned with the solar solstices: the Temple of Apollo in the Accademia; and the Roccabruna. We used the VW to extend their research to other areas of the villa, taking advantage of 3D technology to restore the sun to the right place in the sky and also to restore the damage to the architecture of the villa, as De Franceschini and Veneziano had independently suggested be done before they learned about our digital model of the villa.
The work of De Franceschini and Veneziano is innovative. Archaeastronomy has become an accepted field of study in recent decades, and a considerable amount of work has been done in Old and New World archaeology. In Roman archaeology, however, this approach is still rarely encountered. Significantly, one of the few compelling studies concerns the most famous Hadrianic building: the Pantheon in Rome. Hannah and Magli 2009 and Hannah 2011 have shown a number of solar alignments in the building, of which the most notable are the sun’s illumination of the entrance doorway at noon on April 21; and the view of sunset silhouetting the statue of Hadrian as Sun god on a four-horse chariot atop the Mausoleum of Hadrian as viewed from the middle of the Pantheon’s plaza at sunset on the summer solstice. Like the summer solstice, April 21 is also a significant date: on it occurred the annual festival in Rome known as the Parilia (re-named the Romaia by Hadrian),[3] which celebrated the founding of Rome.
De Franceschini and Veneziano pursued an observation of Mangurian and Ray (2008) to document an impressive example of solar alignment at Hadrian’s Villa involving the tower known as Roccabruna at the western end of the villa. Originally, a tower-like structure topped by a round temple, what remains today is the well-preserved, massive lower floor. The main entrance is located on the northwestern side to the right and gives access to a large circular hall covered by a dome. The dome is punctuated by an odd feature: five conduits that are wider on the outside than on the inside (figure 1).
What is the function of these unusual conduits? They have no known parallel in Roman architecture. After asking themselves this same question, on June 21st, 1988, the day of summer solstice, the American architects Robert Mangurian and Mary Ann Ray went to Roccabruna at sunset, and discovered the extraordinary light phenomena which occur there. At sunset the Sun enters through the main door illuminating the niche on the opposite side, something that happens during most of the summer days. But only in the days of the summer Solstice the Sun penetrates also into the conduit located above that door: its rays come out from the slot inside the dome projecting a rectangular light blade on the opposite side of the dome. In June 2009, De Franceschini verified the findings of Mangurian and Ray. However, they know that the apparent path of the Sun through the sky changes slightly each year, so that in the nearly 1880 years separating us from Hadrian, the precise effect of the alignment has been lost. As they noted, only a computer simulation can recreate the original experience of being in the lower sanctuary at Roccabruna at sunset on the summer solstice during the reign of Hadrian.
Once we had our 3D model of the site, we were able to obtain from NASA’s Horizons system[4] the correct azimuthal data for the year AD 130 and put the sun into the sky at sunset on the summer solstice. Following the lead of De Franceschini, who in the meantime had become a consultant to our project, we put into the niche one of the four statues of the Egyptian sky goddess Isis that were found at the Villa. De Franceschini chose Isis because first of all, there is no question there was a statue in this niche so we need to put something there; and the two flanking niches had candelabra, whose bases are preserved and are decorated with Isiac iconography. Moreover, Isis’ festival in Rome was on the summer solstice. So we scanned and digitally restored one of the several statues of Isis from the villa and put it into the central niche. Finally, for the dome, which we know from surviving paint was blue and therefore had the famous “dome of heaven” motif (Lehmann 1945), we followed De Franceschini in restoring a zodiac set up in such a way that the sign of Gemini is over the statue niche since the last day of Gemini is the summer solstice. Our zodiac is adapted from the great Sun God mosaic in the Rheinisches Landesmuseum in Bonn, which kindly gave us permission to use it.
As can be seen in figure 2, when we restored the sun in the right position in the sky dome for sunset on the summer solstice (June 21) of 130 CE in our 3DS Max model of Roccabruna, the sunlight coming through the main doorway illuminated the statue of Isis in the statue niche, and the light entering through the conduit lit up the sign of Gemini painted on the cupola. So we were able to confirm the Mangurian-Ray thesis.
The approach we have taken in our Roccabruna project is deductive: Mangurian and Ray noted the strange feature of the conduits punctuating the cupola of Roccabruna; they hypothesized a solar alignment. De Franceschini and Veneziano agreed and for various reasons we don’t need to go into today, they put a statue of Isis into the statue niche. We set up the conditions in which these hypotheses could be tested and were able to verify them.
But surely, if there is one such alignment at the villa of the same emperor who was responsible for the Pantheon, there may be others. But the villa is very big—covering over 100 hectares—and has 30 major building complexes, most larger than Roccabruna. Moreover, such alignments could just as easily involve astrological features such as the Moon and the planets. Faced with this level of complexity, the best methodological way forward in searching for new alignments is clearly inductive and empirical. This is one reason why we asked the Institute for Digital Intermedia Arts (IDIA Lab) of Ball State University to create a multi-user virtual world based in Unity 3D from our 3DS Max model.
The project of virtually interpreting a simulation on the scope and scale of Hadrian’s Villa was a daunting one – engaging layers of scholarly, technical and pedagogical challenges. The technical challenges were many – foremost to leverage the game engine of Unity 3D to become an effective multi-user avatar-based virtual world. An important factor was to create an environment that was straightforward and accessible via standard web browsers on both Mac and Windows and selected Unity 3D as the starting point for developing the platorm. We required specific back-end administration tools to handle the accounts and server side aspects of the project – for this we relied on Smart Fox Server as it manages Unity 3D quite well. Our team took an approach that bridged and integrated disparate technologies, creating a robust virtual world platform to immersively augment both instructional and PBL processes. VW features available to the learning community included text based communication, a live map showing current visitor positions, map based teleportation, managed voice channel, user selected avatar gestures, online users, paradata, photographs of the extant site, plan views, and integrated web links.
Key to the project was a varied system of avatars representing the imperial court, freemen, senators, scholars, soldiers, and slaves to the emperor. The avatar system provided several important functions testing recent scholarly interpretations of circulation throughout the villa and the use of various spaces for typical court activities – meals, imperial audiences, bathing, worship, etc. Upon entering the simulation, the choice of avatar would predicate how one’s social standing within the role-play of the world.
A gesture system was created via motion capture providing each user with a unique set of actions and gestural responses to engage social interactions – including greetings, bowing and gestures specific to rank and class. Communication was also a critical element in the modes of problem based learning engaged by the participants in the simulation. Specific technologies provided varied abilities such as public chat, private instant messaging and live multi-user voice channels.
A companion website was co-developed and integrated into the VW environment providing learners with visual assets such as photographs and panoramas of the current site, site plans, elevations, and video interviews with Villa scholars. We also developed three-dimensional turntables of the interpreted and reconstructed models, overview information on each of the major Villa features, bibliography and an expansive database of art attributed to the Villa site. This information can be directly accessed by learners directly from within the virtual world. The development team integrated the notion of paradata, introduced by the London Charter – making instantly transparent the scholarship and all underlying elements of the 3D model (from terrain to buildings, furnishing, costumes, and human behavior).
In support of new research theme on celestial alignments by consultants De Franceschini and Veneziano, a major goal for the project was to develop an accurate simulation for the position of the sun. The solar tracking, or virtual heliodon that we created as a response to this research, was envisioned as a simulation that would a bridge between the virtual environment and coordinates from an external database calculating solar positions. After investigating existing tools we decided to employ the Horizons database that was created by NASA’s Jet Propulsion Laboratory as an on-line solar system data computation service – tracking celestial bodies in ephemerides from 9999 BCE to 9999 CE. In implementing solar tracking for the Villa project in instances were we where we wanted to investigate potential significant solar alignments, we entered the latitude, longitude and altitudes of specific buildings from the Tivoli site to poll the Horizons data for the year 130 CE. The user was able to change the date, time of day, and quickly play the sun from specific moments via the user interface. The system was co-related to both the Julian and Gregorian calendars and contained presets for the vernal and autumnal equinoxes as well at the summer and winter solstices.
These tools allowed for the rapid discovery of potential alignment that might bear further investigation. The solar feature allows one to proceed empirically, in effect turning the clock back to 130 CE and running experiments in which the days and hours of the year are sped up by orders of magnitude so that one can in a very short time find candidate alignments not yet hypothesized by scholars working in the traditional way of Mangurian-Ray.
As developers, our goal was to create the solar tool and let students and scholars use it to undertake their own empirical research. Our team was not intending to engage in this research ourselves, yet in the process of working within the environment daily we quickly began to notice curious solar phenomena. In a bit of empirical study of the very first component of the site we installed in the simulation, the Antinoeion – or newly-discovered Temple of the Divine Antinous, we noticed an alignment of potential interest. The most likely alignment seemed at first glance to be along the main axis running from the entrance, through the obelisk in the central plaza to the statue niche at the end of the axis. We ran the days and hours of the year and found that the sun and shadow of the obelisk align at sunrise on July 20. We consulted with our expert on the Egyptian calendar in the Roman period, Professor Christian Leitz of the University of Tuebingen–and he confirmed that this date has religious significance. It is, in fact, the date of the Egyptian New Year, as the Romans of Hadrian’s age clearly knew (cf. the Roman writer Censorinus, who states that the Egyptian New Year’s Day fell on July 20 in the Julian Calendar in 139 CE, which was a heliacal rising of Sirius in Egypt).
In the process of developing and subsequently utilizing the simulation tools we created for astro-archeological research, our conclusions have been that virtual world technologies can indeed take the inquiry for significant built-celestial alignments to a new level of insight.
Bibliography
Chiappetta, F. 2008. I percorsi antichi di Villa Adriana (Rome).
De Franceschini, M. and G. Veneziano, 2011. Villa Adriana. Architettura celeste. Gli secreti degli solstizi (Rome).
Hannah, R. 2008. Time in Antiquity (London).
Hannah, R. 2011. “The Role of the Sun in the Pantheon’s Design and Meaning,” Numen 58: 486-513.
Kandel, E. 2007. In Search of Memory: The Emergency of a New Science of Mind (W. W. Norton, New York). Kindler edition.
Lehmann, K. “The Dome of Heaven,” Art Bulletin 27: 1-27.
Lugli, G. 1940. “La Roccabruna di Villa Adriana,” Palladio, 4: 257-274
Mangurian, R. and M.A. Ray. 2008. “Re-drawing Hadrian’s Villa,” Yale Architectural Journal, 113-116.
Mari, Z. 2010. “Villa Adriana. Recenti scoperte e stato della ricerca,” Ephemeris Napocensis 20: 7-37.
Ytterberg, M. 2005. “The Perambulations of Hadrian. A Walk through Hadrian’s Villa,” Ph.D. dissertation, University of Pennsylvania.
Vice-President of Information Technology, Phil Repp and IDIA Lab Director, John Fillwalk traveled to Washington, D.C. to present Ball State projects in electronic field trips, virtual worlds and hybrid design to several federal organizations. Meetings included a presentation at the Smithsonian Institution Building on the National Mall to representatives from all the Smithsonian Museums and Institutes as well as a separate presentation to the National Park Service. Conversations are ongoing regarding partnership and collaboration opportunities.
IDIA is collaborating with BSU’s Brandy and Greg Rocheleau from the Department of Criminal Justice and Criminology on the production of a Virtual Courtroom. The project, funded by BSU’s Women of Beneficence, is a program designed to allow students in designated criminal justice classrooms to participate in courtroom simulations through the use of avatars to facilitate knowledge and empathy as well as practice decision-making skills in a courtroom setting.
One of the main purposes of this project is to give criminal justice students an opportunity to practice ethical decision-making skills in the field of criminal justice. It also helps students develop empathy for vulnerable populations so students become socially-responsible professionals. IDIA is designing a multiuser virtual simulation where students can engage via avatars and communicate in voice, chat, video or whiteboard. The facilities will include a courtroom, deposition room, classroom and even an accurate digital twin of the Supreme Court!
IDIA Lab is working on a Virtual Escape Room project for Ball State University College of Health. Project will launch in the fall semester. In this collaboration we are developing a virtual multi-user escape room. Escape rooms employ innovative learning strategies such as problem-based, immersive learning, role play, and game principles to improve teamwork, communication and leadership. Nationally, health profession curricula are beginning to explore the use of escape room experiences as innovative pedagogy. Escape rooms challenge individuals to collaboratively explore new content, work as a team to discover clues, solve puzzles, and accomplish interactive tasks in a limited amount of time. The virtual multi-user virtual escape room will support teaching teamwork and communications curricula for health profession students and contribute to improved health outcomes.
IDIA Lab partnered with physicians at the Seattle Children’s Hospital to develop a virtual simulator to train staff in the case of various disasters.
The EVAC+ Virtual Disaster Simulator leverages virtual reality to train health care providers to react quickly and effectively to unexpected events that threaten patient safety. It was developed after pilot live simulations, user focus groups, surveys, and a successful pilot of the initial EVAC system. After the need for more communication opportunities and variations in disasters responses was identified in the pilot, EVAC+ was created. EVAC+ leverages the same ability to prepare patient equipment for an evacuation but with significantly more features allowing for interacting with hospital room, other staff, and family members. Upon entering the EVAC+ environment providers are oriented to the virtual space, including navigation, interacting with objects, and how to interact verbally with non-player characters.
Once they are comfortable with the space they can choose from a menu of disasters to experience, including earthquakes, fire, and an active shooter event. Each simulation has unique objectives around patient protection, equipment packing, and communication with other staff and families. Learners practice how they would provide a status update using the SBAR technique to their charge nurse, use closed loop communication, and respond to anxious family members.
Pre-determined safety behaviors, such as stopping and resolving a mission critical issue, are prompted and performance recorded for feedback. After each scenario the user is guided through a self-debriefing on how well they performed on key disaster and communication behaviors. Information on how they managed and packed medical equipment is presented along with evidence-based information on how experts would recommend managing equipment. Users can choose to repeat any scenario to practice or refine their responses and can explore the scenarios in any order.
The Unity3D environment is accessible on web-browsers and head-mounted VR systems. Amazon Web Services (AWS) supports voice communication and real-time analytics. The EVAC+ system fills a unique need for accessible, interactive, sustainable disaster training for healthcare providers.
Ball State University and the Technische Universität Dortmund have partnered in creating an international exchange program for students studying the design and technology of Intermedia artforms.
Ball State University and the Technische Universität Dortmund have partnered in creating an international exchange program for students studying the design and technology of Intermedia artforms. This program will provide opportunities for students and faculty engaged in experiential project-based approaches to the collaborative creation of new media art. The Hans Breder Foundation, which is a non-profit international Intermedia art organization, has contributed a significant body of historic artwork to the Museum Ostwall in Dortmund – the largest art museum in Germany. The archive will serve as a catalyst for research and production of Intermedia based projects initiated by the BSU and TU Dortmund exchange. John Fillwalk, also president of the Hans Breder Foundation, initiated the exchange program with the assistance of the Rinker Center for International Programs. He will be present at the formal signing ceremony on November 19th in Dortmund, Germany.
A presentation by Jo Ann Gora, President of Ball State University; Phil Repp, Vice-President of Information Technology; Linda Sweigart, BSU Nursing Faculty; and John Fillwalk, IDIA Lab Director on March 21st 2013 to the American Association of Colleges of Nursing.
A presentation that examines how universities, like Ball State University, use emerging technologies to enrich and expand the utility of course materials at scale by leveraging the inherent production flexibility of digital tools. The advantages of digital technologies when designing virtual participation, collaboration, and interaction will be discussed, as well as the pedagogical uses of un-assembling and re-assembling diverse sets of course materials. http://www.aacn.nche.edu/webinars/2013/03/21/designing-the-unbundling
The Las Americas Virtual Design Studio is made possible through a collaboration between IDIA, the College of Architecture and Planning at Ball State University, the Las Americas Network, with the professional firm of BSA LifeStructures. The project linked 11 international Departments of Architecture, over 100 students and 11 faculty in this immersive collaboration environment. The students worked within their prospective studios under the direction of their local instructors. At the same time each student is assigned to an international virtual studio made out of students of different universities under the advisory of several virtual instructors. The roster of virtual instructors is made out of the instructors of all participating studios and a number of professional reviewers who do not have students at their particular location.
The LAVDS is a beacon that not only attracts visitors, but encourages interaction on multiple levels: it is depicted an organic system – a radial configuration of pods or petals -spaces for small group use, and a large group meeting place at the center. A tall mast-like element. The structure relied on configurable interfaces to facilitate large and small interactions. For instance, it could be set up for small critique or large group lecture sessions.
On November 10, the Las Americas Network held the international grand opening of its Virtual Design Studio. “The nature of the tower’s architecture is specific to Second Life. There are no stairs, and dynamic pods and audio bubbles allow groups to meet together or ‘fly’ away to have private meetings,” said dean of CAP Guillermo Vasquez de Velasco.
The project was created on a Second Life island at the Ball State University Instructional Campus. Some constraints designers faced were the 96′ x 96′ x 96′ cubic site, a limit to 40 avatars able to interact simultaneously, 15,000 geometrical primitives. The project required that it celebrate the spirit of the virtual design studio while leveraging technologies and capabilities for the interaction (work and play) of the participants of the LAVDS in ways that have not been possible outside of a virtual environment.
Students from Ball State University’s Institute for Digital Intermedia Arts created a music video for recording artist Ki: Theory. The entirely animated video for the song Holiday Heart, is the artist’s first professionally produced video for broadcast on popular music television venues. This immersive learning experience connected students to an industry partner, challenged their production skills, and gave them a look at life after graduation.
The IDIA Seminar’s current project focuses on virtually recreating the original settings of various sculptures found throughout the Ball State University Museum of Art. The project focused on scanning five different sculptures using a 3D laser scanner.
This project has allowed Ball State students to get involved with 3D scanning by learning the method, techniques and limitations involved with accurate scanning procedures.
The museum scanning project was initially a way to not only digitally archive a few select sculptures, but to place them in an animated video to visualize the art in their original context, before they found their way to the Ball State Museum of Fine Arts.
Sculptures such as Adolph Alexander Weinman’s “Descending Night” and “Rising Sun” originally were meant to be viewed at the Panama-Pacific International Exposition of 1915 in San Francisco. The students intend to visually replicate that scene along with how the artist framed them for viewing with the help of a Zcorporation Zscanner 700 at a high resolution. Scanning typically takes the class around six hours (for a human sized sculpture). The ZScanner has the ability to scan any object in the tightest spaces and do it in real-time with one continuous scan. Once the scan data is acquired, there are a large variety of mediums that it can be transferred to both digitally and physically. Students then place the models within an animation using AutoDesk Maya. A high-resolution viewer and interactive touch screens are also used to view the models. Students are also investigating a method of 3D prototyping the models to a smaller, more reproducible copy.
IDIA, in partnership with the BSU School of Nursing, developed and launched a new Nursing Interview Simulator in the virtual world of Blue Mars. Blue Mars is a next generation and high fidelity virtual world that uses the CryEngine game engine. Student nurses practice interviews via role-playing – using avatars with predefined health histories.
The IDIA has assembled an interdisciplinary group of students, faculty and industry experts in a significant interactive information portal for the Ball State University Museum of Art (BSUMA). The IDIA team is developing an innovative touch-based interface to navigate the collection, employing and integrating Microsoft Surface with the Museum’s database. The Surface will afford Museum patrons a collaborative, participatory public platform through which to access metadata and media of the physical exhibition – as well as extending virtually into the permanent collection. Using the Digital Images Delivered Online (DIDO) database, the interface will make visual the interconnection between works in the collection queried on searchable parameters, i.e. artist, medium, period, subject etc. This two-semester immersive project supported by the BSU Provost’s Immersion Initiative has been team-taught and has recruited students from targeted disciplines across campus.
Microsoft Surface represents a fundamental change in the potential for interaction with digital content. The Surface is a 30” tabletop interface environment that allows several people to work independently or collaboratively – all without using a mouse or a keyboard. The Surface allows users to navigate information physically, and manipulate information with natural gestures and touch http://www.microsoft.com/SURFACE/product.html#section=The%20Product.
The Ball State University Museum of Art has a collection of nearly 11,000 works of art. Central to the Ball State University Museum of Art’s mission is the provision of educational programming that will further faculty, staff, student, and public utilization and understanding of the Museum and its collection, of museums in general, and of the arts. To accomplish this goal, the Museum offer tours, talks, materials for teachers, an education database, artist demonstrations. The Museum’s education philosophy centers on bringing together the needs of visitors and the resources of the Museum , and balancing responsibility to the university with service to the community. In facilitating communication between the work of art and the viewer, the Museum subscribes to a philosophy of learner-centered programming informed by a thorough knowledge of the collection and methods and strategies for effective teaching.
Although the Museum’s collection numbers nearly 11,000, given the physical constraints of the facility, a mere 10% of the collected works are currently displayed at any given time. The incorporation of this hybrid Surface and database system will afford patrons virtual access to the entire collection – making visible those works currently stored. More importantly, the system will allow patrons to visualize the interconnectivity of the works according to multiple facets – visually re-contextualizing the works in relation to specified search parameters. This form of innovative technology within a museum context would typically be installed at major metropolitan institutions. Development of this interface at the Ball State University Museum of Art will not only benefit local community groups and patrons of BSUMA whose access to the collection will be significantly augmented, but also has the potential to influence other museums through the distribution of the outcomes of this product.
The Institute for Digital Intermedia Arts (IDIA) has a history of providing interdisciplinary immersion and new media experiences, employing pedagogy that supports students’ transformation as active, engaged learners. The Institute provides a dynamic exchange between instructor and students – where participants engage in collaborative, inquiry-based communities that provide an environment, which fosters participatory learning. As opposed to a traditional model, where knowledge is imparted by the teacher to the student, participatory learning can transform the learner into an active role. Participatory learning communities engage in a dynamic conversation centered around a subject – where analysis, comparison, and evaluation are core to the process of acquiring meaning and relationships. This environment allows students the opportunity to have a stake in the conditions of their learning, making decisions that direct their course of investigation. With a commitment to active inquiry, participants are freed to create personal and collective meaning.
BSU MUSEUM OF ART
COLLECTION PORTAL
SOFTWARE DESIGN
Carrie Arnold
Joel Happ
Garret Orth
Deekshita Reddy
Christopher Ross
Jonathan Strong
Austin Toombs
INTERFACE DESIGN
Anastasia Goryacheva
Steven Lanier
Jonathan Strong
ELECTRONICS
Eric Brockmeyer
Giovanni Rozzi
GRAPHIC DESIGN
Amy Goettemoeller
Ina-Marie Henning
FACULTY
Jesse Allison
John Fillwalk
Paul Gestwicki
PROJECT DIRECTOR
Jonathan Strong
A SPECIAL THANK YOU
UNIVERSITY LIBRARIES
James Bradley
John Straw
Budi Wibowo
SOUND
Rick Baker
Steven Lanier
Giovanni Rozzi
STRUCTURAL DESIGN
Eric Brockmeyer
Matthew Wolak
PROJECT MANAGEMENT
Ina-Marie Henning
Amruta Mhaiskar
Jennifer Weaver-Cotton
PRODUCER/PRINCIPLE INVESTIGATOR
John Fillwalk
BALL STATE UNIVERSITY MUSEUM OF ART
Peter Blume
Carl Schafer
Tania Said
Ball State University Museum of Art
College of Fine Arts
College of Architecture and Planning
Department of Architecture
Department of Art
Department of Computer Science
School of Music
Information Technology
Office of the Provost
University Libraries
and
Jeff Berg, IBM Interactive, IDIA Research Fellow
IDIA Presents: BSU MUSEUM OF ART COLLECTION PORTAL 3/20/11 at 2:30pm BSUMA
BSU MUSEUM OF ART COLLECTION PORTAL
Reception and Gallery Talk
Thursday, March 20 @ 2:30pm, BSU Museum of Art
The BSU Museum of Art Collection Portal was developed by students, faculty and industry research fellows in the Institute for Digital Intermedia Arts at Ball State University. The year-long project was developed in partnership with the Ball State University Museum of Art and sponsored by the Provost’s Immersive Learning Initiative.
This interdisciplinary team from art, computer science, architecture and music has developed an innovative, touch-based application to navigate the collection. The portal bridges the Microsoft Surface interface with the Museum’s collection database, the Digital Images Delivered Online (DIDO), hosted on the Digital Media Repository of Bracken Library. The Surface affords Museum visitors an interactive platform to individually or collaboratively make virtual connections between works of art both on display and in reserve – accessing information and media across the collection.
IDIA is creating a media rich interactive digital kiosk for the Digital Fabrication Institute’s MMFX Exhibit hosted by the Indianapolis Museum of Art. Students participating in the IDIA Immersion Seminar in Virtual Worlds are developing the interactive interface, which will act as a station within the exhibit where viewers can attain additional biographical and portfolio information– employing a custom designed, coded and fabricated dynamic media experience.