Virtual Giza Necropolis

BSU IDIA Lab’s Virtual Giza Necropolis is an archaeological site on the Giza Plateau, on the outskirts of Cairo, Egypt. This complex of ancient monuments includes the three pyramids known as the Great Pyramids, the massive sculpture known as the Great Sphinx, several cemeteries, a workers’ village and an industrial complex. It is located some 9 km (5 mi) inland into the desert from the old town of Giza on the Nile. The pyramids, which have historically loomed large as emblems of ancient Egypt in the Western imagination, were popularised in Hellenistic times, when the Great Pyramid was listed by Antipater of Sidon as one of the Seven Wonders of the World. It is by far the oldest of the ancient Wonders and the only one still in existence.

Virtual Giza Necropolis

Visit Virtual Giza Necropolis in Blue Mars

http://blink.bluemars.com/City/IDIA_Giza/

Virtual Giza Necropolis in Blue Mars Video Walkthrough

Temple of Artemis: Wonder of the Ancient World

IDIA Lab was contracted by digital humanities scholars at UCLA to design and build a virtual simulation of the Temple of Artemis, one of the Wonders of the Ancient World, This massive Greek temple, four times the size of the Parthenon lies in ruin in present-day Turkey. This simulation incorporates our CelestialEngine with accurately positions both the sun and moon using a site’s latitude, longitude, altitude and year via NASA JPL data. This particular simulation studies whether an opening in the temple’s portico allowed moonlight to illuminate the statue of Artemis on her feast day.

REDgrid: BSU Virtual World for Teaching & Learning

REDgrid

IDIA is pleased to announce REDgrid – a 3D virtual campus to support the educational mission of faculty, staff and students. Ball State community members can use it for free as a virtual gathering space including classroom instruction, small group meetings, presentations, panel discussions, or performances. It is a secure environment hosted and managed solely by Ball State’s IDIA Lab. Virtual classrooms or field experiences can be customized as well as made private to suit your needs. REDgrid also offers a developmental platform for online instruction or community displays.

In this video, David gives us a preview of what you are able to do in REDgrid.

Ball State institute creates virtual world to be used for learning, connecting
The Daily News | Amanda Belcher Published 08/31/15 12:27pm Updated 09/01/15 5:15pm
By 2013, the Sims video game series had sold 175 million copies worldwide. Its users could create avatars, build houses—just like in reality.

Ball State’s own REDgrid uses a similar concept. Students can make an avatar, walk around a virtual Ball State campus and interact with other avatars via written messages or a headset in this open simulator.

Ball State’s Institute for Digital Intermedia Arts (IDIA) developed REDgrid.

“The server is physically housed at BSU IDIA Lab and was installed with collaboration with Information Technology,” said John Fillwak, the director of Ball State’s IDIA Lab.

When it comes to REDgrid, the possibilities can seem limitless—and some faculty members have already begun testing the simulator’s boundaries.

Mai Kuha, an assistant professor of English, used REDgrid in an honors class as a gender identity project. Students were assigned a gender and told to create an avatar of that gender. This enabled them to observe how people of opposite genders are treated differently.

Kuha isn’t the only professor to use REDgrid as a learning tool in the classroom. Ann Blakey, an associate professor of biology, used it in one of her courses for an environmental ethics activity.

Students were assigned a role, such as scientist or environmentalist, and were invited to explore the virtual environment and file a report. The activity gave students the opportunity to see the environment from different perspectives.

Fillwalk envisions even more opportunities for the platform.

“Ball State community members can use it for free as a virtual gathering space including classroom instruction, small group meetings, presentations, panel discussions or performances,” Fillwalk said.

The virtual classrooms and field experiences can be customized to fit each teacher’s needs.

This kind of creative teaching is what Stephen Gasior, an instructor of biology, is looking to expand upon. An enthusiastic voice for REDgrid, Gasior encourages professors and students to utilize this tool. He explains that it gives professors the ability to shape an environment and allows students to experience any number of events or situations.

REDgrid isn’t just for academic purposes either, Gasior said. Students can use the avatars for social experiments too.

“REDgrid represents Ball State’s campus, and international or online students get [the] feeling of being in the campus environment,” he said.

Fillwalk fully understands this aspect of REDgrid.

“We designed it to be a flexible platform connecting BSU faculty and students to international,” he said.

Connection is key with REDgrid. Gasior stressed that it can help build and connect communities—Ball State or otherwise.

Ball State is already working with faculties of other universities so the tool can be used on campuses other than just Ball State’s, Fillwalk said.

“The platform could certainly continue to be expanded,” he said.

Gasior has plans for the future—like researching developmental grants and expanding REDgrid. But IDIA staff can only do so much.

“People who come to REDgrid and have a passion for it will shape the road it will take,” Gasior said.

http://www.ballstatedaily.com/article/2015/08/redgrid-virtual-world

IDIA Lab wins the Bronze Medal at iENA Exposition in Nuremberg, Germany

IDIA Lab’s Virtual Pantheon/Kinect project in Blue Mars and our occupational simulation for Alzheimer’s patients won the Bronze Medal at the iENA International Trade Fair for Ideas, Inventions and New Products in Nuremberg, Germany. http://www.iena.de/en/home.html

The Virtual Pantheon Project in Blue Mars is a multi-user simulation of the Roman Pantheon and its surroundings as it might have existed in 320AD. The structure is the largest and most intact surviving example of ancient Roman architecture in existence. The building was commissioned by Marcus Agrippa during the reign of Augustus as a temple to all the gods of ancient Rome, and rebuilt by the emperor Hadrian about 126 AD. The simulation was created by the Institute for Digital Intermedia Arts at Ball State University in consultation with the University of Otago, NZ and the Virtual World Heritage Laboratory at Indiana University. The simulation leverages the significant graphic capabilities of the CryEngine to create a compelling immersive experience. IDIA Lab and Avatar Reality worked together to design an interactive solar tracker that allows the visitor to change time of day around important celestial and celebratory events in Ancient Rome. More detail here: http://bit.ly/18cfDKu The project can also be interacted with via MS Kinect sensor directly in Blue Mars.

oneWAPIHANNE: White River data Visualization

oneWAPIHANNE is an interactive artwork visualizing live streaming data from water stations along the White River in Indiana. The stations are installed and monitored by the United States Geological Survey [USGS] – reporting the current gauge height and water speed over time. The visualization graphically represents each stations’ data in their entirety or individually – with overlays for station locations, and record river levels. Wapihanne is the name for the White River in the Delaware language.

The chromatic tiles in the upper left represent the gauge stations along the White River. The monochromatic tiles in the upper right represent supplemental data overlays – displaying a line graph of the river height over time, a compressed map of the White River with the approximate location of the gauge stations, and the record levels for each station. In the bottom right of the screen is a gray tile that refreshes the data and restarts the program.

Each flow of station data represents river height and velocity over the course of the last thirty days. The top of the flow is the height, and the size of the flow is related to velocity. Touching and dragging a specific flow, will reveal detailed information about that particular reading – creating a sound with volume tied to river velocity.

oneWAPIHANNE: White River data Visualization


one WAPiHANnE: White River data Visualization

Virtual Roman Pantheon in Blue Mars / CryEngine

Pantheon32

The Pantheon is the best-preserved architectural monument of ancient Rome. This simulation by BSU’s IDIA Lab represents the Pantheon and its surrounds as it may have appeared in 320 AD. Visitors to this Blue Mars / CryEngine simulation can tour the vicinity, learning about the history, function and solar alignments through an interactive heads up display created for this project. The project opened in beta in late 2013 and will premiere publicly in February 2014 and includes new solar simulation software calibrated to the buildings location and year, an interactive HUD, a greeter bot system and a new AI Non Player Character system developed in partnership between IDIA Lab and Avatar Reality.

Originally built by Agrippa around 27 BC under the rule of Augustus, it was destroyed by fire, then rebuilt and finally completed in its present form during Emperor Hadrian’s reign, around 128 AD. Agrippa finished the construction of the building and it bears his name above the portico. The Pantheon would have contained numerous marble statues representing the major Roman deities. The statues displayed in this simulation represent a possible configuration and are scanned via photogrammetry. The buildings surrounding the Pantheon are built and interpreted by IDIA based on the large scale model of ancient Rome built by Italo Gismondi between 1935 and 1971. The model resides in the Museo della Civiltà Romana, just outside of Rome, Italy.

Video walkthrough of the Virtual Pantheon in Blue Mars:

To  visit the Virtual Pantheon firsthand:

First step, create an account: https://member.bluemars.com/game/WebRegistration.html/
Secondly, download client: http://bluemars.com/BetaClientDownload/ClientDownload.html/
Lastly, visit the Virtual Pantheon: http://blink.bluemars.com/City/IDIA_IDIALabPantheon/

Advisors

The Solar Alignment Simulation of the Roman Pantheon in Blue Mars was developed under consultation with archeo-astronomer Dr. Robert Hannah, Dean of Arts and Social Sciences at the University of Waikato, New Zealand, one of the world’s  foremost scholars on Pantheon solar alignments; and archaeologist Dr. Bernard Frischer, Indiana University.

Background

The Pantheon that we can visit today is composed of a rectangular porch with three rows of granite columns in front of a circular building designed as a huge hemispherical dome (142 feet in diameter), built over a cylinder of the same diameter and as high as the radius. Therefore, the ideal completion of the upper hemisphere by a hypothetical lower one touches the central point of the floor, directly under the unique source of natural light of the building. This light source is the so-called oculus, a circular opening over 27 feet wide on the top of the cupola. It is the only source of direct light since no direct sunlight can enter from the door in the course of the whole year, owing to the northward orientation of the entrance doorway. Of the original embellishments the building should have had, the coffered ceiling, part of the marble interiors, the bronze grille over the entrance and the great bronze doors have survived.

Interior Wall

The interior wall, although circular in plan, is organized into sixteen regularly spaced sectors: the northernmost one contains the entrance door, and then (proceeding in a clockwise direction) pedimented niches and columned recesses alternate with each other. Corresponding to this ground level sector are fourteen blind windows in the upper, attic course, just below the offset between the cylinder and the dome. It is likely that both the niches and the windows were meant for statues, which, however, have not survived.

Oculus

Direct sunlight penetrates the interior only through a large, 27 foot wide oculus in the center of the domed roof. Otherwise indirect sunlight can enter the building, but only through the large, north-facing doorway, when it is open. The fall of direct sunlight through the oculus into the essentially spherical building leads to the comparison with a roofed sundial.

Celestial Alignments

A columned porch leads through a vestibule of the Pantheon, into a huge, shadowy interior, over 142 feet in height and as much in diameter. The building’s form is essentially that of a sphere with its lower half transformed into a cylinder of the same radius. Direct sunlight penetrates the interior only through a large, 27 feet wide oculus in the centre of the domed roof.

The shift from one semester to the other is marked by the passage of the sun at the equinoxes in March and September. At this point the noontime sun shines partially just below the dome, passing through the grill over the entrance doorway and falling on the floor of the porch outside. More significantly, however, the centre of this equinoctial, midday circle of sunlight lies on the interior architectural moulding, which marks the base of the dome.

On April 21st, the midday sun shines directly on to visitors to the Pantheon when they are standing in the open doorway, dramatically highlighting them. This day is of particular significance, not just because this was when the sun entered Taurus, but more because it is the traditional Birthday of Rome, a festival preserved from antiquity right through to the present day. And it may be that when the building was officially commissioned in AD 128, the person expected to be standing in the open doorway was the emperor Hadrian himself.

The illustration indicates a section through the Pantheon, showing the fall of the noon sunlight at the winter solstice, when the sun is at altitude 24 degrees; noon sunlight, at both equinoxes at altitude 48 degrees; noon sunlight on April 21st, when the sun is at altitude 60 degrees; and finally, noon sunlight at the summer solstice, when the sun is at altitude 72 degrees.

Meaning of Pantheon

The Pantheon is a building in Rome, Italy commissioned by Marcus Agrippa during the reign of Augustus as a temple to all of the gods of ancient Rome, and rebuilt by the emperor Hadrian about 126 AD.

Pantheon is an ancient Greek composite word meaning All Gods. Cassius Dio, a Roman senator who wrote in Greek, speculated that the name comes either from the statues of so many gods placed around this building, or from the resemblance of the dome to the heavens.

“Agrippa finished the construction of the building called the Pantheon. It has this name, perhaps because it received among the images which decorated it the statues of many gods, including Mars and Venus; but my own opinion of the name is that, because of its vaulted roof, it resembles the heavens.”

-Cassius Dio History of Rome 53.27.2

Augustus

Augustus was the founder of the Roman Empire and its first Emperor, ruling from 27 BC until his death in 14 AD.

The reign of Augustus initiated an era of relative peace known as the Pax Romana (The Roman Peace). Despite continuous wars or imperial expansion on the Empire’s frontiers and one year-long civil war over the imperial succession, the Roman world was largely free from large-scale conflict for more than two centuries. Augustus dramatically enlarged the Empire, annexing Egypt, Dalmatia, Pannonia, Noricum, and Raetia, expanded possessions in Africa, expanded into Germania, and completed the conquest of Hispania.

Beyond the frontiers, he secured the Empire with a buffer region of client states, and made peace with the Parthian Empire through diplomacy. He reformed the Roman system of taxation, developed networks of roads with an official courier system, established a standing army, established the Praetorian Guard, created official police and fire-fighting services for Rome, and rebuilt much of the city during his reign.

Augustus died in 14 AD at the age of 75. He may have died from natural causes, although there were unconfirmed rumors that his wife Livia poisoned him. He was succeeded as Emperor by his adopted son (also stepson and former son-in-law), Tiberius.

Argrippa

Marcus Vipsanius Agrippa (c. 23 October or November 64/63 BC – 12 BC) was a Roman statesman and general. He was a close friend, son-in-law, lieutenant and defense minister to Octavian, the future Emperor Caesar Augustus and father-in-law of the Emperor Tiberius, maternal grandfather of the Emperor Caligula, and maternal great-grandfather of the Emperor Nero. He was responsible for most of Octavian’s military victories, most notably winning the naval Battle of Actium against the forces of Mark Antony and Cleopatra VII of Egypt.

In commemoration of the Battle of Actium, Agrippa built and dedicated the building that served as the Roman Pantheon before its destruction in 80 AD. Emperor Hadrian used Agrippa’s design to build his own Pantheon, which survives in Rome. The inscription of the later building, which was built around 125, preserves the text of the inscription from Agrippa’s building during his third consulship. The years following his third consulship, Agrippa spent in Gaul, reforming the provincial administration and taxation system, along with building an effective road system and aqueducts.

Arch of Piety

The Arch of Piety is believed to have stood in the piazza to the immediate north of the Pantheon. Statements made in mediaeval documents imply, but do not specifically say, that the scene of Trajan and the widow was represented in a bas-relief on the Arch – narrating the story of the emperor and a widow, suppressing the emperor’s name. His probable source, the mediaeval guidebook of Rome known as Mirabilia Romae, does not even state that the arch was built in commemoration of the event. It mentions the arch and then says that the Incident happened there.

Giacomo Boni discusses the legend of Trajan, giving many interesting pictures which show how the story was used in medieval painting and sculpture. He has found a bas-relief on the Arch of Constantino, which he thinks may have given rise to the story. It shows a woman sitting, her right hand raised in supplication to a Roman figure, who is surrounded by other men, some in military dress, and two accompanied by horses. Boni suggests that the Middle Ages may have supposed this figure to be Trajan because of his reputation for justice.

Saepta Julia

The Saepta Julia was a building in Ancient Rome where citizens gathered to cast votes. The building was conceived by Julius Caesar and dedicated by Marcus Vipsanius Agrippa in 26 BC. The building was originally built as a place for the comitia tributa to gather to cast votes. It replaced an older structure, called the Ovile, which served the same function. The building did not always retain its original function. It was used for gladiatorial fights by Augustus and later as a market place.

The conception of the Saepta Julia began during the reign of Julius Caesar (died 44 BC). Located in the Campus Martius, the Saepta Julia was built of marble and surrounded a huge rectangular space next to the Pantheon. The building was planned by Julius Caesar who wanted it to be built of marble and have a mile long portico according to a letter written by Cicero to his friend Atticus about the building project. The quadriporticus (four-sided portico, like the one used for the enclosure of the Saepta Julia) was an architectural feature made popular by Caesar.

After Caesar’s assassination in 44 BC, and in the backlash of public support for the former ruler, men continued to work on projects that Caesar had set into motion. Marcus Aemilius Lepidus, who used to support Caesar and subsequently aligned with his successor Octavian, took on the continuation of the Saepta Julia building project. The building was finally completed and dedicated by Marcus Vipsanius Agrippa in 26 BC. Agrippa also decorated the building with marble tablets and Greek paintings.

The Saepta Julia can be seen on the Forma Urbis Romae, a map of the city of Rome as it existed in the early 3rd century AD. Part of the original wall of the Saepta Julia can still be seen right next to the Pantheon.

– edited from Robert Hannah, “The Pantheon as Timekeeper”, 2009.
________

Simulation by the Institute for Digital Intermedia Arts at Ball State University
Project Director: John Fillwalk, Senior Directory IDIA Lab, BSU.
IDIA Staff: Neil Zehr, Trevor Danehy, David Rodriguez, Ina Marie Henning, Adam Kobitz

PROJECT ADVISORS:
Dr. Robert Hannah, University of Waikato, New Zealand
Dr. Bernard Frischer, Virtual World Heritage Laboratory, Indiana University, USA

SPECIAL THANKS:
Shinichi Soeda, Avatar Reality

Virtual Art Museum and Virtual Collaboration Center Projects in AvayaLive Engage

Avaya_ArtMuseum2The IDIA Lab has developed two new simulations for AvayaLive Engage – a multi-user virtual world platform based on the Unreal Game Engine. The virtual collaboration Center is a multi-use teaching and learning facility prototype for use by large and small groups containing rich media and collaboration tools including desktop sharing, presentations, drop box, white boards, streaming video and webcam. The Virtual Art Museum is a demonstration of an interactive museum experience with links to companion webpages including a database that allows for extremely close examination of the paintings. Both simulations work on Mac and PC within industry standard web browsers and will be launching in the month of May, 2013. You can visit the project here:  https://wa11619.avayalive.com/11619/html/index.html Certain areas need administrative permission – we will be opening these up in the near future.

AvayaLive Engage
http://avayalive.com/Engage/Products.aspx
Unreal Game Engine UDK
http://www.unrealengine.com

Virtual Nursing Simulator

NursingSimLoginScreen

HDT and IDIA Lab have developed a next generation, high-fidelity Virtual Nursing Simulator in the game engine of Unity 3D for testing by the Ball State University School of Nursing. Students will practice interviews via role-playing – employing avatars with predefined health histories. The simulator introduces our new media tools within the environment including the ability to play video, webcam, screen sharing, presentations and webpages. Additionally the simulator allows for interdisciplinary immersion – providing a context for student nurses to role-play scenarios alongside doctors, health insurance representatives, administrators and other professionals. Desktop, web and tablet versions are in development.
Video preview here: http://www.youtube.com/watch?v=Asz92FuhEyc&feature=share&list=UUtQrLF1JPprEvP4AO-iSNvw

National Endowment for the Humanities: Advanced Topics in the Digital Humanities Summer Institute 2013

IDIA_BlueMarsHUD_Weinman IDIA Lab Director John Fillwalk will be presenting at the National Endowment for the Humanities: Advanced Topics in the Digital Humanities Summer Institute 2013 sponsored by the Center for Digital Initiatives at University of Arkansas.

About & Schedule

This NEH Summer Institute for Advanced Topics in the Digital Humanities will take place from June 17, 2013 – July 6, 2013. The three-week institute will be hosted by the Center for Digital Initiatives (CDI) at Arkansas State University (ASU), Jonesboro campus in northeast Arkansas in the Mississippi Delta region, and the Center for Advanced Spatial Technologies (CAST) at the University of Arkansas, Fayetteville (UAF), located in northwest Arkansas. The first two weeks of the institute will be held at the CDI at Arkansas State University Jonesboro, AR campus. Participants will then travel as a group to the University of Arkansas in Fayetteville, AR for the final week of the institute.

This summer institute brings together twenty scholars working in the humanities who have research or teaching projects that would benefit from real-time visualization in a game engine, published as standalone applications, web players, or on mobile devices. In a three-week institute, participants will be provided with a conceptual road map to the difficult but intellectually productive issues that surround the academic use of game engines, including the balance of immersion with accuracy, strategies for storytelling and graphical user interfaces (GUIs) in “serious” games, and questions of power and appropriateness in using video game conventions to represent non-contemporary or non-Western cultures.  Participants will also receive hands-on training in the digital toolbox for creating game engine content, a basic workflow that they would be able to use in their own projects and bring back to their home institutions.  Training will include VSim, Unity 3D, online multi-user virtual environments, Google SketchUp, 3D Studio Max, Cinema 4D, as well as a broad range of open-source programs.  No prior knowledge or experience in 3D modeling will be assumed.

A unique feature of the institute is the breadth of cultural heritage content it incorporates.  This includes travel to ASU Heritage sites modeled by the CDI including the Lakeport Plantation in Lake Village, the boyhood home of Johnny Cash in Dyess, the Hemingway-Pfeiffer house and studio in Piggott, and the Japanese-American internment camp in Rohwer. At UAF participants will also participate in a Unity tutorial focused on the House of the Ara Massima in Pompeii. In bringing together an impressive group of lecturers who specialize in the use of 3D visualization and game engines as research tools in the digital humanities, the institute creates an important resource in the form of a community of scholars—which allows for future collaborations between individuals and universities.

“Designing the Unbundling of Courses” for the American Association of Nursing

American-Association-Of-Colleges-Of-NursingA presentation by Jo Ann Gora, President of Ball State University; Phil Repp, Vice-President of Information Technology; Linda Sweigart, BSU Nursing Faculty; and John Fillwalk, IDIA Lab Director on March 21st 2013 to the American Association of Colleges of Nursing.

A presentation that examines how universities, like Ball State University, use emerging technologies to enrich and expand the utility of course materials at scale by leveraging the inherent production flexibility of digital tools.  The advantages of digital technologies when designing virtual participation, collaboration, and interaction will be discussed, as well as the pedagogical uses of un-assembling and re-assembling diverse sets of course materials.
http://www.aacn.nche.edu/webinars/2013/03/21/designing-the-unbundling

Izapa Group F Ballcourt Solar Simulation

Mayan Solar Simulator
Izapa Group F Ballcourt

IDIA_IzapaSimulator1stPerson

We have applied our Virtual Solar Simulator to a simple white model build of the pre-Columbian ballcourt in Group F at the Izapa site in Chiapas, Mexico. Izapa is considered to be the birthplace of the Mayan Long Count, which ends its cycle today on December 21st, 2012 – the winter solstice. Viewed today, the site is oriented on an axis where it is aligned generally but not precisely to sunrise on the winter solstice and to sunset on summer solstice. In direct observation today, the alignment along the axis of the ballcourt from the throne #2 to the stela #60 is off-axis by approximately two degrees.

The simulator can be visited here:

https://projects.idialab.org/izapasolarsimulator.html

* The simulation requires Internet Explorer and the Unity plugin.

IDIA_IzapaSimulatorPlanView

The solar simulator developed by IDIA Lab at Ball State University polls data from the NASA / JPL Horizons database (http://ssd.jpl.nasa.gov/horizons.cgi) calculates celestial objects position (sun, moon, planets, comets, etc). The database takes into account for the Chandler Wobble – a periodic deviation in the rotation of the Earth’s axis. Archeo-astronomical alignments that are viewed today at Izapa appear to be off-axis, however when compensated for changes in time, rotation, position and Chandler effect via this simulation method, the solstice events come back into their original orientation for the era of their construction and can be observed to be in actual alignment.

A b’ak’tun is 144,000 days in length – almost 400 years. In the Mayan Long Count it took 13 b’ak’tuns to progress throughout a full cycle of creation. Dec. 21, 2012, marks the end of the 13th b’ak’tun of this Mayan calendar and after today’s solstice, the next Long Count cycle begins again – another 5128.8 years.

Happy New Mayan Long Count!
– John
December 21st, 2012

Simulator:

https://projects.idialab.org/izapasolarsimulator.html

* The simulation requires Internet Explorer and the Unity plugin.

It requires the Unity plugin http://unity3d.com/webplayer/

Note on display time and date

Dates are in the year 300 BCE/BC.
Time is represented in Universal Time (UT1/UTC/GMT).
To obtain local time for the Izapa site, subtract 6 hours from the UT time displayed in the simulator.
Next release will provide solstice and equinox preset for the year 2012 for comparison.

Camera control

To switch cameras press the 1 for 1st person camera view, 2 is for bird’s eye view and 3 is for top down view.
To pan use the WASD keys and to zoom use the middle mouse button. To rotate camera use CTRL-Click or Right Mouse Button.

Background

The model is geolocated and accurately referenced in 3D space by latitude, longitude, orientation and elevation. The Horizons database is then scraped using these coordinates as well as the year of interest – in this case we chose 300 BCE/BC as an apogee in the range in which the Izapa site was inhabited.

IDIA Lab has developed the celestial simulation – which can also track the moon and planets, etc – to work on various Cultural Heritage and archeo-astronomy projects in collaboration with international scholars. Other projects utilizing the Celestial Simulator include simulations of Hadrian’s Villa, the Roman Pantheon, the Solarium Augusti in Rome (the largest sun calendar of the ancient world), the Temple of Artemis in modern Turkey (one of the ancient wonders) and Stonehenge.

Virtual Artworks

Virtual Artworks

http://www.idialabprojects.org/displacedresonance/virtual.html

 

PROXY

John Fillwalk

with Michael Olson, Composer and IDIA Lab. 2012.

In the virtual installation, PROXY, visitors shape the construction of a sculptural and sonic response to the virtual museum. The work progresses to form, eventually transforming to become structural support for the building. When multiple users are in the environment, their avatars interact with one another to create collaborative painting and sculpture.

 

FLICKR™ GETTR v6

John Fillwalk

with Jesse Allison, Composer and IDIA Lab. 2012.

FLICKR™ GETTR v6 connects the social image web service of Flickr™ to the virtual museum environment, allowing visitors to create a dynamic cloud of spatial imagery by entering a search term of their choice, that pulls in related images from Flickr™ into the virtual environment.

 

SURVEY FOR BEIJING

John Fillwalk

with Keith Kothman, Composer and IDIA Lab. 2012.

SURVEY FOR BEIJING is an immersive landscape simulation using real time weather data from the physical location in Beijing, China. Representations of surveyor’s tape, flags, light, time of day, wind and clouds are superimposed onto the virtual landscape in accordance with real-life weather data.

 

con|FLUENCE

John Fillwalk

with Michael Olson, Composer and IDIA Lab. 2012.

Participants in con|FLUENCE create pathways based on reactions to both social and spatial relationships. There is a virtual response and persistence to interactions, as the routes are drawn and sculpted in the three-dimensional environment – forming nodes that are created via visitors’ proximity

 

Third Art and Science International Exhibition

China Science and Technology Museum in Beijing

http://www.tasie.org.cn/index.asp

 

 

TASIE Press release

http://www.tasie.org.cn/content_e.asp?id=84

 

Ball State artists create “forest” of light on display in China and Internet

Muncie, Ind. — Ball State University electronic artists have created a “forest” of light and sound that will be on exhibit in Beijing, China through November, yet also accessible to visitors from Indiana or anywhere else in the world.

That’s possible because “Displaced Resonance,” as the interactive art exhibit is known, has both real-life and virtual components.

The physical portion has been installed in a gallery of the China Science and Technology Museum in Beijing. There, in-person visitors can negotiate a thicket of 16 interactive sculptures spaced 1.5 meters apart that will change colors and emit music as they approach.

A digital replica of the layout, meanwhile, resides on the Internet, accessible through the museum’s website. Online visitors can wander the virtual exhibit using an avatar, and the digital pillars will change colors and produce sounds, just like their physical counterparts.

But that’s not all — the two pieces interact with each other, says John Fillwalk, director of Ball State’s Institute for Digital Intermedia Arts (IDIA) and Hybrid Design Technologies (HDT), which created the work in collaboration with IDIA staff, students and composer Michaal Pounds, BSU.

When an online avatar approaches a virtual pillar, the corresponding real-life column also will change colors, and vice versa. In-person and virtual visitors will produce different colors, however, allowing them to track each other through the exhibit.

“It’s what we call hybrid art,” says Fillwalk. “It’s negotiating between the physical world and the virtual. So it’s both sets of realities, and there’s a connection between the two.”

The physical pillars are two meters (or more than 6 feet, 6 inches) tall. They consist of a wooden base containing a sound system; a translucent pillar made of white corrugated plastic and computer-controlled lighting.

A thermal camera mounted on the museum’s ceiling keeps track of visitors and feeds its data to a computer program that directs the columns to change color and broadcast sounds when someone draws near.

“It’s a sensory forest that you can navigate,” Fillwalk says.

Two final touches: a video screen mounted on a museum wall overlooking the exhibit allows in-person visitors to watch avatars move around the virtual version, while Internet patrons can keeps tabs on the real-life display through a window on their computer screens.

“Displaced Resonance” is the centerpiece of Ball State’s contributions to the Beijing museum’s 3rd Art and Science International Exhibition and Symposium, a month-long celebration of technology and the arts. Ball State was invited to participate because museum curators discovered some of IDIA’s work and liked what they saw, Fillwalk said.

In addition to “Displaced Resonance,” IDIA contributed four other pieces of digital art that museum visitors can view at a kiosk.

Those pieces are:

·      “Proxy”, in which visitors create, color and sculpt with floating 3D pixels.

·      “Flickr Gettr,” in which visitors can surround themselves with photos from the Flickr web service that correspond to search terms they submit.

·      “Confluence,” in which users create virtual sculptures by moving around the screen and leaving a path in their wake.

·      “Survey for Beijing,” in which real time weather data from Beijing is dynamically visualized in a virtual environment.

 

(Note to editors: For more information, contact John Fillwalk, director of the Institute for Digital Intermedia Arts, at765-285-1045 or jfillwalk@bsu.edu; or Vic Caleca, media relations manager, at 765-285-5948, or vjcaleca@bsu.edu. For more stories, visit the Ball State University News Center at www.bsu.edu/news.

Displaced Resonance v2

IDIA Lab was invited to exhibit their virtual and hybrid artworks at the Third Art and Science International Exhibition at the China Science and Technology Museum in Beijing, China.

http://www.idialabprojects.org/displacedresonance/

Displaced Resonance v2

John Fillwalk

Michael Pounds

IDIA Lab, Ball State University

Interactive installation, mixed media

2012

Displaced Resonance v2 is an interactive installation consisting of sixteen reactive forms that are networked in a grid of light and sound. Interaction within the sculptural field is based on a participant’s presence and proximity to each sculpture. The Displaced Resonance installation is connected to a mirrored instance of the field in a virtual environment – bridging both physical and virtual visitors within a shared hybrid space. Visitors to the virtual space are represented by avatars and through their proximity affect the light and sound of each sculpture. Each participant is aware of the other in each space and uniting both instances within a singular hybrid environment.

A computer system using an thermal camera tracks the movement of visitors and responds by controlling the distribution of sound and dynamic RGB data to the LED lights within the sculptural forms. The installation utilizes custom live processing software to transform these sources through the participants’ interaction – displacing the interaction from both their physical and virtual contexts to the processed framework of physical and virtual resonances. The two environments are linked to send and receive active responses from both sides of the installation via messaging, sensors, hardware and scripting.

In prototyping this work, the process included both physical and virtual models to design and build the form and the interaction. The physical prototypes were interpreted in a virtual simulation environment, investigating the spatial interaction of the structure. The interactive functionality was tested through scripting before the form was brought into the sensored camera-based version. After several virtual iterations, the form was re-interpreted and fabricated.

John Fillwalk

with Michael Pounds, David Rodriguez, Neil Zehr, Chris Harrison, Blake Boucher, Matthew Wolak, and Jesse Allison.

Third Art and Science International Exhibition

China Science and Technology Museum in Beijing

http://www.tasie.org.cn/index.asp

TASIE Press release

http://www.tasie.org.cn/content_e.asp?id=84

Ball State artists create “forest” of light on display in China and Internet

Muncie, Ind. — Ball State University electronic artists have created a “forest” of light and sound that will be on exhibit in Beijing, China through November, yet also accessible to visitors from Indiana or anywhere else in the world.

That’s possible because “Displaced Resonance,” as the interactive art exhibit is known, has both real-life and virtual components.

The physical portion has been installed in a gallery of the China Science and Technology Museum in Beijing. There, in-person visitors can negotiate a thicket of 16 interactive sculptures spaced 1.5 meters apart that will change colors and emit music as they approach.

A digital replica of the layout, meanwhile, resides on the Internet, accessible through the museum’s website. Online visitors can wander the virtual exhibit using an avatar, and the digital pillars will change colors and produce sounds, just like their physical counterparts.

But that’s not all — the two pieces interact with each other, says John Fillwalk, director of Ball State’s Institute for Digital Intermedia Arts (IDIA) and Hybrid Design Technologies (HDT), which created the work in collaboration with IDIA staff, students and composer Michael Pounds, BSU.

When an online avatar approaches a virtual pillar, the corresponding real-life column also will change colors, and vice versa. In-person and virtual visitors will produce different colors, however, allowing them to track each other through the exhibit.

“It’s what we call hybrid art,” says Fillwalk. “It’s negotiating between the physical world and the virtual. So it’s both sets of realities, and there’s a connection between the two.”

The physical pillars are two meters (or more than 6 feet, 6 inches) tall. They consist of a wooden base containing a sound system; a translucent pillar made of white corrugated plastic and computer-controlled lighting.

A thermal camera mounted on the museum’s ceiling keeps track of visitors and feeds its data to a computer program that directs the columns to change color and broadcast sounds when someone draws near.

“It’s a sensory forest that you can navigate,” Fillwalk says.

A video screen mounted on a wall overlooking the exhibit allows museum visitors to watch avatars move around the virtual version, while Internet patrons can affect the physical counterpart.

“Displaced Resonance” is the centerpiece of Ball State’s contributions to the Beijing museum’s 3rd Art and Science International Exhibition and Symposium, a month-long celebration of technology and the arts. Ball State was invited to participate because museum curators discovered some of IDIA’s work and liked what they saw, Fillwalk said.

In addition to “Displaced Resonance,” IDIA contributed four other pieces of digital art that museum visitors can view at a kiosk.

Those pieces are:

· “Proxy”, in which visitors create, color and sculpt with floating 3D pixels.

· “Flickr Gettr,” in which visitors can surround themselves with photos from the Flickr web service that correspond to search terms they submit.

· “Confluence,” in which users create virtual sculptures by moving around the screen and leaving a path in their wake.

· “Survey for Beijing,” in which real time weather data from Beijing is dynamically visualized in a virtual environment.

(Note to editors: For more information, contact John Fillwalk, director of the Institute for Digital Intermedia Arts, at765-285-1045 or jfillwalk@bsu.edu; or Vic Caleca, media relations manager, at 765-285-5948, or vjcaleca@bsu.edu. For more stories, visit the Ball State University News Center at www.bsu.edu/news).

Virtual Middletown Living Museum in Blue Mars

The Virtual Middletown Living Museum project in Blue Mars is a simulation of the Ball Glass factory from early 20th century Muncie, Indiana. Life and conditions in the factory were one of the key elements of the Middletown Studies by Robert S. and Helen Merrell Lynd in their landmark studies Middletown (1929) and Middletown in Transition (1937). These in-depth accounts of life in Muncie, Indiana, became classic sociological studies and established the community as a barometer of social trends in the United States. In the years since, scholars in a variety of fields have returned to Muncie to follow up on the Lynds’ work, making this small city among the most studied communities in the nation. The center continues this tradition by sponsoring and promoting research on Muncie as Middletown, on small cities generally, and on the themes and issues the Lynds explored.

This simulation of industrial life, built as a prototype for a much larger project dealing with all aspects of the Lynd Study, has aimed to create an virtual living museum experience expanding the opportunities for both learning and interpretation. The approach to interactive design embeds learning and navigation experiences subtly into the project to maintain the sense of immersion. IDIA has prototyped several techniques to do this including: interactive objects that allow for close up inspection; objects that when clicked bring up web resources that show information; plans or photographs used in the interpretation; non-player character factory workers, a live interactive avatar of Frank C. Ball who greets visitors and introduces them to the factory; video and audio files of factory experts and archival films; an in-world interactive Heads-Up-Display (HUD) that provides deeper investigation and navigation through the factory; and a supporting webpage with complete documentation on all resources used in this interpretation.

To download the Blue Mars client, create an account and tour Virtual Middletown, please visit: http://blink.bluemars.com/City/IDIA_IDIALabExhibitions/

Follow these steps to get set up:

  1. Slelect “Download Client” and follow instructions to install the BlueMars client on your PC (Windows desktop or laptop)
  2. Once you have successfully installed the BlueMars client, select “Go To City” to install the Virtual Middletown virtual world
  3. Register your account and confirm when you receive an email from BlueMars
  4. Modify your avatar (optional)
  5. Explore Virtual Middletown!

NOTE: If you are a Macintosh user (OS X) you may run the BlueMars client and Virtual Middletown virtual world using the Boot Camp emulation: http://www.apple.com/support/bootcamp/

Here are links for additional information on the project:

  1. Center for Middletown Studies project website: http://cms.bsu.edu/Academics/CentersandInstitutes/Middletown/Research/Virtual/Virtual.aspx
  2. IDIA Lab project website: https://idialab.org/virtual-middletown-living-museum-in-blue-mars/

Recommended settings: Very High Graphics with good graphics cards – otherwise High or Low as needed. Screen resolution: Minimum 1280 by 720 or higher. Sound levels should be up. Many objects in the world are interactive – anything that highlights blue can be clicked with the left mouse button and examined – or might can reference a web page. The heads up display in the lower right hand corner provides information and navigation to augment your visit.

Project partners: The Center for Middletown Studies, Library Services and the Emerging Media Initiative at Ball State University

Press release: http://www.bsu.edu/news/article/0,1370,7273-850-65401,00.html

IDIA Lab Simulations in Blue Mars

Ball State University’s Institute for Digital Intermedia Arts (IDIA) in Blue Mars is a large scale virtual simulation that showcases re-creations of artifacts and artworks in their original historical contexts, including sculptures at the 1915 Panama-Pacific International Exposition in San Francisco and a centuries-old Buddha at a Japanese temple. Through the use of such emerging technologies, visitors to Blue Mars can virtually experience these important but long lost sites.

IDIA’s simulations in Blue Mars present several examples of recent grant-supported research projects. IDIA is exploring various new ways that it might be used, from learning and the arts to gaming. Future IDIA initiatives in Blue Mars will include the prototyping the 3D internet, shared media collaboration tools and the development of an open virtual campus for use by educators interested in engaging the platform as an environment for learning.

Videos on IDIA Lab Youtube Channel
http://www.youtube.com/user/IDIALab#p/u

Images on IDIA Lab Flickr Photostream
http://www.flickr.com/photos/idia_lab/

This release of IDIA Lab introduces the following new features and updates, including

 • Heads up Display system – flash-based HUD system displaying dynamic information and map based teleportation

• Automated avatar/bot greeter system

• A guided tour vehicle – launching from the orientation island

• The Snow Globe, a virtual winter simulation – employing physics, environment and particle systems. Teleports are placed throughout the City

• Depth of Field – now enabled in most environments • New navigation and orientation system

• New vegetation design – new plants with LOD

• High fidelity statues using advanced mapping techniques

• High optimization, terrain painting, parallax mapping

• Please use Very High Graphics settings to view

 

Blue Mars Builds


The IDIA Lab

THE INSTITUTE FOR DIGITAL INTERMEDIA ARTS is a collaborative research and design studio exploring the intersections between art, science and technology. We consult, design and produce immersive experiences for virtual reality, hybrid worlds, simulations, visualizations, games and interfaces for various platforms and devices. For more information, please visit IDIALAB.org. You can walk to all simulations or alternatively use the teleport columns you see at each location. Follow us on Twitter @ IDIA_Lab for news and updates.

Panama-Pacific International Exposition Simulation

IDIA simulation of the 1915 San Francisco Panama – Pacific Exposition The project allows for visitors to travel to the past to immersively tour a recreation of an historic environment that no longer exists. The exposition celebrated the opening of the Panama Canal but also San Francisco’s recovery from the devastating earthquake of 1906. IDIA 3D laser-scanned two sculptures by artist Adolph Weinman that have been included in this simulation and were originally installed on top of tall columns in the Court of the Universe. A more detailed examination of the sculptures can be found in our Museum Simulator. Visitors can change the time of day near controls found near this sign and the Fine Arts Palace to experience the lighting design of this exposition.

Palace of Fine Arts

The Palace of Fine Arts in the Marina District of San Francisco, California was originally constructed for the 1915 Panama-Pacific Exposition. One of a handful of surviving structures from the Exposition, it is the still situated on its original site. It was rebuilt in 1965 – and renovation of the lagoon, walkways, and a seismic retrofit were completed in early 2009. IDIA Lab constructed this as an addition to its Panama- Pacific Court of the Universe simulation.

Art Museum Simulation

Using blue prints, photographs, laser-scanning and measurements, IDIA Lab simulated an actual museum gallery to a high degree of accuracy for exploration and study within a live 3D environment. This goals of this build were to provide visitors with a remote immersive experience of an actual exhibition as it was installed – closely replicating all original factors such as the layout, framing and lighting of the physical space. Additional information is provided by touching each canvas or sculpture in the exhibition. Via a simulation such as this, curators can also spatially archive a show or prototype layout, lighting, and installation design.

Byodo-In Temple (Amida Hall)

Amida Hall, the most famous building in the Byodo-in temple complex was constructed in 1053 and is the only remaining building from the original site. Additional buildings making up the compound were burnt down during a civil war in 1336. IDIA scanned an Amida Buddha and created this temple site to give visitors an understanding of a typical context in which the Buddha was observed. A replica of the temple was also constructed on the island of O’ahu in Hawai’i. Nearby there are 360º immersion spheres of the Hawaiian temple that avatars can also explore.

Relic Temple

Relic Temple – located in the Ji Le Temple complex – Nangang District in Harbin, China.The Relic Temple simulation was built by by IDIA Lab as part of our larger 3D Laser Scanning and Simulation Project, including the 1915 Panama Pacific Exposition in San Francisco, Byodo-In Temple in Japan and the Bingling Si Buddhist Cave site – all simulated here on IDIA Lab in Blue Mars.

Shafer Tower

Shafer Tower is a 150-foot-tall (46 m) bell tower with a chiming clock and carillon bells located in the middle of the campus of Ball State University. A small staircase in the tower leads to a control room of the carillon, which has 48 custom-made bells. This virtual tower chimes when an avatar approaches – as does the bell in a smaller temple at our Byodo-in build.

The Snow Globe

Winter themed landscape simulation including environment, weather, particle and physics systems.

3D Point Cloud Laser Scanning

From a seashell to auditorium, laser-scanning can capture the objects and spaces we inhabit and convert them into a digital model. The IDIA scans a range of small or large scale 3D objects using facilities and equipment shared with the SimLab in the College of Architecture and Planning and supported by the Office of Information Technology. Three-dimensional geometry is captured in terms of millions of points that outline the scanned object. These points are converted into polygons to be used in the resulting animation workflow.

In 2009, the IDIA Immersion Seminar in Virtual Worlds worked to simulate the original settings of several sculptures in the collection of the Ball State University Museum of Art. Students where trained professionally to use the ZCorp 3D laser scanner using archival museum methods.

This project has allowed Ball State students to obtain skills in 3D scanning by learning the methodology, techniques and limitations involved with accurate scanning procedures. The museum scanning project was initially a way, to not only digitally archive select sculptures, but to place them in an live three-dimensional multi-user virtual environment that immerses viewers in a simulation of the works in their original context.

This significance of this project was to address the fact that pre-modern sculpture was typically originally designed to be an element in large-scale architectural, public or natural setting. IDIA developed a workflow that is now also a service that can be provided to external clients in the sciences, industry and the arts – employing emerging digital tools to illuminate a viewer’s understanding of context.

FARO and Z Corporation’s scanning technologies are used in applications such as reverse engineering, accident reconstruction, forensic science, historic preservation, entertainment and virtual simulation.

Microsoft Surface Museum Collection Portal

The IDIA has assembled an interdisciplinary group of students, faculty and industry experts in a significant interactive information portal for the Ball State University Museum of Art (BSUMA).  The IDIA team is developing an innovative touch-based interface to navigate the collection, employing and integrating Microsoft Surface with the Museum’s database.  The Surface will afford Museum patrons a collaborative, participatory public platform through which to access metadata and media of the physical exhibition – as well as extending virtually into the permanent collection. Using the Digital Images Delivered Online (DIDO) database, the interface will make visual the interconnection between works in the collection queried on searchable parameters, i.e. artist, medium, period, subject etc. This two-semester immersive project supported by the BSU Provost’s Immersion Initiative has been team-taught and has recruited students from targeted disciplines across campus.

Complete Instruction Guide here: http://ilocker.bsu.edu/users/idiaa/world_shared/SurfaceInstructions.pdf

Microsoft Surface represents a fundamental change in the potential for interaction with digital content. The Surface is a 30” tabletop interface environment that allows several people to work independently or collaboratively – all without using a mouse or a keyboard.  The Surface allows users to navigate information physically, and manipulate information with natural gestures and touch http://www.microsoft.com/SURFACE/product.html#section=The%20Product.

The Ball State University Museum of Art has a collection of nearly 11,000 works of art. Central to the Ball State University Museum of Art’s mission is the provision of educational programming that will further faculty, staff, student, and public utilization and understanding of the Museum and its collection, of museums in general, and of the arts. To accomplish this goal, the Museum offer tours, talks, materials for teachers, an education database, artist demonstrations. The Museum’s education philosophy centers on bringing together the needs of visitors and the resources of the Museum , and balancing responsibility to the university with service to the community. In facilitating communication between the work of art and the viewer, the Museum subscribes to a philosophy of learner-centered programming informed by a thorough knowledge of the collection and methods and strategies for effective teaching.

Although the Museum’s collection numbers nearly 11,000, given the physical constraints of the facility, a mere 10% of the collected works are currently displayed at any given time.  The incorporation of this hybrid Surface and database system will afford patrons virtual access to the entire collection – making visible those works currently stored.  More importantly, the system will allow patrons to visualize the interconnectivity of the works according to multiple facets – visually re-contextualizing the works in relation to specified search parameters.  This form of innovative technology within a museum context would typically be installed at major metropolitan institutions. Development of this interface at the Ball State University Museum of Art will not only benefit local community groups and patrons of BSUMA whose access to the collection will be significantly augmented, but also has the potential to influence other museums through the distribution of the outcomes of this product.

The Institute for Digital Intermedia Arts (IDIA) has a history of providing interdisciplinary immersion and new media experiences, employing pedagogy that supports students’ transformation as active, engaged learners. The Institute provides a dynamic exchange between instructor and students – where participants engage in collaborative, inquiry-based communities that provide an environment, which fosters participatory learning. As opposed to a traditional model, where knowledge is imparted by the teacher to the student, participatory learning can transform the learner into an active role. Participatory learning communities engage in a dynamic conversation centered around a subject – where analysis, comparison, and evaluation are core to the process of acquiring meaning and relationships. This environment allows students the opportunity to have a stake in the conditions of their learning, making decisions that direct their course of investigation. With a commitment to active inquiry, participants are freed to create personal and collective meaning.

 

BSU MUSEUM OF ART

COLLECTION PORTAL

SOFTWARE DESIGN

Carrie Arnold

Joel Happ

Garret Orth

Deekshita Reddy

Christopher Ross

Jonathan Strong

Austin Toombs

INTERFACE DESIGN

Anastasia Goryacheva

Steven Lanier

Jonathan Strong

ELECTRONICS

Eric Brockmeyer

Giovanni Rozzi

GRAPHIC DESIGN

Amy Goettemoeller

Ina-Marie Henning

FACULTY

Jesse Allison

John Fillwalk

Paul Gestwicki

PROJECT DIRECTOR

Jonathan Strong

A SPECIAL THANK YOU

UNIVERSITY LIBRARIES

James Bradley

John Straw

Budi Wibowo

SOUND

Rick Baker

Steven Lanier

Giovanni Rozzi

STRUCTURAL DESIGN

Eric Brockmeyer

Matthew Wolak

PROJECT MANAGEMENT

Ina-Marie Henning

Amruta Mhaiskar

Jennifer Weaver-Cotton

PRODUCER/PRINCIPLE INVESTIGATOR

John Fillwalk

BALL STATE UNIVERSITY MUSEUM OF ART

Peter Blume

Carl Schafer

Tania Said

Ball State University Museum of Art

College of Fine Arts

College of Architecture and Planning

Department of Architecture

Department of Art

Department of Computer Science

School of Music

Information Technology

Office of the Provost

University Libraries

and

Jeff Berg, IBM Interactive, IDIA Research Fellow

IDIA Presents: BSU MUSEUM OF ART COLLECTION PORTAL 3/20/11 at 2:30pm BSUMA

 

BSU MUSEUM OF ART COLLECTION PORTAL
Reception and Gallery Talk

Thursday, March 20 @ 2:30pm, BSU Museum of Art

The BSU Museum of Art Collection Portal was developed by students, faculty and industry research fellows in the Institute for Digital Intermedia Arts at Ball State University. The year-long project was developed in partnership with the Ball State University Museum of Art and sponsored by the Provost’s Immersive Learning Initiative.

This interdisciplinary team from art, computer science, architecture and music has developed an innovative, touch-based application to navigate the collection. The portal bridges the Microsoft Surface interface with the Museum’s collection database, the Digital Images Delivered Online (DIDO), hosted on the Digital Media Repository of Bracken Library. The Surface affords Museum visitors an interactive platform to individually or collaboratively make virtual connections between works of art both on display and in reserve – accessing information and media across the collection.

http://bit.ly/IDIA_Surface

Complete Instruction Guide here: http://ilocker.bsu.edu/users/idiaa/world_shared/SurfaceInstructions.pdf

trans|form

trans|form

An interactive intermedia installation

Michael Pounds

John Fillwalk

Jesse Allison

 

“Trans|form” is an interactive sound sculpture consisting of five metal plates with

attached mechanical transducers that vibrate the metal according to audio signals sent

from the computer. Signals sent from the computer are resonated according to the

physical properties of the metal plates. Additionally contact microphones attached to the

plates can provide feedback to the computer, allowing the computer to gather information

about the specific resonances of each metal plate. This information can be used to further

exploit those resonances by adjusting the audio signals sent to the plates.

 

Interaction is provided through a capacitive sensing mechanism. When viewers move

their hands near the plates or touch the plates, signals are sent to the computer, allowing

the viewers to trigger sounds or modify the properties of sounds through their hand

movements.

 

Five different metal sheets/plates are used with varying physical dimensions and

thicknesses. Different transducers are attached to each plate according to its physical

properties, and each plate will resonate at different frequencies.

Institute for Digital Intermedia Arts [IDIA Lab] at Ball State University

https://idialab.org

RFID-Linked 3D Media Interface

The project allows the user to manipulate a digital world with wireless objects. Through the use of Radio Frequency Identification Devices (RFID), the virtual world can detect the presence of real-world objects and use them to manipulate its own attributes. In the case of this project, physical cubes link the user with virtual cubes within the system and allow the user to call up media such as video and sound. The system uses the software Quest 3D for real-time VR rendering and interactive animation, and Max/MSP for harvesting and inputting RFID data. Additionally, users can navigate the 3D virtual world with the use of a trackball.

Students participating in IDIA’s Immersion Seminar were named as a winner for an Award of Excellence at the International Digital Media and Arts Association (iDMAa) 2007 National Conference. The award included a $250 cash prize and was presented to only two groups out of the twelve universities participating in the conference’s student showcase. iDMAa hosted the Ideas 07: Beyond Boundaries conference in Philadelphia on November 7-11.

Links
BSU Press Release
2007 iDMAa Conference Website
iDMAa Homepage

MMFX/IMA Interactive Project

IDIA is creating a media rich interactive digital kiosk for the Digital Fabrication Institute’s MMFX Exhibit hosted by the Indianapolis Museum of Art. Students participating in the IDIA Immersion Seminar in Virtual Worlds are developing the interactive interface, which will act as a station within the exhibit where viewers can attain additional biographical and portfolio information– employing a custom designed, coded and fabricated dynamic media experience.

[un]wired

[un]wired by Jesse Allison, John Fillwalk and Keith Kothman is a processing network visualizer that responds to interactions from personal radio-frequency devices such as mobile phones, WiFi signals, Bluetooth signals and car-key fobs. It tracks real-time statistical information from wireless access points (designed for seamless handoff of moving wireless traffic, like a cell phone network), along with periodically updated information from hand-held and wireless access points. The interactions then appear in sound and shapes on the screen. Users interacting with the piece can then visually see their interaction live on screen. The control information is collected from network services via the MySQL database and transferred into Max/MSP/Jitter. [un]wired was exhibited at SIGGRAPH Asia 2008 in Singapore.

http://www.siggraph.org/asia2008/attendees/art/20.php

Intermedia Artist Timeline

This device was designed to take you on a journey through the history of intermedia art. The kiosk, made by the IDIA in collaboration with the Institute for Digital Fabrication at Ball State University, allows the user to interact using hand gestures to move through the timeline. The work includes a short bio, image and description of works from 30 intermedia artists of the 1960s to the present. It was displayed in conjunction with the BSU Museum of Art’s Engaging Technology Exhibit. Jesse Allison programmed the piece with help from Mike Sayre. Dustin Headley and the IDF designed and constructed the kiosk.

interActivity: IDIA at Philips HighTech Campus during Dutch Design Week

TRAVERSAL FOR EINDHOVEN
during Dutch Design Week

Collaboration with IDIA Lab and MAD Emergent Art Center

IDIA Lab
John Fillwalk and Jesse Allison

MAD Emergent Art Center Rien Daamen and Réné Paré

October 17-25, 2009 15-21 SLT live daily through the 25th

Virtual location
http://slurl.com/secondlife/Ball%20State%20University2/181/128/21

Physical location: Philips HighTech Campus and NatLab in Eindhoven, Nederlands
http://www.hightechcampus.nl/

‘Traversal for Eindhoven’ is a live performance / hybrid reality installation that bridges the physical and virtual worlds. It connects Philips NatLab in Eindhoven to a virtual environment as part of the 2009 Dutch Design Week. Avatars will be able to perform a physical synthesizer in the Philips NatLab via an interactive online virtual instrument built by IDIA – located near a 3D model of the Eindoven High Tech Campus. The sound of the live synthesizer will be streamed back into Second Life, so the virtual and physical participants can hear their interactions with the instrument.

DUTCH DESIGN WEEK From October 17 through October 25 2009, Eindhoven presents the 8th edition of Dutch Design Week, the largest design event in the Netherlands. Around 1500 designers from home and abroad will show their work in more than 60 locations all over the city of Eindhoven from design disciplines such as industrial design, concept design, graphic design, textile & fashion, spatial design, fooddesign and design management & trends. Visitors will be given insight into the entire development process from concept to product in various disciplines ranging from industrial design to applied arts. The participants include established bureaus, high-profile designers, talented newcomers, and recently graduated designers, one of the reasons why this Dutch Design Week is the perfect meeting place for designers, companies, and public. http://www.dutchdesignweek.nl/indexmap.php

MAD EMERGENT ART CENTER The Foundation MAD is a platform and workshop for Emergent Art: art that exists in the exiting world between cultural and cutting edge technological developments.MAD emergent art center is laboratory, platform and provider on the intersection of art, science and technology. MAD addresses artists, designers, scientists, public groups, institutions, governments and businesses. This on regional as well as national and international level. http://www.mad.dse.nl/mad.html
More Information: http://tinyurl.com/yl2dq4l