Ball State University and the Technische Universität Dortmund have partnered in creating an international exchange program for students studying the design and technology of Intermedia artforms.
Ball State University and the Technische Universität Dortmund have partnered in creating an international exchange program for students studying the design and technology of Intermedia artforms. This program will provide opportunities for students and faculty engaged in experiential project-based approaches to the collaborative creation of new media art. The Hans Breder Foundation, which is a non-profit international Intermedia art organization, has contributed a significant body of historic artwork to the Museum Ostwall in Dortmund – the largest art museum in Germany. The archive will serve as a catalyst for research and production of Intermedia based projects initiated by the BSU and TU Dortmund exchange. John Fillwalk, also president of the Hans Breder Foundation, initiated the exchange program with the assistance of the Rinker Center for International Programs. He will be present at the formal signing ceremony on November 19th in Dortmund, Germany.
The future of home entertainment is taking a giant step forward into the world of virtual reality. More devices are hitting the market that transform the way we look at the world around us, and a group of developers at Ball State is on the cutting edge.
It’s something we experienced first-hand at Ball State University, where they produce content for virtual reality devices at the Institute for Digital Intermedia Arts. The lab is located in the architecture building on the Ball State Campus, where the focus is on the exploration the arts, science and technology.
For the last ten years, Ball State has been developing a wide range of projects for museums, the History Channel and other clients.
John Fillwalk oversees the work done there.
“We basically explore 3D simulation and virtual reality,” he explained. “The skill sets from the staff are across computer science to design to animation.”
“We continue to further the emerging media brand aspect of this university, giving students professional opportunities to work and stay in Indiana for design technology jobs,” said John Fillwalk.
The virtual reality industry estimates more than 7 million headsets will ship this year, even though researchers have found more than half of Americans have no interest in VR and don’t know what it is. Developers foresee huge growth in the next six years as they develop virtual reality social media apps, combined with lower prices for the technology.
Explore the solar system by touch and augmented reality. New app for iOS, Android and Oculus Rift designed by BSU’s IDIA Lab allows users to learn about our sun and planets – and their moons. Physical museum objects trigger augmented experiences of each planet, providing deeper context and understanding. View the moon’s orbits, examine each moon, the planet’s atmosphere and unique features such as Jupiter’s Great Red Spot. IDIA designed this augmented reality app as a prototype for the Boston Museum of Science Planetarium.
Learn how modern technology can shape our understanding of the past during a special program at Mounds State Park on Saturday, Nov. 14.
Visitors to the 1 p.m. program will join park naturalist Kelley Morgan to learn about modern technologies that help archaeologists and historians bring the past to life. During the second half, director John Fillwalk and animator Neil Zehr of the Institute for Digital Intermedia Arts Laboratory at Ball State University will demonstrate how they use archaeological data to interpret the past to the public.
BSU’s IDIA Lab is premiering Virtual Companion – their custom augmented reality app employing LocusEngine, a geolocative process developed by IDIA Lab. Visitors to the park use the app to aid in learning and discovery while exploring the park’s Adena-Hopewell mounds. Using GPS data, the user’s position is geolocated in reference to the historical sites, allowing the app to display relevant content as a dynamic guide. This approach can be applied in cultural heritage, archeology, the sciences and the arts.
Interactive features, as well as the user’s current location in the park, are marked on a series of map options designed to provide multiple layers of locative information throughout the park. A GPS-driven trail map is available, allowing the user to track their movement through the trails and important features. When an interactive feature is selected on the map, an augmented reality view using gyroscope and compass data is loaded, portraying native people’s and habitats from the Adena-Hopewell era. Archaeologists have proposed that the enclosures were used to track celestial alignments. Using solar data from NASA’s JPL Horizons database, the movements of the sun on the equinoxes and solstices during the Adena-Hopewell era can be viewed and tracked to search for important alignments.
Standard park entry fees of $5 per in-state vehicle apply. Mounds State Park (stateparks.IN.gov/2977.htm) is at 4306 Mounds Road, Anderson, 46017.
Digital artists from Ball State’s IDIA Lab built their virtual simulation of Buffalo Bill’s Wild West with the Unity 3-D gaming platform along with custom software created by the lab.
Cowboys and Indians captivated the country when Buffalo Bill’s Wild West rolled through America in the late 1800s and early 1900s. More than a century later, Ball State digital artists have re-created the legendary showman’s outdoor exhibition.
Working with staff from the Buffalo Bill Center of the West, artists and designers from Ball State’s Institute for Digital Intermedia Arts (IDIA) have crafted a computer-generated world that authentically simulates the Wild West showdramatizing frontier life.
“The visual look and feel of the project is something we’re really proud of,” said John Fillwalk, IDIA director and senior director of the university’s Hybrid Design Technologies initiative.
Fillwalk collaborated on the project with Jeremy Johnston, curator of the center’s Buffalo Bill Museum, and Ball State historians James Connolly and Douglas Seefeldt.
As a senior digital editor of the Papers of William F. Cody, Seefeldt has worked closely with Johnston on several projects the National Endowment for the Humanities recently recognized as among the most significant it funded.
“When Doug introduced me to John, I was excited because all we had to visually represent the Wild West show at the Buffalo Bill Museum was this small architectural diorama,” said Johnston, who is also managing editor of the Papers of William F. Cody. “It gave our visitors an aerial overview of the show but lacked action.
“What the IDIA captured for us is the look and feel of the experience, right down to the sound effects of horses and the stage coach running through the arena.”
Buffalo Bill’s Muncie visit
IDIA-created augmented reality apps will feature objects in the museums’ collections, such as firearms.
The Virtual Buffalo Bill project offered a crossover research opportunity for Connolly, director of Ball State’s Center for Middletown Studies. The center is developing Virtual Middletown, a 3-D visualization of industrializing early 20th-century Muncie, and the Buffalo Bill simulation offered an opportunity to produce a module as part of that endeavor.
Connolly and Seefeldt provided Fillwalk with photographs and newspaper accounts of Buffalo Bill’s 1899 stop in Muncie. “He personified the Wild West for audiences in these small towns,” Connolly said.
Connolly’s and Seefeldt’s research, along with assets provided by the Buffalo Bill Center, allowed Fillwalk and his artists to create beautifully rendered graphics based on data and research, hallmarks that have distinguished IDIA’s work in emergent media design.
“The attack on the Deadwood Stage Coach is simulated down to representing John Y. Nelson, one of America’s original Mountain Men driving the coach,” Fillwalk explained. “And Cody himself—along with his wardrobe—was painstakingly researched and re-created. His appearance was based on specific clothing of Cody’s in the museum collection that we were allowed to photograph.”
Seefeldt said Fillwalk’s re-creations uniquely capture William F. “Buffalo Bill” Cody.
“His show had it all—buffalos, the Pony Express, Annie Oakley, re-enactments of iconic events in the history of the West. He was one of the most famous people in the country, a celebrity of that era, and it’s a thrill to see the way John has brought him back to life.”
Ball State-Center of the West partnership continues
Located in Cody, Wyoming, the Buffalo Bill Center of the West includes the Buffalo Bill, Draper Natural History, Whitney Western Art, Plains Indian and Cody Firearms museums, along with the McCracken Research Library.
The Origins of Buffalo Bill
Born in 1846, William F. Cody rode for the Pony Express, served as a military scout and earned his moniker “Buffalo Bill” while hunting the animals for the Kansas Pacific Railroad work crews. Beginning in 1883, he became one of the world’s best showmen with the launch of Buffalo Bill’s Wild West, which was staged for 30 years, touring America and Europe multiple times.
The IDIA Lab’s next project for the center will be a series of augmented reality apps featuring objects in each museum’s collection. By holding electronic devices over images like a grizzly bear or gun, users can learn more about them as 3-D models of the subjects pop up on screen.
“By using their phones or tablets, visitors can see museum exhibits come to life,” Fillwalk said. “All of our work is meant to give visitors a greater appreciation for these assets with the aid of our digital interpretations.”
Johnston said what he likes best about Fillwalk’s approach is the way “he puts technology in the users’ hands.”
“I’ve seen so many younger people walking through our museums with their heads down, glued to their iPhones and iPads. With John’s help, I’m excited that we’re taking something they’re so familiar with and using it in a way to get them to engage with our exhibits here.”
Funding for the Virtual Buffalo Bill project was provided by a grant from the Buffalo Bill Center for the West, which was matched by internal grant funding from Ball State.
CAVE is an immersive 270° panorama screen primarily used for presentations to large groups of students. OmniFocus projection maximizes the shadow-free zone in our installation of the Elumenati‘s 8m GeoDome Panorama, so up to 40 people can share the space. Without headsets, the shared space sets the stage for spontaneous interactions with the content and one another. It’s also logistically smoother than managing 40 headsets at once, especially post-pandemic.
Another use case for the CAVE is as part of our virtual production pipeline, using the Elumenati’s WorldViewer and Omnity plugin for the Unity game engine. This allows us to collaborate inside the 3D environment and demo content for clients.
A vendor and partner since 2018, the Elumenati is a design and engineering firm creating innovative solutions for immersive visualization. With nearly twenty years as leaders in the field, their mission is to develop elegant, effective systems and novel applications of immersion and interactivity, building VR domes and social, collaborative spaces for XR experiences.
IDIA Lab Staff Chris Harrison, Dan Eisinger, Blake Boucher, Trevor Danehy and Andy Hesick worked with Director John Fillwalk in designing and building a 3D scanned and laser printed violin – playable by gesture using custom, self-contained electronics. Bid here: https://qtego.net/qlink/emens.php
MUNCIE – For anybody who has ever dreamed of playing the violin, but found themselves stymied by one slight problem — namely, they don’t know how — John Fillwalk has just the fiddle for you.
What’s more, he’ll be selling it Sept. 27.
In fact, Fillwalk’s will be one of 14 artistically rendered violins sold in a silent auction and dinner that night, ending a year-long observance of Emens Auditorium’s 50th anniversary.
“We’ve had some wonderful guest artists who’ve joined us for performances at Emens over the past year, helping us commemorate this milestone anniversary,” said Charles Sursa, who chaired the Emens 50th anniversary steering committee. “Now, with this dinner and auction, we look forward to bringing our celebration to a close with these exciting culminating events.”
Designs of the violins run from a metal bug to a Garfield-inspired piece to one made entirely of glass. The one made by Fillwalk, who is director of Ball State University’s Idea Lab, was rendered with lots of assistance from his colleagues at the Institute for Digital Intermedia Arts (IDIA) Lab, and seems a particular mind-blower.
“Basically, we wanted to apply our digital processes to the fund-raising event,” he said, noting how they first scanned a violin with a 3-D laser.
“You basically paint it with a laser light,” Fillwalk said, explaining how it was not unlike a grocery store’s laser scanner. The 3-D recreation was then made with resin via a 3-D printer and fitted with sophisticated electronic, um, gizmos, including three sensors. All of that was then mated with a recorded violin concerto performed by a violin major.
After that? Voila!
“We chopped it up into smaller samples,” Fillwalk said, adding that the violin’s three sensors allow the most musically-impaired of users to play the samples, plus control their pitch and volume.
Pretty unique, too, we observed.
“Yeah, it’s a little different,” Fillwalk conceded.
Stories behind other violins are no less interesting. Painting with oils, Ann Johnson put two entwined swans on hers, the neck of one trailing up the neck of the violin. It is the recreation of a sight she saw on the White River through a window of her home, then went to tell her longtime husband, Jack, who was ailing from cancer.
“I went over to his bed to tell him what I saw and he was gone,” she recalled, adding that music from “Swan Lake” was played at his funeral. “I never before had a particular attraction to swans, but I guess I do now.”
By the way, when an aging Bette Graham agreed to decorate a violin, she asked Johnson to touch it up if needed. When Graham died, Johnson examined the cat-decorated fiddle and barely touched it.
“That (violin) was the last thing Bette did,” Johnson said.
Other artists who contributed decorated violins to the auction include Susie Burns, Cassandra Copenhaver, Jennifer Halvorson, Nate Harmon, Tom Howard, Patricia Kreigh, Alfredo Marin, Jean McCauley, Pat Nelson and Aurora Robson. The university also commissioned Julie Borden, a nationally recognized instrument modifier, to paint a violin that has featured prominently in the promotion of the anniversary season.
Ball State will be auctioning off these artist rendition violins for Emens Auditorium’s 50th anniversary.
Tickets for the Sept. 27 dinner are $75. It is open to the public but limited seating is available; to inquire, contact Breanne Talbott at btalbott@bsu.edu. Bidding for the violins is through an online auction site, www.qtego.net/auc/emens. Visitors to the site must enter their mobile phone numbers and payment information to bid via their phone.
The auction is part of a university-sponsored fundraising effort to pay for future renovations. The violins will be set for public viewing in the main lobby of Minnetrista through Sept. 25.
Another unique one is BSU art professor Lynette Whitesell’s, a work she titled “Bariolage,” which is a special effect “obtained by playing in rapid alternation upon open and stopped strings.” A mixed-media specialist, she put a girl with a redwing blackbird on her shoulder — the bird standing for music — on the top and a girl wearing a blindfold on the back. The blindfold emphasizes the importance of hearing, Whitesell said.
“It’s a little ‘collage-y,'” the artist explained, describing what sounded like a complicated artistic process. What wasn’t complicated was her enthusiasm for both the project and Emens Auditorium.
Contributing to a place that has had such an impact on our community was also fulfilling. Recalling taking her little boy, Logan, to shows there, she noted that only the day before she had said goodbye to her now 18-year-old as he flew off to Marine Corps boot camp.
For many of us, she hinted, Emens is a place of magic memories.
MaelStrom AR is an augmented reality installation designed by artist John Fillwalk in collaboration with IDIA Lab (Neil Zehr, Aaron Anderson, and David Rodriguez). This work visualizes live weather data from Washington DC to inform virtual wind direction and speed. This data affects the creation of virtual paper airplanes/messages that fall from above onto the ground, enveloping the viewer within a swirling circular mass of objects. In this multi-user networked mobile artwork, participants are assigned a unique color upon launching, through which they can paint nearby airplanes and also physically move them through walking and collision. The surrounding environment of the user is scanned by the software and is used to create physical interactions with the AR objects. In addition to hue, participants in MaelStrom AR are also assigned a unique tone that blends to form a chorus of sound in the space. In this work, the viewer becomes a participant in a constant stream of messaging and data – playfully affecting its design while collaborating with other users.
IDIA Lab was contracted by digital humanities scholars at UCLA to design and build a virtual simulation of the Temple of Artemis, one of the Wonders of the Ancient World, This massive Greek temple, four times the size of the Parthenon lies in ruin in present-day Turkey. This simulation incorporates our CelestialEngine with accurately positions both the sun and moon using a site’s latitude, longitude, altitude and year via NASA JPL data. This particular simulation studies whether an opening in the temple’s portico allowed moonlight to illuminate the statue of Artemis on her feast day.
IDIA is pleased to announce REDgrid – a 3D virtual campus to support the educational mission of faculty, staff and students. Ball State community members can use it for free as a virtual gathering space including classroom instruction, small group meetings, presentations, panel discussions, or performances. It is a secure environment hosted and managed solely by Ball State’s IDIA Lab. Virtual classrooms or field experiences can be customized as well as made private to suit your needs. REDgrid also offers a developmental platform for online instruction or community displays.
In this video, David gives us a preview of what you are able to do in REDgrid.
Ball State institute creates virtual world to be used for learning, connecting
The Daily News | Amanda Belcher Published 08/31/15 12:27pm Updated 09/01/15 5:15pm
By 2013, the Sims video game series had sold 175 million copies worldwide. Its users could create avatars, build houses—just like in reality.
Ball State’s own REDgrid uses a similar concept. Students can make an avatar, walk around a virtual Ball State campus and interact with other avatars via written messages or a headset in this open simulator.
Ball State’s Institute for Digital Intermedia Arts (IDIA) developed REDgrid.
“The server is physically housed at BSU IDIA Lab and was installed with collaboration with Information Technology,” said John Fillwak, the director of Ball State’s IDIA Lab.
When it comes to REDgrid, the possibilities can seem limitless—and some faculty members have already begun testing the simulator’s boundaries.
Mai Kuha, an assistant professor of English, used REDgrid in an honors class as a gender identity project. Students were assigned a gender and told to create an avatar of that gender. This enabled them to observe how people of opposite genders are treated differently.
Kuha isn’t the only professor to use REDgrid as a learning tool in the classroom. Ann Blakey, an associate professor of biology, used it in one of her courses for an environmental ethics activity.
Students were assigned a role, such as scientist or environmentalist, and were invited to explore the virtual environment and file a report. The activity gave students the opportunity to see the environment from different perspectives.
Fillwalk envisions even more opportunities for the platform.
“Ball State community members can use it for free as a virtual gathering space including classroom instruction, small group meetings, presentations, panel discussions or performances,” Fillwalk said.
The virtual classrooms and field experiences can be customized to fit each teacher’s needs.
This kind of creative teaching is what Stephen Gasior, an instructor of biology, is looking to expand upon. An enthusiastic voice for REDgrid, Gasior encourages professors and students to utilize this tool. He explains that it gives professors the ability to shape an environment and allows students to experience any number of events or situations.
REDgrid isn’t just for academic purposes either, Gasior said. Students can use the avatars for social experiments too.
“REDgrid represents Ball State’s campus, and international or online students get [the] feeling of being in the campus environment,” he said.
Fillwalk fully understands this aspect of REDgrid.
“We designed it to be a flexible platform connecting BSU faculty and students to international,” he said.
Connection is key with REDgrid. Gasior stressed that it can help build and connect communities—Ball State or otherwise.
Ball State is already working with faculties of other universities so the tool can be used on campuses other than just Ball State’s, Fillwalk said.
“The platform could certainly continue to be expanded,” he said.
Gasior has plans for the future—like researching developmental grants and expanding REDgrid. But IDIA staff can only do so much.
“People who come to REDgrid and have a passion for it will shape the road it will take,” Gasior said.
IDIA Lab’s Virtual Pantheon/Kinect project in Blue Mars and our occupational simulation for Alzheimer’s patients won the Bronze Medal at the iENA International Trade Fair for Ideas, Inventions and New Products in Nuremberg, Germany. http://www.iena.de/en/home.html
The Virtual Pantheon Project in Blue Mars is a multi-user simulation of the Roman Pantheon and its surroundings as it might have existed in 320AD. The structure is the largest and most intact surviving example of ancient Roman architecture in existence. The building was commissioned by Marcus Agrippa during the reign of Augustus as a temple to all the gods of ancient Rome, and rebuilt by the emperor Hadrian about 126 AD. The simulation was created by the Institute for Digital Intermedia Arts at Ball State University in consultation with the University of Otago, NZ and the Virtual World Heritage Laboratory at Indiana University. The simulation leverages the significant graphic capabilities of the CryEngine to create a compelling immersive experience. IDIA Lab and Avatar Reality worked together to design an interactive solar tracker that allows the visitor to change time of day around important celestial and celebratory events in Ancient Rome. More detail here: http://bit.ly/18cfDKu The project can also be interacted with via MS Kinect sensor directly in Blue Mars.
The Pantheon is the best-preserved architectural monument of ancient Rome. This simulation by BSU’s IDIA Lab represents the Pantheon and its surrounds as it may have appeared in 320 AD. Visitors to this Blue Mars / CryEngine simulation can tour the vicinity, learning about the history, function and solar alignments through an interactive heads up display created for this project. The project opened in beta in late 2013 and will premiere publicly in February 2014 and includes new solar simulation software calibrated to the buildings location and year, an interactive HUD, a greeter bot system and a new AI Non Player Character system developed in partnership between IDIA Lab and Avatar Reality.
Originally built by Agrippa around 27 BC under the rule of Augustus, it was destroyed by fire, then rebuilt and finally completed in its present form during Emperor Hadrian’s reign, around 128 AD. Agrippa finished the construction of the building and it bears his name above the portico. The Pantheon would have contained numerous marble statues representing the major Roman deities. The statues displayed in this simulation represent a possible configuration and are scanned via photogrammetry. The buildings surrounding the Pantheon are built and interpreted by IDIA based on the large scale model of ancient Rome built by Italo Gismondi between 1935 and 1971. The model resides in the Museo della Civiltà Romana, just outside of Rome, Italy.
Video walkthrough of the Virtual Pantheon in Blue Mars:
The Solar Alignment Simulation of the Roman Pantheon in Blue Mars was developed under consultation with archeo-astronomer Dr. Robert Hannah, Dean of Arts and Social Sciences at the University of Waikato, New Zealand, one of the world’s foremost scholars on Pantheon solar alignments; and archaeologist Dr. Bernard Frischer, Indiana University.
Background
The Pantheon that we can visit today is composed of a rectangular porch with three rows of granite columns in front of a circular building designed as a huge hemispherical dome (142 feet in diameter), built over a cylinder of the same diameter and as high as the radius. Therefore, the ideal completion of the upper hemisphere by a hypothetical lower one touches the central point of the floor, directly under the unique source of natural light of the building. This light source is the so-called oculus, a circular opening over 27 feet wide on the top of the cupola. It is the only source of direct light since no direct sunlight can enter from the door in the course of the whole year, owing to the northward orientation of the entrance doorway. Of the original embellishments the building should have had, the coffered ceiling, part of the marble interiors, the bronze grille over the entrance and the great bronze doors have survived.
Interior Wall
The interior wall, although circular in plan, is organized into sixteen regularly spaced sectors: the northernmost one contains the entrance door, and then (proceeding in a clockwise direction) pedimented niches and columned recesses alternate with each other. Corresponding to this ground level sector are fourteen blind windows in the upper, attic course, just below the offset between the cylinder and the dome. It is likely that both the niches and the windows were meant for statues, which, however, have not survived.
Oculus
Direct sunlight penetrates the interior only through a large, 27 foot wide oculus in the center of the domed roof. Otherwise indirect sunlight can enter the building, but only through the large, north-facing doorway, when it is open. The fall of direct sunlight through the oculus into the essentially spherical building leads to the comparison with a roofed sundial.
Celestial Alignments
A columned porch leads through a vestibule of the Pantheon, into a huge, shadowy interior, over 142 feet in height and as much in diameter. The building’s form is essentially that of a sphere with its lower half transformed into a cylinder of the same radius. Direct sunlight penetrates the interior only through a large, 27 feet wide oculus in the centre of the domed roof.
The shift from one semester to the other is marked by the passage of the sun at the equinoxes in March and September. At this point the noontime sun shines partially just below the dome, passing through the grill over the entrance doorway and falling on the floor of the porch outside. More significantly, however, the centre of this equinoctial, midday circle of sunlight lies on the interior architectural moulding, which marks the base of the dome.
On April 21st, the midday sun shines directly on to visitors to the Pantheon when they are standing in the open doorway, dramatically highlighting them. This day is of particular significance, not just because this was when the sun entered Taurus, but more because it is the traditional Birthday of Rome, a festival preserved from antiquity right through to the present day. And it may be that when the building was officially commissioned in AD 128, the person expected to be standing in the open doorway was the emperor Hadrian himself.
The illustration indicates a section through the Pantheon, showing the fall of the noon sunlight at the winter solstice, when the sun is at altitude 24 degrees; noon sunlight, at both equinoxes at altitude 48 degrees; noon sunlight on April 21st, when the sun is at altitude 60 degrees; and finally, noon sunlight at the summer solstice, when the sun is at altitude 72 degrees.
Meaning of Pantheon
The Pantheon is a building in Rome, Italy commissioned by Marcus Agrippa during the reign of Augustus as a temple to all of the gods of ancient Rome, and rebuilt by the emperor Hadrian about 126 AD.
Pantheon is an ancient Greek composite word meaning All Gods. Cassius Dio, a Roman senator who wrote in Greek, speculated that the name comes either from the statues of so many gods placed around this building, or from the resemblance of the dome to the heavens.
“Agrippa finished the construction of the building called the Pantheon. It has this name, perhaps because it received among the images which decorated it the statues of many gods, including Mars and Venus; but my own opinion of the name is that, because of its vaulted roof, it resembles the heavens.”
-Cassius Dio History of Rome 53.27.2
Augustus
Augustus was the founder of the Roman Empire and its first Emperor, ruling from 27 BC until his death in 14 AD.
The reign of Augustus initiated an era of relative peace known as the Pax Romana (The Roman Peace). Despite continuous wars or imperial expansion on the Empire’s frontiers and one year-long civil war over the imperial succession, the Roman world was largely free from large-scale conflict for more than two centuries. Augustus dramatically enlarged the Empire, annexing Egypt, Dalmatia, Pannonia, Noricum, and Raetia, expanded possessions in Africa, expanded into Germania, and completed the conquest of Hispania.
Beyond the frontiers, he secured the Empire with a buffer region of client states, and made peace with the Parthian Empire through diplomacy. He reformed the Roman system of taxation, developed networks of roads with an official courier system, established a standing army, established the Praetorian Guard, created official police and fire-fighting services for Rome, and rebuilt much of the city during his reign.
Augustus died in 14 AD at the age of 75. He may have died from natural causes, although there were unconfirmed rumors that his wife Livia poisoned him. He was succeeded as Emperor by his adopted son (also stepson and former son-in-law), Tiberius.
Argrippa
Marcus Vipsanius Agrippa (c. 23 October or November 64/63 BC – 12 BC) was a Roman statesman and general. He was a close friend, son-in-law, lieutenant and defense minister to Octavian, the future Emperor Caesar Augustus and father-in-law of the Emperor Tiberius, maternal grandfather of the Emperor Caligula, and maternal great-grandfather of the Emperor Nero. He was responsible for most of Octavian’s military victories, most notably winning the naval Battle of Actium against the forces of Mark Antony and Cleopatra VII of Egypt.
In commemoration of the Battle of Actium, Agrippa built and dedicated the building that served as the Roman Pantheon before its destruction in 80 AD. Emperor Hadrian used Agrippa’s design to build his own Pantheon, which survives in Rome. The inscription of the later building, which was built around 125, preserves the text of the inscription from Agrippa’s building during his third consulship. The years following his third consulship, Agrippa spent in Gaul, reforming the provincial administration and taxation system, along with building an effective road system and aqueducts.
Arch of Piety
The Arch of Piety is believed to have stood in the piazza to the immediate north of the Pantheon. Statements made in mediaeval documents imply, but do not specifically say, that the scene of Trajan and the widow was represented in a bas-relief on the Arch – narrating the story of the emperor and a widow, suppressing the emperor’s name. His probable source, the mediaeval guidebook of Rome known as Mirabilia Romae, does not even state that the arch was built in commemoration of the event. It mentions the arch and then says that the Incident happened there.
Giacomo Boni discusses the legend of Trajan, giving many interesting pictures which show how the story was used in medieval painting and sculpture. He has found a bas-relief on the Arch of Constantino, which he thinks may have given rise to the story. It shows a woman sitting, her right hand raised in supplication to a Roman figure, who is surrounded by other men, some in military dress, and two accompanied by horses. Boni suggests that the Middle Ages may have supposed this figure to be Trajan because of his reputation for justice.
Saepta Julia
The Saepta Julia was a building in Ancient Rome where citizens gathered to cast votes. The building was conceived by Julius Caesar and dedicated by Marcus Vipsanius Agrippa in 26 BC. The building was originally built as a place for the comitia tributa to gather to cast votes. It replaced an older structure, called the Ovile, which served the same function. The building did not always retain its original function. It was used for gladiatorial fights by Augustus and later as a market place.
The conception of the Saepta Julia began during the reign of Julius Caesar (died 44 BC). Located in the Campus Martius, the Saepta Julia was built of marble and surrounded a huge rectangular space next to the Pantheon. The building was planned by Julius Caesar who wanted it to be built of marble and have a mile long portico according to a letter written by Cicero to his friend Atticus about the building project. The quadriporticus (four-sided portico, like the one used for the enclosure of the Saepta Julia) was an architectural feature made popular by Caesar.
After Caesar’s assassination in 44 BC, and in the backlash of public support for the former ruler, men continued to work on projects that Caesar had set into motion. Marcus Aemilius Lepidus, who used to support Caesar and subsequently aligned with his successor Octavian, took on the continuation of the Saepta Julia building project. The building was finally completed and dedicated by Marcus Vipsanius Agrippa in 26 BC. Agrippa also decorated the building with marble tablets and Greek paintings.
The Saepta Julia can be seen on the Forma Urbis Romae, a map of the city of Rome as it existed in the early 3rd century AD. Part of the original wall of the Saepta Julia can still be seen right next to the Pantheon.
– edited from Robert Hannah, “The Pantheon as Timekeeper”, 2009.
________
Simulation by the Institute for Digital Intermedia Arts at Ball State University
Project Director: John Fillwalk, Senior Directory IDIA Lab, BSU.
IDIA Staff: Neil Zehr, Trevor Danehy, David Rodriguez, Ina Marie Henning, Adam Kobitz
PROJECT ADVISORS:
Dr. Robert Hannah, University of Waikato, New Zealand
Dr. Bernard Frischer, Virtual World Heritage Laboratory, Indiana University, USA
The IDIA Lab has developed two new simulations for AvayaLive Engage – a multi-user virtual world platform based on the Unreal Game Engine. The virtual collaboration Center is a multi-use teaching and learning facility prototype for use by large and small groups containing rich media and collaboration tools including desktop sharing, presentations, drop box, white boards, streaming video and webcam. The Virtual Art Museum is a demonstration of an interactive museum experience with links to companion webpages including a database that allows for extremely close examination of the paintings. Both simulations work on Mac and PC within industry standard web browsers and will be launching in the month of May, 2013. You can visit the project here: https://wa11619.avayalive.com/11619/html/index.html Certain areas need administrative permission – we will be opening these up in the near future.
HDT and IDIA Lab have developed a next generation, high-fidelity Virtual Nursing Simulator in the game engine of Unity 3D for testing by the Ball State University School of Nursing. Students will practice interviews via role-playing – employing avatars with predefined health histories. The simulator introduces our new media tools within the environment including the ability to play video, webcam, screen sharing, presentations and webpages. Additionally the simulator allows for interdisciplinary immersion – providing a context for student nurses to role-play scenarios alongside doctors, health insurance representatives, administrators and other professionals. Desktop, web and tablet versions are in development.
Video preview here: http://www.youtube.com/watch?v=Asz92FuhEyc&feature=share&list=UUtQrLF1JPprEvP4AO-iSNvw
IDIA Lab Director John Fillwalk will be presenting at the National Endowment for the Humanities: Advanced Topics in the Digital Humanities Summer Institute 2013 sponsored by the Center for Digital Initiatives at University of Arkansas.
This NEH Summer Institute for Advanced Topics in the Digital Humanities will take place from June 17, 2013 – July 6, 2013. The three-week institute will be hosted by the Center for Digital Initiatives (CDI) at Arkansas State University (ASU), Jonesboro campus in northeast Arkansas in the Mississippi Delta region, and the Center for Advanced Spatial Technologies (CAST) at the University of Arkansas, Fayetteville (UAF), located in northwest Arkansas. The first two weeks of the institute will be held at the CDI at Arkansas State University Jonesboro, AR campus. Participants will then travel as a group to the University of Arkansas in Fayetteville, AR for the final week of the institute.
This summer institute brings together twenty scholars working in the humanities who have research or teaching projects that would benefit from real-time visualization in a game engine, published as standalone applications, web players, or on mobile devices. In a three-week institute, participants will be provided with a conceptual road map to the difficult but intellectually productive issues that surround the academic use of game engines, including the balance of immersion with accuracy, strategies for storytelling and graphical user interfaces (GUIs) in “serious” games, and questions of power and appropriateness in using video game conventions to represent non-contemporary or non-Western cultures. Participants will also receive hands-on training in the digital toolbox for creating game engine content, a basic workflow that they would be able to use in their own projects and bring back to their home institutions. Training will include VSim, Unity 3D, online multi-user virtual environments, Google SketchUp, 3D Studio Max, Cinema 4D, as well as a broad range of open-source programs. No prior knowledge or experience in 3D modeling will be assumed.
A unique feature of the institute is the breadth of cultural heritage content it incorporates. This includes travel to ASU Heritage sites modeled by the CDI including the Lakeport Plantation in Lake Village, the boyhood home of Johnny Cash in Dyess, the Hemingway-Pfeiffer house and studio in Piggott, and the Japanese-American internment camp in Rohwer. At UAF participants will also participate in a Unity tutorial focused on the House of the Ara Massima in Pompeii. In bringing together an impressive group of lecturers who specialize in the use of 3D visualization and game engines as research tools in the digital humanities, the institute creates an important resource in the form of a community of scholars—which allows for future collaborations between individuals and universities.
A presentation by Jo Ann Gora, President of Ball State University; Phil Repp, Vice-President of Information Technology; Linda Sweigart, BSU Nursing Faculty; and John Fillwalk, IDIA Lab Director on March 21st 2013 to the American Association of Colleges of Nursing.
A presentation that examines how universities, like Ball State University, use emerging technologies to enrich and expand the utility of course materials at scale by leveraging the inherent production flexibility of digital tools. The advantages of digital technologies when designing virtual participation, collaboration, and interaction will be discussed, as well as the pedagogical uses of un-assembling and re-assembling diverse sets of course materials. http://www.aacn.nche.edu/webinars/2013/03/21/designing-the-unbundling
We have applied our Virtual Solar Simulator to a simple white model build of the pre-Columbian ballcourt in Group F at the Izapa site in Chiapas, Mexico. Izapa is considered to be the birthplace of the Mayan Long Count, which ends its cycle today on December 21st, 2012 – the winter solstice. Viewed today, the site is oriented on an axis where it is aligned generally but not precisely to sunrise on the winter solstice and to sunset on summer solstice. In direct observation today, the alignment along the axis of the ballcourt from the throne #2 to the stela #60 is off-axis by approximately two degrees.
* The simulation requires Internet Explorer and the Unity plugin.
The solar simulator developed by IDIA Lab at Ball State University polls data from the NASA / JPL Horizons database (http://ssd.jpl.nasa.gov/horizons.cgi) calculates celestial objects position (sun, moon, planets, comets, etc). The database takes into account for the Chandler Wobble – a periodic deviation in the rotation of the Earth’s axis. Archeo-astronomical alignments that are viewed today at Izapa appear to be off-axis, however when compensated for changes in time, rotation, position and Chandler effect via this simulation method, the solstice events come back into their original orientation for the era of their construction and can be observed to be in actual alignment.
A b’ak’tun is 144,000 days in length – almost 400 years. In the Mayan Long Count it took 13 b’ak’tuns to progress throughout a full cycle of creation. Dec. 21, 2012, marks the end of the 13th b’ak’tun of this Mayan calendar and after today’s solstice, the next Long Count cycle begins again – another 5128.8 years.
Happy New Mayan Long Count!
– John
December 21st, 2012
Dates are in the year 300 BCE/BC.
Time is represented in Universal Time (UT1/UTC/GMT).
To obtain local time for the Izapa site, subtract 6 hours from the UT time displayed in the simulator.
Next release will provide solstice and equinox preset for the year 2012 for comparison.
Camera control
To switch cameras press the 1 for 1st person camera view, 2 is for bird’s eye view and 3 is for top down view.
To pan use the WASD keys and to zoom use the middle mouse button. To rotate camera use CTRL-Click or Right Mouse Button.
Background
The model is geolocated and accurately referenced in 3D space by latitude, longitude, orientation and elevation. The Horizons database is then scraped using these coordinates as well as the year of interest – in this case we chose 300 BCE/BC as an apogee in the range in which the Izapa site was inhabited.
IDIA Lab has developed the celestial simulation – which can also track the moon and planets, etc – to work on various Cultural Heritage and archeo-astronomy projects in collaboration with international scholars. Other projects utilizing the Celestial Simulator include simulations of Hadrian’s Villa, the Roman Pantheon, the Solarium Augusti in Rome (the largest sun calendar of the ancient world), the Temple of Artemis in modern Turkey (one of the ancient wonders) and Stonehenge.
In the virtual installation, PROXY, visitors shape the construction of a sculptural and sonic response to the virtual museum. The work progresses to form, eventually transforming to become structural support for the building. When multiple users are in the environment, their avatars interact with one another to create collaborative painting and sculpture.
FLICKR™ GETTR v6
John Fillwalk
with Jesse Allison, Composer and IDIA Lab. 2012.
FLICKR™ GETTR v6 connects the social image web service of Flickr™ to the virtual museum environment, allowing visitors to create a dynamic cloud of spatial imagery by entering a search term of their choice, that pulls in related images from Flickr™ into the virtual environment.
SURVEY FOR BEIJING
John Fillwalk
with Keith Kothman, Composer and IDIA Lab. 2012.
SURVEY FOR BEIJING is an immersive landscape simulation using real time weather data from the physical location in Beijing, China. Representations of surveyor’s tape, flags, light, time of day, wind and clouds are superimposed onto the virtual landscape in accordance with real-life weather data.
con|FLUENCE
John Fillwalk
with Michael Olson, Composer and IDIA Lab. 2012.
Participants in con|FLUENCE create pathways based on reactions to both social and spatial relationships. There is a virtual response and persistence to interactions, as the routes are drawn and sculpted in the three-dimensional environment – forming nodes that are created via visitors’ proximity
Ball State artists create “forest” of light on display in China and Internet
Muncie, Ind. — Ball State University electronic artists have created a “forest” of light and sound that will be on exhibit in Beijing, China through November, yet also accessible to visitors from Indiana or anywhere else in the world.
That’s possible because “Displaced Resonance,” as the interactive art exhibit is known, has both real-life and virtual components.
The physical portion has been installed in a gallery of the China Science and Technology Museum in Beijing. There, in-person visitors can negotiate a thicket of 16 interactive sculptures spaced 1.5 meters apart that will change colors and emit music as they approach.
A digital replica of the layout, meanwhile, resides on the Internet, accessible through the museum’s website. Online visitors can wander the virtual exhibit using an avatar, and the digital pillars will change colors and produce sounds, just like their physical counterparts.
But that’s not all — the two pieces interact with each other, says John Fillwalk, director of Ball State’s Institute for Digital Intermedia Arts (IDIA) and Hybrid Design Technologies (HDT), which created the work in collaboration with IDIA staff, students and composer Michaal Pounds, BSU.
When an online avatar approaches a virtual pillar, the corresponding real-life column also will change colors, and vice versa. In-person and virtual visitors will produce different colors, however, allowing them to track each other through the exhibit.
“It’s what we call hybrid art,” says Fillwalk. “It’s negotiating between the physical world and the virtual. So it’s both sets of realities, and there’s a connection between the two.”
The physical pillars are two meters (or more than 6 feet, 6 inches) tall. They consist of a wooden base containing a sound system; a translucent pillar made of white corrugated plastic and computer-controlled lighting.
A thermal camera mounted on the museum’s ceiling keeps track of visitors and feeds its data to a computer program that directs the columns to change color and broadcast sounds when someone draws near.
“It’s a sensory forest that you can navigate,” Fillwalk says.
Two final touches: a video screen mounted on a museum wall overlooking the exhibit allows in-person visitors to watch avatars move around the virtual version, while Internet patrons can keeps tabs on the real-life display through a window on their computer screens.
“Displaced Resonance” is the centerpiece of Ball State’s contributions to the Beijing museum’s 3rd Art and Science International Exhibition and Symposium, a month-long celebration of technology and the arts. Ball State was invited to participate because museum curators discovered some of IDIA’s work and liked what they saw, Fillwalk said.
In addition to “Displaced Resonance,” IDIA contributed four other pieces of digital art that museum visitors can view at a kiosk.
Those pieces are:
· “Proxy”, in which visitors create, color and sculpt with floating 3D pixels.
· “Flickr Gettr,” in which visitors can surround themselves with photos from the Flickr web service that correspond to search terms they submit.
· “Confluence,” in which users create virtual sculptures by moving around the screen and leaving a path in their wake.
· “Survey for Beijing,” in which real time weather data from Beijing is dynamically visualized in a virtual environment.
(Note to editors: For more information, contact John Fillwalk, director of the Institute for Digital Intermedia Arts, at765-285-1045 or jfillwalk@bsu.edu; or Vic Caleca, media relations manager, at 765-285-5948, or vjcaleca@bsu.edu. For more stories, visit the Ball State University News Center at www.bsu.edu/news.
IDIA, in partnership with the BSU School of Nursing, developed and launched a new Nursing Interview Simulator in the virtual world of Blue Mars. Blue Mars is a next generation and high fidelity virtual world that uses the CryEngine game engine. Student nurses practice interviews via role-playing – using avatars with predefined health histories.
The Virtual Middletown Living Museum project in Blue Mars is a simulation of the Ball Glass factory from early 20th century Muncie, Indiana. Life and conditions in the factory were one of the key elements of the Middletown Studies by Robert S. and Helen Merrell Lynd in their landmark studies Middletown (1929) and Middletown in Transition (1937). These in-depth accounts of life in Muncie, Indiana, became classic sociological studies and established the community as a barometer of social trends in the United States. In the years since, scholars in a variety of fields have returned to Muncie to follow up on the Lynds’ work, making this small city among the most studied communities in the nation. The center continues this tradition by sponsoring and promoting research on Muncie as Middletown, on small cities generally, and on the themes and issues the Lynds explored.
This simulation of industrial life, built as a prototype for a much larger project dealing with all aspects of the Lynd Study, has aimed to create an virtual living museum experience expanding the opportunities for both learning and interpretation. The approach to interactive design embeds learning and navigation experiences subtly into the project to maintain the sense of immersion. IDIA has prototyped several techniques to do this including: interactive objects that allow for close up inspection; objects that when clicked bring up web resources that show information; plans or photographs used in the interpretation; non-player character factory workers, a live interactive avatar of Frank C. Ball who greets visitors and introduces them to the factory; video and audio files of factory experts and archival films; an in-world interactive Heads-Up-Display (HUD) that provides deeper investigation and navigation through the factory; and a supporting webpage with complete documentation on all resources used in this interpretation.
Slelect “Download Client” and follow instructions to install the BlueMars client on your PC (Windows desktop or laptop)
Once you have successfully installed the BlueMars client, select “Go To City” to install the Virtual Middletown virtual world
Register your account and confirm when you receive an email from BlueMars
Modify your avatar (optional)
Explore Virtual Middletown!
NOTE: If you are a Macintosh user (OS X) you may run the BlueMars client and Virtual Middletown virtual world using the Boot Camp emulation: http://www.apple.com/support/bootcamp/
Here are links for additional information on the project:
Recommended settings: Very High Graphics with good graphics cards – otherwise High or Low as needed. Screen resolution: Minimum 1280 by 720 or higher. Sound levels should be up. Many objects in the world are interactive – anything that highlights blue can be clicked with the left mouse button and examined – or might can reference a web page. The heads up display in the lower right hand corner provides information and navigation to augment your visit.
Project partners: The Center for Middletown Studies, Library Services and the Emerging Media Initiative at Ball State University
Ball State University’s Institute for Digital Intermedia Arts (IDIA) in Blue Mars is a large scale virtual simulation that showcases re-creations of artifacts and artworks in their original historical contexts, including sculptures at the 1915 Panama-Pacific International Exposition in San Francisco and a centuries-old Buddha at a Japanese temple. Through the use of such emerging technologies, visitors to Blue Mars can virtually experience these important but long lost sites.
IDIA’s simulations in Blue Mars present several examples of recent grant-supported research projects. IDIA is exploring various new ways that it might be used, from learning and the arts to gaming. Future IDIA initiatives in Blue Mars will include the prototyping the 3D internet, shared media collaboration tools and the development of an open virtual campus for use by educators interested in engaging the platform as an environment for learning.
This release of IDIA Lab introduces the following new features and updates, including
• Heads up Display system – flash-based HUD system displaying dynamic information and map based teleportation
• Automated avatar/bot greeter system
• A guided tour vehicle – launching from the orientation island
• The Snow Globe, a virtual winter simulation – employing physics, environment and particle systems. Teleports are placed throughout the City
• Depth of Field – now enabled in most environments • New navigation and orientation system
• New vegetation design – new plants with LOD
• High fidelity statues using advanced mapping techniques
• High optimization, terrain painting, parallax mapping
• Please use Very High Graphics settings to view
Blue Mars Builds
The IDIA Lab
THE INSTITUTE FOR DIGITAL INTERMEDIA ARTS is a collaborative research and design studio exploring the intersections between art, science and technology. We consult, design and produce immersive experiences for virtual reality, hybrid worlds, simulations, visualizations, games and interfaces for various platforms and devices. For more information, please visit IDIALAB.org. You can walk to all simulations or alternatively use the teleport columns you see at each location. Follow us on Twitter @ IDIA_Lab for news and updates.
Panama-Pacific International Exposition Simulation
IDIA simulation of the 1915 San Francisco Panama – Pacific Exposition The project allows for visitors to travel to the past to immersively tour a recreation of an historic environment that no longer exists. The exposition celebrated the opening of the Panama Canal but also San Francisco’s recovery from the devastating earthquake of 1906. IDIA 3D laser-scanned two sculptures by artist Adolph Weinman that have been included in this simulation and were originally installed on top of tall columns in the Court of the Universe. A more detailed examination of the sculptures can be found in our Museum Simulator. Visitors can change the time of day near controls found near this sign and the Fine Arts Palace to experience the lighting design of this exposition.
Palace of Fine Arts
The Palace of Fine Arts in the Marina District of San Francisco, California was originally constructed for the 1915 Panama-Pacific Exposition. One of a handful of surviving structures from the Exposition, it is the still situated on its original site. It was rebuilt in 1965 – and renovation of the lagoon, walkways, and a seismic retrofit were completed in early 2009. IDIA Lab constructed this as an addition to its Panama- Pacific Court of the Universe simulation.
Art Museum Simulation
Using blue prints, photographs, laser-scanning and measurements, IDIA Lab simulated an actual museum gallery to a high degree of accuracy for exploration and study within a live 3D environment. This goals of this build were to provide visitors with a remote immersive experience of an actual exhibition as it was installed – closely replicating all original factors such as the layout, framing and lighting of the physical space. Additional information is provided by touching each canvas or sculpture in the exhibition. Via a simulation such as this, curators can also spatially archive a show or prototype layout, lighting, and installation design.
Byodo-In Temple (Amida Hall)
Amida Hall, the most famous building in the Byodo-in temple complex was constructed in 1053 and is the only remaining building from the original site. Additional buildings making up the compound were burnt down during a civil war in 1336. IDIA scanned an Amida Buddha and created this temple site to give visitors an understanding of a typical context in which the Buddha was observed. A replica of the temple was also constructed on the island of O’ahu in Hawai’i. Nearby there are 360º immersion spheres of the Hawaiian temple that avatars can also explore.
Relic Temple
Relic Temple – located in the Ji Le Temple complex – Nangang District in Harbin, China.The Relic Temple simulation was built by by IDIA Lab as part of our larger 3D Laser Scanning and Simulation Project, including the 1915 Panama Pacific Exposition in San Francisco, Byodo-In Temple in Japan and the Bingling Si Buddhist Cave site – all simulated here on IDIA Lab in Blue Mars.
Shafer Tower
Shafer Tower is a 150-foot-tall (46 m) bell tower with a chiming clock and carillon bells located in the middle of the campus of Ball State University. A small staircase in the tower leads to a control room of the carillon, which has 48 custom-made bells. This virtual tower chimes when an avatar approaches – as does the bell in a smaller temple at our Byodo-in build.
The Snow Globe
Winter themed landscape simulation including environment, weather, particle and physics systems.
From a seashell to auditorium, laser-scanning can capture the objects and spaces we inhabit and convert them into a digital model. The IDIA scans a range of small or large scale 3D objects using facilities and equipment shared with the SimLab in the College of Architecture and Planning and supported by the Office of Information Technology. Three-dimensional geometry is captured in terms of millions of points that outline the scanned object. These points are converted into polygons to be used in the resulting animation workflow.
In 2009, the IDIA Immersion Seminar in Virtual Worlds worked to simulate the original settings of several sculptures in the collection of the Ball State University Museum of Art. Students where trained professionally to use the ZCorp 3D laser scanner using archival museum methods.
This project has allowed Ball State students to obtain skills in 3D scanning by learning the methodology, techniques and limitations involved with accurate scanning procedures. The museum scanning project was initially a way, to not only digitally archive select sculptures, but to place them in an live three-dimensional multi-user virtual environment that immerses viewers in a simulation of the works in their original context.
This significance of this project was to address the fact that pre-modern sculpture was typically originally designed to be an element in large-scale architectural, public or natural setting. IDIA developed a workflow that is now also a service that can be provided to external clients in the sciences, industry and the arts – employing emerging digital tools to illuminate a viewer’s understanding of context.
FARO and Z Corporation’s scanning technologies are used in applications such as reverse engineering, accident reconstruction, forensic science, historic preservation, entertainment and virtual simulation.
The IDIA has assembled an interdisciplinary group of students, faculty and industry experts in a significant interactive information portal for the Ball State University Museum of Art (BSUMA). The IDIA team is developing an innovative touch-based interface to navigate the collection, employing and integrating Microsoft Surface with the Museum’s database. The Surface will afford Museum patrons a collaborative, participatory public platform through which to access metadata and media of the physical exhibition – as well as extending virtually into the permanent collection. Using the Digital Images Delivered Online (DIDO) database, the interface will make visual the interconnection between works in the collection queried on searchable parameters, i.e. artist, medium, period, subject etc. This two-semester immersive project supported by the BSU Provost’s Immersion Initiative has been team-taught and has recruited students from targeted disciplines across campus.
Microsoft Surface represents a fundamental change in the potential for interaction with digital content. The Surface is a 30” tabletop interface environment that allows several people to work independently or collaboratively – all without using a mouse or a keyboard. The Surface allows users to navigate information physically, and manipulate information with natural gestures and touch http://www.microsoft.com/SURFACE/product.html#section=The%20Product.
The Ball State University Museum of Art has a collection of nearly 11,000 works of art. Central to the Ball State University Museum of Art’s mission is the provision of educational programming that will further faculty, staff, student, and public utilization and understanding of the Museum and its collection, of museums in general, and of the arts. To accomplish this goal, the Museum offer tours, talks, materials for teachers, an education database, artist demonstrations. The Museum’s education philosophy centers on bringing together the needs of visitors and the resources of the Museum , and balancing responsibility to the university with service to the community. In facilitating communication between the work of art and the viewer, the Museum subscribes to a philosophy of learner-centered programming informed by a thorough knowledge of the collection and methods and strategies for effective teaching.
Although the Museum’s collection numbers nearly 11,000, given the physical constraints of the facility, a mere 10% of the collected works are currently displayed at any given time. The incorporation of this hybrid Surface and database system will afford patrons virtual access to the entire collection – making visible those works currently stored. More importantly, the system will allow patrons to visualize the interconnectivity of the works according to multiple facets – visually re-contextualizing the works in relation to specified search parameters. This form of innovative technology within a museum context would typically be installed at major metropolitan institutions. Development of this interface at the Ball State University Museum of Art will not only benefit local community groups and patrons of BSUMA whose access to the collection will be significantly augmented, but also has the potential to influence other museums through the distribution of the outcomes of this product.
The Institute for Digital Intermedia Arts (IDIA) has a history of providing interdisciplinary immersion and new media experiences, employing pedagogy that supports students’ transformation as active, engaged learners. The Institute provides a dynamic exchange between instructor and students – where participants engage in collaborative, inquiry-based communities that provide an environment, which fosters participatory learning. As opposed to a traditional model, where knowledge is imparted by the teacher to the student, participatory learning can transform the learner into an active role. Participatory learning communities engage in a dynamic conversation centered around a subject – where analysis, comparison, and evaluation are core to the process of acquiring meaning and relationships. This environment allows students the opportunity to have a stake in the conditions of their learning, making decisions that direct their course of investigation. With a commitment to active inquiry, participants are freed to create personal and collective meaning.
BSU MUSEUM OF ART
COLLECTION PORTAL
SOFTWARE DESIGN
Carrie Arnold
Joel Happ
Garret Orth
Deekshita Reddy
Christopher Ross
Jonathan Strong
Austin Toombs
INTERFACE DESIGN
Anastasia Goryacheva
Steven Lanier
Jonathan Strong
ELECTRONICS
Eric Brockmeyer
Giovanni Rozzi
GRAPHIC DESIGN
Amy Goettemoeller
Ina-Marie Henning
FACULTY
Jesse Allison
John Fillwalk
Paul Gestwicki
PROJECT DIRECTOR
Jonathan Strong
A SPECIAL THANK YOU
UNIVERSITY LIBRARIES
James Bradley
John Straw
Budi Wibowo
SOUND
Rick Baker
Steven Lanier
Giovanni Rozzi
STRUCTURAL DESIGN
Eric Brockmeyer
Matthew Wolak
PROJECT MANAGEMENT
Ina-Marie Henning
Amruta Mhaiskar
Jennifer Weaver-Cotton
PRODUCER/PRINCIPLE INVESTIGATOR
John Fillwalk
BALL STATE UNIVERSITY MUSEUM OF ART
Peter Blume
Carl Schafer
Tania Said
Ball State University Museum of Art
College of Fine Arts
College of Architecture and Planning
Department of Architecture
Department of Art
Department of Computer Science
School of Music
Information Technology
Office of the Provost
University Libraries
and
Jeff Berg, IBM Interactive, IDIA Research Fellow
IDIA Presents: BSU MUSEUM OF ART COLLECTION PORTAL 3/20/11 at 2:30pm BSUMA
BSU MUSEUM OF ART COLLECTION PORTAL
Reception and Gallery Talk
Thursday, March 20 @ 2:30pm, BSU Museum of Art
The BSU Museum of Art Collection Portal was developed by students, faculty and industry research fellows in the Institute for Digital Intermedia Arts at Ball State University. The year-long project was developed in partnership with the Ball State University Museum of Art and sponsored by the Provost’s Immersive Learning Initiative.
This interdisciplinary team from art, computer science, architecture and music has developed an innovative, touch-based application to navigate the collection. The portal bridges the Microsoft Surface interface with the Museum’s collection database, the Digital Images Delivered Online (DIDO), hosted on the Digital Media Repository of Bracken Library. The Surface affords Museum visitors an interactive platform to individually or collaboratively make virtual connections between works of art both on display and in reserve – accessing information and media across the collection.
The project allows the user to manipulate a digital world with wireless objects. Through the use of Radio Frequency Identification Devices (RFID), the virtual world can detect the presence of real-world objects and use them to manipulate its own attributes. In the case of this project, physical cubes link the user with virtual cubes within the system and allow the user to call up media such as video and sound. The system uses the software Quest 3D for real-time VR rendering and interactive animation, and Max/MSP for harvesting and inputting RFID data. Additionally, users can navigate the 3D virtual world with the use of a trackball.
Students participating in IDIA’s Immersion Seminar were named as a winner for an Award of Excellence at the International Digital Media and Arts Association (iDMAa) 2007 National Conference. The award included a $250 cash prize and was presented to only two groups out of the twelve universities participating in the conference’s student showcase. iDMAa hosted the Ideas 07: Beyond Boundaries conference in Philadelphia on November 7-11.
IDIA is creating a media rich interactive digital kiosk for the Digital Fabrication Institute’s MMFX Exhibit hosted by the Indianapolis Museum of Art. Students participating in the IDIA Immersion Seminar in Virtual Worlds are developing the interactive interface, which will act as a station within the exhibit where viewers can attain additional biographical and portfolio information– employing a custom designed, coded and fabricated dynamic media experience.
[un]wired by Jesse Allison, John Fillwalk and Keith Kothman is a processing network visualizer that responds to interactions from personal radio-frequency devices such as mobile phones, WiFi signals, Bluetooth signals and car-key fobs. It tracks real-time statistical information from wireless access points (designed for seamless handoff of moving wireless traffic, like a cell phone network), along with periodically updated information from hand-held and wireless access points. The interactions then appear in sound and shapes on the screen. Users interacting with the piece can then visually see their interaction live on screen. The control information is collected from network services via the MySQL database and transferred into Max/MSP/Jitter. [un]wired was exhibited at SIGGRAPH Asia 2008 in Singapore.
This device was designed to take you on a journey through the history of intermedia art. The kiosk, made by the IDIA in collaboration with the Institute for Digital Fabrication at Ball State University, allows the user to interact using hand gestures to move through the timeline. The work includes a short bio, image and description of works from 30 intermedia artists of the 1960s to the present. It was displayed in conjunction with the BSU Museum of Art’s Engaging Technology Exhibit. Jesse Allison programmed the piece with help from Mike Sayre. Dustin Headley and the IDF designed and constructed the kiosk.
Jesse Allison and John Fillwalk exhibited their interactive virtual worlds installation, Flickr Gettr at the IUPUI Intermedia Festival held on April 23-25 in Indianapolis. The festival had over 100 international performers from across the arts. For more information, Please visit, http://music.iupui.edu/intermedia/
Physical location: Philips HighTech Campus and NatLab in Eindhoven, Nederlands
http://www.hightechcampus.nl/
‘Traversal for Eindhoven’ is a live performance / hybrid reality installation that bridges the physical and virtual worlds. It connects Philips NatLab in Eindhoven to a virtual environment as part of the 2009 Dutch Design Week. Avatars will be able to perform a physical synthesizer in the Philips NatLab via an interactive online virtual instrument built by IDIA – located near a 3D model of the Eindoven High Tech Campus. The sound of the live synthesizer will be streamed back into Second Life, so the virtual and physical participants can hear their interactions with the instrument.
DUTCH DESIGN WEEK From October 17 through October 25 2009, Eindhoven presents the 8th edition of Dutch Design Week, the largest design event in the Netherlands. Around 1500 designers from home and abroad will show their work in more than 60 locations all over the city of Eindhoven from design disciplines such as industrial design, concept design, graphic design, textile & fashion, spatial design, fooddesign and design management & trends. Visitors will be given insight into the entire development process from concept to product in various disciplines ranging from industrial design to applied arts. The participants include established bureaus, high-profile designers, talented newcomers, and recently graduated designers, one of the reasons why this Dutch Design Week is the perfect meeting place for designers, companies, and public. http://www.dutchdesignweek.nl/indexmap.php
MAD EMERGENT ART CENTER The Foundation MAD is a platform and workshop for Emergent Art: art that exists in the exiting world between cultural and cutting edge technological developments.MAD emergent art center is laboratory, platform and provider on the intersection of art, science and technology. MAD addresses artists, designers, scientists, public groups, institutions, governments and businesses. This on regional as well as national and international level. http://www.mad.dse.nl/mad.html
More Information: http://tinyurl.com/yl2dq4l
The Oculus Rift allows users to be visually transported to virtual 3D worlds. When paired with the Leap Motion, users now have the ability to interact with a 3D virtual world using their own hands.
The visual displays on the Oculus Rift project the user into an immersive 3D environment. The Leap Motion then allows interactivity via infrared images of the users hands. Together, these two pieces of hardware create exciting new possibilities.
IDIA Lab’s Virtual Apollo 15 is a simulation of the mission which launched on July 26, 1971 from the Kennedy Space Center, at Cape Canaveral, Florida. During the launch, the S-IC did not completely shut off following staging for four seconds, creating the possibility of the spent stage banging into the S-II engines, damaging them and forcing an abort (the S-II exhaust also struck a telemetry package on the S-IC and caused it to fail). Despite this, the third stage and spacecraft reached its planned Earth parking orbit. A couple of hours into the mission, the third stage reignited to propel the spacecraft out of Earth orbit and on to the Moon. The Apollo 15 command module is on display at the Smithsonian Air and Space Museum in Washington, D.C.
IDIA Lab’s Kitty Hawk is a simulation of the moment recorded in the famous photograph of the flight of the Wright Flyer (often retrospectively referred to as Flyer I or 1903 Flyer). This flyer was the first successful powered aircraft, designed and built by the Wright brothers. They flew it four times on December 17, 1903, near Kill Devil Hills, about four miles south of Kitty Hawk, North Carolina, U.S. Today, the airplane is exhibited in the National Air and Space Museum in Washington D.C.
IDIA Lab’s Virtual Izapa, is a simulation of the site that is considered to be the birthplace of the Mayan Long Count, which ended its cycle today on December 21st, 2012 – the winter solstice. Viewed today, the site is oriented on an axis where it is aligned generally but not precisely to sunrise on the winter solstice and to sunset on summer solstice. In direct observation today, the alignment along the axis of the ball court from the throne number 2 to the stela number 60 is off-axis by approximately two degrees.
BSU IDIA Lab’s Virtual Giza Necropolis is an archaeological site on the Giza Plateau, on the outskirts of Cairo, Egypt. This complex of ancient monuments includes the three pyramids known as the Great Pyramids, the massive sculpture known as the Great Sphinx, several cemeteries, a workers’ village and an industrial complex. It is located some 9 km (5 mi) inland into the desert from the old town of Giza on the Nile. The pyramids, which have historically loomed large as emblems of ancient Egypt in the Western imagination, were popularised in Hellenistic times, when the Great Pyramid was listed by Antipater of Sidon as one of the Seven Wonders of the World. It is by far the oldest of the ancient Wonders and the only one still in existence.
IDIA Lab’s Virtual Chichen Itza is a simulation of the large pre-Columbian city built by the Maya people of the Post-Classic era. The archaeological site is located in the municipality of Tinum, in the Mexican state of Yucatán.The layout of the Chichen Itza core site developed during its earlier phase of occupation, between 750 and 900 AD. Its final layout was developed after 900 AD, and the 10th century saw the rise of the city as a regional capital controlling the area from central Yucatán to the north coast, with its power extending down the east and west coasts of the peninsula. The earliest hieroglyphic date discovered at Chichen Itza is equivalent to 832 AD, while the last known date was recorded in the Osario temple in 998.
Using our MakerBot Replicator 2X 3D printer, IDIA’s Chris Harrison worked with David Rodriguez to create a family of 3 brackets with varying uses and advantages to mount a Leap Motion Controller to the front of an Oculus Rift DK2.
Finding that double sided tape was not doing a very effective job of holding the Leap Motion Controller in place, we looked around Makerbot’s Thingiverse, an online warehouse of 3D print-ready objects, for a solution.
We found a bracket which when printed didn’t quite meet the tolerances of the Leap’s dimensions, and so some slight modifications were made to better accommodate it. In addition, rather than the 2-piece configuration on the website, a new bracket was made to be printed in one single pass.
Finally, after realizing other potential uses for the Leap, 2 more brackets were designed and printed so that the Leap can be securely installed onto the Oculus in a total of 3 different configurations.
The brackets can be viewed and downloaded here:
Bracket 1 Straight bracket used for visual IR passthrough from Leap camera
Bracket 2 Straight bracket used to minimize Oculus IR emitter occlusion
Bracket 3 Angles backet used to track hands with best angle – if no passthrough is desired
View a Oculus / Leap project here: https://idialab.org/oculus-rift-and-leap-motion-demo/