River Data Visualization

River Data Visualization

oneWAPIHANNE is an interactive artwork visualizing live streaming data from water stations along the White River in Indiana. The stations are installed and monitored by the United States Geological Survey [USGS] – reporting the current gauge height and water speed over time. The visualization graphically represents each stations’ data in their entirety or individually – with overlays for station locations, and record river levels. Wapihanne is the name for the White River in the Delaware language. The chromatic tiles in the upper left represent the gauge stations along the White River. The monochromatic tiles in the upper right represent supplemental data overlays – displaying a line graph of the river height over time, a compressed map of the White River with the approximate location of the gauge stations, and the record levels for each station. In the bottom right of the screen is a gray tile that refreshes the data and restarts the program. Each flow of station data represents river height and velocity over the course of the last thirty days. The top of the flow is the height, and the size of the flow is related to velocity. Touching and dragging a specific flow, will reveal detailed information about that particular reading – creating a sound with volume tied to river velocity.


The Firefly Effect

A virtual social installation by John Fillwalk, IDIA Lab and Jody Naderi, CAP

This work, designed specifically for Nuit Blanche Toronto activates a locative field in which participants engage in dynamic social experience. Individual users of the app interact with a dynamic feedback system allowing them to find the cadence of a contemplative walk. Participants can view other app users on a live map – locating them spatially to join in a unique group or virtual room – forming a dynamic choir of collaborative synchronous light and music.

Nuit Blanche

https://nbto.com

Firefly at Nuit Blanche

https://nbto.com/project.html?project_id=114

IDIA Lab

https://www.idialab.org/firefly/

The History Channel’s The Universe, Ancient Mysteries Solved: Roman Engineering

History Channel at IDIA Lab
History Channel at IDIA Lab

John Fillwalk, director of the Institute for Digital Intermedia Arts and senior director of the university’s Hybrid Design Technologies initiative, was interviewed for the History Channel’s “The Universe.” The episode, which aired 9 p.m. May 23, featured the IDIA Lab’s digital recreation of ancient Roman structures.

For the season finale of its popular program “The Universe,” the History Channel is shining a spotlight on the work of virtual artists from Ball State University.

The season finale episode of the documentary series, which aired May 23, explores how Roman emperors built ancient structures to align with movements of the sun. To confirm experts’ theories about the religious, political, and cultural significance behind these phenomena, the cable network enlisted the help of Ball State’s Institute for Digital Intermedia Arts (IDIA).

Through the use of 3-D computer animation, artists and designers from the IDIA Lab virtually recreated several monuments featured in the episode, along with accurately simulating the alignment of the sun. These structures include: the Pantheon, one of the best preserved buildings of ancient Rome; the Temple of Antinous, a complex that today lies in ruins within Hadrian’s Villa outside of Rome; and the Meridian of Augustus, a site containing an Egyptian obelisk brought to Rome for use as a giant sundial and calendar.

‘Getting things right’

IDIA Lab's Roccabruna reconstruction

 

The IDIA Lab digitally rebuilt ancient Roman structures to give viewers of the History Channel’s “The Universe” a vivid idea of what ruins looked like when they were inhabited.

James Grant Goldin, a writer and director for The History Channel, says IDIA’s animations were an essential part of the program’s second season finale. An earlier episode of “The Universe,” which aired in 2014, contracted with IDIA for animation of Stonehenge in a segment demonstrating how the prehistoric monument may have been used to track celestial movements.

This time around, the work of Ball State digital artists is featured throughout the Roman engineering episode.

“It’s an honor for us,” says John Fillwalk, director of IDIA and senior director of the university’s Hybrid Design Technologies initiative. “Ideally our relationship with the History Channel will continue long into the future.”

Goldin said the cable network is keen on employing the IDIA for future projects because Fillwalk and his team create beautifully rendered graphics backed by data and research.

“John was devoted to getting things right,” he said. ” Many of the theories we discuss in the show remain controversial, and in nonfiction TV it’s always a good idea to present such things with a qualifying phrase in the narration—an ‘if’ or a ‘maybe.’ But the IDIA Lab combined their own research with that of experts, and I’m very happy with the results.”

Gaming software transforms history

IDIA Lab's Pantheon reconstruction

 

The Pantheon is one of the ancient structures the IDIA Lab developed 3-D renderings of for the History Channel’s “The Universe.”

Fillwalk has worked closely over the years with many prominent scholars of the ancient world, including Bernard Frischer, a Roman archeologist at Indiana University and Robert Hannah, dean of arts and social sciences from New Zealand’s University of Waikat, who advised on the archeo-astronomy of the project.

Hannah says he’s been astounded to see the way up-to-the-minute gaming software can bring to life the work of today’s historians and virtual archaeologists. “I’ve seen my son play games like ‘Halo,’ so I knew what was possible,” he said, “but I’d never seen it adapted to ancient world buildings.”

Phil Repp, Ball State’s vice president for information technology, says the relationship between the university and the cable network is a key example of how Ball State is distinguishing itself as leading world provider of emerging media content.

“For the History Channel to want the continued help of John and his staff speaks to the quality of what our lab can produce,” he said.

Goldin’s praise for the IDIA supports Repp’s theory. “Bringing the past to life is a very challenging task,” he said. “The renderings Ball State artists have created represent the most accurate possible picture of something that happened almost 2,000 years ago.”

Old Planetarium + Cutting Edge Studio

planetarium_sub

OLD PLANETARIUM REVAMPED INTO CUTTING-EDGE STUDIO

Months after the opening of the Charles W. Brown Planetarium, Ball State has found a new use for its almost 50-year-old predecessor. The dome-shaped room in the basement of the Cooper Science Complex has become a production facility dedicated to developing new shows for planetariums nationwide.

“This is an exciting opportunity for us,” said planetarium director Ron Kaitchuck, “because the last thing we wanted was for the old planetarium to become a storeroom.”

To make use of the space, Kaitchuck and his staff first needed to outfit it with a new digital projector. Helping him secure funding for the device was John Fillwalk, director of Ball State’s Hybrid Design Technologies, a division of the Office of Information Technology that seeks out partnerships injecting the arts into STEM education. More commonly known as STEAM, the acronym stands for science, technology, engineering, arts and mathematics.

“We saw the dome as a valuable space where we could create new content — the kind that’s hard to test on flat-screen computer monitors,” said Fillwalk, who’s also director of the Institute for Digital Intermedia Arts (IDIA), a lab specializing in creating virtual reality and 3-D simulations for a broad range of clients including UNESCO, other universities and the History Channel.

Kaitchuck said Fillwalk’s expertise — as well as that of his student and professional designers — will let planetarium staff and students in the Department of Physics and Astronomy create shows they couldn’t otherwise. “For instance, we don’t know how to craft 3-D models, but John does.”

“And I’m not a scientist, but Ron is,” Fillwalk said. “So there’s a nice synergy at play where he advises us on the science side of these productions and we create the graphics.”

AN ENTREPRENEURIAL VENTURE IN THE MAKING

While the partnership is new this year, Dayna Thompson, assistant planetarium director, looks forward to seeing how the IDIA Lab helps improve shows Ball State is already creating.

One goal is to generate more revenue from the productions. Take, for example, “Saturn & Beyond,” which recently sold to a distribution company for $18,000. “Had we been able to work with IDIA on that, we could have included more graphics to help explain the science that the program talks about,” she said.

Helping assist with the creation of “Saturn & Beyond” was physics and astronomy student Monique Gabb, a Florida native completing her master’s degree who wants to work for a government-backed science and engineering research lab like Argonne. “Astronomy is a great gateway to getting people interested in science, so I enjoyed helping create a show like this.”

Gabb said when it came time to test “Saturn” for audiences, it was a challenge to get large blocks of screen time inside the 52-foot domed Brown Planetarium. “There are so many shows going on over there, it was hard to work us in.” With the conversion of the old planetarium into a new production dome, “we no longer have to worry about that issue.”

Kaitchuck said the size of the old planetarium is also major boon for its new use. “As far as I know, no other planetarium in the country has at its disposal a 30-foot dome for production purposes only,” he said, noting New York City’s Hayden Planetarium, inside the American Museum of Natural History, uses a production dome a fraction of that size.

IDIA TO CREATE NEW, INTERACTIVE CONTENT

Fillwalk has his own aspirations for the old planetarium, now that he’s helped transform it into a studio — one equipped with game-engine computers and a digital projector encased inside a protective fabricated display created by IDIA artist and Ball State alumnus Chris Harrison, ’10, MS ’12.

Most of Fillwalk’s plans involve using the space to help IDIA expand upon — and reshape — content the lab has created for past projects, including virtual simulations of Stonehenge and other ancient ruins in various celestial alignments. “It’s a great opportunity for us to test what we already do in another immersive environment. A dome gives you the same kind of view you’d see with a head-mounted, immersive display, but now we don’t have to wear goggles to get it.”

IDIA is also exploring development of dome-spaced content for new clients including museums and aerospace companies. “My biggest interest is finding ways to allow audiences to be more interactive with these kinds of show,” Fillwalk explained. “If you could hold your phone up as you’re watching, and have a planet or star jump out at you — I think kids would love that.”

OPPORTUNITIES FOR EDUCATION

Closer to home, Fillwalk and his team are working with Thompson on projects for the Brown Planetarium, which in its first year had about 20,000 visitors — more than the double the number of people who visited the old planetarium in its last full year of operation.

For the planetarium’s show, “Black Holes, Worm Holes, and the Movies,” IDIA artists visualized travel through a worm hole, which scientists consider to be a shortcut connecting two distant points in space-time. And for an upcoming Halloween show, Fillwalk and his staff will create graphics illustrating the history of the holiday and its astronomical origins.

Kaitchuck wants to use the old planetarium for future entrepreneurial learning opportunities that will let students outside his department help create new planetarium shows. “We could use music media production students for the audio, creative writing majors to help with scripts — it could be a very diverse group,” he said. “And now, with John’s help, the kinds of work we’ll be able to do … well, the sky really is the limit.”

By Gail Werner

DOME Lab photographs courtesy of Ball State University, Division of Strategic Communication

http://bit.ly/2anJRzr

Dr. Bernard Frischer Lecture: Rome Reborn: 3D Reconstructions as Tools for Discovery

Bernard Frischer is a digital archaeologist who writes about virtual heritage, Classics, and the survival of the classical world. He received his Ph.D. in Classics from the University of Heidelberg. He followed his doctoral work in Germany with a two-year postdoc at the American Academy in Rome, where he studied Roman topography and archaeology. Dr. Frischer currently is a professor of Informatics at Indiana University.

The Rome Reborn project is an international initiative, launched in 1996, to create a 3D reconstruction of ancient Rome in AD 320. The model, which took 22 years to complete, makes it possible for newcomers to the subject of Roman topography and urban history to obtain a quick visualization of the monuments in their context in the city. The purpose of this talk is not to explore these instructional applications of the model but to draw out the uses of virtual reality as a tool of discovery.

• Sponsored by the Digital Scholarship Lab (DSL), the Institute for Digital Intermedia Arts (IDIA) and the Ancient Studies Program at Ball State University.

• Introduction by Provost Dr. Susana Rivera-Mills.

For more information, please email us at DSL@bsu.edu

Newton’s Cenotaph in VR

Appeared in Engadget, Variety and Motherboard

Linden Lab lifted the veil on Sansar, the long awaited followup to Second Life, allowing the general public to explore VR environments that beta-testers have been creating for the platform. Among the usual suspects of ritzy nightclubs and elven cities is something unexpected.

Over the summer, a group of digital artists from Ball State University used Sansar to construct a digital replica of Newton’s Cenotaph—one of the most awe-inspiring neoclassical structures to never have been built.

“The building is really at this preposterous and fantastical scale,” director at Ball State’s IDIA Lab John Fillwalk told me in a phone interview. IDIA Lab is a division within Ball State’s College of Architecture and Planning that explores the intersection of digital and physical design.

“With this technology, you can build the impossible, or at least the impractical,” said Fillwalk. It had been a long-time aspiration of his to digitally assemble Newton’s Cenotaph in some shape or form, and Sansar provided a convenient way to bring the unrealized work of architecture to life.

The Cenotaph is a great, big dome of a building, originally imagined by the French architect Étienne-Louis Boullée in the 18th century. But Boullée’s more grandiose designs tended to skirt the limits of feasibility, and thus rarely saw the light of day. The Cenotaph’s design, for instance, eclipses the height of the Great Pyramids at 455 feet.

“It would take an enormous amount of labor to do something like that in reality,” Fillwalk said. “And the engineering to pull it off would be an outstanding undertaking.”

Sansar made it easier. To begin with, Fillwalk got ahold of high resolution scans from Boullée’s architectural prints. Following them as closely as possibly, the group recreated them in 3D modeling software Maya.

While the exterior of the unbuilt building is expansive, the interior is mechanically intricate. Boullée envisioned the building as a monument to Isaac Newton, who among other things, worked out mathematical proofs for heliocentrism, the idea that planets orbit around the sun.

In tribute, a great brass armillary sphere, representing the motion of the planets, was intended to rotate within the equally great dome.

One of VR’s greatest assets is giving users a sense of scale, so the medium was a natural fit for resurrecting impossible works of overambitious architecture, Fillwalk said. In fact, the Cenotaph may be too big.

“Because it takes so long to walk through it normally, we put in a teleport feature as a speedy way to get through it,” he said.

https://www.vice.com/en/article/zm4pe4/newtons-cenotaph-has-finally-been-built-but-in-vr

Visit Newton’s Cenotaph

VIRTUAL INSTRUMENT 3D UI

ChromaChord utilizes a combination of the Oculus Rift headset and an attached Leap Motion controller to create an immersive 3D VMI environment. The Oculus Rift provides a three-paneled visual interface, through Unity, that allows the users to access different component windows of the instrument (simply by turning their head), while the Leap Motion Controller allows the user to interact with the visual interface to musical effect (processed and generated in Max/MSP). ChromaChord uses the Oculus Rift DK2, a virtual reality headset to immerse the performer within the VMI. Utilizing two cameras in the virtual environment, this device relays a stereoscopic image to the wearer allowing for highly dimensional experiences. Additionally, a gyroscope, accelerometer, and magnetometer detect real-world head rotation and transform its virtual equivalents. Additionally, a peripheral infrared camera captures limited-range positional tracking. Mounted to the Oculus Rift is affixed a Leap Motion. This provides infrared tracking of hands. In our application, the Motion tracks the user’s real world hands and simulates them, mimicking hand positions and finger movements. Therefore, the user’s own hands are the primary input mechanism. Custom Software
  • Key note triggering and glow by Leap Motion hand touches
  • Particle generation triggered Leap Motion hand touches
  • Button click and glow by Leap Motion hand touches
  • Particle destruction and orbit physics
  • OSC Server and Client configurator / Message Handler / Data Handler
Our graphical user interface is composed of three sections – left, center and right panels. On the left, the user can see the modulation sphere. By touching this sphere, the user generates orbital particles. The number of these particles affects a live process on the notes. The user can remove these particles by simply swiping them away. Turning to the right reveals the root note and scale type controls. To select a root note and scale type, the user must press the desired button. The interface responds by highlighting the selected options. The center module contains the keys, a chromatic scale in two colored columns based on the Solfège. This is the core performance area of the UI. In addition there are arrows facing up and down that the user can touch to raise or lower the octave. All buttons and keys respond to touches by glowing white. 3D Musical Instrument The default window is the center palette, where the instrument itself is located. Within this window, there are two columns of colored keys (four per column) to the left and the right of the screen. To create sound, the user has but to put their hands vertically in front of them and ‘touch’ the keys, which are placed for both ease of use and the ability to contact multiple notes simultaneously to create chords. The notes, themselves, are stacked in intervals of a third from each other; consequently, neighboring notes on the same hand will create conventional harmonies. Between the columns, on the top and bottom-most extremities of them, are two arrows that, if touched, move the current note set up or down an octave. By touching the notes, Unity sends a UDP message to Max/MSP, indicating which note has been touched and remains active until contact has ceased. Similarly, by touching the octave buttons, UDP messages are sent to Max to reassign the MIDI note values. Within Max, MIDI values are then converted to their equivalent frequencies and assigned to their respective oscillators before being processed and output through Max’s audio out. When looking to the right, the user accesses the settings window, where note assignment takes place. First, the user selects the root note of the scale they wish to be within. The second parameter allows the user to select what mode of scale they would like – the options being: major, natural minor, and harmonic minor. By default, the instrument opens with a root note of C and is in the major mode. The UDP messages sent from the settings window control a series of gates within Max that then send adjusted MIDI values to the instrument’s oscillators. The performer accesses the modulation window to the left, which controls the frequency and Q of a filter, as well as level of effect. By touching the button in the center, Unity spawns floating particles that surround the user’s hands. The longer the users touches the button, the more particles accumulate; they can be removed by making shaking or throwing motions. Without particles, the filter’s frequency setting is at it’s lowest audible setting, creating a much darker, mellow timbre; alternatively, as more particles accumulate, the filter opens up, allowing for higher frequencies to pass through, creating a brighter, richer timbre. Additionally, the particles move the dry/wet ratio for effects towards the ‘wet’ end of the spectrum, deepening the audibility of the effects. The UDP message sent from the modulation window sends a single numeric value for the total number of particles active; this value is then scaled in Max/MSP to most appropriately suit each parameter that’s being altered.  
Data 0 0 0 0 0 0 0 0 0 0 0 0 0
Name Do Re Mi Fa Sol La Ti Do Up Octave Down Octave Modulation Root Note Scale Type
Type Key Key Key Key Key Key Key Key Octave Switcher Octave Switcher Modulation Root Note Scale Type
Range 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-100 0-11 0-2
Notes Bottom of octave Top of octave 1 when clicked 1 when clicked # of particles

Sinespace Gallery Designer

IDIA Lab has created a virtual gallery design system in partnership with virtual world creator Sinespace. Sinespace, a multi-user virtual world platform based in the UK, contacted IDIA to design this system for their new virtual environment. The Gallery Designer isan easy to use system to dynamically create gallery and museum exhibitions on the fly with any web hosted images – creating an exhibition with any images that can be connected to from the internet. Users can easily pick gallery styles (classical/modern), exterior settings, frame style, lighting, floosr, wall color and ceiling treatments – customizing an experience that is complementary to the exhibition theme. The gallery can then be easily published and host a virtual reception with visitors from around the globe.

Wright Brothers Workshop

IDIA Lab has created a virtual simulation of the Wright Brothers bicycle shop in Dayton, OH for the HTC Vive headset and SteamVR. The environment is a recreation of the Wright Cycle Company, on South Williams Street in Dayton was built in 1886 and is a US National Historic Landmark and on the National Register of Historic Places. The Wrights occupied this location from 1895-1897. IDIA Lab designed the experience as a virtual field trip where a visitor can pick up and examine highly detailed objects from the shop including letters, tools and inventions created by the Wrights. This project will be published on Steam VR in the coming months.

EVAC+ Virtual Disaster Simulator

EVAC+ Virtual Disaster Simulator

IDIA Lab partnered with physicians at the Seattle Children’s Hospital to develop a virtual simulator to train staff in the case of various disasters.

The EVAC+ Virtual Disaster Simulator leverages virtual reality to train health care providers to react quickly and effectively to unexpected events that threaten patient safety. It was developed after pilot live simulations, user focus groups, surveys, and a successful pilot of the initial EVAC system. After the need for more communication opportunities and variations in disasters responses was identified in the pilot, EVAC+ was created.  EVAC+ leverages the same ability to prepare patient equipment for an evacuation but with significantly more features allowing for interacting with hospital room, other staff, and family members. Upon entering the EVAC+ environment providers are oriented to the virtual space, including navigation, interacting with objects, and how to interact verbally with non-player characters.  

Once they are comfortable with the space they can choose from a menu of disasters to experience, including earthquakes, fire, and an active shooter event. Each simulation has unique objectives around patient protection, equipment packing, and communication with other staff and families. Learners practice how they would provide a status update using the SBAR technique to their charge nurse, use closed loop communication, and respond to anxious family members.  

Pre-determined safety behaviors, such as stopping and resolving a mission critical issue, are prompted and performance recorded for feedback. After each scenario the user is guided through a self-debriefing on how well they performed on key disaster and communication behaviors. Information on how they managed and packed medical equipment is presented along with evidence-based information on how experts would recommend managing equipment. Users can choose to repeat any scenario to practice or refine their responses and can explore the scenarios in any order.  

The Unity3D environment is accessible on web-browsers and head-mounted VR systems. Amazon Web Services (AWS) supports voice communication and real-time analytics. The EVAC+ system fills a unique need for accessible, interactive, sustainable disaster training for healthcare providers.

Calendar Reminders for Your Fitbit with Luvtaps

Luvtaps Gives You Google Calendar Reminders on Your Fitbit

IDIA Lab has released a new product called Luvtaps. Luvtaps makes it easy to receive calendar reminders whenever you have an event coming up. While Google Calendar can notify you on your phone of an upcoming event, Fitbit does not allow access to its vibration. Therefore, Google Calendar cannot directly use this device. Luvtaps gets around this by adding a silent alarm to your Fitbit for every Luvtap you create from a Google Calendar event.

Features

  • Google Calendar integration
  • Fitbit integration
  • Shows all Google Calendar events in the next 24 hours
  • Search for Google Calendar events
  • Support for multiple Fitbit devices

Start Receiving Calendar Reminders

Luvtaps is very easy to use. Navigate to http://luvtaps.org and follow these steps to get started:

  1. To begin, sign in with Google.
  2. Then, sign in with FitBit.
  3. Once you have linked both accounts, select the Fitbit device you would like to which you would like to add a Luvtap.
  4. Then, click the plus button on a Google Calendar event to create a Luvtap for it.
  5. Once you have created a Luvtap, use the up or down buttons to increment or decrement the Luvtap notification time
  6. Finally, Click the minus button on the Luvtap to delete it.

Start receiving calendar reminders today at http://luvtaps.org!



Designed for use with Google Calendar™ and FITBIT® data
Google and the Google logo are registered trademarks of Google Inc., used with permission.
Fitbit is a registered trademark and service mark of Fitbit, Inc. luvtaps is designed for use with the Fitbit platform. This product is neither produced nor endorsed by Fitbit, and Fitbit does not service or warrant the functionality of this product.

Hadrian’s Villa – Download!

hadrians-villa

Always a great pleasure working with Bernie!

Bernard Frischer

We recently posted a UNITY webplayer of some of the core buildings at Hadrian’s Villa. This can be used at no cost by people around the world who want to learn more about the villa or teach it to their students. The webplayer supports avatars, chatting, and can easily be used with Skype so that you can talk to your fellow avatars in world. We thank John Fillwalk and Ball State University’s IDIA Lab for the great work in making the webplayer possible. We also thank the Andrew W. Mellon Foundation and the Indiana School of Informatics for providing the financial support that made this project possible.

You must open it in Mozilla Firefox. Before doing so, I’d urge you to look at the README file with instructions. You can find that file and the UNITY webplayer at:

http://vwhl.soic.indiana.edu/villa/3dmodelsdownloads.php

I’d love to know what you think and be able to observe you using it with your students and friends. So please keep me in the loop. Enjoy!

Hadrian’s Villa – 3D Model Downloads
VWHL.SOIC.INDIANA.EDU

IDIA Lab part of Ball State University President’s Strategic Plan

IDIA Lab part of Ball State University President’s Strategic Plan:
The Institute for Digital Intermedia Arts continues its research and design of virtual worlds, animation, simulation, and augmented reality applications in several existing and new laboratories: the Human Computer Interface Lab (HCI Lab), creating tools and content for the exploration of next-generation input devices, electronics, and sensors; the Simulation Lab (SimLab), investigating immersive virtual simulation environments; and the Digital Dome Production Lab (DomeLab), developing the expertise to create science and arts digital dome masters for distribution to planetariums around the world. #simulation #future http://centennialcommitment.bsu.edu/…/…/FY15AnnualReport.pdf

Wonders of Oz is an augmented reality app

oz

Wonders of Oz is an augmented reality application that provides users an enhanced 3D experience with the film, The Wizard of Oz. This demo app can be triggered during film (or anywhere, anytime) to deliver animated content and sound to augment the screening and share events with a live audience. Flying monkeys, the witch on her burning broom, lollipops, the hot air balloon, Dorothy’s falling house, cows, chickens, rowboats, bicycles, rainbows and the wizard all appear with the press of a button! Users can use their devices to discover and track the virtual performers as they appear around them during the film. The original version contained more 3D content and was designed to work on a timer so the audience can share in the digital performers appearing on cue! This app can be adapted for any event, live theater, film concerts, art and cultural heritage events, etc. Note: This application requires the use of a gyroscope – not all Android devices possess one.
iOS: http://apple.co/1SmwI7Z
Google Play: http://bit.ly/1RRbNds

Launch of Virtual Companion iOS app

Virtual Companion – custom augmented reality app employing LocusEngine, a geolocative process developed by IDIA Lab.
https://idialab.org/launch-of-virtual-companion-ios-app-by-…/

Halloween Celestial Origins

stonehenge_summerfar

The Charles W. Brown Planetarium’s newest production, “Halloween: Celestial Origins” premiered the weekend of October 7th in collaborations with IDIA lab.

Learn the history of Halloween and how it fits into the seasons as a “cross-quarter day” during this new production. Explore the night sky and learn what planets, constellations, and stars will be out on Halloween. Created in collaboration with the Ball State University – Department of Physics and Astronomy, the Department of Anthropology, and IDIA Lab The Institute for Digital Intermedia Arts at Ball State University. More information at http://bsu.edu/planetarium.

spider

IDIA Lab Featured on WTHR News for Virtual Reality research

wthr

The future of home entertainment is taking a giant step forward into the world of virtual reality. More devices are hitting the market that transform the way we look at the world around us, and a group of developers at Ball State is on the cutting edge.

It’s something we experienced first-hand at Ball State University, where they produce content for virtual reality devices at the Institute for Digital Intermedia Arts. The lab is located in the architecture building on the Ball State Campus, where the focus is on the exploration the arts, science and technology.

For the last ten years, Ball State has been developing a wide range of projects for museums, the History Channel and other clients.

John Fillwalk oversees the work done there.

“We basically explore 3D simulation and virtual reality,” he explained. “The skill sets from the staff are across computer science to design to animation.”

“We continue to further the emerging media brand aspect of this university, giving students professional opportunities to work and stay in Indiana for design technology jobs,” said John Fillwalk.

The virtual reality industry estimates more than 7 million headsets will ship this year, even though researchers have found more than half of Americans have no interest in VR and don’t know what it is. Developers foresee huge growth in the next six years as they develop virtual reality social media apps, combined with lower prices for the technology.

http://bit.ly/29JHJAl

Solar System and Jupiter App

jupiter-2

Explore the solar system by touch and augmented reality. New app for iOS, Android and Oculus Rift designed by BSU’s IDIA Lab allows users to learn about our sun and planets – and their moons. Physical museum objects trigger augmented experiences of each planet, providing deeper context and understanding. View the moon’s orbits, examine each moon, the planet’s atmosphere and unique features such as Jupiter’s Great Red Spot. IDIA designed this augmented reality app as a prototype for the Boston Museum of Science Planetarium.

Photogrammetry capture of the Pertev Paşa Mosque

mosque-interior-pan01

IDIA Lab’s Chris Harrison presented on the topic of photogrammetry at a workshop in İzmit, Turkey, the results of which have recently been published in Mimarlık | Tasarım Kültürü Dergisi (Architecture | Design Culture Magazine).

View the 3D Model

https://skfb.ly/PKpx

The workshop

The workshop was a collaboration between Ball State University’s College of Architecture and Planning (group led by Jonathan Spodek, FAIA), the Politecnico di Milano (group led by Prof. Elisabetta Rosina), and the host Kocaeli University (group led by Dr. Emre Kishali).

While the workshop focused broadly on theories of historic preservation in relation to the city of İzmit as a whole, Chris was invited to the workshop to present photogrammetry as a cheap, fast, and accessible method of site documentation and preservation. Chris led a team comprised of one student from each participating university and used the mosque and its grounds as a hands-on experience to teach various capture techniques.

While other documentation methods such as laser scanning exist, they require very expensive equipment that takes considerable training. Photogrammetry only requires a digital camera, which many people have on their cell phones these days. Using a free version of Autodesk’s browser-based ReMake, Chris and the students were able to capture and process the mosque interior and several smaller exterior details over the course of a few days.

Post processing

After returning from the workshop, we decided to further refine the model and optimize it for a virtual reality experience. Using a variety of softwares, the model was simplified so that it could be run as a live experience. This now enables visitors to experience a site located halfway across the world as if they were there.

Kevin Nolan

Director, Applied Anthropology Laboratories

Kevin is an Ohio Valley archaeologist with a primary research interest in the Late Prehistoric period (ca. AD 1000 – 1600) of the Middle Ohio Valley, particularly how humans interact with the environment. He has published collections based research, results of fieldwork, and theoretical models for the Middle Woodland (ca. 50 BC – AD 400) and Late Prehistoric periods. Other research interests include evolutionary approaches to human behavior, siteless survey and regional analysis, paleoenvironments, and systematics. Kevin also has an interest in public education and has regularly given presentation to grade school and high school classes about archaeology and science.

Recent Publications:

Hart, John P. and Kevin C. Nolan
2015 Comment on Cook and Comstock’s “Evaluating the Old Wood Problem in a Temperate Climate: A Fort Ancient Case Study”. American Antiquity 80(3):610-612.

Nolan, Kevin C., Samantha Blatt, Paul Sciulli, and Christine K. Thompson
2015 A Late Woodland Red Ochre Burial Cache in Madison County, Ohio. Manuscript accepted for North American Archaeologist 36(3).

Nolan, Kevin C. and Brian G. Redmond
2015 Geochemical and Geophysical Prospecting at Three Multicomponent Sites in the Southwestern Lake Erie Basin: A Pilot Study. Journal of Archaeological Science: Reports 2:94-105. DOI: 10.1016/j.jasrep.2015.01.002

Nolan, Kevin C. and Paul Scull
2014 Rejoinder to Sciulli and Purcell: Two Late Prehistoric Dogs from the Reinhardt Site (33PI880), Pickaway County, Ohio. Pennsylvania Archaeologist 84(2):65-73.

Nolan, Kevin C.
2014 Prehistoric Landscape Exploitation Strategies Through Time in Central Ohio: A GIS Analysis. Journal of Ohio Archaeology 3:12-37.

Nolan, Kevin C.
2014 Prospecting for Prehistoric Gardens: Results of a Pilot Study. Short Report accepted for Archaeological Prospection 21(2):147-154. DOI: 10.1002/arp.1465.

Nolan, Kevin C. and Robert A. Cook
2012 A Method for Multiple Cost Surface Evaluation of a Model of Fort Ancient Interaction. Manuscript prepared for Least Cost Analysis of Socionatural Landscapes: Archaeological Case Studies, edited by DA White, and S Surface-Evans, pp 67-93, plate 5. University of Utah Press.

Roos, Christopher I. and Kevin C. Nolan
2012 Phosphates, Plowzones, and Plazas: A Minimally Invasive Approach to Infer Settlement Structure of Plowed Village Sites in the Midwestern USA. Journal of Archaeological Science39(1):23-32, doi:10.1016/j.jas.2011.06.033.

Nolan, Kevin C. and Robert A. Cook
2011 A Critique of Late Prehistoric Systematics in the Middle Ohio Valley. North American Archaeologist 32(4):293-325.

Nolan, Kevin C.
2011 Distributional Survey of a Fort Ancient Village in Pickaway County, Ohio: Summary of 2007 Reinhardt Site Survey. Midcontinental Journal of Archaeology 36(1):105-130.

Nolan, Kevin C. and Steven P. Howard
2010 Using Evolutionary Archaeology and Evolutionary Ecology to Explain Cultural Elaboration: The Case of Middle Ohio Valley Woodland Period Ceremonial Subsistence.  North American Archaeology 31(2):119-154.

Nolan, Kevin C. and Robert A. Cook
2010 Volatile Climate Conditions Cahokia: Comment on Benson, Pauketat and Cook 2009. American Antiquity 75(4):978-983.

Nolan, Kevin C. and Robert A. Cook
2010 An Evolutionary Model of Social Change in the Middle Ohio Valley: Was Social Complexity Impossible During the Late Woodland but Mandatory During the Late Prehistoric? Journal of Anthropological Archaeology 29:62-79; doi:10.1016/j.jaa.2009.10.004.

Hero’s Horizons a Real-Time 3D Adventure Game

CAI_startmenu

Hero’s Horizons is a real-time 3D adventure game where players can choose a character that shapes the outcomes of their own experience in learning about civic responsibility and the consequences and impact of their own actions. The game employs a graphic novel esthetic as it immerses players in the conflict and progress of improving the health of an urban community.

In order to design an environment which allowed mission-based gameplay as well as free-roaming exploration, the team at Ball State University’s IDIA lab employed Unity 3D alongside various modeling, animation, motion capture, scripting, painting and texturing packages to create the look and feel of the world. Characters came to life through cinematic and graphic novel techniques, allowing the story to flow as required by interactive missions, but yet allow free gameplay via exploration and mini-games. Players’ good deeds provided mementos from those characters whose lives they helped shape and change.

Ball State University and Creative Associates have developed and disseminated a highly successful and transformative life skills curriculum to help at-risk youth in Central America and the Caribbean. But Creative Associates’ life skills training has always happened face-to-face, making it difficult for organizations to scale it large enough to help the greatest number of at-risk youth. In 2013, Creative Associates and Ball State University joined forces to explore the possibilities of gaming and gamification to expand the reach and impact of the existing life skills curriculum. The result is Hero’s Horizons – an innovative game-based learning system designed to deliver critical life skills training to at-risk youth via mobile devices and the Web.

Creative Associates International and Ball State University have combined expertise to create a truly blended-learning game for change targeted at youth living in high crime areas of Central America and the Caribbean.  By bringing together a mobile-ready graphic novel game – Hero’s Horizons, built in Unity 3-D, and 20 interactive e-modules, Creative seeks to build the life skills of 10,000 youth living in gang affected communities. Youth can access the game and training through their mobile devices as well as in some of the 200 youth outreach centers Creative has set up across the region.

Hero’s Horizons allows the player to customize a character, and then play the role of a young community organizer working in 4 sections of Capital City to help mobilize community youth and leaders to create a youth outreach center.  Along the way he/she encounters issues of apathy, political manipulation, and many issues facing at risk youth that he/she has the opportunity to make choices in how to respond to as he/she learns to overcome barriers at both the individual and community level.

The graphic novel game and accompanying e-modules help youth to advance their life skills in 20 competencies including conflict management, finding employment, and team work among others.  the game and the e-modules continually reinforce each other as part of a blended learning approach to long term retention.

Youth’s achievements are rewarded both through achievement badges as they successfully complete tasks and levels, as well as in the e-module section where youth earn printable Certificates of Competency for completing modules and Mozilla badges for achieving learning paths.

The game is being rolled out and tested through Creative’s network of 200 youth outreach centers in the region and other youth serving organizations to measure its effectiveness and further adapt it to the realities that youth in these communities face.

CAI

www.creativeassociatesinternational.com

Download Android game at Google Play

https://play.google.com/store/apps/details?id=org.idialab.LifeSkills&hl=en

Douglas Seefeldt

Douglas Seefeldt is digital historian with teaching and research interests that focus on the intersections of history and memory in the American West. He arrived at Clemson University in the fall of 2020 after spending eight years in the Department of History at Ball State University in Muncie, Indiana, where he was founding Research Director of the Digital Scholarship Lab from the fall of 2016 through the spring of 2020, a 2014-15 Fellow at the Charles Warren Center for Studies in American History at Harvard University where he participated in the Workshop on Multimedia History and Literature: New Directions in Scholarly Design, and an Emerging Media Fellow with the BSU Center for Media Design from 2012-14.

James Connolly

Professor of History

James Connolly is Professor of History and Director of the Center for Middletown Studies at Ball State University.. He is the author of An Elusive Unity: Urban Democracy and Machine Politics in Industrializing America (Cornell University Press) and The Triumph of Ethnic Progressivism: Urban Political Culture in Boston , 1900-1925 (Harvard University Press).  He also edited After the Factory: Reinventing America’s Industrial Small Cities (Lexington Books) and has published numerous articles and essays in edited volumes and journals such as Social Science History, theJournal of Urban History, and the Journal of the Gilded Age and Progressive Era.  Connolly’s research focuses on American urban, political, and ethnic history in the 1870-1930 period.  He is currently at work on “What Middletown Read,” a study of print culture and urban life at the turn of the twentieth century, in collaboration with Frank Felsenstein.

Wonders of Oz: iOS and Android app

Wonders of Oz is an augmented reality application that provides users an enhanced 3D experience with the film, The Wizard of Oz. This demo app can be triggered during film (or anywhere, anytime) to deliver animated content and sound to augment the screening and share events with a live audience. Flying monkeys, the witch on her burning broom, lollipops, the hot air balloon, Dorothy’s falling house, cows, chickens, rowboats, bicycles, rainbows and the wizard all appear with the press of a button! Users can use their devices to discover and track the virtual performers as they appear around them during the film. The original version contained more 3D content and was designed to work on a timer so the audience can share in the digital performers appearing on cue! This app can be adapted for any event, live theater, film concerts, art and cultural heritage events, etc. Note: This application requires the use of a gyroscope – not all Android devices possess one.

iOS: http://apple.co/1SmwI7Z

Google Play: http://bit.ly/1RRbNds

951ca5f3-1448-406b-8c43-238e6368642f

Buffalo Bill’s Wild West gets a virtual makeover

Virtual Buffalo Bill's Wild West

Digital artists from Ball State’s IDIA Lab built their virtual simulation of Buffalo Bill’s Wild West with the Unity 3-D gaming platform along with custom software created by the lab.

Cowboys and Indians captivated the country when Buffalo Bill’s Wild West rolled through America in the late 1800s and early 1900s. More than a century later, Ball State digital artists have re-created the legendary showman’s outdoor exhibition.

Working with staff from the Buffalo Bill Center of the West, artists and designers from Ball State’s Institute for Digital Intermedia Arts (IDIA) have crafted a computer-generated world that authentically simulates the Wild West showdramatizing frontier life.

“The visual look and feel of the project is something we’re really proud of,” said John Fillwalk, IDIA director and senior director of the university’s Hybrid Design Technologies initiative.

Fillwalk collaborated on the project with Jeremy Johnston, curator of the center’s Buffalo Bill Museum, and Ball State historians James Connolly and Douglas Seefeldt.

As a senior digital editor of the Papers of William F. Cody, Seefeldt has worked closely with Johnston on several projects the National Endowment for the Humanities recently recognized as among the most significant it funded.

“When Doug introduced me to John, I was excited because all we had to visually represent the Wild West show at the Buffalo Bill Museum was this small architectural diorama,” said Johnston, who is also managing editor of the Papers of William F. Cody. “It gave our visitors an aerial overview of the show but lacked action.

“What the IDIA captured for us is the look and feel of the experience, right down to the sound effects of horses and the stage coach running through the arena.”

Buffalo Bill’s Muncie visit

Interactive Firearm Demonstration

IDIA-created augmented reality apps will feature objects in the museums’ collections, such as firearms.

The Virtual Buffalo Bill project offered a crossover research opportunity for Connolly, director of Ball State’s Center for Middletown Studies. The center is developing Virtual Middletown, a 3-D visualization of industrializing early 20th-century Muncie, and the Buffalo Bill simulation offered an opportunity to produce a module as part of that endeavor.

Connolly and Seefeldt provided Fillwalk with photographs and newspaper accounts of Buffalo Bill’s 1899 stop in Muncie. “He personified the Wild West for audiences in these small towns,” Connolly said.

Connolly’s and Seefeldt’s research, along with assets provided by the Buffalo Bill Center, allowed Fillwalk and his artists to create beautifully rendered graphics based on data and research, hallmarks that have distinguished IDIA’s work in emergent media design.

“The attack on the Deadwood Stage Coach is simulated down to representing John Y. Nelson, one of America’s original Mountain Men driving the coach,” Fillwalk explained. “And Cody himself—along with his wardrobe—was painstakingly researched and re-created. His appearance was based on specific clothing of Cody’s in the museum collection that we were allowed to photograph.”

Seefeldt said Fillwalk’s re-creations uniquely capture William F. “Buffalo Bill” Cody.

“His show had it all—buffalos, the Pony Express, Annie Oakley, re-enactments of iconic events in the history of the West. He was one of the most famous people in the country, a celebrity of that era, and it’s a thrill to see the way John has brought him back to life.”

Ball State-Center of the West partnership continues

Located in Cody, Wyoming, the Buffalo Bill Center of the West includes the Buffalo Bill, Draper Natural History, Whitney Western Art, Plains Indian and Cody Firearms museums, along with the McCracken Research Library.

The Origins of Buffalo Bill

Buffalo Bill portraitBorn in 1846, William F. Cody rode for the Pony Express, served as a military scout and earned his moniker “Buffalo Bill” while hunting the animals for the Kansas Pacific Railroad work crews. Beginning in 1883, he became one of the world’s best showmen with the launch of Buffalo Bill’s Wild West, which was staged for 30 years, touring America and Europe multiple times.

The IDIA Lab’s next project for the center will be a series of augmented reality apps featuring objects in each museum’s collection. By holding electronic devices over images like a grizzly bear or gun, users can learn more about them as 3-D models of the subjects pop up on screen.

“By using their phones or tablets, visitors can see museum exhibits come to life,” Fillwalk said. “All of our work is meant to give visitors a greater appreciation for these assets with the aid of our digital interpretations.”

Johnston said what he likes best about Fillwalk’s approach is the way “he puts technology in the users’ hands.”

“I’ve seen so many younger people walking through our museums with their heads down, glued to their iPhones and iPads. With John’s help, I’m excited that we’re taking something they’re so familiar with and using it in a way to get them to engage with our exhibits here.”

Funding for the Virtual Buffalo Bill project was provided by a grant from the Buffalo Bill Center for the West, which was matched by internal grant funding from Ball State.

http://cms.bsu.edu/news/articles/2015/10/ball-state-gives-buffalo-bills-wild-west-a-virtual-makeover

History of Muncie gets Garfield treatment for upcoming ArtsWalk digital art event

150thMUNCIE, Ind. — Ball State University’s IDIA Lab in partnership with Paws, Inc. has created a special digital art event in honor of Muncie’s sesquicentennial celebration. The multimedia event will take place during Muncie’s ArtsWalk at 8 p.m., Oct. 1at the corner of East Charles and South Walnut streets in downtown Muncie.

Jim Davis’ Garfield will virtually host the event, which will feature historical imagery and clips of Muncie mentions in pop culture. “We wanted to create something fun and tongue-in-cheek, but also positive and optimistic about the city’s future,” said John Fillwalk, director of IDIA and senior director of the university’s Hybrid Design Technologies initiative.

“We were able to get a sneak peek at the community collaboration piece and I think ArtsWalk guests are in for a real treat,” said Cherly Crowder, events coordinator for DWNTWN, the city’s downtown development campaign. “John and his team are so creative, and their ability to take an idea from conception to completion has been awesome.”

Other community partner for the digital art event include the Delaware County History Society, Muncie Arts and Culture Council, Center for Middletown Studies and Ball State University Libraries.

After the ArtsWalk premiere, Fillwalk said the piece will be made available to the public. “We consider it a work of community art. The more people who can view it and promote it for us, the better.”

For more about ArtsWalk, visit www.downtownmuncie.org/artswalk

Virtual Buffalo Bill

Ball State University’s IDIA Lab https://idialab.org is developing a multiplayer virtual world that simulates Buffalo Bill Wild West Show. This digital history project is built in Unity 3D using custom software created by IDIA Lab and is being produced for the Buffalo Bill Center for the West in Cody, WY. Scholars include Douglas Seefeldt and James Connolly at Ball State University. http://centerofthewest.org

Buffalo Bill's Wild West

Adam Kobitz

Virtual Worlds Modeler and Animator

3D Modeller and Animator Adam graduated from Ball State with a Bachelor of Fine Arts in 2012, then returned to pursue his Master’s of Fine Arts in Animation the following year. After two years with IDIA Lab as a graduate assistant, Adam continued his time with the team as staff starting in 2014. As a 3D generalist, Adam’s skillset primarily includes modelling, texturing, lighting, rendering, and compositing. In addition to his 3D work, Adam is also a 107 certified sUAS/drone pilot and spends his free time travelling the country to compete in professional-level drone racing competitions.

Virtual Apollo 15

IDIA Lab’s Virtual Apollo 15 is a simulation of the mission which launched on July 26, 1971 from the Kennedy Space Center, at Cape Canaveral, Florida. During the launch, the S-IC did not completely shut off following staging for four seconds, creating the possibility of the spent stage banging into the S-II engines, damaging them and forcing an abort (the S-II exhaust also struck a telemetry package on the S-IC and caused it to fail). Despite this, the third stage and spacecraft reached its planned Earth parking orbit. A couple of hours into the mission, the third stage reignited to propel the spacecraft out of Earth orbit and on to the Moon. The Apollo 15 command module is on display at the Smithsonian Air and Space Museum in Washington, D.C.

Virtual Apollo 15

Visit Virtual Apollo 15 in Blue Mars

http://blink.bluemars.com/City/IDIA_Lunar/

Virtual Apollo 15 in Blue Mars Video Walkthrough

Virtual Kitty Hawk

IDIA Lab’s Kitty Hawk is a simulation of the moment recorded in the famous photograph of the flight of the Wright Flyer (often retrospectively referred to as Flyer I or 1903 Flyer). This flyer was the first successful powered aircraft, designed and built by the Wright brothers. They flew it four times on December 17, 1903, near Kill Devil Hills, about four miles south of Kitty Hawk, North Carolina, U.S. Today, the airplane is exhibited in the National Air and Space Museum in Washington D.C.

Virtual Kitty Hawk

Visit Virtual Kitty Hawk in Blue Mars

http://blink.bluemars.com/City/IDIA_WrightFlyer/

Virtual Kitty Hawk in Blue Mars Video Walkthrough

Virtual Izapa

IDIA Lab’s Virtual Izapa, is a simulation of the site that is considered to be the birthplace of the Mayan Long Count, which ended its cycle today on December 21st, 2012 – the winter solstice. Viewed today, the site is oriented on an axis where it is aligned generally but not precisely to sunrise on the winter solstice and to sunset on summer solstice. In direct observation today, the alignment along the axis of the ball court from the throne number 2 to the stela number 60 is off-axis by approximately two degrees.

Virtual Izapa

Visit Virtual Izapa in Blue Mars

http://blink.bluemars.com/City/IDIA_Izapa/

Virtual Giza Necropolis

BSU IDIA Lab’s Virtual Giza Necropolis is an archaeological site on the Giza Plateau, on the outskirts of Cairo, Egypt. This complex of ancient monuments includes the three pyramids known as the Great Pyramids, the massive sculpture known as the Great Sphinx, several cemeteries, a workers’ village and an industrial complex. It is located some 9 km (5 mi) inland into the desert from the old town of Giza on the Nile. The pyramids, which have historically loomed large as emblems of ancient Egypt in the Western imagination, were popularised in Hellenistic times, when the Great Pyramid was listed by Antipater of Sidon as one of the Seven Wonders of the World. It is by far the oldest of the ancient Wonders and the only one still in existence.

Virtual Giza Necropolis

Visit Virtual Giza Necropolis in Blue Mars

http://blink.bluemars.com/City/IDIA_Giza/

Virtual Giza Necropolis in Blue Mars Video Walkthrough

Virtual Chichen Itza

IDIA Lab’s Virtual Chichen Itza is a simulation of the large pre-Columbian city built by the Maya people of the Post-Classic era. The archaeological site is located in the municipality of Tinum, in the Mexican state of Yucatán.The layout of the Chichen Itza core site developed during its earlier phase of occupation, between 750 and 900 AD. Its final layout was developed after 900 AD, and the 10th century saw the rise of the city as a regional capital controlling the area from central Yucatán to the north coast, with its power extending down the east and west coasts of the peninsula. The earliest hieroglyphic date discovered at Chichen Itza is equivalent to 832 AD, while the last known date was recorded in the Osario temple in 998.

Virtual Chichen Itza

Visit Virtual Chichen Itza in Blue Mars

http://blink.bluemars.com/City/IDIA_ChichenItza/

Virtual Chichen Itza in Blue Mars Video Walkthrough

https://www.youtube.com/watch?v=R2rDLu2zY8E