IDIA Lab wins the Bronze Medal at iENA Exposition in Nuremberg, Germany

IDIA Lab’s Virtual Pantheon/Kinect project in Blue Mars and our occupational simulation for Alzheimer’s patients won the Bronze Medal at the iENA International Trade Fair for Ideas, Inventions and New Products in Nuremberg, Germany. http://www.iena.de/en/home.html

The Virtual Pantheon Project in Blue Mars is a multi-user simulation of the Roman Pantheon and its surroundings as it might have existed in 320AD. The structure is the largest and most intact surviving example of ancient Roman architecture in existence. The building was commissioned by Marcus Agrippa during the reign of Augustus as a temple to all the gods of ancient Rome, and rebuilt by the emperor Hadrian about 126 AD. The simulation was created by the Institute for Digital Intermedia Arts at Ball State University in consultation with the University of Otago, NZ and the Virtual World Heritage Laboratory at Indiana University. The simulation leverages the significant graphic capabilities of the CryEngine to create a compelling immersive experience. IDIA Lab and Avatar Reality worked together to design an interactive solar tracker that allows the visitor to change time of day around important celestial and celebratory events in Ancient Rome. More detail here: http://bit.ly/18cfDKu The project can also be interacted with via MS Kinect sensor directly in Blue Mars.

Virtual Roman Pantheon in Blue Mars / CryEngine

Pantheon32

The Pantheon is the best-preserved architectural monument of ancient Rome. This simulation by BSU’s IDIA Lab represents the Pantheon and its surrounds as it may have appeared in 320 AD. Visitors to this Blue Mars / CryEngine simulation can tour the vicinity, learning about the history, function and solar alignments through an interactive heads up display created for this project. The project opened in beta in late 2013 and will premiere publicly in February 2014 and includes new solar simulation software calibrated to the buildings location and year, an interactive HUD, a greeter bot system and a new AI Non Player Character system developed in partnership between IDIA Lab and Avatar Reality.

Originally built by Agrippa around 27 BC under the rule of Augustus, it was destroyed by fire, then rebuilt and finally completed in its present form during Emperor Hadrian’s reign, around 128 AD. Agrippa finished the construction of the building and it bears his name above the portico. The Pantheon would have contained numerous marble statues representing the major Roman deities. The statues displayed in this simulation represent a possible configuration and are scanned via photogrammetry. The buildings surrounding the Pantheon are built and interpreted by IDIA based on the large scale model of ancient Rome built by Italo Gismondi between 1935 and 1971. The model resides in the Museo della Civiltà Romana, just outside of Rome, Italy.

Video walkthrough of the Virtual Pantheon in Blue Mars:

To  visit the Virtual Pantheon firsthand:

First step, create an account: https://member.bluemars.com/game/WebRegistration.html/
Secondly, download client: http://bluemars.com/BetaClientDownload/ClientDownload.html/
Lastly, visit the Virtual Pantheon: http://blink.bluemars.com/City/IDIA_IDIALabPantheon/

Advisors

The Solar Alignment Simulation of the Roman Pantheon in Blue Mars was developed under consultation with archeo-astronomer Dr. Robert Hannah, Dean of Arts and Social Sciences at the University of Waikato, New Zealand, one of the world’s  foremost scholars on Pantheon solar alignments; and archaeologist Dr. Bernard Frischer, Indiana University.

Background

The Pantheon that we can visit today is composed of a rectangular porch with three rows of granite columns in front of a circular building designed as a huge hemispherical dome (142 feet in diameter), built over a cylinder of the same diameter and as high as the radius. Therefore, the ideal completion of the upper hemisphere by a hypothetical lower one touches the central point of the floor, directly under the unique source of natural light of the building. This light source is the so-called oculus, a circular opening over 27 feet wide on the top of the cupola. It is the only source of direct light since no direct sunlight can enter from the door in the course of the whole year, owing to the northward orientation of the entrance doorway. Of the original embellishments the building should have had, the coffered ceiling, part of the marble interiors, the bronze grille over the entrance and the great bronze doors have survived.

Interior Wall

The interior wall, although circular in plan, is organized into sixteen regularly spaced sectors: the northernmost one contains the entrance door, and then (proceeding in a clockwise direction) pedimented niches and columned recesses alternate with each other. Corresponding to this ground level sector are fourteen blind windows in the upper, attic course, just below the offset between the cylinder and the dome. It is likely that both the niches and the windows were meant for statues, which, however, have not survived.

Oculus

Direct sunlight penetrates the interior only through a large, 27 foot wide oculus in the center of the domed roof. Otherwise indirect sunlight can enter the building, but only through the large, north-facing doorway, when it is open. The fall of direct sunlight through the oculus into the essentially spherical building leads to the comparison with a roofed sundial.

Celestial Alignments

A columned porch leads through a vestibule of the Pantheon, into a huge, shadowy interior, over 142 feet in height and as much in diameter. The building’s form is essentially that of a sphere with its lower half transformed into a cylinder of the same radius. Direct sunlight penetrates the interior only through a large, 27 feet wide oculus in the centre of the domed roof.

The shift from one semester to the other is marked by the passage of the sun at the equinoxes in March and September. At this point the noontime sun shines partially just below the dome, passing through the grill over the entrance doorway and falling on the floor of the porch outside. More significantly, however, the centre of this equinoctial, midday circle of sunlight lies on the interior architectural moulding, which marks the base of the dome.

On April 21st, the midday sun shines directly on to visitors to the Pantheon when they are standing in the open doorway, dramatically highlighting them. This day is of particular significance, not just because this was when the sun entered Taurus, but more because it is the traditional Birthday of Rome, a festival preserved from antiquity right through to the present day. And it may be that when the building was officially commissioned in AD 128, the person expected to be standing in the open doorway was the emperor Hadrian himself.

The illustration indicates a section through the Pantheon, showing the fall of the noon sunlight at the winter solstice, when the sun is at altitude 24 degrees; noon sunlight, at both equinoxes at altitude 48 degrees; noon sunlight on April 21st, when the sun is at altitude 60 degrees; and finally, noon sunlight at the summer solstice, when the sun is at altitude 72 degrees.

Meaning of Pantheon

The Pantheon is a building in Rome, Italy commissioned by Marcus Agrippa during the reign of Augustus as a temple to all of the gods of ancient Rome, and rebuilt by the emperor Hadrian about 126 AD.

Pantheon is an ancient Greek composite word meaning All Gods. Cassius Dio, a Roman senator who wrote in Greek, speculated that the name comes either from the statues of so many gods placed around this building, or from the resemblance of the dome to the heavens.

“Agrippa finished the construction of the building called the Pantheon. It has this name, perhaps because it received among the images which decorated it the statues of many gods, including Mars and Venus; but my own opinion of the name is that, because of its vaulted roof, it resembles the heavens.”

-Cassius Dio History of Rome 53.27.2

Augustus

Augustus was the founder of the Roman Empire and its first Emperor, ruling from 27 BC until his death in 14 AD.

The reign of Augustus initiated an era of relative peace known as the Pax Romana (The Roman Peace). Despite continuous wars or imperial expansion on the Empire’s frontiers and one year-long civil war over the imperial succession, the Roman world was largely free from large-scale conflict for more than two centuries. Augustus dramatically enlarged the Empire, annexing Egypt, Dalmatia, Pannonia, Noricum, and Raetia, expanded possessions in Africa, expanded into Germania, and completed the conquest of Hispania.

Beyond the frontiers, he secured the Empire with a buffer region of client states, and made peace with the Parthian Empire through diplomacy. He reformed the Roman system of taxation, developed networks of roads with an official courier system, established a standing army, established the Praetorian Guard, created official police and fire-fighting services for Rome, and rebuilt much of the city during his reign.

Augustus died in 14 AD at the age of 75. He may have died from natural causes, although there were unconfirmed rumors that his wife Livia poisoned him. He was succeeded as Emperor by his adopted son (also stepson and former son-in-law), Tiberius.

Argrippa

Marcus Vipsanius Agrippa (c. 23 October or November 64/63 BC – 12 BC) was a Roman statesman and general. He was a close friend, son-in-law, lieutenant and defense minister to Octavian, the future Emperor Caesar Augustus and father-in-law of the Emperor Tiberius, maternal grandfather of the Emperor Caligula, and maternal great-grandfather of the Emperor Nero. He was responsible for most of Octavian’s military victories, most notably winning the naval Battle of Actium against the forces of Mark Antony and Cleopatra VII of Egypt.

In commemoration of the Battle of Actium, Agrippa built and dedicated the building that served as the Roman Pantheon before its destruction in 80 AD. Emperor Hadrian used Agrippa’s design to build his own Pantheon, which survives in Rome. The inscription of the later building, which was built around 125, preserves the text of the inscription from Agrippa’s building during his third consulship. The years following his third consulship, Agrippa spent in Gaul, reforming the provincial administration and taxation system, along with building an effective road system and aqueducts.

Arch of Piety

The Arch of Piety is believed to have stood in the piazza to the immediate north of the Pantheon. Statements made in mediaeval documents imply, but do not specifically say, that the scene of Trajan and the widow was represented in a bas-relief on the Arch – narrating the story of the emperor and a widow, suppressing the emperor’s name. His probable source, the mediaeval guidebook of Rome known as Mirabilia Romae, does not even state that the arch was built in commemoration of the event. It mentions the arch and then says that the Incident happened there.

Giacomo Boni discusses the legend of Trajan, giving many interesting pictures which show how the story was used in medieval painting and sculpture. He has found a bas-relief on the Arch of Constantino, which he thinks may have given rise to the story. It shows a woman sitting, her right hand raised in supplication to a Roman figure, who is surrounded by other men, some in military dress, and two accompanied by horses. Boni suggests that the Middle Ages may have supposed this figure to be Trajan because of his reputation for justice.

Saepta Julia

The Saepta Julia was a building in Ancient Rome where citizens gathered to cast votes. The building was conceived by Julius Caesar and dedicated by Marcus Vipsanius Agrippa in 26 BC. The building was originally built as a place for the comitia tributa to gather to cast votes. It replaced an older structure, called the Ovile, which served the same function. The building did not always retain its original function. It was used for gladiatorial fights by Augustus and later as a market place.

The conception of the Saepta Julia began during the reign of Julius Caesar (died 44 BC). Located in the Campus Martius, the Saepta Julia was built of marble and surrounded a huge rectangular space next to the Pantheon. The building was planned by Julius Caesar who wanted it to be built of marble and have a mile long portico according to a letter written by Cicero to his friend Atticus about the building project. The quadriporticus (four-sided portico, like the one used for the enclosure of the Saepta Julia) was an architectural feature made popular by Caesar.

After Caesar’s assassination in 44 BC, and in the backlash of public support for the former ruler, men continued to work on projects that Caesar had set into motion. Marcus Aemilius Lepidus, who used to support Caesar and subsequently aligned with his successor Octavian, took on the continuation of the Saepta Julia building project. The building was finally completed and dedicated by Marcus Vipsanius Agrippa in 26 BC. Agrippa also decorated the building with marble tablets and Greek paintings.

The Saepta Julia can be seen on the Forma Urbis Romae, a map of the city of Rome as it existed in the early 3rd century AD. Part of the original wall of the Saepta Julia can still be seen right next to the Pantheon.

– edited from Robert Hannah, “The Pantheon as Timekeeper”, 2009.
________

Simulation by the Institute for Digital Intermedia Arts at Ball State University
Project Director: John Fillwalk, Senior Directory IDIA Lab, BSU.
IDIA Staff: Neil Zehr, Trevor Danehy, David Rodriguez, Ina Marie Henning, Adam Kobitz

PROJECT ADVISORS:
Dr. Robert Hannah, University of Waikato, New Zealand
Dr. Bernard Frischer, Virtual World Heritage Laboratory, Indiana University, USA

SPECIAL THANKS:
Shinichi Soeda, Avatar Reality

Virtual Art Museum and Virtual Collaboration Center Projects in AvayaLive Engage

Avaya_ArtMuseum2The IDIA Lab has developed two new simulations for AvayaLive Engage – a multi-user virtual world platform based on the Unreal Game Engine. The virtual collaboration Center is a multi-use teaching and learning facility prototype for use by large and small groups containing rich media and collaboration tools including desktop sharing, presentations, drop box, white boards, streaming video and webcam. The Virtual Art Museum is a demonstration of an interactive museum experience with links to companion webpages including a database that allows for extremely close examination of the paintings. Both simulations work on Mac and PC within industry standard web browsers and will be launching in the month of May, 2013. You can visit the project here:  https://wa11619.avayalive.com/11619/html/index.html Certain areas need administrative permission – we will be opening these up in the near future.

AvayaLive Engage
http://avayalive.com/Engage/Products.aspx
Unreal Game Engine UDK
http://www.unrealengine.com

Mars and Ceres Photoscan 3D sculptures created for IDIA Lab Pantheon Project in Blue Mars

CeresStatue_Detail-1

IDIA Lab Roman Pantheon in Blue Mars Project

http://idialab.org/virtual-roman-pantheon-in-blue-mars-cryengine/

Created with Agisoft PhotoScan
Professional Edition

Agisoft PhotoScan Pro allows to generate high resolution georeferenced orthophotos (up to 5 cm accuracy with GCP) and exceptionally detailed DEMs / textured polygonal models. http://www.agisoft.ru/products/photoscan/professional/
MarsStatue_Detail

Blue Mars Client
The Blue Mars Client is your window into our free to play massively multiplayer virtual world made up of an expanding set of independently operated Cities that feature unique themes, activities, and attractions. Through the Blue Mars Client, users can participate in a wide variety of entertainment, educational, and productivity experiences.
Blue Mars PC, Lite and Mobile clients here: http://www.bluemars.com/

Virtual Nursing Simulator

NursingSimLoginScreen

HDT and IDIA Lab have developed a next generation, high-fidelity Virtual Nursing Simulator in the game engine of Unity 3D for testing by the Ball State University School of Nursing. Students will practice interviews via role-playing – employing avatars with predefined health histories. The simulator introduces our new media tools within the environment including the ability to play video, webcam, screen sharing, presentations and webpages. Additionally the simulator allows for interdisciplinary immersion – providing a context for student nurses to role-play scenarios alongside doctors, health insurance representatives, administrators and other professionals. Desktop, web and tablet versions are in development.
Video preview here: http://www.youtube.com/watch?v=Asz92FuhEyc&feature=share&list=UUtQrLF1JPprEvP4AO-iSNvw

National Endowment for the Humanities: Advanced Topics in the Digital Humanities Summer Institute 2013

IDIA_BlueMarsHUD_Weinman IDIA Lab Director John Fillwalk will be presenting at the National Endowment for the Humanities: Advanced Topics in the Digital Humanities Summer Institute 2013 sponsored by the Center for Digital Initiatives at University of Arkansas.

About & Schedule

This NEH Summer Institute for Advanced Topics in the Digital Humanities will take place from June 17, 2013 – July 6, 2013. The three-week institute will be hosted by the Center for Digital Initiatives (CDI) at Arkansas State University (ASU), Jonesboro campus in northeast Arkansas in the Mississippi Delta region, and the Center for Advanced Spatial Technologies (CAST) at the University of Arkansas, Fayetteville (UAF), located in northwest Arkansas. The first two weeks of the institute will be held at the CDI at Arkansas State University Jonesboro, AR campus. Participants will then travel as a group to the University of Arkansas in Fayetteville, AR for the final week of the institute.

This summer institute brings together twenty scholars working in the humanities who have research or teaching projects that would benefit from real-time visualization in a game engine, published as standalone applications, web players, or on mobile devices. In a three-week institute, participants will be provided with a conceptual road map to the difficult but intellectually productive issues that surround the academic use of game engines, including the balance of immersion with accuracy, strategies for storytelling and graphical user interfaces (GUIs) in “serious” games, and questions of power and appropriateness in using video game conventions to represent non-contemporary or non-Western cultures.  Participants will also receive hands-on training in the digital toolbox for creating game engine content, a basic workflow that they would be able to use in their own projects and bring back to their home institutions.  Training will include VSim, Unity 3D, online multi-user virtual environments, Google SketchUp, 3D Studio Max, Cinema 4D, as well as a broad range of open-source programs.  No prior knowledge or experience in 3D modeling will be assumed.

A unique feature of the institute is the breadth of cultural heritage content it incorporates.  This includes travel to ASU Heritage sites modeled by the CDI including the Lakeport Plantation in Lake Village, the boyhood home of Johnny Cash in Dyess, the Hemingway-Pfeiffer house and studio in Piggott, and the Japanese-American internment camp in Rohwer. At UAF participants will also participate in a Unity tutorial focused on the House of the Ara Massima in Pompeii. In bringing together an impressive group of lecturers who specialize in the use of 3D visualization and game engines as research tools in the digital humanities, the institute creates an important resource in the form of a community of scholars—which allows for future collaborations between individuals and universities.

“Designing the Unbundling of Courses” for the American Association of Nursing

American-Association-Of-Colleges-Of-NursingA presentation by Jo Ann Gora, President of Ball State University; Phil Repp, Vice-President of Information Technology; Linda Sweigart, BSU Nursing Faculty; and John Fillwalk, IDIA Lab Director on March 21st 2013 to the American Association of Colleges of Nursing.

A presentation that examines how universities, like Ball State University, use emerging technologies to enrich and expand the utility of course materials at scale by leveraging the inherent production flexibility of digital tools.  The advantages of digital technologies when designing virtual participation, collaboration, and interaction will be discussed, as well as the pedagogical uses of un-assembling and re-assembling diverse sets of course materials.
http://www.aacn.nche.edu/webinars/2013/03/21/designing-the-unbundling

Ina-Marie Johnston

Design and Communications Manager

Ina-Marie Johnston is the Design and Communications Manager for Hybrid Design Technologies. She received her Bachelors in Fine Arts with a concentration in Visual Communications from Ball State University, as well as a Certificate in Web Applications. Her main interesting include print design, web design, print making, book making, and painting.

Neil Zehr

Neil earned his B.S. in the Digital Media Arts program at Huntington University in Huntington, IN. During his years there, he studied various traditional animation techniques, computer animation, and computer graphics while also working on several freelance projects and showcasing his films in Huntington’s annual media showcase. Following his graduation in the summer of 2008, he joined the IDIA at Ball State as a virtual worlds 3D modeler and animator.

IDIA Lab Workshops

IDIA Lab sponsors regular hands-on workshops on a variety of new media arts approaches. These workshops are taught by IDIA faculty fellows, graduate assistants and staff. Please look below for upcoming and past offerings.

Video/Intermedia Workshops Spring 2011

ART 300A-001/ART 515 Maura Jasper

“Live Art”: Action, Participation, and the Everyday

Explore the foundations of Performance Art and “Live Art” practices and

Create your own works in response.

Fluxus, actions, happenings, and socially engaged art practices will be emphasized. Participation and interactivity through the use of YouTube, social networks, Skype, and live web TV will be encouraged, but not required in your work.

Mon/Wed Jan 10- Feb 4

4-6:45 PM The Photo Lighting Studio (AJ)

Artists discussed: Marina Abromovic, Valie Export, Allan Kaprow, Joseph

Beuys, Carolee Schneemann, Dick Higgins, Rirkrit Tiravanija, Harrell Fletcher,Santiago Sierrra, Gillian Wearing

ART 300B-001/ART 515 CRN05668 Victoria Bradbury

“Collaborative Electronic Performance with Jitter”

Feb 7-Mar 4, 2011, MW 4pm-6:45pm

Room #023, Architecture Building

In this workshop, you will build “patches” or programs in Max/Msp/ Jitter

software: learning the basics then building an interactive video project.

This class will work toward a collaboration with the Dance

Department–creating patches which will function during a performance

situation. You will participate in rehearsals with a collaborative team.

ART 300C-001/ART 515 CRN05676 Victoria Bradbury

“Video Live!”

Mar 14-Apr 8, 2011, MW 4pm-6:45pm

Room #023, Architecture Building

In this workshop, you will create video content, attend rehearsals,

configure equipment, and perform video during an April 1 performance date.

You will be expected to function as part of a collaborative team.

Clarian Health VR Simulation

In collaboration with Clarian Health, IDIA has created a training video and interactive media to help facilitate the transition into the Ball Memorial Hospital New South Tower.

How can nurses train in a new hospital wing before it is constructed? Or after it’s complete and full of patients? For help addressing the situation, hospital officials turned to Ball State University and its emerging media experts. Rather than have the nurses don hardhats and run training seminars amidst saws and hammers, Ball State’s Institute for Digital Intermedia Arts (IDIA) worked to create a virtual training program. The result will allow nurses to interact with the new layout and better adapt t o their new surroundings well before the physical construction is complete.

This could be extremely important in terms of ultimate patient care, as nurses will be adapting to a new facility as well as learning a new patient-based floor design. Rather than having a single nursing station surrounded by many rooms, the wing will have individual nursing stations ensconced between two rooms. “Our interactive training simulation showcases new, more efficient methods for working in a decentralized care unit as opposed to a centralized care unit,” said John Fillwalk, IDIA director. “Switching from a single hub to multiple nursing stations represents a culture shift for nurses, and we were able to help them work through that.”

Using Quest 3D, a virtual reality program, Fillwalk and his team developed “New Spaces, New Care,” a training simulation that works like a computer game, allowing nurses to virtually explore their new environment, sit at their new workstations, view and walk into patients’ rooms, examine charts and access medicine cabinets.

In the weeks prior to the facilities opening, nurses assigned to the South Tower will complete the virtual training. By exploring the new wing before it’s complete, the nurses will be better acquainted the amenities, which once mastered, will give them more time for their patients, Fillwalk said. “By working directly with hospital officials and the nursing staff, we were able to create a program to more quickly acclimate staff members to their new environment and help them focus on the most important component of their jobs — tending to patients,” he added.

Links
Indianapolis WRTV Channel 6 Story

The Star Press: Energize ECI collaborating to lead an economic transformation

Gaming in Virtual Worlds and on Mobile Devices

REAL TIME 3D ENVIRONMENTS

Virtual worlds are three-dimensional environments that can provide compelling shared experiences for their users. These media-rich immersive spaces can connect to social networks, web-services, and data to bridge into an entire universe of external content. The IDIA has developed scores of projects in virtual worlds and game environments including Blue Mars, Unity 3D, Second Life, Open Simulator, and Quest 3D.

The IDIA built and animated Odie’s character using Maya 3D; an industry standard animation software package used in the feature film and visualization industries. This game that IDIA designed with Paws characters, was developed in Unity 3D.

Paws characters Odie, Garfield and the neighborhood were then brought into the Unity game engine where the lighting, real-time shadows, and physics simulations were designed.  Unity’s scripting capability was also utilized to control the Odie character’s animation and movement, trigger collision sounds, and animate Garfield as Odie moves by. There are a selection of objects for Odie to interact with, each with its own physical properties including weight, friction, and bounciness.

http://unity3d.com

MOBILE ENTERTAINMENT

Mobile devices are increasingly used to help us navigate, communicate, stay informed and be entertained. Devices such as the Apple iPhone are capable of supporting rich and complex 3D experiences such as augmented reality and games. The IDIA has been developing numerous projects for PDA enabled mobile phones including applications for social networking, conference systems, augmented reality walking tours, live musical performance and games.

The prototype 3D game, IDIA built with Paws content, has also been re-purposed here as an iPhone game – exhibiting the range and scalability of these authoring environments. The same content from any game can also be realized as a Nintendo Wii game using this production workflow.

This authoring environment supports two and three-dimensional game spaces, interacting through clicks, tilts and touch. Users can walk Odie through his neighborhood, encountering trashcans, balls and Garfield himself!

http://unity3d.com/iPhone

Astym 3D Visualization

Trevor Danehy and Tom Anderson, students from the IDIA Immersion Seminar, designed and produced a 3D visualization of a therapeutic process developed by Dr. Tom Sevier of Performance Dynamics. Students worked with state of the art software and a commercial anatomical 3D model with over 4,000 parts to realize a professional quality and dynamic scientific illustration for an external client.

The ASTYM simulation was an animation used to visualize specific effects and treatments of tendonosis. Information was gathered from ASTYM researchers to turn a complex, unseen procedure into a visual explanation. Choosing a design style was a challenge because 3D animation in the medical field has a different goal than most common animations. There needs to be a balance between accuracy and simple, clean visuals. Various shader styles were considered to properly display the internal processes of the human arm accurately, while avoiding anything realistically gruesome or visual clutter that would be detrimental to explaining the complexity of the human body. Transparent, yet visible objects were needed, so students opted for a dynamic incidence shader, which would shade the curvature and sides of objects while leaving the middle portions transparent enough to see what lies under them. Bright glowing colors were used to highlight particular tendons and separate muscles from each other.

Links
Performance Dynamics

Biotown

IDIA, in collaboration with the College of Architecture and Planning, will be constructing a virtual tour of the town of Reynolds, Indiana for Governor Mitch Daniels’s Biotown Project. Reynolds will eventually be removed from the power grid and will be self-sustained by biological fuels. The virtual tour will show the town as it exists today, and how it will appear after the project has been finished.

News Items:

Biotown, USA

Augmented Reality Research

The IDIA is utilizing the open source Augmented Reality toolkit to explore possible applications of AR in the educational and art realms.  Of particular interest is the use of AR in architecture to effectively evaluate students’ white models in a group setting.

Dave Schultz demonstrates the use of AR to display 3D architectural models.

Dave Schultz demonstrates the mapping of video onto an AR object.

IDIA Blackboard Greenhouse Grant for Virtual Worlds

Ball State University’s Institute for Digital Intermedia Arts (IDIA) has produced a bridging toolset, linking the multi-user virtual environment of Second Life with Blackboard Learn – providing a unified, secure and fluid hybrid learning experience. This project is funded by the Blackboard Greenhouse Grant for Virtual Worlds and has produced an open-source Building Block.

IDIA at Ball State University (BSU) created a virtual and web-based software toolset to manage, administrate and facilitate any hybrid Second Life / Blackboard Learn instructional experience. These Blackboard Building Blocks and Second Life scripts were specifically developed for a “hands-on” cinematography course taught with a complete set of virtualized filmmaking equipment. This pilot course also served as a prototype for Blackboard courses from any discipline that uses Second Life – especially in the delivery of studio, laboratory or other hands-on modes of learning – extending the modes of typical distance-learning offerings. The pairing of Blackboard technology augments Second Life as is not designed with course management tools to operate effective self-contained distance instruction.

Open Source Site

Blackboard World Presentation

Ball State University Press Release

Blackboard Press Release

Campus Technology 2008 Innovators Award

Video Demo

News Items:

IDIA Present Blackboard/SL Greenhouse Grant for Virtual Worlds at Blackboard World 2009

Digital Cinema Arts in the Virtual World, The Aesthetic Camera Project

The Aesthetic Camera Project was designed to deliver digital cinema curriculum through Second Life – an online virtual world. This intensive research and development project was hatched through a partnership between the College of Fine Arts, the Center for Media Design, and Information Technology. Students learn the techniques of filmmaking in this 3D collaborative multi-user environment and are able to make virtual films using the project’s equipment. Student designers and programmers worked with a team of local and global professionals from across the arts and sciences.

The one-credit course focuses on cinematography instruction in the physical world and virtual world where it is also known as “machinima” and while teaching cinematography techniques and allowing students to gain hands-on experience with equipment and resources that might not ordinarily be available to them. Within the virtual studio, they can check out cameras, dollies, light systems and more to create their own original movies. They can also film using avatars in costume in a wide variety of virtual set locations.

The project is set in an environment that stimulates creativity and interaction with fellow students and instructors. This environment includes virtual versions of all the necessary tools and some that do not exist in the real world, for example, a flexible shooting environment or holodeck.

The Institute has found measurable benefits for both instructors and researchers from this project that have included the development and retention of an expert research team centered around simulation, visualization and interaction.
Benefits for students include that distance education students can participate in communal learning and synchronous dialog with instructors and a peer group.

Most recently, the IDIA has earned a 2009 Institutional Excellence in Information Communications Technology Award from ACUTA, the Association for Information Communications Technology Professionals in Higher Education. The award recognizes IDIAA for its Aesthetic Camera filmmaking course and integrated Blackboard custom software, offered through the online world of Second Life.

News Items:

Greenhouse Grant for Virtual Worlds
Second Life Press Conference
IDIA Receives ACUTA Institutional Excellence Award
IDIA presents to UCit Instructional Research Computing

Las Americas Virtual Design Studio

The Las Americas Virtual Design Studio is made possible through a collaboration between IDIA, the College of Architecture and Planning at Ball State University, the Las Americas Network, with the professional firm of BSA LifeStructures. The project linked 11 international Departments of Architecture, over 100 students and 11 faculty in this immersive collaboration environment. The students worked within their prospective studios under the direction of their local instructors. At the same time each student is assigned to an international virtual studio made out of students of different universities under the advisory of several virtual instructors. The roster of virtual instructors is made out of the instructors of all participating studios and a number of professional reviewers who do not have students at their particular location.

The LAVDS is a beacon that not only attracts visitors, but encourages interaction on multiple levels:  it is depicted an organic system – a radial configuration of pods or petals -spaces for small group use, and a large group meeting place at the center.  A tall mast-like element. The structure relied on configurable interfaces to facilitate large and small interactions. For instance, it could be set up for small critique or large group lecture sessions.

On November 10, the Las Americas Network held the international grand opening of its Virtual Design Studio. “The nature of the tower’s architecture is specific to Second Life. There are no stairs, and dynamic pods and audio bubbles allow groups to meet together or ‘fly’ away to have private meetings,” said dean of CAP Guillermo Vasquez de Velasco.

The project was created on a Second Life island at the Ball State University Instructional Campus. Some constraints designers faced were the 96′ x 96′ x 96′ cubic site, a limit to 40 avatars able to interact simultaneously, 15,000 geometrical primitives. The project required that it celebrate the spirit of the virtual design studio while leveraging technologies and capabilities for the interaction (work and play) of the participants of the LAVDS in ways that have not been possible outside of a virtual environment.

Links
Indianapolis Wish TV 8 Story
AIArchitect: Architecture in Second Life Is a World All Its Own

News Items:

Wish TV covers Las Americas project
AIArchitect: Architecture In Second Life Is a World All Its Own
Las Americas Finalist for 2009 Mira Award
Design Communication Conference Presentation

Ki: Theory

Students from Ball State University’s Institute for Digital Intermedia Arts created a music video for recording artist Ki: Theory. The entirely animated video for the song Holiday Heart, is the artist’s first professionally produced video for broadcast on popular music television venues. This immersive learning experience connected students to an industry partner, challenged their production skills, and gave them a look at life after graduation.

Links
Ki:Theory

News Items:
Music Video Accepted to International Festival

Museum Scanning Project

The IDIA Seminar’s current project focuses on virtually recreating the original settings of various sculptures found throughout the Ball State University Museum of Art. The project focused on scanning five different sculptures using a 3D laser scanner.

This project has allowed Ball State students to get involved with 3D scanning by learning the method, techniques and limitations involved with accurate scanning procedures.
The museum scanning project was initially a way to not only digitally archive a few select sculptures, but to place them in an animated video to visualize the art in their original context, before they found their way to the Ball State Museum of Fine Arts.

Sculptures such as Adolph Alexander Weinman’s “Descending Night” and “Rising Sun” originally were meant to be viewed at the Panama-Pacific International Exposition of 1915 in San Francisco. The students intend to visually replicate that scene along with how the artist framed them for viewing with the help of a  Zcorporation Zscanner 700 at a high resolution. Scanning typically takes the class around six hours (for a human sized sculpture). The ZScanner has the ability to scan any object in the tightest spaces and do it in real-time with one continuous scan.  Once the scan data is acquired, there are a large variety of mediums that it can be transferred to both digitally and physically. Students then place the models within an animation using AutoDesk Maya. A high-resolution viewer and interactive touch screens are also used to view the models. Students are also investigating a method of 3D prototyping the models to a smaller, more reproducible copy.

Links:
3D Scanner Technology

Jeff Berg

Jeff Berg is a user experience expert at IBM Interactive. Jeff has more than 10 years of social computing development experience with expertise in rich internet application programming, community strategy, usability, user experience, 2d design, 3d content creation, and emerging technology adoption. Jeff focuses on community relations projects which use internet technologies to augment museum and cultural spaces. His interaction development experience includes MoMA New York, The Egyptian Museum in Cairo, The Palace Museum Beijing (The Forbidden City), The Smithsonian National Museum Of African American History and Culture and more.

Jeff has leveraged his rich interaction design experience on commercial projects which include Zula Patrol (the children’s television show), Circuit City, Aviva, Blue Cross Blue Shield, Epcot, Volvo, LL Bean, Steelcase and PNC Bank.

As a consultant Jeff has built interface prototypes and design concepts for McDonalds, Mazda, Coca Cola, American Signature, Discover, Exelon, Sears and more.

Jeff collaborated as a primary author of Beginning ActionScript 2 textbook in 2005 through Wiley Wrox publishing. After graduating from The Massachusetts College of Art in the mid nineties he spent some time painting in Ireland and Boston MA. Jeff has created immersive artworks using 3d spaces in virtual worlds at Princeton University, University of Southern California The University of Kentucky, Linden Labs, and Ball State University.

Bernard Frischer

Bernard Frischer is a leading scholar in the application of digital technologies to humanities research and education. Frischer has overseen many significant projects, including virtual recreations of sites such as the city of Rome in the time of the emperor Constantine the Great. The works of Frischer and his institute have received international acclaim and have been featured on the Discovery Channel, the RAI, German Public Radio, the BBC, in Newsweek, Scientific American, Business Week, Computer Graphics World, Forbes, the New York Times and many other magazines and newspapers around the world (see www.frischerconsulting.com/rome_reborn/press.php#media_coverage). His Rome model was featured at SIGGRAPH 2008, held in August 2008 in the Los Angeles Convention Center.

Professor Frischer is the author or co-author of six printed books, three e-books and many articles on virtual heritage and on the Classical world and its survival. He is also the editor-in-chief of the Digital Roman Forum web site (http://dlib.etc.ucla.edu/projects/Forum ), which was honored in 2008 by being included on the list of EDSITEMENT, the list of educationally-approved websites selected by the National Endowment for the Humanities. He taught Classics at UCLA from 1976 to 2004. Since then he has been Professor of Art History and Classics at the University of Virginia, where he is also the Director of The Virtual World Heritage Laboratory. He has been a guest professor at the University of Pennsylvania (1993), the University of Bologna (1994), Beijing Normal University (2009), and held the post of Professor-in-Charge of the Intercollegiate Center for Classical Studies in Rome (2001-02). He was a Senior Fellow in the Zukunftskolleg of the University of Konstanz during the 2010-11 academic year.

Frischer’s research career reflects his interest in interdisciplinary approaches, and has included studies in the literature, philosophy, art history and archeology of Greece and Rome. He is the author of several books, including Shifting Paradigms: New Approaches to Horace’s Ars Poetica, and The Sculpted Word: Epicureanism and Philosophical Recruitment. Frischer directed the excavations ofHorace’s Villa, a project sponsored by the American Academy in Rome and the Archeological Superintendency for Lazio of the Italian Ministry of Culture. The findings of this work were the subject of a two-volume report, Horace’s Villa Project 1997-2003 (Oxford: 2007), of which Frischer was editor-in-chief. He is founder and director of the Rome Reborn Project, an international initiative based at the University of Virginia, UCLA, and the Politecnico di Milano (for details, see www.romereborn.virginia.edu). The goal of the project is to create 3D digital models illustrating the urban development of ancient Rome from the first settlements in the late Bronze Age (ca. 1,000 BCE) to the early Middle Ages (ca. 550 CE). Rome Reborn 1.0 was premiered by Rome’s Mayor Walter Veltroni at an international press conference Frischer organized in June, 2007. It was published in Google Earth in 2008. Rome Reborn 2.0 was the featured project at SIGGRAPH 2008, the leading industry and scientific conference held in the field of Computer Graphics. His current research includes a new 3D digital model of Hadrian’s Villa (Tivoli, Italy). He is also principal investigator of SAVE (Serving and Archiving Virtual Environments), a project sponsored by the National Science Foundation to create a database of 3D digital models of cultural heritage sites, monuments, and landscapes. Over the course of his career, Frischer has raised over $5 million in support of his various research projects.

Frischer received his B.A. (Wesleyan University, 1971) and Ph.D. (Heidelberg, 1975) degrees summa cum laude and is a member of Phi Beta Kappa (1970), a Fellow of the Michigan Society of Fellows, a Fellow (1974-76) and Resident (1996) of the American Academy in Rome, and he has won research fellowships from the American Council of Learned Societies (1981, 1996) and the Center for Advanced Study in the Visual Arts (1997).  From 1996 to 2003 he directed the excavations of Horace’s Villa sponsored by the American Academy in Rome, and from 1996 to 2004 he was founding director of the UCLA Cultural Virtual Reality Laboratory. The lab was one of the first in the world to use 3D computer modeling to reconstruct cultural heritage sites. In 2005 Bernard Frischer was given the Pioneer Award of the International Society on Virtual Systems and Multimedia. In 2009, he was the recipient of the Tartessus Prize of the Spanish Society for Virtual Archaeology. In 2010 he won a Senior Prize Fellowship from the Zukunftskolleg at the University of Konstanz.

 

Trevor Danehy

Virtual Worlds Modeler and Animator

3D Modeller and Animator Trevor Danehy graduated from Ball State with a Bachelor of Fine Arts specializing in 3d animation in 2007. Trevor’s primary skillsets are with high detail 3d modeling and texturing with programs like Pixologic Zbrush and Adobe Substance Painter. Trevor also developed methods of quick optimization of photogrammetry model generation. Also as a 3d generalist, he uses Autodesk Maya to glue all these processes together. Trevor has knowledge of every aspect of 3d, anywhere from sculpting and retopology, to high definition rendering workflow.

John Fillwalk

History Channel at IDIA Lab
History Channel at IDIA Lab

John Fillwalk is an Associate Professor of Electronic Art at Ball State University. Fillwalk works and instructs in a variety of time-based and digital media including video, installation, imaging, interactive art and animation. He received his MFA from the University of Iowa in Intermedia and Video Art in 1990 and has since received numerous grants, awards and fellowships. Most recently, he has been appointed the Director of the Intermedia and Animation Institute at Ball State University, created in part by a $20 million dollar grant from the Eli Lilly Foundation.

Virtual Artworks

Virtual Artworks

http://www.idialabprojects.org/displacedresonance/virtual.html

 

PROXY

John Fillwalk

with Michael Olson, Composer and IDIA Lab. 2012.

In the virtual installation, PROXY, visitors shape the construction of a sculptural and sonic response to the virtual museum. The work progresses to form, eventually transforming to become structural support for the building. When multiple users are in the environment, their avatars interact with one another to create collaborative painting and sculpture.

 

FLICKR™ GETTR v6

John Fillwalk

with Jesse Allison, Composer and IDIA Lab. 2012.

FLICKR™ GETTR v6 connects the social image web service of Flickr™ to the virtual museum environment, allowing visitors to create a dynamic cloud of spatial imagery by entering a search term of their choice, that pulls in related images from Flickr™ into the virtual environment.

 

SURVEY FOR BEIJING

John Fillwalk

with Keith Kothman, Composer and IDIA Lab. 2012.

SURVEY FOR BEIJING is an immersive landscape simulation using real time weather data from the physical location in Beijing, China. Representations of surveyor’s tape, flags, light, time of day, wind and clouds are superimposed onto the virtual landscape in accordance with real-life weather data.

 

con|FLUENCE

John Fillwalk

with Michael Olson, Composer and IDIA Lab. 2012.

Participants in con|FLUENCE create pathways based on reactions to both social and spatial relationships. There is a virtual response and persistence to interactions, as the routes are drawn and sculpted in the three-dimensional environment – forming nodes that are created via visitors’ proximity

 

Third Art and Science International Exhibition

China Science and Technology Museum in Beijing

http://www.tasie.org.cn/index.asp

 

 

TASIE Press release

http://www.tasie.org.cn/content_e.asp?id=84

 

Ball State artists create “forest” of light on display in China and Internet

Muncie, Ind. — Ball State University electronic artists have created a “forest” of light and sound that will be on exhibit in Beijing, China through November, yet also accessible to visitors from Indiana or anywhere else in the world.

That’s possible because “Displaced Resonance,” as the interactive art exhibit is known, has both real-life and virtual components.

The physical portion has been installed in a gallery of the China Science and Technology Museum in Beijing. There, in-person visitors can negotiate a thicket of 16 interactive sculptures spaced 1.5 meters apart that will change colors and emit music as they approach.

A digital replica of the layout, meanwhile, resides on the Internet, accessible through the museum’s website. Online visitors can wander the virtual exhibit using an avatar, and the digital pillars will change colors and produce sounds, just like their physical counterparts.

But that’s not all — the two pieces interact with each other, says John Fillwalk, director of Ball State’s Institute for Digital Intermedia Arts (IDIA) and Hybrid Design Technologies (HDT), which created the work in collaboration with IDIA staff, students and composer Michaal Pounds, BSU.

When an online avatar approaches a virtual pillar, the corresponding real-life column also will change colors, and vice versa. In-person and virtual visitors will produce different colors, however, allowing them to track each other through the exhibit.

“It’s what we call hybrid art,” says Fillwalk. “It’s negotiating between the physical world and the virtual. So it’s both sets of realities, and there’s a connection between the two.”

The physical pillars are two meters (or more than 6 feet, 6 inches) tall. They consist of a wooden base containing a sound system; a translucent pillar made of white corrugated plastic and computer-controlled lighting.

A thermal camera mounted on the museum’s ceiling keeps track of visitors and feeds its data to a computer program that directs the columns to change color and broadcast sounds when someone draws near.

“It’s a sensory forest that you can navigate,” Fillwalk says.

Two final touches: a video screen mounted on a museum wall overlooking the exhibit allows in-person visitors to watch avatars move around the virtual version, while Internet patrons can keeps tabs on the real-life display through a window on their computer screens.

“Displaced Resonance” is the centerpiece of Ball State’s contributions to the Beijing museum’s 3rd Art and Science International Exhibition and Symposium, a month-long celebration of technology and the arts. Ball State was invited to participate because museum curators discovered some of IDIA’s work and liked what they saw, Fillwalk said.

In addition to “Displaced Resonance,” IDIA contributed four other pieces of digital art that museum visitors can view at a kiosk.

Those pieces are:

·      “Proxy”, in which visitors create, color and sculpt with floating 3D pixels.

·      “Flickr Gettr,” in which visitors can surround themselves with photos from the Flickr web service that correspond to search terms they submit.

·      “Confluence,” in which users create virtual sculptures by moving around the screen and leaving a path in their wake.

·      “Survey for Beijing,” in which real time weather data from Beijing is dynamically visualized in a virtual environment.

 

(Note to editors: For more information, contact John Fillwalk, director of the Institute for Digital Intermedia Arts, at765-285-1045 or jfillwalk@bsu.edu; or Vic Caleca, media relations manager, at 765-285-5948, or vjcaleca@bsu.edu. For more stories, visit the Ball State University News Center at www.bsu.edu/news.

Displaced Resonance v2

IDIA Lab was invited to exhibit their virtual and hybrid artworks at the Third Art and Science International Exhibition at the China Science and Technology Museum in Beijing, China.

http://www.idialabprojects.org/displacedresonance/

Displaced Resonance v2

John Fillwalk

Michael Pounds

IDIA Lab, Ball State University

Interactive installation, mixed media

2012

Displaced Resonance v2 is an interactive installation consisting of sixteen reactive forms that are networked in a grid of light and sound. Interaction within the sculptural field is based on a participant’s presence and proximity to each sculpture. The Displaced Resonance installation is connected to a mirrored instance of the field in a virtual environment – bridging both physical and virtual visitors within a shared hybrid space. Visitors to the virtual space are represented by avatars and through their proximity affect the light and sound of each sculpture. Each participant is aware of the other in each space and uniting both instances within a singular hybrid environment.

A computer system using an thermal camera tracks the movement of visitors and responds by controlling the distribution of sound and dynamic RGB data to the LED lights within the sculptural forms. The installation utilizes custom live processing software to transform these sources through the participants’ interaction – displacing the interaction from both their physical and virtual contexts to the processed framework of physical and virtual resonances. The two environments are linked to send and receive active responses from both sides of the installation via messaging, sensors, hardware and scripting.

In prototyping this work, the process included both physical and virtual models to design and build the form and the interaction. The physical prototypes were interpreted in a virtual simulation environment, investigating the spatial interaction of the structure. The interactive functionality was tested through scripting before the form was brought into the sensored camera-based version. After several virtual iterations, the form was re-interpreted and fabricated.

John Fillwalk

with Michael Pounds, David Rodriguez, Neil Zehr, Chris Harrison, Blake Boucher, Matthew Wolak, and Jesse Allison.

Third Art and Science International Exhibition

China Science and Technology Museum in Beijing

http://www.tasie.org.cn/index.asp

TASIE Press release

http://www.tasie.org.cn/content_e.asp?id=84

Ball State artists create “forest” of light on display in China and Internet

Muncie, Ind. — Ball State University electronic artists have created a “forest” of light and sound that will be on exhibit in Beijing, China through November, yet also accessible to visitors from Indiana or anywhere else in the world.

That’s possible because “Displaced Resonance,” as the interactive art exhibit is known, has both real-life and virtual components.

The physical portion has been installed in a gallery of the China Science and Technology Museum in Beijing. There, in-person visitors can negotiate a thicket of 16 interactive sculptures spaced 1.5 meters apart that will change colors and emit music as they approach.

A digital replica of the layout, meanwhile, resides on the Internet, accessible through the museum’s website. Online visitors can wander the virtual exhibit using an avatar, and the digital pillars will change colors and produce sounds, just like their physical counterparts.

But that’s not all — the two pieces interact with each other, says John Fillwalk, director of Ball State’s Institute for Digital Intermedia Arts (IDIA) and Hybrid Design Technologies (HDT), which created the work in collaboration with IDIA staff, students and composer Michael Pounds, BSU.

When an online avatar approaches a virtual pillar, the corresponding real-life column also will change colors, and vice versa. In-person and virtual visitors will produce different colors, however, allowing them to track each other through the exhibit.

“It’s what we call hybrid art,” says Fillwalk. “It’s negotiating between the physical world and the virtual. So it’s both sets of realities, and there’s a connection between the two.”

The physical pillars are two meters (or more than 6 feet, 6 inches) tall. They consist of a wooden base containing a sound system; a translucent pillar made of white corrugated plastic and computer-controlled lighting.

A thermal camera mounted on the museum’s ceiling keeps track of visitors and feeds its data to a computer program that directs the columns to change color and broadcast sounds when someone draws near.

“It’s a sensory forest that you can navigate,” Fillwalk says.

A video screen mounted on a wall overlooking the exhibit allows museum visitors to watch avatars move around the virtual version, while Internet patrons can affect the physical counterpart.

“Displaced Resonance” is the centerpiece of Ball State’s contributions to the Beijing museum’s 3rd Art and Science International Exhibition and Symposium, a month-long celebration of technology and the arts. Ball State was invited to participate because museum curators discovered some of IDIA’s work and liked what they saw, Fillwalk said.

In addition to “Displaced Resonance,” IDIA contributed four other pieces of digital art that museum visitors can view at a kiosk.

Those pieces are:

· “Proxy”, in which visitors create, color and sculpt with floating 3D pixels.

· “Flickr Gettr,” in which visitors can surround themselves with photos from the Flickr web service that correspond to search terms they submit.

· “Confluence,” in which users create virtual sculptures by moving around the screen and leaving a path in their wake.

· “Survey for Beijing,” in which real time weather data from Beijing is dynamically visualized in a virtual environment.

(Note to editors: For more information, contact John Fillwalk, director of the Institute for Digital Intermedia Arts, at765-285-1045 or jfillwalk@bsu.edu; or Vic Caleca, media relations manager, at 765-285-5948, or vjcaleca@bsu.edu. For more stories, visit the Ball State University News Center at www.bsu.edu/news).

Video Art at BSU Museum of Art

Renowned intermedia artist, Hans Breder, has donated a work of video art to the Ball State University Museum of Art. The work has been installed in a temporary new media gallery at the BSUMA. The reception for the exhibition was held on Thursday, September 15th and will be on display until November 27th, 2011. Breder’s MFA program in intermedia at the University of Iowa was the first of its kind in the nation.

Virtual Middletown Living Museum in Blue Mars

The Virtual Middletown Living Museum project in Blue Mars is a simulation of the Ball Glass factory from early 20th century Muncie, Indiana. Life and conditions in the factory were one of the key elements of the Middletown Studies by Robert S. and Helen Merrell Lynd in their landmark studies Middletown (1929) and Middletown in Transition (1937). These in-depth accounts of life in Muncie, Indiana, became classic sociological studies and established the community as a barometer of social trends in the United States. In the years since, scholars in a variety of fields have returned to Muncie to follow up on the Lynds’ work, making this small city among the most studied communities in the nation. The center continues this tradition by sponsoring and promoting research on Muncie as Middletown, on small cities generally, and on the themes and issues the Lynds explored.

This simulation of industrial life, built as a prototype for a much larger project dealing with all aspects of the Lynd Study, has aimed to create an virtual living museum experience expanding the opportunities for both learning and interpretation. The approach to interactive design embeds learning and navigation experiences subtly into the project to maintain the sense of immersion. IDIA has prototyped several techniques to do this including: interactive objects that allow for close up inspection; objects that when clicked bring up web resources that show information; plans or photographs used in the interpretation; non-player character factory workers, a live interactive avatar of Frank C. Ball who greets visitors and introduces them to the factory; video and audio files of factory experts and archival films; an in-world interactive Heads-Up-Display (HUD) that provides deeper investigation and navigation through the factory; and a supporting webpage with complete documentation on all resources used in this interpretation.

To download the Blue Mars client, create an account and tour Virtual Middletown, please visit: http://blink.bluemars.com/City/IDIA_IDIALabExhibitions/

Follow these steps to get set up:

  1. Slelect “Download Client” and follow instructions to install the BlueMars client on your PC (Windows desktop or laptop)
  2. Once you have successfully installed the BlueMars client, select “Go To City” to install the Virtual Middletown virtual world
  3. Register your account and confirm when you receive an email from BlueMars
  4. Modify your avatar (optional)
  5. Explore Virtual Middletown!

NOTE: If you are a Macintosh user (OS X) you may run the BlueMars client and Virtual Middletown virtual world using the Boot Camp emulation: http://www.apple.com/support/bootcamp/

Here are links for additional information on the project:

  1. Center for Middletown Studies project website: http://cms.bsu.edu/Academics/CentersandInstitutes/Middletown/Research/Virtual/Virtual.aspx
  2. IDIA Lab project website: https://idialab.org/virtual-middletown-living-museum-in-blue-mars/

Recommended settings: Very High Graphics with good graphics cards – otherwise High or Low as needed. Screen resolution: Minimum 1280 by 720 or higher. Sound levels should be up. Many objects in the world are interactive – anything that highlights blue can be clicked with the left mouse button and examined – or might can reference a web page. The heads up display in the lower right hand corner provides information and navigation to augment your visit.

Project partners: The Center for Middletown Studies, Library Services and the Emerging Media Initiative at Ball State University

Press release: http://www.bsu.edu/news/article/0,1370,7273-850-65401,00.html

HYBRID DESIGN TECHNOLOGIES AT BSU

Hybrid Design Technologies (HDT) is a new initiative through the Office of Information Technology at Ball State University, supporting the design and production of virtual, immersive and interactive environments – advancing the University’s expertise in virtual cultural heritage, museums, arts, teaching and hybrid learning. John Fillwalk serves as the Senior Director of HDT, as well as directing the IDIA Lab.  www.bsu.edu/hdt

IDIA Lab Simulations in Blue Mars

Ball State University’s Institute for Digital Intermedia Arts (IDIA) in Blue Mars is a large scale virtual simulation that showcases re-creations of artifacts and artworks in their original historical contexts, including sculptures at the 1915 Panama-Pacific International Exposition in San Francisco and a centuries-old Buddha at a Japanese temple. Through the use of such emerging technologies, visitors to Blue Mars can virtually experience these important but long lost sites.

IDIA’s simulations in Blue Mars present several examples of recent grant-supported research projects. IDIA is exploring various new ways that it might be used, from learning and the arts to gaming. Future IDIA initiatives in Blue Mars will include the prototyping the 3D internet, shared media collaboration tools and the development of an open virtual campus for use by educators interested in engaging the platform as an environment for learning.

Videos on IDIA Lab Youtube Channel
http://www.youtube.com/user/IDIALab#p/u

Images on IDIA Lab Flickr Photostream
http://www.flickr.com/photos/idia_lab/

This release of IDIA Lab introduces the following new features and updates, including

 • Heads up Display system – flash-based HUD system displaying dynamic information and map based teleportation

• Automated avatar/bot greeter system

• A guided tour vehicle – launching from the orientation island

• The Snow Globe, a virtual winter simulation – employing physics, environment and particle systems. Teleports are placed throughout the City

• Depth of Field – now enabled in most environments • New navigation and orientation system

• New vegetation design – new plants with LOD

• High fidelity statues using advanced mapping techniques

• High optimization, terrain painting, parallax mapping

• Please use Very High Graphics settings to view

 

Blue Mars Builds


The IDIA Lab

THE INSTITUTE FOR DIGITAL INTERMEDIA ARTS is a collaborative research and design studio exploring the intersections between art, science and technology. We consult, design and produce immersive experiences for virtual reality, hybrid worlds, simulations, visualizations, games and interfaces for various platforms and devices. For more information, please visit IDIALAB.org. You can walk to all simulations or alternatively use the teleport columns you see at each location. Follow us on Twitter @ IDIA_Lab for news and updates.

Panama-Pacific International Exposition Simulation

IDIA simulation of the 1915 San Francisco Panama – Pacific Exposition The project allows for visitors to travel to the past to immersively tour a recreation of an historic environment that no longer exists. The exposition celebrated the opening of the Panama Canal but also San Francisco’s recovery from the devastating earthquake of 1906. IDIA 3D laser-scanned two sculptures by artist Adolph Weinman that have been included in this simulation and were originally installed on top of tall columns in the Court of the Universe. A more detailed examination of the sculptures can be found in our Museum Simulator. Visitors can change the time of day near controls found near this sign and the Fine Arts Palace to experience the lighting design of this exposition.

Palace of Fine Arts

The Palace of Fine Arts in the Marina District of San Francisco, California was originally constructed for the 1915 Panama-Pacific Exposition. One of a handful of surviving structures from the Exposition, it is the still situated on its original site. It was rebuilt in 1965 – and renovation of the lagoon, walkways, and a seismic retrofit were completed in early 2009. IDIA Lab constructed this as an addition to its Panama- Pacific Court of the Universe simulation.

Art Museum Simulation

Using blue prints, photographs, laser-scanning and measurements, IDIA Lab simulated an actual museum gallery to a high degree of accuracy for exploration and study within a live 3D environment. This goals of this build were to provide visitors with a remote immersive experience of an actual exhibition as it was installed – closely replicating all original factors such as the layout, framing and lighting of the physical space. Additional information is provided by touching each canvas or sculpture in the exhibition. Via a simulation such as this, curators can also spatially archive a show or prototype layout, lighting, and installation design.

Byodo-In Temple (Amida Hall)

Amida Hall, the most famous building in the Byodo-in temple complex was constructed in 1053 and is the only remaining building from the original site. Additional buildings making up the compound were burnt down during a civil war in 1336. IDIA scanned an Amida Buddha and created this temple site to give visitors an understanding of a typical context in which the Buddha was observed. A replica of the temple was also constructed on the island of O’ahu in Hawai’i. Nearby there are 360º immersion spheres of the Hawaiian temple that avatars can also explore.

Relic Temple

Relic Temple – located in the Ji Le Temple complex – Nangang District in Harbin, China.The Relic Temple simulation was built by by IDIA Lab as part of our larger 3D Laser Scanning and Simulation Project, including the 1915 Panama Pacific Exposition in San Francisco, Byodo-In Temple in Japan and the Bingling Si Buddhist Cave site – all simulated here on IDIA Lab in Blue Mars.

Shafer Tower

Shafer Tower is a 150-foot-tall (46 m) bell tower with a chiming clock and carillon bells located in the middle of the campus of Ball State University. A small staircase in the tower leads to a control room of the carillon, which has 48 custom-made bells. This virtual tower chimes when an avatar approaches – as does the bell in a smaller temple at our Byodo-in build.

The Snow Globe

Winter themed landscape simulation including environment, weather, particle and physics systems.

3D Point Cloud Laser Scanning

From a seashell to auditorium, laser-scanning can capture the objects and spaces we inhabit and convert them into a digital model. The IDIA scans a range of small or large scale 3D objects using facilities and equipment shared with the SimLab in the College of Architecture and Planning and supported by the Office of Information Technology. Three-dimensional geometry is captured in terms of millions of points that outline the scanned object. These points are converted into polygons to be used in the resulting animation workflow.

In 2009, the IDIA Immersion Seminar in Virtual Worlds worked to simulate the original settings of several sculptures in the collection of the Ball State University Museum of Art. Students where trained professionally to use the ZCorp 3D laser scanner using archival museum methods.

This project has allowed Ball State students to obtain skills in 3D scanning by learning the methodology, techniques and limitations involved with accurate scanning procedures. The museum scanning project was initially a way, to not only digitally archive select sculptures, but to place them in an live three-dimensional multi-user virtual environment that immerses viewers in a simulation of the works in their original context.

This significance of this project was to address the fact that pre-modern sculpture was typically originally designed to be an element in large-scale architectural, public or natural setting. IDIA developed a workflow that is now also a service that can be provided to external clients in the sciences, industry and the arts – employing emerging digital tools to illuminate a viewer’s understanding of context.

FARO and Z Corporation’s scanning technologies are used in applications such as reverse engineering, accident reconstruction, forensic science, historic preservation, entertainment and virtual simulation.

Microsoft Surface Museum Collection Portal

The IDIA has assembled an interdisciplinary group of students, faculty and industry experts in a significant interactive information portal for the Ball State University Museum of Art (BSUMA).  The IDIA team is developing an innovative touch-based interface to navigate the collection, employing and integrating Microsoft Surface with the Museum’s database.  The Surface will afford Museum patrons a collaborative, participatory public platform through which to access metadata and media of the physical exhibition – as well as extending virtually into the permanent collection. Using the Digital Images Delivered Online (DIDO) database, the interface will make visual the interconnection between works in the collection queried on searchable parameters, i.e. artist, medium, period, subject etc. This two-semester immersive project supported by the BSU Provost’s Immersion Initiative has been team-taught and has recruited students from targeted disciplines across campus.

Complete Instruction Guide here: http://ilocker.bsu.edu/users/idiaa/world_shared/SurfaceInstructions.pdf

Microsoft Surface represents a fundamental change in the potential for interaction with digital content. The Surface is a 30” tabletop interface environment that allows several people to work independently or collaboratively – all without using a mouse or a keyboard.  The Surface allows users to navigate information physically, and manipulate information with natural gestures and touch http://www.microsoft.com/SURFACE/product.html#section=The%20Product.

The Ball State University Museum of Art has a collection of nearly 11,000 works of art. Central to the Ball State University Museum of Art’s mission is the provision of educational programming that will further faculty, staff, student, and public utilization and understanding of the Museum and its collection, of museums in general, and of the arts. To accomplish this goal, the Museum offer tours, talks, materials for teachers, an education database, artist demonstrations. The Museum’s education philosophy centers on bringing together the needs of visitors and the resources of the Museum , and balancing responsibility to the university with service to the community. In facilitating communication between the work of art and the viewer, the Museum subscribes to a philosophy of learner-centered programming informed by a thorough knowledge of the collection and methods and strategies for effective teaching.

Although the Museum’s collection numbers nearly 11,000, given the physical constraints of the facility, a mere 10% of the collected works are currently displayed at any given time.  The incorporation of this hybrid Surface and database system will afford patrons virtual access to the entire collection – making visible those works currently stored.  More importantly, the system will allow patrons to visualize the interconnectivity of the works according to multiple facets – visually re-contextualizing the works in relation to specified search parameters.  This form of innovative technology within a museum context would typically be installed at major metropolitan institutions. Development of this interface at the Ball State University Museum of Art will not only benefit local community groups and patrons of BSUMA whose access to the collection will be significantly augmented, but also has the potential to influence other museums through the distribution of the outcomes of this product.

The Institute for Digital Intermedia Arts (IDIA) has a history of providing interdisciplinary immersion and new media experiences, employing pedagogy that supports students’ transformation as active, engaged learners. The Institute provides a dynamic exchange between instructor and students – where participants engage in collaborative, inquiry-based communities that provide an environment, which fosters participatory learning. As opposed to a traditional model, where knowledge is imparted by the teacher to the student, participatory learning can transform the learner into an active role. Participatory learning communities engage in a dynamic conversation centered around a subject – where analysis, comparison, and evaluation are core to the process of acquiring meaning and relationships. This environment allows students the opportunity to have a stake in the conditions of their learning, making decisions that direct their course of investigation. With a commitment to active inquiry, participants are freed to create personal and collective meaning.

 

BSU MUSEUM OF ART

COLLECTION PORTAL

SOFTWARE DESIGN

Carrie Arnold

Joel Happ

Garret Orth

Deekshita Reddy

Christopher Ross

Jonathan Strong

Austin Toombs

INTERFACE DESIGN

Anastasia Goryacheva

Steven Lanier

Jonathan Strong

ELECTRONICS

Eric Brockmeyer

Giovanni Rozzi

GRAPHIC DESIGN

Amy Goettemoeller

Ina-Marie Henning

FACULTY

Jesse Allison

John Fillwalk

Paul Gestwicki

PROJECT DIRECTOR

Jonathan Strong

A SPECIAL THANK YOU

UNIVERSITY LIBRARIES

James Bradley

John Straw

Budi Wibowo

SOUND

Rick Baker

Steven Lanier

Giovanni Rozzi

STRUCTURAL DESIGN

Eric Brockmeyer

Matthew Wolak

PROJECT MANAGEMENT

Ina-Marie Henning

Amruta Mhaiskar

Jennifer Weaver-Cotton

PRODUCER/PRINCIPLE INVESTIGATOR

John Fillwalk

BALL STATE UNIVERSITY MUSEUM OF ART

Peter Blume

Carl Schafer

Tania Said

Ball State University Museum of Art

College of Fine Arts

College of Architecture and Planning

Department of Architecture

Department of Art

Department of Computer Science

School of Music

Information Technology

Office of the Provost

University Libraries

and

Jeff Berg, IBM Interactive, IDIA Research Fellow

IDIA Presents: BSU MUSEUM OF ART COLLECTION PORTAL 3/20/11 at 2:30pm BSUMA

 

BSU MUSEUM OF ART COLLECTION PORTAL
Reception and Gallery Talk

Thursday, March 20 @ 2:30pm, BSU Museum of Art

The BSU Museum of Art Collection Portal was developed by students, faculty and industry research fellows in the Institute for Digital Intermedia Arts at Ball State University. The year-long project was developed in partnership with the Ball State University Museum of Art and sponsored by the Provost’s Immersive Learning Initiative.

This interdisciplinary team from art, computer science, architecture and music has developed an innovative, touch-based application to navigate the collection. The portal bridges the Microsoft Surface interface with the Museum’s collection database, the Digital Images Delivered Online (DIDO), hosted on the Digital Media Repository of Bracken Library. The Surface affords Museum visitors an interactive platform to individually or collaboratively make virtual connections between works of art both on display and in reserve – accessing information and media across the collection.

http://bit.ly/IDIA_Surface

Complete Instruction Guide here: http://ilocker.bsu.edu/users/idiaa/world_shared/SurfaceInstructions.pdf

Microsoft Surface Museum Collection Portal: IDIA Presents: BSU MUSEUM OF ART COLLECTION PORTAL 3/20/11 at 2:30pm BSUMA

IDIA Presents: BSU MUSEUM OF ART COLLECTION PORTAL 3/20/11 at 2:30pm BSUMA
BSU MUSEUM OF ART COLLECTION PORTAL
Reception and Gallery Talk
Sunday, March 20 @ 2:30pm, BSU Museum of Art

The BSU Museum of Art Collection Portal was developed by students, faculty and industry research fellows in the Institute for Digital Intermedia Arts at Ball State University. The year-long project was developed in partnership with the Ball State University Museum of Art and sponsored by the Provost’s Immersive Learning Initiative.

This interdisciplinary team from art, computer science, architecture and music has developed an innovative, touch-based application to navigate the collection. The portal bridges the Microsoft Surface interface with the Museum’s collection database, the Digital Images Delivered Online (DIDO), hosted on the Digital Media Repository of Bracken Library. The Surface affords Museum visitors an interactive platform to individually or collaboratively make virtual connections between works of art both on display and in reserve – accessing information and media across the collection. Special thanks to IDIA Research Fellow Jeff Berg from IBM Interactive!

http://bit.ly/IDIA_Surface

 

BSU MUSEUM OF ART

COLLECTION PORTAL

SOFTWARE DESIGN

Carrie Arnold

Joel Happ

Garret Orth

Deekshita Reddy

Christopher Ross

Jonathan Strong

Austin Toombs

INTERFACE DESIGN

Anastasia Goryacheva

Steven Lanier

Jonathan Strong

ELECTRONICS

Eric Brockmeyer

Giovanni Rozzi

GRAPHIC DESIGN

Amy Goettemoeller

Ina-Marie Henning

FACULTY

Jesse Allison

John Fillwalk

Paul Gestwicki

PROJECT DIRECTOR

Jonathan Strong

A SPECIAL THANK YOU

UNIVERSITY LIBRARIES

James Bradley

John Straw

Budi Wibowo

SOUND

Rick Baker

Steven Lanier

Giovanni Rozzi

STRUCTURAL DESIGN

Eric Brockmeyer

Matthew Wolak

PROJECT MANAGEMENT

Ina-Marie Henning

Amruta Mhaiskar

Jennifer Weaver-Cotton

PRODUCER/PRINCIPLE INVESTIGATOR

John Fillwalk

BALL STATE UNIVERSITY MUSEUM OF ART

Peter Blume

Carl Schafer

Tania Said

Ball State University Museum of Art

College of Fine Arts

College of Architecture and Planning

Department of Architecture

Department of Art

Department of Computer Science

School of Music

Information Technology

Offi­ce of the Provost

University Libraries

and

Jeff Berg, IBM Interactive, IDIA Research Fellow

IDIA Students Present Surface Project to IBM

As a follow-up to last week’s conference with IDIA Research Fellow Jeff Berg, IBM, John Fillwalk and IDIA Immersion Seminar students were invited to present our Microsoft Surface interface project to a 10 member IBM Interactive team made up of designers and programmers based in Chicago. Provost’s Immersive Learning Grant funded the pursuing commercialization of this product.

Flickr Gettr

At its essence, Flickr Gettr is a novel virtual interface bridging the virtual worlds to the wealth of shared real life imagery and information in Flickr.

Currently on Display at the New Media Consortium, amongst others: http://slurl.com/secondlife/NMC%20Campus%20West/137/100/21

USAGE

Participants query to search public image folders at Flickr web-service. Queries pulls related images from Flickr and feeds them back to be spatialized in an immersive visual and sonic environment.  As each image spawns, it generates a chime-like sound which parallels the environment of imagery that surrounds the viewer.

In Flickr Gettr II, a participant can touch any image and receive a list of the tags associated with it.  Touching the same image again initiates a random search for one of those tags thus retrieving similar imagery.

In Flickr Gettr, an external web service was used as an intermediary to query Flickr, receive images and format them for delivery as a texture.  It then passed the aspect ratios and tags in a second query to be able to map the textures properly.  To make these interactions more flexible, the intermediary web server was employed to collate and prepare information to retain states that can be queried from external applications.  The web application effectively serves as an intermediary between the virtual world and outside environments, providing the communications link and logic to assimilate the information.  This can make the creation of Web 2.0 mash-ups much simpler as the code for doing these sorts of queries and logic is already highly developed in Java, Ruby, and Perl for example.  Flickr Gettr also triggered music files upon the rapid rezzing of objects to create a cumulative ambient effect.

Twitter Cloud

The Twitter Cloud is a visualization of real time Tweets (messages) posted on Twitter.com. The system automatically scans specified user feeds, and visualizes new messages as they arrive. This piece was programmed in Java using the Processing API.

This project was also linked with the Tweet Station so users could be identified with RFID and allowed to post their own messages through a touchscreen kiosk. In the context below, conference attendees entered their Twitter feeds to be tracked so that others could read about their experiences as they posted from their laptops and phones. The Twitter Cloud visualizer has also been used within virtual worlds – both as a live event Tweet tracker, and to track and visualize avatars as they traveled to various locations within the world.

Human-Computer Interaction in the Arts

The Human-Computer Interaction Lab housed in Music Technology is a learning and exploration ground in how electronics can interact with the arts. Students who take the MuMET 440 Special Topics course in HCI can use the lab to develop projects in augmented musical instruments, electronic installation art, analog audio processors, synthesizers, robotics, wireless sensor systems, and other electronic mayhem.  The course teaches equal parts basic electronics like power regulators, op amps, etc. and programming techniques for microcontrollers – the TI MSP430 and the Arduino. Most of all, the lab serves as a proving ground for ideas and a space for exploration.

Traversal

Traversal for Pipe Organ is a virtual performance exploring connectivity between worlds. As the name implies, it is a temporary link between the live pipe organ in a concert hall and a multi-user virtual world. “Physical” actions and events in the virtual realm compose the work in real time with the end product being a physical and aural realization of virtual performance.
As elements of the virtual instrument were engaged, their physical interactions – touch, physics, collision events – were used to create performance gestures on the organ in Sursa Hall. John Fillwalk, who had created the performance environment and virtual instrument as well as collaboratively conceptualized the work, performed live camera for the event that followed the performance for the live audience.  Jesse Allison composed the controllable sound events, and managed the communications link between the virtual and physical during the performance.

Excerpt from the premier performance.

Eco-Net

“Eco-Net” is an attempt to connect wireless network data with nature by visualizing that data with plant-like structures and organic motion. Each plant represents a computer connected to the network and each IP address is displayed above the corresponding plant. Collective network activity is displayed as websites are browsed and emails are sent. This piece represents our constantly connected state, simulated through plant and root structures, and the constant barrage of data that flows through the air all around us every day. This piece was created completely in Java, using the Processing API.

Links
Students win international award for work in the digital arts