Displaced Resonance v2

IDIA Lab was invited to exhibit their virtual and hybrid artworks at the Third Art and Science International Exhibition at the China Science and Technology Museum in Beijing, China.

http://www.idialabprojects.org/displacedresonance/

Displaced Resonance v2

John Fillwalk

Michael Pounds

IDIA Lab, Ball State University

Interactive installation, mixed media

2012

Displaced Resonance v2 is an interactive installation consisting of sixteen reactive forms that are networked in a grid of light and sound. Interaction within the sculptural field is based on a participant’s presence and proximity to each sculpture. The Displaced Resonance installation is connected to a mirrored instance of the field in a virtual environment – bridging both physical and virtual visitors within a shared hybrid space. Visitors to the virtual space are represented by avatars and through their proximity affect the light and sound of each sculpture. Each participant is aware of the other in each space and uniting both instances within a singular hybrid environment.

A computer system using an thermal camera tracks the movement of visitors and responds by controlling the distribution of sound and dynamic RGB data to the LED lights within the sculptural forms. The installation utilizes custom live processing software to transform these sources through the participants’ interaction – displacing the interaction from both their physical and virtual contexts to the processed framework of physical and virtual resonances. The two environments are linked to send and receive active responses from both sides of the installation via messaging, sensors, hardware and scripting.

In prototyping this work, the process included both physical and virtual models to design and build the form and the interaction. The physical prototypes were interpreted in a virtual simulation environment, investigating the spatial interaction of the structure. The interactive functionality was tested through scripting before the form was brought into the sensored camera-based version. After several virtual iterations, the form was re-interpreted and fabricated.

John Fillwalk

with Michael Pounds, David Rodriguez, Neil Zehr, Chris Harrison, Blake Boucher, Matthew Wolak, and Jesse Allison.

Third Art and Science International Exhibition

China Science and Technology Museum in Beijing

http://www.tasie.org.cn/index.asp

TASIE Press release

http://www.tasie.org.cn/content_e.asp?id=84

Ball State artists create “forest” of light on display in China and Internet

Muncie, Ind. — Ball State University electronic artists have created a “forest” of light and sound that will be on exhibit in Beijing, China through November, yet also accessible to visitors from Indiana or anywhere else in the world.

That’s possible because “Displaced Resonance,” as the interactive art exhibit is known, has both real-life and virtual components.

The physical portion has been installed in a gallery of the China Science and Technology Museum in Beijing. There, in-person visitors can negotiate a thicket of 16 interactive sculptures spaced 1.5 meters apart that will change colors and emit music as they approach.

A digital replica of the layout, meanwhile, resides on the Internet, accessible through the museum’s website. Online visitors can wander the virtual exhibit using an avatar, and the digital pillars will change colors and produce sounds, just like their physical counterparts.

But that’s not all — the two pieces interact with each other, says John Fillwalk, director of Ball State’s Institute for Digital Intermedia Arts (IDIA) and Hybrid Design Technologies (HDT), which created the work in collaboration with IDIA staff, students and composer Michael Pounds, BSU.

When an online avatar approaches a virtual pillar, the corresponding real-life column also will change colors, and vice versa. In-person and virtual visitors will produce different colors, however, allowing them to track each other through the exhibit.

“It’s what we call hybrid art,” says Fillwalk. “It’s negotiating between the physical world and the virtual. So it’s both sets of realities, and there’s a connection between the two.”

The physical pillars are two meters (or more than 6 feet, 6 inches) tall. They consist of a wooden base containing a sound system; a translucent pillar made of white corrugated plastic and computer-controlled lighting.

A thermal camera mounted on the museum’s ceiling keeps track of visitors and feeds its data to a computer program that directs the columns to change color and broadcast sounds when someone draws near.

“It’s a sensory forest that you can navigate,” Fillwalk says.

A video screen mounted on a wall overlooking the exhibit allows museum visitors to watch avatars move around the virtual version, while Internet patrons can affect the physical counterpart.

“Displaced Resonance” is the centerpiece of Ball State’s contributions to the Beijing museum’s 3rd Art and Science International Exhibition and Symposium, a month-long celebration of technology and the arts. Ball State was invited to participate because museum curators discovered some of IDIA’s work and liked what they saw, Fillwalk said.

In addition to “Displaced Resonance,” IDIA contributed four other pieces of digital art that museum visitors can view at a kiosk.

Those pieces are:

· “Proxy”, in which visitors create, color and sculpt with floating 3D pixels.

· “Flickr Gettr,” in which visitors can surround themselves with photos from the Flickr web service that correspond to search terms they submit.

· “Confluence,” in which users create virtual sculptures by moving around the screen and leaving a path in their wake.

· “Survey for Beijing,” in which real time weather data from Beijing is dynamically visualized in a virtual environment.

(Note to editors: For more information, contact John Fillwalk, director of the Institute for Digital Intermedia Arts, at765-285-1045 or jfillwalk@bsu.edu; or Vic Caleca, media relations manager, at 765-285-5948, or vjcaleca@bsu.edu. For more stories, visit the Ball State University News Center at www.bsu.edu/news).

Final Wisdom I: Interactive Haptic Poetry Installation at SIGGRAPH 2010

Final Wisdom I is an interactive installation engaging viewers in a sensory exploration of temporal and spatialized poetry. Participants manipulate imagery, sound and language through their gesture, touch and proximity. The work is engaged through a framework of cameras and sensors that react to heat, position, and capacitance – presenting a shifting environment of reactive media and haptics. Final Wisdom I is the work of artists Hans Breder and John Fillwalk, poetry by critic Donald Kuspit, music by composers Carlos Cuellar Brown and Jesse Allison. This project is produced through the Institute for Digital Intermedia Arts at Ball State University in collaboration with the Institute for Digital Fabrication. Final Wisdom I was exhibited at the art gallery of SIGGRAPH 2010 in Los Angeles, CA. Special thanks to IDF/CAP students Matthew Wolak, Christopher Baile and Claire Matucheski, and Assistant Professor of Architecture Joshua Vermillion. http://www.i-m-a-d-e.org/

As an intermedia artist, John Fillwalk actively investigates emerging technologies that inform his work in a variety of media, including video installation, virtual art, and interactive forms. His perspective is rooted in the traditions of painting, cinematography, and sculpture, with a particular interest in spatialized works that can immerse and engage a viewer within an experi- ence. Fillwalk positions his work to act as both a threshold and mediator between tangible and implied space, creating a conduit for the transformative extension of experience, and to pursue the realization of forms, sounds and images that afford interaction at its most fundamental level. In working with technology, he values the synergy of collaboration and regularly works with other artists and scientists on projects that could not be realized otherwise. Electronic media extend the range of traditional processes by establishing a palette of time, motion, interactivity, and extensions of presence. The ephemeral qualities of electronic and intermedia works, by their very nature, are inherently transformative, and the significance of the tangible becomes fleeting, shifting emphasis away from the object and toward the experience.

John Fillwalk is Director of the Institute for Digital Intermedia Arts (IDIA Lab) at Ball State University, an interdisciplinary and collaborative hybrid studio. An intermedia artist and Associate Professor of Electronic Art, Fillwalk investigates media in video installation, hybrid reality and interactive forms. He received his MFA from the University of Iowa in Intermedia and Video Art, and has since received numerous grants, awards, commissions and fellowships.

Donald Kuspit is an art critic, author and professor of art history and philosophy at State University of New York at Stony Brook and lends his editorial expertise to several journals, including Art Criticism, Artforum, New Art Examiner, Sculpture and Centennial Review. Hans Breder was born in Herford, Germany, and trained as a painter in Hamburg, Germany. Attract- ed to the University of Iowa’s School of Art and Art History in 1966, Breder established the Intermedia Program. Carlos Cuellar Brown, a.k.a ccbrown, is a composer, instrumentalist and music producer. Formally trained as a classical pianist, Cuellar specialized in experimental music and intermedia with the late American maverick composer Kenneth Gaburo. Jesse Allison is the Virtual Worlds Research Specialist, IDIA, Assistant Professor of Music Technology, Ball State University. He is also President of Hardware Engineering with Electrotap, LLC, an innovative human-computer interface firm.

Leonardo: The International Society for the Arts, Science and Technology article published by The MIT Press. http://muse.jhu.edu/journals/leonardo/summary/v043/43.4.fillwalk.html

VIRTUAL INSTRUMENT 3D UI

ChromaChord utilizes a combination of the Oculus Rift headset and an attached Leap Motion controller to create an immersive 3D VMI environment. The Oculus Rift provides a three-paneled visual interface, through Unity, that allows the users to access different component windows of the instrument (simply by turning their head), while the Leap Motion Controller allows the user to interact with the visual interface to musical effect (processed and generated in Max/MSP). ChromaChord uses the Oculus Rift DK2, a virtual reality headset to immerse the performer within the VMI. Utilizing two cameras in the virtual environment, this device relays a stereoscopic image to the wearer allowing for highly dimensional experiences. Additionally, a gyroscope, accelerometer, and magnetometer detect real-world head rotation and transform its virtual equivalents. Additionally, a peripheral infrared camera captures limited-range positional tracking. Mounted to the Oculus Rift is affixed a Leap Motion. This provides infrared tracking of hands. In our application, the Motion tracks the user’s real world hands and simulates them, mimicking hand positions and finger movements. Therefore, the user’s own hands are the primary input mechanism. Custom Software
  • Key note triggering and glow by Leap Motion hand touches
  • Particle generation triggered Leap Motion hand touches
  • Button click and glow by Leap Motion hand touches
  • Particle destruction and orbit physics
  • OSC Server and Client configurator / Message Handler / Data Handler
Our graphical user interface is composed of three sections – left, center and right panels. On the left, the user can see the modulation sphere. By touching this sphere, the user generates orbital particles. The number of these particles affects a live process on the notes. The user can remove these particles by simply swiping them away. Turning to the right reveals the root note and scale type controls. To select a root note and scale type, the user must press the desired button. The interface responds by highlighting the selected options. The center module contains the keys, a chromatic scale in two colored columns based on the Solfège. This is the core performance area of the UI. In addition there are arrows facing up and down that the user can touch to raise or lower the octave. All buttons and keys respond to touches by glowing white. 3D Musical Instrument The default window is the center palette, where the instrument itself is located. Within this window, there are two columns of colored keys (four per column) to the left and the right of the screen. To create sound, the user has but to put their hands vertically in front of them and ‘touch’ the keys, which are placed for both ease of use and the ability to contact multiple notes simultaneously to create chords. The notes, themselves, are stacked in intervals of a third from each other; consequently, neighboring notes on the same hand will create conventional harmonies. Between the columns, on the top and bottom-most extremities of them, are two arrows that, if touched, move the current note set up or down an octave. By touching the notes, Unity sends a UDP message to Max/MSP, indicating which note has been touched and remains active until contact has ceased. Similarly, by touching the octave buttons, UDP messages are sent to Max to reassign the MIDI note values. Within Max, MIDI values are then converted to their equivalent frequencies and assigned to their respective oscillators before being processed and output through Max’s audio out. When looking to the right, the user accesses the settings window, where note assignment takes place. First, the user selects the root note of the scale they wish to be within. The second parameter allows the user to select what mode of scale they would like – the options being: major, natural minor, and harmonic minor. By default, the instrument opens with a root note of C and is in the major mode. The UDP messages sent from the settings window control a series of gates within Max that then send adjusted MIDI values to the instrument’s oscillators. The performer accesses the modulation window to the left, which controls the frequency and Q of a filter, as well as level of effect. By touching the button in the center, Unity spawns floating particles that surround the user’s hands. The longer the users touches the button, the more particles accumulate; they can be removed by making shaking or throwing motions. Without particles, the filter’s frequency setting is at it’s lowest audible setting, creating a much darker, mellow timbre; alternatively, as more particles accumulate, the filter opens up, allowing for higher frequencies to pass through, creating a brighter, richer timbre. Additionally, the particles move the dry/wet ratio for effects towards the ‘wet’ end of the spectrum, deepening the audibility of the effects. The UDP message sent from the modulation window sends a single numeric value for the total number of particles active; this value is then scaled in Max/MSP to most appropriately suit each parameter that’s being altered.  
Data 0 0 0 0 0 0 0 0 0 0 0 0 0
Name Do Re Mi Fa Sol La Ti Do Up Octave Down Octave Modulation Root Note Scale Type
Type Key Key Key Key Key Key Key Key Octave Switcher Octave Switcher Modulation Root Note Scale Type
Range 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-100 0-11 0-2
Notes Bottom of octave Top of octave 1 when clicked 1 when clicked # of particles

Microsoft Surface Museum Collection Portal

The IDIA has assembled an interdisciplinary group of students, faculty and industry experts in a significant interactive information portal for the Ball State University Museum of Art (BSUMA).  The IDIA team is developing an innovative touch-based interface to navigate the collection, employing and integrating Microsoft Surface with the Museum’s database.  The Surface will afford Museum patrons a collaborative, participatory public platform through which to access metadata and media of the physical exhibition – as well as extending virtually into the permanent collection. Using the Digital Images Delivered Online (DIDO) database, the interface will make visual the interconnection between works in the collection queried on searchable parameters, i.e. artist, medium, period, subject etc. This two-semester immersive project supported by the BSU Provost’s Immersion Initiative has been team-taught and has recruited students from targeted disciplines across campus.

Complete Instruction Guide here: http://ilocker.bsu.edu/users/idiaa/world_shared/SurfaceInstructions.pdf

Microsoft Surface represents a fundamental change in the potential for interaction with digital content. The Surface is a 30” tabletop interface environment that allows several people to work independently or collaboratively – all without using a mouse or a keyboard.  The Surface allows users to navigate information physically, and manipulate information with natural gestures and touch http://www.microsoft.com/SURFACE/product.html#section=The%20Product.

The Ball State University Museum of Art has a collection of nearly 11,000 works of art. Central to the Ball State University Museum of Art’s mission is the provision of educational programming that will further faculty, staff, student, and public utilization and understanding of the Museum and its collection, of museums in general, and of the arts. To accomplish this goal, the Museum offer tours, talks, materials for teachers, an education database, artist demonstrations. The Museum’s education philosophy centers on bringing together the needs of visitors and the resources of the Museum , and balancing responsibility to the university with service to the community. In facilitating communication between the work of art and the viewer, the Museum subscribes to a philosophy of learner-centered programming informed by a thorough knowledge of the collection and methods and strategies for effective teaching.

Although the Museum’s collection numbers nearly 11,000, given the physical constraints of the facility, a mere 10% of the collected works are currently displayed at any given time.  The incorporation of this hybrid Surface and database system will afford patrons virtual access to the entire collection – making visible those works currently stored.  More importantly, the system will allow patrons to visualize the interconnectivity of the works according to multiple facets – visually re-contextualizing the works in relation to specified search parameters.  This form of innovative technology within a museum context would typically be installed at major metropolitan institutions. Development of this interface at the Ball State University Museum of Art will not only benefit local community groups and patrons of BSUMA whose access to the collection will be significantly augmented, but also has the potential to influence other museums through the distribution of the outcomes of this product.

The Institute for Digital Intermedia Arts (IDIA) has a history of providing interdisciplinary immersion and new media experiences, employing pedagogy that supports students’ transformation as active, engaged learners. The Institute provides a dynamic exchange between instructor and students – where participants engage in collaborative, inquiry-based communities that provide an environment, which fosters participatory learning. As opposed to a traditional model, where knowledge is imparted by the teacher to the student, participatory learning can transform the learner into an active role. Participatory learning communities engage in a dynamic conversation centered around a subject – where analysis, comparison, and evaluation are core to the process of acquiring meaning and relationships. This environment allows students the opportunity to have a stake in the conditions of their learning, making decisions that direct their course of investigation. With a commitment to active inquiry, participants are freed to create personal and collective meaning.

 

BSU MUSEUM OF ART

COLLECTION PORTAL

SOFTWARE DESIGN

Carrie Arnold

Joel Happ

Garret Orth

Deekshita Reddy

Christopher Ross

Jonathan Strong

Austin Toombs

INTERFACE DESIGN

Anastasia Goryacheva

Steven Lanier

Jonathan Strong

ELECTRONICS

Eric Brockmeyer

Giovanni Rozzi

GRAPHIC DESIGN

Amy Goettemoeller

Ina-Marie Henning

FACULTY

Jesse Allison

John Fillwalk

Paul Gestwicki

PROJECT DIRECTOR

Jonathan Strong

A SPECIAL THANK YOU

UNIVERSITY LIBRARIES

James Bradley

John Straw

Budi Wibowo

SOUND

Rick Baker

Steven Lanier

Giovanni Rozzi

STRUCTURAL DESIGN

Eric Brockmeyer

Matthew Wolak

PROJECT MANAGEMENT

Ina-Marie Henning

Amruta Mhaiskar

Jennifer Weaver-Cotton

PRODUCER/PRINCIPLE INVESTIGATOR

John Fillwalk

BALL STATE UNIVERSITY MUSEUM OF ART

Peter Blume

Carl Schafer

Tania Said

Ball State University Museum of Art

College of Fine Arts

College of Architecture and Planning

Department of Architecture

Department of Art

Department of Computer Science

School of Music

Information Technology

Office of the Provost

University Libraries

and

Jeff Berg, IBM Interactive, IDIA Research Fellow

IDIA Presents: BSU MUSEUM OF ART COLLECTION PORTAL 3/20/11 at 2:30pm BSUMA

 

BSU MUSEUM OF ART COLLECTION PORTAL
Reception and Gallery Talk

Thursday, March 20 @ 2:30pm, BSU Museum of Art

The BSU Museum of Art Collection Portal was developed by students, faculty and industry research fellows in the Institute for Digital Intermedia Arts at Ball State University. The year-long project was developed in partnership with the Ball State University Museum of Art and sponsored by the Provost’s Immersive Learning Initiative.

This interdisciplinary team from art, computer science, architecture and music has developed an innovative, touch-based application to navigate the collection. The portal bridges the Microsoft Surface interface with the Museum’s collection database, the Digital Images Delivered Online (DIDO), hosted on the Digital Media Repository of Bracken Library. The Surface affords Museum visitors an interactive platform to individually or collaboratively make virtual connections between works of art both on display and in reserve – accessing information and media across the collection.

http://bit.ly/IDIA_Surface

Complete Instruction Guide here: http://ilocker.bsu.edu/users/idiaa/world_shared/SurfaceInstructions.pdf

Human-Computer Interaction in the Arts

The Human-Computer Interaction Lab housed in Music Technology is a learning and exploration ground in how electronics can interact with the arts. Students who take the MuMET 440 Special Topics course in HCI can use the lab to develop projects in augmented musical instruments, electronic installation art, analog audio processors, synthesizers, robotics, wireless sensor systems, and other electronic mayhem.  The course teaches equal parts basic electronics like power regulators, op amps, etc. and programming techniques for microcontrollers – the TI MSP430 and the Arduino. Most of all, the lab serves as a proving ground for ideas and a space for exploration.

[un]wired

[un]wired by Jesse Allison, John Fillwalk and Keith Kothman is a processing network visualizer that responds to interactions from personal radio-frequency devices such as mobile phones, WiFi signals, Bluetooth signals and car-key fobs. It tracks real-time statistical information from wireless access points (designed for seamless handoff of moving wireless traffic, like a cell phone network), along with periodically updated information from hand-held and wireless access points. The interactions then appear in sound and shapes on the screen. Users interacting with the piece can then visually see their interaction live on screen. The control information is collected from network services via the MySQL database and transferred into Max/MSP/Jitter. [un]wired was exhibited at SIGGRAPH Asia 2008 in Singapore.

http://www.siggraph.org/asia2008/attendees/art/20.php

Intermedia Artist Timeline

This device was designed to take you on a journey through the history of intermedia art. The kiosk, made by the IDIA in collaboration with the Institute for Digital Fabrication at Ball State University, allows the user to interact using hand gestures to move through the timeline. The work includes a short bio, image and description of works from 30 intermedia artists of the 1960s to the present. It was displayed in conjunction with the BSU Museum of Art’s Engaging Technology Exhibit. Jesse Allison programmed the piece with help from Mike Sayre. Dustin Headley and the IDF designed and constructed the kiosk.