AI Art Installation Beijing

AI Art Installation Beijing

Wishing Well (许愿池) by media artist John Fillwalk and IDIA Lab, is an Artificial Intelligence (AI) installation that was invited into the 5th Arts and Science International Exhibition and Symposium (TASIES 2019) at the National Museum of China in Beijing located on Tiananmen Square. The exhibition curated the work of international artists engaged in the pioneering use of AI in the creation of art-making. The Wishing Well installation creates an interactive environment where visitors’ spoken wishes shape the substance, meaning and character of the artwork – mediated through artificial intelligence.


Wishing Well ( 许愿池 ) engages both private and public spatial interactions in the creation of a simultaneously intimate and collective social experience. The tone of the wishes aggregated by the system is processed anonymously and delivered into this social sculpture via synthesized whispered voices, spoken in Mandarin. The pool of water in the bowl, or wishing well, is vibrated by a speaker placed underneath – creating a three-dimensional dynamic sound wave representation shaped by the sound of the actual wish. Virtual colored shapes are emitted based on the tonal analysis of each wish and are aggregated into an historical representation of previous wishes – launching skyward to become part of the larger collection of dreams.

Displaced Resonance v2

IDIA Lab was invited to exhibit their virtual and hybrid artworks at the Third Art and Science International Exhibition at the China Science and Technology Museum in Beijing, China.

http://www.idialabprojects.org/displacedresonance/

Displaced Resonance v2

John Fillwalk

Michael Pounds

IDIA Lab, Ball State University

Interactive installation, mixed media

2012

Displaced Resonance v2 is an interactive installation consisting of sixteen reactive forms that are networked in a grid of light and sound. Interaction within the sculptural field is based on a participant’s presence and proximity to each sculpture. The Displaced Resonance installation is connected to a mirrored instance of the field in a virtual environment – bridging both physical and virtual visitors within a shared hybrid space. Visitors to the virtual space are represented by avatars and through their proximity affect the light and sound of each sculpture. Each participant is aware of the other in each space and uniting both instances within a singular hybrid environment.

A computer system using an thermal camera tracks the movement of visitors and responds by controlling the distribution of sound and dynamic RGB data to the LED lights within the sculptural forms. The installation utilizes custom live processing software to transform these sources through the participants’ interaction – displacing the interaction from both their physical and virtual contexts to the processed framework of physical and virtual resonances. The two environments are linked to send and receive active responses from both sides of the installation via messaging, sensors, hardware and scripting.

In prototyping this work, the process included both physical and virtual models to design and build the form and the interaction. The physical prototypes were interpreted in a virtual simulation environment, investigating the spatial interaction of the structure. The interactive functionality was tested through scripting before the form was brought into the sensored camera-based version. After several virtual iterations, the form was re-interpreted and fabricated.

John Fillwalk

with Michael Pounds, David Rodriguez, Neil Zehr, Chris Harrison, Blake Boucher, Matthew Wolak, and Jesse Allison.

Third Art and Science International Exhibition

China Science and Technology Museum in Beijing

http://www.tasie.org.cn/index.asp

TASIE Press release

http://www.tasie.org.cn/content_e.asp?id=84

Ball State artists create “forest” of light on display in China and Internet

Muncie, Ind. — Ball State University electronic artists have created a “forest” of light and sound that will be on exhibit in Beijing, China through November, yet also accessible to visitors from Indiana or anywhere else in the world.

That’s possible because “Displaced Resonance,” as the interactive art exhibit is known, has both real-life and virtual components.

The physical portion has been installed in a gallery of the China Science and Technology Museum in Beijing. There, in-person visitors can negotiate a thicket of 16 interactive sculptures spaced 1.5 meters apart that will change colors and emit music as they approach.

A digital replica of the layout, meanwhile, resides on the Internet, accessible through the museum’s website. Online visitors can wander the virtual exhibit using an avatar, and the digital pillars will change colors and produce sounds, just like their physical counterparts.

But that’s not all — the two pieces interact with each other, says John Fillwalk, director of Ball State’s Institute for Digital Intermedia Arts (IDIA) and Hybrid Design Technologies (HDT), which created the work in collaboration with IDIA staff, students and composer Michael Pounds, BSU.

When an online avatar approaches a virtual pillar, the corresponding real-life column also will change colors, and vice versa. In-person and virtual visitors will produce different colors, however, allowing them to track each other through the exhibit.

“It’s what we call hybrid art,” says Fillwalk. “It’s negotiating between the physical world and the virtual. So it’s both sets of realities, and there’s a connection between the two.”

The physical pillars are two meters (or more than 6 feet, 6 inches) tall. They consist of a wooden base containing a sound system; a translucent pillar made of white corrugated plastic and computer-controlled lighting.

A thermal camera mounted on the museum’s ceiling keeps track of visitors and feeds its data to a computer program that directs the columns to change color and broadcast sounds when someone draws near.

“It’s a sensory forest that you can navigate,” Fillwalk says.

A video screen mounted on a wall overlooking the exhibit allows museum visitors to watch avatars move around the virtual version, while Internet patrons can affect the physical counterpart.

“Displaced Resonance” is the centerpiece of Ball State’s contributions to the Beijing museum’s 3rd Art and Science International Exhibition and Symposium, a month-long celebration of technology and the arts. Ball State was invited to participate because museum curators discovered some of IDIA’s work and liked what they saw, Fillwalk said.

In addition to “Displaced Resonance,” IDIA contributed four other pieces of digital art that museum visitors can view at a kiosk.

Those pieces are:

· “Proxy”, in which visitors create, color and sculpt with floating 3D pixels.

· “Flickr Gettr,” in which visitors can surround themselves with photos from the Flickr web service that correspond to search terms they submit.

· “Confluence,” in which users create virtual sculptures by moving around the screen and leaving a path in their wake.

· “Survey for Beijing,” in which real time weather data from Beijing is dynamically visualized in a virtual environment.

(Note to editors: For more information, contact John Fillwalk, director of the Institute for Digital Intermedia Arts, at765-285-1045 or jfillwalk@bsu.edu; or Vic Caleca, media relations manager, at 765-285-5948, or vjcaleca@bsu.edu. For more stories, visit the Ball State University News Center at www.bsu.edu/news).

VBAM: VIRTUAL BROAD ART MUSEUM COMMISSION

Press Release

View the VBAM project

VBAM Introductory Lecture

John Fillwalk and the Institute for Digital Intermedia Arts (IDIA Lab) at Ball State University were commissioned to design and build a virtual museum artworks for the Eli and Edythe Broad Art Museum in East Lansing, MI. The physical Museum, was designed by architect Zaha Hadid and opened to the public in Fall 2012. Museum Director, Michael Rush commissioned Fillwalk and IDIA Lab to envision and create a dynamic program of artworks for the Virtual Broad Art Museum project (VBAM). Fillwalk and his collaborators created an immersive multi-user environment and four commissioned artworks using Unity 3D – experienced entirely within a web browser.

“Engaging visitors with innovators at the leading edge of art and technology, both here at MSU and around the globe, is key to the Broad Art Museum’s mission,” said founding director Michael Rush. “With the Virtual Broad Art Museum, we have an opportunity to embrace the tremendous creative and connective possibilities that exist in the digital world.”

The mixed world reception took place on Thursday, March 15th 2012 from 6:00 – 8:00 pm EST simultaneously on the campus of Michigan State University and in the VBAM virtual worlds.

Please visit the VBAM project links below.

http://idialabprojects.org/vbam/proxy.html

http://idialabprojects.org/vbam/flickrgettr.html

http://idialabprojects.org/vbam/survey.html

http://idialabprojects.org/vbam/confluence.html

VBAM Exhibitions by John Fillwalk

Participants in con|FLUENCE can create pathways based on reactions to both social and spatial relationships. There is a virtual response and persistence to interactions, as the routes are drawn and sculpted in the three-dimensional environment – forming nodes that are created via visitors’ proximity.

con|FLUENCE. John Fillwalk. 2012.

Survey for VBAM is an immersive landscape simulation using real time weather data from the location of the Broad Art Museum in East Lansing, Michigan. Representations of surveyor’s tape, flags, light, sky color, time of day, wind speed and even cloud density are driven by the actual physical weather forces – informing the virtual landscape.

Survey. John Fillwalk. 2012.

dis|PLACE is an interactive gesture-based performance that will be exhibited at the VBAM reception on the 15th of March. This installation links virtual space to physical reality in an exploration of agency and gesture. Employing a participant’s movement, the work can both navigate 3D space – as well as control direct interactions with the digital performers via a Microsoft Kinect™ sensor.

dis|PLACE. John Fillwalk. 2012.

Proxy is a performative, interactive and site-specific virtual installation where participants shape the evolution of a sculptural and sonic composition as a response to the Museum. Interactions are painted, sculpted and sonified within this collaborative and evolving environment. The work progresses to construct in itself relation to the Museum – eventually transforming as integrated structure – reflecting and becoming its environment.

Proxy. John Fillwalk. 2012.

Flickr™ Gettr for VBAM connects the social image web service of Flickr™ to the virtual Museum environment, allowing visitors to create a dynamic cloud of spatial imagery by entering a search term of their choice – providing tagged images from Flickr™ into the Museum environment. The search term is spoken by the software and tags from image cloud are revealed allowing for subsequent user transformations of the spatial imagery.

Flickr Gettr. John Fillwalk. 2012.

Credits

Proxy.
John Fillwalk with
Neil Zehr, Michael James Olson
2012

Survey for VBAM
John Fillwalk with
Neil Zehr, Keith Kothman, Charles Estell
2012

Flickr Gettr for VBAM
John Fillwalk with
Jesse Allison, Neil Zehr, Charles Estell
2012

con|FLUENCE
John Fillwalk with
Neil Zehr
2012

dis|PLACE (exhibited at the reception on the 15th of March as a live performance only)
John Fillwalk with
Neil Zehr, Jonathan Strong
2012

Michael Rush, Director of the Broad Museum of Art, MSU
Adam Brown, Professor Intermedia and Electronic Art, MSU
John Fillwalk, Director of the Institute for Digital Intermedia Arts [IDIA Lab], BSU
Neil Zehr, IDIA Lab, BSU
Charles Estell, IDIA Lab, BSU
Michael James Olson, IDIA Lab, BSU
Ina-Marie Henning, IDIA Lab, BSU
Jonathan Strong, IDIA Lab, BSU
Jesse Allison, LSU
Keith Kothman, BSU
Jonathan Ross, BSU
Blake Boucher, BSU
Michael Davidson, MSU
Jayne Goedekke, MSU
Kyle Gomboy, Reaction Grid
Chris Hart, Reaction Grid

Special thank you to Neil Zehr, IDIA Lab, BSU

Interactive Art at the College of Architecture and Planning

The IDIA Lab is designing a series of extended reality artworks that will be installed in the atrium of the College of Architecture and Planning. The installation allows passers-by to interact using various modes including motion, voice, and gesture to shape the compositions. The works employ sensors, sound art, artificial intelligence, and other programming to create dynamic experiences.

The Firefly Effect

A virtual social installation by John Fillwalk, IDIA Lab and Jody Naderi, CAP

This work, designed specifically for Nuit Blanche Toronto activates a locative field in which participants engage in dynamic social experience. Individual users of the app interact with a dynamic feedback system allowing them to find the cadence of a contemplative walk. Participants can view other app users on a live map – locating them spatially to join in a unique group or virtual room – forming a dynamic choir of collaborative synchronous light and music.

Nuit Blanche

https://nbto.com

Firefly at Nuit Blanche

https://nbto.com/project.html?project_id=114

IDIA Lab

https://www.idialab.org/firefly/

Final Wisdom I: Interactive Haptic Poetry Installation at SIGGRAPH 2010

Final Wisdom I is an interactive installation engaging viewers in a sensory exploration of temporal and spatialized poetry. Participants manipulate imagery, sound and language through their gesture, touch and proximity. The work is engaged through a framework of cameras and sensors that react to heat, position, and capacitance – presenting a shifting environment of reactive media and haptics. Final Wisdom I is the work of artists Hans Breder and John Fillwalk, poetry by critic Donald Kuspit, music by composers Carlos Cuellar Brown and Jesse Allison. This project is produced through the Institute for Digital Intermedia Arts at Ball State University in collaboration with the Institute for Digital Fabrication. Final Wisdom I was exhibited at the art gallery of SIGGRAPH 2010 in Los Angeles, CA. Special thanks to IDF/CAP students Matthew Wolak, Christopher Baile and Claire Matucheski, and Assistant Professor of Architecture Joshua Vermillion. http://www.i-m-a-d-e.org/

As an intermedia artist, John Fillwalk actively investigates emerging technologies that inform his work in a variety of media, including video installation, virtual art, and interactive forms. His perspective is rooted in the traditions of painting, cinematography, and sculpture, with a particular interest in spatialized works that can immerse and engage a viewer within an experi- ence. Fillwalk positions his work to act as both a threshold and mediator between tangible and implied space, creating a conduit for the transformative extension of experience, and to pursue the realization of forms, sounds and images that afford interaction at its most fundamental level. In working with technology, he values the synergy of collaboration and regularly works with other artists and scientists on projects that could not be realized otherwise. Electronic media extend the range of traditional processes by establishing a palette of time, motion, interactivity, and extensions of presence. The ephemeral qualities of electronic and intermedia works, by their very nature, are inherently transformative, and the significance of the tangible becomes fleeting, shifting emphasis away from the object and toward the experience.

John Fillwalk is Director of the Institute for Digital Intermedia Arts (IDIA Lab) at Ball State University, an interdisciplinary and collaborative hybrid studio. An intermedia artist and Associate Professor of Electronic Art, Fillwalk investigates media in video installation, hybrid reality and interactive forms. He received his MFA from the University of Iowa in Intermedia and Video Art, and has since received numerous grants, awards, commissions and fellowships.

Donald Kuspit is an art critic, author and professor of art history and philosophy at State University of New York at Stony Brook and lends his editorial expertise to several journals, including Art Criticism, Artforum, New Art Examiner, Sculpture and Centennial Review. Hans Breder was born in Herford, Germany, and trained as a painter in Hamburg, Germany. Attract- ed to the University of Iowa’s School of Art and Art History in 1966, Breder established the Intermedia Program. Carlos Cuellar Brown, a.k.a ccbrown, is a composer, instrumentalist and music producer. Formally trained as a classical pianist, Cuellar specialized in experimental music and intermedia with the late American maverick composer Kenneth Gaburo. Jesse Allison is the Virtual Worlds Research Specialist, IDIA, Assistant Professor of Music Technology, Ball State University. He is also President of Hardware Engineering with Electrotap, LLC, an innovative human-computer interface firm.

Leonardo: The International Society for the Arts, Science and Technology article published by The MIT Press. http://muse.jhu.edu/journals/leonardo/summary/v043/43.4.fillwalk.html

Engaging Technology II: Art + Science

The interplay of art and science—a concept at least as old as Leonardo da Vinci—is the focus of “Engaging Technology II,” an exhibition at Ball State University’s David Owsley Museum of Art which ran September 28, 2017 through December 22, 2017. Museum patrons experienced a selection of work from internationally renowned artists whose explorations on the topic include installations, code art (artwork generated by computer programming), art/science, augmented reality and human-computer interaction. 

The exhibition was curated by Ball State’s John Fillwalk, director of the Institute for Digital Intermedia Arts (IDIA Lab) in the College of Architecture and Planning and is both a sequel and a new concept, according to Robert La France, the museum’s director. This exhibit builds on the success of a prior exhibition curated by Fillwalk, that examined the emergence of the genre of Intermedia Art, with a particular focus on the intersection of technology and artmaking. The new show shifts the curatorial emphasis on STEAM – science, technology, engineering, arts and mathematics.

This exhibition continues to position Ball State as an international center of STEAM innovation. A goal of Engaging Technology II was to invite artists exploring the edges of art and science into discourse within our community—advancing our University’s conversation regarding interdisciplinary collaboration and innovation. This collection of artists examines various aspects of the physical and life sciences, from code and biology to chemistry and physics.

Among those individuals whose work is featured in the exhibition is Adam Brown, an associate professor at Michigan State University. His live biochemical installation, The Great Work of the Metal Lover is a work that sits at the intersection of art, science and alchemy. “It uses microbiology as a technique to solve the mystery of the philosopher’s stone,” said Brown, who described the centuries-old “stone” as a legendary substance sought by alchemists who believed it capable of turning metals like mercury into gold and silver. 

His installation uses custom lab equipment to introduce a “highly specialized” bacterium into an engineered atmosphere, turning toxic gold chloride into usable 24K gold over time during the course of the show. “By the end of the exhibition, the process will produce enough gold to put in the palm of your hand,” Brown said.

Other innovative contributors to “Engaging Technology II” include code artist Casey Raes; composer and visual artist Tristian Perich; and Evelina Domnitch and Dmitry Gelfand, European-Russian artists who create sensory immersion environments that merge physics, chemistry and computer science with uncanny philosophical practices. 

“10000 Peacock Feathers in Foaming Acid” is Evelina Domnitch and Dmitry Gelfand’s live, audiovisual performance which was presented on the fulldome projection screen of Ball State’s Charles W. Brown Planetarium. The artwork uses a penetrating laser beam to scan the surface of soap bubbles, the behavior of which model the unique properties of
cell membranes. 

Throughout the run of Engaging Technology II, a series of invited performances, lectures and workshops were scheduled on campus and in the Muncie community to enhance the exhibition’s reach. Classes, workshops, lectures and family events were offered for local
and regional audiences. 

By Gail Werner, BSU Media Strategist
Et al.

VIRTUAL INSTRUMENT 3D UI

ChromaChord utilizes a combination of the Oculus Rift headset and an attached Leap Motion controller to create an immersive 3D VMI environment. The Oculus Rift provides a three-paneled visual interface, through Unity, that allows the users to access different component windows of the instrument (simply by turning their head), while the Leap Motion Controller allows the user to interact with the visual interface to musical effect (processed and generated in Max/MSP). ChromaChord uses the Oculus Rift DK2, a virtual reality headset to immerse the performer within the VMI. Utilizing two cameras in the virtual environment, this device relays a stereoscopic image to the wearer allowing for highly dimensional experiences. Additionally, a gyroscope, accelerometer, and magnetometer detect real-world head rotation and transform its virtual equivalents. Additionally, a peripheral infrared camera captures limited-range positional tracking. Mounted to the Oculus Rift is affixed a Leap Motion. This provides infrared tracking of hands. In our application, the Motion tracks the user’s real world hands and simulates them, mimicking hand positions and finger movements. Therefore, the user’s own hands are the primary input mechanism. Custom Software
  • Key note triggering and glow by Leap Motion hand touches
  • Particle generation triggered Leap Motion hand touches
  • Button click and glow by Leap Motion hand touches
  • Particle destruction and orbit physics
  • OSC Server and Client configurator / Message Handler / Data Handler
Our graphical user interface is composed of three sections – left, center and right panels. On the left, the user can see the modulation sphere. By touching this sphere, the user generates orbital particles. The number of these particles affects a live process on the notes. The user can remove these particles by simply swiping them away. Turning to the right reveals the root note and scale type controls. To select a root note and scale type, the user must press the desired button. The interface responds by highlighting the selected options. The center module contains the keys, a chromatic scale in two colored columns based on the Solfège. This is the core performance area of the UI. In addition there are arrows facing up and down that the user can touch to raise or lower the octave. All buttons and keys respond to touches by glowing white. 3D Musical Instrument The default window is the center palette, where the instrument itself is located. Within this window, there are two columns of colored keys (four per column) to the left and the right of the screen. To create sound, the user has but to put their hands vertically in front of them and ‘touch’ the keys, which are placed for both ease of use and the ability to contact multiple notes simultaneously to create chords. The notes, themselves, are stacked in intervals of a third from each other; consequently, neighboring notes on the same hand will create conventional harmonies. Between the columns, on the top and bottom-most extremities of them, are two arrows that, if touched, move the current note set up or down an octave. By touching the notes, Unity sends a UDP message to Max/MSP, indicating which note has been touched and remains active until contact has ceased. Similarly, by touching the octave buttons, UDP messages are sent to Max to reassign the MIDI note values. Within Max, MIDI values are then converted to their equivalent frequencies and assigned to their respective oscillators before being processed and output through Max’s audio out. When looking to the right, the user accesses the settings window, where note assignment takes place. First, the user selects the root note of the scale they wish to be within. The second parameter allows the user to select what mode of scale they would like – the options being: major, natural minor, and harmonic minor. By default, the instrument opens with a root note of C and is in the major mode. The UDP messages sent from the settings window control a series of gates within Max that then send adjusted MIDI values to the instrument’s oscillators. The performer accesses the modulation window to the left, which controls the frequency and Q of a filter, as well as level of effect. By touching the button in the center, Unity spawns floating particles that surround the user’s hands. The longer the users touches the button, the more particles accumulate; they can be removed by making shaking or throwing motions. Without particles, the filter’s frequency setting is at it’s lowest audible setting, creating a much darker, mellow timbre; alternatively, as more particles accumulate, the filter opens up, allowing for higher frequencies to pass through, creating a brighter, richer timbre. Additionally, the particles move the dry/wet ratio for effects towards the ‘wet’ end of the spectrum, deepening the audibility of the effects. The UDP message sent from the modulation window sends a single numeric value for the total number of particles active; this value is then scaled in Max/MSP to most appropriately suit each parameter that’s being altered.  
Data 0 0 0 0 0 0 0 0 0 0 0 0 0
Name Do Re Mi Fa Sol La Ti Do Up Octave Down Octave Modulation Root Note Scale Type
Type Key Key Key Key Key Key Key Key Octave Switcher Octave Switcher Modulation Root Note Scale Type
Range 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-1 0-100 0-11 0-2
Notes Bottom of octave Top of octave 1 when clicked 1 when clicked # of particles

Engaging Technology II: Art + Science – 10000 Peacock Feathers in Foaming Acid

Engaging Technology II: Art + Science premiered at the David Owsley Museum of Art on September 28th, 2017. The exhibition was curated by John Fillwalk, Director of the Institute for Digital Intermedia Art at Ball State University. The opening reception was followed by a special performance by two of the featured artists – Evelina Domnitch and Dimitry Gelfand at the Charles W. Brown Planetarium.

About the performance, from the artists: “In 10000 Peacock Feathers in Foaming Acid we use laser light to scan the surfaces of nucleating and dissipating soap bubble clusters. Unlike ordinary light, the laser’s focused beam is capable of crawling through the micro and nano structures within a bubble’s skin. When aimed at specific angles, this penetrating light generates a large-scale projection of molecular interactions as well as the phenomena of non-linear optics. Bubble behaviors viewed in such proximity evoke the dynamics of living cells (the lipid membranes of which, are direct chemical descendants of soap films).”

http://www.portablepalace.com

Sinespace Gallery Designer

IDIA Lab has created a virtual gallery design system in partnership with virtual world creator Sinespace. Sinespace, a multi-user virtual world platform based in the UK, contacted IDIA to design this system for their new virtual environment. The Gallery Designer isan easy to use system to dynamically create gallery and museum exhibitions on the fly with any web hosted images – creating an exhibition with any images that can be connected to from the internet. Users can easily pick gallery styles (classical/modern), exterior settings, frame style, lighting, floosr, wall color and ceiling treatments – customizing an experience that is complementary to the exhibition theme. The gallery can then be easily published and host a virtual reception with visitors from around the globe.

Photogrammetry and 3D Printing in “The Power of Place”

Long on the cutting edge of emerging technologies, IDIA Lab has gained a reputation for unique skill sets. Providing their photogrammetry and 3D printing services, IDIA Lab joined several departments in the production of the new exhibition at the David Owsley Museum of Art. The Power of Place: 100 Years of Architecture at Ball State University

“The Power of Place: 100 Years of Architecture at Ball State University”

“The Power of Place: 100 Years of Architecture at Ball State University” showcases the ways Ball State’s campus architecture has changed and how the past still shines through.

“The exhibition embodies the idea that even though Ball State’s built environment has greatly changed over time by way of adding new buildings constructed with modern materials, our past is forever present,” says Rebecca Torsell, archivist for architectural records at Ball State University, who organized the exhibition.

Running from September 27 through December 21, 2018, the exhibition is part of a campus-wide celebration of Ball State’s Centennial.

“The museum is an ideal place to celebrate the University’s Centennial,” says Robert G. La France, director of the David Owsley Museum of Art. “For more than 80 years the museum’s building — the Fine Arts Building — has stood as Ball State’s palace to the arts. This show displays the University’s long-standing commitment to distinctive architecture and introduces visitors to its vast architectural archives and special collections.”

Photogrammetry

Part of the history of Ball State architecture includes unique tiles used in the Fine Arts Building itself. The museum wanted a way to display these ornate, relief tiles without removing them from the wall. Using their photogrammetry techniques, IDIA Lab first recreated the tiles as a virtual 3D model.

IDIA Lab Modeler, Trevor Danehy, explains the process, “We take at least one hundred photos of each tile, everything from straight on shots of the tile to sharp angles in an effort to expose the levels of depth in the crevices of the tile. The photogrammetry software algorithm requires that each photo has at least 50% overlap compared to the last photo taken in order to stitch together a mesh.”

“It can be a very trial and error process, so the more images taken, the better. That is why we prefer an DSLR camera that can take several images rapidly. Quantity over quality works best in photogrammetry. Once the software stitches together a crude mesh, we do a small amount of digital cleanup and export it for 3D printing.”

3D Printing

After the virtual mesh has been created, the next step is to recreate the tile using a physical medium. At IDIA Lab, this is accomplished using 3D printing. IDIA Lab’s Adam Kobitz, who manages 3D printing in the lab, describes the printing technique. “We printed the tiles using an Fused Deposition Modeling (FDM) printing process, in which the physical objects are built up by extruding layer after layer of melted plastic. This process allows us to produce accurate recreations of the tiles relatively quickly, permitting viewers of the exhibit to observe and interact with them in a manner they’re unable to with the original works.”

After the tiles are printed, they are ready to be displayed as a portable replica of part of Ball State’s architectural history.

Augmented Reality Art

Augmented Reality Art

MaelStrom AR is an augmented reality installation designed by artist John Fillwalk in collaboration with IDIA Lab (Neil Zehr, Aaron Anderson, and David Rodriguez). This work visualizes live weather data from Washington DC to inform virtual wind direction and speed. This data affects the creation of virtual paper airplanes/messages that fall from above onto the ground, enveloping the viewer within a swirling circular mass of objects. In this multi-user networked mobile artwork, participants are assigned a unique color upon launching, through which they can paint nearby airplanes and also physically move them through walking and collision. The surrounding environment of the user is scanned by the software and is used to create physical interactions with the AR objects. In addition to hue, participants in MaelStrom AR are also assigned a unique tone that blends to form a chorus of sound in the space. In this work, the viewer becomes a participant in a constant stream of messaging and data – playfully affecting its design while collaborating with other users.

Virtual Artworks

Virtual Artworks

http://www.idialabprojects.org/displacedresonance/virtual.html

 

PROXY

John Fillwalk

with Michael Olson, Composer and IDIA Lab. 2012.

In the virtual installation, PROXY, visitors shape the construction of a sculptural and sonic response to the virtual museum. The work progresses to form, eventually transforming to become structural support for the building. When multiple users are in the environment, their avatars interact with one another to create collaborative painting and sculpture.

 

FLICKR™ GETTR v6

John Fillwalk

with Jesse Allison, Composer and IDIA Lab. 2012.

FLICKR™ GETTR v6 connects the social image web service of Flickr™ to the virtual museum environment, allowing visitors to create a dynamic cloud of spatial imagery by entering a search term of their choice, that pulls in related images from Flickr™ into the virtual environment.

 

SURVEY FOR BEIJING

John Fillwalk

with Keith Kothman, Composer and IDIA Lab. 2012.

SURVEY FOR BEIJING is an immersive landscape simulation using real time weather data from the physical location in Beijing, China. Representations of surveyor’s tape, flags, light, time of day, wind and clouds are superimposed onto the virtual landscape in accordance with real-life weather data.

 

con|FLUENCE

John Fillwalk

with Michael Olson, Composer and IDIA Lab. 2012.

Participants in con|FLUENCE create pathways based on reactions to both social and spatial relationships. There is a virtual response and persistence to interactions, as the routes are drawn and sculpted in the three-dimensional environment – forming nodes that are created via visitors’ proximity

 

Third Art and Science International Exhibition

China Science and Technology Museum in Beijing

http://www.tasie.org.cn/index.asp

 

 

TASIE Press release

http://www.tasie.org.cn/content_e.asp?id=84

 

Ball State artists create “forest” of light on display in China and Internet

Muncie, Ind. — Ball State University electronic artists have created a “forest” of light and sound that will be on exhibit in Beijing, China through November, yet also accessible to visitors from Indiana or anywhere else in the world.

That’s possible because “Displaced Resonance,” as the interactive art exhibit is known, has both real-life and virtual components.

The physical portion has been installed in a gallery of the China Science and Technology Museum in Beijing. There, in-person visitors can negotiate a thicket of 16 interactive sculptures spaced 1.5 meters apart that will change colors and emit music as they approach.

A digital replica of the layout, meanwhile, resides on the Internet, accessible through the museum’s website. Online visitors can wander the virtual exhibit using an avatar, and the digital pillars will change colors and produce sounds, just like their physical counterparts.

But that’s not all — the two pieces interact with each other, says John Fillwalk, director of Ball State’s Institute for Digital Intermedia Arts (IDIA) and Hybrid Design Technologies (HDT), which created the work in collaboration with IDIA staff, students and composer Michaal Pounds, BSU.

When an online avatar approaches a virtual pillar, the corresponding real-life column also will change colors, and vice versa. In-person and virtual visitors will produce different colors, however, allowing them to track each other through the exhibit.

“It’s what we call hybrid art,” says Fillwalk. “It’s negotiating between the physical world and the virtual. So it’s both sets of realities, and there’s a connection between the two.”

The physical pillars are two meters (or more than 6 feet, 6 inches) tall. They consist of a wooden base containing a sound system; a translucent pillar made of white corrugated plastic and computer-controlled lighting.

A thermal camera mounted on the museum’s ceiling keeps track of visitors and feeds its data to a computer program that directs the columns to change color and broadcast sounds when someone draws near.

“It’s a sensory forest that you can navigate,” Fillwalk says.

Two final touches: a video screen mounted on a museum wall overlooking the exhibit allows in-person visitors to watch avatars move around the virtual version, while Internet patrons can keeps tabs on the real-life display through a window on their computer screens.

“Displaced Resonance” is the centerpiece of Ball State’s contributions to the Beijing museum’s 3rd Art and Science International Exhibition and Symposium, a month-long celebration of technology and the arts. Ball State was invited to participate because museum curators discovered some of IDIA’s work and liked what they saw, Fillwalk said.

In addition to “Displaced Resonance,” IDIA contributed four other pieces of digital art that museum visitors can view at a kiosk.

Those pieces are:

·      “Proxy”, in which visitors create, color and sculpt with floating 3D pixels.

·      “Flickr Gettr,” in which visitors can surround themselves with photos from the Flickr web service that correspond to search terms they submit.

·      “Confluence,” in which users create virtual sculptures by moving around the screen and leaving a path in their wake.

·      “Survey for Beijing,” in which real time weather data from Beijing is dynamically visualized in a virtual environment.

 

(Note to editors: For more information, contact John Fillwalk, director of the Institute for Digital Intermedia Arts, at765-285-1045 or jfillwalk@bsu.edu; or Vic Caleca, media relations manager, at 765-285-5948, or vjcaleca@bsu.edu. For more stories, visit the Ball State University News Center at www.bsu.edu/news.

Microsoft Surface Museum Collection Portal

The IDIA has assembled an interdisciplinary group of students, faculty and industry experts in a significant interactive information portal for the Ball State University Museum of Art (BSUMA).  The IDIA team is developing an innovative touch-based interface to navigate the collection, employing and integrating Microsoft Surface with the Museum’s database.  The Surface will afford Museum patrons a collaborative, participatory public platform through which to access metadata and media of the physical exhibition – as well as extending virtually into the permanent collection. Using the Digital Images Delivered Online (DIDO) database, the interface will make visual the interconnection between works in the collection queried on searchable parameters, i.e. artist, medium, period, subject etc. This two-semester immersive project supported by the BSU Provost’s Immersion Initiative has been team-taught and has recruited students from targeted disciplines across campus.

Complete Instruction Guide here: http://ilocker.bsu.edu/users/idiaa/world_shared/SurfaceInstructions.pdf

Microsoft Surface represents a fundamental change in the potential for interaction with digital content. The Surface is a 30” tabletop interface environment that allows several people to work independently or collaboratively – all without using a mouse or a keyboard.  The Surface allows users to navigate information physically, and manipulate information with natural gestures and touch http://www.microsoft.com/SURFACE/product.html#section=The%20Product.

The Ball State University Museum of Art has a collection of nearly 11,000 works of art. Central to the Ball State University Museum of Art’s mission is the provision of educational programming that will further faculty, staff, student, and public utilization and understanding of the Museum and its collection, of museums in general, and of the arts. To accomplish this goal, the Museum offer tours, talks, materials for teachers, an education database, artist demonstrations. The Museum’s education philosophy centers on bringing together the needs of visitors and the resources of the Museum , and balancing responsibility to the university with service to the community. In facilitating communication between the work of art and the viewer, the Museum subscribes to a philosophy of learner-centered programming informed by a thorough knowledge of the collection and methods and strategies for effective teaching.

Although the Museum’s collection numbers nearly 11,000, given the physical constraints of the facility, a mere 10% of the collected works are currently displayed at any given time.  The incorporation of this hybrid Surface and database system will afford patrons virtual access to the entire collection – making visible those works currently stored.  More importantly, the system will allow patrons to visualize the interconnectivity of the works according to multiple facets – visually re-contextualizing the works in relation to specified search parameters.  This form of innovative technology within a museum context would typically be installed at major metropolitan institutions. Development of this interface at the Ball State University Museum of Art will not only benefit local community groups and patrons of BSUMA whose access to the collection will be significantly augmented, but also has the potential to influence other museums through the distribution of the outcomes of this product.

The Institute for Digital Intermedia Arts (IDIA) has a history of providing interdisciplinary immersion and new media experiences, employing pedagogy that supports students’ transformation as active, engaged learners. The Institute provides a dynamic exchange between instructor and students – where participants engage in collaborative, inquiry-based communities that provide an environment, which fosters participatory learning. As opposed to a traditional model, where knowledge is imparted by the teacher to the student, participatory learning can transform the learner into an active role. Participatory learning communities engage in a dynamic conversation centered around a subject – where analysis, comparison, and evaluation are core to the process of acquiring meaning and relationships. This environment allows students the opportunity to have a stake in the conditions of their learning, making decisions that direct their course of investigation. With a commitment to active inquiry, participants are freed to create personal and collective meaning.

 

BSU MUSEUM OF ART

COLLECTION PORTAL

SOFTWARE DESIGN

Carrie Arnold

Joel Happ

Garret Orth

Deekshita Reddy

Christopher Ross

Jonathan Strong

Austin Toombs

INTERFACE DESIGN

Anastasia Goryacheva

Steven Lanier

Jonathan Strong

ELECTRONICS

Eric Brockmeyer

Giovanni Rozzi

GRAPHIC DESIGN

Amy Goettemoeller

Ina-Marie Henning

FACULTY

Jesse Allison

John Fillwalk

Paul Gestwicki

PROJECT DIRECTOR

Jonathan Strong

A SPECIAL THANK YOU

UNIVERSITY LIBRARIES

James Bradley

John Straw

Budi Wibowo

SOUND

Rick Baker

Steven Lanier

Giovanni Rozzi

STRUCTURAL DESIGN

Eric Brockmeyer

Matthew Wolak

PROJECT MANAGEMENT

Ina-Marie Henning

Amruta Mhaiskar

Jennifer Weaver-Cotton

PRODUCER/PRINCIPLE INVESTIGATOR

John Fillwalk

BALL STATE UNIVERSITY MUSEUM OF ART

Peter Blume

Carl Schafer

Tania Said

Ball State University Museum of Art

College of Fine Arts

College of Architecture and Planning

Department of Architecture

Department of Art

Department of Computer Science

School of Music

Information Technology

Office of the Provost

University Libraries

and

Jeff Berg, IBM Interactive, IDIA Research Fellow

IDIA Presents: BSU MUSEUM OF ART COLLECTION PORTAL 3/20/11 at 2:30pm BSUMA

 

BSU MUSEUM OF ART COLLECTION PORTAL
Reception and Gallery Talk

Thursday, March 20 @ 2:30pm, BSU Museum of Art

The BSU Museum of Art Collection Portal was developed by students, faculty and industry research fellows in the Institute for Digital Intermedia Arts at Ball State University. The year-long project was developed in partnership with the Ball State University Museum of Art and sponsored by the Provost’s Immersive Learning Initiative.

This interdisciplinary team from art, computer science, architecture and music has developed an innovative, touch-based application to navigate the collection. The portal bridges the Microsoft Surface interface with the Museum’s collection database, the Digital Images Delivered Online (DIDO), hosted on the Digital Media Repository of Bracken Library. The Surface affords Museum visitors an interactive platform to individually or collaboratively make virtual connections between works of art both on display and in reserve – accessing information and media across the collection.

http://bit.ly/IDIA_Surface

Complete Instruction Guide here: http://ilocker.bsu.edu/users/idiaa/world_shared/SurfaceInstructions.pdf

Flickr Gettr

At its essence, Flickr Gettr is a novel virtual interface bridging the virtual worlds to the wealth of shared real life imagery and information in Flickr.

Currently on Display at the New Media Consortium, amongst others: http://slurl.com/secondlife/NMC%20Campus%20West/137/100/21

USAGE

Participants query to search public image folders at Flickr web-service. Queries pulls related images from Flickr and feeds them back to be spatialized in an immersive visual and sonic environment.  As each image spawns, it generates a chime-like sound which parallels the environment of imagery that surrounds the viewer.

In Flickr Gettr II, a participant can touch any image and receive a list of the tags associated with it.  Touching the same image again initiates a random search for one of those tags thus retrieving similar imagery.

In Flickr Gettr, an external web service was used as an intermediary to query Flickr, receive images and format them for delivery as a texture.  It then passed the aspect ratios and tags in a second query to be able to map the textures properly.  To make these interactions more flexible, the intermediary web server was employed to collate and prepare information to retain states that can be queried from external applications.  The web application effectively serves as an intermediary between the virtual world and outside environments, providing the communications link and logic to assimilate the information.  This can make the creation of Web 2.0 mash-ups much simpler as the code for doing these sorts of queries and logic is already highly developed in Java, Ruby, and Perl for example.  Flickr Gettr also triggered music files upon the rapid rezzing of objects to create a cumulative ambient effect.

Twitter Cloud

The Twitter Cloud is a visualization of real time Tweets (messages) posted on Twitter.com. The system automatically scans specified user feeds, and visualizes new messages as they arrive. This piece was programmed in Java using the Processing API.

This project was also linked with the Tweet Station so users could be identified with RFID and allowed to post their own messages through a touchscreen kiosk. In the context below, conference attendees entered their Twitter feeds to be tracked so that others could read about their experiences as they posted from their laptops and phones. The Twitter Cloud visualizer has also been used within virtual worlds – both as a live event Tweet tracker, and to track and visualize avatars as they traveled to various locations within the world.

Human-Computer Interaction in the Arts

The Human-Computer Interaction Lab housed in Music Technology is a learning and exploration ground in how electronics can interact with the arts. Students who take the MuMET 440 Special Topics course in HCI can use the lab to develop projects in augmented musical instruments, electronic installation art, analog audio processors, synthesizers, robotics, wireless sensor systems, and other electronic mayhem.  The course teaches equal parts basic electronics like power regulators, op amps, etc. and programming techniques for microcontrollers – the TI MSP430 and the Arduino. Most of all, the lab serves as a proving ground for ideas and a space for exploration.

Traversal

Traversal for Pipe Organ is a virtual performance exploring connectivity between worlds. As the name implies, it is a temporary link between the live pipe organ in a concert hall and a multi-user virtual world. “Physical” actions and events in the virtual realm compose the work in real time with the end product being a physical and aural realization of virtual performance.
As elements of the virtual instrument were engaged, their physical interactions – touch, physics, collision events – were used to create performance gestures on the organ in Sursa Hall. John Fillwalk, who had created the performance environment and virtual instrument as well as collaboratively conceptualized the work, performed live camera for the event that followed the performance for the live audience.  Jesse Allison composed the controllable sound events, and managed the communications link between the virtual and physical during the performance.

Excerpt from the premier performance.

Eco-Net

“Eco-Net” is an attempt to connect wireless network data with nature by visualizing that data with plant-like structures and organic motion. Each plant represents a computer connected to the network and each IP address is displayed above the corresponding plant. Collective network activity is displayed as websites are browsed and emails are sent. This piece represents our constantly connected state, simulated through plant and root structures, and the constant barrage of data that flows through the air all around us every day. This piece was created completely in Java, using the Processing API.

Links
Students win international award for work in the digital arts

Engaging Technology Exhibition

ENGAGING TECHNOLOGY
A HISTORY AND FUTURE OF INTERMEDIA

BALL STATE UNIVERSITY MUSEUM OF ART
NOVEMBER 17, 2006 THROUGH MARCH 11, 2007

https://idialab.org/engagingtechnology/

Engaging Technology traces a range of both pioneering and contemp-orary works exploring the intersections of electronic media and various modes of art-making. The artists selected for this exhibition – Richard Bloes, Hans Breder, Adam Brown, Dick Higgins, Gary Hill, Jenny Holzer, Golan Levin, Nam June Paik, and Alan Rath have contributed works encompassing a broad array of forms including video, sculpture, sound, music, electronic installation and interactive environments.

The opening of this exhibition happens to coincide with the coining of the term intermedia forty years ago by the fluxus artist, Dick Higgins. Below, Higgins addresses a societal question central to artists working within technological media.

Higgins asks:

…For the last ten years or so, artists have changed their media to suit this situation, to the point where the media have broken down in their traditional forms, and have become merely puristic points of reference. The idea has arisen, as if by spontaneous combustion throughout the entire world, that these points are arbitrary and only useful as critical tools, in saying that such-and-such a work is basically musical, but also poetry. This is the intermedial approach, to emphasize the dialectic between the media… Does it not stand to reason, therefore, that having discovered the intermedia [which was, perhaps, only possible through approaching them by formal, even abstract means], the central problem is now not only the new formal one of learning to use them, but the new and more social one of what to use them for?
Dick Higgins
Statement on Intermedia
New York
August 3, 1966

Intermedial investigation is generally interdisciplinary and often collaborative in its exploration of center spaces between genres, media and established boundaries. These expanded intermedial approaches can find their center in potentially any discipline including visual art, music, engineering, performing arts, architecture, social theory and the sciences. Although intermedia art is not always necessarily technologically based, the artists represented in this exhibition explore electronic intermedia art in its various manifestations. Intermedia artists are often interested in the relationship between a viewer and a work of art. In encountering that threshold of engagement, the viewer is invited to enter into a partnership in shaping the direction of the work. The engagement and experience of the viewer is therefore essential to the completion of the work’s meaning.

Engaging Technology is supported by the Ball State University Museum of Art, the Institute for Digital Intermedia Arts and Animation, the Center for Media Design and Lilly Foundation, Inc. I would like to thank Peter Blume and Tania Said at the BSU Museum of Art for their support in organizing this exhibition and my mentor and friend, Hans Breder for his continued guidance and insight throughout the years.

John Fillwalk, Curator
Director
Institute for Digital Intermedia Arts
Ball State University

RFID-Linked 3D Media Interface

The project allows the user to manipulate a digital world with wireless objects. Through the use of Radio Frequency Identification Devices (RFID), the virtual world can detect the presence of real-world objects and use them to manipulate its own attributes. In the case of this project, physical cubes link the user with virtual cubes within the system and allow the user to call up media such as video and sound. The system uses the software Quest 3D for real-time VR rendering and interactive animation, and Max/MSP for harvesting and inputting RFID data. Additionally, users can navigate the 3D virtual world with the use of a trackball.

Students participating in IDIA’s Immersion Seminar were named as a winner for an Award of Excellence at the International Digital Media and Arts Association (iDMAa) 2007 National Conference. The award included a $250 cash prize and was presented to only two groups out of the twelve universities participating in the conference’s student showcase. iDMAa hosted the Ideas 07: Beyond Boundaries conference in Philadelphia on November 7-11.

Links
BSU Press Release
2007 iDMAa Conference Website
iDMAa Homepage

[un]wired

[un]wired by Jesse Allison, John Fillwalk and Keith Kothman is a processing network visualizer that responds to interactions from personal radio-frequency devices such as mobile phones, WiFi signals, Bluetooth signals and car-key fobs. It tracks real-time statistical information from wireless access points (designed for seamless handoff of moving wireless traffic, like a cell phone network), along with periodically updated information from hand-held and wireless access points. The interactions then appear in sound and shapes on the screen. Users interacting with the piece can then visually see their interaction live on screen. The control information is collected from network services via the MySQL database and transferred into Max/MSP/Jitter. [un]wired was exhibited at SIGGRAPH Asia 2008 in Singapore.

http://www.siggraph.org/asia2008/attendees/art/20.php

Intermedia Artist Timeline

This device was designed to take you on a journey through the history of intermedia art. The kiosk, made by the IDIA in collaboration with the Institute for Digital Fabrication at Ball State University, allows the user to interact using hand gestures to move through the timeline. The work includes a short bio, image and description of works from 30 intermedia artists of the 1960s to the present. It was displayed in conjunction with the BSU Museum of Art’s Engaging Technology Exhibit. Jesse Allison programmed the piece with help from Mike Sayre. Dustin Headley and the IDF designed and constructed the kiosk.

Virtual World/Social Media Mashup: Flickr Gettr v4

IDIA Lab’s new Flickr Gettr launching soon – shared & mobile virtual world mashup artwork – preview Flickr pics here http://bit.ly/bCWQCL

Flickr Gettr: Shared and Mobile Media Mashup Artwork
Virtual World/Social Media Mashup: Flickr Gettr v4
The Institute for Digital Intermedia Arts (IDIA Lab) at Ball State University

IDIA Labs Flickr Gettr connects the social image repository of Flickr to virtual worlds and mobile devices through an interactive 3D and sonic experience – immersing the viewer in a dimensional cloud of user searched imagery and sound.

Background

Flickr Gettr in Second Life, leverages the new SL shared-media plugin architecture to provide rich spatially integrated web experiences. Participants query to search public image folders at the Flickr web-service. Queries then pull in related images from Flickr and feeds them back to be spatialized in an immersive visual and sonic environment.  As each image spawns, it generates a sound which parallels the environment of imagery that surrounds the viewer in their selected imagery and related tags that are spoken in synthesized voice.

Flickr Gettr positions virtual worlds as a platform to navigate media in the information metaverse. Virtual worlds have the potential to position the Internet as a three-dimensional information and communication platform where live data can flow in and out to visualize, contextualize, communicate, and to inspire.

In Flickr Gettr, an external web service was used as an intermediary to query Flickr, receive images and format them for delivery as a texture.  It then passed the aspect ratios and tags in a second query to be able to map the textures properly.  To make these interactions more flexible, the intermediary web server was employed to collate and prepare information to retain states that can be queried from external applications.  The web application effectively serves as an intermediary between the virtual world and outside environments, providing the communications link and logic to assimilate the information.  This can make the creation of Web 2.0 mash-ups much simpler as the code for doing these sorts of queries and logic is already highly developed in Java, Ruby, and Perl for example.  Flickr Gettr also triggered music files upon the rapid rezzing of objects to create a cumulative ambient effect.

In 2010, IDIA Lab was invited by the Indian Institute of Technology in Mumbai, India to to install Flickr Gettr at the their annual technology festival – the largest of its kind in Asia – which hosted more than 65,000 attendees.

Traversal for Boston Cyberarts

Traversal for Boston Cyberarts was a live performance and hybrid reality installation that bridged the physical and virtual worlds. It connected Faneuil Hall in Boston with a bell tower with midi controlled carillon as part of the 2009 Boston Cyberarts Festival.

Virtual participant could perform the physical bell tower via an interactive online virtual instrument built by IDIA – located in a 3D model of Faneuil Hall. Participants from all over the world could play the actual French carillon on top of Shafer Tower via their avatars. The ringing of the bells of Shafer Tower was streamed and simulcast back into Second Life, so the virtual participants can see and hear their interactions. Participants at Shafer Tower could also connect to their virtual counterparts through technology set up at the base of the tower and perform the bells locally through the same virtual interface.

IDIA collaboratively designed and built this interactive artwork for the Boston Cyberarts Festival.

 

Links

Boston Cyberarts

Boston Cyberarts Gallery

Virtual Reality Brings Interactive, Immersive Art to the 2009 Festival

AM Radio: The Red and The Wild at IDIA Labs in Second Life

AM Radio’s The Red and the Wild Opening Reception Sunday, April 26th 2009 7 pm SLT

IDIA is pleased to host artist AM Radio as the inaugural artist-in-residence at IDIA Labs – an exhibition and installation sim for virtual installation art and performance. IDIA hosts AM Radio’s The Red and the Wild as the inaugural artist-in-residence at IDIA Labs, an exhibition and installation sim for virtual art.

The Red and The Wild is an experimental shift in my work. The Red and Wild has its basis in earlier builds, notably Husk and Beneath the Tree that died. This time, a third house structure appears, based on a house which figures strongly in my childhood memories. The train that appears as a symbol in many of my works returns, but revived. A doorway represents an impossible or at least implausible path. Water towers dot the horizon, displaced in the context of an over abundance of water. A large red shape looms over the water and into or out of the house.

The title itself has its origins in a film and music artist friend from Atlanta. He had sent me a collection of music experiments of his just as I was in the midst of trying to understand why I was feeling a need to bring in abstract and maybe creepy shapes into my work. One of the tracks sampled some audio from the movie Breakfast at Tiffany’s. The audio lead me to watch the film again and read Capote’s Novella. In the story, the main character Holly invents a world around her in reaction to her anxieties and fears which she describes as the “mean reds.” Holly says, “But you can’t give your heart to a wild thing: the more you do, the stronger they get. Until they’re strong enough to run into the woods. Or fly into a tree. Then the sky. That’s how you’ll end up, Mr. Bell. If you let yourself love a wild thing. You’ll just end up looking at the sky ”

-AM Radio

THE INSTITUTE FOR DIGITAL INTERMEDIA ARTS is a hybrid art and design studio established as part of the Center for Media Design at Ball State University and funded the Lilly Endowment, Inc. The institute’s interdisciplinary studio collaborative explores the intersections between arts and technology – employing virtual reality, visualization, simulation and human computer interface.

Slurl: http://slurl.com/secondlife/IDIA%20Laboratories/15/134/21

Links:

SL Things To Do

DIP’s Dispatches from the Imagination Age