John Fillwalk and Jesse Allison travelled to Mumbai, India for the International Annual Science and Technology Festival, January 22-24, to exhibit the Institute for Digital Intermedia Arts’ (IDIA) “Flickr Getter” installation. The festival, the largest of its kind in Asia, hosted more than 65,000 attendees. In addition to exhibitions such as “Flickr Getter”, the festival featured keynotes such as Lars Rasmussen, Co-Founder of Google Wave and Google Maps; Vic Hayes, father of Wi-Fi; Ajay Bhatt, Co-Inventor of USB; Jonathan Gardner, Senior NASA Scientist; and R. Chidambaram, Principal Scientific Adviser of the Government of India.
IDIA’s Flickr Getter is an interactive, sonic and visual experience, immersing the viewer in a three-dimensional cloud of user searched Flickr images. The installation, completed in collaboration with the New Media Consortium, Linden Labs and the Institute for Digital Fabrication, was also highlighted in the festival’s promotional video.
For more information, please visit http://www.techfest.org/scintillations/flickrgettr/
Luvtaps Gives You Google Calendar Reminders on Your Fitbit
IDIA Lab has released a new product called Luvtaps. Luvtaps makes it easy to receive calendar reminders whenever you have an event coming up. While Google Calendar can notify you on your phone of an upcoming event, Fitbit does not allow access to its vibration. Therefore, Google Calendar cannot directly use this device. Luvtaps gets around this by adding a silent alarm to your Fitbit for every Luvtap you create from a Google Calendar event.
Features
Google Calendar integration
Fitbit integration
Shows all Google Calendar events in the next 24 hours
Designed for use with Google Calendar™ and FITBIT® data Google and the Google logo are registered trademarks of Google Inc., used with permission. Fitbit is a registered trademark and service mark of Fitbit, Inc. luvtaps is designed for use with the Fitbit platform. This product is neither produced nor endorsed by Fitbit, and Fitbit does not service or warrant the functionality of this product.
We recently posted a UNITY webplayer of some of the core buildings at Hadrian’s Villa. This can be used at no cost by people around the world who want to learn more about the villa or teach it to their students. The webplayer supports avatars, chatting, and can easily be used with Skype so that you can talk to your fellow avatars in world. We thank John Fillwalk and Ball State University’s IDIA Lab for the great work in making the webplayer possible. We also thank the Andrew W. Mellon Foundation and the Indiana School of Informatics for providing the financial support that made this project possible.
You must open it in Mozilla Firefox. Before doing so, I’d urge you to look at the README file with instructions. You can find that file and the UNITY webplayer at:
IDIA Lab has been creating spherical full dome masters for planetarium experiences. First projects are based our History Channel work with Stonehenge and Roman Solar Alignments – more to come!
IDIA Lab part of Ball State University President’s Strategic Plan:
The Institute for Digital Intermedia Arts continues its research and design of virtual worlds, animation, simulation, and augmented reality applications in several existing and new laboratories: the Human Computer Interface Lab (HCI Lab), creating tools and content for the exploration of next-generation input devices, electronics, and sensors; the Simulation Lab (SimLab), investigating immersive virtual simulation environments; and the Digital Dome Production Lab (DomeLab), developing the expertise to create science and arts digital dome masters for distribution to planetariums around the world. #simulation#futurehttp://centennialcommitment.bsu.edu/…/…/FY15AnnualReport.pdf
Wonders of Oz is an augmented reality application that provides users an enhanced 3D experience with the film, The Wizard of Oz. This demo app can be triggered during film (or anywhere, anytime) to deliver animated content and sound to augment the screening and share events with a live audience. Flying monkeys, the witch on her burning broom, lollipops, the hot air balloon, Dorothy’s falling house, cows, chickens, rowboats, bicycles, rainbows and the wizard all appear with the press of a button! Users can use their devices to discover and track the virtual performers as they appear around them during the film. The original version contained more 3D content and was designed to work on a timer so the audience can share in the digital performers appearing on cue! This app can be adapted for any event, live theater, film concerts, art and cultural heritage events, etc. Note: This application requires the use of a gyroscope – not all Android devices possess one.
iOS: http://apple.co/1SmwI7Z
Google Play: http://bit.ly/1RRbNds
MUNCIE, Ind. — Ball State University’s IDIA Lab in partnership with Paws, Inc. has created a special digital art event in honor of Muncie’s sesquicentennial celebration. The multimedia event will take place during Muncie’s ArtsWalk at 8 p.m., Oct. 1at the corner of East Charles and South Walnut streets in downtown Muncie.
Jim Davis’ Garfield will virtually host the event, which will feature historical imagery and clips of Muncie mentions in pop culture. “We wanted to create something fun and tongue-in-cheek, but also positive and optimistic about the city’s future,” said John Fillwalk, director of IDIA and senior director of the university’s Hybrid Design Technologies initiative.
“We were able to get a sneak peek at the community collaboration piece and I think ArtsWalk guests are in for a real treat,” said Cherly Crowder, events coordinator for DWNTWN, the city’s downtown development campaign. “John and his team are so creative, and their ability to take an idea from conception to completion has been awesome.”
Other community partner for the digital art event include the Delaware County History Society, Muncie Arts and Culture Council, Center for Middletown Studies and Ball State University Libraries.
After the ArtsWalk premiere, Fillwalk said the piece will be made available to the public. “We consider it a work of community art. The more people who can view it and promote it for us, the better.”
For more about ArtsWalk, visit www.downtownmuncie.org/artswalk
Experience Moonlight Movies: The Wizard of Oz in a whole new way! On Saturday, July 11th, you can watch the film in DWNTWN at Canan Commons with an enhanced 3D experience using a new app on your phone designed locally just for this event.
Flying monkeys, the witch on her burning broom, Dorothy’s falling house, cows, chickens, rowboats, bicycles and rainbows all appear on cue! You can use your device to explore the skies as they appear around you during the film. The app is synched to the timing of the movie with visual and sound effects to provide you with an immersive viewing experience.
Special thanks to the Ball State University Institute for Digital Intermedia Arts for developing this app here locally for DWNTWN.
Follow the Yellow Brick Road to the DWNTWN land of Oz! Beginning at 7:00pm, come dressed as your favorite Oz character, get your picture taken with the characters from the Spirit of Oz touring group, enjoy live stage performances, Auntie Em’s Kansas Petting Zoo, and Kids Sing-along. The Wizard himself will be there with an actual hot air balloon!
There is a Toto Look-alike Contest with the winning Toto receive a free dog carrier from Midwest Homes for Pets, Emerson Dog Park Membership from the Muncie Animal Shelter, and a t-shirt and a free membership to the Bark Park from ARF.
Canan Commons is located in the 500 block of south Walnut Street near the roundabout. Family pre-show activities begin at 7:00pm, with the movie starting at dusk. Guests are encouraged to bring a blanket or lawn chairs and bug spray. Vendors will be on site with light snacks.
Blue Mars is a 3D massively multiplayer virtual world platform developed by Hawaii-based Avatar Reality – allowing third party developers to create virtual worlds, MMOG games, simulations, businesses, entertainment venues, clothing, custom avatars, furniture, virtual homes, and other items. It consists of four main parts: the client software, the Sandbox Editor SDK suite, the website, and the host servers.
The free Blue Mars Client is your window into our free to play massively multiplayer virtual world made up of an expanding set of independently operated Cities that feature unique themes, activities, and attractions. Through the Blue Mars Client, users can participate in a wide variety of entertainment, educational, and productivity experiences.
Blue Mars Sandbox Editors
The free Blue Mars Sandbox Editors are the bridge between your offline content and the online Blue Mars world. Our Editors work in concert with digital content creation tools like 3DSMax, Maya, Blender, and SketchUp to create immersive 3D spaces and upload them into the virtual world. Using the Blue Mars Sandbox Editors, developers can arrange geometry, script interactive features, define items for sale, manage shop and residence interiors, preview lighting conditions, adjust time of day, create shops, manage land, develop terrain and vegetation, choose textures, adjust shaders, and much more. Our editors are “What you see is what you play” so the view of your offline content in the Editor is exactly how it will appear in the online environment.
BlueMars.com
BlueMars.com is the central hub for discovery, community, and user and developer account management. Everything from currency balances, friend lists, messaging, business intelligence, inventory, virtual land, search, shopping, and account information will be managed through the web site.
Blue Mars Servers
The Blue Mars Servers distribute the content uploaded into the virtual world by developers, manage the global item and inventory database, and coordinate the movement, actions, and communication of the users exploring Blue Mars. Our advanced server architecture can support many thousands of users simultaneously in the same location.
Chris Harrison gave a presentation during the annual CAP (College of Architecture and Planning) Faculty and Student Symposium about the use of quadcopter and drones for use in photogrammetry. Using a quadcopter and a GoPro Hero 3 camera, he was able to capture a subject and submit the workflow into Autodesk ReCap in order to get a virtual model out of it.
IDIA is pleased to announce REDgrid – a 3D virtual campus to support the educational mission of faculty, staff and students. Ball State community members can use it for free as a virtual gathering space including classroom instruction, small group meetings, presentations, panel discussions, or performances. It is a secure environment hosted and managed solely by Ball State’s IDIA Lab. Virtual classrooms or field experiences can be customized as well as made private to suit your needs. REDgrid also offers a developmental platform for online instruction or community displays.
In this video, David gives us a preview of what you are able to do in REDgrid.
Ball State institute creates virtual world to be used for learning, connecting
The Daily News | Amanda Belcher Published 08/31/15 12:27pm Updated 09/01/15 5:15pm
By 2013, the Sims video game series had sold 175 million copies worldwide. Its users could create avatars, build houses—just like in reality.
Ball State’s own REDgrid uses a similar concept. Students can make an avatar, walk around a virtual Ball State campus and interact with other avatars via written messages or a headset in this open simulator.
Ball State’s Institute for Digital Intermedia Arts (IDIA) developed REDgrid.
“The server is physically housed at BSU IDIA Lab and was installed with collaboration with Information Technology,” said John Fillwak, the director of Ball State’s IDIA Lab.
When it comes to REDgrid, the possibilities can seem limitless—and some faculty members have already begun testing the simulator’s boundaries.
Mai Kuha, an assistant professor of English, used REDgrid in an honors class as a gender identity project. Students were assigned a gender and told to create an avatar of that gender. This enabled them to observe how people of opposite genders are treated differently.
Kuha isn’t the only professor to use REDgrid as a learning tool in the classroom. Ann Blakey, an associate professor of biology, used it in one of her courses for an environmental ethics activity.
Students were assigned a role, such as scientist or environmentalist, and were invited to explore the virtual environment and file a report. The activity gave students the opportunity to see the environment from different perspectives.
Fillwalk envisions even more opportunities for the platform.
“Ball State community members can use it for free as a virtual gathering space including classroom instruction, small group meetings, presentations, panel discussions or performances,” Fillwalk said.
The virtual classrooms and field experiences can be customized to fit each teacher’s needs.
This kind of creative teaching is what Stephen Gasior, an instructor of biology, is looking to expand upon. An enthusiastic voice for REDgrid, Gasior encourages professors and students to utilize this tool. He explains that it gives professors the ability to shape an environment and allows students to experience any number of events or situations.
REDgrid isn’t just for academic purposes either, Gasior said. Students can use the avatars for social experiments too.
“REDgrid represents Ball State’s campus, and international or online students get [the] feeling of being in the campus environment,” he said.
Fillwalk fully understands this aspect of REDgrid.
“We designed it to be a flexible platform connecting BSU faculty and students to international,” he said.
Connection is key with REDgrid. Gasior stressed that it can help build and connect communities—Ball State or otherwise.
Ball State is already working with faculties of other universities so the tool can be used on campuses other than just Ball State’s, Fillwalk said.
“The platform could certainly continue to be expanded,” he said.
Gasior has plans for the future—like researching developmental grants and expanding REDgrid. But IDIA staff can only do so much.
“People who come to REDgrid and have a passion for it will shape the road it will take,” Gasior said.
The Trans|Duce uses capacitive sensing and an Arduino to sense when a person is near it. That information is sent to MAX 5 and affects a sound and/or filter which is then sent to an amplifier which vibrates a transducer placed on the back of the metal sheet.
The metal sheet then acts as a speaker and its unconventional makeup creates an interesting timbre. Different gauges of metal can be used to better amplify certain ranges of the sound spectrum. From this, we can create different sounds depending on how close or far away we are from the metal sheet.
Be sure to check out the demo video above and see the transducer in action.
Using electroencephalogram (EEG for short) we are developing applications to digitize analog electrical brainwaves. With this device from NeuroSky (www.neurosky.com) we are able to wear the device and control applications on electronics, such as cell phones and computers. Unlike most EEG devices on the market, this one is affordable and stylish. With using the provided developmental tools we plan to implement our own work into it and create more applications for it. Using certain “hacks” and applying our own touches to it, we are able to unlock the ability to create new ideas on what to use this product for. We look forward to continuing our research on this, but it is still in a development stage.
The DicePlus (Dice+) can communicate with both iOS and Android devices and has an available SDK for Android, iOS, and Unity.
It has capabilities including: return dice number rolled, control of LED’s, magnetometer and accelerometer for orientation and movement, capacitive sensor for proximity and touch, and temperature sensor for temperature.
For more information regarding its use check out www.dicepl.us
The Mobile Arduino Controller allows one to access an Arduino board via a mobile friendly website. It’s a website designed for mobile devices for a couple reasons…
1) Allows access from computers, not just smart phones.
2) Easier for the public to access the Arduino or multiple Arduinos for installations without searching for an app and then waiting for it to download.
3) Works on all smart phones.
The mobile website allows for near real time manipulation of anything connected to the Arduino board. It could also be used to relay sensor information back to the website or user. Right now, it is just connected to control LED’s, but it could be anything (servos, speakers, etc.)
We hope to use this for more than just turning on/off an LED on the board, but currently it is in a prototyping/research phase.
The Unity Mobile Controller would allow us to have a Unity project running on a PC, and users would be able to download an app that turns their smart phone into the controller. This allows for the project running on the PC to show the player and the smart to display the controls and other information (such as maps, menus, etc.). It’s still in a research/prototyping stage right now.
Photogrammetry scan of the a marble bust by artist Hiram Powers of Proserpine, daughter of Ceres who became the goddess of the underworld when Pluto carried her away and made her his wife. IDIA Lab used Autodesk ReCap software, displayed in Sketchfab.
We have applied our Virtual Solar Simulator to a simple white model build of the pre-Columbian ballcourt in Group F at the Izapa site in Chiapas, Mexico. Izapa is considered to be the birthplace of the Mayan Long Count, which ends its cycle today on December 21st, 2012 – the winter solstice. Viewed today, the site is oriented on an axis where it is aligned generally but not precisely to sunrise on the winter solstice and to sunset on summer solstice. In direct observation today, the alignment along the axis of the ballcourt from the throne #2 to the stela #60 is off-axis by approximately two degrees.
* The simulation requires Internet Explorer and the Unity plugin.
The solar simulator developed by IDIA Lab at Ball State University polls data from the NASA / JPL Horizons database (http://ssd.jpl.nasa.gov/horizons.cgi) calculates celestial objects position (sun, moon, planets, comets, etc). The database takes into account for the Chandler Wobble – a periodic deviation in the rotation of the Earth’s axis. Archeo-astronomical alignments that are viewed today at Izapa appear to be off-axis, however when compensated for changes in time, rotation, position and Chandler effect via this simulation method, the solstice events come back into their original orientation for the era of their construction and can be observed to be in actual alignment.
A b’ak’tun is 144,000 days in length – almost 400 years. In the Mayan Long Count it took 13 b’ak’tuns to progress throughout a full cycle of creation. Dec. 21, 2012, marks the end of the 13th b’ak’tun of this Mayan calendar and after today’s solstice, the next Long Count cycle begins again – another 5128.8 years.
Happy New Mayan Long Count!
– John
December 21st, 2012
Dates are in the year 300 BCE/BC.
Time is represented in Universal Time (UT1/UTC/GMT).
To obtain local time for the Izapa site, subtract 6 hours from the UT time displayed in the simulator.
Next release will provide solstice and equinox preset for the year 2012 for comparison.
Camera control
To switch cameras press the 1 for 1st person camera view, 2 is for bird’s eye view and 3 is for top down view.
To pan use the WASD keys and to zoom use the middle mouse button. To rotate camera use CTRL-Click or Right Mouse Button.
Background
The model is geolocated and accurately referenced in 3D space by latitude, longitude, orientation and elevation. The Horizons database is then scraped using these coordinates as well as the year of interest – in this case we chose 300 BCE/BC as an apogee in the range in which the Izapa site was inhabited.
IDIA Lab has developed the celestial simulation – which can also track the moon and planets, etc – to work on various Cultural Heritage and archeo-astronomy projects in collaboration with international scholars. Other projects utilizing the Celestial Simulator include simulations of Hadrian’s Villa, the Roman Pantheon, the Solarium Augusti in Rome (the largest sun calendar of the ancient world), the Temple of Artemis in modern Turkey (one of the ancient wonders) and Stonehenge.
Renowned intermedia artist, Hans Breder, has donated a work of video art to the Ball State University Museum of Art. The work has been installed in a temporary new media gallery at the BSUMA. The reception for the exhibition was held on Thursday, September 15th and will be on display until November 27th, 2011. Breder’s MFA program in intermedia at the University of Iowa was the first of its kind in the nation.
Hybrid Design Technologies (HDT) is a new initiative through the Office of Information Technology at Ball State University, supporting the design and production of virtual, immersive and interactive environments – advancing the University’s expertise in virtual cultural heritage, museums, arts, teaching and hybrid learning. John Fillwalk serves as the Senior Director of HDT, as well as directing the IDIA Lab. www.bsu.edu/hdt
IDIA Presents: BSU MUSEUM OF ART COLLECTION PORTAL 3/20/11 at 2:30pm BSUMA
BSU MUSEUM OF ART COLLECTION PORTAL
Reception and Gallery Talk
Sunday, March 20 @ 2:30pm, BSU Museum of Art
The BSU Museum of Art Collection Portal was developed by students, faculty and industry research fellows in the Institute for Digital Intermedia Arts at Ball State University. The year-long project was developed in partnership with the Ball State University Museum of Art and sponsored by the Provost’s Immersive Learning Initiative.
This interdisciplinary team from art, computer science, architecture and music has developed an innovative, touch-based application to navigate the collection. The portal bridges the Microsoft Surface interface with the Museum’s collection database, the Digital Images Delivered Online (DIDO), hosted on the Digital Media Repository of Bracken Library. The Surface affords Museum visitors an interactive platform to individually or collaboratively make virtual connections between works of art both on display and in reserve – accessing information and media across the collection. Special thanks to IDIA Research Fellow Jeff Berg from IBM Interactive!
Received an invitation to be on the international advisory board for the Virtual World Heritage Laboratory. The laboratory is a Hadrian’s Villa simulation project funded by NSF.
U Museum Ostwall in Dortmund, Germany opened its doors to a 7000 person reception. The museum is the largest in Germany. Our International Intermedia Art Archive through the Hans Breder Foundation (J. Fillwalk, President) will open at this museum in Spring 2011 http://bit.ly/aDvNwu & http://bit.ly/dAXjvL
IDIA had a conference call with Avatar Reality in Hawaii with their CEO, President, CTO, and CFO on IDIA development in Blue Mars. Avatar Reality president, Jim Sink is presenting IDIA simulation work (which will be launched on November 15th, 2010) to the Smithsonian Museum this week.
As a follow-up to last week’s conference with IDIA Research Fellow Jeff Berg, IBM, John Fillwalk and IDIA Immersion Seminar students were invited to present our Microsoft Surface interface project to a 10 member IBM Interactive team made up of designers and programmers based in Chicago. Provost’s Immersive Learning Grant funded the pursuing commercialization of this product.
MIT Press. John Fillwalk work, Final Wisdom I exhibited in SIGGRAPH in Los Angeles last month is published in Leonardo: The International Society for the Arts, Science and Technology article published by The MIT Press. http://muse.jhu.edu/journals/leonardo/summary/v043/43.4.fillwalk.html
Jesse Allison and John Fillwalk exhibited their interactive virtual worlds installation, Flickr Gettr at the IUPUI Intermedia Festival held on April 23-25 in Indianapolis. The festival had over 100 international performers from across the arts. For more information, Please visit, http://music.iupui.edu/intermedia/
Partnering with CAP, IDIA modeling and animation staff will be undergoing a training in March on the new FARO 3D and Gexcel Laser scanner and Gexcel software.
Physical location: Philips HighTech Campus and NatLab in Eindhoven, Nederlands
http://www.hightechcampus.nl/
‘Traversal for Eindhoven’ is a live performance / hybrid reality installation that bridges the physical and virtual worlds. It connects Philips NatLab in Eindhoven to a virtual environment as part of the 2009 Dutch Design Week. Avatars will be able to perform a physical synthesizer in the Philips NatLab via an interactive online virtual instrument built by IDIA – located near a 3D model of the Eindoven High Tech Campus. The sound of the live synthesizer will be streamed back into Second Life, so the virtual and physical participants can hear their interactions with the instrument.
DUTCH DESIGN WEEK From October 17 through October 25 2009, Eindhoven presents the 8th edition of Dutch Design Week, the largest design event in the Netherlands. Around 1500 designers from home and abroad will show their work in more than 60 locations all over the city of Eindhoven from design disciplines such as industrial design, concept design, graphic design, textile & fashion, spatial design, fooddesign and design management & trends. Visitors will be given insight into the entire development process from concept to product in various disciplines ranging from industrial design to applied arts. The participants include established bureaus, high-profile designers, talented newcomers, and recently graduated designers, one of the reasons why this Dutch Design Week is the perfect meeting place for designers, companies, and public. http://www.dutchdesignweek.nl/indexmap.php
MAD EMERGENT ART CENTER The Foundation MAD is a platform and workshop for Emergent Art: art that exists in the exiting world between cultural and cutting edge technological developments.MAD emergent art center is laboratory, platform and provider on the intersection of art, science and technology. MAD addresses artists, designers, scientists, public groups, institutions, governments and businesses. This on regional as well as national and international level. http://www.mad.dse.nl/mad.html
More Information: http://tinyurl.com/yl2dq4l
John Fillwalk and Jesse Allison present to the Ohio Learning Network, UCit Instructional & Research Computing at University of Cincinnati on their Blackboard / Second Life Building Block Project.
Virtual worlds provide a platform in which to construct compelling experiences not possible within the material and temporal constraints of the physical world. The virtual realm has the potential to be united and engaged by physicality–informing and transforming the audience’s experience of exhibition in a profoundly transformative nature. The Institute for Digital Intermedia Arts at Ball State University has been incorporating mixed-reality approaches into museum exhibitions, musical performances, installation art, and interface over the last several years. This paper documents specific explorations of the opportunities of the Second Life environment for mixed-reality experiences–analyzing approaches to bridging the worlds such as media streaming, client-side interaction, an external web server communication hub, as well as opportunities for human/computer interaction.
Blackboard Greenhouse Project for Virtual Worlds: The IDIA Second Life/Blackboard Building Block Project
BbWorld ’09
7/15/2009 12:00 PM – 12:45 PM
National Harbor 2
Theme
Expanding and Improving Online Learning, Strategic Initiatives
Special Topics
Featured Speaker(s), Openness and Extending Blackboard Software
Speakers
John Fillwalk, Director of the Institute for Digital Intermedia Arts, Ball State University
Session Description
Ball State University’s Institute for Digital Intermedia Arts (IDIA) has produced a bridging toolset, linking the multi-user virtual environment of Second Life with Blackboard Learn – providing a unified, secure and fluid hybrid learning experience. This project is funded by the Blackboard Greenhouse Grant for Virtual Worlds and has produced an open-source Building Block.
IDIA at Ball State University (BSU) created a virtual and web-based software toolset to manage, administrate and facilitate any hybrid Second Life / Blackboard Learn instructional experience. These Blackboard Building Blocks and Second Life scripts were specifically developed for a “hands-on” cinematography course taught with a complete set of virtualized filmmaking equipment. This pilot course also served as a prototype for Blackboard courses from any discipline that uses Second Life – especially in the delivery of studio, laboratory or other hands-on modes of learning – extending the modes of typical distance-learning offerings. The pairing of Blackboard technology augments Second Life as is not designed with course management tools to operate effective self-contained distance instruction.
Join us for a discussion of these tools, released by IDIA, which have resulted in the public availability of Blackboard Building Blocks that facilitate automated communication between Blackboard Learn and Second Life, and more generally, the conceptual template for using Blackboard Learn to manage virtual world creation, access, and activity in Second Life.For more information-http://www.nmc.org/files/ACUTA_Article_SecondLife.pdfhttp://www.bsu.edu/news/article/0,1370,-1019-61661,00.html
Blue Mars is a free to play massively multiplayer virtual world featuring stunning graphics, realistic characters, and endless social bonding activities. The Blue Mars virtual world is made up of an expanding set of independently operated cities that feature unique themes, activities, and attractions such as shopping, avatar customization, unique personal spaces, and games like dancing, racing, and golf. Cites on Blue Mars are tied together with a unified login system, persistent global Avatar ID, and platform wide participation based reward system.
Andy Beane has formed a large summer class with many students with different academic majors and skill sets to come together and make a finished animated short. IDIA’s Trevor Danehy assisted the class in creating their workflow pipeline with the integration of motion capture. The class, which named themselves ‘Third Floor Studios (on account that most of the work takes place on the third floor of the Arts and Journalism building), wanted to use motion capture to assist them in animating characters due to the rather small amount of time the summer session allots for them. Motion capture is the process of recording a live person’s physical movements and transferring them to data that can be applied to an animated character. Ball State’s Biomechanics Laboratory provided the motion capture facilities and equipment to capture the actor’s movements. Trevor Danehy helped line the workflow in binding the tracking point data to an animated character’s digital skeleton, using programs such as Autodesk Maya and Motionbuilder. A concentration on keeping the software pipeline as minimal as possible is pivotal to creating an animated short in a set timeframe.
The Hans Breder Foundation Board of Directors meeting took place in Chicago on May 16th at the Drake Hotel. Board member John Fillwalk attended the annual meeting and planned activities for the upcoming year – including an exchange with the University of Dortmund.
Members of the IDIA spent the afternoon testing Ball State’s Shafer Tower in order to connect it to the virtual worlds. The testing is in preparation for an upcoming outdoor event allowing people to perform the Carillon bells via Second Life.
John Fillwalk and Jesse Allison will exhibit their hybrid reality artwork, Traversal for Boston during the The Boston Cyberarts Festival. The festival is the first and largest collaboration of new media artists working in North America.
Ball State University has won yet another real-life national honor for one of its Second Life initiatives.
The Institute for Digital Intermedia Arts (IDIA) has earned a 2009 Institutional Excellence in Information Communications Technology Award from ACUTA, the Association for Information Communications Technology Professionals in Higher Education.
The organization will formally present the honor to John Fillwalk, IDIAA director and associate professor of electronic art, at its annual conference April 19-22 in Atlanta. The award recognizes IDIAA for its Aesthetic Camera filmmaking course and integrated Blackboard custom software, offered through the online world of Second Life.
The course teaches cinematography techniques and allows students to gain hands-on experience with equipment and resources that might not ordinarily be available to them. Within the virtual studio, they can check out cameras, dollies, light systems and more to create their own original movies. They can also film using avatars in costume in a wide variety of virtual set locations.
“Ball State’s Aesthetic Camera is a superb example of the type of innovation that ACUTA’s Institutional Excellence Award is designed to recognize,” said Walt Magnussen, ACUTA’s immediate past president and awards committee chairman. “This blending of virtual learning and distance education leverages the best of both the IT and the telecommunications disciplines.”
When first launched, the Aesthetic Camera course received Blackboard Inc.’s inaugural Greenhouse Grant for Virtual Worlds for its work linking the interactive technology of Second Life and Blackboard, which produces Web-based, course management software. The award cited the innovative combination, which allows Ball State to extend virtualized studio and laboratory experiences to an online distance education audience.
The course’s equipment has been virtualized and scripted, which means that it is quite similar to its real-world counterparts, Fillwalk said.
“Preparing our students for a changing world through immersive learning opportunities is a top priority for Ball State,” he said. “The IDIAA is leveraging Ball State’s success in emerging media and creating innovative media art and design experiences for our students in both real and virtual environs.”
The Aesthetic Camera and other opportunities available at the IDIAA are part of Ball State’s Emerging Media Initiative, a planned $17.7 million investment to accelerate economic benefits to Indiana with media-savvy human capital.
Michael Pounds, Jesse Allison and John Fillwalk will exhibit an interactive installation completed with assistance from students in the DFAB institute. For a complete schedule and more information about the event, visit http://seamus.sweetwater.com.
The students of the 2009 spring seminar at the Virginia Ball Center launched their interactive children’s exhibit at the Minnetrisa Center on April 17th. The exhibit is designed for elementary children to gain a better understanding of honeybees and the significant role they have in our world. Through the Human Computer Interface lab sponsored by the IDIA, students were able to design and implement the interactive aspects of the exhibit which will remain at Minnetrista through August.
TechPoint Releases Nominees for 2009 Mira Awards Presented by BKD
Record Number of Entries Proves Challenging for Independent Judging Panel
Winners and Finalists for the 2009 Mira Awards presented by BKD will be revealed on May 16, at the awards gala, but TechPoint has released the names of 83 companies, schools and individuals from around the state nominated in 11 different competitive categories.
According to TechPoint President and CEO Jim Jay, the Mira Awards planning committee received more nominations this year than in any other year since the program’s inception 10 years ago.
Three independent panels of judges made up of business and community leaders were tasked with evaluating the 83 nominations in 11 different categories. The finalists and winners will be revealed at the 2009 TechPoint Mira Awards Gala presented by BKD on May 16, which will be held at The Westin Hotel in downtown Indianapolis.