Publications

Please see our Publications page for a partial list of game design, simulation, and digital media publications.

Research

Drexel's RePlay Lab is interested in all Game and Simulation research leveraging Digital Media (DIGM) and Computer Science (CS) skills in cooperation with other departments and majors including Mechanical Engineering, Electrical and Computer Engineering, Music Industry, and Screenwriting. Please contact us for sponsored research opportunities. Some such past and continuing projects include:

Virtual Reality/Augmented Reality

 

Space gARden Augmented Reality/Virtual Reality WorldThis 1-term Master's class project investigated augmented and virtual reality using an Occulus Rift with mounted stereoscopic cameras and mounted Leap Motion sensor, as well as QR code markers. Using a combination of hand gestures and marker tracking, a portal between the virtual world and augmented real world allows users to see their garden grow in the future through the time/space portal.
Student Team

Ben Goldberg, Producer/Team Lead Caroline Guevara, Scrum master
Ariel Evans, Secretary
Jun Ma, Lead Playtester and Quality Assurance
Natalie Lyon, Lead Audio, Webmaster
Cathy Lu, Co-Tech Lead
Wenjie Wu, Co-Tech Lead
Ning Shao, Co-Art Lead
Shannon Sepelak, Co-Art Lead

  • Advisor: Dr. Paul Diefenbach
  • Space gARden 1
  • Space gARden 2
  • Space gARden 3
  • Space gARden 4


Medical Simulations

 

“Evaluating the Effectiveness of a Novel Neonatal Virtual Simulation Model in Training on Management of Neonatal Emergencies.This project is investigating using virtual simulation training for nurses to respond to neonatal emergencies.  The simulation allows the user to ask questions, perform diagnostic tests and order lab work, perform medical interventions, and see the response of the infant over time for different medical cases.  The simulation also provides analysis of the user's performance and instructs the user at the end of the simulation as to their response in relation to a "gold standard" treatment protocol.

Sponsor
: Independence Blue Cross Medical Education Simulation Center

PI: Sharon Calaman, MD, (Pediatrics, COM)
Co-Investigators: Bruce Bernstein, PhD, (Pediatrics, COM)
Paul Diefenbach, PhD (Digital Media and Computer Science, DU)
Dave Mauriello, (Digital Media, DU)
Jane McGowan, MD (Pediatrics, COM)
Ogechukwu Menkiti, MD (Pediatrics, COM
Student Developers: Tim Day, Cory Bergquist, Anna Nguyen, Don Xu, Katelyn Godfrey, Gabriel Valdiva, Patrick Steimach

  • Advisor: Dr. Paul Diefenbach
  • Neonatal
  • Neonatal
  • Neonatal
  • Neonatal


SmartVCS: Beyond Big Studio Production

In the current production environment, the ability to previsualize shots utilizing a virtual camera system requires expensive hardware and large motion capture spaces only available to large studio environments. With accessible hardware such as multitouch tablets and the latest video game motion controllers, there exists an opportunity to develop a new virtual cinematography platform utilizing only consumer technologies and openly accessible game engines. The SmartVCS platform is designed for directors, both amateur and professional, who wish to embrace the notion of Virtual Production for films and game cinematics without a big studio budget. SmartVCS has potential applications to other digital media areas including game level design, real-time compositing & post-production, and architectural visualization.

    Master's Student: Girish Balakrishnan
  • Advisor: Dr. Paul Diefenbach
  • SmartVCS
  • SmartVCS
  • SmartVCS
  • SmartVCS

Motion Platform Simulator Games

The Drexel Ride is a multi-discipline effort to build games and a game architecture for a 2.5 ton, 3 DOF motion platform simulator donated by The Ride Works.

The game fLight was created as a joint senior design project by a team of Digital Media and Computer Science students. In this game, 5 participants outside of the vehicle play on iPads the two passengers in the vehicle. One drives the vehicle and the other uses a wiimote to light up beacons. The five players on iPads use their creature chraracters to try to stop the vehicle participants through various physical interactions such as bumping the vehicle or temporarily blinding them.

  • Advisor: Dr. Paul Diefenbach

Operation Dino Museum Game

  • Master's Student Team Lead: Glenn Winters
  • Members: Nate Lapinsi, Girish Balakrishnan, Sonia Havens, Kevin Gross, Yujie Zhu, Bobby Speck, Ali Hassanzadeh, Yiqun Shao, Ian Woskey, Tom Burdak
  • Museum Advisor
  • Jason Poole
  • Advisors: Dr. Paul Diefenbach, Dr. Jichen Zhu

With Operation Dino, we have worked with the Academy of Natural Sciences to create an interactive educational experience about their dinosaur exhibition. Based on the museum's educational goals and the needs of their visitors, we developed a new interactive museum exhibit that takes place in the Dinosaur Hall. Utilizing touch screen tablets and QR tags, the exhibit is designed to be an educational experience targeted for children ages 8-12 which extend the information learned in the museum into a fun and interactive game. Unlike most museum games that are localized, Operation Dino utilizes server technology to install multiple game stations all around Dinosaur Hall. Each game station contains a unique multiple-choice question that encourages players to explore the museum to answer it correctly and collect that station's explorer badge. There are many badges to collect, each with the ability to unlock unique in-game animations and collectibles. The game itself features colorful, animated dinosaurs that guide players on their quest as an archeologist to collect all the badges in order to become the Master Archeologist. In future installations of the game, a live score board at the museum will exist to encourage competition among players. It also has potential to extend outside the museum through an online portal that allows players access to their educational content relating to what they learned at the museum.

  • Operation Dino
  • Operation Dino
  • Operation Dino

"Kollect" Cerebral Palsy Physical Therapy with Kinect

ExCITe Center: Cerebral Palsy Physical Therapy with Kinect from Girish Balakrishnan on Vimeo.

Cerebral palsy (CP) is the most common physical disability of childhood, and children with CP often experience decreased physical activity, motor skills, and functional mobility. Rehabilitation for CP youth is moving more towards activity-based interventions to increase physical activity and fitness. Evidence suggests that virtual reality and gaming may provide opportunities for youth with CP to participate in discrete physical activity tasks and improve motor planning.  However, interventions that are fun and provide an opportunity to achieve recommended levels of moderate to vigorous physical activity  are not possible in current games.  The primary aim of this project is to develop a prototype using the Microsoft Kinect to allow for youth with CP to access games more quickly and to play games for a variable amount of time with no lower or upper limits so that they may improve aerobic capacity, muscular endurance and overall health-related fitness.

.

  • Team: Tinothy Day, Kevin Gross, Nate Lapinski, Girish Balakrishnan
  • Advisors: Dr. Paul Diefenbach; Dr. Hasan Ayaz (School of Biomedical Engineering, Science & Health Systems); Dr. Maggie O'Neil (College of Nursing and Health Professions, Department of Physical Therapy and Rehabilitation Sciences); Dr. Trish Shewokis (College of Nursing and Health Professions, Department of Physical Therapy and Rehabilitation Sciences)

Music — Reactive Gaming — "Pulse" and "Pulse 2"

Download View Trailer

Pulse 2 is a music-reactive game which uses the player's own music library to dynamically incorporate music analysis together with web-accessible lyrics, text, and images based on the metadata included in the song files. They produced a web-deliverable 2D side-scroller game integrating these features that performs real-time analysis of songs in a user's music library to drive the gameplay, providing a novel form of game-music interaction. This work was presented at the IEEE Games Innovation Conference (ICE-GIC 2009) in London, England.

  • Pulse
                      music game

"Surge"

Surge expanded on this concept by using more analysis of the music to extract characteristics identifiable by the human ear to deform worlds and generate obstacles and environments which respond to the game world where each planet represents a song in the player's music libarary. This work was presented at SIGGRAPH Asia 2010.

  • "Surge" Master's Student: Christian Hahn
  • Advisors: Dr. Paul Diefenbach, Dr. Youngmoo Kim

Brain Interfaces

"Lazy Brains"

View WebsiteView Trailer

Lazybrains is a 3D game with both a bridge between a neuro-monitoring device and a gaming engine. The game explores the influence that new methods of interfacing may have on a user's immersion in an interactive 3D world, as well as new elements of game-play that may arise. The game is to be an exploratory, level-based experience where the user guides their character through different chambers, each offering a challenge which uses feedback from the player's brain as an additional input device. Utilizing the fNIR device, the user's brain activity becomes an additional "controller". In addition to standard controls on a traditional keyboard and mouse interface, the user's biofeedback output manipulates factors in the game's environment. For example, the user's particular concentration level may manipulate the height of platforms, slow down the motion of obstacles, or even change the color of virtual puzzle pieces to complete objectives. This game was designed and created by a collaboration between a digital media senior project team and a BioMedical Engineering Ph.D. student.

  • Team Members: Jordan Santell (DIGM), James Borden (DIGM), Aaron Bohenick (DIGM), Zachary Brooks (DIGM), Keneth Oum (DIGM), and Hasay Ayaz of BioMedical Engineering
  • Advisor: Dr. Paul Diefenbach
  • Live demo of the FNIR Working in Lazy
                      Brains!
  • Morby Moves a Door with his mind!
  • Ok kid, let's see some beetlejuice!
  • In-game start screen

"Mind Tactics"

Mind Tactics is a multi-player game where distractions are part of the gameplay. This game investigates meaningful BCI gaming, as well as data collection to suplement clinical trial studies.

  • "Mind Tactics" Master's Student:Keneth Oum (DIGM), and Hasay Ayaz of BioMedical Engineering
  • Advisor: Dr. Paul Diefenbach

Augmented Reality — "crAR"

crAR is an ongoing Augmented Reality research project investigating novel gameplay mixing physical and virtual objects.

  • Advisor: Dr. Paul Diefenbach

Behavioral Simulations — "Transition to Teaching

Website

The TTT website hosts training material in the form of interactive 3D simulations which place the teacher in various classroom scenarios and permit them to safely explore alternative strategies for dealing with the situations presented. Numerous scenarios are presented which represent a range from everyday to extreme situations such as inattentive students to verbal and physical abuse.

  • Advisors: Dr. Paul Diefenbach, Dr. Fredericka Reisman
  • Classroom simulation
  • Classroom simulation
  • Classroom simulation

Educational Multitouch Games— "e-Vironments" and "FISH"

View Trailer

Practical Applications for Multitouch Gaming Technology in the Middle School Classroom. Continuing research from Planet Diggum, this project adapts and builds upon our previous framework, resulting in a platform for creating multitouch applications for the classroom environment. By harnessing the current strengths of X3D and developing new extensions for the engine, we have created a modular system where educators can load learning simulations on the multitouch display, including an adaptation of Justin Dobies' FISH. Each simulation can communicate with a web portal allowing greater access to students and ease of administrator for educators.

  • Team Members: Will Muto, Justin Dobies
  • Advisor: Dr. Paul Diefenbach
  • e-Vironments #1
  • e-Vironments #2
  • e-Vironments #3

Location-based Gaming — "Inversion"

View Trailer

Inversion is an Alternative Reality Game engine that allows users to play in both the real and virtual worlds in competitive, team-based games. A deterministic, central server acts as a bridge which links GPS-enabled mobile devices to a 3D representation of the physical game space in which the mobile users are playing. Users playing in the physical world, called Agents, must accomplish location based tasks given to them from their Operator (a user playing from the 3D interface) via in-game text messages. Domination is a specific game-type developed for Inversion. In Domination, Agents are guided by their Operators to physical locations, which they must occupy for a brief period of time in order to capture that location for their team. Teams are capable of capturing a location currently held by another team, and locations can become briefly unobtainable if opposing players attempt to occupy the same location for too long a period. Teams accumulate points by capturing and holding waypoints. The winning team is the one who reaches the point threshold first or has the most points when the time runs out.

  • Advisors: Dr. Paul Diefenbach, Dr. Frank Lee
  • Inversion #1
  • Inversion #2
  • Inversion #3
  • Inversion #4

Multitouch Gaming — "Planet Diggums"

View Presentation

While devices such as the Wii and Playstation's Eye (formerly EyeToy) have expanded user interface possibilities in the gaming world, there is still a traditional disconnect between the user and the physical screen image. The advance of gesture computing and the new availability of multi-point touch screens permit a novel mechanism and universal vocabulary for direct interaction with a virtual world. Planet Diggum is a multi-user god-game which uses a novel combination of finger and hand-stroke gestures. The goal of this project is to create a testbed kiosk where multiple users can interact with the system without need of training. Planet Diggum uses a Drexel-built Frusrated Total Internal Reflection multi-point touch screen, stanadards-based software, and custom media assets, and brings together teams from Digital Media, Computer Science, and Engineering to provide a unique and fun user experience.

  • Team Members: Dr. Paul Diefenbach, William Muto, Matthew Smith, Chester Cunanan, Justin Dobies, Arvind Neelakantan, Sara Colucci, James Grow, Dr. Frank Lee, Louis Kratz, Dr. Ko Nishino, Craig Polakoff, Boris Block, Dan Hennessey, Zenko Klapko, David Millar, William Morgan, Electrical Engineering, Dr. Youngmoo Kim, Timothy Kurzweg, Vijay Balchandani, Eric Effinger, Jeevan Kotha, Pannha Prak, Joseph Romeo
  • Advisors: Dr. Paul Diefenbach, Dr. Youngmoo Kim
  • A Diggum!
  • Diggum
                      instructions to be used in-game.
  • Another example of a Diggum.
  • Users playing a
                      live demo of Planet Diggums!

Humanoid Robots — "HUBO" & "Robonova"

There has historically been a disconnect between the engineering of robotic behaviors and the artistic creation of fluid, human-like motions. This work discusses an approach which leverages 3D tools originally created for computer graphics animators and repurposes them for humanoid robotics. Initial proof-of-concept work on a simple off-the-shelf humanoid built from simple servos and brackets provides a testbed prior to migration of the techniques to a larger, complex, reactionary biped robot, the HUBO.

  • Team Members: Daniel Letarte (DIGM), Joyce Tong (DIGM), Robert Ellengberg (Mechanical Engineering)
  • Advisors: Dr. Paul Diefenbach (DIGM), Christopher Redmann (DIGM), Dr. Paul Oh (Mechanical Engineering)