Research

Open Research Projects

Email kpietroszek aattt csumb ddoott edu, if you are interested in any of the projects below.

  • Interactive Table – building and evaluating world’s largest interactive tabletop.
  • AuTible — AUTomatic audIBLE.com — an auto-generated audiobook that uses previous audiobooks as interpretative input.
  • BrainScore – adopting music score’s interpretation to audience’s mood.
  • Brainformance – turning audience brainwaves into an audiovisual feast in VR.
  • Young Lady’s Illustrated Primer – fairytale book that turns itself into a personal story (inspired by “Diamond Age” by Neal Stephenson.
  • Speech2Face – real time conversion of speech to expressive face animation in VR using deep neural networks.
  • SymphonyVR – conducting a virtual symphonic orchestra in VR.

Ongoing Projects


UniVResity: open access remote classroom participating through mixed reality

UniVResity is a mixed reality education initiative that opens “in-person” class participation to remote classmates. Our hypothesis is that mixed reality participation may result in better learning outcomes, because, as in face-to-face education, it supports social interactions and non-verbal communication. Effective modes of learning, such as peer-learning and active learning are possible in VR, while are difficult to realize in traditional e-learning environments.


User Representation Across Spectrum of Realities: Real, Augmented, Mixed, Virtual

In this research we study alternative representations of users in the context of mixed reality systems, where some edn-nodes connect via 2D screens, some exist in ambient, ubiquitous computing environments, while other interact via traditional virtual reality hardware. We are particularly interested in exploring automatic translation of user representation and interaction for various end-node configurations, so that users are non-uniformly represented in a networked system.


Low-cost mobile-based Surgery Simulator with Haptic Feedback

In collaboration with Graduate School of Medicine, Kyoto University, we are designing a low-cost mobile-based surgery simulator with haptic feedback. Our goal is to bring the benefits of virtual reality surgical training to developing countries and universities that cannot afford the larger and more expensive VR systems.

 


Phantom Limb Pain Attenuation using Mobile-based Virtual Reality

Between 50% and 80% of amputees experience phantom limb pain, often for many years after the surgery. The causes of the pain are not well-understood and the severity of the condition is often dramatic. Using VR-based therapy is known to effectively attenuate phantom limb pain, yet the therapy is only available at a small number of research institutions. In our research, we develop a mobile-based app that will attenuate VR-based phantom limb pain using smartphone-based VR system.


BrainLego: improving proportional reasoning of children with ADHD

Using low-cost Electoencephalograph-based Brain Computer Interface, this project aims at improving proportional reasoning of children with ADHD. In collaboration with School of Health and Health Systems, University of Waterloo, we develop a lego-based game that teaches children proportional reasoning, while adapting the difficulty of the proportional reasoning based on attention level of the player. Using this biofeedback data, the game aims at prolonging the state of flow in which the child is immersed in a gameplay and thus focused on the task. Our hypothesis, based on the previous research results, is that the intervention may improve the attention span of the children with AHDH.


PrintArm: Low-cost 3D-printable EMG-based Robotic Prosthetic Arm

Worldwide, a number of amputees who have access to robotic prosthetics is minimal, especially in the developing countries, because average robotic prosthetic arm costs tens of thousands of dollars. In this project, we aim at designing open-source low-cost robotic hand  that can be 3D-printed to work as a robotic prosthetic arm. The arm is controlled by a flex sensors, thus it reacts to muscle movements of the remaining part of the hand. Once done, we plan to release the design of the arm as an open-source, free-to-print project, and distribute the arm worldwide with a help of charity organizations.


PUBLISHED PROJECTS

Mobile Devices at the Cinema Theatre
The pre-show experience is a significant part of the movie industry. Over the last decade, the pre-movie in-theatre experience has grown to a $600 million industry. In this paper, we describe Paths, an innovative multiplayer real-time socially engaging game that we designed, developed and evaluated. An iterative refinement application development methodology was used to create the game. The game may be played on any smartphone and group interactions are viewed on the large theatre screen. This paper also reports on the quasiexperimental, mixed method study with repeated measures that was conducted to ascertain the effectiveness of this new game.

Publication: Sykes, Edward R., Dilip Muthukrishnan, Yousif Al-Yousifi, Darren Spriet, and Krzysztof Pietroszek. 2016. “Mobile devices at the cinema theatre.” Entertainment Computing 15: 21-39.


TickTockRay: Smartwatch-based virtual pointing
TickTockRay is a smartwatch-based raycasting technique designed for smartphone-based head mounted displays. It demonstrates that smartwatch-based raycasting can be reliably implemented on an off-the-shelf smartphone and may provide a feasible alternative for specialized input devices. We release TickTockRay to the research community as an open-source plugin for Unity along with an example application, a Minecraft VR game clone, that shows the utility of the technique for placement and destruction of Minecraft blocks.

Publication: Daniel Kharlamov, Brandon Woodard, Liudmila Tahai, and Krzysztof Pietroszek. 2016.  TickTockRay: smartwatch-based 3D pointing for smartphone-based virtual reality. In Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology (VRST ’16). ACM, New York, NY, USA, 365-366.


Tiltcasting: 3D Interaction on Large Displays using a Mobile Device
We develop and formally evaluate a metaphor for smartphone interaction with 3D environments: Tiltcasting. Under the Tiltcasting metaphor, users interact within a rotatable 2D plane that is “cast” from their phone’s interactive display into 3D space. Through an empirical validation, we show that Tiltcasting supports efficient pointing, interaction with occluded objects, disambiguation between nearby objects, and object selection and manipulation in fully addressable 3D space. Our technique out-performs existing target agnostic pointing implementations.

Publication: Krzysztof Pietroszek, James R. Wallace, and Edward Lank. 2015. Tiltcasting: 3D Interaction on Large Displays using a Mobile Device. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST ’15). ACM, New York, NY, USA, 57-62


Watchpoint: Freehand Pointing with a Smartwatch in a Ubiquitous Display Environment
We describe the design and evaluation of a freehand, smartwatch-based, mid-air pointing and clicking interaction technique, called Watchpoint. Watchpoint enables a user to point at a target on a nearby large display by moving their arm. It also enables target selection through a wrist rotation gesture. We validate the use of Watchpoint by comparing its performance with two existing techniques: Myopoint, which uses a specialized forearm mounted motion sensor, and a camera-based (Vicon) motion capture system. Our work demonstrates that a commodity smartwatch can serve as an effective pointing device in ubiquitous display environments.

Publication: Keiko Katsuragawa, Krzysztof Pietroszek, James R. Wallace, and Edward Lank. 2016. Watchpoint: Freehand Pointing with a Smartwatch in a Ubiquitous Display Environment. In Proceedings of the International Working Conference on Advanced Visual Interfaces (AVI ’16). ACM, New York, NY, USA, 128-135.=