top of page

OTHER PROJECTS

Human-Robot Dance Improvisation

This project explores the possibilities of physical Human-Robot Interaction (pHRI) in the context of an artistic task such as improvising a dance piece. Inspired by the dance method of Contact Improvisation, (a) dancer(s) and a robot interact both through motion and physical contact, creating a unique dance piece every time, based on their "creative" choices. Initially a collaboration with Isabel Valverde and Ana Moura, the system evolved towards a more autonomous closed-loop system where perception, planning, control and execution monitoring come together to serve the safety as well as the quality of the artistic task. A short movie based off the robot-dancers interactions (shot and edited by Nuno Leite) won the Most Uncanny award at the 2015 Robot Film Festival. Technical details of the system can be found in this video, and examples of a full improvised dance piece can be found here (dancers Kate Rosenberg and Carson McCalley).
Collaborators: Nikhil Baheti, Paul Calhoun, Tiago Ribeiro, Letian Zhang.

Human-Robot Collaborative Painting

Work done in collaboration with Su Baykal, Yeliz Karadayi, and Robert Zacharias. In the first phase, developed a collaborative system where a human and a robot jointly create a painting. The robot, a haptic feedback device connected to a paintbrush influences the person's drawing by applying force to the paintbrush to suggest following some curves that the human doesn't know ahead of time. In the second phase, we conducted a study to evaluate the perceived creativity, willingness to collaborate, and credit assignment in two scenarios. In scenario 1, the human is being told that they are collaborating with an interactive robot with which they will create a painting; in scenario 2, the human is being told that they are collaborating with another human in a separate room and that their input is mediated through the robot in real-time. In fact, the two scenarios are strictly identical, but people in different scenarios showed different perceptions of the creative process. Our project got featured on Creator's Project, in the Guardian, the CMU Tartan newspaper, and the AMT Lab blog. Check out this video explaining how it works.

Musical Lights

In this project, I brought to reality Mr. Walid Tamari's vision of a device that helps you learn the piano rigorously yet in a simple, accessible and flexible way. 'Musical lights' consists of a strip of colored lights, mountable on any piano, a pair of colored-coded gloves, and a software application. The user selects the piece to be played and gets real-time instructions on which keys to stroke and with which hand and fingers thanks to the color coding. It is the first piano technology device to provide simple instructions with this level of detail and is patent-protected.

Explainable Markov Decision Processes

Coming soon.

Vehicular Ad-Hoc Networks (VANET)

Coming soon

Simulating the Time Projection Chamber of the CERN ALICE detector

Document available here.

Heterogeneous Robot Scheduling

Coming soon.

Vehicle Conflict Resolution at Traffic Intersections

Coming soon.

Improving Robot Grasp Detection

Coming soon.

Please reload

bottom of page