Human Music Sequencer

Red Bull's "Hack the Hits" Hackathon, October 2017

The Hackathon

In October of 2017, I had the opportunity to attend an incredible 48 hour hackathon in San Francisco, CA hosted by Red Bull called "Hack the Hits." The hackathon took place at a makerspace called the "Tech Shop" where Red Bull and the Tech Shop provided the hackers with plentiful cash, tools, and mentorship. The goal of the hackathon was to combine music and technology to create a new instrument or entertainment experience. The video below is my submission video to get accepted into the hackathon. Red Bull selected 15 students from the entire nation to come together for these 48 hours and create amazing music technology. 

The Hackathon

In October of 2017, I had the opportunity to attend an incredible 48 hour hackathon in San Francisco, CA hosted by Red Bull called "Hack the Hits." The hackathon took place at a makerspace called the "Tech Shop" where Red Bull and the Tech Shop provided the hackers with plentiful cash, tools, and mentorship. The goal of the hackathon was to combine music and technology to create a new instrument or entertainment experience. The video below is my submission video to get accepted into the hackathon. Red Bull selected 15 students from the entire nation to come together for these 48 hours and create amazing music technology. 

Red Bull's "Hack the Hits" Hackathon Video Submission

10/15/2017

Human Music Sequencer

My team was called "6AM Vibes", and consisted of 3rd year Jaye Sosa from New York University (CS and Music Tech, NYU), 4th year Chris Woodle from the Florida Institute of Technology (Computer Engineering), and myself. Our project was called "Human Music Sequencer", and the goal for our team was to create a music beat sequencer. Rather than using a touch screen interface (bottom left) or physical objects (bottom right) as the medium to represent the musical notes, we decided to use people to create the music. We wanted to create a collaborative experience that people of all skill levels could enjoy so the implementation had to be as seamless and intuitive as possible. 

Project Implementation

Our implementation consisted of using a Microsoft Kinect camera, Processing program, and GUI made with web languages. The Kinect camera was used to capture the person's position, while the Processing programming language was used to process Kinect video footage, create a 3x4 grid, and display the person's head (bottom right).  Web languages were used to create the graphical user interface (GUI, bottom left) that displayed the note being played on the music sequencer after being processed, and played the note according to position. We communicated between the Processing program and Web GUI using a local web server. The video below is the recorded demo during judging.

Red Bull's "Hack the Hits" Hackathon - Human Music Sequencer

October, 2017

Gallery

These photos were taken by Red Bull's photographer, Ryan Flemming

Post Hackathon

Hacking at a dream: Third-year Gustavo Correa is bridging two disciplines into one

 12/05/2017