CREATING A RESPONSIVE VISUALIZATION THAT REACTS WITH MUSIC IN REAL TIME: INTEGRATING ABLETON LIVE 9 AND CYCLING ’74 MAX FOR LIVE INTO A MUSICAL PERFORMANCE

Persistent Link:
http://hdl.handle.net/10150/612546
Title:
CREATING A RESPONSIVE VISUALIZATION THAT REACTS WITH MUSIC IN REAL TIME: INTEGRATING ABLETON LIVE 9 AND CYCLING ’74 MAX FOR LIVE INTO A MUSICAL PERFORMANCE
Author:
BARSETTI-NERLAND, DANIEL EDWARD
Issue Date:
2016
Publisher:
The University of Arizona.
Rights:
Copyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author.
Abstract:
Over the past three semesters, I have been working on refining my Honors Thesis to meet exactly what I wanted to study. This is why when looking at what I wanted to focus on it, it was obvious that I wanted to integrate musical performance with a visual performance—similar to what a Video Disc Jockey (VDJ) might incorporate into a live set. Currently, there are a quite a few percussion pieces that use Max patches that process the sounds, or use pre-made visuals that play along with the music. My goal with this Thesis was to find a way to use a patch with the music that reacts to the visuals in real time. What makes this different from other performances is that each performance will be slightly different than the one before it or after it. To me, this is something that I find very exciting about music and technology. Throughout this essay, I will explain how I approach composing a piece specifically for this as well as my learning process to integrate a Max patch with the Music.
Type:
text; Electronic Thesis
Degree Name:
B.M.
Degree Level:
Bachelors
Degree Program:
Honors College; Music Education
Degree Grantor:
University of Arizona
Advisor:
Weinberg, Norman

Full metadata record

DC FieldValue Language
dc.language.isoen_USen
dc.titleCREATING A RESPONSIVE VISUALIZATION THAT REACTS WITH MUSIC IN REAL TIME: INTEGRATING ABLETON LIVE 9 AND CYCLING ’74 MAX FOR LIVE INTO A MUSICAL PERFORMANCEen_US
dc.creatorBARSETTI-NERLAND, DANIEL EDWARDen
dc.contributor.authorBARSETTI-NERLAND, DANIEL EDWARDen
dc.date.issued2016-
dc.publisherThe University of Arizona.en
dc.rightsCopyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author.en
dc.description.abstractOver the past three semesters, I have been working on refining my Honors Thesis to meet exactly what I wanted to study. This is why when looking at what I wanted to focus on it, it was obvious that I wanted to integrate musical performance with a visual performance—similar to what a Video Disc Jockey (VDJ) might incorporate into a live set. Currently, there are a quite a few percussion pieces that use Max patches that process the sounds, or use pre-made visuals that play along with the music. My goal with this Thesis was to find a way to use a patch with the music that reacts to the visuals in real time. What makes this different from other performances is that each performance will be slightly different than the one before it or after it. To me, this is something that I find very exciting about music and technology. Throughout this essay, I will explain how I approach composing a piece specifically for this as well as my learning process to integrate a Max patch with the Music.en
dc.typetexten
dc.typeElectronic Thesisen
thesis.degree.nameB.M.en
thesis.degree.levelBachelorsen
thesis.degree.disciplineHonors Collegeen
thesis.degree.disciplineMusic Educationen
thesis.degree.grantorUniversity of Arizonaen
dc.contributor.advisorWeinberg, Normanen
All Items in UA Campus Repository are protected by copyright, with all rights reserved, unless otherwise indicated.