Serveur d'exploration sur l'opéra

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Unifying performer and accompaniment

Identifieur interne : 000300 ( PascalFrancis/Checkpoint ); précédent : 000299; suivant : 000301

Unifying performer and accompaniment

Auteurs : Lars Graugaard [Danemark]

Source :

RBID : Pascal:08-0036216

Descripteurs français

English descriptors

Abstract

A unique real time system for correlating a vocal, musical performance to an electronic accompaniment is presented. The system has been implemented and tested extensively in performance in the author's opera 'La Quintrala', and experience with its use in practice is presented. Furthermore, the system's functionality is outlined, it is put into current research perspective, and its possibilities for further development and other usages is discussed. The system correlates voice analysis to an underlying chord structure, stored in computer memory. This chord structure defines the primary supportive pitches, and links the notated and electronic score together, addressing the needs of the singer for tonal indicators' at any given moment. A computer-generated note is initiated by a combination of the singer - by the onset of a note, or by some element in the continuous spectrum of the singing - and the computer through an accompaniment algorithm. The evolution of this relationship between singer and computer is predefined in the application according to the structural intentions of the score, and is affected by the musical and expressive efforts of the singer. The combination of singer and computer influencing the execution of the accompaniment creates a dynamic, musical interplay between singer and computer, and is a very fertile musical area for a composer's combined computer programming and score writing.


Affiliations:


Links toward previous steps (curation, corpus...)


Links to Exploration step

Pascal:08-0036216

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en" level="a">Unifying performer and accompaniment</title>
<author>
<name sortKey="Graugaard, Lars" sort="Graugaard, Lars" uniqKey="Graugaard L" first="Lars" last="Graugaard">Lars Graugaard</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>Aalborg University Esbjerg, Department of Software and Media Technology, Niels Bohrs Vej 6</s1>
<s2>6700 Esbjerg</s2>
<s3>DNK</s3>
<sZ>1 aut.</sZ>
</inist:fA14>
<country>Danemark</country>
<wicri:noRegion>6700 Esbjerg</wicri:noRegion>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">INIST</idno>
<idno type="inist">08-0036216</idno>
<date when="2006">2006</date>
<idno type="stanalyst">PASCAL 08-0036216 INIST</idno>
<idno type="RBID">Pascal:08-0036216</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000288</idno>
<idno type="wicri:Area/PascalFrancis/Curation">000248</idno>
<idno type="wicri:Area/PascalFrancis/Checkpoint">000300</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a">Unifying performer and accompaniment</title>
<author>
<name sortKey="Graugaard, Lars" sort="Graugaard, Lars" uniqKey="Graugaard L" first="Lars" last="Graugaard">Lars Graugaard</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>Aalborg University Esbjerg, Department of Software and Media Technology, Niels Bohrs Vej 6</s1>
<s2>6700 Esbjerg</s2>
<s3>DNK</s3>
<sZ>1 aut.</sZ>
</inist:fA14>
<country>Danemark</country>
<wicri:noRegion>6700 Esbjerg</wicri:noRegion>
</affiliation>
</author>
</analytic>
<series>
<title level="j" type="main">Lecture notes in computer science</title>
<idno type="ISSN">0302-9743</idno>
<imprint>
<date when="2006">2006</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt>
<title level="j" type="main">Lecture notes in computer science</title>
<idno type="ISSN">0302-9743</idno>
</seriesStmt>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Audio acoustics</term>
<term>Computer structure</term>
<term>Experimental study</term>
<term>Hearing</term>
<term>Information retrieval</term>
<term>Information system</term>
<term>Intention</term>
<term>Music</term>
<term>Musical acoustics</term>
<term>Pitch(acoustics)</term>
<term>Real time system</term>
<term>Singer</term>
<term>Voice</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr">
<term>Acoustique audio</term>
<term>Recherche information</term>
<term>Musique</term>
<term>Système temps réel</term>
<term>Système information</term>
<term>Audition</term>
<term>Tonie</term>
<term>Chanteur</term>
<term>Acoustique musicale</term>
<term>Voix</term>
<term>Structure ordinateur</term>
<term>Intention</term>
<term>Etude expérimentale</term>
</keywords>
<keywords scheme="Wicri" type="topic" xml:lang="fr">
<term>Musique</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">A unique real time system for correlating a vocal, musical performance to an electronic accompaniment is presented. The system has been implemented and tested extensively in performance in the author's opera 'La Quintrala', and experience with its use in practice is presented. Furthermore, the system's functionality is outlined, it is put into current research perspective, and its possibilities for further development and other usages is discussed. The system correlates voice analysis to an underlying chord structure, stored in computer memory. This chord structure defines the primary supportive pitches, and links the notated and electronic score together, addressing the needs of the singer for tonal indicators' at any given moment. A computer-generated note is initiated by a combination of the singer - by the onset of a note, or by some element in the continuous spectrum of the singing - and the computer through an accompaniment algorithm. The evolution of this relationship between singer and computer is predefined in the application according to the structural intentions of the score, and is affected by the musical and expressive efforts of the singer. The combination of singer and computer influencing the execution of the accompaniment creates a dynamic, musical interplay between singer and computer, and is a very fertile musical area for a composer's combined computer programming and score writing.</div>
</front>
</TEI>
<inist>
<standard h6="B">
<pA>
<fA01 i1="01" i2="1">
<s0>0302-9743</s0>
</fA01>
<fA05>
<s2>3902</s2>
</fA05>
<fA08 i1="01" i2="1" l="ENG">
<s1>Unifying performer and accompaniment</s1>
</fA08>
<fA09 i1="01" i2="1" l="ENG">
<s1>Computer music modeling and retrieval : Texte imprimé : Third international symposium, CMMR 2005, Pisa, Italy, September 26-28, 2005 : revised papers</s1>
</fA09>
<fA11 i1="01" i2="1">
<s1>GRAUGAARD (Lars)</s1>
</fA11>
<fA12 i1="01" i2="1">
<s1>KRONLAND-MARTINET (Richard)</s1>
<s9>ed.</s9>
</fA12>
<fA12 i1="02" i2="1">
<s1>VOINIER (Thierry)</s1>
<s9>ed.</s9>
</fA12>
<fA12 i1="03" i2="1">
<s1>YSTAD (Sølvi)</s1>
<s9>ed.</s9>
</fA12>
<fA14 i1="01">
<s1>Aalborg University Esbjerg, Department of Software and Media Technology, Niels Bohrs Vej 6</s1>
<s2>6700 Esbjerg</s2>
<s3>DNK</s3>
<sZ>1 aut.</sZ>
</fA14>
<fA20>
<s1>169-184</s1>
</fA20>
<fA21>
<s1>2006</s1>
</fA21>
<fA23 i1="01">
<s0>ENG</s0>
</fA23>
<fA26 i1="01">
<s0>3-540-34027-0</s0>
</fA26>
<fA43 i1="01">
<s1>INIST</s1>
<s2>16343</s2>
<s5>354000153589310160</s5>
</fA43>
<fA44>
<s0>0000</s0>
<s1>© 2008 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45>
<s0>34 ref.</s0>
</fA45>
<fA47 i1="01" i2="1">
<s0>08-0036216</s0>
</fA47>
<fA60>
<s1>P</s1>
<s2>C</s2>
</fA60>
<fA61>
<s0>A</s0>
</fA61>
<fA64 i1="01" i2="1">
<s0>Lecture notes in computer science</s0>
</fA64>
<fA66 i1="01">
<s0>DEU</s0>
</fA66>
<fC01 i1="01" l="ENG">
<s0>A unique real time system for correlating a vocal, musical performance to an electronic accompaniment is presented. The system has been implemented and tested extensively in performance in the author's opera 'La Quintrala', and experience with its use in practice is presented. Furthermore, the system's functionality is outlined, it is put into current research perspective, and its possibilities for further development and other usages is discussed. The system correlates voice analysis to an underlying chord structure, stored in computer memory. This chord structure defines the primary supportive pitches, and links the notated and electronic score together, addressing the needs of the singer for tonal indicators' at any given moment. A computer-generated note is initiated by a combination of the singer - by the onset of a note, or by some element in the continuous spectrum of the singing - and the computer through an accompaniment algorithm. The evolution of this relationship between singer and computer is predefined in the application according to the structural intentions of the score, and is affected by the musical and expressive efforts of the singer. The combination of singer and computer influencing the execution of the accompaniment creates a dynamic, musical interplay between singer and computer, and is a very fertile musical area for a composer's combined computer programming and score writing.</s0>
</fC01>
<fC02 i1="01" i2="X">
<s0>001B40C75</s0>
</fC02>
<fC02 i1="02" i2="X">
<s0>001D02B04</s0>
</fC02>
<fC02 i1="03" i2="X">
<s0>001B40C38</s0>
</fC02>
<fC03 i1="01" i2="3" l="FRE">
<s0>Acoustique audio</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="3" l="ENG">
<s0>Audio acoustics</s0>
<s5>01</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE">
<s0>Recherche information</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG">
<s0>Information retrieval</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA">
<s0>Búsqueda información</s0>
<s5>02</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE">
<s0>Musique</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG">
<s0>Music</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA">
<s0>Música</s0>
<s5>03</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE">
<s0>Système temps réel</s0>
<s5>06</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG">
<s0>Real time system</s0>
<s5>06</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA">
<s0>Sistema tiempo real</s0>
<s5>06</s5>
</fC03>
<fC03 i1="05" i2="X" l="FRE">
<s0>Système information</s0>
<s5>07</s5>
</fC03>
<fC03 i1="05" i2="X" l="ENG">
<s0>Information system</s0>
<s5>07</s5>
</fC03>
<fC03 i1="05" i2="X" l="SPA">
<s0>Sistema información</s0>
<s5>07</s5>
</fC03>
<fC03 i1="06" i2="X" l="FRE">
<s0>Audition</s0>
<s5>08</s5>
</fC03>
<fC03 i1="06" i2="X" l="ENG">
<s0>Hearing</s0>
<s5>08</s5>
</fC03>
<fC03 i1="06" i2="X" l="SPA">
<s0>Audición</s0>
<s5>08</s5>
</fC03>
<fC03 i1="07" i2="X" l="FRE">
<s0>Tonie</s0>
<s5>09</s5>
</fC03>
<fC03 i1="07" i2="X" l="ENG">
<s0>Pitch(acoustics)</s0>
<s5>09</s5>
</fC03>
<fC03 i1="07" i2="X" l="SPA">
<s0>Altura sonida</s0>
<s5>09</s5>
</fC03>
<fC03 i1="08" i2="X" l="FRE">
<s0>Chanteur</s0>
<s5>10</s5>
</fC03>
<fC03 i1="08" i2="X" l="ENG">
<s0>Singer</s0>
<s5>10</s5>
</fC03>
<fC03 i1="08" i2="X" l="SPA">
<s0>Cantor</s0>
<s5>10</s5>
</fC03>
<fC03 i1="09" i2="X" l="FRE">
<s0>Acoustique musicale</s0>
<s5>18</s5>
</fC03>
<fC03 i1="09" i2="X" l="ENG">
<s0>Musical acoustics</s0>
<s5>18</s5>
</fC03>
<fC03 i1="09" i2="X" l="SPA">
<s0>Acústica musical</s0>
<s5>18</s5>
</fC03>
<fC03 i1="10" i2="X" l="FRE">
<s0>Voix</s0>
<s5>19</s5>
</fC03>
<fC03 i1="10" i2="X" l="ENG">
<s0>Voice</s0>
<s5>19</s5>
</fC03>
<fC03 i1="10" i2="X" l="SPA">
<s0>Voz</s0>
<s5>19</s5>
</fC03>
<fC03 i1="11" i2="X" l="FRE">
<s0>Structure ordinateur</s0>
<s5>20</s5>
</fC03>
<fC03 i1="11" i2="X" l="ENG">
<s0>Computer structure</s0>
<s5>20</s5>
</fC03>
<fC03 i1="11" i2="X" l="SPA">
<s0>Estructura computadora</s0>
<s5>20</s5>
</fC03>
<fC03 i1="12" i2="X" l="FRE">
<s0>Intention</s0>
<s5>21</s5>
</fC03>
<fC03 i1="12" i2="X" l="ENG">
<s0>Intention</s0>
<s5>21</s5>
</fC03>
<fC03 i1="12" i2="X" l="SPA">
<s0>Intencíon</s0>
<s5>21</s5>
</fC03>
<fC03 i1="13" i2="X" l="FRE">
<s0>Etude expérimentale</s0>
<s5>33</s5>
</fC03>
<fC03 i1="13" i2="X" l="ENG">
<s0>Experimental study</s0>
<s5>33</s5>
</fC03>
<fC03 i1="13" i2="X" l="SPA">
<s0>Estudio experimental</s0>
<s5>33</s5>
</fC03>
<fN21>
<s1>052</s1>
</fN21>
<fN44 i1="01">
<s1>OTO</s1>
</fN44>
<fN82>
<s1>OTO</s1>
</fN82>
</pA>
<pR>
<fA30 i1="01" i2="1" l="ENG">
<s1>International Symposium on Computer Music Modeling and Retrieval</s1>
<s2>3</s2>
<s3>Pisa ITA</s3>
<s4>2005</s4>
</fA30>
</pR>
</standard>
</inist>
<affiliations>
<list>
<country>
<li>Danemark</li>
</country>
</list>
<tree>
<country name="Danemark">
<noRegion>
<name sortKey="Graugaard, Lars" sort="Graugaard, Lars" uniqKey="Graugaard L" first="Lars" last="Graugaard">Lars Graugaard</name>
</noRegion>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Wicri/Musique/explor/OperaV1/Data/PascalFrancis/Checkpoint
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000300 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Checkpoint/biblio.hfd -nk 000300 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Wicri/Musique
   |area=    OperaV1
   |flux=    PascalFrancis
   |étape=   Checkpoint
   |type=    RBID
   |clé=     Pascal:08-0036216
   |texte=   Unifying performer and accompaniment
}}

Wicri

This area was generated with Dilib version V0.6.21.
Data generation: Thu Apr 14 14:59:05 2016. Site generation: Thu Jan 4 23:09:23 2024