Transcript this file

FRANK @ LAM
Davide Morelli, University of Pisa
David Plans Casal, University of East Anglia
19 December 2005
What?
Frank is/will be an Open Source
framework for evolutionary music
composition
Aim
•
•
An extension of Todd and Werner’s Coevolutionary approach to algorithmic
composition/improvisation
Using Puredata framework and custom C external
objects
Future Aim
•
•
To shift focus from current work on
rhythmic/melodic co-evolution to frequency
spectra
Depart from current MIDI model
Background
•
Biles (GenJam), Pachet (Continuator), Rowe
(1993), ANN-based (Mozer, 1994), Impett (2001),
Todd & Werner (Frankenstinian Methods,2001)
Where Frank fits
•
•
Frank attempts to bridge ‘Musicological’ to
‘Creative’ approaches (Miranda)
Uses co-evolution theory and the concept of
musical cultural history within a GA framework.
Current work
•
Todd’s critics implemented so far in Frank:
•
Local transition preference
•
Global transition preference (too cpu intensive)
•
Surprise preference
Stage 1
•
•
Music space : MIDI world (argh)
Winning phenotype : best fitness
•
Using source melody as mother
•
Source melody will be live input
Stage 1 : Genes
•
Melody genes are expressed by four tables:
•
chord_note
•
octave
•
passing_note
•
played
Stage 1 : Results
•
Using a prevalence of local transition preference
method (Todd 1rst method):
•
After > 500 generations, result melody uses most used transition interval
•
e.g.
•
source melody: c d e d c d g c.
•
winner: c d c d c d c d ....
Stage 1 : Results
•
Using prevalence of surprise preference method:
•
Semi-chaotic output which resembles source melody in shape but without
context
Stage 2
•
Using existing music as context
•
Ligeti’s #10 piano etude (results : difficult to discern fitness right away)
•
Jarrett/Gustavsen/Hancock material (chord recognition harder)
•
Shostakovich’s #2 prelude in Am (results : simpler musical space, better
overall reaction from critics)
•
same melodic shape repeated (almost)
•
simple tonal context
Stage 2
•
Current implementation
•
Wonderful World of C Tunes
•
Next step : diversity
•
Live implementation
•
Some mother preferences taken from live sampling
•
Negation of silence : NOT ‘Continuator’
Stage 2
•
Need better, smarter critics
•
Current fitness functions must be optimised
Stage 3 : seriously
•
•
Intervals are important, but Todd misses shape
•
1) C D E D B C
•
2) C E G F B C (shape)
•
3) C D C D C D (local transition)
•
4) E D B C D E (global transition)
1 and 2 are closer than 1 and 3 or 1 and 4
Stage 3
•
•
Form
•
Single co-evolving GA creates no form
•
Multiple-GA system may solve multiple gesture evolution
Need a Form Manager
Stage 3
•
Harmony: built an object implementing short term
memory of played chords sequences
[chords_memory]
•
Learns the probability of chords transitions while you play
•
Once trained can be asked for “normal” or “strange” chords sequences
•
We can ask things like:
•
“we are in C major tonality, current chord D minor, where did I usually go from here?”
•
“build a walk 3 chords long from F major to A minor in C major tonality using rarely used
chords seqences
[chords_memory]
•
•
•
The memory is implemented using an oriented
graph where the nodes are the possible chords in
a given tonality and the arcs are the transitions
from a chord to another
The arcs have weight: the probability of that
transition in this style
The weights are set realtime each time a new
chord is added to the memory
How it works
•
Initially each arc has weight = 0 (transition never
played)
I maj
II maj
III maj
I min
II min
etc..
etc..
etc..
etc..
How it works
•
Let’s say we are in C major tonality, last chord is
D min and we want to add C maj. First we
translate them in relative names: II min and I maj,
then we increment the arc’s weight
I maj
1
II min
How it works
•
Now next time we’ll be in II min we’ll know that 1
time we used this transition: from II min to I maj
I maj
1
0
I min
etc..
II maj III maj
0
0
II min
etc..
etc..
etc..
[chords_memory]
•
•
Current status: stable, usable and used:
•
In an installation: at SMC05 (Salerno, november 2005)
•
In a performance: at GA05 (Milano, december 2005)
Needs improvements: modulations
•
•
•
•
Stage 3
[chords_memory] was succesfull!
Then we’ll try to apply the same principle to a
rhythm maker object (then a melody maker
object)
GAs have no memory of the played rhythms (nor
themes)
Using graphs we can store enough informations
to represent all the rhythmic (and thematic)
material of a musical piece
[rhythms_memory]
•
•
•
•
The idea:
Each note of a played rhythm is parsed into
simple elements
Each rhythm is a linked list of simple elements
Each time a rhythm is heard we match it with
rhythms in memory:
•
If has some similarity then this is a variation
•
If has too fiew similarity then is a new rhythm
[rhythms_memory]
•
•
This object is capable of doing a real-time
rhythmic analysis of the played rhythms
While you play it builds a memory of your rhythms
and label each rhythm with a tag (a1, a2, b1, etc..)
How it works
•
•
Let’s say you first play this rhythm
It can be expressed as a list of moments.. when each note
starts (in musical notation):
•
1.0/1
2.3/16
3.3/8
4.1/2
5.11/16
6.3/4
How it works
•
Then you play 2 variations of the rhythm…
0/1, 3/16, 3/8,
½, 11/16, ¾,
10/12, 11/12
0/1, 3/16, ½,
11/16, ¾
How it works
•
Can be stored as a graph:
How it works
•
The most played nodes are the nodes that make
the “root rhythm”, the “kernel” of this group of
variations:
[rhythms_memory]
•
Current status: freshly implemented, still
debugging, not yet usable in a performance
From rhythms to themes
•
•
As soon as we’ll have [rhythms_memory] stable
and running we’ll apply the same principles to a
melody memory object
Realtime recognition of variations / new themes
Considerations
•
•
•
GA are really good for evolving musical material
(rhythms, melodies), but have no memory, so is
not possible to build a “form”
Graphs are a light and efficient tool to implement
a memory
Graphs can also tell us how much new material
we are introducing: “quantity of information” (this
is important when considering the form)
Stage 4: the future
•
•
•
•
Finish [rhythms_memory]
Build [melody_memory]
Build a gluer: an object that takes two or more
short melodies and glues them over a given
chords sequence
Build a form manager: an object that
•
Listens to the form recognized by the graphs objects
•
Tells the various GA or graphs which melody/rhythm/chord to use
Stage 4 : future
•
Transition from MIDI to Spectra
•
Possible marriage of C externals and Ruby/GridFlow or Python (py/pyext)
•
Probable use of Maxlib (Footils for Puredata)
•
FFT tables from within 3d matrix as musical world
•
Introduction of culture and agency (multiple GAs in agent-driven
framework)
•
(where ‘agent’ is Russel/Norvig extended reflex agent in Ruby puredata object
transactions)
Stage 4 : future
•
Implementation and release of Frank
•
(Already in CVS) Set of Puredata generic externals and libraries
•
Set of documentation and open API (FLOSS project)
•
C bindings for Ruby/Python (more approachable for students)
•
Possible implementation of GridFlow-based API for teaching
•
Inclusion in Puredata/Ruby ‘music programming’ course?
•
SWIG interfaces
my gene
19 December 2005