Automated Discrimination Learning and Video-based

Download Report

Transcript Automated Discrimination Learning and Video-based

Automated Discrimination Learning and Video-based Analysis of
Decision Making in Zebrafish (Danio rerio)
Bishen
1 1Dept.
1
Singh ,
Luciano
3
Zu ,
Jacqueline
1
Summers ,
Saman
1
Asdjodi ,
Jared
1
Giordano ,
Eric
2
Glasgow
and Jagmeet S.
1
Kanwal
of Neurology, 2 Dept. of Tumor Biology, Georgetown University Med. Ctr., Washington, D.C.; 3Univ. degli Studi di Roma 'La Sapienza', Roma, Italy
INTRODUCTION
Directed swimming towards a target requires spatial memory, sensory
inputs and motivation. An animal may be trained to perform this task
via associative (classical or operant) conditioning.
In a stimulus-driven, operant conditioning paradigm, we present a
pulse of light via LED’s and/or sounds via an underwater transducer. A
webcam placed below a glass tank records fish swimming behavior.
During operant conditioning, a fish must interrupt a light beam at one
location to obtain a small food reward at the same or different location.
The timing-gated interrupt activates robotic arm and feeder stepper
motors via custom software controlling a microprocessor (Arduino). In
this way, full automation of stimulus-triggered place-sensitive
conditioning is achieved. Precise multiday scheduling of training,
including timing, location and intensity of stimulus parameters, and
feeder control is accomplished via a user-friendly interface.
Our training paradigm permits tracking of learning by monitoring
swimming, turning, location and response times of individual fish. This
facilitates comparison of performance within and across a cohort of
animals. We demonstrate the ability of zebrafish to discriminate
complex sounds using the newly developed methodology.
Current methods used for associative conditioning often involve
human intervention, which is labor intensive, stressful to animals, and
introduces noise in the data. Our relatively simple yet flexible
paradigm requires a simple apparatus and minimal human
intervention. Our scheduling and control software and apparatus
(NEMOTRAINER) can quickly and efficiently screen drugs and test
the effects of of CRISPRbased and optogenetic modification of neural
circuits on sensation, locomotion, learning and memory.
RESULTS
METHODS: Setup, Training and Tracking
1 Associative Conditioning
Fig. 6. Box plots and jittered scattergrams
showing distances of 6 fish from correct
(target) side after training in response to
the presentation of sounds within a single
trial (4). Data were obtained from 1 s
before (PRE) and during the second
(POST2) and third (POST3) second post
stimulus. Shorter distances from target
indicate better learning. On average, fish
were closer to target during POST2 and
started to wander away during POST3.
Data from 2 fish trapped behind dividers
are not included in this plot.
Training Paradigm:
Fish are trained via a reward-based conditioning
paradigm to associate a sound with a particular
location (side) of the tank. For operant
conditioning, zebrafish trigger motion sensors to
obtain a small reward. Triggering of the sensor is
accomplished by swimming to the correct side of
the tank, which activates release of food into that
chamber. For classical conditioning, a small food
reward is delivered after a user-specified delay.
Each trial extended over 5 days with 6 possible
repetitions (reps.) or chances to obtain food for
each of 6 runs per day.
Day 1: Free run: intermittent reward paradigm
Day 2: Delayed LEDs, alternating
Day 3: Delayed LEDs, randomized
Day 4: Delayed LEDs, randomized
Day 5: Sounds Only
Fig. 1. Diagrammatic representation of the audiovisual training apparatus. Fish training/testing trials on
stimulus-directed swimming were conducted individually
and in groups within the apparatus.
Fig. 2. Tracking individual fish behavior. Tracks created
using iDTracker (Pérez-Escudero et. al., 2014) from video
recordings a) before, and b) during presentation of upward
FM. Tracks begin at locations 1s pre and terminate 4s post
sound onset, showing fish moving towards lower partition.
METHODS: Software development & Data Analysis
2 Interface Design
3 Conditioning Paradigm
4 Video-based Analysis
BACKGROUND
Fig. 7. Line plots showing individual variation in learned performance to the
presentation of upward and downward FMs as reflected in proximity to the
correct side (target) on which they were trained. Averaged performance for all 8
fish across 5 trials (n=40) is shown in plots and averaged behavior plots are
superimposed. UFMs elicited better performance.
• Zebrafish are an excellent model organism for neurological studies.
• Commonly used in genetics, oncology, and developmental
biology
• Genome completely sequenced
• Rich expression of innate behavior (e.g. dominant and
submissive behavior, food searching behavior, shoaling)
• Easily maintained and bred under laboratory conditions
• Hypothesis: Using an automated system of LEDs and auditory
cues, zebrafish can be trained via classical and operant
conditioning to trigger motion sensors and earn a food reward
SUMMARY
a
Sound
Transducer
• Goal: To create a fully programmable, user-friendly, low-cost
system that can be easily replicated and expanded to allow for
training multiple animals in parallel.
b
UFM
METHODS: Materials
• Zebrafish maintenance
• 14:10 light:dark cycle
• Fed daily with brine & dried flake food
• Habituated to housing 3 days prior to training
• Hardware
• Arduino microprocessor, stepper motors and LEDs.
• Desktop computer, webcam (Logitech d90), Debut -video
recording software.
• Circular glass tank, aerator, temperature probe, amplifier, plastic
tubing and BNC cables.
• Underwater sound transducer: digital sound files.
1.
We demonstrate fully automated training and testing of freely
swimming zebrafish within a stimulus-dependent directional
memory task.
2.
NEMOTRAINER can be used to test the ability of zebrafish to
discriminate between different colored lights and frequency
modulation (FM) in sounds.
3.
Zebrafish are attracted to LEDs and appear to learn upward
FMs better than downward FMs.
REFERENCES & SUPPORT
c
Fig. 3. Screen captures of user interface (above) and
training schedule (below) for monitoring sensors and
control of associative conditioning. “Ardulink” (vers.
0.4.2; Zu, 2013), a JAVA facility, allows simultaneous
implementation of communication protocols with Arduino.
User-definable settings enable either classical or operant
conditioning via customized multi-day scheduling and
precise control of stimulus parameters for training.
Fig. 4. Flow chart depicting the algorithm for automated
training. The training procedure assigns user-defined delays for
turning “on” and “off” light and sound or any other type of
stimulus as part of the setup (green). Stimulus repetition (blue)
provides multiple opportunities in close succession within each of
multiple daily runs (pink) for the animal to learn the task.
DFM
Fig. 5. Screen captures of single video
frames. location of free-swimming fish a)
pre sound presentation; b) and c) post
sound presentation.
Collective decision
making [5] likely determines final location of
all fish towards upper for upward FM (UFM)
vs. lower for downward FM (DFM) partitions.
1. Blaser RE, Vira DG. Experiments on learning in zebrafish (Danio rerio): a promising model of
neurocognitive function. Neurosci Biobehav Rev. 2014 May;42:224–31.
2. Higgs DM, Souza MJ, Wilkins HR, Presson JC, Popper AN. Age-and size-related changes in
the inner ear and hearing ability of the adult zebrafish (Danio rerio). JARO. 2002;3(2):174–84.
3. Manabe K, Dooling RJ, Takaku S. An automated device for appetitive conditioning in zebrafish
(Danio rerio). Zebrafish. 2013 Dec;10(4):518–23.
4. Miller N, Garnier S, Hartnett AT, Couzin ID. Both information and social cohesion determine
collective decisions in animal groups. Proc Natl Acad Sci U S A. 2013 Mar 26;110(13):5263–8.
5. Mueller KP, Neuhauss SC. Automated visual choice discrimination learning in zebrafish (Danio
rerio). J Integr Neurosci. 2012;11(01):73–85.
6. Pérez-Escudero A, Vicente-Page J, Hinz RC, Arganda S, de Polavieja GG. idTracker: tracking
individuals in a group by automatic identification of unmarked animals. Nat Methods. 2014
Jul;11(7):743–8.
ACKNOWLEDGEMENTS: Supported in part by BGRO, Georgetown University.