External Advisory Board Annual Meeting January 13, 2006

Download Report

Transcript External Advisory Board Annual Meeting January 13, 2006

08.04.06
Sampling with
Self-Organizing Feature Maps
Ignacio Zendejas
Jennifer Wang
More on Neural Networks
• Supervised Networks
– Train for desired outputs
• Unsupervised Networks
– Find patterns and relationships between inputs
– e.g. Kohonen’s Self-Organizing Maps
• Retain topology while classifying input vectors
SOM
• Two modes of operation
– Training process
• Map built
• Network organizes
• Input vectors given
– Mapping process
• Input vector placed + classified
• One winning neuron (closest)
Why Neural Networks?
• Good to use for pattern recognition  find where
to sample
• Different idea that can lead to different result
• No need for stats assumptions
– guassian and linear
• Implemented in Matlab!
Neural Networks and Matlab!
 It’s built in
 Use the Neural Network Toolbox in
Matlab
 Several different training algorithms
and learning functions (inc. Kohonen)
 Create and train networks in Matlab
pH Interpolation vs. SOM + Voronoi
Temp. Interpolation vs. SOM + Voronoi
Sparse vs. Dense
• Voronoi sets (big clusters) are very similar for 100 cycles
100 vs. 1000 Cycles
Time Constraints
• Time required for learning is directly
proportional to
– input size
– number of neurons (dimension has no effect)
– number of cycles
• i.e. if you double any of the parameters
above, you will at most double the time
required
• 1000 cycles, 10 neurons and 263 rows
– 2.5 mins on a Pentium 4 3GHz Processor.
What Next?
• Optimize the results, with published
techniques
• Process temporal data
– Detect or predict events, not just clusters
• Implement sensor failure detection and data
reparation for improved robustness