Computational Intelligence.
Download
Report
Transcript Computational Intelligence.
Computational Intelligence
Dr. Garrison Greenwood, Dr. George Lendaris and Dr. Richard Tymerski
http://web.cecs.pdx.edu/~greenwd/
http://www.sysc.pdx.edu/faculty/Lendaris/lendaris.html
http://www.ece.pdx.edu/People/Tymerski.html
Research Interests in Swarm Algorithms
Dr. Tymerski works on Particle Swarm Optimization and its applications in financial
engineering. Particle swarm optimization (PSO) is an optimization and problem-solving
approach that uses swarm intelligence. The analogy is to a swarm of insects or a school of
fish. If one sees a desirable path to go (e.g., for food, protection, etc.) the rest of the swarm
will be able to follow quickly from every location. In addition, each particle has some
randomness of movement. This is modeled by particles in multidimensional space. Particles
have positions and velocities. They communicate their knowledge to other particles.
Evolutionary Computation (genetic algorithms, etc.), fuzzy systems and
neural networks form the computational intelligence area. Active
research explores the design and analysis of complex adaptive
systems, reinforcement learning, intelligent control, robotics and
evolvable hardware. Research is conducted in the Evolvable Systems
Laboratory, Northwest Computational Intelligence Laboratory and in
the Intelligent Robotics Laboratory.
Students interesting in studying computational intelligence must
enroll in:
ECE 559 (Genetic Algorithms)
ECE 555 (Neural Networks I)
ECE 556 (Neural Networks II)
ECE 578 (Intelligent Robotics I)
ECE 579 (Intelligent Robotics II)
A good mathematics background in graph theory and probability is also
highly recommended.
Research Interests in Evolutionary Computation
Many real-world optimization problems are difficult to solve because,
in part, they have an exponential number of possible solutions--exhaustively looking at each one to find the best solution is not
feasible. The class of NP-complete problems fall into this category.
We can, however, collect all possible solutions onto a hyper-surface,
where each point corresponds to a unique solution. We can then
design a computer algorithm that explores this hyper-surface,
searching for the globally best solution. Unfortunately, this search
process is not trivial because the hyper-surface often contains
numerous local optima. The only hope of finding a reasonably good
solution is to conduct a heuristic search.
The field of evolutionary computation (EC) involves the study and
development of algorithms that simulate Darwinian evolution to
conduct a search. Three fundamental algorithms are the genetic
algorithm, the evolution strategy, and evolutionary programming.
Research Interests in Artificial Neural Networks
Dr. Lendaris research interests include the development and application
of massively parallel computation methodology known as neural
networks or connectionist networks. Methodology development focuses
on the idea of matching the structure/architecture of a network to
structural relations in data of problem context. This requires developing
a common mechanism for describing structure in data and structure of a
network so a matching process can be possible. One approach is based
on a knowledge representation formalism known as conceptual
structures, and another is based on a structure representation formalism
called general systems methodology (GSM) notation. Applications being
pursued include pattern recognition and implementation of selected
database/expert system operations. In the planning stage are control
applications. Future work includes collaboration with other faculty in
developing analog/digital VLSI implementations of neural networks.
Selected Publications in Evolutionary Computation.
• G. Greenwood, ``Density in accessibility digraphs'', Graph
Theory Notes of New York XLIX, 7-10, 2005
• G. Quan, G. Greenwood and X. Hu, ``Searching for
multiobjective preventive maintenance schedules: combining
preferences with evolutionary algorithms'', European Journal of
Operations Research (to appear)
• G. W. Greenwood, ``On the practicality of using intrinsic
reconfiguration for fault recovery'', IEEE Transactions on
Evolutionary Computation 9(4), 398-405, 2005
• G. W. Greenwood, ``On the usefulness of accessibility graphs
with combinatorial optimization problems'', Journal of
Interdisciplinary Mathematics 8(2), 277-286, 2005
• G. Greenwood, ``Intrinsic evolution of safe control strategies
for autonomous spacecraft'', IEEE Transactions on Aerospace &
Electronic Systems, 40(1), 236-246, 2004
• G.W. Greenwood, David Hunter and Edward Ramsden, ``Fault
recovery in linear systems via intrinsic evolution'', Proceedings
2004 NASA/DOD Conference on Evolvable Hardware, 115-122,
2004
• G.W. Greenwood, ``Differing mathematical perspectives of
genotype space in combinatorial problems: metric spaces vs
pretopological spaces'', Proceedings 2004 Congress on
Evolutionary Computation, 258-264, 2004
• G. W. Greenwood, ``Adapting mutations in genetic algorithms
using gene flow principles'', Proceedings 2003 Congress on
Evolutionary Computation, 1392-1397, 2003
Selected Publication in Neural Networks
• R.A. Santiago, G. Lendaris, “Reinforcement Learning and the
Frame Problem,” Proc. IJCNN, 2005.
• L. Holmstrom, R.A. Santiago, “On-Line System Identification
Using Context Discernment,” Proceedings IJCNN, 2005.
• S. Matzner, T.T. Shannon, G. Lendaris, “Learning with BinaryValued Utility Using Derivative Adaptive Critic Methods,"
Proceedings IJCNN, 2004.
• G. Lendaris, J. Neidhoefer, “Guidance in the Use of Adaptive
Critics for Control,” Ch. 4 in Handbook of Learning and
Approximate Dynamic Programming, J. Si, A.G. Barto, W.B.
Powell, D. Wunsch, Eds., 97-124, 2004.
• R. Santiago, J. McNames, G. Lendaris, K.J. Burchiel,
“Automated Method for Neuronal Spike Source Identification,”
Neural Networks, Special Issue, 2003.
• G. Lendaris, R.A. Santiago, J. McCarthy, & M.S. Carroll,
“Controller Design via Adaptive Critic and Model Reference
Methods,” Proceedings of IJCNN’03, 2003.
• A.N. Al-Rabadi, G. Lendaris, “Artificial Neural Network
Implementation Using Many-Valued Quantum Computing,”
Proceedings of IJCNN, 2003.
• T.T. Shannon, R.A. Santiago, G. Lendaris, “Accelerated Critic
Learning in Approximate Dynamic Programming via Value
Templates and Perceptual Learning,” Proc. IJCNN, 2003.