A FUNCTIONAL APPROACH TO PARALLELISM AND …

Download Report

Transcript A FUNCTIONAL APPROACH TO PARALLELISM AND …

A FUNCTIONAL APPROACH TO PARALLELISM AND
DISTRIBUTED PROCESSING IN UNDERGRADUATE
EDUCATION
Dr. Maury Eggen
Trinity University
San Antonio, Texas USA
Outline
•
•
•
•
•
•
•
•
•
Motivation for the development of MPSCM
Background
Methodology
Educational Ramifications
Implementation
Examples
Higher Level Functions
Performance
Summary, Conclusions, Future Research
Motivation
• The future is parallel (David Patterson)
• Complete change needed in the development of
computer technology and program coding
• Need to develop new coding techniques
• Need to develop new programming languages –
MPSCM
• Beginning students need fundamentals
• Parallelism must be infused into our entire
curriculum
Background
• MPSCM study begun two years ago
• MzScheme dialect chosen
– fully R5RS-compliant
– TCP/IP
– Threading
– Synchronization
Methodology
• MPI (Message Passing Interface) is the defacto
standard for message passing
• MPSCM has many MPI like constructs
• MPSCM related to a subset of MPI
• MPSCM fits our expectations of what a
message passing environment should be
• Focus on preserving the functional nature of
Scheme
Education
• Scheme commonly used in educational
environments
• Clear syntax and semantics
• Few constructs, easy to learn, expressive
Education
• MPSCM provides a technology based
educational environment for teaching
parallelism which focuses particularly on
creating an interface that gives the student
and educator access to fundamental parallel
programming concepts while preserving the
ease of use of Scheme
Implementation
• MPSCM built in three layers
– Communication layer
– Evaluation layer
– Top level functions
Examples-Simple Scheme
(define fact
(lambda (n)
(if (= n 0)
1
(* n (fact (- n 1))))))
;; worker snippet
(let ((received-val (MPSCM-recv 0 0)))
(MPSCM-send 0 (f received-val) 1))
;; master snippet
(begin
(MPSCM-send 1 value 0)
(MPSCM-recv 1 1))
;; SIMD function
(define mpi-fun
(lambda (value)
(if (= MPSCM-myrank 0)
(begin
(MPSCM-send 1 value 0)
(MPSCM-recv 1 1))
(let ((received-val (MPSCM-recv 0 0)))
(MPSCM-send 0 (f received-val) 1)))))
Higher Level Examples
• (define lst (list 1 2 3 4 5 6 7 8 9))
• map …
• (map fact lst)
(1 2 6 24 120 720 5040 40320 362880)
• (MPSCM-map fact lst)
Performance
• The primary goal of MPSCM is not to provide a
competitive alternative to quicker approaches
to parallel programming such as C/MPI for
high performance computing.
• Its goal is, however, to provide a highly
expressive parallel programming environment
in which students new to the concepts of
parallelism can learn and flourish, and it is
highly successful at achieving this goal.
Summary
• Nice easy to use environment for introduction
to parallelism and distributed processing
• Many interesting and useful characteristics
• Good starting point for further investigations
• Decomposition process well presented in this
approach
• The authors believe the approach will pay
educational dividends for our students
Future Research
• Considerable parallel processing can be done
with just six functions, initialize, finalize, size,
rank, send and receive.
• MPI has over 100 functions and more are
being added daily
• MPSCM has fewer than 100…
• More and varied higher level functions must
be added
Co-Authors
Maury Eggen, Trinity University, San Antonio, Texas
USA
[email protected]
Dr. Roger Eggen, University of North Florida,
Jacksonville, Florida USA
[email protected]
Alexander Starche, Trinity University, San Antonio,
Texas USA
[email protected]