Rules for Melanoma Skin Cancer Diagnosis.
Download
Report
Transcript Rules for Melanoma Skin Cancer Diagnosis.
Rules for Melanoma
Skin Cancer Diagnosis
Włodzisław Duch, K. Grąbczewski, R. Adamczak, K. Grudziński,
Department of Computer Methods,
Nicholas Copernicus University, Torun, Poland.
http://www.phys.uni.torun.pl/kmk
Zdzisław Hippe
Department of Computer Chemistry and Physical Chemistry
Rzeszów University of Technology,
[email protected]
_____KOSYR 2001______
Content:
Melanoma skin cancer data
5 methods: GTS, SSV, MLP2LN, SSV, SBL,
and their results.
Final comparison of results
Conclusions & future prospects
_____KOSYR 2001______
Skin cancer
Most common skin cancer:
Basal cell carcinoma (rak podstawnokomórkowy)
Squamous cell carcinoma (rak kolczystonabłonkowy)
Melanoma: uncontrolled growth of melanocytes, the
skin cells that produce the skin pigment melanin.
Too much exposure to the sun, sunburn.
Melanoma is 4% of skin cancers, most difficult to
control, 1:79 Americans will develop melanoma.
Almost 2000 percent increase since 1930.
Survival now 84%, early detection 95%.
_____KOSYR 2001______
Melanoma skin cancer data summary
Collected in the Outpatient Center of Dermatology in
Rzeszów, Poland.
Four types of Melanoma: benign, blue, suspicious, or
malignant.
250 cases, with almost equal class distribution.
Each record in the database has 13 attributes.
TDS (Total Dermatoscopy Score) - single index
26 new test cases.
Goal: understand the data, find simple description.
_____KOSYR 2001______
Melanoma AB attributes
Asymmetry:
symmetric-spot,
1-axial asymmetry,
and 2-axial asymmetry.
Border irregularity:
The edges are ragged,
notched, or blurred.
Integer, from 0 to 8.
_____KOSYR 2001______
Melanoma CD attributes
Color: white, blue, black, red, light
brown, and dark brown; several
colors are possible
simultaneously.
Diversity:
pigment globules, pigment dots,
pigment network,
branched strikes,
structureless areas.
_____KOSYR 2001______
Melanoma TDS index
Combine ABCD attributes to form one index:
TDS index ABCD formula:
TDS = 1.3 Asymmetry + 0.1 Border +
0.5 S {Colors} + 0.5 S {Diversities}
Coefficients from statistical analysis.
_____KOSYR 2001______
Remarks on testing
Test: only 26 cases for 4 classes.
Estimation of expected statistical accuracy on 276
training + test cases with 10-fold crossvalidation.
Not done with most methods!
Risk matrices desirable:
identification of Blue nevus instead Benign nevus
carries no risk, but with malignant great risk.
_____KOSYR 2001______
Methods used: GTS
GTS covering algorithm (Hippe, 1997) + recursive
reduction of the number of decision rules.
Interactive, user guides the development of the
learning model.
Selection of combination of attributes generating
learning model is based on Frequency and Ranking.
GTS allows to create many different sets of rules.
In a complex situation may be rather difficult to use.
_____KOSYR 2001______
GTS results.
GTS generated a large number (198) of rules.
Experimentation allowed to find important attributes.
Various sets of decision rules were generated:
TDS & C-blue & Asymmetry & Border (4 attributes,
based on the experience of medical doctors)
TDS & C-blue & D-structureless-areas (3 attributes)
TDS & C-Blue (2 attributes)
TDS (1 attribute) - poor results.
Models with 2-4 attributes give 81-85% accuracy.
Combination and generalization of these rules allowed
to select 4 simplified best rules.
Overall: 6 errors on training, 0 errors on test set.
_____KOSYR 2001______
Methods used: SSV
Decision tree (Grąbczewski, Duch 1999)
Based on a separability criterion: max. index of
separability for a given split value for continuous
attribute or a subset of discrete values.
Easily converted into a set of crisp logical rules.
Pruning used to ensure the simplest set of rules that
generalize well.
Fully automatic, very efficient, crossvalidation tests
provide estimation of statistical accuracy.
_____KOSYR 2001______
SSV results
Pruning degree is the only user-defined parameter.
Finds TDS, C-BLUE as most important.
Rules are easy to understand:
IF TDS 4.85 C-BLUE is absent => Benign-nevus
IF TDS 4.85 C-BLUE is present => Blue-nevus
IF 4.85 < TDS < 5.45 => Suspicious
IF TDS 5.45 => Malignant
98% accuracy on training, 100% test.
5 errors, vector pairs from C1/C2 have identical TDS & C-BLUE.
10xCV on all data: 97.5±0.3%
_____KOSYR 2001______
Methods used: MLP2LN
Constructive constrained MLP algorithm, 0, ±1 weights
at the end of training.
MLP is converted into LN, network performing logical
function (Duch, Adamczak, Grąbczewski 1996)
Network function is written as a set of crisp logical rules.
Automatic determination of crisp and fuzzy "softtrapezoidal" membership functions.
Tradeoff: simplicity vs. accuracy explored.
Tradeoff: confidence vs. rejection rate explored.
Almost fully automatic algorithm.
_____KOSYR 2001______
MLP2LN results
Very similar rules as for the SSV found.
Confusion matrix:
Original class
Calculated
Benign
nevus
Blue-
Malig- Suspi-
nevus
nant
cious
Benign-nevus
62
5
0
0
Blue-nevus
0
59
0
0
Malignant
0
0
62
0
Suspicious
0
0
0
62
_____KOSYR 2001______
Methods used: FSM
Feature-Space Mapping (Duch 1994)
FSM estimates probability density of training data.
Neuro-fuzzy system, based on separable transfer
functions.
Constructive learning algorithm with feature selection
and network pruning.
Each transfer function component is a contextdependent membership function.
Crisp logic rules from rectangular functions.
Trapezoidal, triangular, Gaussian f. for fuzzy logic rules.
_____KOSYR 2001______
FSM results
Rectangular functions used for C-rules.
7 nodes (rules) created on average.
10xCV accuracy on training 95.5±1.0%, test 100%.
Committee of 20 FSM networks: 95.5±1.1%, test 92.6%.
F-rules, Gaussian membership functions:
15 fuzzy rules, lower accuracy.
Simplest solution should strongly be preferred.
_____KOSYR 2001______
Methods used: SBL
Similarity-Based-Methods: many models based on
evaluation of similarity.
Similarity-Based-Learner (SBL): software
implementation of SBM.
Various extensions of the k-nearest neighbor algorithms.
S-rules, more general than C-rules and F-rules.
Small number of prototype cases used to explain the
data class structure.
_____KOSYR 2001______
SBL results
SBL optimized performing 10xCV on training set.
Manhattan distance, feature selection: TDS & C_Blue
97.4 ± 0.3% on training, 100% test.
S-rules of the form:
IF (X sim Pi) THEN C(X)=C(Pi)
IF (|TDS(X)-TDS(Pi)|+|C_blue(X)-C_blue (Pi)|)<T (Pi)
THEN C(X)=C(Pi)
Prototype selection left 13 vectors (7 for Benign-nevus
class, 2 for every other class.
97.5% or 6 errors on training (237 vectors), 100% test
7 prototypes: 91.4% training (243 vectors), 100% test
_____KOSYR 2001______
Results - comparison
Method
Rules Training %
Test%
SSV Tree, crisp rules
MLP2LN, crisp rules
4
4
97.5±0.3
98.0 all
100
100
GTS - final simplified
4
97.6 all
100
FSM, rectangular f.
7
95.5±1.0
100±0.0
knn+ prototype selection
13
97.5±0.0
100
FSM, Gaussian f.
15
93.7±1.0
95±3.6
GTS initial rules
knn k=1, Manh, 2 feat.
LERS, weighted rules
198
250
21
85 all
97.4±0.3
--
84.6
100
96.2
_____KOSYR 2001______
Conclusions:
TDS - most important; Color-blue second.
Without TDS - many rules.
Optimize TDS: automatic aggregation of features,
ex. 2-layered neural network.
Very simple and reliable rules have been found.
S-rules are being improved - prototypes obtained
from learning instead of selection.
Data base is expanding; need for non-cancer data.
_____KOSYR 2001______