L643: Evaluation of Information Systems
Download
Report
Transcript L643: Evaluation of Information Systems
Semantic Web
Application:
Music Retrieval
Ying Ding
SLIS, IU
What is the Semantic Web?
“An
extension of the current Web in which
information is given well-defined meaning,
better enabling computers and people to
work in cooperation.”
Sir Tim Berners-Lee et al., Scientific American,
2001: tinyurl.com/i59p
2
Semantic Web -- Web 3.0
How to realize that:
machine-understandable semantics of
information, and
millions of small specialized reasoning services
that provide support in automated task
achievement based on the accessible information
The current (syntactic /
structural) Web
Was the Web meant to be
more?
Hyperlinks – typed hyperlinks
Document - data
Ontology
The semantic Web is essentially based on
ontologies
ontologies are formal and consensual
specifications of conceptualizations…
providing a shared and common understanding
of a domain that can be communicated across
people and application systems
Metadata and Semantics
Semantic Web - Language tower
What is Semantic Web for?
Integrating - trying to solve the problem of
data and service integration
Searching - Providing better communication
between human and computers by adding
machine-processable semantics to data.
Form keyword search data search query
answer
What is current Semantic Web
effort?
Lifting document web to data web
Weaving the data web through semantic links
(types hyperlinks)
Bubbles in April 2008
>2B RDF triples
Around 3M RDF links
http://www.elec.qmul.ac.uk/easaier/
Enabling Access to Sound Archives through Integration, Enrichment and Retrieval
The EASAIER Project
EASAIER - Enabling Access to Sound
Archives through Integration, Enrichment and
Retrieval
EU funded project, 30month duration (started
May 2006)
Partners:
EASAIER - Goals
Overcome problems for many digital sound archives
concerning online access
sound materials and related media often separate
searching audio content limited
EASAIER Framework
Integration of Sound Archives
Low level audio feature extraction (speech/music)
Intelligent User Interface
Enhanced Access Tools
looping, marking of audio
sound source separation
time and pitch scale modification
Semantic Search
Evaluation
Semantics in EASAIER
Description of metadata using an ontology
High-level metadata
Low-level metadata
e.g. title, author of an audio asset
sources are databases, files in e.g. DC, MARC
e.g. speech event occurs at timestamp xyz
feature extractor tools
Semantic Search
Search across variety of metadata
Search across multiple archives
Similarity Search
Related content acquisition from the Web
The EASAIER System
Music Ontology
Overview
Merging existing
related ontologies
Developed by
QMUL
Cover the major
requirements
Widely-adopted
Four core MO
components
FRBR
FOAF
Event
Timeline
http://musicontology.com/
The Music Ontology: Timeline Ontology
Expressing temporal information, e.g.
This performance happened the 9th of March, 1984
This beat is occurring around sample 32480
The second verse is just before the second chorus
The Music Ontology: Event Ontology
Event — An arbitrary classification of a
space/time region
This performance involved Glenn Gould playing the piano
This signal was recorded using a XXX microphone located at that particular place
This beat is occurring around sample 32480
The Music Ontology: FRBR & FOAF
FRBR – Functional Requirements for
Bibliographic Records
Work — e.g. Franz Schubert's Trout Quintet
Manifestation — e.g. the "Nevermind" album
Item — e.g. my "Nevermind" copy
FOAF – Friend of a Friend
Person
Group
Organization
The Music Ontology – Music Production
Concepts
On top of FRBR:
MusicalWork, MusicalManifestation (Record, Track, Playlist, etc.), MusicalItem
(Stream, AudioFile, Vinyl, etc.)
On top of FOAF:
MusicArtist, MusicGroup, Arranger, Engineer, Performer, Composer, etc. — all
these are defined classes: every person involved in a performance is a a
performer...
On top of the Event Ontology:
Composition, Arrangement, Performance, Recording
Others :
Signal, Score, Genre, Instrument, ReleaseStatus, Lyrics, Libretto, etc.
The Music Ontology – Music Production
Workflow
Metadata in RDF
Low-level metadata is output in RDF using Music
Ontology
Audio Feature extractor
Speech recognition service
Emotion detection service
High-level metadata import
DB Schema Mapping
Standardized Metadata import
e.g. D2R, Virtuoso RDF Views
DC, MARC, METS, ...
Linked Data ?
DBPedia, Geonames, ...
Use Case: Archive Publication - HOTBED
Publishing
Hotbed
Database
Extending
Music Ontology
Querying
the Semantic
Archivist
Hotbed RDF
Features
Extraction,
Visualization,
...
Instruments
Taxonomy
Query
Interface
Sound Access
tools
1) editing the ontology
using WSMT editor to extend the ontology
Music
Ontology
Graphical Edit
Music
Ontology
Text Edit
2) performing tests on the new
extension
What are the instruments in my taxonomy ?
Did i forget any kind of [pipe] ?
3)mapping Scottish Instruments to a
general Instruments taxonomy
4) relating and publishing Hotbed
Relate tables from hotbed to concepts from the MO
Publish on the semantic web via the D2R tool
Hotbed
Database
Mapping
RDF Publication
via D2R tool
The server offers a SPARQL end-point for external apps
Music
Ontology
Mapping Metadata to the Music Ontologies
Title: File 2
Author: Oliver Iredale Searle
Perfomers: Katie Punter
Source Type: Audio
Source: File 2
Instrument: Flute
Instrument occurrence
timings: 0"-16"
Time Signature: 4/4
Beats per minute: 50
Tonality: Bb major
Searle Testbed
:music a mo:Signal ;
dc:title "File 2" ;
dc:author "Oliver Iredale Searle" ;
:music-performance a
mo:Performance ;
mo:recorded_as :music ;
mo:composer :OliverIredaleSearle ;
mo:instrument mo:flute ;
mo:performer :KatiePunter ;
mo:bpm 50 ;
mo:meter "4/4" ;
mo:key #BFlatMajor.
:KatiePunter a foaf:Person .
:ss1 a af:PersonPlaying;
af:person :KatiePunter;
event:time [
tl:onTimeLine :tl1234;
tl:beginsAt "PT0S";
tl:duration "PT16S";
].
Mapping Metadata to the Music Ontologies
ALL web service
output
<xml version="1.0" encoding="UTF-8">
<speech_retrieveResult>
<speech_descriptor word="power"
audio_material="c:/hotbed/performance/1004.wav" position_sec="10"
duration_sec="5" confidence="89" />
</speech_retrieveResult>
</xml>
<http://www.myarchive.org/signal/1234/event/powerPT10S> a af:Text;
af:text "power";
af:confidence "89";
event:time [
a time:timeInterval;
tl:onTimeline <http://www.myarchive.org/signal-timeline/1234>;
tl:beginsAtDuration "PT10S";
tl:durationXSD "PT5S";
].
Mapping Metadata to the Music Ontologies
Vamp Output
<metadata type="audio">
<category name="vamp">
<feature name="beats" type="variablerate" description="Detected Beats" unit="N/A
<data>
<event idx="0" timestamp=" 0.0928" duration="0" label="224.69 bpm"/>
</data>
</feature>
</category>
</metadata>
event:time [
a time:Instant ;
tl:onTimeLine :tl898;
tl:at "PT0.0928S";
];
mo:bpm "224.69";
RDF Storage and Retrieval Component
Built on top of OpenRDF Sesame 2.0
Query interfaces
Web Service (Servlet)
HTTP SPARQL Endpoint
Web Service provides predefined SPARQL query templates
Themes
Music, Speech, Timeline, Related media, Similarity
Dynamic FILTER constructs
Results in SPARQL Query Results XML Format
Interface for RDF metadata import using the Archiver
application
Enhanced Client
Web client
Related media
Doubleclick
Related media on the web (1)
Result
search for
author
“Coltrane”
Track selection
Web related media
search launched
automatically
according to the
name of the author
Related media on the
web (2)
Demo
http://www.elec.qmul.ac.uk/easaier/index3.html
http://easaier.deri.at/demo/
Demo
Time and Pitch Scale Modification (demo)
Sound source separation (demixing/remixing,
Noice reduction, etc.) (demo)
Video time stretching (to slow down or speed
up images while retaining optimal sound)
(demo)
Scenario 1 – Artist Search
Aggregation of music artist information from multiple web
sources
Ontology based search:
MusicBrainz data mapped to the MusicOntology
MusicBrainz Web Service:
MusicBrainz RDF Dump:
allows to retrieve artist URI by literal based search
retrieve RDF
use SPARQL to perform queries (e.g. resolve relationships)
Web2.0 Mashups:
Retrieve data (videos, images) from external sources
utilize RSS Feeds, APIs etc. from Youtube, LyricWiki, Google
more accurate results using references from MusicBrainz RDF data
Scenario 1 – Artist Search
<URI>
WS Interface
“Beatles”
<URI>
process data...
RDF Dump
Scenario 1 – Artist Search
Scenario 1 – Artist Search
Scenario 2 – Instrument Reasoning
Reasoning over HOTBED instrument scheme
Ontologize data from HOTBED (Scottish Music Archive)
Usage of D2R to lift data from legacy DBs to RDF
Ontologies:
MusicOntology
Instrument Ontology (domain related taxonomy)
Subsumption reasoning:
Retrieve instrument tree
Search for persons that play an instrument
Subclass relations: resolve persons playing more specific instruments
Example: Wind-Instrument < WoodWind < Flute
Scenario 2 – Instrument Reasoning
Example:
Search for people playing instrument of type Woodwind
Demo 3 – Rules
Infer new knowledge with rules
Domain Rule
Sophisticated Query
Albums based on certain Band/Artist/Instrument
UseCase: The Velvet Underground discography
Available information:
Membership durations
Album release dates
„Founders“ of the band ?
exist _artist, <_band, hasMember, _artist>, <_artist, onDuration, _duration>
forall ?x, <_band, hasMember, ?x>, < ?x, onDuration, ?time>
<?time, notBefore, _duration>
<_band, founder, _artist>
Albums & corresponding members
Demo 3 – Rules
Basic
Information
Band
Founder
Band Duration
(Members & Albums)
Album Tracks
Thanks
Contact
Ying Ding
LI029
(812) 855 5388
[email protected]