Resource Database Assembly: The Next Generation
Download
Report
Transcript Resource Database Assembly: The Next Generation
Resource Database Assembly:
The Next Generation
Part One
How We Got Here
2013 AIRS Conference sessions in Portland
Common thread emerged in discussions about
best practices, potential metrics, staffing
models, etc.
Opened the group to volunteers via the AIRS
Networker Open Forum in June, 2013
Put together a new survey for resource work
(last one was in 2008)
Today’s Presenters
John Allec, Findhelp Information Services, Toronto, ON
Sue Boes, United Way for Southeastern Michigan, Detroit, MI
Marioly Botero, United Way of Greater Atlanta
Katie Conlon, Iowa Compass, Center for Disabilities and
Development, Iowa City, IA
Cathleen Dwyer, CDK Consulting, New York, NY
Steve Eastwood, 2-1-1 Arizona, Community Information and
Referral Services, Phoenix
Polly Fay-McDaniel, Institute for Human Services, 2-1-1 HELPLINE,
Bath, NY
Lindsay Paulsen, 2-1-1, United Way of the Midlands, Omaha, NE
Edward Perry, 2-1-1 Tampa Bay Cares, Clearwater, FL
Additional Group Members
Matthew Finley, United Way Services, Cleveland, OH
Jan Johnson, Council of Community Services, 2-1-1 Virginia SW
Region, Roanoke, VA
Clive Jones, AIRS
Vicki Lofton, Heart of Texas Council of Governments, Waco, TX
Tamara Moore, United Way of Central Maryland, First Call for
Help, Baltimore, MD
Georgia Sales, 2-1-1 LA County, San Gabriel, CA
Survey Respondents
Agency Type
145 Agencies from 41 States & Provinces
Survey Respondents
Where We’re Going
Discussions of the group this last year + Feedback
today =
Recommendations for
Staffing
Metrics
Database update percentage requirements
Etc.
How Much Wood Could a
Woodchuck Chuck if a Woodchuck
Could Chuck Wood….
In other words…
Realistically, how many records can a resource specialist
keep updated?
Record Complexity
Sue Boes
United Way for Southeastern Michigan,
Detroit, MI
Record Complexity
A way to measure the degree of difficulty as relates to
the time/cost required to manage a set number of
records
Application of a consistent formula that “weights”
individual database elements and scores them
A tool for determining where to most effectively
apply staff time and resources
Record Complexity
Method
Assign points to database elements
Develop a scale
Determine average work hours
Consider variables
Create formula
Review possible outcomes
Record Complexity
Sample Scale
Simple Entries
10 pts or less
Moderate Entries 11 - 20 pts
Difficult Entries
21- 40 pts
Complex Entries
41 pts or more
Record Complexity
Determine Average Work Hours
Simple Entries: 1 – 5 hours (2.5 hours average)
Moderate Entries: 6 – 10 hours (7.5 hours)
Difficult Entries: 11 – 20 hours (15 hours)
Complex Entries: 21 – 40 hours plus (30 hours)
Time should include research
Record Complexity
Variables
Skill set of data entry staff (learning curve)
Variance in point spread – additional agencies, sites or
services add time
Time required to research and validate information
Ability to verify information – agency cooperation,
returned phone calls, URL, etc…
Availability of standardized infrastructure to manage
consistent data entry parameters
Current implementation of best practice protocols and
AIRS standards
Record Complexity
Formula
Average hours per level of difficulty x the given
number of records at that level of difficulty.
When totaled for each level of difficulty, provides the
sum total hours required to manage the database.
Record Complexity
Database Composition 1
Record Complexity
Calculations
1412 x 2.5 (average) = 3530 hours
269 x 7.5 (average) = 2017 hours
102 x 15 (average) = 1530 hours
43 x 30 (average) = 1290 hours
Total hours for all tiers = 8367
Record Complexity
Apply Formula
Sum of hours for all tiers of complexity = 8367
8367 hours required to maintain a database of
this complexity make-up and size
At 1950 hours per FTE (37.5 hours per week x
52 weeks), requires approx. 4.5 FTE
405 records per 1FTE (1826 records/4.5)
Record Complexity
Practical Applications
Database Management projections
Parity projects
FTE’s required to manage database
Equitable assignments
Evaluate new initiatives
Record Complexity
Application to Staffing Plans
Define resource management tasks
Define “other than” resource management tasks
Account for percentage of staff time on both
Apply complexity formula to database
What percentage of staff time is required to
established database management goals?
Record Complexity
Database Management Tasks
Formal and informal updates
New agency development
Style guide adherence
Application of AIRS standards and best practices
Taxonomy upkeep
Quality measures
Record Complexity
“Other” Tasks
Organizational projects and meetings that support
organizational agendas
Professional development (StrengthsFinder)
Outreach/presentations to community
Mailing of promotional materials
Vendor liaison
Availability to contact center (time on phones)
Data and reporting (quarterly and annual reports)
Volunteer management
Record Complexity
Job Task Analysis
http://airsnetworker.airs.org
Record Complexity
“Other” Tasks Survey Results
W ha t o the r ta s k s d o e s a re s o urc e s p e c ia lis t re g ula rly p e rfo rm a t y o ur
a g e nc y ? (Che c k a ll tha t a re a p p lic a b le )
A ns we r Op tio ns
Training of I&R Specialists
Outreach
Developing Call Reports
Publication Design
Website Maintenance
Answering I&R Calls
Agency Administration Tasks (e.g. Human
Resources)
R e s p o ns e
P e rc e nt
R e s p o ns e
Co unt
58.3%
68.8%
50.7%
34.0%
36.8%
67.4%
84
99
73
49
53
97
26.4%
38
Results from survey as posted to the Networker by Clive Jones, 2/7/2014
Record Complexity
Sample Staffing Plan
Record Complexity
Food for Thought
Is there a pattern to the complexity of
databases?
Very small sample indicates 2% Complex, 5%
Difficult, 15% Moderate and 78% Simple
Could that pattern be used to help define a
reasonable number of FTE per records as an
industry standard?
Record Complexity
Database Survey Result Projection
Record Complexity
Apply Formula
25 *30 =750 (2% complex)
63*15 =945 (5% difficult)
188*7.5=1410 (15% moderate)
977*2.5=2442 (78% simple)
Total hours = 5547
Translates to 2.85 FTE (5547/1950 hours)
at 440 records per FTE
Update Percentages
Polly Fay-McDaniel
Institute for Human Services, 2-1-1
HELPLINE, Bath, NY
Update Percentages
IF ... y o ur I&R ha s a n inte rna l e xp e c ta tio n o f ho w ma ny o rg a niza tio n/ a g e nc y re c o rd s a re
ma inta ine d b y a s ing le re s o urc e s p e c ia lis t FT E, wha t is the numb e r?
Ans we r Op tio ns
R e s p o ns e
Av e ra g e
R e s p o ns e
T o ta l
R e s p o ns e
Co unt
784
19,596
25
Records maintained per FTE
a ns we re d q ue s tio n
s k ip p e d q ue s tio n
35
103
T o c a lc ula te the a v e ra g e s , we re mo v e d the 5 la rg e s t a nd 5 s ma lle s t numb e rs
Results from survey as posted to the Networker by Clive Jones, 2/7/2014
Update Percentages
Standard 12: Database Maintenance
The I&R service has procedures to ensure that
information in the resource database is accurate and
complete. At a minimum are an annual survey of all
organizations in the database and interim updates of
records throughout the year as new information
becomes available.
AIRS Standards and Quality Indicators for Professional Information & Referral, Version 7.0, Revised March, 2013
Update Percentages
Results from survey as posted to the Networker by Clive Jones, 2/7/2014
Update Percentages
Update Percentages
Use materials submitted by agency or gathered
elsewhere
Website
Questionnaire
Social Media
Pamphlets
Newspaper Articles
Telephone directories
AIRS Standards and Quality Indicators for Professional Information & Referral, Version 7.0, Revised March, 2013
Update Percentages
“Once the I&R services (that means YOU as a
trained resource specialist) is satisfied that it
has obtained the best information possible…it
is permissible to mark the agency as having its
annual review.”
AIRS Standards and Quality Indicators for Professional Information & Referral, Version 7.0, Revised March, 2013
Update Percentages
Process improvement – maybe your procedures aren't
working?
Do you even have written procedures in place?
Are we expecting too much per FTE? (have you looked
at the complexity of your database to ensure you have
enough FTEs in place to do the work?)
Are you including other tasks and not leaving enough
time for database development and maintenance work?
Additional benchmarks in place evaluating work of
Resource Specialists? Are they doing their jobs?
Update Percentages
Update fatigue by agencies, the increased demand on
our agencies to do more with less?
Does the percentage of those living below poverty
within the geographic center for services impact our
work?
The credibility of the overarching agency, competing
agendas?
Are we seeing changes in the way service providers
share and exchange information that no longer look
like the usual ways of networking?
Update Percentages
T he c urre nt A IR S S ta nd a rd s ug g e s ts tha t 100% o f d a ta b a s e re c o rd s s ho uld b e
up d a te d a nnua lly . W e re a lize tha t in the re a l wo rld tha t ra re ly ha p p e ns a s the re a re
a lwa y s s o me in p ro c e s s . If A IR S is c o ns id e ring a re v is io n o f tha t numb e r, whic h o f
the fo llo wing wo uld y o u s up p o rt?
R e s p o ns e
P e rc e nt
R e s p o ns e
Co unt
100% updated within 12 months should remain the target
24.6%
34
90% updated within 12 months and 100% within 15 months
34.1%
47
85% updated within 12 months and 100% within 18 months
39.9%
55
95% updated within 12 months and 100% within 14 months
1.4%
2
A ns we r Op tio ns
30
Additional comments
a ns we re d q ue s tio n
138
Results from survey as posted to the Networker by Clive Jones, 2/7/2014
Survey
https://www.surveymonkey.com/s/NS3Z9YD
Paper copies also available
Coming Up In Part Two
Answer:
This quiz show debuted 50 years ago, on March 30, 1964.
Resource Database Assembly:
The Next Generation
Part Two
Fun!
Answer:
This quiz show debuted 50 years ago, on March 30, 1964.
Question:
What is Jeopardy?
http://www.211arizona.org/jeopardy
A woodchuck would chuck as
much wood as a woodchuck
could chuck if a woodchuck
could chuck wood!
SO…
Are we doing as much as we possibly can, or are we
doing all we can and doing it correctly?
Resource Database Standards
Cathleen Dwyer
AIRS Database Reviewer
CRS, CIRS
CDK Consulting, New York, NY
Resource Database Standards
AIRS requires that 6 database standards be met
for accreditation (#s 7-12)
Inclusion/Exclusion Criteria
Database Elements
Classification System/Taxonomy
Content Management/Indexing
Database Search Methods
Database Maintenance
Resource Database Standards
Standard 8 – Data Elements
* Are all required data elements accommodated
by your software?
* When software does not include a required
data element, have you developed a “workaround”?
Resource Database Standards
Standard 9 – Classification System / Taxonomy
* Are you using the AIRS/LA County 211
Taxonomy?
* Do you use keywords?
* Are your keywords connected to Taxonomy
terms?
Resource Database Standards
Standard 10 – Content Management and Indexing
Three parts to this requirement:
* Style Manual – are you following it?
* Indexing – best practices
* In-depth look at 8 complete agency profiles
Resource Database Standards
Standard 11 – Database Search Methods
* Does your software accommodate all required
search methods?
* Does your software display Taxonomy
definitions and “See Also”s?
Resource Database Standards
Standard 12 – Database Maintenance
* What is your system for pursuing annual updates?
* How old is your oldest update? How many are
overdue?
* How do you collect information about new
agencies and services?
* What is your process for handling interim changes
and adding new agencies?
Database Auditing
Steve Eastwood
2-1-1 Arizona / Community Information and Referral Services
Marioly Botero
United Way of Greater Atlanta
Database Auditing
How often are others auditing their data entry?
Auditing Practices
Database Auditing
Create Data entry standards/style guide
Make sure all of the AIRS required fields
data is being captured
Same format throughout the database
What are others currently using?
Tools built into software?
Auditing forms?
Nothing?
Database Auditing
Software Features
What is needed?
Should vendors be required to create
auditing tools within the software?
Database Auditing
Report on these AIRS Required fields if left blank
Provider Name
Description
Hours
Fees
Intake Procedure
Eligibility
Languages
Geography Served
Taxonomy (at least one
code assigned)
Database Auditing
Optional - Report on these fields if left blank
Physical Address
Mailing Address
Contact Person
Contact Title
Phone
Website
Email
Database Record Audit Form
Katie Conlon
Iowa Compass, Center for Disabilities and Development, Iowa City, IA
Database Record Audit Form
Method of assessment for:
Quality of individual records
Staff performance
Database Record Audit Form
Database Record Audit Form
Format: Microsoft Excel
Includes AIRS required and recommended
fields
Customizable for your own database
See “Instructions” tab for more information
Database Record Audit Form
The form will be provided to conference participants.
Keep in mind:
This is one component of a Resource Specialist review
Form is still in development
Currently no benchmarks on what percentage counts
as an acceptable score
Database Record Audit Form
As you use the form…
Please provide feedback!
Resource Metrics
Edward Perry
2-1-1 Tampa Bay Cares, Clearwater, FL
Resource Metrics
Objective
Discuss the proposed I&R database
metrics
Take action on the proposed I&R
database metrics
Discuss next steps
Resource Metrics
Agenda
Program Metrics
Database Quality Metrics
Staff Performance Metrics
Resource Metrics
Program Metrics
Provide a uniform national set of benchmarks
for all I&R Resource Database initiatives
Measure the accomplishments of I&R industry
in improving all Resource Databases
Provide a set of goals for I&R Resource
Databases to work towards achieving
Resource Metrics
Program Metrics
Resource Metrics
Program Metrics
Do you conduct an annual satisfaction survey of all the
organizations listed in your resource database?
Results from survey as posted to the Networker by Clive Jones, 2/7/2014
Do you currently have a policy on the responsiveness in terms of
answering questions or acknowledging receipt of information from
either the public or listed agencies or agencies interested in being
listed?
Results from survey as posted to the Networker by Clive Jones, 2/7/2014
Resource Metrics
Database Quality Metrics
Provide a uniform national set of benchmarks
for all to measure I&R Resource Database
quality.
Data Quality Metrics consist of 4 items:
Accuracy
Completeness
Consistency
Timeliness
Describe any quality performance indicators used by your I&R program
for assessing resource database work? List as many as relevant.
Results from survey as posted to the Networker by Clive Jones, 2/7/2014
Resource Metrics
Staff Performance Metrics
Provide a uniform national set of benchmarks
for I&R database staff to achieve regarding
their work on the database.
This includes customer service metrics for
I&R database staff.
Resource Metrics
Staff Performance Metrics
Discussion Points:
What should be the average number of records per
FTE a person can be updated annually?
Resource Metrics
Staff Performance Metrics
Discussion Points:
What should be the average number of hours to
first reply?
Resource Metrics
Staff Performance Metrics
If there was to be a recommended response time as an initial target
which of the following do you think would work best?
(Select all applicable)
Results from survey as posted to the Networker by Clive Jones, 2/7/2014
Resource Metrics
All Proposed Metrics
Resource Metrics
Next Steps
Publish these changes to the I&R field
for feedback
Gather all the feedback and present the
final results to the AIRS Board
Publish the benchmarks for I&R centers
Next Steps
Posted to AIRS Networker soon
Forms and resources presented today
Discussion notes from today
Survey
https://www.surveymonkey.com/s/F9K5R9S
Paper copies also available
Today’s Presenters
John Allec, Findhelp Information Services, Toronto, ON
Sue Boes, United Way for Southeastern Michigan, Detroit, MI
Marioly Botero, United Way of Greater Atlanta, Atlanta, GA
Katie Conlon, Iowa Compass, Center for Disabilities and Development, Iowa City, IA
Cathleen Dwyer, CDK Consulting, New York, NY
Steve Eastwood, Community Information and Referral Services, Phoenix, AZ
Polly Fay-McDaniel, Institute for Human Services, 2-1-1 HELPLINE, Bath, NY
Lindsay Paulsen, 2-1-1, United Way of the Midlands, Omaha, NE
Edward Perry, 2-1-1 Tampa Bay Cares, Clearwater, FL
Additional Group Members
Matthew Finley, United Way Services, Cleveland, OH
Jan Johnson, Council of Community Services, 2-1-1 Virginia SW Region, Roanoke, VA
Clive Jones, AIRS
Vicki Lofton, Heart of Texas Council of Governments, Waco, TX
Tamara Moore, United Way of Central Maryland, First Call for Help, Baltimore, MD
Georgia Sales, 2-1-1 LA County, San Gabriel, CA
THANK YOU!!
Resource Database Open House
Following Up on the AIRS Conference
Polly Fay-McDaniel
Institute for Human Services,
2-1-1 HELPLINE, Bath, NY
Questions/Comments
G-1: Resource Database 101
G-2/G-3: Taming the Beast: Indexing with the AIRS/211
LA County Taxonomy
G-6: Don’t Let Your Agency Get Left in the Dust:
Updating Database Info is a Must
G-7: Tools and Tricks for Improving Any Resource
Database or “How we turned our Database
Around…while Tripling The Service Area”
G-8: So You’ve Got a Database, Now What?
Resource Database Assembly
Session 1
Resource Database Assembly
Session 1
Resource Database Assembly
Session 1
2 comments: 1. No comment 2. in concept
Resource Database Assembly
Session 1
Resource Database Assembly
Session 1
Yes, this would be useful to my organization.
No, we figure this number based on our own
agency’s needs.
No, we already figure this number based on
the use of a complexity score.
No, we will be implementing the use of a
complexity score based on today’s discussions
Resource Database Assembly
Session 1: Q5-Additional Comments
With information on how you figured it and how to adjust it if
necessary
Not really sure how this would be incorporated in our day to day
practices
Yes, as a single resource person
OR recommend complexity score as the standard. It would be
great to have something sanctioned so when I tell people how
long data maintenance takes (and they are astounded) there's
something to back me up.
But not at expense of new agency/program development and
with acknowledgement of variation/imprecision of tool.
I think the standard should incorporate a formula that each
agency can use rather than a specific number.
Resource Database Assembly
Session 1
Q6: We asked this on the original survey
...what are
your thoughts
today?
Resource Database Assembly
Session 1: Q6-Additional Comments
we should strive for 100%
95% updated w/in 12 mo, 100% w/in 15 months still high, but
more realistic
Would like it to be made clear what the AIRS DB
Reviewers want to see when they come as far as update %.
Last time I had to inactivate about 5 listings that were not
updated within 12 months before they came to visit.
Hopefully they would be reasonable, but it's a little
unclear.
What about 100% in 15 or 18 months?
Resource Database Assembly
Session 1: Q6-Additional Comments
85% is a more practical/realistic standard due to 1 - update
fatigue by agencies; 2 - internal standards to require personal
contact with high priority agencies; 3 - low funding at agency, 1
person for 1500 agencies
Maybe less
I think keep it at 100% but possibly change the wording...also I
think if airs doesn't punish but supports those who are
struggling meeting thus standard or hold their own behind the
scenes standard of 85% within 12 months and 100% within 18
But I feel there should be some consideration for # of attempts
to update.
Resource Database Assembly
Session 1
Resource Database Assembly
Session 1: Final Comments
Complexity to measure time management is a wonderful idea
Our software puts the last time we were working on the record
on the website. Additional comment to number 3 - It is
grounded in something that people can use and adjust as
needed - we know how we got it and can change it if needed.
I hope the complexity score implemented by the software
vendors will just tell you how many agencies have how many
sites and services instead of doing the math. I know it does not
take me an average of 2.5 hours to update a simple record each
year, so I would want to set that information myself.
Resource Database Assembly
Session 1: Final Comments
Can we make this session be a series of webinars? Or have
an opportunity for participation after the conference?
Undecided on #7
Really great - good to have tools to keep RMs from having
unrealistic expectations from supervisors
We already display date of last update and it is internally
helpful to call specialists but not a good idea for general
public
Yes we should date the last updated
Need to see results of Standards Committee discussion
Resource Database Assembly
Session 1: Final Comments
"attempt 100% update" - define attempt and have a known
set of rules about what attempt means
Records should not have last update info
Date of last formal update?
Along with the above - perhaps the score attached to the
agencies (simple, complex, etc) could be incorporated to
illustrate variant challenges that go along with updating
and why percentages could be lower; i.e simple agencies
100% in 12 mos, complex 80% in 12 mos.
Resource Database Assembly
Session 1: Final Comments
Hard to say what's best about listing the formal or interim
update online. We've had the interim listed, and that has
led agencies to think we don't need a formal review. So
we are favoring a formal date listed.
Last updated should be published online
Keep achieve, but acknowledge margin of error. If
standards already allow for 3rd party validation in some
cases, no need to relax target #.
Maybe? We should just have some type of consideration
for attempts
Resource Database Assembly
Session 2
Resource Database Assembly
Session 2
Resource Database Assembly
Session 2
Resource Database Assembly
Session 2
7 of other responses, did not answer question
Resource Database Assembly
Session 2: Q4-Other Comments
agency wide spell check
spell check, find and replace
Capitalization check. Blank Space check (# of blank spaces
between words)
Seeking searching for miss spellings throughout listing
(record) not just in each field - and use of global replace
feature
Resource Database Assembly
Session 2: Auditing Currently in Use
From our workshop…
Review of Directories
Individual Forms
Download field by field
Process in place
For single-person resource departments: ask co-workers,
other departments, colleagues from other I&R agency
Resource Database Assembly
Session 2: Q4-Other Comments
Global find & replace spell check
Enhanced spell checking features
list of emailed updates that did not reply after three
requests
Spell check and ability to search by address and contact
fields
From our workshop…
Expanded features of spell check
Everyone be able to attach target terms to service terms
10-15% of database should be reviewed annual for reliable statistical info
Resource Database Assembly
Session 2
7 of the other responses did not answer question
Resource Database Assembly
Session 2
7 of the other responses did not answer question
Resource Database Assembly
Session 2: Q5-Other Comments
what is this?
It's not that it isn't useful, but for a number of my smaller
agencies who update by phone, its irrelevant. I have spent
the last few years adjusting to each agency.
Ideal of course, but unrealistic
not sure - concerned about ability to change software
I just don't know if we would get a response rate
Resource Database Assembly
Session 2
Acknowledged within 2 business hours
Acknowledged within 3 business hours
Acknowledged within one business day
5 of the other responses did not answer question
I don’t feel this quality measure would be useful
Resource Database Assembly
Session 2: Q6-Other Comments
90% request acknowledged within 3 business days
my agencies dont have priority over my callers. If a callers
leaves a VM or email it is returned within one business day,
agencies should be same. And our software request is
delayed.
if automated
I reply within 2 hours
2 business days
Resource Database Assembly
Session 2
Completed within 48 hours
Completed within 48 hours
Completed within 48 hours
Completed within 48 hours
5 of the other responses did not answer question
I do not feel this quality measure would be useful
Resource Database Assembly
Session 2: Q7-Other Comments
80%
excluding ones where I am having to wait for the agency
to get back to me or if I have questions.
ideal but unrealistic - while making plans, life happens ;)
I feel this really varies as it depends on the response and
cooperation of the agency contact/response to you.
750 records unrealistic. Like idea of using record
complexity tool.
Maybe change to within 3 business days
Business days
Next Steps
Posted to AIRS Networker soon
Forms and resources presented today
Discussion notes from today