Notes from the Field
Download
Report
Transcript Notes from the Field
“Notes from the Field”
Lessons learned developing a
distributed .NET application
Presenter
Darryl Pollock
Squirrel Consulting Pty Limited
Sydney SQL SIG 26 October 2004
Darryl Pollock
Squirrel Consulting Pty Limited
Developer for 18 years
DBA for 7 years
Consultant for 5 years including :
– Development/ SQL and .NET Architecture
– Clustering
– Performance Tuning
– Large DB implementations (up to 1 TB)
[email protected]
Agenda
Background of the application
Architecture
Data Modelling
Concurrency
Reporting Services Integration
Pearls of Wisdom
Demo
(there is a QA session – but would prefer
Questions throughout!)
Background
SME’s seem to have core applications based
around 1 or 2 servers
These applications are business-critical and
often need monitoring
BUT
SME’s cannot afford nor wish to manage large
infrastructure monitoring systems
Background
clients began asking for a monitoring system for
their business-critical systems
The industry was researched, products tested,
evaluated for a 3 month period
No single product was found that met the
following criteria:
Criteria
Lightweight – would not create performance
issues
Platform independent from NT upwards
Secure, yet firewall-independent
Was not overly complex to use
Provided hierarchical security (portal)
Provided long term (trend) reporting
Could work across multiple organisations
Low Cost!
Configuration
(XML File)
Agent (RSM)
SQL Server
SOAP/XML(Secure)
(Simple/Forms Based Configuration)
Lightweight/Platform Independent
Performance/Events
Licensing
SOAP/XML
SQL Server
SOAP/XML
Agent Monitoring
SOAP/XML
Reporting
Pre-configured Portal Technology (Simple)
The Challenge!
4 Languages
Multiple Technologies/Multiple Platforms
Multiple applications competing for the same
resource:
– One SQL Server!
Building the Data Model
tblEvent
tblRSM
tblCompany
tblRSMViewPane
tblCounterHistory
tblRSMViewCounter
Historical Data
Initially just _Total
tblPerformanceObject
tblCounterInstance
tblPerformanceCounter
Users
Building the Data Model - Testing
tblCounterHistory
Initially this table would grow by
100-300 rows every interval (1-3
minutes)
Performance of course was
spectacular even with 15-20 agents
running
Aggregation of the data was
performed at insertion by way of a
trigger (avg,max,min)
Building the Data Model - Tuning
tblCounterHistory
tblCounterInstance
300 Rows became
1000 rows by
allowing ALL
instances!
Reaching 1 million rows occurred with one agent
running in a week
Once the table started exceeding 1 million
rows, we were in trouble! The database was
straining to view the data even with
reasonable indexing
Performance of course was abysmal even with 2
agents running
Aggregation of the data was changed to occur
hourly – and there was no archiving strategy in
place – we’d encountered our first major issue.
Why? Because our archiving strategy was not
part of the design!! (Big mistake)
Building the Data Model – Re-factoring
tblCounterHistory
tblCounterView
Trigger
ID
LastDateTime
PerformanceCounterID
This became the
“snapshot” of the
current state
ID
PerformanceCounterID
CounterInstanceID
RSMID
Active
Deletion and Aggregation became hourly – only 1 hour of
data was maintained
tblCounterHistoryDaily
tblCounterHistoryArchive
ID
RSMID
CounterInstanceID
Value
LastDateTime
PerformanceCounterID
Deletion and
Aggregation nightly
RSMID
CompanyID
MaxValue
MinValue
AvgValue
PerfCounter
PerfObject
CounterInstance
AggregateDate
We not only adjusted the model, but we built the
first foundation of the report architecture
In the final product, we removed aggregation from the DB
completely and moved to the agent – it took 7 weeks to
get the agent to run efficiently – but worth it!
The Pyramid of Error
Flaw in Data Model
Re-Engineering of Database
Recoding of Web Services
Recoding of SQL Agent
Recoding of Portal
Concurrency – the cowards way out?
We weren’t building a large enterprise system, but a system that an SME
would be running – one server to run probably everything…Which meant :
Most likely scenario was portal retrieving transactional data from a highly
transactional system. (Does this sound like a disaster in the making?)
Transactional data that was being sent to the presentation layer at the portal
experienced MAJOR concurrency issues. The only solution that would work
for us was :
SHOCK! HORROR!!! Select with (nolock)
Our alerts were real-time, so there was no perceived delay – it was a choice
that worked for this application.
However this did not solve all our concurrency issues so….
Concurrency – letting SQL do its thing…
Connection re-use – as soon as we adjusted the Web Service to have each
WebMethod use a single connection, our performance increased by about 20%
- this can be achieved either through code or connection pooling.
This resulted in locks being held for shorter periods of time
We replaced DataAdaptors with SQLCommand.ExecuteScalar wherever
possible, thereby reducing the amount of data we were retrieving
We adjusted all highly transactional, tables to use Row-Level locking, but left
the others alone.
We stopped specifying any type of lock to hold in our procs.
We then sat back and tested
SQL held locks, but released them, fast! – This was the result we wanted.
Large Data Sets and what to do with them
Retrieving a large number of events from the database became a performance
bottleneck – instead of just relying on tuning – we not only tuned the indexes
on the event table but used .NET’s built in DataView and DataFilter objects.
The proc retrieved the entire dataset from a well-indexed table, and instead of
going back to the database and retrieving a filtered new rowset – we passed
the original DataSet into a DataView and applied a filter – with excellent results
Eeek!!!!!!! Dot net Code!!! (This little snippet increased performance by 100%!)
If Me.DataFilter <> "" Then
Dim oDV As DataView = Me.RSMCounters.DefaultView
oDV.RowFilter = Me.DataFilter
Me.datagridRSMCounters.DataSource = oDV
Me.datagridRSMCounters.DataBind()
Else
Me.datagridRSMCounters.DataBind()
End If
Reporting Services Integration
Select Report
Retrieve List of Reports
tblReportEnvironment
Get Server Name
tblReports
tblReports
Build URL Based on Parameters
of where we were in the portal
Redirect Browser to
URL
Collaboration
or
(How not to be bullied by developers!)
In most cases, you want your stored procedure to
represent your logical model …however
Sometimes it is best to let the application do the logic
for retrieving the data structure vs. retrieving it all in one
“hit” in a complicated proc
– why? because it is quicker!
For example:
Our database – the philosophy
Golden Rules
don’t put app logic in db
don’t put db logic in app
Use your db as a data storage device, nothing
more
we only use a few cursors just for initialisation –
and they are run once. If you think you need a
cursor to run more than once – forget Golden
Rule #1!
use .NET to loop through a data set and evaluate
each row..its quick and far more efficient
DRI
If you choose to have your application manage the
referential integrity of your application, then the
integrity is as strong as your worst developer.
The database is never wrong!
We let SQL generate the error and then handle it
at the application layer.
The application should be ignorant of the physical
database, and only be concerned with how to read
and write objects.
Use the Cascading deletes and updates – it can
save hours of development time.
The Future
2 Way SOAP communication – creating
SQL ‘queues’ of commands at the agent
Wireless job summary and control agents
Yukon – execute all functionality via CLR
Extending the architecture beyond
monitoring
– Retail sales summaries
– GPS and SOAP
Contact
More Info - www.squirrelconsulting.com.au/LearnMore.htm
Demo – www.remotesquirrel.com
Company www.squirrelconsulting.com.au
[email protected]