Jul 272014
 

The other day I was troubleshooting 100%  CPU utilization on a SQL Server 2008 database server. The server had 100 or so databases of varying sizes however none were larger than a few hundred MB and each database had a corresponding web site on a separate web server.  Since the server hosted quite a few databases the high CPU needed to be resolved quickly because it was causing issues for everyone.  High CPU on a database server can often be symptomatic of a issues occurring outside the server. In this case the real issue was in fact being caused by a SQL Injection attack on a web server.

Top Queries by Total CPU Time

The knee jerk reaction when experiencing high CPU may be to stop it immediately either by restarting services or recycling app pools however letting it run temporarily will help you to isolate the cause. SQL Server 2008 has some great built-in reports to help track down CPU utilization. On this occasion I used the Top Queries by Total CPU Time report. You can get to this report by right clicking on the server name in SQL Server Management Studio and then selecting Reports.

image

 

The Top Queries by Total CPU Time report will take a few minutes to run. However once it completes it provides a wealth of information. You’ll get a Top 10 report clearly showing which queries and databases are consuming the most CPU on the server at that moment. Using this report I was able to see that one of the databases on the server had 4 different queries running that were contributing to the high CPU. Now I could focus my attention on this 1 problematic database and hopefully resolve the high CPU.

 

image

 

SQL Profiler and Database Tuning Advisor

Now that I knew which database was causing the problems I fired up SQL Profiler for just a few minutes. I wanted to get a better understanding of the activity that was occurring within the database. Looking at the high number of Reads coming from the app named “Internet Information Services” I was starting to realize that web site activity was hitting the database pretty hard. I could also see plaintext  data being inserted into the database and it was clearly spam.

image

 

Before I turned my attention to the web site however I wanted to see if there could be any performance improvement using the Database Engine Tuning Advisor since I had the valuable profiler trace data. The DTA will analyze the database activity and provide a SQL script with optimizations using indexes, partitioning, and indexed views. Usually with DTA I’ll see 5-10 % performance improvement. I was excited to see a 97% improvement!

image

Preventing SQL Injection with IIS Request Filtering

After I applied the optimizations script from the Database Engine Tuning Advisor the CPU utilization on the database server improved considerably. However, I knew the web site was experiencing suspicious activity so I used Log Parser to get some reports from the site’s traffic log. Using the query below I could see the most frequently used querystring values and it was obvious the site experiencing a SQL Injection attack.

logparser.exe -i:iisw3c “select top 20 count(*),cs-uri-query from ex140702.log

group by cs-uri-query order by count(*) desc” -rtp:-1 >file.txt

 

With attacks like this a natural inclination is to start blocking IP addresses. Unfortunately sophisticated attacks will use a variety of IP addresses so as soon as you block a few address malicious requests from new ones will take over. The best solution is to block the malicious requests with Request Filtering so I quickly added a few rules to block keywords I had seen in my log parser reports.

requestfiltering

 

Implementing the IIS Request Filtering rules stymied the SQL Injection attack. Using the Log Parser query below I could see the http status codes of all the requests hitting the site with the new rules in place.

SELECT STRCAT(TO_STRING(sc-status), STRCAT(‘.’, TO_STRING(sc-substatus))) AS Status, COUNT(*)

AS Total FROM w3svc.log to TopStatusCodes.txt GROUP BY Status ORDER BY Total DESC

 

Request Filtering uses the http substatus 404.18 when a query string sequence is denied. Looking at Log Parser report below you can see the  50,039 requests were blocked by the new Request Filtering rules.

topstatuscodes

An Once of Prevention…

The web site that had been attacked hosted free cooking recipes and allowed visitors to submit their own recipes. Unfortunately the owner’s goodwill was easily exploited because there was no form field validation on site’s submission page and new recipes were automatically being displayed on the site without being approved. This is a dangerous site design and should never have been deployed without basic security measures in place.

I did a quick select count(*) from the recipe table in the database and was amused by all the delicious recipes I found Smile.

image

 

In Summary

SQL Server 2008 has several built-in reports like Top Queries by Total CPU Time to help Investigate high CPU utilization. Running SQL Profiler will provide detailed analysis of database activity. Running the profiler output through the Database Tuning Advisor can yield significant performance improvements for the database. IIS Request Filtering is a powerful tool to block SQL Injection attacks against a web site. However, SQL Injection can be easily mitigated using basic data validation. Thanks for reading.

Nov 062012
 

Microsoft’s SQL Server Express is a fantastic product for anyone needing a relational database on a limited budget. By limited budget I’m talking free. Yes SQL Server Express is free but it comes with a few limitations such as only utilizing 1 GB of RAM,  databases are limited to 10 GB, and it does not include SQL Profiler. For low volume sites that do not need enterprise level capabilities, this is a compelling solution. Here is a complete SQL Server feature comparison of all the SQL Server editions.

There is one other important limitation that needs to be mentioned which is that SQL Server Express does not include SQL Agent. This means that you can not schedule administrative tasks such as backing up your databases.  As everyone knows backing up data is critical and with SQL Server Express you can still backup your databases but you have to do it manually. So what does one do if you don’t have a budget to license SQL Server  but you need scheduled backups of your databases? Thanks to the folks at lazycoding.com there is a solution and it’s called SQL Scheduler.

SQL Scheduler is a simple lightweight application that installs on your server and runs as a service. Using the application’s GUI you can create multiple jobs to run your backups on a predefined schedule and you can even configure it to send you an email notification if it completes or fails to run properly.

After you download the program, unzip the archive and run InstallService.bat file from the command prompt. The installation will complete quickly.

image

Once you launch the program you just need to connect to your local SQL Server Express instance on your server. Here is how it will appear without any jobs configured.

image

From the File menu create a new job. In this example we’ll create 1 job to take a full backup of each database that is configured on the server.

image

Click on the Details tab and you’ll be able to enter your SQL statements for whatever task you’re trying to accomplish. Here is an example backup script you can use to run a full backup of each database on your server. Be sure to edit the script and change the path of where you want the database backups stored on your server.

image

On the schedule tab you can define the time and date of your job will run.

image

To have the job send you an email upon completion or in the event of a failure open the SQLScheduler.WindowsService.exe.config file and enter the mail server address along with the and username and password for authentication.

<system.net>
<mailSettings>
<smtp deliveryMethod="Network" from=”somone@somewhere.com”>
<network host="locahost" userName="x" password="x” />
</smtp>
</mailSettings>
</system.net>

 

Once the job runs a full backup will be created for each database and it will be stored in a subfolder of the path you specified in the script.
Now that your daily full backups are being created locally on the server the next step would be to either configure Windows Server Backup or configure 3rd party online backup solution.

As stated in the beginning this only makes sense if you have a limited budget otherwise you should license either SQL Server Standard Edition or SQL Server Web Edition. SQL Express is an ideal product for Cloud Server hosting. If you need Windows Cloud Server hosting take a look at Orcsweb.com. I know them well and I happen to work there.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Sep 072012
 

Have you ever struggled to figure out why your SQL Server database is using so much space? If your database is hosted on a dedicated server then perhaps it’s not an issue for you. But what if your on a server with limited storage? For example if you have a web site that is on a shared hosting plan then chances are you have a limited amount of space for your SQL Server database. If your database grows over the plan allotment then you will probably be looking at an overage fee unless you can clean up some space within the database. Regardless of where your database is hosted it’s always a good idea to keep your database growth under control.

With a SQL Server database, the transaction log is usually the first place to look at when you need to free up space.  The transaction log is used for restoring the database in the event of a system crash to the very minute before the crash occurred. If your web site is not being used for ecommerce then you probably do not need “point in time” recovery.  By switching the SQL Server recovery model to “Simple” you’ll be ale to free up a lot of space that was being used by the transaction log.

So let’s assume you’ve already switched the SQL Server recovery model to minimize the size of your database however your hosting provider is still telling you your database is over the allotment and you need to clean up more space.  With a SQL Server database you could “shrink” it but that’s a topic for another day.  So we’ll focus on the tables in the database and see if you can clean out some legacy data.

The logical question is how does one determine which tables in a SQL Server database are using the most space? Well SQL Server has a stored procedure called sp_spaceused which can show you the total size of your database or alternatively how much space a table is using. For example you can check a table called products like this:

sp_spaceused products

This is very handy however not very practical if you have a lot of tables to test. Fortunately thanks to Bill Graziano over at SQLTeam.com this important but mundane task is a now breaze. They created a SQL script called bigtables.sql that you can run to list the 25 biggest tables in your database. It will quickly and clearly show you how much space each one of your tables is using.

Let’s take it a step further by creating a stored procedure to run the script. This way the functionality is always ready to run when we need it. Here is a script that creates a stored procedure called sp_bigtables. When we run the stored procedure we see these results which clearly show which tables are using the most space. The picture below only shows the first 30 tables however the script displays every table in the database.

Now that you’ve identified the biggest tables in SQL Server database you can focus on cleaning out some legacy data. Be sure to take a backup first before deleting any data from your database.

One final tip is that you should run DBCC CHECKUSAGE (‘databasename’) on your database first before running the script above so that you get the most accurate results.

dbcc updateusage

 

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS