Aug 282014
 

Depending on how many sites your Windows web server is hosting maintaining IIS logs can be a challenge.  IIS logs provide valuable insight into the traffic your sites are experiencing as well as detailed SEO metrics and performance data. A typical web server will have just enough free disk space for future growth needs but ultimately will be limited by the capacity of drives in the server. If left unmonitored IIS logs can quickly fill any remaining disk space on the server. There are a few 3rd party tools that are good at compressing log files when they are under one parent folder but when the log files are in different locations such as on a WebsitePanel server I support an alternative solution is needed. In this walkthrough I will demonstrate how I solved this challenge using asp.net and GZipStream.

 

What about Windows Folder Compression?

Enabling Windows folder compression should always be the first step to try when freeing up space used by IIS logs. I always redirect the site logs to a folder at the root level of the server so it’s easy to find them and monitor their growth. In the example below I saved 8.3 GB of disk space without having even having to zip any files. However Windows folder compression was not going to work for my WebsitePanel server.

Capture12

 

Finding WebsitePanel IIS Logs

WebsitePanel is a wonderful Windows control panel for Multi-Tennant web hosting and the best part is that it’s free. If you haven’t discovered this incredible application yet check it out. Once installed every aspect of site creation and maintenance is completely automated. As shown in the picture below customer sites are usually deployed with a directory structure like c:\HostingSpaces\customername\sitename. The IIS logs are always stored further down the tree in …\sitename\logs\W3SVCXX. As one can imagine on a web server with hundreds of sites it would be quite a challenge to track down and manually apply Windows compression on all the W3SVC folders. This would be further complicated as new sites are added to the server.

Untitled-1 copy

 

Using Directory.EnumerateFiles and GZipStream

I knew I was going to do some coding to solve my challenge but I also wanted to keep the solution easy to use.  In years past I would have considered VBScript folder recursion and executing the PKZip command line tool but I wanted something more contemporary.  So I used Visual Studio 2013 and created a Console Application.

Capture9

 

image

 

The GZipStream class which was introduced in .Net 2.0 would handle the necessary file compression. Since I would only be compressing individual files I didn’t have to deal with using a separate library such as DotNetZip.

public static void Compress(FileInfo fileToCompress)
{
    using (FileStream originalFileStream = fileToCompress.OpenRead())
    {
        if ((File.GetAttributes(fileToCompress.FullName) & 
            FileAttributes.Hidden) != FileAttributes.Hidden & fileToCompress.Extension != ".gz")
        {
            using (FileStream compressedFileStream = File.Create(fileToCompress.FullName + ".gz"))
            {
                using (GZipStream compressionStream = new GZipStream(compressedFileStream, CompressionMode.Compress))
                {
                    originalFileStream.CopyTo(compressionStream);
                    Console.WriteLine("Compressed {0} from {1} to {2} bytes.",
                        fileToCompress.Name, fileToCompress.Length.ToString(), compressedFileStream.Length.ToString());
                }
            }
        }
    }
}

The Directory.EnumerateFiles method introduced in .Net 4 would use find all the IIS log folders from the WebsitePanel hosting spaces directory tree. Using the searchOption overload  would enable me to not only search for .log files but also limit the searching to specific folders matching “W3SVC”.  In the snippet below I loop through the search results, compress any log file older than 1 day, and delete the original file. The last step is to just write the name the file that was just zipped to an audit log.

public static void zipFiles()
{
    try
    {
        foreach (string file in Directory.EnumerateFiles(homeDirectory, "*.log", 
            SearchOption.AllDirectories).Where(d => d.Contains("W3SVC")).ToArray())
        {
            FileInfo fi = new FileInfo(file);
            if (fi.LastWriteTime < DateTime.Now.AddDays(-1))
            {
                Compress(fi);
                fi.Delete();
                activityLog("Zipping: " + file);
            }
        }
    }
    catch (UnauthorizedAccessException UAEx)
    {
        activityLog(UAEx.Message);
    }
    catch (PathTooLongException PathEx)
    {
        activityLog(PathEx.Message);
    }
}

I only needed 30 days worth of log files to be left on the server. In the snippet below I delete any zipped log files matching my date criteria and then write the activity to the audit log.

public static void deleteFiles()
{
    try
    {
        foreach (string file in Directory.EnumerateFiles(homeDirectory, "*.gz", 
            SearchOption.AllDirectories).Where(d => d.Contains("W3SVC")).ToArray())
        {
            FileInfo fi = new FileInfo(file);
            if (fi.LastWriteTime < DateTime.Now.AddDays(-daysToSave))
            {
                fi.Delete();
                activityLog("Deleting:" + file);
            }

        }
    }
    catch (UnauthorizedAccessException UAEx)
    {
        activityLog(UAEx.Message);
    }
    catch (PathTooLongException PathEx)
    {
        activityLog(PathEx.Message);
    }
}

Using an XML Config File

To help keep my application flexible I created a simple XML file to store the path to the WebsitePanel hosting spaces folder and the number of days of logs I wanted to keep on the server.

<?xml version="1.0" encoding="utf-8"?>
<MyConfig>
  <HomeDirectory>D:\HostingSpaces</HomeDirectory>
  <DaysToSave>30</DaysToSave>
</MyConfig>

 

Loading these values into my program is done by parsing the XML data using XmlDocument.

public static void readConfig()
{

    string path = System.IO.Path.GetFullPath(@"logCleanConfig.xml");
    XmlDocument xmlDoc = new XmlDocument();
    xmlDoc.Load(path);
    XmlNodeList HomeDirectory = xmlDoc.GetElementsByTagName("HomeDirectory");
    homeDirectory = HomeDirectory[0].InnerXml;

    XmlNodeList DaysToSave = xmlDoc.GetElementsByTagName("DaysToSave");
    daysToSave = Convert.ToInt32(DaysToSave[0].InnerXml);

}

 

Running the Program

Running the program from a command line I can see the progress as files are compressed. Afterwards I setup a scheduled task so it will run every day to compress new IIS log files and purge the ones older than 30 days.

Capture7

 

A few days later I came across several sites on the server using CMS products with their own application logs. These log files were not being cleaned up by the site owners and taking up free space on the server so I decided to archive them as well. Since they were being stored within each site in subfolder called “Logs”, tweaking the searchOption parameters of the Directory.EnumerateFiles method would catch them. With a quick rebuild of the program, it would now compress the CMS log files as well as the IIS log files.

foreach (string file in Directory.EnumerateFiles(homeDirectory, "*.log",
  SearchOption.AllDirectories).Where(d => d.Contains("W3SVC") || d.Contains("Logs")).ToArray())

In Summary

IIS logs contain wealth of data about web site traffic and performance. Left unchecked over time the logs will accumulate and take up disk space on the server. Using Windows folder compression is a simple solution to reclaim valuable disk space. If your log files are different locations then leveraging the power of ASP.NET with Directory.EnumerateFiles and GZipStream will do the trick. Thanks for reading.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Aug 092014
 

One of the many benefits of using virtual servers over physical servers is the ability to add server resources such as CPU, RAM, and disk space on the fly without downtime. An addition drawback with a physical server is that you were often limited by the physical capacity of the server.  Once those limits were reached the server couldn’t be upgraded further.  Adding resources also required powering off the server which in turn would require coordinating with business owners and impacted users. Not all editions of Windows support hot-add so be sure to confirm your server is supported before starting. In this walkthrough I’ll show how easy it is to add server resources using VMware’s vSphere client.

Logging into vSphere Client

After authenticating on my network with the VMware vSphere 5.5 client I go to Hosts and Clusters under Inventory. From here I have access to all the virtual servers configured in our environment. After selecting the server to be upgraded you will be see the Getting Started tab. From here you have access to the usual administrative tasks such as starting, stopping, and restarting the server as well as performance reporting and events about the server. Click Edit virtual machine settings to add resources.

Capture2

 

Enabling RAM and CPU Hotplug

Adding the additional resources is straight forward. However when you begin you may find the CPU and Memory properties disabled. This indicates that the server has not been been previously enabled for hot adding resources. In this instance the server will need to be shutdown before you can upgrade these resources.

Capture3

 

Fortunately fixing this for future upgrades is a simple matter. When the server is powered down click on the Options tab of the Virtual Machine Properties. Under the Advanced settings go to the Memory/CPU Hotplug properties. Click Enable memory hot add and Enable CPU hot add. Click OK to save the changes. After the server is powered back on you will now be able to add CPU and Memory without having to first shutdown the server.

 

Untitled-1

 

To add additional virtual CPUs simply increase the Number of virtual sockets and click OK to save the changes.

core

 

To add additional Memory to the server adjust the Memory Configuration accordingly and click OK to save.

Untitled-2

 

 

Adding Additional Disk Space

In addition to adding CPU and Memory to the server during this maintenance window I am also going to add disk space. Adding additional disk space is just as straight forward as adding CPU and Memory. In the Virtual Machine Properties on the Hardware tab go to the Hard disk settings. Increase the Provisioned Size by the new amount and click OK to save the changes. Windows will not automatically recognize the new space so the final step of the upgrade will be log into the server and Extend the server’s disk drive. This can either be accomplished using vShere’s server console window or by connecting to the server with Remote Desktop.

Capture5

 

Extending Windows Disk Space

After logging into Windows open the Computer Management snap-in. In the console tree click on Disk Management under Storage. You may need to Rescan the disk before Windows will see that the new space is available.

Capture6

 

Step through the Extend Volume Wizard to allocate the additional space on the existing volume.

Capture7

 

In Summary

VMware vSphere offers System Administrators complete control over virtual server properties. Adding additional CPU, RAM, and disk space is  straight forward and in many cases can be performed without having to shutdown the server.  To help minimize downtime of your next maintenance window, double check the edition of your Windows server supports hot-add and confirm the Memory/CPU Hotplug property has been enabled. Thanks for reading.

Dec 272013
 

Thanks to Microsoft’s Web Platform Installer (Web PI) installing IIS has never been so easy. Before using Web PI to install IIS became available,  you had to use the Server Manager to install the Web Server (IIS) role and then select various Role Services that you need to be enabled. Depending on your level of expertise this could be a challenging task with lots scrolling back and forth and click upon click to get things just right,  but now you can have IIS deployed with just 3 clicks of your mouse.

Install Web PI

If you’re not familiar with the Web PI, it is a powerful tool that can be used to install not only IIS but also SQL Server Express, Visual Web Developer, Express, PHP, WordPress, Umbraco, and many other 3rd party applications from the Windows Web Application Gallery. If you haven’t already done so first Download Web PI and install it. It’s free and has a small footprint of only 2 MB.

image

Select IIS Recommended Configuration

Once Web PI has been installed just launch the program . It will open to the Spotlight tab so just click on the Products tab and click Add next to IIS Recommended Configuration. If you don’t see it in the opening list just search for it. All you need to do after this is just click Install at the bottom of the window.

 

image

 

You may be curious as to what options are installed with the IIS Recommended Configuration. Here is what will be installed:

  • ASP.NET
  • Static Content
  • Default Document
  • Directory Browsing
  • HTTP Errors
  • HTTP Logging
  • Logging Tools
  • Request Monitor
  • .NET Extensibility
  • Request Filtering
  • Static Content Compression
  • ISAPI Extensions
  • ISAPI Filters
  • WAS Process Model
  • Management Console
  • WAS Configuration API
  • WAS .NET Environment
  • .NET 4.5 Extended with ASP.NET for Windows 8
  • .NET 3.5 for Windows 8

Before the installation starts you need to accept the license terms so just click I Accept.

image

 

The installation will run for a few minutes installing the essential features for IIS to work properly.

image

 

Once Web PI has completed installing IIS just click Finish.

image

 

Using IIS Manager

Your server is now ready for hosting web sites. Open IIS Manager and you’ll see the Default web site has been configured.

image

 

When you browse http://localhost you’ll see the familiar IIS Start Page.

image

This page is named iisstart.htm and appears in the Default Documents list above default.aspx so once you upload your web site files be sure to delete this page.

Next Steps?

Now that you have IIS installed what’s next? Well you’ll want to go back to Web PI and at least install FTP Publishing. Once you have FTP Publishing installed you want to look into configuring FTP User Isolation as well as using FTP over SSL for greater security when transferring content to and from your server. You may also want to look at installing Url Rewrite 2.0 from Web PI. Url Rewrite offers many ways to rewrite urls for SEO and perform 301 redirects as well as blocking page requests.

Summary

The Web Platform Installer (Web PI) is a powerful tool for deploying a wide variety of 3rd party applications such as WordPress and other popular CMS products but it can also be used to install IIS or even SQL Server Express on your server. The Web PI offers  unparalleled ease and convenience with installing applications on Windows servers. Thanks for reading.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Oct 072013
 

When you need quick analysis of your traffic logs you won’t find an better tool than Microsoft’s free Log Parser. With Log Parser you can read a variety of log files including the Registry and Windows event logs. It’s ease of use comes from using SQL queries against your log file. You can get your data even faster by using multiple log parser queries in a batch file.

image

The other day I was helping someone who needed some “top 10” data from their site’s log. Since I had these in my trusty batch file I could provide the text reports within seconds. However, I like to offer a little more pizzazz when possible so this time I decided use Log Parser’s native charting capability to output the results with some nice charts.  As the saying goes a picture is worth a thousand words.

Here’s the query I used to create the chart above:

logparser.exe -i:iisw3c "select top 10 cs-uri-stem, count(*)  into top10requests.gif 
from <file> group by cs-uri-stem order by count(*) desc" 
-o:CHART -chartType:pieexploded3d -categories:off -chartTitle:"Top 10 Requests"

 

Installing Office Web Components

Charting is a native feature of Log Parser however there is a dependency for Office 2003 Add-in: Office Web Components. Depending on where you are running Log Parser the first time you try to output your query to a chart you may see this error message:

Error creating output format “CHART”: This output format requires a licensed Microsoft Office Chart Web Component to be installed on the local machine

If you didn’t see the error above then you’re all set but if you saw the error then it will be necessary to install the Office Web Components before you can start outputting charts. Once you’ve downloaded the file just accept the License Agreement and click Install.

image

The installation runs quickly. Click OK to close the window.

image

 

Example Log Parser Reports with Charts

Now you’re ready to start creating some colorful charts. The most useful parameters in my opinion are –chartType, –chartTitle, –categories, –values, and –legend. There are some 20+ chart types that you can choose from including:  Pie, PieExploded, PieExlpoded3D, LineStacked, Line3D, BarClustered, ColumnClustered, Smooothline. The default chart type is Line.  To see all the possible chart options run this simple command:

LogParser -h -o:CHART

To take your charts to the highest level of customization you can use an external configuration script with Jscript or VBscript . Take a look at the MSDN ChartSpace Object Model documentation for more information.

Here are a few different charts with various options.

image

logparser.exe -i:iisw3c "select top 10 cs-uri-stem, count(*)  into top10requests.gif 
from x.log group by cs-uri-stem order by count(*) desc" 
-o:CHART -chartType:pieexploded3d -categories:off -chartTitle:"Top 10 Requests"

 

 

image

logparser.exe -i:iisw3c "select top 10 sc-status, count(*)  into top10errorcodes.gif 
from x.log group by sc-status having sc-status not in ('200') order by count(*) desc" 
-o:CHART -chartType:column3d -categories:on -values:on -chartTitle:"Top Status Codes"

 

 

image

logparser.exe -i:iisw3c "select top 10 cs-uri-stem, count(*)  into top10_404.gif 
from x.log group by cs-uri-stem, sc-status having sc-status in ('404') order by count(*) desc" 
-o:CHART -chartType:BarClustered3D -values:on -categories:on -chartTitle:"Top 10 404 Status"

 

image

logparser.exe -i:iisw3c "select quantize(time, 60) as TimeGenerated, count(*) as Hits into 
hitsperminute.gif from %1 group by TimeGenerated" -o:chart -chartType:Line –chartTitle:"Hits per Minute"

 

 

 

image

 

logparser.exe -i:iisw3c "SELECT TOP 10 cs-uri-stem AS RequestedFile, COUNT(*) AS TotalHits, 
MAX(time-taken) AS MaxTime, AVG(time-taken) AS AvgTime into slow.gif from x.log 
where EXTRACT_FILENAME(cs-uri-stem) not in('%begin%') GROUP BY cs-uri-stem ORDER BY MaxTime, TotalHits DESC" 
-o:CHART -chartType:barclustered3d -values:off -categories:on -chartTitle:"Top 10 Slowest Requests"

 

In Summary

Microsoft’s Log Parser is a powerful tool for log file analysis. You can use it to analyze text files, csv files, Window’s event logs and even the Windows Registry.  You can make boring reports come alive with colorful charts.  There is a dependency on Office Web Components for charting to work but that is easily solved. Thanks for reading.

Dec 092012
 

The other day I was helping someone who was trying to configure a wildcard certificate on their Windows Cloud Server. Their server was running Windows 2008 R2 server using IIS 7. The were technically savvy and knew how to configure site’s on their own and install a regular SSL certificate but they were stuck trying to get a wildcard certificate configured properly.

They had quite a few site’s configured using subdomains such as support.domain.com, mail.domain.com, login.domain.com, etc. To tighten security they decided to use SSL to protect all these sites so they bought a wildcard certificate for *.domain.com. They installed the new certificate on the 1st site correctly but when they tried doing it on the 2nd site they couldn’t. IIS wouldn’t let them assign the certificate on the other sites using a shared IP address. Does this sound familiar? Here’s how you can solve it and it’s easier than you think.

Here are 4 site’s configured in IIS using host header xxx.domain.com with the same IP address.

image

 

After installing our wildcard SSL certificate we assign the binding on the first site.

image

 

Testing the site we see that the wildcard SSL certificate is working great.

image

 

Now we go to site #2 and try to assign our certificate. However we’re not able to enter a host name for site #2.

image

 

If we click OK and try to proceed we get a warning about the dire consequences of our actions. As soon as we try to access site #2 using SSL, IIS will actually stop site #1 which cause all kinds of issues for our users.

image

 

Now that we’ve seen the problem let’s the get to the solution. According to my friend, former coworker, and IIS MVP Scott Forsyth, it turns out that this is not a bug and the IIS team designed it to work this way. There are 2 ways to solve this particular issue when using a wildcard SSL certificate. One way is to simply execute the following app command for each binding that you need.

appcmd set site /site.name:”” /+bindings.
[protocol='https',bindingInformation=':443:']

This certainly works however I tend to have hard time remembering the syntax which leads us to the 2nd method which is in my opinion is far easier and has to do with how the wildcard SSL certificate was originally installed.

Remember back when you had just received the completed wildcard certificate from your SSL vendor? Like most IT people you were probably in a hurry when you installed it. Do you remember what you entered when you were prompted for a Friendly name before saving it? Chances are you just entered “domain.com” however what you should have specified is “*.domain.com”. Doh!

You can check this easily by looking at the certificate store in IIS Manager. If the Name column doesn’t show the * then you need to change it before it SSL binding on multiple sites will work properly.

image

 

So how does one change the Friendly name this after the certificate has already been installed? Open the MMC Snap-In for Certificates. Right-click on the certificate and change the Friendly name to *.domain.com. Save the changes and close out the MMC.

image

 

Now that the Friendly name has been changed to *.domain.com go back to IIS and try to add the SSL binding for site number #2 and now it works. Woohoo. Smile

image

 

Now you can add your wildcard certificate to as many subdomain host header sites as needed using only 1 IP and you don’t have to remember any programming syntax. I hope this helps and thanks for reading.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Sep 262012
 

What’s that? Robocopy can send emails? Well not exactly but here’s how you can do it.

Robocopy is a free tool from Microsoft that was originally part of Windows Resource Kit for Windows NT 4. It was then included as a standard feature with Windows Server 2003 and each subsequent release of Windows Server. It is a powerful tool for transferring data from 1 place to another.  You can transfer data from 1 folder to another on the same server or between 2 servers provided you use an account with permissions on both servers. There are extensive command switches available to control how the data transferred such as preserving ACL permissions, directory structures, and even empty folders. The basic syntax is as follows:

robocopy Source Destination [File[ ...]] [Options]

Recently I had to transfer nearly 50 GB of data from a legacy server to a new server. This migration was occurring on 2 production mail servers and there would be a impact to users so downtime needed to be minimal. I had done thorough testing so I knew it was going to take approximately 3 hours to transfer the data.

Since I was transferring the data from a legacy server to a new Windows 2008 R2 server I planned to initiate robcopy from the new server. By doing this I was able to use the /MT multi-threading switch. I had done several tests using different values for this switch and settled on 32. Increasing the value beyond 32 did not produce any noticeable difference in transfer speed. Without using the switch I was consistently seeing 150 MB/min transfer speeds. Here is the syntax that I used:

robocopy.exe  "\\oldserver\bigfolder" "D:\newfolder" /LOG:d:\temp\log.txt /MT:32 /E /W:5 /Z /V /R:6

I knew that once I initiated the robocopy job there was nothing else I could do other than wait for the data transfer to finish. Furthermore, I did not want to have to frequently check the destination server to know how the transfer was progressing.  So I needed a way to count the folders as they showed up on the destination server and then send me an email.

The solution to this was to leverage the power of Windows Script Host which is native to Windows 2008 servers.  Using Vbscript I could easily create a script to count folders in a directory and then send myself an email. The 2nd half of this challenge was to have that script run every 10 minutes which I explain later in this post.

Using Vbscript I create a function that will check a folder and return count of folders it contains.

 function countFolders(strPath)
        dim objShell
        dim objFolder
        dim folderCount
        
        set objShell = CreateObject("shell.application")
        set objFolder = objShell.NameSpace(strPath)

        if (not objFolder is nothing) then
            dim objFolderItems

            set objFolderItems = objFolder.Items

            if (not objFolderItems Is Nothing) then
                 folderCount=objFolderItems.Count
            end if

            set objFolderItem = nothing
        end if

        set objFolder = nothing
        set objShell = nothing

        countFolders=folderCount
    end function

Next I needed to create a function that would send an email with the folder count. I format the message so that the pertinent details are in the subject line.

Function SendCount(strCount)

Set objMessage = CreateObject("CDO.Message")  
objMessage.Subject = strCount
objMessage.From = "user@mydomain.com"  
objMessage.To = "user@mydomain.com"  
objMessage.TextBody = strCount
 
objMessage.Configuration.Fields.Item _ 
("http://schemas.microsoft.com/cdo/configuration/sendusing") = 2  
 
'Name or IP of Remote SMTP Server 
objMessage.Configuration.Fields.Item _ 
("http://schemas.microsoft.com/cdo/configuration/smtpserver") = "mail.mydomain.com" 
 
'Server port (typically 25) 
objMessage.Configuration.Fields.Item _ 
("http://schemas.microsoft.com/cdo/configuration/smtpserverport") = 25 
 
objMessage.Configuration.Fields.Item _ 
("http://schemas.microsoft.com/cdo/configuration/smtpauthenticate") = 1 
 
objMessage.Configuration.Fields.Item _ 
("http://schemas.microsoft.com/cdo/configuration/smtpusessl") = false 
 
objMessage.Configuration.Fields.Item _ 
("http://schemas.microsoft.com/cdo/configuration/sendusername") = "user@mydomain.com" 
 
objMessage.Configuration.Fields.Item _ 
("http://schemas.microsoft.com/cdo/configuration/sendpassword") = "xyz123" 
 
objMessage.Configuration.Fields.Update 

objMessage.Send  
End Function

So now that I’ve got my two functions I need to code how they’ll be used. I create local variables for the destination path that is going to be monitored, the total number of folders being copied, and a counter.

Option Explicit

Dim destFolder 
Dim maxFolders
Dim folderCount

destFolder="d:\destinationfolder"
maxFolders=500
folderCount=countFolders(destFolder)

sendCount("F: " & folderCount & "  " & FormatPercent(folderCount/maxFolders))

At this point the script is ready to run. I knew there were 500 folders being copied to the new server so each time the script was run it would send me an email with the number of folders copied and the % complete.  In the beginning of this post I indicated I need this to be automated so the next step is to login to the destination server and create a scheduled task to run our new script. Since I know from testing this transfer will take approximately 3 hours I schedule the task to end after 3 hours but I want it to run every 10 minutes until then.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Sep 152012
 

If your web site is hosted on a dedicated server (cloud or physical) then chances are you have some internal processes which need to happen on a recurring basis. The Windows Task Scheduler is a wonderful built-in tool that fulfills this need.  The location of this program has changed from Windows Server 2003 to Windows Server 2008. With Windows Server 2003 it was located in the Control Panel. With Windows Server 2008 it is located in Administrative Tools.

With the Windows Task Scheduler you can run any program on the server including custom scripts at any time with any recurring frequency. So this great news for system admins but what happens if you’re a web developer and you designed an admin page on your site to perform some internal housekeeping which runs when the page is loaded? As you can imagine you don’t want to sit at your desk all day hitting the refresh button.

So here’s were the power of Windows Task Scheduler comes into view. We just need to create a new scheduled task to visit the web site. Well unfortunately this is not possible. Task scheduler is not able to browse sites. However, that would be a cool feature for a future release.  So are we done before we’ve started? What could be used to open a web site url that we could then in-turn schedule as a task? Well look no further than Microsoft’s XMLHTTP object. I always say “there’s no school like old school” and in this case it is absolutely true. 

The following vbscript is all we need to open the web site url programmatically.  

       On Error Resume Next

Dim objRequest
Dim URL

Set objRequest = CreateObject("Microsoft.XMLHTTP")
URL = "http://www.peterviola.com/testme.htm"

objRequest.open "GET", URL , false
objRequest.Send
Set objRequest = Nothing

Just cut and paste the snippet above into a simple .vbs text file on your server and it will be ready to run. If you run it manually it won’t open a browser but the request is completed. To know it works you just need to check your web site logs. With this bit of code we have identified a way to programmatically call web site url from within our server without having to be logged into the server.  So looking back at our original “task” we now have all the pieces in place to get the job done. 

The next step is to just configure Windows Task scheduler and here again Microsoft makes it easy for us. When you open Task Scheduler on the right side of your screen just click “Create Basic Task” and the Create Basic Task Wizard will launch. Just follow the steps and complete the wizard.

You will be prompted to choose the program you want to run. Use the menu to find the .vbs file you created earlier.

After you complete the wizard your task will be ready to run based on the schedule you picked during the wizard. However in some cases you may want your task to run more frequently than once per day. So using the advanced properties you can choose to repeat the task as frequently as every 5 minutes forever.

As I mentioned above you can confirm it works by checking the www logs for your site. Using the powerful command findstr as shown below I can pull out just the requests I want for my test page:


findstr /S /I /P /c:"GET /testme.htm" C:\wwwlogs\W3SVC1\u_ex120915.log >testme.txt

Here are the results which clearly show the scheduled task is working as expected hitting my test page every 5 minutes.


2012-09-15 18:50:22 W3SVC74 ABC123 x.x.x.x GET /testme.htm - 80
2012-09-15 18:55:22 W3SVC74 ABC123 x.x.x.x GET /testme.htm - 80
2012-09-15 19:00:22 W3SVC74 ABC123 x.x.x.x GET /testme.htm - 80
2012-09-15 19:05:22 W3SVC74 ABC123 x.x.x.x GET /testme.htm - 80

This simple technique can be leveraged in so many powerful ways. Thanks for reading!

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS