Posts Tagged ‘iis’

IIS 8.5 Boncode Tomcat Connector

March 4th, 2016 No comments

This guide assumes you have already installed Tomcat Server 7/8. Tomcat offers a connector (mod_jk) but it’s very old and requires a lot of configuration. So I’ll explain how to install and configure the connector from Boncode that’s hosted on RiaForge. This specific example was for an installation of HP Service Manager running on Windows 2012 / IIS 8.5


Download the latest version from
Extract the .zip-file to a temp folder (


Run “dotnetfx35setup.exe” if .NET 3.5 is not yet installed on the server.

Execute “Connector_Setup.exe”














The installer has copied the binaries and configuration file to “C:\Windows”


Open the file “BonCodeAJP13.settings” and modify the content as below. Feel free to change the path to LogFiles.


Make sure the IIS Application Pool user has write access on the LogDir folder!
A folder named “BIN” has been created in the root of your website. It will be empty so copy these 2 files in it:

  • C:\Windows\BonCodeAJP13.dll
  • C:\Windows\BonCodeIIS.dll


Now we need to tell IIS which folders we want to forward to Tomcat. For HPSM this will be “smsso”. Create a new folder in the root of the website named “smsso”


Copy the “BIN” folder in “smsso”

Windows2012_IIS8.5_Tomcat_riaforge_connector_23 Windows2012_IIS8.5_Tomcat_riaforge_connector_24

Now add a handler mapping in IIS. Go to the subfolder named “smsso” and add a new managed handler.

Click on image for full version

Click on image for full version

Request path: *
Type: [select BonCodeIIS…. from the dropdown]
Name: BonCodeAll

Click on image for full size

Click on image for full size

Run “iisreset” to reload the settings file

Categories: IIS Tags: , , , , , ,

IIS LogParser scripts

April 17th, 2012 No comments

Not everyone uses Google Analytics or other 3th party logfile parsers. So a customer asked me to create some reports based on their IIS logfiles. The web is full with examples so one more or less won’t make a difference 🙂

First step is to download Logparser v2.2 and install this on a PC. Doesn’t have to be the server. Default installation path is “C:Program FilesLog Parser 2.2”
Queries can be run from the command line or used via an external .sql-file. I prefer the latter method so I create 3 subfolders: “SQL”, “Logs”  & “Results”. I copied the IIS logfiles to subfolders under LOGS.

Mike Lichtenberg has a demo page with 50-examples to give you an idea what’s possible. Below are 2 simple examples that my customer wanted to have.

1. number of hits per day

Customer (internal webservice) has no stats tool so he wanted to know how many hits this WebService received over a period of time. The query is below, I will explain it afterwards. Save this query in a text-file and name it “RequestPerDay.sql”

SELECT QUANTIZE(TO_TIMESTAMP(date, time), 86400) AS Day, COUNT(*) AS Total
FROM Logs%cFolder%u_ex*.log where cs-uri-stem not like '/loadbalancetestpage.html' GROUP BY Day ORDER BY Day

What we do above is convert the date/time to a timestamp and group them by day (86400). If you replace 86400 by 3600 then you would receive stats per hour but the customer wanted to compare 6 months so we would have received too much output. Notice the ‘%cFolder%’ variable, I do this because there are numerous webservices and this way I can easily target a different subfolder. I also exclude a test-page that it is used by the hardware loadbalancer. Running this straight from the command line would be like this, notice how I pass the folder variable!

Logparser file:sqlRequestPerDay.sql?cFolder=N1-B

It works but not very handy so luckily LogParser supports other methods. Like CSV-export, don’t we all love that! Let’s try the command above again with some extra parameters.

Logparser file:sqlRequestPerDay.sql?cFolder=N1-B -stats:off -o:CSV > results/CSV/ReqPerDay.csv

So, nicely formatted and we can now use this data to create graphs in Excel or in any other tool that supports CSV-data.

2. Hits for a specific page

You know now how it works so we skip the blabla and go straight to the query and the output.

SELECT TO_TIMESTAMP(date, time) AS Day, cs-method as Method,
STRCAT( cs-uri-stem,
REPLACE_IF_NOT_NULL(cs-uri-query, STRCAT('?',cs-uri-query))
) AS Request,
STRCAT( TO_STRING(sc-status),
STRCAT( '.',
COALESCE(TO_STRING(sc-substatus), '?' )
) AS Status,
FROM Logs%cFolder%u_ex*.log where cs-uri-stem like '/Pages/dymmyname.aspx' ORDER BY Day

Above we are grouping the url and also the status. The advantage is that you would see 401.3 instead of just error 401

Logparser file:sqlBO-Pickup.sql?cFolder=N1-B -stats:off -o:CSV &gt; resultscsvBO-Pickup.csv

The possibilitied are endless so go ahead and try some queries on your own!!

Categories: IIS Tags: , , , , , , ,

PowerShell: delete files older than x days

August 25th, 2010 1 comment

Classical script and we all need it some day. Edit the folder path and adjust the number of days to your clean-up schedule. The script works recursive so it will also clean-up subfolders.

As always, make sure you can execute PowerShell scripts

Set-ExecutionPolicy RemoteSigned

The Script…

$Now = Get-Date
$Days = &quot;30&quot;
$TargetFolder = &quot;C:WINDOWSsystem32LogFiles&quot;
$LastWrite = $Now.AddDays(-$days)
$Files = get-childitem $TargetFolder -include *.log, *.txt -recurse | Where {$_.LastWriteTime -le &quot;$LastWrite&quot;} 

foreach ($File in $Files){
	write-host &quot;Deleting File $File&quot; -foregroundcolor &quot;Red&quot;;  Remove-Item $File | out-null
Categories: Powershell Tags: , ,