
Next, we will use an ODBC connection for the CSV files: language: textĭriver= Dbq=. First, we will use a standard SQL Server connection for the destination database: Luckily, both of these are supported by Microsoft’s ADO.NET providers. Our importer needs to be able to read the CSV files, and spit the data into SQL server. In light of this, I will be skimming over a few of the easier parts.
#Sql bulk copy log code
This code is available on GitHub, so there is no need to follow along unless you are building a custom version. So, now it is time to explore the power of the SqlBulkImporter class. The staging table will be an almost identical temp table, with the single omission of the computed ‘datetime’ column. AS (CONVERT(datetime2(0), date + CONVERT(datetime, time, 0), 0)) PERSISTED, bigint IDENTITY(1,1) NOT NULL PRIMARY KEY, The main storage table should match the W3C Log format, like so: language: sql

So, we will need two (very similar) tables during our import process: the staging table, and the main storage table. After the data is staged, it can be incrementally moved into the main storage table to reduce the impact on the analytics. And, even though we don’t want to transfer tons of data, we should be OK with 1-day’s-worth in a staging table. This all leans toward a scheme of staging and merging data, rather than cherry-picking new rows to import. On the other, we don’t want to have long-running transactions against the main IIS logs table, since it is really made for OLAP and response time of queries is paramount. On one hand, we want to transfer as little data to the database server as possible, for obvious reasons. We need a place to shove that data, but we have two masters to server at this point. In addition, its SQL support is useful but is unable to do the incremental loads that we would like to do. When a situation like this arises, Log Parser gets confused and completely fails to properly parse any further entries.
#Sql bulk copy log windows
However, I have found that, since IIS does not flush its log files and since Windows caches writes, the log will sometimes end half-way through an entry. Now, here I would like to note that Log Parser supports a mode of operation in which it remembers where it left off in a log file, and skips there on subsequent runs. It also accepts the website ID number as a parameter on the command line, if you want to run this for more than one site. If you run that, you should end up with a CSV file in the current directory, with the current day’s logs for the default IIS website. %logparser% "SELECT * INTO log-w3svc%siteid%-ex%logdate%.csv FROM C:\WINDOWS\system32\LogFiles\W3SVC%siteid%\ex%logdate%.log" Set logparser="C:\Program Files\Log Parser 2.2\LogParser.exe" Here is an example command script that can take care of those variables: language: off However, as you may be able to tell from the file path, we need to parameterize the source for different IIS sites (e.g. SELECT * INTO log1.csv FROM C:\WINDOWS\system32\LogFiles\W3SVC1\ex110101.log Here is an example of the type of SQL Statement we are looking for: language: sql Microsoft’s Log Parser accepts SQL-like commands from the command line, and, depending on the particular command, can output to a variety of text formats, execute SQL statements, or create images. Log Parsingįirst, let’s get Log Parser humming. We can easily craft a C# program to take the clean, consistent output of Log Parser and stream it into SQL Server at breakneck speeds. So, what is the secret sauce? Well, SQL Server’s ADO.NET provider exposes the SQL Server Bulk Import API through a class called SqlBulkCopy. SQL Server Reporting Services (or your favorite reporting suite).It may also support Apache W3C logs, but I haven’t tested it.) Log Parser 2.2 (For turning W3C logs into an easy-to-import format.My solution is based on a couple of off-the-shelf Microsoft products that, if you are running on the Microsoft stack, should already be licensed to you. So, based on the fact that most existing solutions aren’t designed to fit seamlessly into a windows environment, I decided to come up with my own solution. Cacti is a pretty good option, however it requires a few things that may not be installed on a Windows server (i.e. I’ve taken a look at a few 3rd party tools for creating analytics from these log files, and there are downsides to every one of them. If you need data like this, the best source is going to be the web access logs from your HTTP server. What do you do when you need to add analytics to your server? Well, Google Analytics is a good option, for sure, but it doesn’t capture important, juicy details like bandwidth usage, time to complete requests, cache-hits and so on.

View otac0n on GitHub Slick Analytics with LogParser and SqlBulkCopy Slick Analytics
