Saturday, 31 December 2011

Happy New Year


Image: ponsuwan /

As 2011 draws to a close and as people look forward to evening of partying and celebrating I’d just like to wish everyone a very happy, healthy and prosperous New Year.

Monday, 19 December 2011

Contracting is business not employment: Read the small print

rate cut for contractsThis post is about a contracting in UK. Some of terms mentioned here will not mean a great deal to people who do not understand the mechanics of contracting or the UK tax systems.

I recently got sent an email by a friend and fellow contractor that contained a link to the following news article:

Feel free to go away and have a read of the article but the gist is this. A well known global financial institution has decided that its is possibly paying over the odds for its contractors and as such it is enforcing a 10% rate cut across the board in an effort to cut costs. It doesn’t want to lose the contractors but it does want to reduce the amount it pays for them and hence reduce the cost of the contractors.

Tyour in a business relationship not an employment relationshiphe feeling of the comments left on the post seem to imply that people are outraged by such a move. The survey results seem to suggest opinion is split. Some people asked if the rate cut is legal or even possibly discrimination?

The answer is a simple one in my opinion. Note: I am not a qualified lawyer nor am I giving legal advice but these are my views on this issue.
Its very important to remember contractors are not employees, contractors  are providing a business service. If you are a contractor in the UK you are entering a business relationship not an employment contract. You maybe a small business, it might just be you employed by your company. You may be part of  larger organisation and there maybe more of you but the principal is the same. If you are in business for yourself and secure contracts with clients the terms of the agreement represent a business relationship. The terms of your contract are governed by the business agreement  that you have entered into. If it says in the agreement that a rate cut can take place legally or at least the rate can be negotiated and you signed up to it then, like it or not there is not much you can do.
Now what’s happened here is what happened to me with a previous client, the line in the contract I had with the agent was worded similar to this, which at the time seemed inconspicuous  “If for any reason we can no longer do business with the client, we will provide you with four weeks notice of termination” Doesn’t seem like much when you read it the first time but the letter and explanation I received made it clear that it was this clause that was being invoked and it went something like this. “We have been advised by the client that they can no longer do business with you at the current rate, so this is your 4 weeks notice…[I got the same nicety of it not being a reflection of your work etc]…The client would like to keep you on but the new offer will be at 10% less than your current rate. Please let us know by email by this date if you wish to extend or if you will be ending the contract at the end of the four weeks.” Having discussed this issue with several legal people who happen to be friends, as a director of my company and business I had a choice, take the new contract or accept that the contract would conclude at the end of the agreement. I won’t say what I did here as that is irrelevant to the discussion.

Now they couldn’t treat employees like this, but you often see business contracts being re-negotiated depending on the terms agreed initially. This is what this is, a re-negotiation of the business deal. The fact there is clause in there that allows this and it has been executed implies a business relationship exists…In my opinion it goes someway to support your IR35 status. Every cloud has a silver lining.
Contracting is Business, not employment. Business can be tough but the risks can be rewarding I guess its up to you whether you want the risk of business relationship or you want to be an employee

What do you think? I’d be interesting in any other views that people have to offer.
Contractor Calculator has some useful advice about avoiding rate cuts in these situations. I guess though that will depend on your ability to negotiate business deals.

Thursday, 15 December 2011

Execute Permissions

Written by Ian Treasure

Gethyn posted on the use of execute permissions a while ago (db_executor at Briefly, this approach uses a role – db_Executor – to which users are given membership. The execute permission is granted to the role.
I am using this approach to control access to stored procedures. Some developers started to complain that they could not execute their stored procedures. The first thing that I did was to check if their user was a member of the db_Executor role. It was, which was a surprise. I then started to check that the role had executor permissions. This was not so easy to do.
After a little digging, I used sp_helpprotect as follows.
In SQL Server Management studio, I ran:

sp_helprotect 'execute'

This returns the following:
Msg 15330, Level 11, State 1, Procedure sp_helprotect, Line 291
There are no matching rows on which to report.
OK – now I know that the problem is because the database role does not have execute permissions. So all I need to do is run:
GRANT EXECUTE TO [db_executor]

And if I repeat sp_helpprotect, I now see the following:
db_executor      dbo      Grant                Execute            .
which shows that db_Executor has execute permissions.

Wednesday, 14 December 2011

Contracting, SQL Server and

A while back I started posting some contractor related articles.The articles were mainly for the those people that wanted to start their own business and work for themselves as a contractor. I work in the IT sector through my own company with several clients both based in the UK and across the pond in the US. I have been running my company and working for myself for almost five years now and I even through one of the most difficult economic times  in modern history I have thoroughly enjoyed it and I wouldn’t change  anything that has happened thus far. 

A while back I started a new section of that I intended to have a focus on contracting and also some career based themes. At the time I thought it would useful to split up the SQL Server content that usually gets published on from the contractor/career type posts. This week for a number of reasons I have decided that was not such a good idea.

My reasons for splitting at the time were simple, the two  people who read my blog (My wife Lisa and my Mum, Thanks Both!) are expecting the content to be SQL Server related and I didn’t want to mix the two. It turns out that I don’t post that often to the contracting blog and I don’t write the career contracting posts frequently enough to warrant splitting out the content into a blog of its own. I have crossed posted some of those posts and they have been quote well received on the main SQL Server blog. So I will be removing the Contractor/Career blog and from now on all my posts, whatever the subject will here.

So as its December and a New Year is just around the corner I have decided that the blog needs a bit of tidy up and spring clean. Over the next few weeks you will be seeing some small changes occur in the look and feel. Not a major changes but some old things will be removed like the contracting and section and some new things added I will hopefully have more posts and articles from both Ian and David. I do have a couple outstanding that I need to publish. There will be more SQL Server content, with the release of SQL Server 2012 due by the end of June next year hopefully we will have lots of posts showing off the new features.

So when will we get an RTM version of SQL Server 2012?  Here are my thoughts  SQL BITS which is scheduled for the end of March 2012 is the official launch and I believe the March/April SQL Connections  in Las Vegas is also a launch event. Even though the launch is planned for the spring and we recently got RC0 (Release Candidate 0), the official line at the PASS keynotes was the first half of 2012. The launch events of SQL Server 2008 were before the RTM date too . This means it could be as late as June. In fact Teched is due to start June 25th if I was a betting man that would be the date I would pick.

Thursday, 24 November 2011

Destroying a Windows Cluster

I haven’t posted for a while but thought I’d dust off the blog and get a post out. I have a pipeline of blogs from David and Ian waiting to be scheduled and they should be out before Christmas.

Warning – Do not do this on your production boxes

I have been doing some work, first with CTP3 and now RC0 and Availability Groups. For those of you that don’t know you need a windows cluster configured to use AlwaysOn Availability Groups. I wanted to remove the cluster from my virtual nodes…so I could rebuild it. That’s a long story. I thought, before I started that removing the cluster would be difficult. As it turns out it is very simple in Windows 2008R2.

To destroy you cluster from Failover Cluster Manager right-click the name of your Windows cluster, scroll down to other tasks and select <Destroy Cluster> You get a warning which, if you accept then goes away and removes the cluster from all nodes in your cluster…Its gone!

As I said don’t do this on your production boxes but if you ever want remove clustering from your Windows environment then it can be quite simple to do so.

Thursday, 6 October 2011

SQLBits Friday and Saturday


By David Postlethwaite

The last two days of SQL Bits provide opportunities to hear short 50 minutes presentations from experts
on a diverse range of SQL related subjects.

Friday I stuck to a DBA theme.
I’ve heard Simon Sabin talk on performance tuning at the last SQL bits but there’s no harm hearing it
again there’s always something you missed last time. He certainly hates scalar functions.
David Morrison went through query plan operators which again although you’ve heard them all before,
you pick up something new.
Both covered similar ground and the overall view is that there are many ways to write a query to
achieve the same output, exactly what code you finally use will depend on many factors like the amount
of data, the cardinality of the data ,the available indexes, new indexes . You’ve just got to keep looking
at the query plan, understand it and change the query until you get acceptable results.
And finally Klaus Aschenbrenner on SQL 2008 troubleshooting looking at all those DMVs and how they
can help to keep your SQL Server under control.

I would have gone to another talk on Friday but the hotel was so hot and stuffy that I had to get out for a
breather. England had the hottest September 30th in history and the air-con couldn’t cope.

Saturday, still hot but slightly more bearable.
This time I went to more development orientated talks
Database Unit Testing using VS2010 with Anthony Brown. I’ve never tried a database project in Visual
Studio. I’m sure like many people I just create a database in SSMS and then create a web or Windows
project to write the application. Much better to use database project especially when VS can link to
Source Control. The unit testing ability of VS was pretty neat as well.

Moving on from VS to Project Juneau with Gert Drapers. This is the new developer tool in SQL Denali,
a replacement for BIDS I think. Looks rather cool. I’ve always wanted to be able to step through SQL
scripts like you can in c# and this looks like it’ll do it as well as lots more useful features.

Lightning talks can throw up all sorts of interesting things and this one was no exception. I heard about
someone using SQL spatial objects to record the number and type of dead animals seen by the side of
our roads to help understand our biodiversity and another on a new feature in SQL Denali that links a
SQL database to a windows folder and its file contents and allows you run T-SQL queries on the folder
(like list all files with a certain word within).

SQL Tuning from a .NET perspective by Bob Duffy had some interesting points for a .NET developer,
the main points being connection pooling and caching.

Two talks which were very similar T-SQL performance recommendations by Milos Radivojevic and T-
SQL bad habits to Kick by Aaron Bertrand. Both of these covered similar points to some of the tuning
talks from the previous day but still very useful to hear from a different direction.

And finally if you are looking for a free tool to help you understand your execution plans then take a look
at SQL Sentry Plan Explorer ( It has loads of features and allows you to look at an
execution plan in several different ways and best of all its free without obligation.

Thursday, 29 September 2011

SQL Bits Training Day

By David Postlethwaite

Just spent a day with Allen White, a SQL MVP from Ohio, USA learning about automating and managing SQL Server using PowerShell

PowerShell is ubiquitous in Windows now and can be used to view and manage any part of the Windows systems, it uses the .NET framework which, for me who has written plenty of c#,  made it much easier to understand. One of the things I liked about PowerShell was the way that objects like the registry and SQL Server can be browsed just like a disk drive.

The real winner, not just for PowerShell but for .NET as well, is SQL Server Management Objects or SMO, a whole class of functions that can be used to interrogate SQL Server. I hadn’t come across this before but I could see so many ways that I could use it for retrieving information and managing SQL Server. An example was “how would you find all varchar fields in all databases in several instances and change them to varchar(max). A nightmare with t-sql but with SMO quite easy.

We also saw example scripts for retrieving performance data into a database.

For support and diagnostics of your own systems its definitely the way to go , much more flexible than t-sql and much better for quick ad-hoc queries than trying to use .NET. And there are loads of example scripts on-line.

Can’t wait to give it a go back in the office

Wednesday, 28 September 2011

David’s SQLBITS Day 1–The Travel Day

Just had a pleasurable 5 hours watching the Autumn countryside go by on a glorious sunny day (in fact its warmer at the end of September than it was for most of July and August).

Where am I heading you ask? Liverpool the home of the Beatles, the Mersey Ferry and Nerris Hughes. I’m here for SQLBits, a 3 day biannual conference held in the UK for everything SQL Server.

This is my 2nd time to SQLBits, the last being in March at the Grand Hotel in Brighton.

Thursday is a training day where one studies a particular topic in depth. Friday  has a choice of 30 different sessions. Saturday has another 30 sessions but this day is free to attendees.

On Thursday I’m looking forward to getting to grips with PowerShell. As well as being a SQL DBA I also write a lot of ASP.NET. I’ve written quite a comprehensive web site to audit and troubleshoot my SQL Server estate but I’ve wanted to learn PowerShell to allow me to run quick checks and automation commands just like I used to do with DOS in days gone by.

I’ve hear a lot about how useful PowerShell  and I became enthused at last year’s Tech-Ed in Berlin where I went to a talk by one of the people responsible for its creation. I could see the potential for not just helping the DBA for supporting all parts of Windows.

But, as often the case, finding time to put these things into action isn’t easy. You come back enthused by these events but the daily grind of work stops you from having to time to experiment and take advantage of what you’ve learned and soon its forgotten.

I am determined this year to use what I learn and give myself time to re-watch the recordings of the talks that I go to.

I’ll let you know tomorrow how I got on.

Wednesday, 31 August 2011

Sharon Dooley’s Lightening Talk and the WIT Luncheon

At the end of last week I received an email from my friend and colleague Sharon Dooley telling me her “Lightening Talk” had been accepted and she would be speaking at the Summit in October. Sharon has many years experience in the IT industry, a former SQL Server MVP, currently working as a trainer and course author specializing in SQL Server. She is a volunteer with PASS and is the Virtual Chapter Lead for the DBA virtual chapter. Sharon is also involved in the Women in Technology virtual chapter.

A couple of weeks ago Sharon and I were chatting about the PASS Summit and even though she hadn’t submitted a full session she was considering submitting a lightening talk. Sharon doesn’t currently maintain a blog so I said if she did submit a talk and it was accepted I would promote her “Lightening Talk.” Sharon has a really awesome query that returns a user’s permissions; she will be demonstrating this query during her talk. She also promised to share the query with us after the Summit so those of us who can’t get to Seattle can use it too. So we’ll have an excellent guest post to look forward to later in the year.

There will be more details on Sharon’s talk to follow but Sharon has asked me if I would mention the PASS Women in Technology Luncheon at which she is also on the Panel. The 9th Women in Technology Luncheon and Panel discussion will take place at the PASS Summit in Seattle on Thursday October 13, 2011. This year's topic is: Make Yourself Heard: How to Ask for What You Want at Work. In 2003 the first Women in Technology panel discussion was held at the PASS Community Summit. Since that inaugural panel, the WIT event has been a highlight of the Summit, with captivating speakers and lively discussion on topics relating to women in the field of technology. Participants at these panel discussions were eager to continue networking and learning from each other which led to the launch of the WIT Virtual Chapter. You can find more information on the Women in Technology Luncheon and the virtual chapter in general here ( ) You can follow the Chapter on twitter @PASS_WIT or following the #PASSWIT hash tag.

Tuesday, 30 August 2011

Using PowerShell to Monitor SQL Server

By Ian Treasure

Over the past few years, I have used a number of approaches to monitor SQL Server. For example, I have configured jobs to e-mail the DBA team if they fail. I have written programs to check that everything is OK. I have used a number of third-party tools to monitor my instances. Each approach has its own strengths and weaknesses. If the SQL Server Agent is not running, the jobs don’t run and I don’t get any messages telling me that they have failed. To stop this happening, I have used server monitoring tools for example  Big Brother which can warn if a service has stopped. This sort of tool is pretty good, but you need to invest time and effort setting it up. The same applies to monitoring tools like Ideas Diagnostic Manager or Red Gates SQL Monitor. Also, these tools can cost a lot of money and can themselves affect performance of the server.
So I want to find a tool to monitor SQL that is independent of SQL Server, free and easy to use. PowerShell fits the first two requirements, and it is not too difficult to learn. I’ll suggest an approach to monitor your backups that took me a few hours to implement, is independent of SQL server, takes few resources and costs nothing apart from your time.
I have a production database in full recovery. A scheduled SQL Server Agent backs up the log on the hour between 07:00 and 19:00. If the job to backup the log fails, it will e-mail the SQL Server DBA group. However, if the SQL Agent fails there is a danger that there will be no backup notification. To address this issue, I have written a PowerShell job that runs on a monitor server (Windows 2003) and checks that the log file is in the backup directory. If the file is not present, the PowerShell script e-mails the DBA team.
The first thing that I have written is a PowerShell function library. Create a directory called C:\PSScripts. Create this a file in this directory called CalcFunctions.ps1 and paste the following code into this file:

function SendMail($filename)
$emailFrom = ""
$emailTo = " "
$subject = "Missing backup file - $filename does not exist"
$body = "Missing backup file - $filename does not exist"
$smtpServer = "mail1"
$smtp = new-object Net.Mail.SmtpClient($smtpServer)
$smtp.Send($emailFrom, $emailTo, $subject, $body)

function CheckBackupFile($filename)
$bck = get-childitem $filename | where {((get-date) - $_.LastWriteTime).minutes -lt 9}
if (!$bck) {

There are two functions

The first function – SendMail – will send an e-mail to whoever you configure in your code. I am sending the email to a mail group called SQLDBAs. The function accepts a parameter – filename – which indicates the name of the missing log file, which forms part of the message.
The second function – CheckBAckupFile – is more interesting. It accepts as a parameter the name of a backup file. It then loads a list of files matching the parameter $filename (get-childitem $filename) which were written to in the last 9 minutes (where {((get-date) - $_.LastWriteTime).minutes -lt 9}). If this list is empty the SendMail function is invoked with the $filename as a parameter. The 9 minutes is used because the job will be scheduled to run at 7 minutes past the hour, and there should always be a log file for the production database available at this time.
Now create a script to use these functions – create a file called ChackLogBackups.ps1 and paste the following code into it:

param([string]$logFilename = "")

. c:\psscripts\CalcFunctions.ps1

In this script, note that there is a space between the dot and the c: (. \) – originally I missed the space and it took me a long time to identify the problem.
Next, create a batch file RunPS1.bat as follows:

powershell.exe -command "& 'c:\psscripts\checklogbackups.ps1'" “\\sql\r$\sqlbackup\log_prod*.sqb”

In the above file, the script file checkLogBAckups is run with a parameter pointing to the location of the backup log file. This is on disk at \\sql1\r$\sqlbackup. The wildcard (‘*’) is used because the backup files are named using a datetime stamp – for example log_prod20110801_09:00.sqb.
Finally, create a scheduled job with the windows scheduler. This job must run at 7 minutes past the hour, and run the batch file RunPS1.
This is a simple example. It can be extended to check for other log files on other servers, or overnight full backups. By creating a file of files to be checked, it will be possible to be certain that your production databases are being backed up as expected with little cost, effort or  expense.

Wednesday, 24 August 2011

VMFS LUN resizing options in vSphere


First off, many thanks to Gethyn for the opportunity to post on his excellent blog.

I’m Christian Dadswell and I am a Senior Infrastructure Analyst at the Intellectual Property Office. I have 15+ years cross platform IT experience and currently support and maintain, amongst other things, the vSphere infrastructure from SAN (fabric and storage) thru vSphere to the virtual machines.

In this article it is my intention to give you some options for resizing a VMFS LUN to allow you to increase the size of a Windows VM volume. It will also cover a scenario whereby a VM has been frozen in operation due to lack of disk space.

Previous experience has shown me that VM’s occasionally require more space for whatever reason. This happens more often than not with SQL Server VM’s. Sorry, but you SQL DBA types love to eat SAN space. But seriously if a VM needs more space for its OS or data volumes something will need to be done.

To kick things off I will create a couple of scenarios that you may well of come across in your day to day vSphere admin. I will then discuss how to remedy them.

The scenarios assume you are using vSphere 4.x and above with an IBM SVC. Other storage virtualisation vendors should allow vdisk resizing. More on vdisk resizing later.

Scenario One:

A Windows 2008 SQL VM (SQL01) has been designated it’s very own VMFS LUN within vSphere, for arguments sake, this VMFS LUN contains all volumes that the VM will use.

VMFS LUN size = 1TB

VM volumes and sizes:

C: OS - 40GB



Total = 990GB

The DBA requires more OS space and the vSphere admin notes that an extra 20% of space should be available to allow for snapshots. Unfortunately, when the VM was created nearly all the available space was used up.

The SQL01 OS volume needs resizing as its critically low on disk space. As this is a Windows 2008 VM it is very easy to resize the boot volume without even rebooting the VM. But the problem exists that no extra space exists on the VMFS LUN.

Scenario One Options:

As the VMFS LUN has no extra space there are two options;

1. Extent the VMFS LUN from within vSphere with space from another LUN

2. Resize the underlying VDISK presented to vSphere from within the SVC console and then resize the VMFS LUN into the new space.

Option one will not be covered in this article, but is a very straight forward task. Option two is a little more interesting and I will now cover how to achieve this task.

For the sake of this article we will need to assume that there is available space on the underlying VDISK that has been presented to vSphere. See this Wiki for the full skinny on VDISKS (

Scenario One Tasks:

● Resize the underlying VDISK via the SVC console

● Resize the VMFS LUN within vSphere

● Adjust the SQL01 VM Boot Volume to 60GB

● Resize SQL01’s OS volume

Resize the underlying VDISK via the SVC console

1. Log into the SVC console and select “Work with Virtual Disks”

2. Locate the VDISK you wish to re-size in the table and select it

3. From the drop down at the top of the table, select “Resize Virtual Disk”

4. Follow the wizard and enter the extra size that is required, in this example we’ll add 500GB.

5. Finish the wizard and log out of the SVC console

Resize the VMFS LUN within vSphere

1. Log into the vSphere client

2. Select a vSphere host and go to the Configuration tab and select Storage Adaptors

3. Select Rescan All

a. This will pick up the extra space given to the VMFS volume we just re-sized in the previous step

4. Once the rescan has completed (check the status in the bottom of the client) locate the VMFS LUN, right-click and select Properties

5. You should now see the new VMFS LUN size of 1.5TB

6. Click Next and you should now see that there is free space available (under the Capacity label)

7. Select Next and tick the Maximize capacity tick box

8. Confirm you want to resize by selecting Finish

9. The VMFS LUN is now 1.5TB

Adjust the SQL01 VM Boot Volume to 60GB

NOTE: Ensure you have a backup before running any volume resizes.

1. Shut SQL01 down

2. Right-click the VM within the vSphere client and select Edit Settings

3. Select the hard disk that is designated as the boot drive

a. This should be the 40GB volume

4. Modify this to 60GB and OK the dialog

5. Power on SQL01

Resize SQL01’s OS volume

1. Log into the SQL01 VM as an Administrator

2. Open Computer Management and select Disk Management under Storage

3. Right-click the OS volume and select Extend Volume

4. Extend the volume by 20GB

At this point we have achieved what we set out to do. But what if the VM was Windows 2003 Server?

Resizing a Windows 2003 OS volume

NOTE: Ensure you have a backup before running any volume resizes.

In order to resize an OS volume in Windows 2003 you will need a third party tool and downtime on the VM. One that I use more often than not is GParted Live ( It comes in ISO/USB/PXD and HD formats. ISO format is perfect for attaching to VM CDs so just chuck a copy in your ISO store.


Something that I have found that is worth doing before running GParted on the OS volume is to do a ‘CHKDSK /f C:’ from the command prompt of the Windows server. This will require a reboot* to do a check on next boot up and fix any errors.

*The reboot is required due to the requirement of locking the file system.

Back to the resize...

1. Take the VM down and attach the GParted ISO to the VM’s CD drive and ensure that “Power on boot” and “Connected” are ticked before pressing OK.

2. Power on the VM and click inside the console window as soon as you can and press ESC. You need to get into the boot menu and select CD.

3. Once inside the GParted interface select the boot volume and re-size into the extra space as required.

4. Apply the changes, and let it do its thing

5. Once completed, power off the VM, disconnect the ISO and power it back up

You have now re-sized the boot partition on a Windows 2003 Server.

Scenario Two:

This next scenario is very similar to the above, but the VM has been frozen by vSphere as the VMFS LUN has completely run out of space.

In this situation vSphere will freeze the VM at the point that disk space ran out and you will not be able to do any of the following;

● vMotion

● Storage vMotion

● Snapshot

● Edit machine settings

● Power off

● Power on

On the main screen for the VM within the vSphere client you will notice a yellow box is now advising you that you need to free up space to be able to run the VM again. Do not make a decision on this box yet.

This machine is now effectively offline so we need to free space up on the VMFS LUN in order for vSphere to be able to commence running the machine again.

Fortunately, we can just run through the two steps in the previous scenario in order to get this VM up and running again. Namely;

● Resize the underlying VDISK via the SVC console

● Resize the VMFS LUN within vSphere

Once you have performed these tasks you will be able to select ‘Retry’ on the yellow dialog on the VM’s general screen. The VM will now continue to run exactly where it left off!

Unfortunately, it won’t deal with the carnage that ensues after the service has been restored.

I hope you enjoyed this guest article. If you wish to get in touch, you may do so in the following ways;

Add me to your Circle - +Chris Dadswell

Follow me on Twitter - @chrisdadswell

Subscribe to my blog - Chris Dadswell

Connect with me on LinkedIn - LinkedIn profile

Monday, 22 August 2011

Get your AD information with .NET

.NET has a useful class called system.DirectoryServices to query LDAP such as Active Directory

Here is some example code to retrieve all the AD information for a user called FullName,


On a web site, your Active Directory Name can be found using


FullName = HttpContext.Current.Request.ServerVariables["AUTH_USER"];


(If you are not using AD then this will return a blank)


In this procedure we first write the results to a data table. we can then sort the results before appending them to a stringbuilder which is then return to the web page.


using System.DirectoryServices;


       public static String GetUserInfo(string Fullname)


            DataTable UserInfo = new DataTable();

            UserInfo.Columns.Add("Name", typeof(string));

            UserInfo.Columns.Add("Info", typeof(string));


            System.Text.StringBuilder sb = new System.Text.StringBuilder();


            //extract the user name from DomainName

            string username = Fullname.Substring(Fullname.IndexOf("\\") + 1);


            //create a directorySearcher

            DirectorySearcher ds = new DirectorySearcher(new DirectoryEntry("LDAP://"));


            //filter the search to just the user name

            ds.Filter = "samaccountname=" + username;

            ds.PageSize = 1000;

            //use the findone function rather than the findall to return the user's data

            SearchResult sr = ds.FindOne();


            //loop through the search results properties inserting in to a data table

            foreach (string var in sr.Properties.PropertyNames)


                UserInfo.Rows.Add(var, sr.Properties[var][0].ToString());


            DataView dv = new DataView(UserInfo);

            dv.Sort = "Name";


            //write out the proprties in name order to the stringbuilder

            foreach (DataRowView drv in dv)


                sb.AppendLine(string.Format("{0} : {1}", drv[0], drv[1]));




            return sb.ToString();



Wednesday, 17 August 2011

Another New Contributor–Christian Dadswell

This is just a short post to introduce you all to my good friend Christian Dadswell who has kindly agreed to write guest post for

Christian has a wealth of experience in IT especially in the field of virtualisation. I had the pleasure of working with Christian on some recent projects to virtualise a SQL Server estate

Christian’s post will be focusing on resizing LUNS in vSphere…a subject which we are all likely to have interest in.

As a hobby Christian writes and develops mobile apps for the Android market. I’m told that he is actually very good it. His BBC Browse Cast application had in the region of 20,000 downloads the last time I checked with him… It is probably much more by now.

Chris’ post will be published in the next couple of weeks

Tuesday, 16 August 2011

Auditing SQL Logins Addendum

One thing that has come out of auditing successful logins is the number of applications called ".Net SqlClient Data Provider". This is slightly annoying because we don't want to audit a generic name because you won't then pick a rogue application.

This can be easily fixed by adding " Application Name=AppName" to the connection string of the .NET application.


connectionString="Data Source=Server1;Initial Catalog=master;Integrated Security=SSPI; Application Name=Application1;" providerName="System.Data.SqlClient

Monday, 15 August 2011

Auditing SQL Logins Part II

Written by David Postlethwaite

In a previous blog  I talked about how to audit successful logins. So what about failed logins?

As mentioned before logon Auditing can be configured under server properties. Here you can audit failed logins..




This puts the audit records into the Windows event log but a more useful solution would be to have these failed logins written to a table.

It turns out to be quite simple using the master..xp_ReadErrorLog command which reads from the Windows Event log.


So first let's create a table to store the results


  CREATE TABLE [dbo].[lv_Failed_Logins](

      [Failed_Date] [datetime] NULL,

      [Failed_Userid] [varchar](50) NULL,

      [Failed_Hostname] [varchar](50) NULL,

      [Failed_ServerName] [varchar](100) NULL


In this example we will run a scheduled job every night to write the failed logins into a table

First we will cycle the error log using the following command. This makes the current log file an archive file and creates a new empty log.


We will then run the following procedure. This will read the 1st archive file (the one that we've just created) finding every line that contains the words failed and login and write it to a temporary table.

We then cycle through the temporary table looking for the right substrings to extract the login name and host name and writing them to our failed_logins table.

create procedure [dbo].[sp_lv_Failed_logins]


declare @cmdF varchar(max)

declare @search1 char(6)

declare @search2 char(2)

declare @search3 char(8)

declare @search4 char(1)


select @search1 = "user '"

select @search2 = "'."

select @search3 = "CLIENT: "

select @search4 = "]"

select @cmdF = "master..xp_ReadErrorLog 1, 1, 'failed', 'login'"


set nocount on


create table #lv_FailLog(date datetime,processinfo varchar(30),message varchar(4000))

insert #lv_FailLog exec (@cmdF)


insert      lv_monitoring..lv_failed_logins

  select date,


  (charindex(@search2,message)) - (charindex(@search1,message)+6)),


  (charindex(@search4,message)) - (charindex(@search3,message)+8)),@@SERVERNAME

  from      #lv_FailLog where message like '%user%'


drop table #lv_FailLog




By cycling the error log each night we can ensure that we don't get any duplicates, but if the job fails for any reason then we would lose that day's data.

The major disadvantage is that we only get failed logins for yesterday, never for today.

If we wanted an instant alert of a failed login we would need to use log reader, like MS Operations Manager (SCOM), to pick up the error and report on it.

Monday, 8 August 2011

Putting a Database into Single User Mode

I was recently discussing with a group of budding DBA’s how you go about performing a restore of database. They were trying the restore for themselves  and a few of them got an error about not having exclusive access:

Msg 3101, Level 16, State 1, Line 5
Exclusive access could not be obtained because the database is in use.
Msg 3013, Level 16, State 1, Line 5
RESTORE DATABASE is terminating abnormally.

This was a test setup and obviously we had other sessions connecting to the databases. these connections were  preventing the restore.

If you are restoring a database and you have other sessions connected it can be quite simple to kill those connections in one go  by putting the database into single user mode and using the ‘ROLLBACK IMMEDIATE’ switch. This will rollback any open transactions and allow you to restore the database. I like to wrap this around the restore statement

alter database TestDB set single_user with rollback immediate 



Most of this information including example code is available in Books Online, which is great document for all SQL Server professionals to make use of?

Friday, 5 August 2011

Audit SQL Logins

Written by David Postlethwaite
There are several ways that you can audit who has logged in to your SQL Server. The easiest way is to turn on logon Auditing under server properties. Here you can audit failed logins, successful logins, or both.
This will put the audit records into the Windows event log. The failed logins is pretty useful because there shouldn't be many of them. But the successful logins turns out not to be so helpful because it records every single login which includes the service account. It turns out this logs into the server every few seconds so you end up with so much data in the event log that it's impossible to find anything else, especially the rogue user.
The Enterprise version has a built in auditing option which writes to a user defined event log but again this still suffers the same problem in that it audits every login.
What would be better is a way of selectively auditing only those logins we want to see (or not recording those we aren't interested in).
One way this can be achieve is by using a login trigger to write the login information to a table. We can easily collect the login information we want:
App_Name() will tell us the calling application
HOST_NAME() will tell us the calling computer
eventdata() can return the login name
The trigger can capture the login data and run a query to filter out the logins we don't want to record and then write the data to a table.
In this case I chose to filter on the application name rather than a login name. My thinking here was that a user who can use the application might be tempted to try and use Excel or Access to view the raw data. This worked well but I found the table was filling up with records of me using SSMS. I didn't want to audit the DBA team doing their day to day job but I definitely wanted to audit anyone else using SSMS. So I added a second filter to look for a combination of User Name, Application Name and connecting Host Name.
1st  create a table to hold the login information
CREATE TABLE [dbo].[Succ_Logins](
      [Succ_ServerName] [varchar](100) NULL,
      [Succ_EventTime] [datetime] NULL,
      [Succ_EventType] [varchar](100) NULL,
      [Succ_LoginName] [varchar](100) NULL,
      [Succ_IPAddress] [varchar](100) NULL,
      [Succ_HostName] [varchar](100) NULL,
      [Succ_AppName] [varchar](100) NULL,
      [Succ_EventData] [xml] NULL
2nd  create a table to hold the logins that we don't want to audit
CREATE TABLE dbo. LoginsNotToAudit(
      [App_AppName] [varchar](100) NULL,
      [App_HostName] [varchar](100) NULL,
      [App_LoginName] [varchar](100) NULL
ALTER TABLE [dbo].[ LoginsNotToAudit] ADD  CONSTRAINT [DF_LoginsNotToAudit_App_HostName]  DEFAULT ('') FOR [App_HostName]
ALTER TABLE [dbo].[ LoginsNotToAudit] ADD  CONSTRAINT [DF_LoginsNotToAudit_App_LoginName]  DEFAULT ('') FOR [App_LoginName]
Add some records . Note you can use a % in the value
insert into dbo.LoginsNotToAudit (App_AppName) values ('Microsoft SQL Server')
insert into dbo.LoginsNotToAudit (App_AppName) values ('Report Server%')
insert into dbo.LoginsNotToAudit (App_AppName) values ('SQL Server Data Collector%')
insert into dbo.LoginsNotToAudit (App_AppName) values ('SQLAgent%')
insert into dbo.LoginsNotToAudit (App_AppName) values ('Data Collector%')
insert into dbo.LoginsNotToAudit (App_AppName) values ('DatabaseMail%')
insert into dbo.LoginsNotToAudit (App_AppName) values ('Microsoft SQL Server Management Studio - Transact-SQL IntelliSense')
insert into dbo.LoginsNotToAudit ([App_AppName],App_HostName,App_LoginName) values('.Net SqlClient Data Provider%','Computer1','User1')
And here is the trigger
on all server
with execute as 'sa'
  DECLARE @event XML
  SET @event = eventdata()
  IF (select count(App_AppName) from auditDB.dbo.LoginsNotToAudit
  where App_Name() LIKE App_AppName AND APP_HostName='' AND App_LoginName=''
  App_Name() LIKE App_AppName AND APP_HostName=HOST_NAME()
  AND App_LoginName=CAST(@event.query('/EVENT_INSTANCE/LoginName/text()') AS VARCHAR(100))) = 0
    INSERT INTO auditDB.dbo.Succ_logins (Succ_EventTime,Succ_EventType,Succ_LoginName,Succ_IPAddress,Succ_AppName,Succ_HostName,Succ_ServerName,Succ_EventData)
    VALUES(CAST(CAST(@event.query('/EVENT_INSTANCE/PostTime/text()') AS VARCHAR(64)) AS DATETIME),
       CAST(@event.query('/EVENT_INSTANCE/EventType/text()') AS VARCHAR(100)),
       CAST(@event.query('/EVENT_INSTANCE/LoginName/text()') AS VARCHAR(100)),
       CAST(@event.query('/EVENT_INSTANCE/ClientHost/text()') AS VARCHAR(100)),
   -- This will not stop the trigger from returning an error
I've run this on several 2008 servers without any noticeable degradation in performance.
But there is a major drawback of using a Login trigger.
If the code causes an error then the trigger will fail. If it fails then the users can't login. They will receive an error Logon failed for login 'Domain\username' due to trigger execution.
No amount of try- catch, error trapping or transaction rollbacks will help. Any error will cause the trigger to exit as failed I found this out when a new application started using the server but it wasn't added to the LoginsNotToAudit table, the database ran out of room so the insert failed and so the trigger failed and no one could login I now have a daily job that counts the records in the table to check it's not filling up too quickly
If the trigger is failing then the only way to login is to use the Dedicated Administrator Connection or DAC. The Login Trigger doesn't fire when you login using the DAC.
  • Only sysadmins can use DAC
  • There can be only one DAC connection at a time
  • To use DAC with SSMS, prefix the server name with ADMIN:  The important point with SSMS is you can only open a query window with DAC you can't use Object Explorer since this uses more than one connection -you will get an error.
  • You can also use SQLCMD with the "-A" switch  SQLCMD –A –S [SQL Server Name] –U [User Name] –P [Password]  –Q [query]
  • By default you can't connect remotely using the DAC so make sure you can RDP to the SQL Server
To allows remote connections use sp_configure, but there is are security implications of allowing remote DAC connections
sp_configure 'remote admin connections', 1
reconfigure with override
Once you have your DAC connections you can disable the query using
disable trigger AuditSQlLogins ON ALL SERVER
If you do need to audit logins to SQL this does work very well but does require some administration. Make sure all the DBAs know how to disable the trigger in the event it does fail and keep an eye on the number of records you are collecting. In my experience once you filtered out he DBAs and the application connections there aren't that many records. Which makes looking for the rogue user a bit easier.

Featured post

Creating a Linked Server to an Azure SQL Database

Why would you create a Linked Server to an Azure SQL Database? If you work in a hybrid environment with some databases stored on your on ...