PowerShell – Format SQL Disk Partition

When installing SQL Server you have to ensure that the DATA, LOG and TEMPDB disk are formatted with 64K block size (allocation unit size). This must be done to reduce number of physical page reads from disk when SQL Server engine needs fresh data that is not updated in cache. Since database files most certain are larger than 64K, this is the block size you need for your database files.

As a result, I have implemented an unattended installation script in PowerShell where I check the Block size of the DATA, LOG and TEMP disks. If they don’t have 64K block size, I will format them immediatly.

Here is my PowerShell function for this use.

function FormatSQLDisk([string]$driveletter, [string]$drivelabel)
    Format-Volume 	`
	-DriveLetter $driveletter `
	-NewFileSystemLabel $drivelabel `
        -FileSystem NTFS `
	-AllocationUnitSize 65536 –Force -Confirm:$false

Happy formatting…

, , , , ,

Leave a comment

Never set “AutoShrink = true”

In my last post, I wrote how you could detect fragmentation and how to fix it.  This post describes a feature that many people in the SQL Server Community hate. This is a single setting under “Database Properties” and might seem to be a pretty good idea for someone who don’t know all the facts, and will sooner or later result in decreased performance. This feature will in worst case totally fragment your database – tables, indexes and even files on the file system level.

You can run the following command to detect which databases that have turned on AutoShrink:

select name, is_auto_shrink_on 
from sys.databases
where is_auto_shrink_on = 1

You can run the following command to turn off AutoShrink for a database:

alter database [MyDB]
set auto_shrink off

The AutoShrink feature fires every 30 minutes by default and it severly fragement the database.  If you use AutoGrowth as well, this can totally fragment the entire database. AutoShrink starts at end of file and brute-force moving the pages as close to the beginning of the file as possible. These pages are physically getting out of order inside the database. If you have a “full rebuild index” the night before and AutoShrink could fragment the database again.

, , , , ,

Leave a comment

Check SQL Server fragmentation – rebuild/reorganize

If you have performance issues on your database, one of the first things you should check is the fragmentation.  If the fragmentation is high, SQL Server has the possibility to either reorganize or rebuild indexes. You can detect index fragmentation by using the Dynamic Management View (DMV) sys.dm_db_index_physical_stats and check the avg_fragmentation_in_percent column.

Use [MyDB];
SELECT a.index_id, name, avg_fragmentation_in_percent 
FROM sys.dm_db_index_physical_stats (DB_ID(), NULL, NULL, NULL, NULL) AS a
   JOIN sys.indexes AS b ON a.object_id = b.object_id 
AND a.index_id = b.index_id
ORDER BY avg_fragmentation_in_percent DESC;

If the value avg_fragmentation_in_percent between 5% and 30% you should perform a reorganize of your indexes, not a total rebuild. A rebuild you only ne invoked if the fragmentation percentage is greater than 30%. No action should be taken if the fragmentation percentage is lower tha 5% which is a normal level of fragmentation.

The following command will rebuild all indexes With default fill factor :

Use [MyDB]; 
EXEC sp_MSforeachtable 
  @command1="print '?'", 

, ,

1 Comment

CleanBin – Cleaning up your bin and obj solution folders in Visual Studio

I’m pretty sure every programmer is lazy. What do you do when sending sample code to colleagues or customers by email? Just zipping the Visual Studio solution and send it? Have you ever been stopped by size limitations for attachments on Exchange set by your IT-department? Been there, done that…In addition, until recently I had bad upload capacity on my home broadband, and my synchronization between local disk and Skydrive/Dropbox was awful when there were many new and large files.

This is not a big issue when you have utilities like “Dropbox – public links”, but until recently, I was fed up and created my own utility to clean up unnecessary files from my sample projects. BUT…As mentioned, programmers are lazy and we probably don’t clean the solutions when we close the solution. Therefore I had to create an utility that went through all my source code locally and removed the files under BIN and OBJ folders.

I called this utility CLEANBIN and the main objective was to loop through all my sample code stored on my local drive and remove all files from the BIN and OBJ folder in my Visual Studio solutions.


Happy coding…

, , , , , ,

Leave a comment

MCM readiness videos

I just say: WOW!

SQL Server 2008 Microsoft Certified Master (MCM) Readiness Videos


, ,

Leave a comment

Check SQL Server configuration with powershell

When you are working with many SQL server at once, you will minimize the validation of configuration and settings. Normally, you open “Management Studio” and run some commands in a query window or navigate through the GUI. But in newer version (even powershell 2.0) you can access most of the SQL Server configuration properties and settings by powershell commands.

The following script will connect to the currently loggin server and write typical configured properties on a server you are looking for:

Add-Type -AssemblyName "Microsoft.SqlServer.Smo, `
	Version=, `
	Culture=neutral, `

$server = New-Object `

Write-Host "---------------------------------------------"
Write-Host "##### check_sql_config, by sveroa       #####"
Write-Host "---------------------------------------------"

Write-Host "Server         : " $env:ComputerName
Write-Host "Network name   : " $server.Properties["NetName"].Value
Write-Host "Product        : " $server.Properties["Product"].Value
Write-Host "VersionString  : " $server.Properties["VersionString"].Value
Write-Host "Edition        : " $server.Properties["Edition"].Value;
Write-Host "ProductLevel   : " $server.Properties["ProductLevel"].Value
Write-Host "Processors     : " $server.Properties["Processors"].Value;
Write-Host "PhysicalMemory : " $server.Properties["PhysicalMemory"].Value;
Write-Host "Max Memory     : " $server.Configuration.MaxServerMemory.ConfigValue;
Write-Host "Data folder    : " $server.Properties["DefaultFile"].Value;
Write-Host "Log folder     : " $server.Properties["DefaultLog"].Value;
Write-Host "Backup folder  : " $server.Properties["BackupDirectory"].Value;
Write-Host "Collation      : " $server.Properties["Collation"].Value;

You can also run this script on a remote SQL server by some changes described in this blog.

Happy coding…

, , , , , ,

Leave a comment

Currency Collector

Edit (04-FEB-2014): new version of source code

I couldn’t keep away from programming during my long Chrismast Holiday. Decided to do some “just-code-something-useful” instead of doing something new. I remebered back to my previous employer and a quite simple and fun task of implementing a currency collector, and decided to rebuild this from scratch.

The national bank (Norges Bank) updates the curency rates every day at 15:15 (local time). This file is stored as a CSV file


This file is on the following format seen in the screenshot below. The first column contains the date on the format DD-MMM-YY. The following columns contains data for all the available currencies – one column for each currency. The column header contains both unit and currency code according to Norwegian Kroner (NOK). Each row contains data for all currencies for the specified date in column 1. The file contains all historical data back to 2001.


The main page for the currency rates are located here;

These data are the official data for currency rates according to NOK, but it might be more correct to collect data from different banks as well to add fees you have to pay when buying and selling currency.

The concept has a simple data model.  One table for storing information on the available currencies, one table for the collected rates. I also have a table for sources for currency rates. For the time beeing, I have only implemented one – NB, Norges Bank.


I have developed the solution i Visual Studio 2012 as a console application. The database is created/published with the project type “SQL Server Database Project”. This project type has evolved since I used this a few years back and become more mature, in my opinion.

For reading CSV files I’m always using the super-fast CSV reader by Sebastien Lorion. This lovely library is now also available via Nuget. I’m using Log4Net as logging framework. And the database access is done by “LINQ to Classes” – simple and easy.

The program uses an input- and output directory. It will store all files retrieved from Norges Bank with a date in the output directory, and the main program will check if the file for current day is already stored locally and take no action if this file exists.

If the currency file for today doesn’t exist it will http download the latest currency rates from the URL described above. The program will read currency reates that is enabled in the database (Active = true), and read all new currency rates for all currencies from the downloaded file and write them to the database.

This code is probably not a school example, but it works. I will refactor the code later **IF** I decide to implement a new currency rate source, such as “DNB”.

Full source code is located at my dropbox.

Change history:

  • 04-FEB-2014), bugfix to fix hardcoding folder error

Happy coding…

, , , , , ,

Leave a comment


Get every new post delivered to your Inbox.

Join 685 other followers