In my last post, I wrote how you could detect fragmentation and how to fix it. This post describes a feature that many people in the SQL Server Community hate. This is a single setting under “Database Properties” and might seem to be a pretty good idea for someone who don’t know all the facts, and will sooner or later result in decreased performance. This feature will in worst case totally fragment your database – tables, indexes and even files on the file system level.
You can run the following command to detect which databases that have turned on AutoShrink:
select name, is_auto_shrink_on from sys.databases where is_auto_shrink_on = 1
You can run the following command to turn off AutoShrink for a database:
alter database [MyDB] set auto_shrink off
The AutoShrink feature fires every 30 minutes by default and it severly fragement the database. If you use AutoGrowth as well, this can totally fragment the entire database. AutoShrink starts at end of file and brute-force moving the pages as close to the beginning of the file as possible. These pages are physically getting out of order inside the database. If you have a “full rebuild index” the night before and AutoShrink could fragment the database again.
If you have performance issues on your database, one of the first things you should check is the fragmentation. If the fragmentation is high, SQL Server has the possibility to either reorganize or rebuild indexes. You can detect index fragmentation by using the Dynamic Management View (DMV) sys.dm_db_index_physical_stats and check the avg_fragmentation_in_percent column.
Use [MyDB]; SELECT a.index_id, name, avg_fragmentation_in_percent FROM sys.dm_db_index_physical_stats (DB_ID(), NULL, NULL, NULL, NULL) AS a JOIN sys.indexes AS b ON a.object_id = b.object_id AND a.index_id = b.index_id ORDER BY avg_fragmentation_in_percent DESC;
If the value avg_fragmentation_in_percent between 5% and 30% you should perform a reorganize of your indexes, not a total rebuild. A rebuild you only ne invoked if the fragmentation percentage is greater than 30%. No action should be taken if the fragmentation percentage is lower tha 5% which is a normal level of fragmentation.
The following command will rebuild all indexes With default fill factor :
Use [MyDB]; EXEC sp_MSforeachtable @command1="print '?'", @command2="ALTER INDEX ALL ON ? REBUILD WITH (ONLINE=OFF)"
I just say: WOW!
SQL Server 2008 Microsoft Certified Master (MCM) Readiness Videos
When you are working with many SQL server at once, you will minimize the validation of configuration and settings. Normally, you open “Management Studio” and run some commands in a query window or navigate through the GUI. But in newer version (even powershell 2.0) you can access most of the SQL Server configuration properties and settings by powershell commands.
The following script will connect to the currently loggin server and write typical configured properties on a server you are looking for:
Add-Type -AssemblyName "Microsoft.SqlServer.Smo, ` Version=10.0.0.0, ` Culture=neutral, ` PublicKeyToken=89845dcd8080cc91" $server = New-Object ` Microsoft.SqlServer.Management.Smo.Server($env:ComputerName) Write-Host "---------------------------------------------" Write-Host "##### check_sql_config, by sveroa #####" Write-Host "---------------------------------------------" Write-Host "Server : " $env:ComputerName Write-Host "Network name : " $server.Properties["NetName"].Value Write-Host "Product : " $server.Properties["Product"].Value Write-Host "VersionString : " $server.Properties["VersionString"].Value Write-Host "Edition : " $server.Properties["Edition"].Value; Write-Host "ProductLevel : " $server.Properties["ProductLevel"].Value Write-Host "Processors : " $server.Properties["Processors"].Value; Write-Host "PhysicalMemory : " $server.Properties["PhysicalMemory"].Value; Write-Host "Max Memory : " $server.Configuration.MaxServerMemory.ConfigValue; Write-Host "Data folder : " $server.Properties["DefaultFile"].Value; Write-Host "Log folder : " $server.Properties["DefaultLog"].Value; Write-Host "Backup folder : " $server.Properties["BackupDirectory"].Value; Write-Host "Collation : " $server.Properties["Collation"].Value;
You can also run this script on a remote SQL server by some changes described in this blog.
Edit (04-FEB-2014): new version of source code
I couldn’t keep away from programming during my long Chrismast Holiday. Decided to do some “just-code-something-useful” instead of doing something new. I remebered back to my previous employer and a quite simple and fun task of implementing a currency collector, and decided to rebuild this from scratch.
The national bank (Norges Bank) updates the curency rates every day at 15:15 (local time). This file is stored as a CSV file
This file is on the following format seen in the screenshot below. The first column contains the date on the format DD-MMM-YY. The following columns contains data for all the available currencies – one column for each currency. The column header contains both unit and currency code according to Norwegian Kroner (NOK). Each row contains data for all currencies for the specified date in column 1. The file contains all historical data back to 2001.
The main page for the currency rates are located here;
These data are the official data for currency rates according to NOK, but it might be more correct to collect data from different banks as well to add fees you have to pay when buying and selling currency.
The concept has a simple data model. One table for storing information on the available currencies, one table for the collected rates. I also have a table for sources for currency rates. For the time beeing, I have only implemented one – NB, Norges Bank.
I have developed the solution i Visual Studio 2012 as a console application. The database is created/published with the project type “SQL Server Database Project”. This project type has evolved since I used this a few years back and become more mature, in my opinion.
For reading CSV files I’m always using the super-fast CSV reader by Sebastien Lorion. This lovely library is now also available via Nuget. I’m using Log4Net as logging framework. And the database access is done by “LINQ to Classes” – simple and easy.
The program uses an input- and output directory. It will store all files retrieved from Norges Bank with a date in the output directory, and the main program will check if the file for current day is already stored locally and take no action if this file exists.
If the currency file for today doesn’t exist it will http download the latest currency rates from the URL described above. The program will read currency reates that is enabled in the database (Active = true), and read all new currency rates for all currencies from the downloaded file and write them to the database.
This code is probably not a school example, but it works. I will refactor the code later **IF** I decide to implement a new currency rate source, such as “DNB”.
Full source code is located at my dropbox.
- 04-FEB-2014), bugfix to fix hardcoding folder error
The process of creating an unattended installation procedure for the newest “beta” release of SQL Server is the same as described in my previous post for SQL Server 2008 R2. But one of the major difference is that you should install SQL Server 2014 on Windows Server 2012, and we can use the newest version of Powershell and take advantage of new, powershell functionality like disk management and other advanced OS configurations.
I created the configuration file as shown above, and added the following statements:
I also had to remove the UIMODE and QUIET statements, since they can’t be used with the statement to optimize the unattended install options:
This is my first install of SQL Server 2014 CTP2, and I will optimize the powershell installation script with a set of the most important pre- and post installation tasks.