Azure Service Bus Queues

Lately, I have been experimenting with Azure Service Bus, and especially Queues, but  the Azure Service Bus also have features like Event Hub, Topics and Relay. I have most experience with the concept of queues, and I started my journey there. A queue is called a FIFO data structure (FIFO, First-In-First-Out). This means that the first element added to the queue will be the first one to be removed. This ensures that elements come in the right order they were put into the queue. This can be illustrated with following figure:


In the middle, the queue receieved messages (items) from one or more message senders. These messages might reside the queue until they are taken out of the queue by one or more receiers, or automatically disposed into the “dead letter” (garbage bin). You also be able to take a look at the first element by a “peek” operation.

When you are working with queues in Microsoft Azure, you first have to create a namespace container where all the queues shall be located and grouped together. A namespace might conatin queues and topics. The code below shows the basic code for writing a test message to a Azure Service Bus Queue.

           const string qName = "SampleQueue";

            var nsmgr = NamespaceManager.Create();
            if (!nsmgr.QueueExists(qName))

            var qClient = QueueClient.Create(qName);

            var msg = new BrokeredMessage("test message");
            msg.MessageId = "1";


            //var msg1 = qClient.Peek();
            //var msg2 = qClient.Receive();


After I had written my first element to a Azure Queue, I thought: “How do I look at the element in the queue without writing the code for retrieving the element. Paolo Salvatori has written a nice Service Bus Explorer for this purpose. As you might see, I have commented the Peek and Receive operations. The receive require a Complete operation as well.

Happy coding….

, , , , , , , , ,

Leave a comment

Downloadable Azure Icon Set for Powerpoint & Visio

Here is a nice icon set from Microsoft that can be used in Powerpoint presentations and visio drawings.

Leave a comment

When is SQL Server restarted?

It has been a while since last post now due to summer vacation and busy working hours

In my post on simple database activity monitoring, I described a method by calculating periodical values for read  write operations for databases. These values are calculated based on DMVs (Dynamic Management Views) that stored performance counters. The problem is that most of these DMVs is reset when the SQL Service or the server isself is restarted.

Method 1

SELECT login_time FROM sysprocesses WHERE spid = 1

 Method 2

SELECT sqlserver_start_time FROM sys.dm_os_sys_info

 Method 3

SELECT [crdate] AS [LastStartupDate]
FROM [dbo].[sysdatabases]
WHERE [name] ='tempdb'
AND [crdate] < GETDATE()

Leave a comment

Powershell – download all videos from a RSS feed

In these vacation times there is always a risk for tech abstinence and low bad width. We discussed this at work yesterday, and start looking for ways to mass download videos from channel9. An off course, powershell is the solution to almost every problem :)

A collegues of mine found a nice post on the topic from by Scott Hanselmann. There were some errors in his script that I traced to a local path issue. I corrected the problem and modified it to a script with parameters for URL and local path where the videos should be stored.


If (!(Test-Path $storageDir))
    Write-Host "Path '" $storageDir "' does not exist";
    return ;

$feed=[xml](New-Object System.Net.WebClient).DownloadString($url)

Write-Host "Downloading from : '" $ "' to folder " $storageDir

foreach($i in $ 
    $url = New-Object System.Uri($i.enclosure.url)
    $file = $storageDir + "\" + $url.Segments[-1]

    $started = Get-Date;
    Write-Host "Starting: " $started
    Write-Host "URL: " $url.ToString()
    Write-Host "Local file: " $file

    (New-Object System.Net.WebClient).DownloadFile($url, $file)

    $runtime = $(get-date) - $started
    $secs = $runtime.TotalSeconds.ToString("#0.0");
    $downloadtime = [string]::format("{0} sec(s)", $secs)

    Write-Host "Completed in " $downloadtime

    Write-Host "--------------------"

Suggested “watch list”:

Happy vacation everybody…



, , , ,

Leave a comment

GetDatabaseVersion – Stringify Compatibility Level

Are you working on lots of databases with different compatibility level and different SQL Server versions? I have this problem almost every day. I have to make my code stable and robust enough to work on different SQL Server versions and on the code must use functionality on databases with different compatibility levels.

I have to check for the database compatibility level before configuring the different performance counter and custom features for my database activity monitor, described in my previous post. I was pretty baffled when I discovered this logical error last week.

As a result, I created a helper function used in my day-to-day work to avoid mixing different compatibility levels.

CREATE FUNCTION [dbo].[GetDatabaseVersion]
	@dbname NVARCHAR(100) 
	DECLARE @returnvalue NVARCHAR(100) ;

	SELECT @ver = compatibility_level
	FROM sys.databases WHERE name = @dbname ;

        SET @returnvalue = 
               WHEN @ver = '80' THEN 'SQL Server 2000'
               WHEN @ver = '90' THEN 'SQL Server 2005'
               WHEN @ver = '100' THEN 'SQL Server 2008 (R2)'
               WHEN @ver = '110' THEN 'SQL Server 2012' 
               WHEN @ver = '120' THEN 'SQL Server 2014'

	RETURN @returnvalue



Leave a comment

Monitoring and Baselining Database Activity

Have you ever wondered how much a SQL Server database is used the last day, week, month, year, or if it is used at all? Two central question is – (1) Does users write/update data in the database? (2) Is the database read-only for historical usage?

This is two key questions for most migration projects and important if you want to reduce the number of databases in your migration plans. The amount of INSERT/UPDATE (write) and SELECT (read) statement is vital when determining the database importance in the migration. If the database is used only in normal work hours, we can request a long downtime window during migration. If the database is used during both day and night, you might want to check if the weekend migration is more suitable. If the database is just part of a larger application eco system with loads of integration, you might want to choose a longer downtime period or change the migration strategy.

I have always been interested in performance and database optimization to be able to detect current bottlenecks or predict future performance issues. Last year I came across a very interesting blog post written by Erin Stellato on the baselining topic. In addition, Glenn Berry, a colleague of Erin at sqlskills has interesting blog series on SQL Server diagnosing information. This information can be used to gather, aggregate and store information for how the SQL Server is performing over time. This enables us to establishing scenarios (read: baselines) for different period, such as normal working hours, mornings, evening, weekends, peak periods and so on. These baselines can be uses to measure if a server is performing as expected or not.

There exists several DMV (Dynamic Management Views) with information that can be used for monitor the Activity.

  • sys.dm_db_index_usage_stats: contains counts for different types of index operations and time when they were last performed. The most interesting columns are:
    • user_seeks – number of index seeks
    • user_scans- number of index scans
    • user_lookups – number of index lookups
    • user_updates – number of insert, update or delete operations
  • sys.dm_db_index_operational_stats: contains current low-level I/O, locking, latching, and access method activity for each partition of a table or index in the database. The most interesting columns for reading data is the following:
    • range_scan_count – cumulative count of range and table scans started on the index or heap.
    • singleton_lookup_count – cumulative count of single row retrievals from the index or heap.
  • sys.dm_os_performance_counters: contains cumulative and current values for different SQL server performance counters. The most interesting counter to monitor number of transactions (INSERT/UPDATE operations) against user defined tables are the following:
    • “Write Transactions/sec” (SQL 2008 and newer)
    • “Transaction/sec” (SQL 2005)
    • “User Connections”
    • Based on the post from Erin I have created a solution where I’m able to keep track of INSERT/UPDATE and different SELECT operations for databases on a SQL Server. I have defined two tables and one view:
  • Counters (table): contains definition of all counters and related information of performance counters and custom “counters” that is used on the solution. The column Name is a constructed name based on the “object_name” and “counter_name” from the DMV sys.dm_os_performance_counters. A counter might be deactivated by setting the Active column to 0 (zero).
  • PerfmonData (table) – all data based on DMVs or custom calculations. This store either calculated delta values based cumulative counter values or snapshot values for specified counter. The interval between calculations is decided by the schedules used by the agent job that is responsible to collect data.
  • v_read_count (view) – contains aggregation of read access for all databases on the server. The data is based on sys.dm_db_index_operational_stats.
  • Do the following steps to build Your own baseline database:
  1. Create a database named “BaselineLight”. Add the schema by running this script.
  2. Next step is to create configuration for the initial counter values  for databases. Download and run “Create Counters” script.
  3. Create a SQL Agent job for collecting data. Download and run “Create Collector job” script.
  4. Create test database – mine is called Roar2. Download full script.
  5. Add test data with the following command
    INSERT INTO [Roar2].[dbo].[RoarTest]([CreatedAt]) VALUES (getdate());
  6. Run some single lookups and range reads against the RoarTest table with following commands:
    SELECT * FROM [Roar2].[dbo].[RoarTest] where Id >= 5 AND Id < 10 
    SELECT * FROM [Roar2].[dbo].[RoarTest] where Id = 5 ;

Based on the read/write data in PerfMonData table we can create a forecast on how much a database is used every day all day long. The schedule for the CollectPerfmonData job can be adjusted to for example 5min or 15 min for a finer granularity of the calculated data.

Now we have to data in the DMVs, described above for the Roar2 database. Whent the job “CollectPerfmonData” runs next time the table “PerfMonData” will be populated with the delta values between “run time” of the CollectPermonData and (2) when the initial counter values was recorded.

, , , , , , , ,


Blog milestone – 25k

This blog had a nice, little milestone this week – 25K hits since October 2012. This is an average of 1200 hits/month since I first started blogging regularly with at least 4-5 blog posts each month.

This blog is still used as my personal “notebook” where collect tricks and tips on things I’m working on day-to-day basis. I hope you all enjoy reading, and I will try my hardest to make the blog post as interesting for everyone else :)

I will still write posts on Microsoft Technologies like Windows Server, Azure, SQL Server and other things I find interesting.


, ,

Leave a comment


Get every new post delivered to your Inbox.

Join 746 other followers