The T-SQL script below will find all tables and columns for a particular Primary Key column (located in WHERE clause [pk-table].[pk-column]). This script is pretty useful when you are working close the database, manipulating data directly and so on.
SELECT K_Table = FK.TABLE_NAME, FK_Column = CU.COLUMN_NAME, PK_Table = PK.TABLE_NAME, PK_Column = PT.COLUMN_NAME, Constraint_Name = C.CONSTRAINT_NAME FROM INFORMATION_SCHEMA.REFERENTIAL_CONSTRAINTS C INNER JOIN INFORMATION_SCHEMA.TABLE_CONSTRAINTS FK ON C.CONSTRAINT_NAME = FK.CONSTRAINT_NAME INNER JOIN INFORMATION_SCHEMA.TABLE_CONSTRAINTS PK ON C.UNIQUE_CONSTRAINT_NAME = PK.CONSTRAINT_NAME INNER JOIN INFORMATION_SCHEMA.KEY_COLUMN_USAGE CU ON C.CONSTRAINT_NAME = CU.CONSTRAINT_NAME INNER JOIN ( SELECT i1.TABLE_NAME, i2.COLUMN_NAME FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS i1 INNER JOIN INFORMATION_SCHEMA.KEY_COLUMN_USAGE i2 ON i1.CONSTRAINT_NAME = i2.CONSTRAINT_NAME WHERE i1.CONSTRAINT_TYPE = 'PRIMARY KEY' ) PT ON PT.TABLE_NAME = PK.TABLE_NAME WHERE PK.TABLE_NAME = '[pk-table]' AND PT.COLUMN_NAME = '[pk-column]'
Microsoft have released three Azure Specialist exams for the last few months. I have been watching a lot of videos on Microsoft Virtual Academy, Channel9 and Pluralsight the last few years, and very intensivly since December 2014.
70-532 Developing Microsoft Azure Solutions. This is a developer exam for people who wants to be able to designing, programming, implementing, automating, and monitoring Microsoft Azure solutions.
70-533 Implementing Microsoft Azure Infrastructure Solutions. This is an exam for IT-pros and solution architects who wants to implementing an infrastructure solution in Microsoft Azure. Candidates have experience implementing and monitoring cloud and hybrid solutions as well as supporting application lifecycle management.
70-534 Architecting Microsoft Azure Solutions. This is an exam for Solution Arcitects should know the features and capabilities of Azure services to be able to identify tradeoffs and make decisions for designing public and hybrid cloud solutions. Candidates who take this exam are expected to be able to define the appropriate infrastructure and platform solutions to meet the required functional, operational, and deployment requirements through the solution lifecycle.
- Channel9 – video series
Our department had an interesting challenge the last week. We have an old local on-premise “Team Foundation Server” (TFS) in our data room with two VMware hosts containing a number virtual machines. Due to new company policies we needed to move all domain bound VMs to a new domain. As we feared, this cause a few problems due to old versions and incorrect editions of different software.
The first thing I did was to perform a “Get Latest” on all source code just, in addition to a VM snapshot before we started the actual migration process. We needed to have a “Plan B” if the migration failed. After a few days with migration failures with loads of issues between Sharepoint, Project Server, SQL Server and TFS, we decided to make a clean install and move the source code into new team projects. The problem now was that the old TFS server had about 60 team projects that need to be created manually.
As a lazy programmer, I prefer a command-line utility to help me with this project creation. Luckily, 99% of all team projects didn’t use the Sharepoint site, so for the moment I just have to migrate source code to the version control of the new TFS server.
The command-line tool need for the team project creation is called “TFS Power Tools”, and exists in the latest version of Visual Studio – 2012 and 2013. Here is the command template I have used for our team projects.
tfpt createteamproject /collection:"http://[IP or Hostname]:8080/tfs/DefaultCollection" /teamproject:"[project name]" /processtemplate:"Microsoft Visual Studio Scrum 2.2" /sourcecontrol:New /noreports /noportal
Since I found the list of team project directories by using “dir /b” from the DOS-prompt and put this directory list into Excel and generated one command for each project based on the command-line above. All these command where put into a command file (cmd) and run. When this is completed I will add all files from the projects from the “Source Control Explorer” in Visual Studio. Some manual work is needed.
I have a private project going on where I’m in the data cleaning and import phase. This weekend’s problem has been a lot of date columns stored as string on 2 different formats (yyyymmdd and ddmmyyyy) and different seperator characters (‘-‘ and ‘.’) all over the place. And these strings I want in date datatype. I’m a lazy programmer and had to make string2date function to use for my import stuff. Here is my first version of the function.
CREATE FUNCTION [dbo].[String2Date] ( @string NVARCHAR(100), @style smallint = 104 ) RETURNS date AS BEGIN IF (CHARINDEX('-', @string, 1) = 5 OR CHARINDEX('.', @string, 1) = 5) BEGIN SET @style = 102 ; END; RETURN CONVERT(date, @string, @style) ; END
This function is not perfect, but suits my purpose for the moment. It would probably give some exceptions now and then, but I don’t care :)
Today I found a pretty nice way to delete duplicate rows in a table on SQL Server. I had a table with 25,000 rows where 7,500 rows where rows containing one or more duplicates. I was not very eager to manually delete these duplicate, so I started to googling for answers. I found many different approaches, but suddenly I found my answer at stackoverflow.com. This thread help me rewriting a simple SQL statement after I added an [Id] column to uniquely identify a row. The clue is to use Common Table Expression (CTE) in SQL Server together with the OVER() function to create an unique row number for all duplicates within the key expression (in my case [Name] column), and ordering by the [Id] column. I only want to keep the first row for each duplicate item. Therefore, deleting all [rowno] greater than 1.
WITH cte_duplicates AS ( SELECT [Name], row_number() OVER ( PARTITION BY [Name] ORDER BY [Id] ) AS [rowno] FROM [dbo].[fm2011] ) DELETE FROM cte_duplicates WHERE [rowno] > 1
I have a goal for February to complete 3 challanges for “KiPi 2015”. It should be possible even if I’m going to sell the house in this period. The last challenge last for 32.5 hours and cover many topics within Hybrid Cloud design and implementation. Designing private cloud data centers is not my current work focus, but very useful knowledge in my work as Cloud Solution Architect.
This module consist of the following parts:
- Part 1: Modernizing Your Datacenter Jump Start (Level 200, 52 points)
- Part 2: Hybrid Cloud Workloads: Storage and Backup (Level 200, 52 points)
- Part 3: Hybrid Cloud Workloads: Disaster Recovery and High Availability (Level 200, 44 points)
- Part 4: Hybrid Cloud Workloads-Websites (Level 200, 50 points)
- Part 5: Hybrid Cloud Workloads SQL (Level 200, 52 points)
- Part 6: Deploying Linux VMs on Microsoft Azure (Level 300, 29 points)
- Part 7: Migrating VMs from Amazon AWS to Microsoft Azure (Level 200, 36 points)
- Part 8: Automating the Cloud with Azure Automation (Level 300, 42 points)
I’m having just some hours left of the first part of my KiPi 2015 Challenge for “Cloud Development“, and the next course path is “Mobile Development”. This path will contain video sessions on C#, XAML, Universal app development and how Xamarin and Visual Studio can be used for cross-plattform development. These courses are about 32 hours in total watching.
- Part 1: Programming in C# Jump Start (Level 200, 38 points)
- Part 2: Developing Universal Windows Apps with C# and XAML (Level 200, 118 points)
- Part 3: Cross-Platform Development with Xamarin & Visual Studio ( Level 200, 47 points)