Finally, today I found the source of why the default MVC web application fails after upgrading the Nuget packages. This have been a thorn in my side for some months. I use the “ASP.NET Web Application” and “ASP.NET Core Web Application” alot when I try out things I want to learn new things and use these templates as a starting point.
But this winter these templates started to fail where missing formatting made the test websites ugly and. The root case for this is bootstrap. The template is based on the 3.3.7 and the next step 4.0. If you upgrade the bootstrap nuget package to 4.0 or newert your website will be looking like follows instead of the the standard.
The work-around is not to include the bootstrap 4.0 or newer in your test project until Microsoft has fixed the project template. I just use the “Select all packages” and thereby uncheck the bootstrap and update. Should be safe and things are working as expected.
Off course, you can fix the code yourself, but this is not on my mind at this time 🙂
Bootstrap 4 is amost a complete rewrite and if you need to migrate to bootstrap, please follow the upgrade guide.
EDIT: March 2018. an unpublished blog from winter 2016/217 where I was working for EVRY, and had a blast brain strorming around AI and chatbots.
After being interested in BOT technology for the past few years, I found a perfect pretty perfect scenario for an application. I’m commuting between Fredrikstad and Oslo most weekdays and some days I’m depending on the train to on time for me to pick up my daughter at school/SFO (After-school activities) in time before they close. So, when I’m required to pick up my daughter I need the train to be on time, otherwise I need to make a lot of phone calls to avoid unpleasant moments when I arrive Fredrikstad 🙂
A few weeks back, my train was pretty delayed, but luckily I was not responsible for picking up my daughter at school. But as a developer at mind, I asked myself: “What if?”. What could go wrong if your train or other transportation is delayed?
- You would probably get a fine for being late to pick up your child
- You might have to reschedule deliveries at your home
- You have to notify family about late arrival
- I want to order a taxi on the-fly if needed to pick up my daughter if my wife/ex is at work as well
- You have to reschedule or cancel other activities due to late arrival
- What about dinner?
- Errands that you HAVE to make?
As a “automate-it-all” developer, one of my questions was: “How can I solve this with as less interaction as possible?” This problem has buzzed in head the past few months. And when I was asked to suggest a project for summer interns, I found this as the perfect scenario.
For several BOTs communicating with each other when certain situation occurs. The technology is ready to solve this kind of problem, but there should be some but there should be defined an application communication between
EDIT: Match 2018. If I had implemented this BOT protocol it would have saved alot. Maybe someone will take up the challange?
Due to my daughter didn’t like me commuting from Sarpsborg to Oslo almost every day. It has become worse the past few months on the train (NSB). Fully packed with people and not good to work, and with 2.5 hours one-way I needed to work on the train. Therefore I decided to get a job back home in Sarpsborg/Fredrikstad area again.
Good luck to my former collegues at komplettbank.no and komplettbank.fi the modernation of the platform for Kompis, CustomerDirect and PosFinans. It has been a fast pace journey, but unfortunatly it’s ending here for me.
I now work for CGI Norway that has one office location in Sarpsborg. World-wide, CGI Group Inc has not less than 70,000 employees at over 400 locations and registered at New York (NYSE) on $74/share and Toronto (TSX) on Stock Exchanges. I will work in a department that has it’s focus on the energy business with network and power supplies and their customers. It is a business in fast change and regulations and potential to use new technology for automation the relationship between companies and their end-users.
I will have to refresh my Oracle knowledge from Oracle 8.1.7 from back to 2002, and proudly remembered the foldername of tnsnames 🙂
Now, I just need to get the MSDN Enterprise Subscription and have chat with IT to get rid of some stupid AD Policies. Yesterday, I had to supply my username and password 17 times. Today the number was 12 times while installing and upgrading the laptop with proper sofwtare and configuration.
Since I have begun working in banking industry, I will start a new learning project. I will rewrite my previous currency collector to use the OpenExchangeRate API for real-time data for a given set of currencies like Bitcoin, ethereum, USD, NOK and EUR. The OER API supports the following currencies. At the current time, the developer plan only cost 12$/month with 10K calls to the API – a good deal, if you as me.
My new employer komplettbank.no offers consumer loans and credit cards at the moment. But in my learning project, I will expand this to involve currency loans and maybe blockchain with bitcoins and ethereum.
I will create entities for customer, account, loan and currency, in addition to supporting entities for calculating risk and corralation. My goal will also include to add some machine learning into this for a starter. In addition, i will create a BOT for asking for currency rates and currency conversions.
My goals will be to use Azure SQL, Web Apps and Azure Functions and/or Azure App Fabric for a Microservice application architecture.
Happy coding and designing…
There have been a lot of blog articles about chat bots and robots that will invade our daily life. Don’t worry, they are already here. Microsoft Azure
has a lot of services that can be used to create Bots of different kinds.
Azure has some powerful cognitive services
that enable chat bot to accept user input categorized as utterance – either as voice or written text. Azure has Speech APIs
and Translation APIs
to convert back and forward between speech to text, in addition to translate between different languages.
The past years, Microsoft Azure has released a lot of services around analytics and data management. Many of them is centered around what is called Cortana Intelligence Suite
(CIS) shown in the picture below:
This suite connect many services together and shows the way how Microsoft is defining Analytics for the future. On the right bottom side the chat bots are resides with mobile and web applications. In short, CIS collect on-premise and cloud data from a different of sources like IoT devices, external apps, APIs and similar. These data can be stored in Data Lake Store
or SQL DW that is the basis for analytical services as Azure Machine Learning
, Data Lake Analytics
and Stream Analytics
. The data from the storage or result of analytics tools can be visualized with PowerBI, and queried with web/web/bot applications.
But this will be the topic of future blog posts – over and out!
In mid-February, I was discussing with some colleagues in EVRY to hold a hackathon soon. Immediately, I started to think of what I wanted to create independently of the Hackathon discussions. After a while I was fascinated by the idea of creating a coffee machine bot using Microsoft technologies to its full extent. The plan is to use this project to learn new things, and put everything together around the “Coffee Machine Bot” idea.
I have multiple inspirational sources for this Project
- The fact that software developers consume huge amount of coffee all day (and nights)
- The great GitHub project “hacker-scripts” (separate commands in the bot)
- Homer’s fantastic kitchen machines
- Using cognitive APIs for voice and face recognition to identify persons
- Bot should understand multiple languages, such as English, Norwegian and Swedish
- Using advanced machine learning, analytics and cognitive services to suggest the drink based on drinking habits, time of day, weather, humour and emotion.
- Rate drink
- As usual
- Add to favourites
- Recommend New
- Automatically order new ingredients based on consumption, number of forthcoming workdays.
- Schedule planned and predictive maintenance
- Using HR, IFS, SM9 systems to look scheduled overtime and evening/night/weekend. Make sure participating employees had their drinks covered
This weekend I will do some research for the Face API and Emotion API available in Microsoft Azure.
The past few months I had a few reinstalls of my local development environment due to hardware failure and a new work laptop. And I have installed a local SQL Server instance at least 4-5 times. Luckily, I created a silent install for SQL Server 2016 earlier this year, but I didn’t create a script for pre-install tasks, like creating SQL Server service accounts. Today, I took the time to create the script for creating these quickly.
In my local development environment, I normally install database engine, SSIS and agent. It is not completely necessary with these accounts locally, but it is a good practice. Change the password “XXX” to your own, and run the script from command-line.
NET USER svc-sql-db "XXX" /ADD /passwordchg:no /fullname:"SQL Engine service account"
NET LOCALGROUP "Administrators" "svc-sql-db" /add
NET USER svc-sql-ag "XXX" /ADD /passwordchg:no /fullname:"SQL Agent service account"
NET LOCALGROUP "Administrators" "svc-sql-ag" /add
NET USER svc-sql-is "XXX" /ADD /passwordchg:no /fullname:"SQL Integration service account"
NET LOCALGROUP "Administrators" "svc-sql-is" /add
The only thing I need to improve is to fix the “Password never expires” check box. NET USER have a “/expires:never» switch, but doesn’t seem to work. This could be written in Powershell, but found these command very easy.
In most cases where you have developed a on-premise applikasjon (console, service) these generates logfiles. Either for debugging purpose, or for validating the day-to-day execution wether there are errors or warning.
I tend to use text files, where I keep them for at least 14 days. The following command will delete all files older than 14 days from the c:\temp\logging folder. I have this at the top of my startup.cmd script.
rem delete files older than 14 days
forfiles /p "C:\temp\Logging" /m "*.*" /c "cmd /c del @path" /D -14
During my CRM Solution import debugging yesterday, I also wanted to see which user had been logged in to CRM the last few days. After some googling og trying I came up with this SQL statement for listing all users and last time they accessed CRM during the last 3 days.
NB! You have to change the “OrgName” to get this working on you CRM database server. It is tested for CRM 2011 and CRM 2016.
SELECT O.FriendlyName, SU.FullName as Name, SUO.LastAccessTime
FROM SystemUserOrganizations SUO
LEFT JOIN SystemUserAuthentication SUA ON SUO.UserId = SUA.UserId
AND LEFT(AuthInfo, 1)='C'
LEFT JOIN Organization O ON SUO.OrganizationId=O.Id
INNER JOIN OrgName_MSCRM.dbo.SystemUser SU ON SUO.CrmUserId = SU.systemuserid
WHERE LastAccessTime IS NOT NULL
AND O.FriendlyName = 'OrgName'
AND datediff(DAY,Lastaccesstime, getutcdate()) < 3
ORDER BY lastaccesstime
If you have some problems with Dynamics CRM On-premise you are able to enable tracing with PowerShell. In my case, I needed to get debug information on why my solution import is failing when I’m going to move it to a new organization.
Open the powershell prompt and use the Add-PSSnapin command shown in 1). Thereby, You can list the trace setting with the command shown in 2). Before you start the tracing, you show determine the timeline for when the error occurs and just enble it as close as the error as possible. Run the command in 3) to start the tracing. You should stop the tracing immediately after the error has occured. Use command in 4) to stop the tracing.
# 1) add
# 2) get crm trace settings
# 3) enable tracing
$Setting = Get-CrmSetting TraceSettings
$Setting.Enabled = $True
# 4) disbale tracing
$Setting = Get-CrmSetting TraceSettings
$setting.Enabled = $False
When you have tons of log file, the trace tool CRM Trace reader is nice to use for searing and filtering.
After we found out that SOTI Enterprise Mobility Management system didn’t fully support Windows 10 Store Apps in “Kiosk Mode”, we had to rewrite out latest app using WPF technologi instead.
In this process. I wanted a kind of watermark in my TextBox Controls. After some googling, I found a pretty nice library called “Extended WPF Toolkit” on codeplex (and Nuget).
How to create a watermark input textbox
- Add “Extended.Wpf.Toolkit” via Nuget
- Add XML Namespace at the top of the XAML file
- Add “xctk:WatermarkTextBox” instead of “TextBox” Control With the Watermark attribute set to the help text
<xctk:WatermarkTextBox x:Name="txtSearch" Watermark="type search pattern" />
Blogging haven’t been my first priority for the last year due different circumstances. As a result, I have decided to set a goal of at least one blog post every month. We have to learn something new every day in this industry to keep up with the changes, so it should always be something to write about 🙂
So, what have I been doing for the last year (2015)?
It started with a competence boom at the KiPi 2015 (“Know It, Prove It”) Challenge at Microsoft Virtual Academy in February where I followed and completed Cloud Development, Mobile Development and Hybrid Cloud learning paths. This inspired me to look at the different Azure exams, but unfortunately, busy projects made it impossible to complete these.
Between March and New Year, I worked on mainly upgrade and migration projects for customers, and the next few blog posts will summarize my experience from these projects and describe what kind of knowledge from these projects I have put into my toolbox.
The T-SQL script below will find all tables and columns for a particular Primary Key column (located in WHERE clause [pk-table].[pk-column]). This script is pretty useful when you are working close the database, manipulating data directly and so on.
K_Table = FK.TABLE_NAME,
FK_Column = CU.COLUMN_NAME,
PK_Table = PK.TABLE_NAME,
PK_Column = PT.COLUMN_NAME,
Constraint_Name = C.CONSTRAINT_NAME
FROM INFORMATION_SCHEMA.REFERENTIAL_CONSTRAINTS C
INNER JOIN INFORMATION_SCHEMA.TABLE_CONSTRAINTS FK
ON C.CONSTRAINT_NAME = FK.CONSTRAINT_NAME
INNER JOIN INFORMATION_SCHEMA.TABLE_CONSTRAINTS PK
ON C.UNIQUE_CONSTRAINT_NAME = PK.CONSTRAINT_NAME
INNER JOIN INFORMATION_SCHEMA.KEY_COLUMN_USAGE CU
ON C.CONSTRAINT_NAME = CU.CONSTRAINT_NAME
INNER JOIN (
INNER JOIN INFORMATION_SCHEMA.KEY_COLUMN_USAGE i2
ON i1.CONSTRAINT_NAME = i2.CONSTRAINT_NAME
i1.CONSTRAINT_TYPE = 'PRIMARY KEY'
ON PT.TABLE_NAME = PK.TABLE_NAME
WHERE PK.TABLE_NAME = '[pk-table]'
AND PT.COLUMN_NAME = '[pk-column]'
Microsoft have released three Azure Specialist exams for the last few months. I have been watching a lot of videos on Microsoft Virtual Academy, Channel9 and Pluralsight the last few years, and very intensivly since December 2014.
70-532 Developing Microsoft Azure Solutions. This is a developer exam for people who wants to be able to designing, programming, implementing, automating, and monitoring Microsoft Azure solutions.
70-533 Implementing Microsoft Azure Infrastructure Solutions. This is an exam for IT-pros and solution architects who wants to implementing an infrastructure solution in Microsoft Azure. Candidates have experience implementing and monitoring cloud and hybrid solutions as well as supporting application lifecycle management.
70-534 Architecting Microsoft Azure Solutions. This is an exam for Solution Arcitects should know the features and capabilities of Azure services to be able to identify tradeoffs and make decisions for designing public and hybrid cloud solutions. Candidates who take this exam are expected to be able to define the appropriate infrastructure and platform solutions to meet the required functional, operational, and deployment requirements through the solution lifecycle.
Our department had an interesting challenge the last week. We have an old local on-premise “Team Foundation Server” (TFS) in our data room with two VMware hosts containing a number virtual machines. Due to new company policies we needed to move all domain bound VMs to a new domain. As we feared, this cause a few problems due to old versions and incorrect editions of different software.
The first thing I did was to perform a “Get Latest” on all source code just, in addition to a VM snapshot before we started the actual migration process. We needed to have a “Plan B” if the migration failed. After a few days with migration failures with loads of issues between Sharepoint, Project Server, SQL Server and TFS, we decided to make a clean install and move the source code into new team projects. The problem now was that the old TFS server had about 60 team projects that need to be created manually.
As a lazy programmer, I prefer a command-line utility to help me with this project creation. Luckily, 99% of all team projects didn’t use the Sharepoint site, so for the moment I just have to migrate source code to the version control of the new TFS server.
The command-line tool need for the team project creation is called “TFS Power Tools”, and exists in the latest version of Visual Studio – 2012 and 2013. Here is the command template I have used for our team projects.
/collection:"http://[IP or Hostname]:8080/tfs/DefaultCollection"
/processtemplate:"Microsoft Visual Studio Scrum 2.2"
Since I found the list of team project directories by using “dir /b” from the DOS-prompt and put this directory list into Excel and generated one command for each project based on the command-line above. All these command where put into a command file (cmd) and run. When this is completed I will add all files from the projects from the “Source Control Explorer” in Visual Studio. Some manual work is needed.
I have a private project going on where I’m in the data cleaning and import phase. This weekend’s problem has been a lot of date columns stored as string on 2 different formats (yyyymmdd and ddmmyyyy) and different seperator characters (‘-‘ and ‘.’) all over the place. And these strings I want in date datatype. I’m a lazy programmer and had to make string2date function to use for my import stuff. Here is my first version of the function.
CREATE FUNCTION [dbo].[String2Date]
@style smallint = 104
IF (CHARINDEX('-', @string, 1) = 5 OR CHARINDEX('.', @string, 1) = 5)
SET @style = 102 ;
RETURN CONVERT(date, @string, @style) ;
This function is not perfect, but suits my purpose for the moment. It would probably give some exceptions now and then, but I don’t care 🙂
Today I found a pretty nice way to delete duplicate rows in a table on SQL Server. I had a table with 25,000 rows where 7,500 rows where rows containing one or more duplicates. I was not very eager to manually delete these duplicate, so I started to googling for answers. I found many different approaches, but suddenly I found my answer at stackoverflow.com. This thread help me rewriting a simple SQL statement after I added an [Id] column to uniquely identify a row. The clue is to use Common Table Expression (CTE) in SQL Server together with the OVER() function to create an unique row number for all duplicates within the key expression (in my case [Name] column), and ordering by the [Id] column. I only want to keep the first row for each duplicate item. Therefore, deleting all [rowno] greater than 1.
WITH cte_duplicates AS
PARTITION BY [Name] ORDER BY [Id]
) AS [rowno]
DELETE FROM cte_duplicates WHERE [rowno] > 1
I have a goal for February to complete 3 challanges for “KiPi 2015”. It should be possible even if I’m going to sell the house in this period. The last challenge last for 32.5 hours and cover many topics within Hybrid Cloud design and implementation. Designing private cloud data centers is not my current work focus, but very useful knowledge in my work as Cloud Solution Architect.
This module consist of the following parts:
I’m having just some hours left of the first part of my KiPi 2015 Challenge for “Cloud Development“, and the next course path is “Mobile Development”. This path will contain video sessions on C#, XAML, Universal app development and how Xamarin and Visual Studio can be used for cross-plattform development. These courses are about 32 hours in total watching.
As I wrote in the previous post, I’m participating in the KiPi 2015 Challange at Microsoft Virtual Academy (MVA). I have decided to focus on 100% on Microsoft Azure (cloud paths) during this challange – cloud development, mobile develogment, hybrid could.
The first path “Cloud Development” that consist of four different courses:
Currently, I have just compled “Part 2”, and looking forward to the next 2 parts.