Tuesday 13 December 2016

Moveing classic resources (within subscription)

Well – according to the documentation then you can use the portal to move classic resource items around your groups.

  • Virtual machines (classic) must be moved with the cloud service.
  • Cloud service can only be moved when the move includes all its virtual machines.”

“To move classic resources to a new resource group within the same subscription, use the standard move operations through the portal, Azure PowerShell, Azure CLI, or REST API. You use the same operations as you use for moving Resource Manager resources.”

It even has a nice little picture showing it being done

I had a group with two resources – a VM and its cloud service – but the portal would not allow them to be moved.

So this nice little script does it

# prompt for credentials etc.
Login-AzureRmAccount
# select the correct subscription
Get-AzureRmSubscription -SubscriptionName "Stiona Software General" | Select-AzureRmSubscription
# get the ResourceID property of the two resources - luckily they have the same name in my case
$resources = Get-AzureRmResource -ResourceName fusionreports -ResourceGroupName fusionreports | Select -ExpandProperty ResourceId
# issue a move to the new group
Move-AzureRmResource -DestinationResourceGroupName fusionazure -ResourceId $resources

The $resources this -

/subscriptions/b9fee249-e903-4c4a-ade7-42982be9a20f/resourceGroups/fusionreports/providers/Microsoft.ClassicCompute/domainNa
mes/fusionreports
/subscriptions/b9fee249-e903-4c4a-ade7-42982be9a20f/resourceGroups/fusionreports/providers/Microsoft.ClassicCompute/virtualM
achines/fusionreports

After promt to move the old resource group was empty and the new one contained the vm and cloud service.

Nice hint from stack overflow on the use of  ExpandProperty parameter of Select-Object for getting an array of values from an array of objects.

Man – working with Azure classic sure is tough.

Monday 5 December 2016

Finding dependencies

Looking to find service references and database connections in web.config files on a server or a set of servers.

I started with this guys script - http://www.markrainey.me/2013/03/finding-urls-in-text-files.html but he was just interested in unique url’s so I hacked it around a bit and also added in another regex for the connection strings.

This is not perfect by any means but is a start for anyone else (or me in the future)

########################################################### 
# AUTHOR  : Mark Rainey  
# Ammended - Stuart McLean 2016-05-12
# DATE    : 2013-03-13  
# COMMENT : Reads in a list of servers and searches
# the E:\Live folder for config files.  When it finds
# them it searches for anything starting with 3 or more
# letters (tcp or http) followed by a colon and \\
# now also outputs connection string data and 
# looks for </value in urls
###########################################################

#ERROR REPORTING ALL
Set-StrictMode -Version latest

$scriptpath = Split-Path -parent $MyInvocation.MyCommand.Definition
# A friend at work helped me get this to output to Excel
$outputFile = $scriptpath + "\urls.csv"
$connectionFile = $scriptpath + "\connections.csv"
# This is a file with a FQDN of each server on a new line
$serverList = $scriptpath + "\serverlist.txt"
# Load server list
$servers = Get-Content $serverList


$URLS = New-Object System.Collections.ArrayList
$ConnectionStrings = New-Object System.Collections.ArrayList

$credential = Get-Credential

# Find the string and save it to a file
Function getStringMatch
{
 Try { 
  # Loop through all servers
  Foreach ($server In $servers) {
      # Set UNC Path to files for this server
   $drive = "\\" + $server + "\d$"
   New-PSDrive -Root $drive  -PSProvider FileSystem -Name z -Credential $credential
   #The path can be anywhere on your servers you want to search
   $path     = "z:\AppWebSites"
   #Get list of files
   $files    =  Get-Childitem $path web.config -Recurse | Where-Object { !($_.psiscontainer) } 
   # Loop through the server and search all config files under E:\Live
    #$matches = New-Object System.Collections.ArrayList
   #$matches.clear()
   Foreach ($file In $files)
      {
    $fullFileName = $file.FullName 
    # regular expression for a URL format
    #$regex = '([a-zA-Z]{3,})://([\w-]+\.)+[\w-]+(/[\w- ./?%&=]*)*?'
    # find any urls
    $regex = '([a-zA-Z]{3,})(:\/\/.*)(?=<)'
                # Find all matches in current file and add the Value for each one to an array
    select-string -Path $fullFileName -Pattern $regex -AllMatches | % { $_.Matches } | % {
        $found = New-Object psobject -Property  @{       
        server = $server;
        urlRef = $_.Value;
        file = $file.FullName
        }
        
     $URLS.add($found)
    }

    $connectionStringRegex = '(?<=connectionString=")(.*)(?=")'
    select-string -Path $fullFileName -Pattern $connectionStringRegex -AllMatches | % { $_.Matches } | % {
    try{
        $connectionString = $_.Value;

        $source =   ($connectionString | select-string -Pattern '(?<=data source=)(.*?)(?=;)').Matches[0].Value 
        $catalog =  ($connectionString | select-string -Pattern '(?<=Initial Catalog=)(.*?)(?=;)').Matches[0].Value 
        $sqlUser =  ($connectionString | select-string -Pattern '(?<=User ID=|uid=)(.*?)(?=;)').Matches[0].Value 
        $found = New-Object psobject -Property  @{       
        server = $server;
        source = $source;
        catalog = $catalog;
        sqluser = $sqlUser;
        file = $file.FullName
        }
        
     $connectionStrings.add($found)
     }
     Catch {
  Write-host "could not parse connection string $_" + $connectionString
 }
      } 
   }
            
   Remove-PSDrive z
  
  }
  
 }
 Catch {
  Write-host "Something failed $_"
 }
 Finally {
  "Finished"
 }
 
}
  getStringMatch
  $URLS.ToArray() | Export-Csv -Path $outputFile -NoTypeInformation
  $connectionStrings.ToArray() | Export-Csv -Path $connectionFile -NoTypeInformation

Wednesday 30 November 2016

Powershell file compare.

Simple script for comparing files in two directories.

param (
       [Parameter(Mandatory=$true)][string]$originalDir,
        [Parameter(Mandatory=$true)][string]$newDir
)

$fso = Get-ChildItem -Recurse -path $originalDir
foreach($file in $fso.Name) {
echo comparing $originalDir\$file to $newDir\$file
Compare-Object -ReferenceObject $(Get-Content $originalDir\$file)  -DifferenceObject $(Get-Content $newDir\$file) -IncludeEqual
}

Will compare line by line the files in the originalDir with those in newDir with the same name. 

  • Probably breaks if the file isn’t in newDir
  • Doesn’t check for extra files in newDir.

Tuesday 22 November 2016

Connection IP’s to my database

Thanks to @j_mcmullan

Apperently there's a lot more info on these tables

 select distinct--c.session_id
	  c.client_net_address
	  ,s.database_id
	  ,db_name( s.database_id)
from sys.dm_exec_connections c
    inner join sys.dm_exec_sessions s on c.session_id = s.session_id
where s.database_id = db_id('ASPState')
GO

(I beleiveit may be ironical that I was looking at sessions to a session database).

Friday 18 November 2016

Sprint 4–put it live.

Not quite as easy as deploying the sql as connecting locally I’ve been using my own credentials.  For an azure connection I need to:

  1. Create a database role and grant exec on the stored procs to that role.
  2. Create a new database login
  3. Put the database login in the role.

Why don’t I just use the “sa” password.

Well – least privilege – a good security practice.

Sprints 2 and 3 retrospective–an hour really isn’t long.

So – sprint 2 took about an hour and a half of coding – so a lot closer than the first.

I then noticed that I still had some inline nasty SQL in my powershell – so I cracked out another hour and got another proc and changes in.

Main lesson so far – an hour is a really short time – keep pebbleising the tasks.

Outcome

Ok – so the work got done and hopefully I’ll have a list of riders on production soon.

 

I’ve thought hard about it and its definitely a useful task towards the end goal as the first “Minimum” viable product is for riders to sign up.  I’m going to allow google authentication/Oath which will strongly (enough for this) validate the users email which I can then use to look up there details in the data I have.

 

 

A good start.

 

Some feedback on the tasks here – really putting this code here for anyone to look @ until I work out if/what to open source.  Unless you particularly want to critique my sql or look at how I use powerhell to call stored procs then not much excitement.

 

“Work out how to deploy database project to local database and how to connect to it. “

this was pretty easy – just set the PowerShell script to depend on the database project and then it deploys all your changes as it builds – pretty nifty stuff

 

“Create stored procs for crud operations required by script – include find rider.”

CREATE PROCEDURE [dbo].[ups_RiderInsert]
    @vcrFirstName    NVARCHAR (100) ,
    @vcrSurname      NVARCHAR (100) ,
    @vcrAddressLine1 NVARCHAR (100) ,
    @vcrAddressLine2 NVARCHAR (100) ,
    @vcrTown         VARCHAR (100)  ,
    @vcrCountryCode  VARCHAR (2)    ,
    @vcrCountry      VARCHAR (100)  ,
    @vcrPhoneNumber  VARCHAR (20)   ,
    @vcrClub         VARCHAR (50)   ,
    @vcrClubNumber   VARCHAR (20)   ,
    @vcrECName       VARCHAR (100)  ,
    @vcrECTelephone  VARCHAR (100) 
AS
SET XACT_ABORT ON

BEGIN TRANSACTION
	INSERT INTO tblRider
	VALUES( 
	@vcrFirstName,
	 @vcrSurname      ,
    @vcrAddressLine1 ,
    @vcrAddressLine2 ,
    @vcrTown           ,
    @vcrCountryCode  ,
    @vcrCountry        ,
    @vcrPhoneNumber  ,
    @vcrClub            ,
    @vcrClubNumber      ,
    @vcrECName       ,
    @vcrECTelephone)

	SELECT SCOPE_IDENTITY()  AS INT
COMMIT TRANSACTION



CREATE PROCEDURE [dbo].[usp_EventParticipentInsert]
    @intRiderId       INT  ,
    @intEventId       INT  ,
    @decPaidSTG       MONEY,
    @decPaidEUR       MONEY
AS
SET XACT_ABORT ON
BEGIN TRANSACTION
INSERT INTO tblEventParticipant
VALUES(
   @intRiderId       ,
    @intEventId      ,
    @decPaidSTG      ,
    @decPaidEUR      
)
	SELECT SCOPE_IDENTITY()  AS INT

COMMIT
CREATE PROCEDURE [dbo].[usp_LicenceGetByRiderAndYear]
	@intRiderId int,
	@intYear int
AS
	SELECT * FROM dbo.tblLicense 
	WHERE
	intRiderId = @intRiderId
	AND intYear = @intYear
CREATE PROCEDURE [dbo].[usp_LicenseInsert]
    @intYear          INT,
    @intRiderId       INT,
    @vcrLicenceNumber NVARCHAR(20)

AS
SET XACT_ABORT ON
BEGIN TRANSACTION
INSERT INTO tblLicense 
VALUES(@intYear , @intRiderId, @vcrLicenceNumber)
COMMIT
CREATE PROCEDURE [dbo].[usp_RiderSearchByName]
	@FirstName nvarchar(100) ,
	@SurName nvarchar(100)
AS
	SELECT * FROM tblRider r
	WHERE (r.vcrFirstName like @FirstName + '%' OR @FirstName IS NULL)
	AND (r.vcrSurname like @SurName + '%' OR @SurName IS NULL)
	

 

 

    “Enahance script / tidy up for stored proc use and other comments in script below.”

param (
	   [Parameter(Mandatory=$true)][string]$fileName,
		[Parameter(Mandatory=$true)][string]$connString,
		[Parameter(Mandatory=$true)][int]$intEventId,
	[Parameter(Mandatory=$true)][int]$intYear
)


function ToDBNull($var) {
	if($var -eq $null) {return [System.DBNull]::Value}
	return $var;
}

#
# Script.ps1
# TODO - rename the bit above and document this method
function parseExcel([String] $fileName){
	$excel=new-object -com excel.application
	$wb=$excel.workbooks.open($fileName)
	$sh = $wb.Sheets.Item(1)
	# TODO de hard code teh endrow.
	$startrow = 2
	$endrow = 16#Connection and Query Info

# prpbably best to pass the database name etc. through as Parameters

$conn = new-object System.Data.SqlClient.SqlConnection $connString 
	#TODO Should the open be in the try/catch?
$conn.Open()
try{

	# Main for loop for parsing sheet
	# TODO rather than hard coding rows do until no end.

	for($i = $startrow;
		($sh.Cells.Item($i,1).Value2 -ne $null) -and ($sh.Cells.Item($i,1).Value2.Trim() -ne "") ; 
		$i++) {
$Name = $sh.Cells.Item($i,1).Value2
$Surname = $sh.Cells.Item($i,2).Value2
[String]$Address = $sh.Cells.Item($i,3).Value2
$Tel = $sh.Cells.Item($i,4).Value2
$email = $sh.Cells.Item($i,5).Value2
$Club = $sh.Cells.Item($i,6).Value2
$CI = $sh.Cells.Item($i,7).Value2
$ClubNo = $sh.Cells.Item($i,8).Value2();
$EC = $sh.Cells.Item($i,9).Value2
$ECTel = $sh.Cells.Item($i,10).Value2
$PaidE = $sh.Cells.Item($i,11).Value2
$PaidS = $sh.Cells.Item($i,12).Value2 
		# TODO - do we need to close the command - check documentation 
		# to see if it implements IDisposable - and is there a using equivalent in PowerShell
[System.Data.SQLClient.SQLCommand]$Command = New-Object System.Data.SQLClient.SQLCommand

	$Command.Connection = $conn
			$Command.CommandType = [System.Data.CommandType]::StoredProcedure
			# - lookup first - do within proc?.

	$Command.CommandText = "[dbo].[usp_RiderSearchByName]"
	
	$Command.Parameters.aDD("@FirstName",[System.Data.SqlDbType]::NVarChar)
	$Command.Parameters["@FirstName"].Value = $Name
	$Command.Parameters.Add("@SurName",[System.Data.SqlDbType]::NVarChar ).Value = $Surname
	[int]$ID = $null
   [System.Data.SqlClient.SqlDataReader ]$reader = $Command.ExecuteReader()
			try{
				if($reader.Read()){
					$ID = $reader["intRiderID"]
				}
			}
			finally{
				$reader.Close()
			}
			# if we didn'te find a rider then create on
			if($ID -eq 0) {
				[System.Data.SQLClient.SQLCommand]$Command =  $Command = New-Object System.Data.SQLClient.SQLCommand

			$Command.Connection = $conn
			$command.CommandType = [System.Data.CommandType]::StoredProcedure

				$Command.CommandText = "[dbo].[ups_RiderInsert]"
				$Command.Parameters.Add("@vcrFirstName",[System.Data.SqlDbType]::NVarChar ).Value = $Name
				$Command.Parameters.Add("@vcrSurname",[System.Data.SqlDbType]::NVarChar ).Value = $Surname
				$Command.Parameters.Add("@vcrAddressLine1",[System.Data.SqlDbType]::NVarChar ).Value = ToDBNull($Address.Split(',')[0])
				$Command.Parameters.Add("@vcrAddressLine2",[System.Data.SqlDbType]::NVarChar ).Value = ToDBNull($Address.Split(',')[1])
				$Command.Parameters.Add("@vcrTown",[System.Data.SqlDbType]::NVarChar ).Value = ToDBNull($Address.Split(',')[2])
				$Command.Parameters.Add("@vcrCountryCode",[System.Data.SqlDbType]::NVarChar ).Value = ToDBNull($null)
				$Command.Parameters.Add("@vcrCountry",[System.Data.SqlDbType]::NVarChar ).Value = ToDBNull($null)
				$Command.Parameters.Add("@vcrPhoneNumber",[System.Data.SqlDbType]::NVarChar ).Value = ToDBNull($Tel)
				$Command.Parameters.Add("@vcrClub",[System.Data.SqlDbType]::NVarChar ).Value = ToDBNull($Club)
				$Command.Parameters.Add("@vcrClubNumber",[System.Data.SqlDbType]::NVarChar ).Value = ToDBNull($ClubNo)
				$Command.Parameters.Add("@vcrECName",[System.Data.SqlDbType]::NVarChar ).Value = ToDBNull($EC)
				$Command.Parameters.Add("@vcrECTelephone",[System.Data.SqlDbType]::NVarChar ).Value = ToDBNull($ECTel)
		
				[int]$ID = $Command.ExecuteScalar()
				}
	
[System.Data.SQLClient.SQLCommand]$Command =  $Command = New-Object System.Data.SQLClient.SQLCommand

	$Command.Connection = $conn
			$command.CommandType = [System.Data.CommandType]::StoredProcedure
	$Command.CommandText = "[dbo].[usp_EventParticipentInsert]"
				$Command.Parameters.Add("@intRiderId",[System.Data.SqlDbType]::Int).Value = $ID
				$Command.Parameters.Add("@intEventId",[System.Data.SqlDbType]::Int).Value = $intEventId
					$Command.Parameters.Add("@decPaidSTG",[System.Data.SqlDbType]::Money).Value = If($PaidS -ne $null)  {$PaidS} else {0}
					$Command.Parameters.Add("@decPaidEUR",[System.Data.SqlDbType]::Money).Value = If($PaidE -ne $null)  {$PaidE} else {0}


		

	$Command.ExecuteScalar()
			# if got a CI number
			if($CI -ne $null -and $CI.Trim() -ne '') { 
# search for a licence

			[System.Data.SQLClient.SQLCommand]$Command =  $Command = New-Object System.Data.SQLClient.SQLCommand

			$Command.Connection = $conn
			$command.CommandType = [System.Data.CommandType]::StoredProcedure
			$Command.CommandText = "[dbo].[usp_LicenceGetByRiderAndYear]"
			$Command.Parameters.Add("@intRiderId",[System.Data.SqlDbType]::Int).Value = $ID
			$Command.Parameters.Add("@intYear",[System.Data.SqlDbType]::Int).Value = $intYear
	
			[int]$LID = $null
   [System.Data.SqlClient.SqlDataReader ]$reader = $Command.ExecuteReader()
			try{
				if($reader.Read()){
					$LID = $reader["intRiderID"]
				}
			}
			finally{
				$reader.Close()
			}
				# no licence found - create one
	if($LID -eq 0){
	[System.Data.SQLClient.SQLCommand]$Command =  $Command = New-Object System.Data.SQLClient.SQLCommand

			$Command.Connection = $conn
			$command.CommandType = [System.Data.CommandType]::StoredProcedure
			$Command.CommandText = "[dbo].[usp_LicenseInsert]"
			$Command.Parameters.Add("@intRiderId",[System.Data.SqlDbType]::Int).Value = $ID
			$Command.Parameters.Add("@intYear",[System.Data.SqlDbType]::Int).Value = $intYear
	$Command.Parameters.Add("@vcrLicenceNumber",[System.Data.SqlDbType]::NVarChar).Value = $CI
		$Command.ExecuteScalar()
				}
	

	}

	}
}
Finally
{
$conn.Close()
}
$wb.Close()
$excel.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($excel)

	}


	parseExcel($fileName)

Thursday 10 November 2016

Sprint 2

Goal

Refactor the import script so it is fit for purpose.

Tasks

  1. Work out how to deploy database project to local database and how to connect to it.
  2. Create stored procs for crud operations required by script – include find rider.
  3. Enahance script / tidy up for stored proc use and other comments in script below.

Outcome

I can parse all the sheets I have to load an initial dataset.  I have a set of re-useable stored procedures for the application.

 

The script to fix

#
# Script.ps1
# TODO - rename the bit above and document this method
function parseExcel([String] $fileName){
	$excel=new-object -com excel.application
	$wb=$excel.workbooks.open($fileName)
	$sh = $wb.Sheets.Item(1)
	# TODO de hard code teh endrow.
	$startrow = 2
	$endrow = 16#Connection and Query Info

# prpbably best to pass the database name etc. through as paramaters
$serverName="uj6arxo0jh.database.windows.net" 
$databaseName='AudaxDB' 
#TODO  And this really shouldn't be here!
$connString = "Server=$serverName;Database=$databaseName;" 
$conn = new-object System.Data.SqlClient.SqlConnection $connString 
	#TODO Should the open be in the try/catch?
$conn.Open()
try{

	# Main for loop for parsing sheet
	# TODO rather than hard coding rows do until no end.

	for($i = $startrow; $i -le $endrow; $i++) {
$Name = $sh.Cells.Item($i,1).Value2
$Surname = $sh.Cells.Item($i,2).Value2
[String]$Address = $sh.Cells.Item($i,3).Value2
$Tel = $sh.Cells.Item($i,4).Value2
$email = $sh.Cells.Item($i,5).Value2
$Club = $sh.Cells.Item($i,6).Value2
$CI = $sh.Cells.Item($i,7).Value2
$ClubNo = $sh.Cells.Item($i,8).Value2();
$EC = $sh.Cells.Item($i,9).Value2
$ECTel = $sh.Cells.Item($i,10).Value2
$PaidE = $sh.Cells.Item($i,11).Value2
$PaidS = $sh.Cells.Item($i,12).Value2 
		# TODO - do we need to close the command - check documentation 
		# to see if it implements IDisposable - and is there a using equivalent in PowerShell
$Command =  $Command = New-Object System.Data.SQLClient.SQLCommand

    $Command.Connection = $conn
		# TODO - convert to strongly typed stored procedure.
		# also - lookup first - do within proc?.
    $Command.CommandText = "INSERT INTO tblRider VALUES( '" + $Name  + "'" +
    ",'" + $Surname.Replace("'","''") + "'" + # notice the fancy replace here for the O' people
    ",'" + $Address.Split(',')[0] + "'" +
    ",'" + $Address.Split(',')[1] + "'" +
    ",'" + $Address.Split(',')[2] + "'" +
    ",NULL" + 
    ",NULL" +
    ",'" + $Tel + "'" +
    ",'" + $Club + "'" +
    ",'" + $ClubNo + "'" +
    ",'" + $EC + "'" +
    ",'" + $ECTEL + "'" +
        ")`n`r" + 
        "SELECT @@IDENTITY `n`r"
        
        
        $Command
        [int]$ID = $Command.ExecuteScalar()
        $ID
$Command =  $Command = New-Object System.Data.SQLClient.SQLCommand
		# TODO - remove not required
		#TODO create stronly typed proc.
    $Command.Connection = $conn
    $Command.CommandText = 
       "INSERT INTO tblEventParticipant VALUES( " +  $ID  +",1,'" + $PaidS + "','" + $PaidE +"')" 
    $Command
    $Command.ExecuteScalar()

		# TODO again - needs stored proc.
		# 
if($CI -ne $null -and $CI.Trim() -ne '') { 
$Command =  $Command = New-Object System.Data.SQLClient.SQLCommand

    $Command.Connection = $conn
    $Command.CommandText = 
       "INSERT INTO tblLicense  VALUES( 2016, " +  $ID  + ",'" + $CI +"')" 
    $Command
        $Command.ExecuteScalar()

    }

	}
}
Finally
{
$conn.Close()
}
$wb.Close()
$excel.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($excel)

	}

	parseExcel("C:\Users\stuartm\OneDrive\Documents\Audax\A300_2016\A30002016Entrants.xlsx")

Wednesday 9 November 2016

One hour sprinting - change in strategy–a sprint is a week (or shorter) but the code in an hour

This is a part time project and the thinking is taking a lot more time than the coding – which is good because I’ve a lot of time on the bike for thinking and not much time at the keyboard for coding.

Bit of a change in strategy – a sprint is going to take a week or less but will have just one hour of coding!  This should try and make me actually do some coding each week rather than continuously think.

Sprint 1 Retrospection3–a bit of direction, information security and plans and other musings

 

Not only did I massively overshoot my one hour development time, I also have doubts as to weather I was really adding value to the “product”.

OK – so I’ve uploaded a load of data – but I’m not sure I can actually use that data.  But, upon thinking about it, I might be able to implement user stories whereby I already have the rider data and just get them to confirm it.  Since they supplied the data for my events – I’m kind of assuming that they’ve allowed me to use it for my events but there is obviously  a massive legal bag of spanners to do with data shareing etc. that I’ll have to try and put something together on.

I guess the value added bit is that I’ve validated the schema.

Going forward the focus needs to be

image

Epic – 4307 – “I want a rider to be able to enter all there details online when the event opens and to be able to prepare brevet cards for the event.”

QOTD - DEVOPS

“By seeing problems as they occur and swarming them until effective countermeasures are in place, we continually shorten and amplify our feedback loops, a core tenet of virtually all modern process improvement methodologies. This maximizes the opportunities for our organization to learn and improve.” - The DevOps handbook

Monday 7 November 2016

Sprint 1 Retrospective 2–what did I achieve and why did I use powershell?

"I'm going to create a database - whoop whoop - to store:

Riders,

The Brevets,

Entrants to a brevet - i.e. the relationship between the Riders and the Brevets.

 

> All Done

 

Then I'm going to upload the data from the excel files I have.  This should give me a list of riders, existing entries and previous brevets.

I'm going to upload the data for one of the previous brevets.

This should validate my data model

 

> Done

 

I'm going to do all this in SQL Server management studio straight on the database

I'm going to use a database project to create the schema and any procedures etc,

> Done

I'm going to create my brevets for 2016 - because I have them.

 

> Not Done – but don’t think this should be there

 

I'm going to use ssms to put the data in and out.

 

!"

 

Reading back on this – I’m not sure at what point I decided to use PowerShell – not only did it cause massive overrun (300%) but I’m pretty sure my , separation of the address data has added technical debt – some of these are undoubtedly in the wrong columns now.

Sprint 1 Retrospective 1 – one hell of an overrun

Overview

I’ll cover all that happened in subsequent blog entries – but it took three hours to accomplish the one hour goal that I set in Sprint 1.
Event though I’d never used VS DB Projects before, I created the project and schema and got it deployed in about 25 minutes which I thought was fair enough.  DB project was pretty easy – as I expected it to be.  To be honest I just created the tables with the tool and then typed in the columns, constraints etc. in the SQL window!
It seems to create a local database too – I’ll have to work out where that is for future testing as I now have live data.
Then came the PowerShell task.
I decided fairly late on to parse the excel rather than csv as I thought it would get rid of any comma separation issues.  A couple of google searches showed it was relatively easy.
It was – but using .Value instead of .Value2 (why?) cost me some time then building sql by concatenating strings was a stupid idea and I still had to work out how to cope with special characters like the ‘ which is prevalent in O’Tool, O’Connor etc.

Lessons

  1. Really need to break down the user stories to tiny bits!  Not sure how to deliver useable features at the same time though.
  2. Do the right thing – I new I should be creating stored procs
    1. they’d certainly add valiue down the line
    2. Strong typing is always the way to go
  3. Don’t do new stuff in a limited time – I’d never parsed powershell with excel and was caught out by lack of experience.
  4. Check for extra features/gold plating.  I added in the Organiser table because it’s in my schema but had no date for it or defined columns.  I then had to work out how to insert a row into a database with just an identity column!

The Script

Not much to publish – I’ll get the schema out – but here is the powershell that I used.  It’s probably full of holes!
#
# Script.ps1
#
function parseExcel([String] $fileName){
 $excel=new-object -com excel.application
 $wb=$excel.workbooks.open($fileName)
 $sh = $wb.Sheets.Item(1)
 $startrow = 2
 $endrow = 16#Connection and Query Info

$serverName="uj6arxo0jh.database.windows.net" 
$databaseName='AudaxDB' 
$connString = "Server=$serverName;Database=$databaseName;uid=stiona;pwd=2difficult4u2c#" 
$conn = new-object System.Data.SqlClient.SqlConnection $connString 
$conn.Open()
try{



 for($i = $startrow; $i -le $endrow; $i++) {
$Name = $sh.Cells.Item($i,1).Value2
$Surname = $sh.Cells.Item($i,2).Value2
[String]$Address = $sh.Cells.Item($i,3).Value2
$Tel = $sh.Cells.Item($i,4).Value2
$email = $sh.Cells.Item($i,5).Value2
$Club = $sh.Cells.Item($i,6).Value2
$CI = $sh.Cells.Item($i,7).Value2
$ClubNo = $sh.Cells.Item($i,8).Value2();
$EC = $sh.Cells.Item($i,9).Value2
$ECTel = $sh.Cells.Item($i,10).Value2
$PaidE = $sh.Cells.Item($i,11).Value2
$PaidS = $sh.Cells.Item($i,12).Value2 

$Command =  $Command = New-Object System.Data.SQLClient.SQLCommand

    $Command.Connection = $conn

    $Command.CommandText = "INSERT INTO tblRider VALUES( '" + $Name  + "'" +
    ",'" + $Surname.Replace("'","''") + "'" +
    ",'" + $Address.Split(',')[0] + "'" +
    ",'" + $Address.Split(',')[1] + "'" +
    ",'" + $Address.Split(',')[2] + "'" +
    ",NULL" + 
    ",NULL" +
    ",'" + $Tel + "'" +
    ",'" + $Club + "'" +
    ",'" + $ClubNo + "'" +
    ",'" + $EC + "'" +
    ",'" + $ECTEL + "'" +
        ")`n`r" + 
        "SELECT @@IDENTITY `n`r"
        
        
        $Command
        [int]$ID = $Command.ExecuteScalar()
        $ID
$Command =  $Command = New-Object System.Data.SQLClient.SQLCommand

    $Command.Connection = $conn
    $Command.CommandText = 
       "INSERT INTO tblEventParticipant VALUES( " +  $ID  +",1,'" + $PaidS + "','" + $PaidE +"')" 
    $Command
    $Command.ExecuteScalar()


if($CI -ne $null -and $CI.Trim() -ne '') { 
$Command =  $Command = New-Object System.Data.SQLClient.SQLCommand

    $Command.Connection = $conn
    $Command.CommandText = 
       "INSERT INTO tblLicense  VALUES( 2016, " +  $ID  + ",'" + $CI +"')" 
    $Command
        $Command.ExecuteScalar()

    }

 }
}
Finally
{
$conn.Close()
}
$wb.Close()
$excel.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($excel)

 }

 parseExcel("C:\Users\stuartm\OneDrive\Documents\Audax\A300_2016\A30002016Entrants.xlsx")

Saturday 5 November 2016

Sprint 1 - Sprint planning

Tasks

  1. Create database project.  I already have a VS solution with some work I did for audax before to publish/search the club numbers **
  2. Create schema and deploy to Azure
  3. Create PowerShell script that takes inputs (parameters) eventId, year (for license number) and filename, parses the file (CSV) and inserts the data.
  4. Create the route and event (manually using ssms)
  5. Call script above until it works!
** http://audax.azurewebsites.net/WebApp/ will look up an Audax Club Parisien (ACP) club number from name and country.  It's part of the  paperwork for ACP. 

Data

Headers

Name
Surname
Address
Tel
email
Club
CI license
Club
No
EC Name
EC Tel
Paid EUR
Paid STG

Sample

Stuart MCLEAN 20 Ben Madigan Park South, Belfast   Audax Ireland 16LC0430 (M24) 01500    

What I'm not doing

I did think about handling the duplicates/matching riders etc. because I know I'll need this - but I've one hour!

Issues

Noticed there is some proprietary Stiona Software code in the current solution - will need to refactor if / when I open source.

Risks and mitigation

Never used db projects before - even a sample.  Could do a sample project but suspect this might mean my coding slips another week and I'll lose interest.
Mitigation: re-read samples, have samples open, just do a script or use SSMS straight onto database and reverse engineer later. 
Powershell done this bit before but all my samples are on a clients computer and strictly speaking don't belong to me!
Mitigation - open / find all the samples used for the last time I did this.  Lets face it - when it comes to script google is king.
CSV there are ,'s in the address - need to look at a csv sample with ,'s or go fixed or something  before parsing. 
Mitigation - work this out before the hour!

Wednesday 2 November 2016

The EF Conversation with @shaydmusic

Pretty sure twitter will give you the complete story

Tuesday 1 November 2016

Because I like a bit of schema

Great sketch eh!

OK - a route can be used in an event many times - usually every year but maybe more often.  (The route itself may actually change but I don't think were actually concerned with that in this model).
Each event has a bunch of Participants who are made up of the riders.  A rider can participate in multiple events.
The riders details are pretty much static except they get a new licence number each year.
Obviously they move and so on, but I don't see any particular need to track that against the events - just the current details should do.
Events have organisers - or maybe rides do - they can have more than one so I've just realised the model is wrong - need a join table.
Did think that the organisers were against the ride - but there not as they sometimes change year to year as the route gets passed from person to person.
Riders and organisers are users of some sort - but the chances are I'll pick up this bit of schema from the standard ASP.NET authentication database - or whatever those clever people at Redmond use these days.

I can smell the code ...

Sprint One - Still sharpening the sword

Not a single line of code has been written and Abe Lincoln is fast approaching his time to get the axe out.

So, whilst cycling of course, I went over my previous brief:

As "Stuart McLean" I want to be able to get a csv list of entrants for a brevet.


"I'm going to create a database - whoop whoop - to store:
Riders,
The Brevets,
Entrants to a brevet - i.e. the relationship between the Riders and the Brevets.
Then I'm going to upload the data from the excel files I have.  This should give me a list of riders, existing entries and previous brevets.
This should validate my data model.
I'm going to do all this in SQL Server management studio straight on the database!"

A couple of worries - in one hour.
  1. I have duplicates in my spreadsheets - not sure I'll have time in the hour to sort out these duplicates.
  2. Should I be starting with a database project?
I guess a "database project" is one of my technical user stories - after all I said in my first blog on this that doing some coal face work is part of the aim.
But I've never used the DB projects in anger.  I know they've been around for ages but scripts have been around even longer.
In an hour! 

OK - so heres another breakdown - 

"I'm going to create a database - whoop whoop - to store:
Riders,
The Brevets,
Entrants to a brevet - i.e. the relationship between the Riders and the Brevets.
Then I'm going to upload the data from the excel files I have.  This should give me a list of riders, existing entries and previous brevets.
I'm going to upload the data for one of the previous brevets.
This should validate my data model.
I'm going to do all this in SQL Server management studio straight on the database
I'm going to use a database project to create the schema and any procedures etc,
I'm going to create my brevets for 2016 - because I have them.
I'm going to use ssms to put the data in and out.
!"

Big question here that I need to address - how the heck am I going to get testing (system/uat) in in the hour?


Sunday 30 October 2016

Sprint One - As "Stuart McLean" I want to be able to get a csv list of entrants for a brevet.

The physc evals tell me I'm an implementer - which means I get cracking on things but, as many around me know, rarely finish.  You need a completer finisher for that.
So, I thought a lot and put one thing on the backlog that I might be able to achieve in an hour.

Breaking down the user story

It looks pretty innocuous - right - but - believe it or not, there has been a fair amount of thought into such a simple story.

Let's start with the actor.


"Stuart McLean" - why not a brevet organiser.


Well - there is a whole bunch of stuff that needs doing do secure the average online applicaiton - authentication, authorisation, deployment, hosting etc.  As "Stuart McLean" I can circumnavigate all of these and focus on the data!

Get a list of csv entrants

So as I said in my last blog - I already have a way of using excel to do lots of the admin.  The big task is to get the data in a spreadsheet.

For a brevet

Well - I guess it's really for the next brevet!

What I have

Well - I guess I already have a list of previous people and brevets.
I also know what brevets are coming up.

Solution Overview

I'm going to create a database - whoop whoop - to store:
Riders,
The Brevets,
Entrants to a brevet - i.e. the relationship between the Riders and the Brevets.
Then I'm going to upload the data from the excel files I have.  This should give me a list of riders, existing entries and previous brevets.
This should validate my data model.
I'm going to do all this in SQL Server management studio straight on the database.

Can I Sprint in an hour?

ACP Brevet Cards

It seems the more I know, the less I get to do.  I'm currently a technical architect for 6 sprint teams - which is exciting and challenging but I'm getting less and less (read no) time at the coal face.
The pace of change in IT seems to be exponential and I like to keep my toes on the edge of the programming curve.
OK - so I'd love to get back into Unix, c++, java and all that stuff - but it's been 15 years or so so I'm going to stick with the Microsoft stack.

The Challenge

Deliver a user story in one hour.  That's one hour to code, test and release.  
"Give me six hours to chop down a tree and I will spend the first four sharpening the axe." 
Abraham Lincoln

I am going to allow - outside of the hour - study and plan.  Hopefully this will make the hour super productive.

The project

The project is simple - and combines programming with my other passion - cycling.  You can read about some of my cycling here.  Every year I organise a few "Brevets" - long distance cycle rides for Audax Ireland.  There is a bit of admin involved:

  1. Collect details of participants - currently they email or post a form with name, address etc.
  2. Collect money - I accept PayPal or cash on the day.  Not all organisers accept cash and some push the PayPal through the clubs.  Different events cost different amounts.  Payment can be made in € or £.
  3. Print "Brevet Cards" with the event details, riders details and control points.
  4. Sign on sheet for Cycling Ireland (CI) with list of riders to collect signature at start.
  5. One day licence form for riders who aren't members of cycling Ireland.
  6. On completion - fill in finisher details on spreadsheet for Audax Club Parisien to homogolate the rides and order medals where required.
  7. Submit accounts and pay balance toe Audax Ireland (AI).
  8. When homogolation stickers and medals arrive, post with brevet cards to riders.
Currently I run all this using a spreadsheet to collect the rider info on.  I then have a word mail merge for the Brevet cards and cut and paste onto the other documents.
The biggest pain is actually getting the data on the spreadsheet.  It is either keyed or cut an pasted depending on how the application arrives.  I can also sometimes cut and paste from an old spreadsheet - but the CI license numbers change every year and other details may change.

The tech

  • SQL Azure - assuming a relational database - might need other azure tech around the outside for large data or whatever.
  • C#.net for business object layer, processes etc.
  • MVC for serving HTML  to client
  • WEB API - for services
  • Angular
  • Visual Studio

Entity framework or ADO.NET + Strongly typed stored procedures.

Its a  question that's been worrying me for about 8 years!  
Currently I have projects working with ADO.NET and stored procs (most), EF and unit of work, and EF over stored procedures.  I'm still torn between development speed, security, maintanence and 

Strongly Typed Stored Procedures

  • Tick a big security box - giving lease privilege access to the database and strong type checking - a string is a string never a piece of sql, thus adding a layer of defence against many forms of attack, SQL injection being the most obvious.  OK - O know EF will have vendor backing and is paramaterised etc. etc. but believe me, when some external auditor comes in and says what layers of defence do you have against attack, the strongly typed stored procedure with least privilege (i.e. connection can only execute stored procs) is king.
  •  Are fast
  • Do set based operations where they belong - in an engine that has been optimised over about twenty five years for performing set based operations!
  • Allow provide an encapsulation layer over your data model, allowing it to be changed and maintained independent(ish) of your code.
  • Allow your database to be normalised and your object model to be de-normalised as. it should be.
  • Allow your interactions with the database to be as they should i.e. full control over how much data is fetched when.
  • Is stable with a very long support life ahead and behond - ADO.NET.
  • Can lead to business logic being scattered across code and sql depending on who wrote it!
  • Can result in stored procedure mountains.
  • Offer a great maintainence fix.

Entity Framework

  • Is quicker to develop - though I'd argue this might only be over the first few iterations of a project.
  • Supports more flexible query functionality for the application e.g. sorting, querying multiple parameters etc. without additional effort.
  • Can leave you fighting the framework.
  • Is not as stable
  • Is not as secure
  • Through web services - offers a consistent business layer for other applications e.g. reporting etc.
  • May lead to compromises in the data model (de-normalised) or the business object model (too normalised).
  • Requires a full code deploy.
I'm going to put this decision off a bit longer!

Project management and tools

OK - so its agile, obviously.  

Agile tool/work tracking

TBD - I have visual studio online/visual studio team system and this is the obvious choice but I might look at a more "open" product - after all I'm wearing shorts right!
I guess initially it'll be this blog as my Minimum Viable Product in an hour is going to be very minimum.

Source Code Repository

To open source or not to open source, that is the question - well - I really want to open source - not that I'm expecting a million IT pros to rock up and start working on this but, like I say, I wearing shorts.  I thing I need to be careful of is ideally I'd use some of Stiona Software's utility and platform code but this has IPR wrapped in it and I'm not wearing sandals.  Software is my living too.

Again - with my shorts on GIT - but frankly it's something else to learn at the moment.

So - probably codeplex.  I have an account and it integrates with visual studio.

Tuesday 11 October 2016

Powershell delete all recycle bins

Get-ChildItem "D:\`$Recycle.bin\" -Recurse -Force | Remove-Item -Force -Recurse

Blatantly stolen from http://serverfault.com/questions/330776/clear-the-recycle-bin-for-all-users-in-windows-server-2008-r2

Monday 10 October 2016

Powershell Azure - when you have multiple "directories"/ tenants.

Documentation didn't really cover this too well but when I tried to use powerhell to move items between subscriptions I kept getting an invalid subscription id.  I eventually deduced it was because I have two directories/tenants and was in the wrong one.

List subscriptions using

Get-AzureRmSubscription

then select one in the right tenant making sure you specify the TenantID too.

Select-AzureRmSubscription -SubscriptionId xxxxx -TenantId yyyyy
Basically it seems if you've multiple "Directories" / tenants you have to specify the tenant id.

Once you've done that, moving subscriptions etc. works for that Tenant.

Wednesday 27 April 2016

Powershell - create copy in multiple directories

If found - creates a copy ofActiveDirectory.dll to ActiveDirectory.pre in each of the sub directories of the current directory.

get-childitem -recurse -filter "ActiveDirectory.dll" | foreach-object {copy -path $_.FullName -Destination "$($_.Directory)\ActiveDirectory.pre" -Container}

Not sure if the container is needed.

Tuesday 22 March 2016

Powershell create directory structure

New-Item enables creation of a full directory structure in one - non of that mkdir stuff.

New-Item -ItemType Directory -Force -Path \\web03b\D$\EAP-6.2.0\jboss-eap-6.2\standalone\log

Powershell create directory structure

New-Item enables creation of a full directory structure in one - non of that mkdir stuff.

New-Item -ItemType Directory -Force -Path \\pppweb03b\D$\EAP-6.2.0\jboss-eap-6.2\standalone\log

JBOSS on windows–changing log directory (EAP-6.2.0)

1) Create directory structure D:\EAP-6.2.0\jboss-eap-6.2\standalone\log
2) Change registry entry HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Apache Software Foundation\ProcRun 2.0\JBossEAP 6.0\Parameters\Log\Path to structure D:\EAP-6.2.0\jboss-eap-6.2\standalone\log
3) Restart jboss eap service.
I found the combination of batch files, services, 32 bit WOW and java a bit disconcerting!

Friday 4 March 2016

DocumentDBServices now in codeplex.

So - I took the plunge and published the initial code that I wrote about here - Scalable Querying multiple Azure DocumentDB databases and collections on codeplex here https://documentdbservices.codeplex.com/.

Next steps are to create a simple web interface over the service.

Thursday 3 March 2016

Scalable Querying multiple Azure DocumentDB databases and collections - quick blog

Azure DocumentDB is highly scalable - it seems that every 10GB or so you need a new "Collection".  My guess is each Collection gets you a new VM so you get scale!
So - if you want to run a query you need to do a request to all your collections (http) and then put all the answers together.
So - this service will do it for you
http://documentdbservices.azurewebsites.net/api/Document/?endpointUrl={YOUR ENDPOINT}&authorizationKey={YOUR KEY}&query={YOUR QUERY}

You must url encode the parameters.  I used this service - http://www.url-encode-decode.com/.
Actually - I only had to encode the authorizationKey - it contains a fair amount of special characters - everything else the browser was able to take care of.

I'll be publishing a proper web interface and maybe even putting the code in GIT over the next couple of weeks but for those who can't wait for the excitement - here's a brief how it works.

The crux of it

The class below does all the work.  
Couple of attributes store the url and key for connection and I provide instance and static methods (the instance methods using the connection details).  This should allow a user of the class to cache instances/connections to reduce overhead.

The real parallelism happens around 

 foreach (Database item in databases)
            {
                IEnumerable<DocumentCollection> dcollections = client.CreateDocumentCollectionFeedReader(item.CollectionsLink);
                foreach (DocumentCollection dc in dcollections)
                {
                    QueryDocumentCollection(query, client, retStack, tasks, item, dc);
                }
            }
            Task.WaitAll(tasks.ToArray()); 
Which calls query document collection for each collection in all the databases and waits for all of the queries to complete.
The QueryDocumentCollection calls the ExecuteQuery which creates the http request and then creates a "continuation" task to process the result.
The continuation task is added to the tasks that the above method is waiting on - the  Task.WaitAll(tasks.ToArray()) .

            Task<IQueryable<dynamic>> task = new Task<IQueryable<dynamic>>(() => ExecuteQuery(client,
                item,
                dc,
                query
                ));
            Task continuation = task.ContinueWith((prevTask) => AddCollection(retStack, prevTask));
            tasks.Add(continuation);
            task.Start();
So - each http request - is fired off async allowing the query to run in parallel across all the collections.
The ConcurrentStack<Document> is used to put all the results back together in a thread safe way.



===============the full class =====================
  /// <summary>
    /// This class provides utilities to run queries accross multiple database
    /// and subscriptions.
    /// </summary>
    public class DistributedQueryUtils
    {
        public Uri EndpointUrl { get; set; }

        public String AuthorizationKey { get; set; }
        private DocumentClient GetDocumentClient()
        {
            var client = new DocumentClient(EndpointUrl, AuthorizationKey);
            return client;
        }

        static Dictionary<String, Microsoft.Azure.Documents.DocumentCollection> collections = new Dictionary<string, DocumentCollection>();

        public async Task<DocumentCollection> GetDocumentCollectionAsync(DocumentClient client, Database database, String collection)
        {
            DocumentCollection documentCollection = null;
            // get the week number
            // check to see if we've got it
            lock (collections)
            {
                if (collections.ContainsKey(collection))
                {
                    documentCollection = collections[collection];
                }
            }
            if (null == documentCollection)
            {
                documentCollection = client.CreateDocumentCollectionQuery("dbs/" + database.Id).Where(c => c.Id == collection).AsEnumerable().FirstOrDefault();
                // If the document collection does not exist, create a new collection
                if (documentCollection == null)
                {
                    documentCollection = await client.CreateDocumentCollectionAsync("dbs/" + database.Id,
                        new DocumentCollection
                        {
                            Id = collection
                        });

                }
                lock (documentCollection)
                {
                    if (collections.ContainsKey(collection))
                    {
                        collections[collection] = documentCollection;
                    }
                    else
                    {
                        collections.Add(collection, documentCollection);
                    }
                }
            }

            return documentCollection;

        }

        public IList<Document> QueryAllDatabase(String query)
        {
            IList<Document> docs = QueryAllDatabases(query, GetDocumentClient());

            return docs;
        }
        public static IList<Document>  QueryAllDatabases(String query,
            DocumentClient client)
        {
            IList<Document> docs = QueryAllDocumentCollections(null, query, client);
           
            return docs;
        }

        public IList<Document> QueryAllDocumentCollections(String databaseName,
           String query)
        {
            return QueryAllDocumentCollections(databaseName,
                query,
                GetDocumentClient());
        }

        /// <summary>
        /// Query all document collections in the database name passed in OR 
        /// null for all databases.
        /// </summary>
        /// <param name="databaseName"></param>
        /// <param name="query"></param>
        /// <param name="client"></param>
        /// <returns></returns>
        public static IList<Document> QueryAllDocumentCollections(String databaseName, 
            String query, 
            DocumentClient client)
        {

            // get the collection of documents to look at 
            IEnumerable<Database> databases;
            List<Document> ret = new List<Document>();
            // for thread safety
            ConcurrentStack<Document> retStack = new ConcurrentStack<Document>();
            // get the database matching the name or all
            if (null != databaseName)
            {
                databases = client.CreateDatabaseQuery().Where(db => db.Id == databaseName).AsEnumerable();
            }
            else
            {
                databases = client.CreateDatabaseQuery();
            }
            List<Task> tasks = new List<Task>();
            // create a query for each collection on each database
            foreach (Database item in databases)
            {

                IEnumerable<DocumentCollection> dcollections = client.CreateDocumentCollectionFeedReader(item.CollectionsLink);
                foreach (DocumentCollection dc in dcollections)
                {
                    QueryDocumentCollection(query, client, retStack, tasks, item, dc);

                }
            }
            Task.WaitAll(tasks.ToArray());
            ret.AddRange(retStack);
            return ret;
        }

        /// <summary>
        /// Query thge document collection -
        /// fill the retStack with the Documents matching the query.
        /// The operation is not complete until the list of tasks has finished
        /// - yopu need to Task.WaitAll(tasks.ToArray());
        /// </summary>
        /// <param name="query"></param>
        /// <param name="client"></param>
        /// <param name="retStack"></param>
        /// <param name="tasks"></param>
        /// <param name="item"></param>
        /// <param name="dc"></param>
        private static void QueryDocumentCollection(String query,
            DocumentClient client,
            ConcurrentStack<Document> retStack,
            List<Task> tasks,
            Database item,
            DocumentCollection dc)
        {
            Task<IQueryable<dynamic>> task = new Task<IQueryable<dynamic>>(() => ExecuteQuery(client,
                item,
                dc,
                query
                ));
            Task continuation = task.ContinueWith((prevTask) => AddCollection(retStack, prevTask));
            tasks.Add(continuation);
            task.Start();
        }
        static IQueryable<dynamic> ExecuteQuery(DocumentClient client,
            Database db,
            DocumentCollection dc,
            String query)
        {
            return client.CreateDocumentQuery("dbs/" + db.Id + "/colls/" + dc.Id
                   , query);
        }

        static void AddCollection(ConcurrentStack<Document> retDocs,
            Task<IQueryable<dynamic>> docsToAdd)
        {
            if (!docsToAdd.IsFaulted)
            {
                foreach (Document item in docsToAdd.Result.AsEnumerable())
                {
                    retDocs.Push(item);

                }

            }
            else
            {
                throw docsToAdd.Exception;
            }
        }
    }

Web API code

     public HttpResponseMessage Get(String endpointUrl,
            String authorizationKey,
            String query)
        {
                DistributedQueryUtils dbQuery = new DistributedQueryUtils()
            {
                AuthorizationKey = authorizationKey,
                EndpointUrl = new Uri(endpointUrl)
            };
         
            IList<Document> res = dbQuery.QueryAllDatabase(query);
            StringBuilder builder = new StringBuilder();
         
                    foreach (Document d in res)
                    {
                        /*
                        d.SaveTo(ms,
                            SerializationFormattingPolicy.Indented);
                        */
                        builder.Append(d);
                    }

            var response = this.Request.CreateResponse(HttpStatusCode.OK);
            response.Content = new StringContent(builder.ToString(), Encoding.UTF8, "application/json");
     
            return response;
       
        }
Nothing too rocket science here - idealy I would have liked to write the Documents directly to the http response stream rather than making a big string but I gave up after a while.
Also - I will probably cache the DistributedQueryUtils instances to reduce http requests and add in overrides to search just a specific datbase or container.

Wednesday 24 February 2016

TFS build server - don't run test

Some tests will not run on your build servers because they haven't got connectivity etc.
You can prevent them running using a "TestCategory".
On the TFS build process tab
-> Automated tests
-> Test Source
-> Test Case Filter set to e.g. "TestCategory!=WebService"
Annotate your server with -
 [TestMethod]
        [TestCategory("WebService")]
        public void AckExistingMessage_AckSucceeds_MessageRemovedFromQueue()

Monday 22 February 2016

SQL Server log file size

dbcc sqlperf(logspace)
gives you size and % useed

Installing azure scripltets

I tried the instructions at https://azure.microsoft.com/en-us/documentation/articles/powershell-install-configure/ using Installing Azure PowerShell from the Gallery.
The second command Install-AzureRM failed.
“The 'Install-AzureRM' command was found in the module 'AzureRM', but the module could not be loaded.!
So I ran
Import-Module -Name AzureRM
and then Install-AzureRM and it worked.
(Actually I had to change my policy to allow scripts to run first - Set-ExecutionPolicy RemoteSigned).

Stuart:1 – MS PowerShell: 0