Managing VSTS/TFS Release Definition Variables from PowerShell

Couple of days ago I was trying to provision my Release Definitions variables in my VSTS/TFS projects via PowerShell. As this turned out not to be a trivial Web-Request and as some of the calls I discovered are not yet documented, I decided to share my findings with you.

In the following lines I’ll show you a couple of cmdlets that will allow you to manipulate all of the variables in your Release Definition, those on the definition level, environment specific ones and also variable groups.

For the purpose of adding Release definition, Environment level variables and relating Variable Groups I wrote the following cmdlet:

function Add-EnvironmentVariable()
{
    [CmdletBinding()]
    param
    (
        [string][parameter(Mandatory = $true, ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true)][Alias("name")]$VariableName,
        [string][parameter(Mandatory = $true, ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true)][Alias("value")]$VariableValue,
        [string][parameter(ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true)][Alias("env")]$EnvironmentName,
        [bool][parameter(ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true)]$Secret,
        [int[]]$VariableGroups,
        [string][parameter(Mandatory = $true)]$ProjectUrl,
        [int][parameter(Mandatory = $true)]$DefinitionId,
        [string]$Comment,
        [switch]$Reset
    )
    BEGIN
    {
        Write-Verbose "Entering script $($MyInvocation.MyCommand.Name)"
        Write-Verbose "Parameter Values"
        $PSBoundParameters.Keys | ForEach-Object { if ($Secret -and $_ -eq "VariableValue") { Write-Verbose "VariableValue = *******" } else { Write-Verbose "$_ = '$($PSBoundParameters[$_])'" }}

        $ProjectUrl = $ProjectUrl.TrimEnd("/")

        $url = "$($ProjectUrl)/_apis/release/definitions/$($DefinitionId)?expand=Environments?api-version=3.0-preview"
        $definition = Invoke-RestMethod $url -UseDefaultCredentials

        if ($Reset)
        {
            foreach($environment in $definition.environments)
            {
                foreach($prop in $environment.variables.PSObject.Properties.Where{$_.MemberType -eq "NoteProperty"})
                {
                    $environment.variables.PSObject.Properties.Remove($prop.Name)
                }
            }

            foreach($prop in $definition.variables.PSObject.Properties.Where{$_.MemberType -eq "NoteProperty"})
            {
                $definition.variables.PSObject.Properties.Remove($prop.Name)
            }

            $definition.variableGroups = @()
        }
    }
    PROCESS
    {
        $value = @{value=$VariableValue}
    
        if ($Secret)
        {
            $value.Add("isSecret", $true)
        }

        if ($EnvironmentName)
        {
            $environment = $definition.environments.Where{$_.name -like $EnvironmentName}

            if($environment)
            {
                $environment.variables | Add-Member -Name $VariableName -MemberType NoteProperty -Value $value -Force
            }
            else
            {
                Write-Warning "Environment '$($environment.name)' not found in the given release"
            }
        }
        else
        {
            $definition.variables | Add-Member -Name $VariableName -MemberType NoteProperty -Value $value -Force
        }
    }
    END
    {
        $definition.source = "restApi"

        if ($Comment)
        {
            $definition | Add-Member -Name "comment" -MemberType NoteProperty -Value $Comment
        }
        
        if ($VariableGroups)
        {
            foreach($variable in $VariableGroups)
            {
                if ($definition.variableGroups -notcontains $variable)
                {
                    $definition.variableGroups += $variable
                }
            }
        }

        $body = $definition | ConvertTo-Json -Depth 10 -Compress

        Invoke-RestMethod "$($ProjectUrl)/_apis/release/definitions?api-version=3.0-preview" -Method Put -Body $body -ContentType 'application/json' -UseDefaultCredentials | Out-Null
    }
}

Don’t get scared by the number of parameters, or apparent complexity of the cmdlet. I’ll quickly explain those parameters, usage and the expected result.

Let’s start with some why’s. As you can see, in the BEGIN block of my cmdlet (which is triggered once per a pipeline invocation) I retrieve the given build definition, in the PROCESS block I add the desired variables (hopefully from the pipeline) then in the END block I persist all of the changes.

If you are unfamiliar with Windows PowerShell Cmdlet Lifecycle, please consult the following article Windows PowerShell: The Advanced Function Lifecycle.

This is intentional, as I want to have a single call to the API for all of the added variables. In this way in the history of the build definition there will be a single entry for all of the variables we added, no matter the number of them. Otherwise, we would persist the changes for each of the variables and our history would be messy.

If structured differently, we may see a history entry on per each variable that we do add. This obviously applies only if you are trying to add multiple variable in one go.

Following would be a simple invocation to add a single variable into one of our environments defined in a release template:

$Project = "https://my.tfs.local:8080/tfs/DefaultCollection/MyProject"
$DefinitionId = 23

Add-EnvironmentVariable -VariableName "Mario2" -VariableValue "1" -EnvironmentName "DEV" -Secret $true -ProjectUrl $Project -DefinitionId $DefinitionId -VariableGroups 25 -Comment "Added by PSScript"

The above command will add a variable named Mario2 with a value 1 in the DEV environment, defined in the definition with id 23. It will also reference the variable group that has id 25.

Following would be the result:

In case you would like to add multiple variables in one go, create an array of PSCustomObject with the following properties:

$variables = @()
$variables += [PSCustomObject]@{ name="var1"; value="123"; secret=$false; env="" }
$variables += [PSCustomObject]@{ name="var2"; value="sdfd"; secret=$false; env="" }
$variables += [PSCustomObject]@{ name="var3"; value="5678"; secret=$false; env="DEV" }
$variables += [PSCustomObject]@{ name="var4"; value="ghjk"; secret=$true; env="DEV" }

$variables | Add-EnvironmentVariable -ProjectUrl $Project -DefinitionId $DefinitionId

This will add two variables to the environment called DEV in your Release Definition and two more variables on the Release Definition level. As you can guess, if we omit the environment name, the variables will be added on the Release Definition level. The last variable, var4, is also marked as secret, meaning that once added will not be visible to the user. Also in this case, we will have only a single entry in the change history as a single call to the REST API will be made.

Other options you can specify are:

  • Reset – By setting this switch only the variables that are not passed in the invocation, but are present on the Release definition, will be removed.
  • Comment – In case you want a custom message to be visualized in the history for this change, you can specify it here.
  • VariableGroups – An integer array indicating id’s of the variable groups you wish to link to the Release definition

In case you are using variable groups you can create those via following cmdlet:

function Add-VariableGroupVariable()
{
    [CmdletBinding()]
    param
    (
        [string][parameter(Mandatory = $true)]$ProjectUrl,
        [string][parameter(Mandatory = $true)]$VariableGroupName,
        [string]$VariableGroupDescription,
        [string][parameter(Mandatory = $true, ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true)][Alias("name")]$VariableName,
        [string][parameter(Mandatory = $true, ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true)][Alias("value")]$VariableValue,
        [bool][parameter(ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true)]$Secret,
        [switch]$Reset,
        [switch]$Force
    )
    BEGIN
    {
        Write-Verbose "Entering script $($MyInvocation.MyCommand.Name)"
        Write-Verbose "Parameter Values"

        $PSBoundParameters.Keys | ForEach-Object { Write-Verbose "$_ = '$($PSBoundParameters[$_])'" }
        $method = "Post"
        $variableGroup = Get-VariableGroup $ProjectUrl $VariableGroupName

        if($variableGroup)
        {
            Write-Verbose "Variable group $VariableGroupName exists."

            if ($Reset)
            {
                Write-Verbose "Reset = $Reset : remove all variables."
                foreach($prop in $variableGroup.variables.PSObject.Properties.Where{$_.MemberType -eq "NoteProperty"})
                {
                    $variableGroup.variables.PSObject.Properties.Remove($prop.Name)
                }
            }

            $id = $variableGroup.id
            $restApi = "$($ProjectUrl)/_apis/distributedtask/variablegroups/$id"
            $method = "Put"
        }
        else
        {
            Write-Verbose "Variable group $VariableGroupName not found."
            if ($Force)
            {
                Write-Verbose "Create variable group $VariableGroupName."
                $variableGroup = @{name=$VariableGroupName;description=$VariableGroupDescription;variables=New-Object PSObject;}
                $restApi = "$($ProjectUrl)/_apis/distributedtask/variablegroups?api-version=3.2-preview.1"
            }
            else
            {
                throw "Cannot add variable to nonexisting variable group $VariableGroupName; use the -Force switch to create the variable group."
            }
        }
    }
    PROCESS
    {
        Write-Verbose "Adding $VariableName with value $VariableValue..."
        $variableGroup.variables | Add-Member -Name $VariableName -MemberType NoteProperty -Value @{value=$VariableValue;isSecret=$Secret} -Force
    }
    END
    {
        Write-Verbose "Persist variable group $VariableGroupName."
        $body = $variableGroup | ConvertTo-Json -Depth 10 -Compress
        $response = Invoke-RestMethod $restApi -Method $method -Body $body -ContentType 'application/json' -Header @{"Accept" = "application/json;api-version=3.2-preview.1"}  -UseDefaultCredentials
        
        return $response.id
    }
}

function Get-VariableGroup()
{
    [CmdletBinding()]
    param
    (
        [string][parameter(Mandatory = $true)]$ProjectUrl,
        [string][parameter(Mandatory = $true)]$Name
    )
    BEGIN
    {
        Write-Verbose "Entering script $($MyInvocation.MyCommand.Name)"
        Write-Verbose "Parameter Values"
        $PSBoundParameters.Keys | ForEach-Object { Write-Verbose "$_ = '$($PSBoundParameters[$_])'" }
    }
    PROCESS
    {
        $ProjectUrl = $ProjectUrl.TrimEnd("/")
        $url = "$($ProjectUrl)/_apis/distributedtask/variablegroups"
        $variableGroups = Invoke-RestMethod $url -UseDefaultCredentials
        
        foreach($variableGroup in $variableGroups.value){
            if ($variableGroup.name -like $Name){
                Write-Verbose "Variable group $Name found."
                return $variableGroup
            }
        }
        Write-Verbose "Variable group $Name not found."
        return $null
    }
    END { }
}

This cmdlet will look for the given group and if it exists it will update it with the values you pass in. In case the variable group (matched by name) doesn’t exist, and if the -Force switch is selected, it will create a new group. Working principle is the same as for Add-EnvironmentVariable cmdlet. At the end, it will return the Variable Group Id that you can use later for Add-EnvironmentVariable cmdlet and reference it.

Following an example of invocation:

$ProjectUrl = "https://my.tfs.local:8080/tfs/DefaultCollection/MyProject"

Add-VariableGroupVariable -ProjectUrl $ProjectUrl -VariableGroupName "Mario Test" -Force -VariableName "var1" -VariableValue "1234"

That’s all folks! You now have 2 new cmdlets that will allow you to automate the management of the Release Definition variables. Use these wisely 🙂

Happy coding!

P.S.
A thank you goes to Ted van Haalen who, on my input, actually wrote and tested Add-VariableGroupVariable cmdlet (as you already may have noticed because of the different coding style).

Using Windows Machine File Copy (WinRM) VSTS extension

Implementation of the original Windows Machine File Copy task is based on Net Use Command and Robocopy. This command makes use of the SMB (server message block) and the Netbios protocol on port 139 or 445. Although by default this should be always supported in Intranets, it may be that due to the network restrictions or security policies it is not possible to setup such a connection or you are running a copy on a machine that is out of your local network. Recently I faced an issue copying files with Windows Machine File Copy task due to the SMB restrictions. This pushed me to recreate the same task as the original Windows Machine File Copy task, however with the transfer based on WinRM protocols. I shared my work in a form of an extension on Visual Studio Team Services – Visual Studio Marketplace. You can find my extension here WinRm File Copy.

Sources are available on GitHub in the repository called mmajcica/win-rm-file-copy, meanwhile the original implementation is part of the Microsoft/vsts-tasks repository.

In this post I will not go into the implementation details, just illustrate the usage of the task itself.

Usage wise, there are no differences with the original Microsoft task and this was also my main goal. Here is a screenshot of the task:

As you can see, all of the parameters are almost the same as for the original task.

Requirements wise, PowerShell V5 is required both on the build server as on the destination machine. And that is the only requirement, given for granted that WinRM is correctly setup.

Let’s quickly see how to set up a file copy. As for the Microsoft task, you need to specify the following parameters:

  • Source: The source of the files. As described above using pre-defined system variables like $(Build.Repository.LocalPath) make it easy to specify the location of the build on the Build Automation Agent machine. The variables resolve to the working folder on the agent machine, when the task is run on it. Wild cards like **/*.zip are not supported. Probably you are going to copy something from your artifacts folder that was generated in previous steps of your build/release, at example $(System.ArtifactsDirectory)\Something
  • Machines: Specify comma separated list of machine FQDNs/ip addresses along with port(optional). For example dbserver.fabrikam.com, dbserver_int.fabrikam.com:5988,192.168.34:5989.
  • Admin Login: Domain/Local administrator of the target host. Format: \ < Admin User>.
  • Password: Password for the admin login. It can accept variable defined in Build/Release definitions as ‘$(passwordVariable)’. You may mark variable type as ‘secret’ to secure it.
  • Destination Folder: The folder in the Windows machines where the files will be copied to. An example of the destination folder is C:\FabrikamFibre\Web.
  • Use SSL: In case you are using secure WinRM, HTTPS for transport, this is the setting you will need to flag.
  • Clean Target: Checking this option will clean the destination folder prior to copying the files to it.
  • Copy Files in Parallel: Checking this option will copy files to all the target machines in parallel, which can speed up the copying process.

There is not much more to say. If you need to copy a file or a folder, from your build agent, in a target folder on a remote machine, using WinRm as a transfer media, this is the way to go.

Happy coping!

XL Deploy in VSTS/TFS build/release pipelines

Introduction

Circa a year ago I showed you how to use XebiaLabs Team Foundation Server 2015 XL Deploy plugin to create, import and deploy, XL Deploy Deployment Archives, on TFS and VSTS. With a new version of the plugin, that is now renamed into VSTS extension for XL Deploy, XebiaLabs delivers two more build/release tasks. This tasks are supported both by build and release pipelines in VSTS and Team Foundation Server. This will allow us to be more granular and chose to perform only certain operations at each stage, in our pipelines. We can at example choose to create our XL Deploy Deployment Archive during the build and later on, in our release definition, deploy the same artifact in different environments, using XL Deploy.
In the following post, I’ll show you an example on how to achieve this by using our two new tasks, respectively, Package with XL Deploy build task and Deploy with XL Deploy build task.

Building the artifact

As in my previous post, we will keep our Visual Studio Build, however, instead of using the XL Deploy build task, we will add Package With XL Deploy task which you can find in the Package category in the Task Catalog

Once added, we need to point it to the right manifest path, chose if we want the task to set the version of the created package based on the build number and, in our case, Publish the artifact to TFS. In this way once the package is created, it will be uploaded to TFS and associated as an artifact to the current build. At the end the result should be similar to the following.

You can now try your build and at the end, if successful, you should see our DAR package under the build artifacts.

If you can see your artifact uploaded, we can move to the next step.

Deploying from the VSTS Release Management

We are going now to create a new release definition and start with an empty template. As artifact source we are going to indicate a build and select from the relevant drop-down, build definition that we created in our previous step. You also may choose to select a continuous deployment option and a specific build queue for this build.

Once the release definition is created, for your initial Environment, choose Add task and select Deploy with XL Deploy from the task catalog.

This task will allow us to import the DAR package into XL Deploy and trigger the deployment for the desired environment. Bare in mind that it will check if the package is already in XL Deploy and, if so, will skip the import. This means that if you are using it multiple times for different environment definitions, it will behave as efficiently as possible.

Now we need to select the XL Deploy Endpoint, which indicates the instance of XL Deploy we are using for this deployment and fill in the necessary indications about the artifact location and the name of our target environment. For what concerns the XL Deploy Endpoint you can get further information in Add an XL Deploy endpoint in Visual Studio Team Services document. Artifact can be ‘picked up’ from the Server, it means that it will be made available by the release itself, or it can be retrieved from the file share. In our case it will be delivered by the release which will gets it from the associated build.

That’s all! Now we can create a new release and let our task delegate the deployment to XL Deploy.

Conclusion

We saw how can we leverage the new build/release tasks to interact with XL Deploy in different phases of our Build/Release process.

You can find more VSTS specific information at the following link or more information about the VSTS extension for XL Deploy here.