Using Windows Machine File Copy (WinRM) VSTS extension

Implementation of the original Windows Machine File Copy task is based on Net Use Command and Robocopy. This command makes use of the SMB (server message block) and the Netbios protocol on port 139 or 445. Although by default this should be always supported in Intranets, it may be that due to the network restrictions or security policies it is not possible to setup such a connection or you are running a copy on a machine that is out of your local network. Recently I faced an issue copying files with Windows Machine File Copy task due to the SMB restrictions. This pushed me to recreate the same task as the original Windows Machine File Copy task, however with the transfer based on WinRM protocols. I shared my work in a form of an extension on Visual Studio Team Services – Visual Studio Marketplace. You can find my extension here WinRm File Copy.

Sources are available on GitHub in the repository called mmajcica/win-rm-file-copy, meanwhile the original implementation is part of the Microsoft/vsts-tasks repository.

In this post I will not go into the implementation details, just illustrate the usage of the task itself.

Usage wise, there are no differences with the original Microsoft task and this was also my main goal. Here is a screenshot of the task:

As you can see, all of the parameters are almost the same as for the original task.

Requirements wise, PowerShell V5 is required both on the build server as on the destination machine. And that is the only requirement, given for granted that WinRM is correctly setup.

Let’s quickly see how to set up a file copy. As for the Microsoft task, you need to specify the following parameters:

  • Source: The source of the files. As described above using pre-defined system variables like $(Build.Repository.LocalPath) make it easy to specify the location of the build on the Build Automation Agent machine. The variables resolve to the working folder on the agent machine, when the task is run on it. Wild cards like **/*.zip are not supported. Probably you are going to copy something from your artifacts folder that was generated in previous steps of your build/release, at example $(System.ArtifactsDirectory)\Something
  • Machines: Specify comma separated list of machine FQDNs/ip addresses along with port(optional). For example dbserver.fabrikam.com, dbserver_int.fabrikam.com:5988,192.168.34:5989.
  • Admin Login: Domain/Local administrator of the target host. Format: \ < Admin User>.
  • Password: Password for the admin login. It can accept variable defined in Build/Release definitions as ‘$(passwordVariable)’. You may mark variable type as ‘secret’ to secure it.
  • Destination Folder: The folder in the Windows machines where the files will be copied to. An example of the destination folder is C:\FabrikamFibre\Web.
  • Use SSL: In case you are using secure WinRM, HTTPS for transport, this is the setting you will need to flag.
  • Clean Target: Checking this option will clean the destination folder prior to copying the files to it.
  • Copy Files in Parallel: Checking this option will copy files to all the target machines in parallel, which can speed up the copying process.

There is not much more to say. If you need to copy a file or a folder, from your build agent, in a target folder on a remote machine, using WinRm as a transfer media, this is the way to go.

Happy coping!

Chrome’s badidea

If case you misunderstood the title, NO, Chrome is not a bad idea. It is my browser of choice for almost 10 years. It’s a great choice. However, because of the security restrictions, you may bump in pages like this:

In case of invalid certificate or some other similar issues, your browser will refuse to load a page of your choice. This is often a smart choice, however if you really know what you are doing, there is an easy way to bypass it. Till now in certain cases, you could expand the details view and continue on the site. In certain scenarios and for certain versions that is not possible anymore. If you google this error message, you will find a bunch of suggestion that may or may not work, although I find all of them cumbersome.
There is however an easy trick and I’ll write it down here as I continuously forget it and I need to poke a friend of mine, Damir Varga, who initially introduced me to it (and apparently has a better memory than I do).

Now back to the trick.

In case you are presented with the above situation, click anywhere in the page and type on your keyboard ‘badidea‘. That’s all, that simple. Your page will now load.

Top trick!

Happy browsing!

Persisting sensitive information with PowerShell

It often happens that I need to persist a password or another sensitive information strings in a file or database. When it happens I can never recall what was exactly the command I used to do so in the past. That’s why I decided to encapsulate the two operation of encrypting and decrypting a string in a cmdlet so that the next time I can just check my blog post.

A small preface about the operation of encryption. It is based on the ConvertTo-SecureString cmdlet which on its own uses Advanced Encryption Standard (AES) encryption algorithm. Supported key lengths by the AES encryption algorithm in this case are 128, 192, or 256 bits and they do depend on the specified key length.

As you can see, there are two required parameters, string that you are trying to encrypt and the key to use to encrypt it. As mentioned above, specified key must have a length of 128, 192, or 256 bits. This translate in a string with length respectively equal to 8, 12 or 16 chars. The calculation is simple, strings inside PowerShell are represented as 16-bit Unicode, instances of .NET’s System.String class, thus 16 bits per character. Knowing this, the maths is easy.
For record, if we haven’t specified any key, the Windows Data Protection API (DPAPI) would be used to encrypt the standard string representation and we wouldn’t be capable of decrypting our string on a different computer.
After we invoke our cmdlet, we will get back the encrypted string. We can then persist that information safely in, at example, our configuration file or a database field.

Once we need to read our value back we can use the following cmdlet:

Passing in the encrypted string and the key that should be used to decrypt the information, this cmdlet will return the decrypted string.

Following an example:

You can expect to see outputted console the “My strong super password”.

That’s all folks. Keep your sensitive information safe!