Visual Studio Code behind a proxy

This post is kind of a continuation of Tough life behind a proxy series. This time is the moment of Visual Studio Code. This application does require the web access when it come to plugin installation. If you are behind a proxy, it will not be the easiest thing to achieve.

To set the proxy in Visual Studio Code you need to edit the User Settings. You can do so by opening them from the Preference menu:

user_settings

Once you open it you will be presented with a screen showing the default settings and your user settings file that does override the default settings:

override

Now you can enter the following:

// Place your settings in this file to overwrite the default settings
{
 "http.proxy": "http://my.proxy.address:8080",
 "https.proxy": "http://my.proxy.address:8080",
 "http.proxyStrictSSL": false
}

Some of these information is quite easy to retrieve on interweb, however often they do not mention https and disabling strict SSL. In order to install your extension this is a necessary setting.
Make sure you have the latest version of Visual Studio Code installed before testing this as in some older versions this was not supported.

Save your settings and restart Visual Studio Code. Try now installing an extension, it should be a success.
If you are unaware on how to install an extension in Visual Studio Code, you can find more about this argument in the following blog post Announcing PowerShell language support for Visual Studio Code and more! under “Installing the extension” paragraph. To check the available extensions, please visit Visual Studio Marketplace.

Cheers

Uploading artifacts into Nexus via PowerShell

It may not be the most common thing, however it may happen that you would need to upload an artifact to a maven repository in Nexus via PowerShell. In order to achieve that, we will use Nexus REST API which for this task requires a multipart/form-data POST to /service/local/artifact/maven/content resource. This is however not a trivial task. It is notoriously difficult to manage a Multipart/form-data standard in PowerShell, as I already described in one of my previous post PowerShell tips and tricks – Multipart/form-data requests. As you can see this type of call is necessary in order to upload an artifact to your Nexus server. I will not get in a details about Multipart/form-data requests and if interested about the details, you can check just mentioned post.

For this purpose I created two cmdlets that will allow me to upload an artifact by supplying GAV parameters or by passing the GAV definition in a POM file. A couple of simple functions do support both of these cmdlets.

GAV definition in a POM file

As the heading suggests, this cmdlet will let you upload your artifact and specify the GAV parameters via a POM file.

function Import-ArtifactPOM()
{
	[CmdletBinding()]
	param
	(
		[string][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$EndpointUrl,
		[string][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$Repository,
		[string][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$PomFilePath,
		[string][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$PackagePath,
		[System.Management.Automation.PSCredential][parameter(Mandatory = $true)]$Credential
	)
	BEGIN
	{
		if (-not (Test-Path $PackagePath))
		{
			$errorMessage = ("Package file {0} missing or unable to read." -f $PackagePath)
			$exception =  New-Object System.Exception $errorMessage
			$errorRecord = New-Object System.Management.Automation.ErrorRecord $exception, 'ArtifactUpload', ([System.Management.Automation.ErrorCategory]::InvalidArgument), $PackagePath
			$PSCmdlet.ThrowTerminatingError($errorRecord)
		}
		
		if (-not (Test-Path $PomFilePath))
		{
			$errorMessage = ("POM file {0} missing or unable to read." -f $PomFilePath)
			$exception =  New-Object System.Exception $errorMessage
			$errorRecord = New-Object System.Management.Automation.ErrorRecord $exception, 'ArtifactUpload', ([System.Management.Automation.ErrorCategory]::InvalidArgument), $PomFilePath
			$PSCmdlet.ThrowTerminatingError($errorRecord)
		}

		Add-Type -AssemblyName System.Net.Http
	}
	PROCESS
	{
		$repoContent = CreateStringContent "r" $Repository
		$groupContent = CreateStringContent "hasPom" "true"
		$pomContent = CreateStringContent "file" "$(Get-Content $PomFilePath)" ([system.IO.Path]::GetFileName($PomFilePath)) "text/xml"
		$streamContent = CreateStreamContent $PackagePath

		$content = New-Object -TypeName System.Net.Http.MultipartFormDataContent
		$content.Add($repoContent)
		$content.Add($groupContent)
		$content.Add($pomContent)
		$content.Add($streamContent)

		$httpClientHandler = GetHttpClientHandler $Credential

		return PostArtifact $EndpointUrl $httpClientHandler $content
	}
	END { }
}

In order to invoke this cmdlet you will need to supply the following parameters:

  • EndpointUrl – Address of your Nexus server
  • Repository – Name of your repository in Nexus
  • PomFilePath – Full, literal path pointing to your POM file
  • PackagePath – Full, literal path pointing to your Artifact
  • Credential – Credentials in the form of PSCredential object

I will create a POM file with the following content:

<project>
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.mycompany.app</groupId>
  <artifactId>my-app</artifactId>
  <version>3</version>
</project>

And invoke my cmdlet in the following way:

$server = "http://nexus.maio.com"
$repoName = "maven"
$pomFile = "C:\Users\majcicam\Desktop\pom.xml"
$package = "C:\Users\majcicam\Desktop\junit-4.12.jar"
$credential = Get-Credential

Import-ArtifactPOM $server $repoName $pomFile $package $credential

If everything is correctly setup, you will be first asked to provide the credentials, then the upload will start. If it succeeds, you will receive back a string like this:

{"groupId":"com.mycompany.app","artifactId":"my-app","version":"3","packaging":"jar"}

It is a json representation of the information about imported package.

Manually supplying GAV parameters

The other cmdlet is based on a similar principle, however it doesn’t require a POM file to be passed in, instead it let you provide GAV values as parameter to the call.

function Import-ArtifactGAV()
{
	[CmdletBinding()]
	param
	(
		[string][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$EndpointUrl,
		[string][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$Repository,
		[string][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$Group,
		[string][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$Artifact,
		[string][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$Version,
		[string][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$Packaging,
		[string][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$PackagePath,
		[System.Management.Automation.PSCredential][parameter(Mandatory = $true)]$Credential
	)
	BEGIN
	{
		if (-not (Test-Path $PackagePath))
		{
			$errorMessage = ("Package file {0} missing or unable to read." -f $packagePath)
			$exception =  New-Object System.Exception $errorMessage
			$errorRecord = New-Object System.Management.Automation.ErrorRecord $exception, 'XLDPkgUpload', ([System.Management.Automation.ErrorCategory]::InvalidArgument), $packagePath
			$PSCmdlet.ThrowTerminatingError($errorRecord)
		}

		Add-Type -AssemblyName System.Net.Http
	}
	PROCESS
	{
		$repoContent = CreateStringContent "r" $Repository
		$groupContent = CreateStringContent "g" $Group
		$artifactContent = CreateStringContent "a" $Artifact
		$versionContent = CreateStringContent "v" $Version
		$packagingContent = CreateStringContent "p" $Packaging
		$streamContent = CreateStreamContent $PackagePath

		$content = New-Object -TypeName System.Net.Http.MultipartFormDataContent
		$content.Add($repoContent)
		$content.Add($groupContent)
		$content.Add($artifactContent)
		$content.Add($versionContent)
		$content.Add($packagingContent)
		$content.Add($streamContent)

		$httpClientHandler = GetHttpClientHandler $Credential

		return PostArtifact $EndpointUrl $httpClientHandler $content
	}
	END { }
}

In order to invoke this cmdlet you will need to supply the following parameters:

  • EndpointUrl – Address of your Nexus server
  • Repository – Name of your repository in Nexus
  • Group – Group Id
  • Artifact – Maven artifact Id
  • Version – Artifact version
  • Packaging – Packaging type (at ex. jar, war, ear, rar, etc.)
  • PackagePath – Full, literal path pointing to your Artifact
  • Credential – Credentials in the form of PSCredential object

An example of invocation:

$server = "http://nexus.maio.com"
$repoName = "maven"
$group = "com.test"
$artifact = "project"
$version = "2.4"
$packaging = "jar"
$package = "C:\Users\majcicam\Desktop\junit-4.12.jar"
$credential = Get-Credential

Import-ArtifactGAV $server $repoName $group $artifact $version $packaging $package $credential

If all goes as expected you will again receive a response confirming the imported values.

{"groupId":"com.test","artifactId":"project","version":"2.4","packaging":"jar"}

Supporting functions

As you could see, my cmdlets relay on a couple of functions. They are essential in this process so I will analyse them one by one.

function CreateStringContent()
{
	param
	(
		[string][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$Name,
		[string][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$Value,
		[string]$FileName,
		[string]$MediaTypeHeaderValue
	)
		$contentDispositionHeaderValue = New-Object -TypeName  System.Net.Http.Headers.ContentDispositionHeaderValue -ArgumentList @("form-data")
		$contentDispositionHeaderValue.Name = $Name

		if ($FileName)
		{
			$contentDispositionHeaderValue.FileName = $FileName
		}
		
		$content = New-Object -TypeName System.Net.Http.StringContent -ArgumentList @($Value)
		$content.Headers.ContentDisposition = $contentDispositionHeaderValue

		if ($MediaTypeHeaderValue)
		{
			$content.Headers.ContentType = New-Object -TypeName System.Net.Http.Headers.MediaTypeHeaderValue $MediaTypeHeaderValue
		}

		return $content
}

function CreateStreamContent()
{
	param
	(
		[string][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$PackagePath
	)
		$packageFileStream = New-Object -TypeName System.IO.FileStream -ArgumentList @($PackagePath, [System.IO.FileMode]::Open)
		
		$contentDispositionHeaderValue = New-Object -TypeName  System.Net.Http.Headers.ContentDispositionHeaderValue "form-data"
		$contentDispositionHeaderValue.Name = "file"
		$contentDispositionHeaderValue.FileName = Split-Path $packagePath -leaf

		$streamContent = New-Object -TypeName System.Net.Http.StreamContent $packageFileStream
		$streamContent.Headers.ContentDisposition = $contentDispositionHeaderValue
		$streamContent.Headers.ContentType = New-Object -TypeName System.Net.Http.Headers.MediaTypeHeaderValue "application/octet-stream"

		return $streamContent
}

First two functions are there to create a correct HTTP content. Each content object we create will correspond to a content-disposition header. The first function will return a string type values for the just mentioned header, meanwhile the Stream content will return a stream that will be consumed later on, having as a value octet-stream representation of our artifact.

GetHttpClientHandler function is a helper that will create the right http client handler that contains the credentials to be used for our call.

function GetHttpClientHandler()
{
	param
	(
		[System.Management.Automation.PSCredential][parameter(Mandatory = $true)]$Credential
	)

	$networkCredential = New-Object -TypeName System.Net.NetworkCredential -ArgumentList @($Credential.UserName, $Credential.Password)
	$httpClientHandler = New-Object -TypeName System.Net.Http.HttpClientHandler
	$httpClientHandler.Credentials = $networkCredential

	return $httpClientHandler
}

Last but not least, a function that actually invokes the post call to our server.

function PostArtifact()
{
	param
	(
		[string][parameter(Mandatory = $true)]$EndpointUrl,
		[System.Net.Http.HttpClientHandler][parameter(Mandatory = $true)]$Handler,
		[System.Net.Http.HttpContent][parameter(Mandatory = $true)]$Content
	)

	$httpClient = New-Object -TypeName System.Net.Http.Httpclient $Handler

	try
	{
		$response = $httpClient.PostAsync("$EndpointUrl/service/local/artifact/maven/content", $Content).Result

		if (!$response.IsSuccessStatusCode)
		{
			$responseBody = $response.Content.ReadAsStringAsync().Result
			$errorMessage = "Status code {0}. Reason {1}. Server reported the following message: {2}." -f $response.StatusCode, $response.ReasonPhrase, $responseBody

			throw [System.Net.Http.HttpRequestException] $errorMessage
		}

		return $response.Content.ReadAsStringAsync().Result
	}
	catch [Exception]
	{
		$PSCmdlet.ThrowTerminatingError($_)
	}
	finally
	{
		if($null -ne $httpClient)
		{
			$httpClient.Dispose()
		}

		if($null -ne $response)
		{
			$response.Dispose()
		}
	}
}

This is all the necessary to upload an artifact to our Nexus server. You can find the just show scripts also on GitHub in the following repository tfs-build-tasks. Further information about programtically uploading artifacts into Nexus can be found in the following post, How can I programatically upload an artifact into Nexus?.

Securing Nexus Repository Manager OSS

Recently one of my articles got published on SonaType web site. It is based on my blog post Nexus Repository Manager OSS as Nuget server, which, after reviewing it for SonaType blog, somehow didn’t seemed complete. Thus, I decided to add an extra chapter about the security which I would like to share with you, also on my personal blog, here.

Security

I suppose that your computers are part of an Active Directory domain. I will propose here a setup to make your new Nexus instance dialogue with your domain controller when it comes to authenticating users. Why? Well, uploading packages to the hosted repository requires an API-key and it is assigned on per user basis. If you wish that your users (services) to be distinguished and limit some of them, you would need to setup the authentication and roles correctly. You could do this by simply using Nexus internal database, however, as we all know, adding users manually and creating another identity, from convenience and maintenance prospective is not recommendable. Not to speak about the security implications.

Setting up LDAP connection

In the left-hand main menu panel expand the security group and select the LDAP Configuration option. A new tab will open and you will be presented with a long list of parameters:

ldap-configuration

We are going to analyze them one by one and explain the values you do need to enter in order to make this connection work.

Protocol
Here you have two choices Lightweight Directory Access Protocol and the Lightweight Directory Access Protocol over SSL. Probably you are going to use just a simple LDAP protocol and not the

Hostname
The hostname or IP address of the LDAP server. Simply add the FQDN towards your domain controller.

Port
The port on which the LDAP server is listening. Port 389 is the default port for the ldap protocol, and port 636 is the default port for the ldaps.
If you have a multi domain, distributed Active Directory forest, you should connect to the Active Directory through port 3268. Connecting directly to port 389 might lead to errors. Port 3268 exposes Global Catalog Server that exposes the distributed data. The SSL equivalent connection port is 3269.

Search Base
The search base is the Distinguished Name (DN) to be appended to the LDAP query. The search base usually corresponds to the domain name of an organization. For example, the search base on my test domain server could be dc=maio,dc=local.

Authentication Method
Simple authentication is what we are going to use in first instance for our Active Directory authentication. It is not recommended for production deployments not using the secure ldaps protocol as it sends a clear-text password over the network. You can change

Username
For the LDAP service to establish the communication with the AD server, a valid account is necessary. Usually you will create a service account in your AD for this purpose only.

Password
Password for the above mentioned account.

This is the first part of configuration. Now we need to check if what we entered is valid and if the connection can be established.
Click on Check Authentication button and if the configuration is valid, you should get the following message:

authentication-test

Base DN
You should indicate in which organizational unit Nexus should search for user accounts. By default it can equal to ‘cn=users’ however if you store your user accounts in a different OU, please specify the full path. An example, if you created an OU at the root level called MyBusiness then an OU called Users, you will need to specify the following value as base DN, ‘OU=Users,OU=MyBusiness’.

User Subtree
This needs to be check only if you are having other OU’s, under the one specified in Base DN, in which users are organized. If that is not the case, you can leave it unchecked.

Object Class
For AD it needs to be set yo ‘user’.

User ID Attribute
In the MS AD implementation this value equals to sAMAccountName

Real Name Attribute
You can set this to ‘givenName’ attribute, or simply to cn if givenName is not used during the user creation.

E-Mail Attribute
This attribute should map to ‘mail’

Group Type
Group Type needs to be set to Dynamic Groups.

Member Of Attribute
In case of MS AD this equals to ‘memberOf’.

Now that we have set all of the relevant values, we can ask Nexus OSS to give us a preview of retrieved accounts and mappings to make sure everything is set up correctly. Click on the Check User Mapping button and you should see a preview of the retrieved values and how they are mapped into Nexus OSS. If everything looks fine , save your configuration.

Once the configuration is persisted, the last thing to do is to let Nexus know that you would like to use this new setup for the authentication. This is done by enabling the LDAP Authentication Realm.

In the left-hand main menu panel, from the administration group, choose Server. In the nexus tab that opens, go to the security settings

selected-realms

You will find the OSS LDAP Authentication Realm in the group shown on the right side, called Available Realms. Move it to Selected realms (as shown in the picture above) and save this settings.

Configuring roles

Now that we are able to authenticate through LDAP (AD) we need to set who can do what. This is when the roles do get handy. I usually create security groups and then associate them to roles as this moves the operations of granting the access rights to merely assigning a user to a group.
You can still associate roles to the user in more or less the same way, however I’m going to show you my favorite approach.

From the left-hand main menu panel, in the Security group, select Roles. A new tab showing a list of roles will appear.

external-role-mapping

Choose Add and then External Role Mapping option. You will then be presented with the following dialog:

map-external-role

As you can see from this screenshot, choose LDAP as Realm and the role (an AD security group to be precise) you wish to map. Be aware that in some cases, not all of the groups will be listed. It is a well know bug. If this is your case, there is a workaround. Add a Nexus Role instead of the External Role Mapping and name the role precisely as your missing group is named. This should do the trick until this issue gets solved.

Once the mapping is added, you will need to assign roles (actual rights) to selected security group.

external-role-mapping

On the Role/Privilege Management bar, choose Add button and select the following roles:

  • Nexus API-Key Access
  • Nexus Deployment Role
  • Repo: All NuGet Repositories (Full Control)
  • Artifact Upload

These set of rights will enable whoever part of that group to login in Nexus OSS, manually upload packages, get the API-Key, and generally to control all NuGet repositories. In other words, a sort of developers role when it comes to NuGet.

When it comes to administrator role, it is sufficient to assign only the Nexus Administrator Role.
At the end of selecting roles, do not forget to click on Save button and persist this configuration.

This two basic profiles should be sufficient to start using Nexus. If further profiles are necessary, check the managing roles guide.

You can now test all of our security setup. Make sure you AD user account is part of the administrative security group that you just mapped to Nexus Administrator Role. Log out and log back in with your domain account. If you succeed to login, congratulations LDAP is set correctly.