Introduction

If you ever came across a need to invoke a request that needs to oblige to Multipart/form-data standard in PowerShell, you probably got to know quite quickly that none of commonly used cmdlets do support it.
Both, Invoke-WebRequest and Invoke-RestMethod are unaware on how to format the request body in order to comply to Multipart/form-data standard. In order to succeed in your intent, you probably Googled this out, and the best you could find are a couple of questions on StackOverflow that are showing on how to manually set the message body and then invoke your call with one of above mentioned cmdlets. I did it too. Still I was not satisfied by the answers I found. Although what proposed could be written in a slightly better way, still it was not optimal. First of all the memory footprint that it has. In case you are transmitting a large amount of data, all of the objects are loaded into memory.
After trying this approach, I dug into the .Net framework and found out that all that we need is right there. True, it is only available on .Net 4.5, and if that is not an obstacle, I would advise you to follow that approach. HttpClient and connected classes do have all of the necessary to support Multipart/form-data standard. Initially I created a prototype in a form of a small C# application and once succeed I translated all of that into PowerShell.

In this post I will show you both, PowerShell approach by formatting the request body and transmitting via Invoke-WebRequest cmdlet, as using an approach that is less resource intensive and is based on HttpClient .Net class.

Lets start.

Throw everything in

Multipart/form-data standard requires the message body to follow a certain structure. You can read more about it in the RFC 2388.
Aside the body structure, there is a concept of a boundary. Boundary is nothing else than an unique string that will be used for delimitation purposes inside the message body. It will be specified in the request header and used inside the message body to circumvent files that we intend to transmit.

As I mentioned earlier, this approach constructs the body string in a variable and once ready invokes the Invoke-WebRequest cmdlet to execute the call.

function Invoke-MultipartFormDataUpload
{
    [CmdletBinding()]
    PARAM
    (
        [string][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$InFile,
        [string]$ContentType,
        [Uri][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$Uri,
        [System.Management.Automation.PSCredential]$Credential
    )
    BEGIN
    {
        if (-not (Test-Path $InFile))
        {
            $errorMessage = ("File {0} missing or unable to read." -f $InFile)
            $exception =  New-Object System.Exception $errorMessage
			$errorRecord = New-Object System.Management.Automation.ErrorRecord $exception, 'MultipartFormDataUpload', ([System.Management.Automation.ErrorCategory]::InvalidArgument), $InFile
			$PSCmdlet.ThrowTerminatingError($errorRecord)
        }

        if (-not $ContentType)
        {
            Add-Type -AssemblyName System.Web

            $mimeType = [System.Web.MimeMapping]::GetMimeMapping($InFile)
            
            if ($mimeType)
            {
                $ContentType = $mimeType
            }
            else
            {
                $ContentType = "application/octet-stream"
            }
        }
    }
    PROCESS
    {
		$fileName = Split-Path $InFile -leaf
		$boundary = [guid]::NewGuid().ToString()
		
    	$fileBin = [System.IO.File]::ReadAllBytes($InFile)
	    $enc = [System.Text.Encoding]::GetEncoding("iso-8859-1")

	    $template = @'
--{0}
Content-Disposition: form-data; name="fileData"; filename="{1}"
Content-Type: {2}

{3}
--{0}--

'@

        $body = $template -f $boundary, $fileName, $ContentType, $enc.GetString($fileBin)

		try
		{
			return Invoke-WebRequest -Uri $Uri `
									 -Method Post `
									 -ContentType "multipart/form-data; boundary=$boundary" `
									 -Body $body `
									 -Credential $Credential
		}
		catch [Exception]
		{
			$PSCmdlet.ThrowTerminatingError($_)
		}
    }
    END { }
}

As you can see, we are preparing “by hand” the correct message body and executing Invoke-WebRequest in order to send our file. It is not efficient at all as it encodes the whole file as a string and keeps it in memory. Consider transmitting large files and make yourself an idea on the memory footprint of this approach. Aside that, it is just not elegant!

Wait a moment, what a waste

Previously shown method is as said, not elegant and waists resources. A better approach is to let the HttpClient class handle everything. It is way more efficient as it uses streams in order to read our file and transfers it to http StreamContent. It is less resource intensive and a more elegant solution.

Let’s see now how our refactored Invoke-MultipartFormDataUpload cmdlet looks like:

function Invoke-MultipartFormDataUpload
{
    [CmdletBinding()]
    PARAM
    (
        [string][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$InFile,
        [string]$ContentType,
        [Uri][parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]$Uri,
        [System.Management.Automation.PSCredential]$Credential
    )
    BEGIN
    {
        if (-not (Test-Path $InFile))
        {
            $errorMessage = ("File {0} missing or unable to read." -f $InFile)
            $exception =  New-Object System.Exception $errorMessage
			$errorRecord = New-Object System.Management.Automation.ErrorRecord $exception, 'MultipartFormDataUpload', ([System.Management.Automation.ErrorCategory]::InvalidArgument), $InFile
			$PSCmdlet.ThrowTerminatingError($errorRecord)
        }

        if (-not $ContentType)
        {
            Add-Type -AssemblyName System.Web

            $mimeType = [System.Web.MimeMapping]::GetMimeMapping($InFile)
            
            if ($mimeType)
            {
                $ContentType = $mimeType
            }
            else
            {
                $ContentType = "application/octet-stream"
            }
        }
    }
    PROCESS
    {
        Add-Type -AssemblyName System.Net.Http

		$httpClientHandler = New-Object System.Net.Http.HttpClientHandler

        if ($Credential)
        {
		    $networkCredential = New-Object System.Net.NetworkCredential @($Credential.UserName, $Credential.Password)
		    $httpClientHandler.Credentials = $networkCredential
        }

        $httpClient = New-Object System.Net.Http.Httpclient $httpClientHandler

        $packageFileStream = New-Object System.IO.FileStream @($InFile, [System.IO.FileMode]::Open)
        
		$contentDispositionHeaderValue = New-Object System.Net.Http.Headers.ContentDispositionHeaderValue "form-data"
	    $contentDispositionHeaderValue.Name = "fileData"
		$contentDispositionHeaderValue.FileName = (Split-Path $InFile -leaf)

        $streamContent = New-Object System.Net.Http.StreamContent $packageFileStream
        $streamContent.Headers.ContentDisposition = $contentDispositionHeaderValue
        $streamContent.Headers.ContentType = New-Object System.Net.Http.Headers.MediaTypeHeaderValue $ContentType
        
        $content = New-Object System.Net.Http.MultipartFormDataContent
        $content.Add($streamContent)

        try
        {
			$response = $httpClient.PostAsync($Uri, $content).Result

			if (!$response.IsSuccessStatusCode)
			{
				$responseBody = $response.Content.ReadAsStringAsync().Result
				$errorMessage = "Status code {0}. Reason {1}. Server reported the following message: {2}." -f $response.StatusCode, $response.ReasonPhrase, $responseBody

				throw [System.Net.Http.HttpRequestException] $errorMessage
			}

			return $response.Content.ReadAsStringAsync().Result
        }
        catch [Exception]
        {
			$PSCmdlet.ThrowTerminatingError($_)
        }
        finally
        {
            if($null -ne $httpClient)
            {
                $httpClient.Dispose()
            }

            if($null -ne $response)
            {
                $response.Dispose()
            }
        }
    }
    END { }
}

Conclusion

There is still space for improvement, however in my case it is good enough and I will not continue extending this cmdlet. If you think this is not enough or good enough, please extend it and let me know. As an idea, you may make possible passing multiple files via the pipeline and execute the upload only once, in the END step. In that way it may be more efficient and flexible. Also supporting some other common parameters as Proxy would be beneficial to others. If any let me know, I would be happy to hear from you in comments.

20 thoughts on “PowerShell tips and tricks – Multipart/form-data requests”
  1. Thank you for trying! Amazing issue you are trying to solve!

    When I tried your function, I received:
    Invoke-MultipartFormDataUpload : You cannot call a method on a null-valued expression.
    At line:1 char:1
    Also:
    – not sure about the credential part for basic auth
    – unsure of how to add more headers using this approach

    I understand you won’t work on it anymore, but you invested the time to write, so I will invest the time to provide some feedback. Maybe it will help, maybe not.

    Thank you!

      1. I have been banging my head over this for the past few days and thought this might be the lifesaver. Unfortunately, I too have the same issue as outlined above. I invoke the script in the same way as you indicate, such that:

        $response = Invoke-MultipartFormDataUpload -InFile $UpgradeFile -Uri $resource -Credential $Creds

        What is interesting is that if I set a break at line 68 (after the POST should have completed), $response is NULL. I believe this is caused by a timeout in the HTTP Client. The files I am attempting to upload are over 1 GB in size and the default time out is 1 min 40 seconds. I therefore added an additional line at 51 that says:

        $httpClient.Timeout = 18000000000

        Looks like this value in in ticks and for my the above give 30 minutes (1800 seconds – good enough for a reasonably slow link). I’m sure there is a better way to define this but this will do for now.

        I also expect there is a better way to handle the case of a timeout.

        1. One other thing I noticed (from Fiddler), is that the file data appears to be sent twice when credentials need to be passed. So we spend a chunk of time uploading 1 GB of data, only to be told by the far side that authorisation is required (401 response), so we then start the whole process again.

          1. OK, to add one last point (and borrowing the idea from Paul Smith below), it would see that you need to add you own (Basic) Authorization Header to the $httpclient to stop the double upload. This means encoding the authorisation string as Base64 for the correct character set (iso-8859-1) and add this manually as a header, if Credentials are present. I would have thought that the initialisation of the $httpClientHandler with Credentials would have done this, and even setting $httpClientHandler.PreAuthenticate = $true at this stage does not resolve this issue (this on seem to only work for subsequent connections to that same URI).

            Which means I need another IF statement after $httpClient is initialised, but we also need the password in plain text in order that it can be properly Base64 encoded for the Basic Authorization header 🙁

            Luckily, Mario has provided another function for us (see http://blog.majcica.com/2015/11/17/powershell-tips-and-tricks-decoding-securestring/).

            if ($Credential) {
            $password = Get-PlainText -SecureString $Credential.Password
            $Base64Auth = [Convert]::ToBase64String([Text.Encoding]::GetEncoding(“iso-8859-1”).Getbytes(“$($Credential.UserName):$($password)”))
            $httpClient.DefaultRequestHeaders.Add(“Authorization”, “Basic $Base64Auth”)
            }

            WOW!!!! What a Royal PITA.

            Now wonder my Mac colleagues don’t want to touch PowerShell with a barge poll. Well over 100 line of code for what can be completed in two in Python:

            import requests

            with open(‘/path/to/upgrade.tar’, ‘rb’) as upgradef:
            requests.post(‘https://1.2.3.4/api/’, files = { ‘package’ : upgradef }, auth = ( ‘username’, ‘password’ ), verify = False, timeout = 1800)

            Surely there should be a simpler way 🙁

  2. Hi Mario,

    could you please explain how to add a custom to the main http command when using the .net edition? 🙂
    Thanks!

  3. If you want to pass parameters as well as files to the webserver you can do (added because I struggled a lot with it:

    $multiContent = New-Object System.Net.Http.MultipartFormDataContent #$boundary
    ForEach($keyvaluepair in $parameters.GetEnumerator())
    {
    $stringcontent = New-Object System.Net.Http.StringContent $keyvaluepair.value
    $name ='”{0}”‘ -f $keyvaluepair.key
    $multiContent.Add($stringcontent,$name);
    }
    $multiContent.Add($streamContent,'”File”‘,'”test.txt”‘);

  4. Great, thanks for your help!
    For people that are trying to use the Tableau REST Api in PowerShell, the below code snipped works if you have a twb file without a datasource. If you get “bad request” then most probably you have Data Sources in the twb file that cannot be found on the Tableau Server.

    Add-Type -AssemblyName System.Net.Http
    [System.Net.Http.Httpclient]$httpClient=New-Object -TypeName System.Net.Http.Httpclient
    $httpClient.DefaultRequestHeaders.Clear()
    $httpClient.DefaultRequestHeaders.Add("X-Tableau-Auth",$strToken)
    $httpclient.DefaultRequestHeaders.ExpectContinue = $false
    [System.Net.Http.MultipartFormDataContent]$MPcontent = New-Object -TypeName System.Net.Http.MultipartFormDataContent -ArgumentList ($guidBS.ToString())
    [System.Net.Http.StringContent]$stringContent=New-Object -TypeName System.Net.Http.StringContent -ArgumentList ($tmpXML.InnerXml.ToString())
    [System.Net.Http.Headers.ContentDispositionHeaderValue]$contentDispositionHeaderValue=New-Object -TypeName System.Net.Http.Headers.ContentDispositionHeaderValue -ArgumentList "form-data"
    $contentDispositionHeaderValue.Name="`"request_payload`""
    $stringContent.Headers.ContentDisposition = $contentDispositionHeaderValue
    $stringContent.Headers.ContentType=[System.Net.Http.Headers.MediaTypeHeaderValue]::Parse("text/xml")
    $MPcontent.Add($stringContent)
    [Byte[]]$BytArray=[System.IO.File]::ReadAllBytes($strFileName)
    [System.IO.memorystream]$memStream=New-Object -TypeName System.IO.MemoryStream -ArgumentList (,$BytArray)
    $memstream.Seek(0,[System.IO.SeekOrigin]::Begin)
    [System.Net.Http.StreamContent]$streamContent=New-Object -TypeName System.Net.Http.StreamContent -ArgumentList $memStream
    $streamContent.Headers.ContentDisposition=New-Object -TypeName System.Net.Http.Headers.ContentDispositionHeaderValue -ArgumentList "form-data"
    $streamContent.Headers.ContentDisposition.Name="`"tableau_workbook`""
    $streamContent.Headers.ContentDisposition.FileName="`""+(Split-Path -Path $strFileName -leaf)+"`""
    $streamContent.Headers.ContentType=[System.Net.Http.Headers.MediaTypeHeaderValue]::Parse("application/octet-stream")
    $MPcontent.Add($streamContent)
    $MPcontent.Headers.Clear()
    $MPcontent.Headers.TryAddWithoutValidation("Content-Type", "multipart/mixed; boundary=$($guidBS.ToString())")
    $resp=$httpClient.PostAsync("https://your.server/api/2.2/sites/d0356794-bb9d-4c5c-b43d-ec384a2baf5a/workbooks?overwrite=true",$MPcontent)
    $resp.Wait()
    Write-Host $resp.Result.ToString()

  5. How would you add custom data to the header? For example when using an application where you receive an authentication token and are expected to include it in all subsequent requests headers. An example CURL command would be something like: $curl –insecure -s -H” Content-Type: multipart/form-data” -H “Authorization: Bearer $token” https://$vRA/content-management-service/api/packages/validate -F “file=@DukesBankApp.zip” Where the header content I want to add is the authorization token.

      1. Awesome, that worked like a champ. I added a new optional input parameter for $token and added this just under the $httpClient creation line.

        if ($token) {
        $httpClient.DefaultRequestHeaders.Add(“Authorization”, “Bearer ” + $token)
        }

        FWIW i also had to change the content name as the REST endpoint required it be named “file”:

        $contentDispositionHeaderValue.Name = “file”

  6. I think I need to look at applying this to a GET request as well. I have a normal Invoke-WebRequest with w GET method and -outfile parameters, and whilst the does work, I am downloading large files (3-4GB) and Powershell goes crazy on memory use and take an age to write the file.

    1. OK, I managed to get uploading working with a lot less effort using Sytem.Net.WebClient. However, Webclient has not timeout property!! You need to extend the class. I did try with Httpclient similar to above, but failed dismally as I could only seemingly stream to memory then dump to file, thus consuming large amount of memory. I’m sure there is a way to stream the remote resource directly to a file, I just couldn’t figure it out.

      The extension the Webclient class can be found here – https://koz.tv/setup-webclient-timeout-in-powershell/. There may be a way to do this with native PS classes.

      Oh, I also fund a way to get the Base64 encoded username/password for the Authorisation header without need to run the Get-PlainText function. Again, the authorisation header was needed for me (as above) otherwise you download the large file effectively twice.

      $encoded = [System.Convert]::ToBase64String([System.Text.Encoding]::GetEncoding(“iso-8859-1”).GetBytes([String]::Format( “{0}:{1}”, $Credential.UserName, $Credential.GetNetworkCredential().Password)))

      $wc = New-Object ExtendedWebClient
      $wc.Credentials = New-Object System.Net.NetworkCredential($User, $Pass)
      $wc.Headers.Add(“Authorization”, “Basic $encoded”)
      $wc.Timeout = 1800000 # milliseconds – should be 30 mins
      $wc.DownloadFile($resource, $DestinationFile)

      I hope this helps someone.

  7. Hi,

    By trying both versions, I’m getting 400 Bad request error from the server when trying to POST a json file.

    Any clues?

  8. Its interesting the first one works for me , but the second more ‘efficient’ one does not. I am thinking that since my files is a CSV maybe it cannot read any content.

  9. Hi Mario, i had your code working like a charm in XLDeploy Server version 9.5.1 but since we upgraded to version 9.7.5 the code is not working anymore. I keep on getting next message:

    Invoke-MultipartFormDataUpload : You cannot call a method on a null-valued expression.
    At E:\_Beheer\XLDImportExport\upload_dar.ps1:207 char:21
    + $response = Invoke-MultipartFormDataUpload -InFile $PackagePath -Uri $ur …
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : InvalidOperation: (:) [Invoke-MultipartFormDataUpload], RuntimeException
    + FullyQualifiedErrorId : InvokeMethodOnNull,Invoke-MultipartFormDataUpload

    Any idea what could have been changes in the latest version of XLDeploy?

    Thanks, Ronald.

    1. Hi Ronald, I have no idea. Also I do not have an available instance of XL Deploy to test it. How about poking XebiaLabs support for some help? If not, have you tried with curl call to see if that still works? Maybe they changed something in the API? Let me know.

Leave a Reply to Paul Smith Cancel reply

Your email address will not be published. Required fields are marked *