Recently, at work, I found myself in the situation where I needed to copy some file from my workstation to a jump box. Now of course, on Linux I’d just use rsync
or scp
. But our IT doesn’t like provisioning Linux boxes and therefore uses Windows for jump servers too, so no luck here. Luckily, I could convince them to turn on and allow PowerShell Remoting, so with some simple scripts I can still easily copy files over without using SMB and looking at more hassle with IT.
function Copy-LocalToRemote(
[Parameter(Mandatory = $true)] $LocalPath,
[Parameter(Mandatory = $true)] $RemotePath,
$ComputerName = 'my.default.target.host'
) {
Invoke-Command -ComputerName $ComputerName `
{
param($path, $content)
Set-Content -Path $path -Value $content `
-AsByteStream
} `
-ArgumentList $RemotePath,(
Get-Content $LocalPath Raw -AsByteStream)
}
function Copy-RemoteToLocal(
[Parameter(Mandatory = $true)] $RemotePath,
[Parameter(Mandatory = $true)] $LocalPath,
$ComputerName = 'my.default.source.host'
) {
Invoke-Command -ComputerName $ComputerName `
{
param($path)
Get-Content -Path $path -Raw -AsByteStream
} `
-ArgumentList $RemotePath |
Set-Content -Path $LocalPath -AsByteStream
}
New-Alias -Name 'ltr' -Value 'Copy-LocalToRemote'
New-Alias -Name 'rtl' -Value 'Copy-RemoteToLocal'
As you can see, this is quite simple. Obviously, functions above can only copy one file at a time though. Maybe in the future I’ll build something that can copy entire file structures recursively. I also haven’t spent any time looking at how efficient it is to pass streams this way. In fact, I wouldn’t be surprised at all if this would perform poorly for large files. But then again, I’m mostly pushing around scripts and config files, so this works just fine.