I have a script from here, this is the job :
function CaptureWeight {
Start-Job -Name WeightLog -ScriptBlock {
filter timestamp {
$sw.WriteLine("$(Get-Date -Format MM/dd/yyyy_HH:mm:ss) $_")
}
try {
$sw = [System.IO.StreamWriter]::new("$using:LogDir\$FileName$(Get-Date -f MM-dd-yyyy).txt")
& "$using:PlinkDir\plink.exe" -telnet $using:SerialIP -P $using:SerialPort | TimeStamp
}
finally {
$sw.ForEach('Flush')
$sw.ForEach('Dispose')
}
}
}
I'd like to get his to run against a list of IP addresses while also having a name associated with the IP to set the file name for each file. I was thinking something like $Name = Myfilename
and $name.IP = 1.1.1.1
and using those in place of $FileName and $SerialIP, but have yet to be able get anything close to working or find an example close enough to what I'm trying for.
Thanks
Here is one way you could do it with a hash table as Theo mentioned in his helpful comment. Be aware that Jobs don't have a Threshold / ThrottleLimit parameter as opposed to Start-ThreadJob
or ForEach-Object -Parallel
since jobs run in a different process as you have already commented instead of instances / runspaces, there is no built-in way to control how many Jobs can run at the same time. If you wish have control over this you would need to code it yourself.
# define IPs as Key and FileName as Value
$lookup = @{
'1.2.3.4' = 'FileNameForThisIP'
'192.168.1.15' = 'AnotherFileNameForTHatIP'
}
# path to directory executable
$plink = 'path\to\plinkdirectory'
# path to log directory
$LogDir = 'path\to\logDirectory'
# serial port
$serialport = 123
$jobs = foreach($i in $lookup.GetEnumerator()) {
Start-Job -Name WeightLog -ScriptBlock {
filter timestamp {
$sw.WriteLine("$(Get-Date -Format MM/dd/yyyy_HH:mm:ss) $_")
}
try {
$path = Join-Path $using:LogDir -ChildPath ('{0}{1}.txt' -f $using:i.Value, (Get-Date -f MM-dd-yyyy))
$sw = [System.IO.StreamWriter]::new($path)
$sw.AutoFlush = $true
& "$using:plink\plink.exe" -telnet $using:i.Key -P $using:serialPort | TimeStamp
}
finally {
$sw.ForEach('Dispose')
}
}
}
$jobs | Receive-Job -AutoRemoveJob -Wait
The other alternative to the hash table could be to use a Csv (either from a file with Import-Csv
or hardcoded with ConvertFrom-Csv
).