Start a new topic
Solved

Powervault Memory management.

Hello,


Consider this script:

 

$outfile = "C:\Users\me\Documents\VaultPS\compare.tab"
$outline = "Part Number" + ([char]0x0009) + "DESCR" + ([char]0x0009) + "Title" + ([char]0x0009) + "File Name"
$outline | Out-File $outfile
foreach($line in Get-Content C:\Users\me\Documents\VaultPS\parts.tab) {
    $im_key, $im_descr, $im_type = $line -Split ([char]0x0009)
    $files = Get-VaultFiles -Properties @{"Part Number"=$im_key}
    $found = $false
    Foreach($file in $files){
        $outline = $im_key + ([char]0x0009) + $im_descr + ([char]0x0009) + $file.Title + ([char]0x0009) + $file.'File Name'
        $outline | Out-File $outfile -Append
        $found = $true
    }
    if (-Not $found) { 
        $outline = $im_key + ([char]0x0009) + "Not found in Vault" + ([char]0x0009) + " " + ([char]0x0009) + " "
        $outline | Out-File $outfile -Append
    }
}

 After $files = Get-VaultFiles -Properties @{"Part Number"=$im_key}, 

$files could possibly contain as many as 2 to 10 files.

I notice that as this script runs,  the memory usage for the script increases to several GB.

I am querying ~200,000 files with this script.


Is there some memory management technique I could be using to make this more efficient?


hi John, the Get-VaultFiles is a super convenient command-let that makes it super simple to collect (search) files in Vault. The result is a complete list of the files, with all the properties. So, with the result-set, you can immediately start working and have all you need on your fingertips. The price you pay for this comfort, is performance and memory...


If you want to speed up the search and consume less memory, then you can use the Vault API $vault.DocumentService.FindFilesBySearchConditions(.....). This way, you can more control on what will be retrieved and how. On our blog we have some examples that show how this function can be used. The samples do not explain in detail how the function works, but you would understand it from the context. Here the links:

https://blog.coolorange.com/2017/05/26/fixing-the-bom-blob/

https://blog.coolorange.com/2016/02/05/check-for-unique-part-number/

https://blog.coolorange.com/2015/09/18/unique-part-number-check/

there other blogs as well, so you might just search for FindFilesBySearchConditions


be aware that via the API you get back just the rudimentary file object, without properties. So, in case you'll have to get the complete file object via Get-VaultFile -FileID and pass either the Master (for latest) or the file ID (for exact version).

Also, the FindFilesBySearchConditions just pulls 1000 elements at a time, so in the examples on the blog, you'll see how this function is called several times within a while loop.


I hope this helps. if you need further assistance, let us know and we can provide a more detailed example.


ciao

marco

p.s.: remember to mark this as answer, if this solved your problem


1 person likes this

Hi,


Not sure how I mark your answer.  I marked the thread solved.

Thanks for explaining that there are some performance issues CO Command-let vs Vault API.

I was able to keep mem usage smaller in my outer loop by adding some garbage collection.

 

    $files = $null
    $file = $null
    [GC]::Collect()
}

 

Login or Signup to post a comment