Getting Started with Your Load Project
This document will guide you through the key steps to complete a successful load project using a PowerShell automation script, that enables reproducible and error-free results.
The main steps include:
We also briefly cover the topic of how the powerLoad DB can be transferred between systems.
powerLoad Database
The powerLoad Database is the central interface for receiving data from the external system and preparing it for further high-performant processing.
It is a Microsoft SQL database hosted directly on your Autodesk Data Management Server and it can be extended to fit your specific needs (see the table schema for details).
You first load metadata from your source system into the powerLoad DB, whether that’s a Vault, a legacy PDM system, or an old file store.
The PowerShell cmdlet Import-BcpDatabase can be used to automatically create and populate a new powerLoad DB.
Once the powerLoad DB is filled, it must be completed and refined to fit the target Vault. This includes assigning Vault behaviors such as Category, LifecycleState, and LifecycleDefinition.
Thanks to referential integrity, even operations like moving files to different destination folders or deleting entities with dependencies are safe and straightforward.
After the data is correctly prepared, the powerLoad DB can be exported as a BCP package and then imported into the target Vault.
This approach offers significant advantages for large Vault environments, as it even allows exports for newer ADMS versions — eliminating the need for time-consuming Vault backups and upgrade procedures afterwards.
Start the PowerShell environment
To get started, we now create a new ps1 file that automates all steps of the load project, making it reproducible and error-free.
Simply open any PowerShell IDE, such as Windows Powershel ISE or Visual Studio Code. The powerLoad module now provides cmdlets to load a new powerLoad Database and export it as a BCP package.
Together with the preinstalled Microsoft module SQLPS — or alternatively the newer SqlServer module — you can run SQL commands, backups, and restores directly from PowerShell.
The modules are generally imported automatically when any of its cmdlets are executed.
1. Importing to powerLoad Database
From Source Vault
When loading data from an existing Vault, use the Autodesk Data Transfer Utility (DTU) to extract Files, Folders, Items & BOMs, CustomObjects,
including all relationships and metadata:
function Export-VaultBcp($vault, $user = 'Administrator', $password = '', $path) {
$path += "$($vault)_$(get-date -Format 'yyyy-MM-dd')"
.'C:\Program Files\Autodesk\Vault Server 2026\ADMS Console\Vaultbcp.exe' EXPORT $path $user `"$password`" $vault /L"$path\export.log" /ID `
/M #<-- ADMS update may required for 2025 and 2024!
}
Export-VaultBcp -Vault 'MySourceVault' -Path 'C:\BCP\'
<#
************************************************************
Processing package: C:\BCP\MySourceVault_2025-10-17
************************************************************
http://AutodeskDM/Schemas/Design/CtntSrcPropertyProviders/01/04/2010/
http://AutodeskDM/Schemas/Design/CtntSrcPropertyProviders/01/04/2010/
Preparing database...
...
All finished. Press any key to continue...
#>
$db = Import-BCPDatabase -Package 'C:\BCP\MySourceVault_2025-10-17'
$db.ConnectionString #Returns: Data Source=.\AUTODESKVAULT;Initial Catalog=powerLoad_MySourceVault_2025-10-17;User ID=sa;Password=AutodeskVault@26200
You can download our VaultBcp 2024 if you need a sample BCP package.
Note: Configuration such as security and behaviors is not imported, and also all bomBlob*.xml files are ignored.
Performance Tip: Autodesk DTU with /M parameter
Make sure the latest ADMS updates are installed on the source system.
The minimum ADMS versions and DTU update requirements for the /M paremter can be found here.
For very large Vaults, this option dramatically reduces export time, as it exports only metadata — no files.
2. Transforming Data
Once the powerLoad DB tables are populated, you can use SQL queries to modify how exactly the final Vault should look like. E.g.
Folder/Project structure
Which entities get which lifecycle and category
How should the revision schemes be?
Which UDPs are needed and what will be written into them?
There is much more, but the above things are the core points. Once you have defined how the result should be you can look through your tables to identify example objects such as:
idw with category A
idw with category B
ipt
Content center file
Folder with category A
Folder with category B
Project
…
Using the Microsoft cmdlet Invoke-Sqlcmd, you can efficiently perform bulk updates directly on the SQL Server.
For example, the following command updates the UDP Author for all Vault files located under $/Designs/Padlock (including all subfolders):
Invoke-Sqlcmd -ConnectionString $db.ConnectionString -Query @'
Update Files
SET UDP_Author = 'Max Mustermann'
WHERE FolderPath LIKE '$/Designs/Padlock/%'
'@
3. XML-Export & Import into Target Vault
If you are happy with the result in your powerLoad Database, you can export it to a new directory.
For testing, you can create a BCP package without links to the actual documents. This allows importing the package without the need for the source files at all.
Simply execute Export-BCPDatabase with or without the argument -NoSourceFiles:
Export-BcpDatabase -Path 'C:\BCP\powerLoad_MySourceVault_2025-10-17' -NoSourceFiles #-NoFSPath
function Import-VaultBcp($path, $sourceFileStore, $vault, $user = 'Administrator', $password = '') {
.'C:\Program Files\Autodesk\Vault Server 2026\ADMS Console\Vaultbcp.exe' IMPORT $path $user `"$password`" $vault /L"$path\import.log" /ID `
/FS:$sourceFileStore # for ADMS 2025 and 2024, this requires updates!
}
Import-VaultBcp -Path 'C:\BCP\powerLoad_MySourceVault_2025-10-17' -Vault 'NewTargetVault' -SourceFileStore 'C:\ProgramData\Autodesk\VaultServer\FileStore\MySourceVault'
<#
************************************************************
Processing package: C:\BCP\powerLoad_MySourceVault_2025-10-17
************************************************************
http://AutodeskDM/Schemas/Design/CtntSrcPropertyProviders/01/04/2010/
http://AutodeskDM/Schemas/Design/CtntSrcPropertyProviders/01/04/2010/
Preparing database...
...
All finished. Press any key to continue...
#>
Invoke-Sqlcmd -ConnectionString $db.ConnectionString -Query @"
USE master
ALTER DATABASE [$($db.Database)] SET SINGLE_USER WITH ROLLBACK IMMEDIATE
DROP DATABASE [$($db.Database)]
"@
Performance Tip: Autodesk DTU with /FS parameter
Make sure the latest ADMS updates are installed on the target system.
The minimum ADMS versions and DTU update requirements for the /FS paremter can be found here.
For very large Vaults, this option greatly improves import performance!
Thanks to the powerLoad DB, it doesn’t matter whether the DTU export from the source Vault was created with or without the /M parameter.
The /FS parameter also works efficiently when the Files table uses absolute file paths.
The Assign Items function requires bomBlob data to exist for all CAD files in the target Vault:
For Files-records that were loaded from a source Vault, the original bomBlob*.xml files can be copied into the newly exported BCP package and imported into the target Vault via DTU, since the export preserves the original file IDs.
However, we recommend regenerating the BOM data afterward using the built-in job Autodesk.Vault.ExtractBOM.Inventor, especially if mapped Vault UDPs have been modified.
Limitation: Incomplete BOM Data when using “Assign / Update Item…”
Even though the Bill of Materials (BOM) is displayed correctly in the “Open Item” dialog after the import, using “Assign / Update Item…” may result in an empty BOM.
To ensure the correct BOM is shown in the “Update Item” dialog, it is recommended to check out and re-check in the primary associated file of the item using Inventor.
In cases where a new file version has been created and no item data is available (e.g. no bomBlob.xml were imported), the Autodesk.Vault.ExtractBOM.Inventor job can also be used to regenerate the BOM data.
Developer Notes: Restoring a customer’s powerLoad DB locally
No matter which source system was used for the load, the customer’s powerLoad Database can be easily exported using the following Microsoft cmdlet:
$db = Import-BcpDatabase ...
$backupFile = "C:\Temp\$($db.Database).bak"
Invoke-Sqlcmd -Query "BACKUP DATABASE [$($db.Database)] TO DISK='$backupFile'" `
-ConnectionString $db.ConnectionString
The resulting .bak file can then be restored on a local development system.
For more information, see the Connect-BcpDatabase example “Restoring customer DB locally”.