A few weeks ago we walked the last mile setting up the CI/CD pipeline at my current project, including the ability to deploy Vlocity components. And it is an experience that someone mentioned could be useful to other Salesforce folks, particularly if you are not using one of the specialized Salesforce CI/CD tools available in the market.
SFI (Salesforce Industries, aka. Vlocity ) components are different from traditional Metadata components. SFI components are represented as records and JSON files inside the platform. I am not sure if Salesforce plans to change that in the future, but right now in order to deploy these components, you require either the Vlocity Build tool (CLI) or Vlocity IDX tool (UI).
Assuming that you are using any tool that is capable of running SFDX commands already (e.g. Circle CI, Travis CI, Jenkins, ...), then it is also safe to assume you already have NodeJS and NMP incorporated into the pipeline. So, you can easily create a script to install the Vlocity Build Tool. The following is a traditional YAML file used in a CI tool, the important piece is the npm command.
deploy-vlocity: executor: sfdx/default steps: - checkout - sfdx/install - run: name: Install Dependencies command: | sudo npm install -global vlocity
Once the tool is installed, the next step is to make sure that it is authenticated into your org so that you can modify records and metadata. The easiest way to do that if you already have an SFDX session, is to use the same alias that you are using to work with standard salesforce components. For that, we will create a properties file that can be referenced in the upcoming scripts.
sfdx.username = DevHub sf.intanceUrl = https://mycompany--vlocityorg.my.salesforce.com
As you can see, the file is very simple. You can create different property files for your different source and target environments. With that, we can start executing commands to retrieve and deploy the components using the "vlocity" keyword.
The full list of commands can be referenced in the Github repo of the Vlocity tool, but usually, you will end up writing a script like the following to be executed through your CI tool...
# Source Org vlocity -propertyfile build_source.properties -job EPC.yaml cleanOrgData vlocity -propertyfile build_source.properties -job EPC.yaml packExport vlocity -propertyfile build_source.properties -job EPC.yaml packRetry # If any errors # Target Org vlocity -propertyfile build_target.properties -job EPC.yaml cleanOrgData vlocity -propertyfile build_target.properties -job EPC.yaml packDeploy vlocity -propertyfile build_target.properties -job EPC.yaml packRetry # If any errors
Let's break it down.
The last and probably the most important piece is the "EPC.yaml" file, where we will specify the components to be retrieved/deployed and some other important parameters.
projectPath: ./vlocity queries: # Queries to get vlocity datapacks - VlocityDataPackType: IntegrationProcedure query: SELECT Id,vlocity_ins__SubType__c,vlocity_ins__Type__c FROM vlocity_ins__OmniScript__c WHERE vlocity_ins__IsActive__c = true AND vlocity_ins__IsProcedure__c = true AND Name = 'My Integration Procedure' manifestOnly: true oauthConnection: true activate: true
One last thing worth mentioning is that you just need to include the component at the top (e.g. an Integration Procedure that encapsulates multiple Data Raptors) and any dependencies will be pulled and pushed automatically between the source and target orgs.
In our case, the way that developers tell the pipeline about new components to be deployed between environments is using the queries in the EPC.yaml file. But as mentioned before, there are other options that could work better depending on your workflow. For example, you can push the JSON metadata and records into the ./vlocity folder directly and skip the "packExport" command.
Create your free account to unlock your custom reading experience.