How To Validate Lightning Component forms In Flow Screens

If you are looking to learn how to validate custom lightning components forms in a flow screen then hopefully this post could shed some light on what can be accomplished and the solutions implemented around it.

So far, I found it straight forward to add a custom lightning component in a flow screen, simply specify that you are implementing the “lightning:availabelForFlowScreens” and the component becomes an available element in the flow screen.

Should appear as an option under the Custom heading of the Screen Elements.

However, adding validation for the form inputs in the lightning component with flow screens is a tricky subject with all the limitations of the “out of the box” functionalities.

My requirements are following:

  • capture user information
  • fields are displayed based on user inputs
  • and fields are mandatory
  • only when mandatory fields are completed can you proceed to the next screen
  • ability to navigate back and update any fields.

So far my screen looks like per screenshot when first question is answered Yes.

Hitting next just straight goes to the next screen without any validation.

To allow us to do some sort of validation, we would use the Aura.Action attribute type. As per documentation.

An Aura.Action is a reference to an action in the framework. If a child component has an Aura.Action attribute, a parent component can pass in an action handler when it instantiates the child component in its markup.

Can’t find further documentation, but the way I understand it, in the context of flows, the flow screen acts as the parent component and we can call the function in the lightning component by declaring the attribute.

The type Aura.action returns the following object.

isValid: Boolean – stops the flow from going to the next screen.
errorMessage: String – custom error message.

To make this work we also need to add the action handler init. The ”Next” button on flow screens calls the action handler but only on init. (Why on init only? I still don’t fully understand.) Hitting next forces the lightning component to rerender and call init, from init we can set the validate object.

From your controller.js Set the component “v.validate”.

Without any inputs, if I hit Next.

Yey! So far validation works!

Let’s add some more validation logic.

When I answer questions and leave some blank. I get several issues with this validation.

  • when the component rerenders
    • any custom hide/show logic would be lost
    • any inputs are lost
  • validation error location is at the bottom of the screen
  • you can only show one error at a time

To get around these limitations, I ended up building some custom validation handling, with a combination of flow variables to persist data, design attributes for input/output, sessionStorage to track missing fields inputs, and fieldCustomValidity and if you have date fields or combobox you may need to use afterRender calls.

First update the .design file for the input fields that needs to persists. To keep it easy we use one attribute per field.

Next create flow variables for each attribute and checkbox is available for input and output set to true.

Here is the final attributes

Next assign each to the custom lightning component input and output.

Now for the new validation logic. First I created an array for mandatoryFields. For each field required I add aura:id to the array.

Next I iterate to the array of fields and check if it has been populated with a value. For each missing value I again add to a new array called missingFields.

Next I check the length of the missing fields and I store the information in a sessionStorage.

SessionStorage is like localStorage but it gets cleared up when the session ends. (will cover storing of data locally on another topic)

Then I call the return statement setting isValid to false and an empty errorMessage(this property is required to display errors on the page).

So when Next action is called, it runs the init and sets the sessionStorage.

Note that I’m using the renderer. As adding a date field threw me off a little as the DOM is not yet ready after init and you are trying to access them checking for missing inputs. Solution was to put in the afterRender. But if you only have the basic input fields, you can have the code in the init as well.

I call helper method which displays the error by reading the sessionStorage for missing fields and I set the fieldCustomValidity.

And my flow is looking sharper, I am prompted for incomplete fields and I can move to next when complete and go back and update if I need to.

Code for this is posted on my github repo.

I’ll be posting a video tutorial as well soon. Stay tuned.

Hope someone find this post useful. Happy coding!

How To Use Decorators In Lightning Web Components

My goal is to upskill on Lightning Web Components(LWC) and create a course in the next few months. Aside from learning kebab case is not a food, the other thing that made me scratch my head when I dabbled with LWC was decorators. I used them but didn’t fully understand what they are.

Anyway, I’ll be talking about the decorators because I want to get a deeper understanding of this topic by doing this tutorial.

What are decorators?

Decorators are just a wrapper around a function. For instance, they are used to enhance the functionality of the function without modifying the underlying function. (See Higher Order Function)

The ability to create decorators is part of ECMAScript. Decorators are very widely used in JavaScript through transpilers. For example, see the documentation of core-decoratorsember-decoratorsAngularStencil, and MobX decorators.

These three decorators are unique to Lightning Web Components.

  • @api – makes a property public.
  • @track – makes property private
  • @wire – reads Salesforce data or metadata

What are Reactive Properties?

If the value of properties changes, the component rerenders. To make the property reactive we wrap them with such decorators.

Properties that are non-reactive means changes in value will not cause the component to rerender.

When to use @api

  • when you want to make the property reactive and be accessible outside of the component they were declared.
  • the parent component or owner component can set values to the public properties for the child component.
  • if we want changes in the property value to rerender any content on components that references the property.

import syntax

course.js – decorating a property

course.html – template use

courseApp.html – parent component has access and can set child properties

When to use @track

  • when you want to make a property reactive but only within the component.
  • we don’t want the parent component to change the property values of the child component.

import syntax

course.js – added new track property

course.html – added new template

courseApp.html – parent setting the course-level to Intermediate has no effect

When to use @wire

  • used to read Salesforce data or metadata.
  • we want the component to rerender (be reactive) when the wire service provisions data.

To use the @wire decorator you need to understand a bit about the wire service and calling apex methods

What is the wire service?

  • the wire service provisions an immutable stream of data to the component
    • immutable means it is read-only data. To mutate it you need to make a shallow copy of the object.
  • the wire service delegates control flow to the LWC engine. This is great for read operations. But if you want to do C-U-D(create, update or delete) we need to use the Javascript API instead.
  • the wire service provisions data either from the client cache if it exists, then fetches from the server if needed.

The wire service syntax and steps

  • import from adapterModule and create identifier for the wire adapter
  • decorate wire property or function with the wire adapterConfig object
  • create a property or function that receives the stream of data

Importing objects, fields and relationships:

Use the lightning\ui* api wire adapter module and import reference to objects and fields to make sure the object or fields exists. Recommended using the lightning\ui* api wire adapter module because it respects the user’s CRUD, FLS, and sharing access.

Importing objects, fields and relationships

@wire decorating a property

@wire decorating a function

How to use wire service with Apex methods?

  • Apex methods can be called via @wire or imperatively. It follows a similar syntax but instead of importing the wire adapter, we import the class method. Then data streamed will be received either by a property or a function.
  • wire adapter provisions new values when available, wire service provisioned data from apex methods doesn’t. To get the new values we need to call refreshApex().

import syntax

expose Apex methods with static/global or public and annotate with @AuraEnabled(cacheable=true) – improves runtime performance

import the apex method and wire the property or function which will receive the data stream

if you used @wire, Apex method provisioned data can get stale, use the refreshApex() to get the latest values

importing syntax

usage example – via button click refresh teh data that the wire service provisioned

Well, that wraps the basics on decorators. Hope you learned something from this post and found it useful.

There are more details that you can get out from the documentation linked below.

https://developer.salesforce.com/docs/component-library/documentation/en/lwc/lwc.data

How To Set Up CICD On Bitbucket Pipelines With Salesforce DX And Delta Deployment

Learn how to setup set up CICD delta deployment with Salesforce DX. Tips and tricks for authorisation , setting up node and the basic git commands.

I’m revamping our CICD process with Salesforce DX and Bitbucket Pipeline with the following initial setup which will only a delta deployment

Authentication method – authorize an org and grab the sfdxurl to be stored as repository variable in Bitbucket

sfdx force:auth:web:login 
sfdx force:org:display --verbose

There would be two token types

force://<refreshToken>@<instanceUrl> 
or 
force://<clientId>:<clientSecret>:<refreshToken>@<instanceUrl>

Copy the SFDX Auth URL which will be the second type. Create a repository variable AUTH_URL in Bitbucket and store the copied value.

Echo the AUTH_URL to a file then authenticate with with sfdxurl:store

echo $AUTH_URL >> /tmp/sfdx_auth.txt
sfdx force:auth:sfdxurl:store -f /tmp/sfdx_auth.txt -s -a dxpipeline

Grab the latest sfdx tool and install.

wget https://developer.salesforce.com/media/salesforce-cli/sfdx-linux-amd64.tar.xz 
mkdir sfdx-cli 
tar xJf sfdx-linux-amd64.tar.xz -C sfdx-cli --strip-components 1 
./sfdx-cli/install

Next, to compare delta files – there is node tool available in github that does delta comparison between hash commit or branch. Install the sfdx-git-delta app

npm install sfdx-git-delta@latest -g

Finally I incorporated these to my git workflow

On a Pull Request – I want to run a delta comparison and do an empty check only that my delta files changes are deployable and does break any unit tests.

First checkout a temporary branch from the feature branch

git checkout -b some-pr-branch

Next, run the tool to create a delta comparison from that branch to the target branch.

sgd --to some-pr-branch --from origin/staging --repo . --output .

The tool should create a package.xml/destructiveChange.xml file based on the diff on their respective directory.

Next convert the source format to mdapi so we can run a transactional deploy.

sfdx force:source:convert --manifest=package/package.xml --outputdir=convert

After conversion, do an empty check deploy and run the unit test

sfdx force:mdapi:deploy --deploydir=convert -c -l RunLocalTests -w 30

Below is the complete Pull Request script.

image: atlassian/default-image:2

pipelines:
  pull-requests:
    'feature/*': # Pull request from feature branch to Staging
      - step:
          name: "Staging Pull Request Validate Package"
          script:
            - echo "QA Pull Request Validation"
            - wget https://developer.salesforce.com/media/salesforce-cli/sfdx-linux-amd64.tar.xz
            - mkdir sfdx-cli
            - tar xJf sfdx-linux-amd64.tar.xz -C sfdx-cli --strip-components 1
            - ./sfdx-cli/install
            - echo $AUTH_URL >> /tmp/sfdx_auth.txt
            - sfdx force:auth:sfdxurl:store -f /tmp/sfdx_auth.txt -s -a dxpipeline
            - npm install sfdx-git-delta@latest -g
            - git checkout -b some-pr-branch          
            - git --no-pager diff --name-status some-pr-branch  origin/staging
            - sgd --to some-pr-branch  --from origin/staging --repo . --output .
            - echo "--- package.xml generated with added and modified metadata ---"
            - cat package/package.xml
            - sfdx force:source:convert --manifest=package/package.xml --outputdir=convert 
            - echo "---- Validating delta package  ----"
            - sfdx force:mdapi:deploy --deploydir=convert -c -l RunLocalTests -w 30

On Push to the branch – I ran similar steps with the only exception that I compare the current branch to the staging branch and not do an empty check or run the test classes as I already ran them.

Below is the complete Push script.

image: atlassian/default-image:2

pipelines:
  pushs:
    staging: 
      - step:
          name: "Deploy to Staging"
          script:
            - echo "Deploy to Staging"
            - wget https://developer.salesforce.com/media/salesforce-cli/sfdx-linux-amd64.tar.xz
            - mkdir sfdx-cli
            - tar xJf sfdx-linux-amd64.tar.xz -C sfdx-cli --strip-components 1
            - ./sfdx-cli/install
            - echo $AUTH_URL >> /tmp/sfdx_auth.txt
            - sfdx force:auth:sfdxurl:store -f /tmp/sfdx_auth.txt -s -a dxpipeline
            - npm install sfdx-git-delta@latest -g
            - git checkout -b dev          
            - git --no-pager diff --name-status some-pr-branch  origin/staging
            - sgd --to dev  --from origin/staging --repo . --output .
            - echo "--- package.xml generated with added and modified metadata ---"
            - cat package/package.xml
            - sfdx force:source:convert --manifest=package/package.xml --outputdir=convert 
            - echo "---- Validating delta package  ----"
            - sfdx force:mdapi:deploy --deploydir=convert -w 30

Hope you find this useful. Hit me up on the comments below for any questions.

How To Use Javascript Promises with Lightning Components

Javascript Promises has been around for a while but only got the chance to use it on some of the aura component pieces I started working on.

In analogy you make a promise and either you fulfil or break your promise.

In Javascript Promises context these translate to “resolve” meaning promise is fulfilled or “reject” which means promise was broken and can be caused by an error.

A good use case for in Lightning is handling responses from asynchronous operations in which you pass a callback function. Then that callback function can make another asynchronous operation. Keep on nesting and you can easily eventually end with what they sometimes call callback hell as your code can be hard to manage.

Let’s dive into a creating Javascript Promise

I defined this as a helper method. It calls an apex method and depending on response and getState I mapped it to either resolve or reject.

Here I assigned a variable to the returned Promise in the javascript controller. The helper returns one parameter which is the response which I can access on the .then statement.

Here we called the p variable followed by a .then and the special callback function for Aura which is $A.getCallback. You can then chain another promise by calling and returning that promise. The next then can accept a parameter from the previous promise.

With Javascript Promises, this is more readable than a nested callback and easier to manage.

I hope you find this basic tip useful. Hit the comments below if you have questions.

How To Use Map Object In Aura Lightning Component

By Salesforce documentation, you can define several collection types including a Map.

A Map collection allows you have a key/value pair where the key is unique. Declaring such is easy by adding the following

<aura:attribute type="Map" name="fooMap" />

But in your controller, if you try to do any Map functions such as keys(), set(key, value), values(), you get an error such as:

set is not a function 

or

values is not a function.

What is happening in Lightning is even if you declared it as Map it is treated as an Object. It took me a while to figure this out.

To get around this “limitation” I manually assigned a map in the controller and then I was able to do Map functions. You can do this either on init or before you use the component Map.

Hope you find this tip useful.

How To Replace Salesforce Metadata Before Deploying using Ant Scripts

My particular use case is for Salesforce ant deployment. I wanted to replace some metadata before I deploy to the target org which will allow me to automate the process. I can fetch metadata from my sandbox org and when it gets deployed to the target org like production the values will be updated.

You should have the latest ant-salesforce.jar as a requirement. You can grab the latest ant migration tool from here – https://help.salesforce.com/articleView?id=code_tools_ant_using.htm&type=5

My sample script entails having a conditional check before doing the replace a logic with Custom Labels. This is how my build XML looks.

Additional library you would need to perform the conditional check is ant-contrib.jar file. You can grab the latest library from here – http://ant-contrib.sourceforge.net/

If you try to run the script without the library you might end with the error below.

Fix Ant Build Error: Problem: failed to create task or type if

On the build.xml simply add the reference to the library.

If everything is in place, like the properties file has the right credentials, running the following command should deploy your code and replace the values as per your ant script.

ant -Denvironment=prod -buildfile build.xml deployMetadata

Source code available here – https://github.com/olopsman/salesforce-ant

How To Fix [Cannot read property ‘setParams’ of undefined] for Application Event in Aura Components

Component.cmp

    <aura:registerEvent name="CRMAppEvent" type="c:CRMAppEvent"/>

Controller.js

        var compEvent = $A.get("e.c:CRMAppEvent"); 
        compEvent.setParams({"type" : response})
        compEvent.fire();

Are you receiving cannot read property ‘setParams’ of undefined after assigning an event?

This indicates that $A.get(“e.c:CRMAppEvent”) cannot find the event and any reference to setParams method or any properties of the event would eventually be undefined.
To fix this you need to set your application event access to global

CRMAppEvent.evt

<aura:event type="APPLICATION" description="This is the generic Application Event" access="global">
    <aura:attribute name="type" type="String"/>
</aura:event>

How to Fix Salesforce Deployment Errors: InfoNot a valid enumeration for type

If your deployment strategy still evolves around ant deployment and not the new Salesforce CLI and you encounter this error on your deployment.

Failed to check the status for request ID=0Af0r00000ClbNgCAJ. Error: InfoNot a valid enumeration for type: class com.sforce.soap.metadata.DeployProblemType. Retrying…

While digging around, I found out this is a tool issue. Chances are your ant-salesforce jar library is outdated, you need to grab the latest salesforce-ant version from here

https://developer.salesforce.com/docs/atlas.en-us.daas.meta/daas/forcemigrationtool_install.htm

Update the library on your ant build path. Check your build.xml for the classPath if you are unsure of the path.

<taskdef resource="com/salesforce/antlib.xml" uri="antlib:com.salesforce">
        <classpath>
            <pathelement location="${basedir}/ant-salesforce.jar" />
        </classpath>
</taskdef>

How to Fix Compilation LWC1010: Failed to resolve entry for module

This LWC1010 compilation error is caused by an invalid reference to the component filename. By default camel case component folder names are mapped to kebab-case in the markup when calling them from Lightning Web Components. To resolve entry for module errors check the proper naming and hyphen.

Eg. myComponent would be my-component

Visual Studio code autocomplete would miss this example ” myCRMApp”

<my-component></my-component>
<c-my-crm-app></c-my-crm-app>

To fix this myCRMApp would be my-c-r-m-app

<c-my-c-r-m-app></c-my-c-r-m-app>

How to Fix Salesforce Deployment Errors: Platform Encryption cannot be enabled for standard fields when Portals are enabled.

Go to Salesforce Classic UI and search under Setup for “Customer Portal” and Disable Login Enabled. You should move your portal to Communities where Platform Encryption is supported.