Data Modelling Tips for Salesforce

What is a Data Model?

  • a way to model what databases tables look like in a way that makes sense
  • made up of data entities – can be objects or concepts? to track data
  • types of data model
    • conceptual – high-level business structures and concepts
    • logical – establish entities, their attributes and relationships
    • physical – internal schema database design

What are objects?

In Salesforce we think database tables as objects and rows are records – an abstraction.

Objects in Salesforce – containers for information which builds the user interface as well

  • Standard objects – objects included in Saleforce.
    • Business Objects
      • Account
      • Contact
      • Lead
      • Opportunities
  • Custom Objects – specific objects created for your company or industry
  • External Objects
  • Platform Events – Kafka abstraction
  • Big Objects

Custom Fields – are columns or attributes abstraction

  • each field has it’s own data type
    • checkbox
    • textfield
    • date or datetime
    • formula – automatically calculated based on a formula written
    • etc.

Creating a Record is same as adding a new row

What are object relationships?

A special field type that connects two objects together – foreign key and primary key abstraction. With different types of relationships, Salesforce determines how the data deletion, sharing and required fields on a page layout will be implemented

  • Master-detail
    • closely links objects together – master records controls certain behaviour of the detail and subdetail record
    • deleting detail keeps the master intact
    • deleting the master deletes the detail(child)
    • by default can’t be reparented. unless special permission is given on attribute definition – ‘Allow reparenting’
    • Owner field on detail/subdetail is the master owner.
    • detail and subdetail inherits security from master
    • each custom object can have up to 2 master-detail relationships and up to 25 total relationships
    • broken permissions exist if child has permission that parent should have
      • Salesforce updates the parent entity on first save for the profile.(master gets the permission)
  • Many-to-Many
    • A junction object
    • each record of one object to be linked to multiple records from another object
  • Lookup Relationships
    • can be one-to-one
    • one-to-many
    • link to same object except user object
    • similar to master detail except no sharing or rollup
    • Limitations: can’t lookup to campaign member object
    • Lookup setup options:
      • make lookup field required
      • clear value of this field
      • don’t allow deletion of the lookup record that’s part of the lookup relationship
      • delete this record also
        • this can result in cascade-delete bypassing security and sharing settings
        • a user can delete the target record and cascade delete records related to it even when though he has no access to the records.
        • contact SF to enable cascade-delete
      • parent record deletion on lookup relationship does not track field history tracking
      • linked objects of more than 100k cannot be deleted, first delete the appropriate child records
  • External Lookup
    • links a child standard, custom or external object to a parent external object
    • external id field on the parent external object is matched with the values of the child external lookup relationship field
  • Indirect Lookup
    • links a child external object to standard or custom objects
    • the parent object custom unique external id field should match the child’s indirect lookup relationship field.
  • Hierarchical
    • special lookup relationship available for only the user object.


  • limitation: custom can have up to two-master detail relationships and many lookups
  • converting relationships
    • master-detail to lookup ?
      • no roll-up summary fields
      • child changes to OWD changes to public read/write
    • lookup to master
      • all lookup fields contain a value
      • OWD changes to controlled by parent
  • self relationships
    • single record can’t be linked to itself
    • indirectly relate to itself
    • can’t create many-to-many self-relationship
  • icons
    • based on tabs on
  • master-detail relationships
    • “Customize Application” user permission to create multilevel master-detail relationships?
    • records by default can’t be reparented. Only Admin can.
    • 3 custom detail levels?
    • standard objects can’t be on the detail side of a custom object master-detail
    • multilevel master-detail relationships do not support division transfer
    • you can’t create master-detail relationship if object already contains data
      • you can do a lookup relationship then convert to master-detail though
    • roll-up works on the detail but not on subdetail
      • create. a rollup on the subdetail
    • junction objects cannot be master
    • deleting sequence has an impact
      • deleting detail and later delete master, you cannot undelete detail
    • Metadata API deployment that includes master-detail deletes all detail records in the bin for the ff:
      • soft delete detail before deploy, then bin will be emptied
      • convert lookup to master-detail, detail must reference a master or soft delete, after deploy bin is emptied
    • Don’t exceed 10k records on master-detail relationship
  • many-to-many relationship
    • junction objects are deleted when either associated master record is deleted
    • if both master are deleted, the junction record can’t be restored
    • OWD and Object permission should match
      • read/write master owd, user must have read/write object permssion on both objects
    • user can’t delete a master if there are 200 junction object records and if junction has a rollup-up summary to other parent
    • first master-detail becomes the primary relationship
      • this affects the look and feel, record ownership inherit from primary
      • division? inherits from primary
    • second primary can be primary if the first primary is deleted/converted to lookup
  • relationship to external objects
    • only support lookup, external lookup and indirect lookup relationship
    • relationships that involve external objects allow users to create child records? however relationship field on each new child isn’t automatically populated with the parent
    • syncing doesn’t create relationship fields on external objects, however you can change the field type of a sync type to lookup, external or indirect lookup
      • change text field that identifies the foreign key which is the primary key
      • relationship field is a type of custom field. this field can be overwritten when you sync external objects – see considerations
      • cascade-delete is not available for external object
      • Salesforce classic
        • indirect lookup relationship fields dont display the expected name of records (eg. Goldman and Sachs, instead it shows the value of the target field external id.
        • same with external lookup. display shows value of the external id in list views.
          • in detail page, it displays the name as expected. but basing on previously retrieving parent record.
      • lookup search isn’t available. enter the external id manually.
      • external lookup and indirect lookup relationships appear as links to the parent record. If there is no access to the parent record then it appears as a plain text instead of a link.
      • lookup filters not available
      • indirect lookup relationships can only be created on external objects
      • only objects that have external id and unique are available as parent objects in indirect lookup.
      • on case-senstive scenario make sure it applies to the parent object field definition too
    • impact of relationships on reports
      • lookup relationships allow data from two related objects to be joined in one report
      • master-detail relationships allow data from three objects to be joined in one report.(master, detail and another lookup)
      • many-to-many provide two standard report types
        • primary master with junction object and secondary master
        • secondary master with junction object and primary master
        • the order of the master objects is important
          • list determines the scope of records that can be displayed

Best Practices

  • Meaningful names for objects and fields, unique names improves clarity
  • Help out your users – include descriptions for your custom objects and fields
  • Require fields when necessary – to keep data clean, you might enforce required fields
  • Dont’ exceed 10k child records for master-detail relationships

Journey to Salesforce CTA Starts Again Now

Last 2018 when I was an employee I started on a journey to prepare for a Salesforce CTA. At that time I went to all the prerequisite certifications and smash them one after the other. Then at the start of 2019 my focus changed and gave up on the idea of becoming Salesforce CTA. My focus shifted on growing assets for my company by a building suite of products from mobile apps, amazon merch t-shirt designs, vector illustrations, acquiring niche blogs and other projects, some still in the back log. Doing all of these while keeping a 40 hours work week as a Salesforce consultant.

Back to today 2021. I learned alot from the past few years. These assets are now working as passive incomes that supplements my income from being a Salesforce consultant. Now, I feel I can focus back on the journey to CTA again. Starting this week I will be sharing my experience on the journey to CTA. #JourneyToCTA

How To Validate Lightning Component forms In Flow Screens

If you are looking to learn how to validate custom lightning components forms in a flow screen then hopefully this post could shed some light on what can be accomplished and the solutions implemented around it.

So far, I found it straight forward to add a custom lightning component in a flow screen, simply specify that you are implementing the “lightning:availabelForFlowScreens” and the component becomes an available element in the flow screen.

Should appear as an option under the Custom heading of the Screen Elements.

However, adding validation for the form inputs in the lightning component with flow screens is a tricky subject with all the limitations of the “out of the box” functionalities.

My requirements are following:

  • capture user information
  • fields are displayed based on user inputs
  • and fields are mandatory
  • only when mandatory fields are completed can you proceed to the next screen
  • ability to navigate back and update any fields.

So far my screen looks like per screenshot when first question is answered Yes.

Hitting next just straight goes to the next screen without any validation.

To allow us to do some sort of validation, we would use the Aura.Action attribute type. As per documentation.

An Aura.Action is a reference to an action in the framework. If a child component has an Aura.Action attribute, a parent component can pass in an action handler when it instantiates the child component in its markup.

Can’t find further documentation, but the way I understand it, in the context of flows, the flow screen acts as the parent component and we can call the function in the lightning component by declaring the attribute.

The type Aura.action returns the following object.

isValid: Boolean – stops the flow from going to the next screen.
errorMessage: String – custom error message.

To make this work we also need to add the action handler init. The ”Next” button on flow screens calls the action handler but only on init. (Why on init only? I still don’t fully understand.) Hitting next forces the lightning component to rerender and call init, from init we can set the validate object.

From your controller.js Set the component “v.validate”.

Without any inputs, if I hit Next.

Yey! So far validation works!

Let’s add some more validation logic.

When I answer questions and leave some blank. I get several issues with this validation.

  • when the component rerenders
    • any custom hide/show logic would be lost
    • any inputs are lost
  • validation error location is at the bottom of the screen
  • you can only show one error at a time

To get around these limitations, I ended up building some custom validation handling, with a combination of flow variables to persist data, design attributes for input/output, sessionStorage to track missing fields inputs, and fieldCustomValidity and if you have date fields or combobox you may need to use afterRender calls.

First update the .design file for the input fields that needs to persists. To keep it easy we use one attribute per field.

Next create flow variables for each attribute and checkbox is available for input and output set to true.

Here is the final attributes

Next assign each to the custom lightning component input and output.

Now for the new validation logic. First I created an array for mandatoryFields. For each field required I add aura:id to the array.

Next I iterate to the array of fields and check if it has been populated with a value. For each missing value I again add to a new array called missingFields.

Next I check the length of the missing fields and I store the information in a sessionStorage.

SessionStorage is like localStorage but it gets cleared up when the session ends. (will cover storing of data locally on another topic)

Then I call the return statement setting isValid to false and an empty errorMessage(this property is required to display errors on the page).

So when Next action is called, it runs the init and sets the sessionStorage.

Note that I’m using the renderer. As adding a date field threw me off a little as the DOM is not yet ready after init and you are trying to access them checking for missing inputs. Solution was to put in the afterRender. But if you only have the basic input fields, you can have the code in the init as well.

I call helper method which displays the error by reading the sessionStorage for missing fields and I set the fieldCustomValidity.

And my flow is looking sharper, I am prompted for incomplete fields and I can move to next when complete and go back and update if I need to.

Code for this is posted on my github repo.

I’ll be posting a video tutorial as well soon. Stay tuned.

Hope someone find this post useful. Happy coding!

How To Use Decorators In Lightning Web Components

My goal is to upskill on Lightning Web Components(LWC) and create a course in the next few months. Aside from learning kebab case is not a food, the other thing that made me scratch my head when I dabbled with LWC was decorators. I used them but didn’t fully understand what they are.

Anyway, I’ll be talking about the decorators because I want to get a deeper understanding of this topic by doing this tutorial.

What are decorators?

Decorators are just a wrapper around a function. For instance, they are used to enhance the functionality of the function without modifying the underlying function. (See Higher Order Function)

The ability to create decorators is part of ECMAScript. Decorators are very widely used in JavaScript through transpilers. For example, see the documentation of core-decoratorsember-decoratorsAngularStencil, and MobX decorators.

These three decorators are unique to Lightning Web Components.

  • @api – makes a property public.
  • @track – makes property private
  • @wire – reads Salesforce data or metadata

What are Reactive Properties?

If the value of properties changes, the component rerenders. To make the property reactive we wrap them with such decorators.

Properties that are non-reactive means changes in value will not cause the component to rerender.

When to use @api

  • when you want to make the property reactive and be accessible outside of the component they were declared.
  • the parent component or owner component can set values to the public properties for the child component.
  • if we want changes in the property value to rerender any content on components that references the property.

import syntax

course.js – decorating a property

course.html – template use

courseApp.html – parent component has access and can set child properties

When to use @track

  • when you want to make a property reactive but only within the component.
  • we don’t want the parent component to change the property values of the child component.

import syntax

course.js – added new track property

course.html – added new template

courseApp.html – parent setting the course-level to Intermediate has no effect

When to use @wire

  • used to read Salesforce data or metadata.
  • we want the component to rerender (be reactive) when the wire service provisions data.

To use the @wire decorator you need to understand a bit about the wire service and calling apex methods

What is the wire service?

  • the wire service provisions an immutable stream of data to the component
    • immutable means it is read-only data. To mutate it you need to make a shallow copy of the object.
  • the wire service delegates control flow to the LWC engine. This is great for read operations. But if you want to do C-U-D(create, update or delete) we need to use the Javascript API instead.
  • the wire service provisions data either from the client cache if it exists, then fetches from the server if needed.

The wire service syntax and steps

  • import from adapterModule and create identifier for the wire adapter
  • decorate wire property or function with the wire adapterConfig object
  • create a property or function that receives the stream of data

Importing objects, fields and relationships:

Use the lightning\ui* api wire adapter module and import reference to objects and fields to make sure the object or fields exists. Recommended using the lightning\ui* api wire adapter module because it respects the user’s CRUD, FLS, and sharing access.

Importing objects, fields and relationships

@wire decorating a property

@wire decorating a function

How to use wire service with Apex methods?

  • Apex methods can be called via @wire or imperatively. It follows a similar syntax but instead of importing the wire adapter, we import the class method. Then data streamed will be received either by a property or a function.
  • wire adapter provisions new values when available, wire service provisioned data from apex methods doesn’t. To get the new values we need to call refreshApex().

import syntax

expose Apex methods with static/global or public and annotate with @AuraEnabled(cacheable=true) – improves runtime performance

import the apex method and wire the property or function which will receive the data stream

if you used @wire, Apex method provisioned data can get stale, use the refreshApex() to get the latest values

importing syntax

usage example – via button click refresh teh data that the wire service provisioned

Well, that wraps the basics on decorators. Hope you learned something from this post and found it useful.

There are more details that you can get out from the documentation linked below.

How To Set Up CICD On Bitbucket Pipelines With Salesforce DX And Delta Deployment

Learn how to setup set up CICD delta deployment with Salesforce DX. Tips and tricks for authorisation , setting up node and the basic git commands.

I’m revamping our CICD process with Salesforce DX and Bitbucket Pipeline with the following initial setup which will only a delta deployment

Authentication method – authorize an org and grab the sfdxurl to be stored as repository variable in Bitbucket

sfdx force:auth:web:login 
sfdx force:org:display --verbose

There would be two token types


Copy the SFDX Auth URL which will be the second type. Create a repository variable AUTH_URL in Bitbucket and store the copied value.

Echo the AUTH_URL to a file then authenticate with with sfdxurl:store

echo $AUTH_URL >> /tmp/sfdx_auth.txt
sfdx force:auth:sfdxurl:store -f /tmp/sfdx_auth.txt -s -a dxpipeline

Grab the latest sfdx tool and install.

mkdir sfdx-cli 
tar xJf sfdx-linux-amd64.tar.xz -C sfdx-cli --strip-components 1 

Next, to compare delta files – there is node tool available in github that does delta comparison between hash commit or branch. Install the sfdx-git-delta app

npm install sfdx-git-delta@latest -g

Finally I incorporated these to my git workflow

On a Pull Request – I want to run a delta comparison and do an empty check only that my delta files changes are deployable and does break any unit tests.

First checkout a temporary branch from the feature branch

git checkout -b some-pr-branch

Next, run the tool to create a delta comparison from that branch to the target branch.

sgd --to some-pr-branch --from origin/staging --repo . --output .

The tool should create a package.xml/destructiveChange.xml file based on the diff on their respective directory.

Next convert the source format to mdapi so we can run a transactional deploy.

sfdx force:source:convert --manifest=package/package.xml --outputdir=convert

After conversion, do an empty check deploy and run the unit test

sfdx force:mdapi:deploy --deploydir=convert -c -l RunLocalTests -w 30

Below is the complete Pull Request script.

image: atlassian/default-image:2

    'feature/*': # Pull request from feature branch to Staging
      - step:
          name: "Staging Pull Request Validate Package"
            - echo "QA Pull Request Validation"
            - wget
            - mkdir sfdx-cli
            - tar xJf sfdx-linux-amd64.tar.xz -C sfdx-cli --strip-components 1
            - ./sfdx-cli/install
            - echo $AUTH_URL >> /tmp/sfdx_auth.txt
            - sfdx force:auth:sfdxurl:store -f /tmp/sfdx_auth.txt -s -a dxpipeline
            - npm install sfdx-git-delta@latest -g
            - git checkout -b some-pr-branch          
            - git --no-pager diff --name-status some-pr-branch  origin/staging
            - sgd --to some-pr-branch  --from origin/staging --repo . --output .
            - echo "--- package.xml generated with added and modified metadata ---"
            - cat package/package.xml
            - sfdx force:source:convert --manifest=package/package.xml --outputdir=convert 
            - echo "---- Validating delta package  ----"
            - sfdx force:mdapi:deploy --deploydir=convert -c -l RunLocalTests -w 30

On Push to the branch – I ran similar steps with the only exception that I compare the current branch to the staging branch and not do an empty check or run the test classes as I already ran them.

Below is the complete Push script.

image: atlassian/default-image:2

      - step:
          name: "Deploy to Staging"
            - echo "Deploy to Staging"
            - wget
            - mkdir sfdx-cli
            - tar xJf sfdx-linux-amd64.tar.xz -C sfdx-cli --strip-components 1
            - ./sfdx-cli/install
            - echo $AUTH_URL >> /tmp/sfdx_auth.txt
            - sfdx force:auth:sfdxurl:store -f /tmp/sfdx_auth.txt -s -a dxpipeline
            - npm install sfdx-git-delta@latest -g
            - git checkout -b dev          
            - git --no-pager diff --name-status some-pr-branch  origin/staging
            - sgd --to dev  --from origin/staging --repo . --output .
            - echo "--- package.xml generated with added and modified metadata ---"
            - cat package/package.xml
            - sfdx force:source:convert --manifest=package/package.xml --outputdir=convert 
            - echo "---- Validating delta package  ----"
            - sfdx force:mdapi:deploy --deploydir=convert -w 30

Hope you find this useful. Hit me up on the comments below for any questions.

How To Use Javascript Promises with Lightning Components

Javascript Promises has been around for a while but only got the chance to use it on some of the aura component pieces I started working on.

In analogy you make a promise and either you fulfil or break your promise.

In Javascript Promises context these translate to “resolve” meaning promise is fulfilled or “reject” which means promise was broken and can be caused by an error.

A good use case for in Lightning is handling responses from asynchronous operations in which you pass a callback function. Then that callback function can make another asynchronous operation. Keep on nesting and you can easily eventually end with what they sometimes call callback hell as your code can be hard to manage.

Let’s dive into a creating Javascript Promise

I defined this as a helper method. It calls an apex method and depending on response and getState I mapped it to either resolve or reject.

Here I assigned a variable to the returned Promise in the javascript controller. The helper returns one parameter which is the response which I can access on the .then statement.

Here we called the p variable followed by a .then and the special callback function for Aura which is $A.getCallback. You can then chain another promise by calling and returning that promise. The next then can accept a parameter from the previous promise.

With Javascript Promises, this is more readable than a nested callback and easier to manage.

I hope you find this basic tip useful. Hit the comments below if you have questions.

How To Use Map Object In Aura Lightning Component

By Salesforce documentation, you can define several collection types including a Map.

A Map collection allows you have a key/value pair where the key is unique. Declaring such is easy by adding the following

<aura:attribute type="Map" name="fooMap" />

But in your controller, if you try to do any Map functions such as keys(), set(key, value), values(), you get an error such as:

set is not a function 


values is not a function.

What is happening in Lightning is even if you declared it as Map it is treated as an Object. It took me a while to figure this out.

To get around this “limitation” I manually assigned a map in the controller and then I was able to do Map functions. You can do this either on init or before you use the component Map.

Hope you find this tip useful.

How To Replace Salesforce Metadata Before Deploying using Ant Scripts

My particular use case is for Salesforce ant deployment. I wanted to replace some metadata before I deploy to the target org which will allow me to automate the process. I can fetch metadata from my sandbox org and when it gets deployed to the target org like production the values will be updated.

You should have the latest ant-salesforce.jar as a requirement. You can grab the latest ant migration tool from here –

My sample script entails having a conditional check before doing the replace a logic with Custom Labels. This is how my build XML looks.

Additional library you would need to perform the conditional check is ant-contrib.jar file. You can grab the latest library from here –

If you try to run the script without the library you might end with the error below.

Fix Ant Build Error: Problem: failed to create task or type if

On the build.xml simply add the reference to the library.

If everything is in place, like the properties file has the right credentials, running the following command should deploy your code and replace the values as per your ant script.

ant -Denvironment=prod -buildfile build.xml deployMetadata

Source code available here –

How To Fix [Cannot read property ‘setParams’ of undefined] for Application Event in Aura Components


    <aura:registerEvent name="CRMAppEvent" type="c:CRMAppEvent"/>


        var compEvent = $A.get("e.c:CRMAppEvent"); 
        compEvent.setParams({"type" : response});

Are you receiving cannot read property ‘setParams’ of undefined after assigning an event?

This indicates that $A.get(“e.c:CRMAppEvent”) cannot find the event and any reference to setParams method or any properties of the event would eventually be undefined.
To fix this you need to set your application event access to global


<aura:event type="APPLICATION" description="This is the generic Application Event" access="global">
    <aura:attribute name="type" type="String"/>

How to Fix Salesforce Deployment Errors: InfoNot a valid enumeration for type

If your deployment strategy still evolves around ant deployment and not the new Salesforce CLI and you encounter this error on your deployment.

Failed to check the status for request ID=0Af0r00000ClbNgCAJ. Error: InfoNot a valid enumeration for type: class com.sforce.soap.metadata.DeployProblemType. Retrying…

While digging around, I found out this is a tool issue. Chances are your ant-salesforce jar library is outdated, you need to grab the latest salesforce-ant version from here

Update the library on your ant build path. Check your build.xml for the classPath if you are unsure of the path.

<taskdef resource="com/salesforce/antlib.xml" uri="antlib:com.salesforce">
            <pathelement location="${basedir}/ant-salesforce.jar" />