How To Fix [Cannot read property ‘setParams’ of undefined] for Application Event in Aura Components

Component.cmp

    <aura:registerEvent name="CRMAppEvent" type="c:CRMAppEvent"/>

Controller.js

        var compEvent = $A.get("e.c:CRMAppEvent"); 
        compEvent.setParams({"type" : response})
        compEvent.fire();

Are you receiving cannot read property ‘setParams’ of undefined after assigning an event?

This indicates that $A.get(“e.c:CRMAppEvent”) cannot find the event and any reference to setParams method or any properties of the event would eventually be undefined.
To fix this you need to set your application event access to global

CRMAppEvent.evt

<aura:event type="APPLICATION" description="This is the generic Application Event" access="global">
    <aura:attribute name="type" type="String"/>
</aura:event>

How to Fix Salesforce Deployment Errors: InfoNot a valid enumeration for type

If your deployment strategy still evolves around ant deployment and not the new Salesforce CLI and you encounter this error on your deployment.

Failed to check the status for request ID=0Af0r00000ClbNgCAJ. Error: InfoNot a valid enumeration for type: class com.sforce.soap.metadata.DeployProblemType. Retrying…

While digging around, I found out this is a tool issue. Chances are your ant-salesforce jar library is outdated, you need to grab the latest salesforce-ant version from here

https://developer.salesforce.com/docs/atlas.en-us.daas.meta/daas/forcemigrationtool_install.htm

Update the library on your ant build path. Check your build.xml for the classPath if you are unsure of the path.

<taskdef resource="com/salesforce/antlib.xml" uri="antlib:com.salesforce">
        <classpath>
            <pathelement location="${basedir}/ant-salesforce.jar" />
        </classpath>
</taskdef>

How to Fix Compilation LWC1010: Failed to resolve entry for module

This LWC1010 compilation error is caused by an invalid reference to the component filename. By default camel case component folder names are mapped to kebab-case in the markup when calling them from Lightning Web Components. To resolve entry for module errors check the proper naming and hyphen.

Eg. myComponent would be my-component

Visual Studio code autocomplete would miss this example ” myCRMApp”

<my-component></my-component>
<c-my-crm-app></c-my-crm-app>

To fix this myCRMApp would be my-c-r-m-app

<c-my-c-r-m-app></c-my-c-r-m-app>

How to Fix Salesforce Deployment Errors: Platform Encryption cannot be enabled for standard fields when Portals are enabled.

Go to Salesforce Classic UI and search under Setup for “Customer Portal” and Disable Login Enabled. You should move your portal to Communities where Platform Encryption is supported.

How to Fix Salesforce Deployment Errors: You may not modify the permission Connect Org to Customer 360 Data Manager while editing a Standard Profile

If you try to deploy the Admin profile and might get this error if you have Dev Hub enabled on your org. You need to edit the Admin profile and remove the following lines

<userPermissions>
   <enabled>true</enabled>
   <name>ManageHubConnections</name>
</userPermissions>

Passed The Heroku Architecture Designer Exam (Part 1)

This a late post and is part 1 of 2 parts are there is quite a bit of content to digest.

Background Story: One of my goals last year was to certified in Heroku. I studied my butt off for 3-4 weeks to prepare for this exam. I have web development experience but haven’t deployed applications to Heroku until just recently. With Salesforce offering this certification mid last year 2019, I decided to take it before the end of the year (December 2019).

Here are my key takeaways from the exam and materials to focus on. (Please do not ask for exam dumps! If you wanna be really good, put in the effort and study please!)

Where does Heroku fall in the stack of cloud computing? It is Platform as a Service

If you don’t have Heroku imagine what you would traditionally go through from coming up with an idea and building the app.

Heroku removes most of those decision making for you and abstracts them so you focus on just developing the app and removes the worry about the infrastructure that would support your app.

  • Try the different the Getting Started tutorials on Heroku on different languages (I tried PHP, NodeJS, Java, Python)
    • Check what is a common pattern defined on each language and how they get deployed to Heroku (the dependency mechanism like requirements.txt for Python, pom.xml for Java, package.json for NodeJs, etc..)
    • For the framework you are using, know what are commands to execute the application, and define them in the Procfile.
  • Get in-depth with the Heroku Architecture
    • git push heroku master – what does this command actually do on the Heroku
      • what are buildpacks? – they are open-source sets of instructions/commands that take your application source code, dependencies and runtime to produce a slug.
      • what is a slug? – produced by the buildpack, this contains all your compiled application code, dependencies and runtime ready to run on a dyno with the Procfile.
      • what are config vars – these are environment specific configurations
      • what are dynos? – it is a lightweight Linux container where it executes your slug. Dynos can be scaled horizontally by adding more dynos or scaled vertically by using bigger dynos.
        • web – a process type that receives HTTP traffic
        • worker – a process type typically used for background jobs, queueing systems and timed(cron) jobs
        • one-off – a temporary dyno that is detached that can run a local terminal. Used for admin tasks, db migrations and console sessions.
      • different dyno types
        • free, hobby, standard and performance for common runtime
        • limitations of each dyno type
        • private dynos for private spaces
      • get familiar with dyno manager, redundancy and security
    • what are stacks – an operating image stack(Ubuntu) maintained by Heroku
    • how to add custom domains and subdomains
      • limitations of A record with your DNS provider
      • use of CNAME record
    • HTTP routing – routes the incoming request to your app to running dynos
    • HTTP Request ID headers
    • Session Affinity – in short, requests routed to any running dyno should maintain state regardless
    • Logging and Monitoring – logs are considered to streams

Heroku Add-ons

  • how to provision add-ons
    • from using dashboard or CLI
  • share add-ons between apps
  • what can you see in the elements marketspace
    • add-ons, buttons, and buildpacks

Heroku Managed Add-ons(Data Management)

  • Heroku Postgres – I focused too much on this as I was scared that there were too many materials about it, but on the exam, I realized that just covering the basics would have gone a long way by itself.
    • get the basics covered (provisioning, the different plans, primary and follower database, forking, sharing add-ons between apps
    • follower – database replication serves many purposes
      • read throughput with leader-follower configuration
      • hot standby
      • reporting database
      • seamless migration and upgrade
    • forking – creates a snapshot of your current database (does not stay up to date like follower database)
      • risk-free method to try production data and data for testing development or migration
    • data clips – SQL queries that you share, can be accessed through a browser and downloaded as CSV or JSON (30 requests per minute limit per IP)
      • can be public/draft or shared individual or to teams, you are a member of
      • can be revoked
      • can be integrated with Google Sheet (=IMPORTDATA(…))
      • 100k rows returned
    • how to troubleshoot performance issues
      • use CLI commands like pg:diagnose or Diagnose tab (not available on Hobby plans)
      • expensive queries – queries running slowly or takes a significant amount of execution time
        • Run EXPLAIN ANALYZE (via pg:psql)
        • identify used/unused indexes
        • upgrade to the latest database
    • how to do rollbacks from backups – can rollback DB to a certain point in time – similar to the release deployment – heroku releases:rollback
      • follows the same pattern as follower and fork and does not affect the primary database
    • how does it relate to the Heroku Connect add-on? – More on this on Heroku Enterprise topic for syncing Salesforce records
  • Heroku Redis
    • Heroku’s managed key-value store as a service
    • create a Redis instance and attach it to an app
  • Heroku Kafka
    • I was not able to play with this add-on as it is paid, but get an in-depth understanding of how the architecture works and concepts is a must
    • Kafka is a distributed commit log, fault-tolerant communication with producers and consumers using message based topics.
    • Some use cases
      • elastic queueing – can accept large volumes of events and consumers/downstream services can pull from these events when available.. this allows scaling and improves stability with fluctuations in volumes
      • data pipelines and analytics – with Kafka immutable data streams – developers can better build a highly parallel data pipeline for ETL and aggregation of data
      • microservice coordination –
    • Kafka concepts to master
      • Kafka is made of up from a cluster of brokers (instances running Kafka)- numbers of a cluster in a broker can be scaled to increase capacity, resilience, and parallelism
      • brokers manage the stream of messages sent to Kafka
      • producers are clients that write to brokers
      • consumers are clients that read from the broker
      • topics are made of a number of partitions
      • allocate more partitions if your consumers are slow compared to your producers

For the second part I’ll be sharing the tips for following.

  • Heroku Enterprise
  • User Management
  • Heroku Runtime
    • Common Runtime
    • Private Spaces
    • Shield Private Spaces
  • Dynos and Dyno Manager
  • Deployment
  • Etc.

How to Fix Salesforce Deployment Errors: source:push “You may not modify the permission Access Tracer for External Data Sources while editing a Standard Profile”

There are some permissions that are not applicable to a scratch org and this is one of those weird ones. Pushing the standard Admin profile to my scratch org I encountered this error.

You may not modify the permission Access Tracer for External Data Sources while editing a Standard Profile.

To fix this you need to edit and remove this specific profile permission. So far I cannot find any documentation about this TraceXdsQueries permission and what it actually does.

    <userPermissions>
        <enabled>true</enabled>
        <name>TraceXdsQueries</name>
    </userPermissions>

How To Enable SObject Intellisense in VSCode for Salesforce DX

Quick Tip.

Salesforce DX already comes with Intellisense for Apex classes and the different primitive types. Eg. Strings

For SObjects it’s a per-project setting you need to enable once you have your project created. Open the Command Palette and choose SFDX: Refresh SObject Definitions

What the command does under the hood is it will create a class definition of the SObject and store them under the .sfdx folder/tools folder. These don’t get committed to version control as the .sfdx folder is by default ignored.

Each class has properties that allows the Intellisense to work.

Once enabled you are good to go.

If you want more tips and information on Salesforce DX checkout my youtube playlist where I cover them in detail.

How To Get Started With Org Development Model With Salesforce DX

There are two development models you can follow with Salesforce DX.

First is the package model where you develop against a scratch org and prepare all the components that are needed to deploy, similar to change sets but smarter as it handles the dependencies for you. We will talk about this more in the future.

The other method is what you would be most familiar with if you have been developing with Salesforce for sometime now(Force.com IDE/Mavensmate/IntelliJ), you develop changes against a sandbox and move the metadata to deploy to the target org till you deploy to production.

With Salesforce DX you can still continue to develop against a sandbox. You do not need to enable Dev Hub for developing against a sandbox or DE org.

Requirements ofcourse if you should have VSCode installed, the VSCode Salesfor ce Extension Pack and the Salesforce CLI.

Boot up VSCode and Open the Command Prompt and type SFDX : Create Project with Manifest

The scaffolding created will contain a manifest folder with a package.xml

The default package.xml adds the base ApexClass, ApexComponent, ApexPage, ApexTestSuite, ApexTrigger,AuraDefinitionBundle and StaticResource.

Next is to Authorise the Sandbox you want to work on, Go to the command pallete. SFDX: Authorize and Org.

Next right click on the package.xml and choose Retrieve Source in Manifest in Org.

Once done you can start modifying your code. Right click on a file to deploy it to the Source Org or enable the auto deploy on Save settings.

That’s it, you should now be able to work with your existing sandbox.

Check out my video and subscribe if you want more tips and suggestions.

Fix for flowruntime:lookup Error With Salesforce Flows

If you recently used the custom lookup component while working on Salesforce Flows and get this error.

We can’t display component ‘flowruntime:lookup’, because it isn’t supported in Classic runtime. 

Chances are you still are on Classic. The component only works on Lightning. An easy fix is to enable the Lightning runtime in Flows.

  1. From Setup, enter Process Automation Settings in the Quick Find box, then select Process Automation Settings.
  2. Select Enable Lightning runtime for Flows.
  3. Save your changes.