Introduction

Be the change that you wish to see in the world.
- Mohandas Gandhi
Goals of this book
Cover about |
---|
Best practices and tools for the trade |
Methods for maintaining code quality |
Options in DevOps |
Details on Change management and governance |
Provide Org management tools |
Version Control |
Chrome Extension |
VS Code Extension |
Request to the readers
Please plant fruit trees like King Ashoka did:
“Along roads I have had wells dug and trees planted for the benefit of humans and animals”.
“Along roads I have had banyan trees planted so that they can give shade to animals and men, and I have had mango groves planted.”
Images used are from the Chromecast source.
Data Governance
Cost of doing nothing
References
Change Management
- Optimizing the change management is the key to the success of any Salesforce implementation
- Changes are a regular affair when you are bringing in new features/enhancements into your implementation
graph TD; A[Finding the Need for the Change]-->B[Define Change Scope]; B-->C[Assign Priority to the Change you are bringing in.\n How it enhances/affects current implementation]; C-->D[Submit the change to the Change Approval Board - CAB]; D-->E[Change got approved by CAB]; E-->F[Perform the change]; F-->G[Regression test the change]; G-->H[Change Manager Approves the implemented change.\n Records that change was successful,\n timely, accurately estimated, within budget, and other details ]
Using Atlassian Change Management
-
A project member requests a change. They provide the details like the affected systems, possible risks, and expected implementation.
-
The change manager or peers determine if the change will be successful. They may ask for more information in this step.
-
After review, the team plans how to put the change in place. They record details about:
- the expected outcomes
- resources needed
- timeline
- testing requirements
- undo : ways to roll back the change
-
Depending on the type of change and risk, a change approval board (CAB) may need to review the plan.
-
The team works to implement the change, documenting their procedures and results.
-
The change manager reviews and closes the implemented change.
- They note whether it was successful, timely, accurately estimated, within budget, and other details.
Roles
Change Manager (CM)
- A Change Manager (CM) is a trusted advisor that addresses the people side of change.
- A CM’s primary goal is to help define a clear vision for change and assist in designing a plan that will drive faster adoption, increase utilization of and proficiency with the changes that impact an organization or business.
Project Manager
- Primary point of contact
- Manages Project Plan, Organization, delivery within budget and schedule
- Escalation point for the project team and stakeholders
- Monitor project alignment and goals
Product Owner
- Primary point of contact for Development team
- Defines product roadmap and vision
- Manages product backlog
- Oversees and evaluates development stages and progress
References
Videos
Data Model
Topics
- Tools for extracting Org metadata
- SF-LAND Data Viz for viewing the metadata
- Getting ERD for given sObjects
Tools for getting the complete metadata of your org
Let us pick a scratch org to show the concepts
ALIAS USERNAME ORG ID EXPIRATION DATE
─ ───────── ───────────────────────────── ────────────────── ───────────────
treeprjSO test-uzsmfdqkhtk7@example.com 00DDM000003raPM2AY 2023-01-28
Let us get the all the objects in the org into a object list txt file
~/treeprj [main] >sfdx mohanc:md:describeGlobal -u test-uzsmfdqkhtk7@example.com > treeprjSO_Objects.txt
You can get the list of objects names in the org from this text file by:
cat treeprjSO_Objects.txt | sed 's/,/\n/g' | bat
───────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
│ STDIN
───────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
1 │ AIApplication
2 │ AIApplicationConfig
3 │ AIInsightAction
4 │ AIInsightFeedback
5 │ AIInsightReason
6 │ AIInsightValue
7 │ AIPredictionEvent
8 │ AIRecordInsight
9 │ AcceptedEventRelation
10 │ Account
11 │ AccountChangeEvent
12 │ AccountCleanInfo
13 │ AccountContactRole
14 │ AccountContactRoleChangeEvent
15 │ AccountFeed
16 │ AccountHistory
17 │ AccountPartner
18 │ AccountShare
19 │ ActionLinkGroupTemplate
20 │ ActionLinkTemplate
21 │ ActiveFeatureLicenseMetric
22 │ ActivePermSetLicenseMetric
23 │ ActiveProfileMetric
24 │ ActivityHistory
25 │ AdditionalNumber
26 │ Address
27 │ AggregateResult
28 │ AlternativePaymentMethod
29 │ AlternativePaymentMethodShare
30 │ Announcement
:
Get the org metadata using this object list text file into treeprjSO_metadata.csv
- This operation may take few minutes based your org metadata
sfdx mohanc:md:describe -u test-uzsmfdqkhtk7@example.com -i treeprjSO_Objects.txt > treeprjSO_metadata.csv
bat treeprjSO_metadata.csv
───────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
│ File: treeprjSO_metadata.csv
───────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
1 │ "sobjectName","sobjectLabel","name","label","type","length","nillable","referenceTo","relationshipName","unique","updateable","inlineHelpText"
│ ,"picklistValues","externalId"
2 │ "AIApplication","AI Application","Id","AI Application ID","id",18,false,"[]",,false,false,,"[]",false
3 │ "AIApplication","AI Application","IsDeleted","Deleted","boolean",0,false,"[]",,false,false,,"[]",false
4 │ "AIApplication","AI Application","DeveloperName","Name","string",80,false,"[]",,false,false,,"[]",false
5 │ "AIApplication","AI Application","Language","Master Language","picklist",40,false,"[]",,false,false,,"[""English"",""German"",""Spanish"",""Fr
│ ench"",""Italian"",""Japanese"",""Swedish"",""Korean"",""Chinese (Traditional)"",""Chinese (Simplified)"",""Portuguese (Brazil)"",""Dutch"",""
│ Danish"",""Thai"",""Finnish"",""Russian"",""Spanish (Mexico)"",""Norwegian""]",false
6 │ "AIApplication","AI Application","MasterLabel","Label","string",80,false,"[]",,false,false,,"[]",false
7 │ "AIApplication","AI Application","NamespacePrefix","Namespace Prefix","string",15,true,"[]",,false,false,,"[]",false
8 │ "AIApplication","AI Application","CreatedDate","Created Date","datetime",0,false,"[]",,false,false,,"[]",false
9 │ "AIApplication","AI Application","CreatedById","Created By ID","reference",18,false,"[""User""]","CreatedBy",false,false,,"[]",false
10 │ "AIApplication","AI Application","LastModifiedDate","Last Modified Date","datetime",0,false,"[]",,false,false,,"[]",false
11 │ "AIApplication","AI Application","LastModifiedById","Last Modified By ID","reference",18,false,"[""User""]","LastModifiedBy",false,false,,"[]"
│ ,false
12 │ "AIApplication","AI Application","SystemModstamp","System Modstamp","datetime",0,false,"[]",,false,false,,"[]",false
13 │ "AIApplication","AI Application","Status","Status","picklist",255,false,"[]",,false,false,,"[""3"",""2"",""1"",""0""]",false
14 │ "AIApplication","AI Application","Type","App Type","picklist",255,false,"[]",,false,false,,"[""5"",""14""]",false
15 │ "sobjectName","sobjectLabel","name","label","type","length","nillable","referenceTo","relationshipName","unique","updateable","inlineHelpText"
│ ,"picklistValues","externalId"
16 │ "AIApplicationConfig","AI Application config","Id","AI Application config Id","id",18,false,"[]",,false,false,,"[]",false
17 │ "AIApplicationConfig","AI Application config","IsDeleted","Deleted","boolean",0,false,"[]",,false,false,,"[]",false
18 │ "AIApplicationConfig","AI Application config","DeveloperName","Name","string",80,false,"[]",,false,false,,"[]",false
19 │ "AIApplicationConfig","AI Application config","Language","Master Language","picklist",40,false,"[]",,false,false,,"[""English"",""German"",""S
│ panish"",""French"",""Italian"",""Japanese"",""Swedish"",""Korean"",""Chinese (Traditional)"",""Chinese (Simplified)"",""Portuguese (Brazil)""
│ ,""Dutch"",""Danish"",""Thai"",""Finnish"",""Russian"",""Spanish (Mexico)"",""Norwegian""]",false
20 │ "AIApplicationConfig","AI Application config","MasterLabel","Label","string",80,false,"[]",,false,false,,"[]",false
21 │ "AIApplicationConfig","AI Application config","NamespacePrefix","Namespace Prefix","string",15,true,"[]",,false,false,,"[]",false
22 │ "AIApplicationConfig","AI Application config","CreatedDate","Created Date","datetime",0,false,"[]",,false,false,,"[]",false
23 │ "AIApplicationConfig","AI Application config","CreatedById","Created By ID","reference",18,false,"[""User""]","CreatedBy",false,false,,"[]",
Using SF-LAND Data Viz to view the model file
ERD
- Let us get ERD for Contact and Account Objects
- Note: This also emits the metadata for these 2 object in csv - contact-account.csv
sfdx mohanc:md:describe -s Contact,Account -e contact-account.dot -u test-uzsmfdqkhtk7@example.com > contact-account.csv
Get the SVG for this ERD
sfdx mohanc:viz:graphviz:dot2svg -i contact-account.dot
SVG file is written to contact-account.dot.svg
open contact-account.dot.svg
- Click on the image to view the full version
Financial Services Cloud (FSC) Interactive Data Model
Sales Cloud Interactive Data Model
Tools for trade
- We will discuss tools used for maintaining the Salesforce org
Tools for the Salesforce Org
Topics
- Visualizing the org
- Comparing query results from 2 orgs
- Compare 2 Salesforce Orgs for a metadata type
- List Metadata item of the given Metadata type and compare it with local
- Export SObject Metadata from an Org
- ERD for the given list of Salesforce Objects
- Compare SObjects in 2 orgs
Visualizing the org
sfdx mohanc:org:viz -u test-uzsmfdqkhtk7@example.com --help
Visualize the Org
USAGE
$ sfdx mohanc org viz [-u <string>] [--apiversion <string>] [--json] [--loglevel
trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]
FLAGS
-u, --targetusername=<value> username or alias for the target org; overrides default target org
--apiversion=<value> override the api version used for api requests made by this command
--json format output as json
--loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL) [default: warn] logging level for this command invocation
DESCRIPTION
Visualize the Org
EXAMPLES
Visualize the org
sfdx mohanc:org:viz
Demo of org:viz
sfdx mohanc:org:viz -u test-uzsmfdqkhtk7@example.com
=== Working on getOrgData ===
=== Working on getOrgLimits ===
=== Working on getCount:ApexClass ===
=== Working on getCount:ApexPage ===
=== Working on getCount:ConnectedApplication ===
=== Working on getCount:CustomPermission ===
=== Working on getCount:ExternalDataSource ===
=== Working on getCount:FieldPermissions ===
=== Working on getCount:FieldSecurityClassification ===
=== Working on getCount:Group ===
=== Working on getCount:GroupMember ===
=== Working on getCount:NamedCredential ===
=== Working on getCount:ObjectPermissions ===
=== Working on getCount:PackageLicense ===
=== Working on getCount:PermissionSet ===
=== Working on getCount:PermissionSetAssignment ===
=== Working on getCount:PermissionSetGroup ===
=== Working on getCount:PermissionSetLicense ===
=== Working on getCount:Profile ===
=== Working on getCount:QueueSobject ===
=== Working on getCount:SetupAssistantStep ===
=== Working on getCount:SetupEntityAccess ===
=== Working on getCount:User ===
=== Working on getCount:UserLicense ===
=== Working on getCount:UserPackageLicense ===
=== Working on getCount:UserRole ===
=== Working on getCount:Account ===
=== Working on getCount:AccountContactRole ===
=== Working on getCount:Asset ===
=== Working on getCount:Campaign ===
=== Working on getCount:CampaignMember ===
=== Working on getCount:Case ===
=== Working on getCount:CaseStatus ===
=== Working on getCount:Contact ===
=== Working on getCount:Contract ===
=== Working on getCount:ContractContactRole ===
=== Working on getCount:Lead ===
=== Working on getCount:Opportunity ===
=== Working on getCount:OpportunityCompetitor ===
=== Working on getCount:OpportunityContactRole ===
=== Working on getCount:OpportunityStage ===
=== Working on getCount:Order ===
=== Working on getCount:Partner ===
=== Working on getCount:PartnerRole ===
=== Working on getCount:Account ===
=== Working on getCount:Case ===
=== Working on getCount:CaseComment ===
=== Working on getCount:CaseHistory ===
=== Working on getCount:CaseSolution ===
=== Working on getCount:Contact ===
=== Working on getCount:Solution ===
=== Working on getPackageInfo ===
=== Working on EntityDefinitionSummary:Account ===
=== Working on EntityDefinitionSummary:Contact ===
=== Working on EntityDefinitionSummary:AccountContactRole ===
=== Working on EntityDefinitionSummary:Opportunity ===
=== Working on EntityDefinitionSummary:OpportunityContactRole ===
=== Working on EntityDefinitionSummary:OpportunityCompetitor ===
=== Working on EntityDefinitionSummary:Lead ===
=== Working on EntityDefinitionSummary:Case ===
=== Working on EntityDefinitionSummary:Campaign ===
=== Working on EntityDefinitionSummary:CampaignMember ===
=== Working on EntityDefinitionSummary:Asset ===
=== Working on EntityDefinitionSummary:Contract ===
=== Working on EntityDefinitionSummary:ContractContactRole ===
=== Working on EntityDefinitionSummary:Order ===
=== Working on getApexCodeCoverage ===
=== Working on getBusinessProcess ===
=== Working on getCertificate ===
=== Working on CspTrustedSite ===
=== Working on getOauthToken ===
=== Working on getSetupAuditTrail ===
=== Working on getInactiveUsers ===
=== Working on getUnusedProfiles ===
=== Working on getUsedProfiles ===
=== Working on getUsedPermissionSets ===
=== Working on getUnusedPermissionSets ===
=== Working on getUsedProfiles ===
=== Working on getUsedRoles ===
=== Working on getSysAdminUsers ===
=== Working on getProfileInfo:System Administrator ===
Error: {
"name": "INVALID_FIELD",
"errorCode": "INVALID_FIELD"
}
=== Working on getProfileInfo:Customer Community Login User ===
Error: {
"name": "INVALID_FIELD",
"errorCode": "INVALID_FIELD"
}
=== Working on getPermissionSetInfo:Sales_Ops ===
Error: {
"name": "INVALID_FIELD",
"errorCode": "INVALID_FIELD"
}
=== Working on getProfileForUserLicense:Guest ===
Error: {
"name": "INVALID_FIELD",
"errorCode": "INVALID_FIELD"
}
=== Working on getUserLicenseInfo ===
=== Working on getNetworkMemberInfo ===
Error: {
"name": "INVALID_TYPE",
"errorCode": "INVALID_TYPE"
}
=== Working on getHasEinsteinDataDiscovery ===
=== Working on getHasEDD ===
=== Working on getHasEinsteinDataDiscovery ===
=== Working on getCountTooling:flow ===
=== Working on getFlows ===
=== Working on getFlowMetadata ===
=== Writing Org JSON in file Org.json ...
=== Writing visualization in file Org.svg ...
Visualization done. "open Org.svg" in Chrome Browser to view the Visualization
Org SVG
Comparing query results from 2 orgs
sfdx mohanc:org:compare --help
Compare 2 orgs for the given metadata type query
USAGE
$ sfdx mohanc org compare -o <string> [-m <string>] [-q <string>] [--json] [--loglevel
trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]
FLAGS
-m, --mtype=<value> metadata type, exmaple: profile
-o, --orgusernames=<value> (required) Comma separated orgUserName, example:
user1@email.com,user2@email.com
-q, --inputfilename=<value> soql file or url for soql
--json format output as json
--loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL) [default: warn] logging level for this command invocation
DESCRIPTION
Compare 2 orgs for the given metadata type query
EXAMPLES
Compare 2 orgs for the given metadata type query
sfdx mohanc:org:compare -o orgUserName1,orgUserName2 -m metadataType
when -m is used, query here: https://github.com/mohan-chinnappan-n/soql will be used
to refer ObjectPermissions
(https://raw.githubusercontent.com/mohan-chinnappan-n/soql/main/ObjectPermissions.soql)
use: sfdx mohanc:org:compare -o orgUserName1,orgUserName2 -m ObjectPermissions
sfdx mohanc:org:compare -o orgUserName1,orgUserName2 -i inputquery.soql
Note: inputquery.soql can be a local file or url
Example
sfdx mohanc:org:compare -o mohan.chinnappan.n.sel@gmail.com,mohan.chinnappan.n.sel2@gmail.com -m profile
sfdx mohanc:org:compare -o mohan.chinnappan.n.sel@gmail.com,mohan.chinnappan.n.sel2@gmail.com -q inputquery.soql
Demo of Org query compare tool
Compare 2 Salesforce Orgs for a metadata type
sfdx mohanc:mdapi:helper:rc2 -m Profile -o mohan.chinnappan.n.sel@gmail.com,mohan.chinnappan.n.sel2@gmail.com -n "Admin" --help
Retrieve Metadata from 2 orgs and compares them
USAGE
$ sfdx mohanc mdapi helper rc2 -o <string> -m <string> -n <string> [-u <string>] [--apiversion <string>] [--json] [--loglevel
trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]
FLAGS
-m, --mtype=<value> (required) Metadata type, example: Profile
-n, --name=<value> (required) Name of the metadata type
-o, --orgusernames=<value> (required) 2 Org usernames comma separated
-u, --targetusername=<value> username or alias for the target org; overrides default target org
--apiversion=<value> override the api version used for api requests made by this command
--json format output as json
--loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL) [default: warn] logging level for this command invocation
DESCRIPTION
Retrieve Metadata from 2 orgs and compares them
EXAMPLES
retrieve a metadata type from 2 orgs and compare them
sfdx mohanc:mdapi:helper:rc -m <metadata type> -o orgUserName1,orgUserName2 -n <name>
Example:
sfdx mohanc:mdapi:helper:rc2 -m PermissionSet -o orgUserName1,orgUserName2 -n PermissionSetName
Demo of Compare 2 Salesforce Orgs for a metadata type
List Metadata item of the given Metadata type and compare it with local
sfdx mohanc:mdapi:helper:list -m Profile -u mohan.chinnappan.n.sel@gmail.com -p force-app/main/default/ --help
List Metadata item of the given Metadata type and compare it with local
USAGE
$ sfdx mohanc mdapi helper list -m <string> -p <string> [-u <string>] [--apiversion <string>] [--json] [--loglevel
trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]
FLAGS
-m, --mtype=<value> (required) Metadata type, example: Profile
-p, --prjfolder=<value> (required) DX Project folder path (example:
~/treeprj/force-app/main/default/
-u, --targetusername=<value> username or alias for the target org; overrides default target org
--apiversion=<value> override the api version used for api requests made by this command
--json format output as json
--loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL) [default: warn] logging level for this command invocation
DESCRIPTION
List Metadata item of the given Metadata type and compare it with local
EXAMPLES
List Metadata item of the given Metadata type and compare it with local
sfdx mohanc:mdapi:helper:list -m <metadata type> -p <dxProjectFolder>
Example:
sfdx mohanc:mdapi:helper:list -m Profile -p ~/treeprj/force-app/main/default/
sfdx mohanc:mdapi:helper:list -m Profile -u mohan.chinnappan.n.sel@gmail.com -p force-app/main/default/
force-app/main/default/profiles/SolutionManager.profile-meta.xml true
force-app/main/default/profiles/Analytics Cloud Security User.profile-meta.xml true
force-app/main/default/profiles/Partner App Subscription User.profile-meta.xml true
force-app/main/default/profiles/Guest License User.profile-meta.xml true
force-app/main/default/profiles/HighVolumePortal.profile-meta.xml true
force-app/main/default/profiles/Analytics Cloud Integration User.profile-meta.xml true
force-app/main/default/profiles/Identity User.profile-meta.xml true
force-app/main/default/profiles/Customer Community User.profile-meta.xml true
force-app/main/default/profiles/Authenticated Website.profile-meta.xml true
force-app/main/default/profiles/Customer Portal Manager Standard.profile-meta.xml true
force-app/main/default/profiles/External Identity User.profile-meta.xml true
force-app/main/default/profiles/Customer Community Plus Login User.profile-meta.xml true
force-app/main/default/profiles/Chatter External User.profile-meta.xml true
force-app/main/default/profiles/Customer Portal Manager Custom.profile-meta.xml true
force-app/main/default/profiles/Chatter Moderator User.profile-meta.xml true
force-app/main/default/profiles/Cross Org Data Proxy User.profile-meta.xml true
force-app/main/default/profiles/Custom%3A Marketing Profile.profile-meta.xml false
force-app/main/default/profiles/StandardAul.profile-meta.xml true
force-app/main/default/profiles/Custom%3A Support Profile.profile-meta.xml false
force-app/main/default/profiles/Force%2Ecom - App Subscription User.profile-meta.xml false
force-app/main/default/profiles/mohanc Profile.profile-meta.xml true
force-app/main/default/profiles/Read Only.profile-meta.xml true
force-app/main/default/profiles/Force%2Ecom - Free User.profile-meta.xml false
force-app/main/default/profiles/PlatformPortal.profile-meta.xml true
force-app/main/default/profiles/Chatter Free User.profile-meta.xml true
force-app/main/default/profiles/Gold Partner User.profile-meta.xml true
force-app/main/default/profiles/Partner Community User.profile-meta.xml true
force-app/main/default/profiles/Customer Community Login User.profile-meta.xml true
force-app/main/default/profiles/Minimum Access - Salesforce.profile-meta.xml true
force-app/main/default/profiles/Work%2Ecom Only User.profile-meta.xml false
force-app/main/default/profiles/Customer Community Plus User.profile-meta.xml true
force-app/main/default/profiles/Custom%3A Sales Profile.profile-meta.xml false
force-app/main/default/profiles/Admin.profile-meta.xml true
force-app/main/default/profiles/ContractManager.profile-meta.xml true
force-app/main/default/profiles/Silver Partner User.profile-meta.xml true
force-app/main/default/profiles/External Apps Login User.profile-meta.xml true
force-app/main/default/profiles/Partner Community Login User.profile-meta.xml true
force-app/main/default/profiles/Standard.profile-meta.xml true
force-app/main/default/profiles/MarketingProfile.profile-meta.xml true
force-app/main/default/profiles/High Volume Customer Portal User.profile-meta.xml true
Demo of List Metadata item of the given Metadata type and compare it with local
Export SObject Metadata from an Org
sfdx mohanc:md:describeGlobal -u mohan.chinnappan.n.sel2@gmail.com > objects.txt
sfdx mohanc:md:describe -u mohan.chinnappan.n.sel2@gmail.com -i objects.txt > org-md.csv
Making package.xml for all the objects in the org
# make_package.xml.sh
# reads stdin of lines to create package.xml
# mchinnappan
API_VERSION=56.0
prefix=$(cat <<EOF
<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
<version>${API_VERSION}</version>
<types>
<name>CustomObject</name>
EOF
)
echo ${prefix}
while read line; do
echo " <members>${line}</members>"
done
suffix=$(cat << EOF
</types>
</Package>
EOF
)
echo ${suffix}
# if you do not have bat in *nix box, use cat
# get all custom objects
sfdx mohanc:md:describeGlobal -u mohan.chinnappan.n_ea2@gmail.com | tr ',' '\n' | grep -i "__c$" | bash ~/shell-scripts/bash/make_package.xml.sh > _package.xml; xmllint --format _package.xml > package.xml; bat package.xml
───────┬──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
│ File: package.xml
───────┼──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
1 │ <?xml version="1.0" encoding="UTF-8"?>
2 │ <Package xmlns="http://soap.sforce.com/2006/04/metadata">
3 │ <version>56.0</version>
4 │ <types>
5 │ <name>CustomObject</name>
6 │ <members>AcquiredAccount__c</members>
7 │ <members>Broker__c</members>
8 │ <members>GanttLink__c</members>
9 │ <members>GanttTask__c</members>
10 │ <members>MyFilter__c</members>
11 │ <members>OpportunityHistory__c</members>
12 │ <members>Property__c</members>
13 │ <members>Stock_Position__c</members>
14 │ <members>Teacher__c</members>
15 │ <members>TruncateTest__c</members>
...
Get the metadata for these assets in the package.xml
sfdx force:mdapi:retrieve -k ./package.xml -u mohan.chinnappan.n_ea2@gmail.com -r .
Retrieving v56.0 metadata from mohan.chinnappan.n_ea2@gmail.com using the v57.0 SOAP API
Retrieve ID: 09S3h000007GKGdEAO
Retrieving metadata from mohan.chinnappan.n_ea2@gmail.com... done
Wrote retrieve zip to /private/tmp/meta-export/mdexport/unpackaged.zip
jar tvf unpackaged.zip
16429 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Object_Activity__c.object
16058 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Pipeline_Stage__c.object
7087 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Pipeline__c.object
5616 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/MyFilter__c.object
8383 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Project__c.object
11274 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Remote_Change__c.object
11079 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Repository__c.object
7874 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Source_Member_Reference__c.object
11355 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Vcs_Event__c.object
5159 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/Teacher__c.object
5888 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/Stock_Position__c.object
7700 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/Broker__c.object
17198 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/Property__c.object
9307 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Async_Operation_Result__c.object
8572 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Back_Sync__c.object
10427 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Change_Bundle_Install__c.object
7122 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Change_Bundle__c.object
13690 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Change_Submission__c.object
13408 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Environment__c.object
7164 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Hidden_Remote_Change__c.object
12752 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Work_Item_Promote__c.object
7874 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/AcquiredAccount__c.object
26381 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Work_Item__c.object
10679 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/OpportunityHistory__c.object
6115 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/GanttLink__c.object
6390 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/GanttTask__c.object
7916 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Deploy_Component__c.object
7924 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Branch__c.object
2576 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/WrikeAccountSettings__c.object
1487 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/Wrike_API__c.object
1488 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/Wrike_Bindings__c.object
9129 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Submit_Component__c.object
7598 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Merge_Result__c.object
10249 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/sf_devops__Deployment_Result__c.object
5484 Tue Apr 04 17:52:02 EDT 2023 unpackaged/objects/TruncateTest__c.object
1976 Tue Apr 04 17:52:02 EDT 2023 unpackaged/package.xml
ERD for the given list of Salesforce Objects using CLI
sfdx mohanc:md:describe -u mohan.chinnappan.n.sel2@gmail.com -s Contact,Account -e contact-account.dot > contact-account.csv
cat contact-account.dot | pbcopy
open "https://mohan-chinnappan-n.github.io/viz/viz.html?c=1"
Compare SObjects in 2 orgs
View the SObjects in the org
sfdx mohanc:md:describeGlobal -u mohan.chinnappan.n.sel@gmail.com | tr ',' '\n' | pbcopy; open "https://mohan-chinnappan-n5.github.io/viz/datatable/dt.html?c=csv"
Using Diff app
-
Run the org-sobjects-compare
-
Usage
./org-sobjects-compare.sh
Org SObjects Compare tool - compares SObjects in the given two orgs
Usage: org-sobjects-compare.sh <username1> <username2>
- Run
org-sobjects-compare.sh mohan.chinnappan.n.sel@gmail.com mohan.chinnappan.n.sel2@gmail.com
Org SObjects Compare tool - compares SObjects in the given two orgs
=== Getting sobjects for the org with username: mohan.chinnappan.n.sel@gmail.com... ===
Paste the clipboard content into the left side of the diff app. Then press enter to continue to the next org...
=== Getting sobjects for the org with username: mohan.chinnappan.n.sel2@gmail.com... ===
=== Now you can paste the content in the clipboard into the right side of the diff app and press Compare button... ===
- Demo
Data Loading BulkAPI2
How use the DX plugin for BulkAPI 2 data load ?
Topics
How use mohanc:data:bulkapi:load?
$ sfdx mohanc:data:bulkapi:load -h
Data Load using BulkAPI 2
USAGE
$ sfdx mohanc:data:bulkapi:load
OPTIONS
-e, --lineending=lineending Line Ending (LF or CRLF), default: LF
-f, --inputfile=inputfile CSV file to load, default: input.csv
-o, --sobject=sobject sObject to load into, default: Case
-u, --targetusername=targetusername username or alias for the target org; overrides default target org
--apiversion=apiversion override the api version used for api requests made by this command
--json format output as json
--loglevel=(trace|debug|info|warn|error|fatal) logging level for this command invocation
EXAMPLE
sfdx mohanc:bulkapi:load -u <username> -f input.csv -e LF -o Case
Example
$ sfdx mohanc:data:bulkapi:load -u mohan.chinnappan.n_ea2@gmail.com -f /tmp/input.csv -e LF -o Case
Input file
$ cat /tmp/input.csv
Subject,Priority
Engine cylinder has knocking,High
Wiper Blade needs replacement,Low
Output
=== CREATE JOB ===
{
id: '7503h000003pgNYAAY',
operation: 'insert',
object: 'Case',
createdById: '0053h000002xQ5sAAE',
createdDate: '2020-08-04T04:32:41.000+0000',
systemModstamp: '2020-08-04T04:32:41.000+0000',
state: 'Open',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 49,
contentUrl: 'services/data/v49.0/jobs/ingest/7503h000003pgNYAAY/batches',
lineEnding: 'LF',
columnDelimiter: 'COMMA'
}
jobId: 7503h000003pgNYAAY
=== JOB STATUS ===
=== JOB STATUS for job: 7503h000003pgNYAAY ===
{
id: '7503h000003pgNYAAY',
operation: 'insert',
object: 'Case',
createdById: '0053h000002xQ5sAAE',
createdDate: '2020-08-04T04:32:41.000+0000',
systemModstamp: '2020-08-04T04:32:41.000+0000',
state: 'Open',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 49,
jobType: 'V2Ingest',
contentUrl: 'services/data/v49.0/jobs/ingest/7503h000003pgNYAAY/batches',
lineEnding: 'LF',
columnDelimiter: 'COMMA',
retries: 0,
totalProcessingTime: 0,
apiActiveProcessingTime: 0,
apexProcessingTime: 0
}
=== PUT DATA ===
result: status: 201, statusText: Created
=== JOB STATUS ===
=== JOB STATUS for job: 7503h000003pgNYAAY ===
{
id: '7503h000003pgNYAAY',
operation: 'insert',
object: 'Case',
createdById: '0053h000002xQ5sAAE',
createdDate: '2020-08-04T04:32:41.000+0000',
systemModstamp: '2020-08-04T04:32:41.000+0000',
state: 'Open',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 49,
jobType: 'V2Ingest',
contentUrl: 'services/data/v49.0/jobs/ingest/7503h000003pgNYAAY/batches',
lineEnding: 'LF',
columnDelimiter: 'COMMA',
numberRecordsProcessed: 0,
numberRecordsFailed: 0,
retries: 0,
totalProcessingTime: 0,
apiActiveProcessingTime: 0,
apexProcessingTime: 0
}
=== PATCH STATE ===
{
id: '7503h000003pgNYAAY',
operation: 'insert',
object: 'Case',
createdById: '0053h000002xQ5sAAE',
createdDate: '2020-08-04T04:32:41.000+0000',
systemModstamp: '2020-08-04T04:32:41.000+0000',
state: 'UploadComplete',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 49
}
=== JOB STATUS ===
=== JOB STATUS for job: 7503h000003pgNYAAY ===
jobStatus {
id: '7503h000003pgNYAAY',
operation: 'insert',
object: 'Case',
createdById: '0053h000002xQ5sAAE',
createdDate: '2020-08-04T04:32:41.000+0000',
systemModstamp: '2020-08-04T04:32:43.000+0000',
state: 'InProgress',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 49,
jobType: 'V2Ingest',
lineEnding: 'LF',
columnDelimiter: 'COMMA',
numberRecordsProcessed: 0,
numberRecordsFailed: 0,
retries: 0,
totalProcessingTime: 0,
apiActiveProcessingTime: 0,
apexProcessingTime: 0
}
WAITING...
{
id: '7503h000003pgNYAAY',
operation: 'insert',
object: 'Case',
createdById: '0053h000002xQ5sAAE',
createdDate: '2020-08-04T04:32:41.000+0000',
systemModstamp: '2020-08-04T04:32:43.000+0000',
state: 'JobComplete',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 49,
jobType: 'V2Ingest',
lineEnding: 'LF',
columnDelimiter: 'COMMA',
numberRecordsProcessed: 2,
numberRecordsFailed: 0,
retries: 0,
totalProcessingTime: 152,
apiActiveProcessingTime: 57,
apexProcessingTime: 0
}
=== JOB Failure STATUS ===
=== JOB Failure STATUS for job: 7503h000003pgNYAAY ===
"sf__Id","sf__Error",Priority,Subject
=== JOB getUnprocessedRecords STATUS ===
=== JOB getUnprocessedRecords STATUS for job: 7503h000003pgNYAAY ===
Subject,Priority
Loading Platform Events via Bulk API 2.0
Resources
$ cat ~/tmp/pe_msg.csv
message__c
Power Off
$ sfdx mohanc:data:bulkapi:load -u mohan.chinnappan.n_ea2@gmail.com -f ~/tmp/pe_msg.csv -e LF -o Notification__e
=== CREATE JOB ===
{
id: '7503h000009XNPkAAO',
operation: 'insert',
object: 'Notification__e',
createdById: '0053h000002xQ5sAAE',
createdDate: '2021-04-05T22:40:39.000+0000',
systemModstamp: '2021-04-05T22:40:39.000+0000',
state: 'Open',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 51,
contentUrl: 'services/data/v51.0/jobs/ingest/7503h000009XNPkAAO/batches',
lineEnding: 'LF',
columnDelimiter: 'COMMA'
}
jobId: 7503h000009XNPkAAO
=== JOB STATUS ===
=== JOB STATUS for job: 7503h000009XNPkAAO ===
{
id: '7503h000009XNPkAAO',
operation: 'insert',
object: 'Notification__e',
createdById: '0053h000002xQ5sAAE',
createdDate: '2021-04-05T22:40:39.000+0000',
systemModstamp: '2021-04-05T22:40:39.000+0000',
state: 'Open',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 51,
jobType: 'V2Ingest',
contentUrl: 'services/data/v51.0/jobs/ingest/7503h000009XNPkAAO/batches',
lineEnding: 'LF',
columnDelimiter: 'COMMA',
retries: 0,
totalProcessingTime: 0,
apiActiveProcessingTime: 0,
apexProcessingTime: 0
}
=== PUT DATA ===
result: status: 201, statusText: Created
=== JOB STATUS ===
=== JOB STATUS for job: 7503h000009XNPkAAO ===
{
id: '7503h000009XNPkAAO',
operation: 'insert',
object: 'Notification__e',
createdById: '0053h000002xQ5sAAE',
createdDate: '2021-04-05T22:40:39.000+0000',
systemModstamp: '2021-04-05T22:40:39.000+0000',
state: 'Open',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 51,
jobType: 'V2Ingest',
contentUrl: 'services/data/v51.0/jobs/ingest/7503h000009XNPkAAO/batches',
lineEnding: 'LF',
columnDelimiter: 'COMMA',
numberRecordsProcessed: 0,
numberRecordsFailed: 0,
retries: 0,
totalProcessingTime: 0,
apiActiveProcessingTime: 0,
apexProcessingTime: 0
}
=== PATCH STATE ===
{
id: '7503h000009XNPkAAO',
operation: 'insert',
object: 'Notification__e',
createdById: '0053h000002xQ5sAAE',
createdDate: '2021-04-05T22:40:39.000+0000',
systemModstamp: '2021-04-05T22:40:39.000+0000',
state: 'UploadComplete',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 51
}
=== JOB STATUS ===
=== JOB STATUS for job: 7503h000009XNPkAAO ===
jobStatus {
id: '7503h000009XNPkAAO',
operation: 'insert',
object: 'Notification__e',
createdById: '0053h000002xQ5sAAE',
createdDate: '2021-04-05T22:40:39.000+0000',
systemModstamp: '2021-04-05T22:40:41.000+0000',
state: 'InProgress',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 51,
jobType: 'V2Ingest',
lineEnding: 'LF',
columnDelimiter: 'COMMA',
numberRecordsProcessed: 1,
numberRecordsFailed: 0,
retries: 0,
totalProcessingTime: 61,
apiActiveProcessingTime: 12,
apexProcessingTime: 0
}
WAITING...
{
id: '7503h000009XNPkAAO',
operation: 'insert',
object: 'Notification__e',
createdById: '0053h000002xQ5sAAE',
createdDate: '2021-04-05T22:40:39.000+0000',
systemModstamp: '2021-04-05T22:40:42.000+0000',
state: 'JobComplete',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 51,
jobType: 'V2Ingest',
lineEnding: 'LF',
columnDelimiter: 'COMMA',
numberRecordsProcessed: 1,
numberRecordsFailed: 0,
retries: 0,
totalProcessingTime: 61,
apiActiveProcessingTime: 12,
apexProcessingTime: 0
}
=== JOB Failure STATUS ===
=== JOB Failure STATUS for job: 7503h000009XNPkAAO ===
"sf__Id","sf__Error",message__c
=== JOB getUnprocessedRecords STATUS ===
=== JOB getUnprocessedRecords STATUS for job: 7503h000009XNPkAAO ===
message__c
Results
User Loading
Table of contents
Using CLI for BulkAPI 2.0
1. Query the current users in the org
sfdx mohanc:data:bulkapi:query -u mohan.chinnappan.n.sel@gmail.com -q ~/.soql/users.soql; pbcopy < ~/.soql/users.soql.csv ; open "https://mohan-chinnappan-n5.github.io/viz/datatable/dt.html?c=csv"
-- query used
SELECT Username, LastName, FirstName, Email, Alias,
LOCALESIDKEY,LANGUAGELOCALEKEY,EMAILENCODINGKEY,TIMEZONESIDKEY,PROFILEID,USERROLEID
FROM User
https://d4x000007rxogeaq-dev-ed.my.salesforce.com/services/data/v57.0/jobs/query
{
id: '7504x00000XkfyMAAR',
operation: 'query',
object: 'User',
createdById: '0054x000006Riv4AAC',
createdDate: '2023-03-21T10:43:31.000+0000',
systemModstamp: '2023-03-21T10:43:31.000+0000',
state: 'UploadComplete',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 57,
lineEnding: 'LF',
columnDelimiter: 'COMMA'
}
=== JOB STATUS ===
=== JOB STATUS for job: 7504x00000XkfyMAAR ===
{
id: '7504x00000XkfyMAAR',
operation: 'query',
object: 'User',
createdById: '0054x000006Riv4AAC',
createdDate: '2023-03-21T10:43:31.000+0000',
systemModstamp: '2023-03-21T10:43:31.000+0000',
state: 'InProgress',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 57,
jobType: 'V2Query',
lineEnding: 'LF',
columnDelimiter: 'COMMA',
numberRecordsProcessed: 0,
retries: 0,
totalProcessingTime: 0
}
WAITING...
{
id: '7504x00000XkfyMAAR',
operation: 'query',
object: 'User',
createdById: '0054x000006Riv4AAC',
createdDate: '2023-03-21T10:43:31.000+0000',
systemModstamp: '2023-03-21T10:43:32.000+0000',
state: 'JobComplete',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 57,
jobType: 'V2Query',
lineEnding: 'LF',
columnDelimiter: 'COMMA',
numberRecordsProcessed: 12,
retries: 0,
totalProcessingTime: 157
}
==== Job State: JobComplete ====
=== Total time taken to process the job : 157 milliseconds ===
=== Total records processed : 12 ===
https://d4x000007rxogeaq-dev-ed.my.salesforce.com/services/data/v57.0/jobs/query/7504x00000XkfyMAAR/results
==== Output CSV file written into : /Users/mchinnappan/.soql/users.soql.csv ===
==== View the output file : /Users/mchinnappan/.soql/users.soql.csv using:
cat /Users/mchinnappan/.soql/users.soql.csv ===
=== JOB Failure STATUS ===
=== JOB Failure STATUS for job: 7504x00000XkfyMAAR === "sf__Id","sf__Error","Username","LastName","FirstName","Email","Alias","LocaleSidKey","LanguageLocaleKey","EmailEncodingKey","TimeZoneSidKey","ProfileId","UserRoleId"
===
"sf__Id","sf__Error","Username","LastName","FirstName","Email","Alias","LocaleSidKey","LanguageLocaleKey","EmailEncodingKey","TimeZoneSidKey","ProfileId","UserRoleId"
2. View the user using this app which is automatically opened by the above command #1
3. Edit the query results in your favorite app (I am using vim to do this, you can use apps like Excel) to add your user(s) to load
- I have my data input file looks like:
"Username","LastName","FirstName","Email","Alias","LocaleSidKey","LanguageLocaleKey","EmailEncodingKey","TimeZoneSidKey","ProfileId","UserRoleId"
"telsa.nik@unix.org.sel","Tesla","Nik","tesla@unix.org","teslan","en_US","en_US","UTF-8","America/Los_Angeles","00e4x000002b8InAAI",""
4. Let us load the data for the new user(s)
sfdx mohanc:data:bulkapi:load -u mohan.chinnappan.n.sel@gmail.com -f /Users/mchinnappan/.soql/users.soql.csv -e LF -o User
5. Query the user Object to check our loading
- Check the bulk data load jobs status for these jobs we created
Apex Way
List<User> users = new List<User>();
users.add( new User (
Username = 'abc@gnu.org.sel',
Email = 'abc@gnu.org.invalid',
FirstName = 'John',
LastName = 'Doe',
Alias = 'jdoe',
TimeZoneSidKey = 'America/New_York',
LocaleSidKey = 'en_US',
EmailEncodingKey = 'ISO-8859-1',
ProfileId = [SELECT Id FROM Profile WHERE Name ='Standard Platform User' LIMIT 1].Id,
LanguageLocaleKey = 'en_US'
));
// users.add( new User(...)
Insert users;
- running it
sfdx mohanc:tooling:execute -u mohan.chinnappan.n.sel@gmail.com -a ~/.apex/addUsers.cls
apexCode: //String profile = 'Standard Platform User';
//Id profileId = [SELECT Id FROM Profile WHERE Name =: profile LIMIT 1].Id;
List<User> users = new List<User>();
users.add( new User (
Username = 'abc@gnu.org.sel',
Email = 'abc@gnu.org.invalid',
FirstName = 'John',
LastName = 'Doe',
Alias = 'jdoe',
TimeZoneSidKey = 'America/New_York',
LocaleSidKey = 'en_US',
EmailEncodingKey = 'ISO-8859-1',
ProfileId = [SELECT Id FROM Profile WHERE Name ='Standard Platform User' LIMIT 1].Id,
LanguageLocaleKey = 'en_US'
));
// users.add( new User(...)
Insert users;
compiled?: true
executed?: true
{
line: -1,
column: -1,
compiled: true,
success: true,
compileProblem: null,
exceptionStackTrace: null,
exceptionMessage: null
}
Is there a script which can create this apex code ?
- Yes!
cat users.csv
"Username","LastName","FirstName","Email","Alias","LocaleSidKey","LanguageLocaleKey","EmailEncodingKey","TimeZoneSidKey","ProfileName"
"gfe@gnu.org.sel","Garderner","Joe","jg@gnu.org.invalid","jg","en_US","en_US","ISO-8859-1","America/New_York","Standard Platform User"
"xyz@gnu.org.sel","Smith","Joe","xyz@gnu.org.invalid","jsmith","en_US","en_US","ISO-8859-1","America/New_York","Standard Platform User"
-
Download the script userApexgen.py
-
Run the script
python3 userApexgen.py users.csv > useradd.cls
cat useradd.cls
List<User> users = new List<User>();
users.add( new User (
Username = 'gfe@gnu.org.sel',
Email = 'jg@gnu.org.invalid',
FirstName = 'Joe',
LastName = 'Garderner',
Alias = 'jg',
TimeZoneSidKey = 'America/New_York',
LocaleSidKey = 'en_US',
EmailEncodingKey = 'ISO-8859-1',
ProfileId = [SELECT Id FROM Profile WHERE Name ='Standard Platform User' LIMIT 1].Id,
LanguageLocaleKey = 'en_US'
));
users.add( new User (
Username = 'xyz@gnu.org.sel',
Email = 'xyz@gnu.org.invalid',
FirstName = 'Joe',
LastName = 'Smith',
Alias = 'jsmith',
TimeZoneSidKey = 'America/New_York',
LocaleSidKey = 'en_US',
EmailEncodingKey = 'ISO-8859-1',
ProfileId = [SELECT Id FROM Profile WHERE Name ='Standard Platform User' LIMIT 1].Id,
LanguageLocaleKey = 'en_US'
));
Insert users;
- Run the apex code to add the users
sfdx mohanc:tooling:execute -u mohan.chinnappan.n.sel@gmail.com -a useradd.cls
Mass assign permission sets to users
- Querying User, PermissionSet and PermissionSetAssignment
SELECT Id
,Username
FROM User
SELECT
Id
,Name
,NamespacePrefix
,Description
,HasActivationRequired
,IsCustom
,IsOwnedByProfile
,Label
,LicenseId
,PermissionSetGroupId
,ProfileId
,Type
FROM PermissionSet
SELECT
Id
,AssigneeId
,IsActive
,ExpirationDate
,PermissionSetId
,PermissionSetGroupId
FROM PermissionSetAssignment
- Users
Id,Username
0054x000007avznAAA,sel.dmr@unix.org.sel
0054x000007avdcAAA,sel.ken@unix.org.sel
- Permissionsets to assign
Id,Name
0PS4x000002QxRNGA0,B2BBuyer
- Assignments
AssigneeId,PermissionSetId
0054x000007avznAAA,0PS4x000002QxRNGA0
0054x000007avdcAAA,0PS4x000002QxRNGA0
Data loader way
Product2 Loading and Updating
Topics
Loading
sfdx mohanc:data:bulkapi:query -u mohan.chinnappan.n.sel@gmail.com -q ~/.soql/product2.soql; pbcopy < ~/.soql/product2.soql.csv ; open "https://mohan-chinnappan-n5.github.io/viz/datatable/dt.html?c=csv"
SELECT Name, ProductCode, Description, IsActive, Family, QuantityUnitOfMeasure,StockKeepingUnit
FROM Product2
https://d4x000007rxogeaq-dev-ed.my.salesforce.com/services/data/v57.0/jobs/query
{
id: '7504x00000Xkf9FAAR',
operation: 'query',
object: 'Product2',
createdById: '0054x000006Riv4AAC',
createdDate: '2023-03-21T12:36:50.000+0000',
systemModstamp: '2023-03-21T12:36:50.000+0000',
state: 'UploadComplete',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 57,
lineEnding: 'LF',
columnDelimiter: 'COMMA'
}
=== JOB STATUS ===
=== JOB STATUS for job: 7504x00000Xkf9FAAR ===
{
id: '7504x00000Xkf9FAAR',
operation: 'query',
object: 'Product2',
createdById: '0054x000006Riv4AAC',
createdDate: '2023-03-21T12:36:50.000+0000',
systemModstamp: '2023-03-21T12:36:50.000+0000',
state: 'InProgress',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 57,
jobType: 'V2Query',
lineEnding: 'LF',
columnDelimiter: 'COMMA',
numberRecordsProcessed: 0,
retries: 0,
totalProcessingTime: 0
}
WAITING...
{
id: '7504x00000Xkf9FAAR',
operation: 'query',
object: 'Product2',
createdById: '0054x000006Riv4AAC',
createdDate: '2023-03-21T12:36:50.000+0000',
systemModstamp: '2023-03-21T12:36:51.000+0000',
state: 'JobComplete',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 57,
jobType: 'V2Query',
lineEnding: 'LF',
columnDelimiter: 'COMMA',
numberRecordsProcessed: 17,
retries: 0,
totalProcessingTime: 208
}
==== Job State: JobComplete ====
=== Total time taken to process the job : 208 milliseconds ===
=== Total records processed : 17 ===
https://d4x000007rxogeaq-dev-ed.my.salesforce.com/services/data/v57.0/jobs/query/7504x00000Xkf9FAAR/results
==== Output CSV file written into : /Users/mchinnappan/.soql/product2.soql.csv ===
==== View the output file : /Users/mchinnappan/.soql/product2.soql.csv using:
cat /Users/mchinnappan/.soql/product2.soql.csv ===
=== JOB Failure STATUS ===
=== JOB Failure STATUS for job: 7504x00000Xkf9FAAR === "sf__Id","sf__Error","Id","ProductCode","Description","IsActive","Family","QuantityUnitOfMeasure","StockKeepingUnit"
===
"sf__Id","sf__Error","Id","ProductCode","Description","IsActive","Family","QuantityUnitOfMeasure","StockKeepingUnit"
Input file
cat /Users/mchinnappan/.soql/product2.soql2.csv
"Name","ProductCode","Description","IsActive","Family","QuantityUnitOfMeasure","StockKeepingUnit"
"Autopilot","MC123456","","true","","",""
Load
sfdx mohanc:data:bulkapi:load -u mohan.chinnappan.n.sel@gmail.com -f /Users/mchinnappan/.soql/product2.soql2.csv -e LF -o Product2
Query the results
sfdx mohanc:data:bulkapi:query -u mohan.chinnappan.n.sel@gmail.com -q ~/.soql/product2.soql; pbcopy < ~/.soql/product2.soql.csv ; open "https://mohan-chinnappan-n5.github.io/viz/datatable/dt.html?c=csv"
Update example
cat ~/.soql/product2-id.soql.csv
"Id","Description"
"01t4x000004kZPaAAM","Autopilot Description"
- Update
sfdx mohanc:data:bulkapi:update -u mohan.chinnappan.n.sel@gmail.com -f /Users/mchinnappan/.soql/product2-id.soql.csv -e LF -o Product2
- Query
sfdx mohanc:data:bulkapi:query -u mohan.chinnappan.n.sel@gmail.com -q ~/.soql/product2-id.soql; pbcopy < ~/.soql/product2-id.soql.csv ; open "https://mohan-chinnappan-n5.github.io/viz/datatable/dt.html?c=csv"
Bulk data load status
Delete
sfdx mohanc:data:bulkapi:delete -u mohan.chinnappan.n.sel@gmail.com -f /Users/mchinnappan/.soql/product2-id.del.csv -e LF -o Product2
/Users/mchinnappan/.soql/product2-id.del.csv Product2 LF
=== CREATE JOB ===
{
id: '7504x00000Xl3OlAAJ',
operation: 'delete',
object: 'Product2',
createdById: '0054x000006Riv4AAC',
createdDate: '2023-03-23T08:11:55.000+0000',
systemModstamp: '2023-03-23T08:11:55.000+0000',
state: 'Open',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 57,
contentUrl: 'services/data/v57.0/jobs/ingest/7504x00000Xl3OlAAJ/batches',
lineEnding: 'LF',
columnDelimiter: 'COMMA'
}
jobId: 7504x00000Xl3OlAAJ
=== JOB STATUS ===
=== JOB STATUS for job: 7504x00000Xl3OlAAJ ===
{
id: '7504x00000Xl3OlAAJ',
operation: 'delete',
object: 'Product2',
createdById: '0054x000006Riv4AAC',
createdDate: '2023-03-23T08:11:55.000+0000',
systemModstamp: '2023-03-23T08:11:55.000+0000',
state: 'Open',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 57,
jobType: 'V2Ingest',
contentUrl: 'services/data/v57.0/jobs/ingest/7504x00000Xl3OlAAJ/batches',
lineEnding: 'LF',
columnDelimiter: 'COMMA',
retries: 0,
totalProcessingTime: 0,
apiActiveProcessingTime: 0,
apexProcessingTime: 0
}
=== PUT DATA === SIZE: == 26 MB ==
result: status: 201, statusText: Created
=== JOB STATUS ===
=== JOB STATUS for job: 7504x00000Xl3OlAAJ ===
{
id: '7504x00000Xl3OlAAJ',
operation: 'delete',
object: 'Product2',
createdById: '0054x000006Riv4AAC',
createdDate: '2023-03-23T08:11:55.000+0000',
systemModstamp: '2023-03-23T08:11:55.000+0000',
state: 'Open',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 57,
jobType: 'V2Ingest',
contentUrl: 'services/data/v57.0/jobs/ingest/7504x00000Xl3OlAAJ/batches',
lineEnding: 'LF',
columnDelimiter: 'COMMA',
numberRecordsProcessed: 0,
numberRecordsFailed: 0,
retries: 0,
totalProcessingTime: 0,
apiActiveProcessingTime: 0,
apexProcessingTime: 0
}
=== PATCH STATE ===
{
id: '7504x00000Xl3OlAAJ',
operation: 'delete',
object: 'Product2',
createdById: '0054x000006Riv4AAC',
createdDate: '2023-03-23T08:11:55.000+0000',
systemModstamp: '2023-03-23T08:11:55.000+0000',
state: 'UploadComplete',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 57
}
=== JOB STATUS ===
=== JOB STATUS for job: 7504x00000Xl3OlAAJ ===
jobStatus {
id: '7504x00000Xl3OlAAJ',
operation: 'delete',
object: 'Product2',
createdById: '0054x000006Riv4AAC',
createdDate: '2023-03-23T08:11:55.000+0000',
systemModstamp: '2023-03-23T08:11:57.000+0000',
state: 'InProgress',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 57,
jobType: 'V2Ingest',
lineEnding: 'LF',
columnDelimiter: 'COMMA',
numberRecordsProcessed: 0,
numberRecordsFailed: 0,
retries: 0,
totalProcessingTime: 0,
apiActiveProcessingTime: 0,
apexProcessingTime: 0
}
WAITING...
{
id: '7504x00000Xl3OlAAJ',
operation: 'delete',
object: 'Product2',
createdById: '0054x000006Riv4AAC',
createdDate: '2023-03-23T08:11:55.000+0000',
systemModstamp: '2023-03-23T08:11:58.000+0000',
state: 'JobComplete',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 57,
jobType: 'V2Ingest',
lineEnding: 'LF',
columnDelimiter: 'COMMA',
numberRecordsProcessed: 1,
numberRecordsFailed: 0,
retries: 0,
totalProcessingTime: 363,
apiActiveProcessingTime: 289,
apexProcessingTime: 0
}
=== JOB Failure STATUS ===
=== JOB Failure STATUS for job: 7504x00000Xl3OlAAJ ===
"sf__Id","sf__Error","Id"
=== JOB getUnprocessedRecords STATUS ===
=== JOB getUnprocessedRecords STATUS for job: 7504x00000Xl3OlAAJ ===
"Id"
Query After delete
sfdx mohanc:data:bulkapi:query -u mohan.chinnappan.n.sel@gmail.com -q ~/.soql/product2-id.soql; pbcopy < ~/.soql/product2-id.soql.csv ; open "https://mohan-chinnappan-n5.github.io/viz/datatable/dt.html?c=csv"
SELECT Id,Name,ProductCode, Description, IsActive, Family, QuantityUnitOfMeasure,StockKeepingUnit
FROM Product2
https://d4x000007rxogeaq-dev-ed.my.salesforce.com/services/data/v57.0/jobs/query
{
id: '7504x00000Xl3O8AAJ',
operation: 'query',
object: 'Product2',
createdById: '0054x000006Riv4AAC',
createdDate: '2023-03-23T08:13:04.000+0000',
systemModstamp: '2023-03-23T08:13:04.000+0000',
state: 'UploadComplete',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 57,
lineEnding: 'LF',
columnDelimiter: 'COMMA'
}
=== JOB STATUS ===
=== JOB STATUS for job: 7504x00000Xl3O8AAJ ===
{
id: '7504x00000Xl3O8AAJ',
operation: 'query',
object: 'Product2',
createdById: '0054x000006Riv4AAC',
createdDate: '2023-03-23T08:13:04.000+0000',
systemModstamp: '2023-03-23T08:13:04.000+0000',
state: 'InProgress',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 57,
jobType: 'V2Query',
lineEnding: 'LF',
columnDelimiter: 'COMMA',
numberRecordsProcessed: 0,
retries: 0,
totalProcessingTime: 0
}
WAITING...
{
id: '7504x00000Xl3O8AAJ',
operation: 'query',
object: 'Product2',
createdById: '0054x000006Riv4AAC',
createdDate: '2023-03-23T08:13:04.000+0000',
systemModstamp: '2023-03-23T08:13:05.000+0000',
state: 'JobComplete',
concurrencyMode: 'Parallel',
contentType: 'CSV',
apiVersion: 57,
jobType: 'V2Query',
lineEnding: 'LF',
columnDelimiter: 'COMMA',
numberRecordsProcessed: 17,
retries: 0,
totalProcessingTime: 220
}
==== Job State: JobComplete ====
=== Total time taken to process the job : 220 milliseconds ===
=== Total records processed : 17 ===
https://d4x000007rxogeaq-dev-ed.my.salesforce.com/services/data/v57.0/jobs/query/7504x00000Xl3O8AAJ/results
==== Output CSV file written into : /Users/mchinnappan/.soql/product2-id.soql.csv ===
==== View the output file : /Users/mchinnappan/.soql/product2-id.soql.csv using:
cat /Users/mchinnappan/.soql/product2-id.soql.csv ===
=== JOB Failure STATUS ===
=== JOB Failure STATUS for job: 7504x00000Xl3O8AAJ === "sf__Id","sf__Error","Id","Name","ProductCode","Description","IsActive","Family","QuantityUnitOfMeasure","StockKeepingUnit"
===
"sf__Id","sf__Error","Id","Name","ProductCode","Description","IsActive","Family","QuantityUnitOfMeasure","StockKeepingUnit"
FlowInterview Query and Deletion using CLI (BulkAPI2.0)
Query
- Note:
- Username used in the examples is
mohan.chinnappan.n.sel@gmail.com
- You need to replace it with your username after authenticationg with
# For PROD and DE sfdx force:auth:web:login -r https://login.salesforce.com # For Sandboxes sfdx force:auth:web:login -r https://test.salesforce.com
- Username used in the examples is
Query for the FlowInterview records with Error InterviewStatus
SELECT Id, CurrentElement, FlowVersionViewId,Guid,InterviewLabel,
InterviewStatus,Name,OwnerId,PauseLabel,
WasPausedFromScreen
FROM FlowInterview
WHERE InterviewStatus ='Error'
-
Store above SOQL in a file say
flowInterview.soql
-
Execute the query
sfdx mohanc:data:bulkapi:query -u mohan.chinnappan.n.sel@gmail.com -q flowInterview.soql
- This will create a csv output in
flowInterview.soql.csv
. Check the records for deletion. - If you have decided to delete the these records, you can run the following query
SELECT Id
FROM FlowInterview
WHERE InterviewStatus ='Error'
-
Store above SOQL in a file say
flowInterviewIds.soql
-
Execute the query
sfdx mohanc:data:bulkapi:query -u mohan.chinnappan.n.sel@gmail.com -q flowInterviewIds.soql
Deletion
sfdx mohanc:data:bulkapi:delete -u mohan.chinnappan.n.sel@gmail.com -f flowInterviewIds.soql.csv -e LF -o FlowInterview
- This will delete all the FlowInterview records with Ids in
flowInterviewIds.soql.csv
References
Prepare and Load for data tree import
Convert the given CSV to data:tree:import JSON and plan file
Using csv2json app
- pass url parameter 'so' to set the SObject required for loading
Demo
Sample files
- pricebook.csv
"Name","IsActive","Description"
"US Price Book","true","Description for US Price Book"
"Canada Pricebook","false","Description for Canada Pricebook"
- pricebook.csv.json
{
"records": [
{
"Name": "US Price Book",
"IsActive": true,
"Description": "Description for US Price Book",
"attributes": {
"type": "PriceBook2",
"referenceId": "PriceBook2Ref0"
}
},
{
"Name": "Canada Pricebook",
"IsActive": false,
"Description": "Description for Canada Pricebook",
"attributes": {
"type": "PriceBook2",
"referenceId": "PriceBook2Ref1"
}
}
]
}
- plan-pricebook.csv.json
[
{
"sobject": "PriceBook2",
"saveRefs": true,
"resolveRefs": false,
"files": [
"pricebook.csv.json"
]
}
]
Use CSV to SF JSON and Plan file script
python csv2sf_json.py --csv-file input.csv --sobject=Account
Conversion Done.
Import data into your org with: sfdx data:tree:import -u $USERNAME -p input.csv.json-plan.json
Loading State and Country Picklist
-
Requirement for this tool to help the customers since You can’t use the Metadata API to create or delete new states, countries, or territories
-
This app helps to fill-in this gap, of creating new states and countries, or territories
-
This app uses Chrome's new Recorder feature
Topics
Demo
Where I can find this app?
Using SF-LAND Data Viz to get All Countries data
-
I prepared this data by web scraping the Wikipedia, feel free to use it.
Visualizing State and Country - AddressSettings
- package.xml to retrieve the data
<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
<types>
<members>Address</members>
<name>Settings</name>
</types>
<version>56.0</version>
</Package>
- Transform to get the visualization
xmlutil transform --xsl=addresssettings --xml=Address.settings --out=/tmp/address.html
References
Updating State and Country Picklist
-
You can use the Metadata API to edit existing states, countries, and territories in state and country/territory picklists. You can’t use the Metadata API to create or delete new states, countries, or territories. Refer
-
Once you have loaded State and Country Picklist you can update them, like selectively activating a country as per your use case.
-
Options described here explains how to activate a particular country or countries.
Topics
- Option-1 Activating using Recorder application
- Option-2 Using ONNU Chrome Extension
- Option-3 Using CLI mdapi
- Option-4 retrieve into SFDX project using source:retrieve
Option-1 Activating using Recorder application
Before
Replying the recording
After
Option-2 Using ONNU Chrome Extension
Option-3 Using CLI mdapi
Retrieve
sfdx force:mdapi:retrieve -k unpackaged/package.xml -u mohan.chinnappan.n.sel@gmail.com -r .
Retrieving v56.0 metadata from mohan.chinnappan.n.sel@gmail.com using the v57.0 SOAP API
Retrieve ID: 09S4x00000F0DGMEA3
Retrieving metadata from mohan.chinnappan.n.sel@gmail.com... done
Wrote retrieve zip to /Users/mchinnappan/Downloads/updates/unpackaged.zip
jar tvf unpackaged.zip
569563 Mon Mar 27 23:41:02 EDT 2023 unpackaged/settings/Address.settings
226 Mon Mar 27 23:41:02 EDT 2023 unpackaged/package.xml
- Edit Address.settings as in the Option-2 (ONNU option) to update active tags to make updatedSCPL.zip zip file
Deploy the edited Address.settings file
sfdx force:mdapi:deploy -u mohan.chinnappan.n.sel@gmail.com -f updatedSCPL.zip -c --verbose --loglevel TRACE -w 1000
Deploying metadata to mohan.chinnappan.n.sel@gmail.com using the v57.0 SOAP API
Deploy ID: 0Af4x00000YsXnoCAF
DEPLOY PROGRESS | ████████████████████████████████████████ | 1/1 Components
=== Deployed Source
Type File Name Id
─────────────── ──────────────────────────────────── ─────────── ──
unpackaged/package.xml package.xml
AddressSettings unpackaged/settings/Address.settings Address
Option-4 retrieve into SFDX project using source:retrieve
cat package.xml
<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
<types>
<members>Address</members>
<name>Settings</name>
</types>
<version>56.0</version>
</Package>
sfdx force:source:retrieve -u mohan.chinnappan.n.sel@gmail.com -x package.xml
Retrieving v56.0 metadata from mohan.chinnappan.n.sel@gmail.com using the v57.0 SOAP API
Preparing retrieve request... done
=== Retrieved Source
FULL NAME TYPE PROJECT PATH
───────── ──────── ─────────────────────────────────────────────────────────
Address Settings force-app/main/default/settings/Address.settings-meta.xml
- Now you can commit this into your version control and make changes to the
Address.settings-meta.xml
in your editor
vi force-app/main/default/settings/Address.settings-meta.xml
Deploy the updated Address.settings-meta.xml into the org
sfdx force:source:deploy -u mohan.chinnappan.n.sel@gmail.com -x package.xml
Deploying v56.0 metadata to mohan.chinnappan.n.sel@gmail.com using the v57.0 SOAP API
Deploy ID: 0Af4x00000YsZyUCAV
DEPLOY PROGRESS | ████████████████████████████████████████ | 1/1 Components
=== Deployed Source
FULL NAME TYPE PROJECT PATH
───────── ──────── ─────────────────────────────────────────────────────────
Address Settings force-app/main/default/settings/Address.settings-meta.xml
Deploy Succeeded.
Project Management
Topics
Semantic Doc Compare and Group
Project Estimates
Estimate | Comments |
---|---|
Planning estimates | It is a rough order of estimate(ROM). ROM is an approximation strategy that helps project managers present important financial estimates to clients and consumers. Helps in determining the feasibility of a project and is important for the decision-making process |
Backlog estimates | Provides data on the amount of work that isn't complete - works has started but not yet completed |
Timebox estimates | Shows how long it might take to complete a project - important part of Agile and Lean software development. Involves subdividing a project into smaller chunks of work and estimating each chunk separately. |
Order-of-magnitude estimates | Helps to estimate the cost and effort to complete a project - help you determine how big something is without knowing the exact value. |
Feasibility estimates | useful when determining if a project or task is doable and how much it may cost |
Detailed estimates | Includes several line items or categories of work for a project. You can use it to estimate the time and resources necessary to complete a project |
Analogous estimates | uses experience to predict the time and cost required for a project. For example, if you've completed similar projects in the past and know how long each one took, you can use this information to estimate future projects |
Top-down estimates | starts with a high-level project scope, which you simplify into smaller pieces. |
References
PERT - Program Evaluation and Review Technique
-
Statistical tool used in project management, which was designed to analyze and represent the tasks involved in completing a given project.
-
First developed by the United States Navy in 1958, it is commonly used in conjunction with the critical path method (CPM) that was introduced in 1957.
Event Monitoring
Viewing and Visualizing Event Log files using DX
Topics
Listing the Event Types
- Requires 0.0.115 version of the plugin
- sfdx-mohanc-plugins@0.0.115
- How to install the plugin
Usage
USAGE
$ sfdx mohanc:monitoring:em:types
OPTIONS
-u, --targetusername=targetusername username or alias for the target org; overrides default target org
-v, --targetdevhubusername=targetdevhubusername username or alias for the dev hub org; overrides default dev hub org
--apiversion=apiversion override the api version used for api requests made by this command
--json format output as json
--loglevel=(trace|debug|info|warn|error|fatal) logging level for this command invocation
EXAMPLE
** Show Event Types **
sfdx mohanc:monitoring:em:types -u <userName>
Demo
$ sfdx mohanc:monitoring:em:types -u mohan.chinnappan.n_ea2@gmail.com
API
ApexExecution
ApexTrigger
BulkApi
ContentTransfer
DocumentAttachmentDownloads
LightningInteraction
LightningPageView
LightningPerformance
Login
Logout
RestApi
URI
WaveChange
WaveInteraction
WavePerformance
Get the Event files for given Event Type
Usage
$ sfdx mohanc:monitoring:em:get -h
Event Files for the given Event Type
USAGE
$ sfdx mohanc:monitoring:em:get
OPTIONS
-e, --enddate=enddate End DateTime for the logs in the format YYYY-MM-DDThh:mm:ssZ,, example:
2012-12-30:23:00:00Z
-o, --eventfileout=eventfileout Output CSV file name to write the events into
-s, --startdate=startdate Start DateTime for the logs in the format YYYY-MM-DDThh:mm:ssZ, example:
2012-12-30T00:10:00Z
-t, --eventtype=eventtype Event Type for which Event File is requested
-u, --targetusername=targetusername username or alias for the target org; overrides default target org
-v, --targetdevhubusername=targetdevhubusername username or alias for the dev hub org; overrides default dev hub org
--apiversion=apiversion override the api version used for api requests made by this command
--json format output as json
--loglevel=(trace|debug|info|warn|error|fatal) logging level for this command invocation
EXAMPLE
** Event Files for the given Event Type **
sfdx mohanc:monitoring:em:get -u <userName> -t <eventType> -o <outFileName> -s <startDateTime> -e <endDateTime>
Demo
$ sfdx mohanc:monitoring:em:get -t API -u mohan.chinnappan.n_ea2@gmail.com -o API2d.csv -s 2020-11-10T00:01:00Z -e 2020-11-13T00:02:00Z
=== Deleting old output file API2d.csv ...
=== Collecting logs for the EventType: API ...
=== Getting log file for LogDate: 2020-11-11T00:00:00.000+0000 of Size: 15113, Total: 15113 Bytes ...
=== Getting log file for LogDate: 2020-11-10T00:00:00.000+0000 of Size: 5590, Total: 20703 Bytes ...
=== Getting log file for LogDate: 2020-11-08T00:00:00.000+0000 of Size: 16036, Total: 36739 Bytes ...
=== Getting log file for LogDate: 2020-11-09T00:00:00.000+0000 of Size: 4159, Total: 40898 Bytes ...
Total Bytes written: 40898
=== Opening API2d.csv ...
Charts
$ sfdx mohanc:data:viz:barChart -d ./API2d.csv -e ./API.csv.encoding-2.json -m bar -o ./APIbarchart.html -p ./apiChart.json
=== Opening ./APIbarchart.html via Local Web Server on port :7020 ...
Proactive Monitoring
- Part of Signature Success Plan
Signature Success Plan
Feature | Comments |
---|---|
Technical Account Manager (TAM) | Designated TAM |
Personalized Trends and Insights | |
Key Event Management | |
24/7 Proactive Monitoring | Predict and prevent disruptions before they occur. Includes customized monitoring plan including Smart Alerts to detect potential risks and root causes |
Continuous 24/7 monitoring
Salesforce team watches your production Org limits day and night, armed with proprietary diagnostic tools and the insights that come with overseeing a broad universe of Salesforce orgs. Continuous 24/7 monitoring of your key Salesforce solutions helps you predict and prevent issues.
Earlier warning of critical issues featuring Smart Alert
Salesforce support teams can swing into action more quickly, solving issues faster and reducing the likelihood that issues become severe.
Technical guidance
Salesforce team provides technical guidance from Salesforce monitoring experts, based upon best practices learned supporting more than 150,000 customers, can help the customer to restore functionality more quickly and help to reduce the likelihood of future errors and system problems.
Quarterly trends and reports
Customer will receive Quarterly trends and reports. And Annual health review from the Signature team.
Samples
-
Salesforce team will also able to initiate emergency measures quickly (like increasing login limits), to keep your business running smoothly while issues are being addressed.
-
When the customer chooses the automations and triggered sends that customer wants to be monitored, and Salesforce team tracks them 24/7.
-
Salesforce team is always monitoring the CPU utilization, which is yet another indicator of when and where problems may arise. Salesforce team is empowered to go right to work reviewing custom code – allowing them to identify issues and recommend adjustments without delay. This service doesn’t fix the code, but we’ll diagnose the problem and tell you what you should do next to keep this from happening.
-
Batch process issues can slow down the org if not best practices are followed. Salesforce team can quickly help the customer to align rogue batch processes (caused by custom code or third-party vendors) with the best practices.
References
Real-Time Event Monitoring
- Monitor and detect standard events in Salesforce in near real-time
- Event data can be stored for auditing or reporting purposes
Insights
Insight | Comments |
---|---|
Who viewed what data and when? | |
Where data was accessed? | |
When a user changes a record using the UI? | |
Who is logging in and from where? | |
Who in your org is performing actions related to Platform Encryption administration? | |
Which admins logged in as another user and the actions the admin took as that user? | |
How long it takes a Lightning page to load? | |
Threats detected in your org, such as anomalies in how users view or export reports, session hijacking attacks, or credential stuffing attacks |
Key terms
Term | Definition |
---|---|
Event | An event is anything that happens in Salesforce, including user clicks, record state changes, and measuring values. Events are immutable and timestamped. |
Event Channel | A stream of events on which an event producer sends event messages and event consumers read those messages. |
Event Subscriber | A subscriber to a channel that receives messages from the channel. For example, a security app is notified of new report downloads. |
Event Message | A message used to transmit data about the event. |
Event Publisher | The publisher of an event message over a channel, such as a security and auditing app. |
Setting up - Enable Access to Real-Time Event Monitoring
Via Profile or PermissionSet
View Real-Time Event Monitoring Data
Customize Application
Event Manager
- From Setup, in the Quick Find box, enter Events, then select Event Manager.
- Next to the event you want to enable or disable streaming for, click the dropdown menu.
- Select whether you want to enable or disable streaming or storing on the event.
Querying RealTimeEventSettings
cat ~/.soql/RealTimeEventSettings.soql
SELECT
Id
,DurableId
,Metadata
,FullName
FROM RealTimeEventSettings
sfdx mohanc:tooling:query -u mohan.chinnappan.n_ea2@gmail.com -q ~/.soql/RealTimeEventSettings.soql -f json
[
{
"attributes": {
"type": "RealTimeEventSettings",
"url": "/services/data/v57.0/tooling/sobjects/RealTimeEventSettings/bWRjLzBIRS9SZWFsVGltZUV2ZW50U2V0dGluZ3M%3D"
},
"Id": "000000000000000AAA",
"DurableId": "bWRjLzBIRS9SZWFsVGltZUV2ZW50U2V0dGluZ3M=",
"Metadata": {
"realTimeEvents": [
{
"entityName": "ApiEvent",
"isEnabled": false
},
{
"entityName": "ApiEventStream",
"isEnabled": false
},
{
"entityName": "ApiAnomalyEventStore",
"isEnabled": false
},
{
"entityName": "ApiAnomalyEvent",
"isEnabled": false
},
{
"entityName": "BulkApiResultEventStore",
"isEnabled": false
},
{
"entityName": "BulkApiResultEvent",
"isEnabled": false
},
{
"entityName": "ConcurLongRunApexErrEvent",
"isEnabled": false
},
{
"entityName": "CredentialStuffingEventStore",
"isEnabled": false
},
{
"entityName": "CredentialStuffingEvent",
"isEnabled": false
},
{
"entityName": "FileEventStore",
"isEnabled": false
},
{
"entityName": "FileEvent",
"isEnabled": false
},
{
"entityName": "IdentityVerificationEvent",
"isEnabled": false
},
{
"entityName": "IdentityProviderEventStore",
"isEnabled": false
},
{
"entityName": "LightningUriEvent",
"isEnabled": false
},
{
"entityName": "LightningUriEventStream",
"isEnabled": false
},
{
"entityName": "ListViewEvent",
"isEnabled": false
},
{
"entityName": "ListViewEventStream",
"isEnabled": false
},
{
"entityName": "LoginEvent",
"isEnabled": false
},
{
"entityName": "LoginEventStream",
"isEnabled": false
},
{
"entityName": "LoginAsEvent",
"isEnabled": false
},
{
"entityName": "LoginAsEventStream",
"isEnabled": false
},
{
"entityName": "LogoutEvent",
"isEnabled": false
},
{
"entityName": "LogoutEventStream",
"isEnabled": false
},
{
"entityName": "PermissionSetEventStore",
"isEnabled": false
},
{
"entityName": "PermissionSetEvent",
"isEnabled": false
},
{
"entityName": "ReportEvent",
"isEnabled": false
},
{
"entityName": "ReportEventStream",
"isEnabled": false
},
{
"entityName": "ReportAnomalyEventStore",
"isEnabled": false
},
{
"entityName": "ReportAnomalyEvent",
"isEnabled": false
},
{
"entityName": "SessionHijackingEventStore",
"isEnabled": false
},
{
"entityName": "SessionHijackingEvent",
"isEnabled": false
},
{
"entityName": "UriEvent",
"isEnabled": false
},
{
"entityName": "UriEventStream",
"isEnabled": false
}
],
"urls": null
},
"FullName": "RealTimeEvent"
}
]
What are the standard events?
Requirements
- Accessing following event objects requires either the Salesforce Shield or Salesforce Event Monitoring add-on subscription and the View Real-Time Event Monitoring Data user permission.
Event | Description | Notes |
---|---|---|
ApiEvent | Tracks these user-initiated read-only API calls: query(), queryMore(), and count(). Captures API requests through SOAP API and Bulk API for the Enterprise and Partner WSDLs. Tooling API calls and API calls originating from a Salesforce mobile app aren’t captured. | |
ApiAnomalyEvent | Track anomalies in how users make API calls. | |
PermissionSetEventStore | Tracks changes to permission sets and permission set groups. This event initiates when a permission is added to, or removed from a permission set.This event also initiates when a permission set containing a critical permission is assigned or unassigned. | |
LoginAsEvent | when an admin logs in as another user in your org. In Real-Time Event Monitoring, it captures events for org admins and Experience Cloud sites only. | |
LoginEvent | LoginEvent tracks the login activity of users who log in to Salesforce. | |
LogoutEvent | Tracks user UI logouts. A logout event records a successful user logout from your org’s UI. | |
IdentityVerificationEvent | Tracks user identity verification events in your org. | |
IdentityProviderEventStore | Tracks problems and successes with inbound SAML or OpenID Connect authentication requests from another app provider. It also records outbound SAML responses when Salesforce is acting as an identity provider. | |
ListViewEvent | Tracks when users access data with list views using Lightning Experience, Salesforce Classic, or the API. It doesn’t track list views of Setup entities. | |
ReportEvent | Tracks when reports are run in your org. |
Event and storage details
- With Real-Time Event Monitoring, you can store and query event data in Salesforce objects. Many of the storage events are Salesforce big objects, which are ideal for storing large volumes of data for up to six months. A big object stores the data natively in Salesforce so you can access it for reporting and other uses. Some storage events, such as for Threat Detection, are standard Salesforce objects.
Platform Event | Object for Event Storage | Can Be Used in a Transaction Security Policy? |
---|---|---|
ApiAnomalyEvent | ApiAnomalyEventStore | Yes |
ApiEventStream | ApiEvent | Yes |
BulkApiResultEvent | BulkApiResultEventStore | Yes |
ConcurLongRunApexErrEvent | Not Available | |
CredentialStuffingEvent | CredentialStuffingEventStore | Yes |
FileEvent (Beta) | FileEventStore (Beta) | |
Not Available | IdentityVerificationEvent | |
Not Available | IdentityProviderEventStore | |
LightningUriEventStream | LightningUriEvent | |
ListViewEventStream | ListViewEvent | Yes |
LoginAsEventStream | LoginAsEvent | |
LoginEventStream | LoginEvent | Yes |
LogoutEventStream | LogoutEvent | |
MobileEmailEvent | Not Available | |
MobileEnforcedPolicyEvent | Not Available | |
MobileScreenshotEvent | Not Available | |
MobileTelephonyEvent | Not Available | |
PermissionSetEvent | PermissionSetEventStore | Yes |
ReportAnomalyEvent | ReportAnomalyEventStore | Yes |
ReportEventStream | ReportEvent | Yes |
SessionHijackingEvent | SessionHijackingEventStore | Yes |
UriEventStream | UriEvent |
LogoutEventStream
- When LogoutEventStream is enabled, Salesforce publishes logout events when users log out from the UI. You can add an Apex trigger to subscribe to those events. You can then implement custom logic during logout. For example, you can revoke all refresh tokens for a user at logout.
trigger LogoutEventTrigger on LogoutEventStream (after insert) {
// the subscriber inserts a custom logout event record during logout.
LogoutEventStream event = Trigger.new[0];
LogoutEvent__c record = new LogoutEvent__c();
record.EventIdentifier__c = event.EventIdentifier;
record.UserId__c = event.UserId;
record.Username__c = event.Username;
record.EventDate__c = event.EventDate;
record.RelatedEventIdentifier__c = event.RelatedEventIdentifier;
record.SessionKey__c = event.SessionKey;
record.LoginKey__c = event.LoginKey;
insert(record);
}
- Simpler one
trigger LogoutEventTrigger on LogoutEventStream (after insert) {
LogoutEventStream event = Trigger.new[0];
System.debug(event);
}
sfdx force:source:deploy -u mohan.chinnappan.n_ea2@gmail.com -p force-app/main/default/triggers/LogoutEventTrigger.trigger
Deploying v55.0 metadata to mohan.chinnappan.n_ea2@gmail.com using the v57.0 SOAP API
Deploy ID: 0Af3h00000SEzjlCAD
DEPLOY PROGRESS | ████████████████████████████████████████ | 1/1 Components
=== Deployed Source
FULL NAME TYPE PROJECT PATH
────────────────── ─────────── ───────────────────────────────────────────────────────────────────
LogoutEventTrigger ApexTrigger force-app/main/default/triggers/LogoutEventTrigger.trigger
LogoutEventTrigger ApexTrigger force-app/main/default/triggers/LogoutEventTrigger.trigger-meta.xml
Deploy Succeeded.
Listing logs
sfdx force:apex:log:list -u mohan.chinnappan.n_ea2@gmail.com --json |sfdx mohan:data:jq -q '.result' | pbcopy; open "https://mohan-chinnappan-n5.github.io/viz/datatable/dt.html?c=json"
Use Async SOQL with Real-Time Event Monitoring
- User case
Let’s say you’ve created a custom object called Patent__c that contains sensitive patent information. You want to know when users query this object using any API. Use the following Async SOQL query on the ApiEvent object to determine when Patent__c was last accessed, who accessed it, and what part of it was accessed. The WHERE clause uses the QueriedEntities field to narrow the results to just API queries of the Patent__c object.
Async SOQL schedules and runs queries asynchronously in the background, so it can run queries that normally time out with regular SOQL.
With Async SOQL, you can run multiple queries in the background while monitoring their completion status. Set up your queries and come back a few hours later to a dataset to work with. Async SOQL is the most efficient way to process the large amount of data in a storage event, especially for big objects.
- URI
https://yourInstance.salesforce.com/services/data/v56.0/async-queries/
- Payload (payload.json)
{
"query": "SELECT EventDate, EventIdentifier, QueriedEntities, SourceIp, Username, UserAgent FROM ApiEvent
WHERE QueriedEntities LIKE '%Patent__c%'",
"targetObject": "ApiTarget__c",
"targetFieldMap": {
"EventDate": "EventDate__c",
"EventIdentifier": "EventIdentifier__c",
"QueriedEntities": "QueriedEntities__c",
"SourceIp": "IPAddress__c",
"Username": "User__c",
"UserAgent": "UserAgent__c"
}
}
curl -H "Authorization: Bearer <accessToken>" -d @payload.json -X POST
Let us look at ReportEvent
sfdx mohanc:md:describe -u mohan.chinnappan.n_ea2@gmail.com -s ReportEvent | pbcopy; open "https://mohan-chinnappan-n5.github.io/viz/datatable/dt.html?c=csv"
# Fields in ReportEvent Object
cut -f 3 -d "," /tmp/reportevents.csv | sed 's/"//g' | sed 's/^/,/g'
SELECT
Id
,CreatedDate
,EventIdentifier
,UserId
,Username
,EventDate
,RelatedEventIdentifier
,LoginHistoryId
,RowsProcessed
,Operation
,QueriedEntities
,PolicyId
,PolicyOutcome
,EvaluationTime
,SessionKey
,LoginKey
,SessionLevel
,SourceIp
,DashboardId
,DashboardName
,EventSource
,ExecutionIdentifier
,ExportFileFormat
,IsScheduled
,NumberOfColumns
,ColumnHeaders
,GroupedColumnHeaders
,Description
,Format
,ReportId
,Records
,Name
,OwnerId
,Scope
,Sequence
,DisplayedFieldEntities
FROM reportEvent
sfdx mohanc:data:query -u mohan.chinnappan.n_ea2@gmail.com -q ~/.soql/reportEvent.soql | pbcopy; open "https://mohan-chinnappan-n5.github.io/viz/datatable/dt.html?c=csv"
References
Auditing Scripts

Contents
- List unapproved users with given profile in the org
- List unapproved users having ModifyAllData
- List unapproved users having ViewAllData
- List Users with given Profile who have Role
- Check users with given PermissionSet
- Check SetupAuditTrail
- Operationalize Audit Scripts using github-actions
List unapproved users with given profile in the org
Sample approved_admins.txt content:
Ken Thompson
Dennis Ritchie
Niklaus Wirth
ActiveUsers.soql
SELECT Name FROM User
WHERE isActive = true AND Profile.Name = '$PROFILE'
disjoint-stdin.py
#!/usr/local/bin/python3
# print disjoint lines between stdin and input file
# mchinnappan
#--------------------------------------------------
import sys
usage = """
----------------------------------------------------------------
python3 disjoint-stdin.py ref_file.txt
- Checks and prints any line in the stdin not in the ref_file
----------------------------------------------------------------
"""
if len(sys.argv) != 2 :
print (f"Usage: {usage}")
exit(0)
file2 = sys.argv[1]
file1_contents = [line.strip() for line in sys.stdin.readlines()]
set1 = set(file1_contents)
with open(file2, 'r') as f2:
file2_contents = [line.strip() for line in f2.readlines()]
set2 = set (file2_contents)
for item in set1:
if item not in set2:
print(item.strip())
Script to find out the users not in the approved list
export PROFILE="System Administrator";
echo "SELECT Name FROM User WHERE isActive = true AND Profile.Name = '$PROFILE'" > _check.soql
sfdx mohanc:data:query -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -q _check.soql -f json | jq '.[].Name' | sort | sed 's/"//g'| python3 py/disjoint-stdin.py data/approved-admins.txt
- The output of this script will be :
- list of users whose profile is system administrator but their names are not in the
approved.txt
file.
- list of users whose profile is system administrator but their names are not in the
mohan chinnappan
Operationalize Audit Scripts using github-actions
name: Auditor
run-name: on-demand-auditor
on: [workflow_dispatch]
jobs:
setup-dx-py-audit:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- run: npm install sfdx-cli -g
- run: echo "${{ secrets.SALESFORCE_JWT_SECRET_KEY }}" > server.key
- run: sfdx force:auth:jwt:grant --clientid=${{ secrets.SALESFORCE_CONSUMER_KEY }} --jwtkeyfile=server.key --username=${{ secrets.SALESFORCE_DEVHUB_USERNAME }} --setdefaultdevhubusername
- name: Install mohanc sfdx plugin
run: |
echo 'y' | sfdx plugins:install sfdx-mohanc-plugins
sfdx plugins
- name: Source DeployCheck Deploy
run: |
cd mc2Project
sfdx force:source:deploy -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -p force-app -c --verbose
sfdx mohanc:tooling:query -q ../.soql/deploymentStatus.soql -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -f json > ./deploymentStatus.json
cat ./deploymentStatus.json
- name: Setup Python
uses: actions/setup-python@v3.1.3
with:
python-version: 3.10.10
# List unapproved users with given profile in the org - data/approved-admins.txt
- name: run-sys-admin-check
run : |
export PROFILE="System Administrator";
echo "SELECT Name FROM User WHERE isActive = true AND Profile.Name = '$PROFILE'" > _check.soql
sfdx mohanc:data:query -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -q _check.soql -f json | jq '.[].Name' | sort | sed 's/"//g'| python3 py/disjoint-stdin.py data/approved-admins.txt
# List unapproved users having ModifyAllData - data/approved-admins.txt
- name: run-ModifyAllData-check
run : |
export PERM="PermissionsModifyAllData";
echo "SELECT AssigneeId FROM PermissionSetAssignment WHERE PermissionSet.$PERM=true" > _check2.soql
sfdx mohanc:data:query -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -q _check2.soql -f json | jq '.[].AssigneeId' | sed 's/"//g' | python3 py/list2in.py |sed "s/{/SELECT Name FROM USER WHERE isActive=true AND Id IN (/g" | sed 's/}/)/g' > _psa_.soql;
sfdx mohanc:data:query -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -q _psa_.soql -f json | jq '.[].Name' | sed 's/"//g' | python3 py/disjoint-stdin.py data/approved-admins.txt
# List unapproved users having ViewAllData - data/approved-admins.txt
- name: run-ViewAllData-check
run : |
export PERM="PermissionsViewAllData";
echo "SELECT AssigneeId FROM PermissionSetAssignment WHERE PermissionSet.$PERM=true" > _check2.soql
sfdx mohanc:data:query -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -q _check2.soql -f json | jq '.[].AssigneeId' | sed 's/"//g' | python3 py/list2in.py |sed "s/{/SELECT Name FROM USER WHERE isActive=true AND Id IN (/g" | sed 's/}/)/g' > _psa_.soql;
sfdx mohanc:data:query -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -q _psa_.soql -f json | jq '.[].Name' | sed 's/"//g' | python3 py/disjoint-stdin.py data/approved-admins.txt
# List Users with given Profile who have Role
- name: run-users-with-role-check
run : |
export PROFILE="System Administrator";
echo "SELECT Name, UserRole.Name FROM User WHERE isActive = true AND Profile.Name = '$PROFILE' AND UserRole.Name != null" > _check3.soql
sfdx mohanc:data:query -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -q _check3.soql -f json | jq '.[].Name + "|" + .[].UserRole.Name' | sed 's/"//g'
# Check users with given PermissionSet - EinsteinAnalyticsPlusAdmin
- name: check-user-with-given-ps
run : |
export PERMSET="EinsteinAnalyticsPlusAdmin";
echo "SELECT Id, PermissionSetId, PermissionSet.Name, PermissionSet.ProfileId, PermissionSet.Profile.Name, AssigneeId, Assignee.Name FROM PermissionSetAssignment WHERE PermissionSet.Name LIKE '%$PERMSET%'" > _check4.soql
sfdx mohanc:data:query -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -q _check4.soql -f json |jq '.[].Assignee.Name + "|" + .[].PermissionSet.Name' | sed 's/"//g' | sort -u
- You can use this script for checking other profiles as well - say: Developer profile:
export PROFILE="Developer";
echo "SELECT Name FROM User WHERE isActive = true AND Profile.Name = '$PROFILE'" > _check.soql
sfdx mohanc:data:query -u mohan.chinnappan.n.sel@gmail.com -q _check.soql -f json | jq '.[].Name' | sort | sed 's/"//g'| python3 disjoint-stdin.py approved_developers.txt
List unapproved users having ModifyAllData
- psAssignment_id.soql
SELECT AssigneeId
FROM PermissionSetAssignment
WHERE PermissionSet.$PERM=true
- list2in.py
import sys
file1_contents = [line.strip() for line in sys.stdin.readlines()]
print (set(file1_contents))
export PERM="PermissionsModifyAllData"; sed "s/\$PERM/$PERM/" psAssignment_id.soql > _checkPS.soql;
sfdx mohanc:data:query -u mohan.chinnappan.n.sel@gmail.com -q _checkPS.soql -f json | jq '.[].AssigneeId' | sed 's/"//g' | python3 list2in.py |sed "s/{/SELECT Name FROM USER WHERE isActive=true AND Id IN (/g" | sed 's/}/)/g' > _psa_.soql; sfdx mohanc:data:query -u mohan.chinnappan.n.sel@gmail.com -q _psa_.soql -f json | jq '.[].Name' | sed 's/"//g' | python3 disjoint-stdin.py approved_mad.txt
List unapproved users having ViewAllData
export PERM="PermissionsViewAllData"; sed "s/\$PERM/$PERM/" psAssignment_id.soql > _checkPS.soql;
sfdx mohanc:data:query -u mohan.chinnappan.n.sel@gmail.com -q _checkPS.soql -f json | jq '.[].AssigneeId' | sed 's/"//g' | python3 list2in.py |sed "s/{/SELECT Name FROM USER WHERE isActive=true AND Id IN (/g" | sed 's/}/)/g' > _psa_.soql; sfdx mohanc:data:query -u mohan.chinnappan.n.sel@gmail.com -q _psa_.soql -f json | jq '.[].Name' | sed 's/"//g' | python3 disjoint-stdin.py approved_vad.txt
List Users with given Profile who have Role
cat usersWithRole.soql
SELECT Name, UserRole.Name FROM User
WHERE isActive = true AND Profile.Name = '$PROFILE' AND UserRole.Name != null
export PROFILE="System Administrator"; sed "s/\$PROFILE/$PROFILE/" usersWithRole.soql > _check.soql; \
sfdx mohanc:data:query -u mohan.chinnappan.n.sel@gmail.com -q _check.soql -f json | jq '.[].Name + "|" + .[].UserRole.Name' | sed 's/"//g'
- Sample output - pipe separated - Name|Role
Joe Carpenter|ROLE1
John Smith|ROLE2
Check users with given PermissionSet
cat PermsetUsers.soql
SELECT Id, PermissionSetId, PermissionSet.Name,
PermissionSet.ProfileId, PermissionSet.Profile.Name,
AssigneeId, Assignee.Name
FROM PermissionSetAssignment
WHERE PermissionSet.Name LIKE '%$PERMSET%'
export PERMSET="EinsteinAnalyticsPlusAdmin"; sed "s/\$PERMSET/$PERMSET/" PermsetUsers.soql > _psUsers.soql; sfdx mohanc:data:query -u mohan.chinnappan.n_ea2@gmail.com -q _psUsers.soql -f json |jq '.[].Assignee.Name + "|" + .[].PermissionSet.Name' | sed 's/"//g' | sort -u
output of this command
Mohan Chinnappan|EinsteinAnalyticsPlusAdmin
if you like to checked against an approved list
cat approved_crmaadmin.txt
John Smith
export PERMSET="EinsteinAnalyticsPlusAdmin"; sed "s/\$PERMSET/$PERMSET/" PermsetUsers.soql > _psUsers.soql; sfdx mohanc:data:query -u mohan.chinnappan.n_ea2@gmail.com -q _psUsers.soql -f json |jq '.[].Assignee.Name + "|" + .[].PermissionSet.Name' | sed 's/"//g' | sort -u | python3 disjoint-stdin.py approved_crmaadmin.txt
output of this command
Mohan Chinnappan
Check SetupAuditTrail
Using Datatable view
sfdx mohanc:data:bulkapi:query -u mohan.chinnappan.n.sel2@gmail.com -q setupAduit.soql; sed 's/CreatedBy.Name/CreatedByName/g' setupAduit.soql.csv | pbcopy; open "https://mohan-chinnappan-n5.github.io/viz/datatable/dt.html?c=csv"
Using sqlite3
#!/usr/local/bin/python3
# SQL Query for CSV using pandas and sqlite
# mchinnappan
#--------------------------------------------------
import sys
import time
usage = """
----------------------------------------------------------------
python3 csvQuery input_file query
# SQL Query for CSV using pandas and sqllite on SetupAduit
----------------------------------------------------------------
"""
if len(sys.argv) != 3 :
print (f"Usage: {usage}")
exit(0)
import pandas as pd
import sys
import sqlite3
input_file = sys.argv[1]
table_name = 't' + str(round(time.time()*1000) )
sql_string_in = sys.argv[2] # Example for SetupAuditTrail: SELECT CreatedDate, "CreatedBy.Name" AS name, Display FROM sat WHERE Section ="Manage Users"
sql_string = sql_string_in.replace('$TBL', table_name)
# read CSV file into a DataFrame
df = pd.read_csv(input_file)
conn = sqlite3.connect("_temp.db") #if the db does not exist, it will be created
#store your table in the database:
df.to_sql(table_name, conn)
#read a SQL Query out of your database and into a pandas dataframe
df = pd.read_sql(sql_string, conn)
print(df.to_csv(sys.stdout))
cat setupAduit.soql
SELECT CreatedDate,
CreatedBy.Name,
CreatedByContext,
CreatedByIssuer,
Display,
Section,
DelegateUser
FROM SetupAuditTrail
sfdx mohanc:data:bulkapi:query -u mohan.chinnappan.n.sel@gmail.com -q setupAduit.soql ; python3 querycsv.py setupAduit.soql.csv ' SELECT CreatedDate, "CreatedBy.Name" AS name, Display FROM $TBL WHERE Section ="Manage Users"'
Output of the this
,CreatedDate,name,Display
0,2022-10-25T20:36:52.000Z,mohan chinnappan,Created permission set DevOps Center: with no license
1,2022-10-25T20:36:53.000Z,mohan chinnappan,Created permission set DevOps Center Manager: with no license
2,2022-10-25T20:36:53.000Z,mohan chinnappan,Created permission set DevOps Center Release Manager: with no license
3,2022-10-25T20:39:25.000Z,,Created permission set sf_devops_NamedCredentials: with no license
4,2022-10-25T20:39:26.000Z,,Created permission set sf_devops_InitializeEnvironments: with no license
5,2022-11-10T09:30:51.000Z,mohan chinnappan,Set new password for user null
- You can change the sql used by using different sections (for e.g Apex Class, Company Profile...)
SELECT CreatedDate, "CreatedBy.Name" AS name, Display FROM $TBL WHERE Section ="Apex Class"
References
Comparing Proactive Monitoring, Real-Time-Event Monitoring and Auditing Scripts
Monitoring type | Description | Refer for Details |
---|---|---|
Proactive Monitoring | Part of Signature Success Plan. Assigns Designated Technical Account Manager (TAM). Provides Personalized Trends and Insights. Supports Key Event Management. 24/7 Proactive Monitoring to Predict and prevent disruptions before they occur. Includes customized monitoring plan including Smart Alerts to detect potential risks and root causes | Proactive Monitoring |
Real-Time-Event Monitoring | Helps the customer to monitor and detect standard events in Salesforce in near real-time.The customer can store the event data for auditing or reporting purposes. The customer can create transaction security policies using Condition Builder. | Real-time-event monitoring |
Auditing Scripts | Provides scripts for List unapproved users with given profile in the org. List unapproved users having ModifyAllData. List unapproved users having ViewAllData. List Users with given Profile who have Role. Check users with given PermissionSet. Check SetupAuditTrail Operationalize Audit Scripts using github-actions | Auditing script |
Instance Related
Topics
Instance data center location for a given list of instances
cat instance-loc.py
#!/usr/local/bin/python3
# Prints the data center location for list of Salesforce Instances given in the input file
# mchinnappan
#--------------------------------------------------
import sys
usage = """
----------------------------------------------------------------
python3 instance-loc-.py file1
# Prints the data center location for list of Salesforce Instances given as input
----------------------------------------------------------------
"""
if len(sys.argv) != 2:
print (f"Usage: {usage}")
exit(0)
import sys
import pandas as pd
df = pd.read_csv('https://raw.githubusercontent.com/mohan-chinnappan-n/cli-dx/master/instances/sf-managed.csv')
file1 = sys.argv[1]
with open(file1, 'r') as f1:
file1_contents = [line.strip() for line in f1.readlines()]
set1 = set (file1_contents)
for item in set1:
print(df.query(f'SFManagedInstance == "{item}"'))
Run
python3 instance-loc.py instances.txt | sed 's/SFManagedInstance//g' | sed 's/DataCenters//g' | sed '/^[[:space:]]*$/d'
Input
cat instances.txt
CS1
CS123
CS125
CS166
CS167
CS170
CS171
CS172
CS190
CS191
Output
241 CS166 Washington DC, USA (North) / Washington DC, US...
157 CS1 Phoenix, USA / Washington DC, USA
232 CS125 Washington DC, USA (North) / Washington DC, US...
230 CS123 Washington DC, USA (North) / Washington DC, US...
245 CS171 Washington DC, USA (North) / Washington DC, US...
246 CS172 Washington DC, USA (North) / Washington DC, US...
250 CS190 Washington DC, USA (North) / Washington DC, US...
242 CS167 Washington DC, USA (North) / Washington DC, US...
251 CS191 Washington DC, USA (North) / Washington DC, US...
244 CS170 Washington DC, USA (North) / Washington DC, US...
Org Metadata Snapshot
Here is a simple way to create a Org's metadata Snapshot
- Build the package.xml using ONNU
cat ~/.pkg/package.xml
<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
<types>
<members>*</members>
<name>LeadConvertSettings</name>
</types>
<types>
<members>*</members>
<name>ApprovalProcess</name>
</types>
<types>
<members>*</members>
<name>Audience</name>
</types>
<types>
<members>*</members>
<name>AuraDefinitionBundle</name>
</types>
<types>
<members>*</members>
<name>BrandingSet</name>
</types>
<types>
<members>*</members>
<name>ApexClass</name>
</types>
<types>
<members>*</members>
<name>Contentasset</name>
</types>
<types>
<members>*</members>
<name>CustomMetadata</name>
</types>
<types>
<members>*</members>
<name>CustomPermission</name>
</types>
<types>
<members>*</members>
<name>Dashboard</name>
</types>
<types>
<members>*</members>
<name>DelegateGroup</name>
</types>
<types>
<members>*</members>
<name>DuplicateRule</name>
</types>
<types>
<members>*</members>
<name>ExperienceBundle</name>
</types>
<types>
<members>*</members>
<name>Flexipage</name>
</types>
<types>
<members>*</members>
<name>FlowDefinition</name>
</types>
<types>
<members>*</members>
<name>Flow</name>
</types>
<types>
<members>*</members>
<name>GlobalValueSet</name>
</types>
<types>
<members>*</members>
<name>Group</name>
</types>
<types>
<members>*</members>
<name>CustomLabel</name>
</types>
<types>
<members>*</members>
<name>Layout</name>
</types>
<types>
<members>*</members>
<name>Letterhead</name>
</types>
<types>
<members>*</members>
<name>LightningExperienceTheme</name>
</types>
<types>
<members>*</members>
<name>LightningComponentBundle</name>
</types>
<types>
<members>*</members>
<name>MatchingRule</name>
</types>
<types>
<members>*</members>
<name>LightningMessageChannel</name>
</types>
<types>
<members>*</members>
<name>NavigationMenu</name>
</types>
<types>
<members>*</members>
<name>Network</name>
</types>
<types>
<members>*</members>
<name>NotificationTypeConfig</name>
</types>
<types>
<members>*</members>
<name>Translations</name>
</types>
<types>
<members>*</members>
<name>CustomObject</name>
</types>
<types>
<members>*</members>
<name>ApexPage</name>
</types>
<types>
<members>*</members>
<name>PathAssistant</name>
</types>
<types>
<members>*</members>
<name>PermissionSetGroup</name>
</types>
<types>
<members>*</members>
<name>PermissionSet</name>
</types>
<types>
<members>*</members>
<name>PlatformEventChannelMember</name>
</types>
<types>
<members>*</members>
<name>ProfilePasswordPolicy</name>
</types>
<types>
<members>*</members>
<name>ProfileSessionSetting</name>
</types>
<types>
<members>*</members>
<name>Profile</name>
</types>
<types>
<members>*</members>
<name>Queue</name>
</types>
<types>
<members>*</members>
<name>QuickAction</name>
</types>
<types>
<members>*</members>
<name>RecommendationStrategy</name>
</types>
<types>
<members>*</members>
<name>ReportType</name>
</types>
<types>
<members>*</members>
<name>Report</name>
</types>
<types>
<members>*</members>
<name>RestrictionRule</name>
</types>
<types>
<members>*</members>
<name>Role</name>
</types>
<types>
<members>*</members>
<name>Settings</name>
</types>
<types>
<members>*</members>
<name>SharingRules</name>
</types>
<types>
<members>*</members>
<name>SharingSet</name>
</types>
<types>
<members>*</members>
<name>StandardValueSet</name>
</types>
<types>
<members>*</members>
<name>StaticResource</name>
</types>
<types>
<members>*</members>
<name>CustomTab</name>
</types>
<types>
<members>*</members>
<name>Territory2Model</name>
</types>
<types>
<members>*</members>
<name>Territory2Type</name>
</types>
<types>
<members>*</members>
<name>ApexTrigger</name>
</types>
<types>
<members>*</members>
<name>WaveApplication</name>
</types>
<types>
<members>*</members>
<name>WaveComponent</name>
</types>
<types>
<members>*</members>
<name>WaveDashboard</name>
</types>
<types>
<members>*</members>
<name>WaveDataflow</name>
</types>
<types>
<members>*</members>
<name>WaveDataset</name>
</types>
<types>
<members>*</members>
<name>WaveLens</name>
</types>
<types>
<members>*</members>
<name>WaveRecipe</name>
</types>
<types>
<members>*</members>
<name>WaveTemplateBundle</name>
</types>
<types>
<members>*</members>
<name>WaveXmd</name>
</types>
<types>
<members>*</members>
<name>Workflow</name>
</types>
</Package>
Option - 1 Extract using ONNU
Option -2 Using CLI
sfdx force:mdapi:retrieve -k ~/.pkg/package.xml -u mohan.chinnappan.n.sel@gmail.com -r .
Retrieving v57.0 metadata from mohan.chinnappan.n.sel@gmail.com using the v57.0 SOAP API
Retrieve ID: 09S4x00000EyfKDEAZ
Retrieving metadata from mohan.chinnappan.n.sel@gmail.com... done
Wrote retrieve zip to /private/tmp/snapshot/unpackaged.zip
=== Retrieved Source Warnings
FILE NAME PROBLEM
────────────────────── ──────────────────────────────────────────────────────────────────────
unpackaged/package.xml Entity type 'Territory2Model' is not available in this organization
unpackaged/package.xml Entity type 'Territory2Type' is not available in this organization
unpackaged/package.xml Entity type 'WaveApplication' is not available in this organization
unpackaged/package.xml Entity type 'WaveComponent' is not available in this organization
unpackaged/package.xml Entity type 'WaveDashboard' is not available in this organization
unpackaged/package.xml Entity type 'WaveDataflow' is not available in this organization
unpackaged/package.xml Entity type 'WaveDataset' is not available in this organization
unpackaged/package.xml Entity type 'WaveLens' is not available in this organization
unpackaged/package.xml Entity type 'WaveRecipe' is not available in this organization
unpackaged/package.xml Entity type 'WaveTemplateBundle' is not available in this organization
unpackaged/package.xml Entity type 'WaveXmd' is not available in this organization
Extract the zip file
jar xvf unpackaged.zip
View the zip Contents
tree
.
├── unpackaged
│ ├── audience
│ │ └── Default_mohanc.audience
│ ├── aura
│ │ ├── forgotPassword
│ │ │ ├── forgotPassword.cmp
│ │ │ ├── forgotPassword.cmp-meta.xml
│ │ │ ├── forgotPassword.css
│ │ │ ├── forgotPassword.design
│ │ │ ├── forgotPasswordController.js
│ │ │ └── forgotPasswordHelper.js
│ │ ├── loginForm
│ │ │ ├── loginForm.cmp
│ │ │ ├── loginForm.cmp-meta.xml
│ │ │ ├── loginForm.css
│ │ │ ├── loginForm.design
│ │ │ ├── loginFormController.js
│ │ │ └── loginFormHelper.js
│ │ ├── selfRegister
│ │ │ ├── selfRegister.cmp
│ │ │ ├── selfRegister.cmp-meta.xml
│ │ │ ├── selfRegister.css
│ │ │ ├── selfRegister.design
│ │ │ ├── selfRegisterController.js
│ │ │ └── selfRegisterHelper.js
│ │ ├── setExpId
│ │ │ ├── setExpId.evt
│ │ │ └── setExpId.evt-meta.xml
│ │ └── setStartUrl
│ │ ├── setStartUrl.evt
│ │ └── setStartUrl.evt-meta.xml
│ ├── classes
│ │ ├── ChangePasswordController.cls
│ │ ├── ChangePasswordController.cls-meta.xml
│ │ ├── ChangePasswordControllerTest.cls
│ │ ├── ChangePasswordControllerTest.cls-meta.xml
│ │ ├── CommunitiesLandingController.cls
│ │ ├── CommunitiesLandingController.cls-meta.xml
│ │ ├── CommunitiesLandingControllerTest.cls
│ │ ├── CommunitiesLandingControllerTest.cls-meta.xml
│ │ ├── CommunitiesLoginController.cls
│ │ ├── CommunitiesLoginController.cls-meta.xml
│ │ ├── CommunitiesLoginControllerTest.cls
│ │ ├── CommunitiesLoginControllerTest.cls-meta.xml
│ │ ├── CommunitiesSelfRegConfirmController.cls
│ │ ├── CommunitiesSelfRegConfirmController.cls-meta.xml
│ │ ├── CommunitiesSelfRegConfirmControllerTest.cls
│ │ ├── CommunitiesSelfRegConfirmControllerTest.cls-meta.xml
│ │ ├── CommunitiesSelfRegController.cls
│ │ ├── CommunitiesSelfRegController.cls-meta.xml
│ │ ├── CommunitiesSelfRegControllerTest.cls
│ │ ├── CommunitiesSelfRegControllerTest.cls-meta.xml
│ │ ├── FieldAuditor.cls
│ │ ├── FieldAuditor.cls-meta.xml
│ │ ├── ForgotPasswordController.cls
│ │ ├── ForgotPasswordController.cls-meta.xml
│ │ ├── ForgotPasswordControllerTest.cls
│ │ ├── ForgotPasswordControllerTest.cls-meta.xml
│ │ ├── HelloPeach.cls
│ │ ├── HelloPeach.cls-meta.xml
│ │ ├── HelloWorld.cls
│ │ ├── HelloWorld.cls-meta.xml
│ │ ├── HelloWorld2.cls
│ │ ├── HelloWorld2.cls-meta.xml
│ │ ├── LightningForgotPasswordController.cls
│ │ ├── LightningForgotPasswordController.cls-meta.xml
│ │ ├── LightningForgotPasswordControllerTest.cls
│ │ ├── LightningForgotPasswordControllerTest.cls-meta.xml
│ │ ├── LightningLoginFormController.cls
│ │ ├── LightningLoginFormController.cls-meta.xml
│ │ ├── LightningLoginFormControllerTest.cls
│ │ ├── LightningLoginFormControllerTest.cls-meta.xml
│ │ ├── LightningSelfRegisterController.cls
│ │ ├── LightningSelfRegisterController.cls-meta.xml
│ │ ├── LightningSelfRegisterControllerTest.cls
│ │ ├── LightningSelfRegisterControllerTest.cls-meta.xml
│ │ ├── MicrobatchSelfRegController.cls
│ │ ├── MicrobatchSelfRegController.cls-meta.xml
│ │ ├── MicrobatchSelfRegControllerTest.cls
│ │ ├── MicrobatchSelfRegControllerTest.cls-meta.xml
│ │ ├── MyHelloWorld.cls
│ │ ├── MyHelloWorld.cls-meta.xml
│ │ ├── MyProfilePageController.cls
│ │ ├── MyProfilePageController.cls-meta.xml
│ │ ├── MyProfilePageControllerTest.cls
│ │ ├── MyProfilePageControllerTest.cls-meta.xml
│ │ ├── ObjectAuditor.cls
│ │ ├── ObjectAuditor.cls-meta.xml
│ │ ├── RecordTypeAuditor.cls
│ │ ├── RecordTypeAuditor.cls-meta.xml
│ │ ├── SiteLoginController.cls
│ │ ├── SiteLoginController.cls-meta.xml
│ │ ├── SiteLoginControllerTest.cls
│ │ ├── SiteLoginControllerTest.cls-meta.xml
│ │ ├── SiteRegisterController.cls
│ │ ├── SiteRegisterController.cls-meta.xml
│ │ ├── SiteRegisterControllerTest.cls
│ │ └── SiteRegisterControllerTest.cls-meta.xml
│ ├── duplicateRules
│ │ ├── Account.Standard_Account_Duplicate_Rule.duplicateRule
│ │ ├── Contact.Standard_Contact_Duplicate_Rule.duplicateRule
│ │ └── Lead.Standard_Lead_Duplicate_Rule.duplicateRule
│ ├── experiences
│ │ ├── mohanc1
│ │ │ ├── brandingSets
│ │ │ │ └── buildYourOwnLWR.json
│ │ │ ├── config
│ │ │ │ ├── languages.json
│ │ │ │ ├── mainAppPage.json
│ │ │ │ └── mohanc.json
│ │ │ ├── routes
│ │ │ │ ├── error.json
│ │ │ │ ├── home.json
│ │ │ │ ├── serviceNotAvailable.json
│ │ │ │ └── tooManyRequests.json
│ │ │ ├── themes
│ │ │ │ └── buildYourOwnLWR.json
│ │ │ └── views
│ │ │ ├── error.json
│ │ │ ├── home.json
│ │ │ ├── serviceNotAvailable.json
│ │ │ └── tooManyRequests.json
│ │ └── mohanc1.site-meta.xml
│ ├── flexipages
│ │ ├── Account_Record_Page.flexipage
│ │ ├── ContactCenter_UtilityBar.flexipage
│ │ ├── CrisisCenter_UtilityBar.flexipage
│ │ ├── LightningSalesConsole_UtilityBar.flexipage
│ │ ├── LightningSales_UtilityBar.flexipage
│ │ └── LightningService_UtilityBar.flexipage
│ ├── flowDefinitions
│ │ ├── OpptyAmountChange.flowDefinition
│ │ └── sample1.flowDefinition
│ ├── flows
│ │ ├── OpptyAmountChange.flow
│ │ └── sample1.flow
│ ├── globalValueSets
│ │ ├── Test.globalValueSet
│ │ └── plants.globalValueSet
│ ├── labels
│ │ └── CustomLabels.labels
│ ├── layouts
│ │ ├── Account-Account\ %28Marketing%29\ Layout.layout
│ │ ├── Account-Account\ %28Sales%29\ Layout.layout
│ │ ├── Account-Account\ %28Support%29\ Layout.layout
│ │ ├── Account-Account\ Layout.layout
│ │ ├── Account-Simple.layout
│ │ ├── AccountBrand-Account\ Brand\ Layout.layout
│ │ ├── AlternativePaymentMethod-Alternative\ Payment\ Method\ Layout.layout
│ │ ├── ApiAnomalyEventStore-API\ Anomaly\ Event\ Store\ Layout.layout
│ │ ├── AppointmentInvitation-Appointment\ Invitation\ Layout.layout
│ │ ├── Asset-Asset\ Layout.layout
│ │ ├── AssetAction-Asset\ Action\ Layout.layout
│ │ ├── AssetActionSource-Asset\ Action\ Source\ Layout.layout
│ │ ├── AssetRelationship-Asset\ Relationship\ Layout.layout
│ │ ├── AssetStatePeriod-Asset\ State\ Period\ Layout.layout
│ │ ├── AssignedResource-Assigned\ Resource\ Layout.layout
│ │ ├── AuthorizationForm-Authorization\ Form\ Layout.layout
│ │ ├── AuthorizationFormConsent-Authorization\ Form\ Consent\ Layout.layout
│ │ ├── AuthorizationFormDataUse-Authorization\ Form\ Data\ Use\ Layout.layout
│ │ ├── AuthorizationFormText-Authorization\ Form\ Text\ Layout.layout
│ │ ├── BusinessBrand-Business\ Brand\ Layout.layout
│ │ ├── Campaign-Campaign\ Layout.layout
│ │ ├── CampaignMember-Campaign\ Member\ Page\ Layout.layout
│ │ ├── CardPaymentMethod-Card\ Payment\ Method\ Layout.layout
│ │ ├── CartCheckoutSession-Cart\ Checkout\ Session\ Layout.layout
│ │ ├── CartDeliveryGroup-Cart\ Delivery\ Group\ Layout.layout
│ │ ├── CartItem-Cart\ Item\ Layout.layout
│ │ ├── CartTax-Cart\ Tax\ Layout.layout
│ │ ├── CartValidationOutput-Cart\ Validation\ Output\ Layout.layout
│ │ ├── Case-Case\ %28Marketing%29\ Layout.layout
│ │ ├── Case-Case\ %28Sales%29\ Layout.layout
│ │ ├── Case-Case\ %28Support%29\ Layout.layout
│ │ ├── Case-Case\ Layout.layout
│ │ ├── CaseClose-Close\ Case\ Layout.layout
│ │ ├── CaseMilestone-Case\ Milestone\ Layout.layout
│ │ ├── CodeBuilder__CodeBuilderSetting__mdt-CodeBuilder__CodeBuilderSetting\ Layout.layout
│ │ ├── CommunityMemberLayout-Community\ Member\ Layout.layout
│ │ ├── ConsumptionRate-Consumption\ Rate\ Layout.layout
│ │ ├── ConsumptionSchedule-Consumption\ Schedule\ Layout.layout
│ │ ├── Contact-Contact\ %28Marketing%29\ Layout.layout
│ │ ├── Contact-Contact\ %28Sales%29\ Layout.layout
│ │ ├── Contact-Contact\ %28Support%29\ Layout.layout
│ │ ├── Contact-Contact\ Layout.layout
│ │ ├── ContactPointAddress-Contact\ Point\ Address\ Layout.layout
│ │ ├── ContactPointEmail-Contact\ Point\ Email\ Layout.layout
│ │ ├── ContactPointPhone-Contact\ Point\ Phone\ Layout.layout
│ │ ├── ContactPointTypeConsent-Contact\ Point\ Type\ Consent\ Layout.layout
│ │ ├── ContentVersion-Content\ Version\ Layout.layout
│ │ ├── Contract-Contract\ Layout.layout
│ │ ├── ContractLineItem-Contract\ Line\ Item\ Layout.layout
│ │ ├── CredentialStuffingEventStore-Credential\ Stuffing\ Event\ Store\ Layout.layout
│ │ ├── CreditMemo-Credit\ Memo\ Layout.layout
│ │ ├── CreditMemoLine-Credit\ Memo\ Line\ Layout.layout
│ │ ├── Customer-Customer\ Layout.layout
│ │ ├── DandBCompany-D%26B\ Company\ Layout.layout
│ │ ├── DataUseLegalBasis-Data\ Use\ Legal\ Basis\ Layout.layout
│ │ ├── DataUsePurpose-Data\ Use\ Purpose\ Layout.layout
│ │ ├── DelegatedAccount-External\ Managed\ Account\ Layout.layout
│ │ ├── DigitalWallet-Digital\ Wallet\ Layout.layout
│ │ ├── DuplicateRecordSet-Duplicate\ Record\ Set\ Layout.layout
│ │ ├── EmailMessage-Email\ Message\ Layout.layout
│ │ ├── EngagementChannelType-Engagement\ Channel\ Type\ Layout.layout
│ │ ├── Entitlement-Entitlement\ Layout.layout
│ │ ├── EntityMilestone-Object\ Milestone\ Layout.layout
│ │ ├── Event-Event\ Layout.layout
│ │ ├── FeedItem-Feed\ Item\ Layout.layout
│ │ ├── FinanceBalanceSnapshot-Finance\ Balance\ Snapshot\ Layout.layout
│ │ ├── FinanceTransaction-Finance\ Transaction\ Layout.layout
│ │ ├── FruitTree__c-FruitTree\ Layout.layout
│ │ ├── Global-Global\ Layout.layout
│ │ ├── Idea-Idea\ Layout.layout
│ │ ├── Individual-Individual\ Layout.layout
│ │ ├── Invoice-Invoice\ Layout.layout
│ │ ├── InvoiceLine-Invoice\ Line\ Layout.layout
│ │ ├── Lead-Lead\ %28Marketing%29\ Layout.layout
│ │ ├── Lead-Lead\ %28Sales%29\ Layout.layout
│ │ ├── Lead-Lead\ %28Support%29\ Layout.layout
│ │ ├── Lead-Lead\ Layout.layout
│ │ ├── LegalEntity-Legal\ Entity\ Layout.layout
│ │ ├── Macro-Macro\ Layout.layout
│ │ ├── OperatingHours-Operating\ Hours\ Layout.layout
│ │ ├── Opportunity-Opportunity\ %28Marketing%29\ Layout.layout
│ │ ├── Opportunity-Opportunity\ %28Sales%29\ Layout.layout
│ │ ├── Opportunity-Opportunity\ %28Support%29\ Layout.layout
│ │ ├── Opportunity-Opportunity\ Layout.layout
│ │ ├── OpportunityLineItem-Opportunity\ Product\ Layout.layout
│ │ ├── OpportunityTeamMember-Opportunity\ Team\ Member\ Layout.layout
│ │ ├── Order-Order\ Layout.layout
│ │ ├── OrderItem-Order\ Product\ Layout.layout
│ │ ├── Payment-Payment\ Layout.layout
│ │ ├── PaymentAuthAdjustment-Payment\ Authorization\ Adjustment\ Layout.layout
│ │ ├── PaymentAuthorization-Payment\ Authorization\ Layout.layout
│ │ ├── PaymentGateway-Payment\ Gateway\ Layout.layout
│ │ ├── PaymentGatewayLog-Payment\ Gateway\ Log\ Layout.layout
│ │ ├── PaymentGroup-Payment\ Group\ Layout.layout
│ │ ├── PaymentLineInvoice-Payment\ Line\ Invoice\ Layout.layout
│ │ ├── Pricebook2-Price\ Book\ Layout.layout
│ │ ├── PricebookEntry-Price\ Book\ Entry\ Layout.layout
│ │ ├── ProcessException-Process\ Exception\ Layout.layout
│ │ ├── Product2-Product\ Layout.layout
│ │ ├── ProductConsumptionSchedule-Product\ Consumption\ Schedule\ Layout.layout
│ │ ├── QuickText-Quick\ Text\ Layout.layout
│ │ ├── Refund-Refund\ Layout.layout
│ │ ├── RefundLinePayment-Refund\ Line\ Payment\ Layout.layout
│ │ ├── ReportAnomalyEventStore-Report\ Anomaly\ Event\ Store\ Layout.layout
│ │ ├── ResourceAbsence-Resource\ Absence\ Layout.layout
│ │ ├── ResourcePreference-Resource\ Preference\ Layout.layout
│ │ ├── ReturnOrder-Return\ Order\ Layout.layout
│ │ ├── ReturnOrderItemAdjustment-Return\ Order\ Item\ Adjustment\ Layout.layout
│ │ ├── ReturnOrderItemTax-Return\ Order\ Item\ Tax\ Layout.layout
│ │ ├── ReturnOrderLineItem-Return\ Order\ Line\ Item\ Layout.layout
│ │ ├── Sales_Play__c-Sales\ Play\ Layout.layout
│ │ ├── Scorecard-Scorecard\ Layout.layout
│ │ ├── ScorecardAssociation-Scorecard\ Association\ Layout.layout
│ │ ├── ScorecardMetric-Scorecard\ Metric\ Layout.layout
│ │ ├── Seller-Seller\ Layout.layout
│ │ ├── ServiceAppointment-Service\ Appointment\ Layout.layout
│ │ ├── ServiceContract-Service\ Contract\ Layout.layout
│ │ ├── ServiceResource-Service\ Resource\ Layout.layout
│ │ ├── ServiceResourceSkill-Service\ Resource\ Skill\ Layout.layout
│ │ ├── ServiceTerritory-Service\ Territory\ Layout.layout
│ │ ├── ServiceTerritoryMember-Service\ Territory\ Member\ Layout.layout
│ │ ├── SessionHijackingEventStore-Session\ Hijacking\ Event\ Store\ Layout.layout
│ │ ├── Shift-Shift\ Layout.layout
│ │ ├── SkillRequirement-Skill\ Requirement\ Layout.layout
│ │ ├── SocialPersona-Social\ Persona\ Layout.layout
│ │ ├── Solution-Solution\ Layout.layout
│ │ ├── Task-Task\ Layout.layout
│ │ ├── ThreatDetectionFeedback-Threat\ Detection\ Feedback\ Layout.layout
│ │ ├── TimeSlot-Time\ Slot\ Layout.layout
│ │ ├── Tree__c-Tree\ Layout.layout
│ │ ├── User-User\ Layout.layout
│ │ ├── UserProvAccount-User\ Provisioning\ Account\ Layout.layout
│ │ ├── UserProvisioningLog-User\ Provisioning\ Log\ Layout.layout
│ │ ├── UserProvisioningRequest-User\ Provisioning\ Request\ Layout.layout
│ │ ├── WebCart-Cart\ Layout.layout
│ │ ├── WebCartAdjustmentBasis-Cart\ Adjustment\ Basis\ Layout.layout
│ │ ├── WebCartAdjustmentGroup-Cart\ Adjustment\ Group\ Layout.layout
│ │ ├── WebStore-Store\ Layout.layout
│ │ ├── WebStoreConfig-Web\ Store\ Configuration\ Layout.layout
│ │ ├── WorkOrder-Work\ Order\ Layout.layout
│ │ ├── WorkOrderLineItem-Work\ Order\ Line\ Item\ Layout.layout
│ │ ├── WorkPlan-Work\ Plan\ Layout.layout
│ │ ├── WorkPlanTemplate-Work\ Plan\ Template\ Layout.layout
│ │ ├── WorkPlanTemplateEntry-Work\ Plan\ Template\ Entry\ Layout.layout
│ │ ├── WorkStep-Work\ Step\ Layout.layout
│ │ ├── WorkStepTemplate-Work\ Step\ Template\ Layout.layout
│ │ ├── WorkType-Work\ Type\ Layout.layout
│ │ ├── WorkTypeGroup-Work\ Type\ Group\ Layout.layout
│ │ ├── WorkTypeGroupMember-Work\ Type\ Group\ Member\ Layout.layout
│ │ ├── sf_devops__App_Config__mdt-sf_devops__App\ Config\ Layout.layout
│ │ ├── sf_devops__Async_Operation_Result__c-sf_devops__Async\ Operation\ Result\ Layout.layout
│ │ ├── sf_devops__Back_Sync__c-sf_devops__Back\ Sync\ Layout.layout
│ │ ├── sf_devops__Branch__c-sf_devops__Branch\ Layout.layout
│ │ ├── sf_devops__Change_Bundle_Install__c-sf_devops__Change\ Bundle\ Install\ Layout.layout
│ │ ├── sf_devops__Change_Bundle__c-sf_devops__Change\ Bundle\ Layout.layout
│ │ ├── sf_devops__Change_Submission__c-sf_devops__Change\ Submission\ Layout.layout
│ │ ├── sf_devops__Deploy_Component__c-sf_devops__Deploy\ Component\ Layout.layout
│ │ ├── sf_devops__Deployment_Result__c-sf_devops__Deployment\ Result\ Layout.layout
│ │ ├── sf_devops__Environment__c-sf_devops__Environment\ Layout.layout
│ │ ├── sf_devops__Hidden_Remote_Change__c-sf_devops__Hidden\ Remote\ Change\ Layout.layout
│ │ ├── sf_devops__Merge_Result__c-sf_devops__Merge\ Result\ Layout.layout
│ │ ├── sf_devops__Object_Activity__c-sf_devops__Object\ Activity\ Layout.layout
│ │ ├── sf_devops__Pipeline_Stage__c-sf_devops__Pipeline\ Stage\ Layout.layout
│ │ ├── sf_devops__Pipeline__c-sf_devops__Pipeline\ Layout.layout
│ │ ├── sf_devops__Remote_Change__c-sf_devops__Remote\ Change\ Layout.layout
│ │ ├── sf_devops__Repository__c-sf_devops__Repository\ Layout.layout
│ │ ├── sf_devops__Service__mdt-sf_devops__Service\ Layout.layout
│ │ ├── sf_devops__Source_Member_Reference__c-sf_devops__Source\ Member\ Reference\ Layout.layout
│ │ ├── sf_devops__Submit_Component__c-sf_devops__Submit\ Component\ Layout.layout
│ │ ├── sf_devops__Vcs_Event__c-sf_devops__VCS\ Event\ Layout.layout
│ │ ├── sf_devops__Work_Item_Promote__c-sf_devops__Work\ Item\ Promote\ Layout.layout
│ │ └── sf_devops__Work_Item__c-sf_devops__Work\ Item\ Layout.layout
│ ├── lwc
│ │ ├── chart
│ │ │ ├── chart.html
│ │ │ ├── chart.js
│ │ │ └── chart.js-meta.xml
│ │ ├── hello
│ │ │ ├── hello.html
│ │ │ ├── hello.js
│ │ │ └── hello.js-meta.xml
│ │ └── uIRecordAPIAccount
│ │ ├── uIRecordAPIAccount.html
│ │ ├── uIRecordAPIAccount.js
│ │ └── uIRecordAPIAccount.js-meta.xml
│ ├── navigationMenus
│ │ └── SFDC_Default_Navigation_mohanc.navigationMenu
│ ├── networks
│ │ └── mohanc.network
│ ├── notificationTypeConfig
│ │ └── NotificationTypeConfig.config
│ ├── objects
│ │ ├── FruitTree__c.object
│ │ ├── Sales_Play__c.object
│ │ └── Tree__c.object
│ ├── package.xml
│ ├── pages
│ │ ├── AnswersHome.page
│ │ ├── AnswersHome.page-meta.xml
│ │ ├── BandwidthExceeded.page
│ │ ├── BandwidthExceeded.page-meta.xml
│ │ ├── ChangePassword.page
│ │ ├── ChangePassword.page-meta.xml
│ │ ├── CommunitiesLanding.page
│ │ ├── CommunitiesLanding.page-meta.xml
│ │ ├── CommunitiesLogin.page
│ │ ├── CommunitiesLogin.page-meta.xml
│ │ ├── CommunitiesSelfReg.page
│ │ ├── CommunitiesSelfReg.page-meta.xml
│ │ ├── CommunitiesSelfRegConfirm.page
│ │ ├── CommunitiesSelfRegConfirm.page-meta.xml
│ │ ├── CommunitiesTemplate.page
│ │ ├── CommunitiesTemplate.page-meta.xml
│ │ ├── Exception.page
│ │ ├── Exception.page-meta.xml
│ │ ├── FieldAuditView.page
│ │ ├── FieldAuditView.page-meta.xml
│ │ ├── FileNotFound.page
│ │ ├── FileNotFound.page-meta.xml
│ │ ├── ForgotPassword.page
│ │ ├── ForgotPassword.page-meta.xml
│ │ ├── ForgotPasswordConfirm.page
│ │ ├── ForgotPasswordConfirm.page-meta.xml
│ │ ├── IdeasHome.page
│ │ ├── IdeasHome.page-meta.xml
│ │ ├── InMaintenance.page
│ │ ├── InMaintenance.page-meta.xml
│ │ ├── MicrobatchSelfReg.page
│ │ ├── MicrobatchSelfReg.page-meta.xml
│ │ ├── MyProfilePage.page
│ │ ├── MyProfilePage.page-meta.xml
│ │ ├── SiteLogin.page
│ │ ├── SiteLogin.page-meta.xml
│ │ ├── SiteRegister.page
│ │ ├── SiteRegister.page-meta.xml
│ │ ├── SiteRegisterConfirm.page
│ │ ├── SiteRegisterConfirm.page-meta.xml
│ │ ├── SiteTemplate.page
│ │ ├── SiteTemplate.page-meta.xml
│ │ ├── StdExceptionTemplate.page
│ │ ├── StdExceptionTemplate.page-meta.xml
│ │ ├── Unauthorized.page
│ │ ├── Unauthorized.page-meta.xml
│ │ ├── UnderConstruction.page
│ │ └── UnderConstruction.page-meta.xml
│ ├── permissionsets
│ │ ├── Experience_Profile_Manager.permissionset
│ │ ├── Manage_Encryption_Keys.permissionset
│ │ ├── sf_devops_InitializeEnvironments.permissionset
│ │ ├── sf_devops_NamedCredentials.permissionset
│ │ └── test2.permissionset
│ ├── platformEventChannelMembers
│ │ ├── ChangeEvents_AccountChangeEvent.platformEventChannelMember
│ │ ├── ChangeEvents_CampaignChangeEvent.platformEventChannelMember
│ │ └── ChangeEvents_CaseChangeEvent.platformEventChannelMember
│ ├── profilePasswordPolicies
│ │ └── PT1_profilePasswordPolicy1672084508523.profilePasswordPolicy
│ ├── profileSessionSettings
│ │ └── PT1_profileSessionSetting1672084508612.profileSessionSetting
│ ├── profiles
│ │ ├── Admin.profile
│ │ ├── Analytics\ Cloud\ Integration\ User.profile
│ │ ├── Analytics\ Cloud\ Security\ User.profile
│ │ ├── Authenticated\ Website.profile
│ │ ├── Chatter\ External\ User.profile
│ │ ├── Chatter\ Free\ User.profile
│ │ ├── Chatter\ Moderator\ User.profile
│ │ ├── ContractManager.profile
│ │ ├── Cross\ Org\ Data\ Proxy\ User.profile
│ │ ├── Custom%3A\ Marketing\ Profile.profile
│ │ ├── Custom%3A\ Sales\ Profile.profile
│ │ ├── Custom%3A\ Support\ Profile.profile
│ │ ├── Customer\ Community\ Login\ User.profile
│ │ ├── Customer\ Community\ Plus\ Login\ User.profile
│ │ ├── Customer\ Community\ Plus\ User.profile
│ │ ├── Customer\ Community\ User.profile
│ │ ├── Customer\ Portal\ Manager\ Custom.profile
│ │ ├── Customer\ Portal\ Manager\ Standard.profile
│ │ ├── External\ Apps\ Login\ User.profile
│ │ ├── External\ Identity\ User.profile
│ │ ├── Force%2Ecom\ -\ App\ Subscription\ User.profile
│ │ ├── Force%2Ecom\ -\ Free\ User.profile
│ │ ├── Gold\ Partner\ User.profile
│ │ ├── Guest\ License\ User.profile
│ │ ├── High\ Volume\ Customer\ Portal\ User.profile
│ │ ├── HighVolumePortal.profile
│ │ ├── Identity\ User.profile
│ │ ├── MarketingProfile.profile
│ │ ├── Minimum\ Access\ -\ Salesforce.profile
│ │ ├── Partner\ App\ Subscription\ User.profile
│ │ ├── Partner\ Community\ Login\ User.profile
│ │ ├── Partner\ Community\ User.profile
│ │ ├── PlatformPortal.profile
│ │ ├── Read\ Only.profile
│ │ ├── Silver\ Partner\ User.profile
│ │ ├── SolutionManager.profile
│ │ ├── Standard.profile
│ │ ├── StandardAul.profile
│ │ ├── Work%2Ecom\ Only\ User.profile
│ │ └── mohanc\ Profile.profile
│ ├── quickActions
│ │ ├── Case.LogACall.quickAction
│ │ ├── Case.NewChildCase.quickAction
│ │ ├── CollaborationGroup.NewGroupMember.quickAction
│ │ ├── CollaborationGroup.NewGroupRecord.quickAction
│ │ ├── FeedItem.NewTaskFromFeedItem.quickAction
│ │ ├── GuestCommunityCase.quickAction
│ │ ├── LogACall.quickAction
│ │ ├── NewAccount.quickAction
│ │ ├── NewCase.quickAction
│ │ ├── NewCommunityCase.quickAction
│ │ ├── NewContact.quickAction
│ │ ├── NewEvent.quickAction
│ │ ├── NewGroup.quickAction
│ │ ├── NewLead.quickAction
│ │ ├── NewNote.quickAction
│ │ ├── NewOpportunity.quickAction
│ │ ├── NewTask.quickAction
│ │ ├── SendEmail.quickAction
│ │ ├── Task.Defer.quickAction
│ │ ├── Task.EditDescription.quickAction
│ │ ├── Task.UpdatePriority.quickAction
│ │ └── Task.UpdateStatus.quickAction
│ ├── reportTypes
│ │ └── screen_flows_prebuilt_crt.reportType
│ ├── roles
│ │ ├── CEO.role
│ │ ├── CFO.role
│ │ ├── COO.role
│ │ ├── ChannelSalesTeam.role
│ │ ├── CustomerSupportInternational.role
│ │ ├── CustomerSupportNorthAmerica.role
│ │ ├── DirectorChannelSales.role
│ │ ├── DirectorDirectSales.role
│ │ ├── EasternSalesTeam.role
│ │ ├── InstallationRepairServices.role
│ │ ├── MarketingTeam.role
│ │ ├── SVPCustomerServiceSupport.role
│ │ ├── SVPHumanResources.role
│ │ ├── SVPSalesMarketing.role
│ │ ├── VPInternationalSales.role
│ │ ├── VPMarketing.role
│ │ ├── VPNorthAmericanSales.role
│ │ └── WesternSalesTeam.role
│ ├── settings
│ │ ├── Account.settings
│ │ ├── AccountIntelligence.settings
│ │ ├── Actions.settings
│ │ ├── Activities.settings
│ │ ├── Address.settings
│ │ ├── Analytics.settings
│ │ ├── Apex.settings
│ │ ├── AppAnalytics.settings
│ │ ├── AppExperience.settings
│ │ ├── Archive.settings
│ │ ├── Blockchain.settings
│ │ ├── Bot.settings
│ │ ├── BusinessHours.settings
│ │ ├── Campaign.settings
│ │ ├── Case.settings
│ │ ├── Chatter.settings
│ │ ├── ChatterEmailsMD.settings
│ │ ├── Commerce.settings
│ │ ├── Communities.settings
│ │ ├── Company.settings
│ │ ├── ConnectedApp.settings
│ │ ├── Content.settings
│ │ ├── Contract.settings
│ │ ├── ConversationalIntelligence.settings
│ │ ├── Currency.settings
│ │ ├── CustomAddressField.settings
│ │ ├── CustomerDataPlatform.settings
│ │ ├── CustomizablePropensityScoring.settings
│ │ ├── DataDotCom.settings
│ │ ├── Deployment.settings
│ │ ├── DevHub.settings
│ │ ├── Discovery.settings
│ │ ├── DocumentChecklist.settings
│ │ ├── EAC.settings
│ │ ├── EinsteinDocumentCapture.settings
│ │ ├── EmailAdministration.settings
│ │ ├── EmailIntegration.settings
│ │ ├── EmailTemplate.settings
│ │ ├── EmployeeFieldAccess.settings
│ │ ├── EmployeeUser.settings
│ │ ├── EncryptionKey.settings
│ │ ├── EngagementMessaging.settings
│ │ ├── EnhancedNotes.settings
│ │ ├── Entitlement.settings
│ │ ├── Essentials.settings
│ │ ├── Event.settings
│ │ ├── ExperienceBundle.settings
│ │ ├── ExternalClientApp.settings
│ │ ├── FieldService.settings
│ │ ├── FileUploadAndDownloadSecurity.settings
│ │ ├── FilesConnect.settings
│ │ ├── Flow.settings
│ │ ├── Forecasting.settings
│ │ ├── Formula.settings
│ │ ├── GoogleApps.settings
│ │ ├── Ideas.settings
│ │ ├── IdentityProvider.settings
│ │ ├── IncidentMgmt.settings
│ │ ├── Industries.settings
│ │ ├── IndustriesEinsteinFeature.settings
│ │ ├── InvocableAction.settings
│ │ ├── Knowledge.settings
│ │ ├── Language.settings
│ │ ├── LeadConfig.settings
│ │ ├── LightningExperience.settings
│ │ ├── LiveAgent.settings
│ │ ├── LiveMessage.settings
│ │ ├── Macro.settings
│ │ ├── MailMerge.settings
│ │ ├── Meetings.settings
│ │ ├── Mobile.settings
│ │ ├── MyDomain.settings
│ │ ├── Name.settings
│ │ ├── Notifications.settings
│ │ ├── OauthOidc.settings
│ │ ├── ObjectLinking.settings
│ │ ├── OmniChannel.settings
│ │ ├── Opportunity.settings
│ │ ├── Order.settings
│ │ ├── OrderManagement.settings
│ │ ├── Org.settings
│ │ ├── Pardot.settings
│ │ ├── PardotEinstein.settings
│ │ ├── PartyDataModel.settings
│ │ ├── PathAssistant.settings
│ │ ├── PaymentsIngestEnabled.settings
│ │ ├── PaymentsManagementEnabled.settings
│ │ ├── Picklist.settings
│ │ ├── PlatformEncryption.settings
│ │ ├── Portals.settings
│ │ ├── PredictionBuilder.settings
│ │ ├── Privacy.settings
│ │ ├── Product.settings
│ │ ├── QuickText.settings
│ │ ├── Quote.settings
│ │ ├── RealTimeEvent.settings
│ │ ├── RecommendationBuilder.settings
│ │ ├── RecordPage.settings
│ │ ├── RetailExecution.settings
│ │ ├── Sandbox.settings
│ │ ├── Schema.settings
│ │ ├── Search.settings
│ │ ├── Security.settings
│ │ ├── ServiceCloudVoice.settings
│ │ ├── ServiceSetupAssistant.settings
│ │ ├── Sharing.settings
│ │ ├── Site.settings
│ │ ├── SocialProfile.settings
│ │ ├── SourceTracking.settings
│ │ ├── Survey.settings
│ │ ├── SystemNotification.settings
│ │ ├── Territory2.settings
│ │ ├── Trailhead.settings
│ │ ├── TrialOrg.settings
│ │ ├── UserEngagement.settings
│ │ ├── UserInterface.settings
│ │ ├── UserManagement.settings
│ │ ├── Voice.settings
│ │ ├── WebToX.settings
│ │ └── WorkDotCom.settings
│ ├── sharingRules
│ │ ├── Account.sharingRules
│ │ ├── AccountBrand.sharingRules
│ │ ├── ActionCadenceAsyncJob.sharingRules
│ │ ├── ActivityRecurrence2.sharingRules
│ │ ├── ActivityRecurrence2Exception.sharingRules
│ │ ├── AlternativePaymentMethod.sharingRules
│ │ ├── AppointmentInvitation.sharingRules
│ │ ├── Asset.sharingRules
│ │ ├── AssistantInteraction.sharingRules
│ │ ├── AssistantProgress.sharingRules
│ │ ├── AssistantRecommendation.sharingRules
│ │ ├── AuthorizationForm.sharingRules
│ │ ├── AuthorizationFormConsent.sharingRules
│ │ ├── AuthorizationFormDataUse.sharingRules
│ │ ├── BusinessBrand.sharingRules
│ │ ├── CalendarModel.sharingRules
│ │ ├── CalendarView.sharingRules
│ │ ├── Campaign.sharingRules
│ │ ├── Case.sharingRules
│ │ ├── ChangeRequest.sharingRules
│ │ ├── ChannelProgram.sharingRules
│ │ ├── CleanLeadBackup.sharingRules
│ │ ├── CommSubscription.sharingRules
│ │ ├── CommSubscriptionChannelType.sharingRules
│ │ ├── CommSubscriptionConsent.sharingRules
│ │ ├── ComponentResponseCache.sharingRules
│ │ ├── ConsumptionSchedule.sharingRules
│ │ ├── Contact.sharingRules
│ │ ├── ContactPointConsent.sharingRules
│ │ ├── ContactPointTypeConsent.sharingRules
│ │ ├── ContactRequest.sharingRules
│ │ ├── ConvCoachingRecommendation.sharingRules
│ │ ├── Coupon.sharingRules
│ │ ├── CreditMemo.sharingRules
│ │ ├── Customer.sharingRules
│ │ ├── DataPrepServiceLocator.sharingRules
│ │ ├── DataUseLegalBasis.sharingRules
│ │ ├── DataUsePurpose.sharingRules
│ │ ├── DelegatedAccount.sharingRules
│ │ ├── EngagementChannelType.sharingRules
│ │ ├── ExchangeUserMapping.sharingRules
│ │ ├── ExternalEventMapping.sharingRules
│ │ ├── FinanceBalanceSnapshot.sharingRules
│ │ ├── FinanceTransaction.sharingRules
│ │ ├── FlowInterview.sharingRules
│ │ ├── FlowInterviewLog.sharingRules
│ │ ├── FlowOrchestrationInstance.sharingRules
│ │ ├── FlowOrchestrationLog.sharingRules
│ │ ├── FlowOrchestrationStageInstance.sharingRules
│ │ ├── FlowOrchestrationStepInstance.sharingRules
│ │ ├── FlowOrchestrationWorkItem.sharingRules
│ │ ├── FlowTestResult.sharingRules
│ │ ├── FruitTree__c.sharingRules
│ │ ├── Image.sharingRules
│ │ ├── Incident.sharingRules
│ │ ├── Individual.sharingRules
│ │ ├── Invoice.sharingRules
│ │ ├── Lead.sharingRules
│ │ ├── LearningItem.sharingRules
│ │ ├── LegalEntity.sharingRules
│ │ ├── ListEmail.sharingRules
│ │ ├── LmaApiSession.sharingRules
│ │ ├── LocWaitlistMsgTemplate.sharingRules
│ │ ├── Location.sharingRules
│ │ ├── LocationGroup.sharingRules
│ │ ├── LocationWaitlist.sharingRules
│ │ ├── LocationWaitlistedParty.sharingRules
│ │ ├── Macro.sharingRules
│ │ ├── MacroUsage.sharingRules
│ │ ├── ManagedContentRelease.sharingRules
│ │ ├── MessagingEndUser.sharingRules
│ │ ├── MessagingSession.sharingRules
│ │ ├── MobileHomeConfiguration.sharingRules
│ │ ├── NetworkMemberChunk.sharingRules
│ │ ├── NotifDeliveryUserPref.sharingRules
│ │ ├── Opportunity.sharingRules
│ │ ├── Order.sharingRules
│ │ ├── OrgDeleteRequest.sharingRules
│ │ ├── PartnerFundAllocation.sharingRules
│ │ ├── PartnerFundClaim.sharingRules
│ │ ├── PartnerFundRequest.sharingRules
│ │ ├── PartnerMarketingBudget.sharingRules
│ │ ├── PartyConsent.sharingRules
│ │ ├── Problem.sharingRules
│ │ ├── ProcessException.sharingRules
│ │ ├── Product2.sharingRules
│ │ ├── Promotion.sharingRules
│ │ ├── PromotionSegment.sharingRules
│ │ ├── PromptAction.sharingRules
│ │ ├── PromptError.sharingRules
│ │ ├── QuickText.sharingRules
│ │ ├── QuickTextUsage.sharingRules
│ │ ├── RecordMergeHistory.sharingRules
│ │ ├── ReportResultBlob.sharingRules
│ │ ├── RequestsForAccessSIQ.sharingRules
│ │ ├── ReturnOrder.sharingRules
│ │ ├── Sales_Play__c.sharingRules
│ │ ├── Scorecard.sharingRules
│ │ ├── Seller.sharingRules
│ │ ├── ServiceAppointment.sharingRules
│ │ ├── ServiceContract.sharingRules
│ │ ├── ServiceResource.sharingRules
│ │ ├── ServiceTerritory.sharingRules
│ │ ├── Shift.sharingRules
│ │ ├── SiqUserBlacklist.sharingRules
│ │ ├── SiteUserViewMode.sharingRules
│ │ ├── StreamActivityAccess.sharingRules
│ │ ├── StreamingChannel.sharingRules
│ │ ├── TaskLabel.sharingRules
│ │ ├── TaskLabelRelation.sharingRules
│ │ ├── TodayGoal.sharingRules
│ │ ├── Tree__c.sharingRules
│ │ ├── User.sharingRules
│ │ ├── UserAppMenuCustomization.sharingRules
│ │ ├── UserEmailPreferredPerson.sharingRules
│ │ ├── UserExternalCredential.sharingRules
│ │ ├── UserProvisioningRequest.sharingRules
│ │ ├── VoiceCall.sharingRules
│ │ ├── WebCart.sharingRules
│ │ ├── WebCartDocument.sharingRules
│ │ ├── WebStore.sharingRules
│ │ ├── WorkOrder.sharingRules
│ │ ├── WorkPlan.sharingRules
│ │ ├── WorkPlanTemplate.sharingRules
│ │ ├── WorkStepTemplate.sharingRules
│ │ ├── WorkType.sharingRules
│ │ └── WorkTypeGroup.sharingRules
│ ├── staticresources
│ │ ├── SiteSamples.resource
│ │ ├── SiteSamples.resource-meta.xml
│ │ ├── chartJs.resource
│ │ └── chartJs.resource-meta.xml
│ ├── translations
│ │ ├── en_US.translation
│ │ ├── es.translation
│ │ └── fr.translation
│ ├── triggers
│ │ ├── Simple.trigger
│ │ └── Simple.trigger-meta.xml
│ └── workflows
│ └── Case.workflow
└── unpackaged.zip
46 directories, 704 files
Auditing Scripts

Contents
- List unapproved users with given profile in the org
- List unapproved users having ModifyAllData
- List unapproved users having ViewAllData
- List Users with given Profile who have Role
- Check users with given PermissionSet
- Check SetupAuditTrail
- Operationalize Audit Scripts using github-actions
List unapproved users with given profile in the org
Sample approved_admins.txt content:
Ken Thompson
Dennis Ritchie
Niklaus Wirth
ActiveUsers.soql
SELECT Name FROM User
WHERE isActive = true AND Profile.Name = '$PROFILE'
disjoint-stdin.py
#!/usr/local/bin/python3
# print disjoint lines between stdin and input file
# mchinnappan
#--------------------------------------------------
import sys
usage = """
----------------------------------------------------------------
python3 disjoint-stdin.py ref_file.txt
- Checks and prints any line in the stdin not in the ref_file
----------------------------------------------------------------
"""
if len(sys.argv) != 2 :
print (f"Usage: {usage}")
exit(0)
file2 = sys.argv[1]
file1_contents = [line.strip() for line in sys.stdin.readlines()]
set1 = set(file1_contents)
with open(file2, 'r') as f2:
file2_contents = [line.strip() for line in f2.readlines()]
set2 = set (file2_contents)
for item in set1:
if item not in set2:
print(item.strip())
Script to find out the users not in the approved list
export PROFILE="System Administrator";
echo "SELECT Name FROM User WHERE isActive = true AND Profile.Name = '$PROFILE'" > _check.soql
sfdx mohanc:data:query -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -q _check.soql -f json | jq '.[].Name' | sort | sed 's/"//g'| python3 py/disjoint-stdin.py data/approved-admins.txt
- The output of this script will be :
- list of users whose profile is system administrator but their names are not in the
approved.txt
file.
- list of users whose profile is system administrator but their names are not in the
mohan chinnappan
Operationalize Audit Scripts using github-actions
name: Auditor
run-name: on-demand-auditor
on: [workflow_dispatch]
jobs:
setup-dx-py-audit:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- run: npm install sfdx-cli -g
- run: echo "${{ secrets.SALESFORCE_JWT_SECRET_KEY }}" > server.key
- run: sfdx force:auth:jwt:grant --clientid=${{ secrets.SALESFORCE_CONSUMER_KEY }} --jwtkeyfile=server.key --username=${{ secrets.SALESFORCE_DEVHUB_USERNAME }} --setdefaultdevhubusername
- name: Install mohanc sfdx plugin
run: |
echo 'y' | sfdx plugins:install sfdx-mohanc-plugins
sfdx plugins
- name: Source DeployCheck Deploy
run: |
cd mc2Project
sfdx force:source:deploy -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -p force-app -c --verbose
sfdx mohanc:tooling:query -q ../.soql/deploymentStatus.soql -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -f json > ./deploymentStatus.json
cat ./deploymentStatus.json
- name: Setup Python
uses: actions/setup-python@v3.1.3
with:
python-version: 3.10.10
# List unapproved users with given profile in the org - data/approved-admins.txt
- name: run-sys-admin-check
run : |
export PROFILE="System Administrator";
echo "SELECT Name FROM User WHERE isActive = true AND Profile.Name = '$PROFILE'" > _check.soql
sfdx mohanc:data:query -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -q _check.soql -f json | jq '.[].Name' | sort | sed 's/"//g'| python3 py/disjoint-stdin.py data/approved-admins.txt
# List unapproved users having ModifyAllData - data/approved-admins.txt
- name: run-ModifyAllData-check
run : |
export PERM="PermissionsModifyAllData";
echo "SELECT AssigneeId FROM PermissionSetAssignment WHERE PermissionSet.$PERM=true" > _check2.soql
sfdx mohanc:data:query -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -q _check2.soql -f json | jq '.[].AssigneeId' | sed 's/"//g' | python3 py/list2in.py |sed "s/{/SELECT Name FROM USER WHERE isActive=true AND Id IN (/g" | sed 's/}/)/g' > _psa_.soql;
sfdx mohanc:data:query -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -q _psa_.soql -f json | jq '.[].Name' | sed 's/"//g' | python3 py/disjoint-stdin.py data/approved-admins.txt
# List unapproved users having ViewAllData - data/approved-admins.txt
- name: run-ViewAllData-check
run : |
export PERM="PermissionsViewAllData";
echo "SELECT AssigneeId FROM PermissionSetAssignment WHERE PermissionSet.$PERM=true" > _check2.soql
sfdx mohanc:data:query -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -q _check2.soql -f json | jq '.[].AssigneeId' | sed 's/"//g' | python3 py/list2in.py |sed "s/{/SELECT Name FROM USER WHERE isActive=true AND Id IN (/g" | sed 's/}/)/g' > _psa_.soql;
sfdx mohanc:data:query -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -q _psa_.soql -f json | jq '.[].Name' | sed 's/"//g' | python3 py/disjoint-stdin.py data/approved-admins.txt
# List Users with given Profile who have Role
- name: run-users-with-role-check
run : |
export PROFILE="System Administrator";
echo "SELECT Name, UserRole.Name FROM User WHERE isActive = true AND Profile.Name = '$PROFILE' AND UserRole.Name != null" > _check3.soql
sfdx mohanc:data:query -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -q _check3.soql -f json | jq '.[].Name + "|" + .[].UserRole.Name' | sed 's/"//g'
# Check users with given PermissionSet - EinsteinAnalyticsPlusAdmin
- name: check-user-with-given-ps
run : |
export PERMSET="EinsteinAnalyticsPlusAdmin";
echo "SELECT Id, PermissionSetId, PermissionSet.Name, PermissionSet.ProfileId, PermissionSet.Profile.Name, AssigneeId, Assignee.Name FROM PermissionSetAssignment WHERE PermissionSet.Name LIKE '%$PERMSET%'" > _check4.soql
sfdx mohanc:data:query -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -q _check4.soql -f json |jq '.[].Assignee.Name + "|" + .[].PermissionSet.Name' | sed 's/"//g' | sort -u
- You can use this script for checking other profiles as well - say: Developer profile:
export PROFILE="Developer";
echo "SELECT Name FROM User WHERE isActive = true AND Profile.Name = '$PROFILE'" > _check.soql
sfdx mohanc:data:query -u mohan.chinnappan.n.sel@gmail.com -q _check.soql -f json | jq '.[].Name' | sort | sed 's/"//g'| python3 disjoint-stdin.py approved_developers.txt
List unapproved users having ModifyAllData
- psAssignment_id.soql
SELECT AssigneeId
FROM PermissionSetAssignment
WHERE PermissionSet.$PERM=true
- list2in.py
import sys
file1_contents = [line.strip() for line in sys.stdin.readlines()]
print (set(file1_contents))
export PERM="PermissionsModifyAllData"; sed "s/\$PERM/$PERM/" psAssignment_id.soql > _checkPS.soql;
sfdx mohanc:data:query -u mohan.chinnappan.n.sel@gmail.com -q _checkPS.soql -f json | jq '.[].AssigneeId' | sed 's/"//g' | python3 list2in.py |sed "s/{/SELECT Name FROM USER WHERE isActive=true AND Id IN (/g" | sed 's/}/)/g' > _psa_.soql; sfdx mohanc:data:query -u mohan.chinnappan.n.sel@gmail.com -q _psa_.soql -f json | jq '.[].Name' | sed 's/"//g' | python3 disjoint-stdin.py approved_mad.txt
List unapproved users having ViewAllData
export PERM="PermissionsViewAllData"; sed "s/\$PERM/$PERM/" psAssignment_id.soql > _checkPS.soql;
sfdx mohanc:data:query -u mohan.chinnappan.n.sel@gmail.com -q _checkPS.soql -f json | jq '.[].AssigneeId' | sed 's/"//g' | python3 list2in.py |sed "s/{/SELECT Name FROM USER WHERE isActive=true AND Id IN (/g" | sed 's/}/)/g' > _psa_.soql; sfdx mohanc:data:query -u mohan.chinnappan.n.sel@gmail.com -q _psa_.soql -f json | jq '.[].Name' | sed 's/"//g' | python3 disjoint-stdin.py approved_vad.txt
List Users with given Profile who have Role
cat usersWithRole.soql
SELECT Name, UserRole.Name FROM User
WHERE isActive = true AND Profile.Name = '$PROFILE' AND UserRole.Name != null
export PROFILE="System Administrator"; sed "s/\$PROFILE/$PROFILE/" usersWithRole.soql > _check.soql; \
sfdx mohanc:data:query -u mohan.chinnappan.n.sel@gmail.com -q _check.soql -f json | jq '.[].Name + "|" + .[].UserRole.Name' | sed 's/"//g'
- Sample output - pipe separated - Name|Role
Joe Carpenter|ROLE1
John Smith|ROLE2
Check users with given PermissionSet
cat PermsetUsers.soql
SELECT Id, PermissionSetId, PermissionSet.Name,
PermissionSet.ProfileId, PermissionSet.Profile.Name,
AssigneeId, Assignee.Name
FROM PermissionSetAssignment
WHERE PermissionSet.Name LIKE '%$PERMSET%'
export PERMSET="EinsteinAnalyticsPlusAdmin"; sed "s/\$PERMSET/$PERMSET/" PermsetUsers.soql > _psUsers.soql; sfdx mohanc:data:query -u mohan.chinnappan.n_ea2@gmail.com -q _psUsers.soql -f json |jq '.[].Assignee.Name + "|" + .[].PermissionSet.Name' | sed 's/"//g' | sort -u
output of this command
Mohan Chinnappan|EinsteinAnalyticsPlusAdmin
if you like to checked against an approved list
cat approved_crmaadmin.txt
John Smith
export PERMSET="EinsteinAnalyticsPlusAdmin"; sed "s/\$PERMSET/$PERMSET/" PermsetUsers.soql > _psUsers.soql; sfdx mohanc:data:query -u mohan.chinnappan.n_ea2@gmail.com -q _psUsers.soql -f json |jq '.[].Assignee.Name + "|" + .[].PermissionSet.Name' | sed 's/"//g' | sort -u | python3 disjoint-stdin.py approved_crmaadmin.txt
output of this command
Mohan Chinnappan
Check SetupAuditTrail
Using Datatable view
sfdx mohanc:data:bulkapi:query -u mohan.chinnappan.n.sel2@gmail.com -q setupAduit.soql; sed 's/CreatedBy.Name/CreatedByName/g' setupAduit.soql.csv | pbcopy; open "https://mohan-chinnappan-n5.github.io/viz/datatable/dt.html?c=csv"
Using sqlite3
#!/usr/local/bin/python3
# SQL Query for CSV using pandas and sqlite
# mchinnappan
#--------------------------------------------------
import sys
import time
usage = """
----------------------------------------------------------------
python3 csvQuery input_file query
# SQL Query for CSV using pandas and sqllite on SetupAduit
----------------------------------------------------------------
"""
if len(sys.argv) != 3 :
print (f"Usage: {usage}")
exit(0)
import pandas as pd
import sys
import sqlite3
input_file = sys.argv[1]
table_name = 't' + str(round(time.time()*1000) )
sql_string_in = sys.argv[2] # Example for SetupAuditTrail: SELECT CreatedDate, "CreatedBy.Name" AS name, Display FROM sat WHERE Section ="Manage Users"
sql_string = sql_string_in.replace('$TBL', table_name)
# read CSV file into a DataFrame
df = pd.read_csv(input_file)
conn = sqlite3.connect("_temp.db") #if the db does not exist, it will be created
#store your table in the database:
df.to_sql(table_name, conn)
#read a SQL Query out of your database and into a pandas dataframe
df = pd.read_sql(sql_string, conn)
print(df.to_csv(sys.stdout))
cat setupAduit.soql
SELECT CreatedDate,
CreatedBy.Name,
CreatedByContext,
CreatedByIssuer,
Display,
Section,
DelegateUser
FROM SetupAuditTrail
sfdx mohanc:data:bulkapi:query -u mohan.chinnappan.n.sel@gmail.com -q setupAduit.soql ; python3 querycsv.py setupAduit.soql.csv ' SELECT CreatedDate, "CreatedBy.Name" AS name, Display FROM $TBL WHERE Section ="Manage Users"'
Output of the this
,CreatedDate,name,Display
0,2022-10-25T20:36:52.000Z,mohan chinnappan,Created permission set DevOps Center: with no license
1,2022-10-25T20:36:53.000Z,mohan chinnappan,Created permission set DevOps Center Manager: with no license
2,2022-10-25T20:36:53.000Z,mohan chinnappan,Created permission set DevOps Center Release Manager: with no license
3,2022-10-25T20:39:25.000Z,,Created permission set sf_devops_NamedCredentials: with no license
4,2022-10-25T20:39:26.000Z,,Created permission set sf_devops_InitializeEnvironments: with no license
5,2022-11-10T09:30:51.000Z,mohan chinnappan,Set new password for user null
- You can change the sql used by using different sections (for e.g Apex Class, Company Profile...)
SELECT CreatedDate, "CreatedBy.Name" AS name, Display FROM $TBL WHERE Section ="Apex Class"
References
Custom Metadata Types
-
Records in a Custom Metadata Type are Metadata not data
-
You can package Custom Metadata Type Records
-
Data Objects
- Used By users
- For the business
- Standard
- Account, Contact, Opportunity
- Custom
- Warranty, Invoice, Project, CaronCredit...
-
Metadata Types
- Used by Admins
- For the App
- Standard
- Validation Rules
- Account settings
- Custom Object Translation
- Custom
- Stock point Rule
- Warranty Settings
-
Custom Settings
- Data in Custom Setting is unpackageable
CLI
# Creates a new custom metadata type in the current project.
force:cmdt:create
sfdx force:cmdt:create --typename MyCustomType
sfdx force:cmdt:create --typename MyCustomType --label "Custom Type" --plurallabel "Custom Types" --visibility Protected
# Generate a custom metadata field based on the field type provided.
force:cmdt:field:create
sfdx force:cmdt:field:create --fieldname MyField --fieldtype Checkbox
sfdx force:cmdt:field:create --fieldname MyField --fieldtype Picklist --picklistvalues "A,B,C"
sfdx force:cmdt:field:create --fieldname MyField --fieldtype Number --decimalplaces 2
# Generates a custom metadata type and all its records for the provided sObject.
force:cmdt:generate
sfdx force:cmdt:generate --devname MyCMDT --sobjectname MySourceObject__c
sfdx force:cmdt:generate --devname MyCMDT --sobjectname MySourceObject__c --ignoreunsupported --targetusername 'alias or user email of the org containing the source type'
# Create a new record for a given custom metadata type in the current project.
force:cmdt:record:create
sfdx force:cmdt:record:create --typename MyCMT__mdt --recordname MyRecord My_Custom_Field_1=Foo My_Custom_Field_2=Bar
# Create new custom metadata type records from a CSV file.
force:cmdt:record:insert
sfdx force:cmdt:record:insert --filepath path/to/my.csv --typename My_CMDT_Name
UUID
- UUIds (Universally Unique IDentifier), also known as GUIDs (Globally Unique IDentifier).
- A UUID is 128 bits long, and can guarantee uniqueness across space and time.
- You can use this where you need unique values for a field (say External ID, Name...)
- Spec for UUId is here
Apex example
// using ApexUuid
// https://github.com/jongpie/ApexUUID/blob/main/apex-uuid/classes/Uuid.cls
String uuid = new Uuid().getValue();
System.debug(uuid);
String uuid = new Uuid().getValue();
System.debug(uuid);
Account acct = new Account();
acct.name = 'Green ' + uuid;
insert acct;
- Spring 24 release
// in Spring 24 release
UUID randomUuid = UUID.randomUUID();
// Prints the UUID string that was randomly generated
system.debug(randomUuid);
Python example
## simple_uuid.py
import uuid
print (uuid.uuid1())
print (uuid.uuid3(uuid.NAMESPACE_DNS, 'python.org'))
print (uuid.uuid4())
print (uuid.uuid5(uuid.NAMESPACE_DNS, 'python.org'))
- Running this :
python simple_uuid.py
e7bafaaa-bdc3-11ee-972e-acde48001122
6fa459ea-ee8a-3ca4-894e-db77e160355e
5b3418b4-d52e-458e-9ef5-a3279b476f97
886313e1-3b8a-5372-9b90-0c9aee199e5d
- Python version ApexUuid is here
// example.py
from Uuid4 import Uuid4
def example():
try:
# Create a new UUID instance
uuid_instance = Uuid4()
# Access the generated UUID
uuid_value = uuid_instance.value
print(f"Generated UUID: {uuid_value}")
# Check if a UUID is valid
is_valid = Uuid4.is_valid(uuid_value)
print(f"Is Valid UUID: {is_valid}")
# Format a UUID
unformatted_uuid = "550e8400-e29b-41d4-a716-446655440000"
formatted_uuid = Uuid4.format_value(unformatted_uuid)
print(f"Formatted UUID: {formatted_uuid}")
# Create a UUID instance from a string
uuid_instance_from_string = Uuid4.value_of("550e8400-e29b-41d4-a716-446655440000")
print(f"UUID from String: {uuid_instance_from_string.value}")
except UuidException as e:
print(f"UuidException: {e}")
if __name__ == "__main__":
example()
python example.py
Generated UUID: 37cce53e-eb9e-4faf-8dcd-516fae0efafd
Is Valid UUID: True
Formatted UUID: 550e8400-e29b-41d4-a716-446655440000
UUID from String: 550e8400-e29b-41d4-a716-446655440000
Java version
- Uses this for Uuid4
// UuidExample.java
public class UuidExample {
public static void main(String[] args) {
try {
// Create a new UUID instance
Uuid4 uuidInstance = new Uuid4();
// Access the generated UUID
String uuidValue = uuidInstance.getValue();
System.out.println("Generated UUID: " + uuidValue);
// Check if a UUID is valid
boolean isValid = Uuid4.isValid(uuidValue);
System.out.println("Is Valid UUID: " + isValid);
// Format a UUID
String unformattedUuid = "550e8400-e29b-41d4-a716-446655440000";
String formattedUuid = Uuid4.formatValue(unformattedUuid);
System.out.println("Formatted UUID: " + formattedUuid);
// Create a UUID instance from a string
Uuid4 uuidInstanceFromString = Uuid4.valueOf("550e8400-e29b-41d4-a716-446655440000");
System.out.println("UUID from String: " + uuidInstanceFromString.getValue());
} catch (Uuid4.UuidException e) {
System.out.println("UuidException: " + e.getMessage());
}
}
}
- Usage
javac Uuid4.java UuidExample.java
java UuidExample
Generated UUID: 7972fe1e-6ebb-4c27-9901-f7a025de7408
Is Valid UUID: true
Formatted UUID: 550e8400-e29b-41d4-a716-446655440000
UUID from String: 550e8400-e29b-41d4-a716-446655440000
References
- UUID
- Get Support for Randomly Generated UUID v4
- ApexUuid by our Jonathan Gillespie
- Python version of ApexUuid
- Java Uuid Example
Code Quality
Introduction
There is no such thing as "perfect" code — there is only better code!
Instead of seeking perfection, we should seek for is "continuous improvement"...
We rarely write tests for our tests — a human must ensure that tests are valid.
-
Code Quality Is Everyone’s Responsibility including:
- Developer
- Tester
-
Code quality aspects
Aspect | Comments |
---|---|
Reliability | Probability that a system will run without failure over a specific period of operation |
Maintainability | How easily software can be maintained |
Testability | how well the software supports testing efforts |
Portability | how usable the same software is in different environments |
Reusability | whether existing assets — such as code — can be used again |
- Code quality Metrics
Metric | Comments |
---|---|
Defect Metrics | Number of defects — and severity of those defects — are important metrics of overall quality. |
Complexity Metrics | Cyclomatic complexity for example |
Testability | how well the software supports testing efforts |
Portability | how usable the same software is in different environments |
- Ways to Improve Code Quality
Way | Comments |
---|---|
Use a coding standard | Makes everyone to use the right style |
Analyze code — before code reviews | Analyze code as soon as it’s written |
Follow code review best practices | Manual code reviews are still important for verifying the intent of the code |
Refactor legacy code (when necessary) | help you clean up your codebase and lower its complexity |
Write Maintainable Testable Good Code
- readable
- understandable
- testable
Good Code |
---|
Does what it should |
Follows a consistent style |
It is easy to understand |
Has been well-documented |
It can be tested |
Key Points |
---|
Avoid Code with global, mutable state - that any code can access, and any code can modify |
A class should ask for its dependencies through its constructor rather than acquiring them itself - dependency injection |
Optimize the code for the reader of the code, not the writer - Good Code Is Easily Read and Understood, in Part and in Whole |
DRY - Don't repeat yourself - it is easier to maintain the code if it only exists in one place |
Do the simplest, smallest thing you can do to add some value |
Static Code Analysis
-
Run Static Code Analysis tools like PMD that attempt to highlight possible violations and vulnerabilities within ‘static’ (non-running) source code by using techniques such as Taint Analysis and Data Flow Analysis
-
Integrated with IDEs like VSCode, VSCode Apex PMD
New PMD report
Demo (will be part of PMD 6.50.0)
Screenshot
ESLint
ESLinting
Lightning Lint Service
Inclusive Language
Code Review
- Peer Code Review
- Individual programmers are less than 50% efficient at finding bugs in their own software.
- Code Walk-thru
Goals |
---|
The overall code health of code base is improving over time |
The codebase stays consistent, maintainable |
-
What to look for in a Code Review?
- The developers should test code well-enough that they work correctly by the time they get to code review.
Item | Comments |
---|---|
Design | Interactions in the code makes sense? |
Functionality | Code is the manifest of what the developer intended? |
Complexity | Is the code is more complex than it should be?. “can’t be understood quickly by code readers.” Over-engineering, where developers have made the code more generic than it needs to be |
Tests | Unit, integration, or end-to-end tests |
Naming | Developer pick good names for everything? |
Comments | Developer has written clear comments in understandable Language. Comment should explain why some code exists, and should not be explaining what some code is doing. Regular expressions and complex algorithms often benefit greatly from comments that explain what they’re doing, for example. Note that comments are different from documentation of classes, modules, or functions, which should instead express the purpose of a piece of code, how it should be used, and how it behaves when used |
Style | Developer follows the published style for a given programming language |
Documentation | READMEs and generated reference docs |
- Code Review Summary
Item |
---|
The code is well-designed. |
The functionality is good for the users of the code. |
Any UI changes are sensible and look good. |
Any parallel programming is done safely. |
The code isn’t more complex than it needs to be. |
The developer isn’t implementing things they might need in the future but don’t know they need now. |
Code has appropriate unit tests. |
Tests are well-designed. |
The developer used clear names for everything. |
Comments are clear and useful, and mostly explain why instead of what. |
Code is appropriately documented |
The code conforms to our style guides |
Note to the developers
Item | Comments |
---|---|
Don’t Take it Personally | The goal of review is to maintain the quality of the codebase and the products |
Fix the Code | 1. Clarify the code 2.code comment that explains why the code is there (if required) 3. Modify the code |
Think Collaboratively | If you disagree with the reviewer, find ways to collaborate: ask for clarifications, discuss pros/cons, and provide explanations of why your method of doing things is better for the codebase |
Apex Best Practices
Practice |
---|
1. Bulkify your Code |
2. Avoid SOQL Queries or DML statements inside FOR Loops |
3. Bulkify your Helper Methods |
4. Using Collections, Streamlining Queries, and Efficient For Loops |
5. Streamlining Multiple Triggers on the Same Object |
6. Querying Large Data Sets |
7. Use of the Limits Apex Methods to Avoid Hitting Governor Limits |
8. Use @future Appropriately |
9. Writing Test Methods to Verify Large Datasets |
10. Avoid Hardcoding IDs |
Naming Conventions
- Based on Java Naming Conventions
Item | Notes | Example |
---|---|---|
Class Name | Should be unique, begin with an Uppercase letter. Do not contain underscores or spaces (except from the prefix and suffix). Should be nouns in mixed cases, with first letter of each interval word capitalized | AccountController |
Variable Name | should be in mixed case (camelCase) with a lowercase first letter. Internal words start with capital letters | geocodingUrl |
Method Name | Should be verbs, in mixed case with the first letter lowercase, with the first letter of each internal word capitalized - camelCase | getPagedPropertyList(...) |
Constants | Class constants should be all UPPERCASE with words separated by underscores | private static final String BASE_URL = 'https://nominatim.openstreetmap.org/search?format=json' |
Trigger |
Test Driven Development
- Test Driven Development(TDD) = Test First Development(TFD) + refactoring
- First write the test code before the functional code.
- it's a way to get SelfTestingCode
Steps | Comments |
---|---|
Write a quick test | Basically just enough code to fail |
Run the test | It should fail |
Write functional code | So that pass the test |
Finally, run the test | To check whether it has passed |
Re-factor until tests PASS to continue with further development |
LWC Best Practices
SOQL Best Practices
Lightning Performance Best Practices
Large Data Volumes
LWC Test Coverage - Jest
Write Jest tests to: |
---|
Test a component in isolation |
Test a component’s public API (@api properties and methods, events) |
Test basic user interaction (clicks) |
Verify the DOM output of a component |
Verify that events fire when expected |
Simple Jest Demo
// app.js
function sum(a, b) {
return a + b;
}
module.exports = sum;
// app.test.js
const sum = require('./app');
test('adds 1 + 2 to equal 3', () => {
expect(sum(1, 2)).toBe(3);
});
yarn test
yarn test
yarn run v1.22.17
$ jest
PASS ./app.test.js
✓ adds 1 + 2 to equal 3 (1 ms)
Test Suites: 1 passed, 1 total
Tests: 1 passed, 1 total
Snapshots: 0 total
Time: 0.519 s, estimated 1 s
Ran all test suites.
✨ Done in 1.76s.
LWC Example (unedited)
Using Sa11y for 508 compliance
More info
Visualizing Code Coverage
- This demo shows how to visualize code coverage in VS Code
How to setup video - Enabling Apex Code Coverage Overlay in VS Code | Developer Quick Takes
LWC i18n
Demo showing how to use Custom Labels for localization
-
English (US)
-
French
-
Spanish
-
Example:
sfdx mohanc:mdapi:ls -u mohan.chinnappan.n.sel@gmail.com -i listCL.json -t CustomLabel
[ 'Greeting' ]
{
"fullName": "Greeting",
"language": "en_US",
"protected": "true",
"shortDescription": "Greeting",
"value": "Hello World From LWC"
}
Resources
Generate
sfdx mohanc:slides:gen -i cq.md -o cq.md.html -t 'Notes on Code Quality'
PMD Apex code scan using DX
- Interested in VS Code Extension for PMD Apex Code scan?
Topics
Make sure you have java installed
java --version
openjdk 11.0.14.1 2022-02-08
OpenJDK Runtime Environment Homebrew (build 11.0.14.1+0)
OpenJDK 64-Bit Server VM Homebrew (build 11.0.14.1+0, mixed mode)
Download and install pmd distribution - pmd-bin-6.47.0
-
from pmd-bin-6.47.0
-
unzip the file pmd-bin-6.47.0.zip
jar xvf pmd-bin-6.47.0.zip
- go to bin folder of the extracted pmd-bin-6.47.0.zip
- do this copy
cd pmd-bin-6.47.0/bin
cp run.sh pmd-run.sh
- put this folder:
pmd-bin-6.47.0/bin
in your PATH (~/.zshrc or ~/.bashrc
) - example in ~/.zshrc
# PMD env var holds the location of your PMD install
export PMD="/Users/mchinnappan/node-pmd/pmd-bin-6.47.0"
# let us put the bin folder of the PMD in path
export PATH="$PATH:$PMD/bin"
- make it executable
chmod +x pmd-run.sh
Extract Apex Classes from your Org
Login into the Org
sfdx force:auth:web:login
Successfully authorized mohan.chinnappan.n_ea2@gmail.com with org ID 00D3h000007R1LuEAK
Get the Apex Classes using DX
sfdx mohanc:mdapi:retrieve -u mohan.chinnappan.n_ea2@gmail.com -t "ApexClass"
cd /tmp/
/tmp >mkdir apex
/tmp >cd apex
/tmp/apex >sfdx mohanc:mdapi:checkRetrieveStatus -u mohan.chinnappan.n_ea2@gmail.com -i "09S3h000006DK3lEAG"
jar xvf 09S3h000006DK3lEAG.zip
Download the apex_ruleset.xml to your ~/.pmd folder
- create folder
mkdir ~/.pmd
- copy the apex_ruleset.xml from:
- apex_ruleset.xml
- to the folder ~/.pmd
Check for @SuppressWarning annotations in the code
grep -irn '@SuppressWarnings' unpackaged/classes
unpackaged/classes/GeocodingService.cls:1:@SuppressWarnings('PMD.ExcessiveParameterList')
Using VS Code
Run pmd csv output
- your apex class files are at: /tmp/apex/unpackaged/classes/
- csv output
pmd-run.sh pmd -R ~/.pmd/apex_ruleset.xml -d /tmp/apex/unpackaged/classes/ -f csv > /tmp/apex/pmd-results.csv
Script for CI/CD Pipeline cicd-pmd.sh
# Script for pipeline
# Bails out if count of P1,P2 and P3 are not zero
# -----------------------------------------
#----- configure the following to meet your needs ----
RULESET=~/.pmd/apex_ruleset.xml
MSG="Number of P1,P2 and P3 issues are"
THRESHOLD=4
CODE=/tmp/apex/unpackaged/classes/
#-----------------------------------------------------
# Run the pmd
echo pmd-run.sh pmd -R $RULESET -d $CODE -f csv > results.csv
pmd-run.sh pmd -R $RULESET -d $CODE -f csv > results.csv
# query the results using SQL
echo "SELECT COUNT(*) AS CNT FROM CSV(\"results.csv\", {headers:true}) WHERE Priority < $THRESHOLD" > q.sql
cat q.sql
sfdx mohanc:data:query:sql -q q.sql > out.json
# check for the errors
nerrors=`sfdx mohanc:data:jq -f '.[].CNT' -i out.json`
echo "nerrors: $nerrors"
if [ "$nerrors" != 0 ]
then
echo "$MSG: $nerrors. Stopping the deployment!"
exit 2
fi
echo "Continue the deployment..."
Sample Run
bash cicd-pmd.sh
Jan 10, 2023 9:42:14 PM net.sourceforge.pmd.PMD encourageToUseIncrementalAnalysis
WARNING: This analysis could be faster, please consider using Incremental Analysis: https://pmd.github.io/pmd-6.47.0/pmd_userdocs_incremental_analysis.html
nerrors: 0
Continue the deployment...
Sample Run with bailing out
Github Actions - PMD workflow bailing out when violations count is not zero
- uses: pmd/pmd-github-action@v1
id: pmd
with:
version: '6.40.0'
sourcePath: 'mc2Project/force-app/main/default/classes'
rulesets: 'pmd/rulesets/apex_ruleset.xml'
- name: Fail build if there are violations
if: steps.pmd.outputs.violations != 0
run: exit 1
Open the results csv file
open /tmp/apex/pmd-results.csv
Run pmd html output
- your apex class files are at: /tmp/apex/unpackaged/classes/
- html output
pmd-run.sh pmd -R ~/.pmd/apex_ruleset.xml -d /tmp/apex/unpackaged/classes/ -f html > /tmp/apex/pmd-results.html
Open the results html file
open /tmp/apex/pmd-results.html
Run pmd enhanced html output
pmd-run.sh pmd -R ~/.pmd/apex_ruleset.xml -d ./unpackaged/classes -f xslt -property xsltFilename=html-report-v2.xslt > pmd-report.html
Screenshot
Demo
Rule set info
-
P1 and P2 are from the PMD default rulesets:
-
As shown here: Default
-
The ruleset that is we use is on the top these default rulesets.
-
The ruleset (which most of the PMD customers use ):
Comments
Topics
Apex Code Comments
Class level
token | description |
---|---|
@author | the author of the class |
@date | the date the class was first implemented |
@group | a group to display this class under, in the menu hierarchy |
@group-content | a relative path to a static html file that provides content about the group |
@description | one or more lines that provide an overview of the class |
/*******************************************************************************************
* @name FruitService
* @author FirstName LastName <auth@email.com>
* @date 01/02/2022
* @group Fruit Services
* @description This class contains all methods related to managing Fruits in Salesforce.
*******************************************************************************************/
/*
MODIFICATION LOG
* Version Developer Date Description
*-------------------------------------------------------------------------------------------
* 1.0 FullName 01/02/2022 Initial Creation
******************************************************************************************
*/
Method level
token | description |
---|---|
@description | one or more lines that provide an overview of the method |
@param param name | a description of what the parameter does |
@return | a description of the return value from the method |
@example | Example code usage. This will be wrapped in tags to preserve whitespace |
/**************************************************************************************
* @description This method finds count of the given Fruit and returns it
* @param String - name - Fruit name
* @return Integer - Value retrieved .
* @example
* FruitService.getFruitsCount('mango')
**************************************************************************************/
public static IntegerI(String name) {
// ...
}
Tools to check for the comments in the repo
Check for comments in given folder with Apex classes
USAGE
$ sfdx mohanc:regex:apex:checkComments2 [-d <string>] [--json] [--loglevel
trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]
OPTIONS
-d, --directoryname=directoryname Directory containing Apex Class
files
--json format output as json
--loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL) [default: warn] logging level for
this command invocation
EXAMPLE
** Check for comments in given folder with Apex classes **
$ sfdx mohanc:regex:apex:checkComments2 -d <directory containing source code>
- You can redirect the output to a file like this to open in your spreadsheet program
sfdx mohanc:regex:apex:checkComments2 -d ascent-develop/src/sales/channel-fundamentals/main/default/classes > REPO_comments2-status-jul-12-2022.csv
References
Apex Code Test Coverage
SELECT
Id
,IsDeleted
,CreatedDate
,CreatedById
,LastModifiedDate
,LastModifiedById
,SystemModstamp
,ApexTestClassId
,TestMethodName
,ApexClassOrTriggerId
,NumLinesCovered
,NumLinesUncovered
,Coverage
FROM ApexCodeCoverage
Demo
LWC Linting using ESLint
- ESLint is a tool for identifying and reporting on patterns found in ECMAScript/JavaScript code, with the goal of making code more consistent and avoiding bugs.
- ES stands for EcmaScript
@lwc/eslint-plugin-lwc
$ npm install eslint @babel/core @babel/eslint-parser @lwc/eslint-plugin-lwc --save-dev
cat .eslintrc
{
"parser": "@babel/eslint-parser",
"parserOptions": {
"requireConfigFile": false,
"babelOptions": {
"parserOpts": {
"plugins": ["classProperties", ["decorators", { "decoratorsBeforeExport": false }]]
}
}
},
"plugins": ["@lwc/eslint-plugin-lwc"],
"rules": {
"@lwc/lwc/no-deprecated": "error",
"@lwc/lwc/valid-api": "error",
"@lwc/lwc/no-document-query": "error"
}
}
New SFDX Project
- ESLint plugin already included in the new SFDX Project which was created with
sfdx force:project:create -n ProjectName
- run
npm install
to install the ESLint plugin and other dependencies
cat .eslintignore
**/lwc/**/*.css
**/lwc/**/*.html
**/lwc/**/*.json
**/lwc/**/*.svg
**/lwc/**/*.xml
**/aura/**/*.auradoc
**/aura/**/*.cmp
**/aura/**/*.css
**/aura/**/*.design
**/aura/**/*.evt
**/aura/**/*.json
**/aura/**/*.svg
**/aura/**/*.tokens
**/aura/**/*.xml
**/aura/**/*.app
.sfdx
- Create a lwc
sfdx force:lightning:component:create --type lwc -n HelloPeach
cat package.json
{
"name": "salesforce-app",
"private": true,
"version": "1.0.0",
"description": "Salesforce App",
"scripts": {
"lint": "eslint **/{aura,lwc}/**",
"test": "npm run test:unit",
"test:unit": "sfdx-lwc-jest",
"test:unit:watch": "sfdx-lwc-jest --watch",
"test:unit:debug": "sfdx-lwc-jest --debug",
"test:unit:coverage": "sfdx-lwc-jest --coverage",
"prettier": "prettier --write \"**/*.{cls,cmp,component,css,html,js,json,md,page,trigger,xml,yaml,yml}\"",
"prettier:verify": "prettier --list-different \"**/*.{cls,cmp,component,css,html,js,json,md,page,trigger,xml,yaml,yml}\"",
"postinstall": "husky install",
"precommit": "lint-staged"
},
"devDependencies": {
"@lwc/eslint-plugin-lwc": "^1.1.2",
"@prettier/plugin-xml": "^2.0.1",
"@salesforce/eslint-config-lwc": "^3.2.3",
"@salesforce/eslint-plugin-aura": "^2.0.0",
"@salesforce/eslint-plugin-lightning": "^1.0.0",
"@salesforce/sfdx-lwc-jest": "^1.1.0",
"eslint": "^8.11.0",
"eslint-plugin-import": "^2.25.4",
"eslint-plugin-jest": "^26.1.2",
"husky": "^7.0.4",
"lint-staged": "^12.3.7",
"prettier": "^2.6.0",
"prettier-plugin-apex": "^1.10.0"
},
"lint-staged": {
"**/*.{cls,cmp,component,css,html,js,json,md,page,trigger,xml,yaml,yml}": [
"prettier --write"
],
"**/{aura,lwc}/**": [
"eslint"
]
}
}
References
- Configuring ESLint
- Notes on Code Quality
- Salesforce VS Code extensions
- Official ESLint rules for Lightning Web Components (LWC)
eslint custom rules
Here we will explain how to setup a custom rule for eslint in two methods
Use case
- We like want to warn the developers about using
alert()
in their code
Writing the rule
- Prepare the offending code
alert('test')
- Use AST Explorer to find out the node details
- We need to check for
node.callee.name === 'alert'
as per the AST
- Our rule should be something like this
- meta : for metadata about the rule
- create: action part of the rule
- here we check for
node.callee.name === 'alert'
and report the developer that with a message:Using alert() is not recommended
using thecontext
provided by the eslint
- here we check for
module.exports = {
meta: {
type: 'suggestion',
docs: {
description: 'Warn the alert function used in the code',
category: 'Best Practices',
recommended: true,
},
},
create: function (context) {
return {
CallExpression(node) {
if (node.callee.name === 'alert') {
context.report({
node,
message: 'Using alert() is not recommended',
});
}
},
};
},
};
Project Setup
Create project: custom-rules-demo
~/eslint >mkdir custom-rules-demo
~/eslint >cd custom-rules-demo
~/eslint/custom-rules-demo >npm init @eslint/config
✔ How would you like to use ESLint? · problems
✔ What type of modules does your project use? · esm
✔ Which framework does your project use? · none
✔ Does your project use TypeScript? · No / Yes
✔ Where does your code run? · browser
✔ What format do you want your config file to be in? · JavaScript
A config file was generated, but the config file itself may not follow your linting rules.
Successfully created .eslintrc.js file in /Users/mchinnappan/eslint/custom-rules-demo
Look at .eslintrc.js
~/eslint/custom-rules-demo >cat .eslintrc.js
module.exports = {
"env": {
"browser": true,
"es2021": true
},
"extends": "eslint:recommended",
"overrides": [
],
"parserOptions": {
"ecmaVersion": "latest",
"sourceType": "module"
},
"rules": {
}
}
Make a folder for rules
~/eslint/custom-rules-demo >mkdir rules
~/eslint/custom-rules-demo >touch rules/no-alert.js
Content of the rules/no-alert.js
~/eslint/custom-rules-demo >cat rules/no-alert.js
module.exports = {
meta: {
type: 'suggestion',
docs: {
description: 'Warn the alert function used in the code',
category: 'Best Practices',
recommended: true,
},
},
create: function (context) {
return {
CallExpression(node) {
if (node.callee.name === 'alert') {
context.report({
node,
message: 'Using alert() is not recommended',
});
}
},
};
},
};
Create app.js
~/eslint/custom-rules-demo >touch app.js
~/eslint/custom-rules-demo >cat app.js
console.log('hello world!');
alert('this is an alert');
Add an entry in the .eslintrc.js for our custom rule no-alert
"rules": {
"no-alert": ["warn"]
}
~/eslint/custom-rules-demo >vi .eslintrc.js
~/eslint/custom-rules-demo >cat .eslintrc.js
module.exports = {
"env": {
"browser": true,
"es2021": true
},
"extends": "eslint:recommended",
"overrides": [
],
"parserOptions": {
"ecmaVersion": "latest",
"sourceType": "module"
},
"rules": {
"no-alert": ["warn"]
}
}
Run the eslint with our custom rule
~/eslint/custom-rules-demo >eslint --rulesdir rules app.js
Results
/Users/mchinnappan/eslint/custom-rules-demo/app.js
1:28 error Extra semicolon semi
2:1 warning Using alert() is not recommended no-alert
2:26 error Extra semicolon semi
✖ 3 problems (2 errors, 1 warning)
2 errors and 0 warnings potentially fixable with the `--fix` option.
Fixing and running eslint
vi app.js
cat app.js
console.log('hello world!')
- Running the modified code
eslint --rulesdir rules app.js
Taking care of assertions like in JEST expect(1).toBe(1)
cat .eslintrc.js
module.exports = {
"env": {
"browser": true,
"es2021": true,
"jest": true
},
"extends": ["eslint:recommended"],
"overrides": [
],
"parserOptions": {
"ecmaVersion": "latest",
"sourceType": "module"
},
"plugins": ["jest"],
"rules": {
"no-alert": ["warn"],
"no-literal-expect": ["warn"]
}
}
cat rules/no-literal-expect.js
module.exports = {
meta: {
type: 'problem',
docs: {
description: 'disallow hardcoded expect statements',
category: 'Possible Errors',
recommended: true
},
schema: []
},
create: function (context) {
return {
CallExpression(node) {
if (
node.callee.type === "Identifier" &&
node.callee.name === "expect" &&
node.arguments.length === 1 &&
node.arguments[0].value === 1
) {
context.report({
node,
message: `${context.id}: Using expect(1).toBe(1) is not recommended`,
});
}
},
};
}
}
Testing
cat app.js
alert('hello')
expect(1).toBe(1)
const x = 100
expect(x).to(100)
eslint --rulesdir=rules app.js
1:1 warning Using alert() is not recommended no-alert
2:1 warning no-literal-expect: Using expect(1).toBe(1) is not recommended no-literal-expect
✖ 2 problems (0 errors, 2 warnings)
Using plugin eslint-plugin-mohanc-eslint-rules
Create a new project
- Note: In an existing project you do not need this step
mkdir eslint-testing
cd eslint-testing
npm init -y
Wrote to /private/tmp/eslint-testing/package.json:
{
"name": "eslint-testing",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"author": "",
"license": "ISC"
}
Install the plugin for the custom rules
- Note: This step is required for an existing project as well
npm i eslint-plugin-mohanc-eslint-rules
added 1 package, and audited 2 packages in 922ms
found 0 vulnerabilities
Look at the package.json
cat package.json
{
"name": "eslint-testing",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"author": "",
"license": "ISC",
"dependencies": {
"eslint-plugin-mohanc-eslint-rules": "^0.0.2"
}
}
Create .eslintrc
- NOTE: In an existing project just add/update the line specific to "plugins"
vi .eslintrc
cat .eslintrc
{
"parserOptions": {
"ecmaVersion": 2021,
"jest": true
},
"rules": {
"mohanc-eslint-rules/no-alert": "warn",
"mohanc-eslint-rules/no-literal-expect": "warn"
},
"plugins": ["mohanc-eslint-rules"]
}
Create the sample code (app.js) to test these custom rules
vi app.js
cat app.js
alert('hello');
expect(1).toBe(1);
const x = 100;
expect(x).toBe(100);
Run the eslint to lint this file (app.js)
/tmp/eslint-testing >npx eslint app.js
/private/tmp/eslint-testing/app.js
1:1 warning Using alert() is not recommended mohanc-eslint-rules/no-alert
2:1 warning mohanc-eslint-rules/no-literal-expect: Using expect(1).toBe(1) is not recommended mohanc-eslint-rules/no-literal-expect
✖ 2 problems (0 errors, 2 warnings)
How it shows in VSCode (click on the image to view the full view)
- Demo
Deployments
Tools to get the log of the deployment of in an org
SELECT
Id
,SystemModstamp
,NumberComponentsDeployed
,NumberComponentErrors
,numberTestErrors
,numberTestsCompleted
,numberTestsTotal
,runTestsEnabled
,rollbackOnError
,CheckOnly
,completedDate
,createdDate
,ErrorStatusCode
,Status
,ErrorMessage
,ignoreWarnings
,startDate
,stateDetail
FROM DeployRequest
ORDER BY CompletedDate DESC
LIMIT 10
sfdx mohanc:query:tooling -q ~/.soql/deploymentStatus.soql -u mohan.chinnappan.n.sel@gmail.com
sfdx mohanc:query:tooling -q ~/.soql/deploymentStatus.soql -u mohan.chinnappan.n.sel@gmail.com | bat
───────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
│ STDIN
───────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
1 │ "attributes","Id","SystemModstamp","NumberComponentsDeployed","NumberComponentErrors","NumberTestErrors","NumberTestsCompleted","NumberTestsTo
│ tal","RunTestsEnabled","RollbackOnError","CheckOnly","CompletedDate","CreatedDate","ErrorStatusCode","Status","ErrorMessage","IgnoreWarnings",
│ "StartDate","StateDetail"
2 │ "{""type"":""DeployRequest"",""url"":""/services/data/v56.0/tooling/sobjects/DeployRequest/0Af4x00000Y402zCAB""}","0Af4x00000Y402zCAB","2023-0
│ 1-21T10:54:40.000+0000",1,0,0,0,0,false,true,false,"2023-01-21T10:54:40.000+0000","2023-01-21T10:51:06.000+0000",,"Succeeded",,false,"2023-01-
│ 21T10:51:06.000+0000",
3 │ "{""type"":""DeployRequest"",""url"":""/services/data/v56.0/tooling/sobjects/DeployRequest/0Af4x00000Y33AkCAJ""}","0Af4x00000Y33AkCAJ","2023-0
│ 1-18T17:39:36.000+0000",1,0,0,0,0,false,true,true,"2023-01-18T17:39:36.000+0000","2023-01-18T17:39:35.000+0000",,"Succeeded",,false,"2023-01-1
│ 8T17:39:35.000+0000",
4 │ "{""type"":""DeployRequest"",""url"":""/services/data/v56.0/tooling/sobjects/DeployRequest/0Af4x00000Y31jECAR""}","0Af4x00000Y31jECAR","2023-0
│ 1-18T17:31:31.000+0000",1,0,0,0,0,false,true,true,"2023-01-18T17:31:31.000+0000","2023-01-18T17:31:30.000+0000",,"Succeeded",,false,"2023-01-1
│ 8T17:31:30.000+0000",
5 │ "{""type"":""DeployRequest"",""url"":""/services/data/v56.0/tooling/sobjects/DeployRequest/0Af4x00000Y2wuRCAR""}","0Af4x00000Y2wuRCAR","2023-0
│ 1-18T14:45:47.000+0000",1,0,0,0,0,false,true,true,"2023-01-18T14:45:47.000+0000","2023-01-18T14:45:42.000+0000",,"Succeeded",,false,"2023-01-1
│ 8T14:45:42.000+0000",
6 │ "{""type"":""DeployRequest"",""url"":""/services/data/v56.0/tooling/sobjects/DeployRequest/0Af4x00000Y2wu7CAB""}","0Af4x00000Y2wu7CAB","2023-0
│ 1-18T14:31:43.000+0000",1,0,0,0,0,false,true,true,"2023-01-18T14:31:43.000+0000","2023-01-18T14:31:35.000+0000",,"Succeeded",,false,"2023-01-1
│ 8T14:31:37.000+0000",
7 │ "{""type"":""DeployRequest"",""url"":""/services/data/v56.0/tooling/sobjects/DeployRequest/0Af4x00000Y2wrhCAB""}","0Af4x00000Y2wrhCAB","2023-0
│ 1-18T14:30:13.000+0000",1,120,0,0,0,false,true,true,"2023-01-18T14:30:13.000+0000","2023-01-18T14:30:07.000+0000",,"Failed",,false,"2023-01-18
│ T14:30:07.000+0000",
8 │ "{""type"":""DeployRequest"",""url"":""/services/data/v56.0/tooling/sobjects/DeployRequest/0Af4x00000Y2wpRCAR""}","0Af4x00000Y2wpRCAR","2023-0
│ 1-18T14:27:40.000+0000",1,120,0,0,0,false,true,false,"2023-01-18T14:27:40.000+0000","2023-01-18T14:27:29.000+0000",,"Failed",,false,"2023-01-1
│ 8T14:27:35.000+0000",
9 │ "{""type"":""DeployRequest"",""url"":""/services/data/v56.0/tooling/sobjects/DeployRequest/0Af4x00000Y2wp7CAB""}","0Af4x00000Y2wp7CAB","2023-0
│ 1-18T14:24:49.000+0000",1,0,0,0,0,false,true,false,"2023-01-18T14:24:49.000+0000","2023-01-18T14:24:43.000+0000",,"Succeeded",,false,"2023-01-
│ 18T14:24:44.000+0000",
10 │ "{""type"":""DeployRequest"",""url"":""/services/data/v56.0/tooling/sobjects/DeployRequest/0Af4x00000Y2wmwCAB""}","0Af4x00000Y2wmwCAB","2023-0
│ 1-18T14:22:54.000+0000",1,120,0,0,0,false,true,false,"2023-01-18T14:22:54.000+0000","2023-01-18T14:22:44.000+0000",,"Failed",,false,"2023-01-1
│ 8T14:22:44.000+0000",
:
Using SF-LAND Data Viz to view the deployment log
Delta Deployments
Topics
- Assumptions
- Comparing - Delta vs full deployment
- Delta deployment logs
- Recommendation
- Single Command to Delta Deployment
- Detailed Steps
- References and more tools
Assumptions
- Version control should be the single-source of truth for the source code
- Version control should be kept up-to-date at any point in time
Delta vs full deployment
Full Deployment | Delta |
---|---|
Adheres to the principles of container based deployments | You only deploy the delta - not deploying untouched code |
Takes more deployment time | Quicker Deployments |
Simple toolset | Requires smarter toolset |
Simple to keep track of deployments | Requires good documentation including commit-ids used for the deployments - Logs as shown below |
Delta Deployment Logs - example
Deployment Date | Approval Ref# | Environment | From Commit-Id SHA-1 | To Commit-Id SHA-1 | Comments |
---|---|---|---|---|---|
Dec-10-2022 | #123456 | UAT | 839861d | 0d306cc | 12 components |
Recommendation
- Each approach has a place in your deployment process. Start with a couple of full deployments and follow it with delta deployment
How do prepare for Delta deployment?
Single command to do the delta deployment
- Download the script from here
- deltaDeploy
Demo
~/deltatest [main] >bash deltaDeploy.sh
-----------------------------------
Delta Deployment
-----------------------------------
===== git log =====
29dff0a|mohan-chinnappan-n|mohan.chinnappan.n@gmail.com|cleanup
15ca632|mohan-chinnappan-n|mohan.chinnappan.n@gmail.com|delta image added
3456d9f|mohan-chinnappan-n|mohan.chinnappan.n@gmail.com|delta.md updated
474669c|mohan-chinnappan-n|mohan.chinnappan.n@gmail.com|Fruit Mango added
73ea22c|mohan-chinnappan-n|mohan.chinnappan.n@gmail.com|Fruit Mango added
171a81a|mohan-chinnappan-n|mohan.chinnappan.n@gmail.com|pineapple fruit classes removed
faa2d86|mohan-chinnappan-n|mohan.chinnappan.n@gmail.com|3 fruit classes added
fe12d89|mohan-chinnappan-n|mohan.chinnappan.n@gmail.com|init
-----------------------------------
FROM (Enter SHA1 for this commit FROM which we need to the delta deployment, defulat: HEAD^): fe12d89
TO (Enter SHA1 for this commit TO which we need to the delta deployment, default: HEAD): 29dff0a
----------------------------------- fe12d89 to 29dff0a -----------------------------------
----------------------------------- Preparing delta packages -----------------------------------
{
"error": null,
"output": ".",
"success": true,
"warnings": []
}
<---- SUCCESS: true
Check only? (Enter y/n, default: y): y
<---- Check only: -c
Run TestClasses only? (Enter y/n, default: y): n
<---- testlevel
Run deletion pre or post? (Enter pre or post, default: post)):
post
<---- pre/post: post
----------------------------------- Deploying delta packages -----------------------------------
sfdx force:source:deploy -x package/package.xml --postdestructivechanges destructiveChanges/destructiveChanges.xml --verbose --loglevel TRACE
Deploying v55.0 metadata to mohan.chinnappan.n_ea2@gmail.com using the v56.0 SOAP API
Deploy ID: 0Af3h00000Rz0dNCAR
DEPLOY PROGRESS | ████████████████████████████████████████ | 3/3 Components
=== Deployed Source
FULL NAME TYPE PROJECT PATH
────────── ───────── ──────────────────────────────────────────────────────
HelloMango ApexClass force-app/main/default/classes/HelloMango.cls
HelloMango ApexClass force-app/main/default/classes/HelloMango.cls-meta.xml
HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls
HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls-meta.xml
HelloPear ApexClass force-app/main/default/classes/HelloPear.cls
HelloPear ApexClass force-app/main/default/classes/HelloPear.cls-meta.xml
Deploy Succeeded.
Full steps
Let us create DX Project 'deltatest' to provide the concepts
~ >sfdx force:project:create -n deltatest
target dir = /Users/mchinnappan
create deltatest/config/project-scratch-def.json
create deltatest/README.md
create deltatest/sfdx-project.json
create deltatest/.husky/pre-commit
create deltatest/.vscode/extensions.json
create deltatest/.vscode/launch.json
create deltatest/.vscode/settings.json
create deltatest/force-app/main/default/lwc/.eslintrc.json
create deltatest/force-app/main/default/aura/.eslintrc.json
create deltatest/scripts/soql/account.soql
create deltatest/scripts/apex/hello.apex
create deltatest/.eslintignore
create deltatest/.forceignore
create deltatest/.gitignore
create deltatest/.prettierignore
create deltatest/.prettierrc
create deltatest/jest.config.js
create deltatest/package.json
Commit into git
~ >cd deltatest
~/deltatest >git init .
hint: Using 'master' as the name for the initial branch. This default branch name
hint: is subject to change. To configure the initial branch name to use in all
hint: of your new repositories, which will suppress this warning, call:
hint:
hint: git config --global init.defaultBranch <name>
hint:
hint: Names commonly chosen instead of 'master' are 'main', 'trunk' and
hint: 'development'. The just-created branch can be renamed via this command:
hint:
hint: git branch -m <name>
Initialized empty Git repository in /Users/mchinnappan/deltatest/.git/
~/deltatest [master] >git branch -m main
~/deltatest [main] >git add -A
~/deltatest [main] >git commit -m init
[main (root-commit) fe12d89] init
18 files changed, 255 insertions(+)
create mode 100644 .eslintignore
create mode 100755 .forceignore
create mode 100644 .gitignore
create mode 100755 .husky/pre-commit
create mode 100755 .prettierignore
create mode 100755 .prettierrc
create mode 100644 .vscode/extensions.json
create mode 100644 .vscode/launch.json
create mode 100644 .vscode/settings.json
create mode 100644 README.md
create mode 100644 config/project-scratch-def.json
create mode 100644 force-app/main/default/aura/.eslintrc.json
create mode 100644 force-app/main/default/lwc/.eslintrc.json
create mode 100644 jest.config.js
create mode 100644 package.json
create mode 100644 scripts/apex/hello.apex
create mode 100644 scripts/soql/account.soql
create mode 100644 sfdx-project.json
Let us see the git commit log
~/deltatest [main] >git lg2
* fe12d89 - Sun, 8 Jan 2023 12:59:54 -0500 (2 minutes ago) (HEAD -> main)
init - mohan-chinnappan-n
Let us create 3 classes HelloPeach HelloPear and HelloPineapple
~/deltatest [main] >sfdx force:apex:class:create -n HelloPeach -d force-app/main/default/classes
target dir = /Users/mchinnappan/deltatest/force-app/main/default/classes
create force-app/main/default/classes/HelloPeach.cls
create force-app/main/default/classes/HelloPeach.cls-meta.xml
~/deltatest [main] >sfdx force:apex:class:create -n HelloPear -d force-app/main/default/classes
target dir = /Users/mchinnappan/deltatest/force-app/main/default/classes
create force-app/main/default/classes/HelloPear.cls
create force-app/main/default/classes/HelloPear.cls-meta.xml
~/deltatest [main] >sfdx force:apex:class:create -n HelloPineapple -d force-app/main/default/classes
target dir = /Users/mchinnappan/deltatest/force-app/main/default/classes
create force-app/main/default/classes/HelloPineapple.cls
create force-app/main/default/classes/HelloPineapple.cls-meta.xml
Let us commit this change into git
~/deltatest [main] >git add -A
~/deltatest [main] >git commit -m '3 fruit classes added'
[main faa2d86] 3 fruit classes added
6 files changed, 46 insertions(+)
create mode 100644 force-app/main/default/classes/HelloPeach.cls
create mode 100644 force-app/main/default/classes/HelloPeach.cls-meta.xml
create mode 100644 force-app/main/default/classes/HelloPear.cls
create mode 100644 force-app/main/default/classes/HelloPear.cls-meta.xml
create mode 100644 force-app/main/default/classes/HelloPineapple.cls
create mode 100644 force-app/main/default/classes/HelloPineapple.cls-meta.xml
Let us see the git log for these changes
~/deltatest [main] >git lg3
* faa2d86 - Sun, 8 Jan 2023 13:08:07 -0500 (12 seconds ago) (committed: Sun, 8 Jan 2023 13:08:07 -0500) (HEAD -> main)
| 3 fruit classes added
* fe12d89 - Sun, 8 Jan 2023 12:59:54 -0500 (8 minutes ago) (committed: Sun, 8 Jan 2023 12:59:54 -0500)
init
Using git diff to view the diff between these 2 SHAs fe12d89 and faa2d86
~/deltatest [main] >git diff fe12d89...faa2d86
Let us remove the HelloPineapple class and commit it into git
~/deltatest [main] >git add -A
~/deltatest [main] >git commit -m 'pineapple fruit classes removed'
[main 171a81a] pineapple fruit classes removed
2 files changed, 10 deletions(-)
delete mode 100644 force-app/main/default/classes/HelloPineapple.cls
delete mode 100644 force-app/main/default/classes/HelloPineapple.cls-meta.xml
Let us see the git logs
~/deltatest [main] >git lg3
* 171a81a - Sun, 8 Jan 2023 13:10:14 -0500 (4 seconds ago) (committed: Sun, 8 Jan 2023 13:10:14 -0500) (HEAD -> main)
| pineapple fruit classes removed
* faa2d86 - Sun, 8 Jan 2023 13:08:07 -0500 (2 minutes ago) (committed: Sun, 8 Jan 2023 13:08:07 -0500)
| 3 fruit classes added
* fe12d89 - Sun, 8 Jan 2023 12:59:54 -0500 (10 minutes ago) (committed: Sun, 8 Jan 2023 12:59:54 -0500)
init
Let us view the diff between these SHAs faa2d89 and fe12d89
~/deltatest [main] >git diff faa2d86..fe12d89
~/deltatest [main] >git diff 171a81a..faa2d86
Let us add the new class HelloMango
~/deltatest [main] >sfdx force:apex:class:create -n HelloMango -d force-app/main/default/classes
target dir = /Users/mchinnappan/deltatest/force-app/main/default/classes
create force-app/main/default/classes/HelloMango.cls
create force-app/main/default/classes/HelloMango.cls-meta.xml
Commit it into git
~/deltatest [main] >git add -A
~/deltatest [main] >git commit -m 'Fruit Mango added'
[main 73ea22c] Fruit Mango added
2 files changed, 566 insertions(+)
create mode 100644 force-app/main/default/classes/HelloMango.cls
create mode 100644 force-app/main/default/classes/HelloMango.cls-meta.xml
Let us view the git log
~/deltatest [main] >git lg
* 474669c - (64 seconds ago) Fruit Mango added - mohan-chinnappan-n (HEAD -> main)
* 73ea22c - (2 minutes ago) Fruit Mango added - mohan-chinnappan-n
* 171a81a - (17 minutes ago) pineapple fruit classes removed - mohan-chinnappan-n
* faa2d86 - (19 minutes ago) 3 fruit classes added - mohan-chinnappan-n
* fe12d89 - (27 minutes ago) init - mohan-chinnappan-n
Let us run the sgd:source:delta to create package.xml and destructiveChanges.xml
- Install this plugin using
sfdx plugins:install sfdx-git-delta
- Check the plugin install
sfdx plugins
sfdx-git-delta 5.8.0 <------ from Sébastien Colladon
sfdx-mohanc-plugins 0.0.222
-
This plugin generates the sfdx content in source format and destructive change from two git commits
-
Our 'from' SHA is faa2d86 'to' SHA is 'faa2d86' output dir is current folder
~/deltatest [main] >sfdx sgd:source:delta -f faa2d86 -t 474669c -d -o .
{
"error": null,
"output": ".",
"success": true,
"warnings": []
}
Additions
- Contains the metadata that has been added and changed and that needs to be deployed to the Org
~/deltatest [main] >cat package/package.xml
<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
<types>
<members>HelloMango</members>
<name>ApexClass</name>
</types>
<version>55.0</version>
</Package>
- Deploy this by
sfdx force:source:deploy -x package/package.xml -c --verbose --testlevel RunLocalTests --loglevel DEBUG
Deletions
- Contains the metadata that has been removed that needs to be deleted from the Org
~/deltatest [main] >cat destructiveChanges/destructiveChanges.xml
```xml
<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
<types>
<members>HelloPineapple</members>
<name>ApexClass</name>
</types>
<version>55.0</version>
</Package>
- Perform deletion by
sfdx force:mdapi:deploy -d destructiveChanges
- Note this package.xml is an empty file required by DX CLI for the deployment of destructive changes
tree destructiveChanges/
destructiveChanges/
├── destructiveChanges.xml
└── package.xml
my .gitconfig
[alias]
lg = lg1
lg1 = lg1-specific --all
lg2 = lg2-specific --all
lg3 = lg3-specific --all
lg1-specific = log --graph --abbrev-commit --decorate --format=format:'%C(bold blue)%h%C(reset) - %C(bold green)(%ar)%C(reset) %C(white)%s%C(reset) %C(dim white)- %an%C(reset)%C(auto)%d%C(reset)'
lg2-specific = log --graph --abbrev-commit --decorate --format=format:'%C(bold blue)%h%C(reset) - %C(bold cyan)%aD%C(reset) %C(bold green)(%ar)%C(reset)%C(auto)%d%C(reset)%n'' %C(white)%s%C(reset) %C(dim white)- %an%C(reset)'
lg3-specific = log --graph --abbrev-commit --decorate --format=format:'%C(bold blue)%h%C(reset) - %C(bold cyan)%aD%C(reset) %C(bold green)(%ar)%C(reset) %C(bold cyan)(committed: %cD)%C(reset) %C(auto)%d%C(reset)%n'' %C(white)%s%C(reset)%n'' %C(dim white)- %an <%ae> %C(reset) %C(dim white)(committer: %cn <%ce>)%C(reset)'
References
Delta deployment
Demo of Org query compare tool
Github Actions
Examples
Running org:viz using Github Actions
name: onDemand-org-viz
run-name: run org viz
on: [workflow_dispatch]
jobs:
run-org-viz:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: sfdx-actions/setup-sfdx@v1
- run: echo ${{ github.actor }} is running this...
- run: echo ${{ secrets.SEL_AUTH_JSON_BASE64 }} > ./mySecret.json.txt
- run: cat ./mySecret.json.txt | base64 -d > ./mySecret.json
- run: echo "SECRET_JSON=$(jq -c . < ./mySecret.json)" >> $GITHUB_ENV
- run: echo '${{ fromJson(env.SECRET_JSON).result.id }}'
- run: echo FN=slide__$RANDOM >> $GITHUB_ENV
- name: Install mohanc sfdx plugins
run: |
echo 'y' | sfdx plugins:install sfdx-mohanc-plugins
sfdx plugins
- name: Process secrets
run: |
echo ${{ fromJson(env.SECRET_JSON).result.sfdxAuthUrl }} > ./mySecret.url.txt
sfdx auth:sfdxurl:store -f ./mySecret.url.txt -s -a targetOrg
- name: Run Org Viz
run: sfdx mohanc:org:viz -u ${{ fromJson(env.SECRET_JSON).result.username }}
- name: setup git config
run: |
# setup the username and email. using 'GitHub Actions Bot' with no email by default
git config user.name "GitHub Actions Bot"
git config user.email "<>"
- name: Commit files
run: |
git pull origin HEAD
git add Org.json Org.svg
git commit -m "add org viz files"
git push origin HEAD
PMD workflow bailing out when violations count is not zero
PMD workflow bailing out when violations count is not zero
- uses: pmd/pmd-github-action@v1
id: pmd
with:
version: '6.40.0'
sourcePath: 'mc2Project/force-app/main/default/classes'
rulesets: 'pmd/rulesets/apex_ruleset.xml'
- name: Fail build if there are violations
if: steps.pmd.outputs.violations != 0
run: exit 1
- Detailed
name: on-push-deployment-develop
run-name: Push Check Deploy - develop
on: [workflow_dispatch]
jobs:
check-deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- run: npm install sfdx-cli -g
- run: echo "${{ secrets.SALESFORCE_JWT_SECRET_KEY }}" > server.key
- run: sfdx force:auth:jwt:grant --clientid=${{ secrets.SALESFORCE_CONSUMER_KEY }} --jwtkeyfile=server.key --username=${{ secrets.SALESFORCE_DEVHUB_USERNAME }} --setdefaultdevhubusername
- name: Install mohanc sfdx plugin
run: |
echo 'y' | sfdx plugins:install sfdx-mohanc-plugins
sfdx plugins
- name: Run Org Viz
run: sfdx mohanc:org:viz -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }}
- uses: actions/setup-java@v3
with:
distribution: 'zulu' # See 'Supported distributions' for available options
java-version: '17'
- run: |
cd src
java HelloWorld.java
cd ..
- uses: pmd/pmd-github-action@v1
id: pmd
with:
version: '6.40.0'
sourcePath: 'mc2Project/force-app/main/default/classes'
rulesets: 'pmd/rulesets/apex_ruleset.xml'
- name: Fail build if there are violations
if: steps.pmd.outputs.violations != 0
run: exit 1
- name: Upload SARIF file
uses: github/codeql-action/upload-sarif@v2
with:
sarif_file: pmd-report.sarif
- name: Source DeployCheck Deploy
run: |
cd mc2Project
sfdx force:source:deploy -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -p force-app -c --verbose
sfdx mohanc:tooling:query -q ../.soql/deploymentStatus.soql -u ${{ secrets.SALESFORCE_DEVHUB_USERNAME }} -f json > ./deploymentStatus.json
cat ./deploymentStatus.json
- name: setup git config
run: |
# setup the username and email. using 'GitHub Actions Bot' with no email by default
git config user.name "GitHub Actions Bot"
git config user.email "<>"
- name: Commit files
run: |
git add Org.json Org.svg
git commit -m "updated Org Viz files"
git push origin HEAD
Validating changes
Deployment validations using force:source commands
Login into the org
sfdx force:auth:web:login
Successfully authorized mohan.chinnappan.n_ea2@gmail.com with org ID 00D3h000007R1LuEAK
Create DX Project (say: validate1)
mkdir ~/.prj
cd ~/.prj
sfdx force:project:create -n validate1
ls ~/.prj/validate1
cd validate1
ls
README.md force-app package.json sfdx-project.json
config jest.config.js scripts
Retrieve (for example ApexClass)
sfdx force:source:retrieve -u mohan.chinnappan.n_ea2@gmail.com -m ApexClass
Preparing retrieve request... done
=== Retrieved Source
FULL NAME TYPE PROJECT PATH
─────────────────────────────────────── ───────── ───────────────────────────────────────────────────────────────────────────────────
AccessToken ApexClass force-app/main/default/classes/AccessToken.cls
...
List the retrieved
ls force-app/main/default/classes
AccessToken.cls LightningLoginFormController.cls-meta.xml
AccessToken.cls-meta.xml LightningLoginFormControllerTest.cls
...
Quick deploy
sfdx force:source:deploy -u mohan.chinnappan.n_ea2@gmail.com -m ApexClass -c
*** Deploying with SOAP API v55.0 ***
Deploy ID: 0Af3h00000QZnmZCAT
DEPLOY PROGRESS | ████████████████████████████████████████ | 105/105 Components
Successfully validated the deployment. 105 components deployed and 0 tests run.
Can I Do it in a single script?
- Yes
Bash script
- Script
- make it executable
chmod +x validate.sh
#!/bin/bash
##--------------------------------------##
# validate.sh
# Purpose: Validating the given metadata type(s) from a source org against a target org
# - mohan chinnappan jul-19-2022 - creation
# - mohan chinnappan aug-12-2022 - added apiversion support
# - <your_name> date - <your contributions>
##-------------------------------------##
## Util Functions
sf_login() {
case $1 in
SBX)
echo "=== login into a Sandbox ... ==="
sfdx force:auth:web:login -r https://test.salesforce.com
;;
PROD)
echo "=== login into a PROD or Dev Edition ... ==="
sfdx force:auth:web:login -r https://login.salesforce.com
;;
esac
}
help() {
if [[ "$1" -ne 7 ]]
then
echo "=== Validate script ==="
echo "usage: ./validate.sh <SRC PROD|SBX> <PRJ_Name> <SRC_UserName> <MetadataTypes - in command separated> <TARGET PROD|SBX> <TARGET_UserName> <apiversion>"
echo "Example: usage: ./validate.sh PROD validate2 mohan.chinnappan.n_ea2@gmail.com ApexClass PROD mohan.chinnappan.n_ea2@gmail.com 55.0"
exit 0
fi
}
help $#
mkdir ~/.prj
cd ~/.prj
# Read commandline args
sbx_prod=$1
project=$2
un=$3
mt=$4
tsbx_prod=$5
tun=$6
apiversion=$7
echo "=== Login into the Source org ==="
sf_login $sbx_prod
# Remove the validate2 folder if it is already there
[ -e ~/.prj/validate2 ] && rm -fr ~/.prj/validate2
echo "=== Creating the the project $project ... ==="
sfdx force:project:create -n $project
cd validate2
ls -l
echo "=== Retrieving source for the metadata type(s): $mt ... ==="
sfdx force:source:retrieve -u $un -m $mt
ls -l
echo "=== Login into the Target org ==="
sf_login $tsbx_prod
echo "=== Validating the source for the quick deployment ... ==="
sfdx force:source:deploy -u $tun -m $mt -c -l RunLocalTests --apiversion $apiversion
echo "=== Opening the org for checking the deployment status ... ==="
sfdx force:org:open -u $tun -p lightning/setup/DeployStatus/home
Demo
=== Validate script ===
usage: ./validate.sh <SRC PROD|SBX> <PRJ_Name> <SRC_UserName> <MetadataTypes - in command separated> <TARGET PROD|SBX> <TARGET_UserName>
Example: usage: ./validate.sh PROD validate2 mohan.chinnappan.n_ea2@gmail.com ApexClass PROD mohan.chinnappan.n_ea2@gmail.com
./validate.sh PROD validate2 mohan.chinnappan.n_ea2@gmail.com ApexClass PROD mohan.chinnappan.n_ea2@gmail.com
mkdir: /Users/mchinnappan/.prj: File exists
=== Login into the Source org ===
Successfully authorized mohan.chinnappan.n_ea2@gmail.com with org ID 00D3h000007R1LuEAK
=== Creating the the project validate2 ... ===
target dir = /Users/mchinnappan/.prj
identical validate2/config/project-scratch-def.json
identical validate2/README.md
identical validate2/sfdx-project.json
identical validate2/.husky/pre-commit
identical validate2/.vscode/extensions.json
identical validate2/.vscode/launch.json
identical validate2/.vscode/settings.json
identical validate2/force-app/main/default/lwc/.eslintrc.json
identical validate2/force-app/main/default/aura/.eslintrc.json
identical validate2/scripts/soql/account.soql
identical validate2/scripts/apex/hello.apex
identical validate2/.eslintignore
identical validate2/.forceignore
identical validate2/.gitignore
identical validate2/.prettierignore
identical validate2/.prettierrc
identical validate2/jest.config.js
identical validate2/package.json
total 32
-rw-r--r-- 1 mchinnappan staff 1251 Jul 19 17:07 README.md
drwxr-xr-x 3 mchinnappan staff 96 Jul 19 17:07 config
drwxr-xr-x 3 mchinnappan staff 96 Jul 19 17:07 force-app
-rw-r--r-- 1 mchinnappan staff 170 Jul 19 17:07 jest.config.js
-rw-r--r-- 1 mchinnappan staff 1386 Jul 19 17:07 package.json
drwxr-xr-x 4 mchinnappan staff 128 Jul 19 17:07 scripts
-rw-r--r-- 1 mchinnappan staff 217 Jul 19 17:07 sfdx-project.json
=== Retrieving source for the metadata type(s): ApexClass ... ===
Preparing retrieve request... done
=== Retrieved Source
FULL NAME TYPE PROJECT PATH
─────────────────────────────────────── ───────── ───────────────────────────────────────────────────────────────────────────────────
AccessToken ApexClass force-app/main/default/classes/AccessToken.cls
AccessToken ApexClass force-app/main/default/classes/AccessToken.cls-meta.xml
AccountController ApexClass force-app/main/default/classes/AccountController.cls
AccountController ApexClass force-app/main/default/classes/AccountController.cls-meta.xml
AccountRESTController ApexClass force-app/main/default/classes/AccountRESTController.cls
AccountRESTController ApexClass force-app/main/default/classes/AccountRESTController.cls-meta.xml
ApexClassForTests ApexClass force-app/main/default/classes/ApexClassForTests.cls
ApexClassForTests ApexClass force-app/main/default/classes/ApexClassForTests.cls-meta.xml
ApexParser ApexClass force-app/main/default/classes/ApexParser.cls
ApexParser ApexClass force-app/main/default/classes/ApexParser.cls-meta.xml
ApexStepData ApexClass force-app/main/default/classes/ApexStepData.cls
ApexStepData ApexClass force-app/main/default/classes/ApexStepData.cls-meta.xml
AsyncExecutionExample ApexClass force-app/main/default/classes/AsyncExecutionExample.cls
AsyncExecutionExample ApexClass force-app/main/default/classes/AsyncExecutionExample.cls-meta.xml
AutocreatedRegHandler1618428423355 ApexClass force-app/main/default/classes/AutocreatedRegHandler1618428423355.cls
AutocreatedRegHandler1618428423355 ApexClass force-app/main/default/classes/AutocreatedRegHandler1618428423355.cls-meta.xml
ChangePasswordController ApexClass force-app/main/default/classes/ChangePasswordController.cls
ChangePasswordController ApexClass force-app/main/default/classes/ChangePasswordController.cls-meta.xml
ChangePasswordControllerTest ApexClass force-app/main/default/classes/ChangePasswordControllerTest.cls
ChangePasswordControllerTest ApexClass force-app/main/default/classes/ChangePasswordControllerTest.cls-meta.xml
ClassDiagramCreatorCtrl ApexClass force-app/main/default/classes/ClassDiagramCreatorCtrl.cls
ClassDiagramCreatorCtrl ApexClass force-app/main/default/classes/ClassDiagramCreatorCtrl.cls-meta.xml
CodeMetrics ApexClass force-app/main/default/classes/CodeMetrics.cls
CodeMetrics ApexClass force-app/main/default/classes/CodeMetrics.cls-meta.xml
CommunitiesLandingController ApexClass force-app/main/default/classes/CommunitiesLandingController.cls
CommunitiesLandingController ApexClass force-app/main/default/classes/CommunitiesLandingController.cls-meta.xml
CommunitiesLandingControllerTest ApexClass force-app/main/default/classes/CommunitiesLandingControllerTest.cls
CommunitiesLandingControllerTest ApexClass force-app/main/default/classes/CommunitiesLandingControllerTest.cls-meta.xml
CommunitiesLoginController ApexClass force-app/main/default/classes/CommunitiesLoginController.cls
CommunitiesLoginController ApexClass force-app/main/default/classes/CommunitiesLoginController.cls-meta.xml
CommunitiesLoginControllerTest ApexClass force-app/main/default/classes/CommunitiesLoginControllerTest.cls
CommunitiesLoginControllerTest ApexClass force-app/main/default/classes/CommunitiesLoginControllerTest.cls-meta.xml
CommunitiesSelfRegConfirmController ApexClass force-app/main/default/classes/CommunitiesSelfRegConfirmController.cls
CommunitiesSelfRegConfirmController ApexClass force-app/main/default/classes/CommunitiesSelfRegConfirmController.cls-meta.xml
CommunitiesSelfRegConfirmControllerTest ApexClass force-app/main/default/classes/CommunitiesSelfRegConfirmControllerTest.cls
CommunitiesSelfRegConfirmControllerTest ApexClass force-app/main/default/classes/CommunitiesSelfRegConfirmControllerTest.cls-meta.xml
CommunitiesSelfRegController ApexClass force-app/main/default/classes/CommunitiesSelfRegController.cls
CommunitiesSelfRegController ApexClass force-app/main/default/classes/CommunitiesSelfRegController.cls-meta.xml
CommunitiesSelfRegControllerTest ApexClass force-app/main/default/classes/CommunitiesSelfRegControllerTest.cls
CommunitiesSelfRegControllerTest ApexClass force-app/main/default/classes/CommunitiesSelfRegControllerTest.cls-meta.xml
DataManager_Activity ApexClass force-app/main/default/classes/DataManager_Activity.cls
DataManager_Activity ApexClass force-app/main/default/classes/DataManager_Activity.cls-meta.xml
DataManager_ActivityTest ApexClass force-app/main/default/classes/DataManager_ActivityTest.cls
DataManager_ActivityTest ApexClass force-app/main/default/classes/DataManager_ActivityTest.cls-meta.xml
DataManager_CleanUp ApexClass force-app/main/default/classes/DataManager_CleanUp.cls
DataManager_CleanUp ApexClass force-app/main/default/classes/DataManager_CleanUp.cls-meta.xml
DataManager_CleanUpTest ApexClass force-app/main/default/classes/DataManager_CleanUpTest.cls
DataManager_CleanUpTest ApexClass force-app/main/default/classes/DataManager_CleanUpTest.cls-meta.xml
DataManager_Dataflow ApexClass force-app/main/default/classes/DataManager_Dataflow.cls
DataManager_Dataflow ApexClass force-app/main/default/classes/DataManager_Dataflow.cls-meta.xml
DataManager_DataflowTest ApexClass force-app/main/default/classes/DataManager_DataflowTest.cls
DataManager_DataflowTest ApexClass force-app/main/default/classes/DataManager_DataflowTest.cls-meta.xml
DataManager_Opportunity ApexClass force-app/main/default/classes/DataManager_Opportunity.cls
DataManager_Opportunity ApexClass force-app/main/default/classes/DataManager_Opportunity.cls-meta.xml
DataManager_OpportunityRelated ApexClass force-app/main/default/classes/DataManager_OpportunityRelated.cls
DataManager_OpportunityRelated ApexClass force-app/main/default/classes/DataManager_OpportunityRelated.cls-meta.xml
DataManager_OpportunityRelatedTest ApexClass force-app/main/default/classes/DataManager_OpportunityRelatedTest.cls
DataManager_OpportunityRelatedTest ApexClass force-app/main/default/classes/DataManager_OpportunityRelatedTest.cls-meta.xml
DataManager_OpportunityTest ApexClass force-app/main/default/classes/DataManager_OpportunityTest.cls
DataManager_OpportunityTest ApexClass force-app/main/default/classes/DataManager_OpportunityTest.cls-meta.xml
DataManager_Quota ApexClass force-app/main/default/classes/DataManager_Quota.cls
DataManager_Quota ApexClass force-app/main/default/classes/DataManager_Quota.cls-meta.xml
DataManager_QuotaTest ApexClass force-app/main/default/classes/DataManager_QuotaTest.cls
DataManager_QuotaTest ApexClass force-app/main/default/classes/DataManager_QuotaTest.cls-meta.xml
DataManager_TestUtils ApexClass force-app/main/default/classes/DataManager_TestUtils.cls
DataManager_TestUtils ApexClass force-app/main/default/classes/DataManager_TestUtils.cls-meta.xml
DataManager_Utils ApexClass force-app/main/default/classes/DataManager_Utils.cls
DataManager_Utils ApexClass force-app/main/default/classes/DataManager_Utils.cls-meta.xml
DataManager_UtilsTest ApexClass force-app/main/default/classes/DataManager_UtilsTest.cls
DataManager_UtilsTest ApexClass force-app/main/default/classes/DataManager_UtilsTest.cls-meta.xml
DataManager_controller ApexClass force-app/main/default/classes/DataManager_controller.cls
DataManager_controller ApexClass force-app/main/default/classes/DataManager_controller.cls-meta.xml
DataManager_controllerTest ApexClass force-app/main/default/classes/DataManager_controllerTest.cls
DataManager_controllerTest ApexClass force-app/main/default/classes/DataManager_controllerTest.cls-meta.xml
ERDCreatorCtrl ApexClass force-app/main/default/classes/ERDCreatorCtrl.cls
ERDCreatorCtrl ApexClass force-app/main/default/classes/ERDCreatorCtrl.cls-meta.xml
ForgotPasswordController ApexClass force-app/main/default/classes/ForgotPasswordController.cls
ForgotPasswordController ApexClass force-app/main/default/classes/ForgotPasswordController.cls-meta.xml
ForgotPasswordControllerTest ApexClass force-app/main/default/classes/ForgotPasswordControllerTest.cls
ForgotPasswordControllerTest ApexClass force-app/main/default/classes/ForgotPasswordControllerTest.cls-meta.xml
Fruit ApexClass force-app/main/default/classes/Fruit.cls
Fruit ApexClass force-app/main/default/classes/Fruit.cls-meta.xml
FuritsInv ApexClass force-app/main/default/classes/FuritsInv.cls
FuritsInv ApexClass force-app/main/default/classes/FuritsInv.cls-meta.xml
FuritsManger ApexClass force-app/main/default/classes/FuritsManger.cls
FuritsManger ApexClass force-app/main/default/classes/FuritsManger.cls-meta.xml
GanttChartCtrl ApexClass force-app/main/default/classes/GanttChartCtrl.cls
GanttChartCtrl ApexClass force-app/main/default/classes/GanttChartCtrl.cls-meta.xml
GanttData ApexClass force-app/main/default/classes/GanttData.cls
GanttData ApexClass force-app/main/default/classes/GanttData.cls-meta.xml
GeocodingService ApexClass force-app/main/default/classes/GeocodingService.cls
GeocodingService ApexClass force-app/main/default/classes/GeocodingService.cls-meta.xml
GeocodingServiceTest ApexClass force-app/main/default/classes/GeocodingServiceTest.cls
GeocodingServiceTest ApexClass force-app/main/default/classes/GeocodingServiceTest.cls-meta.xml
IdeaController ApexClass force-app/main/default/classes/IdeaController.cls
IdeaController ApexClass force-app/main/default/classes/IdeaController.cls-meta.xml
IdeaCtrl ApexClass force-app/main/default/classes/IdeaCtrl.cls
IdeaCtrl ApexClass force-app/main/default/classes/IdeaCtrl.cls-meta.xml
Jdt ApexClass force-app/main/default/classes/Jdt.cls
Jdt ApexClass force-app/main/default/classes/Jdt.cls-meta.xml
LightningForgotPasswordController ApexClass force-app/main/default/classes/LightningForgotPasswordController.cls
LightningForgotPasswordController ApexClass force-app/main/default/classes/LightningForgotPasswordController.cls-meta.xml
LightningForgotPasswordControllerTest ApexClass force-app/main/default/classes/LightningForgotPasswordControllerTest.cls
LightningForgotPasswordControllerTest ApexClass force-app/main/default/classes/LightningForgotPasswordControllerTest.cls-meta.xml
LightningLoginFormController ApexClass force-app/main/default/classes/LightningLoginFormController.cls
LightningLoginFormController ApexClass force-app/main/default/classes/LightningLoginFormController.cls-meta.xml
LightningLoginFormControllerTest ApexClass force-app/main/default/classes/LightningLoginFormControllerTest.cls
LightningLoginFormControllerTest ApexClass force-app/main/default/classes/LightningLoginFormControllerTest.cls-meta.xml
LightningSelfRegisterController ApexClass force-app/main/default/classes/LightningSelfRegisterController.cls
LightningSelfRegisterController ApexClass force-app/main/default/classes/LightningSelfRegisterController.cls-meta.xml
LightningSelfRegisterControllerTest ApexClass force-app/main/default/classes/LightningSelfRegisterControllerTest.cls
LightningSelfRegisterControllerTest ApexClass force-app/main/default/classes/LightningSelfRegisterControllerTest.cls-meta.xml
MetadataService ApexClass force-app/main/default/classes/MetadataService.cls
MetadataService ApexClass force-app/main/default/classes/MetadataService.cls-meta.xml
MetadataServiceTest ApexClass force-app/main/default/classes/MetadataServiceTest.cls
MetadataServiceTest ApexClass force-app/main/default/classes/MetadataServiceTest.cls-meta.xml
MicrobatchSelfRegController ApexClass force-app/main/default/classes/MicrobatchSelfRegController.cls
MicrobatchSelfRegController ApexClass force-app/main/default/classes/MicrobatchSelfRegController.cls-meta.xml
MicrobatchSelfRegControllerTest ApexClass force-app/main/default/classes/MicrobatchSelfRegControllerTest.cls
MicrobatchSelfRegControllerTest ApexClass force-app/main/default/classes/MicrobatchSelfRegControllerTest.cls-meta.xml
MultiselectController ApexClass force-app/main/default/classes/MultiselectController.cls
MultiselectController ApexClass force-app/main/default/classes/MultiselectController.cls-meta.xml
MultiselectControllerTest ApexClass force-app/main/default/classes/MultiselectControllerTest.cls
MultiselectControllerTest ApexClass force-app/main/default/classes/MultiselectControllerTest.cls-meta.xml
MyProfilePageController ApexClass force-app/main/default/classes/MyProfilePageController.cls
MyProfilePageController ApexClass force-app/main/default/classes/MyProfilePageController.cls-meta.xml
MyProfilePageControllerTest ApexClass force-app/main/default/classes/MyProfilePageControllerTest.cls
MyProfilePageControllerTest ApexClass force-app/main/default/classes/MyProfilePageControllerTest.cls-meta.xml
OpportunityController ApexClass force-app/main/default/classes/OpportunityController.cls
OpportunityController ApexClass force-app/main/default/classes/OpportunityController.cls-meta.xml
PagedResult ApexClass force-app/main/default/classes/PagedResult.cls
PagedResult ApexClass force-app/main/default/classes/PagedResult.cls-meta.xml
PlantUMLClassDiagram ApexClass force-app/main/default/classes/PlantUMLClassDiagram.cls
PlantUMLClassDiagram ApexClass force-app/main/default/classes/PlantUMLClassDiagram.cls-meta.xml
PlantUMLERDiagram ApexClass force-app/main/default/classes/PlantUMLERDiagram.cls
PlantUMLERDiagram ApexClass force-app/main/default/classes/PlantUMLERDiagram.cls-meta.xml
PropertyController ApexClass force-app/main/default/classes/PropertyController.cls
PropertyController ApexClass force-app/main/default/classes/PropertyController.cls-meta.xml
SOQLControllerInjection ApexClass force-app/main/default/classes/SOQLControllerInjection.cls
SOQLControllerInjection ApexClass force-app/main/default/classes/SOQLControllerInjection.cls-meta.xml
SampleDataController ApexClass force-app/main/default/classes/SampleDataController.cls
SampleDataController ApexClass force-app/main/default/classes/SampleDataController.cls-meta.xml
SelectOptionSorter ApexClass force-app/main/default/classes/SelectOptionSorter.cls
SelectOptionSorter ApexClass force-app/main/default/classes/SelectOptionSorter.cls-meta.xml
ServiceWaveConfigurationModifier ApexClass force-app/main/default/classes/ServiceWaveConfigurationModifier.cls
ServiceWaveConfigurationModifier ApexClass force-app/main/default/classes/ServiceWaveConfigurationModifier.cls-meta.xml
SiteLoginController ApexClass force-app/main/default/classes/SiteLoginController.cls
SiteLoginController ApexClass force-app/main/default/classes/SiteLoginController.cls-meta.xml
SiteLoginControllerTest ApexClass force-app/main/default/classes/SiteLoginControllerTest.cls
SiteLoginControllerTest ApexClass force-app/main/default/classes/SiteLoginControllerTest.cls-meta.xml
SiteRegisterController ApexClass force-app/main/default/classes/SiteRegisterController.cls
SiteRegisterController ApexClass force-app/main/default/classes/SiteRegisterController.cls-meta.xml
SiteRegisterControllerTest ApexClass force-app/main/default/classes/SiteRegisterControllerTest.cls
SiteRegisterControllerTest ApexClass force-app/main/default/classes/SiteRegisterControllerTest.cls-meta.xml
SoftphoneContactSearchController ApexClass force-app/main/default/classes/SoftphoneContactSearchController.cls
SoftphoneContactSearchController ApexClass force-app/main/default/classes/SoftphoneContactSearchController.cls-meta.xml
SoftphoneContactSearchControllerTest ApexClass force-app/main/default/classes/SoftphoneContactSearchControllerTest.cls
SoftphoneContactSearchControllerTest ApexClass force-app/main/default/classes/SoftphoneContactSearchControllerTest.cls-meta.xml
SoftphoneProviderController ApexClass force-app/main/default/classes/SoftphoneProviderController.cls
SoftphoneProviderController ApexClass force-app/main/default/classes/SoftphoneProviderController.cls-meta.xml
SoftphoneProviderControllerTest ApexClass force-app/main/default/classes/SoftphoneProviderControllerTest.cls
SoftphoneProviderControllerTest ApexClass force-app/main/default/classes/SoftphoneProviderControllerTest.cls-meta.xml
SoftphoneProviderHelper ApexClass force-app/main/default/classes/SoftphoneProviderHelper.cls
SoftphoneProviderHelper ApexClass force-app/main/default/classes/SoftphoneProviderHelper.cls-meta.xml
SoftphoneProviderHelperTest ApexClass force-app/main/default/classes/SoftphoneProviderHelperTest.cls
SoftphoneProviderHelperTest ApexClass force-app/main/default/classes/SoftphoneProviderHelperTest.cls-meta.xml
Tasks ApexClass force-app/main/default/classes/Tasks.cls
Tasks ApexClass force-app/main/default/classes/Tasks.cls-meta.xml
TestApex ApexClass force-app/main/default/classes/TestApex.cls
TestApex ApexClass force-app/main/default/classes/TestApex.cls-meta.xml
TestPropertyController ApexClass force-app/main/default/classes/TestPropertyController.cls
TestPropertyController ApexClass force-app/main/default/classes/TestPropertyController.cls-meta.xml
TestSampleDataController ApexClass force-app/main/default/classes/TestSampleDataController.cls
TestSampleDataController ApexClass force-app/main/default/classes/TestSampleDataController.cls-meta.xml
UploadEMUserNotFoundException ApexClass force-app/main/default/classes/UploadEMUserNotFoundException.cls
UploadEMUserNotFoundException ApexClass force-app/main/default/classes/UploadEMUserNotFoundException.cls-meta.xml
Upload_Init_EM ApexClass force-app/main/default/classes/Upload_Init_EM.cls
Upload_Init_EM ApexClass force-app/main/default/classes/Upload_Init_EM.cls-meta.xml
Upload_Init_EMTest ApexClass force-app/main/default/classes/Upload_Init_EMTest.cls
Upload_Init_EMTest ApexClass force-app/main/default/classes/Upload_Init_EMTest.cls-meta.xml
Upload_ProcessTest_EM ApexClass force-app/main/default/classes/Upload_ProcessTest_EM.cls
Upload_ProcessTest_EM ApexClass force-app/main/default/classes/Upload_ProcessTest_EM.cls-meta.xml
Upload_Process_EM ApexClass force-app/main/default/classes/Upload_Process_EM.cls
Upload_Process_EM ApexClass force-app/main/default/classes/Upload_Process_EM.cls-meta.xml
Upload_controller_EM ApexClass force-app/main/default/classes/Upload_controller_EM.cls
Upload_controller_EM ApexClass force-app/main/default/classes/Upload_controller_EM.cls-meta.xml
Upload_controller_EMTest ApexClass force-app/main/default/classes/Upload_controller_EMTest.cls
Upload_controller_EMTest ApexClass force-app/main/default/classes/Upload_controller_EMTest.cls-meta.xml
WrikeWebHook ApexClass force-app/main/default/classes/WrikeWebHook.cls
WrikeWebHook ApexClass force-app/main/default/classes/WrikeWebHook.cls-meta.xml
Wrike_AbstractSectionController ApexClass force-app/main/default/classes/Wrike_AbstractSectionController.cls
Wrike_AbstractSectionController ApexClass force-app/main/default/classes/Wrike_AbstractSectionController.cls-meta.xml
Wrike_AuthorizeController ApexClass force-app/main/default/classes/Wrike_AuthorizeController.cls
Wrike_AuthorizeController ApexClass force-app/main/default/classes/Wrike_AuthorizeController.cls-meta.xml
Wrike_FolderController ApexClass force-app/main/default/classes/Wrike_FolderController.cls
Wrike_FolderController ApexClass force-app/main/default/classes/Wrike_FolderController.cls-meta.xml
Wrike_PrivilegedUtils ApexClass force-app/main/default/classes/Wrike_PrivilegedUtils.cls
Wrike_PrivilegedUtils ApexClass force-app/main/default/classes/Wrike_PrivilegedUtils.cls-meta.xml
Wrike_RemoteSettingsController ApexClass force-app/main/default/classes/Wrike_RemoteSettingsController.cls
Wrike_RemoteSettingsController ApexClass force-app/main/default/classes/Wrike_RemoteSettingsController.cls-meta.xml
Wrike_SetupController ApexClass force-app/main/default/classes/Wrike_SetupController.cls
Wrike_SetupController ApexClass force-app/main/default/classes/Wrike_SetupController.cls-meta.xml
Wrike_TaskController ApexClass force-app/main/default/classes/Wrike_TaskController.cls
Wrike_TaskController ApexClass force-app/main/default/classes/Wrike_TaskController.cls-meta.xml
Wrike_Test ApexClass force-app/main/default/classes/Wrike_Test.cls
Wrike_Test ApexClass force-app/main/default/classes/Wrike_Test.cls-meta.xml
datatableController ApexClass force-app/main/default/classes/datatableController.cls
datatableController ApexClass force-app/main/default/classes/datatableController.cls-meta.xml
uploadController ApexClass force-app/main/default/classes/uploadController.cls
uploadController ApexClass force-app/main/default/classes/uploadController.cls-meta.xml
total 32
-rw-r--r-- 1 mchinnappan staff 1251 Jul 19 17:07 README.md
drwxr-xr-x 3 mchinnappan staff 96 Jul 19 17:07 config
drwxr-xr-x 3 mchinnappan staff 96 Jul 19 17:07 force-app
-rw-r--r-- 1 mchinnappan staff 170 Jul 19 17:07 jest.config.js
-rw-r--r-- 1 mchinnappan staff 1386 Jul 19 17:07 package.json
drwxr-xr-x 4 mchinnappan staff 128 Jul 19 17:07 scripts
-rw-r--r-- 1 mchinnappan staff 217 Jul 19 17:07 sfdx-project.json
=== Login into the Target org ===
Successfully authorized mohan.chinnappan.n_ea2@gmail.com with org ID 00D3h000007R1LuEAK
=== Validating the source for the quick deployment ... ===
*** Deploying with SOAP API v55.0 ***
Deploy ID: 0Af3h00000QZoegCAD
DEPLOY PROGRESS | ████████████████████████████████████████ | 174/174 Components
=== Component Failures [0]
Type Name Problem
──── ──── ───────
=== Test Results Summary
Passing: 69
Failing: 6
Total: 75
Time: 71194
ERROR running force:source:deploy: Deploy failed.
=== Opening the org for checking the deployment status ... ===
Opening org 00D3h000007R1LuEAK as user mohan.chinnappan.n_ea2@gmail.com
Waiting to resolve the Lightning Experience-enabled custom domain...... done
Deploy using this ID: 0Af3h00000QZoegCAD
sfdx force:source:deploy -q 0Af3h00000QZoegCAD -u mohan.chinnappan.n_ea2@gmail.com --apiversion 55.0
Taking care of Admin changes
Topics
- Use case
- Hotfix creation
- Using ONNU Chrome Extension
- Using CLI
- Retrieving Profile and Permission Set
- Viewing SetupAuditTrail
- package.xml Generator
Use case
- Admin wants to make changes (like adding reports, adding custom fields, list views...) to a non-source tracked orgs like PROD
- Admin not familiar with developer tools, version control
sequenceDiagram autonumber Admin->>ChangeControl: Create WorkItem ChangeControl->>Admin: Validate and approve Admin->>Org: Perform Changes Admin->>ChangeControl: Update WorkItem:<br/> Assign to right team member for hotfix branch creation ChangeControl->>TeamMember: Notify TeamMember->>VersionControl: Create hotfix branch based on the changes <br/> done by the Admin in the WorkItem
Create hotfix based on the changes done by the Admin in the WorkItem
- Options
- Using Chrome Extension ONNU
- Using CLI
- This assumes the TeamMember has the access to this Org
Using ONNU
-
Assume Admin has added a CustomObject
FruitTree__c
as the WorkItem - WI-12345 -
Zip file created
tree unpackaged\ 68
unpackaged\ 68
├── objects
│ └── FruitTree__c.object
└── package.xml
~/Downloads/unpackaged 68 >cat objects/FruitTree__c.object
<?xml version="1.0" encoding="UTF-8"?>
<CustomObject xmlns="http://soap.sforce.com/2006/04/metadata">
<actionOverrides>
<actionName>Accept</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Accept</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Accept</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>CancelEdit</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>CancelEdit</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>CancelEdit</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Clone</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Clone</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Clone</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Delete</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Delete</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Delete</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Edit</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Edit</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Edit</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>List</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>List</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>List</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>New</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>New</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>New</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>SaveEdit</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>SaveEdit</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>SaveEdit</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Tab</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Tab</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Tab</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>View</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>View</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>View</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<allowInChatterGroups>false</allowInChatterGroups>
<compactLayoutAssignment>SYSTEM</compactLayoutAssignment>
<deploymentStatus>Deployed</deploymentStatus>
<enableActivities>false</enableActivities>
<enableBulkApi>true</enableBulkApi>
<enableFeeds>false</enableFeeds>
<enableHistory>false</enableHistory>
<enableLicensing>false</enableLicensing>
<enableReports>false</enableReports>
<enableSearch>false</enableSearch>
<enableSharing>true</enableSharing>
<enableStreamingApi>true</enableStreamingApi>
<externalSharingModel>Private</externalSharingModel>
<label>FruitTree</label>
<nameField>
<label>FruitTree Name</label>
<type>Text</type>
</nameField>
<pluralLabel>FruitTrees</pluralLabel>
<searchLayouts/>
<sharingModel>ReadWrite</sharingModel>
<visibility>Public</visibility>
</CustomObject>
- TeamMember will create a hotfix branch say hotfix-WI-12345 to take of this so this change done by the Admin is back committed into the version control
Using CLI
- Login
sfdx force:auth:web:login
Successfully authorized mohan.chinnappan.n.sel@gmail.com with org ID 00D4x000007rxoGEAQ
- Prepare package.xml
~/hotfix >cat hotfix_package.xml
<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
<types>
<members>FruitTree__c</members>
<name>CustomObject</name>
</types>
<version>56.0</version>
</Package>
- Retrieve
sfdx force:mdapi:retrieve -k hotfix_package.xml -u mohan.chinnappan.n.sel@gmail.com -r .
Retrieving v56.0 metadata from mohan.chinnappan.n.sel@gmail.com using the v56.0 SOAP API
Retrieve ID: 09S4x00000EyAenEAF
Retrieving metadata from mohan.chinnappan.n.sel@gmail.com... done
Wrote retrieve zip to /Users/mchinnappan/hotfix/unpackaged.zip
- Unzip
~/hotfix >jar xvf unpackaged.zip
inflated: unpackaged/objects/FruitTree__c.object
inflated: unpackaged/package.xml
tree
.
├── hotfix_package.xml
├── unpackaged
│ ├── objects
│ │ └── FruitTree__c.object
│ └── package.xml
└── unpackaged.zip
2 directories, 4 files
~/hotfix >cat hotfix_package.xml
<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
<types>
<members>FruitTree__c</members>
<name>CustomObject</name>
</types>
<version>56.0</version>
</Package>
~/hotfix >cat unpackaged/objects/FruitTree__c.object
<?xml version="1.0" encoding="UTF-8"?>
<CustomObject xmlns="http://soap.sforce.com/2006/04/metadata">
<actionOverrides>
<actionName>Accept</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Accept</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Accept</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>CancelEdit</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>CancelEdit</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>CancelEdit</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Clone</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Clone</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Clone</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Delete</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Delete</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Delete</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Edit</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Edit</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Edit</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>List</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>List</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>List</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>New</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>New</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>New</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>SaveEdit</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>SaveEdit</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>SaveEdit</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Tab</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Tab</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>Tab</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>View</actionName>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>View</actionName>
<formFactor>Large</formFactor>
<type>Default</type>
</actionOverrides>
<actionOverrides>
<actionName>View</actionName>
<formFactor>Small</formFactor>
<type>Default</type>
</actionOverrides>
<allowInChatterGroups>false</allowInChatterGroups>
<compactLayoutAssignment>SYSTEM</compactLayoutAssignment>
<deploymentStatus>Deployed</deploymentStatus>
<enableActivities>false</enableActivities>
<enableBulkApi>true</enableBulkApi>
<enableFeeds>false</enableFeeds>
<enableHistory>false</enableHistory>
<enableLicensing>false</enableLicensing>
<enableReports>false</enableReports>
<enableSearch>false</enableSearch>
<enableSharing>true</enableSharing>
<enableStreamingApi>true</enableStreamingApi>
<externalSharingModel>Private</externalSharingModel>
<label>FruitTree</label>
<nameField>
<label>FruitTree Name</label>
<type>Text</type>
</nameField>
<pluralLabel>FruitTrees</pluralLabel>
<searchLayouts/>
<sharingModel>ReadWrite</sharingModel>
<visibility>Public</visibility>
</CustomObject>
package.xml Generator
- You can use this tool to generate
package.xml
for your requirements
DevOps Center
DevOps Center Features
- DevOps Center brings best practices to the entire development community around
- source management
- change management
- release management
through a click-based interface.
Feature | Comments |
---|---|
Changes are tracked automatically | |
The source control repository is your source of truth for project changes, so you always know that everyone is working with a consistent set of metadata | |
Source control branches are created and managed for you. You can move the changes through the release pipeline by clicks | |
Provides visibility into each other’s changes regardless of how they’re doing their work | |
Uses DX under the hood, but you do not have to know them | DX Project, Metadata API, version control |
How to release changes more efficiently using DevOps center?
Persona | Currently Using | Skill set |
---|---|---|
Salesforce admin and declarative developer | Change Sets | Not familiar with version control |
Programmatic developer | DX tooling, likes to more visibility into the admin’s changes | Familiar with using CLI |
Release manager | Responsible for deploying changes for both teammates through the release pipeline | Familiar with version control, CLI and Change sets |
Work Item (WI) based
Flow for Salesforce admin and declarative developer
- Performs following with clicks in DevOps Center
graph TD; A[Declarative developer creates a Work Item \n in DevOps Center]-->B[Performs her work in a Developer Sandbox]; B-->C[Pull her changes from the Developer Sandbox and\n commits them into version control\n and create review]; C-->D[Promote her changes];
Flow for Programmatic developer
- Performs following with clicks in DevOps Center
graph TD; A[Programmatic developer creates a Work Item \n in DevOps Center]-->B[Performs his work in the 'work item feature branch'\n using IDE]; B-->C[Commits and merges those changes directly\n in the version control];
Flow for Release Manager
- Performs following with clicks in DevOps Center
graph TD; A[View the changes]-->B[Promote changes either from within DevOps Center or \nusing Salesforce CLI];
Key features
-
Work item
- Defines the requirement for the change to be made
- Tracks the associated metadata source files through the release lifecycle.
- Increases visibility regarding where changes reside in each pipeline stage.
-
Conflict management
- Identifies work items that have conflicts with other work items in the pipeline
- Provides information to help you resolve the conflict.
-
Dev environment synchronization
- Detects when a development environment is out-of-date so you can ensure that you’re working against the latest source of truth.
- Tells you what’s different, and enables you to synchronize it to avoid future conflicts.
-
Activity history
- Provides increased visibility, auditability, and error tracking in both work item and pipeline views.
- Tells you what’s different, and enables you to synchronize it to avoid future conflicts.
-
Ability to delete metadata:
- Removes the component from the target stages, leveraging the Metadata API destructive changes functionality.
-
Validation-only deployments
- Enables you to run a validation-only deployment, and, at a later time, perform a quick deployment
Installing DevOps Center
References
Videos
Jenkins
The best part is no part.
Physics is the law and everything else is a recommendation.
The machine that makes the machine!
- Elon Musk
Topics
Use Case
- Setup Jenkins Master and Worker Nodes to run SFDX for the given org
- Make use of JWT flow to login to a Salesforce org
graph TB; JenkinsMaster-->socat; socat--> Worker1-->ORG; socat--> Worker2-->ORG; socat--> Worker3-->ORG;
Setting up Jenkins Master
docker network create jenkins
37d113c8d79cd12a209822f0ccb12be7058b9c6ccdc62e2b298725537edd703d
docker network ls
NETWORK ID NAME DRIVER SCOPE
f8ca68549edb bridge bridge local
145498352d20 host host local
37d113c8d79c jenkins bridge local
3bdfaf921867 none null local
docker run --name jenkins-docker --rm --detach \
--privileged --network jenkins --network-alias docker \
--env DOCKER_TLS_CERTDIR=/certs \
--volume jenkins-docker-certs:/certs/client \
--volume jenkins-data:/var/jenkins_home \
--publish 2376:2376 \
docker:dind --storage-driver overlay2
Unable to find image 'docker:dind' locally
dind: Pulling from library/docker
63b65145d645: Pull complete
67fecbccebe0: Pull complete
2b3fda0e4f2a: Pull complete
30dc96e58339: Pull complete
8c007c9e333b: Pull complete
c31b9f3137a1: Pull complete
782206789a08: Pull complete
e415889ffe25: Pull complete
55134767123a: Pull complete
36af3abb7351: Pull complete
9e414f2e72f2: Pull complete
4ccabe0725ae: Pull complete
ca215d26ccb7: Pull complete
Digest: sha256:e4d776dd1e0580dfb670559d887300aa08b53b8a59f5df2d4eaace936ef4d0e9
Status: Downloaded newer image for docker:dind
58b1537f513652de9951efd03766a2c9ad853e45341a18584f39565bb4beae14
cat Dockerfile
FROM jenkins/jenkins:2.387.1
USER root
RUN apt-get update && apt-get install -y lsb-release
RUN curl -fsSLo /usr/share/keyrings/docker-archive-keyring.asc \
https://download.docker.com/linux/debian/gpg
RUN echo "deb [arch=$(dpkg --print-architecture) \
signed-by=/usr/share/keyrings/docker-archive-keyring.asc] \
https://download.docker.com/linux/debian \
$(lsb_release -cs) stable" > /etc/apt/sources.list.d/docker.list
RUN apt-get update && apt-get install -y docker-ce-cli
USER jenkins
RUN jenkins-plugin-cli --plugins "blueocean docker-workflow"
docker build -t myjenkins-blueocean:2.387.1-1 .
[+] Building 42.9s (10/10) FINISHED
=> [internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 602B 0.0s
=> [internal] load .dockerignore 0.0s
=> => transferring context: 2B 0.0s
=> [internal] load metadata for docker.io/jenkins/jenkins:2.387.1 1.0s
=> [1/6] FROM docker.io/jenkins/jenkins:2.387.1@sha256:0944e18261a6547e89b700cec432949281a7419a6165a3906e78c97efde3bc86 11.1s
=> => resolve docker.io/jenkins/jenkins:2.387.1@sha256:0944e18261a6547e89b700cec432949281a7419a6165a3906e78c97efde3bc86 0.0s
=> => sha256:a56533012712c1db623da3e5e9c2d0276301c82db0a2e7a82debfb57e5d916f2 8.93MB / 8.93MB 0.3s
=> => sha256:c09d5e9e1188f3fff7a4f8c3c7c330fde5184cba1c6f0c92526b8b7bd0ac7c26 51.63MB / 51.63MB 3.4s
=> => sha256:0944e18261a6547e89b700cec432949281a7419a6165a3906e78c97efde3bc86 2.36kB / 2.36kB 0.0s
=> => sha256:32fb02163b6bb519a30f909008e852354dae10bdfd6b34190dbdfe8f15403ea0 55.05MB / 55.05MB 2.6s
=> => sha256:005fcb5c3017ef120d0d9d8d8925e9248ff6e2cf2b5e18b527b01459c7b2b3f4 2.77kB / 2.77kB 0.0s
=> => sha256:d5ed2ceef0ec08e9044ebb39812f211d64dbcdfce775cc6b0460ca289193416f 13.13kB / 13.13kB 0.0s
=> => sha256:7936e107ffe73b406a0d02edf9bb02b983534d803bb06fd03dc38dac4b6cfe2a 1.24kB / 1.24kB 0.4s
=> => sha256:3ca683058265b99b65bbc69b9e8fa4c46e830db35aad614706200e6cf0c30d8a 189B / 189B 0.5s
=> => sha256:c2ecd304b4b84ef6154bd85e13360f0b015e39057a329698617ce0a53ed6cf32 98.12MB / 98.12MB 4.7s
=> => extracting sha256:32fb02163b6bb519a30f909008e852354dae10bdfd6b34190dbdfe8f15403ea0 2.4s
=> => sha256:be3512d810d65f00f28af0885e2f30833263ee061528e20c7fee21664f1572b8 202B / 202B 2.8s
=> => sha256:56b37d7c2a7a3c93fd013cb1ad5652f8cea3910ecc0274d73064bba27ab57864 5.84MB / 5.84MB 3.4s
=> => sha256:99ed1e723e52507ce2d615e1682d673c3ffcec5f5b68c266db70d829ef4be208 76.93MB / 76.93MB 5.8s
=> => sha256:256db5485b1399ecbd58c2558388fc9a4ff1caaaeacbdfe23127c7f13b1ee98b 1.93kB / 1.93kB 3.5s
=> => sha256:ee8c7eaf5e6bd8c45e503756da48e80137def6765017148ce9f2af66ce97244b 1.17kB / 1.17kB 3.7s
=> => sha256:509f66c2f3174642f0eb3e3b2e8a70da698f613042ec65f525c476afe0b6b7d5 374B / 374B 3.9s
=> => sha256:820296a845d636be13276fdf3bdae7fdf2ac00d401182632d1ab450e26353674 271B / 271B 4.1s
=> => extracting sha256:c09d5e9e1188f3fff7a4f8c3c7c330fde5184cba1c6f0c92526b8b7bd0ac7c26 2.0s
=> => extracting sha256:a56533012712c1db623da3e5e9c2d0276301c82db0a2e7a82debfb57e5d916f2 0.2s
=> => extracting sha256:7936e107ffe73b406a0d02edf9bb02b983534d803bb06fd03dc38dac4b6cfe2a 0.0s
=> => extracting sha256:3ca683058265b99b65bbc69b9e8fa4c46e830db35aad614706200e6cf0c30d8a 0.0s
=> => extracting sha256:c2ecd304b4b84ef6154bd85e13360f0b015e39057a329698617ce0a53ed6cf32 0.6s
=> => extracting sha256:be3512d810d65f00f28af0885e2f30833263ee061528e20c7fee21664f1572b8 0.0s
=> => extracting sha256:56b37d7c2a7a3c93fd013cb1ad5652f8cea3910ecc0274d73064bba27ab57864 0.1s
=> => extracting sha256:99ed1e723e52507ce2d615e1682d673c3ffcec5f5b68c266db70d829ef4be208 1.6s
=> => extracting sha256:256db5485b1399ecbd58c2558388fc9a4ff1caaaeacbdfe23127c7f13b1ee98b 0.0s
=> => extracting sha256:ee8c7eaf5e6bd8c45e503756da48e80137def6765017148ce9f2af66ce97244b 0.0s
=> => extracting sha256:509f66c2f3174642f0eb3e3b2e8a70da698f613042ec65f525c476afe0b6b7d5 0.0s
=> => extracting sha256:820296a845d636be13276fdf3bdae7fdf2ac00d401182632d1ab450e26353674 0.0s
=> [2/6] RUN apt-get update && apt-get install -y lsb-release 6.7s
=> [3/6] RUN curl -fsSLo /usr/share/keyrings/docker-archive-keyring.asc https://download.docker.com/linux/debian/gpg 0.7s
=> [4/6] RUN echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/docker-archive-keyring.asc] https://download.docker 0.2s
=> [5/6] RUN apt-get update && apt-get install -y docker-ce-cli 8.9s
=> [6/6] RUN jenkins-plugin-cli --plugins "blueocean docker-workflow" 13.5s
=> exporting to image 0.8s
=> => exporting layers 0.8s
=> => writing image sha256:fd428091db2d8149ed7ce13d7fe3ccd7d412e87a121f4a6850ec0cfd0ed91ce2 0.0s
=> => naming to docker.io/library/myjenkins-blueocean:2.387.1-1 0.0s
Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix them
docker container ls
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
58b1537f5136 docker:dind "dockerd-entrypoint.…" 2 minutes ago Up 2 minutes 2375/tcp, 0.0.0.0:2376->2376/tcp jenkins-docker
docker run --name jenkins-blueocean --restart=on-failure --detach \
--network jenkins --env DOCKER_HOST=tcp://docker:2376 \
--env DOCKER_CERT_PATH=/certs/client --env DOCKER_TLS_VERIFY=1 \
--publish 8080:8080 --publish 50000:50000 \
--volume jenkins-data:/var/jenkins_home \
--volume jenkins-docker-certs:/certs/client:ro \
myjenkins-blueocean:2.387.1-1
docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c5ea234eee86 myjenkins-blueocean:2.387.1-1 "/usr/bin/tini -- /u…" 2 minutes ago Up 2 minutes 0.0.0.0:8080->8080/tcp, 0.0.0.0:50000->50000/tcp jenkins-blueocean
58b1537f5136 docker:dind "dockerd-entrypoint.…" 4 minutes ago Up 4 minutes 2375/tcp, 0.0.0.0:2376->2376/tcp jenkins-docker
Getting initialAdminPassword
docker exec c5ea234eee86 cat /var/jenkins_home/secrets/initialAdminPassword
600755197d9642a08d03aa294fb82344
bash into docker container
docker exec -it c5ea234eee86 bash
jenkins@c5ea234eee86:~$ ls -l
total 112
-rw-r--r-- 1 jenkins jenkins 1663 Mar 12 13:29 config.xml
-rw-r--r-- 1 jenkins jenkins 5316 Mar 12 13:28 copy_reference_file.log
-rw-r--r-- 1 jenkins jenkins 156 Mar 12 13:28 hudson.model.UpdateCenter.xml
-rw-r--r-- 1 jenkins jenkins 370 Mar 12 13:28 hudson.plugins.git.GitTool.xml
-rw------- 1 jenkins jenkins 1680 Mar 12 13:28 identity.key.enc
-rw-r--r-- 1 jenkins jenkins 7 Mar 12 13:34 jenkins.install.InstallUtil.lastExecVersion
-rw-r--r-- 1 jenkins jenkins 7 Mar 12 13:34 jenkins.install.UpgradeWizard.state
-rw-r--r-- 1 jenkins jenkins 179 Mar 12 13:34 jenkins.model.JenkinsLocationConfiguration.xml
-rw-r--r-- 1 jenkins jenkins 171 Mar 12 13:28 jenkins.telemetry.Correlator.xml
drwxr-xr-x 3 jenkins jenkins 4096 Mar 12 13:51 jobs
drwxr-xr-x 4 jenkins jenkins 4096 Mar 12 14:33 logs
-rw-r--r-- 1 jenkins jenkins 907 Mar 12 13:28 nodeMonitors.xml
drwxr-xr-x 2 jenkins jenkins 4096 Mar 12 13:28 nodes
drwxr-xr-x 114 jenkins jenkins 24576 Mar 12 13:33 plugins
-rw-r--r-- 1 jenkins jenkins 129 Mar 12 13:54 queue.xml
-rw-r--r-- 1 jenkins jenkins 64 Mar 12 13:28 secret.key
-rw-r--r-- 1 jenkins jenkins 0 Mar 12 13:28 secret.key.not-so-secret
drwx------ 2 jenkins jenkins 4096 Mar 12 13:53 secrets
drwxr-xr-x 2 jenkins jenkins 4096 Mar 12 13:33 updates
drwxr-xr-x 2 jenkins jenkins 4096 Mar 12 13:28 userContent
drwxr-xr-x 3 jenkins jenkins 4096 Mar 12 13:33 users
drwxr-xr-x 11 jenkins jenkins 4096 Mar 12 13:28 war
drwxr-xr-x 3 jenkins jenkins 4096 Mar 12 13:53 workspace
jenkins@c5ea234eee86:~$ pwd
/var/jenkins_home
jenkins@c5ea234eee86:~$ python3
Python 3.9.2 (default, Feb 28 2021, 17:03:44)
[GCC 10.2.1 20210110] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> exit()
socat container
-
socat provides proxy the connection between Jenkins master container and localhost
-
Steps
-
If you are running jenkins as container, in the docker host uri field you have to enter unix or tcp address of the docker host. But since you are running jenkins as container, the container can't reach docker host unix port
-
So we have to run another container that can mediate between docker host and jenkins container. It will publish docker host's unix port as its tcp port. Follow the instructions to create socat container https://hub.docker.com/r/alpine/socat/
3)After the creating socat container, you can go back the docker configuration in jenkins and enter tcp://socat-container-ip:2375
- Test Connection should succeed now
docker run -d --restart=always -p 127.0.0.1:2400:2375 --network jenkins -v /var/run/docker.sock:/var/run/docker.sock alpine/socat tcp-listen:2375,fork,reuseaddr unix-connect:/var/run/docker.sock
b643f2a82b31e1f2ac9e5261badbb65d13447b94a92eaa0bb34202a9148f39b4
docker container ls
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
b643f2a82b31 alpine/socat "socat tcp-listen:23…" About a minute ago Up About a minute 127.0.0.1:2400->2375/tcp silly_hopper
c5ea234eee86 myjenkins-blueocean:2.387.1-1 "/usr/bin/tini -- /u…" 3 hours ago Up 12 minutes 0.0.0.0:8080->8080/tcp, 0.0.0.0:50000->50000/tcp jenkins-blueocean
58b1537f5136 docker:dind "dockerd-entrypoint.…" 3 hours ago Up 3 hours 2375/tcp, 0.0.0.0:2376->2376/tcp jenkins-docker
docker inspect b643f2a82b31e1f2ac9e5261badbb65d13447b94a92eaa0bb34202a9148f39b4 | grep IPAddress
"SecondaryIPAddresses": null,
"IPAddress": "",
"IPAddress": "172.18.0.4",
docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
b643f2a82b31 alpine/socat "socat tcp-listen:23…" 2 minutes ago Up 2 minutes 127.0.0.1:2400->2375/tcp silly_hopper
c5ea234eee86 myjenkins-blueocean:2.387.1-1 "/usr/bin/tini -- /u…" 3 hours ago Up 13 minutes 0.0.0.0:8080->8080/tcp, 0.0.0.0:50000->50000/tcp jenkins-blueocean
58b1537f5136 docker:dind "dockerd-entrypoint.…" 3 hours ago Up 3 hours 2375/tcp, 0.0.0.0:2376->2376/tcp jenkins-docker
Inspect socat
docker inspect silly_hopper
[
{
"Id": "b643f2a82b31e1f2ac9e5261badbb65d13447b94a92eaa0bb34202a9148f39b4",
"Created": "2023-03-12T16:08:27.933374909Z",
"Path": "socat",
"Args": [
"tcp-listen:2375,fork,reuseaddr",
"unix-connect:/var/run/docker.sock"
],
"State": {
"Status": "running",
"Running": true,
"Paused": false,
"Restarting": false,
"OOMKilled": false,
"Dead": false,
"Pid": 8609,
"ExitCode": 0,
"Error": "",
"StartedAt": "2023-03-12T16:08:28.362760818Z",
"FinishedAt": "0001-01-01T00:00:00Z"
},
"Image": "sha256:f0bbf8a4f6a0e25e06fe69c284e6e0494948e8de21ef933d72732b7858da3110",
"ResolvConfPath": "/var/lib/docker/containers/b643f2a82b31e1f2ac9e5261badbb65d13447b94a92eaa0bb34202a9148f39b4/resolv.conf",
"HostnamePath": "/var/lib/docker/containers/b643f2a82b31e1f2ac9e5261badbb65d13447b94a92eaa0bb34202a9148f39b4/hostname",
"HostsPath": "/var/lib/docker/containers/b643f2a82b31e1f2ac9e5261badbb65d13447b94a92eaa0bb34202a9148f39b4/hosts",
"LogPath": "/var/lib/docker/containers/b643f2a82b31e1f2ac9e5261badbb65d13447b94a92eaa0bb34202a9148f39b4/b643f2a82b31e1f2ac9e5261badbb65d13447b94a92eaa0bb34202a9148f39b4-json.log",
"Name": "/silly_hopper",
"RestartCount": 0,
"Driver": "overlay2",
"Platform": "linux",
"MountLabel": "",
"ProcessLabel": "",
"AppArmorProfile": "",
"ExecIDs": null,
"HostConfig": {
"Binds": [
"/var/run/docker.sock:/var/run/docker.sock"
],
"ContainerIDFile": "",
"LogConfig": {
"Type": "json-file",
"Config": {}
},
"NetworkMode": "jenkins",
"PortBindings": {
"2375/tcp": [
{
"HostIp": "127.0.0.1",
"HostPort": "2400"
}
]
},
"RestartPolicy": {
"Name": "always",
"MaximumRetryCount": 0
},
"AutoRemove": false,
"VolumeDriver": "",
"VolumesFrom": null,
"CapAdd": null,
"CapDrop": null,
"CgroupnsMode": "private",
"Dns": [],
"DnsOptions": [],
"DnsSearch": [],
"ExtraHosts": null,
"GroupAdd": null,
"IpcMode": "private",
"Cgroup": "",
"Links": null,
"OomScoreAdj": 0,
"PidMode": "",
"Privileged": false,
"PublishAllPorts": false,
"ReadonlyRootfs": false,
"SecurityOpt": null,
"UTSMode": "",
"UsernsMode": "",
"ShmSize": 67108864,
"Runtime": "runc",
"ConsoleSize": [
0,
0
],
"Isolation": "",
"CpuShares": 0,
"Memory": 0,
"NanoCpus": 0,
"CgroupParent": "",
"BlkioWeight": 0,
"BlkioWeightDevice": [],
"BlkioDeviceReadBps": null,
"BlkioDeviceWriteBps": null,
"BlkioDeviceReadIOps": null,
"BlkioDeviceWriteIOps": null,
"CpuPeriod": 0,
"CpuQuota": 0,
"CpuRealtimePeriod": 0,
"CpuRealtimeRuntime": 0,
"CpusetCpus": "",
"CpusetMems": "",
"Devices": [],
"DeviceCgroupRules": null,
"DeviceRequests": null,
"KernelMemory": 0,
"KernelMemoryTCP": 0,
"MemoryReservation": 0,
"MemorySwap": 0,
"MemorySwappiness": null,
"OomKillDisable": null,
"PidsLimit": null,
"Ulimits": null,
"CpuCount": 0,
"CpuPercent": 0,
"IOMaximumIOps": 0,
"IOMaximumBandwidth": 0,
"MaskedPaths": [
"/proc/asound",
"/proc/acpi",
"/proc/kcore",
"/proc/keys",
"/proc/latency_stats",
"/proc/timer_list",
"/proc/timer_stats",
"/proc/sched_debug",
"/proc/scsi",
"/sys/firmware"
],
"ReadonlyPaths": [
"/proc/bus",
"/proc/fs",
"/proc/irq",
"/proc/sys",
"/proc/sysrq-trigger"
]
},
"GraphDriver": {
"Data": {
"LowerDir": "/var/lib/docker/overlay2/ac59c4bb642c204d52676ea5a3f38ab123aa55c28e7c4d32885c29798685bf6c-init/diff:/var/lib/docker/overlay2/33c335c0d0c0267bc8a0de4b181fa6a67212ef993d5b23cc2d932eccd87da979/diff:/var/lib/docker/overlay2/0b76f458f2092bfa7d73629e242e320299fb159448bf52a18647959f1637387c/diff",
"MergedDir": "/var/lib/docker/overlay2/ac59c4bb642c204d52676ea5a3f38ab123aa55c28e7c4d32885c29798685bf6c/merged",
"UpperDir": "/var/lib/docker/overlay2/ac59c4bb642c204d52676ea5a3f38ab123aa55c28e7c4d32885c29798685bf6c/diff",
"WorkDir": "/var/lib/docker/overlay2/ac59c4bb642c204d52676ea5a3f38ab123aa55c28e7c4d32885c29798685bf6c/work"
},
"Name": "overlay2"
},
"Mounts": [
{
"Type": "bind",
"Source": "/var/run/docker.sock",
"Destination": "/var/run/docker.sock",
"Mode": "",
"RW": true,
"Propagation": "rprivate"
}
],
"Config": {
"Hostname": "b643f2a82b31",
"Domainname": "",
"User": "",
"AttachStdin": false,
"AttachStdout": false,
"AttachStderr": false,
"ExposedPorts": {
"2375/tcp": {}
},
"Tty": false,
"OpenStdin": false,
"StdinOnce": false,
"Env": [
"PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
],
"Cmd": [
"tcp-listen:2375,fork,reuseaddr",
"unix-connect:/var/run/docker.sock"
],
"Image": "alpine/socat",
"Volumes": null,
"WorkingDir": "",
"Entrypoint": [
"socat"
],
"OnBuild": null,
"Labels": {}
},
"NetworkSettings": {
"Bridge": "",
"SandboxID": "a818d2616cbaced18798c777a02b95d53c16765842f392151d39272ed0e4be37",
"HairpinMode": false,
"LinkLocalIPv6Address": "",
"LinkLocalIPv6PrefixLen": 0,
"Ports": {
"2375/tcp": [
{
"HostIp": "127.0.0.1",
"HostPort": "2400"
}
]
},
"SandboxKey": "/var/run/docker/netns/a818d2616cba",
"SecondaryIPAddresses": null,
"SecondaryIPv6Addresses": null,
"EndpointID": "",
"Gateway": "",
"GlobalIPv6Address": "",
"GlobalIPv6PrefixLen": 0,
"IPAddress": "",
"IPPrefixLen": 0,
"IPv6Gateway": "",
"MacAddress": "",
"Networks": {
"jenkins": {
"IPAMConfig": null,
"Links": null,
"Aliases": [
"b643f2a82b31"
],
"NetworkID": "37d113c8d79cd12a209822f0ccb12be7058b9c6ccdc62e2b298725537edd703d",
"EndpointID": "08bc7021dd48b6993f51b08a0d0e9333276c64311b4487361b02c476a3a1e3f9",
"Gateway": "172.18.0.1",
"IPAddress": "172.18.0.4",
"IPPrefixLen": 16,
"IPv6Gateway": "",
"GlobalIPv6Address": "",
"GlobalIPv6PrefixLen": 0,
"MacAddress": "02:42:ac:12:00:04",
"DriverOpts": null
}
}
}
}
]
Worker node setup
Build Docker image
cat Dockerfile
FROM jenkins/agent:alpine-jdk11
# author: mchinnappan
USER root
# install python3
RUN apk add python3
RUN apk add py3-pip
# install vim
RUN apk add vim
# install nodejs and sfdx cli
RUN apk add nodejs
RUN apk add npm
RUN apk add yarn
ARG SALESFORCE_CLI_VERSION=latest
RUN npm install --global sfdx-cli@${SALESFORCE_CLI_VERSION}
RUN apk add jq
USER jenkins
# install required plugins
RUN echo 'y' | sfdx plugins:install sfdx-mohanc-plugins
RUN echo 'y' | sfdx plugins:install sfdx-git-delta
docker build -t docker-agent-sfdx .
[+] Building 160.6s (15/15) FINISHED
=> [internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 557B 0.0s
=> [internal] load .dockerignore 0.0s
=> => transferring context: 2B 0.0s
=> [internal] load metadata for docker.io/jenkins/agent:alpine-jdk11 0.9s
=> [ 1/11] FROM docker.io/jenkins/agent:alpine-jdk11@sha256:fdf4d5e79b3645be5779130bf3239cf9344b5b1929d538b7d2980be8868550f6 0.0s
=> CACHED [ 2/11] RUN apk add python3 0.0s
=> CACHED [ 3/11] RUN apk add py3-pip 0.0s
=> [ 4/11] RUN apk add vim 1.4s
=> [ 5/11] RUN apk add nodejs 1.3s
=> [ 6/11] RUN apk add npm 1.0s
=> [ 7/11] RUN apk add yarn 0.8s
=> [ 8/11] RUN npm install --global sfdx-cli@latest 30.6s
=> [ 9/11] RUN apk add jq 0.9s
=> [10/11] RUN echo 'y' | sfdx plugins:install sfdx-mohanc-plugins 63.6s
=> [11/11] RUN echo 'y' | sfdx plugins:install sfdx-git-delta 38.8s
=> exporting to image 21.2s
=> => exporting layers 21.1s
=> => writing image sha256:5d988bfa62b447ffe80a0421417b9f6329656f4d78722ea65969d32cdc13b5d4 0.0s
=> => naming to docker.io/library/docker-agent-sfdx
run the image
docker run -it --entrypoint bash --name docker-agent-sfdx-app docker-agent-sfdx
17ded65a422c:~$ sfdx plugins
sfdx-git-delta 5.13.3
sfdx-mohanc-plugins 0.0.343
exit
Publish this image : 5d988bfa62b447ffe80a0421417b9f6329656f4d78722ea65969d32cdc13b5d4
docker tag 5d988bfa62b447ffe80a0421417b9f6329656f4d78722ea65969d32cdc13b5d4 mohanchinnappan/mc-sfdx
docker images
REPOSITORY TAG IMAGE ID CREATED SIZE
mohanchinnappan/mc-sfdx latest 5d988bfa62b4 24 minutes ago 1.51GB
myjenkins-blueocean 2.387.1-1 fd428091db2d 4 hours ago 805MB
docker dind c365741dcfc2 4 days ago 311MB
alpine/socat latest f0bbf8a4f6a0 2 weeks ago 8.5MB
Push
docker push mohanchinnappan/mc-sfdx
Using default tag: latest
The push refers to repository [docker.io/mohanchinnappan/mc-sfdx]
3c3548e6efa9: Pushed
c991a1b7df1a: Pushed
018486e93789: Pushed
96d7750030cc: Pushed
17ec9d53bd65: Pushed
cc4f6cce9ed0: Pushed
e5fc35b10359: Pushed
26cde4efb06b: Pushed
a91d2cf08b0f: Pushed
0970af15e418: Pushed
5f70bf18a086: Mounted from mohanchinnappan/jupyter-rust
639d13e0e024: Mounted from jenkins/agent
b910852e976b: Mounted from jenkins/agent
bc4c636cd02f: Mounted from jenkins/agent
99b5df13238c: Mounted from jenkins/agent
10009e9661e3: Mounted from jenkins/agent
abc8eda4da3a: Mounted from jenkins/agent
591e5beaaae8: Mounted from jenkins/agent
a34e3a797aad: Mounted from jenkins/agent
7cd52847ad77: Mounted from jenkins/agent
latest: digest: sha256:cf4b1d93c8833298cec8fa5897d22ede339b53193a606c69448ac8ee76a0c90b size: 4532
Published Docker Image
Run it
docker run -it --entrypoint bash mohanchinnappan/mc-sfdx
9900abaf79d2:~$ sfdx plugins
sfdx-git-delta 5.13.3
sfdx-mohanc-plugins 0.0.343
python3
Python 3.10.10 (main, Feb 9 2023, 02:08:14) [GCC 12.2.1 20220924] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> exit()
9900abaf79d2:~$ pwd
/home/jenkins
9900abaf79d2:~$ vim
9900abaf79d2:~$ java --version
openjdk 11.0.18 2023-01-17
OpenJDK Runtime Environment Temurin-11.0.18+10 (build 11.0.18+10)
OpenJDK 64-Bit Server VM Temurin-11.0.18+10 (build 11.0.18+10, mixed mode)
docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
bd6b56f8666f mohanchinnappan/mc-sfdx "bash" 36 seconds ago Up 35 seconds reverent_beaver
b643f2a82b31 alpine/socat "socat tcp-listen:23…" 2 hours ago Up 2 hours 127.0.0.1:2400->2375/tcp silly_hopper
c5ea234eee86 myjenkins-blueocean:2.387.1-1 "/usr/bin/tini -- /u…" 5 hours ago Up 2 hours 0.0.0.0:8080->8080/tcp, 0.0.0.0:50000->50000/tcp jenkins-blueocean
58b1537f5136 docker:dind "dockerd-entrypoint.…" 5 hours ago Up 5 hours 2375/tcp, 0.0.0.0:2376->2376/tcp jenkins-docker
Running build on this
Started by user Mohan Chinnappan
Running as SYSTEM
Building remotely on docker-node-sfdx-00006mp66sdok on docker (docker-node-sfdx) in workspace /home/jenkins/workspace/HelloWorldPython
The recommended git tool is: NONE
No credentials specified
Cloning the remote Git repository
Cloning repository https://github.com/mohan-chinnappan-n/jenkins-cicd
> git init /home/jenkins/workspace/HelloWorldPython # timeout=10
Fetching upstream changes from https://github.com/mohan-chinnappan-n/jenkins-cicd
> git --version # timeout=10
> git --version # 'git version 2.38.4'
> git fetch --tags --force --progress -- https://github.com/mohan-chinnappan-n/jenkins-cicd +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/mohan-chinnappan-n/jenkins-cicd # timeout=10
> git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
Avoid second fetch
> git rev-parse refs/remotes/origin/main^{commit} # timeout=10
Checking out Revision b34f09b31c3cc64d716a6e9ff613e7b71de5fe4c (refs/remotes/origin/main)
> git config core.sparsecheckout # timeout=10
> git checkout -f b34f09b31c3cc64d716a6e9ff613e7b71de5fe4c # timeout=10
Commit message: "init"
> git rev-list --no-walk b34f09b31c3cc64d716a6e9ff613e7b71de5fe4c # timeout=10
[HelloWorldPython] $ /bin/sh -xe /tmp/jenkins15310274698032643005.sh
+ python3 py/hw.py
Hello World!
Finished: SUCCESS
sfdx XML project
Started by user Mohan Chinnappan
Running as SYSTEM
Building remotely on docker-node-sfdx-00006t9wamykj on docker (docker-node-sfdx) in workspace /home/jenkins/workspace/sfdc-project-xml
The recommended git tool is: NONE
No credentials specified
Cloning the remote Git repository
Cloning repository https://github.com/mohan-chinnappan-n/jenkins-cicd
> git init /home/jenkins/workspace/sfdc-project-xml # timeout=10
Fetching upstream changes from https://github.com/mohan-chinnappan-n/jenkins-cicd
> git --version # timeout=10
> git --version # 'git version 2.38.4'
> git fetch --tags --force --progress -- https://github.com/mohan-chinnappan-n/jenkins-cicd +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/mohan-chinnappan-n/jenkins-cicd # timeout=10
> git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
Avoid second fetch
> git rev-parse refs/remotes/origin/main^{commit} # timeout=10
Checking out Revision 12763086762b99f46569a109d06c03a4e8cfd65d (refs/remotes/origin/main)
> git config core.sparsecheckout # timeout=10
> git checkout -f 12763086762b99f46569a109d06c03a4e8cfd65d # timeout=10
Commit message: "xml added"
First time build. Skipping changelog.
[sfdc-project-xml] $ /bin/sh -xe /tmp/jenkins13699243142254465595.sh
+ sfdx mohanc:xml:transform -i xml/Admin.profile-meta.xml -m Profile
=== Writing the output into file : xml/Admin.profile-meta.xml.html ===
Finished: SUCCESS
Pipeline
Field | Format of valid values |
---|---|
Min | 0-59 |
Hour | 0-23 |
Day of the month | 1-31 |
Month | 1-12 (or JAN to DEC) |
Day of the week | 0-6 (or SUN to SAT; or 7 for Sunday) |
Script
pipeline {
agent {
node { label 'docker-node-sfdx' }
}
triggers {
pollSCM '*/5 * * * *'
}
environment {
SF_USERNAME = 'mohan.chinnappan.n.sel2@gmail.com'
SF_INSTANCE_URL = 'https://login.salesforce.com'
SF_CONSUMER_KEY = '3MVG9vp1McGXQBYGcnDR0UfAip.JjJVJUbccWajKPlMgbFIV.C9g3FdmKaKfOYlrJ6iY4QUQKnC1F8dzu8YAz'
SERVER_KEY_CREDENTALS_ID='cb689b37-ff89-490c-87b0-da6983e03854'
}
stages {
stage('Prepare') {
steps {
withCredentials([file(credentialsId: SERVER_KEY_CREDENTALS_ID, variable: 'server_key_file')]) {
echo "pip install"
sh '''
cd py
pip3 install -r requirements.txt
echo ${SF_USERNAME}
sfdx force:auth:jwt:grant -u ${SF_USERNAME} -f ${server_key_file} -i ${SF_CONSUMER_KEY}
sfdx force:data:soql:query --query "SELECT Id, Name, Account.Name FROM Contact" -u ${SF_USERNAME}
sfdx mohanc:org:viz -u ${SF_USERNAME}
wc -l Org.json
'''
}
}
}
stage('Test') {
steps {
echo "Testing it"
sh '''
cd py
python3 double.py double 27
'''
}
}
stage('Deploy') {
steps {
echo 'Deploy....'
sh '''
echo "Deploying..."
'''
}
}
}
}
Setup
Demo
Started by user Mohan Chinnappan
Obtained py/Jenkinsfile from git https://github.com/mohan-chinnappan-n/jenkins-cicd
[Pipeline] Start of Pipeline
[Pipeline] node
Still waiting to schedule task
‘Jenkins’ doesn’t have label ‘docker-node-sfdx’
Running on docker-node-sfdx-0000ejftdsk7i on docker in /home/jenkins/workspace/Pipeline-Python-CLI-Test
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Declarative: Checkout SCM)
[Pipeline] checkout
The recommended git tool is: git
No credentials specified
Cloning the remote Git repository
Cloning repository https://github.com/mohan-chinnappan-n/jenkins-cicd
> git init /home/jenkins/workspace/Pipeline-Python-CLI-Test # timeout=10
Fetching upstream changes from https://github.com/mohan-chinnappan-n/jenkins-cicd
> git --version # timeout=10
> git --version # 'git version 2.38.4'
> git fetch --tags --force --progress -- https://github.com/mohan-chinnappan-n/jenkins-cicd +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/mohan-chinnappan-n/jenkins-cicd # timeout=10
> git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
Avoid second fetch
Checking out Revision a53c0b8dfdfa5617d000ee07a6b8e2172370a933 (refs/remotes/origin/main)
Commit message: "py fire test added"
> git rev-parse refs/remotes/origin/main^{commit} # timeout=10
> git config core.sparsecheckout # timeout=10
> git checkout -f a53c0b8dfdfa5617d000ee07a6b8e2172370a933 # timeout=10
> git rev-list --no-walk a53c0b8dfdfa5617d000ee07a6b8e2172370a933 # timeout=10
[Pipeline] }
[Pipeline] // stage
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Prepare)
[Pipeline] withCredentials
Masking supported pattern matches of $server_key_file
[Pipeline] {
[Pipeline] echo
pip install
[Pipeline] sh
+ cd py
+ pip3 install -r requirements.txt
Defaulting to user installation because normal site-packages is not writeable
Collecting fire==0.5.0
Downloading fire-0.5.0.tar.gz (88 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 88.3/88.3 kB 2.2 MB/s eta 0:00:00
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Requirement already satisfied: six in /usr/lib/python3.10/site-packages (from fire==0.5.0->-r requirements.txt (line 1)) (1.16.0)
Collecting termcolor
Downloading termcolor-2.2.0-py3-none-any.whl (6.6 kB)
Installing collected packages: termcolor, fire
DEPRECATION: fire is being installed using the legacy 'setup.py install' method, because it does not have a 'pyproject.toml' and the 'wheel' package is not installed. pip 23.1 will enforce this behaviour change. A possible replacement is to enable the '--use-pep517' option. Discussion can be found at https://github.com/pypa/pip/issues/8559
Running setup.py install for fire: started
Running setup.py install for fire: finished with status 'done'
Successfully installed fire-0.5.0 termcolor-2.2.0
+ echo mohan.chinnappan.n.sel2@gmail.com
mohan.chinnappan.n.sel2@gmail.com
+ sfdx force:auth:jwt:grant -u mohan.chinnappan.n.sel2@gmail.com -f **** -i 3MVG9vp1McGXQBYGcnDR0UfAip.JjJVJUbccWajKPlMgbFIV.C9g3FdmKaKfOYlrJ6iY4QUQKnC1F8dzu8YAz
Warning: The "-u" flag has been deprecated. Use "--username" instead.
Successfully authorized mohan.chinnappan.n.sel2@gmail.com with org ID 00D8W000004LymuUAC
+ sfdx force:data:soql:query --query 'SELECT Id, Name, Account.Name FROM Contact' -u mohan.chinnappan.n.sel2@gmail.com
Warning: The "force data soql query" command has been deprecated. Use "data query" instead.
Warning: The "-u" flag has been deprecated. Use "--target-org" instead.
Querying Data...
ID NAME ACCOUNT.NAME
────────────────── ───────────────── ───────────────────────────────────
0038W00001ulzMpQAI Rose Gonzalez Edge Communications
0038W00001ulzMqQAI Sean Forbes Edge Communications
0038W00001ulzMrQAI Jack Rogers Burlington Textiles Corp of America
0038W00001ulzMsQAI Pat Stumuller Pyramid Construction Inc.
0038W00001ulzMtQAI Andy Young Dickenson plc
0038W00001ulzMuQAI Tim Barr Grand Hotels & Resorts Ltd
0038W00001ulzMvQAI John Bond Grand Hotels & Resorts Ltd
0038W00001ulzMwQAI Stella Pavlova United Oil & Gas Corp.
0038W00001ulzMxQAI Lauren Boyle United Oil & Gas Corp.
0038W00001ulzMyQAI Josh Davis Express Logistics and Transport
0038W00001ulzMzQAI Jane Grey University of Arizona
0038W00001ulzN0QAI Arthur Song United Oil & Gas Corp.
0038W00001ulzN1QAI Avi Green United Oil & Gas Corp.
0038W00001ulzN2QAI Babara Levy Express Logistics and Transport
0038W00001ulzN3QAI Ashley James United Oil & Gas, UK
0038W00001ulzN4QAI Tom Ripley United Oil & Gas, Singapore
0038W00001ulzN5QAI Liz D'Cruz United Oil & Gas, Singapore
0038W00001ulzN6QAI Edna Frank GenePoint
0038W00001ulzN7QAI Siddartha Nedaerk sForce
0038W00001ulzN8QAI Jake Llorrac sForce
Total number of records retrieved: 20.
Querying Data... done
+ sfdx mohanc:org:viz -u mohan.chinnappan.n.sel2@gmail.com
=== Working on getOrgData ===
=== Working on getOrgLimits ===
=== Working on getCount:ApexClass ===
=== Working on getCount:ApexPage ===
=== Working on getCount:ConnectedApplication ===
=== Working on getCount:CustomPermission ===
=== Working on getCount:ExternalDataSource ===
=== Working on getCount:FieldPermissions ===
=== Working on getCount:FieldSecurityClassification ===
=== Working on getCount:Group ===
=== Working on getCount:GroupMember ===
=== Working on getCount:NamedCredential ===
=== Working on getCount:ObjectPermissions ===
=== Working on getCount:PackageLicense ===
=== Working on getCount:PermissionSet ===
=== Working on getCount:PermissionSetAssignment ===
=== Working on getCount:PermissionSetGroup ===
=== Working on getCount:PermissionSetLicense ===
=== Working on getCount:Profile ===
=== Working on getCount:QueueSobject ===
=== Working on getCount:SetupAssistantStep ===
=== Working on getCount:SetupEntityAccess ===
=== Working on getCount:User ===
=== Working on getCount:UserLicense ===
=== Working on getCount:UserPackageLicense ===
=== Working on getCount:UserRole ===
=== Working on getCount:Account ===
=== Working on getCount:AccountContactRole ===
=== Working on getCount:Asset ===
=== Working on getCount:Campaign ===
=== Working on getCount:CampaignMember ===
=== Working on getCount:Case ===
=== Working on getCount:CaseStatus ===
=== Working on getCount:Contact ===
=== Working on getCount:Contract ===
=== Working on getCount:ContractContactRole ===
=== Working on getCount:Lead ===
=== Working on getCount:Opportunity ===
=== Working on getCount:OpportunityCompetitor ===
=== Working on getCount:OpportunityContactRole ===
=== Working on getCount:OpportunityStage ===
=== Working on getCount:Order ===
=== Working on getCount:Partner ===
=== Working on getCount:PartnerRole ===
=== Working on getCount:Account ===
=== Working on getCount:Case ===
=== Working on getCount:CaseComment ===
=== Working on getCount:CaseHistory ===
=== Working on getCount:CaseSolution ===
=== Working on getCount:Contact ===
=== Working on getCount:Solution ===
=== Working on getPackageInfo ===
=== Working on EntityDefinitionSummary:Account ===
=== Working on EntityDefinitionSummary:Contact ===
=== Working on EntityDefinitionSummary:AccountContactRole ===
=== Working on EntityDefinitionSummary:Opportunity ===
=== Working on EntityDefinitionSummary:OpportunityContactRole ===
=== Working on EntityDefinitionSummary:OpportunityCompetitor ===
=== Working on EntityDefinitionSummary:Lead ===
=== Working on EntityDefinitionSummary:Case ===
=== Working on EntityDefinitionSummary:Campaign ===
=== Working on EntityDefinitionSummary:CampaignMember ===
=== Working on EntityDefinitionSummary:Asset ===
=== Working on EntityDefinitionSummary:Contract ===
=== Working on EntityDefinitionSummary:ContractContactRole ===
=== Working on EntityDefinitionSummary:Order ===
=== Working on getApexCodeCoverage ===
=== Working on getBusinessProcess ===
=== Working on getCertificate ===
=== Working on CspTrustedSite ===
=== Working on getOauthToken ===
=== Working on getSetupAuditTrail ===
=== Working on getInactiveUsers ===
=== Working on getUnusedProfiles ===
=== Working on getUsedProfiles ===
=== Working on getUsedPermissionSets ===
=== Working on getUnusedPermissionSets ===
=== Working on getUsedProfiles ===
=== Working on getUsedRoles ===
=== Working on getSysAdminUsers ===
=== Working on getProfileInfo:System Administrator ===
Error: {
"name": "INVALID_FIELD",
"errorCode": "INVALID_FIELD"
}
=== Working on getProfileInfo:Customer Community Login User ===
Error: {
"name": "INVALID_FIELD",
"errorCode": "INVALID_FIELD"
}
=== Working on getPermissionSetInfo:Sales_Ops ===
Error: {
"name": "INVALID_FIELD",
"errorCode": "INVALID_FIELD"
}
=== Working on getProfileForUserLicense:Guest ===
Error: {
"name": "INVALID_FIELD",
"errorCode": "INVALID_FIELD"
}
=== Working on getUserLicenseInfo ===
=== Working on getNetworkMemberInfo ===
Error: {
"name": "INVALID_TYPE",
"errorCode": "INVALID_TYPE"
}
=== Working on getHasEinsteinDataDiscovery ===
=== Working on getHasEDD ===
=== Working on getHasEinsteinDataDiscovery ===
=== Working on getCountTooling:flow ===
=== Working on getFlows ===
=== Working on getFlowMetadata ===
=== Writing Org JSON in file Org.json ...
=== Writing visualization in file Org.svg ...
Visualization done. "open Org.svg" in Chrome Browser to view the Visualization.
+ wc -l Org.json
2726 Org.json
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] echo
Testing it
[Pipeline] sh
+ cd py
+ python3 double.py double 27
54
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Deploy)
[Pipeline] echo
Deploy....
[Pipeline] sh
+ echo Deploying...
Deploying...
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
Resources
- How to use OAuth2.0 based JWT flow to login to a Salesforce org using DX
- Sample Pipelines
- Configure Your Environment for Jenkins
- Learn Jenkins! Complete Jenkins Course - Zero to Hero
git
- Well designed branching strategy is key to the success of your deployments
Topic
Basic Operations
Cheat Sheet - click to view the documentation
![]() | ![]() | |
![]() | ![]() | |
![]() | ![]() | |
![]() | ![]() | |
![]() | ![]() |
Merge Conflicts
Using git logs
my .gitconfig
[alias]
lg = lg1
lg1 = lg1-specific --all
lg2 = lg2-specific --all
lg3 = lg3-specific --all
lg1-specific = log --graph --abbrev-commit --decorate --format=format:'%C(bold blue)%h%C(reset) - %C(bold green)(%ar)%C(reset) %C(white)%s%C(reset) %C(dim white)- %an%C(reset)%C(auto)%d%C(reset)'
lg2-specific = log --graph --abbrev-commit --decorate --format=format:'%C(bold blue)%h%C(reset) - %C(bold cyan)%aD%C(reset) %C(bold green)(%ar)%C(reset)%C(auto)%d%C(reset)%n'' %C(white)%s%C(reset) %C(dim white)- %an%C(reset)'
lg3-specific = log --graph --abbrev-commit --decorate --format=format:'%C(bold blue)%h%C(reset) - %C(bold cyan)%aD%C(reset) %C(bold green)(%ar)%C(reset) %C(bold cyan)(committed: %cD)%C(reset) %C(auto)%d%C(reset)%n'' %C(white)%s%C(reset)%n'' %C(dim white)- %an <%ae> %C(reset) %C(dim white)(committer: %cn <%ce>)%C(reset)'
Let us git lg3 on tensorflow repo
~/tf/tensorflow [master] >git lg3
* 0d4d75bc2b7 - Sun, 22 Jan 2023 01:31:56 -0800 (8 hours ago) (committed: Sun, 22 Jan 2023 01:36:16 -0800) (HEAD -> master, origin/master, origin/HEAD)
| Add no_pip tag for `pywrap_quantize_model_test`.
| - Dan Suh <dansuh@google.com> (committer: TensorFlower Gardener <gardener@tensorflow.org>)
* 0b9a85fa851 - Sun, 22 Jan 2023 01:01:52 -0800 (8 hours ago) (committed: Sun, 22 Jan 2023 01:12:11 -0800)
| compat: Update forward compatibility horizon to 2023-01-22
| - A. Unique TensorFlower <gardener@tensorflow.org> (committer: TensorFlower Gardener <gardener@tensorflow.org>)
* a35dac30a05 - Sun, 22 Jan 2023 01:01:52 -0800 (8 hours ago) (committed: Sun, 22 Jan 2023 01:06:31 -0800)
| Update GraphDef version to 1384.
| - A. Unique TensorFlower <gardener@tensorflow.org> (committer: TensorFlower Gardener <gardener@tensorflow.org>)
* 7a0e6f3a58c - Sat, 21 Jan 2023 14:45:12 -0800 (18 hours ago) (committed: Sat, 21 Jan 2023 14:49:53 -0800) (origin/nightly)
| Use tensor.extract_slice instead of gml_st.materialize.
| - Alexander Belyaev <pifon@google.com> (committer: TensorFlower Gardener <gardener@tensorflow.org>)
* 114579d598c - Sat, 21 Jan 2023 14:15:52 -0800 (19 hours ago) (committed: Sat, 21 Jan 2023 14:19:18 -0800)
| [StableHLO to MHLO] Handle bounds in the CholeskyOp shape function
| - Gunhyun Park <gunhyun@google.com> (committer: TensorFlower Gardener <gardener@tensorflow.org>)
* 5c880d3ebe3 - Sat, 21 Jan 2023 13:41:09 -0800 (20 hours ago) (committed: Sat, 21 Jan 2023 13:46:12 -0800)
| [StableHLO to MHLO] Handle bounds in the ReduceWindowOp shape function
| - Gunhyun Park <gunhyun@google.com> (committer: TensorFlower Gardener <gardener@tensorflow.org>)
* 421205cd514 - Sat, 21 Jan 2023 13:30:20 -0800 (20 hours ago) (committed: Sat, 21 Jan 2023 13:33:14 -0800)
| Hotfix for breakage
| - Johannes Reifferscheid <jreiffers@google.com> (committer: TensorFlower Gardener <gardener@tensorflow.org>)
* 6ca0c86111d - Sat, 21 Jan 2023 13:02:20 -0800 (20 hours ago) (committed: Sat, 21 Jan 2023 13:06:28 -0800)
| Replace 0l with int64_t{0}.
| - Johannes Reifferscheid <jreiffers@google.com> (committer: TensorFlower Gardener <gardener@tensorflow.org>)
* 5d9c9ba31d6 - Sat, 21 Jan 2023 12:31:25 -0800 (21 hours ago) (committed: Sat, 21 Jan 2023 12:40:01 -0800)
| [StableHLO to MHLO] Verify exponent and mantissa in ReducePrecisionOp
| - Gunhyun Park <gunhyun@google.com> (committer: TensorFlower Gardener <gardener@tensorflow.org>)
* 4a67ff1bfd4 - Sat, 21 Jan 2023 12:30:41 -0800 (21 hours ago) (committed: Sat, 21 Jan 2023 12:34:19 -0800)
| [StableHLO to MHLO] Handle bounds in the SelectAndScatterOp shape function
| - Gunhyun Park <gunhyun@google.com> (committer: TensorFlower Gardener <gardener@tensorflow.org>)
* 0c4eb3ace86 - Sat, 21 Jan 2023 11:18:46 -0800 (22 hours ago) (committed: Sat, 21 Jan 2023 11:21:55 -0800)
| [StableHLO to MHLO] Fix typos in infer_stablehlo.mlir tests
:
Script extract git logs into a TSV file
# gitLogs.sh
# Script to produce git logs from the current branch
# Run this command when you are in the branch for which you need gitlogs
# mchinnappan
branch="$(git branch | sed -n -e 's/^\* \(.*\)/\1/p')"
echo "=== branch: ${branch} ==="
tsvoutput="${branch}-$(date +"%Y_%m_%d_%I_%M_%p").tsv"
echo "=== Output file: $tsvoutput ==="
git log --pretty='format:%h|%an|%ae|%s' | sed 's/|/\t/g' > "${tsvoutput}"
open "${output}"
- Run it on Tensorflow master
bash gitLogs.sh
=== branch: master ===
=== Output file: master-2023_01_22_12_13_PM.tsv ===
wc -l master-2023_01_22_12_13_PM.tsv
141977 master-2023_01_22_12_13_PM.tsv
Word Cloud on Tensorflow git logs
Using SF-LAND Data Viz to visualize the tensorflow git logs
Using SF-LAND Data Viz to view the model file
References
git internals
Git is a content addressable system
- It is a simple key-value data store
- You can insert any kind of content into a Git Repo.
- Git will hand you back a unique key you can use later to retrieve that content
git hash-object
command- takes in some data and stores in
.git/objects
folder -- this object database - gives back an unique key which references this object
- takes in some data and stores in
Let us play with git hash-object
~ >git init git-playground
hint: Using 'master' as the name for the initial branch. This default branch name
hint: is subject to change. To configure the initial branch name to use in all
hint: of your new repositories, which will suppress this warning, call:
hint:
hint: git config --global init.defaultBranch <name>
hint:
hint: Names commonly chosen instead of 'master' are 'main', 'trunk' and
hint: 'development'. The just-created branch can be renamed via this command:
hint:
hint: git branch -m <name>
Initialized empty Git repository in /Users/mchinnappan/git-playground/.git/
~ >cd git-playground
~/git-playground [master] >tree
.
0 directories, 0 files
~/git-playground [master] >tree .git
.git
├── HEAD
├── config
├── description
├── hooks
│ ├── applypatch-msg.sample
│ ├── commit-msg.sample
│ ├── fsmonitor-watchman.sample
│ ├── post-update.sample
│ ├── pre-applypatch.sample
│ ├── pre-commit.sample
│ ├── pre-merge-commit.sample
│ ├── pre-push.sample
│ ├── pre-rebase.sample
│ ├── pre-receive.sample
│ ├── prepare-commit-msg.sample
│ ├── push-to-checkout.sample
│ └── update.sample
├── info
│ └── exclude
├── objects
│ ├── info
│ └── pack
└── refs
├── heads
└── tags
8 directories, 17 files
Let us add content to object database
~/git-playground [master] >echo 'test content' | git hash-object -w --stdin
d670460b4b4aece5915caf5c68d12f560a9fe3e4
~/git-playground [master] >tree
.
0 directories, 0 files
~/git-playground [master] >tree .git
.git
├── HEAD
├── config
├── description
├── hooks
│ ├── applypatch-msg.sample
│ ├── commit-msg.sample
│ ├── fsmonitor-watchman.sample
│ ├── post-update.sample
│ ├── pre-applypatch.sample
│ ├── pre-commit.sample
│ ├── pre-merge-commit.sample
│ ├── pre-push.sample
│ ├── pre-rebase.sample
│ ├── pre-receive.sample
│ ├── prepare-commit-msg.sample
│ ├── push-to-checkout.sample
│ └── update.sample
├── info
│ └── exclude
├── objects
│ ├── d6 <----- first 2 characters of the SHA-1
│ │ └── 70460b4b4aece5915caf5c68d12f560a9fe3e4 <------- SHA-1 checksum of the content and its header.
│ ├── info
│ └── pack
└── refs
├── heads
└── tags
9 directories, 18 files
~/git-playground [master] >
find .git/objects -type f
.git/objects/d6/70460b4b4aece5915caf5c68d12f560a9fe3e4
Let us examine the content of the content with git cat-file
git cat-file -p d670460b4b4aece5915caf5c68d12f560a9fe3e4
test content
Let us add a file to the object database
echo 'Version: 1' > test.txt
git hash-object -w test.txt
f8acaeb421037571491c643e96acc09faea03b24
# Let us the object created by this:
find .git/objects -type f
.git/objects/d6/70460b4b4aece5915caf5c68d12f560a9fe3e4
.git/objects/f8/acaeb421037571491c643e96acc09faea03b24 <------
Let us add version 2 of the test.txt
echo 'Version: 2' > test.txt
git hash-object -w test.txt
939ba388fd33e17497577f5cfacd6aa84ef219b9
# Let us the object created by this:
find .git/objects -type f
.git/objects/93/9ba388fd33e17497577f5cfacd6aa84ef219b9 <-----
.git/objects/d6/70460b4b4aece5915caf5c68d12f560a9fe3e4
.git/objects/f8/acaeb421037571491c643e96acc09faea03b24
- Now you can delete your local copy of that test.txt file, then use Git to retrieve, from the object database, either the first version or second version you saved
rm test.txt
ls -l
git cat-file -p 939ba388fd33e17497577f5cfacd6aa84ef219b9
Version: 2
# Version 1:
git cat-file -p f8acaeb421037571491c643e96acc09faea03b24
Type of this SHA-1 is blob
git cat-file -t f8acaeb421037571491c643e96acc09faea03b24
blob
- All the content in git is stored as tree and blob objects
- with trees corresponding to UNIX directory entries
- blobs corresponding more or less to i-nodes or file contents
Tree Objects
- Tree object solves the problem of storing the filename and also allows you to store a group of files together
~/git-playground [master] >mkdir src
~/git-playground [master] >touch README.md
~/git-playground [master] >touch src/hello.js
tree
.
├── README.md --> blob
└── src --> tree
└── hello.js --> blob
~/git-playground [master] >git add -A
~/git-playground [master] >git commit -m init
[master (root-commit) 27c4b42] init
2 files changed, 0 insertions(+), 0 deletions(-)
create mode 100644 README.md
create mode 100644 src/hello.js
~/git-playground [master] >tree .git
.git
├── COMMIT_EDITMSG
├── HEAD
├── config
├── description
├── hooks
│ ├── applypatch-msg.sample
│ ├── commit-msg.sample
│ ├── fsmonitor-watchman.sample
│ ├── post-update.sample
│ ├── pre-applypatch.sample
│ ├── pre-commit.sample
│ ├── pre-merge-commit.sample
│ ├── pre-push.sample
│ ├── pre-rebase.sample
│ ├── pre-receive.sample
│ ├── prepare-commit-msg.sample
│ ├── push-to-checkout.sample
│ └── update.sample
├── index
├── info
│ └── exclude
├── logs
│ ├── HEAD
│ └── refs
│ └── heads
│ └── master
├── objects
│ ├── 27
│ │ └── c4b42327458637549fb9d00170bdc7b07ed77b
│ ├── 5a
│ │ └── 38b439d9a9e0a7d8d12f1e88cad5f881968a6b
│ ├── 93
│ │ └── 9ba388fd33e17497577f5cfacd6aa84ef219b9
│ ├── d6
│ │ └── 70460b4b4aece5915caf5c68d12f560a9fe3e4
│ ├── e1
│ │ └── 52c4c52532fda392dece4c503567d30a9ae059
│ ├── e6
│ │ └── 9de29bb2d1d6434b8b29ae775ad8c2e48c5391
│ ├── f8
│ │ └── acaeb421037571491c643e96acc09faea03b24
│ ├── info
│ └── pack
└── refs
├── heads
│ └── master
└── tags
18 directories, 29 files
master^{tree}
syntax specifies the tree object that is pointed to by the last commit on your master branch.
~/git-playground [master] >git cat-file -p master^{tree}
100644 blob e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 README.md
040000 tree e152c4c52532fda392dece4c503567d30a9ae059 src
~/git-playground [master] >vi src/hello.js
~/git-playground [master] >cat src/hello.js
console.log('hello');
git add -A
~/git-playground [master] >git commit -m 'hello added'
>tree
.
├── README.md
└── src
└── hello.js
References
Videos
git hooks
Good Projects have good git hooks!
sequenceDiagram autonumber Developer->>VSCode: VSCode extension: <br/> Validates code on-the-fly Developer->>git: Prepare for commit git->>Developer: Using git hooks: <br>Validates commit message <br/>Performs code-scan<br/> AutoDeltaDeployer->>Org: Every 30 min perform:<br/>Code-scan<br/> Bails out if threshold is reached
Pre-requisites
- Make sure your PMD tools are installed according to this
- Check python is already installed in your system by typing following in the terminal:
python
Or
python3
-
If you are successful in above command, you have already python installed if not continue to next steps:
-
Python is installed in one of these 2 ways
- Install from python.org
- Install via Homebrew
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install.sh)"
- After Homebrew is installed install python
brew install python
-
Required SFDX Plugin is installed:
sfdx plugins:install sfdx-mohanc-plugins
-
All the hooks have to be made executable. For example:
chmod +x prepare-commit-msg pre-commit commit-msg
- You can view this :
/treeprj/.git/hooks [patch1] >ls -l prepare-commit-msg pre-commit commit-msg
-rwxr-xr-x 1 mchinnappan staff 522 Jan 25 08:43 commit-msg
-rwxr-xr-x 1 mchinnappan staff 1050 Feb 8 03:47 pre-commit
-rwxr-xr-x 1 mchinnappan staff 177 Jan 25 05:54 prepare-commit-msg
-
Here we explain how to use the power of git hooks to check/enforce:
- commit comments quality
- the committed code is deployable
-
Git hooks can be run:
-
locally
-
prepare-commit-msg
-
pre-commit
-
commit-msg
-
post-commit
-
They are not copied over to the new repository when you run git clone. And, since hooks are local, they can be altered by anybody with access to the repository.
-
-
on git server
- pre-receive
- update
- post-receive
-
Local hooks can only serve as a guidance and should not be relied upon for enforcements.
-
They are not copied over to the new repository when you run git clone.
-
Can be altered by anybody with access to the repository.
-
.git/hooks directory
isn’t cloned with the rest of your project, nor is it under version control.- Easily disabled by the user - by deleting the
.git/hook/
directory or--no-verify
with git cli - We need to find a way to make sure hooks stay up-to-date amongst your team members.
- You can’t force developers to create commits that look a certain way
- You can only encourage them to do so.
- Easily disabled by the user - by deleting the
-
-
Solution
-
A simple solution to these local problems is to:
- Store your hooks in the actual project directory (above the
.git
directory) - This lets you edit them like any other version-controlled file
- To install a hook, you can create a symlink to it in
.git/hooks
- Store your hooks in the actual project directory (above the
-
Git Templates Directory
- Git's Template Directory mechanism that makes it easier to install hooks automatically.
- All of the files and directories contained in this template directory are copied into the
.git
directory every time you usegit init
orgit clone
.
-
- Server hooks are a better choice for enforcing rules.
Topics we will cover here:
Sample prepare-commit-msg
cat .git/hooks/prepare-commit-msg
#!/usr/bin/env python3
import sys
import subprocess
# Run the git command to get the current branch name
result = subprocess.run(['git', 'rev-parse', '--abbrev-ref', 'HEAD'], stdout=subprocess.PIPE)
# Extract the branch name from the command output
branch_name = result.stdout.decode('utf-8').strip()
#print(branch_name)
# read .git/COMMIT_EDITMSG
commit_msg_filepath = sys.argv[1]
with open(commit_msg_filepath, "w") as f:
# change the following as needed
f.write(f"{branch_name} your info\n#You must include Ticket reference in the first line")
prepare-commit-msg in action
git branch WI-123456
git checkout WI-123456
Switched to branch 'WI-123456'
touch r3.md
[WI-123456]- >git add .
[WI-123456]- >git commit
[WI-123456 e855e40] WI-123456 added to support training
1 file changed, 0 insertions(+), 0 deletions(-)
create mode 100644 r3.md
[WI-123456]- > git lg
* e855e40 - (3 minutes ago) WI-123456 added to support training - mohan-chinnappan-n (HEAD -> WI-123456)

commit-msg
- With commit-msg we can enforce commit message
#!/usr/bin/env python3
# python script check the format of the commit with a regex
# valid: WI-123456 adding new docs
# echo $? -> 0
# invalid: " adding more docs"
# echo $? -> 1
# TODO: change the regex to fit your needs
import re,sys
input = sys.argv[1]
print('INPUT', input)
prefix='==='
with open(input, 'r') as file:
data = file.read()
p = re.compile("^WI-([0-9]*){1,6}\s[a-zA-Z0-9\s]{1,30}")
print (f'{prefix} commit msg: {data} ')
if bool(p.match(data)):
print (f'{prefix} Valid commit message!{prefix}')
else:
print (f'{prefix} Invalid commit message! bailing out... {prefix}')
exit(1)
Testing
cat ".git/COMMIT_EDITMSG"
WI-123456 adding new docs
~/treeprj [patch1] >python .git/hooks/commit-msg ".git/COMMIT_EDITMSG"
Invalid commit message! bailing out...
~/treeprj [patch1] >echo $?
1
~/treeprj [patch1] >python .git/hooks/commit-msg ".git/COMMIT_EDITMSG"
Valid commit message!
~/treeprj [patch1] >echo $?
0
Demo
git commit
Jan 25, 2023 8:45:02 AM net.sourceforge.pmd.PMD encourageToUseIncrementalAnalysis
WARNING: This analysis could be faster, please consider using Incremental Analysis: https://pmd.github.io/pmd-6.47.0/pmd_userdocs_incremental_analysis.html
SELECT COUNT(*) AS CNT FROM CSV("results.csv", {headers:true}) WHERE Priority < 3
nerrors: 0
Continue the deployment...
Deploying v55.0 metadata to test-uzsmfdqkhtk7@example.com using the v56.0 SOAP API
Deploy ID: 0AfDM000016PncS0AS
DEPLOY PROGRESS | ████████████████████████████████████████ | 1/1 Components
=== Deployed Source
FULL NAME TYPE PROJECT PATH
────────── ───────── ──────────────────────────────────────────────────────
HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls
HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls-meta.xml
Successfully validated the deployment.
('INPUT', '.git/COMMIT_EDITMSG')
Valid commit message!
[patch1 9fd2000] WI-123456 adding new docs
3 files changed, 3 insertions(+), 2 deletions(-)
create mode 100644 docs/12.md
~/treeprj [patch1] >git lg | head -n 1
* 9fd2000 - (53 seconds ago) WI-123456 adding new docs - mohan-chinnappan-n (HEAD -> patch1)
pre-commit hook
- Note this will increase time to do the commit but you will be assured of what is committed is deployable
- Here I have explained for deploy check for classes, but you can add it as needed.
cat .git/hooks/pre-commit
#!/usr/bin/env bash
sfdx force:source:deploy -p /Users/mchinnappan/treeprj/force-app/main/default/classes/ -u test-uzsmfdqkhtk7@example.com -c --verbose
pre-commit hook in action
- Script used for this
~/treeprj [patch1] >
sfdx force:source:deploy -p /Users/mchinnappan/treeprj/force-app/main/default/classes/ -u test-uzsmfdqkhtk7@example.com -c --verbose
Deploying v55.0 metadata to test-uzsmfdqkhtk7@example.com using the v56.0 SOAP API
Deploy ID: 0AfDM000016PUr90AG
DEPLOY PROGRESS | ████████████████████████████████████████ | 1/1 Components
=== Deployed Source
FULL NAME TYPE PROJECT PATH
────────── ───────── ──────────────────────────────────────────────────────
HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls
HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls-meta.xml
Successfully validated the deployment.
~/treeprj [patch1] >echo $?
0
- Note here the validation deploy is run to make sure that code is deployable
~/treeprj [patch1] >vi docs/9.md
~/treeprj [patch1] >git add -A
~/treeprj [patch1] >git commit
Deploying v55.0 metadata to test-uzsmfdqkhtk7@example.com using the v56.0 SOAP API
Deploy ID: 0AfDM000016PUwn0AG
DEPLOY PROGRESS | ████████████████████████████████████████ | 1/1 Components
=== Deployed Source
FULL NAME TYPE PROJECT PATH
────────── ───────── ──────────────────────────────────────────────────────
HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls
HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls-meta.xml
Successfully validated the deployment.
[patch1 9f3ce34] PATCH1 WI-2345 - more docs added
1 file changed, 2 insertions(+)
create mode 100644 docs/9.md
Adding PMD code scan in pre-commit
~/treeprj [patch1] >vi docs/10.md
~/treeprj [patch1] >git add -A
~/treeprj [patch1] >git commit
Jan 25, 2023 6:41:18 AM net.sourceforge.pmd.PMD encourageToUseIncrementalAnalysis
WARNING: This analysis could be faster, please consider using Incremental Analysis: https://pmd.github.io/pmd-6.47.0/pmd_userdocs_incremental_analysis.html
SELECT COUNT(*) AS CNT FROM CSV("results.csv", {headers:true}) WHERE Priority < 4
nerrors: 3
Number of P1,P2 and P3 issues are: 3. Stopping the deployment!
cat .git/hooks/pre-commit
#!/usr/bin/env bash
# Script for pipeline
# Bails out if count of P1,P2 and P3 are not zero
# -----------------------------------------
#----- configure the following to meet your needs ----
RULESET=~/.pmd/apex_ruleset.xml
MSG="Number of P1,P2 and P3 issues are"
THRESHOLD=3
CODE=/Users/mchinnappan/treeprj/force-app/main/default/classes/
#-----------------------------------------------------
PREFIX="==="
# Run the pmd
echo "$PREFIX Running PMD scan on ${CODE} ${PREFIX}..."
echo "Going to run: pmd-run.sh pmd -R $RULESET -d $CODE -f csv > results.csv ..."
pmd-run.sh pmd -R $RULESET -d $CODE -f csv > results.csv
# query the results using SQL
echo "SELECT COUNT(*) AS CNT FROM CSV(\"results.csv\", {headers:true}) WHERE Priority < $THRESHOLD" > q.sql
cat q.sql
sfdx mohanc:data:query:sql -q q.sql > out.json
# check for the errors
echo "$PREFIX Checking for PMD violations ... $PREFIX"
nerrors=`sfdx mohanc:data:jq -f '.[].CNT' -i out.json`
echo "nerrors: $nerrors"
if [ "$nerrors" != 0 ]
then
echo "$PREFIX $MSG: $nerrors. Stopping the deployment! $PREFIX "
exit 2
else
echo "$PREFIX No PMD violations! $PREFIX"
fi
echo "$PREFIX Continue the deployment...$PREFIX"
echo sfdx force:source:deploy -p $CODE -u test-uzsmfdqkhtk7@example.com -c --verbose
Demo of successful commit
~/treeprj [patch1] >git add -A
~/treeprj [patch1] >git commit
Jan 25, 2023 6:51:23 AM net.sourceforge.pmd.PMD encourageToUseIncrementalAnalysis
WARNING: This analysis could be faster, please consider using Incremental Analysis: https://pmd.github.io/pmd-6.47.0/pmd_userdocs_incremental_analysis.html
SELECT COUNT(*) AS CNT FROM CSV("results.csv", {headers:true}) WHERE Priority < 3
nerrors: 0
Continue the deployment...
Deploying v55.0 metadata to test-uzsmfdqkhtk7@example.com using the v56.0 SOAP API
Deploy ID: 0AfDM000016PX2w0AG
DEPLOY PROGRESS | ████████████████████████████████████████ | 1/1 Components
=== Deployed Source
FULL NAME TYPE PROJECT PATH
────────── ───────── ──────────────────────────────────────────────────────
HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls
HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls-meta.xml
Successfully validated the deployment.
[patch1 b1ebf5a] PATCH1 WI-4567 - add more docs
5 files changed, 9 insertions(+)
create mode 100644 docs/10.md
create mode 100644 docs/11.md
...
~/treeprj [patch1] >git lg
* b1ebf5a - (2 minutes ago) PATCH1 WI-4567 - add more docs - mohan-chinnappan-n (HEAD -> patch1)
References
git operations
- Refer git chapter before starting this.
Topics
init
~/git-playground [master] >cd
~ >mkdir git-demo
~ >cd git-demo
~/git-demo >ls
~/git-demo >git init .
hint: Using 'master' as the name for the initial branch. This default branch name
hint: is subject to change. To configure the initial branch name to use in all
hint: of your new repositories, which will suppress this warning, call:
hint:
hint: git config --global init.defaultBranch <name>
hint:
hint: Names commonly chosen instead of 'master' are 'main', 'trunk' and
hint: 'development'. The just-created branch can be renamed via this command:
hint:
hint: git branch -m <name>
Initialized empty Git repository in /Users/mchinnappan/git-demo/.git/
~/git-demo [master] >git config --global init.defaultBranch
~/git-demo [master] >git config --global init.defaultBranch main
~/git-demo [master] >git branch -m main
~/git-demo [main] >code .
~/git-demo [main] >git status
On branch main
No commits yet
Untracked files:
(use "git add <file>..." to include in what will be committed)
greet.js
nothing added to commit but untracked files present (use "git add" to track)
stage
~/git-demo [main] >git add greet.js
status
~/git-demo [main] >git status
On branch main
No commits yet
Changes to be committed:
(use "git rm --cached <file>..." to unstage)
new file: greet.js
commit
~/git-demo [main] >git commit -m 'init'
[main (root-commit) 6738865] init
1 file changed, 2 insertions(+)
create mode 100644 greet.js
~/git-demo [main] >git status
On branch main
nothing to commit, working tree clean
logs
git log
commit 6738865ba8b8549b62b9fbadf4b4041ac891e36e (HEAD -> main)
Author: mohan-chinnappan-n <mohan.chinnappan.n@gmail.com>
Date: Fri Feb 3 07:27:03 2023 -0500
init
~/git-demo [main] >git lg
* 6738865 - (12 seconds ago) init - mohan-chinnappan-n (HEAD -> main)
~/git-demo [main] >
branch
~/git-demo [main] >git branch feature-0001
``
~/git-demo [main] >git branch feature-0001
- main
~/git-demo [main] >git checkout feature-0001 Switched to branch 'feature-0001'
~/git-demo [feature-0001] >git lg
* 6738865 - (24 minutes ago) init - mohan-chinnappan-n (HEAD -> feature-0001, main)
~/git-demo [feature-0001] >git branch
* feature-0001
main
work in feature-0001 branch
~/git-demo [feature-0001] >vi greet.js
~/git-demo [feature-0001] >cat greet.js
const msg = 'hello world';
console.log(msg);
~/git-demo [feature-0001] >node greet.js
hello world
commit the changes
~/git-demo [feature-0001] >git add greet.js
~/git-demo [feature-0001] >git commit -m 'msg updated'
[feature-0001 4f8fbaa] msg updated
1 file changed, 3 insertions(+), 2 deletions(-)
~/git-demo [feature-0001] >git lg
* 4f8fbaa - (9 seconds ago) msg updated - mohan-chinnappan-n (HEAD -> feature-0001)
* 6738865 - (25 minutes ago) init - mohan-chinnappan-n (main)
Keep working on feature-0001
~/git-demo [feature-0001] >vi greet.js
~/git-demo [feature-0001] >cat greet.js
const msg = process.argv[2] || 'hello world';
console.log(msg);
~/git-demo [feature-0001] >node greet.js HelloWorld
HelloWorld
~/git-demo [feature-0001] >node greet.js
hello world
/git-demo [feature-0001] >git add greet.js
~/git-demo [feature-0001] >git commit -m 'added msg from commandline param'
[feature-0001 6aff16d] added msg from commandline param
1 file changed, 1 insertion(+), 1 deletion(-)
logs
~/git-demo [feature-0001] >git lg2
* 6aff16d - Fri, 3 Feb 2023 07:57:30 -0500 (69 seconds ago) (HEAD -> feature-0001)
| added msg from commandline param - mohan-chinnappan-n
* 4f8fbaa - Fri, 3 Feb 2023 07:52:23 -0500 (6 minutes ago)
| msg updated - mohan-chinnappan-n
* 6738865 - Fri, 3 Feb 2023 07:27:03 -0500 (32 minutes ago) (main)
init - mohan-chinnappan-n
switch back to main and make few updates and commit
~/git-demo [feature-0001] >git checkout main
Switched to branch 'main'
~/git-demo [main] >vi greet.js
~/git-demo [main] >cat greet.js
const msg = 'Greetings';
console.log(msg);
~/git-demo [main] >node greet.js
Greetings
~/git-demo [feature-0001] >git checkout main
Switched to branch 'main'
~/git-demo [main] >vi greet.js
~/git-demo [main] >cat greet.js
const msg = 'Greetings';
console.log(msg);
~/git-demo [main] >node greet.js
Greetings
~/git-demo [main] >git add greet.js
~/git-demo [main] >git commit -m 'msg is changed'
[main 972c964] msg is changed
1 file changed, 2 insertions(+), 2 deletions(-)
~/git-demo [main] >git lg2
* 972c964 - Fri, 3 Feb 2023 08:00:54 -0500 (6 seconds ago) (HEAD -> main)
| msg is changed - mohan-chinnappan-n
| * 6aff16d - Fri, 3 Feb 2023 07:57:30 -0500 (4 minutes ago) (feature-0001)
| | added msg from commandline param - mohan-chinnappan-n
| * 4f8fbaa - Fri, 3 Feb 2023 07:52:23 -0500 (9 minutes ago)
|/ msg updated - mohan-chinnappan-n
* 6738865 - Fri, 3 Feb 2023 07:27:03 -0500 (34 minutes ago)
init - mohan-chinnappan-n
let us merge feature-001 with main to the conflicts
~/git-demo [main] >git merge feature-0001
Auto-merging greet.js
CONFLICT (content): Merge conflict in greet.js
Automatic merge failed; fix conflicts and then commit the result.
use merge editor in VSCode to resolve the conflicts
~/git-demo (git)-[main|merge]- >git status
On branch main
All conflicts fixed but you are still merging.
(use "git commit" to conclude merge)
Changes to be committed:
modified: greet.js
~/git-demo (git)-[main|merge]- >git add greet.js
~/git-demo (git)-[main|merge]- >git commit -m 'merged with feature-0001'
[main 50fa27f] merged with feature-0001
~/git-demo [main] >git lg2
* 50fa27f - Fri, 3 Feb 2023 08:09:16 -0500 (6 seconds ago) (HEAD -> main)
|\ merged with feature-0001 - mohan-chinnappan-n
| * 6aff16d - Fri, 3 Feb 2023 07:57:30 -0500 (12 minutes ago) (feature-0001)
| | added msg from commandline param - mohan-chinnappan-n
| * 4f8fbaa - Fri, 3 Feb 2023 07:52:23 -0500 (17 minutes ago)
| | msg updated - mohan-chinnappan-n
* | 972c964 - Fri, 3 Feb 2023 08:00:54 -0500 (8 minutes ago)
|/ msg is changed - mohan-chinnappan-n
* 6738865 - Fri, 3 Feb 2023 07:27:03 -0500 (42 minutes ago)
init - mohan-chinnappan-n
~/git-demo [main] >cat greet.js
const msg = process.argv[2] || 'hello world';
console.log(msg);
Some more update on feature-0001
git checkout feature-0001
M greet.js
Switched to branch 'feature-0001'
~/git-demo [main] >vi greet.js
~/git-demo [main] >vi greet.js
~/git-demo [main] >node greet.js
hello world
30
~/git-demo [main] >cat greet.js
const msg = process.argv[2] || 'hello world';
console.log(msg);
const add = (a,b) => a+b;
console.log(add(10,20));
~/git-demo [feature-0001] >git status
On branch feature-0001
Changes not staged for commit:
(use "git add <file>..." to update what will be committed)
(use "git restore <file>..." to discard changes in working directory)
modified: greet.js
no changes added to commit (use "git add" and/or "git commit -a")
~/git-demo [feature-0001] >git add greet.js
~/git-demo [feature-0001] >git commit -m 'add function added'
[feature-0001 af3e5f4] add function added
1 file changed, 3 insertions(+)
~/git-demo [feature-0001] >git lg3
* af3e5f4 - Sat, 4 Feb 2023 04:40:59 -0500 (4 seconds ago) (committed: Sat, 4 Feb 2023 04:40:59 -0500) (HEAD -> feature-0001)
| add function added
| - mohan-chinnappan-n <mohan.chinnappan.n@gmail.com> (committer: mohan-chinnappan-n <mohan.chinnappan.n@gmail.com>)
| * 50fa27f - Fri, 3 Feb 2023 08:09:16 -0500 (21 hours ago) (committed: Fri, 3 Feb 2023 08:09:16 -0500) (main)
| |\ merged with feature-0001
| |/ - mohan-chinnappan-n <mohan.chinnappan.n@gmail.com> (committer: mohan-chinnappan-n <mohan.chinnappan.n@gmail.com>)
|/|
* | 6aff16d - Fri, 3 Feb 2023 07:57:30 -0500 (21 hours ago) (committed: Fri, 3 Feb 2023 07:57:30 -0500)
| | added msg from commandline param
| | - mohan-chinnappan-n <mohan.chinnappan.n@gmail.com> (committer: mohan-chinnappan-n <mohan.chinnappan.n@gmail.com>)
* | 4f8fbaa - Fri, 3 Feb 2023 07:52:23 -0500 (21 hours ago) (committed: Fri, 3 Feb 2023 07:52:23 -0500)
| | msg updated
| | - mohan-chinnappan-n <mohan.chinnappan.n@gmail.com> (committer: mohan-chinnappan-n <mohan.chinnappan.n@gmail.com>)
| * 972c964 - Fri, 3 Feb 2023 08:00:54 -0500 (21 hours ago) (committed: Fri, 3 Feb 2023 08:00:54 -0500)
|/ msg is changed
| - mohan-chinnappan-n <mohan.chinnappan.n@gmail.com> (committer: mohan-chinnappan-n <mohan.chinnappan.n@gmail.com>)
* 6738865 - Fri, 3 Feb 2023 07:27:03 -0500 (21 hours ago) (committed: Fri, 3 Feb 2023 07:27:03 -0500)
init
- mohan-chinnappan-n <mohan.chinnappan.n@gmail.com> (committer: mohan-chinnappan-n <mohan.chinnappan.n@gmail.com>)
~/git-demo [feature-0001] >vi greet.js
~/git-demo [feature-0001] >node greet.js
hello world
30
200
~/git-demo [feature-0001] >cat greet.js
const msg = process.argv[2] || 'hello world';
console.log(msg);
const add = (a,b) => a+b;
console.log(add(10,20));
const mul = (a,b) => a*b;
console.log(mul(10,20));
~/git-demo [feature-0001] >git add greet.js
~/git-demo [feature-0001] >git commit -m 'mul function added'
[feature-0001 d99c263] mul function added
1 file changed, 3 insertions(+)
~/git-demo [feature-0001] >git lg
* d99c263 - (36 seconds ago) mul function added - mohan-chinnappan-n (HEAD -> feature-0001)
* af3e5f4 - (3 minutes ago) add function added - mohan-chinnappan-n
| * 50fa27f - (21 hours ago) merged with feature-0001 - mohan-chinnappan-n (main)
| |\
| |/
|/|
* | 6aff16d - (21 hours ago) added msg from commandline param - mohan-chinnappan-n
* | 4f8fbaa - (21 hours ago) msg updated - mohan-chinnappan-n
| * 972c964 - (21 hours ago) msg is changed - mohan-chinnappan-n
|/
* 6738865 - (21 hours ago) init - mohan-chinnappan-n
~/git-demo [feature-0001] >git lg2
* d99c263 - Sat, 4 Feb 2023 04:43:06 -0500 (43 seconds ago) (HEAD -> feature-0001)
| mul function added - mohan-chinnappan-n
* af3e5f4 - Sat, 4 Feb 2023 04:40:59 -0500 (3 minutes ago)
| add function added - mohan-chinnappan-n
| * 50fa27f - Fri, 3 Feb 2023 08:09:16 -0500 (21 hours ago) (main)
| |\ merged with feature-0001 - mohan-chinnappan-n
| |/
|/|
* | 6aff16d - Fri, 3 Feb 2023 07:57:30 -0500 (21 hours ago)
| | added msg from commandline param - mohan-chinnappan-n
* | 4f8fbaa - Fri, 3 Feb 2023 07:52:23 -0500 (21 hours ago)
| | msg updated - mohan-chinnappan-n
| * 972c964 - Fri, 3 Feb 2023 08:00:54 -0500 (21 hours ago)
|/ msg is changed - mohan-chinnappan-n
* 6738865 - Fri, 3 Feb 2023 07:27:03 -0500 (21 hours ago)
init - mohan-chinnappan-n
~/git-demo [feature-0001] >git log --pretty='format:%h|%an|%ae|%s'
d99c263|mohan-chinnappan-n|mohan.chinnappan.n@gmail.com|mul function added
af3e5f4|mohan-chinnappan-n|mohan.chinnappan.n@gmail.com|add function added
6aff16d|mohan-chinnappan-n|mohan.chinnappan.n@gmail.com|added msg from commandline param
4f8fbaa|mohan-chinnappan-n|mohan.chinnappan.n@gmail.com|msg updated
6738865|mohan-chinnappan-n|mohan.chinnappan.n@gmail.com|init
cherry-pick af3e5f4 into main
~/git-demo [feature-0001] >cat greet.js
const msg = process.argv[2] || 'hello world';
console.log(msg);
const add = (a,b) => a+b;
console.log(add(10,20));
const mul = (a,b) => a*b;
console.log(mul(10,20));
~/git-demo [feature-0001] >git checkout main
Switched to branch 'main'
~/git-demo [main] >cat greet.js
const msg = process.argv[2] || 'hello world';
console.log(msg);
~/git-demo [main] >git cherry-pick af3e5f4
[main befe251] add function added
Date: Sat Feb 4 04:40:59 2023 -0500
1 file changed, 3 insertions(+)
~/git-demo [main] >cat greet.js
const msg = process.argv[2] || 'hello world';
console.log(msg);
const add = (a,b) => a+b;
console.log(add(10,20));
~/git-demo [main] >node greet.js
hello world
30
~/git-demo [main] >git lg2
* befe251 - Sat, 4 Feb 2023 04:40:59 -0500 (10 minutes ago) (HEAD -> main)
| add function added - mohan-chinnappan-n
* 50fa27f - Fri, 3 Feb 2023 08:09:16 -0500 (21 hours ago)
|\ merged with feature-0001 - mohan-chinnappan-n
* | 972c964 - Fri, 3 Feb 2023 08:00:54 -0500 (21 hours ago)
| | msg is changed - mohan-chinnappan-n
| | * d99c263 - Sat, 4 Feb 2023 04:43:06 -0500 (7 minutes ago) (feature-0001)
| | | mul function added - mohan-chinnappan-n
| | * af3e5f4 - Sat, 4 Feb 2023 04:40:59 -0500 (10 minutes ago)<---------- cheery-pick
| |/ add function added - mohan-chinnappan-n
| * 6aff16d - Fri, 3 Feb 2023 07:57:30 -0500 (21 hours ago)
| | added msg from commandline param - mohan-chinnappan-n
| * 4f8fbaa - Fri, 3 Feb 2023 07:52:23 -0500 (21 hours ago)
|/ msg updated - mohan-chinnappan-n
* 6738865 - Fri, 3 Feb 2023 07:27:03 -0500 (21 hours ago)
init - mohan-chinnappan-n
References
pull request (PR) template
- All good projects have PR template
Steps
- Create
.github/pull_request_template.md
in your project repo - Here is a sample .github/pull_request_template.md
- [ ] I completed my code review with my peers
- [ ] I completed code scan as explained [here](https://github.com/mohan-chinnappan-n/cli-dx/blob/master/mdapi/pmd-codescan.md)
- [ ] I validated the test code coverage for correctness and any shortcuts to increase test code coverage **not** presented in the code
- [ ] I completed a checkOnly (-c) deployment and this code is deployable
Demo
- Here we can see how a pull request goes thru this PR template
-
git diff
-
Takes two input data sets and outputs the changes between them. Runs a diff function on Git data sources. Displays the sections of the file that have changes
-
Our example:
-
HEAD^ (one before HEAD)
Hello World
Hello Earth
- HEAD :
Hello World
Hello Earth
Hello Mars
Hello Venus
Git diff output line | Comments |
---|---|
git diff HEAD^..HEAD | Our command |
diff --git a/greet.txt b/greet.txt | input sources of the diff |
index cb896b5..712612f 100644 | internal Git metadata |
--- a/greet.txt | Markers for the changes |
+++ b/greet.txt | Markers for the changes |
@@ -1,2 +1,4 @@ | Diff chunks: 2 lines have been extracted starting from line number 1. Additionally, 4 lines have been added starting at line number 1. |
Hello World | Each changed line is prepended with a + or - symbol indicating which version of the diff input the changes come from. |
Hello Earth | |
+Hello Mars | |
+Hello Venus |
Using diff to compare HEAD^ and file
git diff HEAD^ greet.txt
diff --git a/greet.txt b/greet.txt
index cb896b5..712612f 100644
--- a/greet.txt
+++ b/greet.txt
@@ -1,2 +1,4 @@
Hello World
Hello Earth
+Hello Mars
+Hello Venus
Compare 2 branches
git diff main..featureBranch
Compare a file in 2 branches
~/git-experiments [main] >git branch feature
~/git-experiments [main] >git checkout feature
Switched to branch 'feature'
~/git-experiments [feature] >vi greet.txt
~/git-experiments [feature] >cat greet.txt
Hello World
Hello Earth
Hello Mars
Hello Venus
Hello Farms
~/git-experiments [feature] >git add -A
~/git-experiments [feature] >git commit -m 'farms added'
[feature b3359ea] farms added
1 file changed, 1 insertion(+)
~/git-experiments [feature] >git diff main feature greet.txt
diff --git a/greet.txt b/greet.txt
index 712612f..3cacf68 100644
--- a/greet.txt
+++ b/greet.txt
@@ -2,3 +2,4 @@ Hello World
Hello Earth
Hello Mars
Hello Venus
+Hello Farms
How to compare a file in 2 branches
~/git-experiments [feature] > git diff main greet.txt
diff --git a/greet.txt b/greet.txt
index 712612f..3cacf68 100644
--- a/greet.txt
+++ b/greet.txt
@@ -2,3 +2,4 @@ Hello World
Hello Earth
Hello Mars
Hello Venus
+Hello Farms
How to find the files got changed between 2 git commits
~/git-experiments [feature] >git diff --name-only HEAD..HEAD^
greet.txt
How find deleted files between 2 commits
git diff --diff-filter=D commit1..commit2
More on --diff-filter
-
The --diff-filter option is used with the git diff command to filter the differences that are displayed based on the type of change that has occurred. Here are some examples of how to use the --diff-filter option:
-
git diff --diff-filter=M: Shows only the files that have been modified between two commits.
-
git diff --diff-filter=A: Shows only the files that have been added between two commits.
-
git diff --diff-filter=D: Shows only the files that have been deleted between two commits.
-
git diff --diff-filter=R: Shows only the files that have been renamed between two commits.
-
git diff --diff-filter=C: Shows only the files that have been copied between two commits.
-
git diff --diff-filter=T: Shows only the files that have had their type (mode) changed between two commits.
-
git diff --diff-filter=B: Shows only the files that have had their binary content changed between two commits.
-
You can also combine the --diff-filter option with other options, such as --name-only or --name-status, to further refine the output of the git diff command.
-
For example, git diff --name-only --diff-filter=M would show only the names of the files that have been modified between two commits. Similarly, git diff --name-status --diff-filter=D would show the names of the files that have been deleted, along with a status indicator.
--diff-filter values, combination of these :
A Added
-- list filename(s) which got added between 2 commits
~/git-experiments [feature] >git diff --name-only --diff-filter=A HEAD..HEAD^
welcome.txt
C Copied
D Deleted
-- list files got deleted between 2 commits
M Modified
R Renamed
T have their type (mode) changed
U Unmerged
X Unknown
B have had their pairing Broken
* All-or-none
Compare a file between develop and main branch
~/treeprj [patch1] >git commit
=== Running PMD scan on /Users/mchinnappan/treeprj/force-app/main/default/classes/ ===...
Going to run: pmd-run.sh pmd -R /Users/mchinnappan/.pmd/apex_ruleset.xml -d /Users/mchinnappan/treeprj/force-app/main/default/classes/ -f csv > results.csv ...
Feb 26, 2023 1:17:01 PM net.sourceforge.pmd.PMD encourageToUseIncrementalAnalysis
WARNING: This analysis could be faster, please consider using Incremental Analysis: https://pmd.github.io/pmd-6.47.0/pmd_userdocs_incremental_analysis.html
SELECT COUNT(*) AS CNT FROM CSV("results.csv", {headers:true}) WHERE Priority < 3
=== Checking for PMD violations ... ===
nerrors: 0
=== No PMD violations! ===
=== Continue the deployment...===
sfdx force:source:deploy -p /Users/mchinnappan/treeprj/force-app/main/default/classes/ -u test-uzsmfdqkhtk7@example.com -c --verbose
INPUT .git/COMMIT_EDITMSG
=== commit msg: WI-123456 fixing the doc format
=== Valid commit message!===
[patch1 dd63730] WI-123456 fixing the doc format
1 file changed, 1 insertion(+), 1 deletion(-)
# list all the branches including remotes
~/treeprj [master] >git branch -a
* master
patch1
remotes/origin/master
# list the file content in both branches master and patch1
~/treeprj [master] >git checkout patch1
Switched to branch 'patch1'
~/treeprj [patch1] >cat docs/9.md
# adding more docs
doc9 added
~/treeprj [master] >cat docs/9.md
# adding more docs
doc9 added
add more line item here
# compare what is in patch1 (a) with what is in the master (b)
git diff patch1 docs/9.md
diff --git a/docs/9.md b/docs/9.md
index 1f8cc7c..89dd9f3 100644
--- a/docs/9.md
+++ b/docs/9.md
@@ -1,3 +1,4 @@
# adding more docs
doc9 added
+add more line item here
python to do the diff stuff
import subprocess
# Replace these with the commit hashes or branch names you want to compare
old_commit = "HEAD"
new_commit = "HEAD^"
# Get a list of all files that were changed between the two commits
changed_files = subprocess.check_output(
["git", "diff", "--name-only", old_commit, new_commit]
).decode("utf-8").splitlines()
# Get a list of all files that were deleted between the two commits
deleted_files = subprocess.check_output(
["git", "diff", "--diff-filter=D", "--name-only", old_commit, new_commit]
).decode("utf-8").splitlines()
# Print the results
print("Changed files:")
for filename in changed_files:
print(filename)
print("\nDeleted files:")
for filename in deleted_files:
print(filename)
git merge
- The command used to integrate changes from one branch into another. Merging combines changes made in separate branches and creates a new commit that includes the changes from both branches.
- Merging is a powerful feature of git that allows multiple developers to work on the same codebase simultaneously and combine their changes in a controlled and efficient way.
Git Branch Merge Demo
Create Git Repo
merge_branches() {
local branch1="$1"
local branch2="$2"
# Validate branch names
if ! git rev-parse --quiet --verify "$branch1" >/dev/null; then
echo "Error: Branch '$branch1' does not exist."
return 1
fi
if ! git rev-parse --quiet --verify "$branch2" >/dev/null; then
echo "Error: Branch '$branch2' does not exist."
return 1
fi
# Checkout branch2
git checkout "$branch2"
# Merge branch1 into branch2
git merge "$branch1"
# Check if merge was successful
if [ $? -eq 0 ]; then
echo "Merge completed successfully."
else
echo "Error: Merge failed."
return 1
fi
}
merge_branches $1 $2
~/proj1 >git init
Initialized empty Git repository in /Users/mchinnappan/proj1/.git/
~/proj1 (git)-[main]- >vi notes.md
~/proj1 (git)-[main]- >git add -A
~/proj1 (git)-[main]- >git commit -m init
[main (root-commit) d2fdd7b] init
1 file changed, 2 insertions(+)
create mode 100644 notes.md
~/proj1 (git)-[main]- >git log
commit d2fdd7b25355531e93dca4fcc6c5457d6596f7f8 (HEAD -> main)
Author: mohan chinnappan <mohan.chinnappan.n@gmail.com>
Date: Tue May 16 09:29:44 2023 -0400
init
Create branch dev
~/proj1 (git)-[main]- >git branch dev
List the branches
~/proj1 (git)-[main]- >git branch -a
dev
* main
Edit notes.md in main branch
~/proj1 (git)-[main]- >vi notes.md
~/proj1 (git)-[main]- >git add -A
~/proj1 (git)-[main]- > cat notes.md
one
two
~/proj1 (git)-[main]- >git commit -m update_on_notes
[main 61fb145] update_on_notes
1 file changed, 1 insertion(+)
Checkout dev
~/proj1 (git)-[main]- >git checkout dev
Switched to branch 'dev'
~/proj1 (git)-[dev]- >cat notes.md
one
merge from main to dev
~/proj1 (git)-[dev]- >bash git-branch-merge.sh main dev
Already on 'dev'
Updating d2fdd7b..61fb145
Fast-forward
notes.md | 1 +
1 file changed, 1 insertion(+)
Merge completed successfully.
Check merge
~/proj1 (git)-[dev]- >cat notes.md
one
two
How to list the merges done on the branch by an author
git log --merges --author="Ken"
Example - list the merges done by the author: mohan
git log --merges --author="mohan"
commit e307a67fec2e3af51d9361ef17f5c2153cbaf83a
Merge: c5c7a82 e6881af
Author: mohan chinnappan <mohan.chinnappan.n@gmail.com>
Date: Fri May 5 18:10:43 2023 -0400
Merge branch 'feature-0002'
commit c5c7a82be50820c548353f731881ef562c2c3aa5
Merge: b7353b8 a78f1a5
Author: mohan chinnappan <mohan.chinnappan.n@gmail.com>
Date: Fri May 5 18:06:14 2023 -0400
Merge branch 'feature-0002'
commit 50fa27fed0f56d1de8588aadd42db00cf522b24c
Merge: 972c964 6aff16d
Author: mohan-chinnappan-n <mohan.chinnappan.n@gmail.com>
Date: Fri Feb 3 08:09:16 2023 -0500
merged with feature-0001
3
Steps in performing merge
- git checkout
~/git-demo (git)-[feature-0002]- >git checkout main
Switched to branch 'main'
- git merge
git merge feature-0002
Auto-merging iris.csv
Merge made by the 'ort' strategy.
iris.csv | 1 +
1 file changed, 1 insertion(+)
head iris.csv
sepal_length,sepal_width,petal_length,petal_width,species
5.1,3.5,1.4,0.2,Iris-setosa
5.1,3.5,1.4,0.2,Iris-setosa
5.1,3.5,1.4,0.2,Iris-setosa
4.9,3,1.4,0.2,Iris-setosa
4.9,3,1.4,0.2,Iris-setosa
4.7,3.2,1.3,0.2,Iris-setosa
4.6,3.1,1.5,0.2,Iris-setosa
5,3.6,1.4,0.2,Iris-setosa
5.4,3.9,1.7,0.4,Iris-setosa
- This command merges the changes from
( feature-0002
) into the current branch (main
). - Before performing the merge, git checks whether there are any conflicts between the two branches. If there are conflicts, git will prompt you to resolve them manually.
git log
Customizing the git log
- Since: last year
/lwc-recipes (git)-[main]- >
git logs-csv 'last year'
=== Getting logs for since:last year with the filter: ADM ===
=== 50 /tmp/_gitlog_.csv lines. Opening in the your web browser to view the logs ===
- Since: 2022-01-01
git logs-csv '2022-01-01'
=== Getting logs for since:2023-01-01 with the filter: ADM ===
=== 89 /tmp/_gitlog_.csv lines. Opening in the your web browser to view the logs ===
How to setup this - extending git with alias
- Get the script git-logs-csv
- Store it in your local folder say
~/git-scripts
- Make it executable
chmod +x /path/to/git-logs-csv
- Configure it to run by git
git config --global alias.logs-csv '!/path/to/git-logs-csv'
git stash
-
In git, a "stash" is a feature that allows you to temporarily save changes that are not yet ready to be committed to the repository.
-
Stashing is useful when you need to switch branches or work on a different feature, but you don't want to commit your changes yet.
-
When you stash changes in git, you create a "stash" object that contains all the changes you've made since your last commit.
-
You can then switch to a different branch or work on a different feature, and when you're ready to resume work on the original changes, you can apply the stash and restore your changes.
Here's how to stash changes in git:
Make sure your working directory is clean (i.e., no changes are staged or committed).
git pull
remote: Enumerating objects: 20, done.
remote: Counting objects: 100% (20/20), done.
remote: Compressing objects: 100% (3/3), done.
remote: Total 20 (delta 16), reused 20 (delta 16), pack-reused 0
Unpacking objects: 100% (20/20), 2.94 KiB | 100.00 KiB/s, done.
From :xxxxx
error: Your local changes to the following files would be overwritten by merge:
....
Please commit your changes or stash them before you merge.
Aborting
Run the following command:
git stash save "message"
Saved working directory and index state On develop: stashed
This will create a new stash object with a message describing the changes you've stashed.
You can now switch to a different branch or work on a different feature.
When you're ready to restore your stashed changes, run the following command:
git stash apply
error: Your local changes to the following files would be overwritten by merge:
...
Please commit your changes or stash them before you merge.
git config
Table of contents
git pmd-scan
- Running PMD Scan as git alias
git pmd-scan ~/lwc-recipes/force-app/main/default/classes 4
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 9657 100 9657 0 0 80105 0 --:--:-- --:--:-- --:--:-- 84710
=== Running PMD scan on /Users/mchinnappan/lwc-recipes/force-app/main/default/classes === and THRESHOLD: 4 ...
Going to run: pmd-run.sh pmd -R /Users/mchinnappan/.pmd/apex_ruleset.xml -d /Users/mchinnappan/lwc-recipes/force-app/main/default/classes -f csv > /tmp/results.csv ...
Mar 19, 2023 7:26:23 AM net.sourceforge.pmd.PMD encourageToUseIncrementalAnalysis
WARNING: This analysis could be faster, please consider using Incremental Analysis: https://pmd.github.io/pmd-6.47.0/pmd_userdocs_incremental_analysis.html
Mar 19, 2023 7:26:24 AM net.sourceforge.pmd.PMD runPmd
WARNING: Some deprecated options were used on the command-line, including -property
Mar 19, 2023 7:26:24 AM net.sourceforge.pmd.PMD runPmd
WARNING: Consider replacing it with --property
Mar 19, 2023 7:26:24 AM net.sourceforge.pmd.PMD encourageToUseIncrementalAnalysis
WARNING: This analysis could be faster, please consider using Incremental Analysis: https://pmd.github.io/pmd-6.47.0/pmd_userdocs_incremental_analysis.html
SELECT COUNT(*) AS CNT FROM CSV("/tmp/results.csv", {headers:true}) WHERE Priority < 4
=== Checking for PMD violations ... ===
nerrors: 23
=== 24 /tmp/results.csv lines. Opening in the your web browser to view the logs ===
=== Number of P1,P2 and P3 issues are: 23. Can't continue! ===
How to setup git pmd-scan - extending git with alias
- Get the script git-pmd
- Store it in your local folder say:
~/git-scripts
- Make it executable
chmod +x /path/to/git-logs-csv
- Configure it to run by git
git config --global alias.pmd-scan '!/Users/mchinnappan/git-scripts/git-pmd'
git cpd (Copy and Paste Detector)
- Get the script git-cpd
- Store it in your local folder say:
~/git-scripts
- Make it executable
chmod +x /path/to/git-cpd
- Configure it to run by git
git config --global alias.pmd-cpd '!/Users/mchinnappan/git-scripts/git-cpd'
Running it
code1=~/lwc-recipes/force-app/main/default/classes
git pmd-cpd ${code1} 100 > /tmp/cpd.xml
echo === xmlutil transform --xml=/tmp/cpdhtml-v2.xml --xsl=cpd --out=/tmp/cpd.html
xmlutil transform --xml=/tmp/cpd.xml --xsl=cpdhtml-v2 --out=/tmp/cpd.html
open /tmp/cpd.html
git revert
Reverting a git commit
git revert commitId
Log before revert
git revert 57f76d3
Log after the revert of 57f76d3
Reverting a revert
- To revert a revert in Git, we can use the
git revert
command again.
git revert 48cd92d
Log after reverting a revert
Reverting a git merge
git branch feature-0002
git checkout feature-0002
Switched to branch 'feature-0002'
~/git-demo (git)-[feature-0002]- >git add -A
~/git-demo (git)-[feature-0002]- >git commit -m 'iris.csv added new record 4'
git commit -m 'iris.csv added new record 4'
[feature-0002 a78f1a5] iris.csv added new record 4
1 file changed, 1 insertion(+)
Finding the diff in a file in 2 branches
~/git-demo (git)-[feature-0002]- >git diff feature-0002 main iris.csv
diff --git a/iris.csv b/iris.csv
index c51bae8..098e207 100644
--- a/iris.csv
+++ b/iris.csv
@@ -1,7 +1,7 @@
sepal_length,sepal_width,petal_length,petal_width,species
5.1,3.5,1.4,0.2,Iris-setosa
5.1,3.5,1.4,0.2,Iris-setosa
-4.9,3,1.4,0.2,Iris-setosa
+5.1,3.5,1.4,0.2,Iris-setosa^M
4.9,3,1.4,0.2,Iris-setosa
4.7,3.2,1.3,0.2,Iris-setosa
4.6,3.1,1.5,0.2,Iris-setosa
git merge
- The command used to integrate changes from one branch into another. Merging combines changes made in separate branches and creates a new commit that includes the changes from both branches.
- Merging is a powerful feature of git that allows multiple developers to work on the same codebase simultaneously and combine their changes in a controlled and efficient way.
Steps
- git checkout
~/git-demo (git)-[feature-0002]- >git checkout main
Switched to branch 'main'
- git merge
git merge feature-0002
Auto-merging iris.csv
Merge made by the 'ort' strategy.
iris.csv | 1 +
1 file changed, 1 insertion(+)
head iris.csv
sepal_length,sepal_width,petal_length,petal_width,species
5.1,3.5,1.4,0.2,Iris-setosa
5.1,3.5,1.4,0.2,Iris-setosa
5.1,3.5,1.4,0.2,Iris-setosa
4.9,3,1.4,0.2,Iris-setosa
4.9,3,1.4,0.2,Iris-setosa
4.7,3.2,1.3,0.2,Iris-setosa
4.6,3.1,1.5,0.2,Iris-setosa
5,3.6,1.4,0.2,Iris-setosa
5.4,3.9,1.7,0.4,Iris-setosa
- This command merges the changes from
into the current branch. Before performing the merge, git checks whether there are any conflicts between the two branches. If there are conflicts, git will prompt you to resolve them manually.
Reverting a Branch merge
git revert <merge-commit-hash> -m <greaterThan 0>
-
The
-m 1
option specifies that you want to revert to the first parent of the merge commit, which is typically the branch you were on before the merge. -
If the merge had conflicts and you had to manually resolve them, you may need to specify
-m 2
instead.
git deleting branch
List branches
git branch -a
feature
feature-0001
* main
Delete the branch feature-0001 which is not fully merged
- If the branch has not been fully merged, git will display an error message and prevent the branch from being deleted.
git branch -d feature-0001
error: The branch 'feature-0001' is not fully merged.
If you are sure you want to delete it, run 'git branch -D feature-0001'.
- Following command forces git to delete the branch, regardless of whether it has been fully merged or not.
git branch -D feature-0001
Deleted branch feature-0001 (was d99c263).
Delete the branch feature which is fully merged
git branch -d feature
Deleted branch feature (was a97f348).
List the branches after the deletions
git branch -a
* main
git tree compare
Let us explain how compare 2 branches using ls-tree command
git ls-tree -r branch1 --name-only | sort | pbcopy; open "https://mohan-chinnappan-n5.github.io/delta/diff.html"
# paste now into the left side of the diff app
git ls-tree -r branch2 --name-only | sort | pbcopy
# go back to diff tool and paste now into right side and click [Compare] button
Apps from App Exchange
Using CLI to install App Exchange Apps
Silent (no-prompt mode)
sfdx force:package:install --wait 10 --publishwait 10 --package <packageId> --noprompt -u <username@email.com>
With Prompt mode
sfdx force:package:install --wait 10 --publishwait 10 --package <packageId> -u <username@email.com>
Source Tracking
-
Used for Tracking Changes Between Your Project and Org
-
Source Tracking can be enabled for Developer Sandbox and Developer Pro Sandboxes
-
It is enabled in Scratch orgs
Content for your .sfdx folder in your DX Project
- You can create a DX project by
sfdx force:project:create -n treePrj
- The files in the .sfdx folder as they are necessary for the internal workings of the Salesforce CLI
~/treeprj [master] >tree .sfdx
.sfdx
├── indexes
│ └── lwc
│ └── custom-components.json
├── orgs
│ └── mohan.chinnappan.n_ea2@gmail.com
│ └── sourcePathInfos.json <---- This information is used to make sure conflicts between files in an org and your project folder can be properly detected.
├── tools
│ ├── sobjects
│ │ └── standardObjects
│ │ ├── Account.cls
│ │ ├── Attachment.cls
│ │ ├── Case.cls
│ │ ├── Contact.cls
│ │ ├── Contract.cls
│ │ ├── Lead.cls
│ │ ├── Note.cls
│ │ ├── Opportunity.cls
│ │ ├── Order.cls
│ │ ├── Pricebook2.cls
│ │ ├── PricebookEntry.cls
│ │ ├── Product2.cls
│ │ ├── RecordType.cls
│ │ ├── Report.cls
│ │ ├── Task.cls
│ │ └── User.cls
│ ├── soqlMetadata
│ │ ├── customObjects
│ │ ├── standardObjects
│ │ │ ├── Account.json
│ │ │ ├── Attachment.json
│ │ │ ├── Case.json
│ │ │ ├── Contact.json
│ │ │ ├── Contract.json
│ │ │ ├── Lead.json
│ │ │ ├── Note.json
│ │ │ ├── Opportunity.json
│ │ │ ├── Order.json
│ │ │ ├── Pricebook2.json
│ │ │ ├── PricebookEntry.json
│ │ │ ├── Product2.json
│ │ │ ├── RecordType.json
│ │ │ ├── Report.json
│ │ │ ├── Task.json
│ │ │ └── User.json
│ │ └── typeNames.json
│ └── testresults
│ └── apex
└── typings
└── lwc
├── apex.d.ts
├── engine.d.ts
├── lds.d.ts
├── messageservice.d.ts
├── schema.d.ts
├── sobjects
│ ├── Account.d.ts
│ ├── Attachment.d.ts
│ ├── Case.d.ts
│ ├── Contact.d.ts
│ ├── Contract.d.ts
│ ├── Lead.d.ts
│ ├── Note.d.ts
│ ├── Opportunity.d.ts
│ ├── Order.d.ts
│ ├── Pricebook2.d.ts
│ ├── PricebookEntry.d.ts
│ ├── Product2.d.ts
│ ├── RecordType.d.ts
│ ├── Report.d.ts
│ ├── Task.d.ts
│ └── User.d.ts
└── user.d.ts
15 directories, 57 files
What is under orgs folder?
───────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
│ File: .sfdx/orgs/mohan.chinnappan.n_ea2@gmail.com/sourcePathInfos.json
───────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
1 │ {
2 │ "/Users/mchinnappan/treeprj/force-app": {
3 │ "sourcePath": "/Users/mchinnappan/treeprj/force-app",
4 │ "deferContentHash": false,
5 │ "isWorkspace": false,
6 │ "isArtifactRoot": true,
7 │ "state": "n",
8 │ "package": "force-app",
9 │ "isDirectory": true,
10 │ "isMetadataFile": false,
11 │ "size": 128,
12 │ "modifiedTime": 1673263339703,
13 │ "changeTime": 1673263339703,
14 │ "contentHash": "51d86f3b64a3d9a0048d192f02147b0e9e9594e4"
15 │ },
16 │ "/Users/mchinnappan/treeprj/force-app/main": {
17 │ "sourcePath": "/Users/mchinnappan/treeprj/force-app/main",
18 │ "deferContentHash": false,
19 │ "isWorkspace": false,
20 │ "isArtifactRoot": false,
21 │ "state": "n",
22 │ "package": "force-app",
23 │ "isDirectory": true,
24 │ "isMetadataFile": false,
25 │ "size": 128,
26 │ "modifiedTime": 1673263339699,
27 │ "changeTime": 1673263339699,
28 │ "contentHash": "ae41ed56706e4439a34cda97ac997d7988935c2f"
29 │ },
30 │ "/Users/mchinnappan/treeprj/force-app/main/default": {
31 │ "sourcePath": "/Users/mchinnappan/treeprj/force-app/main/default",
:
Querying SourceMember Object
- SourceMember object is used to log any changes to the org metadata and settings.
No entries
sfdx force:data:soql:query --query "SELECT ChangedBy, IsNameObsolete, MemberName, MemberType, RevisionCounter FROM SourceMember" --usetoolingapi -u test-uzsmfdqkhtk7@example.com
Total number of records retrieved: 0.
Querying Data... done
Let us deploy classes to get the entries
~/treeprj [master] >sfdx force:source:deploy -u test-uzsmfdqkhtk7@example.com -p force-app/main/default/classes
Deploying v55.0 metadata to test-uzsmfdqkhtk7@example.com using the v56.0 SOAP API
Deploy ID: 0AfDM000016M4060AC
DEPLOY PROGRESS | ████████████████████████████████████████ | 1/1 Components
=== Deployed Source
FULL NAME TYPE PROJECT PATH
────────── ───────── ──────────────────────────────────────────────────────
HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls
HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls-meta.xml
Deploy Succeeded.
~/treeprj [master] >sfdx force:data:soql:query --query "SELECT ChangedBy, IsNameObsolete, MemberName, MemberType, RevisionCounter FROM SourceMember" --usetoolingapi -u test-uzsmfdqkhtk7@example.com
CHANGEDBY ISNAMEOBSOLETE MEMBERNAME MEMBERTYPE REVISIONCOUNTER
─────────────── ────────────── ────────── ────────── ───────────────
005DM00000682Jy Admin Profile 2
005DM00000682Jy HelloPeach ApexClass 3
Total number of records retrieved: 2.
Querying Data... done
Listing local and remote changes
Local changes
sfdx force:source:status -u test-uzsmfdqkhtk7@example.com -l
Source Status
STATE FULL NAME TYPE PROJECT PATH
───────── ─────────────────────────────────── ───────────── ──────────────────────────────────────────────────────────────────────────────────────────────────
Local Add HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls
Local Add HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls-meta.xml
Local Add Tree__c.quantity__c CustomField force-app/main/default/objects/Tree__c/fields/quantity__c.field-meta.xml
Local Add Account_Plan__c CustomObject force-app/main/default/objects/Account_Plan__c/Account_Plan__c.object-meta.xml
Local Add Sales_Play__c CustomObject force-app/main/default/objects/Sales_Play__c/Sales_Play__c.object-meta.xml
Local Add Tree__c CustomObject force-app/main/default/objects/Tree__c/Tree__c.object-meta.xml
Local Add Experience_Profile_Manager PermissionSet force-app/main/default/permissionsets/Experience_Profile_Manager.permissionset-meta.xml
Local Add Manage_Encryption_Keys PermissionSet force-app/main/default/permissionsets/Manage_Encryption_Keys.permissionset-meta.xml
Local Add sf_devops_InitializeEnvironments PermissionSet force-app/main/default/permissionsets/sf_devops_InitializeEnvironments.permissionset-meta.xml
Local Add sf_devops_InitializeEnvironments PermissionSet force-app/main/default/permissionsets/sf_devops_InitializeEnvironments.permissionset-meta.xml.html
Local Add sf_devops_NamedCredentials PermissionSet force-app/main/default/permissionsets/sf_devops_NamedCredentials.permissionset-meta.xml
Local Add Admin Profile force-app/main/default/profiles/Admin.profile-meta.xml
Local Add Admin Profile force-app/main/default/profiles/Admin.profile-meta.xml.html
Local Add Admin Profile force-app/main/default/profiles/Admin.profile-meta.xml.json
Local Add Analytics Cloud Integration User Profile force-app/main/default/profiles/Analytics Cloud Integration User.profile-meta.xml
Local Add Analytics Cloud Security User Profile force-app/main/default/profiles/Analytics Cloud Security User.profile-meta.xml
Local Add Authenticated Website Profile force-app/main/default/profiles/Authenticated Website.profile-meta.xml
Local Add Chatter External User Profile force-app/main/default/profiles/Chatter External User.profile-meta.xml
Local Add Chatter Free User Profile force-app/main/default/profiles/Chatter Free User.profile-meta.xml
Local Add Chatter Moderator User Profile force-app/main/default/profiles/Chatter Moderator User.profile-meta.xml
Local Add ContractManager Profile force-app/main/default/profiles/ContractManager.profile-meta.xml
Local Add Cross Org Data Proxy User Profile force-app/main/default/profiles/Cross Org Data Proxy User.profile-meta.xml
Local Add Custom%3A Marketing Profile Profile force-app/main/default/profiles/Custom%3A Marketing Profile.profile-meta.xml
Local Add Custom%3A Sales Profile Profile force-app/main/default/profiles/Custom%3A Sales Profile.profile-meta.xml
Local Add Custom%3A Support Profile Profile force-app/main/default/profiles/Custom%3A Support Profile.profile-meta.xml
Local Add Customer Community Login User Profile force-app/main/default/profiles/Customer Community Login User.profile-meta.xml
Local Add Customer Community Plus Login User Profile force-app/main/default/profiles/Customer Community Plus Login User.profile-meta.xml
Local Add Customer Community Plus User Profile force-app/main/default/profiles/Customer Community Plus User.profile-meta.xml
Local Add Customer Community User Profile force-app/main/default/profiles/Customer Community User.profile-meta.xml
Local Add Customer Portal Manager Custom Profile force-app/main/default/profiles/Customer Portal Manager Custom.profile-meta.xml
Local Add Customer Portal Manager Standard Profile force-app/main/default/profiles/Customer Portal Manager Standard.profile-meta.xml
Local Add External Apps Login User Profile force-app/main/default/profiles/External Apps Login User.profile-meta.xml
Local Add External Identity User Profile force-app/main/default/profiles/External Identity User.profile-meta.xml
Local Add Force%2Ecom - App Subscription User Profile force-app/main/default/profiles/Force%2Ecom - App Subscription User.profile-meta.xml
Local Add Force%2Ecom - Free User Profile force-app/main/default/profiles/Force%2Ecom - Free User.profile-meta.xml
Local Add Gold Partner User Profile force-app/main/default/profiles/Gold Partner User.profile-meta.xml
Local Add Guest License User Profile force-app/main/default/profiles/Guest License User.profile-meta.xml
Local Add High Volume Customer Portal User Profile force-app/main/default/profiles/High Volume Customer Portal User.profile-meta.xml
Local Add HighVolumePortal Profile force-app/main/default/profiles/HighVolumePortal.profile-meta.xml
Local Add Identity User Profile force-app/main/default/profiles/Identity User.profile-meta.xml
Local Add MarketingProfile Profile force-app/main/default/profiles/MarketingProfile.profile-meta.xml
Local Add Minimum Access - Salesforce Profile force-app/main/default/profiles/Minimum Access - Salesforce.profile-meta.xml
Local Add mohanc Profile Profile force-app/main/default/profiles/mohanc Profile.profile-meta.xml
Local Add Partner App Subscription User Profile force-app/main/default/profiles/Partner App Subscription User.profile-meta.xml
Local Add Partner Community Login User Profile force-app/main/default/profiles/Partner Community Login User.profile-meta.xml
Local Add Partner Community User Profile force-app/main/default/profiles/Partner Community User.profile-meta.xml
Local Add PlatformPortal Profile force-app/main/default/profiles/PlatformPortal.profile-meta.xml
Local Add Read Only Profile force-app/main/default/profiles/Read Only.profile-meta.xml
Local Add Silver Partner User Profile force-app/main/default/profiles/Silver Partner User.profile-meta.xml
Local Add SolutionManager Profile force-app/main/default/profiles/SolutionManager.profile-meta.xml
Local Add Standard Profile force-app/main/default/profiles/Standard.profile-meta.xml
Local Add StandardAul Profile force-app/main/default/profiles/StandardAul.profile-meta.xml
Local Add Work%2Ecom Only User Profile force-app/main/default/profiles/Work%2Ecom Only User.profile-meta.xml
Remote changes
sfdx force:source:status -u test-uzsmfdqkhtk7@example.com -r
Source Status
STATE FULL NAME TYPE PROJECT PATH
────────────── ────────── ───────── ───────────────────────────────────────────────────────────
Remote Changed HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls-meta.xml
Remote Changed HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls
Remote Changed Admin Profile force-app/main/default/profiles/Admin.profile-meta.xml.json
How force:source:push works
- The CLI collects files that have changed in the project workspace since the last sync.
- The CLI queries SourceMember object records greater than max revision counter stored in maxrevision.json file.
- The CLI marks files as conflicts if a SourceMember object record is found for them.
- If conflicts are detected, the CLI informs the developer about these conflicts. The developer needs to resolve them and force push using the force flag.
- If no conflicts are detected, the CLI converts source to Metadata API format with package.xml and generate a zip artifact.
- The CLI deploys changes to the Salesforce org using the Metadata API.
- The CLI updates state and file contentHash in sourcePathInfos.json and maxrevision.json.
Viewing sfdx.log
# macOS
open -a TextEdit $HOME/.sfdx/sfdx.log
Resources
Dangling Fields
Topics
-
Dangling fields are ones which are present assets like Profile but they not in the source asset like Objects. For example
Account.carbonFootprint__c
may be in Profiles/PermissionSets but the Account Object will not have this fieldcarbonFootprint__c
-
Having Dangling fields in assets like Profiles makes the deployment task hard, so clean them as soon as possible.
-
I coined this Dangling fields with comparison to dangling pointers in memory model of the languages like C, C++. I am open to your feedback on this naming...
Finding Dangling fields using CLI
list the profile files in json format
sfdx mohanc:mdapi:listFiles -p src -t permissionsets --help
List Dangling Fields
USAGE
$ sfdx mohanc mdapi listFiles [-p <string>] [-t <string>] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]
FLAGS
-p, --projectpath=<value> Project location (say src)
-t, --type=<value> Metadata type (e.g profiles)
--json format output as json
--loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL) [default: warn] logging level for this command invocation
DESCRIPTION
List Dangling Fields
EXAMPLES
List the files of given metadata type
sfdx mohanc:mdapi:listFiles -p <projectPath> -t <metadataType>
Example:
sfdx mohanc:mdapi:listFiles -p src -t <profiles | permissionsets >
sfdx mohanc:mdapi:listFiles -p /path/to/src/folder -t profiles > /tmp/profiles.json
find the dangling fields for those profiles or permissionsets
sfdx mohanc:mdapi:df -i /tmp/permissionsets.json -m PermissionSet --help
List Dangling Fields
USAGE
$ sfdx mohanc mdapi df -i <string> -m <string> [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]
FLAGS
-i, --xmls=<value> (required) JSON file with profile xml files
-m, --type=<value> (required) [default: Profile] Profile or PermissionSet
--json format output as json
--loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL) [default: warn] logging level for this command invocation
DESCRIPTION
List Dangling Fields
EXAMPLES
List the dangling files when comparing with fields
sfdx mohanc:mdapi:df -i <XMLfiles.json> -m <Profile | PermissionSet>
sfdx mohanc:mdapi:df -i /tmp/profiles.json > dangling_fields_2.tsv
opens in numbers or Excel
open dangling_fields_2.tsv
cleanup
cd /path/to/src/profiles/folder
rm *.json
cd /path/to/src/permissionsets/folder
rm *.json
How Salesforce Manages profile items life cycle
- Let us explain this creating an Apex Class
HelloMars
- set the Security for that class
- verify the profile to have the entry for this classAccess
- delete the class in UI
- verify the profile that Salesforce has removed this classAccess entry in the profile
Create the class HelloMars
~/treeprj [patch1] >sfdx force:apex:class:create -n HelloMars -d force-app/main/default/classes
target dir = /Users/mchinnappan/treeprj/force-app/main/default/classes
create force-app/main/default/classes/HelloMars.cls
create force-app/main/default/classes/HelloMars.cls-meta.xml
~/treeprj [patch1] >tree force-app/main/default/classes
force-app/main/default/classes
├── HelloMars.cls
├── HelloMars.cls-meta.xml
├── HelloPeach.cls
└── HelloPeach.cls-meta.xml
0 directories, 4 files
~/treeprj [patch1] >sfdx force:source:deploy -u mohan.chinnappan.n.sel@gmail.com -p force-app/main/default/classes
Deploying v55.0 metadata to mohan.chinnappan.n.sel@gmail.com using the v57.0 SOAP API
Deploy ID: 0Af4x00000YS5i5CAD
DEPLOY PROGRESS | ████████████████████████████████████████ | 2/2 Components
=== Deployed Source
FULL NAME TYPE PROJECT PATH
────────── ───────── ──────────────────────────────────────────────────────
HelloMars ApexClass force-app/main/default/classes/HelloMars.cls
HelloMars ApexClass force-app/main/default/classes/HelloMars.cls-meta.xml
HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls
HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls-meta.xml
Deploy Succeeded.
~/treeprj [patch1] >bat force-app/main/default/classes/HelloMars.cls
───────┬────────────────────────────────────────────────────────────────────────────────────────────────────────────
│ File: force-app/main/default/classes/HelloMars.cls
───────┼────────────────────────────────────────────────────────────────────────────────────────────────────────────
1 │ public with sharing class HelloMars {
2 │ public HelloMars() {
3 │
4 │ }
5 │ }
───────┴────────────────────────────────────────────────────────────────────────────────────────────────────────────
~/treeprj [patch1] >
- After deleting the Class in the UI
Let us repeat the above exercise, this time will we will use CLI to delete the class
~/treeprj [patch1] >sfdx force:apex:class:create -n HelloEarth -d force-app/main/default/classes
target dir = /Users/mchinnappan/treeprj/force-app/main/default/classes
create force-app/main/default/classes/HelloEarth.cls
create force-app/main/default/classes/HelloEarth.cls-meta.xml
~/treeprj [patch1] >sfdx force:source:deploy -u mohan.chinnappan.n.sel@gmail.com -p force-app/main/default/classes/HelloEarth.cls
Deploying v55.0 metadata to mohan.chinnappan.n.sel@gmail.com using the v57.0 SOAP API
Deploy ID: 0Af4x00000YS5u1CAD
DEPLOY PROGRESS | ████████████████████████████████████████ | 1/1 Components
=== Deployed Source
FULL NAME TYPE PROJECT PATH
────────── ───────── ──────────────────────────────────────────────────────
HelloEarth ApexClass force-app/main/default/classes/HelloEarth.cls
HelloEarth ApexClass force-app/main/default/classes/HelloEarth.cls-meta.xml
Deploy Succeeded.
-
-
Prepare destructiveChanges
~/treeprj [patch1] >cd destructiveChanges
~/treeprj/destructiveChanges [patch1] >ls
destructiveChanges.xml package.xml
~/treeprj/destructiveChanges [patch1] >vi destructiveChanges.xml
~/treeprj/destructiveChanges [patch1] >bat destructiveChanges.xml
───────┬────────────────────────────────────────────────────────────────────────────────────────────────────────────
│ File: destructiveChanges.xml
───────┼────────────────────────────────────────────────────────────────────────────────────────────────────────────
1 │ <?xml version="1.0" encoding="UTF-8"?>
2 │ <Package xmlns="http://soap.sforce.com/2006/04/metadata">
3 │ <version>55.0</version>
4 ~ │
5 ~ │ <types>
6 ~ │ <members>HelloEarth</members>
7 ~ │ <name>ApexClass</name>
8 ~ │ </types>
9 ~ │
10 ~ │ </Package>
───────┴────────────────────────────────────────────────────────────────────────────────────────────────────────────
~/treeprj/destructiveChanges [patch1] >cd ..
~/treeprj [patch1] >vi package.json
~/treeprj [patch1] >vi package.xml
~/treeprj [patch1] >cp destructiveChanges/package.xml .
~/treeprj [patch1] >vi package.xml
~/treeprj [patch1] >bat package.xml
───────┬────────────────────────────────────────────────────────────────────────────────────────────────────────────
│ File: package.xml
───────┼────────────────────────────────────────────────────────────────────────────────────────────────────────────
1 │ <?xml version="1.0" encoding="UTF-8"?>
2 │ <Package xmlns="http://soap.sforce.com/2006/04/metadata">
3 │ <version>55.0</version>
4 │ </Package>
───────┴────────────────────────────────────────────────────────────────────────────────────────────────────────────
~/treeprj [patch1] >sfdx force:source:deploy -x package.xml --postdestructivechanges destructiveChanges/destructiveChanges.xml --verbose --loglevel TRACE -u mohan.chinnappan.n.sel@gmail.com -c
Deploying v55.0 metadata to mohan.chinnappan.n.sel@gmail.com using the v57.0 SOAP API
Deploy ID: 0Af4x00000YS5uBCAT
=== Deleted Source
FULL NAME TYPE PROJECT PATH
────────── ───────── ──────────────────────────────────────────────────────
HelloEarth ApexClass force-app/main/default/classes/HelloEarth.cls
HelloEarth ApexClass force-app/main/default/classes/HelloEarth.cls-meta.xml
Successfully validated the deployment.
~/treeprj [patch1] >sfdx force:source:deploy -x package.xml --postdestructivechanges destructiveChanges/destructiveChanges.xml --verbose --loglevel TRACE -u mohan.chinnappan.n.sel@gmail.com
Deploying v55.0 metadata to mohan.chinnappan.n.sel@gmail.com using the v57.0 SOAP API
Deploy ID: 0Af4x00000YS5uLCAT
=== Deleted Source
FULL NAME TYPE PROJECT PATH
────────── ───────── ──────────────────────────────────────────────────────
HelloEarth ApexClass force-app/main/default/classes/HelloEarth.cls
HelloEarth ApexClass force-app/main/default/classes/HelloEarth.cls-meta.xml
Deploy Succeeded.
About class files in the local folder
- Since destructiveChanges deployment do not change files in the local folder as shown below:
tree force-app/main/default/classes
force-app/main/default/classes
├── HelloEarth.cls
├── HelloEarth.cls-meta.xml
├── HelloMars.cls
├── HelloMars.cls-meta.xml
├── HelloPeach.cls
└── HelloPeach.cls-meta.xml
0 directories, 6 files
~/treeprj [patch1] >sfdx force:source:retrieve -p force-app/main/default/classes -u mohan.chinnappan.n.sel@gmail.com
Retrieving v55.0 metadata from mohan.chinnappan.n.sel@gmail.com using the v57.0 SOAP API
Preparing retrieve request... done
=== Retrieved Source
FULL NAME TYPE PROJECT PATH
────────── ───────── ──────────────────────────────────────────────────────
HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls
HelloPeach ApexClass force-app/main/default/classes/HelloPeach.cls-meta.xml
=== Retrieved Source Warnings
FILE NAME PROBLEM
────────────────────── ─────────────────────────────────────────────────────────────
unpackaged/package.xml Entity of type 'ApexClass' named 'HelloEarth' cannot be found
unpackaged/package.xml Entity of type 'ApexClass' named 'HelloMars' cannot be found
~/treeprj [patch1] >tree force-app/main/default/classes
force-app/main/default/classes
├── HelloEarth.cls
├── HelloEarth.cls-meta.xml
├── HelloMars.cls
├── HelloMars.cls-meta.xml
├── HelloPeach.cls
└── HelloPeach.cls-meta.xml
0 directories, 6 files
- We need to manually delete them, otherwise the following command (wildcard based) will re-deploy the deleted the classes back into the org
sfdx force:source:deploy -u mohan.chinnappan.n.sel@gmail.com -p force-app/main/default/classes
Sandbox Management
Sandbox Clone
Question | Answer |
---|---|
How Sandbox Clone is different from regular Sandbox | With Sandbox Clone, you can create a copy of an existing sandbox. That copy will contain all data that existed in the Source Sandbox at the time of the copy. A regular sandbox is a copy of production org. In Setup > Sandboxes page it will be have Copied From field |
How to create or refresh cloned sandbox | You can create or refresh a cloned sandbox the same way you use a regular sandbox, you have to choose your source sandbox rather than production on the Create From drop-down field.You can clone a sandbox that’s on a different major Salesforce release version than your production organization. |
Things to know | A sandbox clone can only be of the same type sandbox as the source. For example, a 'Full Copy Sandbox' can only be cloned as another 'Full Copy Sandbox.' |
Sandbox Refresh
1. Create a Pre-refresh Checklist
2. Minimize Downtime using right refresh window (taking care of developers developing new features, UAT and Training users)
-
Makes sure that refresh window aligns with Salesforce release window
-
For each sandbox
-
Salesforce sandbox refreshes take time - It is a function of amount of configuration and metadata it contains
-
Determine a window for running sandbox refresh - this could take around 2 days
-
During this window the sandbox won’t be available
-
Type of sandbox determines how often it can be refreshed:
- Developer, Developer Pro - 1 day
- Partial Copy - 5 days
- Full Copy - 29 days
-
Refreshing the sandbox will delete all that metadata (that is not yet committed in to Production)
-
3. Sandbox Refresh Communication
- Communicate early and often about the Sandbox Refresh
- Use custom metadata types (mdt) in production, so they’re easily modified
- Use custom settings and/or custom metadata types to specify email addresses, integration endpoints, etc.
- modify these values post-refresh
- Have 2 records per metadata type—one for production and one for sandboxes
- Apex classes can then use the
IsSandbox
property to dynamically choose which metadata type to use at runtime.
SELECT IsSandbox FROM Organization LIMIT 1
- Apex classes can then use the
- Use custom settings and/or custom metadata types to specify email addresses, integration endpoints, etc.
4. Create a Post-refresh Checklist
- After refreshing a sandbox take care of removing production endpoints
- SandboxPostCopy to extend this interface and add methods to perform post-copy tasks, then specify the class during sandbox creation.
global class PrepareMySandbox implements SandboxPostCopy {
global PrepareMySandbox() {
//Implementations of SandboxPostCopy must have a no-arg constructor.
//This constructor is used during the sandbox copy process.
//You can also implement constructors with arguments, but be aware that
//they won’t be used by the sandbox copy process (unless as part of the
//no-arg constructor).
this(some_args);
}
global PrepareMySandbox(String some_args) {
//Logic for constructor.
}
global void runApexClass(SandboxContext context) {
System.debug('Org ID: ' + context.organizationId());
System.debug('Sandbox ID: ' + context.sandboxId());
System.debug('Sandbox Name: ' + context.sandboxName());
// Insert logic here to prepare the sandbox for use.
}
}
5. Take care of configuration data (not metadata)
- Import it using the Data Loader or Import Wizard.
- Some AppExchange apps use custom object for the configuration. When we create a new sandbox that does not include data, these apps will not work after the sandbox refresh.
- For master-detail relationships - insert the parent records before any child records
References
Sandbox Preview Guide
Topics
To Refresh or Not to Refresh: De-mystifying the Sandbox Preview [SP]
Salesforce Sandbox Preview Guide App
References
- Salesforce Sandbox Preview Instructions
- Trust Status
- Modify your Instance at the end of the url to fit your Instance
MetadataComponentDependency
SOQL used for this:
SELECT
MetadataComponentId,
MetadataComponentName, MetadataComponentType,RefMetadataComponentId,
RefMetadataComponentName,
RefMetadataComponentType
FROM MetadataComponentDependency
WHERE RefMetadataComponentType = 'ApexClass'
Using ONNU for this
YAML
YAML to JSON convertor
JSON Editor
JSON to YAML convertor
Installed Packages
How to Check Installed Packages in the org
Using ONNU Chrome Extension
SELECT Id, SubscriberPackageId, SubscriberPackage.NamespacePrefix,
SubscriberPackage.Name, SubscriberPackageVersion.Id,
SubscriberPackageVersion.Name, SubscriberPackageVersion.MajorVersion,
SubscriberPackageVersion.MinorVersion,
SubscriberPackageVersion.PatchVersion,
SubscriberPackageVersion.BuildNumber
FROM InstalledSubscriberPackage
ORDER BY SubscriberPackageId
Using CLI
sfdx force:package:installed:list -u mohan.chinnappan.n.sel2@gmail.com
ID Package ID Package Name Namespace Package Version ID Version Name Version
────────────────── ────────────────── ───────────────────────── ─────────── ────────────────── ──────────── ───────
0A38W000000hMfWSAU 03330000000wDAbAAM Salesforce Connected Apps sf_com_apps 04t30000001DUvrAAG Winter '16 1.7.0.1
CRMA tools
Dataset Load and Export
Export your Einstein Analytics datasets
Uploading datasets via CLI
Analysis
Get FieldUsage info for a given Dataflow using DX
- Requires 0.0.73 version of the plugin
- sfdx-mohanc-plugins@0.0.73
- How to install the plugin
Topics
- Demo
- Usage
- Steps
- Output CSV
- Output JSON
- Load Field Usage CSV into EA
- Field Usage Dashboard
- Dataflows without field usage info
Demo
Usage
$ sfdx mohanc:ea:dataflow:fieldUsage -h
Get Field Usage Info for the given dataflow
USAGE
$ sfdx mohanc:ea:dataflow:fieldUsage
OPTIONS
-d, --dataflowid=dataflowid Dataflow Id to analyzed
-u, --targetusername=targetusername username or alias for the target org; overrides default target org
--apiversion=apiversion override the api version used for api requests made by this command
--json format output as json
--loglevel=(trace|debug|info|warn|error|fatal) logging level for this command invocation
EXAMPLE
Get Field Usage data for the given dataflow
sfdx mohanc:ea:dataflow:fieldUsage -u <username> -d <dataflowId>
Creates output in both csv and json file
--------------------------------------------------
NOTE:
Use:
sfdx mohanc:ea:dataflow:list -u <username>
to list the dataflows in the org
Steps
List the dataflows in the org
$ sfdx mohanc:ea:dataflow:list -u mohan.chinnappan.n_ea2@gmail.com
Id,Label
02K3h000000Mu1oEAC,exportOppty2
02K3h000000Mu0vEAC,exportOppty
02K3h000000MtyuEAC,ExportCustomers
02K3h000000MrxWEAS,fruitsdf
02K3h000000Mr7JEAS,The_Motivator
02K3h000000Mr7KEAS,Default Salesforce Dataflow
Get the field usage info this dataflow: 02K3h000000Mr7JEAS,The_Motivator
$ sfdx mohanc:ea:dataflow:fieldUsage -u mohan.chinnappan.n_ea2@gmail.com -d 02K3h000000Mr7JEAS
=== Looking at the dataflow: 02K3h000000Mr7JEAS, The_Motivator ...
=== Looking at the dataflow's sfdcRegister actions ...
_________________________________________________
=== Looking at the dataset: user ...
=== Looking at current version of the dataset: user ...
=== Looking at Security Predicate of the dataset: user ...
=== Looking at WHERE USED details of the dataset: user ...
=== Looking at Dashboards using this dataset: user ...
=== Looking at Lenses using this dataset: user ...
_________________________________________________
=== Looking at the dataset: activity ...
=== Looking at current version of the dataset: activity ...
=== Looking at Security Predicate of the dataset: activity ...
=== Looking at WHERE USED details of the dataset: activity ...
=== Looking at Dashboards using this dataset: activity ...
_________________________________________________
Writing output files in: The_Motivator_fldusage.json and The_Motivator_fldusage.csv
Done.
Output CSV
Load Field Usage CSV file into EA
$ cp ea/02K3h000000Mr7JEAS.csv /tmp/fieldusage.csv
$ sfdx mohanc:ea:dataset:load -u mohan.chinnappan.n_ea2@gmail.com -d /tmp/fieldusage.csv
[
{
fullyQualifiedName: 'fieldusage_csvDataset.type',
label: 'type',
name: 'type',
isSystemField: false,
isUniqueId: false,
isMultiValue: false,
multiValueSeparator: 'null',
type: 'Text'
},
{
fullyQualifiedName: 'fieldusage_csvDataset.name',
label: 'name',
name: 'name',
isSystemField: false,
isUniqueId: false,
isMultiValue: false,
multiValueSeparator: 'null',
type: 'Text'
},
{
fullyQualifiedName: 'fieldusage_csvDataset.dataflowName',
label: 'dataflowName',
name: 'dataflowName',
isSystemField: false,
isUniqueId: false,
isMultiValue: false,
multiValueSeparator: 'null',
type: 'Text'
},
{
fullyQualifiedName: 'fieldusage_csvDataset.datasetName',
label: 'datasetName',
name: 'datasetName',
isSystemField: false,
isUniqueId: false,
isMultiValue: false,
multiValueSeparator: 'null',
type: 'Text'
},
{
fullyQualifiedName: 'fieldusage_csvDataset.dashboardName',
label: 'dashboardName',
name: 'dashboardName',
isSystemField: false,
isUniqueId: false,
isMultiValue: false,
multiValueSeparator: 'null',
type: 'Text'
},
{
fullyQualifiedName: 'fieldusage_csvDataset.lensName',
label: 'lensName',
name: 'lensName',
isSystemField: false,
isUniqueId: false,
isMultiValue: false,
multiValueSeparator: 'null',
type: 'Text'
},
{
fullyQualifiedName: 'fieldusage_csvDataset.InSecurityPredicate',
label: 'InSecurityPredicate',
name: 'InSecurityPredicate',
isSystemField: false,
isUniqueId: false,
isMultiValue: false,
multiValueSeparator: 'null',
type: 'Text'
},
{
fullyQualifiedName: 'fieldusage_csvDataset.usageCount',
label: 'usageCount',
name: 'usageCount',
isSystemField: false,
isUniqueId: false,
type: 'Numeric',
defaultValue: '0',
precision: 18,
scale: 0
},
{
fullyQualifiedName: 'fieldusage_csvDataset.inUse',
label: 'inUse',
name: 'inUse',
isSystemField: false,
isUniqueId: false,
isMultiValue: false,
multiValueSeparator: 'null',
type: 'Text'
}
]
{ id: '06V3h0000005uEOEAY', success: true, errors: [] }
Load part: 1
{ id: '06W3h00000075pdEAA', success: true, errors: [] }
Going to process...
Done.
- Dataflow job created details
Dataflow without field usage info
$ sfdx mohanc:ea:dataflow:list -u mohan.chinnappan.n_ea2@gmail.com
Id,Label
02K3h000000Mu1oEAC,exportOppty2
02K3h000000Mu0vEAC,exportOppty
02K3h000000MtyuEAC,ExportCustomers
02K3h000000MrxWEAS,fruitsdf
02K3h000000Mr7JEAS,The_Motivator
02K3h000000Mr7KEAS,Default Salesforce Dataflow
# let us look at 02K3h000000MrxWEAS,fruitsdf
$ sfdx mohanc:ea:dataflow:fieldUsage -u mohan.chinnappan.n_ea2@gmail.com -d 02K3h000000MrxWEAS
=== Looking at the dataflow 02K3h000000MrxWEAS ...
=== Looking at the dataflow sfdcRegister Actions ...
_________________________________________________
=== Looking at the dataset regFruitYield ...
Error: getting dataset for the currentVersion!
No field usage can be obtained for this dataflow!
Dashboard
Field Usage Output JSON
Dataflow performance with field usage analysis
Export node details of a dataflow job
Binding
Deployment
Use case
- For first time: Move the wave assets from a source org into a target org
- commit these wave assets into the version control once it is verified
package.xml used in this use case
- You can customize it as needed
<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns=“http://soap.sforce.com/2006/04/metadata”>
<types>
<members>*</members>
<name>WaveApplication</name>
</types>
<types>
<members>*</members>
<name>WaveComponent</name>
</types>
<types>
<members>*</members>
<name>WaveDashboard</name>
</types>
<types>
<members>*</members>
<name>WaveDataflow</name>
</types>
<types>
<members>*</members>
<name>WaveDataset</name>
</types>
<types>
<members>*</members>
<name>WaveLens</name>
</types>
<types>
<members>*</members>
<name>WaveRecipe</name>
</types>
<types>
<members>*</members>
<name>WaveTemplateBundle</name>
</types>
<types>
<members>*</members>
<name>WaveXmd</name>
</types>
<version>56.0</version>
</Package>
Option - 1
- Download CRMA package zip file
- using Chrome Extension ONNU
Changing extension of the xmd and xds
python3 change_extn.py
-------------------------------------------------------
Usage: python3 change_extn.py <from_extn> <to_extn>
Examples: python3 change_extn.py xmd xmd-meta.xml
: python3 change_extn.py xds xds-meta.xml
-------------------------------------------------------
- Download the script change_extn.py
import os
import sys
"""
change_extn.py
- changes the extension of the given file into required extension
- For example:
from: xmd
to: xmd-meta.xml
- mchinnappan
"""
usage = """
-------------------------------------------------------
Usage: python3 change_extn.py <from_extn> <to_extn>
Examples: python3 change_extn.py xmd xmd-meta.xml
: python3 change_extn.py xds xds-meta.xml
-------------------------------------------------------
"""
if len(sys.argv) < 3:
print (f'{usage}')
exit(0)
inext = sys.argv[1]
outext = sys.argv[2]
for filename in os.listdir('.'):
if filename.endswith(f'.{inext}'):
os.rename(filename, filename[:-len(inext)] + outext)
- run it on the wave folder:
python3 change_extn.py xmd xmd-meta.xml
python3 change_extn.py xds xds-meta.xml
Option - 2
- Download CRMA package zip file
- using DX
sfdx force:mdapi:retrieve -k ~/.package/crma-package.xml -u mohan.chinnappan.n_ea2@gmail.com -r .
Retrieving v56.0 metadata from mohan.chinnappan.n_ea2@gmail.com using the v57.0 SOAP API
Retrieve ID: 09S3h000006la2aEAA
Retrieving metadata from mohan.chinnappan.n_ea2@gmail.com... done
Wrote retrieve zip to /private/tmp/CRMA/unpackaged.zip
- View/unzip the zip file
/tmp/CRMA >jar xvf unpackaged.zip
105 Thu Mar 09 11:55:00 EST 2023 unpackaged/waveTemplates/Service_Analytics_Flex/dataset_files/ServiceOmniUserPresence_XMD_20_template.json
40861 Thu Mar 09 11:55:00 EST 2023 unpackaged/waveTemplates/Trailhead_Template_PartOne/workflow/SalesAnalyticsDataflow.json
221 Thu Mar 09 11:55:00 EST 2023 unpackaged/waveTemplates/Trailhead_Template_PartTwo/folder.json
44649 Thu Mar 09 11:55:00 EST 2023 unpackaged/waveTemplates/Trailhead_Template_PartTwo/workflow/SalesAnalyticsDataflow.json
90993 Thu Mar 09 11:55:00 EST 2023 unpackaged/waveTemplates/Service_Analytics_Flex/dashboards/Service_Knowledge_Efficiency.json
101231 Thu Mar 09 11:55:00 EST 2023 unpackaged/waveTemplates/Service_Analytics_Flex/dashboards/Service_Channel_Review.json
120130 Thu Mar 09 11:55:00 EST 2023 unpackaged/waveTemplates/Service_Analytics_Flex/dashboards/Service_Live_Agent_Chat.json
100166 Thu Mar 09 11:55:00 EST 2023 unpackaged/waveTemplates/Service_Analytics_Flex/dashboards/Service_Telephony.json
...
Move this zip content unpackaged/* into your local folder of your SFDX Project under the wave folder
# copy under force-app/main/default/wave of your sfdx project, for example:
/tmp/CRMA >
cp unpackaged/wave/* ~/treeprj/force-app/main/default/wave/
Deploy into the target org
- checkOnly (-c flag)
sfdx force:source:deploy -x ~/.package/crma-package.xml -c --verbose --loglevel TRACE
- Example
sfdx force:source:deploy -x ~/.package/crma-package.xml -c --verbose --loglevel TRACE -u mohan.chinnappan.n_ea2@gmail.com
Deploying v56.0 metadata to mohan.chinnappan.n_ea2@gmail.com using the v57.0 SOAP API
Deploy ID: 0Af3h00000SZ7WpCAL
DEPLOY PROGRESS | ████████████████████████████████████████ | 27/27 Components
=== Deployed Source
FULL NAME TYPE PROJECT PATH
──────────────────────────────────────────────────── ───────────── ───────────────────────────────────────────────────────────────────
Benchmark_dashboard WaveDashboard Benchmark_dashboard.wdash
Benchmark_dashboard WaveDashboard Benchmark_dashboard.wdash-meta.xml
Cars3DB WaveDashboard Cars3DB.wdash
Cars3DB WaveDashboard Cars3DB.wdash-meta.xml
DTC_Sales_SAMPLE WaveDashboard DTC_Sales_SAMPLE.wdash
DTC_Sales_SAMPLE WaveDashboard DTC_Sales_SAMPLE.wdash-meta.xml
FieldUsage WaveDashboard FieldUsage.wdash
FieldUsage WaveDashboard FieldUsage.wdash-meta.xml
Opportunity_Details WaveDashboard Opportunity_Details.wdash
Opportunity_Details WaveDashboard Opportunity_Details.wdash-meta.xml
Regional_Sales_SAMPLE WaveDashboard Regional_Sales_SAMPLE.wdash
Regional_Sales_SAMPLE WaveDashboard Regional_Sales_SAMPLE.wdash-meta.xml
Sales_Performance_with_Selectable_Measures_Trailhead WaveDashboard Sales_Performance_with_Selectable_Measures_Trailhead.wdash
Sales_Performance_with_Selectable_Measures_Trailhead WaveDashboard Sales_Performance_with_Selectable_Measures_Trailhead.wdash-meta.xml
The_Motivator_1 WaveDashboard The_Motivator_1.wdash
The_Motivator_1 WaveDashboard The_Motivator_1.wdash-meta.xml
The_Motivator_2 WaveDashboard The_Motivator_2.wdash
The_Motivator_2 WaveDashboard The_Motivator_2.wdash-meta.xml
Worldwide_Fundraising_Starter WaveDashboard Worldwide_Fundraising_Starter.wdash
Worldwide_Fundraising_Starter WaveDashboard Worldwide_Fundraising_Starter.wdash-meta.xml
ExportCustomers WaveDataflow ExportCustomers.wdf
ExportCustomers WaveDataflow ExportCustomers.wdf-meta.xml
FlattenExample WaveDataflow FlattenExample.wdf
FlattenExample WaveDataflow FlattenExample.wdf-meta.xml
JOINSTutorial WaveDataflow JOINSTutorial.wdf
JOINSTutorial WaveDataflow JOINSTutorial.wdf-meta.xml
SalesEdgeEltWorkflow WaveDataflow SalesEdgeEltWorkflow.wdf
SalesEdgeEltWorkflow WaveDataflow SalesEdgeEltWorkflow.wdf-meta.xml
The_Motivator WaveDataflow The_Motivator.wdf
The_Motivator WaveDataflow The_Motivator.wdf-meta.xml
exportOppty WaveDataflow exportOppty.wdf
exportOppty WaveDataflow exportOppty.wdf-meta.xml
exportOppty2 WaveDataflow exportOppty2.wdf
exportOppty2 WaveDataflow exportOppty2.wdf-meta.xml
filterFruits WaveDataflow filterFruits.wdf
filterFruits WaveDataflow filterFruits.wdf-meta.xml
fruitsdf WaveDataflow fruitsdf.wdf
fruitsdf WaveDataflow fruitsdf.wdf-meta.xml
Benchmark WaveLens Benchmark.wlens
Benchmark WaveLens Benchmark.wlens-meta.xml
myFriutsYield WaveLens myFriutsYield.wlens
myFriutsYield WaveLens myFriutsYield.wlens-meta.xml
FlattenTest WaveRecipe FlattenTest.wdpr
FlattenTest WaveRecipe FlattenTest.wdpr-meta.xml
JoinedOutput WaveRecipe JoinedOutput.wdpr
JoinedOutput WaveRecipe JoinedOutput.wdpr-meta.xml
Yield2_Recipe WaveRecipe Yield2_Recipe.wdpr
Yield2_Recipe WaveRecipe Yield2_Recipe.wdpr-meta.xml
ea_s3_r WaveRecipe ea_s3_r.wdpr
ea_s3_r WaveRecipe ea_s3_r.wdpr-meta.xml
fruits WaveRecipe fruits.wdpr
fruits WaveRecipe fruits.wdpr-meta.xml
yield2_r WaveRecipe yield2_r.wdpr
yield2_r WaveRecipe yield2_r.wdpr-meta.xml
Successfully validated the deployment.
Next Steps
-
Once this is successful, use quick deploy to deploy these wave assets into the target org
-
After verification of the wave assets in the target org, commit into the version control
-
Keep the version control up-to-date with your CRMA development
Performance
Lex Performance
Performance testing
About Performance testing
Goals
- Assessing the application's ability to handle a large volume of users and data without experiencing slowdowns or crashes.
Steps
Step | Details |
---|---|
Identify the business processes and use cases | Determine the workload and user behavior scenarios that we will simulate during testing |
Set up a test environment | This should mirror the production environment. |
Use a load testing tool | To simulate user activity and workload on your Salesforce instance |
Monitor and measure KPIs | response times, throughput, and error rates |
Analyze the results | To identify any performance issues or bottlenecks that need to be addressed. |
Repeat | To ensure that the Salesforce instance can handle increasing levels of user activity and data volumes. |
Flow
graph TD; A[Setup Sandbox for Perf testing]-->B[Draw a system diagram \nto visualize current and future features, systems,\n and users that involve Salesforce]; B-->C[For each part of the system: \n estimate peak load levels,\n average load\n levels features used\nIdentify your projected growth pattern]; C-->D[Calculate the throughput of your system in Requests per Second - RPS]; D-->E[Estimate the size and shape of your data, including \n the number of accounts, users, feeds, groups, and other objects]; E-->F[Identify the key personas for your org and\n plan your tests around their page flows.\nDifferent personas have different data volumes and data visibility]; F-->G[Measure your browser octane score and network latency using \nthe same hardware and network conditions as your users.\n Resolve any performance issues before testing your org]; G-->H[For each test:\n Define the scope of your investigation:\n What components are involved in the test?\n What metrics you want to measure?\n Run your performance test multiple times to eliminate variance.\n Run your tests at regular intervals,\n and take note of any changes in response times and throughput. ]; H-->I[Coordinate testing with Salesforce:\nPerformance tests can generate large amounts of traffic. \nTo avoid having your tests flagged as malicious activity,\ngive Salesforce a heads up]; style A fill:#f9f,stroke:#333,stroke-width:4px style B fill:#ff9,stroke:#333,stroke-width:4px style C fill:#9ff,stroke:#333,stroke-width:4px style D fill:#f9f,stroke:#333,stroke-width:4px style E fill:#ff9,stroke:#333,stroke-width:4px style F fill:#9ff,stroke:#333,stroke-width:4px style G fill:#f9f,stroke:#333,stroke-width:4px style H fill:#ff9,stroke:#333,stroke-width:4px style I fill:#f9f,stroke:#333,stroke-width:4px
Now the details
-
A good testing strategy evaluates both performance and scalability. Performance refers to the speed and effectiveness of a system under a given workload within a given time frame. Scalability is the ability of a system to meet its response time or throughput objectives under increasing application and system processing demands. Make your implementation performant and scalable.
-
It is about establishing confidence by observing how a system performs while it’s being used, often in increasing amounts of utilization.
-
The Salesforce Platform is multitenant. That means that the underlying resources that power your Salesforce experience are shared by other customers. Intentionally overloading those resources to find their breaking point isn’t neighborly and is not allowed.
-
Process of performance testing
-
Technical aspects on how to test?
Goals
- To prove we are confident that the solution can scale to meet the user needs without unwanted side effects and performs at-scale.
- Page loads load in less than two seconds?
- Growing the system:
- User base - Increase of users?
- data volume - Is the amount of data growing at a fast rate?
- Requests per second - Requests running too long or even timing out?
Metrics and measurements
- Identify your projected growth pattern
- Common challenging growth patterns
- Changing by an order of magnitude or more - NX
- Growing exponentially
- Rapidly or repeatedly doubling
Points to note
-
The Salesforce Platform works hard to ensure that resources don’t become overwhelmed.
-
To prevent customizations from overloading the system, a governor manages resource use and capacity. If you exceed the capacity, then the governor enforces a limit. This might cause errors, or in some cases, throttling.
-
Coordinate testing with Salesforce
- Performance tests can generate large amounts of traffic. The Salesforce infrastructure teams work hard to make sure that systems are reliable and available. Part of that is identifying malicious traffic and unapproved system behavior
- To avoid having your tests flagged as malicious activity, give Salesforce a heads-up about your testing plans by filing a support ticket.
- Performance tests must be submitted for approval at least two weeks in advance of the testing date. Requests without a two-week notice are denied unless they’re categorized as urgent.
- A copy of the test plan that you created in Performance Assistant
- The time and date when the test starts and ends
- The Organization ID for the sandbox where the test occurs. Performance tests can be conducted only in a sandbox.
- The contact information of the person conducting the test, including name, phone number, and email.
- The business case or scenario to test, for example, holiday shopping event or new website launch.
- The business justification for testing.
- If you expect volumes over 40 requests per second, include the ramp schedule provided in Performance Assistant.
-
Best Practices and recommendations
Setting up Sandbox
- We recommend using a sandbox that is a full copy of your production org. Make sure that your sandbox’s data model is similar to production.
- Sandbox and production orgs exist in different instances, have different hardware, and can differ in performance. These differences can be most noticeable in asynchronous processing and database caching. Don’t use sandbox performance as a benchmark for production performance. Likewise, don’t use production as a benchmark for sandbox performance.
Visualization and RPS
-
Draw a system diagram to visualize current and future features, systems, and users that involve Salesforce.
-
For each part of the system,
- estimate peak load levels,
- average load levels
- features used
- Consider user arrival rates, login rates, which pages are viewed, and page views per session. If available, any existing site data as a starting point.
-
Calculate the throughput of your system in Requests per Second (RPS).
- RPS combines inbound XMLHttpRequests (XHRs) and API calls, both of which are supported by Event Monitoring.
Size and Shape of data
- Estimate the size and shape of your data, including the number of accounts, users, feeds, groups, and other objects.
- In your sandbox org, include any complex relationships between your objects, role hierarchies, and sharing rules.
Key Personas and Site Map
- After your sandbox org is set up, identify the key personas for your org and plan your tests around their page flows.
- Different personas have different data volumes and data visibility.
- Performance for a persona with a wide view of your org’s data, like the VP of Sales, can be different from users with more specialized roles.
- Use your key personas to build a site map and identify likely page flows for each persona.
Set Up and Run Performance Tests
-
Create tests that evaluate your networks, key personas, and data loading.
-
Before measuring your org’s performance, measure your browser’s octane score and network latency using the same hardware and network conditions as your users. Resolve any performance issues before testing your org.
-
For each test
- Define the scope of your investigation,
- What components are involved in the test,
- What metrics you want to measure.
- Run your performance test multiple times to eliminate variance.
- Run your tests at regular intervals, and take note of any changes in response times and throughput.
-
Performance testing is an iterative process. Finding and solving issues uncovered by your tests can uncover more issues.
Performance Assistant
- Guides you through the journey of performance testing for scalability
- Helps you to generate your test plan and schedule a testing window
References
- Introduction to Performance Testing
- Performance Testing on the Lightning Platform
- Throttle is/was applied to my org
- Performance test FAQs
- Measure Performance for Your Salesforce Org
- Performance Assistant
Tools
Terms
P95
In load testing, P95 (also known as the 95th percentile) is a metric that indicates the response time that 95% of requests fall below. This means that only 5% of requests take longer than the P95 response time.
For example, let's say you're load testing a web application and you send 100 requests. If the P95 response time is 500ms, this means that 95 of those requests were completed in 500ms or less, while the remaining 5 requests took longer than 500ms.
The P95 metric is useful because it provides insight into the performance of your application for the majority of users. It can help you identify performance bottlenecks and ensure that your application is able to handle traffic from a large number of users.
P99
In load testing, P99 (also known as the 99th percentile) is a metric that indicates the response time that 99% of requests fall below. This means that only 1% of requests take longer than the P99 response time.
Examples
Artillery
vusers.created_by_name.Dev account signup: .................. 10
vusers.created.total: ....................................... 10
vusers.completed: ........................................... 10
vusers.session_length:
min: ...................................................... 3884.2
max: ...................................................... 13846.2
median: ................................................... 12711.5
p95: ...................................................... 12968.3
p99: ...................................................... 12968.3
browser.page_domcontentloaded: ........................... 20
browser.response_time:
min: ...................................................... 0
max: ...................................................... 1778.8
median: ................................................... 37.7
p95: ...................................................... 3828.5
p99: ...................................................... 3828.5
browser.page_domcontentloaded.dominteractive:
min: ...................................................... 297
max: ...................................................... 2247
median: ................................................... 1002.4
p95: ...................................................... 1939.5
p99: ...................................................... 1939.5
browser.page_domcontentloaded.dominteractive.https://artillery.io/:
min: ...................................................... 427
max: ...................................................... 2247
median: ................................................... 1130.2
p95: ...................................................... 1939.5
p99: ...................................................... 1939.5
browser.page_domcontentloaded.dominteractive.https://artillery.io/pro/:
min: ...................................................... 297
max: ...................................................... 1927
median: ................................................... 596
p95: ...................................................... 1380.5
p99: ...................................................... 1380.5
K6
// brew install k6
// run: export K6_BROWSER_ENABLED=true; k6 run kptest.mjs
// kptest.mjs
import { chromium } from 'k6/experimental/browser';
import { check } from 'k6'
export default async function () {
const browser = chromium.launch({ headless: false });
const page = browser.newPage();
try {
await page.goto('https://test.k6.io/my_messages.php', { waitUntil: 'networkidle' });
page.locator('input[name="login"]').type('admin');
page.locator('input[name="password"]').type('123');
await Promise.all([
page.waitForNavigation(),
page.locator('input[type="submit"]').click(),
]);
check(page, {
'header': page.locator('h2').textContent() == 'Welcome, admin!',
});
} finally {
page.close();
browser.close();
}
}
/\ |‾‾| /‾‾/ /‾‾/
/\ / \ | |/ / / /
/ \/ \ | ( / ‾‾\
/ \ | |\ \ | (‾) |
/ __________ \ |__| \__\ \_____/ .io
execution: local
script: kptest.mjs
output: -
scenarios: (100.00%) 1 scenario, 1 max VUs, 10m30s max duration (incl. graceful stop):
* default: 1 iterations for each of 1 VUs (maxDuration: 10m0s, gracefulStop: 30s)
✓ header
browser_dom_content_loaded.......: avg=8.29ms min=129µs med=5.18ms max=19.58ms p(90)=16.7ms p(95)=18.14ms
browser_first_contentful_paint...: avg=37.43ms min=19.31ms med=37.43ms max=55.55ms p(90)=51.92ms p(95)=53.74ms
browser_first_meaningful_paint...: avg=55.55ms min=55.55ms med=55.55ms max=55.55ms p(90)=55.55ms p(95)=55.55ms
browser_first_paint..............: avg=37.43ms min=19.31ms med=37.43ms max=55.55ms p(90)=51.92ms p(95)=53.74ms
browser_loaded...................: avg=8.3ms min=1.36ms med=4.12ms max=19.41ms p(90)=16.35ms p(95)=17.88ms
checks...........................: 100.00% ✓ 1 ✗ 0
data_received....................: 5.8 kB 1.5 kB/s
data_sent........................: 2.6 kB 663 B/s
http_req_connecting..............: avg=12.2ms min=0s med=0s max=61ms p(90)=36.6ms p(95)=48.79ms
http_req_duration................: avg=44.55ms min=1.23ms med=28.14ms max=114.44ms p(90)=89.65ms p(95)=102.05ms
http_req_receiving...............: avg=25.6ms min=0s med=26ms max=51ms p(90)=41.4ms p(95)=46.2ms
http_req_sending.................: avg=0s min=0s med=0s max=0s p(90)=0s p(95)=0s
http_req_tls_handshaking.........: avg=6.6ms min=0s med=0s max=33ms p(90)=19.8ms p(95)=26.39ms
http_reqs........................: 5 1.288394/s
iteration_duration...............: avg=3.88s min=3.88s med=3.88s max=3.88s p(90)=3.88s p(95)=3.88s
iterations.......................: 1 0.257679/s
vus..............................: 1 min=1 max=1
vus_max..........................: 1 min=1 max=1
running (00m03.9s), 0/1 VUs, 1 complete and 0 interrupted iterations
default ✓ [======================================] 1 VUs 00m03.9s/10m0s 1/1 iters, 1 per VU
// test1.js
import http from "k6/http";
import { check, sleep } from "k6";
// Test configuration
export const options = {
thresholds: {
// Assert that 99% of requests finish within 3000ms.
http_req_duration: ["p(99) < 3000"],
},
// Ramp the number of virtual users up and down
stages: [
{ duration: "30s", target: 15 },
{ duration: "1m", target: 15 },
{ duration: "20s", target: 0 },
],
};
// Simulated user behavior
export default function () {
let res = http.get("https://test-api.k6.io/public/crocodiles/1/");
// Validate response status
check(res, { "status was 200": (r) => r.status == 200 });
sleep(1);
}
k6 run test1.js
/\ |‾‾| /‾‾/ /‾‾/
/\ / \ | |/ / / /
/ \/ \ | ( / ‾‾\
/ \ | |\ \ | (‾) |
/ __________ \ |__| \__\ \_____/ .io
execution: local
script: test1.js
output: -
scenarios: (100.00%) 1 scenario, 15 max VUs, 2m20s max duration (incl. graceful stop):
* default: Up to 15 looping VUs for 1m50s over 3 stages (gracefulRampDown: 30s, gracefulStop: 30s)
✓ status was 200
checks.........................: 100.00% ✓ 1218 ✗ 0
data_received..................: 451 kB 4.1 kB/s
data_sent......................: 155 kB 1.4 kB/s
http_req_blocked...............: avg=1.41ms min=1µs med=5µs max=423.53ms p(90)=7µs p(95)=8.14µs
http_req_connecting............: avg=823.57µs min=0s med=0s max=341.59ms p(90)=0s p(95)=0s
✓ http_req_duration..............: avg=59.04ms min=28.25ms med=36.12ms max=655.37ms p(90)=86.64ms p(95)=187.46ms
{ expected_response:true }...: avg=59.04ms min=28.25ms med=36.12ms max=655.37ms p(90)=86.64ms p(95)=187.46ms
http_req_failed................: 0.00% ✓ 0 ✗ 1218
http_req_receiving.............: avg=59.58µs min=18µs med=55µs max=167µs p(90)=94µs p(95)=98µs
http_req_sending...............: avg=21.4µs min=7µs med=19µs max=76µs p(90)=36µs p(95)=40µs
http_req_tls_handshaking.......: avg=562.16µs min=0s med=0s max=81.86ms p(90)=0s p(95)=0s
http_req_waiting...............: avg=58.96ms min=28.21ms med=36.04ms max=655.29ms p(90)=86.55ms p(95)=187.37ms
http_reqs......................: 1218 10.964403/s
iteration_duration.............: avg=1.06s min=1.02s med=1.03s max=1.65s p(90)=1.09s p(95)=1.18s
iterations.....................: 1218 10.964403/s
vus............................: 1 min=1 max=15
vus_max........................: 15 min=15 max=15
running (1m51.1s), 00/15 VUs, 1218 complete and 0 interrupted iterations
default ✓ [======================================] 00/15 VUs 1m50s
Playwright + K6
-
Steps
-
Write a Playwright script: Use the Playwright API to write a script that automates the interactions with your web application, such as clicking on buttons, filling out forms, and navigating to different pages. This script should simulate a realistic user scenario.
-
Instrument the script: Add K6's instrumentation to your Playwright script. This will allow you to measure performance metrics, such as response time and throughput, during the test. You can use the K6 script generator to generate the instrumented script.
-
Run the test: Use the K6 CLI to run the instrumented script. You can customize the number of virtual users and other test parameters to simulate realistic user traffic.
-
Analyze the results: After the test is complete, K6 will generate a report with performance metrics and other details. You can use this report to identify any performance issues and optimize your web application.
Testing with Playwright
Performance testing with Artillery and Playwright
import { test, expect } from '@playwright/test';
test('test', async ({ page }) => {
await page.goto('https://www.google.com/');
await page.getByRole('button', { name: 'Google Search' }).click();
await page.getByRole('link', { name: 'Introducing Lightning Web Components - Salesforce Developers https://developer.salesforce.com › documentation › lwc' }).click();
await page.getByRole('link', { name: 'Work with Salesforce Data' }).click();
await page.getByRole('treeitem', { name: 'Use Components Outside Salesforce' }).locator('div').click();
await page.getByRole('link', { name: 'Test Lightning Web Components' }).click();
});
-
Playwright is a modern end-to-end browser testing framework. Artillery integrates with Playwright to let you run load tests with real headless browsers.
-
Uses playwright codegen to generate test scripts fast
artillery version
___ __ _ ____
_____/ | _____/ /_(_) / /__ _______ __ ___
/____/ /| | / ___/ __/ / / / _ \/ ___/ / / /____/
/____/ ___ |/ / / /_/ / / / __/ / / /_/ /____/
/_/ |_/_/ \__/_/_/_/\___/_/ \__ /
/____/
VERSION INFO:
Artillery: 2.0.0-29
Node.js: v16.17.1
OS: darwin
- Artillery is centered around the concept of virtual users. (VUs)
Config file (Test Definition)
- A test definition is written as a YAML file
- The artillery CLI is then used to run that test definition.
config:
target: https://www.artillery.io
# Enable the Playwright engine:
engines:
playwright: {}
processor: "./flows.js"
scenarios:
- engine: playwright
flowFunction: "helloFlow"
flow: []
cat flow.js
module.exports = { helloFlow };
async function helloFlow(page) {
//
// The code below is just a standard Playwright script:
//
// Go to https://artillery.io/
await page.goto('https://artillery.io/');
// Click text=Pricing
await page.click('text=Pricing');
// assert.equal(page.url(), 'https://artillery.io/pro/');
// Click text=Sign up
await page.click('text=Sign up');
}
-
Run your tests from your own AWS account with no infra to set up or manage
-
Load test with Playwright— use Playwright to load test with real browsers
-
Test any stack — test HTTP, WebSocket, Socket.io, gRPC, Kafka, HLS, and more
-
Virtual users (VUs) and scenarios
-
Workload modeling
- A load phase tells Artillery how many virtual users to create over a period of time.
- Each virtual user will pick and run one of the scenarios in the test definition and run it to completion.
- A test run ends when all virtual users finish running their scenarios.
Load tests vs smoke tests
- Artillery can be used for both load testing and smoke testing (also known as synthetic checks, or synthetic tests). The only difference between a load test and a smoke test is the number of VUs created:
- A load test creates many VUs
- A smoke test creates just one VU
Runtime Platforms
By default Artillery will run the test from the local machine. Since v2.0.0-21 Artillery can also run tests on arbitrary platforms. Currently, AWS Lambda support is available in preview mode.
Examples
- Run a quick test which generates 20 virtual users, each sending 100 GET requests to the specified target:
artillery quick \
--count 20 \
--num 100 \
https://service-foo.preprod.acmecorp.digital/items
Run a distributed test on AWS Lambda in us-east-1 region using 25 workers:
artillery run \
--platform aws:lambda \
--platform-opt region=us-east-1 \
--count 25 \
hello-artillery.yml
References
More with Playwright
import { test, expect } from '@playwright/test';
import fs from 'fs';
const settings = JSON.parse(fs.readFileSync('/Users/mchinnappan/.playwright/sel2/settings.json', 'utf-8'));
test('test', async ({ page }) => {
await page.goto('https://login.salesforce.com/');
await page.getByLabel('Username').click();
await page.getByLabel('Username').fill(settings.username);
await page.getByLabel('Username').press('Tab');
await page.getByLabel('Password').click();
await page.getByLabel('Password').fill(settings.password);
await page.getByRole('button', { name: 'Log In' }).click();
await page.getByRole('link', { name: 'Setup' }).click();
await page.getByRole('link', { name: 'Expand - Customize - Level 1' }).click();
await page.getByRole('link', { name: 'Expand - Accounts - Level 2' }).click();
await page.getByRole('link', { name: 'Fields' }).click();
await page.getByRole('link', { name: 'NumLakes' }).click();
await page.getByRole('button', { name: 'Edit' }).click();
await page.getByLabel('*Field Name').click();
await page.getByLabel('*Field Name').fill('NumLakes7');
page.once('dialog', dialog => {
console.log(`Dialog message: ${dialog.message()}`);
dialog.dismiss().catch(() => {});
});
await page.getByRole('row', { name: 'Custom Field Definition Edit Change Field Type Save Cancel', exact: true }).getByRole('button', { name: 'Save' }).click();
});
- Github actions
name: Playwright Tests
on:
push:
branches: [ main, master ]
pull_request:
branches: [ main, master ]
jobs:
test:
timeout-minutes: 60
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: 16
- name: Install dependencies
run: npm ci
- name: Install Playwright Browsers
run: npx playwright install --with-deps
- name: Run Playwright tests
run: npx playwright test
- uses: actions/upload-artifact@v3
if: always()
with:
name: playwright-report
path: playwright-report/
Recording with Playwright in VSCode
import { test, expect } from '@playwright/test';
test('test', async ({ page }) => {
await page.goto('https://mohan-chinnappan-n5.github.io/xml/sfl-xslt.html');
await page.getByPlaceholder('Search for XML Type...').click();
await page.getByPlaceholder('Search for XML Type...').fill('pr');
await page.getByRole('option', { name: 'profile3' }).click();
await expect(page.getByRole('button', { name: 'Transform' })).toHaveText('Transform');
});
Python version
from playwright.sync_api import Page, expect
def test_example(page: Page) -> None:
page.goto("https://mohan-chinnappan-n5.github.io/xml/sfl-xslt.html")
page.get_by_placeholder("Search for XML Type...").click()
page.get_by_placeholder("Search for XML Type...").fill("p")
page.get_by_role("option", name="profile3").click()
expect(page.get_by_role('button', name='Transform' )).to_have_text('Transform')
Salesforce Edge Network
How to enable Salesforce Edge Network?
- To find out the effect of Salesforce Edge Network
- Run speedtest.jsp (https://instance_url/speedtest.jsp) without Edge on
- Repeat the speedtest.jsp (https://instance_url/speedtest.jsp) with Edge on
- Run speedtest.jsp (https://instance_url/speedtest.jsp) without Edge on
Lightning Experience Queries
Terms
- Experienced Page Time (EPT) is a performance metric Salesforce uses in Lightning to measure page load time. EPT measures how long it takes for a page to load into a state that a user can meaningfully interact with.
The EPT is measured as the time from the page start to when no more activity occurs for at least two frames (~33 ms). The two extra frames help to avoid false positives due to asynchronous calls. These calls include any XHR activity, any storage activity, or any user interaction or client-side work of any kind in the main JavaScript thread.
Topics
LightningUsageByBrowserMetrics
- Represents Lightning Experience usage grouped by user’s browser
sfdx mohanc:data:query -q "https://raw.githubusercontent.com/mohan-chinnappan-n/cli-dx/master/soql/ept.soql" -u mohan.chinnappan.n.sel@gmail.com -f json | pbcopy ; open "https://mohan-chinnappan-n5.github.io/viz/datatable/dt.html?c=csv"
LightningUsageByPageMetrics
sfdx mohanc:data:query -q "https://raw.githubusercontent.com/mohan-chinnappan-n/cli-dx/master/soql/pageEPT_top10.soql" -u mohan.chinnappan.n.sel@gmail.com -f json | pbcopy ; open "https://mohan-chinnappan-n5.github.io/viz/datatable/dt.html?c=csv"
Real User Monitoring (RUM) and Synthetic Monitoring
RUM
- An approach to monitoring and providing insights into web performance
- Generally best suited for understanding long-term trends
- Measures the performance of a page from real users' machines.
How it works?
- Generally, a third party script injects a script on each page to measure and report back on page load data for every request made. This technique monitors an application's actual user interactions.
- The browsers of real users report back performance metrics experienced.
- Helps identify how an application is being used, including the geographic distribution of users and the impact of that distribution on the end user experience.
- As users interact with an application, all performance timings are captured, regardless of what actions are taken or pages viewed.
- Monitors actual use cases, not the synthetic, assumed use cases predefined by an engineer
- Business can better understand its users and identify the areas on its site that require the most attention.
- Helps to understand the geographic or channel distribution trends of your users. The can help you better define your business plan and, from a monitoring perspective, allows you to identify key areas to target for optimization and performance improvements.
Synthetic Monitoring
-
Involves monitoring the performance of a page in a 'laboratory' environment
-
Well suited to regression testing and mitigating shorter-term performance issues during development.
-
Typically with automation tooling in a consistent as possible environment.
-
Involves deploying scripts to simulate the path an end user might take through a web application, reporting back the performance the simulator experiences.
-
The traffic measured is not of your actual users, but rather synthetically generated traffic collecting data on page performance.
-
It is done in a controlled environment where variable like geography, network, device, browser, and cached status are predetermined.
-
Controlling for environmental variables is helpful in understanding where performance bottlenecks have been occurring and identifying the source of any performance issues
-
Important component of regression testing and production site monitoring.
Synthetic | RUM |
---|---|
Well suited for catching regressions during development life cycles, especially with network throttling | Provides real metrics from real users using the site or application. |
Inexpensive, and great for spot-checking performance during development as an effective way to measure the effect of code changes, but it doesn't reflect what real users are experiencing and provides only a narrow view of performance. | More expensive and likely less convenient, it provides vital user experience data. |
Performance APIs
- The Performance interface provides access to performance-related information for the current page.
const navigationEntries = performance.getEntriesByType("navigation")[0];
navigationEntries
const resourceListEntries = performance.getEntriesByType("resource");
resourceListEntries
const dnsTime = navigationEntries.domainLookupEnd - navigationEntries.domainLookupStart;
dnsTime
0
Using puppeteer to view performance.timing
#!/usr/bin/env node
const puppeteer = require("puppeteer");
const site = process.argv[2] || "https://google.com";
const type = process.argv[3] || "resource";
console.log( `=== Getting performanceTiming: ${type} for site: ${site}: ===`);
(async () => {
const browser = await puppeteer.launch();
const page = await browser.newPage();
await page.goto(site);
// Executes Navigation API within the page context
let performanceTiming;
if (type === "resource") {
performanceTiming = JSON.parse(
await page.evaluate(() =>
JSON.stringify(window.performance.getEntriesByType("resource"))
)
);
} else if (type === "all") {
performanceTiming = JSON.parse(
await page.evaluate(() => JSON.stringify(window.performance.timing))
);
}
console.log(
`performanceTiming: ${type} for site: ${site}:`,
performanceTiming
);
await browser.close();
})();
./webperf.js https://google.com all
=== Getting performanceTiming: all for site: https://google.com: ===
performanceTiming: all for site: https://google.com:
{
connectStart: 1677720916684,
navigationStart: 1677720916432,
secureConnectionStart: 1677720916709,
fetchStart: 1677720916682,
domContentLoadedEventStart: 1677720917232,
responseStart: 1677720916935,
domInteractive: 1677720917153,
domainLookupEnd: 1677720916684,
responseEnd: 1677720917010,
redirectStart: 0,
requestStart: 1677720916750,
unloadEventEnd: 0,
unloadEventStart: 0,
domLoading: 1677720917031,
domComplete: 1677720917640,
domainLookupStart: 1677720916683,
loadEventStart: 1677720917641,
domContentLoadedEventEnd: 1677720917232,
loadEventEnd: 1677720917645,
redirectEnd: 0,
connectEnd: 1677720916750
}
./webperf.js https://google.com resource
=== Getting performanceTiming: resource for site: https://google.com: ===
performanceTiming: resource for site: https://google.com:
[
{
name: 'https://www.google.com/xjs/_/js/k=xjs.s.en_US.y9vzKGrGO0c.O/am=AIHIJ1wBcAAAAgAAABARAAAAAABgAACGACAAAAgAoBAGQQwGwFIJAAAAGH0QAQAAMABAAQwcBRAAAAAA-QMA8AIgBhMWAAAAAAAAAEBwEgSDGyQoCAABAAAAAAAAQJVMXFxBCA/d=1/ed=1/dg=2/rs=ACT90oFPVvppotwAo2N5XMiB1X5DNHmb0g/m=cdos,cr,dpf,hsm,jsa,d,csi',
entryType: 'resource',
startTime: 994.6000000238419,
duration: 97.5,
initiatorType: 'script',
nextHopProtocol: 'h2',
renderBlockingStatus: 'non-blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 994.6000000238419,
domainLookupStart: 994.6000000238419,
domainLookupEnd: 994.6000000238419,
connectStart: 994.6000000238419,
secureConnectionStart: 994.6000000238419,
connectEnd: 994.6000000238419,
requestStart: 1015.2999999523163,
responseStart: 1035.1999999284744,
responseEnd: 1092.1000000238419,
transferSize: 314805,
encodedBodySize: 314505,
decodedBodySize: 897753,
responseStatus: 200,
serverTiming: []
},
]
- PerformanceNavigationTiming
- PerformanceResourceTiming
- PerformanceMark
- PerformanceMeasure
- PerformancePaintTiming
- First paint - FP
- Time when anything is rendered. Note that the marking of the first paint is optional, not all user agents (browsers) report it.
- First contentful paint - FCP
- when the browser renders the first bit of content from the DOM
- FCP timestamp is when the browser first rendered any text, image (including background images), video, canvas that had been drawn into, or non-empty SVG.
- LargestContentfulPaint LCP
- Provides timing information about the largest image or text paint before user input on a web page.
- Helps you minimize the time that users have to wait before they can see the site's content start to appear
- First paint - FP
Sample instrumentation code embedded into web page and response at RUM server side
Instrumentation code
// Function to send performance data to the RUM server
let rumServer = 'http://localhost:1000/rum';
const pageName = 'Markdown Editor - Kalam';
console.log(rumServer);
function sendPerformanceDataToServer(data) {
fetch(rumServer, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify(data),
})
.then((response) => {
if (response.ok) {
console.log('Performance data sent to RUM server successfully');
} else {
console.error('Failed to send performance data to RUM server');
}
})
.catch((error) => {
console.error('Error:', error);
});
}
// Function to gather and send performance metrics
function gatherAndSendPerformanceMetrics() {
console.log('inside gatherAndSendPerformanceMetrics() ')
if ('performance' in window) {
const perf = window.performance;
const timing = perf.timing;
const resources = perf.getEntriesByType('resource');
const userTimings = perf.getEntriesByType('measure');
// Calculate page load time
const pageLoadTime = timing.loadEventEnd - timing.navigationStart;
// Ensure pageLoadTime is non-negative
const nonNegativePageLoadTime = Math.max(0, pageLoadTime);
// Gather performance data including page name
const performanceData = {
pageName: pageName, // Replace with the actual page name
pageLoadTime: nonNegativePageLoadTime,
timeToFirstByte: timing.responseStart - timing.navigationStart,
resources: resources,
userTimings: userTimings,
};
// Send performance data to the RUM server
sendPerformanceDataToServer(performanceData);
}
else {
console.log('could not find "performance" in the window object')
}
}
// Log performance metrics when the page is fully loaded
let myevent = 'load';
myevent = 'afterprint';
window.addEventListener(myevent, gatherAndSendPerformanceMetrics);
The payload received by the RUM Server
heroku logs -t # for heroku RUM Server
const rum = {
pageName: 'Markdown Editor - Kalam',
pageLoadTime: 114,
timeToFirstByte: 5,
resources: [
{
name: 'https://cdnjs.cloudflare.com/ajax/libs/showdown/2.1.0/showdown.min.js',
entryType: 'resource',
startTime: 12.200000002980232,
duration: 0,
initiatorType: 'script',
deliveryType: 'cache',
nextHopProtocol: 'h3',
renderBlockingStatus: 'blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 12.200000002980232,
domainLookupStart: 12.200000002980232,
domainLookupEnd: 12.200000002980232,
connectStart: 12.200000002980232,
secureConnectionStart: 12.200000002980232,
connectEnd: 12.200000002980232,
requestStart: 12.200000002980232,
responseStart: 12.200000002980232,
firstInterimResponseStart: 0,
responseEnd: 12.200000002980232,
transferSize: 0,
encodedBodySize: 20705,
decodedBodySize: 75209,
responseStatus: 0,
serverTiming: []
},
{
name: 'https://mohan-chinnappan-n5.github.io/pkg/js/lib/autocomplete.min.js',
entryType: 'resource',
startTime: 12.400000005960464,
duration: 0,
initiatorType: 'script',
deliveryType: '',
nextHopProtocol: '',
renderBlockingStatus: 'blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 12.400000005960464,
domainLookupStart: 0,
domainLookupEnd: 0,
connectStart: 0,
secureConnectionStart: 0,
connectEnd: 0,
requestStart: 0,
responseStart: 0,
firstInterimResponseStart: 0,
responseEnd: 12.400000005960464,
transferSize: 0,
encodedBodySize: 0,
decodedBodySize: 0,
responseStatus: 0,
serverTiming: []
},
{
name: 'https://mohan-chinnappan-n5.github.io/pkg/css/lib/autocomplete.css',
entryType: 'resource',
startTime: 12.5,
duration: 11.700000002980232,
initiatorType: 'link',
deliveryType: '',
nextHopProtocol: '',
renderBlockingStatus: 'blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 12.5,
domainLookupStart: 0,
domainLookupEnd: 0,
connectStart: 0,
secureConnectionStart: 0,
connectEnd: 0,
requestStart: 0,
responseStart: 0,
firstInterimResponseStart: 0,
responseEnd: 24.200000002980232,
transferSize: 0,
encodedBodySize: 0,
decodedBodySize: 0,
responseStatus: 0,
serverTiming: []
},
{
name: 'https://mohan-chinnappan-n.github.io/sfdc/gs/js/split.js',
entryType: 'resource',
startTime: 12.700000002980232,
duration: 0,
initiatorType: 'script',
deliveryType: '',
nextHopProtocol: '',
renderBlockingStatus: 'blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 12.700000002980232,
domainLookupStart: 0,
domainLookupEnd: 0,
connectStart: 0,
secureConnectionStart: 0,
connectEnd: 0,
requestStart: 0,
responseStart: 0,
firstInterimResponseStart: 0,
responseEnd: 12.700000002980232,
transferSize: 0,
encodedBodySize: 0,
decodedBodySize: 0,
responseStatus: 0,
serverTiming: []
},
{
name: 'https://mohan-chinnappan-n.github.io/sfdc/gs/css/split.css',
entryType: 'resource',
startTime: 12.799999997019768,
duration: 14.600000008940697,
initiatorType: 'link',
deliveryType: '',
nextHopProtocol: '',
renderBlockingStatus: 'blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 12.799999997019768,
domainLookupStart: 0,
domainLookupEnd: 0,
connectStart: 0,
secureConnectionStart: 0,
connectEnd: 0,
requestStart: 0,
responseStart: 0,
firstInterimResponseStart: 0,
responseEnd: 27.400000005960464,
transferSize: 0,
encodedBodySize: 0,
decodedBodySize: 0,
responseStatus: 0,
serverTiming: []
},
{
name: 'https://d3js.org/d3.v7.min.js',
entryType: 'resource',
startTime: 13,
duration: 0,
initiatorType: 'script',
deliveryType: '',
nextHopProtocol: '',
renderBlockingStatus: 'blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 13,
domainLookupStart: 0,
domainLookupEnd: 0,
connectStart: 0,
secureConnectionStart: 0,
connectEnd: 0,
requestStart: 0,
responseStart: 0,
firstInterimResponseStart: 0,
responseEnd: 13,
transferSize: 0,
encodedBodySize: 0,
decodedBodySize: 0,
responseStatus: 0,
serverTiming: []
},
{
name: 'https://cdnjs.cloudflare.com/ajax/libs/monaco-editor/0.24.0/min/vs/loader.js',
entryType: 'resource',
startTime: 13.099999994039536,
duration: 0,
initiatorType: 'script',
deliveryType: 'cache',
nextHopProtocol: 'h2',
renderBlockingStatus: 'blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 13.099999994039536,
domainLookupStart: 13.099999994039536,
domainLookupEnd: 13.099999994039536,
connectStart: 13.099999994039536,
secureConnectionStart: 13.099999994039536,
connectEnd: 13.099999994039536,
requestStart: 13.099999994039536,
responseStart: 13.099999994039536,
firstInterimResponseStart: 0,
responseEnd: 13.099999994039536,
transferSize: 0,
encodedBodySize: 8361,
decodedBodySize: 32170,
responseStatus: 0,
serverTiming: []
},
{
name: 'https://cdn.jsdelivr.net/npm/bootstrap@4.5.3/dist/css/bootstrap.min.css',
entryType: 'resource',
startTime: 13.200000002980232,
duration: 15.899999991059303,
initiatorType: 'link',
deliveryType: 'cache',
nextHopProtocol: 'h3',
renderBlockingStatus: 'blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 13.200000002980232,
domainLookupStart: 13.200000002980232,
domainLookupEnd: 13.200000002980232,
connectStart: 13.200000002980232,
secureConnectionStart: 13.200000002980232,
connectEnd: 13.200000002980232,
requestStart: 25.299999997019768,
responseStart: 27.099999994039536,
firstInterimResponseStart: 27.099999994039536,
responseEnd: 29.099999994039536,
transferSize: 0,
encodedBodySize: 26099,
decodedBodySize: 160392,
responseStatus: 200,
serverTiming: []
},
{
name: 'https://cdnjs.cloudflare.com/ajax/libs/jquery/3.6.0/jquery.min.js',
entryType: 'resource',
startTime: 13.400000005960464,
duration: 0,
initiatorType: 'script',
deliveryType: 'cache',
nextHopProtocol: 'h2',
renderBlockingStatus: 'blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 13.400000005960464,
domainLookupStart: 13.400000005960464,
domainLookupEnd: 13.400000005960464,
connectStart: 13.400000005960464,
secureConnectionStart: 13.400000005960464,
connectEnd: 13.400000005960464,
requestStart: 13.400000005960464,
responseStart: 13.400000005960464,
firstInterimResponseStart: 0,
responseEnd: 13.400000005960464,
transferSize: 0,
encodedBodySize: 27938,
decodedBodySize: 89501,
responseStatus: 200,
serverTiming: []
},
{
name: 'https://cdn.jsdelivr.net/npm/popper.js@1.16.1/dist/umd/popper.min.js',
entryType: 'resource',
startTime: 13.599999994039536,
duration: 0,
initiatorType: 'script',
deliveryType: 'cache',
nextHopProtocol: 'h3',
renderBlockingStatus: 'blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 13.599999994039536,
domainLookupStart: 13.599999994039536,
domainLookupEnd: 13.599999994039536,
connectStart: 13.599999994039536,
secureConnectionStart: 13.599999994039536,
connectEnd: 13.599999994039536,
requestStart: 13.599999994039536,
responseStart: 13.599999994039536,
firstInterimResponseStart: 0,
responseEnd: 13.599999994039536,
transferSize: 0,
encodedBodySize: 7831,
decodedBodySize: 21233,
responseStatus: 200,
serverTiming: []
},
{
name: 'https://cdn.jsdelivr.net/npm/bootstrap@4.5.3/dist/js/bootstrap.min.js',
entryType: 'resource',
startTime: 13.700000002980232,
duration: 0,
initiatorType: 'script',
deliveryType: 'cache',
nextHopProtocol: 'h3',
renderBlockingStatus: 'blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 13.700000002980232,
domainLookupStart: 13.700000002980232,
domainLookupEnd: 13.700000002980232,
connectStart: 13.700000002980232,
secureConnectionStart: 13.700000002980232,
connectEnd: 13.700000002980232,
requestStart: 13.700000002980232,
responseStart: 13.700000002980232,
firstInterimResponseStart: 0,
responseEnd: 13.700000002980232,
transferSize: 0,
encodedBodySize: 16162,
decodedBodySize: 63240,
responseStatus: 200,
serverTiming: []
},
{
name: 'http://localhost:7010/github-mirror-2/mohan-chinnappan-n5.github.io/md/js/app.js?v=5',
entryType: 'resource',
startTime: 13.700000002980232,
duration: 16.299999997019768,
initiatorType: 'script',
deliveryType: '',
nextHopProtocol: 'http/1.0',
renderBlockingStatus: 'non-blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 13.700000002980232,
domainLookupStart: 26.200000002980232,
domainLookupEnd: 26.200000002980232,
connectStart: 26.200000002980232,
secureConnectionStart: 0,
connectEnd: 26.700000002980232,
requestStart: 27,
responseStart: 28.799999997019768,
firstInterimResponseStart: 0,
responseEnd: 30,
transferSize: 8980,
encodedBodySize: 8680,
decodedBodySize: 8680,
responseStatus: 200,
serverTiming: []
},
{
name: 'https://mohan-chinnappan-n.github.io/sfdc/gs/css/vertical.png',
entryType: 'resource',
startTime: 108.29999999701977,
duration: 0,
initiatorType: 'css',
deliveryType: '',
nextHopProtocol: '',
renderBlockingStatus: 'non-blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 108.29999999701977,
domainLookupStart: 0,
domainLookupEnd: 0,
connectStart: 0,
secureConnectionStart: 0,
connectEnd: 0,
requestStart: 0,
responseStart: 0,
firstInterimResponseStart: 0,
responseEnd: 108.29999999701977,
transferSize: 0,
encodedBodySize: 0,
decodedBodySize: 0,
responseStatus: 0,
serverTiming: []
},
{
name: 'https://raw.githubusercontent.com/mohan-chinnappan-n/project-docs/main/sf/list.txt',
entryType: 'resource',
startTime: 108.70000000298023,
duration: 160.09999999403954,
initiatorType: 'fetch',
deliveryType: '',
nextHopProtocol: '',
renderBlockingStatus: 'non-blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 108.70000000298023,
domainLookupStart: 0,
domainLookupEnd: 0,
connectStart: 0,
secureConnectionStart: 0,
connectEnd: 0,
requestStart: 0,
responseStart: 0,
firstInterimResponseStart: 0,
responseEnd: 268.79999999701977,
transferSize: 0,
encodedBodySize: 0,
decodedBodySize: 0,
responseStatus: 200,
serverTiming: []
},
{
name: 'https://mohan-chinnappan-n5.github.io/editor/img/favicon.ico',
entryType: 'resource',
startTime: 119.5,
duration: 2.5999999940395355,
initiatorType: 'other',
deliveryType: '',
nextHopProtocol: '',
renderBlockingStatus: 'non-blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 119.5,
domainLookupStart: 0,
domainLookupEnd: 0,
connectStart: 0,
secureConnectionStart: 0,
connectEnd: 0,
requestStart: 0,
responseStart: 0,
firstInterimResponseStart: 0,
responseEnd: 122.09999999403954,
transferSize: 0,
encodedBodySize: 0,
decodedBodySize: 0,
responseStatus: 0,
serverTiming: []
},
{
name: 'https://cdnjs.cloudflare.com/ajax/libs/monaco-editor/0.24.0/min/vs/editor/editor.main.js',
entryType: 'resource',
startTime: 270.5,
duration: 18.200000002980232,
initiatorType: 'script',
deliveryType: 'cache',
nextHopProtocol: 'h3',
renderBlockingStatus: 'non-blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 270.5,
domainLookupStart: 270.5,
domainLookupEnd: 270.5,
connectStart: 270.5,
secureConnectionStart: 270.5,
connectEnd: 270.5,
requestStart: 271.40000000596046,
responseStart: 272.29999999701977,
firstInterimResponseStart: 272.29999999701977,
responseEnd: 288.70000000298023,
transferSize: 0,
encodedBodySize: 501975,
decodedBodySize: 2568106,
responseStatus: 0,
serverTiming: []
},
{
name: 'https://mohan-chinnappan-n5.github.io/pkg/css/lib/images/search.svg',
entryType: 'resource',
startTime: 272.09999999403954,
duration: 0,
initiatorType: 'css',
deliveryType: '',
nextHopProtocol: '',
renderBlockingStatus: 'non-blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 272.09999999403954,
domainLookupStart: 0,
domainLookupEnd: 0,
connectStart: 0,
secureConnectionStart: 0,
connectEnd: 0,
requestStart: 0,
responseStart: 0,
firstInterimResponseStart: 0,
responseEnd: 272.09999999403954,
transferSize: 0,
encodedBodySize: 0,
decodedBodySize: 0,
responseStatus: 0,
serverTiming: []
},
{
name: 'https://cdnjs.cloudflare.com/ajax/libs/monaco-editor/0.24.0/min/vs/editor/editor.main.css',
entryType: 'resource',
startTime: 309.70000000298023,
duration: 2.5,
initiatorType: 'link',
deliveryType: 'cache',
nextHopProtocol: 'h3',
renderBlockingStatus: 'non-blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 309.70000000298023,
domainLookupStart: 309.70000000298023,
domainLookupEnd: 309.70000000298023,
connectStart: 309.70000000298023,
secureConnectionStart: 309.70000000298023,
connectEnd: 309.70000000298023,
requestStart: 310.29999999701977,
responseStart: 311.29999999701977,
firstInterimResponseStart: 311.29999999701977,
responseEnd: 312.20000000298023,
transferSize: 0,
encodedBodySize: 11145,
decodedBodySize: 70275,
responseStatus: 0,
serverTiming: []
},
{
name: 'https://cdnjs.cloudflare.com/ajax/libs/monaco-editor/0.24.0/min/vs/editor/editor.main.nls.js',
entryType: 'resource',
startTime: 314.29999999701977,
duration: 2.4000000059604645,
initiatorType: 'script',
deliveryType: 'cache',
nextHopProtocol: 'h3',
renderBlockingStatus: 'non-blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 314.29999999701977,
domainLookupStart: 314.29999999701977,
domainLookupEnd: 314.29999999701977,
connectStart: 314.29999999701977,
secureConnectionStart: 314.29999999701977,
connectEnd: 314.29999999701977,
requestStart: 315,
responseStart: 315.90000000596046,
firstInterimResponseStart: 315.90000000596046,
responseEnd: 316.70000000298023,
transferSize: 0,
encodedBodySize: 12372,
decodedBodySize: 62955,
responseStatus: 0,
serverTiming: []
},
{
name: 'https://cdnjs.cloudflare.com/ajax/libs/monaco-editor/0.24.0/min/vs/base/browser/ui/codicons/codicon/codicon.ttf',
entryType: 'resource',
startTime: 346.40000000596046,
duration: 0,
initiatorType: 'css',
deliveryType: 'cache',
nextHopProtocol: 'h3',
renderBlockingStatus: 'non-blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 346.40000000596046,
domainLookupStart: 346.40000000596046,
domainLookupEnd: 346.40000000596046,
connectStart: 346.40000000596046,
secureConnectionStart: 346.40000000596046,
connectEnd: 346.40000000596046,
requestStart: 346.40000000596046,
responseStart: 346.40000000596046,
firstInterimResponseStart: 0,
responseEnd: 346.40000000596046,
transferSize: 0,
encodedBodySize: 40989,
decodedBodySize: 70964,
responseStatus: 200,
serverTiming: []
},
{
name: 'https://cdnjs.cloudflare.com/ajax/libs/monaco-editor/0.24.0/min/vs/basic-languages/markdown/markdown.js',
entryType: 'resource',
startTime: 473.70000000298023,
duration: 4.5,
initiatorType: 'script',
deliveryType: 'cache',
nextHopProtocol: 'h3',
renderBlockingStatus: 'non-blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 473.70000000298023,
domainLookupStart: 473.70000000298023,
domainLookupEnd: 473.70000000298023,
connectStart: 473.70000000298023,
secureConnectionStart: 473.70000000298023,
connectEnd: 473.70000000298023,
requestStart: 477,
responseStart: 477.90000000596046,
firstInterimResponseStart: 477.90000000596046,
responseEnd: 478.20000000298023,
transferSize: 0,
encodedBodySize: 1407,
decodedBodySize: 4216,
responseStatus: 0,
serverTiming: []
},
{
name: 'https://www.greensboroughvets.com.au/images/mule.jpg',
entryType: 'resource',
startTime: 511.29999999701977,
duration: 0,
initiatorType: 'img',
deliveryType: '',
nextHopProtocol: '',
renderBlockingStatus: 'non-blocking',
workerStart: 0,
redirectStart: 0,
redirectEnd: 0,
fetchStart: 511.29999999701977,
domainLookupStart: 0,
domainLookupEnd: 0,
connectStart: 0,
secureConnectionStart: 0,
connectEnd: 0,
requestStart: 0,
responseStart: 0,
firstInterimResponseStart: 0,
responseEnd: 511.29999999701977,
transferSize: 0,
encodedBodySize: 0,
decodedBodySize: 0,
responseStatus: 0,
serverTiming: []
}
],
userTimings: []
}
Lifecycle Hooks
-
Component Lifecycle
- Lightning web components have a lifecycle managed by the framework.
- The framework:
- creates components
- inserts them into the DOM
- renders them
- removes them from the DOM.
It also monitors components for property changes.
-
A lifecycle hook is a callback method triggered at a specific phase of a component instance’s lifecycle.
Hook | When Called |
---|---|
constructor() | Called when the component is created |
connectedCallback() | Called when the element is inserted into a document |
renderedCallback() | Called after every render of the component |
render() | Call this method to update the UI |
disconnectedCallback() | Called when the element is removed from a document |
errorCallback(error, stack) | Called when a descendent component throws an error |
- Component lifecycle from creation through render
constructor() - Called when the component is created
- This hook flows from parent to child, which means that it fires in the parent first.
- Run Code When a Component Is Created
connectedCallback() - Called when the element is inserted into a document
- This hook flows from parent to child. You can’t access child elements because they don’t exist yet.
- This is invoked with the initial properties passed to the component.
- This callback is invoked after all the public properties are set and can be used to set the state of the component.
renderedCallback() - Called after every render of the component
-
This lifecycle hook is specific to Lightning Web Components, it isn’t from the HTML custom elements specification. This hook flows from child to parent.
-
A component is rerendered when the value of a property changes and that property is used either directly in a component template or indirectly in the getter of a property that is used in a template.
render() - Call this method to update the UI
- It may be called before or after connectedCallback().
- The render() method is not technically a lifecycle hook. It is a protected method on the LightningElement class.
disconnectedCallback() - Called when the element is removed from a document
- This hook flows from parent to child.
- Use disconnectedCallback() to clean up work done in the connectedCallback(), like purging caches or removing event listeners.
errorCallback(error, stack) - Called when a descendent component throws an error
-
The error argument is a JavaScript native error object, and the stack argument is a string.
-
This lifecycle hook is specific to Lightning Web Components, it isn’t from the HTML custom elements specification.
-
Component instance is removed from the DOM
References
- MDN - Performance Monitoring: RUM vs. synthetic monitoring
- LWC - Lifecycle Hooks
- HTML: Custom elements spec
- Dog Breed Example Playground
Apex CPU time limit
Salesforce imposes a limit on the CPU usage that can be consumed in a given transaction, which is approximately
Timeout | Type |
---|---|
10 seconds | Synchronous Apex |
60 seconds | Asynchronous Apex |
-
Note: these are hard-limits, can't be increased!
-
An Apex CPU Timeout Exception effectively means that Apex Code (or process builder/ flows or both) is attempting to execute and is exceeding the cpu time limit.
-
Any DML executions contained within the transaction will be rolled back if the limit is hit.
-
We only count events that require application server CPU use
Not counted
- Things beyond your control as a programmer are not included
- Time spent in the database retrieving records won't count.
- Time spent waiting for a callout to return is also not included.
- Code compilation time isn't counted.
Counted
- All Apex code
- Library functions exposed in Apex
- Workflow execution
- If DML in your code encounters a validation rule with a formula, we'll count the time spent evaluating that formula.
- CPU time is calculated for all executions on the Salesforce application servers occurring in one Apex transaction
Suggestions
- Using Map based query
- Using Map query saves CPU time
//Fetching all account in map
Map<id,account> aMap = new Map<id,account>([Select Id,Name from Account limit 50000]);
- Explore if your business allows you to do the operation asynchronously
- If there is a chance to make code execute in
@future
,this will break the context and also the CPU time out limit for asynchronous process is 60 seconds (6 times of synchronous process)
- If there is a chance to make code execute in
- Aggregate SOQL usage
-
Since the database time is not calculated in CPU time it is always better to explore the usage of aggregate SOQL for your business use case .
-
Say you want a summation of field value of some records. If you use normal for loop to get these, it is obvious you have spent CPU time there. Instead, try to push your calculation using SUM,AVG aggregate functions at the database layer itself. By doing this you have reduced CPU time and have pushed the process on database layer itself. Explore options to group by or create some sort of filtering at database layer. Push your calculations at database layer to reduce chances of hitting CPU time out issue.
-
- Only take necessary data and run a loop
- It is essential to filter only specific data while doing a for on a list of records. Too much looping will increase CPU time. The same was true when we had number of script statements limit
Resolution
Item | Resolution |
---|---|
Process Builder | Where viable, convert Process Builder processes to Apex Triggers using an asynchronous approach if possible. |
Custom Code | If you have custom code (i.e. trigger) you will need to contact your developer or responsible organization to refactor your current custom solutions to ensure you do not exceed the CPU time limit. |
Batch Processing | If you are inserting or updating a large number of records, try reducing the batch size. |
Async Apex | Use @future calls and asynchronous processing to move non-critical business logic out of triggers and/or Process Builder processes. |
Best Practices | Consider Salesforce best practices for Triggers and Bulk Requests |
Managed Package involvement | If code from a managed package (i.e. Rollup Helper) is taking part in a transaction such as Posting a Billing you will need to contact the authors of that package. |
- Check for unexpected situation where trigger logic is executed multiple times
- Check for nested loops in Apex code
Using Developer console to view timeline
- Switching the perspective and using timeline
- Timeline shows how the items (Apex, Flows, ...) are executed in sequence and how much time each of them consumes.
References
- Trigger and Bulk Request Best Practices
- Proactive Alert Monitoring: Apex CPU Timeout Exceptions
- Code more efficiently to avoid 'Apex CPU time limit exceeded'
Diagram tools
UML Sequence diagram tool
Diagram Editor
CRMA Dataflow Viz
XML tools
Topics
- Checks email-Ids in given folder of xml
- Checks ObjectIds in given folder of xml
- XML Transformer
- Using our CLI you can convert that XLF file into TSV
- Checks XML files in given folder is valid or not?
- Find Object.Field from Report Types
Checks email-Ids in given folder of xml
- You can use this tool to find out any hard-corded email ids are in the code
sfdx mohanc:xml:email --help
Checks emails in given folder of xml
USAGE
$ sfdx mohanc xml email [-p <string>] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]
FLAGS
-p, --path=<value> Path to the xml files folder
--json format output as json
--loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL) [default: warn] logging level for this command invocation
DESCRIPTION
Checks emails in given folder of xml
EXAMPLES
** Checks emails in given folder of xml **
sfdx mohanc:xml:email -p <path/to/xml_files>
Checks ObjectIds in given folder of xml
- You can use this tool to find out any hard-corded ObjectIds are in the code
sfdx mohanc:xml:objectId --help
Checks ObjectIds in given folder of xml
USAGE
$ sfdx mohanc xml objectId [-p <string>] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]
FLAGS
-p, --path=<value> Path to the xml files folder
--json format output as json
--loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL) [default: warn] logging level for this command invocation
DESCRIPTION
Checks ObjectIds in given folder of xml
EXAMPLES
** Checks ObjectIds in given folder of xml **
sfdx mohanc:xml:objectId -p <path/to/xml_files>
Convert given XML file into JSON
Convert given XML file into JSON
USAGE
$ sfdx mohanc xml xml2json [-i <string>] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]
FLAGS
-i, --inputfilename=<value> Data file input in XML format
--json format output as json
--loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL) [default: warn] logging level for this command invocation
DESCRIPTION
Convert given XML file into JSON
EXAMPLES
** Convert given XML file into JSON **
sfdx mohanc:xml:xml2json -i /tmp/cd.xml
Demo
sfdx mohanc:xml:xml2json -i force-app/main/default/profiles/Admin.profile-meta.xml
bat force-app/main/default/profiles/Admin.profile-meta.xml.json
───────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
│ File: force-app/main/default/profiles/Admin.profile-meta.xml.json
───────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
1 │ {
2 │ "?xml": "",
3 │ "Profile": {
4 │ "custom": false,
5 │ "objectPermissions": [
6 │ {
7 │ "allowCreate": true,
8 │ "allowDelete": true,
9 │ "allowEdit": true,
10 │ "allowRead": true,
11 │ "modifyAllRecords": true,
12 │ "object": "Account_Plan__c",
13 │ "viewAllRecords": true
14 │ },
15 │ {
16 │ "allowCreate": true,
17 │ "allowDelete": true,
18 │ "allowEdit": true,
19 │ "allowRead": true,
20 │ "modifyAllRecords": true,
21 │ "object": "Sales_Play__c",
22 │ "viewAllRecords": true
23 │ },
24 │ {
25 │ "allowCreate": true,
26 │ "allowDelete": true,
27 │ "allowEdit": true,
28 │ "allowRead": true,
29 │ "modifyAllRecords": true,
30 │ "object": "Tree__c",
31 │ "viewAllRecords": true
:
XML Transformer
sfdx mohanc:xml:transform --help
Transform given xml using XSLT
USAGE
$ sfdx mohanc xml transform [-i <string>] [-t <string>] [-m <string>] [--json] [--loglevel
trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]
FLAGS
-i, --inputfilename=<value> Data file in XML format you want to do the transformation
-m, --mtype=<value> metadata type: Supported: [package,profile, permset]
-t, --xsltfilename=<value> XSLT file to do the transformation
--json format output as json
--loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL) [default: warn] logging level for this command invocation
DESCRIPTION
Transform given xml using XSLT
EXAMPLES
** Transform given xml using XSLT **
sfdx mohanc:xml:transform -i /tmp/cd.xml -t /tmp/cd.xslt
Using CLI you can convert the XLF file into TSV
- stf2tsvHtml.xsl used here
sfdx mohanc:xml:transform -i img/exported-2.xlf -t xsl/stf2tsvHtml.xsl
Above command will open a HTML page with TSV file loaded in a textarea. You can cut and paste the content of the textarea to create a TSV file.
- If you want to view the XLIFF file as a TSV page:
sfdx mohanc:xml:transform -i img/exported-2.xlf -t xsl/stf2tsv.xsl
=== Writing the output into file : img/exported-2.xlf.out.xml.html ===
- If you want to view the XLIFF file as a HTML page:
sfdx mohanc:xml:transform -i img/exported-2.xlf -t xsl/stf2html.xsl
=== Writing the output into file : img/exported-2.xlf.out.xml.html ===
Checks XML files in given folder is valid or not?
sfdx mohanc:xml:validate -p force-app/main/default/profiles/ --help
Checks given folder of xml is valid or not
USAGE
$ sfdx mohanc xml validate -p <string> [-i] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]
FLAGS
-i, --invalid Print only invalid
-p, --path=<value> (required) Path to the xml files folder
--json format output as json
--loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL) [default: warn] logging level for this command invocation
DESCRIPTION
Checks given folder of xml is valid or not
EXAMPLES
** Checks given folder of xml is valid or not **
sfdx mohanc:xml:validate -p <path/to/xml_files> -i
- lists out xml files in the given path are valid or not
-i flag provides only invalid files
Demo
- You can redirect to a TSV file and view in a spreadsheet program
sfdx mohanc:xml:validate -p force-app/main/default/profiles/
force-app/main/default/profiles/Admin.profile-meta.xml Admin.profile-meta.xml true
force-app/main/default/profiles/Analytics Cloud Integration User.profile-meta.xml Analytics Cloud Integration User.profile-meta.xml true
force-app/main/default/profiles/Analytics Cloud Security User.profile-meta.xml Analytics Cloud Security User.profile-meta.xml true
force-app/main/default/profiles/Authenticated Website.profile-meta.xml Authenticated Website.profile-meta.xml true
force-app/main/default/profiles/Chatter External User.profile-meta.xml Chatter External User.profile-meta.xml true
force-app/main/default/profiles/Chatter Free User.profile-meta.xml Chatter Free User.profile-meta.xml true
force-app/main/default/profiles/Chatter Moderator User.profile-meta.xml Chatter Moderator User.profile-meta.xml true
force-app/main/default/profiles/ContractManager.profile-meta.xml ContractManager.profile-meta.xml true
force-app/main/default/profiles/Cross Org Data Proxy User.profile-meta.xml Cross Org Data Proxy User.profile-meta.xml true
force-app/main/default/profiles/Custom%3A Marketing Profile.profile-meta.xml Custom%3A Marketing Profile.profile-meta.xml true
force-app/main/default/profiles/Custom%3A Sales Profile.profile-meta.xml Custom%3A Sales Profile.profile-meta.xml true
force-app/main/default/profiles/Custom%3A Support Profile.profile-meta.xml Custom%3A Support Profile.profile-meta.xml true
force-app/main/default/profiles/Customer Community Login User.profile-meta.xml Customer Community Login User.profile-meta.xml true
force-app/main/default/profiles/Customer Community Plus Login User.profile-meta.xml Customer Community Plus Login User.profile-meta.xml true
force-app/main/default/profiles/Customer Community Plus User.profile-meta.xml Customer Community Plus User.profile-meta.xml true
force-app/main/default/profiles/Customer Community User.profile-meta.xml Customer Community User.profile-meta.xml true
force-app/main/default/profiles/Customer Portal Manager Custom.profile-meta.xml Customer Portal Manager Custom.profile-meta.xml true
force-app/main/default/profiles/Customer Portal Manager Standard.profile-meta.xml Customer Portal Manager Standard.profile-meta.xml true
force-app/main/default/profiles/External Apps Login User.profile-meta.xml External Apps Login User.profile-meta.xml true
force-app/main/default/profiles/External Identity User.profile-meta.xml External Identity User.profile-meta.xml true
force-app/main/default/profiles/Force%2Ecom - App Subscription User.profile-meta.xml Force%2Ecom - App Subscription User.profile-meta.xml true
force-app/main/default/profiles/Force%2Ecom - Free User.profile-meta.xml Force%2Ecom - Free User.profile-meta.xml true
force-app/main/default/profiles/Gold Partner User.profile-meta.xml Gold Partner User.profile-meta.xml true
force-app/main/default/profiles/Guest License User.profile-meta.xml Guest License User.profile-meta.xml true
force-app/main/default/profiles/High Volume Customer Portal User.profile-meta.xml High Volume Customer Portal User.profile-meta.xml true
force-app/main/default/profiles/HighVolumePortal.profile-meta.xml HighVolumePortal.profile-meta.xml true
force-app/main/default/profiles/Identity User.profile-meta.xml Identity User.profile-meta.xml true
force-app/main/default/profiles/MarketingProfile.profile-meta.xml MarketingProfile.profile-meta.xml true
force-app/main/default/profiles/Minimum Access - Salesforce.profile-meta.xml Minimum Access - Salesforce.profile-meta.xml true
force-app/main/default/profiles/mohanc Profile.profile-meta.xml mohanc Profile.profile-meta.xml true
force-app/main/default/profiles/Partner App Subscription User.profile-meta.xml Partner App Subscription User.profile-meta.xml true
force-app/main/default/profiles/Partner Community Login User.profile-meta.xml Partner Community Login User.profile-meta.xml true
force-app/main/default/profiles/Partner Community User.profile-meta.xml Partner Community User.profile-meta.xml true
force-app/main/default/profiles/PlatformPortal.profile-meta.xml PlatformPortal.profile-meta.xml true
force-app/main/default/profiles/Read Only.profile-meta.xml Read Only.profile-meta.xml true
force-app/main/default/profiles/Silver Partner User.profile-meta.xml Silver Partner User.profile-meta.xml true
force-app/main/default/profiles/SolutionManager.profile-meta.xml SolutionManager.profile-meta.xml true
force-app/main/default/profiles/Standard.profile-meta.xml Standard.profile-meta.xml true
force-app/main/default/profiles/StandardAul.profile-meta.xml StandardAul.profile-meta.xml true
force-app/main/default/profiles/Work%2Ecom Only User.profile-meta.xml Work%2Ecom Only User.profile-meta.xml true
Find Object.Field from Report Types
# get the json file for the given xml file
sfdx mohanc:xml:xml2json -i MyReport.reportType-meta.xml
# get the Object.Fields from the the reports
sfdx mohanc:data:jq -i MyReport.reportType-meta.xml.json -f '.ReportType.sections[].columns[] | .table + "." + .field'
"Account.Type"
"Account.RecordType"
"Account.Parent"
"Account.BillingAddress"
"Account.BillingStreet"
"Account.BillingCity"
"Account.BillingState"
"Account.BillingPostalCode"
"Account.BillingCountry"
"Account.BillingStateCode"
"Account.BillingCountryCode"
"Account.BillingLatitude"
"Account.BillingLongitude"
"Account.BillingGeocodeAccuracy"
...
XSLT Transformer
SF-Land XSLT Transformer
Demos
XML Delta Util
App Online
Notes
- Currently delta tool does not support
layoutAssignments
Demos
Data Processing tools
How to convert CSV file into TSV (Tab Separated Values) or PSV (Pipe Separated Values)
cat fruits.csv
name,qty,store
Mango,100,"NY's fruits"
Apple,200,"NY's fruits"
Peach,300,FL
CSV to TSV
sfdx mohanc:data:file:csv2 -i fruits.csv -d $'\t'
name qty store
Mango 100 NY's fruits
Apple 200 NY's fruits
Peach 300 FL
CSV to PSV
sfdx mohanc:data:file:csv2 -i fruits.csv -d $'|'
name|qty|store
Mango|100|NY's fruits
Apple|200|NY's fruits
Peach|300|FL
Data Cloud
Key Ideas
- Traditionally we store our data in a transactional Databases
- Great for transactions, syn-writes...
- We run to challenges is scaling
-
- Data Scale (e.g. Event data, data from sensors...)
-
- Since data is locked into Transactional systems, what we need to do is
- Pull all that data out into other systems like Analytics.
- Pull all the data into data warehouse
- To do Machine Learning (ML)
- Pull all the data into data lakes and do the ML
-
- Data leaves Salesforce into these other systems
- Security control is lost
- Business definition done at Salesforce is lost and Semantics is lost
- End up with proliferations of things which again needs to be integrated back into Salesforce
With Data Cloud
- We are bringing the power of the big data processing into Salesforce, so we can run all on the same platform
- where you can run Analytics, ML, Realtime decision system
- It is not a silo solution, but it is built into Salesforce platform
- So we can use the power of Salesforce metadata platform and power of hyperforce data scale aspects
- We put all the metadata into the Salesforce Core (user, dev-experience...). Stored in the tables in the Core.
- All the data is sitting in the hyperforce
- At the storage level, we store the data in Apache Parquet format
- Parquet is industry standard columnar format
- Helps to perform queries at scale
- We integrate this data with partitioning so you can put high-scale
- Customer brought in 3T rows of data into the system in a week - System is capable of scaling to that level.
- We have tested up to 100K requests / second per tenant
- These are volumes and scale the system can support
- In traditional data lake house, they are built for batches
- file upload works great
- does not work well for streaming data
- We contribute to the open source community so we can handle streaming data as the events are coming in, we can update at scale
- 1000s of requests coming in and we updating on disk live
- Apache Iceberg sits on the top of Apache Parquet layer which are stored in AWS S3 bucket
- Iceberg provides the metadata - what is the table structure of the Parquet file
Layer | Notes |
---|---|
Parquet | Columnar file |
Iceberg | Tabular view of the Parquet file |
- Iceberg supports delta increment update fast , we also added Salesforce metadata to Iceberg
- So our partners like Snowflake can query our data live - no copy, no ETL
- All the security control is with Salesforce platform Admins
- Super powerful
- You have lost the security
- You have not lost the semantics that you have created in the Salesforce Platform
Ways to use data in Data Cloud (JDBC or File level)
- Zero copy way of looking at Salesforce data with full security control of the Salesforce Admin
- Via JDBC connector - analytics system can make of use it
- File Level access - at Iceberg level - Partners like Snowflake can make use of it
Reverse (Data-In)
- Already have DW, Data cloud will allow to mount those tables virtually in Data Cloud (Work in progress)
- REST API
- SOQL (with limitations) on single entity
- ANSI SQL - full scale
Best Practices
- Make the Canonical Model correct (Business scenario)
- What is the velocity of the data coming in - batch / streaming
- Cost willing to spend - consumption based pricing model
- Figuring out what you will use is important
Questions to ask:
- What is Data Landscape
- Starring fresh with DC
- Already have DW?
- Have MDM ?
- Your Action system
-
Sales Cloud
-
Service Cloud
-
Marketing Cloud
-
Terms used
What is Apache Iceberg?
Apache Iceberg is an open-source table format that adds data warehouse-level capabilities to a traditional data lake. One of the Apache Software Foundation’s open-source projects, the Iceberg table format enables more efficient petabyte-scale data processing by creating an abstracted metadata layer that describes the files in a data lake’s object storage.
What is Zero ETL?
Zero-ETL is a set of integrations that eliminates or minimizes the need to build ETL data pipelines. Extract, transform, and load (ETL) is the process of combining, cleaning, and normalizing data from different sources to get it ready for analytics, artificial intelligence (AI) and machine learning (ML) workloads. Traditional ETL processes are time-consuming and complex to develop, maintain, and scale. Instead, zero-ETL integrations facilitate point-to-point data movement without the need to create ETL data pipelines. Zero-ETL can also enable querying across data silos without the need for data movement.
Apache parquet format
Lakehouse
A lakehouse is a new type of data platform architecture that combines the key benefits of data lakes and data warehouses. Data lakes handle both structured and unstructured data, often for advanced analytics. Lakehouses combine the two, offering analytics flexibility with diverse data types.
Data lakes are typically stored in file format with variable organization or hierarchy. Built on object storage, data lakes allow for the flexibility to store data of all types, from a wide variety of sources. Provide multi-modal access: query, use spark...
Data warehouses tend to be more performant than data lakes, but they can be more expensive and limited in their ability to scale. Typically you can query with SQL.
References
Video
Translation Tools
Topics
- Install Kural CLI
- Using Kural CLI to translate
- How to prepare translation TSV from STF file
- Using CLI you can convert the XLF file into TSV
- Using CLI to fill-in Salesforce translation file (STF)
Install
# linux and mac
sudo npm i -g kural
# windows
npm i -g kural
Chrome Driver
- Download the Chrome Driver file and put this in your PATH
Using Kural CLI to translate
kural i18n:translate
-
Translates a given sentence into various languages and provides verification Google Translate URL (optionally) to verify the translation work
-
Uses Google Translate
# Translate to French
kural i18n:translate -i 'Good Morning, Friends!' -f 'en' -t fr
Bonjour les amis!
# Translate to Tamil
kural i18n:translate -i 'Good Morning, Friends!' -f 'en' -t ta
காலை வணக்கம் நண்பர்களே!
# Translate to Telugu
kural i18n:translate -i 'Good Morning, Friends!' -f 'en' -t te
శుభోదయం మిత్రులారా!
# Translate to Marathi
kural i18n:translate -i 'Good Morning, Friends!' -f 'en' -t mr -g true
सुप्रभात, मित्रांनो!
# Translate to Japanese and launch google translate to verify the translation work
kural i18n:translate -i 'Good Morning, Friends!' -f 'en' -t ja -g true
おはようございます、友達!
-
Translation verification
How to prepare translation TSV from STF file
- Demo showing how to do the export - PDF
- exported-2.stf is the input file you got from Translation workbench export
sfdx mohanc:i18n:stf:prep -i exported-2.stf > exported-2.stf.tsv
exported-2.stf.tsv
Using CLI you can convert the XLF file into TSV
-
If you exported the XLF file in the translation workbench you will have the information about the max width/size of the translation allowed for each key.
-
stf2tsvHtml.xsl used here
sfdx mohanc:xml:transform -i img/exported-2.xlf -t xsl/stf2tsvHtml.xsl
Above command will open a HTML page with TSV file loaded in a textarea. You can cut and paste the content of the textarea to create a TSV file.
- If you want to view the XLIFF file as a TSV page:
sfdx mohanc:xml:transform -i img/exported-2.xlf -t xsl/stf2tsv.xsl
=== Writing the output into file : img/exported-2.xlf.out.xml.html ===
- If you want to view the XLIFF file as a HTML page:
sfdx mohanc:xml:transform -i img/exported-2.xlf -t xsl/stf2html.xsl
=== Writing the output into file : img/exported-2.xlf.out.xml.html ===
Using CLI to fill-in Salesforce translation file (STF)
NOTES
CLI allows progressive filling of stf with translated files
STEPS
- Export STF file for a bilingual (using Translation workbench) - say our case it is French (FR)
- Our file say: exported.stf
- You have translated the keys in this STF file and have the translated-1.psv (Pipe Separated Value file)
- Create the filled value stf file batch-1.stf using this command:
sfdx mohanc:i18n:stf:fill -i exported.stf -t translated-1.psv > batch-1.stf
cat translated-1.psv
1|ButtonOrLink.Account.Billing|Billing|Facturation
2|CustomField.Account.CustomerPriority.FieldLabel|Customer Priority|Priorité du client
batch-1.stf
- Now another translation team has provided you translated-2.psv
- You can apply this translation on top of batch-1.stf, using this command
sfdx mohanc:i18n:stf:fill -i batch-1.stf -t translated-2.psv > batch-2.stf
cat translated-2.psv
1|ButtonOrLink.Opportunity.DeliveryStatus|Delivery Status|Statut de livraison
2|CustomField.Account.NumberofLocations.FieldLabel|Number of Locations|Nombre d'emplacements
batch-2.stf
- and so on...
- You can load (import) the final stf file to the org in translation workbench
- If you export again you will see the results of your translations:
exported-2.stf
Media - Recording tools
- Optimized for Chromium based browsers
- Chrome
- MS Edge
- Opera
Screen Recording tool without audio
Screen Recording tool with audio
Photo Booth
Audio Recorder (use a headset)
Video Recorder (use a headset)
All Media tools
Image Processing
SVG to PNG converter
Code Generators
LWC Code Generator
NLP (Natural Language Processing)
Playing with Spacy
Apex Code Generation using NLP
Machine Learning
In this section we will share tools regarding
Semantic Doc Compare and Group
Description
-
Semantic Document Compare and Group using Tensorflow and universal-sentence-encoder and plotly. Thanks to 'Attention Is All You Need' paper by Ashish Vaswani and team and thanks to Hong Jing for his articles on this subject!
Demo
Data Catalog for ML Work
- You can use these datasets for ML work, please provide a reference this site in your work which will provide encouragement to this work!
openAI - ChatGPT
ChatGPT writing Apex code
ChatGPT writing Scripts
openAI
Writing a web app with openAI GPT-3 Codex (DaVinci-Codex Engine)
Chat with openAI GPT-3 engine!
Einstein Bot
Einstein Bot - Simple - Appointment Scheduler
Dialogflow
Playing with Dialogflow Knowledge Base feature
NLP (Natural Language Processing)
Playing with Spacy
Apex Code Generation using NLP
Transformers
Model success depends on how fast you can build it and how good it is in reducing the losses!
-
It is a general-purpose differentiable computer.
-
A neural network architecture developed at Google that relies on self-attention mechanisms to transform a sequence of input embeddings into a sequence of output embeddings without relying on convolutions or recurrent neural networks.
-
A Transformer can be viewed as a stack of self-attention layers.
-
Based on a self-attention mechanisms unlike traditional convolutional (CNN) or recurrent neural networks (RNN)
-
Transformer outperforms both recurrent and convolutional models in language translation benchmarks
Resilient
Remarkably, the Transformer architecture published by Google team is very resilient over time. The Transformer that came out in 2017 is basically the same as today except that we reshuffle some of the layer normalizations.
Researchers are making training datasets much larger, evaluation much larger, but they are keeping the architecture untouched, which is remarkable!
- they aggregate information from surrounding words to determine the meaning of a given bit of language in context.
A man/word is known by the company she/he/word keeps means that a person/words is similar to the people/words she/ he/word chooses to spend time with!
Example:
- deciding on the most likely meaning and appropriate representation of the word bank in the sentence :
I arrived at the bank after crossing the… requires knowing if the sentence ends in “... road.” or “... river.”
Architecture | Notes |
---|---|
RNN | Can determine that “bank” is likely to refer to the bank of a river after reading each word between “bank” and “river” step by step. More the steps decisions require, the harder it is for a RNN to learn how to make those decisions.The sequential nature of RNNs also makes it more difficult to fully take advantage of modern fast computing devices such as TPUs and GPUs, which excel at parallel and not sequential processing. |
CNN | Less sequential than RNNs, the number of steps required to combine information from distant parts of the input still grows with increasing distance |
Transformer | Performs a small, constant number of steps (chosen empirically). In each step, it applies a self-attention mechanism which directly models relationships between all words in a sentence, regardless of their respective position. |
Feature | Comments |
---|---|
Expressive | With forward pass. It nodes store vectors and these nodes look at each other and they can see what's important for their computation in the other nodes. Message-passing-like architecture is general (i.e. completeness) and powerful (i.e. efficiency), able to cover many real-world algorithms and in a small number of compute steps; an an empirical finding. |
Optimizable | With Back-Propagation and Gradient Descent. With residual connections, layer normalization and softmax attention. Residual connections support a kind of ability to learn short algorithms (think low LOC) fast and first, then gradually extend them longer during training. |
Efficient | High parallelism compute graph. Can efficiently run on modern hardware (GPUs). Compute graph is shallow and wide, mapping significantly better to our high-parallelism compute architectures |
-
It can become a single neural network architecture for multi-modal inputs (text, image, video, audio), and thus converging all the other architectures into one.
-
Transformers combine some of the benefits traditionally seen with convolutional neural networks (CNNs) and recurrent neural networks (RNNs).
Architecture | Feature | Transformer |
---|---|---|
CNN | learn structured representations of images | ability to process input in a hierarchical manner, with each layer of the model learning increasingly complex features |
CNN | process input in a fixed-length window | use self-attention mechanisms that allow the model to directly relate different input elements to each other, regardless of their distance in the sequence. Capture long-range dependencies between input elements, |
RNN | Cyclic. Ability to capture dependencies between elements in a sequence |
Model Architecture
- The encoder and decoder are represented by six layers (the number can be any, for example, in BERT there are 24 encoder blocks)
graph LR; Inputs-->Embedding-->SelfAttention-->FeedForwardNN-->Outputs
Steps | Description |
---|---|
1 | The input data of the encoder first passes through the Self-Attention layer. This layer helps the encoder look at other words in the input sentence when it encodes a particular word. |
2 | The results of this Self-Attention layer are fed to the FeedForward NN |
3 | As the model processes each token (each word in the input sequence, say w), the Self-Attention layer allows it to search for hints in the other tokens (words) in the input sequence that may help improve the encoding of that word (w). |
4 | FeedForward NN has no interaction with other words, and therefore various chains can be executed in parallel as they pass through this layer. This allows transformer to process all words in the input text in parallel. |
Model constantly cares about what “she”, “it” , “the”, or “that” refers to (pronouns).
I never been to Mars. But it is on wish list
it here refers to Mars
Query/Key/Value vectors - retrieval system
\( \int x dx = \frac{x^2}{2} + C \)
graph LR; Query-->Keys-->ToGetValues
- we want to compute the self-attention for the word it in above sentence
- creating matrices Q, K, and V for this input sentence
- where q (row in matrix Q) represents the query vector for the word it
- K represents the key matrix for all the words in the sentence
- V represents the value matrix for all the words in the sentence.
- where q (row in matrix Q) represents the query vector for the word it
- The self-attention for the word it would then be computed as:
- the dot product of Q and K, divided by the square root of the length of K, followed by a dot product with V.
\( \frac {Q \cdot K}{\sqrt(dim(K))} \)
The matrices \( (W^Q, W^K, W^V) \) are initially randomly initialized weight matrices.
- The vector \( (x_n) \)in this case would be a word embedding of the input token (or output of the previous attention layer).
Example
Hello World
- Embeddings
- \( x_0 \) : Hello
- \( x_1 \) : World
- We have vectors \( q, k, v \) for each of these two words.
- These vectors are generated by multiplying embedding vector of the word and the weight matrix \( W \)
Encoder and Decoder
An encoder transforms a sequence of embeddings into a new sequence of the same length.
-
Includes N identical layers (Nx), each of which contains two sub-layers.
- These two sub-layers are applied at each position of the input embedding sequence, transforming each element of the sequence into a new embedding.
- The first encoder sub-layer aggregates information from across the input sequence. - aggregation
- The second encoder sub-layer transforms the aggregated information into an output embedding. - transformation
- These two sub-layers are applied at each position of the input embedding sequence, transforming each element of the sequence into a new embedding.
-
Each encoder consists of two layers: Self-Attention and Feed Forward Neural Network.
graph LR; inputSequence --> Aggregate --> Transform --> outputSequence
A decoder transforms a sequence of input embeddings into a sequence of output embeddings, possibly with a different length.
- Includes N identical layers with three sub-layers
- First and second of which are similar to the encoder sub-layers.
- The third decoder sub-layer takes the output of the encoder and applies the self-attention mechanism to gather information from it
graph LR; inputSequence --> Aggregate --> Transform --> ApplySelfAttention
Self-attention
For this sentence:
The animal didn't cross the street because it was too tired.
- self-attention layer's attention pattern for the pronoun it: (with the darkness of each line indicating how much each word contributes to the representation:)
- The self-attention layer highlights words that are relevant to it. In this case, the attention layer has learned to highlight words that it might refer to, assigning the highest weight to animal.
- For a sequence of n tokens, self-attention transforms a sequence of embeddings n separate times, once at each position in the sequence.
In the example:
I arrived at the bank after crossing the river
-
to determine that the word bank refers to the shore of a river and not a financial institution
- Transformer can learn to immediately attend to the word river and make this decision in a single step
-
To compute the next representation for a given word - bank for example - the Transformer compares it to every other word in the sentence
- The result of these comparisons is an attention score for every other word in the sentence.
- These attention scores determine how much each of the other words should contribute to the next representation of bank
- The disambiguating river could receive a high attention score when computing a new representation for bank
- The attention scores are then used as weights for a weighted average of all words’ representations which is fed into a fully-connected network to generate a new representation for bank, reflecting that the sentence is talking about a river bank.
Neural networks for machine translation typically contain
-
an encoder reading the input sentence and generating a representation of it
-
a decoder then generates the output sentence word by word while consulting the representation generated by the encoder
-
Transformer starts by generating initial representations, or embeddings, for each word.
- These are represented by the unfilled circles
- Then, using self-attention, it aggregates information from all of the other words, generating a new representation per word informed by the entire context, represented by the filled balls.
- This step is then repeated multiple times in parallel for all words, successively generating new representations.
-
The decoder operates similarly, but generates one word at a time, from left to right. It attends not only to the other previously generated words, but also to the final representations generated by the encoder.
-
We can visualize what other parts of a sentence the network attends to when processing or translating a given word, thus gaining insights into how information travels through the network.
- In the first sentence pair it refers to the animal, and in the second to the street.
- In one of its steps, the Transformer clearly identified the two nouns it could refer to and the respective amount of attention reflects its choice in the different contexts.
Given this insight, it might not be that surprising that the Transformer also performs very well on the classic language analysis task of syntactic constituency parsing, a task the natural language processing community has attacked with highly specialized systems for decades.
Terms
Term | Description |
---|---|
Gradient descent | A mathematical technique to minimize loss. Gradient descent iteratively adjusts weights and biases, gradually finding the best combination to minimize loss |
Loss | The difference between the prediction and the label value is the loss |
Forward Pass | System processes a batch of examples to yield prediction(s). The system compares each prediction to each label value. The difference between the prediction and the label value is the loss for that example. The system aggregates the losses for all the examples to compute the total loss for the current batch. |
Backward pass (backpropagation) | System reduces loss by adjusting the weights of all the neurons in all the hidden layer(s).Backpropagation determines whether to increase or decrease the weights applied to particular neurons. |
Hidden layer | A layer in a neural network between the input layer (the features) and the output layer (the prediction). Each ![]() |
Learning Rate | Multiplier that controls the degree to which each backward pass increases or decreases each weight. A large learning rate will increase or decrease each weight more than a small learning rate |
Chain rule | expresses the derivative of the composition of two differentiable functions f and g in terms of the derivatives of f and g |
Lagrange's notation | a prime mark denotes a derivative. If f is a function, then its derivative evaluated at x is written f`(x) |
Encode | In sequence-to-sequence tasks, encoder takes an input sequence and returns an internal state (a vector). Then, the decoder uses that internal state to predict the next sequence. |
Tensor | N-dimensional (where N could be very large) data structures, most commonly scalars, vectors, or matrices. The elements of a Tensor can hold integer, floating-point, or string values |
language understanding tasks | language modeling, machine translation and question answering |
Attention | Weighted sum over a set of inputs, where the weight for each input is computed by another part of the neural network |
self-attention | Each embedding in the output sequence is constructed by integrating information from the elements of the input sequence through an attention mechanism.The self part of self-attention refers to the sequence attending to itself rather than to some other context. A self-attention layer starts with a sequence of input representations, one for each word. For each word in an input sequence, the network scores the relevance of the word to every element in the whole sequence of words. The relevance scores determine how much the word's final representation incorporates the representations of other words.Self-attention is one of the main building blocks for Transformers and uses dictionary lookup terminology, such as “query”, “key”, and “value”. Self-attention and multi-head self-attention, which are the building blocks of Transformers. |
Masked language modelling | Encoder only architectures (like BERT) solve the task of predicting the masked word in a sentence. So the attention can see all the words before and after this masked word |
Predict next token | Predict the next token or set of tokens, i.e. given tokens [0, ..., n-1] predict n |
Sentence completion task
- This requires decoder portion of the transformer architecture
- For the translation tasks we need to encode the input text first
- have a cross-attention layer with the translated text
Large Language Models (LLMs)
Huggingface using langchain
pip -q install langchain huggingface_hub transformers sentence_transformers accelerate bitsandbytes
import os
os.environ['HUGGINGFACEHUB_API_TOKEN'] = 'XYZ'
from langchain import PromptTemplate, HuggingFaceHub, LLMChain
template = """Question: {question}
Answer: Let's think step by step."""
prompt = PromptTemplate(template=template, input_variables=["question"])
llm_chain = LLMChain(prompt=prompt,
llm=HuggingFaceHub(repo_id="google/flan-t5-xl",
model_kwargs={"temperature":0,
"max_length":64}))
question = "What is the capital of England?"
print(llm_chain.run(question))
## prints:
### London is the capital of England. The final answer: London.
Using LLM to improve team productivity
Large language models (LLM) like GPT-4, Bard can be valuable tools for enhancing developer productivity in various ways.
-
Code generation: Developers often spend time writing repetitive or boilerplate code. Language models can assist in generating code snippets or even complete functions based on a given description or requirements. By providing prompts such as function signatures or high-level descriptions, developers can leverage the model to generate code quickly, saving time and effort.
-
Documentation assistance: Documenting code is crucial for maintainability and collaboration. Large language models can help automate the process of writing documentation by generating descriptions or summaries for functions, classes, or entire projects. This can alleviate the burden on developers and ensure that codebases are well-documented.
-
Bug identification and fixing: Language models can assist in identifying potential bugs in code by analyzing patterns and suggesting fixes. Developers can provide the model with code snippets or error messages, and it can propose solutions or identify potential sources of issues. This can be particularly helpful for debugging complex or hard-to-reproduce problems.
-
Natural language interfaces: Language models can be used to create natural language interfaces for development tools and APIs. Developers can interact with the model in plain English, asking questions or giving commands, and the model can interpret and execute those instructions. This can simplify the learning curve for new tools and make it easier to interact with complex APIs.
-
Code completion and auto-suggestions: IDEs and code editors can leverage LLMs to provide intelligent code completion suggestions. By analyzing the context of the code being written, the model can predict the next likely tokens, saving developers from typing out long and repetitive code snippets. This feature can greatly enhance developer productivity and reduce errors
-
Test Class generation: LLMs can generate a decent Test Class for the given Class
-
Test Plan generation: LLMs based systems can provide a baseline test plan for your project.
- Can provide Ideas for writing QA scripts
-
Perf Plan generation: LLMs based systems can assist in identifying the areas where performance testing needs to be conducted.
-
Sprint Planning: LLMs based system can help in writing your project sprint plan based the solution details you have provided to the system
- Extensions to LLMs can provide input to standard PERT and Gantt tools
-
Deployment plan: Based the solution requirements you have provided to the LLM based systems can provide deployment plan for your solution.
-
Release plan: Based the solution requirements you have provided to the LLM based systems can basic Release plan for your solution.
Most of the chat systems have conversion-Id as sticky bit, so it can remember your context over the period of your conversions with the system
It's important to note that while LLMs can provide valuable assistance, they are not a substitute for human expertise and should be used as tools to support and augment developers' work. Developers should exercise caution and validate the suggestions provided by these models to ensure the quality and correctness of the code produced.
--
Using Large Language Models (LLMs) to Improve Team Productivity
Large language models (LLMs), such as GPT-4 and Bard, can be highly valuable in enhancing developer productivity across several areas.
-
Code Generation: Developers often spend a significant amount of time writing repetitive or boilerplate code. LLMs can assist by generating code snippets or complete functions based on given descriptions or requirements. Developers can provide prompts like function signatures or high-level descriptions to leverage the model, saving time and effort.
-
Documentation Assistance: Documentation is crucial for code maintainability and collaboration. LLMs can automate the process by generating descriptions or summaries for functions, classes, or entire projects. This feature alleviates the burden on developers and ensures well-documented codebases.
-
Bug Identification and Fixing: LLMs can help identify potential bugs by analyzing patterns and suggesting fixes. Developers can provide code snippets or error messages to the model, which can propose solutions or pinpoint potential issues. This capability is particularly valuable for debugging complex or hard-to-reproduce problems.
-
Natural Language Interfaces: LLMs can be employed to create natural language interfaces for development tools and APIs. Developers can interact with the model using plain English, asking questions or giving commands, and the model can interpret and execute those instructions. This simplifies the learning curve for new tools and facilitates interaction with complex APIs.
-
Code Completion and Auto-Suggestions: Integrated Development Environments (IDEs) and code editors can leverage LLMs to provide intelligent code completion suggestions. By analyzing the context of the code being written, the model can predict the next likely tokens, saving developers from manually typing long and repetitive code snippets. This feature significantly enhances developer productivity and reduces errors.
Additionally, LLMs can offer other productivity enhancements, such as generating test classes, providing baseline test plans, offering ideas for writing QA scripts, assisting with performance testing planning, aiding in sprint planning based on provided solution details, providing input to standard PERT and Gantt tools, generating deployment plans and release plans based on solution requirements.
It's important to recognize that while LLMs can provide valuable assistance, they should not replace human expertise. Instead, they should be viewed as tools that support and augment developers' work. Developers should exercise caution and validate the suggestions provided by these models to ensure the quality and correctness of the code produced.
Furthermore, most chat systems have a conversation-ID as a sticky bit, enabling them to remember the context throughout interactions with the system. This feature enhances the continuity and efficiency of communication with the LLM-based system.
Document management tools
OCR the paper docs via images
Design Notes
Chrome Extensions
Extensions covered
Chrome Extensions
ONNU
-
ONNU is a chrome extension to provide Utils for Salesforce Admins and Developers
-
Available from chrome web store - Onnu - Salesforce Utils
-
Once installed ONNU in your chrome browser, login into Salesforce org and switch to Classic mode to start :
Topics
- About Onnu
- How to get ERD of a SObject
- How to get CRMA (Einstein Analytics) Metadata Assets?
- Screenshots
- SetupAuditTrail Query
Videos
About ONNU
How to get ERD of a sObject?
How to get CRMA (Einstein Analytics) Metadata Assets?
Screenshots
- Compare queries from 2 orgs
SetupAuditTrail Query
SELECT CreatedDate,
CreatedBy.Name,
CreatedByContext,
CreatedByIssuer,
Display,
Section
FROM SetupAuditTrail
ORDER BY CreatedDate
DESC LIMIT 10
About the name ONNU
- ONNU to mean ON to New
- ONNU is number one in some Indian languages
VS Code Extensions
Extensions covered
SF Land
Welcome to the VS Code Extension - SF-LAND!
Install sf-land from Visual Studio MarketPlace
Features
Salesforce specific - Login required
Code Scan (Salesforce Login not required)
LWC Code Gen (Salesforce Login not required)
Provide datatable view for a JSON node in the editor
Provide datatable view for a CSV File in the editor
Render JSON in the Editor as Bar Chart
i18n
Word Cloud
Pie Chart
GraphViz
REST client
Crypto features
Doc features
More Demos
SFL: Login into Salesforce to get Auth File
- Login into Salesforce and create an Auth JSON file
- Uses SFDX for this operation
SFL: Login with Auth JSON File
- Login into the Salesforce Org with the given Auth JSON file
SFL: Run SOQL Query
- Runs a given SOQL and provides results output in HTML
- Select Tooling to use Tooling query
- You can select a SOQL from standard list using SFL: Select SOQL Query
SFL: Run Anonymous Apex
- Runs the Anonymous Apex in the editor
SFL: Run Org Limits
- Display Limits info in your org
SFL: List Fields for SObject
- Field Usage Demo
SFL: Load Data Using BulkAPI 2.0
cat /tmp/case2.csv
Subject,Priority
Automobile having fuel issues,High
Automobile Auto-Pilot has input error,High
SFL: Delete Data Using BulkAPI 2.0
cat /tmp/case2_del.csv
Id
5004x00000KdRUWAA3
5004x00000KdRUXAA3
SFL: Run PMD Scan
-
Runs PMD Scan
- Setup PMD executable as per this Doc
-
Also you can run PMD scan on the current file in the editor
- using SFL: Run PM Scan for File in the Editor
-
Demo
-
SFL: Google Translate
- Translate the current editor text using Google Translate
SFL: Draw Word Cloud
- Draws Word Cloud for the current editor text
SFL: Draw graphViz graph
- Draws graphViz graph for the current editor text
SFL: REST Client
-
You can run SFL: Select REST Resource to select from list of standard REST resources
-
input
{
"method": "SFGET",
"url": "sobjects/Opportunity/describe"
}
- Response will be rendered in a VS Code window
SFL: Render Markdown in Editor as Webpage
- Renders a Markdown document in the editor to html page
SFL:SECURITY: Create JWT
- input
{
"payload": { "user": "Dennis Ritchie"},
"expiresIn": 3
}
- JWT
eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyIjoiRGVubmlzIFJpdGNoaWUiLCJpYXQiOjE2NjQ5MjUzODQsImV4cCI6MzMyOTg1MDc3MX0.dEjtntdJ5anLHh8A1w8n24UBkejVeMy-c9Wru-PsMNw
SFL:SECURITY: Verify JWT
- input
eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyIjoiRGVubmlzIFJpdGNoaWUiLCJpYXQiOjE2NjQ5MjUzODQsImV4cCI6MzMyOTg1MDc3MX0.dEjtntdJ5anLHh8A1w8n24UBkejVeMy-c9Wru-PsMNw
- output
{
"user": "Dennis Ritchie",
"iat": 1664925384,
"exp": 3329850771
}
LWC Code gen
- SFL: Generate LWC for SOQL Query
- For the Given Single Object based SOQL:
SELECT Id, Name, Amount FROM Opportunity
Draw Pie Chart
- SFL: Draw Pie Chart
- Sample data file in the editor
{
"title": "Fruits Yield",
"name": "Yield",
"data": [{ "Mango": 40 },
{ "Peach": 10 },
{ "Pear": 5 },
{ "Plum": 8 },
{ "Apple": 20 }
]
}
Render Datatable
- SFL: Render Selected JSON Node in Datatable"
- Sample data file in the editor
{
"title": "Fruits",
"data": [
{ "Name": "Mango", "qty": 2000, "goodQuality": true },
{ "Name": "Apple", "qty": 3000, "goodQuality": true },
{ "Name": "Pear", "qty": 500, "goodQuality": true },
{ "Name": "Peach", "qty": 4000, "goodQuality": true }
]
}
Render CSV in datatable
Render JSON in the Editor as Bar Chart
Delete Metadata
More Demos
- Named Credentials
Stay tuned... More coming...
Enjoy!
- Built by Mohan Chinnappan with ♥
VSCode Apex PMD
From Visual Studio MarketPlace
Install it from Visual Studio MarketPlace
VSCode Extension for PMD code scan for Salesforce Apex
Allows you to run Apex Static Code Analysis directly in VSCode.
ANT task based PMD
If you are looking for ANT task based PMD for Apex, here it is.
Features Demo
Current actions Supported
Run analysis on:
- file open
- file save
- entire workspace
- single file
Installation
- Must have
JDK >=1.4
installed and in path - Download a PMD "bin" release (
>= 5.6
) - unzip the zip file from step #2 to location of choice and make a note of that location. This is the value for :
apexPMD.pmdPath
- In VSCode, Open
Preferences: User Settings
and setapexPMD.pmdPath
to folder where pmd was unzipped in step 3 along with other config items as shown below:
Sample Configuration -> VSCode::Preferences: User Settings
{
// ...
// absolute path to where PMD was installed
// following example in my case
"apexPMD.pmdPath":"/Users/mchinnappan/code-scan-tools/pmd/pmd-bin-5.8.1",
// Set to false to use you own ruleset (set path)
"apexPMD.useDefaultRuleset": "true",
// Absolute path to ruleset xml file. Must also set `useDefaultRuleset:false`.
"apexPMD.rulesetPath": "",
// Will run static analysis every time a file is opened
"apexPMD.runOnFileOpen": "true",
// Will run static analysis every time a file is saved
"apexPMD.runOnFileSave": "true",
// Determines at what priority level 'errors' will be added. Anything less will be a warning or hint
"apexPMD.priorityErrorThreshold": "1",
// Determines at what priority level 'warnings' will be added. Anything less will be a hint
"apexPMD.priorityWarnThreshold": "3"
}
Ruleset
Here is the sample ruleset used in this extension
From Visual Studio MarketPlace
Install it from Visual Studio MarketPlace
VSCode tips
How to comment out multiple lines
-
Steps (macOS)
- Select multiple lines
- [Shift] + [Option] + A
-
Steps (Linux)
- Select multiple lines
- [Shift] + [Ctrl] + A
-
Steps (Windows)
- Select multiple lines
- [Shift] + [Alt] + A
Data Viz
Data Viz tools
Data Catalog
D3 Experiments
- Draw Bar Chart based on Clipboard data
- Copy a json file in your clipboard and click the above url, change x to field name for x and y for field name for y
- Sample Data - copy this into clipboard
[
{
"db": "PostgreSQL",
"value": 72.08
},
{
"db": "Redis",
"value": 71.04
},
{
"db": "MongoDB",
"value": 60.51
},
{
"db": "SQLLite",
"value": 57.51
},
{
"db": "MSSQL",
"value": 54.95
},
{
"db": "MYSQL",
"value": 51.17
},
{
"db": "DynamoDB",
"value": 53.63
},
{
"db": "Oracle",
"value": 35.33
},
{
"db": "DB2",
"value": 26.0
}
]
- Click on Viz Url
D3 CodePens
Releases
Spring 23
- Calendar
References
REST Tester
- HTML markup is server-side-rendered in this app
URLs
- https://mohanc-rest.deno.dev
- https://mohanc-rest.deno.dev/users/1
- https://mohanc-rest.deno.dev/users/2
Verbs
GET
Date
curl -X GET https://mohanc-rest.deno.dev/date | jq
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 35 100 35 0 0 208 0 --:--:-- --:--:-- --:--:-- 207
{
"date": "2023-02-06T00:36:12.687Z"
}
GET a particular user
curl -X GET https://mohanc-rest.deno.dev/users/1 | jq
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 36 100 36 0 0 238 0 --:--:-- --:--:-- --:--:-- 238
{
"id": "1",
"username": "Ken Thompson"
}
POST
curl -X POST https://mohanc-rest.deno.dev/users/1000 -d '{"text":"Awesome user!"}' | jq
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 117 100 93 100 24 352 90 --:--:-- --:--:-- --:--:-- 443
{
"msg": "Received a POST HTTP method for user of id: 1000",
"payload": {
"text": "Awesome user!"
}
}
PUT
curl -X PUT https://mohanc-rest.deno.dev/users/1 | jq
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 54 100 54 0 0 308 0 --:--:-- --:--:-- --:--:-- 308
{
"msg": "Received a PUT HTTP method for user of id: 1"
}
DELETE
curl -X DELETE https://mohanc-rest.deno.dev/users/1 | jq
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 57 100 57 0 0 370 0 --:--:-- --:--:-- --:--:-- 367
{
"msg": "Received a DELETE HTTP method for user of id: 1"
}
Security
Exploit and prevention
AppExchange apps
- App:Authenticated and Guest User Access Report and Monitoring
- Demo: Authenticated and Guest User Access Report and Monitoring App Overview & Demo
CLI
Articles
- Many Public Salesforce Sites are Leaking Private Data
- Salesforce Lightning - An in-depth look at exploitation vectors for the everyday community
Pen Testing
- Pen Testing
- Vulnerability Assessment
- Security Testing and Evaluation (ST&E)
- Cybersecurity Threats and Threat Actors
- Security Assessment Agreement
Sharing Debugger
How to run sharing debugger for the given user ?
- Chrome Extension version
Demo
DX Version
- Requires 0.0.144 version of the plugin
- sfdx-mohanc-plugins@0.0.144
- How to install the plugin
- Login using sfdx force:auth:web:login
Usage
$ sfdx mohanc:sharing:debug -h
Run Sharing Debugger on the given org for the given user
USAGE
$ sfdx mohanc:sharing:debug
Run Sharing Debugger on the given org for the given user
USAGE
$ sfdx mohanc:sharing:debug
OPTIONS
-n, --username=username User Name to Debug
-u, --targetusername=targetusername username or alias for the target org; overrides default target org
-v, --targetdevhubusername=targetdevhubusername username or alias for the dev hub org; overrides default dev hub org
--apiversion=apiversion override the api version used for api requests made by this command
--json format output as json
--loglevel=(trace|debug|info|warn|error|fatal) logging level for this command invocation
EXAMPLE
** Run Sharing Debugger on the given org for the given user **
USAGE
sfdx mohanc:sharing:debug -u <username> -n <userNameToDebug>
writes userInfo.json and userInfo.svg files in the current directory
-----------
Demo
$ sfdx mohanc:sharing:debug -u mohan.chinnappan.n_ea2@gmail.com -n mohan.chinnappan.n_ea2@gmail.com
____________Profile____________
profileId : 00e3h000001kqJMAAY
____________Permissions____________
"Id","Name","PermissionsViewAllData","PermissionsModifyAllData","PermissionsManageDataIntegrations"
"00e3h000001kqJMAAY","System Administrator",true,true,true
____________PermSets____________
"Id","Label","PermissionsViewAllData","PermissionsModifyAllData","PermissionsManageDataIntegrations"
"0PS3h000002PwNZGA0","00ex00000018ozh_128_09_04_12_1",true,true,true
____________AccountShare____________
"Id","AccountId","AccountAccessLevel","OpportunityAccessLevel","CaseAccessLevel","ContactAccessLevel","RowCause"
"00r3h000033f9xzAAA","0013h00000FfgCMAAZ","All","Edit","Edit","Edit","Owner"
"00r3h000033fA4bAAE","0013h00000FfgCNAAZ","All","Edit","Edit","Edit","Owner"
____________OpportunityShare____________
=== None found ===
Running Sharing debugger for all users
bash process_sharing_debug.sh <login-user-name>
# Example
bash process_sharing_debug.sh mohan.chinnappan.n.sel@gmail.com
FAQ
- Why this user has access to this Account record, as per org security settings, this user should not have access to?
- Refer: Users can view records which are not shared to them
How to query AccountShare object
SELECT Id ,AccountId,
UserOrGroupId,AccountAccessLevel,OpportunityAccessLevel,CaseAccessLevel,ContactAccessLevel,RowCause
FROM AccountShare
WHERE AccountId ='YOUR ACOUNT ID'
How to query UserRecordAccess object
sfdx mohanc:data:query -q ~/.query/userRecordAccess.soql -u mohan.chinnappan.n_ea2@gmail.com -f json
[
{
"attributes": {
"type": "UserRecordAccess",
"url": "/services/data/v52.0/sobjects/UserRecordAccess/0013h00000FfgCMAAZ"
},
"RecordId": "0013h00000FfgCMAAZ",
"HasReadAccess": true,
"HasEditAccess": true,
"HasDeleteAccess": true,
"HasTransferAccess": true,
"HasAllAccess": true
}
]
cat ~/.query/userRecordAccess.soql
SELECT RecordId,
HasReadAccess,HasEditAccess,HasDeleteAccess,
HasTransferAccess,HasAllAccess
FROM UserRecordAccess
WHERE UserId='0053h000002xQ5sAAE' AND
RecordId = '0013h00000FfgCMAAZ'
Listing share sObjects
$ sfdx mohanc:ws:rest -f header_ea.json -r https://mohansun-ea-02-dev-ed.my.salesforce.com/services/data/v49.0/sobjects -m GET > sobjects.json
# sObjects with name ending with Share
$ sfdx mohanc:data:jq -i sobjects.json -f '.sobjects[].name | select ( contains("Share") )'
"AccountBrandShare"
"AccountShare"
"AcquiredAccount__Share"
"AssetShare"
"AuthorizationFormConsentShare"
"AuthorizationFormDataUseShare"
"AuthorizationFormShare"
"CalendarViewShare"
"CampaignShare"
"CaseShare"
"ChannelProgramLevelShare"
"ChannelProgramMemberShare"
"ChannelProgramShare"
"CommSubscriptionChannelTypeShare"
"CommSubscriptionConsentShare"
"CommSubscriptionShare"
"ConsumptionScheduleShare"
"ContactPointAddressShare"
"ContactPointConsentShare"
"ContactPointEmailShare"
"ContactPointPhoneShare"
"ContactPointTypeConsentShare"
"ContactRequestShare"
"ContactShare"
"DataUseLegalBasisShare"
"DataUsePurposeShare"
"EngagementChannelTypeShare"
"ExternalEventMappingShare"
"FlowInterviewLogShare"
"FlowInterviewShare"
"ImageShare"
"IndividualShare"
"LeadShare"
"ListEmailShare"
"MacroShare"
"MacroUsageShare"
"MyFilter__Share"
"OpportunityHistory__Share"
"OpportunityShare"
"OrderShare"
"OrgDeleteRequestShare"
"PartnerFundAllocationShare"
"PartnerFundClaimShare"
"PartnerFundRequestShare"
"PartnerMarketingBudgetShare"
"PartyConsentShare"
"PromptActionShare"
"QuickTextShare"
"QuickTextUsageShare"
"StreamingChannelShare"
"TodayGoalShare"
"UserAppMenuCustomizationShare"
"UserEmailPreferredPersonShare"
"UserProvisioningRequestShare"
"UserShare"
# __Share sObjects
$ sfdx mohanc:data:jq -i sobjects.json -f '.sobjects[].name | select ( contains("__Share") )'
"AcquiredAccount__Share"
"MyFilter__Share"
"OpportunityHistory__Share"
# list name,isCustomObject
$ sfdx mohanc:data:jq -i sobjects.json -f '.sobjects[] | .name + "," + (.custom | tostring)' | sed -e 's/"//g'
AIInsightAction,false
AIInsightFeedback,false
AIInsightReason,false
AIInsightValue,false
AIMetric,false
AIPredictionEvent,false
AIRecordInsight,false
AcceptedEventRelation,false
...
Querying the AccountShare
$ sfdx mohanc:data:query -q accountShare.soql -u mohan.chinnappan.n_ea2@gmail.com | sed -e 's/""/"/g' | sed -e 's/"//g'
attributes,Id,AccountId,UserOrGroupId,AccountAccessLevel,ContactAccessLevel,OpportunityAccessLevel,CaseAccessLevel
{type:AccountShare,url:/services/data/v49.0/sobjects/AccountShare/00r3h00000gJvnMAAS},00r3h00000gJvnMAAS,0013h00000FffwFAAR,00G3h000001rhfAEAQ,Read,Read,None,None
{type:AccountShare,url:/services/data/v49.0/sobjects/AccountShare/00r3h00000gJvnDAAS},00r3h00000gJvnDAAS,0013h00000FffwMAAR,00G3h000001rhfAEAQ,Read,Read,None,None
{type:AccountShare,url:/services/data/v49.0/sobjects/AccountShare/00r3h00000gJvryAAC},00r3h00000gJvryAAC,0013h00000FffwVAAR,00G3h000001rhfAEAQ,Read,Read,None,None
...
$ cat accountShare.soql
SELECT Id, AccountId, UserOrGroupId,
AccountAccessLevel,
ContactAccessLevel,
OpportunityAccessLevel,
CaseAccessLevel
FROM AccountShare
Getting name,label,type,length for a given sObject (say Group)
$ bash ./sh/getObjectFields.sh Group
Id,Group ID,id,18
Name,Name,string,40
DeveloperName,Developer Name,string,80
RelatedId,Related ID,reference,18
Type,Type,picklist,40
Email,Email,email,255
OwnerId,Owner ID,reference,18
DoesSendEmailToMembers,Send Email to Members,boolean,0
DoesIncludeBosses,Include Bosses,boolean,0
CreatedDate,Created Date,datetime,0
CreatedById,Created By ID,reference,18
LastModifiedDate,Last Modified Date,datetime,0
LastModifiedById,Last Modified By ID,reference,18
SystemModstamp,System Modstamp,datetime,0
Read Group sObject
$ sfdx mohanc:data:query -u mohan.chinnappan.n_ea2@gmail.com -q ./soql/group.soql | sed -e 's/""/"/g' | sed -e 's/"//g'
attributes,Id,Name,DeveloperName,RelatedId,Type,Email,OwnerId,DoesSendEmailToMembers,DoesIncludeBosses
{type:Group,url:/services/data/v49.0/sobjects/Group/00G3h000001rhf6EAA},00G3h000001rhf6EAA,,Sales_WW,00E3h000001JaqREAS,Role,,00D3h000007R1LuEAK,false,true
{type:Group,url:/services/data/v49.0/sobjects/Group/00G3h000001rhf8EAA},00G3h000001rhf8EAA,,CEOA,00E3h000001JaqUEAS,Role,,00D3h000007R1LuEAK,false,true
{type:Group,url:/services/data/v49.0/sobjects/Group/00G3h000001rhf9EAA},00G3h000001rhf9EAA,,CIO,00E3h000001JaqQEAS,Role,,00D3h000007R1LuEAK,false,true
{type:Group,url:/services/data/v49.0/sobjects/Group/00G3h000001rhfAEAQ},00G3h000001rhfAEAQ,,Sales_AMER,00E3h000001JaqSEAS,Role,,00D3h000007R1LuEAK,false,true
{type:Group,url:/services/data/v49.0/sobjects/Group/00G3h000001rhfBEAQ},00G3h000001rhfBEAQ,,Sales_EMEA,00E3h000001JaqTEAS,Role,,00D3h000007R1LuEAK,false,true
{type:Group,url:/services/data/v49.0/sobjects/Group/00G3h000001rhfCEAQ},00G3h000001rhfCEAQ,,CIO,00E3h000001JaqQEAS,RoleAndSubordinates,,00D3h000007R1LuEAK,false,true
{type:Group,url:/services/data/v49.0/sobjects/Group/00G3h000001rhfDEAQ},00G3h000001rhfDEAQ,,Sales_WW,00E3h000001JaqREAS,RoleAndSubordinates,,00D3h000007R1LuEAK,false,true
{type:Group,url:/services/data/v49.0/sobjects/Group/00G3h000001rhfEEAQ},00G3h000001rhfEEAQ,,Sales_AMER,00E3h000001JaqSEAS,RoleAndSubordinates,,00D3h000007R1LuEAK,false,true
{type:Group,url:/services/data/v49.0/sobjects/Group/00G3h000001rhfFEAQ},00G3h000001rhfFEAQ,,Sales_EMEA,00E3h000001JaqTEAS,RoleAndSubordinates,,00D3h000007R1LuEAK,false,true
{type:Group,url:/services/data/v49.0/sobjects/Group/00G3h000001rhfGEAQ},00G3h000001rhfGEAQ,,CEOD,00E3h000001JaqUEAS,RoleAndSubordinates,,00D3h000007R1LuEAK,false,true
{type:Group,url:/services/data/v49.0/sobjects/Group/00G3h000001HciTEAS},00G3h000001HciTEAS,,CEO,00E3h000001JaqUEAS,RoleAndSubordinatesInternal,,00D3h000007R1LuEAK,false,true
{type:Group,url:/services/data/v49.0/sobjects/Group/00G3h000001HciUEAS},00G3h000001HciUEAS,,CIO,00E3h000001JaqQEAS,RoleAndSubordinatesInternal,,00D3h000007R1LuEAK,false,true
{type:Group,url:/services/data/v49.0/sobjects/Group/00G3h000001HciVEAS},00G3h000001HciVEAS,,Sales_AMER,00E3h000001JaqSEAS,RoleAndSubordinatesInternal,,00D3h000007R1LuEAK,false,true
{type:Group,url:/services/data/v49.0/sobjects/Group/00G3h000001HciWEAS},00G3h000001HciWEAS,,Sales_EMEA,00E3h000001JaqTEAS,RoleAndSubordinatesInternal,,00D3h000007R1LuEAK,false,true
{type:Group,url:/services/data/v49.0/sobjects/Group/00G3h000001HciXEAS},00G3h000001HciXEAS,,Sales_WW,00E3h000001JaqREAS,RoleAndSubordinatesInternal,,00D3h000007R1LuEAK,false,true
{type:Group,url:/services/data/v49.0/sobjects/Group/00G3h000001HciSEAS},00G3h000001HciSEAS,,AllPartnerUsers,,PRMOrganization,,00D3h000007R1LuEAK,false,true
{type:Group,url:/services/data/v49.0/sobjects/Group/00G3h000001rhf7EAA},00G3h000001rhf7EAA,,AllInternalUsers,,Organization,,00D3h000007R1LuEAK,false,true
{type:Group,url:/services/data/v49.0/sobjects/Group/00G3h000001HciYEAS},00G3h000001HciYEAS,,AllCustomerPortalUsers,,AllCustomerPortal,,00D3h000007R1LuEAK,false,true
{type:Group,url:/services/data/v49.0/sobjects/Group/00G3h000001HcicEAC},00G3h000001HcicEAC,selfservice,selfservice,0053h000002xqUhAAI,GuestUserGroup,,00D3h000007R1LuEAK,false,true
$ cat soql/group.soql
SELECT
Id,
Name,
DeveloperName,
RelatedId,
Type,
Email,
OwnerId,
DoesSendEmailToMembers,
DoesIncludeBosses
FROM Group
Queue
$ bash ./sh/getObjectFields.sh QueueSobject
Id,Queue Sobject ID,id,18
QueueId,Group ID,reference,18
SobjectType,Sobject Type,picklist,40
CreatedById,Created By ID,reference,18
SystemModstamp,System Modstamp,datetime,0
$ sfdx mohanc:data:query -u mohan.chinnappan.n_ea2@gmail.com -q ./soql/queue.soql | sed -e 's/""/"/g' | sed -e 's/"//g'
Organization
$ bash ./sh/getObjectFields.sh Organization
Id,Organization ID,id,18
Name,Name,string,80
Division,Division,string,80
Street,Street,textarea,255
City,City,string,40
State,State/Province,string,80
PostalCode,Zip/Postal Code,string,20
Country,Country,string,80
Latitude,Latitude,double,0
Longitude,Longitude,double,0
GeocodeAccuracy,Geocode Accuracy,picklist,40
Address,Address,address,0
Phone,Phone,phone,40
Fax,Fax,phone,40
PrimaryContact,Primary Contact,string,80
DefaultLocaleSidKey,Locale,picklist,40
TimeZoneSidKey,Time Zone,picklist,40
LanguageLocaleKey,Language,picklist,40
ReceivesInfoEmails,Info Emails,boolean,0
ReceivesAdminInfoEmails,Info Emails Admin,boolean,0
PreferencesRequireOpportunityProducts,RequireOpportunityProducts,boolean,0
PreferencesTransactionSecurityPolicy,TransactionSecurityPolicy,boolean,0
PreferencesTerminateOldestSession,TerminateOldestSession,boolean,0
PreferencesConsentManagementEnabled,ConsentManagementEnabled,boolean,0
PreferencesAutoSelectIndividualOnMerge,AutoSelectIndividualOnMerge,boolean,0
PreferencesLightningLoginEnabled,LightningLoginEnabled,boolean,0
PreferencesOnlyLLPermUserAllowed,OnlyLLPermUserAllowed,boolean,0
FiscalYearStartMonth,Fiscal Year Starts In,int,0
UsesStartDateAsFiscalYearName,Fiscal Year Name by Start,boolean,0
DefaultAccountAccess,Default Account Access,picklist,40
DefaultContactAccess,Default Contact Access,picklist,40
DefaultOpportunityAccess,Default Opportunity Access,picklist,40
DefaultLeadAccess,Default Lead Access,picklist,40
DefaultCaseAccess,Default Case Access,picklist,40
DefaultCalendarAccess,Default Calendar Access,picklist,40
DefaultPricebookAccess,Default Price Book Access,picklist,40
DefaultCampaignAccess,Default Campaign Access,picklist,40
SystemModstamp,System Modstamp,datetime,0
ComplianceBccEmail,Compliance BCC Email,email,80
UiSkin,UI Skin,picklist,40
SignupCountryIsoCode,Signup Country,string,2
TrialExpirationDate,Trial Expiration Date,datetime,0
NumKnowledgeService,Knowledge Licenses,int,0
OrganizationType,Edition,picklist,40
NamespacePrefix,Namespace Prefix,string,15
InstanceName,Instance Name,string,5
IsSandbox,Is Sandbox,boolean,0
WebToCaseDefaultOrigin,Web to Cases Default Origin,string,40
MonthlyPageViewsUsed,Monthly Page Views Used,int,0
MonthlyPageViewsEntitlement,Monthly Page Views Allowed,int,0
IsReadOnly,Is Read Only,boolean,0
CreatedDate,Created Date,datetime,0
CreatedById,Created By ID,reference,18
LastModifiedDate,Last Modified Date,datetime,0
LastModifiedById,Last Modified By ID,reference,18
User
$ bash getObjectFields.sh User
Id,User ID,id,18
Username,Username,string,80
LastName,Last Name,string,80
FirstName,First Name,string,40
Name,Full Name,string,121
CompanyName,Company Name,string,80
Division,Division,string,80
Department,Department,string,80
Title,Title,string,80
Street,Street,textarea,255
City,City,string,40
State,State/Province,string,80
PostalCode,Zip/Postal Code,string,20
Country,Country,string,80
Latitude,Latitude,double,0
Longitude,Longitude,double,0
GeocodeAccuracy,Geocode Accuracy,picklist,40
Address,Address,address,0
Email,Email,email,128
EmailPreferencesAutoBcc,AutoBcc,boolean,0
EmailPreferencesAutoBccStayInTouch,AutoBccStayInTouch,boolean,0
EmailPreferencesStayInTouchReminder,StayInTouchReminder,boolean,0
SenderEmail,Email Sender Address,email,80
SenderName,Email Sender Name,string,80
Signature,Email Signature,textarea,1333
StayInTouchSubject,Stay-in-Touch Email Subject,string,80
StayInTouchSignature,Stay-in-Touch Email Signature,textarea,512
StayInTouchNote,Stay-in-Touch Email Note,string,512
Phone,Phone,phone,40
Fax,Fax,phone,40
MobilePhone,Mobile,phone,40
Alias,Alias,string,8
CommunityNickname,Nickname,string,40
BadgeText,User Photo badge text overlay,string,80
IsActive,Active,boolean,0
TimeZoneSidKey,Time Zone,picklist,40
UserRoleId,Role ID,reference,18
LocaleSidKey,Locale,picklist,40
ReceivesInfoEmails,Info Emails,boolean,0
ReceivesAdminInfoEmails,Admin Info Emails,boolean,0
EmailEncodingKey,Email Encoding,picklist,40
ProfileId,Profile ID,reference,18
UserType,User Type,picklist,40
LanguageLocaleKey,Language,picklist,40
EmployeeNumber,Employee Number,string,20
DelegatedApproverId,Delegated Approver ID,reference,18
ManagerId,Manager ID,reference,18
LastLoginDate,Last Login,datetime,0
LastPasswordChangeDate,Last Password Change or Reset,datetime,0
CreatedDate,Created Date,datetime,0
CreatedById,Created By ID,reference,18
LastModifiedDate,Last Modified Date,datetime,0
LastModifiedById,Last Modified By ID,reference,18
SystemModstamp,System Modstamp,datetime,0
NumberOfFailedLogins,Failed Login Attempts,int,0
OfflineTrialExpirationDate,Offline Edition Trial Expiration Date,datetime,0
OfflinePdaTrialExpirationDate,Sales Anywhere Trial Expiration Date,datetime,0
UserPermissionsMarketingUser,Marketing User,boolean,0
UserPermissionsOfflineUser,Offline User,boolean,0
UserPermissionsCallCenterAutoLogin,Auto-login To Call Center,boolean,0
UserPermissionsSFContentUser,Salesforce CRM Content User,boolean,0
UserPermissionsKnowledgeUser,Knowledge User,boolean,0
UserPermissionsInteractionUser,Flow User,boolean,0
UserPermissionsSupportUser,Service Cloud User,boolean,0
UserPermissionsJigsawProspectingUser,Data.com User,boolean,0
UserPermissionsSiteforceContributorUser,Site.com Contributor User,boolean,0
UserPermissionsSiteforcePublisherUser,Site.com Publisher User,boolean,0
UserPermissionsWorkDotComUserFeature,WDC User,boolean,0
ForecastEnabled,Allow Forecasting,boolean,0
UserPreferencesActivityRemindersPopup,ActivityRemindersPopup,boolean,0
UserPreferencesEventRemindersCheckboxDefault,EventRemindersCheckboxDefault,boolean,0
UserPreferencesTaskRemindersCheckboxDefault,TaskRemindersCheckboxDefault,boolean,0
UserPreferencesReminderSoundOff,ReminderSoundOff,boolean,0
UserPreferencesDisableAllFeedsEmail,DisableAllFeedsEmail,boolean,0
UserPreferencesDisableFollowersEmail,DisableFollowersEmail,boolean,0
UserPreferencesDisableProfilePostEmail,DisableProfilePostEmail,boolean,0
UserPreferencesDisableChangeCommentEmail,DisableChangeCommentEmail,boolean,0
UserPreferencesDisableLaterCommentEmail,DisableLaterCommentEmail,boolean,0
UserPreferencesDisProfPostCommentEmail,DisProfPostCommentEmail,boolean,0
UserPreferencesContentNoEmail,ContentNoEmail,boolean,0
UserPreferencesContentEmailAsAndWhen,ContentEmailAsAndWhen,boolean,0
UserPreferencesApexPagesDeveloperMode,ApexPagesDeveloperMode,boolean,0
UserPreferencesReceiveNoNotificationsAsApprover,ReceiveNoNotificationsAsApprover,boolean,0
UserPreferencesReceiveNotificationsAsDelegatedApprover,ReceiveNotificationsAsDelegatedApprover,boolean,0
UserPreferencesHideCSNGetChatterMobileTask,HideCSNGetChatterMobileTask,boolean,0
UserPreferencesDisableMentionsPostEmail,DisableMentionsPostEmail,boolean,0
UserPreferencesDisMentionsCommentEmail,DisMentionsCommentEmail,boolean,0
UserPreferencesHideCSNDesktopTask,HideCSNDesktopTask,boolean,0
UserPreferencesHideChatterOnboardingSplash,HideChatterOnboardingSplash,boolean,0
UserPreferencesHideSecondChatterOnboardingSplash,HideSecondChatterOnboardingSplash,boolean,0
UserPreferencesDisCommentAfterLikeEmail,DisCommentAfterLikeEmail,boolean,0
UserPreferencesDisableLikeEmail,DisableLikeEmail,boolean,0
UserPreferencesSortFeedByComment,SortFeedByComment,boolean,0
UserPreferencesDisableMessageEmail,DisableMessageEmail,boolean,0
UserPreferencesHideLegacyRetirementModal,HideLegacyRetirementModal,boolean,0
UserPreferencesJigsawListUser,JigsawListUser,boolean,0
UserPreferencesDisableBookmarkEmail,DisableBookmarkEmail,boolean,0
UserPreferencesDisableSharePostEmail,DisableSharePostEmail,boolean,0
UserPreferencesEnableAutoSubForFeeds,EnableAutoSubForFeeds,boolean,0
UserPreferencesDisableFileShareNotificationsForApi,DisableFileShareNotificationsForApi,boolean,0
UserPreferencesShowTitleToExternalUsers,ShowTitleToExternalUsers,boolean,0
UserPreferencesShowManagerToExternalUsers,ShowManagerToExternalUsers,boolean,0
UserPreferencesShowEmailToExternalUsers,ShowEmailToExternalUsers,boolean,0
UserPreferencesShowWorkPhoneToExternalUsers,ShowWorkPhoneToExternalUsers,boolean,0
UserPreferencesShowMobilePhoneToExternalUsers,ShowMobilePhoneToExternalUsers,boolean,0
UserPreferencesShowFaxToExternalUsers,ShowFaxToExternalUsers,boolean,0
UserPreferencesShowStreetAddressToExternalUsers,ShowStreetAddressToExternalUsers,boolean,0
UserPreferencesShowCityToExternalUsers,ShowCityToExternalUsers,boolean,0
UserPreferencesShowStateToExternalUsers,ShowStateToExternalUsers,boolean,0
UserPreferencesShowPostalCodeToExternalUsers,ShowPostalCodeToExternalUsers,boolean,0
UserPreferencesShowCountryToExternalUsers,ShowCountryToExternalUsers,boolean,0
UserPreferencesShowProfilePicToGuestUsers,ShowProfilePicToGuestUsers,boolean,0
UserPreferencesShowTitleToGuestUsers,ShowTitleToGuestUsers,boolean,0
UserPreferencesShowCityToGuestUsers,ShowCityToGuestUsers,boolean,0
UserPreferencesShowStateToGuestUsers,ShowStateToGuestUsers,boolean,0
UserPreferencesShowPostalCodeToGuestUsers,ShowPostalCodeToGuestUsers,boolean,0
UserPreferencesShowCountryToGuestUsers,ShowCountryToGuestUsers,boolean,0
UserPreferencesDisableFeedbackEmail,DisableFeedbackEmail,boolean,0
UserPreferencesDisableWorkEmail,DisableWorkEmail,boolean,0
UserPreferencesPipelineViewHideHelpPopover,PipelineViewHideHelpPopover,boolean,0
UserPreferencesHideS1BrowserUI,HideS1BrowserUI,boolean,0
UserPreferencesDisableEndorsementEmail,DisableEndorsementEmail,boolean,0
UserPreferencesPathAssistantCollapsed,PathAssistantCollapsed,boolean,0
UserPreferencesCacheDiagnostics,CacheDiagnostics,boolean,0
UserPreferencesShowEmailToGuestUsers,ShowEmailToGuestUsers,boolean,0
UserPreferencesShowManagerToGuestUsers,ShowManagerToGuestUsers,boolean,0
UserPreferencesShowWorkPhoneToGuestUsers,ShowWorkPhoneToGuestUsers,boolean,0
UserPreferencesShowMobilePhoneToGuestUsers,ShowMobilePhoneToGuestUsers,boolean,0
UserPreferencesShowFaxToGuestUsers,ShowFaxToGuestUsers,boolean,0
UserPreferencesShowStreetAddressToGuestUsers,ShowStreetAddressToGuestUsers,boolean,0
UserPreferencesLightningExperiencePreferred,LightningExperiencePreferred,boolean,0
UserPreferencesPreviewLightning,PreviewLightning,boolean,0
UserPreferencesHideEndUserOnboardingAssistantModal,HideEndUserOnboardingAssistantModal,boolean,0
UserPreferencesHideLightningMigrationModal,HideLightningMigrationModal,boolean,0
UserPreferencesHideSfxWelcomeMat,HideSfxWelcomeMat,boolean,0
UserPreferencesHideBiggerPhotoCallout,HideBiggerPhotoCallout,boolean,0
UserPreferencesGlobalNavBarWTShown,GlobalNavBarWTShown,boolean,0
UserPreferencesGlobalNavGridMenuWTShown,GlobalNavGridMenuWTShown,boolean,0
UserPreferencesCreateLEXAppsWTShown,CreateLEXAppsWTShown,boolean,0
UserPreferencesFavoritesWTShown,FavoritesWTShown,boolean,0
UserPreferencesRecordHomeSectionCollapseWTShown,RecordHomeSectionCollapseWTShown,boolean,0
UserPreferencesRecordHomeReservedWTShown,RecordHomeReservedWTShown,boolean,0
UserPreferencesFavoritesShowTopFavorites,FavoritesShowTopFavorites,boolean,0
UserPreferencesExcludeMailAppAttachments,ExcludeMailAppAttachments,boolean,0
UserPreferencesSuppressTaskSFXReminders,SuppressTaskSFXReminders,boolean,0
UserPreferencesSuppressEventSFXReminders,SuppressEventSFXReminders,boolean,0
UserPreferencesPreviewCustomTheme,PreviewCustomTheme,boolean,0
UserPreferencesHasCelebrationBadge,HasCelebrationBadge,boolean,0
UserPreferencesUserDebugModePref,UserDebugModePref,boolean,0
UserPreferencesSRHOverrideActivities,SRHOverrideActivities,boolean,0
UserPreferencesNewLightningReportRunPageEnabled,NewLightningReportRunPageEnabled,boolean,0
UserPreferencesNativeEmailClient,NativeEmailClient,boolean,0
ContactId,Contact ID,reference,18
AccountId,Account ID,reference,18
CallCenterId,Call Center ID,reference,18
Extension,Extension,phone,40
PortalRole,Portal Role Level,picklist,40
IsPortalEnabled,Is Portal Enabled,boolean,0
FederationIdentifier,SAML Federation ID,string,512
AboutMe,About Me,textarea,1000
FullPhotoUrl,Url for full-sized Photo,url,1024
SmallPhotoUrl,Photo,url,1024
IsExtIndicatorVisible,Show external indicator,boolean,0
OutOfOfficeMessage,Out of office message,string,40
MediumPhotoUrl,Url for medium profile photo,url,1024
DigestFrequency,Chatter Email Highlights Frequency,picklist,40
DefaultGroupNotificationFrequency,Default Notification Frequency when Joining Groups,picklist,40
JigsawImportLimitOverride,Data.com Monthly Addition Limit,int,0
LastViewedDate,Last Viewed Date,datetime,0
LastReferencedDate,Last Referenced Date,datetime,0
BannerPhotoUrl,Url for banner photo,url,1024
SmallBannerPhotoUrl,Url for IOS banner photo,url,1024
MediumBannerPhotoUrl,Url for Android banner photo,url,1024
IsProfilePhotoActive,Has Profile Photo,boolean,0
IndividualId,Individual ID,reference,18
sfdx mohanc:data:query -u mohan.chinnappan.n_ea2@gmail.com -q ~/bulkapi/dx/soql/user.soql
"attributes","Id","Name"
"{""type"":""User"",""url"":""/services/data/v49.0/sobjects/User/0053h000003de6cAAA""}","0053h000003de6cAAA","Automated Process"
"{""type"":""User"",""url"":""/services/data/v49.0/sobjects/User/0053h000002xqUhAAI""}","0053h000002xqUhAAI","selfservice Site Guest User"
"{""type"":""User"",""url"":""/services/data/v49.0/sobjects/User/0053h000002xQ5sAAE""}","0053h000002xQ5sAAE","Mohan Chinnappan"
...
Object Permission
$ bash getObjectFields.sh ObjectPermissions
Id,ObjectPermissions ID,id,18
ParentId,Parent ID,reference,18
SobjectType,SObject Type Name,picklist,40
PermissionsCreate,Create Records,boolean,0
PermissionsRead,Read Records,boolean,0
PermissionsEdit,Edit Records,boolean,0
PermissionsDelete,Delete Records,boolean,0
PermissionsViewAllRecords,Read All Records,boolean,0
PermissionsModifyAllRecords,Edit All Records,boolean,0
CreatedDate,Created Date,datetime,0
CreatedById,Created By ID,reference,18
LastModifiedDate,Last Modified Date,datetime,0
LastModifiedById,Last Modified By ID,reference,18
SystemModstamp,System Modstamp,datetime,0
$ bash soql/ObjectPermission.sh 0053h000002xQ5sAAE Account > soql/op_account.soql
$ cat soql/op_account.soql
SELECT Id,
SObjectType,
PermissionsCreate,
PermissionsRead,
PermissionsEdit,
PermissionsDelete,
PermissionsViewAllRecords,
PermissionsModifyAllRecords,
Parent.label,
Parent.IsOwnedByProfile
FROM ObjectPermissions
WHERE (ParentId IN (
SELECT PermissionSetId
FROM PermissionSetAssignment
WHERE Assignee.id = '0053h000002xQ5sAAE')
)
AND
(SobjectType = 'Account')
$ sfdx mohanc:data:query -u mohan.chinnappan.n_ea2@gmail.com -q soql/op_account.soql | sed -e 's/""/"/g' | sed -e 's/"//g'
attributes,Id,SobjectType,PermissionsCreate,PermissionsRead,PermissionsEdit,PermissionsDelete,PermissionsViewAllRecords,PermissionsModifyAllRecords,Parent
{type:ObjectPermissions,url:/services/data/v49.0/sobjects/ObjectPermissions/1103h00001wg5TMAAY},1103h00001wg5TMAAY,Account,true,true,true,true,true,true,{attributes:{type:PermissionSet,url:/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0},Label:00ex00000018ozh_128_09_04_12_1,IsOwnedByProfile:true}
{type:ObjectPermissions,url:/services/data/v49.0/sobjects/ObjectPermissions/00000002PwNZ001AID},00000002PwNZ001AID,Account,true,true,true,true,true,true,{attributes:{type:PermissionSet,url:/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0},Label:00ex00000018ozh_128_09_04_12_1,IsOwnedByProfile:true}
# column formatting
$ sfdx mohanc:data:query -u mohan.chinnappan.n_ea2@gmail.com -q soql/op_account.soql | sed -e 's/""/"/g' | sed -e 's/"//g' | column -t -s, | less -S
FieldPermissions
$ bash soql/FieldPermission.sh 0053h000002xQ5sAAE Account > soql/fp_account.soql
$ cat soql/fp_account.soql
SELECT Id,
SObjectType,
Field,
PermissionsRead,
PermissionsEdit,
Parent.label,
Parent.IsOwnedByProfile
FROM FieldPermissions
WHERE (ParentId IN
(SELECT PermissionSetId
FROM PermissionSetAssignment
WHERE Assignee.id = '0053h000002xQ5sAAE')
)
AND
(SobjectType = 'Account')
$ sfdx mohanc:data:query -u mohan.chinnappan.n_ea2@gmail.com -q soql/fp_account.soql | sed -e 's/""/"/g'
"attributes","Id","SobjectType","Field","PermissionsRead","PermissionsEdit","Parent"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu8132AQA"}","01k3h0000Iu8132AQA","Account","Account.YearStarted",true,false,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu8131AQA"}","01k3h0000Iu8131AQA","Account","Account.Website",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu8130AQA"}","01k3h0000Iu8130AQA","Account","Account.Type",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812zAQA"}","01k3h0000Iu812zAQA","Account","Account.Tradestyle",true,false,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812yAQA"}","01k3h0000Iu812yAQA","Account","Account.TickerSymbol",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812xAQA"}","01k3h0000Iu812xAQA","Account","Account.Site",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812wAQA"}","01k3h0000Iu812wAQA","Account","Account.SicDesc",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812vAQA"}","01k3h0000Iu812vAQA","Account","Account.Sic",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812uAQA"}","01k3h0000Iu812uAQA","Account","Account.ShippingAddress",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812tAQA"}","01k3h0000Iu812tAQA","Account","Account.Rating",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812sAQA"}","01k3h0000Iu812sAQA","Account","Account.Phone",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812rAQA"}","01k3h0000Iu812rAQA","Account","Account.Parent",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812qAQA"}","01k3h0000Iu812qAQA","Account","Account.Ownership",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812pAQA"}","01k3h0000Iu812pAQA","Account","Account.NumberOfEmployees",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812oAQA"}","01k3h0000Iu812oAQA","Account","Account.NaicsDesc",true,false,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812nAQA"}","01k3h0000Iu812nAQA","Account","Account.NaicsCode",true,false,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812mAQA"}","01k3h0000Iu812mAQA","Account","Account.Jigsaw",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812lAQA"}","01k3h0000Iu812lAQA","Account","Account.Industry",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812kAQA"}","01k3h0000Iu812kAQA","Account","Account.Fax",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812jAQA"}","01k3h0000Iu812jAQA","Account","Account.DunsNumber",true,false,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812iAQA"}","01k3h0000Iu812iAQA","Account","Account.Description",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812fAQA"}","01k3h0000Iu812fAQA","Account","Account.DandbCompany",true,false,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812eAQA"}","01k3h0000Iu812eAQA","Account","Account.CleanStatus",true,false,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812dAQA"}","01k3h0000Iu812dAQA","Account","Account.BillingAddress",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812cAQA"}","01k3h0000Iu812cAQA","Account","Account.AnnualRevenue",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812bAQA"}","01k3h0000Iu812bAQA","Account","Account.AccountSource",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812aAQA"}","01k3h0000Iu812aAQA","Account","Account.AccountNumber",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812SAQQ"}","01k3h0000Iu812SAQQ","Account","Account.Segment__c",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812ZAQQ"}","01k3h0000Iu812ZAQQ","Account","Account.UpsellOpportunity__c",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812YAQQ"}","01k3h0000Iu812YAQQ","Account","Account.SLASerialNumber__c",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812XAQQ"}","01k3h0000Iu812XAQQ","Account","Account.SLAExpirationDate__c",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812WAQQ"}","01k3h0000Iu812WAQQ","Account","Account.SLA__c",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812VAQQ"}","01k3h0000Iu812VAQQ","Account","Account.NumberofLocations__c",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812UAQQ"}","01k3h0000Iu812UAQQ","Account","Account.CustomerPriority__c",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
"{"type":"FieldPermissions","url":"/services/data/v49.0/sobjects/FieldPermissions/01k3h0000Iu812TAQQ"}","01k3h0000Iu812TAQQ","Account","Account.Active__c",true,true,"{"attributes":{"type":"PermissionSet","url":"/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwNZGA0"},"Label":"00ex00000018ozh_128_09_04_12_1","IsOwnedByProfile":true}"
Detailed User query
$ bash soql/userQuery.soql 0053h000002xQ5sAAE > soql/userQuery.soql
$ cat soql/userQuery.soql
SELECT Id,
IsActive,
ProfileId,
Profile.Name,
Profile.UserLicense.Name,
UserRoleId,
UserRole.DeveloperName,
Username,
Email,
Name,
LastLoginDate,
CreatedBy.Name,
CreatedDate,
LastModifiedBy.Name,
LastModifiedDate,
(SELECT PermissionSet.Name
FROM PermissionSetAssignments
WHERE PermissionSet.IsOwnedByProfile = FALSE)
FROM User WHERE Id = '0053h000002xQ5sAAE'
LIMIT 1
$ sfdx mohanc:data:query -u mohan.chinnappan.n_ea2@gmail.com -q soql/userQuery.soql | sed -e 's/""/"/g' | sed -e 's/"//g'
attributes,Id,IsActive,ProfileId,Profile,UserRoleId,UserRole,Username,Email,Name,LastLoginDate,CreatedBy,CreatedDate,LastModifiedBy,LastModifiedDate,PermissionSetAssignments
{type:User,url:/services/data/v49.0/sobjects/User/0053h000002xQ5sAAE},0053h000002xQ5sAAE,true,00e3h000001kqJMAAY,{attributes:{type:Profile,url:/services/data/v49.0/sobjects/Profile/00e3h000001kqJMAAY},Name:System Administrator,UserLicense:{attributes:{type:UserLicense,url:/services/data/v49.0/sobjects/UserLicense/1003h000001GeewAAC},Name:Salesforce}},00E3h000001JaqUEAS,{attributes:{type:UserRole,url:/services/data/v49.0/sobjects/UserRole/00E3h000001JaqUEAS},DeveloperName:CEO},mohan.chinnappan.n_ea2@gmail.com,mohan.chinnappan.n@gmail.com,Mohan Chinnappan,2020-08-27T10:25:03.000+0000,{attributes:{type:User,url:/services/data/v49.0/sobjects/User/0053h000002xQ5sAAE},Name:Mohan Chinnappan},2020-07-13T19:30:57.000+0000,{attributes:{type:User,url:/services/data/v49.0/sobjects/User/0053h000002xQ5sAAE},Name:Mohan Chinnappan},2020-08-23T22:09:49.000+0000,{totalSize:4,done:true,records:[{attributes:{type:PermissionSetAssignment,url:/services/data/v49.0/sobjects/PermissionSetAssignment/0Pa3h000002cLFBCA2},PermissionSet:{attributes:{type:PermissionSet,url:/services/data/v49.0/sobjects/PermissionSet/0PS3h000001i9aKGAQ},Name:EinsteinAnalyticsPlusAdmin}},{attributes:{type:PermissionSetAssignment,url:/services/data/v49.0/sobjects/PermissionSetAssignment/0Pa3h000001BwoBCAS},PermissionSet:{attributes:{type:PermissionSet,url:/services/data/v49.0/sobjects/PermissionSet/0PS3h000001iHPpGAM},Name:EA_Plus}},{attributes:{type:PermissionSetAssignment,url:/services/data/v49.0/sobjects/PermissionSetAssignment/0Pa3h000002cLF1CAM},PermissionSet:{attributes:{type:PermissionSet,url:/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwO7GAK},Name:Wave_Analytics_Trailhead_Admin}},{attributes:{type:PermissionSetAssignment,url:/services/data/v49.0/sobjects/PermissionSetAssignment/0Pa3h000002cLF3CAM},PermissionSet:{attributes:{type:PermissionSet,url:/services/data/v49.0/sobjects/PermissionSet/0PS3h000002PwO9GAK},Name:EventMonitoringWaveAdmin}}]}
Group
User ERD Group ERD GroupMember ERD
$ cat ~/tmp/group.soql
SELECT
Id,
DeveloperName,
RelatedId,
Type,
Email,
DoesSendEmailToMembers,
DoesIncludeBosses
FROM Group
$ sfdx mohanc:data:query -u mohan.chinnappan.n_ea2@gmail.com -q ~/tmp/group.soql -f json
$ sfdx mohanc:data:query -u mohan.chinnappan.n_ea2@gmail.com -q ~/tmp/group.soql -f json
[
{
"attributes": {
"type": "Group",
"url": "/services/data/v51.0/sobjects/Group/00G3h000001rhf6EAA"
},
"Id": "00G3h000001rhf6EAA",
"DeveloperName": "Sales_WW",
"RelatedId": "00E3h000001JaqREAS",
"Type": "Role",
"Email": null,
"DoesSendEmailToMembers": false,
"DoesIncludeBosses": true
},
...
]
GroupMember
$ cat ~/tmp/groupMember.soql
SELECT
Id,
GroupId,
UserOrGroupId
FROM GroupMember
$ sfdx mohanc:data:query -u mohan.chinnappan.n_ea2@gmail.com -q ~/tmp/groupMember.soql -f json
[
{
"attributes": {
"type": "GroupMember",
"url": "/services/data/v51.0/sobjects/GroupMember/0113h0000002627AAA"
},
"Id": "0113h0000002627AAA",
"GroupId": "00G3h000000OtiAEAS",
"UserOrGroupId": "00G3h000001rhf7EAA"
}
]
Object Permissions
$ cat ~/tmp/obj-perm.soql
SELECT Id, ParentId, SObjectType,PermissionsCreate,PermissionsRead,PermissionsEdit,PermissionsDelete,PermissionsViewAllRecords,PermissionsModifyAllRecords,Parent.label, Parent.IsOwnedByProfile FROM ObjectPermissions
$ sfdx mohanc:data:query -u mohan.chinnappan.n_ea2@gmail.com -q ~/tmp/obj-perm.soql -f json
[
...
{
"attributes": {
"type": "ObjectPermissions",
"url": "/services/data/v51.0/sobjects/ObjectPermissions/00000001i9aL801AAC"
},
"Id": "00000001i9aL801AAC",
"ParentId": "0PS3h000001i9aLGAQ",
"SobjectType": "Order",
"PermissionsCreate": true,
"PermissionsRead": true,
"PermissionsEdit": true,
"PermissionsDelete": true,
"PermissionsViewAllRecords": true,
"PermissionsModifyAllRecords": true,
"Parent": {
"attributes": {
"type": "PermissionSet",
"url": "/services/data/v51.0/sobjects/PermissionSet/0PS3h000001i9aLGAQ"
},
"Label": "Salesforce CMS Integration Admin",
"IsOwnedByProfile": false
}
},
...
]
Field Permissions
$ cat ~/tmp/field-perm.soql
SELECT id,SobjectType, Field, PermissionsRead, PermissionsEdit FROM FieldPermissions
$ sfdx mohanc:data:query -u mohan.chinnappan.n_ea2@gmail.com -q ~/tmp/field-perm.soql -f json
[
...
{
"attributes": {
"type": "FieldPermissions",
"url": "/services/data/v51.0/sobjects/FieldPermissions/01k3h0000Iu7ytcAQA"
},
"Id": "01k3h0000Iu7ytcAQA",
"SobjectType": "Event",
"Field": "Event.Description",
"PermissionsRead": true,
"PermissionsEdit": true
},
...
]
Sharing Recalculation
By default, sharing rules and group membership are recalculated after certain updates to related features, such as sharing settings, roles, territories, groups, and users.
TYPE OF SHARING | CALCULATION OCCURS WHEN YOU |
---|---|
Sharing Rule | Change an organization's default sharing model |
Create, edit, or delete sharing rules | |
Create or transfer any records | |
Update public group members | |
Create or activate a user | |
Change users’ roles or update the role hierarchy | |
Add or remove users from territories | |
Reparent territories | |
Make changes to roles, territories, or public groups involved in sharing rules | |
Group Membership | Change or reparent roles |
Add or remove users from territories | |
Update public group members | |
Update portal account ownership if the new owner has a different role |
About defer sharing recalculation
- Defer Sharing Rule Calculations
- If you’re making a large number of changes that affect sharing rules, including changes to sharing rules, roles, territories, or public groups, you can temporarily suspend automatic sharing rule calculations and resume them after you finish your updates. You must resume calculations and do a full sharing rule recalculation, or you can experience sharing inconsistencies in your records.
Admin can do the defer sharing recalculation?
- Yes
- The defer sharing calculation feature isn't enabled by default. To enable it for your organization, contact Salesforce Customer Support.
What permission needed for this
To defer (suspend and resume) and recalculate sharing rules:
Manage Users AND Manage Sharing Calculation Deferral
Note: Salesforce Customer Support will enable this permission for the admin via support case
References
How to visualize Role Hierarchy in Salesforce org
Plugin
- Requires 0.0.171 version of the plugin
- sfdx-mohanc-plugins@0.0.171
- How to install the plugin
Usage
sfdx mohanc:security:rh -h
Render Role Hierarchy
USAGE
$ sfdx mohanc:security:rh
OPTIONS
-u, --targetusername=targetusername username or alias for the target org; overrides default target org
--apiversion=apiversion override the api version used for api requests made by this command
--json format output as json
--loglevel=(trace|debug|info|warn|error|fatal) logging level for this command invocation
EXAMPLE
** Render Role Hierarchy **
Draw Role Hierarchy
sfdx mohanc:security:rh -u <username>
Demo
sfdx mohanc:security:rh -u mohan.chinnappan.n_ea2@gmail.com
Writing visualization in file __RH__.svg ...
Visualization done. "open __RH__.svg" in Chrome Browser to view the Visualization.
Shield
Topics
Best Practices
- Take the time to identify the most likely threats to your org
-
Distinguish data that needs encryption from data that doesn’t, so that you can encrypt only what you need to. Not all data is sensitive.
-
Focus on information that requires encryption to meet your regulatory, security, compliance, and privacy requirements. Unnecessarily encrypting data impacts functionality and performance.
-
Balance business-critical functionality against security and risk measures and challenge your assumptions periodically.
-
Your tenant secret and keys are backed up, and be careful who you allow to manage your secrets and keys.
- If your tenant secrets are destroyed, reimport them to access your data. You are solely responsible for making sure that your data and tenant secrets are backed up and stored in a safe place. Salesforce cannot help you with deleted, destroyed, or misplaced tenant secrets.
-
To identify the threats that are most likely to affect your organization,
- Walk through a formal threat modeling exercise.
- Use your findings to create a data classification scheme, which can help you decide what data to encrypt.
-
Test Shield Platform Encryption in a sandbox environment before deploying to a production environment. Encryption policy settings can be deployed using change sets.
-
Before enabling encryption, fix any violations that you uncover.
- For example, if you reference encrypted fields in a SOQL ORDER BY clause, a violation occurs. Fix the violation by removing references to the encrypted fields.
-
If you use an app from the AppExchange, test how it interacts with encrypted data in your organization and evaluate whether its functionality is affected.
- If you suspect Shield Platform Encryption could affect the functionality of an app, ask the provider for help with evaluation.
-
Grant the Manage Encryption Keys user permission to authorized users only.
- Users with the Manage Encryption Keys permission can generate, export, import, and destroy organization-specific keys. Monitor the key management activities of these users regularly with the setup audit trail.
-
Synchronize your existing data with your active key material.
- Existing field and file data is not automatically encrypted when you turn on Shield Platform Encryption.
- To encrypt existing field data, update the records associated with the field data.
- This action triggers encryption for these records so that your existing data is encrypted at rest.
- To encrypt existing files or get help updating other encrypted data, contact Salesforce.
- We can encrypt existing file data in the background to ensure data alignment with the latest encryption policy and key material
- When you contact Salesforce support to request the background encryption service, allow at least a week before you need the background encryption completed.
- The time to complete the process varies based on the volume of data involved. It could take several days.
-
Currency and Number fields can’t be encrypted because they could have broad functional consequences across the platform, such as disruptions to roll-up summary reports, report timeframes, and calculations.
-
You can often keep private, sensitive, or regulated data of this variety safe in other encryption-supported field types.
-
Before you enable Shield Platform Encryption in a production environment, communicate to the users about how it affects your business solution.
-
When you generate a new tenant secret, any new data is encrypted using this key.
- However, existing sensitive data remains encrypted using previous keys.
- In this situation, Salesforce strongly recommends re-encrypting these fields using the latest key. Contact Salesforce for help with re-encrypting your data.
-
Use discretion when granting login as access to users or Salesforce Customer Support.
- If you grant login access to a user, and they have field level security access to an encrypted field, that user is able to view encrypted data in that field in plaintext.
-
Key Management and Rotation
- Shield Platform Encryption lets you control and rotate the key material used to encrypt your data
- You can use Salesforce to generate a tenant secret for you
- This is then combined with a per-release master secret to derive a data encryption key.
tenant_secret + per_release_master_secret => data_encryption_key
- This derived data encryption key is used in encrypt and decrypt functions
- You can also use the Bring Your Own Key (BYOK) service to upload your own key material
- Or store key material outside of Salesforce and have the Cache-Only Key Service fetch your key material on demand.
Shield Setup
FAQ
1.How do I know a Field in an Object is encrypted at rest?
Metadata API - describe - provides encrypted flag value for the encrypted fields as true as shown below:
Account object encrypted fields:
Describe on Account Object showing encrypted flag for the encrypted fields:
Describe on Account Object:
Options for Verifying Encryption:
-
At the request of an organizations admin, Tier 3 security agents can run a backend process to provide confirmation details of encrypted fields based on server logs which can be provided to users. Please allow two business days for these requests to process due to complexity of the request.
-
Organization admins can follow the steps below to use tenant key masking to verify field encryption at a record level:
a. Export current tenant secret used to encrypt current data in report and store it in a safe place.
b. Generate a new tenant secret.
c. Destroy the initial key used to encrypt data.
d. With the new key active navigate to any record where fields are encrypted and the data will show ????? meaning the data is encrypted.
Note: Once the admin or customer has reviewed the data masked with ????, the old secret used to encrypt the data can be imported back into the org and the new key destroyed as it was not used to encrypt any data.
How about Attachments encryption?
In the event that an admin would like to verify encryption on files and attachments the query below can be ran in Developer Console:
SELECT ContentType, Id, IsEncrypted FROM Attachments WHERE IsEncrypted = false/true
The query above will run against attachments and show all content where IsEncrypted = true or false. The attachments option can be replaced with files and content type removed to determine if files are encrypted or decrypted based on the IsEncrypted flag.
For additional information related to platform encryption masking please review our Salesforce article entitled:
What Does My Encrypted Data Look Like?: https://help.salesforce.com/articleView?id=security_pe_masking.htm&type=0
Knowledge Article: 000247422 says: View Encrypted Data Permission Not Needed with Shield Platform Encryption Beginning Spring ‘17
Can you explain this with an example?
![KB-FLS]https://github.com/mohan-chinnappan-n/Shield-setup/raw/master/FAQ/img/account-describe-metadata.png) Reference to this Knowledge Article
Let us take an example: In our org, we have an user: joe simple
Joe can see the encrypted field: Account.Fax but Joe can't see the encrypted field Account.Phone as per FLS for his profile:
Account.Fax:
Account.Phone:
If Joe uses REST API for example, to access Account Object, Joe will be denied access to the field: Account.Phone as shown below, while Joe can access Account.Fax
But other user, whose FLS allows read on these fields: Account.Fax and Account.Phone can access these two fields:
- Do I have to backup tenant secrets?
YES!!!
Tenant secrets are not like passwords.
Unlike passwords, you can’t reset a tenant secret. Salesforce can’t help with deleted, destroyed, or misplaced tenant secrets. Always back up tenant secrets!
- Is the encrypted fields data encrypted at rest?
YES!
You should use field-level access controls to limit who can access this sensitive data as shown above in question number: 2
About guideline for selecting fields for encrypting?
Unnecessarily encrypting data can slow down performance and affect users day-to-day activities. Based on your regulatory requirements define the kinds of customer data that require extra security and apply Shield Platform Encryption only to those areas.
- About automatic encryption?
Field values are automatically encrypted only in records created or updated after you’ve enabled encryption. Salesforce recommends updating existing records to ensure that their field values are encrypted. For example, if you encrypt the Description field on the Case object, use the Data Loader to update all case records. Contact Salesforce if you need help with this.
Can you provide technical details about Platform Encryption?
Here is the video answering this question.
- About Sandbox Refresh: How Does Shield Platform Encryption Work In a Sandbox?
Refreshing a sandbox from a production organization creates an exact copy of the production organization. If Shield Platform Encryption is enabled on the production organization, all encryption settings are copied, including tenant secrets created in production.
Once a sandbox is refreshed, tenant secret changes are confined to your current organization. This means that when you rotate or destroy a tenant secret on sandbox, it doesn’t affect the production organization.
As a best practice, rotate tenant secrets on sandboxes after a refresh. Rotation ensures that production and sandbox use different tenant secrets. Destroying tenant secrets on a sandbox renders encrypted data unusable in cases of partial or full copies.
How Does Shield Platform Encryption Work in a Sandbox?
Resources for curious minds:
Rotate Your Encryption Tenant Secrets - based on your organization’s security policies
What’s the Difference Between Classic Encryption and Shield Platform Encryption?
How Shield Platform Encryption Works
Tradeoffs and Limitations of Shield Platform Encryption
-
View Encrypted Data Permission Not Needed with Shield Platform Encryption Beginning Spring ‘17
-
View Encrypted Data” Permission Not Needed with Shield Platform Encryption Beginning Spring ‘17
-
Tighten Your Security with Salesforce Shield Platform Encryption
References
Profile, PermissionSet
Topics
- Retrieving Profile from an Org
- Retrieving Permission from an Org
- Comparing a profile in 2 orgs
- Incremental/delta deployment of profile entries
- Comparing a permissionset/profile between 2 git branches
- Viewing Profile components
- Micro-delta on Profile and PermissionSet
- Queries
- EOL for permissions on profiles by Spring 26
- Visualize Profile, PermissionSet
How to retrieve Profile from an Org and verify its correctness?
Demo Video
- Explains :
- How to Retrieve a Profile from a Salesforce org and check its validity and talks about how you can commit it back into version control and deploy into another Salesforce org
sfdx mohanc:mdapi:helper:retrieve -m Profile -u mohan.chinnappan.n.sel@gmail.com -n "Admin"
==== Writing to ./43/unpackaged/profiles/Admin.profile-meta.xml... ====
==== Writing package.xml into ./43/unpackaged/package.xml... ====
==== Executing zip command: zip -r 43.zip unpackaged ====
==== RUN: cd 43; sfdx force:mdapi:deploy -f 43.zip -u mohan.chinnappan.n.sel@gmail.com -w 1000 --verbose -c ====
~/test2 >vi 43/unpackaged/profiles/Admin.profile-meta.xml
~/test2 >cd 43; sfdx force:mdapi:deploy -f 43.zip -u mohan.chinnappan.n.sel@gmail.com -w 1000 --verbose -c
Deploying metadata to mohan.chinnappan.n.sel@gmail.com using the v56.0 SOAP API
Deploy ID: 0Af4x00000YA7XwCAL
DEPLOY PROGRESS | ████████████████████████████████████████ | 1/1 Components
=== Deployed Source
Type File Name Id
─────── ────────────────────────────────────────── ─────────── ──────────────────
unpackaged/package.xml package.xml
Profile unpackaged/profiles/Admin.profile-meta.xml Admin 00e4x000002b8IgAAI
~/test2/43 >cd ..
~/test2 >!v
vi 43/unpackaged/profiles/Admin.profile-meta.xml
- Note you can pass comma separated list of Profiles to this command like
sfdx mohanc:mdapi:helper:retrieve -m Profile -u mohan.chinnappan.n.sel@gmail.com -n "Profile1, Profile2"
Using sfdx-plugin-source-read
# install
sfdx plugins:install sfdx-plugin-source-read
# run this command in a SFDX project
sfdx force:source:read -m Profile:Admin -u <username>
How to retrieve Permission Set from an Org and verify its correctness?
Demo Video
~/test2 > sfdx mohanc:mdapi:helper:retrieve -m PermissionSet -u mohan.chinnappan.n.sel@gmail.com -n 'Experience_Profile_Manager'
==== Writing to ./41/unpackaged/permissionsets/Experience_Profile_Manager.permissionset-meta.xml... ====
==== Writing package.xml into ./41/unpackaged/package.xml... ====
==== Executing zip command: zip -r 41.zip unpackaged ====
==== RUN: cd 41; sfdx force:mdapi:deploy -f 41.zip -u mohan.chinnappan.n.sel@gmail.com -w 1000 --verbose -c ====
~/test2 >vi 41/unpackaged/permissionsets/Experience_Profile_Manager.permissionset-meta.xml
~/test2 >cd 41; sfdx force:mdapi:deploy -f 41.zip -u mohan.chinnappan.n.sel@gmail.com -w 1000 --verbose -c
Deploying metadata to mohan.chinnappan.n.sel@gmail.com using the v56.0 SOAP API
Deploy ID: 0Af4x00000YA7YGCA1
DEPLOY PROGRESS | ████████████████████████████████████████ | 1/1 Components
=== Deployed Source
Type File Name Id
───────────── ─────────────────────────────────────────────────────────────────────────── ────────────────────────── ──────────────────
unpackaged/package.xml package.xml
PermissionSet unpackaged/permissionsets/Experience_Profile_Manager.permissionset-meta.xml Experience_Profile_Manager 0PS4x0000045OQEGA2
~/test2/41 >
- Note you can pass comma separated list of PermissionSets API Names to this command like
sfdx mohanc:mdapi:helper:retrieve -m PermissionSet -u mohan.chinnappan.n.sel@gmail.com -n "Acct_View_All, Oppty_View_All"
Comparing a profile in 2 orgs
~/treeprj [patch1] > sfdx mohanc:mdapi:helper:rc2 -m Profile -o mohan.chinnappan.n.sel@gmail.com,mohan.chinnappan.n.sel2@gmail.com -n Admin
==== sfdx mohanc:mdapi:helper:retrieveSimple -u mohan.chinnappan.n.sel@gmail.com -m Profile -n "Admin" > /var/folders/ch/p19n6yj11wd1vp5c3svhpt800000gp/T/Profile-Admin.xml ====
==== sfdx mohanc:mdapi:helper:retrieveSimple -u mohan.chinnappan.n.sel2@gmail.com -m Profile -n "Admin" > /var/folders/ch/p19n6yj11wd1vp5c3svhpt800000gp/T/Profile-Admin.xml ====
~/treeprj [patch1] >
Package xml
- For Profile
<?xml version=1.0 encoding=UTF-8 standalone=yes?>
<Package xmlns=http://soap.sforce.com/2006/04/metadata>
<types>
<members>*</members>
<name>ApexClass</name>
</types>
<types>
<members>*</members>
<name>ApexPage</name>
</types>
<types>
<members>*</members>
<name>CustomApplication</name>
</types>
<!-- need to specify Object.Field
<types>
<members>*</members>
<name>CustomField</name>
</types>
-->
<types>
<members>*</members>
<name>CustomMetadata</name>
</types>
<types>
<members>*</members>
<name>CustomObject</name>
</types>
<types>
<members>*</members>
<name>CustomTab</name>
</types>
<types>
<members>*</members>
<name>ExternalDataSource</name>
</types>
<members>*</members>
<name>Layout</name>
</types>
<!-- needs specific value
<types>
<members>*</members>
<name>RecordType</name>
</types>
-->
<types>
<members>*</members>
<name>FlexiPage</name>
</types>
<types>
<members>*</members>
<name>Profile</name>
</types>
<version>56.0</version>
</Package>
- For Permission set
<?xml version=1.0 encoding=UTF-8 standalone=yes?>
<Package xmlns=http://soap.sforce.com/2006/04/metadata>
<types>
<members>*</members>
<name>ApexClass</name>
</types>
<types>
<members>*</members>
<name>ApexPage</name>
</types>
<types>
<members>*</members>
<name>CustomApplication</name>
</types>
<!-- need to specify Object.Field
<types>
<members>*</members>
<name>CustomField</name>
</types>
-->
<types>
<members>*</members>
<name>CustomMetadata</name>
</types>
<types>
<members>*</members>
<name>CustomObject</name>
</types>
<types>
<members>*</members>
<name>CustomTab</name>
</types>
<types>
<members>*</members>
<name>ExternalDataSource</name>
</types>
<!-- needs specific value
<types>
<members>*</members>
<name>RecordType</name>
</types>
-->
<types>
<members>*</members>
<name>PermissionSet</name>
</types>
<version>56.0</version>
</Package>
Incremental/delta deployment of profile entries
Use Case
Read Only.profile
current has nopageAccesses
forCommunitiesLanding
-
Current xml
-
We like to add this
pageAccesses
in the fileRead Only.profile
<?xml version="1.0" encoding="UTF-8"?>
<Profile xmlns="http://soap.sforce.com/2006/04/metadata">
<pageAccesses>
<apexPage>CommunitiesLanding</apexPage>
<enabled>true</enabled>
</pageAccesses>
</Profile>
in an incremental manner instead of deploying the whole xml with
Steps
-
- Build the folders
mkdir 100
cd 100
mkdir unpackaged
cd unpackaged
mkdir profiles
-
- Build the
package.xml
using this tool:
- Note this tool puts the package.xml content into your clipboard
- Build the
# Get the package.xml content from clipboard
pbpaste > package.xml
bat package.xml
───────┬────────────────────────────────────────────────────────────────────────────────────────────────────────────
│ File: package.xml
───────┼────────────────────────────────────────────────────────────────────────────────────────────────────────────
1 │ <?xml version="1.0" encoding="UTF-8"?>
2 │ <Package xmlns="http://soap.sforce.com/2006/04/metadata">
3 │ <version>56.0</version>
4 │
5 │ <types>
6 │ <members>Read Only</members>
7 │ <name>Profile</name>
8 │ </types>
9 │
10 │
11 │ </Package>
- Copy
Read Only.profile
to profiles folder
/tmp/profile-exp/100/unpackaged >vi "profiles/Read Only.profile"
/tmp/profile-exp/100/unpackaged >bat "profiles/Read Only.profile"
───────┬────────────────────────────────────────────────────────────────────────────────────────────────────────────
│ File: profiles/Read Only.profile
───────┼────────────────────────────────────────────────────────────────────────────────────────────────────────────
1 │ <?xml version="1.0" encoding="UTF-8"?>
2 │ <Profile xmlns="http://soap.sforce.com/2006/04/metadata">
3 │ <pageAccesses>
4 │ <apexPage>CommunitiesLanding</apexPage>
5 │ <enabled>true</enabled>
6 │ </pageAccesses>
7 │ </Profile>
8 │
9 │
10 │
───────┴────────────────────────────────────────────────────────────────────────────────────────────────────────────
- zip the package
/tmp/profile-exp/100/unpackaged >cd ..
/tmp/profile-exp/100 >pwd
/tmp/profile-exp/100
/tmp/profile-exp/100 >zip -r 100.zip unpackaged
adding: unpackaged/ (stored 0%)
adding: unpackaged/package.xml (deflated 24%)
adding: unpackaged/profiles/ (stored 0%)
adding: unpackaged/profiles/Read Only.profile (deflated 29%)
>tree
.
├── 100.zip
└── unpackaged
├── package.xml
└── profiles
└── Read\ Only.profile
- Deploy 100.zip after you have logged into the org (checkOnly)
/tmp/profile-exp/100 >sfdx force:mdapi:deploy -f 100.zip -u mohan.chinnappan.n.sel@gmail.com -w 1000 --verbose -c
Deploying metadata to mohan.chinnappan.n.sel@gmail.com using the v57.0 SOAP API
Deploy ID: 0Af4x00000YS5fmCAD
DEPLOY PROGRESS | ████████████████████████████████████████ | 1/1 Components
=== Deployed Source
Type File Name Id
─────── ───────────────────────────────────── ─────────── ──────────────────
unpackaged/package.xml package.xml
Profile unpackaged/profiles/Read Only.profile Read Only 00e4x000002b8JCAAY
- Now Deploy
sfdx force:mdapi:deploy -f 100.zip -u mohan.chinnappan.n.sel@gmail.com -w 1000 --verbose
Deploying metadata to mohan.chinnappan.n.sel@gmail.com using the v57.0 SOAP API
Deploy ID: 0Af4x00000YS5hlCAD
DEPLOY PROGRESS | ████████████████████████████████████████ | 1/1 Components
=== Deployed Source
Type File Name Id
─────── ───────────────────────────────────── ─────────── ──────────────────
unpackaged/package.xml package.xml
Profile unpackaged/profiles/Read Only.profile Read Only 00e4x000002b8JCAAY
- Open the org to see the DeployStatus
sfdx force:org:open -p lightning/setup/DeployStatus/home -u mohan.chinnappan.n.sel@gmail.com
- Check this profile in the org
Viewing components in Profile/PermissionSet in the org
Get XML/JSON for the given Profile/PermissionSet in the org
sfdx mohanc:mdapi:helper:retrieveSimple -m Profile -u mohan.chinnappan.n.sel2@gmail.com -n Admin -f xml
sfdx mohanc:mdapi:helper:retrieveSimple -m Profile -u mohan.chinnappan.n.sel2@gmail.com -n Admin -f json
sfdx mohanc:mdapi:helper:retrieveSimple -m PermissionSet -u mohan.chinnappan.n.sel2@gmail.com -n "Experience_Profile_Manager" -f xml
sfdx mohanc:mdapi:helper:retrieveSimple -m PermissionSet -u mohan.chinnappan.n.sel2@gmail.com -n "Experience_Profile_Manager" -f json
sfdx mohanc:mdapi:helper:retrieveSimple -m PermissionSet -u mohan.chinnappan.n.sel2@gmail.com -n Experience_Profile_Manager -f dt -s objectPermissions
No data found!
Valid sections:
_________
applicationVisibilities
classAccesses
customPermissions
customMetadataTypeAccesses
fieldPermissions
layoutAssignments
objectPermissions
recordTypeVisibilities
tabVisibilities
userPermissions
________
PermissionSet
sfdx mohanc:mdapi:helper:retrieveSimple -m PermissionSet -u mohan.chinnappan.n.sel2@gmail.com -n Experience_Profile_Manager -f dt -s userPermissions
Profile
sfdx mohanc:mdapi:helper:retrieveSimple -m Profile -u mohan.chinnappan.n.sel2@gmail.com -n Admin -f dt -s userPermissions
Micro-delta on Profile and PermissionSet
Use case
- We like to know the delta for a Profile or PermissionSet in the version control (which is in our local folder) with the same Profile or PermissionSet in the Org
- Tool should be able to produce a micro-delta profile or PermissionSet xml file which we can use for our delta deployments
sfdx mohanc:mdapi:helper:retrieveSimple -m Profile -u mohan.chinnappan.n.sel2@gmail.com -n Admin -f jsonSection -l ~/treeprj/force-app/main/default/profiles/Admin.profile-meta.xml -s ALL
Flags:
- -m Profile - metadata type
- -u is user name to login into the org
- -n Admin : We are working with Admin Proffile
- -f jsonSection : we like to go by section by section
- -l is the local file we got from the version control
- -s All says all the sections
Sections
{
"applicationVisibilities": "application",
"classAccesses": "apexClass",
"customPermissions":"name",
"customMetadataTypeAccesses":"name",
"fieldPermissions":"field",
"layoutAssignments":"layout",
"objectPermissions":"object",
"recordTypeVisibilities":"recordType",
"tabVisibilities":"tab",
"userPermissions": "name"
}
Video about Micro-delta Profile file
cat /tmp/Admin_micro-delta.xml
<?xml version="1.0" encoding="UTF-8"?>
<Profile xmlns="http://soap.sforce.com/2006/04/metadata">
<!-- Diff item -->
<applicationVisibilities>
<application>standard__AllTabSet</application>
<default>false</default>
<visible>false</visible>
</applicationVisibilities>
<!-- New item -->
<applicationVisibilities>
<application>TESTAPP</application>
<default>false</default>
<visible>false</visible>
</applicationVisibilities>
<!-- New item -->
<applicationVisibilities>
<application>NEWAPP</application>
<default>false</default>
<visible>false</visible>
</applicationVisibilities>
<!-- Diff item -->
<classAccesses>
<apexClass>MyDomain</apexClass>
<enabled>false</enabled>
</classAccesses>
<!-- New item -->
<classAccesses>
<apexClass>Yourclass</apexClass>
<enabled>true</enabled>
</classAccesses>
<!-- Diff item -->
<fieldPermissions>
<editable>false</editable>
<field>Account.AccountNumber</field>
<readable>true</readable>
</fieldPermissions>
<!-- New item -->
<fieldPermissions>
<editable>true</editable>
<field>Account.testNumber__c</field>
<readable>true</readable>
</fieldPermissions>
<!-- New item -->
<fieldPermissions>
<editable>true</editable>
<field>Account.NewAccountNumber__c</field>
<readable>true</readable>
</fieldPermissions>
</Profile>
Permission Set
sfdx mohanc:mdapi:helper:retrieveSimple -m PermissionSet -u mohan.chinnappan.n.sel2@gmail.com -n "Experience_Profile_Manager" -f jsonSection -l ~/treeprj/force-app/main/default/permissionsets/Experience_Profile_Manager.permissionset-meta.xml -s ALL
<?xml version="1.0" encoding="UTF-8"?>
<PermissionSet xmlns="http://soap.sforce.com/2006/04/metadata">
<!-- Diff item -->
<userPermissions>
<enabled>false</enabled>
<name>AccessCMC</name>
</userPermissions>
How about getting micro-delta between 2 local files (may be coming from 2 branches)
- with
-x
flag you can provide second file - this file will be treated as it is from the org
sfdx mohanc:mdapi:helper:retrieveSimple -m PermissionSet -u mohan.chinnappan.n.sel2@gmail.com -n "Experience_Profile_Manager" -f jsonSection -l ~/treeprj/force-app/main/default/permissionsets/Experience_Profile_Manager.permissionset-meta.xml -s ALL -x /tmp/EPM.permissionset-meta.xml
<?xml version="1.0" encoding="UTF-8"?>
<PermissionSet xmlns="http://soap.sforce.com/2006/04/metadata">
<!-- Diff item -->
<userPermissions>
<enabled>false</enabled>
<name>AccessCMC</name>
</userPermissions>
<!-- Diff item -->
<userPermissions>
<enabled>true</enabled>
<name>CreateReportInLightning</name>
</userPermissions>
</PermissionSet>
Prepare the diff viewer
pbcopy < ~/treeprj/force-app/main/default/permissionsets/Experience_Profile_Manager.permissionset-meta.xml
# Put the clipboard content into left side of the https://mohan-chinnappan-n5.github.io/delta/diff.html
pbcopy < /tmp/EPM.permissionset-meta.xml
# Now put the clipboard content into left side of the https://mohan-chinnappan-n5.github.io/delta/diff.html
Comparing a permissionset/profile between 2 git branches
Demo video
Commands used in this video
~/lwc/dreamhouse-lwc [feature-1] >git branch -a
* feature-1
main
remotes/origin/HEAD -> origin/main
remotes/origin/main
vi force-app/main/default/permissionsets/dreamhouse.permissionset-meta.xml
~/lwc/dreamhouse-lwc [feature-1] >git checkout main
Switched to branch 'main'
Your branch is up to date with 'origin/main'.
vi force-app/main/default/permissionsets/dreamhouse.permissionset-meta.xml
~/lwc/dreamhouse-lwc [main] >python3 ~/sf-git-utils/git_diff_file_two_branches.py --branch1=main --branch2=feature-1 --filepath="force-app/main/default/permissionsets/dreamhouse.permissionset-meta.xml" --metadataType='permissionset' --help
usage: git_diff_file_two_branches.py [-h] [--branch1 BRANCH1] [--branch2 BRANCH2] [--filepath FILEPATH]
[--metadataType METADATATYPE]
Get content for a file from given 2 branches
options:
-h, --help show this help message and exit
--branch1 BRANCH1 branch1 (Default: main)
--branch2 BRANCH2 branch2 (Default: develop)
--filepath FILEPATH File path for the file (Default: README.md)
--metadataType METADATATYPE
profile or permission set (Default: profile)
Get given file content in given 2 branches
~/lwc/dreamhouse-lwc [main] >python3 ~/sf-git-utils/git_diff_file_two_branches.py --branch1=main --branch2=feature-1 --filepath="force-app/main/default/permissionsets/dreamhouse.permissionset-meta.xml" --metadataType='permissionset'
=== Checking out main ===
Already on 'main'
Temporary file name: /var/folders/ch/p19n6yj11wd1vp5c3svhpt800000gp/T/tmpa701j416
=== Going to run xml transform...===
=== Writing the output into file : /var/folders/ch/p19n6yj11wd1vp5c3svhpt800000gp/T/tmpa701j416.html ===
=== Contents of /var/folders/ch/p19n6yj11wd1vp5c3svhpt800000gp/T/tmpa701j416 is copied into your clipboard. Opening the diff app... ===
=== ACTION REQUIRED: Paste the content into the diff app, once it opens up... and press [Enter] ===
=== Continuing with the program... ===
=== checking out feature-1 ===
Switched to branch 'feature-1'
Temporary file name: /var/folders/ch/p19n6yj11wd1vp5c3svhpt800000gp/T/tmp2rdylch1
going to run xml transform...
=== Writing the output into file : /var/folders/ch/p19n6yj11wd1vp5c3svhpt800000gp/T/tmp2rdylch1.html ===
=== Contents of /var/folders/ch/p19n6yj11wd1vp5c3svhpt800000gp/T/tmp2rdylch1 is copied into your clipboard. Goto the diff app... ===
=== ACTION REQUIRED: Paste the content into the diff app right side box ===
Tools
Queries
Query to check users assigned with a given permissionset
SELECT id,PermissionSetId, Assignee.Name, Assignee.id FROM PermissionSetAssignment
WHERE PermissionSet.Name = 'SalesUserPsl'
End of life (EOL) of permissions on profiles
- Permission sets are the future of user management
- End of life (EOL) of permissions on profiles that will be the Spring ’26 release.
What will remain on a profile?
- One-to-one relationships
- login hours/IP ranges
- Defaults—record types, apps
- Page layout assignment
- The future is App Builder/Dynamic Forms so Salesforce will not invest in bringing page layout assignment to permission sets.
What will be available only on permission sets after this EOL?
- User permissions (system and app permissions)
- Object permissions (object Create, Read, Update, and Delete - CRUD)
- Field permissions (field-level security FLS)
- Tabs
- Record types (not defaults)
- Apps (not defaults)
- Connected app access
- Apex classes
- Visualforce pages
- Custom permissions
Turning of Profiles before EOL
- Turning off the ability to use permissions on profiles before the official EOL - targeted for Spring ’24 release
References
Visualize Profile, PermissionSet
Profile
sfdx mohanc:xml:transform -i <ProfileXmlFile> -m Profile
sfdx mohanc:xml:transform -i force-app/main/default/profiles/Admin.profile-meta.xml -m Profile```
PermissionSet
sfdx mohanc:xml:transform -i force-app/main/default/permissionsets/Experience_Profile_Manager.permissionset-meta.xml -m PermissionSet
Demos
Using xmlutil executable
- Download the tool
- Put the downloaded tool in your PATH
- You need to give permission to run this command in your macOS via System Settings > Privacy & Security
- Make it executable
chmod +x xmlutil
- Visualize Permission Set
xmlutil transform --xml=MyTest.profile-meta.xml --xsl=PermissionSet3 --out=/tmp/myTest.html
- Visualize Profile
xmlutil transform --xml=MyProfile.permissionset-meta.xml --xsl=Profile3 --out=/tmp/myProfile.html
- Visualize CustomLabels
xmlutil transform --xml=CustomLabels.labels-meta.xml --xsl=CustomLabel2 --out=/tmp/cl.html
How to get Visualization for all the profiles or permissionsets in the given folder
-
Make sure that xmlutil executable is in your PATH to make this work
-
Download the scripts
bash process_profile.sh <profiles folder>
bash process_ps.sh <permissionsets folder>
- Example for permissionsets
bash process_ps.sh ~/treeprj/force-app/main/default/permissionsets
=== 1: Processing /Users/mchinnappan/treeprj/force-app/main/default/permissionsets/Experience_Profile_Manager.permissionset-meta.xml file...
=== Output is written into /tmp/_viz_PermissionSet/Experience_Profile_Manager.permissionset-meta.xml.html, view this file in your web browser: open /tmp/_viz_PermissionSet/Experience_Profile_Manager.permissionset-meta.xml.html ===
-------------------------------------------
=== 2: Processing /Users/mchinnappan/treeprj/force-app/main/default/permissionsets/Manage_Encryption_Keys.permissionset-meta.xml file...
=== Output is written into /tmp/_viz_PermissionSet/Manage_Encryption_Keys.permissionset-meta.xml.html, view this file in your web browser: open /tmp/_viz_PermissionSet/Manage_Encryption_Keys.permissionset-meta.xml.html ===
-------------------------------------------
=== 3: Processing /Users/mchinnappan/treeprj/force-app/main/default/permissionsets/sf_devops_InitializeEnvironments.permissionset-meta.xml file...
=== Output is written into /tmp/_viz_PermissionSet/sf_devops_InitializeEnvironments.permissionset-meta.xml.html, view this file in your web browser: open /tmp/_viz_PermissionSet/sf_devops_InitializeEnvironments.permissionset-meta.xml.html ===
-------------------------------------------
=== 4: Processing /Users/mchinnappan/treeprj/force-app/main/default/permissionsets/sf_devops_NamedCredentials.permissionset-meta.xml file...
=== Output is written into /tmp/_viz_PermissionSet/sf_devops_NamedCredentials.permissionset-meta.xml.html, view this file in your web browser: open /tmp/_viz_PermissionSet/sf_devops_NamedCredentials.permissionset-meta.xml.html ===
-------------------------------------------
=== 4 processed ===
adding: tmp/_viz_PermissionSet/Experience_Profile_Manager.permissionset-meta.xml.html (deflated 79%)
adding: tmp/_viz_PermissionSet/Manage_Encryption_Keys.permissionset-meta.xml.html (deflated 80%)
adding: tmp/_viz_PermissionSet/sf_devops_InitializeEnvironments.permissionset-meta.xml.html (deflated 79%)
adding: tmp/_viz_PermissionSet/sf_devops_NamedCredentials.permissionset-meta.xml.html (deflated 79%)
=== _viz_PermissionSet_files.zip created ===
Package.xml for profile/permissionset covering sales objects
<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
<version>56.0</version>
<!--
usage:
sfdx force:mdapi:retrieve -k profiles-with-stdobjects-package.xml -u <username> -r .
-->
<types>
<!-- needs to be customizable CustomObjects -->
<members>Account</members>
<members>Contact</members>
<members>Opportunity</members>
<members>Campaign</members>
<members>Case</members>
<members>Asset</members>
<members>Order</members>
<members>Lead</members>
<members>CampaignMember</members>
<members>Contract</members>
<name>CustomObject</name>
</types>
<types>
<members>*</members>
<name>Profile</name>
</types>
</Package>
SetupAuditTrail
Using Chrome Extension ONNU - Query Console to query SetupAuditTrail and filter the results easily
Data Mask
- Data Mask helps test environment secure by replacing sensitive data with mock realistic test data.
Features
How to
-
Deleting data that is not needed for testing
-
Phone replace with random numbers and leave blank for fields needing unmasked data
- Libraries for masking the fields
Data Mask Demo
FAQ
Guidelines for integration testing
- Make sure that records are masked the same way between the data sources in your solution
- Taking care of Referential integrity of the multiple data sources in your solution - this requires a specific record is masked as same wherever it occurs (Salesforce and other sub-systems of your integration landscape).
- Example: The customer Ken Thompson in Salesforce is masked into David Farmer in Salesforce and as well as in say SAP to which you are integrating with. And Ken's SSN (or any PII) must be masked into the same on Salesforce and SAP.
References
SAML - Security Assertion Markup Language.
sequenceDiagram title: Magic of Security Assertion Markup Language - Get into ExternalApps User->>Dashboard: Lands on Dashboard Dashboard->>IdP: Logs in using IdP CorporatePortal->>User: Presents External Apps available to the User User->>ExternalApp:Clicks into the ExternalApp ExternalApp->>IdP: Sends Security Assertion Markup Language <br/>request to authenticate User IdP->>ExternalApp: Verifies the session and sends the User back to ExternalApp <br/> with Security Assertion Markup Language Response User->>ExternalApp: Got access to ExternalApp
It is an XML-based open-standard for transferring identity data between two parties: an identity provider (IdP) and a service provider (SP).
Identity Provider (IdP)
— Performs authentication and passes the user’s identity and authorization level to the service provider.
Service Provider (SP)
— Trusts the identity provider and authorizes the given user to access the requested resource.
Benefits
-
Improved User Experience — Users only need to sign in one time to access multiple service providers.
-
Increased Security — SAML provides a single point of authentication, which happens at a secured IdP.
- Then, SAML transfers the identity information to the service providers. This form of authentication ensures that credentials are only sent to the IdP directly.
-
Loose Coupling of Directories — SAML doesn’t require user information to be maintained and synchronized between directories.
-
Reduced Costs for Service Providers — With SAML, you don’t have to maintain account information across multiple SPs. IdP bears this burden.
SAML Request from SP
This SAML request is a POST binding request, meaning that it will be sent via an HTTP POST request to the Identity Provider (IdP) endpoint specified in the Destination attribute.
<?xml version="1.0" encoding="UTF-8"?>
<samlp:AuthnRequest
xmlns:samlp="urn:oasis:names:tc:SAML:2.0:protocol"
xmlns:saml="urn:oasis:names:tc:SAML:2.0:assertion"
ID="12345"
Version="2.0"
IssueInstant="2023-03-26T12:34:56Z"
ProtocolBinding="urn:oasis:names:tc:SAML:2.0:bindings:HTTP-POST"
AssertionConsumerServiceURL="https://sp.example.com/acs"
Destination="https://idp.example.com/idp/login">
<saml:Issuer>https://sp.example.com</saml:Issuer>
<samlp:NameIDPolicy
Format="urn:oasis:names:tc:SAML:1.1:nameid-format:unspecified"
AllowCreate="true" />
</samlp:AuthnRequest>
Element | Description |
---|---|
samlp:AuthnRequest | The root element of the SAML request |
xmlns:samlp and xmlns:saml | The namespaces used in the SAML request |
ID | A unique identifier for the SAML request |
Version | The version of the SAML protocol being used (in this case, SAML 2.0). |
IssueInstant | The timestamp when the SAML request was issued. |
ProtocolBinding | The binding used for the SAML request. |
AssertionConsumerServiceURL | The URL where the SAML response should be sent. |
Destination | The endpoint URL of the IdP where the SAML request should be sent. |
saml:Issuer | The entity ID of the service provider sending the SAML request. |
samlp:NameIDPolicy | A policy that specifies how the NameID should be formatted in the SAML response. In this case, the format is unspecified and the IdP is allowed to create a new NameID if necessary. |
- Assertion Consumer Service (ACS)
SAML response from IdP
<?xml version="1.0" encoding="UTF-8"?>
<samlp:Response
xmlns:samlp="urn:oasis:names:tc:SAML:2.0:protocol"
xmlns:saml="urn:oasis:names:tc:SAML:2.0:assertion"
ID="abcdefg"
Version="2.0"
IssueInstant="2023-03-26T12:34:56Z"
Destination="https://sp.example.com/acs"
InResponseTo="12345">
<saml:Issuer>https://idp.example.com</saml:Issuer>
<samlp:Status>
<samlp:StatusCode
Value="urn:oasis:names:tc:SAML:2.0:status:Success" />
</samlp:Status>
<saml:Assertion
ID="hijklmn"
Version="2.0"
IssueInstant="2023-03-26T12:34:56Z">
<saml:Issuer>https://idp.example.com</saml:Issuer>
<saml:Subject>
<saml:NameID>user@example.com</saml:NameID>
<saml:SubjectConfirmation
Method="urn:oasis:names:tc:SAML:2.0:cm:bearer">
<saml:SubjectConfirmationData
InResponseTo="12345"
NotOnOrAfter="2023-03-26T12:44:56Z"
Recipient="https://sp.example.com/acs" />
</saml:SubjectConfirmation>
</saml:Subject>
<saml:Conditions
NotBefore="2023-03-26T12:24:56Z"
NotOnOrAfter="2023-03-26T12:44:56Z">
<saml:AudienceRestriction>
<saml:Audience>https://sp.example.com</saml:Audience>
</saml:AudienceRestriction>
</saml:Conditions>
<saml:AuthnStatement
AuthnInstant="2023-03-26T12:34:56Z"
SessionIndex="12345">
<saml:AuthnContext>
<saml:AuthnContextClassRef>
urn:oasis:names:tc:SAML:2.0:ac:classes:PasswordProtectedTransport
</saml:AuthnContextClassRef>
</saml:AuthnContext>
</saml:AuthnStatement>
</saml:Assertion>
</samlp:Response>
Item | Description |
---|---|
Assertion | This is the primary component of the SAML response and contains the actual authentication and authorization information about the user. The assertion typically includes the user's identity information, such as their name and email address, as well as any relevant access control information. |
Status | This element indicates the success or failure of the authentication process. If the authentication was successful, the status will be set to "Success." If it failed, the status will be set to "Failure" and may include an error message. |
Signature | The SAML response may be digitally signed to ensure its authenticity and integrity. The signature is typically generated using the private key of the IdP and can be verified using the public key of the IdP. |
Destination | This element specifies the URL of the SP that the SAML response is intended for. |
IssueInstant | This element indicates the time when the SAML response was issued. |
Issuer | This element identifies the IdP that issued the SAML response |
sequenceDiagram title: Security Assertion Markup Language - Request and Response User->>Dashboard: Lands on Dashboard via Browser Dashboard->>IdP: POSTs Request IdP->>ExternalApp: Parses Request and authenticates User and sends Response via Browser User->>ExternalApp: Got access to ExternalApp
OpenID Concepts
Basic Concepts
- OpenID Connect is an open standard for authentication and authorization that provides an interoperable authentication layer between applications
- It is a protocol that helps applications of all types
- handle end-user authentication and verify the identities of these users
- OpenID Connect is based on a framework called OAuth 2.0, which handles delegated authorization
- OpenID Connect enables scenarios where one login can be used across multiple applications, also known as single sign-on (SSO).

authentication
- The process of confirming the identity of an entity (e.g user)
authorization
- The process of verifying what entities can access, or what actions they can perform.
For a concrete example, imagine a situation where you buy a ticket for a show. In this case,
more often than not, the establishment will not be interested in your identity (i.e., on who you are). What
they care about is whether you are authorized or not to attend the show. To prove that you have the right
to be there, instead of using an ID or a passport, you would use a ticket that contains no information
about you.
- The authentication can lead to authorization but that the opposite is not true.
- having authorization does not mean being authenticated or identified
entity
- thing that exists as an individual unit
identity
-
is a set of attributes that you can use to distinguish the entity within a context.
-
Multiple Identities can belong to an Entity
flowchart LR identity1 --> entity identity2 --> entity observer1 --> identity1 observer2 --> identity2 identity2 --> attributes2 identity1 --> attributes1
- Apps based on the OpenID Connect protocol rely on
- identity providers to handle authentication processes for them securely and to verify the identities (i.e., personal attributes) of their users
- Note: OIDC spec uses Authorization Server to refer to the entity in charge of authenticating end-users
Imagine, for example, you have a restaurant application that allows authenticated users to book tables.
By using OpenID Connect, instead of dealing with the credentials of these users, your app would offload
the authentication process to an identity provider (for example, Google, Microsoft, or Auth0). More
specifically, when visitors start this process, your application would redirect them to the identity provider
of choice where they would authenticate themselves to prove their identities. After the authentication
process, your app would get this identity proof and would allow users to book their tables based on it
sequenceDiagram title: Restaurant table booking User->>App: I need to book table App->>User: OK, can you prove your identity, I will send you to Google OIDC<br/> Since that is configured as your authProvider User->>GoogleOIDC: Login GoogleOIDC->>App: Here is the ID proof of the User App->>GoogleOIDC: Here is the artifacts for the authentication App->>User: You are logged-in, go ahead and book the table User->>App: Books the table
sequenceDiagram title: Salesforce Org Access User->>SFDC: I need to login into the Org SFDC->>User: OK, can you prove your identity, I will send you to Google OIDC,<br/> Since that is configured as your authProvider User->>GoogleOIDC: Login GoogleOIDC->>SFDC: Here is the ID proof of the User SFDC->>GoogleOIDC: Here is the artifacts for the authentication SFDC->>User: You are logged-in, go ahead and use the Org User->>SFDC: Performs work in the Org
- For OIDC spec point of view:
- Identity providers are free to decide if they handle user authentication through a set of credentials (e.g., username and password)
- if they enhance the security of the process by using features like multi-factor authentication (MFA),
- or even if they relay this process to other identity providers and other protocols.
History - how we landed here?
Password days
- It all started if you have a password you can access the computer
flowchart LR User-->Password-->Computer
Birth of Authentication Protocol like Kerberos
- With advent of computer networks, we need to avoid the issue of users would end up being duplicated all over the place
- Authentication Protocol like Kerberos are created to solve these issues
- Then we decided to centralize authentication on domain controllers (servers that respond to security authentication requests like login, permission checking, etc.)
flowchart LR Apps<-->DomainController<-->User
- Single organization focused:
- The biggest problem with these domain controllers was that they were conceived to put all the entities (end-users, applications, and services) involved in the process under the control of a single organization.
- These protocols were not created to handle scenarios where users want to connect to third-party clients as in a typical cloud computing scenario, where the organization that handles the identity provider typically does not have knowledge or control over those clients.
SAML
- For exchanging authentication and authorization data between different security domains.
- A user that belonged to company A could consume a service on Company B in a reliable way and without requiring companies to duplicate user profile
- Heavily based on XML for flexibility , but the issues are:
- becomes a problem when you need to digitally sign identities
- where two elements listed in a different order can break a signature verification.
- becomes a problem when you need to digitally sign identities
- Technically speaking, SAML provides enough to cover similar scenarios to the ones OpenID Connect addresses
- But the protocol ended up being used mostly in the business world.
- In the consumer world (social networks, for example), there was another effort called OAuth (Open Authorization)
flowchart LR CompanyA <--> SAML <--> CompanyB
OAuth (Open Authorization) and OIDC
- OAuth created specifically to handle delegated authorization scenarios, like when you let a random application post something on Facebook as if it was you.
- OAuth got so much traction that developers started using it to do things it was not created to do, like handling end-user authentication.
- This led to an effort that resulted in OpenID Connect, a protocol that extends OAuth 2.0 to address authentication.
OAuth 2.0
-
OAuth 2.0 is an authorization framework that enables clients to use resource servers on behalf of resources owners
-
Delegated authorization
-
OAuth 2.0 framework defines Authorization Server
- Authorization Server provides a means for resource owners to decide whether they want to grant the client the power to do something on their behalf or not
-
OAuth 2.0 framework uses the term Access Token to define the artifact that grants third-party applications (Clients) delegated authorization to act on behalf of users (Resource Owners).
-
OAuth 2.0 is an authorization framework that enables applications(clients) to use APIs on behalf of users
- With this token, an API (Resource Server) will let an application access some part of it to perform some action
-
Since the only thing OAuth 2.o handles is authorization, the framework does not support end-user authentication by itself
sequenceDiagram title: How SFDX CLI uses OAuth 2.0? User->>SFDXCLI: Grants access (after giving the consent) <br/> User is the Resource Owner SFDXCLI->>SFOrg: Authorization <br/>(SFDXCLI is Client/Application) (SFOrg is Resource Server)
OIDC Use Cases
- OpenID Connect lets applications offload the authentication process burden to identity providers.
-
Use OIDC to enable your users to reuse their accounts on an identity provider
- Instead of asking users to create yet another account – which would mean asking them to remember another set of credentials (or, even worse, reusing usernames and passwords) – you could take advantage of OIDC to integrate with an identity provider like Google or Microsoft to let them reuse existing accounts.
- The signup process would be smoother, resulting in fewer dropouts
- OIDC defines a set of profile data categories that you can use to acquire more information about your users
-
Instead of making your application communicate with multiple providers, you can make it connect to a single one that works as a hub for the others. For example: Auth0 company integrates with more than fifty identity providers
-
Work as a proxy for other protocols
- For example make an OpenID Connect identity provider work as a proxy for a more restrictive protocol like SAML
Key terms
Item | Description |
---|---|
Authorization Server | the place where, in a pure OAuth 2.0 authorization flow, end-users authorize third-party applications to act on their behalf; or, in an OpenID Connect flow, where end-users authenticate. |
ID Token | an artifact that carries personal information about end-users that authenticate on an OpenID Connect flow. |
Digital Signature | the mechanism that allows third-party applications to confirm the veracity of the information they find in ID Tokens. |
Delegated Authorization | a particular type of authorization where an entity enables another entity to act on its behalf. |
Access Tokens | the artifact that grants, to a third-party application, the authorization to act on behalf of a resource owner (e.g., an end-user) |
Implicit flow | tokens are handed to the applications directly by the authorization endpoint (i.e., when the authorization server redirects users back to the application) |
authorization code flow | Applications first get authorization codes (hence, the name); then, they have to exchange these codes for the tokens they need |
Resources
OpenId Connect
How to use Google as OpenID Connect provider for Salesforce?
Topics
- Demo
- Auth. Provider Settings
- Open ID Provider client secret
- Auth.RegistrationHandler
- Auth.RegistrationHandler Log
- Using Metadata API
- Getting AccessToken for the Google using Auth.AuthToken and Getting userInfo
- References
Demo
Auth. Provider settings
Salesforce Configuration
URL | Link |
---|---|
Test-Only Initialization URL | https://mohansun-ea-02-dev-ed.my.salesforce.com/services/auth/test/Google |
Single Sign-On Initialization URL | https://mohansun-ea-02-dev-ed.my.salesforce.com/services/auth/sso/Google |
Existing User Linking URL | https://mohansun-ea-02-dev-ed.my.salesforce.com/services/auth/link/Google |
OAuth-Only Initialization URL | https://mohansun-ea-02-dev-ed.my.salesforce.com/services/auth/oauth/Google |
Callback URL | https://mohansun-ea-02-dev-ed.my.salesforce.com/services/authcallback/Google |
Single Logout URL | https://mohansun-ea-02-dev-ed.my.salesforce.com/services/auth/rp/oidc/logout |
Google OIDC Provider Configuration
URL | Link |
---|---|
Authorize Endpoint URL | https://accounts.google.com/o/oauth2/auth |
Token Endpoint URL | https://accounts.google.com/o/oauth2/token |
User Info Endpoint URL | https://www.googleapis.com/oauth2/v3/userinfo |
OID provider client secret details
- Note: Content is mangled here to protect the Identity
{
"web": {
"project_id": "oidtest-310720",
"client_id": "113225562349-fqktafcJUNK.apps.googleusercontent.com",
"client_secret": "6VIgVAz8NlJUNKIk4INu4",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"redirect_uris": [
"https://mohansun-ea-02-dev-ed.my.salesforce.com/services/authcallback/Google"
]
}
}
Sample Autocreated RegHandler class
// TODO:This autogenerated class includes the basics for a Registration
// Handler class. You will need to customize it to ensure it meets your needs and
// the data provided by the third party.
global class AutocreatedRegHandler1618428423355 implements Auth.RegistrationHandler {
global boolean canCreateUser(Auth.UserData data) {
//TODO: Check whether we want to allow creation of a user with this data
//Set<String> s = new Set<String>{'usernamea', 'usernameb', 'usernamec'};
//if(s.contains(data.username)) {
//return true;
//}
return false;
}
global User createUser(Id portalId, Auth.UserData data){
System.debug(data);
User u = [Select Id From User WHERE federationidentifier = :data.email LIMIT 1];
return u;
}
global void updateUser(Id userId, Id portalId, Auth.UserData data){
}
}
Querying AuthProvider
SELECT Id,authorizeUrl,
consumerKey,consumerSecret,
customMetadataTypeRecord,
defaultScopes,
DeveloperName,
errorUrl,
friendlyName,
iconUrl,
idTokenIssuer,
LinkKickoffUrl,
logoutUrl,
OauthKickoffUrl,
providerType,
tokenUrl,
userInfoUrl
FROM AuthProvider
Log from the RegHandler
Metadata details of this AuthProvider
$ sfdx mohanc:mdapi:list -u mohan.chinnappan.n_ea2@gmail.com -t AuthProvider
{
result: [
{
createdById: '0053h000002xQ5sAAE',
createdByName: 'Mohan Chinnappan',
createdDate: 2020-12-11T17:06:31.000Z,
fileName: 'authproviders/Google.authprovider',
fullName: 'Google',
id: '0SO3h000000CwxxGAC',
lastModifiedById: '0053h000002xQ5sAAE',
lastModifiedByName: 'Mohan Chinnappan',
lastModifiedDate: 1970-01-01T00:00:00.000Z,
type: 'AuthProvider'
}
]
}
$ sfdx mohanc:mdapi:retrieve -u mohan.chinnappan.n_ea2@gmail.com -t AuthProvider
{
"RetrieveRequest": {
"apiVersion": "51.0",
"unpackaged": [
{
"types": {
"members": "*",
"name": "AuthProvider"
}
}
]
}
}
{ result: { done: false, id: '09S3h000005hIRGEA2', state: 'Queued' } }
$ sfdx mohanc:mdapi:checkRetrieveStatus -u mohan.chinnappan.n_ea2@gmail.com -i 09S3h000005hIRGEA2
[
{
createdById: '',
createdByName: '',
createdDate: 2020-12-11T17:06:31.000Z,
fileName: 'unpackaged/authproviders/Google.authprovider',
fullName: 'Google',
id: '0SO3h000000CwxxGAC',
lastModifiedById: '',
lastModifiedByName: '',
lastModifiedDate: 1970-01-01T00:00:00.000Z,
type: 'AuthProvider'
},
{
createdById: '0053h000002xQ5sAAE',
createdByName: 'Mohan Chinnappan',
createdDate: 2021-04-16T01:12:48.036Z,
fileName: 'unpackaged/package.xml',
fullName: 'unpackaged/package.xml',
id: '',
lastModifiedById: '0053h000002xQ5sAAE',
lastModifiedByName: 'Mohan Chinnappan',
lastModifiedDate: 2021-04-16T01:12:48.036Z,
manageableState: 'unmanaged',
type: 'Package'
}
]
=== Writing zipFile base64 content to 09S3h000005hIRGEA2.zip.txt ...
=== Writing zipFile binary content to 09S3h000005hIRGEA2.zip ...
$ jar tvf 09S3h000005hIRGEA2.zip
1054 Fri Apr 16 01:12:48 EDT 2021 unpackaged/authproviders/Google.authprovider
224 Fri Apr 16 01:12:48 EDT 2021 unpackaged/package.xml
$ unzip 09S3h000005hIRGEA2.zip
Archive: 09S3h000005hIRGEA2.zip
inflating: unpackaged/authproviders/Google.authprovider
inflating: unpackaged/package.xml
$ cat authproviders/Google.authprovider
<?xml version="1.0" encoding="UTF-8"?>
<AuthProvider xmlns="http://soap.sforce.com/2006/04/metadata">
<authorizeUrl>https://accounts.google.com/o/oauth2/auth</authorizeUrl>
<consumerKey>_junk_-fqktafcmesni74spapbj1ep0pvja4oor.apps.googleusercontent.com</consumerKey>
<consumerSecret>_____</consumerSecret>
<defaultScopes>email openid profile</defaultScopes>
<executionUser>mohan.chinnappan@email.com</executionUser>
<friendlyName>Google</friendlyName>
<includeOrgIdInIdentifier>false</includeOrgIdInIdentifier>
<providerType>Google</providerType>
<registrationHandler>AutocreatedRegHandler1618428423355</registrationHandler>
<sendAccessTokenInHeader>false</sendAccessTokenInHeader>
<sendClientCredentialsInHeader>false</sendClientCredentialsInHeader>
<sendSecretInApis>true</sendSecretInApis>
<tokenUrl>https://accounts.google.com/o/oauth2/token</tokenUrl>
<userInfoUrl>https://www.googleapis.com/oauth2/v3/userinfo</userInfoUrl>
</AuthProvider>
Test-Only Initialization URL
- https://mohansun-ea-02-dev-ed.my.salesforce.com/services/auth/test/Google
- Run Test Only
Getting AccessToken for the Google using Auth.AuthToken and UserInfo
// file: ~/.apex/oidc.cls
String authProviderId = '0SO3h000000Cwxx';
String providerName = 'Google';
String accessToken = Auth.AuthToken.getAccessToken(authProviderId, providerName);
System.debug('accessToken: ' + accessToken);
# you can also use developer console to execute above apex code instead of CLI
sfdx mohanc:tooling:execute -a ~/.apex/oidc.cls -u mohan.chinnappan.n_ea2@gmail.com
apexCode: String authProviderId = '0SO3h000000Cwxx';
String providerName = 'Google';
String accessToken = Auth.AuthToken.getAccessToken(authProviderId, providerName);
System.debug('accessToken: ' + accessToken);
compiled?: true
executed?: true
{
line: -1,
column: -1,
compiled: true,
success: true,
compileProblem: null,
exceptionStackTrace: null,
exceptionMessage: null
}
Check logs
sfdx force:apex:log:list -u mohan.chinnappan.n_ea2@gmail.com
APPLICATION DURATION (MS) ID LOCATION SIZE (B) LOG USER OPERATION REQUEST START TIME STATUS
─────────── ───────────── ────────────────── ────────── ──────── ──────────────── ───────────────────────────────────────────────────────── ─────────── ──────────────────────── ───────
Unknown 94 07L3h00000AvxqyEAB SystemLog 3640 Mohan Chinnappan /services/data/v57.0/tooling/executeAnonymous Api 2023-04-04T05:44:30+0000 Success
sfdx force:apex:log:get -i 07L3h00000AvxqyEAB -u mohan.chinnappan.n_ea2@gmail.com | grep -i Debug
57.0 APEX_CODE,FINEST;APEX_PROFILING,INFO;CALLOUT,INFO;DB,INFO;NBA,INFO;SYSTEM,DEBUG;VALIDATION,INFO;VISUALFORCE,FINER;WAVE,INFO;WORKFLOW,INFO
Execute Anonymous: System.debug('accessToken: ' + accessToken);
22:44:30.1 (7497609)|USER_DEBUG|[4]|DEBUG|accessToken: ya29.a0Ael9sCNXJZqO7O34pvUcb4sDkO9S0X3hbBTxtMnnZ1H00TJQq_mhaPAR5fXffydfasdfsarerrvSneMUdC8Wen9YOHYNn4rQ1sNtiHDFQqRQ6gtqGPBdrgHFz3kBsTEobU4OP59OJSIJJ1OO1kUkaCgYKAUYSARMSFQF4udJh3LjnChU-94GnI--qgC6QcA0166
If you have used Developer Console to execute the apex code, here is the view from the Developer Console:
Let us get the userInfo
# Note the accessToken used here is mangled to protect identity
curl https://www.googleapis.com/oauth2/v3/userinfo -H "Authorization: Bearer ya29.a0Ael9sCNXJZqO7O34pvUcb4sDkO9S0X3hbBTxtMnnZ1H00TJQq_mhadfdj3335fXffy4JsugBpVZSjY06npxOmlFJPvSneMUdC8Wen9YOHYNn4rQ1sNtiHDFQqRQ6gtqGPBdrgHFz3kBsTEobU4OP59OJSIJJ1OO1kUkaCgYKAUYSARMSFQF4udJh3LjnChU-94GnI--qgC6QcA0166" -X GET
{
"sub": "102467670031694452102",
"name": "Mohan Chinnappan",
"given_name": "Mohan",
"family_name": "Chinnappan",
"picture": "https://lh3.googleusercontent.com/a/AGNmyxbp471dpjPtNSeeeejyyNbh1fFJqlknZUaDqrmb\u003ds96-c",
"email": "mohan.chinnappan.n@gmail.com",
"email_verified": true,
"locale": "en"
}
Viewing RegistrationHandlerUserUpdate log
sfdx force:apex:log:list -u mohan.chinnappan.n_ea2@gmail.com
APPLICATION DURATION (MS) ID LOCATION SIZE (B) LOG USER OPERATION REQUEST START TIME STATUS
─────────── ───────────── ────────────────── ────────── ──────── ──────────────── ───────────────────────────────────────────────────────── ─────────── ──────────────────────── ───────
Unknown 91 07L3h00000AvxrhEAB SystemLog 6559 Mohan Chinnappan RegistrationHandlerUserUpdate Application 2023-04-04T10:44:35+0000 Success
sfdx force:apex:log:get -i 07L3h00000AvxrhEAB -u mohan.chinnappan.n_ea2@gmail.com | grep -i Debug
03:44:35.5 (12059570)|USER_DEBUG|[25]|DEBUG|UserData:[attributeMap={email=mohan.chinnappan.n@gmail.com, email_verified=true, family_name=Chinnappan, given_name=Mohan, locale=en, name=Mohan Chinnappan, picture=https://lh3.googleusercontent.com/a/AGNmyxbp471dpjPtNSeeeejyyNbh1fFJqlknZUaDqrmb=s96-c, sub=102467670031694452102}, email=mohan.chinnappan.n@gmail.com, firstName=Mohan, fullName=Mohan Chinnappan, identifier=102467670031694452102, lastName=Chinnappan, link=null, locale=en, provider=Google, siteLoginUrl=null, username=null]
Resources
-
Okta specific
-
Videos
-
Playgrounds
About OpenID connect
- Simple Identity layer on top of the OAuth 2.0 protocol
- Allows Clients (like service providers) to verify the identity of the End-User based on the authentication performed by an Authorization Server

OpenID Connect (OIDC) Provider
Coding
- Configure your provider
- You need to configure your provider :
- specifying the issuer URL
- the client ID and secret,
- the scopes you want to use.
- Implement the authorization endpoint
/authorize
- This endpoint handles user authentication and authorization
- When a user attempts to log in, they will be redirected to this endpoint
- You will need to prompt the user to log in and authorize the requested scopes
Salesforce Configuration
URL | Link |
---|---|
Test-Only Initialization URL | https://mohansun-ea-02-dev-ed.my.salesforce.com/services/auth/test/Google |
Single Sign-On Initialization URL | https://mohansun-ea-02-dev-ed.my.salesforce.com/services/auth/sso/Google |
Existing User Linking URL | https://mohansun-ea-02-dev-ed.my.salesforce.com/services/auth/link/Google |
OAuth-Only Initialization URL | https://mohansun-ea-02-dev-ed.my.salesforce.com/services/auth/oauth/Google |
Callback URL | https://mohansun-ea-02-dev-ed.my.salesforce.com/services/authcallback/Google |
Single Logout URL | https://mohansun-ea-02-dev-ed.my.salesforce.com/services/auth/rp/oidc/logout |
Google OIDC Provider Configuration
URL | Link |
---|---|
Authorize Endpoint URL | https://accounts.google.com/o/oauth2/auth |
Token Endpoint URL | https://accounts.google.com/o/oauth2/token |
User Info Endpoint URL | https://www.googleapis.com/oauth2/v3/userinfo |
AUthorization end point
Digging deeper...
curl 'https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=113225562349-fqktafcmesni74spapbj1ep0pvja4oor.apps.googleusercontent.com&redirect_uri=https%3A%2F%2Fmohansun-ea-02-dev-ed.my.salesforce.com%2Fservices%2Fauthcallback%2FGoogle&scope=email+openid++profile&state=CAAAAYcyiOKSMDAwMDAwMDAwMDAwMDAwAAAA8muQ9vyTuOZ1uRL1-H30_n-abNbVuoLO32Tl1OlRdT-uaHJrpKY_02vYBvWlWtOxWsB0wODRqoyDZD6MyteiJcRmkeEQQFxjZXB-_lWFAjHW4brvhjPGWGoOCGyI1-NbUElsg_sbNRqfHBqKtmPCxR39uK8hAu88cAEKitoJUchOMhSa6TLbN1puVbOBQUQUY4gtawZ7wyNCfSQ5VDP0G6x7_EwYHpfCgbtv5zee_OGw0jSrRDGFh3hwN14bw6NJyw%3D%3D' \
-H 'authority: accounts.google.com' \
-H 'accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7' \
-H 'accept-language: en-US,en;q=0.9' \
-H 'cookie: SMSV=ADHTe-A4DYNhjtsQ9HsUTQtFvTNXHpLdZ04HxHbq7iwzxSqKK8K1AP3E3E1_jd0rItdNoqGd650_Sc-1cX0tvT8Cz7hqzClpq3a4YziVdbSO7lM61qvI4mI; LSOLH=_SVI_EPbFnLSe1PsCGA0iP01BRURIZl9xN1ZKemF1eGVDaHl1aGU2aVVZNk0zRVFESW1aSGpVWW1QVFNRVUV3dFVBVkwzWU5CNVh2NnBzQQ_:27829248:f334; ACCOUNT_CHOOSER=AFx_qI7V7VEYJej7Ykov04Vrzq5A4_lDKibmV59PmAF20VUie91gKs9wCHS1je3okc4e0Ash2J7IggG8rH_w9TIvFyLAmqCzLzReX9DHCyqfRWF9JW6Ay95sd9pW9Jj1chlVpYuldJuIsif1joMKWgghl4h5Y8PpLbmskYC_j4UcuLil6STA7rLWALpbBtClLyv8V4maNfl933_2cnloK8fgGF0OAXPHvPfOuX3i1eccntjfP2WnCVq1E5eO7PHG3WbS0ieW9HhAiBbAf7KPbQqVewLHt6S_o61MV-sfm1SxpYwXsd760WHCO-N2Wkt7od4dRoI_dM57nVf_LAJzp2Myl8WYTp27jU_c9rq4lVAkZO5EgxDo8xE; user_id=102467670031694452102; SID=Ugi1I85WSY9C5ab3s1crnWPFArrjJ9281xpuc75ZTh9aCJMbzZE_SzFdwcD9bTdPo9Bucw.; __Secure-1PSID=Ugi1I85WSY9C5ab3s1crnWPFArrjJ9281xpuc75ZTh9aCJMbIZaMWo88ze8emst2djLA6Q.; __Secure-3PSID=Ugi1I85WSY9C5ab3s1crnWPFArrjJ9281xpuc75ZTh9aCJMbfLml6pHozIpG1K4nIG1rFw.; LSID=o.calendar.google.com|o.chat.google.com|o.drive.google.com|o.groups.google.com|o.lens.google.com|o.mail.google.com|o.meet.google.com|o.play.google.com|o.store.google.com|s.blogger|s.youtube:Ugi1I8tfct6dQOO-qPyKgV4Ha29QOMbZr5ctdnf3Sf9HqxsPZd14eK8G30Swws7LSup9Ag.; __Host-1PLSID=o.calendar.google.com|o.chat.google.com|o.drive.google.com|o.groups.google.com|o.lens.google.com|o.mail.google.com|o.meet.google.com|o.play.google.com|o.store.google.com|s.blogger|s.youtube:Ugi1I8tfct6dQOO-qPyKgV4Ha29QOMbZr5ctdnf3Sf9HqxsPDhMdmIlVdR8c5dtYsdoWGg.; __Host-3PLSID=o.calendar.google.com|o.chat.google.com|o.drive.google.com|o.groups.google.com|o.lens.google.com|o.mail.google.com|o.meet.google.com|o.play.google.com|o.store.google.com|s.blogger|s.youtube:Ugi1I8tfct6dQOO-qPyKgV4Ha29QOMbZr5ctdnf3Sf9HqxsPOZdCNWhVBl79ZqQKnP6BAA.; HSID=AbpN7mtQ4ZrFiXUmT; SSID=A9My-LpWUK-E4r7AG; APISID=IbDmx8r0qLbrL-RR/AmOxvGGxLvSzxnsgI; SAPISID=eJ_nQyk1nKB5iKMM/AVN6ZTGMq7qWxaEVu; __Secure-1PAPISID=eJ_nQyk1nKB5iKMM/AVN6ZTGMq7qWxaEVu; __Secure-3PAPISID=eJ_nQyk1nKB5iKMM/AVN6ZTGMq7qWxaEVu; AEC=AUEFqZeU5ix63mgATKsT_B_8nJ5HMCvzMz865MhcSRFZkaImjtODFIogmw; NID=511=IjViYjqGfKANVt5V2z5b9pOeE6BacqtY7fuRlwNz-Oilmm3XhWFjSYUMG6YZJoRcIxPaQUilDoK8RDBz0LM0GkaCLVI8M1QffX4K-2N65AZ_gNUIfrwgP-KuRWGjoY-INvYwBfYjR7gESIj127KE9Ur6UkSVB3_qdfhdd37oHmQKbhmOckrzXwo8LiEtX08X-UA9yEbTHWjtQyRZUV3J2utDBl9omh1YFRteY4uss5AuL8J4NW5VUtUiVLZlcxuWJjjgarXg_EJz0OJ0F9q9jRPIyb2GHO9rnDpedHgun-z5MvuarhMgtdrk5jevwIRAN1WAh2HBpQoIYXMUpTaURroFSK7W1SuHG3Rk5FyoNpLFvJEdMNopCe3g4jLlIzMM; 1P_JAR=2023-03-30-12; __Host-GAPS=1:ZyvvyOwHAegzD3RyYvG5p98L6wA_zhvWdfbbqHx-lRSKP05LryST18rKEZfAxVCdMQTvucTa993UgRV_E9zZO1BTgKv4bw:IyvRFMgyJdq7vjgn; SIDCC=AFvIBn8xSu6nUAArPyQcTCvydpPlThjwVbbivL-8sNeWkRx5iHw_yvBJZH96HxHeyVDuwxY50F8R; __Secure-1PSIDCC=AFvIBn8H1BlHixXBGGbs5mqbhQjz3Tmc9KpJeZjEuyjEDbuTZd0dQm3OpMgTbNqAUz5QHbXWWs4; __Secure-3PSIDCC=AFvIBn_ZDNglnMEA17ZnzPuhdmQ8DwiVezhvI4B1iOTnRJMVfEYQjWwwHQbPRkk1OVu61guANIo' \
-H 'referer: https://mohansun-ea-02-dev-ed.my.salesforce.com/' \
-H 'sec-ch-ua: "Google Chrome";v="111", "Not(A:Brand";v="8", "Chromium";v="111"' \
-H 'sec-ch-ua-arch: "x86"' \
-H 'sec-ch-ua-bitness: "64"' \
-H 'sec-ch-ua-full-version: "111.0.5563.146"' \
-H 'sec-ch-ua-full-version-list: "Google Chrome";v="111.0.5563.146", "Not(A:Brand";v="8.0.0.0", "Chromium";v="111.0.5563.146"' \
-H 'sec-ch-ua-mobile: ?0' \
-H 'sec-ch-ua-model: ""' \
-H 'sec-ch-ua-platform: "macOS"' \
-H 'sec-ch-ua-platform-version: "13.2.1"' \
-H 'sec-ch-ua-wow64: ?0' \
-H 'sec-fetch-dest: document' \
-H 'sec-fetch-mode: navigate' \
-H 'sec-fetch-site: cross-site' \
-H 'sec-fetch-user: ?1' \
-H 'upgrade-insecure-requests: 1' \
-H 'user-agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36' \
-H 'x-chrome-id-consistency-request: version=1,client_id=77185425430.apps.googleusercontent.com,device_id=6774b99a-0361-4feb-8398-814743d0a935,signin_mode=all_accounts,signout_mode=show_confirmation' \
--compressed
url = 'https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=113225562349-fqktafcmesni74spapbj1ep0pvja4oor.apps.googleusercontent.com&redirect_uri=https%3A%2F%2Fmohansun-ea-02-dev-ed.my.salesforce.com%2Fservices%2Fauthcallback%2FGoogle&scope=email+openid++profile&state=CAAAAYcycWR5MDAwMDAwMDAwMDAwMDAwAAAA8vlkOR0tObD2YFyIGcyW5uNODCoVk-UbAFB67fKsPwsXLPrGFjlfWXpi9CQQgfXdm06to3jqNofBSOe4DvcoyDzCBtvnUrkrCTL9to_CblC3Qj4L48_E1vvIvjXlmVzjnSGRBiWYyu2GX8mups7UUBk9u8z7K1r-6VlBx3e6bt7ovlSqMeSz1-DUYuDKBI3pcBGTpgqmTbuCZE_6wi2LC7QY8Bo29ETSb4aakGKRGJb2Vx78NSMUODioVO4ZmGr9CA%3D%3D'
const urlParsed = new URL(url);
urlParsed
URL {origin: 'https://accounts.google.com', protocol: 'https:', username: '', password: '', host: 'accounts.google.com', …}hash: ""host: "accounts.google.com"hostname: "accounts.google.com"href: "https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=113225562349-fqktafcmesni74spapbj1ep0pvja4oor.apps.googleusercontent.com&redirect_uri=https%3A%2F%2Fmohansun-ea-02-dev-ed.my.salesforce.com%2Fservices%2Fauthcallback%2FGoogle&scope=email+openid++profile&state=CAAAAYcycWR5MDAwMDAwMDAwMDAwMDAwAAAA8vlkOR0tObD2YFyIGcyW5uNODCoVk-UbAFB67fKsPwsXLPrGFjlfWXpi9CQQgfXdm06to3jqNofBSOe4DvcoyDzCBtvnUrkrCTL9to_CblC3Qj4L48_E1vvIvjXlmVzjnSGRBiWYyu2GX8mups7UUBk9u8z7K1r-6VlBx3e6bt7ovlSqMeSz1-DUYuDKBI3pcBGTpgqmTbuCZE_6wi2LC7QY8Bo29ETSb4aakGKRGJb2Vx78NSMUODioVO4ZmGr9CA%3D%3D"origin: "https://accounts.google.com"password: ""pathname: "/o/oauth2/auth"port: ""protocol: "https:"search: "?response_type=code&client_id=113225562349-fqktafcmesni74spapbj1ep0pvja4oor.apps.googleusercontent.com&redirect_uri=https%3A%2F%2Fmohansun-ea-02-dev-ed.my.salesforce.com%2Fservices%2Fauthcallback%2FGoogle&scope=email+openid++profile&state=CAAAAYcycWR5MDAwMDAwMDAwMDAwMDAwAAAA8vlkOR0tObD2YFyIGcyW5uNODCoVk-UbAFB67fKsPwsXLPrGFjlfWXpi9CQQgfXdm06to3jqNofBSOe4DvcoyDzCBtvnUrkrCTL9to_CblC3Qj4L48_E1vvIvjXlmVzjnSGRBiWYyu2GX8mups7UUBk9u8z7K1r-6VlBx3e6bt7ovlSqMeSz1-DUYuDKBI3pcBGTpgqmTbuCZE_6wi2LC7QY8Bo29ETSb4aakGKRGJb2Vx78NSMUODioVO4ZmGr9CA%3D%3D"searchParams: URLSearchParams {}username: ""[[Prototype]]: URL
urlParsed.search
'?response_type=code&client_id=113225562349-fqktafcmesni74spapbj1ep0pvja4oor.apps.googleusercontent.com&redirect_uri=https%3A%2F%2Fmohansun-ea-02-dev-ed.my.salesforce.com%2Fservices%2Fauthcallback%2FGoogle&scope=email+openid++profile&state=CAAAAYcycWR5MDAwMDAwMDAwMDAwMDAwAAAA8vlkOR0tObD2YFyIGcyW5uNODCoVk-UbAFB67fKsPwsXLPrGFjlfWXpi9CQQgfXdm06to3jqNofBSOe4DvcoyDzCBtvnUrkrCTL9to_CblC3Qj4L48_E1vvIvjXlmVzjnSGRBiWYyu2GX8mups7UUBk9u8z7K1r-6VlBx3e6bt7ovlSqMeSz1-DUYuDKBI3pcBGTpgqmTbuCZE_6wi2LC7QY8Bo29ETSb4aakGKRGJb2Vx78NSMUODioVO4ZmGr9CA%3D%3D'
urlParsed.search.split('&')
(5) ['?response_type=code', 'client_id=113225562349-fqktafcmesni74spapbj1ep0pvja4oor.apps.googleusercontent.com', 'redirect_uri=https%3A%2F%2Fmohansun-ea-02-dev-ed.m…salesforce.com%2Fservices%2Fauthcallback%2FGoogle', 'scope=email+openid++profile', 'state=CAAAAYcycWR5MDAwMDAwMDAwMDAwMDAwAAAA8vlkOR0t…Y8Bo29ETSb4aakGKRGJb2Vx78NSMUODioVO4ZmGr9CA%3D%3D']
Object.assign({}, urlParsed.search.split('&') );
{
0: '?response_type=code',
1: 'client_id=113225562349-fqktafcmesni74spapbj1ep0pvja4oor.apps.googleusercontent.com',
2: 'redirect_uri=https%3A%2F%2Fmohansun-ea-02-dev-ed.m…salesforce.com%2Fservices%2Fauthcallback%2FGoogle',
3: 'scope=email+openid++profile',
4: 'state=CAAAAYcycWR5MDAwMDAwMDAwMDAwMDAwAAAA8vlkOR0t…Y8Bo29ETSb4aakGKRGJb2Vx78NSMUODioVO4ZmGr9CA%3D%3D'
}
response_type: code
client_id: 113225562349-fqktafcmesni74spapbj1ep0pvja4oor.apps.googleusercontent.com
redirect_uri: https://mohansun-ea-02-dev-ed.my.salesforce.com/services/authcallback/Google
scope: email openid profile
state: CAAAAYcycWR5MDAwMDAwMDAwMDAwMDAwAAAA8vlkOR0tObD2YFyIGcyW5uNODCoVk-UbAFB67fKsPwsXLPrGFjlfWXpi9CQQgfXdm06to3jqNofBSOe4DvcoyDzCBtvnUrkrCTL9to_CblC3Qj4L48_E1vvIvjXlmVzjnSGRBiWYyu2GX8mups7UUBk9u8z7K1r-6VlBx3e6bt7ovlSqMeSz1-DUYuDKBI3pcBGTpgqmTbuCZE_6wi2LC7QY8Bo29ETSb4aakGKRGJb2Vx78NSMUODioVO4ZmGr9CA==
- Consent
https://accounts.google.com/signin/oauth/consent?authuser=0&part=AJi8hAPPKVJYXZWoTwsAvFBuRYB7qFqGMagu55E253ysGRFqZgWZMdOjqEMP5YqvtEdf0B566yMiguqZSSZ4lbgsLdyErCN7Zmibhm9pjMTeJsy4kIibA-O8lK314NlUmbAVPEyLNZ8AbCQkVbBb_9qAdHxdEK7rkD0KJbOhKcFL0BO_SNbUbwQLmABvm3r7uorry7AuxtvDw1w1ok8CI6u08IDtVKXx5LnDSjFvn1U93GRtxT4DfklSsDbHjcPTB-a7Qj_98dbo0L9FojzSq1l8IltdnWL6YvT0GkXO_WVHb5Mh5Iu8vpDaBD1vaYCIlG5A
authuser: 0
part: AJi8hAPPKVJYXZWoTwsAvFBuRYB7qFqGMagu55E253ysGRFqZgWZMdOjqEMP5YqvtEdf0B566yMiguqZSSZ4lbgsLdyErCN7Zmibhm9pjMTeJsy4kIibA-O8lK314NlUmbAVPEyLNZ8AbCQkVbBb_9qAdHxdEK7rkD0KJbOhKcFL0BO_SNbUbwQLmABvm3r7uorry7AuxtvDw1w1ok8CI6u08IDtVKXx5LnDSjFvn1U93GRtxT4DfklSsDbHjcPTB-a7Qj_98dbo0L9FojzSq1l8IltdnWL6YvT0GkXO_WVHb5Mh5Iu8vpDaBD1vaYCIlG5AuPU6JRCQY1a5SGdDLribsqCPzuE9z9wrqbmU_iFHpcl8vhdGBnqLoEjkNgzmb3V6TlxWAkWEuti6r8y8dkWuduYd0rGcRersJ_pD3f8rtHT00IVTu0mB7tGgN6UEWb8rey9D4v6jxgK_el8fXIFiAByfO1ni_1pZj1DWvRvp4IDRYzowLa4oxU0GSPZNTmRdT9r-qCMC
as: S-1290867910:1680177605126845
client_id: 113225562349-fqktafcmesni74spapbj1ep0pvja4oor.apps.googleusercontent.com
pli: 1
rapt: AEjHL4PiKkA1sGph8xQqIaNrSv-lknPQyYaBt15Hc4GkzOB7AwQX-wQTo3W-dsAfETT82g0Ybf3TmqLglD1kasQo5QkGWw5mPA
https://mohansun-ea-02-dev-ed.my.salesforce.com/services/authcallback/Google?state=CAAAAYcycWR5MDAwMDAwMDAwMDAwMDAwAAAA8vlkOR0tObD2YFyIGcyW5uNODCoVk-UbAFB67fKsPwsXLPrGFjlfWXpi9CQQgfXdm06to3jqNofBSOe4DvcoyDzCBtvnUrkrCTL9to_CblC3Qj4L48_E1vvIvjXlmVzjnSGRBiWYyu2GX8mups7UUBk9u8z7K1r-6VlBx3e6bt7ovlSqMeSz1-DUYuDKBI3pcBGTpgqmTbuCZE_6wi2LC7QY8Bo29ETSb4aakGKRGJb2Vx78NSMUODioVO4ZmGr9CA==&code=4/0AVHEtk7MijYSTwo9rTHxlZksTZHx7ua6suRe3d9ISEVIPGhQNazKbJqPabgNko1gukZumw&scope=email+profile+https://www.googleapis.com/auth/userinfo.email+https://www.googleapis.com/auth/userinfo.profile+openid&authuser=0&prompt=none
state: CAAAAYcycWR5MDAwMDAwMDAwMDAwMDAwAAAA8vlkOR0tObD2YFyIGcyW5uNODCoVk-UbAFB67fKsPwsXLPrGFjlfWXpi9CQQgfXdm06to3jqNofBSOe4DvcoyDzCBtvnUrkrCTL9to_CblC3Qj4L48_E1vvIvjXlmVzjnSGRBiWYyu2GX8mups7UUBk9u8z7K1r-6VlBx3e6bt7ovlSqMeSz1-DUYuDKBI3pcBGTpgqmTbuCZE_6wi2LC7QY8Bo29ETSb4aakGKRGJb2Vx78NSMUODioVO4ZmGr9CA==
code: 4/0AVHEtk7MijYSTwo9rTHxlZksTZHx7ua6suRe3d9ISEVIPGhQNazKbJqPabgNko1gukZumw
scope: email profile https://www.googleapis.com/auth/userinfo.email https://www.googleapis.com/auth/userinfo.profile openid
authuser: 0
prompt: none
curl 'https://mohansun-ea-02-dev-ed.file.force.com/secur/contentDoor?startURL=https%3A%2F%2Fmohansun-ea-02-dev-ed.my.salesforce.com%2Fsetup%2FforcecomHomepage.apexp%3Fsetupid%3DForceCom&sid=00D3h000007R1Lu%21AR0AQMK0XqIiTcFHijiGI.5J303.dtTTgZqyYbnK6iu6_mBrIxFkmFzqShvE6c8FY4ELjJ1Z5cl5_NDV27FylouPaLDmbWKl&skipRedirect=1&lm=eyJlbmMiOiJBMjU2R0NNIiwiYXVkIjoiMDBEM2gwMDAwMDdSMUx1Iiwia2lkIjoie1widFwiOlwiMDBEM2gwMDAwMDdSMUx1XCIsXCJ2XCI6XCIwMkczaDAwMDAwMFFGWE9cIixcImFcIjpcImNvbnRlbnRkb29ydXNlcnRyYW5zaWVudGtleWVuY3J5cHRcIixcInVcIjpcIjAwNTNoMDAwMDAyeFE1c1wifSIsImNyaXQiOlsiaWF0Il0sImlhdCI6MTY4MDE3OTE3NjQ2NiwiZXhwIjowfQ%3D%3D..BEIS8BDm_kfKDL9I.IC5eP6YZO2nhu-WSWP56HA%3D%3D.cadev8_F5qhzIdAzzcv-vQ%3D%3D' \
-H 'authority: mohansun-ea-02-dev-ed.file.force.com' \
-H 'accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7' \
-H 'accept-language: en-US,en;q=0.9' \
-H $'cookie: _ce.s=v~521a935acbeb20c80c9fcd40ee0d1e47ceeb29fb~vpv~0; _ga_1HBYZJY0L5=GS1.1.1650657815.1.0.1650657821.0; _ga=GA1.2.731290433.1606657544; sid_Client=h000002xQ5sh000007R1Lu; clientSrc=24.188.37.149; BrowserId_sec=Ga1_6sSXEe26w5WqP0vh9g; CookieConsentPolicy=0:1; LSKey-c$CookieConsentPolicy=0:1; BrowserId=Pi75EcoLEe2nss1S-y4GIg; sid=00D3h000007R1Lu\u0021AR0AQPQxwTIi4..zhspyWG5FXJJqqH9TMtpFr2ZkqpQH5WaBlHkivD7aJpjJlD884mCseXoShUalu3IdLjOr7GBNOSxY5I2p; inst=APP_3h' \
-H 'referer: https://mohansun-ea-02-dev-ed.my.salesforce.com/' \
-H 'sec-ch-ua: "Google Chrome";v="111", "Not(A:Brand";v="8", "Chromium";v="111"' \
-H 'sec-ch-ua-mobile: ?0' \
-H 'sec-ch-ua-platform: "macOS"' \
-H 'sec-fetch-dest: document' \
-H 'sec-fetch-mode: navigate' \
-H 'sec-fetch-site: cross-site' \
-H 'upgrade-insecure-requests: 1' \
-H 'user-agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36' \
--compressed
oid: 00D3h000007R1Lu
retURL: https://mohansun-ea-02-dev-ed.file.force.com/secur/contentDoor?startURL=https%3A%2F%2Fmohansun-ea-02-dev-ed.my.salesforce.com%2Fsetup%2FforcecomHomepage.apexp%3Fsetupid%3DForceCom&sid=00D3h000007R1Lu%21AR0AQIXvv71zFeZY_eRCCFTHPs2.xrl69eBVSmaBJX.wDdf7_Q5gCtWf70K4Ao9VKDmd0k616oMgBIyR2_ISlLBGl1eim57i&skipRedirect=1&lm=eyJlbmMiOiJBMjU2R0NNIiwiYXVkIjoiMDBEM2gwMDAwMDdSMUx1Iiwia2lkIjoie1widFwiOlwiMDBEM2gwMDAwMDdSMUx1XCIsXCJ2XCI6XCIwMkczaDAwMDAwMFFGWE9cIixcImFcIjpcImNvbnRlbnRkb29ydXNlcnRyYW5zaWVudGtleWVuY3J5cHRcIixcInVcIjpcIjAwNTNoMDAwMDAyeFE1c1wifSIsImNyaXQiOlsiaWF0Il0sImlhdCI6MTY4MDE3ODQ4MjAyMywiZXhwIjowfQ%3D%3D..sxLY9IUn87nIGIuv.DylJd7gqisXaz0tZZn66uw%3D%3D.g8WTiSfkUkPNnp7NjManyA%3D%3D
startURL: https://mohansun-ea-02-dev-ed.my.salesforce.com/setup/forcecomHomepage.apexp?setupid=ForceCom
sid: 00D3h000007R1Lu!AR0AQMK0XqIiTcFHijiGI.5J303.dtTTgZqyYbnK6iu6_mBrIxFkmFzqShvE6c8FY4ELjJ1Z5cl5_NDV27FylouPaLDmbWKl
skipRedirect: 1
lm: eyJlbmMiOiJBMjU2R0NNIiwiYXVkIjoiMDBEM2gwMDAwMDdSMUx1Iiwia2lkIjoie1widFwiOlwiMDBEM2gwMDAwMDdSMUx1XCIsXCJ2XCI6XCIwMkczaDAwMDAwMFFGWE9cIixcImFcIjpcImNvbnRlbnRkb29ydXNlcnRyYW5zaWVudGtleWVuY3J5cHRcIixcInVcIjpcIjAwNTNoMDAwMDAyeFE1c1wifSIsImNyaXQiOlsiaWF0Il0sImlhdCI6MTY4MDE3OTE3NjQ2NiwiZXhwIjowfQ==..BEIS8BDm_kfKDL9I.IC5eP6YZO2nhu-WSWP56HA==.cadev8_F5qhzIdAzzcv-vQ==
jwt:
eyJlbmMiOiJBMjU2R0NNIiwiYXVkIjoiMDBEM2gwMDAwMDdSMUx1Iiwia2lkIjoie1widFwiOlwiMDBEM2gwMDAwMDdSMUx1XCIsXCJ2XCI6XCIwMkczaDAwMDAwMFFGWE9cIixcImFcIjpcImNvbnRlbnRkb29ydXNlcnRyYW5zaWVudGtleWVuY3J5cHRcIixcInVcIjpcIjAwNTNoMDAwMDAyeFE1c1wifSIsImNyaXQiOlsiaWF0Il0sImlhdCI6MTY4MDE3OTE3NjQ2NiwiZXhwIjowfQ==..BEIS8BDm_kfKDL9I.IC5eP6YZO2nhu-WSWP56HA==.cadev8_F5qhzIdAzzcv-vQ==
- header
{
"enc": "A256GCM",
"aud": "00D3h000007R1Lu",
"kid": "{\"t\":\"00D3h000007R1Lu\",\"v\":\"02G3h000000QFXO\",\"a\":\"contentdoorusertransientkeyencrypt\",\"u\":\"0053h000002xQ5s\"}",
"crit": [
"iat"
],
"iat": 1680179176466,
"exp": 0
}
Name | Requirement | Description |
---|---|---|
response_type | REQUIRED | code for Authorization Code Grant and token for Implicit Grant. |
client_id | REQUIRED | The client ID of the client application making the authorization request. A client ID is an opaque number or string issued by a service. |
redirect_uri | OPTIONAL | The URL to which the client application requests the result of the authorization request to be reported. If the client application has registered multiple redirect URIs or has not registered any redirect URI (this is allowed when the client type of the client application is “confidential”), this request parameter is REQUIRED. |
scope | OPTIONAL | A space-delimited list of scopes (permissions) that the client application requires. The service defines a set of valid scope values. Some implementations of OAuth 2.0 (e.g. Facebook) do not correctly implement the specification, requiring a comma-delimited list. |
state | RECOMMENDED | An opaque value that the client application may use. If this request parameter is contained in the authorization request, it is returned to the redirect URI as a query parameter. |
# Implement your authorization endpoint here
# https://www.authlete.com/developers/definitive_guide/authorization_endpoint_spec/
@app.route('/authorize')
def authorize():
#...
- Implement the token endpoint
/token
- This endpoint handles the exchange of an authorization code for an access token and ID token
# Implement your token endpoint here
@app.route('/token', methods=['POST'])
def token():
# ...
- Implement the userinfo endpoint
/userinfo
- This endpoint returns information about the authenticated user
# Implement your userinfo endpoint here
@app.route('/userinfo')
def userinfo():
# ...
- Implement the logout endpoint
- This endpoint handles the user logout process
# Implement your logout endpoint here
@app.route('/logout')
def logout():
# ...
Code outline
from flask import Flask, redirect, url_for
from flask_oidc import OpenIDConnect
app = Flask(__name__)
app.config.update({
'SECRET_KEY': 'secret',
'OIDC_CLIENT_SECRETS': 'client_secrets.json',
'OIDC_ID_TOKEN_COOKIE_SECURE': False,
'OIDC_REQUIRE_VERIFIED_EMAIL': False,
'OIDC_VALID_ISSUERS': 'https://accounts.example.com',
'OIDC_SCOPES': ['openid', 'email', 'profile'],
})
oidc = OpenIDConnect(app)
@app.route('/')
@oidc.require_login
def index():
user_info = oidc.user_getinfo(['sub', 'email', 'preferred_username'])
return 'Hello, {} ({})'.format(user_info['preferred_username'], user_info['email'])
# Implement your authorization endpoint here
@app.route('/authorize')
def authorize():
# ...
# Implement your token endpoint here
@app.route('/token', methods=['POST'])
def token():
# ...
# Implement your userinfo endpoint here
@app.route('/userinfo')
def userinfo():
# ...
# Implement your logout endpoint here
@app.route('/logout')
def logout():
# ...
if __name__ == '__main__':
app.run()
SECRET_KEY: A secret key for Flask to use for its session management.
OIDC_CLIENT_SECRETS: A path to a JSON file containing client secrets for your OIDC provider.
OIDC_ID_TOKEN_COOKIE_SECURE: Whether or not the ID token cookie should be marked as secure.
OIDC_REQUIRE_VERIFIED_EMAIL: Whether or not email verification should be required for OIDC logins.
OIDC_VALID_ISSUERS: A string or list of valid issuer URLs for OIDC logins.
OIDC_SCOPES: A list of scopes to request from the OIDC provider.
const { Issuer, Strategy } = require('openid-client');
Issuer.discover('https://accounts.example.com')
.then((issuer) => {
const client = new issuer.Client({
client_id: 'client_id',
client_secret: 'client_secret',
});
const params = {
redirect_uri: 'https://example.com/callback',
scope: 'openid email profile',
};
const strategy = new Strategy(
{
client,
params,
passReqToCallback: false,
usePKCE: 'S256',
},
(tokenset, userinfo, done) => {
// Handle successful authentication here
}
);
// Implement your authorization endpoint here
app.get('/authorize', (req, res) => {
// ...
});
// Implement your token endpoint here
app.post('/token', (req, res) => {
// ...
});
// Implement your userinfo endpoint here
app.get('/userinfo', (req, res) => {
// ...
});
// Implement your logout endpoint here
app.post('/logout', (req, res) => {
// ...
});
})
.catch((err) => {
console.error(err);
});
References
- Authorization Server Implementation for Deno
- An Illustrated Guide to OAuth and OpenID Connect
- Configure an Authentication Provider Using OpenID Connect
- RegistrationHandler Interface
JWT (JSON Web Tokens)
sequenceDiagram title: JWT - JSON Web Token - GoogleOIDC User->>SFDC: I need to login into the Org SFDC->>User: OK, can you prove your identity, I will send you to Google OIDC,<br/> Since that is configured as your authProvider User->>GoogleOIDC: Login GoogleOIDC->>SFDC: Here is the ID proof of the User, JWT SFDC->>User: You are logged-in, go ahead and use the Org User->>SFDC: Performs work in the Org: Authorization: Bearer JWT
Parts of JWT
Header.Payload.Signature
Part | Description |
---|---|
Header | {typ: "JWT", alg: "HS256"} |
Payload | |
> Registered Claims | {exp: 1679913812248, sub: "Subject here"} |
> Public Claims | {name: "Ken Thompson"} |
> Private Claim | {admin: true} |
Signature | HMACSHA256( base64URLencode( Header+Payload, secret ) |
Signature
const signature = HMACSHA256( base64URLencode( Header+Payload, secret );
- Verification: Server calculates the signature based on the data present in the token (JWT) and compares with the signature that came in the token. If a malicious user tampers anything in the header or payload, the server’s calculated signature won’t match the received signature.
Code
- jwt util module (jwt.ts)
// jwt.ts
// Exports:
// createJWT
// verifyJWT
import { encode, decode } from "https://deno.land/std/encoding/base64url.ts";
import { Sha256 } from "https://deno.land/std@0.90.0/hash/sha256.ts";
// create JWT
export function createJWT(header: any, payload: any, secret: string): string {
const eHeader = encode(new TextEncoder().encode(JSON.stringify(header)));
const ePayload = encode(new TextEncoder().encode(JSON.stringify(payload)));
const token = `${eHeader}.${ePayload}`;
// hash functions process input of any length and produces a fixed-size hash.
return token + "." + new Sha256(secret).update(token).toString();
}
// verification
export function verifyJWT(token: string, secret: string): any {
// get the parts
const parts = token.split(".");
if (parts.length !== 3) return "not enought items"; // not enough items
const [header, payload, signature] = parts;
const calcSign = new Sha256(secret).update(`${header}.${payload}`).toString();
if (calcSign !== signature) return "invalid signature"; // invalid signature
const payloadOut = JSON.parse(new TextDecoder().decode(decode(payload)));
if (payloadOut.exp && Date.now() > payloadOut.exp) return `EXPIRED:${JSON.stringify(payloadOut)}`; // expired
return payloadOut;
}
- using jwt.ts
// jwtApp.ts
// uses jwt.ts in the local
// run:
// . deno run jwtApp.ts
import {createJWT, verifyJWT} from './jwt.ts';
const secret = "mySecret";
const jwt1 = createJWT(
{ typ: "JWT", alg: "HS256" },
{ exp: 1679913812248, sub: "SUB1", name: "Ken Thompson", admin: true },
secret
);
console.log('JWT:', jwt1);
console.log('Verification:', verifyJWT(jwt1, secret));
- Testing
deno run jwtApp.ts
JWT: eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJleHAiOjE2Nzk5MTM4MTIyNDgsInN1YiI6IlNVQjEiLCJuYW1lIjoiS2VuIFRob21wc29uIiwiYWRtaW4iOnRydWV9.5eaeb498ad001a7a17ce1816d5dea56a6fa6c11438ce4aa93361224f
Verification: EXPIRED:{"exp":1679913812248,"sub":"SUB1","name":"Ken Thompson","admin":true}
- Testing at jwt.io
Using CLI
- Create
cat payload.json
{ "exp": 1679913812248, "sub": "SUB1", "name": "Ken Thompson", "admin": true }
sfdx mohanc:security:jwt:create -i payload.json -s "mySecret"
eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJleHAiOjE2Nzk5MTM4MTIyNDgsInN1YiI6IlNVQjEiLCJuYW1lIjoiS2VuIFRob21wc29uIiwiYWRtaW4iOnRydWUsImlhdCI6MTY3OTkxNzc0OX0.B0KltHsNOUBlPjA8YEdpXKY24ZsGpJI67GBzOW-jMr8
- Verify
sfdx mohanc:security:jwt:verify -s "mySecret" -j "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJleHAiOjE2Nzk5MTM4MTIyNDgsInN1YiI6IlNVQjEiLCJuYW1lIjoiS2VuIFRob21wc29uIiwiYWRtaW4iOnRydWUsImlhdCI6MTY3OTkxODQ3NX0.KvYu4MvjmTsX3gfEow7eAHY7PspDV465RCb4tkIme2Y"
{
exp: 1679913812248,
sub: 'SUB1',
name: 'Ken Thompson',
admin: true,
iat: 1679918475
}
OpenSSL
- Secret Generation using OpenSSL
openssl rand -hex 32
bb5e5ae325f4cfb8a9215602ef3964efc79848232d7c8715e621923692a27669
Penetration (pen) Testing
Penetration testing, or pen testing for short, is the process of testing a website or other digital system for vulnerabilities that could be exploited by attackers. Penetration testing typically involves simulating various types of attacks to see how the system responds and to identify potential weaknesses that could be exploited by attackers.
Steps
- Note: Unauthorized pen testing is illegal and could result in criminal charges.Take appropriate precautions to avoid causing any harm or disruption to the website or its users during the testing process
Step | Description |
---|---|
1. Define the scope of the test | Determine what parts of the website will be tested, what types of attacks will be simulated, and any limitations or constraints that need to be taken into consideration. |
2. Reconnaissance/Scouting | Conduct a thorough reconnaissance of the website and any associated systems to identify potential vulnerabilities or weaknesses.Ping probes, port scanning, or traceroute or finger-printing are practical examples of active reconnaissance. |
3. Vulnerability scanning | Use automated tools to scan the website and associated systems for known vulnerabilities. Example: QL injection or cross-site scripting (XSS) vulnerabilities. |
4. Manual testing | Perform manual testing to identify any additional vulnerabilities or weaknesses that may not have been identified by automated tools. |
5. Exploitation | Attempt to exploit any vulnerabilities that have been identified to see if they can be used to gain unauthorized access or perform other malicious activities. |
6. Reporting | Document all findings and provide recommendations for remediation |
Reconnaissance/Scouting
The process of gathering information about a target system or network before attempting to launch a pen testing. It is an important step since it helps the tester to identify potential vulnerabilities and weaknesses in the target system, network or application.
Step | Description |
---|---|
Passive reconnaissance | This involves gathering information about the target system, network or application without actively engaging with it. This includes searching for information on search engines, social media, and other publicly available sources. |
Active reconnaissance | This involves actively engaging with the target system, network or application to gather information. This includes techniques such as port scanning, network mapping, and fingerprinting. |
Social engineering | This involves using psychological manipulation to gather information from people associated with the target system, network or application. This includes techniques such as phishing, pre-texting, and baiting. |
Open-source intelligence (OSINT) | This involves gathering information from publicly available sources such as company websites, job postings, and news articles. |
Vulnerability scanning
Step | Description |
---|---|
1. Planning | Identify the scope of the scan, including the systems, networks, or applications to be scanned, and the types of vulnerabilities to be checked. |
2. Scanning | Use a vulnerability scanning tool to scan the target systems, networks, or applications to identify vulnerabilities, misconfigurations, or weak points that could be exploited. |
3. Analysis | Review the scan results to prioritize the vulnerabilities based on their severity, potential impact, and exploitability. |
4. Reporting | Generate a report that summarizes the vulnerabilities found, their severity, and recommendations for remediation |
5. Remediation | Address the vulnerabilities by applying patches, updating configurations, or implementing other measures to mitigate the identified risks |
Vulnerability scanning Tools for Linux
Tool | Description |
---|---|
OpenVAS | Can detect thousands of known vulnerabilities in Linux systems. It includes a web-based user interface and can be configured to run scheduled scans. |
Nikto | Can identify vulnerabilities in web servers and web applications running on Linux systems. It includes over 6,700 checks for potential vulnerabilities and misconfigurations. |
Nmap - Network Mapper | Network scanner that can identify open ports, running services, and potential vulnerabilities in Linux systems. It includes a wide range of features, including version detection and OS fingerprinting. |
Lynis | Security auditing tool that can identify vulnerabilities in Linux systems. It includes over 200 tests for security and compliance |
Sample Pen Test Report
Section | Description |
---|---|
Executive Summary | Provides a high-level overview of the penetration test results, including the scope, objectives, key findings, and recommendations for remediation. |
Scope and Methodology | Outlines the scope of the penetration test, including the systems, networks, or applications tested, and the methodology used to conduct the test. |
Vulnerabilities | Provides a detailed description of the vulnerabilities found during the test, including their severity, potential impact, and recommendations for remediation. |
Exploitation | Provides details on how the vulnerabilities were exploited during the test, including the tools and techniques used. |
Recommendations | Provides recommendations for remediation, including prioritization of vulnerabilities and specific steps to be taken to address the identified risks. |
Conclusion | Summarizes the key findings and recommendations of the penetration test, and emphasizes the importance of addressing the identified vulnerabilities to improve the overall security of the organization |
Appendices | Includes additional information such as network diagrams, screenshots, and raw data from the penetration test. |
Mobile
Simulator
iOS simulator
- Follow the instructions outlined here to download and setup your iOS Mobile Simulator
Demo - Salesforce Login using OpenID Connect (Google)
Reports and Dashboards
Topics
- Use Case covered
- View the reports in the org
- Getting the snapshot of the reports in the org
Use Case
-
- Get the snapshot of the reports (in public folder) in your org on a regular frequency (say weekly or daily)
-
- You can store the snapshot in your version control
reports-snapshots
so that you can retrieve it from the version control as needed.
- You can store the snapshot in your version control
View the reports in the org
- Requires mohanc plugin
cat reports.soql
SELECT Id,Name,DeveloperName, Foldername
FROM report
sfdx mohanc:data:query -q reports.soql -u mohan.chinnappan.n.sel@gmail.com | \
pbcopy;\
open "https://mohan-chinnappan-n5.github.io/viz/datatable/dt.html?c=csv"
For Linux you need (alias for pbcopy)
alias pbcopy='xsel --clipboard --input'
alias pbpaste='xsel --clipboard --output'
For Windows use clip in place of pbcopy
Getting the snapshot of the reports in the org
Option-1
sfdx force:data:query:soql -f reports.soql --json -u mohan.chinnappan.n.sel@gmail.com | \
jq -e '.result.records' >input.json ;\
node gen_report_packagexml.mjs > report_package.xml; \
sfdx force:mdapi:retrieve -r . -k report_package.xml -u mohan.chinnappan.n.sel@gmail.com ;\
jar tvf unpackaged.zip
Retrieving v56.0 metadata from mohan.chinnappan.n.sel@gmail.com using the v57.0 SOAP API
Retrieve ID: 09S4x00000Eyq1vEAB
Retrieving metadata from mohan.chinnappan.n.sel@gmail.com... done
Wrote retrieve zip to /private/tmp/reports/unpackaged.zip
2814 Wed Feb 22 11:01:18 EST 2023 unpackaged/reports/unfiled$public/flow_screen_prebuilt_report.report
259 Wed Feb 22 11:01:18 EST 2023 unpackaged/package.xml
Option-2 using mohanc pluigin
- Requires
sfdx mohanc:data:query -q reports.soql -u mohan.chinnappan.n.sel@gmail.com -f json >input.json ;\
node gen_report_packagexml.mjs > report_package.xml; \
sfdx force:mdapi:retrieve -r . -k report_package.xml -u mohan.chinnappan.n.sel@gmail.com ; \
jar tvf unpackaged.zip
Retrieving v56.0 metadata from mohan.chinnappan.n.sel@gmail.com using the v57.0 SOAP API
Retrieve ID: 09S4x00000EypzkEAB
Retrieving metadata from mohan.chinnappan.n.sel@gmail.com... done
Wrote retrieve zip to /private/tmp/reports/unpackaged.zip
2814 Wed Feb 22 10:20:56 EST 2023 unpackaged/reports/unfiled$public/flow_screen_prebuilt_report.report
259 Wed Feb 22 10:20:56 EST 2023 unpackaged/package.xml
Scripts used
cat gen_report_packagexml.mjs
// mchinnappan
// TODO: add error/exception handling as needed
import * as fs from 'fs';
let pkg =
`<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
<version>56.0</version>
`
const input = JSON.parse(fs.readFileSync('input.json', 'utf-8'));
pkg += '<types>\n';
pkg += '\t<name>Report</name>\n';
for (const record of input){
let {DeveloperName, FolderName} = record;
pkg += `\t<members>${FolderName}/${DeveloperName}</members>\n`;
}
pkg += '\n</types>\n</Package>';
console.log(pkg);
cat reports.soql
SELECT Id,Name,DeveloperName, Foldername
FROM report
Custom Labels
Topics
- Validate and find duplicate fullNames
- Find the duplicate label values
- HTML view of the Custom Labels file
Use case
-
validate the CustomLabels.labels-meta.xml
-
find any duplicate labels
-
fix this issues and commit back to the version control
-
Requires
Steps
-
- Validate the CustomLabels.labels-meta.xml using
sfdx mohanc:xml:validate -p src/sales/channel-fundamentals/main/default/labels/ -i
and fix any missing or invalid tags
- Find the duplicate labels
sfdx mohanc:xml:xml2json -i CustomLabels.labels-meta.xml ; \
sfdx mohanc:data:jq -f '.CustomLabels.labels[].fullName' -i CustomLabels.labels-meta.xml.json | sort | node dup.mjs
Label1 2
Label2 2
...
Scripts used
cat dup.mjs
//-------------------------------
// feb-2023 mchinnappan
// read the given file from the stdin and emit duplicate items
// cat input.txt | node dup.mjs
//-------------------------------
import readline from "readline";
const items = {};
const rl = readline.createInterface({
input: process.stdin,
output: process.stdout,
terminal: false,
});
rl.on("line", (line) => {
const item = line.replace(/"/g, ""); //remove quotes
if (items[item]) ++items[item];
else items[item] = 1;
if (items[item] !== 1) {
console.log(item, items[item]);
}
});
rl.once("close", () => {
//console.log(items);
});
Use case
- Find out duplicate values (same value with different fullName)
- This will list out label values list in the sorted order
- Work with teams to make sure you reuse the label values
Option-1 - csv viewer
sfdx mohanc:xml:xml2json -i CustomLabels.labels-meta.xml ; \
sfdx mohanc:data:jq -f '.CustomLabels.labels[].value' -i CustomLabels.labels-meta.xml.json | sort > labelValues.csv; open labelValues.csv
Option-2 - datatable
sfdx mohanc:xml:xml2json -i CustomLabels.labels-meta.xml ; \
sfdx mohanc:data:jq -f '.CustomLabels.labels[].value' -i CustomLabels.labels-meta.xml.json | sort | pbcopy ; open "https://mohan-chinnappan-n5.github.io/viz/datatable/dt.html?c=csv"
List of duplicates label values
Account
Account
Accounts Assigned to
Action
Actions
Active Account Plan
Active
Active;Completed
Activities
Activity Category
Activity
Activity
Activity
HTML view of the Custom Labels file
bat CustomLabels.labels
───────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
│ File: CustomLabels.labels
───────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
1 │ <?xml version="1.0" encoding="UTF-8"?>
2 │ <CustomLabels xmlns="http://soap.sforce.com/2006/04/metadata">
3 │ <labels>
4 │ <fullName>Greeting</fullName>
5 │ <language>en_US</language>
6 │ <protected>true</protected>
7 │ <shortDescription>Greeting</shortDescription>
8 │ <value>Hello World From LWC</value>
9 │ </labels>
10 │ </CustomLabels>
───────┴──────────────────────
sfdx mohanc:xml:transform -i CustomLabels.labels -m customlabel
508 - Accessibility
Mobile accessibility checklist
Color contrast
- must comply with WCAG 2.1 AA level requirements:
- Contrast ratio of 4.5:1 for normal text (less than 18 point or 14 point bold.)
- Contrast ratio of 3:1 for large text (at least 18 point or 14 point bold.)
- Information conveyed via color must be also available by other means too (underlined text for links, etc.)
Visibility
- Content hiding techniques such as zero opacity, z-index order and off-screen placement must not be used exclusively to handle visibility.
- Everything other than the currently visible screen must be truly invisible (especially relevant for single page apps with multiple cards):
- Use the hidden attribute or visibility or display style properties.
- Unless absolutely unavoidable, aria-hidden attribute should not be used.
Text Equivalents
- Text equivalent must be provided for every non-strictly presentational non-text element within the app.
- Use alt and title where appropriated
<td>
<input type="text" aria-label="quantity of mangos">
</td>
-
If the above attributes are not applicable, use appropriate ARIA States and Properties such as aria-label, aria-labelledby, or aria-describedby.
-
Images of text must be avoided. Images of Text (Level AA)
-
All form controls must have labels (
<form>
<div class="form-group">
<label for="exampleInputEmail1">Email address</label>
<input type="email" class="form-control" id="exampleInputEmail1" aria-describedby="emailHelp" placeholder="Enter email">
<small id="emailHelp" class="form-text text-muted">We'll never share your email with anyone else.</small>
</div>
<div class="form-group">
<label for="exampleInputPassword1">Password</label>
<input type="password" class="form-control" id="exampleInputPassword1" placeholder="Password">
</div>
<div class="form-check">
<input type="checkbox" class="form-check-input" id="exampleCheck1">
<label class="form-check-label" for="exampleCheck1">Check me out</label>
</div>
<button type="submit" class="btn btn-primary">Submit</button>
</form>
Handling state
- Content should not be restricted to a single orientation, such as:
- portrait
- landscape unless essential. Orientation (Level AA)
General guidelines
- An app title must be provided.
- Headings must not break hierarchical structure
<h1>Top level heading</h1>
<h2>Secondary heading</h2>
<h2>Another secondary heading</h2>
<h3>Low level heading</h3>
- For touch events, at least one of the following must be true:
- The down-event should not be used to trigger any action
- The action is triggered on the up event and an option to abort the action before its completion is available or an option to undo the action after its completion
- The up-event will undo any action that was triggered on a down event
- It is essential to trigger the action on the down event. For example, playing a game or a piano application.
References
- MDN: Mobile accessibility checklist
- Using the HTML title attribute – updated March 2020
- 6.6 Definitions of States and Properties (all aria-* attributes)
- Target touch size
Tools
- Firefox Accessibility Inspector
Kural Install
# linux and mac
sudo npm i -g kural
# windows
npm i -g kural
Chrome Driver
- Download the Chrome Driver file and put this in your PATH
$ cat 508_sfdc.json
{
"testName": "508 - sfdc",
"name": "login",
"url": "https://login.salesforce.com",
"port": 8041,
"soMo": 50,
"headless": false,
"reportFilename": "sfdc_lh.html",
"viewport": { "width": 1516, "height": 699 } ,
"onlyCategories": ["accessibility", "best-practices", "performance", "pwa", "seo" ],
"form": [
{
"locator": "#username",
"value": "mohan.chinnappan.n.sel@gmail.com",
"action": "sendKeys"
},
{
"locator": "#password",
"value": "PWD",
"action": "sendKeys"
},
{
"locator": "#Login"
}
]
}
$ cat 508.json
{
"description": "Running 508 checks on Salesforce org page",
"retURL" : "lightning/o/Account/list",
"headless": false,
"port": 8041,
"slowMo": 50,
"closeBrowser": true,
"reportFilename": "sfdc_lh.html",
"viewPort": { "width": 2040, "height": 699 },
"onlyCategories": ["accessibility"]
}
Run 508 Demo
Run 508 Demo-2
Sa11y
- Sa11y helps to write automated accessibility tests.
- Sa11y helps you detect machine-knowable static DOM accessibility issues.
- Sa11y is based on axe-core, an accessibility testing JavaScript engine.
- Sa11y provides support for Jest, WebdriverIO, and generic JavaScript tests.
- Sa11y is used internally by teams at Salesforce - it is open source
Sa11y ensures your DOM follows a set of accessibility guidelines (WCAG) and best practices. This is a list of some of the issues that sa11y will help you with, and some of the issues you will have to manually test:
Test for | @sa11y/jest | @sa11y/wdio | @sa11y/assert (generic JS) |
---|---|---|---|
Use of semantic structure of the DOM | Yes | Yes | Yes |
Utilization of ARIA attributes | Yes | Yes | Yes |
Offering text-based alternatives for images & icons | Yes | Yes | Yes |
Navigation aids for screen readers | Yes | Yes | Yes |
Not using deprecated elements that hurt a11y, as | Yes | Yes | Yes |
Appropriate use of colors, with enough contrast | No | Yes | No |
Audio, Video elements | No | Yes | No |
App support of keyboard navigation and appropriate focus implementation | No | No | No |
Accessibility guidelines that apply only to mobile, as an adequate use of gestures | No | No | No |
Note our base components, as well as SLDS, try to follow accessibility standards as much as possible.
Key points:
- The use of semantic structure of the DOM, and the utilization of ARIA attributes (if needed) to allow assistive technologies to correctly identify the markup meaning.
- The appropriate use of colors, with enough contrast and not relying on them as the only way to convey information.
- Making sure the app is fully accessible using the keyboard alone (i.e. no mouse!) with a correct implementation of focus.
Salesforce + a11y => Sa11y
- Install in your sfdx project
npm install -D @sa11y/jest
{
"name": "salesforce-app",
"private": true,
"version": "1.0.0",
"description": "Salesforce App",
"scripts": {
"lint": "eslint **/{aura,lwc}/**",
"test": "npm run test:unit",
"test:unit": "sfdx-lwc-jest",
"test:unit:watch": "sfdx-lwc-jest --watch",
"test:unit:debug": "sfdx-lwc-jest --debug",
"test:unit:coverage": "sfdx-lwc-jest --coverage",
"prettier": "prettier --write \"**/*.{cls,cmp,component,css,html,js,json,md,page,trigger,xml,yaml,yml}\"",
"prettier:verify": "prettier --list-different \"**/*.{cls,cmp,component,css,html,js,json,md,page,trigger,xml,yaml,yml}\"",
"postinstall": "husky install",
"precommit": "lint-staged"
},
"devDependencies": {
"@lwc/eslint-plugin-lwc": "^1.1.2",
"@prettier/plugin-xml": "^2.0.1",
"@sa11y/jest": "^5.2.0",
"@salesforce/eslint-config-lwc": "^3.2.3",
"@salesforce/eslint-plugin-aura": "^2.0.0",
"@salesforce/eslint-plugin-lightning": "^1.0.0",
"@salesforce/sfdx-lwc-jest": "^1.1.0",
"eslint": "^8.11.0",
"eslint-plugin-import": "^2.25.4",
"eslint-plugin-jest": "^26.1.2",
"husky": "^7.0.4",
"lint-staged": "^12.3.7",
"prettier": "^2.6.0",
"prettier-plugin-apex": "^1.10.0"
},
"lint-staged": {
"**/*.{cls,cmp,component,css,html,js,json,md,page,trigger,xml,yaml,yml}": [
"prettier --write"
],
"**/{aura,lwc}/**": [
"eslint"
]
}
}
Sample
- helloConditionalRendering.html
<template>
<template if:true={isDetails}>
<p>Jackfruits are great</p>
</template>
</template>
- helloConditionalRendering.js
import { LightningElement, api } from 'lwc';
export default class HelloConditionalRendering extends LightningElement {
@api isDetailsAvailable;
}
Test
import { createElement } from 'lwc';
import HelloConditionalRendering from 'c/helloConditionalRendering';
describe('c-hello-conditional-rendering', () => {
afterEach(() => {
// The jsdom instance is shared across test cases in a single file so reset the DOM
while (document.body.firstChild) {
document.body.removeChild(document.body.firstChild);
}
});
// accessability test
it('is it accessible', async () => {
// Arrange
const element = createElement('c-hello-conditional-rendering', {
is: HelloConditionalRendering
});
// Act
document.body.appendChild(element);
element.isDetailsAvailable = true;
// Assert
await expect(element).toBeAccessible();
});
});
Successful test
The following line make it accessible to Sa11y
element.isDetailsAvailable = true;
npm run test:unit
> salesforce-app@1.0.0 test:unit
> sfdx-lwc-jest
PASS force-app/main/default/lwc/helloConditionalRendering/__tests__/helloConditionalRendering.test.js
c-hello-conditional-rendering
✓ TODO: test case generated by CLI command, please fill in test logic (10 ms)
Test Suites: 1 passed, 1 total
Tests: 1 passed, 1 total
Snapshots: 0 total
Time: 1.976 s
Ran all test suites.
Failed test run
If you comment out this line:
element.isDetailsAvailable = true;
~/Sa11y-1/Sa11ytest >npm run test:unit
> salesforce-app@1.0.0 test:unit
> sfdx-lwc-jest
FAIL force-app/main/default/lwc/helloConditionalRendering/__tests__/helloConditionalRendering.test.js
c-hello-conditional-rendering
✕ is it accessible (8 ms)
● c-hello-conditional-rendering › is it accessible
TypeError: expect(...).toBeAccessible is not a function
23 | // const div = element.shadowRoot.querySelector('div');
24 |
> 25 | await expect(element).toBeAccessible();
| ^
26 | });
27 | });
at Object.toBeAccessible (force-app/main/default/lwc/helloConditionalRendering/__tests__/helloConditionalRendering.test.js:25:31)
Test Suites: 1 failed, 1 total
Tests: 1 failed, 1 total
Snapshots: 0 total
Time: 1.728 s
Ran all test suites.
Video
Resources
- SLDS Accessibility Overview
- Get Started with Web Accessibility
- Salesforce Accessibility Automation Libraries
Process Automation
Flow
How to delete Obsolete Flow versions?
SOQL used
SELECT Id, Status, Definition.DeveloperName, VersionNumber
FROM Flow
WHERE Definition.NamespacePrefix = null
AND Status = 'Obsolete'
ORDER BY Definition.DeveloperName, VersionNumber
Download the script from here
bash build_destructiveChanges_pkg_for_obsolete_flows.sh -u username@email.com
--- username: username@email.com ---
--- Getting the SOQL query ---
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 195 100 195 0 0 2336 0 --:--:-- --:--:-- --:--:-- 2532
--- Running the query to get the JSON file ---
--- Forming destructiveChanges.xml ---
--- Completed writing destructiveChanges.xml --
Deploy command
sfdx force:source:deploy -u username -x destructiveChanges/package.xml --predestructivechanges destructiveChanges/destructiveChanges.xml -c --verbose --loglevel TRACE
Great Folks

Ken Thompson - Father of Unix
Dennis Ritchie - Co-creator of Unix, author of C language
Jeff Dean of Google
Jeff Dean, Senior Fellow and SVP, Google Research and AI
Co-designer and co-implementor of five successive generations of Google’s crawling indexing and query retrieval systems for google.com, covering 100X to 10000X growth in number of documents searched, number of queries handled per second, and frequency of updates to the system.
Co-designer and implementor of the initial version of Google's advertising serving system.
Co-designer of LevelDB
Co-designer and implementor of MapReduce, a system for simplifying the development of large-scale data processing applications on large clusters of machines.
Co-designer and implementor of BigTable A Distributed Storage System for Structured Data, a large-scale semi-structured storage system used underneath approximately 50 Google products.
Designer and developer of the initial version of Google's AdSense for Content product involving both the production serving system as well as work on developing and improving the quality of ad selection algorithms based on the contents of pages.
Co-designer and implementor of many aspects of the production system design for Google Translate, a statistical machine translation system, especially a system that provides distributed high-speed access to very large language models (too large to fit in memory on a single machine).
Implemented many low-level libraries used throughout Google’s code base.
Developer of several internal tools for improving productivity in Google’s software development environment.
Designed and implemented initial production serving system for Google News.
Co-designer and implementor of Spanner, a large-scale globally distributed storage system used for several important products at Google.
Co-designer and implementor of system for large-scale distributed neural network training. System has been used to make significant improvements in image understanding, speech recognition and natural language processing applications.
Co-founder and leader of Google's deep learning research and engineering team.
Mozart - original GPT (Generative Pre-trained Transformer)
Written down the music already finished in his head!!! Page after page of it...
As it he was taking dictation...
And the music finished, no music has ever finished...
Here again it was the very own voice of God!
- He is very proud of his work!!!
---
Live Composing by Maestro Ilaiyaraaja (another original GPT)
Editor tips
vi or nvim
From a python dictionary from the given list of words in the file
- Select the lines and run:
'<,'>s/^\(\w\+\)/"\1":"\1",/g
These lines in the file :
objects
pages
will converted into:
"objects":"objects",
"pages":"pages
Multiline comments in vim
- Press
Ctrl-v
to enter visual block mode. - Use the arrow keys to select the three lines you want to comment out.
- Press
Shift-I
to enter insert mode at the beginning of the selected lines. - Type
#
(or your comment identifier) and then press Esc.
Multiline comments in VSCode
# select multi-lines to be commented out
shift + option + A
Delete empty lines
:g/^$/d
- The
:g
command allows you to perform an action on all lines that match a pattern. /^$/
is a regular expression that matches empty lines. The^
and$
characters anchor the expression to the beginning and end of the line, respectively.- The
d
command deletes the selected lines.
Selecting non-empty lines and deleting others
%v/\S/d
- The
%
character represents the entire file. - The
v
command selects all lines that do not match a pattern. /\S/
is a regular expression that matches all lines that contain at least one non-whitespace character. In other words, it selects all non-empty lines.- The
d
command deletes the selected lines.
Converting a js JSON into compliant json
:%s/\(\S*\):/"\1":/g
- From
{
type: "ObjectManager > Account > Fields&Relationships",
field: "AnnualRevenue",
attribute: "Help Text",
newValue: "New Value here"
}
- to JSON
{
"type": "ObjectManager > Account > Fields&Relationships",
"field": "AnnualRevenue",
"attribute": "Help Text",
"newValue": "New Value here"
}
Bash tips
Delete all other files except given file pattern
find . ! -name '*.py' -type f -exec rm -f {} +
list files of pattern '* 2.' and ' copy.*' and delete them
## go to top folder and run the following command
find . -name '* 2.*' 2>&1 | tee /tmp/2_dup_files.txt
find . -name '* copy.*' 2>&1 | tee /tmp/copy_dup_files.txt
# to delete these files, uncomment the next lines
# cat /tmp/2_dup_files.txt | sed 's/\(.*\)/"\1"/g' | xargs rm -f
# cat /tmp/copy_dup_files.txt | sed 's/\(.*\)/"\1"/g' | xargs rm -f
Language Design
Apex
Apex is a strongly typed, object-oriented programming language that allows developers to execute flow and transaction control statements on Salesforce servers in conjunction with calls to the API. Using syntax that looks like Java and acts like database stored procedures, Apex enables developers to add business logic to most system events, including button clicks, related record updates, and Visualforce pages. Apex code can be initiated by Web service requests and from triggers on objects.
When a developer writes and saves Apex code to the platform, the platform application server first compiles the code into an abstract set of instructions that can be understood by the Apex runtime interpreter, and then saves those instructions as metadata.
When an end user triggers the execution of Apex, the platform application server retrieves the compiled instructions from the metadata and sends them through the runtime interpreter before returning the result. The end user observes no differences in execution time from standard platform requests.
Apex can be used to:
- Create Web services.
- Create email services.
- Perform complex validation over multiple objects.
- Create complex business processes that are not supported by workflow.
- Create custom transactional logic (logic that occurs over the entire transaction, not just with a single record or object).
- Attach custom logic to another operation, such as saving a record, so that it occurs whenever the operation is executed, regardless of whether it originates in the user interface, a Visualforce page, or from SOAP API.
Apex cannot be used to:
- Render elements in the user interface other than error messages
- Change standard functionality
- Apex will prevent this functionality from happening, or add additional functionality
- Create temporary files
- Spawn threads
Writing tests
Testing is the key to successful long-term development and is a critical component of the development process. We strongly recommend that you use a test-driven development process, that is, test development that occurs at the same time as code development.
Apex supports the creation and execution of unit tests.
- Unit tests are class methods that verify whether a particular piece of code is working properly.
- Unit test methods take no arguments, commit no data to the database, and send no emails.
- Such methods are flagged with the
@IsTest
annotation in the method definition. - Unit test methods must be defined in test classes, that is, classes annotated with @IsTest.
- Such methods are flagged with the
- Unit tests must cover at least 75% of your Apex code, and all of those tests must complete successfully.
Notes |
---|
When deploying Apex to a production organization, each unit test in your organization namespace is executed by default. |
Calls to ```System.debug`` aren’t counted as part of Apex code coverage. |
Test methods and test classes aren’t counted as part of Apex code coverage. |
While at least 75% of your Apex code must be covered by tests, don’t focus on the percentage of code that is covered. Instead, make sure that every use case of your application is covered, including positive and negative cases, as well as bulk and single records. This approach ensures that 75% or more of your code is covered by unit tests. |
- Example
public static void updateOlderAccounts() {
// Get the 2 oldest accounts and change their description to 'Old Account'
Account[] oldAccounts = [SELECT Id, Description FROM Account ORDER BY CreatedDate ASC LIMIT 2];
// loop through them and update the Description field
for (Account acct : oldAccounts) {
acct.Description = 'Old Account';
}
// save the change you made
update oldAccounts;
}
// run it
updateOlderAccounts();
Demo
- Run the query to find the old accounts
cat oldAccounts.soql
SELECT Id, Description FROM Account ORDER BY CreatedDate ASC LIMIT 2
sfdx mohanc:data:query -u mohan.chinnappan.n.sel2@gmail.com -q oldAccounts.soql -f json
[
{
"attributes": {
"type": "Account",
"url": "/services/data/v57.0/sobjects/Account/0018W000024fLocQAE"
},
"Id": "0018W000024fLocQAE",
"Description": "Genomics company engaged in mapping and sequencing of the human genome and developing gene-based drugs"
},
{
"attributes": {
"type": "Account",
"url": "/services/data/v57.0/sobjects/Account/0018W000024fLoaQAE"
},
"Id": "0018W000024fLoaQAE",
"Description": "TODO"
}
]
cat UpdateOlderAccounts.cls
public static void updateOlderAccounts() {
// Get the 2 oldest accounts and change their description to 'Old Account'
Account[] oldAccounts = [SELECT Id, Description FROM Account ORDER BY CreatedDate ASC LIMIT 2];
// loop through them and update the Description field
for (Account acct : oldAccounts) {
acct.Description = 'Old Account';
}
// save the change you made
update oldAccounts;
}
// run it
updateOlderAccounts();
- Execute
sfdx mohanc:tooling:execute -u mohan.chinnappan.n.sel2@gmail.com -a UpdateOlderAccounts.cls
apexCode: public static void updateOlderAccounts() {
// Get the 2 oldest accounts and change their description to 'Old Account'
Account[] oldAccounts = [SELECT Id, Description FROM Account ORDER BY CreatedDate ASC LIMIT 2];
// loop through them and update the Description field
for (Account acct : oldAccounts) {
acct.Description = 'Old Account';
}
// save the change you made
update oldAccounts;
}
// run it
updateOlderAccounts();
compiled?: true
executed?: true
{
line: -1,
column: -1,
compiled: true,
success: true,
compileProblem: null,
exceptionStackTrace: null,
exceptionMessage: null
}
- Check the results after execution
sfdx mohanc:data:query -u mohan.chinnappan.n.sel2@gmail.com -q oldAccounts.soql -f json
[
{
"attributes": {
"type": "Account",
"url": "/services/data/v57.0/sobjects/Account/0018W000024fLocQAE"
},
"Id": "0018W000024fLocQAE",
"Description": "Old Account"
},
{
"attributes": {
"type": "Account",
"url": "/services/data/v57.0/sobjects/Account/0018W000024fLoaQAE"
},
"Id": "0018W000024fLoaQAE",
"Description": "Old Account"
}
]
Grammar
Grammar for ANTLR
/*
[The "BSD licence"]
Copyright (c) 2013 Terence Parr, Sam Harwell
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions
are met:
1. Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
3. The name of the author may not be used to endorse or promote products
derived from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR
IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES
OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT,
INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
/**
* An Apexcode grammar derived from Java 1.7 grammar for ANTLR v4.
* Uses ANTLR v4's left-recursive expression notation.
*
* @maintainer: Andrey Gavrikov
*
* You can test with
*
* $ antlr4 apex.g4
* $ javac *.java
* $ grun apex compilationUnit *.cls
*/
grammar apex;
// starting point for parsing a apexcode file
compilationUnit
: packageDeclaration? importDeclaration* typeDeclaration* EOF
;
packageDeclaration
: annotation* PACKAGE qualifiedName ';'
;
importDeclaration
: IMPORT STATIC? qualifiedName ('.' '*')? ';'
;
typeDeclaration
: classOrInterfaceModifier* classDeclaration
| classOrInterfaceModifier* enumDeclaration
| classOrInterfaceModifier* interfaceDeclaration
| classOrInterfaceModifier* annotationTypeDeclaration
| ';'
;
modifier
: classOrInterfaceModifier
| ( NATIVE
| SYNCHRONIZED
| TRANSIENT
)
;
classOrInterfaceModifier
: annotation // class or interface
| ( PUBLIC // class or interface
| PROTECTED // class or interface
| PRIVATE // class or interface
| STATIC // class or interface
| ABSTRACT // class or interface
| FINAL // class only -- does not apply to interfaces
| GLOBAL // class or interface
| WEBSERVICE // class only -- does not apply to interfaces
| OVERRIDE // method only
| VIRTUAL // method only
| TESTMETHOD // method only
| APEX_WITH_SHARING // class only
| APEX_WITHOUT_SHARING //class only
)
;
variableModifier
: FINAL
| annotation
;
classDeclaration
: CLASS Identifier typeParameters?
(EXTENDS type_)?
(IMPLEMENTS typeList)?
classBody
;
typeParameters
: '<' typeParameter (',' typeParameter)* '>'
;
typeParameter
: Identifier (EXTENDS typeBound)?
;
typeBound
: type_ ('&' type_)*
;
enumDeclaration
: ENUM Identifier (IMPLEMENTS typeList)?
'{' enumConstants? ','? enumBodyDeclarations? '}'
;
enumConstants
: enumConstant (',' enumConstant)*
;
enumConstant
: annotation* Identifier arguments? classBody?
;
enumBodyDeclarations
: ';' classBodyDeclaration*
;
interfaceDeclaration
: INTERFACE Identifier typeParameters? (EXTENDS typeList)? interfaceBody
;
typeList
: type_ (',' type_)*
;
classBody
: '{' classBodyDeclaration* '}'
;
interfaceBody
: '{' interfaceBodyDeclaration* '}'
;
classBodyDeclaration
: ';'
| STATIC? block
| modifier* memberDeclaration
;
memberDeclaration
: methodDeclaration
| genericMethodDeclaration
| fieldDeclaration
| constructorDeclaration
| genericConstructorDeclaration
| interfaceDeclaration
| annotationTypeDeclaration
| classDeclaration
| enumDeclaration
| propertyDeclaration
;
/* We use rule this even for void methods which cannot have [] after parameters.
This simplifies grammar and we can consider void to be a type, which
renders the [] matching as a context-sensitive issue or a semantic check
for invalid return type after parsing.
*/
methodDeclaration
: OVERRIDE? (type_|VOID) Identifier formalParameters ('[' ']')*
(THROWS qualifiedNameList)?
( methodBody
| ';'
)
;
genericMethodDeclaration
: typeParameters methodDeclaration
;
constructorDeclaration
: Identifier formalParameters (THROWS qualifiedNameList)?
constructorBody
;
genericConstructorDeclaration
: typeParameters constructorDeclaration
;
fieldDeclaration
: type_ variableDeclarators ';'
;
propertyDeclaration
: type_ variableDeclarators propertyBodyDeclaration
;
propertyBodyDeclaration
: '{' propertyBlock propertyBlock? '}'
;
interfaceBodyDeclaration
: modifier* interfaceMemberDeclaration
| ';'
;
interfaceMemberDeclaration
: constDeclaration
| interfaceMethodDeclaration
| genericInterfaceMethodDeclaration
| interfaceDeclaration
| annotationTypeDeclaration
| classDeclaration
| enumDeclaration
;
constDeclaration
: type_ constantDeclarator (',' constantDeclarator)* ';'
;
constantDeclarator
: Identifier ('[' ']')* '=' variableInitializer
;
// see matching of [] comment in methodDeclaratorRest
interfaceMethodDeclaration
: (type_|VOID) Identifier formalParameters ('[' ']')*
(THROWS qualifiedNameList)?
';'
;
genericInterfaceMethodDeclaration
: typeParameters interfaceMethodDeclaration
;
variableDeclarators
: variableDeclarator (',' variableDeclarator)*
;
variableDeclarator
: variableDeclaratorId ('=' variableInitializer)?
;
variableDeclaratorId
: Identifier ('[' ']')*
;
variableInitializer
: arrayInitializer
| expression
;
arrayInitializer
: '{' (variableInitializer (',' variableInitializer)* (',')? )? '}'
;
enumConstantName
: Identifier
;
type_
: classOrInterfaceType ('[' ']')*
| primitiveType ('[' ']')*
;
classOrInterfaceType
: Identifier typeArguments? ('.' Identifier typeArguments? )*
| SET typeArguments // 'set <' has to be defined explisitly, otherwise it clashes with SET of property setter
;
primitiveType
: CHAR
| BYTE
| SHORT
| INT
| FLOAT
;
typeArguments
: '<' typeArgument (',' typeArgument)* '>'
;
typeArgument
: type_
| '?' ((EXTENDS | SUPER) type_)?
;
qualifiedNameList
: qualifiedName (',' qualifiedName)*
;
formalParameters
: '(' formalParameterList? ')'
;
formalParameterList
: formalParameter (',' formalParameter)* (',' lastFormalParameter)?
| lastFormalParameter
;
formalParameter
: variableModifier* type_ variableDeclaratorId
;
lastFormalParameter
: variableModifier* type_ '...' variableDeclaratorId
;
methodBody
: block
;
constructorBody
: block
;
qualifiedName
: Identifier ('.' Identifier)*
;
literal
: IntegerLiteral
| FloatingPointLiteral
| CharacterLiteral
| StringLiteral
| BooleanLiteral
| NullLiteral
;
// ANNOTATIONS
annotation
: '@' annotationName ( '(' ( elementValuePairs | elementValue )? ')' )?
;
annotationName : qualifiedName ;
elementValuePairs
: elementValuePair (','? elementValuePair)*
;
elementValuePair
: Identifier '=' elementValue
;
elementValue
: expression
| annotation
| elementValueArrayInitializer
;
elementValueArrayInitializer
: '{' (elementValue (',' elementValue)*)? (',')? '}'
;
annotationTypeDeclaration
: '@' INTERFACE Identifier annotationTypeBody
;
annotationTypeBody
: '{' (annotationTypeElementDeclaration)* '}'
;
annotationTypeElementDeclaration
: modifier* annotationTypeElementRest
| ';' // this is not allowed by the grammar, but apparently allowed by the actual compiler
;
annotationTypeElementRest
: type_ annotationMethodOrConstantRest ';'
| classDeclaration ';'?
| interfaceDeclaration ';'?
| enumDeclaration ';'?
| annotationTypeDeclaration ';'?
;
annotationMethodOrConstantRest
: annotationMethodRest
| annotationConstantRest
;
annotationMethodRest
: Identifier '(' ')' defaultValue?
;
annotationConstantRest
: variableDeclarators
;
defaultValue
: DEFAULT elementValue
;
// STATEMENTS / BLOCKS
block
: '{' blockStatement* '}'
;
blockStatement
: localVariableDeclarationStatement
| statement
| typeDeclaration
;
localVariableDeclarationStatement
: localVariableDeclaration ';'
;
localVariableDeclaration
: variableModifier* type_ variableDeclarators
;
statement
: block
| IF parExpression statement (ELSE statement)?
| FOR '(' forControl ')' statement
| WHILE parExpression statement
| DO statement WHILE parExpression ';'
| RUNAS '(' expression ')' statement
| TRY block (catchClause+ finallyBlock? | finallyBlock)
| TRY resourceSpecification block catchClause* finallyBlock?
| RETURN expression? ';'
| THROW expression ';'
| BREAK Identifier? ';'
| CONTINUE Identifier? ';'
| ';'
| statementExpression ';'
| Identifier ':' statement
| apexDbExpression ';'
;
propertyBlock
: modifier* (getter | setter)
;
getter
: GET (';' | methodBody)
;
setter
: SET (';' | methodBody)
;
catchClause
: CATCH '(' variableModifier* catchType Identifier ')' block
;
catchType
: qualifiedName ('|' qualifiedName)*
;
finallyBlock
: FINALLY block
;
resourceSpecification
: '(' resources ';'? ')'
;
resources
: resource (';' resource)*
;
resource
: variableModifier* classOrInterfaceType variableDeclaratorId '=' expression
;
forControl
: enhancedForControl
| forInit? ';' expression? ';' forUpdate?
;
forInit
: localVariableDeclaration
| expressionList
;
enhancedForControl
: variableModifier* type_ variableDeclaratorId ':' expression
;
forUpdate
: expressionList
;
// EXPRESSIONS
parExpression
: '(' expression ')'
;
expressionList
: expression (',' expression)*
;
statementExpression
: expression
;
constantExpression
: expression
;
apexDbUpsertExpression
: DB_UPSERT expression (expression)*
;
apexDbExpression
: (DB_INSERT | DB_UPDATE | DB_DELETE | DB_UNDELETE) expression
| apexDbUpsertExpression
;
expression
: primary
| expression '.' GET '(' expressionList? ')'
| expression '.' SET '(' expressionList? ')'
| expression '?'? '.' Identifier
| expression '.' THIS
| expression '.' NEW
| expression '.'
( DB_INSERT
| DB_UPSERT
| DB_UPDATE
| DB_DELETE
| DB_UNDELETE
)
| expression '.' SUPER superSuffix
| expression '.' explicitGenericInvocation
| expression '[' expression ']'
| expression '(' expressionList? ')'
| NEW creator
| '(' type_ ')' expression
| expression ('++' | '--')
| ('+'|'-'|'++'|'--') expression
| ('~'|'!') expression
| expression ('*'|'/'|'%') expression
| expression ('+'|'-') expression
| expression ('<' '<' | '>' '>' '>' | '>' '>') expression
| expression ('<=' | '>=' | '>' | '<') expression
| expression INSTANCEOF type_
| expression ('==' | '!=' | '<>') expression
| expression '&' expression
| expression '^' expression
| expression '|' expression
| expression '&&' expression
| expression '||' expression
| expression '?' expression ':' expression
| <assoc=right> expression
( '='
| '+='
| '-='
| '*='
| '/='
| '&='
| '|='
| '^='
| '>>='
| '>>>='
| '<<='
| '%='
)
expression
;
primary
: '(' expression ')'
| THIS
| SUPER
| literal
| Identifier
| type_ '.' CLASS
| VOID '.' CLASS
| nonWildcardTypeArguments (explicitGenericInvocationSuffix | THIS arguments)
| SoqlLiteral
;
creator
: nonWildcardTypeArguments createdName classCreatorRest
| createdName (arrayCreatorRest | classCreatorRest | mapCreatorRest | setCreatorRest)
;
createdName
: Identifier typeArgumentsOrDiamond? ('.' Identifier typeArgumentsOrDiamond?)*
| primitiveType
| SET typeArgumentsOrDiamond // 'set <' has to be defined explisitly, otherwise it clashes with SET of property setter
;
innerCreator
: Identifier nonWildcardTypeArgumentsOrDiamond? classCreatorRest
;
arrayCreatorRest
: '['
( ']' ('[' ']')* arrayInitializer
| expression ']' ('[' expression ']')* ('[' ']')*
)
;
mapCreatorRest
: '{'
( '}'
| ( Identifier | expression ) '=>' ( literal | expression ) (',' (Identifier | expression) '=>' ( literal | expression ) )* '}'
)
;
setCreatorRest
: '{'
( '}'
| ( literal | expression ) (',' ( literal | expression ))* '}'
)
;
classCreatorRest
: arguments classBody?
;
explicitGenericInvocation
: nonWildcardTypeArguments explicitGenericInvocationSuffix
;
nonWildcardTypeArguments
: '<' typeList '>'
;
typeArgumentsOrDiamond
: '<' '>'
| typeArguments
;
nonWildcardTypeArgumentsOrDiamond
: '<' '>'
| nonWildcardTypeArguments
;
superSuffix
: arguments
| '.' Identifier arguments?
;
explicitGenericInvocationSuffix
: SUPER superSuffix
| Identifier arguments
;
arguments
: '(' expressionList? ')'
;
// Apex - SOQL literal
SoqlLiteral
: '[' WS* SELECT (SelectRestNoInnerBrackets | SelectRestAllowingInnerBrackets)*? ']'
;
fragment SelectRestAllowingInnerBrackets
: '[' ~']' .*? ']'
| ~'[' .*?
;
fragment SelectRestNoInnerBrackets
: ~'['
;
// LEXER
// ?3.9 Keywords
OVERRIDE : O V E R R I D E;
VIRTUAL : V I R T U A L;
SET : S E T;
GET : G E T;
ABSTRACT : A B S T R A C T;
BREAK : B R E A K;
BYTE : B Y T E;
CATCH : C A T C H;
CHAR : C H A R;
CLASS : C L A S S;
CONST : C O N S T;
CONTINUE : C O N T I N U E;
DEFAULT : D E F A U L T;
DO : D O;
ELSE : E L S E;
ENUM : E N U M;
EXTENDS : E X T E N D S;
FINAL : F I N A L;
FINALLY : F I N A L L Y;
FLOAT : F L O A T;
FOR : F O R;
IF : I F;
GOTO : G O T O;
IMPLEMENTS : I M P L E M E N T S;
IMPORT : I M P O R T;
INSTANCEOF : I N S T A N C E O F;
INT : I N T;
INTERFACE : I N T E R F A C E;
NATIVE : N A T I V E;
NEW : N E W;
PACKAGE : P A C K A G E;
PRIVATE : P R I V A T E;
PROTECTED : P R O T E C T E D;
PUBLIC : P U B L I C;
RETURN : R E T U R N;
SHORT : S H O R T;
STATIC : S T A T I C;
SUPER : S U P E R;
SYNCHRONIZED : S Y N C H R O N I Z E D;
THIS : T H I S;
THROW : T H R O W;
THROWS : T H R O W S;
TRANSIENT : T R A N S I E N T;
TRY : T R Y;
VOID : V O I D;
VOLATILE : V O L A T I L E;
WHILE : W H I L E;
// Apexcode specific
GLOBAL : G L O B A L;
WEBSERVICE : W E B S E R V I C E;
APEX_WITH_SHARING : W I T H SPACE S H A R I N G;
APEX_WITHOUT_SHARING : W I T H O U T SPACE S H A R I N G;
SELECT : S E L E C T;
DB_INSERT : I N S E R T;
DB_UPSERT : U P S E R T;
DB_UPDATE : U P D A T E;
DB_DELETE : D E L E T E;
DB_UNDELETE : U N D E L E T E;
TESTMETHOD : T E S T M E T H O D;
RUNAS : S Y S T E M DOT R U N A S;
// ?3.10.1 Integer Literals
IntegerLiteral
: DecimalIntegerLiteral
| HexIntegerLiteral
| OctalIntegerLiteral
| BinaryIntegerLiteral
;
fragment
DecimalIntegerLiteral
: DecimalNumeral IntegerTypeSuffix?
;
fragment
HexIntegerLiteral
: HexNumeral IntegerTypeSuffix?
;
fragment
OctalIntegerLiteral
: OctalNumeral IntegerTypeSuffix?
;
fragment
BinaryIntegerLiteral
: BinaryNumeral IntegerTypeSuffix?
;
fragment
IntegerTypeSuffix
: [lL]
;
fragment
DecimalNumeral
: '0' Digit?
| NonZeroDigit (Digits? | Underscores Digits)
;
fragment
Digits
: Digit (DigitOrUnderscore* Digit)?
;
fragment
Digit
: '0'
| NonZeroDigit
;
fragment
NonZeroDigit
: [1-9]
;
fragment
DigitOrUnderscore
: Digit
| '_'
;
fragment
Underscores
: '_'+
;
fragment
HexNumeral
: '0' [xX] HexDigits
;
fragment
HexDigits
: HexDigit (HexDigitOrUnderscore* HexDigit)?
;
fragment
HexDigit
: [0-9a-fA-F]
;
fragment
HexDigitOrUnderscore
: HexDigit
| '_'
;
fragment
OctalNumeral
: '0' Underscores? OctalDigits
;
fragment
OctalDigits
: OctalDigit (OctalDigitOrUnderscore* OctalDigit)?
;
fragment
OctalDigit
: [0-7]
;
fragment
OctalDigitOrUnderscore
: OctalDigit
| '_'
;
fragment
BinaryNumeral
: '0' [bB] BinaryDigits
;
fragment
BinaryDigits
: BinaryDigit (BinaryDigitOrUnderscore* BinaryDigit)?
;
fragment
BinaryDigit
: [01]
;
fragment
BinaryDigitOrUnderscore
: BinaryDigit
| '_'
;
// ?3.10.2 Floating-Point Literals
FloatingPointLiteral
: DecimalFloatingPointLiteral
| HexadecimalFloatingPointLiteral
;
fragment
DecimalFloatingPointLiteral
: Digits '.' Digits? ExponentPart? FloatTypeSuffix?
| '.' Digits ExponentPart? FloatTypeSuffix?
| Digits ExponentPart FloatTypeSuffix?
| Digits FloatTypeSuffix
;
fragment
ExponentPart
: ExponentIndicator SignedInteger
;
fragment
ExponentIndicator
: [eE]
;
fragment
SignedInteger
: Sign? Digits
;
fragment
Sign
: [+-]
;
fragment
FloatTypeSuffix
: [fFdD]
;
fragment
HexadecimalFloatingPointLiteral
: HexSignificand BinaryExponent FloatTypeSuffix?
;
fragment
HexSignificand
: HexNumeral '.'?
| '0' [xX] HexDigits? '.' HexDigits
;
fragment
BinaryExponent
: BinaryExponentIndicator SignedInteger
;
fragment
BinaryExponentIndicator
: [pP]
;
// ?3.10.3 Boolean Literals
BooleanLiteral
: 'true'
| 'false'
;
// ?3.10.4 Character Literals
CharacterLiteral
: QUOTE SingleCharacter QUOTE
| QUOTE EscapeSequence QUOTE
;
fragment
SingleCharacter
: ~['\\]
;
// ?3.10.5 String Literals
StringLiteral
: QUOTE StringCharacters? QUOTE
;
fragment
StringCharacters
: StringCharacter+
;
fragment
StringCharacter
: ~['\\]
| EscapeSequence
;
// ?3.10.6 Escape Sequences for Character and String Literals
fragment
EscapeSequence
: '\\' [btnfr"'\\]
| OctalEscape
| UnicodeEscape
;
fragment
OctalEscape
: '\\' OctalDigit
| '\\' OctalDigit OctalDigit
| '\\' ZeroToThree OctalDigit OctalDigit
;
fragment
UnicodeEscape
: '\\' 'u' HexDigit HexDigit HexDigit HexDigit
;
fragment
ZeroToThree
: [0-3]
;
// ?3.10.7 The Null Literal
NullLiteral : N U L L;
// ?3.11 Separators
LPAREN : '(';
RPAREN : ')';
LBRACE : '{';
RBRACE : '}';
LBRACK : '[';
RBRACK : ']';
SEMI : ';';
COMMA : ',';
DOT : '.';
// ?3.12 Operators
ASSIGN : '=';
GT : '>';
LT : '<';
BANG : '!';
TILDE : '~';
QUESTION : '?';
COLON : ':';
EQUAL : '==';
LE : '<=';
GE : '>=';
NOTEQUAL : '!=';
AND : '&&';
OR : '||';
INC : '++';
DEC : '--';
ADD : '+';
SUB : '-';
MUL : '*';
DIV : '/';
BITAND : '&';
BITOR : '|';
CARET : '^';
MOD : '%';
ADD_ASSIGN : '+=';
SUB_ASSIGN : '-=';
MUL_ASSIGN : '*=';
DIV_ASSIGN : '/=';
AND_ASSIGN : '&=';
OR_ASSIGN : '|=';
XOR_ASSIGN : '^=';
MOD_ASSIGN : '%=';
LSHIFT_ASSIGN : '<<=';
RSHIFT_ASSIGN : '>>=';
URSHIFT_ASSIGN : '>>>=';
LAMBDA_LIKE : '=>';
// ?3.8 Identifiers (must appear after all keywords in the grammar)
Identifier
: JavaLetter JavaLetterOrDigit*
;
fragment
JavaLetter
: [a-zA-Z$_] // these are the "java letters" below 0xFF
| // covers all characters above 0xFF which are not a surrogate
~[\u0000-\u00FF\uD800-\uDBFF]
{Character.isJavaIdentifierStart(_input.LA(-1))}?
| // covers UTF-16 surrogate pairs encodings for U+10000 to U+10FFFF
[\uD800-\uDBFF] [\uDC00-\uDFFF]
{Character.isJavaIdentifierStart(Character.toCodePoint((char)_input.LA(-2), (char)_input.LA(-1)))}?
;
fragment
JavaLetterOrDigit
: [a-zA-Z0-9$_] // these are the "java letters or digits" below 0xFF
| // covers all characters above 0xFF which are not a surrogate
~[\u0000-\u00FF\uD800-\uDBFF]
{Character.isJavaIdentifierPart(_input.LA(-1))}?
| // covers UTF-16 surrogate pairs encodings for U+10000 to U+10FFFF
[\uD800-\uDBFF] [\uDC00-\uDFFF]
{Character.isJavaIdentifierPart(Character.toCodePoint((char)_input.LA(-2), (char)_input.LA(-1)))}?
;
//
// Additional symbols not defined in the lexical specification
//
AT : '@';
ELLIPSIS : '...';
//
// Whitespace and comments
//
WS : [ \t\r\n\u000C]+ -> skip
;
APEXDOC_COMMENT
: '/**' [\r\n] .*? '*/' -> skip
;
APEXDOC_COMMENT_START
: '/**' -> skip
;
COMMENT
: '/*' .*? '*/' -> skip
;
COMMENT_START
: '/*' -> skip
;
LINE_COMMENT
: '//' ~[\r\n]* -> skip
;
//
// Unexpected token for non recognized elements
//
QUOTE : '\'' -> skip;
// characters
fragment A : [aA];
fragment B : [bB];
fragment C : [cC];
fragment D : [dD];
fragment E : [eE];
fragment F : [fF];
fragment G : [gG];
fragment H : [hH];
fragment I : [iI];
fragment J : [jJ];
fragment K : [kK];
fragment L : [lL];
fragment M : [mM];
fragment N : [nN];
fragment O : [oO];
fragment P : [pP];
fragment Q : [qQ];
fragment R : [rR];
fragment S : [sS];
fragment T : [tT];
fragment U : [uU];
fragment V : [vV];
fragment W : [wW];
fragment X : [xX];
fragment Y : [yY];
fragment Z : [zZ];
fragment SPACE : ' ';
antlr4 apex.g4
ls -l
total 1264
-rw-r--r-- 1 mchinnappan staff 23489 May 5 03:57 apex.g4
-rw-r--r-- 1 mchinnappan staff 51171 May 5 03:57 apex.interp
-rw-r--r-- 1 mchinnappan staff 1751 May 5 03:57 apex.tokens
-rw-r--r-- 1 mchinnappan staff 38466 May 5 03:57 apexBaseListener.java
-rw-r--r-- 1 mchinnappan staff 49689 May 5 03:57 apexLexer.interp
-rw-r--r-- 1 mchinnappan staff 78432 May 5 03:57 apexLexer.java
-rw-r--r-- 1 mchinnappan staff 1751 May 5 03:57 apexLexer.tokens
-rw-r--r-- 1 mchinnappan staff 39428 May 5 03:57 apexListener.java
-rw-r--r-- 1 mchinnappan staff 341615 May 5 03:57 apexParser.java
cat AccountCleanup.cls
public class AccountCleanup {
public static void updateAccounts() {
Account[] accounts = [SELECT Id, Description FROM Account WHERE Id ='0018W000024fLoaQAE'];
// loop through them and update the Description field
for (Account acct : accounts) {
acct.Description = 'TODO';
}
// save the change you made
update accounts;
}
}
antlr4-parse apex.g4 compilationUnit -gui
public class AccountCleanup {
public static void updateAccounts() {
Account[] accounts = [SELECT Id, Description FROM Account WHERE Id ='0018W000024fLoaQAE'];
// loop through them and update the Description field
for (Account acct : accounts) {
acct.Description = 'TODO';
}
// save the change you made
update accounts;
}
}
^D
References
Calendar Integrations
Calendar Integrations
Apple iCal, also known as Apple Calendar, is a calendar application developed by Apple Inc. It is available for macOS, iOS, and other Apple devices. iCal allows users to create, manage, and organize events, appointments, and tasks in a digital calendar format.
Here are some key features and functionalities of Apple iCal:
-
Calendar Management: iCal provides a user-friendly interface for managing multiple calendars, including personal calendars, work calendars, and shared calendars. Users can create, edit, and delete events, set reminders, and categorize events with colors and tags.
-
Synchronization: iCal supports synchronization with various calendar services and protocols, such as iCloud, Google Calendar, Microsoft Exchange, and CalDAV. This allows users to access and sync their calendar data across multiple devices and platforms.
-
Sharing and Collaboration: iCal enables users to share calendars with others, granting them permission to view or edit events. This facilitates collaboration and coordination among teams, family members, or groups working on shared projects or events.
-
Notifications and Reminders: iCal can send reminders and notifications for upcoming events, ensuring that users stay informed about their schedules and important appointments. Reminders can be set to trigger at specific times or intervals before an event.
-
Integration with Other Apps: iCal integrates seamlessly with other Apple applications, such as Mail, Contacts, and Maps. It allows users to quickly create events from emails or addresses and provides location-based suggestions for event venues.
-
Recurring Events: iCal supports the creation of recurring events, allowing users to set up events that repeat daily, weekly, monthly, or at custom intervals. This is useful for scheduling regular activities or reminders.
-
Search and Filtering: iCal includes search and filtering capabilities to help users find specific events or navigate through their calendars more efficiently. Users can search for events by keywords, dates, locations, or participants.
-
Travel Time: iCal offers the ability to estimate travel time for events, taking into account traffic conditions and providing alerts to ensure users arrive on time.
-
Availability and Scheduling: iCal includes features for checking the availability of invitees when scheduling meetings or events. This helps users find the best suitable time when all participants are free.
Overall, Apple iCal provides a comprehensive calendar management solution with a range of features designed to help users stay organized, schedule events, and manage their time effectively across Apple devices and platforms.
Html App
Python example
# pip install pyicloud
from pyicloud import PyiCloudService
api = PyiCloudService('your_apple_id', 'your_apple_password')
ical_service = api.calendar
events = ical_service.events()
for event in events:
print(event)
# create new event
new_event = ical_service.new_event()
new_event.title = 'My Event'
new_event.location = 'Event Location'
new_event.notes = 'Event Description'
new_event.start = '2023-06-15 10:00:00'
new_event.end = '2023-06-15 12:00:00'
new_event.save()
# Update an existing event:
event_id = 'your_event_id'
event = ical_service.event(event_id)
event.title = 'Updated Event Title'
event.save()
# Delete an event:
event_id = 'your_event_id'
event = ical_service.event(event_id)
event.delete()
Territory Management
Enable Territory Management
Enterprise Territory Management allows you to organize your sales structure into separate territories, based on :
- geographic attributes
- named accounts
- industry/verticals, or
- any other criteria that suits your organization.
You can experiment with different territory models and activate the one that works best for your sales teams.
Deploying Territory Management includes these steps:
- Build your territory hierarchy.
- Assign users to territories and create rules for account assignment.
- Preview your account assignments.
- Activate your territory model.
Data Model
Territory
Represents a flexible collection of accounts and users where the users have at least read access to the accounts, regardless of who owns the accounts. Only available if territory management has been enabled for your organization.
Territories organize groups of accounts, leads, and the Salesforce users who work with them. Territories are created based on territory types.
The territory record shows assigned users, manually assigned records, the forecast manager, and rules that assign accounts automatically. A territory can have both parent and child territories. Depending on how your organization assigns records, a record can be assigned to more than one territory.
Territory Hierarchy
The territory hierarchy shows a model’s territory structure and serves as its main interaction point. Start from the hierarchy to create, edit, and delete territories; run assignment rules for territories, and navigate to territory detail pages for more information. From the hierarchy, you can also assign territories to opportunities, run assignment rules at the model level, and activate or archive the model. The forecasts hierarchy for territory forecasts is based on your territory hierarchy in the active territory model.
The name and state of the territory model appear at the top node of the hierarchy (1). Parent and child territories are nested beneath the model name. If a territory has a forecast manager assigned, the manager’s name appears here (2). Create or edit child territories, or run their assignment rules (3). Run the filter that assigns territories to opportunities (4). Run assignment rules for all territories in the model (5). Hover to learn when rules were last run (6). Set the model’s state to Active or Archive (7). Toggle to a sorted list view and back to tree view (8).
Territory2
Represents a sales territory. Available only if Enterprise Territory Management ETM has been enabled for your organization.
Territory Type
Territory types help you organize your territories by key characteristics important to your company. Every territory you create has a territory type. Territory types are used for organizing and creating territories only. They don’t appear on territory model hierarchies.
For example, you can create a territory type called Universal Telco Named Accounts, then from that type, create territories that include named accounts.
Or you can create a territory type called US Geographic, and from that type create territories based on states or regions.
When you create a territory type, make sure that the label and description clearly indicate what kinds of territories belong to that type
A territory model represents a complete territory management system for your company. Modeling lets you create and preview multiple territory structures and different account, lead, and user assignments before you activate the model that works best.
Each territory model features a graphical hierarchy. The hierarchy shows your model’s territory structure (including parent and child territories) and lets you create, edit, and delete territories and view their detail pages. Your Salesforce edition determines the number of territory models that you can create and modify at a time. Only one model can be active at any time. Activate the model you want to use to finalize its account, lead, and user assignments.
Territory2ModelHistory
Represents the history of changes to the values in the fields on a territory model
Territory2AlignmentLog
Represents the start and end status of a territory assignment rule run job.
Territory2ObjectExclusion
Represents the objects that aren’t included in territory assignment rule runs, even when they meet assignment rule criteria.

Territory Type Priority
Specifying and managing territory type priority information helps you choose the appropriate territory type for territories you create or edit. You create your own priority scheme. For example, 001 can indicate that a territory type is the highest priority or the lowest. Make sure that your scheme can scale as you add more territory types. When you create a territory, you select a territory type from the Territory Type lookup field. From the lookup, you can view the territory type’s assigned priority
Territory2ObjSharingConfig
Represents the sharing access level of objects assigned to a particular territory.
Limits
Increase the Maximum Number of Territories per Territory Model in Enterprise Territory Management limit.
- By default, each territory model can have up to 1,000 territories. If you’re using Developer or Enterprise Edition, the maximum allocation is 1,000 territories per model. If you’re using Performance Edition or Unlimited Edition, you can add up to 20,000 territories to a territory model by contacting Salesforce Customer Support.
References
- Optimize Performance for Enterprise Territory Management
- Enterprise Territory Management Implementation Guide
Tips
Date to DateTime conversion steps
Use case
- Account has field
ingestDate__c
- You like to have a field
ingestDateTime__c
instead ofingestDate__c
and replaceingestDate__c
withingestDateTime__c
cat ~/.soql/datetime.soql
SELECT Id, Name, ingestDate__c FROM Account WHERE ingestDate__c != NULL
sfdx mohanc:data:bulkapi:query -q ~/.soql/datetime.soql -u mohan.chinnappan.n.sel@gmail.com
Query output ~/.soql/datetime.soql.csv
"Id","Name","IngestDate__c"
"0014x00001JUIV3AAP","GenePoint","2023-07-27"
Change the above file header to IngestDateTime__c instead of IngestDate__c
"Id","Name","IngestDateTime__c"
"0014x00001JUIV3AAP","GenePoint","2023-07-27"
Load update the data
sfdx mohanc:data:bulkapi:update -f ~/.soql/datetime.soql.csv -u mohan.chinnappan.n.sel@gmail.com -o Account
- Above operation will convert the date from IngestDate__c to datetime in IngestDateTime__c
Next steps
- Find where used for this field
IngestDate__c
and replace it withIngestDateTime__c