Saturday, 16 July 2016

Force CLI Part 3 - Accessing Data

Force CLI Part 3 - Accessing Data



In part 2 of this series, Extracting Metadata, I covered how to extract configuration data from your Salesforce instance, which it is something pretty much every other deployment tools offer. A key difference of the Force CLI is that you aren’t limited to configuration - you can retrieve and update your Salesforce data from the command line or a script. Very useful if you have to carry out some data migration as part of a deployment for example.

NOTE: In the examples below the commands are split over multiple lines for readability, but should be entered on a single line if you are trying them out for yourself.

Executing SOQL

The command for executing SOQL is force query, followed by the SOQL string enclosed in quotes:

> force query "select id, Name, Industry from Account 
order by CreatedDate limit 5"

The default format is ascii table style:

 Id                 | Industry       | Name
 0018000000Q9v1VAAR | Construction   | Pyramid Construction Inc.
 0018000000Q9v1WAAR | Consulting     | Dickenson plc
 0018000000Q9v1XAAR | Hospitality    | Grand Hotels & Resorts Ltd
 0018000000Q9v1YAAR | Transportation | Express Logistics and Transport
 0018000000Q9v1ZAAR | Education      | University of Arizona
 (5 records)

which is fine for displaying data on the screen, but not that easy for processing. Luckily the output format can be changed via the -format switch.

CSV Formatting

Specifying csv as the value of the format switch outputs the results of the query in the familiar comma separated format, along with a list of headers:

> force query "select id, Name, Industry from Account 
order by CreatedDate limit 5" -format:csv
"0018000000Q9v1VAAR","Construction","Pyramid Construction Inc."
"0018000000Q9v1WAAR","Consulting","Dickenson plc"
"0018000000Q9v1XAAR","Hospitality","Grand Hotels & Resorts Ltd"
"0018000000Q9v1YAAR","Transportation","Express Logistics and Transport"
"0018000000Q9v1ZAAR","Education","University of Arizona"

 while this is more useful for processing, it doesn’t work well with subqueries:

> force query "select id, Name, (select id, FirstName from Contacts) 
from Account order by CreatedDate limit 5" -format:csv
"Contacts","Id","Name" "[map[Id:0038000000asT8TAAU FirstName:Pat]]","0018000000Q9v1VAAR","Pyramid Construction Inc." "[map[Id:0038000000asT8UAAU FirstName:Andy]]","0018000000Q9v1WAAR","Dickenson plc" "[map[Id:0038000000asT8VAAU FirstName:Tim] map[Id:0038000000asT8WAAU FirstName:John]]","0018000000Q9v1XAAR","Grand Hotels & Resorts Ltd" "[map[Id:0038000000asT8ZAAU FirstName:Babara] map[FirstName:Josh Id:0038000000asT8aAAE]]","0018000000Q9v1YAAR","Express Logistics and Transport" "[map[Id:0038000000asT8bAAE FirstName:Jane]]","0018000000Q9v1ZAAR","University of Arizona"

JSON Formatting

Specifying json as the format switch outputs the result in JavaScript Object Notation:

force query "select id, Name, (select id, FirstName from Contacts) 
from Account order by CreatedDate limit 5" -format:json
[{"Contacts":{"done":true,"records":[{"FirstName":"Pat","Id":"0038000000asT8TAAU","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8TAAU"}}],"totalSize":1},"Id":"0018000000Q9v1VAAR","Name":"Pyramid Construction Inc.","attributes":{"type":"Account","url":"/services/data/v36.0/sobjects/Account/0018000000Q9v1VAAR"}},{"Contacts":{"done":true,"records":[{"FirstName":"Andy","Id":"0038000000asT8UAAU","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8UAAU"}}],"totalSize":1},"Id":"0018000000Q9v1WAAR","Name":"Dickenson plc","attributes":{"type":"Account","url":"/services/data/v36.0/sobjects/Account/0018000000Q9v1WAAR"}},{"Contacts":{"done":true,"records":[{"FirstName":"Tim","Id":"0038000000asT8VAAU","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8VAAU"}},{"FirstName":"John","Id":"0038000000asT8WAAU","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8WAAU"}}],"totalSize":2},"Id":"0018000000Q9v1XAAR","Name":"Grand Hotels \u0026 Resorts Ltd","attributes":{"type":"Account","url":"/services/data/v36.0/sobjects/Account/0018000000Q9v1XAAR"}},{"Contacts":{"done":true,"records":[{"FirstName":"Babara","Id":"0038000000asT8ZAAU","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8ZAAU"}},{"FirstName":"Josh","Id":"0038000000asT8aAAE","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8aAAE"}}],"totalSize":2},"Id":"0018000000Q9v1YAAR","Name":"Express Logistics and Transport","attributes":{"type":"Account","url":"/services/data/v36.0/sobjects/Account/0018000000Q9v1YAAR"}},{"Contacts":{"done":true,"records":[{"FirstName":"Jane","Id":"0038000000asT8bAAE","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8bAAE"}}],"totalSize":1},"Id":"0018000000Q9v1ZAAR","Name":"University of Arizona","attributes":{"type":"Account","url":"/services/data/v36.0/sobjects/Account/0018000000Q9v1ZAAR"}}]

 suitable for processing by a myriad variety of tools.

Prettified JSON

For the best of both worlds - JSON formatted for automated processing but laid out so that humans can read it, specify json-pretty as the format switch:

> force query "select id, Name, (select id, FirstName from Contacts) 
from Account order by CreatedDate limit 5" -format:json-pretty
[ { "Contacts": { "done": true, "records": [ { "FirstName": "Pat", "Id": "0038000000asT8TAAU", "attributes": { "type": "Contact", "url": "/services/data/v36.0/sobjects/Contact/0038000000asT8TAAU" } } ], "totalSize": 1 }, "Id": "0018000000Q9v1VAAR", "Name": "Pyramid Construction Inc.", "attributes": { "type": "Account", "url": "/services/data/v36.0/sobjects/Account/0018000000Q9v1VAAR" } }
<removed for clarity>

Working with Records

If you know the ID(s) of the records you need to work with, the force record command allows you to access and manipulate record instances.

Retrieving Records

To retrieve a record, execute force record <type> <id> - this brings back all fields :

> force record get Contact 0038000000asT8TAAU
AccountId: 0018000000Q9v1VAAR AssistantName: Jean Marie AssistantPhone: (014) 427-4465 Birthdate: CreatedById: 00580000001ju2CAAQ CreatedDate: 2009-04-19T11:27:45.000+0000 Department: Finance Description: Fax: (014) 427-4428 FirstName: Pat

<removed for clarity>
SystemModstamp: 2013-01-10T16:17:39.000+0000 Title: SVP, Administration and Finance attributes: type: Contact url: /services/data/v36.0/sobjects/Contact/0038000000asT8TAAU

Updating Records

To update a record, execute force record update <type> <id> <field>:<value>

> force record update Contact 0038000000asT8TAAU FirstName:"Patrick"
Record updated

I can then confirm that the record has been updated by retrieving it again and extracting the FirstName field:

>force record get Contact 0038000000asT8TAAU | grep FirstName
FirstName: Patrick

(the grep command extracts any lines in the output containing the search term, FirstName in this case.

Creating Records

You aren’t just limited to existing records with the Force CLI - you can also created them, either individually on en-masse from a file.

To create a single record execute force record create <type> <field> <field> … :

> force record create Contact FirstName:"Keir" LastName:"Bowden"
Record created: 00380000023TUCxAAO > force record get Contact 00380000023TUCxAAO

<removed for clarity>
FirstName: Keir LastModifiedById: 00580000001ju2CAAQ LastModifiedDate: 2016-07-16T10:48:38.000+0000 LastName: Bowden
 <removed for clarity>

To create multiple records execute force record create:bulk <type> <file> - the default format is CSV, but you can specify the content type of the file if it is different. Thus the following file, named contacts.csv:

"First","CLI Blog"
"Second","CLI Blog"
"Third","CLI Blog"

can be loaded as follows

> force record create:bulk Contact contacts.csv

Batch 1 of 1 added with Id 75180000006NK0pAAG
Job created ( 75080000004fh7wAAA ) - for job status use
 force bulk batch 75080000004fh7wAAA 75180000006NK0pAAG

and I can verify the load using the query command introduced earlier

> force query "select FirstName, LastName from Contact where LastName ='CLI Blog'"

 FirstName | LastName
 First     | CLI Blog
 Second    | CLI Blog
 Third     | CLI Blog
 (3 records)

Related Posts


Sunday, 26 June 2016

Force CLI Part 2 - Extracting Metadata

Force CLI Part 2 - Extracting Metadata



In Part 1 of this series, Getting Started, I covered downloading and installing the Force CLI, and logging in to Salesforce. This post covers extracting metadata from your Salesforce org.

Why do I need the Metadata?

This is a very good question - a lot of the time the answer is that you don’t. If you do, its highly likely that your IDE of choice will handle this for you - pulling down the metadata components that you are working on and uploading the changes when you save or deploy. The classic use case for this is to make a backup of your implementation, typically on a regular schedule in an unattended fashion.


At first glance, this looks about as straightforward as it could be. Executing force help shows there is a command that looks tailor made:

export    Export metadata to a local directory

By default, this command dumps the metadata to a subdirectory of your current directory named (fittingly enough) metadata, although you can supply a different location for the output as an additional parameter.

After logging in, I can export the metadata by executing the following command:

force export

However, at the time of writing this doesn’t quite bring everything back. Here’s the contents of my metadata directory- spot the deliberate mistake: 

analyticSnapshots	homePageComponents	quickActions
applications		homePageLayouts		remoteSiteSettings
approvalProcesses	labels			reportTypes
assignmentRules		layouts			roles
autoResponseRules	networks		samlssoconfigs
callCenters		objectTranslations	scontrols
classes			objects			sharingRules
communities		package.xml		sites
components		pages			staticresources
connectedApps		permissionsets		tabs
dataSources		portals			translations
flows			profiles		triggers
groups			queues			workflows

Spotters Badge for those eagle-eyed readers that wondered where the Lightning Components metadata directory(aka aura) is.

There’s currently no way to influence the metadata elements that are included by the export command, but luckily there is a mechanism to extract individual metadata component types.

Fetch! Good boy!

The fetch subcommand provides granularity around metadata extraction, not just down to a single type of metadata, but as far as a single metadata component. I’m not going down to that level, but if you are interested then execute force help fetch which gives full details of the options.

The fetch command takes a -t switch to identify which metadata type you want to extract. For all types you can specify the name of the metadata type as detailed in the Metadata API Developer’s Guide, so to extract the Sites metadata, for example, you would execute:

force fetch -t CustomSite

The metadata name for Lightning Components is AuraDefinitionBundle, but you can also specify Aura as the type, which will extract the Lightning Component metadata via the REST API - I tend to use the latter version as I find it is slightly faster than the metadata route.


force fetch -t Aura

extracts the Lightning Components to the metadata directory, and the entire Salesforce configuration is now in one place:

analyticSnapshots	homePageComponents	remoteSiteSettings
applications		homePageLayouts		reportTypes
approvalProcesses	labels			roles
assignmentRules		layouts			samlssoconfigs
aura			networks		scontrols
autoResponseRules	objectTranslations	sharingRules
callCenters		objects			sites
classes			package.xml		staticresources
communities		pages			tabs
components		permissionsets		translations
connectedApps		portals			triggers
dataSources		profiles		workflows
flows			queues
groups			quickActions

Putting these commands together in a bash script, with a dash of compression, gives me a handy backup file containing my metadata:

force login
force export
force fetch -t Aura
zip -r metadata >

Note - if you run the above commands directly from the command line, ignore the first one. That tells the OS which command to use to execute the script, which makes no sense for individual commands.

Version Control to Major Tom

Where this can be particularly powerful is if you integrate with version control.  You can use a script to periodically extract the latest metadata and merge into your Git repository for example.

Related Posts


Sunday, 29 May 2016

Force CLI Part 1 - Getting Started

Force CLI Part 1 - Getting Started



In one of my earlier blog posts around Lightning Components deployment (Deploying Lightning Components - Go Big or Go Missing) I made a passing reference to the fact that I use the Force CLI for automated deployments. A few people contacted me after this and said that while they were vaguely aware of the CLI, they’d never spent any time with it, so I decided to show this unsung hero some love with a series of blog posts.

A Command Line Interface to!

As someone who spent 20 or so years developing on Unix and Linux, I’m pretty comfortable with command line tools, and in many cases (resolving git conflicts!) I actually prefer them to a GUI. So when the CLI was released at Dreamforce 13, I couldn’t wait to get hold of it.


The first thing to do is download the CLI - you can do this from the home page at : - this has pre-built binaries for Mac OSX, Windows and Debian/Ubuntu. I’m a Mac user so any instructions/examples are based on OSX.

Screen Shot 2016 04 03 at 18 57 08

Clicking on the ‘Download for Mac OS X 64’ link downloads the ‘force’ application to my ‘Downloads’ directory. I clear this directory out regularly (yeah right, regularly being when I run out of disk space) so the first thing I need to do is move this to a safe place. I create a tools directory underneath my home directory and drag the file into that:

Trying it Out

Via the terminal (If you aren’t familiar with the terminal then I’d recommend following this tutorial from MacWorld) I can then test out whether the application works correctly 

> cd ~/tools
> ./force

and of course it doesn’t:

-bash: ./force: Permission denied

This is because files are downloaded without execute permission. To rectify this, execute:

> chmod +x ./force

Executing again now shows the (long) list of available options:

> ./force

Usage: force  []
Available commands:
   login     force login [-i=] [ ]
   logout    Log out from
   logins    List logins used
   active    Show or set the active account
   whoami    Show information about the active account
   describe  Describe the object or list of available objects
   sobject   Manage standard & custom objects
   bigobject  Manage big objects
   field     Manage sobject fields
   record    Create, modify, or view records
   bulk      Load csv file use Bulk API
   fetch     Export specified artifact(s) to a local directory
   import    Import metadata from a local directory
   export    Export metadata to a local directory
   query     Execute a SOQL statement
   apex      Execute anonymous Apex code
   trace     Manage trace flags
   log       Fetch debug logs
   eventlogfile  List and fetch event log file
   oauth     Manage ConnectedApp credentials
   test      Run apex tests
   security  Displays the OLS and FLS for a give SObject
   version   Display current version
   update    Update to the latest version
   push      Deploy artifact from a local directory
   aura      force aura push -resourcepath=
   password  See password status or reset password
   notify    Should notifications be used
   limits    Display current limits
   help      Show this help
   datapipe  Manage DataPipes
Run 'force help [command]' for details.

Access All Areas

By executing ‘./force’ I’m asking the operating system to execute the script named force in the local directory (the ./ part). This is going to be pretty unwieldy if I have to provide the location each time, so the last thing to do to get set up is to add the tools directory to my path - if you chose a different directory name to tools, just alter the following commands to use your name.

The default shell on OSX is bash (bourne again shell, as it was a replacement for the venerable Bourne shell - this is both a blog post and a history lesson!). When you login (or open a new terminal window on Mac OSX) the bash shell executes the .bash_profile script from your home directory, if one exists, so this is the easiest way to update your path. 

Using your favourite editor (mine is vim, but that’s a bit like saying the developer console is your favourite IDE), open the .bash_profile file, from your home directory (/Users/<yourname>), or create one if it isn’t already there and add the following at the bottom:

export PATH=$PATH:~/tools

Save the file, then execute the .bash_profile script to update your current context:

> . ~/.bash_profile

Next, change to your home directory and execute the force command without a path to check all is good:

> cd
> force

and once again you should see the full output.

Congratulations - now you can run the force executable from anywhere without worrying about where it has been installed.

Logging In

Now that everything is set up, you can login to Salesforce via the CLI by executing :

> force login

This opens up a browser window for you to enter your credentials: 

Screen Shot 2016 05 08 at 08 26 08

and authorise access via oauth:

Screen Shot 2016 05 08 at 08 26 30

(Note that this process involves a Heroku app communicating with a local web server to return the oauth token to the command line, which may be problematic if you are behind a firewall that you can’t configure or using NAT. See the end of this post for an alternative, non-oauth approach).

This is the number 1 reason why I use the Force CLI for deployments - it removes the need to store usernames and passwords on disk, which has always been my issue with the Force Migration Tool.

If you have a number of orgs its quite easy to forget which one you have logged into, which is where the logins command comes in handy, as it will show you all the orgs that you have logged in to and which login is currently active:

> force logins

Screen Shot 2016 05 08 at 08 33 17

Non-OAuth Authentication

If you can’t use the oauth mechanism described above, you can always fall back on supplying the username and password to the CLI via the following command:

> force login -u=<username> -p=<password>

While this is slightly better than storing credentials on disk, it does mean that if someone were to compromise your machine they’d be able to see the username and password in your command history. For example, after executing:

> force login -u=fake@fake.fake -p=Faking1t

If I then look at my command history

> history

This shows the credentials:

  685  force login -u=fake@fake.fake -p=Faking1t
  686  history

Luckily I can clean up afterwards and remove the command from my history, First delete the entry:

> history -d 685

then write the history to disk:

> history -w

viewing the history again shows that entry 685 has been replaced with the entry that was at 686:

685  history

Logging in to a Sandbox

The logins command supports production orgs, sandboxes and custom domains. To view the options, for this or any other command, execute ‘force <command> -help’:

force login -help

Usage: force login
  force login [-i=<instance>] [<-u=username> <-p=password> <-v=apiversion]
    force login
    force login -i=test
    force login -u=un -p=pw
    force login -i=test -u=un -p=pw
    force login -u=un -p=pw

That’s all Folks

This concludes getting started with the Force CLI. Assuming all went well you now have the CLI installed, in your path and you’ve been able to login to an org. In the next post I’ll take a look at interacting with your metadata.

Related Posts


Saturday, 7 May 2016

Salesforce - It's not just for Ladies

Salesforce - It’s not just for Ladies



This week I (and many, many others - check the picture above!) attended the “An Evening with Ladies who Salesforce" event at Salesforce Tower in London, organised by the team behind the London Salesforce Women in Tech - Freya Crawshaw, Louise Lockie and Jodi Wagner. This was a vertical journey for me, as the BrightGen offices are on the 18th floor!


The keynote talk was from Anne-Marie Imafidon, Head Stemette. Stemettes are doing an amazing job of enabling the next generation of girls into STEM (Science, Technology, Engineering and Maths).  Note that I’ve chosen the word enabling rather than encouraging deliberately - part of Anne-Marie’s slot included a video from the Outbox Incubator, and its clear that these girls already have an enormous amount of interest and enthusiasm (and energy!) for technology and development. 

Panel Time

The second part of the evening was a panel discussion on career pathing in the Salesforce ecosystem, with an impressive panel lineup including Salesforce MVP Jenny Bamber, Certified Technical Architect Doina Figural (the first female CTA outside of the US and one of only three worldwide) and Salesforce EVP and GM of App Cloud, Adam Seligman.

I found this an enlightening discussion, as it covered a lot of areas that I’ve never really thought about before - planning where you want to be and structuring an approach to get there for example. My own “career” (which it has taken me a long time to accept that I have one, as opposed to a series of jobs) has tended to be more about following the tech that I’m interested in, so I’ve worked for a number of small companies that either haven’t made it or have turned into very different places from the one I’ve joined. When that has happened I’ve moved on to the next one, usually taking a pay cut and dropping down from a senior to a mid-level position while I learn the ropes.

Only once or twice, when this has happened in the midst of a recession for example, have I actually worried about whether I’d be able to find another position, so it was very interesting to hear from people who have had to take a very different approach to their career, purely because through an accident of birth they happen to be a different gender.

One of the questions from the audience was around how companies can attract more women to apply for their open positions - something very close to my heart as I’ve been on a long journey to attract the first female developer to BrightGen, which finally came to an end in early 2016. It wasn’t that we were interviewing females and rejecting them, but we were getting very few CVs, and understandably many of those that we interviewed weren’t keen on being the one and only female writing code. One  panelist’s answer was around what do you tell the recruitment agents in terms of the diversity that you want from the CVs that they send you. This resonated enormously, as it has never occurred to me that I have to do anything other than ask agents to send me their best CVs.

Its not just for Ladies!

So what was I doing there? The clue is in the name - An Evening with Ladies who Salesforce, not for Ladies who Salesforce! I wasn’t the only man either - there were three or four others outside of Adam on the panel and various Salesforce representatives. It was unusual, shall we say, being at a Salesforce community event in London where I was very much in the minority, but how dull would life be if everything was always the same.

So I’d recommend these events to any men in the Salesforce ecosystem, especially those hoping to lead - its vital to understand things from the viewpoint of others, and the best way to achieve this is to listen to others. If you are worried about being the only man there, tough! We expect the women that attend the developer events to be able to handle being in the minority (and at times flying solo), so we should be able to handle this when the roles are reversed.


Saturday, 16 April 2016

The Hurt Locker

The Hurt Locker



There’s a new Sheriff in Lightning Town and its name is the LightningLocker. The purpose of this is to tighten up security around Lightning Components, something that has been mentioned in the documentation, usually along the lines of "you shouldn’t do this, and while it might work at the moment, we may stop it in the future”.  Well the future is starting in Summer 16 when the LightningLocker becomes available as a critical update for existing orgs and the default for new orgs. I’ve always suspected the Lightning Components present a real concern for the Salesforce security team, as our JavaScript is co-existing with Salesforce JavaScript and has the same powers, well not any more! This is something to be welcomed, not only from the perspective of stopping nefarious JavaScript from causing problems, but also because a lot of developers will be learning JavaScript on the jon as they develop their components, and its pretty much a given there will be some unintended consequences which may be severe.

But What if I Really, Really Need To ...

Now obviously I’m not Salesforce, so I can’t say for sure if they will make any exceptions, but I’d be pretty surprised if they do. The Locker isn’t closing down loopholes because they are worried the world may be blinded by the sheer awesomeness of your solution. Loopholes are being closed because they are a security risk, and whitelisting applications to punch holes in security is unlikely to make customers feel safe.

It’s So Unfair - I Hate You!

It isn’t really - we all knew this was coming. I view it in a similar vein to JavaScript in Home Page Components - if you were told not to do something and did it anyway, don’t expect any sympathy when the capability is taken away (and yes, I know I was wrong about Salesforce providing a replacement - that should have been customising standard pages with Lightning Components which has taken way longer to arrive than I expected - currently still in pilot for record home pages!). That said there are a couple of things that concern me:

  1. This should have been in place from the start
    It’s all well and good putting warnings in the docs, but out in the real world people have hard deadlines and requirements, and not everyone reads the docs before making a questionable decision. I’m sure there are some customers out there who have components where the original developer has since departed or was never on staff. If their components are using anything that the Locker takes issue with, they could lose key functionality without the in-house skills to replace it.

    If you think about it, this is like carrying out a Salesforce implementation and giving every user System Administrator privileges with a warning not to rely on them, then coming back after a year or two and changing them to a Standard User. While you can say that you warned them, and its not your fault if their business processes relied on their elevated privileges, you aren’t going to be popular.
  2. Should we expect breaking changes?
    I’m raising this as its something which I think I’m likely to fall foul of. To use inheritance in Lightning Components I’m making  use of the component.getDef() method to access the ComponentDef and then executing the getHelper() method. However, ComponentDef has been removed from the Lightning documentation app, which suggests that it isn’t supported - per Skip Sauls reply in this thread

    "JavaScript source is not documentation, and does not indicate support. Please refer to the docs for the official API. Use of unsupported components, events, or APIs is a violation of the terms of service, and your code may break in a release, including patch."

    But here’s the rub - this did appear in the documentation, and then it went away. So where does that leave me? I’ve raised this on Stack Exchange where I know a lot of the Lightning team lurk, so if I get an answer I’ll update this post.

  3. I Want Workarounds!
    Thus far I’ve seen mentions of tools, blogs, articles and Trailhead modules to help us get to grips with this. What I haven’t seen mention of is alternatives. If I’ve gone to the open source Aura project to figure out how to do something, and that turns out to be a private API, I’d like a workaround providing equivalent functionality rather than just being told I can’t use it any more. I only resort to the Aura source if the documentation app isn’t helpful (sadly this happens more than you’d expect, as there are still a number of entries that just have the method signature/component name and no additional information).

There’s an app for that 

If news of the Locker makes you fear for your code, there’s an app (or more accurately, a command line tool) that can give you some succour - the Lightning CLI. I’ve run this on some of my components and it has raised a few things that I need to take care of (although doesn’t have a problem with ComponentDef mentioned above, but I’m not reading too much into this at the moment).

The Hurt Locker? Seriously?

Like I could leave that alone!

Related Links


Wednesday, 13 April 2016

The Case of the Missing Mascot

The Case of the Missing Mascot

Trailhead module wheres astro


There's just over a day to go to help the Trailhead team in the search for missing mascot Astro, so if you haven't already pitched in, get yourself over to :

where you can earn the badge and even get the chance to win a prize*

Where’s Astro?

This is a somewhat different module to the others - rather than learning about a particular feature of Salesforce and then answering some questions or completing a challenge, you are instead helping to solve a mystery (where is Astro) by searching for clues in the Trailhead content.

The clues will direct you to other modules, so if you don't have the badges for those (really, you don't have them all?) its a perfect opportunity to improve your Salesforce knowledge while investigating. If I'd been setting this up we'd have to complete the modules, but the Trailhead team were clearly overwhelmed with worry about their missing mascot and went easy on us.

Of course, it wouldn't be Trailhead if you didn't have to carry out some setup and configuration,so you can expect to track the progress of your investigation in a Salesforce Developer Edition, much like Sherlock Holmes would, if the modern version wasn't on the BBC and thus not allowed to advertise products :)

There's no coding involved, and the setup is pretty straightforward, so this is very much a module for all (or no!) skill sets - when a mascot's life hangs in the balance you need all the help that you can get, especially people that aren't averse to leaving their computer and going outside!

Keeping Everyone in the Loop

One aspect that I particularly liked were the tweet buttons at various points that posted out cryptic updates about progress:

Screen Shot 2016 04 13 at 16 57 23

a great way to get others interested and let the concerned population know that the search continues at full steam. Continuing with the Sherlock Holmes association, this is the modern equivalent of the agony columns that featured in many of the stories.

Leaving Everyone Hanging

The first rule of earning the Where’s Astro badge is that you don’t talk about the outcome of the investigation. If you don’t complete this module yourself you’ll have to wait until the Trailhead team spills the beans some time after the competition closes, and I hope you can live with yourself in the meantime.

As mentioned earlier, the search finishes on April 14th, so get your magnifying glass and deerstalker hat and join in while there is still time.

* Confession time - I didn't realise this was a competition until earlier today. What happened, as is always the case, is that I saw a new badge was available and I had to have it :)

Saturday, 9 April 2016

Fantasy Trailhead #1 - Maximum Damage

Fantasy Trailhead #1 - Maximum Damage

(Note - this is a humorous post - if you follow any of the advice below your experience will be sub-optimal, but also hilarious, so make sure to let me know just how bad things turned out)



After finishing up my 102nd Trailhead badge a couple of weeks ago, I started thinking about badges that I’d like to see in the future. Before long I’d changed to thinking about badges that we’re never likely to see, which are awarded for doing things badly as opposed to well. To that end, allow me to present my first Fantasy Trailhead - Maximum Damage


A Trail for Everyone

As everyone involved in implementing or maintaining a Salesforce instance has an opportunity to cause damage, this Trail covers the whole range of skills - Admin, Developer and Architect. The aim is to answer questions/implement challenges in the way that would cause maximum damage to an instance. 

Clicks not Solutions

Admins are every bit as powerful as developers when it comes to breaking Salesforce and in the Maximum Damage trail they would be expected to understand not just the terrible impact of their changes on the Salesforce platform, but also the deleterious  effects on their business as a whole. Here’s an example question:

A user would like a mechanism to ensure that they cannot save a Contact record without populating the Other Phone number. No other users are expecting this change and in the majority of cases this information will not be available”.

In a normal universe this request would be rightly rejected, but in the Maximum Damage Trail we leap on it with gusto. The answer options as as follows, along with an explanation of why the are not the right (or the most wrong!) solution:

  1. Add a validation rule that checks if the Contact is being created by the particular user that requested the change, if it is then ensure the Other Phone field is populated.
    This is a reasonable solution to the problem, and therefore not what we are looking for in Maximum Damage - there are no additional unwanted features and other users won’t even know the change has been applied.
  2. Make the Other Phone field required on the Contact page layout
    This is better, as users who do not want this change are being affected by it and their working lives have been made a
    little more difficult. One downside to this is that no existing integrations have been broken, as this is only affecting the UI.
  3. Make the Other Phone required through Field Level Security
    Now we are starting to see some serious damage - integrations and automated loads of Contacts are likely to be impacted, but this is still relatively localised to Contact creation. Is there any more that can be done to cause problems for the wider business.

  4. Make the User a System Administrator so that they can make the Other Phone required through Field Level Security
    This is taking the long view with regard to ruining your implementation - all the downsides of answer 3, plus the user is now empowered to make whatever changes they like to any part of the system. hopefully without any idea of the impact of their changes. It might seem that we’ve taken this as far as we can, but things could be worse.
  5. Share your username and password with the user so that they can make the Other Phone required through Field Level Security
    This is the gold standard - all of the downsides of the previous options, plus a lack of accountability. When the user makes the inevitable badly thought out change, nobody will know it was them and it will look like you did it. This will muddy the waters nicely, as your co-administrators will assume you had a good reason and hopefully spend valuable time trying to understand why.

 Future-Resistant Code

The Developer module in the Maximum Damage Trail is designed to test your ability to produce the worst possible coded solution to a problem that didn’t need code in the first place. Rather than taking the YAGNI (You Ain’t Gonna Need It) approach, not worrying about future requirements, solutions here use a YAGGI (You Ain’t Gonna Get It) approach, producing code that cannot be extended in the future.

This module is more challenge based, and looks for solutions with the following attributes:

  • Replicating standard platform functionality, badly.
  • Use of hardcoded ids wherever possible
  • Only capable of handing a single record, to ensure that data migration is as painful as possible
  • SOQL queries and DML statements inside loops (preferably loops that are nested for no good reason).
  • No unit test coverage
  • Any inputs other than those specified in the challenge would cause the code to fail
  • Consuming as many governor limits as possible without breaching, to ensure that this cannot be added to an existing business process with code.
  • No error checking or validation of inputs
  • Empty catch blocks should be liberally used to swallow any exceptions and plough on regardless.

Technical Incompitecht

The architect module of Maximum Damage looks for decisions that will not only inhibit scale, but also maximise costs throughout the lifetime of the implementation. Fittingly, I’m not sure how well this would scale from a marking perspective, as there would be some subjectivity here so they would probably need manual marking. The Maximum Damage Trail also takes down the author!

Key features of the solution architecture include:

  • Overloading a standard object to provide custom functionality
    This would involve creating a number of new fields which have no relation to the standard object and removing any standard fields from the page layout. Record types must be avoided, as this provides an unwanted degree of separation. Using a standard object in this way maximises the license cost, and the standard object should preferably be one requiring an additional, paid, feature license.
  • Using a Community to do a Site’s job
    It goes without saying that named Partner Community licenses should be used, especially if high volumes are required as this will guarantee a scalability problem before too long.
  • Complex security and sharing
    Org wide defaults should always be Private. If public access is required this should be implemented by multiple sharing rules at the lowest possible level of the role hierarchy possible. Territory management is a must, unless the requirements indicate territory management, in which case it should be avoided at all costs.
  • Little or no Change Control
    An architect should not miss the opportunity to make the development lifecycle harder than it needs to be, so a change control process involving as manual replication of configuration and hand-deployment of code via the IDE will score highly.
  • Centre of Negligence
    The antithesis of a Centre of Excellence, an architect is expected to identify a governance framework that leads to isolated decision making with no idea of the needs of other business units or the organisation as a whole.

Marge my friend

Only Joking … Or Am I?

While I put this post together for a bit of fun, I think there is value in being able to identify the worst solution from a selection of bad options. Most test/exam questions have one correct answer and the others, while plausible, will be flawed in some way. This means that you can figure out the correct answer even if you don’t know it (I’m fond of quoting Sherlock Holmes in this regard - “When you have eliminated the impossible, whatever remains, however improbable, must be the truth”).

Being presented with an entire list of flawed options and having to choose the one which will cause the most problems both now and in the future requires an in-depth understanding of how customisations affect the Salesforce platform, and how seemingly minor decisions can cause major problems in the future.

I have a couple more ideas for Fantasy Trailhead, so stay tuned for more posts so set you on the path to failure.