Friday, 14 October 2016

Visualforce Development Cookbook Second Edition

Visualforce Development Cookbook Second Edition



As regular readers of this blog will know, I wrote a book three years ago - the Visualforce Development Cookbook - and I really haven't stopped banging on about it since. Well things just got worse as I finished the second edition in ?September? and I'm obviously keen to shift as many copies as possible!

What’s a Second Edition?

A second edition isn't a new book. Instead it's an iteration on the previous version, minus content that is no longer relevant and including new features, so the first job is to identify what should be cut and what should be added.

One of the nice things about working on Salesforceis that over time new features are added that remove the need for custom code, so I was able to chop a few recipes out that are now handled natively by the platform. The next job is to update all existing code to the latest API version - something that typically happens several times as Salesforce produces releases faster than I can write a book (even a second edition!). 

Then its back to the writing - for existing recipes making sure that the text is still appropriate and creating new recipes from scratch. This was pretty compressed this time around, as Packt proposed a pretty aggressive timescale to complete before Dreamforce which I was more than happy to sign up to, as I knew that would take up all of my time through to mid-October. I wouldn't recommend this if you don't have some vacation time you can dedicate to writing though, as it would have been pretty tough to complete with only evenings and weekends available.

What’s changed?

The main changes in the second edition are:

  • The chapter on sites has been reworked to use the Salesforce Lightning Design System
  • The chapter on jQueryMobile has been replaced with Salesforce1 recipes.
  • A new chapter on Troubleshooting has been added, covering debugging, managing the view state and advice on avoiding common problems.
  • All the retained recipes have been revised as appropriate.

You want me to buy it again?

Now I’m not like the music industry that expects you to buy the same  music every time they think up a different format, and I can understand that for someone who already purchased the first edition, this might not represent enough of a change to justify buying the second edition (although it was for Fabrice Cathala, who is clearly going for the full set - thanks Fabrice!).

I raised this with the publishers and they gave me a couple of discount codes to use, both valid until 17th October, which you can use on the Packt website:

  • Use code VISDSE50 for 50% off the Ebook
  • Use code VISDSE15 for 15% off the print book

Note that you don't need to have purchased the first edition to use these codes, you can simply use them to scoop up a bargain.

Show me the code!

For those who just need the code and no explanation, its available free of charge on github at:

A word of advice though - best not come to me asking for help around adapting the recipes or producing unit tests if you've decided not to purchase the book - that would be asking me to cannibalize my own sales!

Related Posts


Friday, 23 September 2016

Force CLI Part 4 - Deploying Metadata

Force CLI Part 4 - Deploying Metadata



In previous posts in this series I’ve covered Getting Started, Extracting Metadata and Accessing Data via the Force CLI. In this post I’ll cover the functionality that I use probably 95% of the time that I use the Force CLI - deploying metadata from the local filesystem to a Salesforce instance.

The Building Blocks

Like other tools for deploying to Salesforce (e.g. migration tool, Eclipse IDE) there are a two items required to deploy via the Force CLI

Project Manifest (aka package.xml)

The package.xml file defines the metadata components that will be deployed - you can find out more about this in the Salesforce Metadata API Developer’s Guide

Metadata Components

The metadata components need to appear in the directory structure expected by the metadata API - e.g. Apex classes in a directory named ‘classes’, Visualforce pages in a directory named ‘pages'

I don’t make the rules (but I do make the deployment directory)

The rule when deploying is that the components specified in the package.xml manifest file need to match those present in the metadata components directory structure. If you specify components that are missing you will get an error, and if you provide additional components that aren’t mentioned in the package.xml, you’ll get a different flavour of error.

My metadata comes from GIT, so the metadata components structure of my entire project is defined there. However, I don’t always want to carry out a full deployment so I create a new structure in a temporary directory and create a package.xml file reflecting that. I also use a naming convention for my temporary directory so that I can look back at the order that I did things in - drop_<YYYYMMDDhhmmss>, where YYYYMMDD is the year/month/day of the date and hmmss is the hours/minutes/seconds of the time. The reason I go with this naming convention is that when I use the ls command, my directories are listed in order of creation:

$ ls -l

drwxr-xr-x  6 kbowden  wheel    204 16 Sep 14:39 drop_20160916141152
drwxr-xr-x  6 kbowden  wheel    204 17 Sep 10:11 drop_20160916184125
drwxr-xr-x  6 kbowden  wheel    204 17 Sep 12:45 drop_20160917121232
drwxr-xr-x  6 kbowden  wheel    204 18 Sep 12:56 drop_20160918092957
drwxr-xr-x  6 kbowden  wheel    204 18 Sep 19:21 drop_20160918185148
drwxr-xr-x  6 kbowden  wheel    204 20 Sep 15:39 drop_20160920152837

To use a very simple example, here’s the directory structure to deploy just the User object. My starting directory is /tmp/drop_20160923080432:


 My package.xml file just contains an entry to deploy the objects directly (it’s called CustomObject, but covers both standard and custom):

<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="">

Deploy all the Metadata

Once the directory structure and manifest are in place, use the force import command to deploy your metadata, using the -d switch to specify the directory containing the metadata:

force import -d /tmp/drop_20160923080432/target

The Force CLI will update you every five seconds on progress

Not done yet: Queued  Will check again in five seconds.
Not done yet: InProgress  Will check again in five seconds.
Not done yet: InProgress  Will check again in five seconds.
Not done yet: InProgress  Will check again in five seconds.
Not done yet: InProgress  Will check again in five seconds.
Not done yet: InProgress  Will check again in five seconds.

Once the deployment is complete, you’ll get details of successes/failures - the default behaviour is simply the to output the number of each

Failures - 0

Successes - 24
Imported from /tmp/drop_20160923080432/target

While if you specify the -v switch, you'll get full details of each success/failure:

Successes - 24
	status: unchanged
	status: unchanged
	status: unchanged


Deploying to Production

As everyone in the world knows, in order to deploy metadata to production you need to run some unit tests. The Force CLI provides support for this via the -l switch, which takes the same options as the migration tool:

  • NoTestRun 
  • RunSpecifiedTests
  • RunLocalTests
  • RunAllTestsInOrg

Verification Only

Another very useful switch is -c. This carries out a verification only deployment and rolls back all changes upon completion. I use this as part of our continuous integration environment to make sure I can successfully deploy to a clean development environment.

Related Posts


Sunday, 28 August 2016

Stand and Deliver with Salesforce Geocoding

Stand and Deliver with Salesforce Geocoding


The Summer 16 release of Salesforce introduced Automatic Geocodes for Addresses which, as the name suggests, allows you to geocode records with addresses automatically. To enable this just go to Setup, type ‘Clean' into the search and click on Clean Rules from the resulting setup nodes.  Then activate one or more of the clean rules and wait a few minutes. Much easier than setting up a batch job to integrate with Google and then finding yourself in breach of the license terms! 

In conjunction with the Geolocation API, this new feature allows you to build some cool applications without too much effort. In this post I’ll show how to combine these two technologies to build an application that assigns a delivery to the nearest user (driver, bike messenger etc) to the collection location.

User Check In

The functionality to allow users to check in (record their location) is provided by a Lightning Component. The component includes the Lightning Design System (yeah, I know it's part of the platform now, but including it myself means that I can also use this component in lightning out) and after the CSS has loaded, uses the geolocation API to capture the current location:

    doInit : function(cmp, ev) {
        var self=this;
        if (navigator.geolocation) {
                function(result) {self.gotGeoCode(result, cmp, self);},
                maximumAge: 0,
                enableHighAccuracy: true

The parameters passed to the geolocation function are as follows:

  • maximumAge - the maximum age of a cached position. Setting this to zero ensures that a new position is captured
  • timeout - the maximum number of milliseconds to wait for a response
  • enableHighAccuracy - requests that the most accurate position is captured. A request specifying this typically takes longer and consumes more power, and may be denied by the hardware. It’s just a suggestion really.

Once captured, the a server side Apex action is invoked to store the position in a custom field on the user record, along with a timestamp of when it was captured:

    public static String StoreUserLocation(Decimal latitude, Decimal longitude)
        String result='SUCCESS';
	    User u=new User(id=UserInfo.getUserId(),
	    update u;
        catch (Exception e)
        return result;

Note that I don’t need to retrieve the user record from the database to update it, I can just instantiate a new User object in memory and specify the id based on the UserInfo global - always worth saving a SOQL call if possible.

The component in action is shown in the screen capture video below:

Delivery Request

The flip side of the user check in is the delivery request. This is where a customer requires something to be collected from one (their) account and delivered to another. As this isn’t something that is likely to be requested on the fly, this is handled by a Visualforce page, although still using the Lightning Design System to style a Visualforce form.

To create a new delivery, the customer (me in this case) specifies the From and To account and any specific delivery instructions. I’m sending a package to Universal Containers as they seem to have a new social media team and I am keen to engage with them:

Screen Shot 2016 08 28 at 17 55 11

The package is being sent from Sea Palling Beach, where myself and the Jobs Admin have recently checked in:

Screen Shot 2016 08 28 at 17 56 44

When the record is saved, the controller sets the owner to the user who has most recently checked in to a location within 50 miles - if no such user can be found, the owner will be left as the user that created the record:


    public PageReference save()
        Account acc=[select ShippingLatitude, ShippingLongitude
                     from Account
                     where id=:delivery.From__c];
	List users=[SELECT id, Location__c, Location_Timestamp__c
	 	    FROM User
	 	    WHERE DISTANCE(Location__c, GEOLOCATION(:acc.ShippingLatitude, :acc.ShippingLongitude), 'mi') < 50
                    ORDER BY Location_Timestamp__c DESC];
	if (users.size()>0)
        insert delivery;
        ApexPages.StandardController std=new ApexPages.standardController(delivery);
        return std.view();

 So even though I created the delivery request, it was assigned to the job admin user as they checked in more recently:

Screen Shot 2016 08 28 at 18 18 31

That’s It?

That is indeed  all there is to it - a couple of custom fields on the user object, a custom object, one lighting component, one Visualforce page and two Apex classes. If you are so inclined, you can view the full code at the github repository. 


Saturday, 16 July 2016

Force CLI Part 3 - Accessing Data

Force CLI Part 3 - Accessing Data



In part 2 of this series, Extracting Metadata, I covered how to extract configuration data from your Salesforce instance, which it is something pretty much every other deployment tools offer. A key difference of the Force CLI is that you aren’t limited to configuration - you can retrieve and update your Salesforce data from the command line or a script. Very useful if you have to carry out some data migration as part of a deployment for example.

NOTE: In the examples below the commands are split over multiple lines for readability, but should be entered on a single line if you are trying them out for yourself.

Executing SOQL

The command for executing SOQL is force query, followed by the SOQL string enclosed in quotes:

> force query "select id, Name, Industry from Account 
order by CreatedDate limit 5"

The default format is ascii table style:

 Id                 | Industry       | Name
 0018000000Q9v1VAAR | Construction   | Pyramid Construction Inc.
 0018000000Q9v1WAAR | Consulting     | Dickenson plc
 0018000000Q9v1XAAR | Hospitality    | Grand Hotels & Resorts Ltd
 0018000000Q9v1YAAR | Transportation | Express Logistics and Transport
 0018000000Q9v1ZAAR | Education      | University of Arizona
 (5 records)

which is fine for displaying data on the screen, but not that easy for processing. Luckily the output format can be changed via the -format switch.

CSV Formatting

Specifying csv as the value of the format switch outputs the results of the query in the familiar comma separated format, along with a list of headers:

> force query "select id, Name, Industry from Account 
order by CreatedDate limit 5" -format:csv
"0018000000Q9v1VAAR","Construction","Pyramid Construction Inc."
"0018000000Q9v1WAAR","Consulting","Dickenson plc"
"0018000000Q9v1XAAR","Hospitality","Grand Hotels & Resorts Ltd"
"0018000000Q9v1YAAR","Transportation","Express Logistics and Transport"
"0018000000Q9v1ZAAR","Education","University of Arizona"

 while this is more useful for processing, it doesn’t work well with subqueries:

> force query "select id, Name, (select id, FirstName from Contacts) 
from Account order by CreatedDate limit 5" -format:csv
"Contacts","Id","Name" "[map[Id:0038000000asT8TAAU FirstName:Pat]]","0018000000Q9v1VAAR","Pyramid Construction Inc." "[map[Id:0038000000asT8UAAU FirstName:Andy]]","0018000000Q9v1WAAR","Dickenson plc" "[map[Id:0038000000asT8VAAU FirstName:Tim] map[Id:0038000000asT8WAAU FirstName:John]]","0018000000Q9v1XAAR","Grand Hotels & Resorts Ltd" "[map[Id:0038000000asT8ZAAU FirstName:Babara] map[FirstName:Josh Id:0038000000asT8aAAE]]","0018000000Q9v1YAAR","Express Logistics and Transport" "[map[Id:0038000000asT8bAAE FirstName:Jane]]","0018000000Q9v1ZAAR","University of Arizona"

JSON Formatting

Specifying json as the format switch outputs the result in JavaScript Object Notation:

force query "select id, Name, (select id, FirstName from Contacts) 
from Account order by CreatedDate limit 5" -format:json
[{"Contacts":{"done":true,"records":[{"FirstName":"Pat","Id":"0038000000asT8TAAU","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8TAAU"}}],"totalSize":1},"Id":"0018000000Q9v1VAAR","Name":"Pyramid Construction Inc.","attributes":{"type":"Account","url":"/services/data/v36.0/sobjects/Account/0018000000Q9v1VAAR"}},{"Contacts":{"done":true,"records":[{"FirstName":"Andy","Id":"0038000000asT8UAAU","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8UAAU"}}],"totalSize":1},"Id":"0018000000Q9v1WAAR","Name":"Dickenson plc","attributes":{"type":"Account","url":"/services/data/v36.0/sobjects/Account/0018000000Q9v1WAAR"}},{"Contacts":{"done":true,"records":[{"FirstName":"Tim","Id":"0038000000asT8VAAU","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8VAAU"}},{"FirstName":"John","Id":"0038000000asT8WAAU","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8WAAU"}}],"totalSize":2},"Id":"0018000000Q9v1XAAR","Name":"Grand Hotels \u0026 Resorts Ltd","attributes":{"type":"Account","url":"/services/data/v36.0/sobjects/Account/0018000000Q9v1XAAR"}},{"Contacts":{"done":true,"records":[{"FirstName":"Babara","Id":"0038000000asT8ZAAU","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8ZAAU"}},{"FirstName":"Josh","Id":"0038000000asT8aAAE","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8aAAE"}}],"totalSize":2},"Id":"0018000000Q9v1YAAR","Name":"Express Logistics and Transport","attributes":{"type":"Account","url":"/services/data/v36.0/sobjects/Account/0018000000Q9v1YAAR"}},{"Contacts":{"done":true,"records":[{"FirstName":"Jane","Id":"0038000000asT8bAAE","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8bAAE"}}],"totalSize":1},"Id":"0018000000Q9v1ZAAR","Name":"University of Arizona","attributes":{"type":"Account","url":"/services/data/v36.0/sobjects/Account/0018000000Q9v1ZAAR"}}]

 suitable for processing by a myriad variety of tools.

Prettified JSON

For the best of both worlds - JSON formatted for automated processing but laid out so that humans can read it, specify json-pretty as the format switch:

> force query "select id, Name, (select id, FirstName from Contacts) 
from Account order by CreatedDate limit 5" -format:json-pretty
[ { "Contacts": { "done": true, "records": [ { "FirstName": "Pat", "Id": "0038000000asT8TAAU", "attributes": { "type": "Contact", "url": "/services/data/v36.0/sobjects/Contact/0038000000asT8TAAU" } } ], "totalSize": 1 }, "Id": "0018000000Q9v1VAAR", "Name": "Pyramid Construction Inc.", "attributes": { "type": "Account", "url": "/services/data/v36.0/sobjects/Account/0018000000Q9v1VAAR" } }
<removed for clarity>

Working with Records

If you know the ID(s) of the records you need to work with, the force record command allows you to access and manipulate record instances.

Retrieving Records

To retrieve a record, execute force record <type> <id> - this brings back all fields :

> force record get Contact 0038000000asT8TAAU
AccountId: 0018000000Q9v1VAAR AssistantName: Jean Marie AssistantPhone: (014) 427-4465 Birthdate: CreatedById: 00580000001ju2CAAQ CreatedDate: 2009-04-19T11:27:45.000+0000 Department: Finance Description: Fax: (014) 427-4428 FirstName: Pat

<removed for clarity>
SystemModstamp: 2013-01-10T16:17:39.000+0000 Title: SVP, Administration and Finance attributes: type: Contact url: /services/data/v36.0/sobjects/Contact/0038000000asT8TAAU

Updating Records

To update a record, execute force record update <type> <id> <field>:<value>

> force record update Contact 0038000000asT8TAAU FirstName:"Patrick"
Record updated

I can then confirm that the record has been updated by retrieving it again and extracting the FirstName field:

>force record get Contact 0038000000asT8TAAU | grep FirstName
FirstName: Patrick

(the grep command extracts any lines in the output containing the search term, FirstName in this case.

Creating Records

You aren’t just limited to existing records with the Force CLI - you can also created them, either individually on en-masse from a file.

To create a single record execute force record create <type> <field> <field> … :

> force record create Contact FirstName:"Keir" LastName:"Bowden"
Record created: 00380000023TUCxAAO > force record get Contact 00380000023TUCxAAO

<removed for clarity>
FirstName: Keir LastModifiedById: 00580000001ju2CAAQ LastModifiedDate: 2016-07-16T10:48:38.000+0000 LastName: Bowden
 <removed for clarity>

To create multiple records execute force record create:bulk <type> <file> - the default format is CSV, but you can specify the content type of the file if it is different. Thus the following file, named contacts.csv:

"First","CLI Blog"
"Second","CLI Blog"
"Third","CLI Blog"

can be loaded as follows

> force record create:bulk Contact contacts.csv

Batch 1 of 1 added with Id 75180000006NK0pAAG
Job created ( 75080000004fh7wAAA ) - for job status use
 force bulk batch 75080000004fh7wAAA 75180000006NK0pAAG

and I can verify the load using the query command introduced earlier

> force query "select FirstName, LastName from Contact where LastName ='CLI Blog'"

 FirstName | LastName
 First     | CLI Blog
 Second    | CLI Blog
 Third     | CLI Blog
 (3 records)

Related Posts


Sunday, 26 June 2016

Force CLI Part 2 - Extracting Metadata

Force CLI Part 2 - Extracting Metadata



In Part 1 of this series, Getting Started, I covered downloading and installing the Force CLI, and logging in to Salesforce. This post covers extracting metadata from your Salesforce org.

Why do I need the Metadata?

This is a very good question - a lot of the time the answer is that you don’t. If you do, its highly likely that your IDE of choice will handle this for you - pulling down the metadata components that you are working on and uploading the changes when you save or deploy. The classic use case for this is to make a backup of your implementation, typically on a regular schedule in an unattended fashion.


At first glance, this looks about as straightforward as it could be. Executing force help shows there is a command that looks tailor made:

export    Export metadata to a local directory

By default, this command dumps the metadata to a subdirectory of your current directory named (fittingly enough) metadata, although you can supply a different location for the output as an additional parameter.

After logging in, I can export the metadata by executing the following command:

force export

However, at the time of writing this doesn’t quite bring everything back. Here’s the contents of my metadata directory- spot the deliberate mistake: 

analyticSnapshots	homePageComponents	quickActions
applications		homePageLayouts		remoteSiteSettings
approvalProcesses	labels			reportTypes
assignmentRules		layouts			roles
autoResponseRules	networks		samlssoconfigs
callCenters		objectTranslations	scontrols
classes			objects			sharingRules
communities		package.xml		sites
components		pages			staticresources
connectedApps		permissionsets		tabs
dataSources		portals			translations
flows			profiles		triggers
groups			queues			workflows

Spotters Badge for those eagle-eyed readers that wondered where the Lightning Components metadata directory(aka aura) is.

There’s currently no way to influence the metadata elements that are included by the export command, but luckily there is a mechanism to extract individual metadata component types.

Fetch! Good boy!

The fetch subcommand provides granularity around metadata extraction, not just down to a single type of metadata, but as far as a single metadata component. I’m not going down to that level, but if you are interested then execute force help fetch which gives full details of the options.

The fetch command takes a -t switch to identify which metadata type you want to extract. For all types you can specify the name of the metadata type as detailed in the Metadata API Developer’s Guide, so to extract the Sites metadata, for example, you would execute:

force fetch -t CustomSite

The metadata name for Lightning Components is AuraDefinitionBundle, but you can also specify Aura as the type, which will extract the Lightning Component metadata via the REST API - I tend to use the latter version as I find it is slightly faster than the metadata route.


force fetch -t Aura

extracts the Lightning Components to the metadata directory, and the entire Salesforce configuration is now in one place:

analyticSnapshots	homePageComponents	remoteSiteSettings
applications		homePageLayouts		reportTypes
approvalProcesses	labels			roles
assignmentRules		layouts			samlssoconfigs
aura			networks		scontrols
autoResponseRules	objectTranslations	sharingRules
callCenters		objects			sites
classes			package.xml		staticresources
communities		pages			tabs
components		permissionsets		translations
connectedApps		portals			triggers
dataSources		profiles		workflows
flows			queues
groups			quickActions

Putting these commands together in a bash script, with a dash of compression, gives me a handy backup file containing my metadata:

force login
force export
force fetch -t Aura
zip -r metadata >

Note - if you run the above commands directly from the command line, ignore the first one. That tells the OS which command to use to execute the script, which makes no sense for individual commands.

Version Control to Major Tom

Where this can be particularly powerful is if you integrate with version control.  You can use a script to periodically extract the latest metadata and merge into your Git repository for example.

Related Posts


Sunday, 29 May 2016

Force CLI Part 1 - Getting Started

Force CLI Part 1 - Getting Started



In one of my earlier blog posts around Lightning Components deployment (Deploying Lightning Components - Go Big or Go Missing) I made a passing reference to the fact that I use the Force CLI for automated deployments. A few people contacted me after this and said that while they were vaguely aware of the CLI, they’d never spent any time with it, so I decided to show this unsung hero some love with a series of blog posts.

A Command Line Interface to!

As someone who spent 20 or so years developing on Unix and Linux, I’m pretty comfortable with command line tools, and in many cases (resolving git conflicts!) I actually prefer them to a GUI. So when the CLI was released at Dreamforce 13, I couldn’t wait to get hold of it.


The first thing to do is download the CLI - you can do this from the home page at : - this has pre-built binaries for Mac OSX, Windows and Debian/Ubuntu. I’m a Mac user so any instructions/examples are based on OSX.

Screen Shot 2016 04 03 at 18 57 08

Clicking on the ‘Download for Mac OS X 64’ link downloads the ‘force’ application to my ‘Downloads’ directory. I clear this directory out regularly (yeah right, regularly being when I run out of disk space) so the first thing I need to do is move this to a safe place. I create a tools directory underneath my home directory and drag the file into that:

Trying it Out

Via the terminal (If you aren’t familiar with the terminal then I’d recommend following this tutorial from MacWorld) I can then test out whether the application works correctly 

> cd ~/tools
> ./force

and of course it doesn’t:

-bash: ./force: Permission denied

This is because files are downloaded without execute permission. To rectify this, execute:

> chmod +x ./force

Executing again now shows the (long) list of available options:

> ./force

Usage: force  []
Available commands:
   login     force login [-i=] [ ]
   logout    Log out from
   logins    List logins used
   active    Show or set the active account
   whoami    Show information about the active account
   describe  Describe the object or list of available objects
   sobject   Manage standard & custom objects
   bigobject  Manage big objects
   field     Manage sobject fields
   record    Create, modify, or view records
   bulk      Load csv file use Bulk API
   fetch     Export specified artifact(s) to a local directory
   import    Import metadata from a local directory
   export    Export metadata to a local directory
   query     Execute a SOQL statement
   apex      Execute anonymous Apex code
   trace     Manage trace flags
   log       Fetch debug logs
   eventlogfile  List and fetch event log file
   oauth     Manage ConnectedApp credentials
   test      Run apex tests
   security  Displays the OLS and FLS for a give SObject
   version   Display current version
   update    Update to the latest version
   push      Deploy artifact from a local directory
   aura      force aura push -resourcepath=
   password  See password status or reset password
   notify    Should notifications be used
   limits    Display current limits
   help      Show this help
   datapipe  Manage DataPipes
Run 'force help [command]' for details.

Access All Areas

By executing ‘./force’ I’m asking the operating system to execute the script named force in the local directory (the ./ part). This is going to be pretty unwieldy if I have to provide the location each time, so the last thing to do to get set up is to add the tools directory to my path - if you chose a different directory name to tools, just alter the following commands to use your name.

The default shell on OSX is bash (bourne again shell, as it was a replacement for the venerable Bourne shell - this is both a blog post and a history lesson!). When you login (or open a new terminal window on Mac OSX) the bash shell executes the .bash_profile script from your home directory, if one exists, so this is the easiest way to update your path. 

Using your favourite editor (mine is vim, but that’s a bit like saying the developer console is your favourite IDE), open the .bash_profile file, from your home directory (/Users/<yourname>), or create one if it isn’t already there and add the following at the bottom:

export PATH=$PATH:~/tools

Save the file, then execute the .bash_profile script to update your current context:

> . ~/.bash_profile

Next, change to your home directory and execute the force command without a path to check all is good:

> cd
> force

and once again you should see the full output.

Congratulations - now you can run the force executable from anywhere without worrying about where it has been installed.

Logging In

Now that everything is set up, you can login to Salesforce via the CLI by executing :

> force login

This opens up a browser window for you to enter your credentials: 

Screen Shot 2016 05 08 at 08 26 08

and authorise access via oauth:

Screen Shot 2016 05 08 at 08 26 30

(Note that this process involves a Heroku app communicating with a local web server to return the oauth token to the command line, which may be problematic if you are behind a firewall that you can’t configure or using NAT. See the end of this post for an alternative, non-oauth approach).

This is the number 1 reason why I use the Force CLI for deployments - it removes the need to store usernames and passwords on disk, which has always been my issue with the Force Migration Tool.

If you have a number of orgs its quite easy to forget which one you have logged into, which is where the logins command comes in handy, as it will show you all the orgs that you have logged in to and which login is currently active:

> force logins

Screen Shot 2016 05 08 at 08 33 17

Non-OAuth Authentication

If you can’t use the oauth mechanism described above, you can always fall back on supplying the username and password to the CLI via the following command:

> force login -u=<username> -p=<password>

While this is slightly better than storing credentials on disk, it does mean that if someone were to compromise your machine they’d be able to see the username and password in your command history. For example, after executing:

> force login -u=fake@fake.fake -p=Faking1t

If I then look at my command history

> history

This shows the credentials:

  685  force login -u=fake@fake.fake -p=Faking1t
  686  history

Luckily I can clean up afterwards and remove the command from my history, First delete the entry:

> history -d 685

then write the history to disk:

> history -w

viewing the history again shows that entry 685 has been replaced with the entry that was at 686:

685  history

Logging in to a Sandbox

The logins command supports production orgs, sandboxes and custom domains. To view the options, for this or any other command, execute ‘force <command> -help’:

force login -help

Usage: force login
  force login [-i=<instance>] [<-u=username> <-p=password> <-v=apiversion]
    force login
    force login -i=test
    force login -u=un -p=pw
    force login -i=test -u=un -p=pw
    force login -u=un -p=pw

That’s all Folks

This concludes getting started with the Force CLI. Assuming all went well you now have the CLI installed, in your path and you’ve been able to login to an org. In the next post I’ll take a look at interacting with your metadata.

Related Posts


Saturday, 7 May 2016

Salesforce - It's not just for Ladies

Salesforce - It’s not just for Ladies



This week I (and many, many others - check the picture above!) attended the “An Evening with Ladies who Salesforce" event at Salesforce Tower in London, organised by the team behind the London Salesforce Women in Tech - Freya Crawshaw, Louise Lockie and Jodi Wagner. This was a vertical journey for me, as the BrightGen offices are on the 18th floor!


The keynote talk was from Anne-Marie Imafidon, Head Stemette. Stemettes are doing an amazing job of enabling the next generation of girls into STEM (Science, Technology, Engineering and Maths).  Note that I’ve chosen the word enabling rather than encouraging deliberately - part of Anne-Marie’s slot included a video from the Outbox Incubator, and its clear that these girls already have an enormous amount of interest and enthusiasm (and energy!) for technology and development. 

Panel Time

The second part of the evening was a panel discussion on career pathing in the Salesforce ecosystem, with an impressive panel lineup including Salesforce MVP Jenny Bamber, Certified Technical Architect Doina Figural (the first female CTA outside of the US and one of only three worldwide) and Salesforce EVP and GM of App Cloud, Adam Seligman.

I found this an enlightening discussion, as it covered a lot of areas that I’ve never really thought about before - planning where you want to be and structuring an approach to get there for example. My own “career” (which it has taken me a long time to accept that I have one, as opposed to a series of jobs) has tended to be more about following the tech that I’m interested in, so I’ve worked for a number of small companies that either haven’t made it or have turned into very different places from the one I’ve joined. When that has happened I’ve moved on to the next one, usually taking a pay cut and dropping down from a senior to a mid-level position while I learn the ropes.

Only once or twice, when this has happened in the midst of a recession for example, have I actually worried about whether I’d be able to find another position, so it was very interesting to hear from people who have had to take a very different approach to their career, purely because through an accident of birth they happen to be a different gender.

One of the questions from the audience was around how companies can attract more women to apply for their open positions - something very close to my heart as I’ve been on a long journey to attract the first female developer to BrightGen, which finally came to an end in early 2016. It wasn’t that we were interviewing females and rejecting them, but we were getting very few CVs, and understandably many of those that we interviewed weren’t keen on being the one and only female writing code. One  panelist’s answer was around what do you tell the recruitment agents in terms of the diversity that you want from the CVs that they send you. This resonated enormously, as it has never occurred to me that I have to do anything other than ask agents to send me their best CVs.

Its not just for Ladies!

So what was I doing there? The clue is in the name - An Evening with Ladies who Salesforce, not for Ladies who Salesforce! I wasn’t the only man either - there were three or four others outside of Adam on the panel and various Salesforce representatives. It was unusual, shall we say, being at a Salesforce community event in London where I was very much in the minority, but how dull would life be if everything was always the same.

So I’d recommend these events to any men in the Salesforce ecosystem, especially those hoping to lead - its vital to understand things from the viewpoint of others, and the best way to achieve this is to listen to others. If you are worried about being the only man there, tough! We expect the women that attend the developer events to be able to handle being in the minority (and at times flying solo), so we should be able to handle this when the roles are reversed.