Productivity Tip : Quicker Javascript and CSS development using Dropbox

I’ve been working on a Javascript/CSS heavy Visualforce page this week and it can be a real pain tweaking the code, saving the library, zipping up the static resource, uploading it to Salesforce, testing the change and then repeating the process. Alternatively, I could put all my JS/CSS inside the VF page until I’m done but it still makes for time consuming saves up to the Salesforce cloud.

The following tip is a variation on a trick John Conners showed me, the difference being I’m using Dropbox as a temporary webserver instead of rolling my own.

A much quicker way of working is to take your JS and CSS libraries out of your static resource and put them in your public Dropbox folder while you’re working on them.

Once in your dropbox folder you can right click and get the public link:

Dropbox Public Link Example

Next simply comment out the references to your static resource and replace them with your dropbox public links …

<apex:page>
<!--	<apex:includeScript value="{!URLFOR($Resource.MyStaticResource, '/MyStaticResource/web/js/myjslib.js')}"/> -->
	<apex:includeScript value="https://dl.dropboxusercontent.com/u/26FG9442D4A/testscripts/myjslib.js" />
<!-- 	<apex:stylesheet value="{!URLFOR($Resource.MyStaticResource, '/MyStaticResource/web/css/mycss.css')}"/> -->
	<apex:stylesheet value="https://dl.dropboxusercontent.com/u/26FG9442D4A/testscripts/mycss.css" />

...

</apex:page>

You can then code away in your favourite editor and save your changes direct to your dropbox in seconds, reload your VF page and test your changes.

When you’re all set pop your scripts back, zip up your static resource and don’t forget to replace your temporary dropbox links with the static resource links.

Tagged with: , , , , ,
Posted in code, productivity

Creating A Simple Batch Apex Process Queue

Working with the Salesforce platform presents many challenges. One of the most common is avoiding the platforms many governor limits. As a developer, hitting governor limits on the platform is a sure fire way to turn a good day into a bad day. If you are working on a complex code base it can be a very frustrating experience.

So, you start to look for ways to get more out of the platform for that complex business process you’re working on. You may get a bit a further pushing your process into a future job or you may resort to using staging tables and multiple scheduled batch jobs polling and updating every 15 minutes.

Trying to eek out a bit more processing power can often lead to code getting more and more complicated and less and less maintainable.

I recently saw an inspiring example of a Message Bus implemented on the platform by Matt Bingham and Neil Reid. You can check it out here: http://www.ratsaas.com/eda

Often, however, you don’t have the time to re-architect your whole code base and need something to plug in quickly to solve your immediate problem. The pattern I’m going to show you takes some ideas from Matt and Neil’s presentation and builds a simple stateful process queue that allows you to break your process up into steps and give each step its own request and therefore its own set of governors.

With this pattern you can:

  • Split your operation into smaller manageable process steps.
  • Sequence and execute your process steps in a specific order.
  • Pass state between process steps.
  • Decide whether a failing process step should halt the process chain.

This is achieved with little more that a Batch Apex class and a simple Interface.

We’ll begin with taking a look at the Interface …

All the process steps we will later define will implement a common Interface that looks like this:

public interface IProcessStep
{
	void execute(Map<String, Object> stateMap);

	Boolean haltOnError();
}

You’ll see that the interface has 2 methods.

The execute() method takes a Map<String, Object> as an argument. This map is used to persist state between process steps. You’ll see how this works when we get to the implementation.

The haltOnError() method allows the process step to flag that a critical error has occurred and processing of further steps should be prevented.

The Batch Process …

public class ProcessStepsBatch
	implements Database.Batchable<Object>, Database.Stateful
{
	private List<IProcessStep> m_processSteps;		// List of Process Steps to Iterate
	private Map<String, Object> m_stateMap;			// State Map

	private Boolean m_halt;
	private List<String> m_messages;

	/**
	 * Constructor
	 *
	 * Arguments:	List<IProcessStep>		List of process steps to execute
	 *				Map<String, Object>		State Map to persist
	 */
	public ProcessStepsBatch(List<IProcessStep> processSteps, Map<String, Object> stateMap)
	{
		Messaging.reserveSingleEmailCapacity(1);
		
		m_processSteps = processSteps;
		m_stateMap = stateMap;

		m_messages = new List<String>();
		m_halt = false;
	}

	/**
	 * Start Method
	 */
	public Iterable<Object> start(Database.BatchableContext info)
	{
		return (Iterable<Object>)m_processSteps;
	}

	/**
	 * Execute Method
	 */
	public void execute(Database.BatchableContext info, Object scope)
	{
		List<Object> scopeList = (List<Object>)scope;

		// Check the batch size
		if (scopeList.size() > 1)
		{
			throw new ProcessException('Maximum batch size is 1.');
		}

		// Cast to our Interface
		IProcessStep processStep = (IProcessStep)scopeList[0];

		// if we called a halt to the process do not process the step, just report it
		if (m_halt == true)
		{
			m_messages.add( 'NOT PROCESSED: Not Processed Action: ' + processStep);
			return;
		}

		try
		{
			// Execute the Step
			processStep.execute(m_stateMap);
			
			// Report the Success
			m_messages.add( 'SUCCESS: Processed Action: ' + processStep);
		}
		catch (Exception e)
		{
			// Report the Failure
			m_messages.add( 'FAILED: Error Processing Step: ' + processStep + '; Exception: ' + e.getMessage());
			
			// Check whether to halt the process or continue
			if (processStep.haltOnError())
			{
				m_halt = true;
			}
		}
	}

	/**
	 * Finish method
	 */
	public void finish(Database.BatchableContext info)
	{
		// Send out an Email with a Report
		Messaging.SingleEmailMessage mail = new Messaging.SingleEmailMessage();
		mail.setToAddresses(new String[] { UserInfo.getUserEmail() });
		mail.setSubject('Process Steps Batch - Report');

		String body = 'Report: \n\n';
		for (String m : m_messages)
		{
			body += '    ' + m + '\n';
		}

		mail.setPlainTextBody(body);

		Messaging.sendEmail(new Messaging.SingleEmailMessage[] { mail });
	}

	public class ProcessException extends Exception {}
}

Most important is that the batch apex class implements the Database.Stateful interface as well as the Database.Batchable interface.

The constructor takes 2 arguments. The first is a list of Process Steps that implement the IProcessStep Interface and the second is the state map. Both of these are stored as member variables to ensure they are persisted across the entire batch.

The start() method returns the list of process steps (which itself is an iterator).

The execute() method first validates the scope passed to it to ensure it has a batch size of 1. You could, and should, experiment with larger batch sizes but the point of this exercise is to give maximum resource to each step. (A little modification will be required if you want to increase the batch size.) It then executes the step and handles any errors.

The finish() method simply sends an email to report its progress.

Some simple process steps …

Next we can define some process step classes that implement our IProcessStep Interface.

For this example we going to use two simplistic process steps, one that creates an Account and another that creates an Opportunity.

The create Account process step class looks like this:

public without sharing class CreateAccountStep 
	implements IProcessStep
{
	// Member variables
	private String m_accountIdStateKey;
	private String m_accountName;

	/**
	 * Constructor
	 *
	 * Arguments:	String	Key to the State Map to store the Account Id
	 *				String	Account Name to create the Account with
	 */
	public CreateAccountStep(String accountIdStateKey, String accountName)
	{
		m_accountIdStateKey = accountIdStateKey;
		m_accountName = accountName;
	}

	/**
	 * Execute method (from Interface)
	 */
	public void execute(Map<String, Object> stateMap)
	{
		// Create the account
		Account a = new Account(
			Name = m_accountName
		);

		insert a;

		// Store the Id in the State Map using the supplied key
		stateMap.put(m_accountIdStateKey, a.Id);
	}

	/**
	 * Halt on Error? (from Interface)
	 */
	public Boolean haltOnError()
	{
		return true;
	}
}

You’ll see that the constructor takes 2 arguments.

The first is a key to the State Map. It uses this key to store the Id of the Account is has created as we are going to need this in a later step when we create the Opportunity. We could equally pass any sort of object into this map but bear in mind that any state we store is going to count towards our overall heap size as we will be persisting this map in the batch apex implementation.

The second argument is the Name we want to use for the Account.

We store both these values as member variables, again this will count towards our heap so we want to give careful consideration to what we are storing.

The execute() method creates and inserts an Account and stores its Id in the state map using the specified key.

The haltOnError() method returns true as we don’t want to do anything else if it should fail.

In this example I’m arbitrarily setting it to true but you could equally define logic that determines its value.

The create Opportunity process step class looks like this:

public without sharing class CreateOpportunityStep 
	implements IProcessStep
{
	// Member variables
	private String m_accountIdStateKey;
	private String m_opportunityName;

	/**
	 * Cosntructor
	 *
	 * Arguments:	String	Key to the State Map holding the Account Id
	 *				String	Opportuninty Name to create the Opportunity with
	 */
	public CreateOpportunityStep(String accountIdStateKey, String opportunitytName)
	{
		m_accountIdStateKey = accountIdStateKey;
		m_opportunityName = opportunitytName;
	}

	/**
	 * Execute method (from Interface)
	 */
	public void execute(Map<String, Object> stateMap)
	{
		// Get the Account Id from the stateMap
		Id accountId = (Id)stateMap.get(m_accountIdStateKey);

		if (accountId == null)
		{
			throw new ProcessStepException('No Account Id passed to process step.');
		}

		// Create the opportunity
		Opportunity o = new Opportunity(
			AccountId = accountId,
			Name = m_opportunityName,
			CloseDate = System.today(),
			StageName = 'Closed Won'
		);

		insert o;
	}

	/**
	 * Halt on Error? (from Interface)
	 */
	public Boolean haltOnError()
	{
		return false;
	}

	public class ProcessStepException extends Exception {}
}

Similarly to the CreateAccountStep class the constructor takes 2 arguments.

Once again the first is a key to the state map.

The second is the name of the Opportunity to create.

In similar fashion it stores these arguments as member variables.

The execute() method uses the key to resolve the Id of the Account before creating the Opportunity and inserting it. If we wanted to do something with the Opportunity in a future step we would need to store its Id in the state map and add another key to the constructor arguments.

The haltOnError() method returns false to indicate processing may continue in the event of a failure.

Putting it all together …

Now we have a couple of process steps we can wrap them up in a process and send it to the batch queue. For this example I’m going to put it all in a standalone class inside a small static method but you could integrate this into a service class with some logic behind it:

public without sharing class MyProcessSteps
{
	/**
	 * Static Sumbit Method to get things going
	 */
	public static void submit()
	{
		// Create a state map
		Map<String, Object> stateMap = new Map<String, Object>();

		// Create a list of Process Steps
		List<IProcessStep> processSteps = new List<IProcessStep>();

		// Add the process steps to execute
		processSteps.add(new CreateAccountStep('Acc1', 'Yorkshire Coffee'));
		processSteps.add(new CreateOpportunityStep('Acc1', 'The Great Teabag Swindle'));
		processSteps.add(new CreateAccountStep('Acc2', 'Yorkshire Toffee'));
		processSteps.add(new CreateOpportunityStep('Acc2', 'Curry and Chocolate Promotion'));

		// Instantiate the Batch Job
		ProcessStepsBatch batch = new ProcessStepsBatch(processSteps, stateMap);
		
		// Execute the Batch with a size of one
		Database.executeBatch(batch, 1);
	}
}

You’ll see we first instantiate the state map and list of IProcesStep‘s then we add the required process steps to the list.

In the above example we are creating 2 Accounts and 2 Opportunities. We pass a common key to the state map between the process steps that need to share information. So step 1 and 2 both use the key ‘Acc1’ while steps 2 and three share the key ‘Acc2’.

Next we instatiate the batch apex class and pass in the process steps and state map.

Finally we execute the batch and and let it process each step in turn, et voilà!

A couple of things to think about before we go …

  • Consider carefully what member variables you hold in your process step classes. It will all count towards your heap. Can you pass Ids instead of objects and requery them inside the execute method?
  • You’re working outside of a single request. If a process step fails down the line all the process steps before will be commited. What are the implications of this? How will you handle errors?

Thanks also to Phil Hawthorn for helping to inspire this approach. Hope you find it useful.

Tagged with: , , , ,
Posted in apex, code, force.com, salesforce

Trigger Pattern for Tidy, Streamlined, Bulkified Triggers Revisited

I thought I’d launch my blog by revisiting a post I wrote for the Force.com Cookbook. You can find the original post here:

Trigger Pattern for Tidy, Streamlined, Bulkified Triggers

The main aims of this pattern are to organise trigger code and follow best practices with regard to handling bulk operations and keeping trigger logic together. I find that trigger code for an object is easier to understand and maintain when its one place and this pattern helps enforce that and avoids much of the head scratching that goes on when code in another trigger is the cause of the problem.

The pattern makes use of an interface to govern the order of events through a trigger in a logical way. Trigger logic is placed in a handler class that implements the interface. The interface makes it easy to cache data at the start of the trigger and perform actions against each object passed to the trigger and perform any post processing at the end. A factory class is responsible for initialising and executing the handler logic.

The main difference between this revision and the original post is the dynamic instantiation of the handler. In the previous version the handlers were hard coded into the factory class and in this version they are dictated by the trigger which specifies the handler to use. This is made possible by the System.Type class Salesforce added in the Summer 12 release. This approach has 2 advantages:

  • You don’t need to modify the factory class every time you want to create a new handler.
  • You have the option of creating multiple handlers for the same object. While this goes against the grain of keeping the trigger code in one place you may have a need to deploy some temporary trigger code or you don’t want to refactor an existing handler.

So onto the code …

This pattern involves delegating work from the trigger to a structured Trigger Handler class. Each object will generally have one trigger handler and the trigger will specify the handler to use. The trigger itself has almost no code in it. We make use of a Factory class to instantiate the appropriate trigger handler. Whenever we create a new trigger all we need to do is create a new handler class and add a line of code to the trigger itself to delegate the work to the trigger factory. The trigger factory takes care of instantiating the handler and calling the handler methods in the right order.

We start by defining an Interface that provides the template for our trigger handlers. This interface contains the method signatures each handler must implement. See below:

/**
 * Interface containing methods Trigger Handlers must implement to enforce best practice
 * and bulkification of triggers.
 */
public interface ITrigger
{
	/**
	 * bulkBefore
	 *
	 * This method is called prior to execution of a BEFORE trigger. Use this to cache
	 * any data required into maps prior execution of the trigger.
	 */
	void bulkBefore();

	/**
	 * bulkAfter
	 *
	 * This method is called prior to execution of an AFTER trigger. Use this to cache
	 * any data required into maps prior execution of the trigger.
	 */
	void bulkAfter();

	/**
	 * beforeInsert
	 *
	 * This method is called iteratively for each record to be inserted during a BEFORE
	 * trigger. Never execute any SOQL/SOSL etc in this and other iterative methods.
	 */
	void beforeInsert(SObject so);

	/**
	 * beforeUpdate
	 *
	 * This method is called iteratively for each record to be updated during a BEFORE
	 * trigger.
	 */
	void beforeUpdate(SObject oldSo, SObject so);

	/**
	 * beforeDelete
	 *
	 * This method is called iteratively for each record to be deleted during a BEFORE
	 * trigger.
	 */
	void beforeDelete(SObject so);

	/**
	 * afterInsert
	 *
	 * This method is called iteratively for each record inserted during an AFTER
	 * trigger. Always put field validation in the 'After' methods in case another trigger
	 * has modified any values. The record is 'read only' by this point.
	 */
	void afterInsert(SObject so);

	/**
	 * afterUpdate
	 *
	 * This method is called iteratively for each record updated during an AFTER
	 * trigger.
	 */
	void afterUpdate(SObject oldSo, SObject so);

	/**
	 * afterDelete
	 *
	 * This method is called iteratively for each record deleted during an AFTER
	 * trigger.
	 */
	void afterDelete(SObject so);

	/**
	 * andFinally
	 *
	 * This method is called once all records have been processed by the trigger. Use this
	 * method to accomplish any final operations such as creation or updates of other records.
	 */
	void andFinally();
}

We can now add a handler class that implements this interface. In the example below we have a handler for the Account object that has some logic to make sure the Account isn’t referenced elsewhere before it can be deleted. It also writes a record away to a custom object called Audit__c for each Account deleted. (Supporting classes and details of the Audit__c object will described at the end)

In the example we make use of the bulkBefore method to cache all the in use Account Id’s passed to the trigger in a Set. As this method is only called once we will not run the overhead of multiple SOQL queries. The validation is done in the beforeDelete method. This method is called iteratively for every record passed to the before delete trigger. If we were validating field data we would do this in one of the after methods since the values could be modified by another trigger or workflow. If the validation succeeds we add an Audit__c record to a list for handling later in the andFinally method. The andFinally method is executed once at the end of the trigger. In this case we use it to insert the Audit__c records.

You will also notice there is no SOQL in the class, this is delegated out to a Gateway class.

/**
 * Class AccountHandler
 *
 * Trigger Handler for the Account SObject. This class implements the ITrigger
 * interface to help ensure the trigger code is bulkified and all in one place.
 */
public without sharing class AccountHandler
	implements ITrigger
{
	// Member variable to hold the Id's of Accounts 'in use'
	private Set<Id> m_inUseIds = new Set<Id>();

	// Member variable to record Audit records
	private List<Audit__c> m_audits = new List<Audit__c>();

	// Constructor
	public AccountHandler()
	{
	}

	/**
	 * bulkBefore
	 *
	 * This method is called prior to execution of a BEFORE trigger. Use this to cache
	 * any data required into maps prior execution of the trigger.
	 */
	public void bulkBefore()
	{
		// If this a delete trigger Cache a list of Account Id's that are 'in use'
		if (Trigger.isDelete)
		{
			// pre load all the in use projects passed to this trigger
			m_inUseIds = AccountGateway.findAccountIdsInUse(Trigger.oldMap.keySet());
		}
	}

	public void bulkAfter()
	{
	}

	public void beforeInsert(SObject so)
	{
	}

	public void beforeUpdate(SObject oldSo, SObject so)
	{
	}

	/**
	 * beforeDelete
	 *
	 * This method is called iteratively for each record to be deleted during a BEFORE
	 * trigger.
	 */
	public void beforeDelete(SObject so)
	{
		// Cast the SObject to an Account
		Account myAccount = (Account)so;

		// Examine the Set and if the account is in use don't allow it to be deleted.
		if (m_inUseIds.contains(myAccount.Id))
		{
			// Add the error to the offending object
			so.addError('You cannot delete an Account that is in use.');
		}
		else
		{
			// Add an audit record to the list
			Audit__c myAudit = new Audit__c();
			myAudit.Description__c = 'Account Name: ' + myAccount.Name + ' (Id: ' + myAccount.Id + ') was deleted';

			m_audits.add(myAudit);
		}
	}

	public void afterInsert(SObject so)
	{
	}

	public void afterUpdate(SObject oldSo, SObject so)
	{
	}

	public void afterDelete(SObject so)
	{
	}

	/**
	 * andFinally
	 *
	 * This method is called once all records have been processed by the trigger. Use this
	 * method to accomplish any final operations such as creation or updates of other records.
	 */
	public void andFinally()
	{
		// insert any audit records
		if (!m_audits.isEmpty())
		{
			insert m_audits;
		}
	}
}

The factory class below instantiates the specified handler and executes the methods defined by the interface. It does this using the type of the handler class passed to the static method: createAndExecutHandler.

You will see from the execute method the order in which the interface methods are called and you will note the methods that are called iteratively passing in the relevant SObjects.

/**
 * Class TriggerFactory
 *
 * Used to instantiate and execute Trigger Handlers associated with sObjects.
 */
public with sharing class TriggerFactory
{
	/**
	 * Public static method to create and execute a trigger handler
	 *
	 * Arguments:	Type t - Type of handler to instatiate
	 *
	 * Throws a TriggerException if no handler has been found.
	 */
	public static void createAndExecuteHandler(Type t)
	{
		// Get a handler appropriate to the object being processed
		ITrigger handler = getHandler(t);
		
		// Make sure we have a handler registered, new handlers must be registered in the getHandler method.
		if (handler == null)
		{
			throw new TriggerException('No Trigger Handler found named: ' + t.getName());
		}
		
		// Execute the handler to fulfil the trigger
		execute(handler);
	}

	/**
	 * private static method to control the execution of the handler
	 *
	 * Arguments:	ITrigger handler - A Trigger Handler to execute
	 */
	private static void execute(ITrigger handler)
	{
		// Before Trigger
		if (Trigger.isBefore)
		{
			// Call the bulk before to handle any caching of data and enable bulkification
			handler.bulkBefore();

			// Iterate through the records to be deleted passing them to the handler.
			if (Trigger.isDelete)
			{
				for (SObject so : Trigger.old)
				{
					handler.beforeDelete(so);
				}
			}
			// Iterate through the records to be inserted passing them to the handler.
			else if (Trigger.isInsert)
			{
				for (SObject so : Trigger.new)
				{
					handler.beforeInsert(so);
				}
			}
			// Iterate through the records to be updated passing them to the handler.
			else if (Trigger.isUpdate)
			{
				for (SObject so : Trigger.old)
				{
					handler.beforeUpdate(so, Trigger.newMap.get(so.Id));
				}
			}
		}
		else
		{
			// Call the bulk after to handle any caching of data and enable bulkification
			handler.bulkAfter();

			// Iterate through the records deleted passing them to the handler.
			if (Trigger.isDelete)
			{
				for (SObject so : Trigger.old)
				{
					handler.afterDelete(so);
				}
			}
			// Iterate through the records inserted passing them to the handler.
			else if (Trigger.isInsert)
			{
				for (SObject so : Trigger.new)
				{
					handler.afterInsert(so);
				}
			}
			// Iterate through the records updated passing them to the handler.
			else if (Trigger.isUpdate)
			{
				for (SObject so : Trigger.old)
				{
					handler.afterUpdate(so, Trigger.newMap.get(so.Id));
				}
			}
		}

		// Perform any post processing
		handler.andFinally();
	}

	/**
	 * private static method to get the named handler.
	 *
	 * Arguments:	Type t - Class of handler to instatiate
	 *
	 * Returns:		ITrigger - A trigger handler if one exists or null.
	 */
	private static ITrigger getHandler(Type t)
	{
		// Instantiate the type
		Object o = t.newInstance();

		// if its not an instance of ITrigger return null
		if (!(o instanceOf ITrigger))
		{
		    return null;
		}

		return (ITrigger)o;
	}

	public class TriggerException extends Exception {}
}

Now we need to wire the trigger up to the TriggerFactory class. This is easily accomplished with a single line of code in the trigger below. The trigger simply passes the name of the handler to use to the factory method. You will notice that the trigger handles all CRUD operations. I havn’t included undelete but you could adapt the pattern to include this.

trigger AccountTrigger on Account (after delete, after insert, after update, before delete, before insert, before update)
{
	TriggerFactory.createAndExecuteHandler(AccountHandler.class);
}

The example requires the additional class below:

/**
 * Class AccountGateway
 *
 * Provides finder methods for accessing data in the Account object.
 */
public without sharing class AccountGateway
{
	/**
	 * Returns a subset of id's where there are any records in use.
	 *
	 * Arguments:	Set<Id> accIds - Set of Account Id's to examine
	 *
	 * Returns:		Set<Id> - Set of Account Id's that are 'in use'
	 */
	public static Set<Id> findAccountIdsInUse(Set<Id> accIds)
	{
		Set<Id> inUseIds = new Set<Id>();

		for (Account[] accounts : [Select p.Id, (Select Id From Opportunities Limit 1) From Account p where p.Id in : accIds])
		{
			for (Account acc : accounts)
			{
				if (acc.Opportunities.size() > 0)
				{
					inUseIds.add(acc.id);
				}
			}
		}

		return inUseIds;
	}
}

The Audit_c object is a simple custom object with an Autonumber Name field and a single text area field called Description__c.

A few things to note:

As with all triggers care should be taken to keep inside of governor limits and be aware that your trigger may be dealing with up to 200 records in a bulk operation.

  • Always add field validation to the after methods.
  • If you need data to persist across the before and after trigger consider the use of static variables.
  • Don’t put SOQL inside loops or any of the before and after methods.
Tagged with: , , ,
Posted in apex, code, force.com, salesforce
About Me
Product Services Developer at:
FinancialForce.com
All views expressed here are my own. More about me and contact details here.
Please sponsor me …
My wife and I are running the Great North Run to support The Nick Alexander Memorial Trust

If you've found my blog useful, please consider sponsoring us. Sponsor me on BT MyDonate

Enter your email address to follow this blog and receive notifications of new posts by email.

Copyright (there isn’t any, feel free to reuse!)

CC0
To the extent possible under law, Tony Scott has waived all copyright and related or neighboring rights to MeltedWires.com Examples and Code Samples. This work is published from: United Kingdom.

%d bloggers like this: