Friday, July 18, 2014

First in First Out Case Management

If your organization uses Cases and Queues then this little hack may help.   The views that show cases can sort on most Case fields; such as last update or when the case was created.

Using the creation timestamp is only useful if all your Cases stay opened until resolved and never reopen.  Consider any case that was closed a few days ago.  If that case is reopened now then, because it has a old creation date it will appear BEFORE cases that arrived earlier in the day.

If you use the last update timestamp and the Case is touched for any reason then the timestamp will be updated.  If you are trying to process the oldest cases first then this old case will now be pushed to the bottom of the list.

The solution is to sort based on the time the case was added to the Queue.

1. Create a custom field  QueuedAtTimestamp__c
2. Create a before insert or update trigger on Case.
3. Create unit test.
4. Sort your Case view on the new timestamp.

Trigger code

When working with objects in triggers you can't access information on a related object. In this 'case', the owner id field relates the Case to a User or Group. To test if the Owner is a Queue we need to make a second query to see if the owner is in the Group table. If yes then the owner is a queue.
But introducing a query inside a trigger can cause problems when the time comes to update 100's of Case objects. First I'll show you the code without worrying about these limits. To protect for bulk processing I chose to add a test to only run the for loop if the size of the list is less than 100.

trigger CaseBeforeUpsertTrigger on Case (before insert, before update) {
  for(Case theCase:
    if(trigger.isUpdate) {
      Case beforeUpdate = System.Trigger.oldMap.get(theCase.Id);
      if(theCase.OwnerId != beforeUpdate.OwnerId)
        List gp = [select Id, Type from Group where Id = :theCase.OwnerId and Type = 'Queue' ];
          system.debug(Logginglevel.ERROR,'isUpdate  is queue: ' );
          theCase.QueuedAtTimestamp__c =;
    } else if(trigger.isInsert) {
        List gp = [select Id, Type from Group where Id = :theCase.OwnerId and Type = 'Queue' ];
        system.debug(Logginglevel.ERROR,'isInsert  is queue: ' );
        theCase.QueuedAtTimestamp__c =;
Unit test code:

static testMethod void CaseQueuedTimestampTest() {
  Group g = [select Id, Name from Group where  Type = 'Queue' limit 1];

  // Test 1. no owner so no queue timestamp
  Case testCase = new Case();
  testCase.Description = 'Case 1 created as part of unit test of Case Queued and Timestamped';
  insert testCase;
  String caseId = testCase.Id;

  // Test 2 change owner to queue to set the timestamp
  testCase = [select Id, OwnerId, QueuedAtTimestamp__c from Case where id = :caseId];        
  System.assertEquals(true, testCase.QueuedAtTimestamp__c == null);
  testCase.OwnerId = g.Id;
  update testCase;
  testCase = [select Id, OwnerId, QueuedAtTimestamp__c from Case where id = :caseId];        
  System.assertEquals(true, testCase.OwnerId == g.Id);
  System.assertEquals(true, testCase.QueuedAtTimestamp__c != null);

  // Test 3 create with queue as owner, so has timestamp
  testCase = new Case();
  testCase.Description = 'Case 2 created as part of unit test of CaseQueued and Timestamped';
  testCase.OwnerId = g.Id;
  insert testCase;
  caseId = testCase.Id;
  testCase = [select Id, OwnerId, QueuedAtTimestamp__c from Case where id = :caseId];  
  System.assertEquals(true, testCase.OwnerId == g.Id);
  System.assertEquals(true, testCase.QueuedAtTimestamp__c != null);    

Friday, March 21, 2014

ConnetionException is indication of Partner version mismatch

The following error showed up and it took a while to figure out the cause.  These notes are for anyone that happens on the same problem. Unexpected element. Parser was expecting element '' but found ''
    at com.sforce.soap.partner.Field.loadFields(
    at com.sforce.soap.partner.Field.load(
    at com.sforce.soap.partner.DescribeSObjectResult.loadFields(
    at com.sforce.soap.partner.DescribeSObjectResult.load(
    at com.sforce.soap.partner.DescribeSObjectResponse_element.loadFields(
    at com.sforce.soap.partner.DescribeSObjectResponse_element.load(
    at com.sforce.soap.partner.PartnerConnection.describeSObject(

During the updates the error message changed to: Unexpected element. Parser was expecting element '' but found ''

In both cases the solution was to sync up the version of Java, WSC jar file and partner file. Plus be careful to update the connection string.

The first error happened when one of the jars was compiled with Java6 and the second happened while using the wrong connection string.

In brief, rebuild your partner.api jar file to the latest. Get the latest wsc.jar file.  Update your connection string with the correct end point.   See my other post about rebuilding the partner api.


Friday, January 3, 2014

Debug statements stopped working on large sections of APEX

Working away as normal building APEX code and testing it on a sandbox.  As usual our only way to debug this code is through debug statements.  Here is what I put at the beginning of every method:

System.debug(Logginglevel.ERROR,'MonitoringUtilities.testProcessUpsertMonitoringItem ');

Notice the Logginglevel.ERROR? That is to make sure this statement appears in the debug logs no matter what the filtering level is set to.  I want to see my method entry statement.    (Yes, I know that the debug log also produces a  METHOD_ENTRY log entry on every method entry point. I typically add some more information to my entry statement.)

In any case, those debug statements are supposed to appear in the trace. BUT THEY STOPPED!?

Just to lay the ground work here.  To see debug logs in the Salesforce UI you log on and go to Settings. Then search for Debug in the settings area. Only one entry called "Debug Log" so select it.  I always start from a clean slate which means I delete any existing logs and existing users (myself in this situation as I am a solo administrator/developer for my company).  I then add myself back as a user.  Next I run whatever process I'm working on. In my case, I am mainly running an ANT build to push code back into the server. (Test instance first.)

Then return to the Salesforce UI and refresh the page.

Normally, I just view the log in the browser and just search for my method entry point and then go from there.

Most debug statements went missing

Today, after a break from Salesforce, I came and did some more development. All as usual until I tried to see my debug statements. They were not there.   Other methods that ran both before and after my new code have log entries but most entries are just not there.

I've spent several hours (remember that APEX development is mainly about waiting for the load process to finish) trying various things and searching the web but without any success.

Debug Filtering

On the Debug Logs screen beside the user name you will see a link to set "Filters" for a user.  This lets you get more or less information.  The default setting for System.debug is to log the entry at the Debug level and the default filter is to show this Debug level.

In the trace, on the first line you can see the current levels for each category.  Here is a sample from my log (showing some debug statements but not the ones around my new code.)


Solution Found

There seems to be a hidden governor limit on debug log statements.  Something that would allow a 10 or so and the beginning and a few at the end but all those in the middle are dropped.  The key is the size of the log file.  It's 2Gig.

Once I built a smaller workspace with just the classes and triggers underdevelopment and modified the build file ... the debug logs started to appear again.

Normally when I make just lightweight edits I retrieve and deploy and test the full set.  To get this smaller set to work I needed to add a runText element for the class under development.
<sf:deploy deployroot="${targetDir}" password="${sf.password}" serverurl="${sf.serverurl}" username="${sf.username}">

For more on the ant migration toolkit see my earlier posts or

Solution Only Partial Fix

So the above "works" in that the log file is now just showing the test results from deploying and testing the smaller set of files.  But none of the files actually got deployed on my sandbox because the build failed due to a long list of code coverage problems. Like this:

Code coverage issue, class: OpportunityUpsertTrigger -- Test coverage of selected Apex Trigger is 0%, at least 1% test coverage is required

Now all those classes have all the test coverage they needs in the system already.  The deployment does not mention any of these classes.   So why then does Salesforce even look at them?   

I don't have any more time for Salesforce just now. It is such a frustrating development environment.  All I am trying to do is write a simple trigger to copy values from one table to another and it has taken a long time to do get part way done.   I am very tempted to just push this work outside Salesforce like I do with so much else and just use Java and the partner API.  But I'm this deep into it and doing this work in a trigger is the most logical solution from a DB design point of view.

Solution?   Run the all the tests again and forgo the debug statements in my test class!   So, my process is (a) develop the code and use the deployment to run the tests; read the debug statements and adjust until things are working then (b) deploy with flag to run all tests.

Here, for the record, is the Ant deploy target in my build file.

<target name="deploy" depends="setup">
<if> <equals arg1="${allTests}" arg2="true" />
<echo>Deploy with all tests and rollback on error true</echo>
<sf:deploy username="${sf.username}" password="${sf.password}" serverurl="${sf.serverurl}" deployRoot="${targetDir}"
runAllTests="true"  rollbackOnError="true" />
<sf:deploy username="${sf.username}" password="${sf.password}" serverurl="${sf.serverurl}" deployRoot="${targetDir}" >

Here is step (a) command and then step (b)
ant -v -lib . -f build.xml -Dtarget=test -DtargetDir=workareaSmall -DallTests=false  deploy

ant -v -lib . -f build.xml -Dtarget=test -DtargetDir=workareaSmall -DallTests=true  deploy

Should mention that I also have to then

  1. retrieve a fresh download from production which goes in the main/full directory
  2. copy the files with changes from the small workspace directory to the main/full development workarea
  3. retest with all the files on the sandbox, if tests pass then
  4. deploy onto Production.
I always grab the latests contents from Production in case I made any changes there while developing the code.   I truly do not know what people do in the world of larger development teams.  I wish them luck!