Friday, July 18, 2014

First in First Out Case Management

If your organization uses Cases and Queues then this little hack may help.   The views that show cases can sort on most Case fields; such as last update or when the case was created.

Using the creation timestamp is only useful if all your Cases stay opened until resolved and never reopen.  Consider any case that was closed a few days ago.  If that case is reopened now then, because it has a old creation date it will appear BEFORE cases that arrived earlier in the day.

If you use the last update timestamp and the Case is touched for any reason then the timestamp will be updated.  If you are trying to process the oldest cases first then this old case will now be pushed to the bottom of the list.

The solution is to sort based on the time the case was added to the Queue.

1. Create a custom field  QueuedAtTimestamp__c
2. Create a before insert or update trigger on Case.
3. Create unit test.
4. Sort your Case view on the new timestamp.

Trigger code

When working with objects in triggers you can't access information on a related object. In this 'case', the owner id field relates the Case to a User or Group. To test if the Owner is a Queue we need to make a second query to see if the owner is in the Group table. If yes then the owner is a queue.
But introducing a query inside a trigger can cause problems when the time comes to update 100's of Case objects. First I'll show you the code without worrying about these limits. To protect for bulk processing I chose to add a test to only run the for loop if the size of the trigger.new list is less than 100.


trigger CaseBeforeUpsertTrigger on Case (before insert, before update) {
  for(Case theCase: trigger.new)
  { 
    if(trigger.isUpdate) {
      Case beforeUpdate = System.Trigger.oldMap.get(theCase.Id);
      if(theCase.OwnerId != beforeUpdate.OwnerId)
      {
        List gp = [select Id, Type from Group where Id = :theCase.OwnerId and Type = 'Queue' ];
        if(!gp.isEmpty()){ 
          system.debug(Logginglevel.ERROR,'isUpdate  is queue: ' );
          theCase.QueuedAtTimestamp__c = System.now();
        }
      }
    } else if(trigger.isInsert) {
        List gp = [select Id, Type from Group where Id = :theCase.OwnerId and Type = 'Queue' ];
        if(!gp.isEmpty()){ 
        system.debug(Logginglevel.ERROR,'isInsert  is queue: ' );
        theCase.QueuedAtTimestamp__c = System.now();
      }
    }
  }
}
Unit test code:

static testMethod void CaseQueuedTimestampTest() {
      
  Group g = [select Id, Name from Group where  Type = 'Queue' limit 1];

  // Test 1. no owner so no queue timestamp
  Case testCase = new Case();
  testCase.Description = 'Case 1 created as part of unit test of Case Queued and Timestamped';
  insert testCase;
  String caseId = testCase.Id;

  // Test 2 change owner to queue to set the timestamp
  testCase = [select Id, OwnerId, QueuedAtTimestamp__c from Case where id = :caseId];        
  System.assertEquals(true, testCase.QueuedAtTimestamp__c == null);
  testCase.OwnerId = g.Id;
  update testCase;
  testCase = [select Id, OwnerId, QueuedAtTimestamp__c from Case where id = :caseId];        
  System.assertEquals(true, testCase.OwnerId == g.Id);
  System.assertEquals(true, testCase.QueuedAtTimestamp__c != null);


  // Test 3 create with queue as owner, so has timestamp
  testCase = new Case();
  testCase.Description = 'Case 2 created as part of unit test of CaseQueued and Timestamped';
  testCase.OwnerId = g.Id;
  insert testCase;
  caseId = testCase.Id;
  testCase = [select Id, OwnerId, QueuedAtTimestamp__c from Case where id = :caseId];  
  System.assertEquals(true, testCase.OwnerId == g.Id);
  System.assertEquals(true, testCase.QueuedAtTimestamp__c != null);    
} 

Friday, March 21, 2014

ConnetionException is indication of Partner version mismatch


The following error showed up and it took a while to figure out the cause.  These notes are for anyone that happens on the same problem.



com.sforce.ws.ConnectionException: Unexpected element. Parser was expecting element 'urn:partner.soap.sforce.com:precision' but found 'urn:partner.soap.sforce.com:permissionable'
    at com.sforce.ws.bind.TypeMapper.verifyTag(TypeMapper.java:386)
    at com.sforce.ws.bind.TypeMapper.verifyElement(TypeMapper.java:415)
    at com.sforce.soap.partner.Field.loadFields(Field.java:1148)
    at com.sforce.soap.partner.Field.load(Field.java:1036)
    at com.sforce.ws.bind.TypeMapper.readSingle(TypeMapper.java:628)
    at com.sforce.ws.bind.TypeMapper.readArray(TypeMapper.java:528)
    at com.sforce.ws.bind.TypeMapper.readObject(TypeMapper.java:506)
    at com.sforce.soap.partner.DescribeSObjectResult.loadFields(DescribeSObjectResult.java:771)
    at com.sforce.soap.partner.DescribeSObjectResult.load(DescribeSObjectResult.java:730)
    at com.sforce.ws.bind.TypeMapper.readSingle(TypeMapper.java:628)
    at com.sforce.ws.bind.TypeMapper.readObject(TypeMapper.java:504)
    at com.sforce.soap.partner.DescribeSObjectResponse_element.loadFields(DescribeSObjectResponse_element.java:68)
    at com.sforce.soap.partner.DescribeSObjectResponse_element.load(DescribeSObjectResponse_element.java:59)
    at com.sforce.ws.bind.TypeMapper.readSingle(TypeMapper.java:628)
    at com.sforce.ws.bind.TypeMapper.readObject(TypeMapper.java:504)
    at com.sforce.ws.transport.SoapConnection.bind(SoapConnection.java:170)
    at com.sforce.ws.transport.SoapConnection.receive(SoapConnection.java:144)
    at com.sforce.ws.transport.SoapConnection.send(SoapConnection.java:98)
    at com.sforce.soap.partner.PartnerConnection.describeSObject(PartnerConnection.java:1003)
    at cmh.web.tools.sfdc.SalesforcePartnerConnector.describeObject(SalesforcePartnerConnector.java:183)



During the updates the error message changed to:

com.sforce.ws.ConnectionException: Unexpected element. Parser was expecting element 'urn:partner.soap.sforce.com:compactLayoutable' but found 'urn:partner.soap.sforce.com:createable'

In both cases the solution was to sync up the version of Java, WSC jar file and partner file. Plus be careful to update the connection string.

The first error happened when one of the jars was compiled with Java6 and the second happened while using the wrong connection string.

In brief, rebuild your partner.api jar file to the latest. Get the latest wsc.jar file.  Update your connection string with the correct end point.   See my other post about rebuilding the partner api.

PROD_END_POINT = "https://login.salesforce.com/services/Soap/u/29.0"; 


Friday, January 3, 2014

Debug statements stopped working on large sections of APEX

Working away as normal building APEX code and testing it on a sandbox.  As usual our only way to debug this code is through debug statements.  Here is what I put at the beginning of every method:

System.debug(Logginglevel.ERROR,'MonitoringUtilities.testProcessUpsertMonitoringItem ');

Notice the Logginglevel.ERROR? That is to make sure this statement appears in the debug logs no matter what the filtering level is set to.  I want to see my method entry statement.    (Yes, I know that the debug log also produces a  METHOD_ENTRY log entry on every method entry point. I typically add some more information to my entry statement.)

In any case, those debug statements are supposed to appear in the trace. BUT THEY STOPPED!?

Just to lay the ground work here.  To see debug logs in the Salesforce UI you log on and go to Settings. Then search for Debug in the settings area. Only one entry called "Debug Log" so select it.  I always start from a clean slate which means I delete any existing logs and existing users (myself in this situation as I am a solo administrator/developer for my company).  I then add myself back as a user.  Next I run whatever process I'm working on. In my case, I am mainly running an ANT build to push code back into the server. (Test instance first.)

Then return to the Salesforce UI and refresh the page.


Normally, I just view the log in the browser and just search for my method entry point and then go from there.

Most debug statements went missing

Today, after a break from Salesforce, I came and did some more development. All as usual until I tried to see my debug statements. They were not there.   Other methods that ran both before and after my new code have log entries but most entries are just not there.

I've spent several hours (remember that APEX development is mainly about waiting for the load process to finish) trying various things and searching the web but without any success.

Debug Filtering

On the Debug Logs screen beside the user name you will see a link to set "Filters" for a user.  This lets you get more or less information.  The default setting for System.debug is to log the entry at the Debug level and the default filter is to show this Debug level.

In the trace, on the first line you can see the current levels for each category.  Here is a sample from my log (showing some debug statements but not the ones around my new code.)

APEX_CODE,FINEST;APEX_PROFILING,INFO;CALLOUT,INFO;DB,INFO;SYSTEM,DEBUG;VALIDATION,INFO;VISUALFORCE,INFO;WORKFLOW,INFO

Solution Found

There seems to be a hidden governor limit on debug log statements.  Something that would allow a 10 or so and the beginning and a few at the end but all those in the middle are dropped.  The key is the size of the log file.  It's 2Gig.

Once I built a smaller workspace with just the classes and triggers underdevelopment and modified the build file ... the debug logs started to appear again.

Normally when I make just lightweight edits I retrieve and deploy and test the full set.  To get this smaller set to work I needed to add a runText element for the class under development.
  
<sf:deploy deployroot="${targetDir}" password="${sf.password}" serverurl="${sf.serverurl}" username="${sf.username}">
 <runtest>MyClass</runtest>
</sf:deploy>

For more on the ant migration toolkit see my earlier posts or
http://www.salesforce.com/us/developer/docs/daas/salesforce_migration_guide.pdf


Solution Only Partial Fix

So the above "works" in that the log file is now just showing the test results from deploying and testing the smaller set of files.  But none of the files actually got deployed on my sandbox because the build failed due to a long list of code coverage problems. Like this:


Code coverage issue, class: OpportunityUpsertTrigger -- Test coverage of selected Apex Trigger is 0%, at least 1% test coverage is required


Now all those classes have all the test coverage they needs in the system already.  The deployment does not mention any of these classes.   So why then does Salesforce even look at them?   

I don't have any more time for Salesforce just now. It is such a frustrating development environment.  All I am trying to do is write a simple trigger to copy values from one table to another and it has taken a long time to do get part way done.   I am very tempted to just push this work outside Salesforce like I do with so much else and just use Java and the partner API.  But I'm this deep into it and doing this work in a trigger is the most logical solution from a DB design point of view.

Solution?   Run the all the tests again and forgo the debug statements in my test class!   So, my process is (a) develop the code and use the deployment to run the tests; read the debug statements and adjust until things are working then (b) deploy with flag to run all tests.

Here, for the record, is the Ant deploy target in my build file.

<target name="deploy" depends="setup">
<if> <equals arg1="${allTests}" arg2="true" />
<then>
<echo>Deploy with all tests and rollback on error true</echo>
<sf:deploy username="${sf.username}" password="${sf.password}" serverurl="${sf.serverurl}" deployRoot="${targetDir}"
runAllTests="true"  rollbackOnError="true" />
    </then>
<else>
<echo>Deploy</echo>
<sf:deploy username="${sf.username}" password="${sf.password}" serverurl="${sf.serverurl}" deployRoot="${targetDir}" >
<runTest>MonitoringUtilities</runTest>
</sf:deploy>
</else>
</if>
</target>

Here is step (a) command and then step (b)
ant -v -lib . -f build.xml -Dtarget=test -DtargetDir=workareaSmall -DallTests=false  deploy

ant -v -lib . -f build.xml -Dtarget=test -DtargetDir=workareaSmall -DallTests=true  deploy


Should mention that I also have to then

  1. retrieve a fresh download from production which goes in the main/full directory
  2. copy the files with changes from the small workspace directory to the main/full development workarea
  3. retest with all the files on the sandbox, if tests pass then
  4. deploy onto Production.
I always grab the latests contents from Production in case I made any changes there while developing the code.   I truly do not know what people do in the world of larger development teams.  I wish them luck!



Wednesday, November 27, 2013

InvalidSObjectFault INVALID_TYPE

Today, I spent a while trying to figure out why my calls to get an object via the Partner API failed.

In code I composed a query, just like usual, and made the call, just like usual but I got this error

[InvalidSObjectFault [ApiQueryFault [ApiFault  
    exceptionCode='INVALID_TYPE' 
    exceptionMessage='Select ... FROM MyCustomObject__c ...'

    sObject type 'MyCustomObject__c' is not supported. If you are attempting to use a custom object, be sure to append the '__c' after the entity name. Please reference your WSDL or the describe call for the appropriate names.']

A searched for help on "InvalidSObjectFault" is frustrating and didn't help. The object "MyCustomObject__c" is valid. I could run the same query in the SoqlXplorer (http://www.pocketsoap.com/osx/soqlx ).  So I could easily verify the query was good. I retrieved the object class files and inspected them without any clue. Next I used my other posting about rebuilding the partner jar file to get the latest V29, just in case this was a versioning issue. 

Nothing.

Turns out the solution is simple yet the error message sends you completely way off base.

Solution

Check the user's profile permissions on the object. If the user does not have permission then API responds with InvalidSObjectFault.

(Why couldn't the description of this fault include the fact that the sObject may be valid yet the user lacks permission?)

Wednesday, April 3, 2013

Using the new Salesforce Geolocation Fields

I started with all the links I could find on Google
https://help.salesforce.com/HTSearchResults?qry=geolocation+soql
https://help.salesforce.com/HTViewSolution?id=000159817&language=en_US
https://help.salesforce.com/HTViewSolution?id=000171328&language=en_US
http://salesforce.stackexchange.com/questions/388/geolocation-searching
http://blogs.developerforce.com/engineering/2012/06/new-geolocation-features-and-mobile-apps.html

The best sample query I could find was from that last link:
SELECT name__c, phone__c
FROM restaurant__c
WHERE DISTANCE(loc__c, GEOLOCATION(37.794915,-122.394733), "mi") <= 1


 But the above did not work.  Problem?  As documented elsewhere the query only supports < or > not <=.  The correct query is:

SELECT name__c, phone__c
FROM restaurant__c
WHERE DISTANCE(loc__c, GEOLOCATION(37.794915,-122.394733), "mi") < 1



Another link that shows the correct query was found later. See
http://www.salesforce.com/us/developer/docs/soql_sosl/

Testing Queries


Next I found that my copy of SOQLExplorer doesn't support these queries at all.   I'll have to update my MAC operating system to update to the latest SOQLExplore.   But we can test these queries using the developer console.   Log onto Salesforce. Select your Name. In the drop down select Developer Console.

In the console use the Query Editor.

Using Geolocation with Partner API Queries

Next I found that my older version of the partner API does not support these queries either.  That made me try to rebuild the partner.api.  See previous posting
http://sforcehacks.blogspot.ca/2013/04/building-partnerjar-api-file.html

I'll see if this works and then come back and update this posting.

Building the Partner.jar API file

The SF documentation is missing some information needed to build the latest partner.jar API file.  Here is the link I found that gets you started
http://www.salesforce.com/us/developer/docs/api_asynch/Content/asynch_api_code_set_up_client.htm

The missing part is a dependency on a Javascript libraray called rhino.  If you follow the instructions given you get

~/Documents/workspace2/salesforce/carmanah_salesforce$ java -classpath force-wsc-27.0.0.jar com.sforce.ws.tools.wsdlc partner.wsdl partner-27.jar
[WSC][wsdlc.run:320]Created temp dir: /var/folders/D0/D0me7a5vFGmGh95S1IO9PU+++TI/-Tmp-/wsdlc-temp-3884699697501788635-dir
[WSC][wsdlc.<init>:81]Generating Java files from schema ...
Exception in thread "main" java.lang.NoClassDefFoundError: org/mozilla/javascript/Scriptable
    at com.sforce.ws.tools.TypeGenerator.generate(TypeGenerator.java:73)
    at com.sforce.ws.tools.wsdlc.generate(wsdlc.java:291)
    at com.sforce.ws.tools.wsdlc.generateTypes(wsdlc.java:278)
    at com.sforce.ws.tools.wsdlc.<init>(wsdlc.java:81)
    at com.sforce.ws.tools.wsdlc.run(wsdlc.java:320)
    at com.sforce.ws.tools.wsdlc.main(wsdlc.java:311)
Caused by: java.lang.ClassNotFoundException: org.mozilla.javascript.Scriptable
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    ... 6 more

The solution is to get the latest rhino library from http://mvnrepository.com/artifact/rhino
and add this to the command line.

Just for the record, the WSC download is here http://mvnrepository.com/artifact/com.force.api/force-wsc
I'm working with v27 today,  file: "force-wsc-27.0.0.jar"
I also downloaded rhino jar file: "js-1.7R2.jar"

I am working on a MAC. I placed both downloaded jar files in my working directory.  I downloaded the partner.wsdl file as well.  Then use this command line to produce the partner jar file:

java -classpath force-wsc-27.0.0.jar:js-1.7R2.jar com.sforce.ws.tools.wsdlc partner.wsdl partner-27.jar

On some systems you need to use a semi-colon to separate the jar files. Like this:
java -classpath force-wsc-27.0.0.jar;js-1.7R2.jar com.sforce.ws.tools.wsdlc partner.wsdl partner-27.jar



Thursday, February 28, 2013

Using the Migration Tool - Part 2

This post is a continuation of Working Sandbox and Production Using the Force Migration Tool
This post assumes we have made the build.xml file described in that posting.

Using the Ant Force.com Migration Tool:

  • I'll show you how to perform a typical retrieve from Production; deploy to Sandbox; fix; deploy to Sandbox and test; then deploy to Producton. 
  • Then I'll show you how to create a new APEX  Class.
  • Then I'll show you how to create a new APEX Trigger.

Typical workflow to fix something on Production.  

  1. Edit the workArea/package.xml file to specify what classes, triggers or object I need to fix. See this post for more how to do this.
  2. Run the ant tool to retrieve from production.  
    1. ant -v -lib . -f build.xml -Dtarget=prod retrieve
  3. Run the ant tool to deploy to the sandbox.   
    1. ant -v -lib . -f build.xml -Dtarget=test deploy
  4. Assuming the deploy worked it is now time to fix what you need to fix.  Include test cases verify your fix and provide the 75+% test coverage required by Salesforce.  Strive for 90% coverage.
  5. Run the ant tool to deploy to the sandbox.  Test cases will run to validate your fix.
    1. ant -v -lib . -f build.xml -Dtarget=test deploy
  6. Run the ant tool to deploy to production.
    1. ant -v -lib . -f build.xml -Dtarget=prod deploy
Nice and clean.


Here is the complete build file:

<project name="Salesforce Project"  default="work" basedir="." xmlns:sf="antlib:com.salesforce">
<taskdef resource="net/sf/antcontrib/antcontrib.properties"/>

<target name="setup" >
  <!-- on command line include "-DtargetDir=src" to change the target -->
  <property name="targetDir"  value="workarea" />
  <if>
   <equals arg1="${target}" arg2="test" />
   <then>
     <property file="buildTest.properties"/>
     <property name="system"  value="TEST" />
   </then>
   <elseif>
    <equals arg1="${target}" arg2="prod" />
    <then>
     <property file="buildProd.properties"/>
     <property name="system"  value="PROD" />
    </then>
   </elseif>
   <else>
     <echo message="The value of property target is not 'test' or 'prod'" level="error" />
     <fail/>
   </else>
  </if>
</target>

<target name="work"  depends="setup">
   <echo>The ${system} user name is ${sf.username}</echo>
</target>


<!-- Retrieve an unpackaged set of metadata from your org -->
<!-- The file ${targetDir}/package.xml lists what is to be retrieved -->
<target name="retrieve" depends="setup">
   <!-- Retrieve the contents into another directory -->
   <sf:retrieve username="${sf.username}" password="${sf.password}" serverurl="${sf.serverurl}" retrieveTarget="${targetDir}" unpackaged="${targetDir}/package.xml"/>
</target>

<!-- Deploy the unpackaged set of metadata retrieved with retrieveUnpackaged -->
<!-- The file ${targetDir}/package.xml lists what is to be depployed -->
<target name="deploy" depends="setup">
   <sf:deploy username="${sf.username}" password="${sf.password}" serverurl="${sf.serverurl}" deployRoot="${targetDir}"/>
</target>

</project>



Create APEX Class Using Ant

So here is a snippet to include in your build.xml file to generate a blank APEX class.

<!--
Sample: Create an after upsert trigger on object Opportunity
    ant -v -lib . -f build.xml -Dtarget=test -DclassName=Foo -Dtype=after createClass
Results:
Two files:
Foo.cls
Foo.cls-meta.xml

-->
<target name="createClass" depends="setup" >
    <fail unless="className" message="Must provide className (-DclassName=Foo) "/>
    <mkdir dir="${targetDir}/classes"/>
    <echo file="${targetDir}/classes/${className}.cls" append="false">public with sharing class ${className} {
}</echo>
    <copy file="templates/classes.meta.xml"  tofile="${targetDir}/triggers/${className}.cls-meta.xml" />
</target>
 


Before running this create a directory for the templates

~/Documents/salesforce/dev$ mkdir templates/

Then create the file classes.meta.xml  and insert the following:

<?xml version="1.0" encoding="UTF-8"?>
<ApexClasses xmlns="http://soap.sforce.com/2006/04/metadata">
    <apiVersion>26.0</apiVersion>
    <status>Active</status>
</
ApexClasses>


Now when you run ant like this
ant -v -lib . -f build.xml -Dtarget=test -Dobject=Opportunity -Dtype=after createTrigger

you get triggers/OpportunityAfterUpsertTrigger.trigger  with this content

trigger OpportunityAfterUpsertTrigger on Opportunity (after insert, after update) {
}


and you get   triggers/OpportunityAfterUpsertTrigger.trigger-meta.xml  with the necessary meta definition.  When you deploy this new class will be pushed to the server.




Create APEX Trigger Using Ant

I have a simple naming convention for my trigger files.   Object + Before or After + "UpsertTrigger".  For example  OpportunityAfterUpsertTrigger works on Opportunity objects after insert or update.    I generally only build before or after upsert triggers; meaning I generally have triggers like this
trigger OpportunityAfterUpsertTrigger on Opportunity (after insert, after update) {
}

or this
trigger OpportunityBeforeUpsertTrigger on Opportunity (before insert, before update) {
}


Inside these triggers I generally collect the objects that need to be worked on and ship them over to another class to do the work and testing.

So here is a snippet to include in your build.xml file to generate a blank trigger. 

<!--
Sample: Create an after upsert trigger on object Opportunity
    ant -v -lib . -f build.xml -Dtarget=test -Dobject=Opportunity -Dtype=after createTrigger
Results:
Two files:
OpportunityAfterUpsertTrigger.trigger
OpportunityAfterUpsertTrigger.trigger-meta.xml

-->
<target name="createTrigger" depends="setup" >
    <fail unless="object" message="Must provide Object for the trigger"/>
    <fail unless="type" message="Must provide type 'before' or 'after' "/>
    <if>
    <equals arg1="${type}" arg2="before" />
    <then>
            <property name="triggerName" value="${object}BeforeUpsertTrigger" />
    </then>
    <elseif>
    <equals arg1="${type}" arg2="after" />
    <then>
            <property name="triggerName" value="${object}AfterUpsertTrigger" />
    </then>
    </elseif>
    <else>
     <echo message="The value of property type is not 'before' or 'after'" level="error" />
     <fail/>
    </else>
    </if>
    <mkdir dir="${targetDir}/triggers"/>
    <echo file="${targetDir}/triggers/${triggerName}.trigger" append="false">trigger ${triggerName} on ${object}</echo>
    <if> <equals arg1="${type}" arg2="before" />
    <then>
        <echo file="${targetDir}/triggers/${triggerName}.trigger" append="true"> (before insert, before update) {
}</echo>
    </then>
    <elseif> <equals arg1="${type}" arg2="after" />
    <then>
        <echo file="${targetDir}/triggers/${triggerName}.trigger" append="true"> (after insert, after update) {
}</echo>
    </then>
    </elseif>
    </if>
  
    <copy file="templates/trigger.meta.xml"  tofile="${targetDir}/triggers/${triggerName}.trigger-meta.xml" />
 
</target>


Before running this create a directory for the templates

~/Documents/salesforce/dev$ mkdir templates/

Then create the file trigger.meta.xml  and insert the following:

<?xml version="1.0" encoding="UTF-8"?>
<ApexTrigger xmlns="http://soap.sforce.com/2006/04/metadata">
    <apiVersion>26.0</apiVersion>
    <status>Active</status>
</ApexTrigger>



Now when you run ant like this
ant -v -lib . -f build.xml -Dtarget=test -Dobject=Opportunity -Dtype=after createTrigger

you get triggers/OpportunityAfterUpsertTrigger.trigger  with this content

trigger OpportunityAfterUpsertTrigger on Opportunity (after insert, after update) {
}


and you get   triggers/OpportunityAfterUpsertTrigger.trigger-meta.xml  with the necessary meta definition.  When you deploy this new class will be pushed to the server.