Lawson Process Automation

Lawson AGS Caching Pt 2

I previously posted about what Lawson AGS caching was here. After some questions and reviewing the post, I realize that I didn’t fully explain how to use it within the context of Process Automation or Process Integrator. Hopefully, this post will answer those questions. Since we’re on Process Automation, my example will be from PA. The only difference between Process Automation and Processflow Integrator is that PA will return the XML from the Lawson Transaction Node (AGS). You will have to use a WebRun in Processflow integrator. Neither product (to my knowledge) will return the _TRANSID as one of the fields from the AGS node.

In this example, I will inquire on GL10.1 with cache set to true and then update the Company name field. For those who are just joining us (and didn’t read part 1), by turning on caching, I can update just the company name field in the update node instead of having to pass in all of the fields again.

Here is my sample flow:
examplelTRANSID

  1. I have two start node variables: strXML and strTRANSID
  2. Inquire on GL10.1. If you’re on Processflow integrator use a Webrun here.
  3. Fix the XML data. XML cannot have a period (.) in a node name. Unfortunately, the AGS response has a period in the node that contains the form name (GL10.1 in this case). We must remove this or the parse will fail
  4. Parse the XML string
  5. Get the TRANSID. This step is not strictly necessary
  6. Update the company name on GL10

GL10.1 Inquire

_PDL=PROD&_TKN=GL10.1&_EVT=CHG&_CACHE=TRUE&_LFN=ALL&FC=I&GLS-COMPANY=10

Update XML string

strXML=AGSInquire_outputData.replace(/GL10.1/g,"GL101");

Parse XML
Action is “Parse XML String” and the input value is strXML that was built in the Assign node

Get the TRANSID

strTRANSID=XMLParse_output.GL101[0]._TRANSID

GL10.1 Update

_PDL=PROD&_TKN=GL10.1&_EVT=CHG&_CACHE=TRUE&_LFN=ALL&FC=C&GLS-COMPANY=10&GLS-NAME=NewCompanyName&_TRANSID=<!strTRANSID>

HTH

Advertisement

Predicting the future with Process Automation or What Would Process Automation do?

I hate surprises. I hate surprises to the point that I actually don’t care if I know what happens in a movie before I see it. Consequently, I spend a lot of time trying to figure out what is going to happen next, just so I’m not surprised. This got me thinking about predicting the future with Process Automation (yes, I realize this is an odd thing to think about, but you have to have a hobby, right?). It is fairly easy to figure out what happened in the past in a flow. This could be done either through a variable you have set or viewing the output variables from the node, but is it so much harder to figure what is going to happen next? As it turns out, not really.

I’m not sure that I have a business case for this, it’s just an interesting exercise which is why you got the preamble about me and surprises. Back story aside, Process Automation (and Process Flow – although slightly more complicated) can be self aware. By being self aware, and given that a flow is a static set of instructions, you CAN predict the future – as in you can figure out what will happen next within a flow. Before I lose everyone, I guess I should start getting to the good stuff.

It all starts with the fact that a flow is an XML document. With Process Automation you can read the XML that comprises the flow and search for the information that you need using E4X syntax. All of the information about the flow is in the XML, so you can effectively read ahead to what the flow will do next and thus predict the future.

Here is a simple flow that demonstrates reading ahead. I will attempt to determine the value in the “To” field of the email node at the end.
ReadNextNodeFlow

In order:

  1. Run a landmark query to retrieve the XML of the flow. If you’re using Process Flow integrator, you would need to retrieve the XML from the file system.
  2. Parse the string information into an XML document
  3. Use javascript to retrieve the To value of the email node
  4. Email node – it just exists so we can read it.

Landmark Query
_dataArea="prod" & _module="pfi" & _objectName="PfiFlowDefinition" & _actionName="Find" & _actionOperator="NONE" & _actionType="SingleRecordQuery" & _pageSize="30" & PfiFlowDefinition="ReadFutureNode" & CurrentFlowXml
Note that this assumes that you know the name of the flow and that it has been published to the server

The XML document has the following basic structure:

<process>
  <processUserNode></processUserNode>
  <activities>
    <activity>
      <prop><anyData><anyData></prop>
      ...
      <prop><anyData><anyData></prop>
      <OnActivityError>
    </activity>
  </activities>
  <edges>
    <edge />
    ...
    <edge />
  </edges>
</process>
  • processUserNode – contains information set at the process level
  • activities – contains all of the activity nodes
  • activity – information about the node itself (type, location, etc)
  • prop – the settings in the node. There may be many prop nodes
  • OnActivityError – contains information on what to do on error
  • edges – container node
  • edge – indicates the “next” node

Retrieving the To value
I recommend this page as a reference for using E4X

//Find 'this' node in the edge list
var xmlThisNode = XMLParseFlow_output.edges[0].edge.(@from=='Assign8190'); 
//get the ID of the next node (email)
var strNextNodeID = xmlThisNode[0].@to;  
//Get the activity node for the email
var xmlNextNode = XMLParseFlow_output.activities[0].activity.(@id==strNextNodeID); 
//Pull the To property 
var xmlEmailToNode = xmlNextNode[0].prop.(@name=='to');  
//Get the value of To
strEmailTo = xmlEmailToNode.anyData; 

I suppose you could use this technique to retrieve the initial value of the Start node variables to see if they have changed during the flow. There might be other uses as well. I’m not sure if this is really something you would ever actually do in a production environment, but I feel better knowing I can predict the future.

HTH

Calling Process Automation from the command line

I discussed this briefly during my presentations at Inforum this year. The basic business requirements are that there are occasions when you need to trigger a workunit other than from the application (S3/M3/LMK) or from a schedule. In Process Automation you can use Channels to poll for new data (files, JMS, etc), but there are times when you still need more flexibility.

Need some examples? Okay, how about as part of a script or set of processes? For us, a good example is our ACH process for Accounts Payable. Our bank (PNC) will not accept the files produced by the Lawson AP160 job and as a result, we need to manipulate them. The reason we can’t use File Channels in Process Automation is because the bank requires us to send one file with all of our bank accounts in it (currently 85) that has separate batches for each account. The easiest way to accomplish this is to have a script that runs as the last step of a multistep job after all of the AP160 steps. That script simply calls a java program that triggers a process flow to process the files and send to PNC. There are several other examples, such as extending the ability to do scripting. Imagine being able to call a script that can update the RM attributes on a user. Pretty nice option, eh?

Hopefully, I don’t really need to convince you this is a good idea. By being able to call a process flow at will, you can eliminate customizations in your application environment, which is a good thing. Below is java code that will trigger a workunit. The code is fairly verbosely commented regarding what it’s doing, so you should be able to modify to suit your needs without any more commentary from me. You will need to update the appropriate variables (host, user, password, etc) and create a process flow called “TestJavaProcess” and upload to your server. After you run the java program, you can review the workunit that was created to see where the values from the program appear so you know what to update.

* TestCallLPA.java
 *
 * Description:
 *  Program is designed to call LPA flows from the command line
 *
  ============================================================================== */
import com.lawson.bpm.eprocessserver.interfaces.ProcessRequest;
import com.lawson.bpm.eprocessserver.interfaces.ProcessResponse;
import com.lawson.bpm.eprocessserver.interfaces.ProcessVariable;
import com.lawson.bpm.eprocessserver.interfaces.ProcessFolder;
import com.lawson.bpm.eprocessserver.interfaces.LPSSession;


public class TestCallLPA
{
    public static LPSSession session;
    public static ProcessRequest request = new ProcessRequest();
    public static ProcessResponse response;
    //===================================================
    public static void main(String[] args) {
        try {
            String HostName = "gridhost"; //grid host name
            Integer PortNumber = 50005; //port number that grid is listening on
            String UserName = "user"; //a valid admin LMK user name
            String UserPwd = "password";  //password for UserName
            String LMKProd = "prod"; //note this is the LMK PL
            String ProcessName = "TestJavaProcess";
            String ProcessTitle = "Java API Workunit";
            String KeyString = "123456789"; //This with key value needs to be a unique string
            String KeyValue = "KeyString"; //This with key string needs to be a unique string
            Boolean textOutput = false; //set to true to print return values to screen - infocode, return message, outputdata with | separators
            Boolean returnData = false; //set to true to output data -- will need a return node
            Boolean Async = false; //set to true to trigger WU without waiting for response
            request.setSystem("Landmark");
            request.setSource("JavaAPI");
            request.setFlowName(ProcessName);
            request.setWorkTitle(ProcessTitle);
            request.setFilterKey("FILTERBY");
            request.setFilterValue("FILTERVALUE");
            request.setKeyString(KeyString,KeyValue);
            /* For demo purposes, this is commented out.  If you have a service, set here */
            //request.setService(setting[1]);
            //Criteria 1
            request.setBizCriteria(request.BIZ_CRITERIA_1,"CRITERION1");
            //Criteria 2
            request.setBizCriteria(request.BIZ_CRITERIA_2,"CRITERION2");
            //Criteria 3
            request.setBizCriteria(request.BIZ_CRITERIA_3,"CRITERION3");

            //Start adding variables
            //Boolean
            ProcessVariable variable = new ProcessVariable("BOOLEAN","true",ProcessVariable.TYPE_BOOLEAN);
            request.addVariable(variable);
            //Integer variable
            variable = new ProcessVariable("INTEGER","1",ProcessVariable.TYPE_INT);
            request.addVariable(variable);
            //Decimal variable
            variable = new ProcessVariable("DOUBLE","1.00",ProcessVariable.TYPE_DBL);
            request.addVariable(variable);
            //Date variable
            variable = new ProcessVariable("DATE","01/01/2013",ProcessVariable.TYPE_DATE);
            request.addVariable(variable);
            //Long Integer variable
            variable = new ProcessVariable("LONG","123456789",ProcessVariable.TYPE_LONG);
            request.addVariable(variable);
            //Object Variable -- not sure how to pass
            variable = new ProcessVariable("OBJECT","",ProcessVariable.TYPE_OBJECT);
            request.addVariable(variable);
            //Array variable -- not sure how to pass
            variable = new ProcessVariable("ARRAY","",7); //Array process type is not documented and is not TYPE_ARRAY
            request.addVariable(variable);
            //Add input data
            request.setInputData("Some input data");

            //Connect to grid and create a session
            session = LPSSession.createGridSession(HostName,PortNumber,UserName,UserPwd,LMKProd);
            //Create workunit
            //Pass in the built request from above and set Async value
            response = createWU(request,Async);
            //If user selected Async then createWU will exit and we won't get here
            int eRC = response.getErrorCode();

            //Deal with response
            if (textOutput) {
                System.out.println(response.getInformationCode()+"|"+response.getReturnMessage()+"|"+response.getOutputData());
            }
            if (returnData) {
                System.out.print(response.getOutputData());
            }

            //Cleanup and close out
            session.close();
            System.exit(eRC);
        } catch (Exception e) {
            e.printStackTrace();
            System.exit(1);
        }
    }

    //===================================================
    //Trigger workunits based on sync vs async
    public static ProcessResponse createWU(ProcessRequest request,Boolean Async) {
        try {
            if (Async) {
                response = session.createAndReleaseWorkunit(request);
                session.close();
                System.exit(response.getErrorCode());
            } else {
                response = session.runRequest(request,true);
            }
        } catch (Exception e) {
            e.printStackTrace();
            System.exit(1);
        }
        return response;
    }
}

In order to compile the code, you will need to make sure that the following files are in your classpath (and the directory you save the code file above to):

  • sec-client.jar
  • bpm-interfaces.jar
  • type.jar
  • security.jar
  • clientsecurity.jar
  • grid-client.jar
  • lawutil_logging.jar

A basic classpath, compile, and call command (on Unix) would be:

$ export CLASSPATH=/lawson/lmrkstage/env/java/thirdParty/sec-client.jar:/lawson/lmrkstage/env/java/jar/bpm-interfaces.jar:/lawson/lmrkstage/env/java/jar/type.jar:/lawson/lmrkstage/env/java/jar/security.jar:/lawson/lmrkstage/env/java/jar/clientsecurity.jar:/lawson/lmrkstage/env/java/thirdParty/grid/grid-client.jar:/lawson/lmrkstage/env/java/jar/lawutil_logging.jar
# Compile 
$ javac TestCallLPA.java
# Call program
$ java TestCallLPA

HTH

Lawson approvals – do it your way

Lawson approvals my way? What does that even mean and why would I want to do it?

Customizing approvals is changing how workunits route for approvals in Lawson Process Flow and Lawson Process Automation. This post is going to specifically be about S3, but it should apply for Landmark and M3 as well. My company is on Lawson Process Automation, so that’s what will be in the examples, but the same concepts should apply to Process Flow.

Let’s first talk about how Lawson Process Automation (LPA) seeks approvals in the User Action nodes. (The rest of this post will assume that you are familiar with the concept of approvals in Process Flow or LPA. If you’re not, check out the documentation for your product before continuing as it might be confusing otherwise.) User action nodes use what is called “Category Filtering”. First a task is assigned to a user. For that task, a filter category and value are assigned (you can assign multiple categories and values). Next, each User Action in a flow will also have a task (or multiple tasks) assigned to it. Finally, when the workunit is created, it will have a filter category and filter value on it based on data in the source application. The inbasket will use the filter category and filter value from the workunit to display only relevant workunits to the user who is logged in.

Easy, right?

Okay, maybe not.

Keep these key words in mind:

  • Task = Inbasket. There are several philosophies on how to set these up, but they will basically equate to either the person or the type of approval they are performing. Examples might be Manager or Invoice Approver.
  • Filter Category = What kind of information is relevant for this workunit.
  • Filter Value = The actual value that will be filtered on for the workunit.

Here is an example setup:
User: Jdoe
Task: Invoice Approver
Filter Category: ACCTUNIT
Filter Value: 123456

In this scenario, Jdoe would only be able to see workunits in his inbasket that have BOTH the Filter Category of ACCTUNIT and the Filter Value of 123456.

Okay, now that that’s out of the way, let’s get to the fun stuff. Ultimately, the problem comes down to the fact that workunits created by the source applications (S3, M3, Landmark) rarely have Filter Categories that are of any use. Case in point, the Invoice Approval service uses Authority code, which has to be keyed at the time of the invoice (or defaulted from some place like Vendor). This creates a bit of an issue for us. With 2000 Managers responsible for their own accounting units, it means that AP would need to actually know which of the 2000 managers needed to approve the invoice. Not gonna happen. In a much larger context, it also limits us to basically one level of approval if we were to actually set up a code for each manager because we wouldn’t be able to route to another level easily like a Vice President. Each VP would need to have the same setup as each of the managers that might escalate to them instead of something that makes sense like Company. I’m not saying it’s impossible, it would just be extremely messy. If I’m a VP and there are three managers that approve for Accounting Units in my company, I would need to have each of their approval codes assigned to me. If the manager happens to also approve for an accounting unit in another company, the VP responsible for that company would also need that Approver code assigned to them. Add in the fact that Approval code is only 3 characters and we’re going to wind up with codes like 0X0 and 1CF that AP would probably never get right.

Truth be told, we don’t really want to get approvals by approval code anyway. How our invoice approval works is: If the invoice is part of a Capital project, then route it to the Project Manager, if it’s for a Service, route it to the manager of the cost center receiving the service. So not only do we NOT want to use Approval Code, we actually want to use different approvals depending on the type of invoice.

The question is, how do we do that? The answer is we modify the Category Filter and value. There are people thinking right now, “Okay, so we need to modify the 4GL of the library that creates the workunit to have the correct Filter Category and Filter Value, right?”. You would be correct, you could do that. If you’re one of those people (or you’re a process flow person in an environment that thinks like that) I feel sorry for you. You’re doing it the hard way. Not only will you have a modified library that you have to be careful of when you apply patches, you have now created extra effort when you want to upgrade. Good for job security, but bad for your business.

So now you’re thinking, “Okay smarty, since you just insulted me, how do you propose that we do it?”. I’m going to advocate that you dynamically change the category filter and value IN THE FLOW ITSELF. Here’s an interesting bit of information – the Filter Category and Filter Value on the workunit record are NOT the values used to create the inbasket task. What actually happens (as near as I can tell), is that these values are actually used to populate two Flow variables (in LPA called: oCatKey and oCatValue – I believe this to be the same in Process Flow). It is the flow variables that are used to used in the creation of the inbasket task. All you have to do is add an assign node to your flow before the User Action node. Add a javascript expression to set these two values to whatever you want. Voila! The inbasket will now use the Filter Category and Filter Value that you set in your assign node.

Here’s the code to change a workunit to match what I set up for Jdoe above:

oCatKey = "ACCTUNIT"; //Filter category
oCatValue = "123456"; //Filter Value

For practical purposes and debugging, we are in the habit of also making calls to update the workunit itself with the correct Filter Category and Filter Value. It makes them easier to find when doing research. The added bonus to dynamically changing the values is that you can change the Filter Category and Filter Value as many times as you need in a flow. Thinking outside the box (we don’t do this) – you could have an Invoice approved by a manager for an accounting unit. You could then change the filtering to company and route to a VP responsible for that company if it’s over a certain dollar amount. In this case, you would only need to set up the number of companies that you have for VPs instead of having to setup all of the accounting units for a company (which you would have to do if you went the 4GL route). You could change it again and send it to a treasury analyst based on the bank account that it would pay out of to make sure that funds were available.

The possibilities are pretty much endless at that point.

HTH