Tuesday, December 28, 2010

Changing servers for business rules deployment

In our migration to webmethods 8 we changed server names as we have 7 and 8 running concurrently in dev.

In order to deploy our business rules to the integration server, I loaded them into Blaze 6.8, but I didn't see a way to change the server.  Searching the files in the repository while blaze was closed revealed 2 places where the hostname and connection info is stored in plain text:
<Project Name>

  <instance ref='IS Host'>
   <instance ref='value'>
    <value>hostname</value>
   </instance>


<Project Name>.innovator_attbs
 managementProperty.IS\ Host=hostname


That worked, then I had to regenerate my business object model from IS.  The object hadn't changed, but would not sync up.  After regenerating it deployed fine.

Migration to Webmethods 8 - more info

Found this guide while searching about:

http://www.scribd.com/doc/35935457/8-x-Web-Methods-Upgrade-Guide

Handling CSV files

We were interfacing to external vendor who had a CSV restful interface.  I had built a system for creating and parsing the xml files.  Then my co-worker reminded me about the built in flat file handling routines.  Duh!

  1. Create a flat file dictionary with all the different cvs formats you will need.
  2. Then create a flat file schema for each transaction (referencing the object in the dictionary).
  3. Create a document definition from the schema
  4. Take the csv data and send it to pub.flatFile:convertToValues.  Send in the appropriate schema into the ffSchema entry on the service input.  (You can just copy the schema from the navigator and paste it into the value for ffSchema).
  5. map ffValues to a document reference list of the document you created in step 3.
It was easy and straightforward and it handles all the parsing for you.

Monday, December 27, 2010

Migration to Webmethods 8

Based on errors we were having in 7.1.3 (we think our install was corrupt) we finally decided to migrate to 8.

Here's what we've done so far.
  1. Installed and configured IS8 and MWS8.
  2. Installed JDBC drivers
  3. Exported all the IS Packages we wanted to keep from 7 and placed in replicate/inbound on IS 8 and installed them from there.
  4. Copied properties files from 7 to 8
  5. Setup all existing scheduled tasks
  6. Synced all the publishable documents in the WM8 broker
  7. Set the permissions for all our web service connectors to Anonymous (it seemed to revert to the default when installed from the inbound directory).
  8. Created all the web service consumer aliases.

This got IS up and running and it seems fine.  We still need to regression test it, but all the packages passed basic tests.


Then I moved on to deploying our process models from WM7 into WM8.  When I opened the projects I had to change the jre from 150 to 160.  I also had to fix a couple library issues.  (We didn't want to try and deploy the process models directly into 8 as we have had numerous problems with them, so we wanted to start from scratch on them.)

I also created a new process.  This process is working fine in MWS8.  The ported processes are getting a null pointer exception on the task interface.  I'm still investigating that. 

But so far, the IS upgrade was pretty smooth.  The MWS upgrade is still taking some time.  More on that later.

Wednesday, December 22, 2010

Sending high priority emails

Use PSUtilities.email.smtp (instead of pub.client.smtp) and set priority = 1.

Tuesday, December 21, 2010

Hashtable/Hashmap

Wow, you would think this would score higher on google searches, but after several pages of results I finally found PSUtilities.hashtable.   I was going to write up something or try and find a clever way to build a document so it looked like a hashmap.  Glad I didn't waste that time but it sure wasn't easy to find!

Friday, December 17, 2010

Changing connections for an existing JDBC adapter

So you have your adapter setup with all 40 fields mapped and the 2 table join with a nice complex where clause.  Now you find out you need to use a different connection for whatever reason.  If you don't know, it looks like you have to delete and recreate your adapter.  There isn't anywhere in the adapter to change the connection.

Enter the WmART package.
For each adapter service, call pub.art.service:setAdapterServiceNodeConnection.  Pass in the serviceName (copy and paste from the navigator) and the new connection alias name.  Now the adapter will point to the new connection alias!

Transactions from within a Process Flow

Using JDBC Adapter 6.5, patch 23 in IS 7.1.3 we were having an issue where a flow service that inserts data in to 3 tables (each table insert via a JDBC Adapter service) does not roll back properly on a failure when called from within a process flow.

When run from within Developer, it would work fine and would roll back as you would expect.  As soon as we called it on the "Save to Database" step in our process, it would no longer roll back.

Trying XA_TRANSACTION and LOCAL_TRANSACTION as the transaction type on the JDBC Connector does not make a difference.
Basically, we were getting auto commit on each insert when the flow service is invoked from a Process Model.

In reviewing the logs, we see that

When the Process starts, a "Beginning transaction" is logged.
When the Process finishes, a "Committing transaction" is logged.
The process always does a commit - even when my IS flow service Exits $flow with FAILURE


We were not able to do Explicit transaction management because the JDBC Adapter Services were using a JDBC Adapter connection that was already in the parent transaction context. I needed to create a new JDBC Adapter connection (in IS) and change my JDBC Adapter Services to use the new connection. Then I could explicitly call startTransaction and or commitTransaction/rollbackTransaction in my flow service.

Hope this helps others.  Thanks to Brian for figuring this out!

Wednesday, December 8, 2010

Process step delay

Why oh why doesn't webMethods have a delay step?  Seems a common thing to want.

So far the best solution I've seen on the forums is this:
  1. For a delay between step 1 and step 2, Setup a dummy receive step R
  2. Setup a join between step 1 and R with a join timeout of the desired timeout
  3. Set the transition going from the join to step 2 with a condition of timeout
It's putting a technical workaround inside your process (which is not pretty), but you don't want to call a Thread.sleep as that will tie up threads while waiting.


Creating a sub-process can at least hide the complexity and technical details and make the process flow more readable.

Monday, December 6, 2010

KPIs not working? Things to check:

  1. Make sure the fields are logged.  In developer on the properties tab for the process step, select the "Logged Fields" tab and then choose which fields will be used for the KPI data and dimensions.  (See below)
  2. Ensure the facts are being collected. In your process database, you should have FACT_ tables with data being added.
  3.  If the new KPI is going against existing data, it may take a while to calculate and display it.
  4. Check the Event Map for the process/kpi.  Go to Administration->Analytics->KPIs->BusinessData.  Below there should be an Event Map that matches your process.  It should be marked "deployed".  If you open up the (+) sign, you should see your fact listed (which should match what you set up in designer) and below that any dimensions you have configured.


  5. Check if Monitoring->System-Wide->System Overview shows your event map. (You can search to narrow down to the name).
  6. Check if Monitoring->System-Wide->Problems has any entries which might explain your KPI problems.

Sorry I don't have the answers to every problem you might encounter here, but at least this might help you track down the root cause.  I've had to call support to get them working both times I've tried.  The first time I wasn't logging the fields (duh).  The second time my event map was not correct.  The fact name didn't match the fact being collected, so it wasn't resolving.

Tuesday, November 23, 2010

Blaze rules - more

Even our (highly paid) consultant didn't know of a way to change the data structure without re-importing the data into Blaze.  Then you can also by hand create new templates for each field, but in the end (unless you have tons of data/processes already set up) it was easier to delete and recreate.

The other thing that isn't easy to spot is that to get the built-in rules for >, <, >=, <= etc for numbers, your data structure needs to be a number type.  I created a "rulesData" object with the variables with java objects set to doubles and ints and do the conversion before hitting the rules engine step.  The rulesData object is the one I imported into blaze.  It looks like this:






He had suggested just creating the object you need with Blaze itself (rather than importing from webMethods).  That worked, except then when you generate the webMethods deployment (Tools -> Generate Webmethods Deployment) the input parameter is just an object on the webMethods side with no visibility into what you need to send in.  You have to then go to Blaze and figure out the data structure, then create the same thing in webMethods and make sure you pass it in exactly the same.  Seemed rather over-complicated to me.

Friday, November 19, 2010

Blaze rules

Getting the data pulled over from your Integration Server isn't intuitively obvious and  the documentation for Blaze is not specific to webMethods.  I was going write up something on how to do it, then I found this tutorial.

Blaze Quickstart Tutorial for webMethods

One of the key things is the blaze implementation class, which you just have to know.
com.webmethods.blaze.bom.ISDocumentTypeProvider

Also, if your IS data changes, you can't refresh it.  Instead you have to delete and recreate it.  

Still haven't figured out how to recreate the data entries in the decision tree without recreating the tree yet, but I'll get there.

Tuesday, November 9, 2010

Error publishing task interface to MWS server

[POP.003.0025] {0} cannot view the content of {1}.; nested exception is:
com.webmethods.portal.PortalException: [POP.003.0025] {0} cannot view the content of {1}.
[POP.003.0025] {0} cannot view the content of {1}.; nested exception is:
com.webmethods.portal.PortalException: [POP.003.0025] {0} cannot view the content of {1}.


I got the above error when trying to publish a task.  Finally searched and figured it out.  You have to be logged in as SysAdmin to publish a task.  I had lost the login when I had to rebuild my workspace in Designer.

Wednesday, November 3, 2010

Log4j and Secure FTP in webmethods

Found a link with two nice packages for the IntegrationServer.  One for Log4j logs through the IS and another for Secure FTP.

http://webmethodsarchitect.com/

Tuesday, November 2, 2010

Notifications from tasks

Built-in way:
  1. In the solutions tab, go to Tasks->[Task Project Name]->[Task Name]-> Task Notifications
  2. Right click and create a new notification.  This is basically an email template.  You can use any of the pipeline data that is input to the task in the email.
  3. Open the task overview screen for the task you want a notification on.
  4. Click on the Events tab
  5. Add a new Event and name it appropriately
  6. In the Event Actions section, add a new action and select the Send Notification action.  Select the notification that you created in step 2.
Now the users need to subscribe to receive the notifications:
  1. Login to My Webmethods Server.
  2. Go to Monitoring->Business->Tasks->MyInbox
  3. Right above the Export Table button is a subscriptions link.  Click it.
  4. Hit the Subscribe button
  5. Select the notification and hit next.
  6. Click the checkbox after task subscriptions and hit save.
Note: As the Administrator, you can set up subscriptions for other users.  Also in 7.1.3 we could not get subscriptions to work when assigning a role to the subscription.  We had to do individuals.


Integration Server Way:

  1. Create a flow service (or java service) that will send out your email notification.
  2. Register that service in the Bindings window in designer in your task project.
  3. Open the task overview screen for the task you want a notification on.
  4. Click on the Events tab
  5. Add a new Event and name it appropriately
  6. In the Event Actions section, add a new action and select the Invoke Service action.
  7. Select the service you created in steps 1 and 2.
Note: You have to get the email addresses of the users manually, format the email, etc with this method.

Thursday, October 28, 2010

Starting a process from a web URL.

I couldn't find a way to have a task initiate a workflow process.

What I did was just had a publish event integration service that publishes an empty document to start the process. Then you can just have a web link to the invoke step like 
 http://<server>:5555/invoke/DtMigrationChecklist.v1.flow:publishRequestMigrationEvent 

Then my first step of the process was the task for filling in the data.  This indirectly starts the process "from a task".  We used it for our migration process for getting code from development up to QA and then tested and approved.

Removing a published document from the Broker

If you have a document set to infinite retry, but it's never going to go through you can remove it from the Broker.  Here's how:
 

  1. Connect to the environment where you want to remove a document. (see [MWS Environments|MWSConnectionInformation])
  2. Navigate to "Administration > Messaging > Broker Servers > Clients"
  3. Search for the subscribing Trigger (easiest way is to use the package name where the Trigger is located)
  4. Click on the Client ID link of the appropriate Trigger
  5. To remove all documents
    1. Click on the "Statistics" tab
    2. Click the "Clear Queue" button
  6. To remove a single document
    1. Click on the "Configuration" tab
    2. Click on the "Lock Queue" button (make sure to unlock queue when done)
    3. Click on the "Browse Queue" tab
    4. Search for the document you want to remove
    5. Select the document
    6. Click the "Delete" button
    7. Click back on the "Configuration" tab
    8. Click on the "Unlock Queue" button (make sure you do this!!!)

KPIs

I just could not get the custom KPIs to work. I had found these steps pretty easily.
  1.  Administration -> Business -> Business Processes : turn on analysis for the process
  2.  Administration -> Business -> Tasks -> Task Engine Administration : turn on analysis for each task 
  3. Setup the KPI in Designer for the process
I finally had to call support to get them working.  Here is the step I was missing!
  1.  Turn on Logging for the fields used for the KPIs. In the Properties Tab->Advanced->Logged Fields 
Once I did that they immediately started working.  You have to tell the process what data it cares about as far as KPIs are concerned.  It makes sense, but it sure wasn't clear in the documentation.

Tuesday, October 26, 2010

Debugging a remotely triggered flow service.

To debug a process that is run from a trigger or business process in IntegrationServer7:
  1. Put a pub.flow:savePipelineToFile as the first entry in the flow service and enter a filename.
  2. Execute the code that hits the trigger or business process.
  3. Look at the file on the IntegrationServer/pipeline directory to check the data.
  4. To debug, disable the savePipelineToFile and add a pub.flow:restorePipelineFromFile with the same filename
  5. Step through the flow service and you can now see exactly how it behaved and find your problem.
And in IntegrationServer8 (as mentioned in the comments) it gets much easier:
  1. Open the service and change the pipeline debug option to "Save".  (You may have to click on the title bar of the service to get to the right properties screen)
  2. Execute the code that hits the trigger or business process.
  3. The data is saved in the IntegrationServer/pipeline directory
  4. Now select the service again as in step 1, but change the pipeline debug option to either "Restore" option.
  5. Step through the flow service and it will start out with your restored pipeline.

    Put a custom identifier on the task list to easily differentiate tasks

    1. Open the task overview pages (either double click it in the Solutions tree or from the process model)
    2. Switch to the Events tab (at the bottom)
    3. Add a new event with an {{Event Type}} of {{Queued}}
    4. Add a simple action and choose Set Task CustomID.
    5. Use the {{...}} button to select the field to use as a custom id.
    6. Repeat this for every task that you want a customId on (and they can be different for each task if that makes more sense).
    Now you need to add the custom id to your inbox/task list.
    1.  Login to my webmethods
    2. Goto the {{My Inbox}} tab.
    3. Click on the middle button above the {{Scheduled Delegations}} link.  It is circular and looks like an options list.
    4. Move the Custom ID field from the {{Available Columns}} list to the {{Selected Columns}} list and use the up and down arrows to change the column order.
    Now the custom ID will show up in your inbox list.

    Do the same process for the {{Task List Management}} view as well.

    Remove the task folder from the task view.

    If you do not implement a custom inbox task portlet, then you also don't want the tab showing in the list. Open the WebContent/WEB-INF/tasks/<task id>/taskDefinition.xml file and change the isTaskFolder="true" value to false in the section that looks like this:

      <!-- Publish Task pages -->
    <CONTEXT alias="webm.apps.workflow">
          <folder name="Service Delay" description="" 
                   alias="BF21B6F4-8954CEC6-4057-571C9AF1FF66.task.app" isTaskFolder="false"/>
    </CONTEXT>

    Purpose

    I've been working with webMethods for about a year now and I have found there are lots of items that are hard to figure out via the documentation and very few posts on the web about them.  I thought I'd post them here as I find them to help out other people (hopefully).