Sunday, September 11, 2016

Easy SOA releases with JGitFlow

If you use GIT as your source control system and if you use maven, the jgit-flow plugin is a massive time-saver, especially when we release a slightly large application with multiple modules (Each with it's own pom file). 

Two steps: 
 mvn clean external.atlassian.jgitflow:jgitflow-maven-plug in:release-start
 mvn clean external.atlassian.jgitflow:jgitflow-maven-plug in:release-start

do the job. 

The above sequence basically updates the pom file versions to a release version (e.g. from 1.0-SNAPSHOT to 1.0, merges the development branch changes to the master branch, and sets the pom versions in the development branch to the next snapshot 1.1-SNAPSHOT)

If you have an application with multiple projects/modules, all of them can be released in one go (such as my application here that contains two modules)

Of course, there are some peculiarities when SOA Composite projects are involved. 
e.g. the oracle-soa-plugin maven plugin insists on 'deploying' the composite and running tests at the same time - so you need to keep a SOA server running and supply the serverUrl, username and password properties. (keep the properties names different - see sar-common pom for example names) just so they don't clash with the jgitflow username and password properties. 

I avoid this by simply using a private-public key pair to interact with github which saves time and avoids the above property name clash. 

Of course, there are ways to not have the oracle soa plugin insist on deployment when creating a release, but that is a post for a later day!. 

Saturday, September 03, 2016

Test Driven SOA - citrus for powerful SOA test coverage

Reading parts of Test-Driven Development for Embedded C" by James W. Grenning inspired me to take another look at this area and look for something new,  fresh and powerful for use in the SOA world. 

I don't think we need much convincing on the importance of  automated test coverage (if someone does, please read the first chapter of the book mentioned above, especially the section on "Physics of TDD" that tries to quantify the high long-term costs of "Debug later programming" - the nemesis of TDD)

A very simple application with a SOA composite project and Tests project can be found here:

Although the test in this is just a simple SOAP request, what I am interested in are the features that citrus has to offer that can help create a solid battery of tests. 

  • Tests can be specified in Java or XML or a combination of both
  • A number of test utilities are inbuilt - including things like database access, JMS,mock SOAP endpoints (static responses), complex assertions - and these can be used to write complex setup and teardown routines

I will leave the reader to peruse the code on github but this shows the most important pieces of config in my test project:

  • To build+deploy+test, after making sure your SOA server is running, just run "mvn integration-test" from the application level (provide serverUrl, user and password in the SOAComposite pom or from the environment e.g. -serverUrl=http://soahost:port)
  • To only run the integration tests, just run "mvn integration-test" from the SOAApplication/SOACompositeTests level.

This is all neat and CI ready! 

Saturday, August 27, 2016

Maven builds for SOA 12c Composites with BPEL Java embedding and Java class

Enviornment: Oracle SOA Suite 12.2.1

Sample Application:
(Git clone or use the download option from here: )

 A BPEL component has a Java embedding that in turn calls a Java class method (under the usual SCA-INF/src)

This works and builds find using JDeveloper, but the oracle-soa-plugin for maven seems to have a few known issues (see references for one of them) that cause builds for such composites to fail. 

 My Java embedding, referring to my class com.singhpora.samples.SOAApplication.SCAJava under SCA-INF/src can be seen here: 

When I build the SOA project using "mvn clean package"  (from the SOAProject directory with the default pom) , you can see that I get two distinct errors as shown below: 
a) It can't find my class from under SCA-INF/src 
b) It cannot find even the BPEL platform classes

The workaround(s) for the two issues above involve:
a) create a simple java pom file under SCA-INF 

b) Add SOA/SCA-INF as a module in the 'Application' level pom 

c) Workaround for the second issue where it can't find BPEL's libraries:
Observe the use of maven-dependency-plugin here (which essentialy copies a BPEL platform dependency temporarily under SCA-INF/lib to keep the compiler happy):

After using the above workarounds, if I now build my application using the application pom, it builds and deploys fine:
(run mvn clean pre-integration-test from the SOAApplication level). 
As you can see, it now builds the two modules and the application. 

At runtime, my Java code is invoked successfully:

References/Related links:
2)   Same as workaround c)
3)   Builds for java classes under under SCA-INF but a slightly different approach:

Tuesday, June 07, 2016

#AMIS25 and The Oracle Cloud Shift : Insights from my first Holland trip

I would like to take this opportunity to wish AMIS Netherlands a very happy 25th birthday. In the context of Oracle SOA, the the name AMIS often keeps popping up - they have contributed a lot to the knowledge available to the community around this and related Oracle technology. 

As they chose to celebrate this occasion in a uniquely signature style - by holding a global Oracle conference with an impressive lineup of speakers from all 6 continents and also by holding the event in an old aircraft hangar (commemorating their origins as the Aircraft Management Information Systems) 
It was a pleasure to be invited by Lucas Jellema (@lucasjellema) so I decided to attend at least one day - the Friday, 3rd of June. The line-up of events though was fantastic on both days. 

I arrived in the Netherlands on Thursday, the 2nd (my first visit to the country, outside the airport that is) and decided to explore places nearby... More on this later!

All the speakers might upload their presentations as they see fit and of course, know the best about their subject matter. I'm going to write about the talks I attended and my observations on the main themes.

One thing that is quite apparent is that the mainstream Oracle world is now cloud. This is quite the realisation of c of 12c.

First, the conference day started for me with Simon Haslam's (@simon_haslam) talk on the Oracle Traffic Director. This was one of the aha moments when you realise a gap in existing technology that you vaguely knew was there but had always either ignored or worked around it!
OTD offers seriously advanced load-balancing, fit for globally distributed cloud applications that is also 'application aware' (both OOTB and with options to extend with custom programming)

In my second session, Matt Wright of Rubicon Red shared his company's insights and a roadmap for moving integrations to the cloud. 

Peter Ebell of AMIS presented a talk on new SOA paradigms ("Alien architectures" as he termed it) - the post RDBMS world. The premise was that traditionally, SOA service layers that directly perform DML on RDBMS databases are very prone to changes in the database. Perhaps new approaches might need to be explored - especially for the new world where data in general is more unstructured or semi-structured. 
He started with a typical 'napkin architecture' and then progressed on to explain how it would evolve for certain modern requirements. 
At first the speaker started the talk in Dutch and I thought it would be an interesting challenge to try and understand everything in Dutch! But he then switched to English.

Shay Shmeltzer (@JDevShay) introduced the Oracle Developer cloud - this is a boon for the developer community as with a few clicks, a developer can provision the basic development environment (Source control, wiki, issue tracker, build server) up and running for a whole team! 
As Shay reiterated "..A mature DevOps facilitates short and quick release cycles...." , which is precisely today's need and expectation from businesses. 

Lonneke Dikmans of eProseed and Lucas Jellema of AMIS introduced the various Oracle cloud offerings - PaaS offerings to be precise. Beyond the familiar SOA CS,  ICS (Integration cloud service), PCS (Process cloud service - with it's BPM engine and BPM Workspace, the IoT and Big data cloud services are interesting new offerings. 
I noticed that both IoT and BigData CS included 'analytics' -  Lonnenke clarified that this targeted different types of data (real time data in flux versus static-historic data). 
As I see it, the IoT cloud service adds value by "turning sensor data into sensible information" - that can subsequently be fed in to underlying data, integration and analytics services. Very compelling. 

Lucas described a realistic strategy for migration to the cloud by targeting 'edge systems' first. 

Bram Van Der Pelt of AMIS gave a session on Identity 3.0 and it's possible application in the Oracle world. Identity 3.0 is a new proposal developed by the Jericho Forum, which essentially proposes a mechanism whereby Identity and it's related attributes are maintained by and shared by the authority that owns them (such as a national government or the individual themselves). The root of every identity is proposed as anonymous. These principles facilitate privacy. 
This is a major paradigm from the currently prevalent model in every application where copies of user identities and lots of personal profile information are stored locally. 

......Beyond technology, the conference also gave me the opportunity to see some nice parts of Holland. As I arrived at Amsterdam on the afternoon of Thursday, I started to make my way towards Katwijk. As the historic city of Leiden was on the way, I took the opportunity to explore the Leiden town centre a bit and also see the  Rijksmuseum van Oudheden - which is the national archaeological museum of the Netherlands. The collection is nice and includes artifacts from ancient Egypt, Persia and local archaeological finds from the regions in and around the Netherlands. 

"Why should we look to the past in order to prepare for the future?
Because there is nowhere else to look"

~James Burke  (Quoted at the 
Rijksmuseum van Oudheden)
An interesting fact about Leiden is that it's the birth place of the famous Dutch painter Rembrandt. 

The day after the conference, I headed to Amsterdam (having stayed overnight at Den Haag/The Hague). Found a map of the city and started the day with walks along the canals from Central station to the Museum district. Eventually decided on exploring the Rijksmuseum, which I explored for most of the day with it's extensive collection of paintings by Rembrandt, Veneer and other artists. 

"You have two eyes and but one mouth. Let this be a signal to pay heed, not to talk here, but to read"
(~Quoted on the walls of the Library at the Rijksmuseum, Amsterdam, pictured below)

Having spent hours at the Rijksmuseum, for the remainder of the day all I could do was to walk around the city some more, before it was time to catch my flight. A very fruitful first trip to Holland - not only for the information packed conference, but also because I got to sightsee and visit two main national museums of the country!

Sunday, June 29, 2014

My first take on SOA/BPM Suite 12c.

The Oracle Fusion Middleware circles have been abuzz this weekend with the launch of the latest and greatest release of BPM/SOA Suite 12c.

The 12c release announcement late Friday evening (GMT+1, UK time)  caught me pleasantly off guard as during the past 6 months I have been very focused on a client project.

Listing below are just a few of my initial notes on the features that caught my attention.

- Experience with initial installation
   Downloaded the generic quick start installation bundle for 64 bit JVM's. 
   After the download, it took 9 minutes to install and get JDev running with a blank BPM project (Others on the twitter hashtag #BPMSuite12c reported around 15-20 minutes so I think my lower time might be due to the solid state drive in my laptop). 
Configuration of the integrated domain and launching the server took longer though but was straightforward and smooth. 
  I really might be one of the first few  in the UK (maybe even the first) who reported installing 12c on the twitter hashtag #BPMSuite12c (1:30 AM UK Time on the 28th of June) after its public release.

- True convergence of BPM and BPA?
Sounds like marketingspeak, but that is the phrase that comes to mind when I see the new 'BA' circle that surrounds the familiar 'BPM loop'
I felt the BPM Composer 11g  completely lacked BPA (business process analysis) support but that seems to have changed now.. 

- Feature: process comparator/ history tab in BPMN
I had noticed in earlier releases that the 'history' tab used to be missing in the JDeveloper BPMN studio's process designer view. BPMN definition is essentially just another XML file (albeit a lot more complex to interpret without tool support unlike BPEL) so this didn't make much sense. 
Seeing the 'history' tab with the 'process comparator' feature made me realise why it took this long. It's a fairly complex feature to have (and very useful to have for any collaborative development or even for any to-and-fro between the composer and JDev BPM studio). They  needed to get it right. 

- Collaboration features between BPM Composer and Studio are more robust
Just after exporting the BPM project to 'PAM' I could basically view it in the BPM Composer (so far nothing different from exporting it to BPM-MDS and loading it up on Composer in 11g). 

The additional useful thing to note is the automatic versioning using an inbuilt subversion server. After 'publish' of changes via Composer and then updating the version in Studio, the version numbers got updated and in one case I got conflicts pending resolution.

*That said, I wouldn't use simultaneous changes in Composer and Studio as a regular development practice. 

- Finally, we can debug BPEL and use breakpoints
The OSB has had this for long and that is one feature (in addition to the slick refactoring capabilities) that made me envy eclipse users. 
Well, now we have - PROPER debugging, breakpoints and watch window for BPEL**
(Oracle ADF on the other hand has allowed breakpoints in nearly all of its XML-defined components: task flows, jspx pages etc.) 

**I did notice that you could set breakpoints in BPEL in JDev 11g too. They seemed to have no use. I did plan to one day run weblogic in debug mode and see - I guess that's no longer necessary!

- Some things never change
It does make you smile to see that breadcrumb trail on Weblogic 12c console still stacking up indefinitely (It has almost started to seem like the right behaviour to me now!)

Anyway, this was just a set of brief observations on the shiny new release. Nothing profound or in-depth. There's a lot to read and learn. 

Saturday, December 28, 2013

ADF UI with MongoDB for persistence

The sample application (JDeveloper 12c / 12.1.2) can be downloaded from here.
This is a simple ADF application whose underlying business service implementation uses MongoDB for persistence.
The application allows create & update operations on a one-to-many data model (Department-Employees)

As shown in the screenshot below, you can Create a new department and then create a few employee records. The "Save all changes" button then calls the standard ADF POJO DataControloperations that in turn delegate the persistence of these objects to the MongoDB without using any SQL.

The data is saved in a database called "hrdb"  (see class MongoDAO in the Model project) in a collection called "Departments"

The design I have used is very basic and shown below.
I have used simple POJOS extending ReflectionDBObject as my 'persistable' entities but this can be anything else (such as EclipseLink) that can support MongoDB (either programmatically or out of the box).

1. Start an instance of MongoDB (v2.4.8) using mongod and all the default options.
If you enable authentication or the host/port is different, just change these in the MongoDAO class in the sample project
2. Just run the project.
3. Verify the data creates/updates either using the MongoDB console or the ReST interface via browser

Wednesday, December 18, 2013

Webservice interaction patterns - part 2 - Synchronous Webservice call timeouts

Goal: For reasons of performance and more importantly, good user experience, the client application (ADF) should be able to timeout gracefully instead of hanging indefinitely (and/or potentially causing STUCK threads on the ADF server) when invoking an external 'synchronous' web service.

At runtime
When I run the application,
If I enter all fields (remember to enter a unique Employee ID, email address etc), then the 'Commit' button calls the web service as usual.

However, if I Create a new record with the FirstName field set to TIMEOUT, you will notice that you see a timeout message on the screen after around 6 seconds.
 (don't update an existing one as in the sample app, the update operation just goes directly to the DB)

If I had not performed the setup listed in the "Client setup" section below, this would have resulted in a screen that appears to hang for a long time (with STUCK threads observed on the ADF application server after a while depending on the settings there)

My setup

Download the sample application from here.
I developed my 'external' transactional service using the SOA suite (BPEL).

1) From the zip file, open Webservices/EmpDeptCrud application and deploy the DataService on a SOA server (on a domain called 'default').
2) Open the TestWSCreateTimeout ADF application.
IF your SOA Server doesn't run on the URL http://soabpm-vm:7001, and you don't deploy it to a domain called 'default', then change the location property in the wsdl file to reflect where your service is accessibleThen right-click on the wsdl and "Generate web service proxy"
3) Just for the purpose of this test, go to the SOA server EM console and then navigate to the BPEL engine settings as shown below. Change the SyncMaxWaitTime
property to 600 (sec) or more.

In my web service implementation (a BPEL process), I added a wait activity that waits for 5 minutes as shown in the screenshot below.

Increased the default timeout (SyncMaxWaitTime) set in the SOA Suite's BPEL engine properties to be 10 minutes (600 sec) just so that my wait activity is effective. In the Oracle SOA Suite this timeout is 45 seconds by default (In other technologies it may be different or non-existent - hence this post that shows you what should be done to handle this gracefully on the client/UI side).
As a web application developer, you probably wouldn't have access to these settings on the service side.

The sample application is just an enhanced version of the one created for the previous post in this series. Both the ADF ui application and the web service are created using JDeveloper
(Although the ADF application could have been created in any other JDev release as the two are run independently on different servers)

Client setup (The ADF application)

At the point I invoke the external web service, it's a matter of adding the appropriate parameters in the request context. The REQUEST_TIMEOUT property is most relevant in this particular scenario.  A reasonable CONNECT_TIMEOUT is also recommended.
Just look at the code for the imports and libraries used.
The setup in ADF code is shown below

Friday, December 13, 2013

Webservice interaction patterns - part 1 - very basic DML

There are a number of ways you 'call' web services from an ADF application. The best method to choose would depend on the individual scenario.

This particular example is the most suitable option where:
- Our application is allowed to read from the database using its own business services - i.e. ADFbc (which are fairly easy to create and don't require a lot of 'plumbing' code compared to other technologies) but we need to delegate any transactional calls (such as create/update/deletes) to an external web service.

The reasons could be many: Most commonly, such business (web) services already exist with a number of person-years of effort, testing & polishing already gone into them and they already meet the requirements perfectly. So it makes perfect sense to not re-develop all that logic into ADFbc and just reuse the existing service.

What this post is not: If you have written all your business logic using ADF business components then you can expose that as a web service (e.g. for consumption of the mobile version of your app.) That is a different scenario altogether.

If ALL data access operations (both reads and writes) are based on web services, I would most likely choose a different approach. (which I will cover in the subsequent posts in this series).

The sample application (Created in JDev has three parts
1)  the web service which provides the Create operation.
I simply created it using the SOA Suite - but that's not the important part - you can pick the wsdl and implement the service using any technology. I also exposed only the Create operation for now and that too, just for a single row. It's easy to expose all the other CRUD operations as well.

2) The ADF application (Application TestWSCreate in the zip file). Contains the Model and ViewController projects (thing to note is the EmployeeImpl class and its doDML operation)
3) The web service proxy - which is just code generated from the wsdl. I have kept it in the Model project but it should probably be in its own project or application which you could call ExternalService clients etc.
Contents of my Model project are shown below:

Download the application from here1) On the server where you'll deploy the web service, create an XA data source pointing to the XE schema with the JNDI name jdbc/hrXADS
2) Create a connection pool on the DBAdapter with the jndi name eis/db/hrXA
and point it to the above XA data source

3. Open the application EmpDeptCRUD and deploy the project DataService on the SOA server.
I used the SOA-BPM VM and on my machine it runs at http://soabpm-vm:7001 . If your SOA server is running at a different
address, just point the location inside the WSDL in the proxy to your server's location (see the screenshot of the Model project) and re-generate the proxy classes (right click - generate web service proxy....)

4. Open and run the adfc-config from ViewController in the application TestWSCreate
Clicking 'Create New' will create a new Employee record - you can enter data, perform validations.
Finally, when you click 'Commit' the web service is invoked.

Here is how/why it happens:

Coming up:
- Detecting and Handling timeouts
- Leveraging the SOA Suite features (if the implementation technology for your external webservices is the SOA suite or even if you have a SOA Suite license).
- Optimizing web service calls when performing creates in a table / collection

Also see:

Update: updated the step on what to do when the web service URL is changed.

Thursday, December 12, 2013

JDeveloper Productivity features - run current working set

In a well designed application ('the sum of parts'), we probably divide functionality into projects. 
At any given time individual developers normally work on one or two projects, which they need to run and test quickly using the integrated weblogic server. 
For years I found the working set feature very effective for this:

Particularly note:

4.3.6 How to Manage Working Sets How to Run and Debug a Working Set

basically, using working sets, JDeveloper would only build, deploy and run projects you are specifically interested in or working on, saving you time.

Sunday, February 24, 2013

Minor observations on the ADF standards document

My original title to this post was: "Entity/Entities, Key abstractions and nouns". 

Just noticed this very useful document on "ADF naming and project layout guidelines" published recently. It represents a valuable and exhaustive resource that many projects and programmers (including myself) benefit from. 

I'm a fan of the 'convention over configuration' approach so prefer to leverage the framework/IDE features and default settings to the maximum (while attempting to keep myself aware of when and what to change)

On a few minor points, I have a slightly different preference [I don't suggest massive deviation from what the wizard creates - but these can add value in making names more meaningful and closer to how they model a business]:

1. Singular entity names

I prefer entity names to be singular**: Customer, Sale, Department, Employee, Location
So, for a table called SALES default entity name generated by the wizard would be Sales - my preference would be to change it to "Sale" (Actually my preference would be for the wizard to remove the plural spelling as far as possible...)
**My reasoning: A table as a whole represents all sales. But a row in a table represents a single 'sale'. 
Similarly, an instance of an entity represents a Sale just like an instance of the Customer entity represents one customer and not all the customers. 

In the (currently) more uncommon forward engineered scenario, the domain model would get designed first (crash course follows at the end)

(Ref: [ADFng1-04004] )

2. Marking cardinality

Another worthwhile standard I enforce is marking of the association cardinality clearly: (one to many, one to one, one to zero or many, many to one, many to zero or one). 

What a one to many cardinality between Department and Employee entities represents is this: For one instance of Department, there can exist multiple associated Employee instances

3. Cardinality in association names

As a consequence of 2, a preferred association name might be AccountCustomers or DepartmentEmployees  (both one to many), DepartmentLocation (could either be 'one to one' or 'one to zero or one')
This sort of encodes the cardinality of the association within the name (one to many in both the examples above). 
The same goes for one to one and many to one associations but I can't think of many examples now. 
This just serves to make it clear to the client (such as the UI) to expect a collection type (such as RowIterator) and not a single instance type (Row)

-------------------------------------------------------Crash course  for identifying entities - -------------------------------------------------------

One of the things I have always tried to stress is that while one can 'reverse engineer' entities and associations  from a relational database, these components fundamentally represent the business domain model.  

We start by reading an over-simplified requirement statement such as this:
A customer walks in to a bank branch, goes to the teller* and requests a new account. The teller logs in to the new system and enters the customer's details... and so on... 
(*Tech note: Teller = Employee? Role?)

The highlighted nouns are a fairly exhaustive list of my initial entity model for such a problem statement. 
Side note: And verbs usually evolve into usecases (and TaskFlows in the ADF world)

Nothing original in this but this is surprisingly identical to probably my first OOAD course in college!