Tuesday, June 07, 2016

#AMIS25 and The Oracle Cloud Shift : Insights from my first Holland trip

I would like to take this opportunity to wish AMIS Netherlands a very happy 25th birthday. In the context of Oracle SOA, the the name AMIS often keeps popping up - they have contributed a lot to the knowledge available to the community around this and related Oracle technology. 

As they chose to celebrate this occasion in a uniquely signature style - by holding a global Oracle conference with an impressive lineup of speakers from all 6 continents and also by holding the event in an old aircraft hangar (commemorating their origins as the Aircraft Management Information Systems) 
It was a pleasure to be invited by Lucas Jellema (@lucasjellema) so I decided to attend at least one day - the Friday, 3rd of June. The line-up of events though was fantastic on both days. 

I arrived in the Netherlands on Thursday, the 2nd (my first visit to the country, outside the airport that is) and decided to explore places nearby... More on this later!

All the speakers might upload their presentations as they see fit and of course, know the best about their subject matter. I'm going to write about the talks I attended and my observations on the main themes.

One thing that is quite apparent is that the mainstream Oracle world is now cloud. This is quite the realisation of c of 12c.

First, the conference day started for me with Simon Haslam's (@simon_haslam) talk on the Oracle Traffic Director. This was one of the aha moments when you realise a gap in existing technology that you vaguely knew was there but had always either ignored or worked around it!
OTD offers seriously advanced load-balancing, fit for globally distributed cloud applications that is also 'application aware' (both OOTB and with options to extend with custom programming)

In my second session, Matt Wright of Rubicon Red shared his company's insights and a roadmap for moving integrations to the cloud. 

Peter Ebell of AMIS presented a talk on new SOA paradigms ("Alien architectures" as he termed it) - the post RDBMS world. The premise was that traditionally, SOA service layers that directly perform DML on RDBMS databases are very prone to changes in the database. Perhaps new approaches might need to be explored - especially for the new world where data in general is more unstructured or semi-structured. 
He started with a typical 'napkin architecture' and then progressed on to explain how it would evolve for certain modern requirements. 
At first the speaker started the talk in Dutch and I thought it would be an interesting challenge to try and understand everything in Dutch! But he then switched to English.

Shay Shmeltzer (@JDevShay) introduced the Oracle Developer cloud - this is a boon for the developer community as with a few clicks, a developer can provision the basic development environment (Source control, wiki, issue tracker, build server) up and running for a whole team! 
As Shay reiterated "..A mature DevOps facilitates short and quick release cycles...." , which is precisely today's need and expectation from businesses. 

Lonneke Dikmans of eProseed and Lucas Jellema of AMIS introduced the various Oracle cloud offerings - PaaS offerings to be precise. Beyond the familiar SOA CS,  ICS (Integration cloud service), PCS (Process cloud service - with it's BPM engine and BPM Workspace, the IoT and Big data cloud services are interesting new offerings. 
I noticed that both IoT and BigData CS included 'analytics' -  Lonnenke clarified that this targeted different types of data (real time data in flux versus static-historic data). 
As I see it, the IoT cloud service adds value by "turning sensor data into sensible information" - that can subsequently be fed in to underlying data, integration and analytics services. Very compelling. 

Lucas described a realistic strategy for migration to the cloud by targeting 'edge systems' first. 

Bram Van Der Pelt of AMIS gave a session on Identity 3.0 and it's possible application in the Oracle world. Identity 3.0 is a new proposal developed by the Jericho Forum, which essentially proposes a mechanism whereby Identity and it's related attributes are maintained by and shared by the authority that owns them (such as a national government or the individual themselves). The root of every identity is proposed as anonymous. These principles facilitate privacy. 
This is a major paradigm from the currently prevalent model in every application where copies of user identities and lots of personal profile information are stored locally. 

......Beyond technology, the conference also gave me the opportunity to see some nice parts of Holland. As I arrived at Amsterdam on the afternoon of Thursday, I started to make my way towards Katwijk. As the historic city of Leiden was on the way, I took the opportunity to explore the Leiden town centre a bit and also see the  Rijksmuseum van Oudheden - which is the national archaeological museum of the Netherlands. The collection is nice and includes artifacts from ancient Egypt, Persia and local archaeological finds from the regions in and around the Netherlands. 

"Why should we look to the past in order to prepare for the future?
Because there is nowhere else to look"

~James Burke  (Quoted at the 
Rijksmuseum van Oudheden)
An interesting fact about Leiden is that it's the birth place of the famous Dutch painter Rembrandt. 

The day after the conference, I headed to Amsterdam (having stayed overnight at Den Haag/The Hague). Found a map of the city and started the day with walks along the canals from Central station to the Museum district. Eventually decided on exploring the Rijksmuseum, which I explored for most of the day with it's extensive collection of paintings by Rembrandt, Veneer and other artists. 

"You have two eyes and but one mouth. Let this be a signal to pay heed, not to talk here, but to read"
(~Quoted on the walls of the Library at the Rijksmuseum, Amsterdam, pictured below)

Having spent hours at the Rijksmuseum, for the remainder of the day all I could do was to walk around the city some more, before it was time to catch my flight. A very fruitful first trip to Holland - not only for the information packed conference, but also because I got to sightsee and visit two main national museums of the country!

Sunday, June 29, 2014

My first take on SOA/BPM Suite 12c.

The Oracle Fusion Middleware circles have been abuzz this weekend with the launch of the latest and greatest release of BPM/SOA Suite 12c.

The 12c release announcement late Friday evening (GMT+1, UK time)  caught me pleasantly off guard as during the past 6 months I have been very focused on a client project.

Listing below are just a few of my initial notes on the features that caught my attention.

- Experience with initial installation
   Downloaded the generic quick start installation bundle for 64 bit JVM's. 
   After the download, it took 9 minutes to install and get JDev running with a blank BPM project (Others on the twitter hashtag #BPMSuite12c reported around 15-20 minutes so I think my lower time might be due to the solid state drive in my laptop). 
Configuration of the integrated domain and launching the server took longer though but was straightforward and smooth. 
  I really might be one of the first few  in the UK (maybe even the first) who reported installing 12c on the twitter hashtag #BPMSuite12c (1:30 AM UK Time on the 28th of June) after its public release.

- True convergence of BPM and BPA?
Sounds like marketingspeak, but that is the phrase that comes to mind when I see the new 'BA' circle that surrounds the familiar 'BPM loop'
I felt the BPM Composer 11g  completely lacked BPA (business process analysis) support but that seems to have changed now.. 

- Feature: process comparator/ history tab in BPMN
I had noticed in earlier releases that the 'history' tab used to be missing in the JDeveloper BPMN studio's process designer view. BPMN definition is essentially just another XML file (albeit a lot more complex to interpret without tool support unlike BPEL) so this didn't make much sense. 
Seeing the 'history' tab with the 'process comparator' feature made me realise why it took this long. It's a fairly complex feature to have (and very useful to have for any collaborative development or even for any to-and-fro between the composer and JDev BPM studio). They  needed to get it right. 

- Collaboration features between BPM Composer and Studio are more robust
Just after exporting the BPM project to 'PAM' I could basically view it in the BPM Composer (so far nothing different from exporting it to BPM-MDS and loading it up on Composer in 11g). 

The additional useful thing to note is the automatic versioning using an inbuilt subversion server. After 'publish' of changes via Composer and then updating the version in Studio, the version numbers got updated and in one case I got conflicts pending resolution.

*That said, I wouldn't use simultaneous changes in Composer and Studio as a regular development practice. 

- Finally, we can debug BPEL and use breakpoints
The OSB has had this for long and that is one feature (in addition to the slick refactoring capabilities) that made me envy eclipse users. 
Well, now we have - PROPER debugging, breakpoints and watch window for BPEL**
(Oracle ADF on the other hand has allowed breakpoints in nearly all of its XML-defined components: task flows, jspx pages etc.) 

**I did notice that you could set breakpoints in BPEL in JDev 11g too. They seemed to have no use. I did plan to one day run weblogic in debug mode and see - I guess that's no longer necessary!

- Some things never change
It does make you smile to see that breadcrumb trail on Weblogic 12c console still stacking up indefinitely (It has almost started to seem like the right behaviour to me now!)

Anyway, this was just a set of brief observations on the shiny new release. Nothing profound or in-depth. There's a lot to read and learn. 

Saturday, December 28, 2013

ADF UI with MongoDB for persistence

The sample application (JDeveloper 12c / 12.1.2) can be downloaded from here.
This is a simple ADF application whose underlying business service implementation uses MongoDB for persistence.
The application allows create & update operations on a one-to-many data model (Department-Employees)

As shown in the screenshot below, you can Create a new department and then create a few employee records. The "Save all changes" button then calls the standard ADF POJO DataControloperations that in turn delegate the persistence of these objects to the MongoDB without using any SQL.

The data is saved in a database called "hrdb"  (see class MongoDAO in the Model project) in a collection called "Departments"

The design I have used is very basic and shown below.
I have used simple POJOS extending ReflectionDBObject as my 'persistable' entities but this can be anything else (such as EclipseLink) that can support MongoDB (either programmatically or out of the box).

1. Start an instance of MongoDB (v2.4.8) using mongod and all the default options.
If you enable authentication or the host/port is different, just change these in the MongoDAO class in the sample project
2. Just run the project.
3. Verify the data creates/updates either using the MongoDB console or the ReST interface via browser

Wednesday, December 18, 2013

Webservice interaction patterns - part 2 - Synchronous Webservice call timeouts

Goal: For reasons of performance and more importantly, good user experience, the client application (ADF) should be able to timeout gracefully instead of hanging indefinitely (and/or potentially causing STUCK threads on the ADF server) when invoking an external 'synchronous' web service.

At runtime
When I run the application,
If I enter all fields (remember to enter a unique Employee ID, email address etc), then the 'Commit' button calls the web service as usual.

However, if I Create a new record with the FirstName field set to TIMEOUT, you will notice that you see a timeout message on the screen after around 6 seconds.
 (don't update an existing one as in the sample app, the update operation just goes directly to the DB)

If I had not performed the setup listed in the "Client setup" section below, this would have resulted in a screen that appears to hang for a long time (with STUCK threads observed on the ADF application server after a while depending on the settings there)

My setup

Download the sample application from here.
I developed my 'external' transactional service using the SOA suite (BPEL).

1) From the zip file, open Webservices/EmpDeptCrud application and deploy the DataService on a SOA server (on a domain called 'default').
2) Open the TestWSCreateTimeout ADF application.
IF your SOA Server doesn't run on the URL http://soabpm-vm:7001, and you don't deploy it to a domain called 'default', then change the location property in the wsdl file to reflect where your service is accessibleThen right-click on the wsdl and "Generate web service proxy"
3) Just for the purpose of this test, go to the SOA server EM console and then navigate to the BPEL engine settings as shown below. Change the SyncMaxWaitTime
property to 600 (sec) or more.

In my web service implementation (a BPEL process), I added a wait activity that waits for 5 minutes as shown in the screenshot below.

Increased the default timeout (SyncMaxWaitTime) set in the SOA Suite's BPEL engine properties to be 10 minutes (600 sec) just so that my wait activity is effective. In the Oracle SOA Suite this timeout is 45 seconds by default (In other technologies it may be different or non-existent - hence this post that shows you what should be done to handle this gracefully on the client/UI side).
As a web application developer, you probably wouldn't have access to these settings on the service side.

The sample application is just an enhanced version of the one created for the previous post in this series. Both the ADF ui application and the web service are created using JDeveloper
(Although the ADF application could have been created in any other JDev release as the two are run independently on different servers)

Client setup (The ADF application)

At the point I invoke the external web service, it's a matter of adding the appropriate parameters in the request context. The REQUEST_TIMEOUT property is most relevant in this particular scenario.  A reasonable CONNECT_TIMEOUT is also recommended.
Just look at the code for the imports and libraries used.
The setup in ADF code is shown below

Friday, December 13, 2013

Webservice interaction patterns - part 1 - very basic DML

There are a number of ways you 'call' web services from an ADF application. The best method to choose would depend on the individual scenario.

This particular example is the most suitable option where:
- Our application is allowed to read from the database using its own business services - i.e. ADFbc (which are fairly easy to create and don't require a lot of 'plumbing' code compared to other technologies) but we need to delegate any transactional calls (such as create/update/deletes) to an external web service.

The reasons could be many: Most commonly, such business (web) services already exist with a number of person-years of effort, testing & polishing already gone into them and they already meet the requirements perfectly. So it makes perfect sense to not re-develop all that logic into ADFbc and just reuse the existing service.

What this post is not: If you have written all your business logic using ADF business components then you can expose that as a web service (e.g. for consumption of the mobile version of your app.) That is a different scenario altogether.

If ALL data access operations (both reads and writes) are based on web services, I would most likely choose a different approach. (which I will cover in the subsequent posts in this series).

The sample application (Created in JDev has three parts
1)  the web service which provides the Create operation.
I simply created it using the SOA Suite - but that's not the important part - you can pick the wsdl and implement the service using any technology. I also exposed only the Create operation for now and that too, just for a single row. It's easy to expose all the other CRUD operations as well.

2) The ADF application (Application TestWSCreate in the zip file). Contains the Model and ViewController projects (thing to note is the EmployeeImpl class and its doDML operation)
3) The web service proxy - which is just code generated from the wsdl. I have kept it in the Model project but it should probably be in its own project or application which you could call ExternalService clients etc.
Contents of my Model project are shown below:

Download the application from here1) On the server where you'll deploy the web service, create an XA data source pointing to the XE schema with the JNDI name jdbc/hrXADS
2) Create a connection pool on the DBAdapter with the jndi name eis/db/hrXA
and point it to the above XA data source

3. Open the application EmpDeptCRUD and deploy the project DataService on the SOA server.
I used the SOA-BPM VM and on my machine it runs at http://soabpm-vm:7001 . If your SOA server is running at a different
address, just point the location inside the WSDL in the proxy to your server's location (see the screenshot of the Model project) and re-generate the proxy classes (right click - generate web service proxy....)

4. Open and run the adfc-config from ViewController in the application TestWSCreate
Clicking 'Create New' will create a new Employee record - you can enter data, perform validations.
Finally, when you click 'Commit' the web service is invoked.

Here is how/why it happens:

Coming up:
- Detecting and Handling timeouts
- Leveraging the SOA Suite features (if the implementation technology for your external webservices is the SOA suite or even if you have a SOA Suite license).
- Optimizing web service calls when performing creates in a table / collection

Also see:

Update: updated the step on what to do when the web service URL is changed.

Thursday, December 12, 2013

JDeveloper Productivity features - run current working set

In a well designed application ('the sum of parts'), we probably divide functionality into projects. 
At any given time individual developers normally work on one or two projects, which they need to run and test quickly using the integrated weblogic server. 
For years I found the working set feature very effective for this:


Particularly note:

4.3.6 How to Manage Working Sets How to Run and Debug a Working Set

basically, using working sets, JDeveloper would only build, deploy and run projects you are specifically interested in or working on, saving you time.

Sunday, February 24, 2013

Minor observations on the ADF standards document

My original title to this post was: "Entity/Entities, Key abstractions and nouns". 

Just noticed this very useful document on "ADF naming and project layout guidelines" published recently. It represents a valuable and exhaustive resource that many projects and programmers (including myself) benefit from. 

I'm a fan of the 'convention over configuration' approach so prefer to leverage the framework/IDE features and default settings to the maximum (while attempting to keep myself aware of when and what to change)

On a few minor points, I have a slightly different preference [I don't suggest massive deviation from what the wizard creates - but these can add value in making names more meaningful and closer to how they model a business]:

1. Singular entity names

I prefer entity names to be singular**: Customer, Sale, Department, Employee, Location
So, for a table called SALES default entity name generated by the wizard would be Sales - my preference would be to change it to "Sale" (Actually my preference would be for the wizard to remove the plural spelling as far as possible...)
**My reasoning: A table as a whole represents all sales. But a row in a table represents a single 'sale'. 
Similarly, an instance of an entity represents a Sale just like an instance of the Customer entity represents one customer and not all the customers. 

In the (currently) more uncommon forward engineered scenario, the domain model would get designed first (crash course follows at the end)

(Ref: [ADFng1-04004] )

2. Marking cardinality

Another worthwhile standard I enforce is marking of the association cardinality clearly: (one to many, one to one, one to zero or many, many to one, many to zero or one). 

What a one to many cardinality between Department and Employee entities represents is this: For one instance of Department, there can exist multiple associated Employee instances

3. Cardinality in association names

As a consequence of 2, a preferred association name might be AccountCustomers or DepartmentEmployees  (both one to many), DepartmentLocation (could either be 'one to one' or 'one to zero or one')
This sort of encodes the cardinality of the association within the name (one to many in both the examples above). 
The same goes for one to one and many to one associations but I can't think of many examples now. 
This just serves to make it clear to the client (such as the UI) to expect a collection type (such as RowIterator) and not a single instance type (Row)

-------------------------------------------------------Crash course  for identifying entities - -------------------------------------------------------

One of the things I have always tried to stress is that while one can 'reverse engineer' entities and associations  from a relational database, these components fundamentally represent the business domain model.  

We start by reading an over-simplified requirement statement such as this:
A customer walks in to a bank branch, goes to the teller* and requests a new account. The teller logs in to the new system and enters the customer's details... and so on... 
(*Tech note: Teller = Employee? Role?)

The highlighted nouns are a fairly exhaustive list of my initial entity model for such a problem statement. 
Side note: And verbs usually evolve into usecases (and TaskFlows in the ADF world)

Nothing original in this but this is surprisingly identical to probably my first OOAD course in college!

Monday, September 10, 2012

ADF & Event driven Integration with a BPMN process

The usecase for my sample application is:
A standalone application exists to create/update 'Departments'. Originally, it was not intended to be part of an automated process (BPEL or BPMN) but now business decided that their process for the creation of a department is actually more than just data entry in the standalone application. It also now involves an approval. 

1. Every interaction with a software application in a business environment is part of some process - the process may or may not have been analysed, modelled in a standard notation or automated, but it's a process nonetheless.
2. When we interact with software applications, we generate 'events' (Order fulfilled event, Loan approved event, etc.). 
3. The sample applications for this post can be downloaded from here and consist of: 
                  a) The SOA/BPM composite application - event listener 
                  b) The standalone ADF web application - event producer
 (Instructions to setup and run the sample are towards the end of this post)

From a technical standpoint:
Before BPM was implemented, the DepartmentCreated event used to be a simple insertion of a new row in a table. Now, it will lead to a longer process involving an approval (and possibly other human/automated tasks). From a technical standpoint, we are really not looking to change an existing application such that it becomes tied up to specific API's. 
With an event driven architecture, the SOA/BPM infrastructure listens to things that happen around your enterprise landscape and take appropriate action when required (in this case, it needs to start a process when a Department is created). 

My simple Process model:

1. Process roles
 I have created three abstract roles for the process - the 'ADF application user' always existed, because, people were always creating new rows in a table called 'Departments'. 
We now have a new 'Approver' role. (For convenience, I have mapped the default 'weblogic' user to 'Approver')
The third is the automated handler for automated tasks.  

2. The activity 'Create a department' is meant to represent from the perspective of the business process. The thing to note is it's represented as a manual task and not as a (initiator) human task
As I mentioned before, it is JUST a standalone web application with a facility to create/update Departments and from a technical standpoint, it is not even 'aware' that it is a participant in a business process

3. Structure of the composite 
My process does not expose any SOAP endpoint (as it's 'Start' event is not a 'message' but a 'signal'). 

4. The start signal. 
It's 'implementation type' is 'Signal' (not 'message', which is the default) and note the Event that it listens to.

Something interesting to note here is that my process data object is based on the same model as the domain objects (Entities) I created for the UI application i.e. we do not need to create our domain model twice - we levarage the one we have. 

We can also reuse the definitions from ADFbc SDO's if we chose, to avoid duplication. 

5.  Raising the event. 
All I had to do was to configure the Entity to raise events I was interested in and the payload I wanted to pack into the event. In my example, I chose to send the department ID and name. 
We can easily expose a getByID web service so that the process can query up the whole department object when it needs to. Until then, we avoid the overhead of lugging huge DOM payloads around and the associated overhead of transformations, marshalling/unmarshalling etc. 
End result: good performance. (or at least you avoid performance problems specifically due to unnecessarily large payloads later on - provided other aspects of the code+configuration are right)

All SCA components (Mediator, BPMN and BPEL process engines) can similarly listen to events. 
The component that raises the events is just going about its business and doesn't know or care about who or what might be interested in these events. i.e. an existing application doesn't have to call specific API's or service end-points and is immune to changes in other components that listen to its events.
This allows a great deal of decoupling between applications to be integrated.

Concepts covered:
1. Manual activity
2. Business events
3. BPM Process data objects based on the existing domain model - I used the same XSD for my PDO as the one for my DepartmentCreated event.

When we talk Integration, SOA mostly involves SOAP web services - but just as in real life, a 'business process' doesn't necessarily need to be 'invoked' (via a web service or API call) to get initiated. We just model and implement processes to reflect real business scenarios.  

Where appropriate, an 'Initiator' human task could also initiate a process - i.e. when the first activity in a business process is a human task - although, in my example, I purposely have not implemented a  initiator human workflow task.

Sample Application
1. My development environment is a SOA/BPM suite installation on a single server instance with no managed server. The SOA and BPM domain was created in 'developer' mode. Both my SOA/BPM suite and the standalone UI application are deployed on the same server (but this can be configured to be on different servers)
2. Create a data source jdbc/hrDS pointing to the sample HR schema.
3. Deploy the EmpDeptADFApplication and access it via http://your_server:your_port/adfedl/faces/home
4. Deploy the EmpBusinessProcess composite
5. When you create a new department (manually enter a unique departmentId) and commit it, an instance of the EmpBusinessProcess composite is created. For the approval human task, login to the BPM workspace as weblogic and just Approve or Reject the item in the worklist. 

1. https://blogs.oracle.com/soabpm/entry/event_delivery_network_chapter 
(An old post that introduces the EDN but that was before the BPM suite and BPMN processes became 'SCA-fied')
2. Start Events for a business process (Implementation type='Message' is just the default !! Just like in real life, every process does not logically start with a web service or an API call! ): 

Saturday, August 25, 2012

Brief note on ADF 11gR2 installation

The certified application server and ADF combinations for ADF 11.1.2.x.x are available here.

We need to follow the 

I'm of course only going to list a helpful link and not actually post the contents of that document as it is secure, but briefly:
1. On your standalone weblogic server, Install ADF runtime first 
ADF Runtime is available from the usual ADF downloads page. 

2. Then install the two patches for both ADF and ADF available from Oracle support and mentioned in the support document. 

It's important to read the patch README files, set the ORACLE_HOME environment variable correctly and use the correct OPatch version... (latest usually works) to avoid any gotchas. 

Friday, August 24, 2012

Drilldown on ADF DVT graphs

The sample workspace  (JDev/ADF for this post is available here.
Just run adfc-config from the ViewController project. 

One form of drilldown is where you provide an 'action' that can be a control flow case in the taskflow and which leads the user to another view/page, but that's not what I'm writing about. 

The drilldown i'm referring to is for the following usecase (based on the HR schema):

A graph displays Departments and total salaries of all the employees in that department. 
When you click on one of the departments (represented as a bar graph), you see the same bar graph with employees in that department. 
(You can extend this approach indefinitely but can get complex after a couple of levels of drilldown). 

My approach uses plain ADF DVT components and not BI data controls**.
It also relies on the underlying view objects to provide the aggregate data (sum of salaries in this example) and doesn't use any of the aggregation functions of the DVT components. 
You might also want to consider the approach mentioned here to see what works best for your usecase: https://forums.oracle.com/forums/thread.jspa?messageID=10527052#10527052

Master Graph:

And when you click on one of the above bars, you get to this (employees under the department you clicked.)
Also note a 'Drill Up' button or icon can be provided to go back up one level... 

As of this writing, I didn't know of a declarative approach to achieve this, so the core of this functionality is the GraphHandler class (available in the attached sample application).

**I just read about another approach that might work on this forum thread and this developer guide link but I haven't tried that and I'm not sure it would work for multiple levels of drilldown or drill-up. 

My approach relies on programmatically replacing the binding of the dvt:graph component (screenshot of the code below):