Tuesday, December 06, 2016

Progress with the Oracle Integration Cloud Adapter SDK

In the past few days, I have been making some progress with using the ICS Cloud Adapter SDK. 
Today, I created my first shell adapter - the design time views can be seen below!

The journey so far: 
 * Installation of all the offline material [Check]
       Gotcha's to note here: the step to install SDK patches wasn't required for 12.2.1 (the version I was on). 
 * Reading through the documentation [Ongoing]
 * Developing the empty adapter and deploying it for design time and runtime [Check]

There are a number of integration use-cases that we have identified. If all goes well, these will be available for a wider rollout, helping customers implement some complex integration use-cases with some important cloud services in "hours, not months" in keeping with the Oracle ICS philosophy (and in line with DRY software engineering)!

More coming soon...

Tuesday, October 25, 2016

WS Security - enabling passwordDigest authentication in an Oracle FMW environment

To have a basic level of authentication on web services (especially where there's no transport layer security) without having to pass clear text passwords in the WS Security headers. 

The concepts are fairly generic but this post is highly Oracle Fusion middleware/SOA Suite specific. There can be complex decision tree (see [1]) involved when selecting the 'appropriate' level of security for any system. As security involves trade-offs between cost, performance, usability and other variables, the 'appropriate' level of security could be highly specific to the environment, usecase, system and people. But as developers, we can still perform some due diligence based on the tools and knowledge available to us.  

My rule of thumb when developing a traditional web service or microservice is: If it's reading from a secure database or some system that is accessible only via authentication, it must only expose a secure endpoint. 

Now sites can differ considerably and so does the definition of what "secure" is. 
When exposing ah http endpoint (SOAP or REST) hosted on cloud or accessible over the Internet, one would as a minimum ensure that it's over TLS and has authentication enabled. 

In an on-premise hosted solution, traditionally https has not been widespread within organisations and web service endpoints meant for internal consumption have most commonly only been exposed over http - hopefully accompanied by infrastructure level setup (firewalls, DMZs etc.) that ensures that the data or service is only accessible inside a 'trusted' network. 

Even in a trusted network without TLS, it is probably best if passwords weren't floating around in clear text (which is what the default UsernameToken with passwordText policies do)
With a few steps, one can enable the passwordDigest authentication that not only protects the password in-transit, but also provides protection against replay attacks (if nonce and creation time properties are set in the SOAP header as well)

  • Basic steps are listed here:
  • For step 9.3.3, what I do is create a new policy pair based on oracle/wss_username_token_service_policy
This is done via /em -> WeblogicDomain -> Web Services -> WSM Policies

Search for oracle/wss_username_token_service_policy and copy to create a new one with the settings for passwordDigest applied (as per step 9.3.3 of the Oracle guide)

The one I created is singhpora/wss_UsernameToken_PasswordDigest_service_policy and 
(based on oracle/wss_username_token_client_policy) and I keep these source controlled. (An additional benefit of putting them in source control is so developers can import these into their local JDeveloper policy store for design time and also to promote their initial versions or changes across environments from development through to production,much like any other artefact)
  • Another associated step is to have the oracle.wsm.security map and basic.credentials key to be present on the server that will use the client policy (you can use custom map and key names if required). This needs to contain the username(s) and passwords of the users who are allowed to invoke the web services that use the username token policies. (You can follow the principle of least privilege when assigning a group to these users.)

This can clearly be seen when invoking the service via SOAPui. 
If your service uses a UsernameToken with PasswordDigest policy (like the one I shared above), SOAPui can be used to test it as it can automatically set the required security headers. 
If you look at the SOAPui logs, before applying the passwordDigest policy (e.g. when your services uses a username token with password text based authentication policy like in the default setup), this is how the password component of username token is created:

Unless your service is accessible only over SSL, this means you have passwords flowing around the network in clear text. Most corporate IT policies would I believe specifically forbid letting passwords float around like this and yet, this can often go unnoticed and unaddressed. 

After applying the username token with password digest policy, is how the WS Security headers get created:

Only the client and the server now know what the password is and no one in the middle can see this. 

* To enable digest authentication, the server has to store passwords in clear text as per the documentation (for the default authenticator to work - If you have more stringent requirements, it is possible to write your own authenticator that can read passwords from some encrypted credential store). The reason is that with digests, the client (such as SOAPui in the above example) creates a hash of the actual password, creation time and nonce -* the server on its side has to create the same hash for successful authentication and this requires the server to know the clear text password. 
But this is still okay as this can be contained behind strict administrator control. Way better than having clear text passwords travelling over the network. 

Digest authentication on its own can help protect your password and with the nonce, it can help prevent replay attacks [3][4]. 
But, it is still vulnerable to man (sic) in the middle (MITM) attacks [3] - which means it is on the whole better to also enable TLS for web service endpoints* when possible. 
(Although, if you suspect MITM attacks from inside an organisation's network, you might have other serious issues!) 

[1] Decisions and choices involved when selecting the appropriate security policy: https://docs.oracle.com/middleware/1221/owsm/security/choose-owsm-policy.htm#OWSMS3988

[2] Setup steps for enabling digest authentication: https://docs.oracle.com/middleware/1212/owsm/OWSMS/configure-owsm-authentication.htm#OWSMS5450

[2] Sections 3.2 & 3.3 : https://www.w3.org/Protocols/rfc2069/rfc2069

[4] https://www.oasis-open.org/committees/download.php/13392/wss-v1.1-spec-pr-UsernameTokenProfile-01.htm

16-May-2017: Note about possibility of custom authenticator in the Tradeoffs section (prompted by Jason Scarfe's comment)
1-Jul-2017: Added limitations section

Managing shared metadata (MDS) in a continuous integration environment

Goals and Summary:
* Package shared metadata in a SOA environment and make it widely distributable (SOA MDS [2], Servicebus, maven artifact repository) 
* Associated sample: https://github.com/jvsingh/SOATestingWithCitrus/tree/develop/shared-metadata  
* Key command (if you use the associated pom file) 
mvn deploy com.oracle.soa.plugin:oracle-soa-plugin:12.2.1-0-0:deploy -Dpassword=*****

Having worked on a wide range of projects, I came to the realisation that SOA can mean vastly different things in different places.
It can be about implementing the foundational service oriented architectural principles or it can be simply about using a tool or technology with SOA in its name- just like any other programming language.
In a mature SOA environment, the shared metadata contains valuable artefacts that provide the foundation – subject to design, it contains the canonical information model of the enterprise (in the form of business/entity objects) and the various organisational API interfaces (service interfaces and messages).
In the Fusion middleware world, this pattern is easily implemented via the MDS – a set of services that allows storage and retrieval of many types of shared resources such as WSDLs, XSDs, DVMs, reusable XQuery or XSL transformations etc. Within SOA composites, these are then accessible via the oramds:/ prefix
To take this one step further, we also can deploy the same copy of shared artefacts into the Oracle Service bus as a servicebus project so even the OSB services can access these without requiring local copies scattered everywhere.  A great benefit of deploying this content to the OSB is that you get some basic sanity checking of these artefacts for free (e.g. the OSB is a bit strict about unresolvable XSD imports in WSDLs – this kind of thing is highlighted at design-time only if you use a professional XML editor and not regular JDeveloper[1], which is what most FMW developers might commonly use)
There are some key principles here:
  • ·         Within an organisation, are service callers and called services able to access the same copies of schemas and WSDLs? Or are there copies floating all over in every project? This kind of thing invariably leads to ‘slightly’ different copies of the same schema and basically is a recipe for mess.

(Of course, when consuming ‘external’ services, we probably do want to save a specific version of their interface locally as that forms our ‘contract’)
  • ·         Are projects neat self contained units that interact with the ‘external world’ via well defined interfaces, or is there a complex web of cross-dependencies, deeply nested dependencies and even circular-dependencies with projects referencing each other? Shared metadata helps avoid these situations by providing both callers and implementers the same reference data model and interfaces.
  • ·         Is there any form of assurance or validation of the shared artefacts? Are the WSDLs and XSDs well-formed and valid? To be specific, are any schema errors flagged up regularly as part of a continuous build (rather than being detected much later when multiple such errors have accumulated?)
  • ·         Is the MDS being built and deployed as a single, versioned unit or do individuals simply zip up groups of “files” and promoting them across environments?

On the last point, I think it is important to consider the shared metadata as a single deployable unit that can be version controlled, tagged, built with identifiable versions,  validated, deployed, promoted, in the same way as a SOA composite or a ServiceBus project is a single deployable unit. (yes, I know you can create an *.sbar archive with only the  ‘files’ you changed within a project, but this kind of approach is completely contrary to practices that promote continuous integration and delivery. You essentially end up tracking individual files rather than treating a ‘project’ as a unit of deployment. )

Now, coming to the build and deployment of MDS, we use the approach of zipping these up (note the build section and packaging in my MDS pom.xml and then deploying the artefact using using maven the oracle-soa-plugin (specifying the sarFile property as apps.jar)

mvn deploy com.oracle.soa.plugin:oracle-soa-plugin:12.2.1-0-0:deploy -Dpassword=


·         As seen above, the MDS bundle is deployed to the SOA runtime.
·         It is also deployed to the maven repository configured in the distributionManagement section (this could be any repository such as nexus)
Note that since I call the oracle-soa-plugin directly in the maven command, I don’t need to explicitly configure it in the pom (I would have to do that only if I was piggybacking the soa deploy on top of one of the maven phases but here I specifically want “mvn deploy” to validate and then deploy the artefact to my maven repo. I specifically want my MDS deployment to the runtime MDS to happen separately).I have only configured some of the properties required by oracle-soa-plugin in the pom to keep my deploy command concise.
I further make it a point to make sure that the artefact produced by this last step is also deployed to the local and internal maven repositories (such as nexus). For this example, I have used a simple distributionManagement section in my MDS pom that installs the shared-metadata bundle into my local maven repository. This simple step ensures that ANY other consumer in the organisation is able to consume this metadata (e.g. a standalone Java web service or application that needs to call an internal web service)
In subsequent posts, I will add a Java consumer that can simply use the shared metadata as a dependency and consume the common repository of
In the brave new polyglot world of Oracle Application container cloud, this can in theory be ANY consumer – even PHP or python!

Coming up:
* Adding more validation for shared-metadata in CI

References and footnotes

[1] Some times unresolved types only come to light in JDeveloper if you ctrl+click on it. I think this flexibility might be by design to keep things simple for beginners perhaps but this is only an opinion. 
[2] Teams might use various approaches for this. Here is one of the earlier posts that also partly address MDS deployment via maven with a conceptually similar approach (create a zip then deploy using oracle-soa-plugin): http://weblog.redrock-it.nl/?p=740
My approach, though, avoids the need for the assembly plugin and its associated XML assembly descriptor to create the zip beforehand. The benefit is that the primary artifact produced by the main build is what maven also automatically pushes to the distribution repo (such as nexus) in the 'deploy' phase.

Sunday, September 11, 2016

Easy SOA releases with JGitFlow

If you use GIT as your source control system and if you use maven, the jgit-flow plugin is a massive time-saver, especially when we release a slightly large application with multiple modules (Each with it's own pom file). 

Two steps: 
 mvn clean external.atlassian.jgitflow:jgitflow-maven-plug in:release-start
 mvn clean external.atlassian.jgitflow:jgitflow-maven-plug in:release-start

do the job. 

The above sequence basically updates the pom file versions to a release version (e.g. from 1.0-SNAPSHOT to 1.0, merges the development branch changes to the master branch, and sets the pom versions in the development branch to the next snapshot 1.1-SNAPSHOT)

If you have an application with multiple projects/modules, all of them can be released in one go (such as my application here that contains two modules)

Of course, there are some peculiarities when SOA Composite projects are involved. 
e.g. the oracle-soa-plugin maven plugin insists on 'deploying' the composite and running tests at the same time - so you need to keep a SOA server running and supply the serverUrl, username and password properties. (keep the properties names different - see sar-common pom for example names) just so they don't clash with the jgitflow username and password properties. 

I avoid this by simply using a private-public key pair to interact with github which saves time and avoids the above property name clash. 

Of course, there are ways to not have the oracle soa plugin insist on deployment when creating a release, but that is a post for a later day!. 

Saturday, September 03, 2016

Test Driven SOA - citrus for powerful SOA test coverage

Reading parts of Test-Driven Development for Embedded C" by James W. Grenning inspired me to take another look at this area and look for something new,  fresh and powerful for use in the SOA world. 

I don't think we need much convincing on the importance of  automated test coverage (if someone does, please read the first chapter of the book mentioned above, especially the section on "Physics of TDD" that tries to quantify the high long-term costs of "Debug later programming" - the nemesis of TDD)

A very simple application with a SOA composite project and Tests project can be found here: https://github.com/jvsingh/SOATestingWithCitrus

Although the test in this is just a simple SOAP request, what I am interested in are the features that citrus has to offer that can help create a solid battery of tests. 

  • Tests can be specified in Java or XML or a combination of both
  • A number of test utilities are inbuilt - including things like database access, JMS,mock SOAP endpoints (static responses), complex assertions - and these can be used to write complex setup and teardown routines

I will leave the reader to peruse the code on github but this shows the most important pieces of config in my test project:

  • To build+deploy+test, after making sure your SOA server is running, just run "mvn integration-test" from the application level (provide serverUrl, user and password in the SOAComposite pom or from the environment e.g. -serverUrl=http://soahost:port)
  • To only run the integration tests, just run "mvn integration-test" from the SOAApplication/SOACompositeTests level.

This is all neat and CI ready! 

Saturday, August 27, 2016

Maven builds for SOA 12c Composites with BPEL Java embedding and Java class

Enviornment: Oracle SOA Suite 12.2.1

Sample Application:  https://github.com/jvsingh/SOAAppWithJavaEmbedding/tree/develop/SOAApplication
(Git clone or use the download option from here: https://github.com/jvsingh/SOAAppWithJavaEmbedding/tree/develop )

 A BPEL component has a Java embedding that in turn calls a Java class method (under the usual SCA-INF/src)

This works and builds find using JDeveloper, but the oracle-soa-plugin for maven seems to have a few known issues (see references for one of them) that cause builds for such composites to fail. 

 My Java embedding, referring to my class com.singhpora.samples.SOAApplication.SCAJava under SCA-INF/src can be seen here: 

When I build the SOA project using "mvn clean package"  (from the SOAProject directory with the default pom) , you can see that I get two distinct errors as shown below: 
a) It can't find my class from under SCA-INF/src 
b) It cannot find even the BPEL platform classes

The workaround(s) for the two issues above involve:
a) create a simple java pom file under SCA-INF 

b) Add SOA/SCA-INF as a module in the 'Application' level pom 

c) Workaround for the second issue where it can't find BPEL's libraries:
Observe the use of maven-dependency-plugin here (which essentialy copies a BPEL platform dependency temporarily under SCA-INF/lib to keep the compiler happy):

After using the above workarounds, if I now build my application using the application pom, it builds and deploys fine:
(run mvn clean pre-integration-test from the SOAApplication level). 
As you can see, it now builds the two modules and the application. 

At runtime, my Java code is invoked successfully:

References/Related links:
2)   Same as workaround c) http://www.esentri.com/blog/2016/04/07/unable-to-compile-a-composite-java-embedded-maven/
3)   Builds for java classes under under SCA-INF but a slightly different approach:

Tuesday, June 07, 2016

#AMIS25 and The Oracle Cloud Shift : Insights from my first Holland trip

I would like to take this opportunity to wish AMIS Netherlands a very happy 25th birthday. In the context of Oracle SOA, the the name AMIS often keeps popping up - they have contributed a lot to the knowledge available to the community around this and related Oracle technology. 

As they chose to celebrate this occasion in a uniquely signature style - by holding a global Oracle conference with an impressive lineup of speakers from all 6 continents and also by holding the event in an old aircraft hangar (commemorating their origins as the Aircraft Management Information Systems) 
It was a pleasure to be invited by Lucas Jellema (@lucasjellema) so I decided to attend at least one day - the Friday, 3rd of June. The line-up of events though was fantastic on both days. 

I arrived in the Netherlands on Thursday, the 2nd (my first visit to the country, outside the airport that is) and decided to explore places nearby... More on this later!

All the speakers might upload their presentations as they see fit and of course, know the best about their subject matter. I'm going to write about the talks I attended and my observations on the main themes.

One thing that is quite apparent is that the mainstream Oracle world is now cloud. This is quite the realisation of c of 12c.

First, the conference day started for me with Simon Haslam's (@simon_haslam) talk on the Oracle Traffic Director. This was one of the aha moments when you realise a gap in existing technology that you vaguely knew was there but had always either ignored or worked around it!
OTD offers seriously advanced load-balancing, fit for globally distributed cloud applications that is also 'application aware' (both OOTB and with options to extend with custom programming)

In my second session, Matt Wright of Rubicon Red shared his company's insights and a roadmap for moving integrations to the cloud. 

Peter Ebell of AMIS presented a talk on new SOA paradigms ("Alien architectures" as he termed it) - the post RDBMS world. The premise was that traditionally, SOA service layers that directly perform DML on RDBMS databases are very prone to changes in the database. Perhaps new approaches might need to be explored - especially for the new world where data in general is more unstructured or semi-structured. 
He started with a typical 'napkin architecture' and then progressed on to explain how it would evolve for certain modern requirements. 
At first the speaker started the talk in Dutch and I thought it would be an interesting challenge to try and understand everything in Dutch! But he then switched to English.

Shay Shmeltzer (@JDevShay) introduced the Oracle Developer cloud - this is a boon for the developer community as with a few clicks, a developer can provision the basic development environment (Source control, wiki, issue tracker, build server) up and running for a whole team! 
As Shay reiterated "..A mature DevOps facilitates short and quick release cycles...." , which is precisely today's need and expectation from businesses. 

Lonneke Dikmans of eProseed and Lucas Jellema of AMIS introduced the various Oracle cloud offerings - PaaS offerings to be precise. Beyond the familiar SOA CS,  ICS (Integration cloud service), PCS (Process cloud service - with it's BPM engine and BPM Workspace, the IoT and Big data cloud services are interesting new offerings. 
I noticed that both IoT and BigData CS included 'analytics' -  Lonnenke clarified that this targeted different types of data (real time data in flux versus static-historic data). 
As I see it, the IoT cloud service adds value by "turning sensor data into sensible information" - that can subsequently be fed in to underlying data, integration and analytics services. Very compelling. 

Lucas described a realistic strategy for migration to the cloud by targeting 'edge systems' first. 

Bram Van Der Pelt of AMIS gave a session on Identity 3.0 and it's possible application in the Oracle world. Identity 3.0 is a new proposal developed by the Jericho Forum, which essentially proposes a mechanism whereby Identity and it's related attributes are maintained by and shared by the authority that owns them (such as a national government or the individual themselves). The root of every identity is proposed as anonymous. These principles facilitate privacy. 
This is a major paradigm from the currently prevalent model in every application where copies of user identities and lots of personal profile information are stored locally. 

......Beyond technology, the conference also gave me the opportunity to see some nice parts of Holland. As I arrived at Amsterdam on the afternoon of Thursday, I started to make my way towards Katwijk. As the historic city of Leiden was on the way, I took the opportunity to explore the Leiden town centre a bit and also see the  Rijksmuseum van Oudheden - which is the national archaeological museum of the Netherlands. The collection is nice and includes artifacts from ancient Egypt, Persia and local archaeological finds from the regions in and around the Netherlands. 

"Why should we look to the past in order to prepare for the future?
Because there is nowhere else to look"

~James Burke  (Quoted at the 
Rijksmuseum van Oudheden)
An interesting fact about Leiden is that it's the birth place of the famous Dutch painter Rembrandt. 

The day after the conference, I headed to Amsterdam (having stayed overnight at Den Haag/The Hague). Found a map of the city and started the day with walks along the canals from Central station to the Museum district. Eventually decided on exploring the Rijksmuseum, which I explored for most of the day with it's extensive collection of paintings by Rembrandt, Veneer and other artists. 

"You have two eyes and but one mouth. Let this be a signal to pay heed, not to talk here, but to read"
(~Quoted on the walls of the Library at the Rijksmuseum, Amsterdam, pictured below)

Having spent hours at the Rijksmuseum, for the remainder of the day all I could do was to walk around the city some more, before it was time to catch my flight. A very fruitful first trip to Holland - not only for the information packed conference, but also because I got to sightsee and visit two main national museums of the country!