Oracle Integration Cloud - insights and analytics with Splunk

Sharing some interesting insights that I obtained from runtime information of integration flows in Oracle Integration Cloud - extracted into Splunk. My approach was to extract the activity trace already available in OIC, slightly reformat the data using python/shell scripts, ingest them into Splunk and create these visualisations. I will not go into too many technical details here but enough to show what's possible.I should add that Oracle Management Cloud (especially Log Analytics) might have something similar, but as of this writing, I am not aware of anything similar that might be pre-built specifically for Oracle Integration Cloud. OIC's own monitoring dashboard provides pre-configured insights at a high level, and of course, one can drill down into individual runtime instances to identify problems. But my objective here is to create something more flexible and dynamic that could be extended and more importantly, could be used to analyse historical trends. This is valuable…

An article demystifying crypto-coins, and if it's even possible to value Bitcoin

This post has nothing to do with Oracle technology as I haven't yet tried Oracle blockchain. I'm also aware that much has been said  on the subject by others and inner workings of blockchain have been explained ad-infinitum over the years. 

But here's my analysis that describes ideas like money, "value", price, some history of "ledgers" and how they evolved to "Distributed Ledger Technology". 

Must add that many of the shortcomings around scalability of Bitcoin (which is only one of and the pioneering cryptocoins) are covered on Bitcoin FAQ's

Abstract below:
The meteoric rise of the price of bitcoin, and its accompanying wild fluctuations, piqued the interest of many investors and speculators. Cryptocurrency has often been hyped as a gold-like replacement for fiat currencies by virtue of its “finite” supply. There have been many publicised stories of the early stage “miners” who “solved puzzles” (as they put it) on their computers to “mine” b…

Recursive calls in Oracle Integration Flows (Scenario: Paginated API calls for large Data Sets)

A number of use-cases can be implemented cleanly using a recursive approach. This post is not to debate the pros and cons of recursion versus looping but provides a simple approach to achieve this. For scenarios such as the ones listed below, and possibly more, this approach is quite efficient, concise, maintainable, and most importantly, it is highly scalable. It also leaves a smaller runtime footprint with a smaller execution time per instance than a looping flow instance. This also makes error handling easier as I will describe later. 
Polling (continuously monitoring an FTP location, a database, or an API output)Paginated API's (when the target system exposes an API with a paginated* interface such as the eBay findProducts operation)Retryable flows

Fault tolerance in integration flows - handling target system availability problems

An important non-functional property of any software system is "Availability". In the ISO/IEC 25010:2011 product quality model, this is grouped under an overall category of "Reliability".  Fault tolerance is a closely associated property also grouped under "Reliability". 
System downtimes could be either due to scheduled maintenance

Selective persistence of Oracle Diagnostic Logging (ODL) output

Background and GoalIn any application, logging is widely used for diagnostics and debugging. 
Logging at various "checkpoints" (such as entering with request, exiting with response, error handler) in the application can provide a fairly reliable way to trace the execution path of the application - which a subsequent sweep or count can be used to report on. When the logs are regularly analysed and reported on, anomalies can get flagged up proactively and investigated further. Some examples

Geographical clusters with the biggest concentration of web services

From a data set of approximately 145 million IP addresses running at least one publicly accessible web service (such as a website), I was able to determine these 20 geographic "clusters".

Raw results - countries list with total IP (IPv4) addresses


Presented below is a list of countries (country codes) and the total count of live IPv4 addresses where a public facing service (such as a website) might be hosted as counted from the scan data of 1st October 2017

The reason these don't quite add up to anywhere in the ballpark of 4 billion (the total IPv4 address space) is because the data set I used might only be scanning for hosts that run some public service exposed over a TCP port (e.g. a website running on port 80 or 443)
The numbers definitely look incorrect and total up to only 145,430,195 - I will continue to investigate why, but they seem to be in proportion.  It is likely that are only able to gather data about live IP addresses at the time of the scan as opposed to total allocated ones)

|     LT|  120718|
|     DZ|  362827|
|     MM|    3494|
|     CI|   18954|