Selective persistence of Oracle Diagnostic Logging (ODL) output

Background and Goal

In any application, logging is widely used for diagnostics and debugging. 

Logging at various "checkpoints" (such as entering with request, exiting with response, error handler) in the application can provide a fairly reliable way to trace the execution path of the application - which a subsequent sweep or count can be used to report on. When the logs are regularly analysed and reported on, anomalies can get flagged up proactively and investigated further. Some examples
include: a report on requests without a corresponding responses, and a report on fault counts and fault codes.

For such reporting, one can write fairly complex scripts to extract the required data OR persist specific log entries to a database for querying using SQL tools or even a custom-designed user interface. This blog post shows an easy approach to achieve this. 

Oracle Diagnostics Logging

In Oracle Fusion Middleware, the Oracle Diagnostics Logging (ODL) framework is responsible for handling all application level logging. The main configuration for ODL is stored in a file called logging.xml, a copy of which is configured in each managed server and the Admin server (this configuration can also be done via the fusion middleware control)
Briefly, the ODL consists of a set of pre-defined "log handlers" (represented by log_handler elements in logging.xml that in turn use an implementation class or interface and filters and are configured using a set of nested property elements) 

Each application or sub-system is then identified by a logger element - where a logger can use one or more log handlers. Loggers form an inheritable hierarchy where sub-loggers can inherit parent logger properties but can override these as well. 

Loggers called "console-handler" and "odl-handler" are two common ones - sending outputs to the server console and a ${SERVER_NAME}-diagnostic.log file respectively. 

Oracle Service Bus (OSB) Pipeline logs

In the Oracle Service Bus, "log" and "report" are two actions that can be used in complementary ways. 
The Log action used in OSB pipelines specifically uses the ODL framework to write content to the configured log files. The logger used is oracle.osb.logging.pipeline and in the default setup, the output goes to the odl-handler and in turn, to the server specific diagnostic.log file. 

You can view the standard/default configuration of the odl-handler in logging.xml

Custom Log Handler for OSB pipeline logs

I created a slight variant of the odl-handler and called it SBMessageFlowtraceHandler
This is the configuration I created for it:


The key differences from the configuration for odl-handler are:
* The output generated is an XML structure instead of the default text
* I used a rotation frequency of 1 minute - I will come to the reason later
* Used a specific file name pattern and location 

The log handler created above was then mapped to the OSB logger below (which is also a parent logger of oracle.osb.logging.pipeline:

The result of the above configuration is that any log actions used in the pipelines start getting routed to my new osb_managed_server-diagnostic.xml log file. One such entry can be seen below:

We can observe that each log entry is represented by a "msg" element. This includes the original content we logged within the "txt" element but also a lot of interesting metadata:
1. Timestamp of the message in the time attribute
2. host name / IP address where the OSB managed server was running
3. ECID - an invaluable identifier that can be used to search for other related entries in all related OSB logs across servers
4. Within the txt element, there is the 4-tuple structure enclosed in square brackets that includes the route name and whether it was the REQUEST, RESPONSE or ERROR handler pipeline. 

To map these to design time, I have included this image of the relevant section of my pipeline:

Log extraction and Persistence

Once we have made the diagnostic logs available in a well defined format and location, the solution is just like any other OSB project that uses a file transport based proxy service. 
We can use a simple file transport/poller, combined with an XSL transformation and the DBAdapter to save this data into the database!
I have created a draft OSB project with JDev/OSB 12.2.1 that does precisely this here:

The database includes a simple table with these main columns:
1) MessageDateTime - timestamp of the original message, which we can pick from the logs as shown above
2) AuditDateTime - the timestamp when the data was inserted into the database (there is likely to be a slight delay between when the logs are written and when they are polled for persistence)
3) MessagePayload - to store the actual content of the message: which would either be the full content of the 'txt' element or the part after "log summary:". In my example, this is of type XML (SQL Server)

You can of course create your own custom database schema and pick and choose the content and format of data you need inside it. You can choose to save all logged messages or filter based on a certain log level or even certain keywords in the "summary" content of the Log activity. 

Workarounds etc.

1) To prevent the SBAuditLogger from reading the same log file that the server writes to, I used the following "mask" in the file proxy: *flowtrace-*.xml

This means, only 'rolled' files are read (which I already specified to be 1 minute, so the rolling is frequent). The proxy can be configured to delete or archive the files that are read just like any other file polling proxy service. 

2) To avoid situations where the "txt" element comes in with nested CDATA sections, I had to write this small Java utility called in the SBAuditLogger pipeline via a Java callout. 


1) Adding a log action in an OSB pipeline:

2) Configuring Logging and log files / Understanding Oracle Diagnostic Logging:



Popular posts from this blog

Recursive calls in Oracle Integration Flows (Scenario: Paginated API calls for large Data Sets)

How to show/hide operators in adf:query advanced mode

WS Security - enabling passwordDigest authentication in an Oracle FMW environment