IoT on #oow17 and in my life

Last Sunday I attended an interesting Oracle OpenWord session about how Federal Express uses the electronics in cars to coach their drivers, to save fuel and prevent excessive engine and tyre wear.

Last Sunday I attended an interesting session at Oracle OpenWord. It was a session about how Federal Express, an American company who delivers parcels, used the electronics in cars that monitor tyre pressure, a lot of parameters from the engine and the gps of the car to collect huge amounts of data. With this data they could coach drivers to drive more fluently to save fuel and prevent excessive engine and tyre wear. They saved money because they optimized routes and minimized the time of cars running idle. They could also inform the customer exactly when the parcel would arrive, because they could see live how well the trip is going (deviations or accidents and a very lucky drive were all noticed and used to update the estimated time of arrival.

I thought it was an inspiring session, because they used what is already there (the car electronics, a database), and combine it with a small device that sends all this information via the phone data network to the organization. Then they add clever analytics software and they are much more in control, and can use and maintain their equipment much more efficiently. In the past there was a little buffer taken for many things, now that is no longer necessary, because they can see exactly when, for instance, the tyres of the car need to be resurfaced, because one can see the exact mileage.

The next day Mark Hurd was talking about predictions he had made, and proving that they were right. One of them was that much more devices in our lives will be connected with each other to make our life easier. Until now I had the idea that all these things were not going as fast as he had predicted them.

But suddenly I realized that even in my own life a lot of things are being combined with each other to comfort my life: if I start up my Blu-ray player, my TV is automatically started, my ‘toon’ (smart thermostat and domotica controller) measures how much power I use and tells that to the power company, so that I don’t have to fill in a end-of-year measurement manually anymore. If I step into my car, my phone is automatically connected, and the app ‘flitsmeister’ is automatically started because it detects that the phone connects to my car. If I Google a restaurant on my phone, I can let google maps guide me to there, if I hear a good song, I can let Shazam listen to it, and find the song’s title for me. When I booked Oracle OpenWorld with my credit card, the bank called me to check if that was a normal, wanted transaction.

These are all examples of clever use of things that are there like phones and what’s in them, databases, the data network, the data itself and analytic software to help us make our lives better. I love it, and I am proud to be a part of the industry that makes this happen!

OGh Tech Experience 2017 – recap

On June 15th and 16th 2017 the very first OGh Tech Experience was held. This 2-day conference was a new combination of the DBA Days and Fusion Middleware Tech Experience that were held in previous years. To summarize: OGh hit bullseye. It was two days packed with excellent in-depth technical sessions, good customer experiences and great networking opportunities.

On June 15th and 16th 2017 the very first OGh Tech Experience was held. This 2-day conference was a new combination of the DBA Days and Fusion Middleware Tech Experience that were held in previous years. To summarize: OGh hit bullseye. It was two days packed with excellent in-depth technical sessions, good customer experiences and great networking opportunities.

The venue was well chosen. De Rijtuigenloods in Amersfoort is a former maintenance building of the Dutch Railways converted into a conference center. So the backdrop and even the location of some sessions were old train carriages.

Tech Experience Entrance

Tech Experience Entrance

The first bombshell was dropped in the introduction by OGh Chairman Robin Buitenhuis. Starting July 1st, the OGh will be renamed to nlOUG, the Netherlands Oracle User Group. This will provide OGh with a more international allure and hopefully attract more foreign speakers and visitors to their events.

After the introduction it was time for the first Keynote. Maria Colgan (@sqlmaria) the Oracle Database Product Manager delivered a great talk about bridging the gap between developers and dba’s, by letting them work together as DevOps teams. Even though I don’t agree with her definition of DevOps (I agree more with Lucas Jellema, but more about that later in this blog), I do agree with her sentiments.

Maria Colgan on Oracle Database 12c and DevOps

Maria Colgan on Oracle Database 12c and DevOps

During her talk a lot of new Database 12c features were shown, that were interesting for both the DBA as the Fusion Middleware attendees. A few highlights:

  • leveraging REST from the database by using SQL Developer as ORDS (Oracle Rest Data Services)
  • dbms_json and json_dataguide to implement json directly from/in the database
  • new Materialized View features like Enable On Query Computation
  • ALTER TABLE INMEMORY for better performance
  • dbms_redact to redact data for security reasons

After the first keynote and much needed refreshments due to the tropical temperatures, the parallel sessions started. Because the Tech Experience was a combination of two events, there were 9 simultanious tracks about many different subjects. Ranging from DBA and PL/SQL to Integration & Process and Web & Mobile, choosing an interesting track was easy.

My first session was one by Jon Petter Hjulstad, namely ‘Experiences from SOA 12.2 implementations’. Because my current customer is in a transition from 11g to 12c, this one gave me some insight in what to expect and what to avoid.

Tools of the trade
Tools of the trade

My second session was also in the Integration & Process track. This one was on ‘Case Management in Process Cloud’ delivered by Ralf Mueller. He showed the audiance the path that Oracle is taking in expanding the Process Cloud with Unstructured or Dynamic Processes by introducing stages that can be invoked by rules, similar to the on-premise ACM implementation. He even did a demo, to show the ease with which a new implementation can be built.
Later Ralf gave a glimpse into the future of Process Cloud by talking about Adaptive Processes. That will be supported by AI and Machine Learning, data based rules and a Deep Learning Algorithm. Stuff out of sci-fi!

After this it was time for some good old SQL. Chris Saxon showed the audience some magic that can be achieved by using some neat features that the Oracle Database has hidden. Some examples are:

  • SQL Translation Framework
  • Edition Base Redefinition
  • Index Organized Tables
  • Invisible Columns
Don't ask Tom
Don’t ask Tom

My next session was by Sandra Flores (@sandyfloresmx), an Integration specialist from Mexico. She explained her vision on SOA, Microservices and Service Orientation. Especially how they connect to each other. My main takeaway from this session was, that Microservices can be a part of a SOA, but they are both a part of a Service Orientation.

The Samurai Way

The Samurai Way

The last session of the first day that I visited was by Lonneke Dikmans and Ronald van Luttikhuizen. In this session they showed the audience how Architects and Integration Specialist can start debates on the architecture of a integration implementation.

The day was concluded with a very good dinner and some much needed drinks.

A bit rusty
A bit rusty

On the second day of the event, things started off with another keynote. This time it was Duncan Mills, who titled his session “How I learned to stop worrying and love the Cloud”. In his keynote, he explained how he started his life as a young programmer and gradually growed into the person he is now, and what lessons he learned along the way.
It was a fun talk and it provided lots of insights for developers and managers alike.
Duncan explained how an organization can take it’s path towards the Cloud. This path consists of four steps in his opinion, but an organization should not go further up that path than they dare to go. If step 1 is enough for them, you don’t have to go further to be able to profit from the Cloud. If you take it too far, adoption might get too forced.

Duncan Mills - Path to the Cloud

Duncan Mills – Path to the Cloud

Another great lesson was based on Richard Dawkins book “The Selfish Gene“, about how genes that wanted to grow started the evolotion of organisms. Therefore Duncan introduced us to the Selfish Developer. He explained that helping developers in all their whims (like: “give me another environment to test stuff”) will help an organization evolve.

The Selfish Developer
The Selfish Developer

And finally Duncan explained a list of things he learned along the way.

  • Be pragmatic
  • Don’t over-design
  • (Mostly) Don’t take anyone’s word for it
  • Strive for automation
  • Work in short iterations
  • Make mistakes!
I guess you had to be there, to get this reference :-)
I guess you had to be there, to get this reference 🙂

Because the venue was an old railway complex, some session were inside railway cars, like my first parallel session of day two.
Luc Gorissen had an inspiring presentation about Faulthandling in ACM and BPM. It all starts with the functional design. You have to think about the happy-flow, but you should never forget what should happen when something technical (like a failing server) or something functional (incorrect zipcode or unknown user) goes wrong.
Not until you combine your Fault Categories, Fault Strategy, Layering Model, Design Guidelines and Implementation Guidelines, you will get a complete Fault Handling Implementation.

Fault Handling
Fault Handling

Next up was Xander van Rooijen of Rabobank. He showed an example of an API Management implementation.
Using Apiary for the API design and API Fortress for testing, they were able to create a full API implementation on Oracle’s API Platform CS bêta environment. Now they wait for Oracle to be able to push it to production.

Another client story was by Froukje van der Wulp and Maarten Smeets of spir-it, the IT department of the Dutch Council of the Judiciary. They explained how spir-it has transformed it’s organization from a classic Waterfall to an Agile environment. This enabled them to tackle large performance issues in their complex applications for digitizing the Judiciary.

Always faster
Always faster

In the last parallel track I visited a session by Robert van Mölken on Blockchain. For me a very new and unknown subject. But his session gave me insight and I learned that Blockchains are an implementation of the techniques used by Bitcoin.

The final word was for Lucas Jellema. In the third and last keynote he wrapped up the conference.
With many salutes to audience, speakers and organizers he gave some valuable lessons. And he wasn’t afraid to disagree with some of the other speakers. As I said in the beginning of this blog, his definition of DevOps is different from the one that Maria Colgan showed. His description of a DevOps team is more practical: “You build it, you run it, you fix and evolve it”.

We salute you!
We salute you!

Lucas’ talk on bridging the gaps that exist on many levels, left the audience inspired to come back again next year.

If you got curious after reading this blog, please visit the nlOUG website to download handouts from the Tech Experience.

Building OSB 12c releases on resource level using Maven

A while ago I published a Whitebook (in Dutch) about building OSB 12c releases on resource level using Maven. In the Whitebook, you could read which functionality we were missing in the regular Oracle Service Bus plugin for Maven and how we were able to create this functionality using a custom Maven plugin.

A while ago I published a Whitebook (in Dutch) about building OSB 12c releases on resource level using Maven. In the Whitebook, you could read which functionality we were missing in the regular Oracle Service Bus plugin for Maven and how we were able to create this functionality using a custom Maven plugin.

I have added the code of this custom Maven plugin to the following public repository: https://bitbucket.org/whitehorsesbv/servicebusplugin

Used settings

The custom Maven plugin has been developed and tested on multiple environments, so we can confirm that the Maven plugin is working if you are using the following version(s):

Application Version
Java 1.7.0_79, 1.8.0_101
Maven 3.3.9
Oracle Service Bus 12.1.3, 12.2.1

Installing the custom Maven plugin

First start to install the custom Maven plugin to your local Maven repository. To do this, you can download both the JAR and the POM file from the download page of the repository. After you have downloaded both files, you can execute the following commands to install it to you local Maven repository:

mvn install:install-file -Dfile=servicebus-plugin-1.0.jar -DgroupId=nl.whitehorses.servicebus -DartifactId=servicebus-plugin -Dversion=1.0 -Dpackaging=jar
mvn install:install-file -Dfile=servicebus-plugin-1.0.pom -DgroupId=nl.whitehorses.servicebus -DartifactId=servicebus-plugin -Dversion=1.0 -Dpackaging=pom

Building a Service Bus project on resource level

Now that the custom Maven plugin is available in the Maven repository, we are able to build OSB projects using this plugin instead of the default Oracle plugin. To do so, open the pom.xml file in the OSB project directory. By default, the pom.xml file will look something like the below example:

pom.xml

<?xml version="1.0" encoding="UTF-8"?> 
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd" xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> 
	<modelVersion>4.0.0</modelVersion> 
	<parent> 
		<groupId>com.oracle.servicebus</groupId> 
		<artifactId>sbar-project-common</artifactId> 
		<version>12.1.3-0-0</version> 
	</parent> 
	<groupId>nl.whitehorses</groupId> 
	<artifactId>Employee</artifactId> 
	<version>1.0-SNAPSHOT</version> 
	<packaging>sbar</packaging> 
	<description/> 
</project>

If we package the project into an archive using the above pom.xml file by executing mvn package via the command line, we will get an archive that can be used to deploy the OSB project on project level to the OSB. The build archive can be found in the .data/maven folder within the project directory.

If we change the value of the pom.xml file to the below example, and we package the project the project using the same command, we will get an archive  which can be used to deploy on resource level via the OSB. In the below example the following files will be added to the archive:

  • All files within the Business directory.
  • All files within the Pipeline directory, except UpdateEmployee.pipeline.

pom.xml

<?xml version="1.0" encoding="UTF-8"?> 
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd" xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> 
	<modelVersion>4.0.0</modelVersion> 
	<groupId>nl.whitehorses</groupId> 
	<artifactId>Employee</artifactId> 
	<version>1.0-SNAPSHOT</version> 
	<packaging>sbar</packaging> 
	<build> 
		<directory>${project.basedir}/.data/maven</directory> 
		<sourceDirectory>${project.basedir}</sourceDirectory> 
		<resources> 
			<resource> 
				<directory>${project.basedir}</directory> 
			</resource> 
		</resources> 
		<plugins> 
			<plugin> 
				<groupId>nl.whitehorses.servicebus</groupId> 
				<artifactId>servicebus-plugin</artifactId> 
				<version>1.0</version> 
				<extensions>true</extensions> 
				<configuration> 
					<!-- Configure the Oracle Home directory --> 
					<oracleHome>C:OracleMiddleware1221Oracle_Home</oracleHome> 
					<!-- Specify whether this is a system release --> 
					<system>false</system> 
					<!-- Configure the export level of the release, possible values are PROJECT and RESOURCE --> 
					<exportLevel>RESOURCE</exportLevel> 
					<!-- Optional parameter to specify which of the resources should be included into the archive --> 
					<includes> 
						<include>**/Business/*</include> 
						<include>**/Pipeline/*</include> 
					</includes> 
					<!-- Optional parameter to specify which of the resources should be excluded from the archive --> 
					<excludes> 
						<exclude>**/Pipeline/UpdateEmployee.pipeline</exclude> 
					</excludes> 
				</configuration> 
			</plugin> 
		</plugins> 
	</build> 
</project>

Instead of using the includes and excludes tags, we can also add a resources tag to the pom.xml file, which points towards a configuration file. This can be used to specify which files need to be included and excluded. For example, the below files will result in exactly the same archive as we would get using the includes and excludes tags: 

pom.xml

<?xml version="1.0" encoding="UTF-8"?> 
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd" xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> 
	<modelVersion>4.0.0</modelVersion> 
	<groupId>nl.whitehorses</groupId> 
	<artifactId>Employee</artifactId> 
	<version>1.0-SNAPSHOT</version> 
	<packaging>sbar</packaging> 
	<build> 
		<directory>${project.basedir}/.data/maven</directory> 
		<sourceDirectory>${project.basedir}</sourceDirectory> 
		<resources> 
			<resource> 
				<directory>${project.basedir}</directory> 
			</resource> 
		</resources> 
		<plugins> 
			<plugin> 
				<groupId>nl.whitehorses.servicebus</groupId> 
				<artifactId>servicebus-plugin</artifactId> 
				<version>1.0</version> 
				<extensions>true</extensions> 
				<configuration> 
					<!-- Configure the Oracle Home directory --> 
					<oracleHome>C:OracleMiddleware1221Oracle_Home</oracleHome> 
					<!-- Specify whether this is a system release --> 
					<system>false</system> 
					<!-- Configure the export level of the release, possible values are PROJECT and RESOURCE --> 
					<exportLevel>RESOURCE</exportLevel> 
					<!-- Optional parameter to specify which of the resources should be included into and excluded from the archive --> 
					<resources>C:JDevelopermyworkWhitehorsesEmployeearchiveResources.xml</resources> 
				</configuration> 
			</plugin> 
		</plugins> 
	</build> 
</project>

archiveResources.xml

<?xml version="1.0" encoding="UTF-8"?> 
<resources> 
	<!-- Optional parameter to specify which of the resources should be included into the archive --> 
	<includes> 
		<include>**/Business/*</include> 
		<include>**/Pipeline/*</include> 
	</includes> 
	<!-- Optional parameter to specify which of the resources should be excluded from the archive --> 
	<excludes> 
		<exclude>**/Pipeline/UpdateEmployee.pipeline</exclude> 
	</excludes> 
</resources>

Adding an Assembly project to add multiple projects to single archive

The final step of creating full OSB release archives on resource level, was that we wanted to add multiple projects to a single archive. We achieved this by first building every OSB project separately, and later build an Assembly application which combines all projects into a single archive file. To do this, we use the parent pom.xml file to build all projects (including the assembly project). It is important that the assembly project is build last.

The parent pom.xml file, which will build every project, including the assembly project, will look something like the below example:

pom.xml

<?xml version="1.0" encoding="UTF-8"?> 
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd" xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> 
	<modelVersion>4.0.0</modelVersion> 
	<groupId>nl.whitehorses</groupId> 
	<artifactId>Whitehorses</artifactId> 
	<version>1.0-SNAPSHOT</version> 
	<packaging>pom</packaging> 
	<modules> 
		<!-- All OSB projects --> 
		<module>Clockwise</module> 
		<module>Employee</module> 
		<module>Office</module> 
		<!-- Assembly project --> 
		<module>Assembly</module> 
	</modules> 
</project>

The pom.xml file of the Assembly project will look like the below example. It is important that if you add new projects, you don’t only add them to the parent pom.xml file, but you also add them as a dependency to the Assembly project. 

pom.xml

<?xml version="1.0" encoding="UTF-8"?> 
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd" xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> 
	<modelVersion>4.0.0</modelVersion> 
	<groupId>nl.whitehorses</groupId> 
	<artifactId>Assembly</artifactId> 
	<version>1.0-SNAPSHOT</version> 
	<dependencies> 
		<dependency> 
			<groupId>nl.whitehorses</groupId> 
			<artifactId>Clockwise</artifactId> 
			<version>1.0-SNAPSHOT</version> 
			<type>sbar</type> 
		</dependency> 
		<dependency> 
			<groupId>nl.whitehorses</groupId> 
			<artifactId>Employee</artifactId> 
			<version>1.0-SNAPSHOT</version> 
			<type>sbar</type> 
		</dependency> 
		<dependency> 
			<groupId>nl.whitehorses</groupId> 
			<artifactId>Office</artifactId> 
			<version>1.0-SNAPSHOT</version> 
			<type>sbar</type> 
		</dependency> 
	</dependencies> 
	<build> 
		<finalName>sbconfig_${project.version}</finalName> 
		<plugins> 
			<plugin> 
				<artifactId>maven-jar-plugin</artifactId> 
				<version>2.6</version> 
				<executions> 
					<execution> 
						<id>default-jar</id> 
						<phase>never</phase> 
					</execution> 
				</executions> 
			</plugin> 
			<plugin> 
				<artifactId>maven-assembly-plugin</artifactId> 
				<version>2.6</version> 
				<configuration> 
					<appendAssemblyId>false</appendAssemblyId> 
					<descriptors> 
						<descriptor>${basedir}/src/main/assembly/assembly.xml</descriptor> 
					</descriptors> 
				</configuration> 
				<dependencies> 
					<dependency> 
						<groupId>nl.whitehorses.servicebus</groupId> 
						<artifactId>servicebus-plugin</artifactId> 
						<version>1.0</version> 
					</dependency> 
				</dependencies> 
				<executions> 
					<execution> 
						<id>make-assembly</id> 
						<phase>package</phase> 
						<goals> 
							<goal>single</goal> 
						</goals> 
					</execution> 
				</executions> 
			</plugin> 
		</plugins> 
	</build> 
</project>

This pom.xml file also points to an assembly.xml file, which contains some configuration about how the Assembly archive should be build bij the Maven assembly plugin. This assembly.xml file should look like the below example: 

assembly.xml

<assembly xmlns="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.3" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.3 http://maven.apache.org/xsd/assembly-1.1.3.xsd"> 
	<id>src</id> 
	<formats> 
		<format>sbar</format> 
	</formats> 
	<includeBaseDirectory>false</includeBaseDirectory> 
	<dependencySets> 
		<dependencySet> 
			<outputDirectory>/</outputDirectory> 
			<unpack>true</unpack> 
		</dependencySet> 
	</dependencySets> 
</assembly>

If you navigate to the directory which contains the parent pom.xml file, and you execute mvn install via the command line, the OSB release will be build.

For a working example of the above, please download: Building-OSB-12c-releases-on-resource-level-using-Maven.zip

Deploying the archive via Maven

It is also possible to deploy the generated (Assembly) archive directly to an OSB server via the Maven plugin. To achieve this, we should add the deploy-assembly goal of the custom Maven plugin to the desired Maven phase. We chose the pre-integration-test phase for deploying the archive to the OSB server. The pom.xml file in the Assembly project can be updated like the below example to connect the Maven pre-integration-test phase to the deploy-assembly goal of the Maven plugin:

<?xml version="1.0" encoding="UTF-8"?> 
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd" xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> 
	<modelVersion>4.0.0</modelVersion> 
	<groupId>nl.whitehorses</groupId> 
	<artifactId>Assembly</artifactId> 
	<version>1.0-SNAPSHOT</version> 
	<dependencies> 
		<dependency> 
			<groupId>nl.whitehorses</groupId> 
			<artifactId>Clockwise</artifactId> 
			<version>1.0-SNAPSHOT</version> 
			<type>sbar</type> 
		</dependency> 
		<dependency> 
			<groupId>nl.whitehorses</groupId> 
			<artifactId>Employee</artifactId> 
			<version>1.0-SNAPSHOT</version> 
			<type>sbar</type> 
		</dependency> 
		<dependency> 
			<groupId>nl.whitehorses</groupId> 
			<artifactId>Office</artifactId> 
			<version>1.0-SNAPSHOT</version> 
			<type>sbar</type> 
		</dependency> 
	</dependencies> 
	<build> 
		<finalName>sbconfig_${project.version}</finalName> 
		<plugins> 
			<plugin> 
				<artifactId>maven-jar-plugin</artifactId> 
				<version>2.6</version> 
				<executions> 
					<execution> 
						<id>default-jar</id> 
						<phase>never</phase> 
					</execution> 
				</executions> 
			</plugin> 
			<plugin> 
				<artifactId>maven-assembly-plugin</artifactId> 
				<version>2.6</version> 
				<configuration> 
					<appendAssemblyId>false</appendAssemblyId> 
					<descriptors> 
						<descriptor>${basedir}/src/main/assembly/assembly.xml</descriptor> 
					</descriptors> 
				</configuration> 
				<dependencies> 
					<dependency> 
						<groupId>nl.whitehorses.servicebus</groupId> 
						<artifactId>servicebus-plugin</artifactId> 
						<version>1.0</version> 
					</dependency> 
				</dependencies> 
				<executions> 
					<execution> 
						<id>make-assembly</id> 
						<phase>package</phase> 
						<goals> 
							<goal>single</goal> 
						</goals> 
					</execution> 
				</executions> 
			</plugin>
			<plugin>
				<groupId>nl.whitehorses.servicebus</groupId> 
				<artifactId>servicebus-plugin</artifactId> 
				<version>1.0</version> 
				<executions>
					<execution>
						<id>deploy-assembly</id>
						<phase>pre-integration-test</phase>
						<goals>
							<goal>deploy-assembly</goal>
						</goals>
					</execution>
				</executions>
			</plugin>
		</plugins> 
	</build> 
</project>

If we navigate to the directory containing the parent pom.xml file again, we will be able to deploy the release to the OSB server by executing the mvn pre-integration-test statement. The following parameters must/can be added to this command:

Parameter Required? Default value
-Dserver.url Yes
-Dserver.username Yes
-Dserver.password Yes
-Ddeployment.preserve.credentials true
-Ddeployment.preserve.envValues true
-Ddeployment.preserve.operationalValues true
-Ddeployment.preserve.securityAndPolicyConfig true
-Ddeployment.preserve.accessControlPolicies true
-Ddeployment.customization.file No
-Ddeployment.session.activate true
-Ddeployment.session.discardOnError true

 

These settings represent the “Advanced Settings” when manually uploading an archive via the servicebus console.

Good luck building your OSB releases on resouce level using Maven!

Using XPath functions in the BPEL Process manager

When using XPath expressions in XSLT, and you want to use some functions that are avaible in the Oracle Mediator XSLT mapper but not in Oracle BPEL Process Manager. For Oracle BPEL Process Manager, you have to use an assign activity.

When using XPath expressions in XSLT, and you want to use one of the following functions:

  • getProperty(propertyName as string)
  • setCompositeInstanceTitle(titleElement)
  • getComponentInstanceID()
  • getComponentName()
  • getCompositeInstanceID()
  • getCompositeName()
  • getECID()

You can use these in the Oracle Mediator XSLT mapper only, and not Oracle BPEL Process Manager. For Oracle BPEL Process Manager, you have to  use an assign activity.

Oracle JavaOne 2016

This blog shows some of the highlights of JavaOne this year. The main keywords where: Internet of Things, Microservices BigData and Java 9.

This blog shows some of the highlights of JavaOne this year.

The main keywords where: Internet of Things, Microservices BigData and  Java 9.

The most of my time I have spend on following JavaOne sessions because I more technical than functional programmer.
Because I’m a integration developer my interest where in microservices next to that Java 9 and whats coming in this release.

JavaOne announcements:

  • Oracle has given NetBeans to the Apache foundation. The whole transfer has not been completed yet.
    This gives NetBeans the opportunity to grow into a greater IDE for developers with more plugins.
  • Java 9 has 58 new features. The biggest are Jigsaw and JShell.
  • Java EE 8 introduces four new modules security, JSON-B Health Check and Configuration.
  • The launch of Go Java like Oracle Developer Gateway the starting point for Java developers.
  • IBM is bringing his SDK to the Open Source community.

The things I’m exited about are:

  • Building a smaller JRE from a existing JDK 9 so you can run your software on a JRE with only the libraries that you need.
  • Using hypermedia with REST you can leave the whole business logica on the server side.
  • Using Swagger you can develop contract first REST. And parallel programming by front-end and back-end using a mocked REST service based on the contract.
    Swagger can also generate boilerplate code from the contract. The contract can be written in YAML or JSON.
  • NetBeans going to the Apache Foundation gives the opportunity for developers to join and contribute on the project. For building more plugins.
  • Native REST support in Java EE 7 with JSON-B.
  • Microservices and the concept of it, scalability, API management, every service has one specified responsibility and his own dataset.
  • Mission Control and Flight Recorder witch will be delivered with JDK 9.

For more info on JavaOne blogs see Oracle Java Blogs

Oracle OpenWorld 2016

Last week I was at Oracle OpenWorld for the first time. It was a great experience to see how big Oracle really is.

Last week I was at Oracle OpenWorld for the first time. It was a great experience to see how big Oracle really is.

Next to Oracle OpenWorld, Oracle organized JavaOne the Java Conference. I was there as well.

In this blog I want to focus on the highlights of the Oracle OpenWorld in my other blog I will show you the highlights of JavaOne.

Al this information is reserved by “safe harbor statement” of Oracle.

The great keywords of this year are: Lift and Shift, 82% growth in the last year, the fastest growing Cloud company, and the Cloud on premise.

  • Oracle has introduced the Oracle Developer Gateway a entrance for developers to working with everything that Oracle has to offer for the developer.
  • Oracle has announced the new version of Oracle 12c database Release 2. This database is the same as the one in the Cloud. Thereby you can easily lift and shift your on premise database to the Cloud.
  • Oracle introduces during his keynotes a lower cost Cloud solution Exadata Cloud Service starting at 175 dollar a month, it runs on the same hardware and software.
  • If you have, for some reason, the need of  the Cloud solution on premise Oracle can deliver that for you with the same hardware and software in your datacenter.
  • Oracle offers now Cloud Containers. Here you can run your docker instances directly on the Oracle Cloud.
  • Oracle offers strategy for migration of your on premise databases, Java EE application, Weblogic Clusters to the Cloud.
  • Oracle offers Big Data Solutions and Machine Learning in the Cloud.
  • And for Security Oracle offers Security Monitoring Analytics Cloud service with this Oracle investigate, monitor the behavior of internal and external traffic. Based on that it will detect weaknesses and attacks on your Cloud.
  • Oracle is showing in one of its keynotes that the Oracle Database can be run everywhere on every cloud solution of Amazone and Microsoft.
  • Oracle shows that, based on bench marks, there Cloud solutions runs faster than there competitors and that means that if you use the payment module pay what you use you can get more value for your money.

For more details of Oracle OpenWorld watch the keynotes at Oracle OpenWorld 2016 Highlights.

Oracle Process Cloud Service advanced form validation and control

In my previous blogpost I explained how forms in PCS are created and what basic options are available. Now I would like to go deeper into the details of rules and business objects.

In my previous blogpost I explained how forms in PCS are created and what basic options are available. Now I would like to go deeper into the details of rules and business objects.

Let’s start with rules.

The form canvas has a row of tiny buttons in the top right. The left one switches between rules and the form itself. Click that and the canvas changes to the list of rules (if already created). A new rule can be created here too. The image below shows what you get to see.

.pcs-formrules

This might be different than you expected. Just a name, description and text area for the rule. I hope you like JavaScript, because that is exactly what you are going to use to write the rules. I guess Oracle has embraced JavaScript as language of choice for rules and expressions, as this was introduced in SOA Suite 12.2.1 for BPEL assignments and Service Bus expressions.

Referencing the items on your page has been made easy. Click to open the Form Outline in the left menu, and expand the nodes to view which items you have available and their properties. You can also click the icon next to the attribute to copy the name to your clipboard.

Now back to our original idea; hide and show elements based on a checkbox value, and make sure at least one value in the contact info is entered.

It took me some time to understand the way the checkboxes are defined. The default value for “Options” (which describe the number of checkboxes and their respective values) is something like “Option_1=Yes”. Referring to the checkbox as cc_c[0]=”Yes” did not work though. I finally figured it out when replacing the Option field with “Yes”. PCS replaced it with “Yes=Yes” and then it worked for me. Ok.

pcs-checkbox-options

The definitive rule contains this JavaScript code:

if (cc_c[0].value == 'Yes') {
  CI.visible = true;
} else {
  CI.visible = false;
}

cc_c is the name of the checkbox. CI is the name of the group that contains both email and phone number fields. Easy! Although I really wonder if “citizen developers” will agree with me.

How can we add validations with these rules? PCS has chosen for a rather rudimentary option. You can add a “Message” element that has different type options (info, warning, error, etc) and contains the text of you choice. Let’s say I create a Message element tplMsg with a descriptive text.

Enter the following JavaScript validation:

if ((cc_c[0].value == 'Yes') && ( Phone.value !== '' || YourEmail.value !== '' )) {
  tplMsg.visible = false;
} else {
  tplMsg.visible = true;
}

pcs-validationwarning

Next stop: creating business objects for dynamic control and service-based functionality in forms.

Creating and editing forms in Process Cloud Service

Oracle Process Cloud Service is a PaaS (Platform as a Service) offering which centers around designing and managing (stand alone) business processes in the cloud. The focus is once again on business analysts and other non-technical people to automate business processes. Oracle has keyed them “citizen developers”. One of the features is the option to create forms that fulfill the “human task” interaction in the BPM process.

For whom is PCS?

Oracle Process Cloud Service is a PaaS (Platform as a Service) offering which centers around designing and managing (stand alone) business processes in the cloud. The focus is once again on business analysts and other non-technical people to automate business processes. Oracle has keyed them “citizen developers”. One of the features is the option to create forms that fulfill the “human task” interaction in the BPM process.

A walkthrough for creating a form

For the sake of ease, login to the Process Cloud BPM Composer. Create a new app and choose “Quickstart App”. Select the Travel Approval.

pcs-newprocess

 

 

 

 

The process is displayed. Drag a human task type “Submit” into the process.
pcs-bpmaddtask

 

 

 

 

Now click on the app and a blue “hamburger icon” appears. When you click on it the detail view is displayed:

pcs-createformNow press on the + icon to go to the Form page.
Creating a new input form is childishly easy; drag and drop the elements of choice to the canvas and enter the name, description and other basic properties. It’s all very very straightforward.

What is more interesting is the option to add more advanced validations and computations, populate dynamic dropdowns and add all kinds of dynamic validations.

As you can see, I’ve created a simple form with a text area, a True/False (simple checkbox) and a phone and email field. Idea is to hide the contact options when the checkbox is blank, and to validate the input fields and make sure either Phone, Email or both are entered. But let’s start with simple validations.

pcs-formcanvas

In the left pane are a number of predefined form fields like Phone, Email or Date. When you select “Phone”, the settings show options for required, maximum length and pattern. Enter 12 in the Max Length and the field will not accept more than 12 characters. Enter a pattern and the field is validated against the pattern on losing focus. Apparently the “standard” regex patterns may be used here. The effect is that, when the pattern is not matched, an inline error message is displayed.

Another nice touch is, that you can choose a “decorator”; a small icon in the field that gives a visual clue to the user about the field’s purpose. The icons are from the font-awesome icon libraries.

During the development phase, press the Preview button on the top right of the canvas (an icon of a running person), and a working preview is presented to you. Very helpful for testing the layout and validations you created.

Saving, deploying and testing

One more thing, the save button is located at the top of the page, in the center. An awkward position for a save button if you ask me. Next to the save pcs-publishbutton is a “Publish” button, which makes the application available for other developers. You might say it’s a push to the versioning system. Here is also the option for creating a “snapshot”, probably useful for reverting back to in case you messed something up (which is bound to happen as there is no undo!).

I was not able to deploy at first. I received a message that the credentials were not setup correctly. The fix was relatively easy: go to the workspace homepage, choose “Administration” in the top bar, and click on the green “Player” icon. Now make sure you’ve entered your credentials there. It should work now.
Deploy your application by choosing “Test” in the top right, and click “Deploy”. Go get coffee. Click play, and a new page appears with your application. Doubleclick and a new tab opens with a visual representation of your proces. Click the starting point (it has a cleverly concealed play button), and the process starts. The first human task form will display immediately. You can walk through the process now! Congratulations, you’ve succesfully created your first PCS process.

There is much more to see, for instance the option to create complex validations and with JavaScript and model business jects for interaction with other services. More of that in a subsequent article.

Use Oracle JET to monitor Weblogic queues

At a customer we are working in an Oracle Fusion Middleware environment. In this environment we created queues. These queues are configured with an error queue when messages could not be delivered. In this blog post I will show how to monitor these queues from an Oracle JET application.

At a customer we are working in an Oracle Fusion Middleware environment. In this environment we created queues. These queues are configured with an error queue when messages could not be delivered.

Because there are a number of error queues, monitoring is a lot of work. You have to check each queue independently. Recently I ran into this blog of Frank Munz (link). In this blogpost he talks about the framework Jolokia. This framework offers remote JMX access. More information about Jolokia can be found here.

With Jolokia you can request Weblogic MBean data and the response can be defined as JSON. So when I read this, I thought this is the solution to monitor the error queues.

What I did is the following:

First install Jolokia as an application.  From the website you can download a WAR. This can be installed as an application in Weblogic. When you have an environment with managed servers and you want to read data from these servers you must also target the WAR to the managed servers.

Then I created an Oracle JET application. In this application I made a JQuery.Ajax call to the REST service of the error queue. This is the url: http://<host>:<port/jolokia/read/com.bea:JMSServerRuntime=<JmsServerName>,Name=<JMS ModuleName>!!<JmsServerName>@<QueueName>,ServerRuntime=<serverName>,Type=JMSDestinationRuntime/MessagesCurrentCount

In the response you can read the current messages that are on the queue with the use of the variable MessagesCurrentCount. Add the value to an array and you can use the array in a chart for the JET application.

You can see the result in the next image:

blog_queue

In this image you can see the number of messages on all the different queues in one view, instead of monitoring each queue indepently.

With a lot of MBeans available in Weblogic and with the help of Jolokia you can show almost everything in Oracle JET of course!

 

 

Creating FTP connection factories using WLST

Creating FTP connection factories using the Weblogic console can take a lot of time, mainly because it are multiple screens you have to click through every time. But creating these FTP connection factories can be done a lot easier and faster when using a WLST script.

Creating FTP connection factories using the Weblogic console can take a lot of time, mainly because it are multiple screens you have to click through every time. But creating these FTP connection factories can be done a lot easier and faster when using a WLST script.

The following script can be used as a base for your script to generate your FTP connection factories. I have used this script on a Weblogic 12c environment.

# 
# Script settings
#
appName = 'FtpAdapter'

moduleOverrideName = appName + '.rar'
moduleDescriptorName = 'META-INF/weblogic-ra.xml'

soaHome = '/u01/Oracle/Products/Middleware/soa'
appPath = soaHome + '/soa/connectors/' + moduleOverrideName
planPath = soaHome + '/soa/FTPAdapterPlan.xml'

#
# Connect to Weblogic server
#
print('Connecting to local weblogic domain... ')
username = raw_input('Enter username: ')
password = raw_input('Enter password: ')
connect(username,password,'t3://localhost:7001')

#
# Method to insert variable to deployment plan
#
def makeDeploymentPlanVariable(wlstPlan, name, value, xpath, origin='planbased'):
    try:
        while wlstPlan.getVariableAssignment(name, moduleOverrideName, moduleDescriptorName):
            wlstPlan.destroyVariableAssignment(name, moduleOverrideName, moduleDescriptorName)
        variableAssignment = wlstPlan.createVariableAssignment(name, moduleOverrideName, moduleDescriptorName)
        variableAssignment.setXpath(xpath)
        variableAssignment.setOrigin(origin)
        wlstPlan.createVariable(name, value)
    except:
        print 'Error during makeDeploymentPlanVariable: ', sys.exc_info()[0]

#
# Update property for FTP adapter
#
def updatePropertyForFtpAdapter(deploymentPlan, jndiName, propertyName, propertyValue):
    try:
        shortJndiName = jndiName.split('/')[2]
        makeDeploymentPlanVariable(deploymentPlan, 'ConfigProperty_' + propertyName + '_Value_' + shortJndiName, propertyValue, '/weblogic-connector/outbound-resource-adapter/connection-definition-group/[connection-factory-interface="javax.resource.cci.ConnectionFactory"]/connection-instance/[jndi-name="' + jndiName + '"]/connection-properties/properties/property/[name="' + propertyName + '"]/value', moduleOverrideName)
    except:
        print 'Error during updatePropertyForFtpAdapter: ', sys_exc_info()[0]

#
# Method to create new FTP connection factory
#
def createFTPConnectionFactory(jndiName, type, host, port, securePort, username, password, walletLocation='', walletPassword=''):
    try:
        newPlan = loadApplication(appPath, planPath)
        makeDeploymentPlanVariable(newPlan, 'ConnectionInstance_' + jndiName + '_JNDIName', jndiName, '/weblogic-connector/outbound-resource-adapter/connection-definition-group/[connection-factory-interface="javax.resource.cci.ConnectionFactory"]/connection-instance/[jndi-name="' + jndiName + '"]/jndi-name', moduleOverrideName)
        
        if(type == 'sftp'):
            updatePropertyForFtpAdapter(newPlan, jndiName, 'UseFtps', 'false')
            updatePropertyForFtpAdapter(newPlan, jndiName, 'UseSftp', 'true')
        elif(type == 'ftps'):
            updatePropertyForFtpAdapter(newPlan, jndiName, 'UseFtps', 'true')
            updatePropertyForFtpAdapter(newPlan, jndiName, 'UseSftp', 'false')
        else:
            updatePropertyForFtpAdapter(newPlan, jndiName, 'UseFtps', 'false')
            updatePropertyForFtpAdapter(newPlan, jndiName, 'UseSftp', 'false')
        
        updatePropertyForFtpAdapter(newPlan, jndiName, 'Host', host)
        updatePropertyForFtpAdapter(newPlan, jndiName, 'Username', username)
        updatePropertyForFtpAdapter(newPlan, jndiName, 'Password', password)
        updatePropertyForFtpAdapter(newPlan, jndiName, 'Port', port)
        updatePropertyForFtpAdapter(newPlan, jndiName, 'SecurePort', securePort)
        updatePropertyForFtpAdapter(newPlan, jndiName, 'WalletLocation', walletLocation)
        updatePropertyForFtpAdapter(newPlan, jndiName, 'WalletPassword', walletPassword)
        
        newPlan.save();
        save();
    except:
        print 'Error during createFTPConnectionFactory: ', sys.exc_info()[0]

#
# Create FTP connection factories
#
try:
    edit()
    startEdit()
    
    createFTPConnectionFactory('eis/Ftp/ftp_server1', 'ftp', '127.0.0.1', '21', '21', 'username', 'password')
    createFTPConnectionFactory('eis/Ftp/sftp_server1', 'sftp', 'localhost', '22', '22', 'username', 'password')
    createFTPConnectionFactory('eis/Ftp/ftps_server1', 'ftps', 'localhost', '990', '990', 'username', 'password', '/location/to/pkcs/wallet/ftps_wallet.p12', 'wallet_password')
    
    print 'Updating and restarting application...'
    cd('/AppDeployments/FtpAdapter/Targets');
    updateApplication(appName, planPath);
    startApplication(appName)
    
    print 'Done with changes. Calling activate...'
    activate()
except:
    print 'Unexpected error: ', sys.exc_info()[0]
    dumpStack()
    raise

Save the script as create_ftp_connection_factories.py on the Weblogic server and execute the following statement:

$ORACLE_HOME/oracle_common/common/bin/wlst.sh create_ftp_connection_factories.py

At the moment I have only implemented the createFTPConnectionFactory method to be able to configure the Host, Username, Password, Port, SecurePort, WalletLocation and WalletPassword properties. However, if you want to be able to edit other properties via this script, it is very easy as well.

To view all properties of an existing FTP connection factory, open the Weblogic console and navigate to Deployments > FtpAdapter > Configuration > Outbound Connection Pools. Extend javax.resource.cci.ConnectionFactory and select the desired connection factory. You will immediately see 8 pages full of properties. With some minor modifications to the WLST script, it is very easy to edit any of these properties. We just have to add a call to updatePropertyForFtpAdapter within the createFTPConnectionFactory method, and provide the application plan, the JNDI name of the connection factory we want to edit, the property name (as shown in the Weblogic console) and the desired property value.

For example, if we want to use implicit SSL, we can see the property UseImplicitSSL via the Weblogic console, which is false by default. To make this property value true by default, we just have to add the following line before the newPlan.save() statement.

updatePropertyForFtpAdapter(newPlan, jndiName, 'UseImplicitSSL', 'true')

I hope this will save you a lot of time when configuring your FTP connection factories. If you have any problem with this script, please let me know.

Sources:

Oracle SOA Suite 11g – Creating Resource Adapter Connection factories through WLST for Database Adapter,MQ Adapter and FTP Adapter

Design a site like this with WordPress.com
Get started