Doing more with the WebLogic Maven Plugin (Robot, Selenium and Sonar too!)

The ‘new and improved’ WebLogic Maven Plugin in WebLogic Server 12.1.2 provides a bunch of additional capabilities above and beyond your straight compile/package/deploy operations.  In this post I wanted to take you on a journey of some of these additional capabilities and explore how we might use them to automate testing of a project.

We are going to use a somewhat contrived example of a simple web application, so the capabilities we look at are going to probably look like overkill for the project.  But they probably make a lot more sense for a larger project, and I am sure you can ‘scale up’ the sample and imagine how things would work.

What we are going to do is take an application, and set it up so that our build process does the following:

  • compile the code
  • run unit tests
  • package it into a WAR
  • install WebLogic Server
  • create a simple domain
  • start the Admin Server
  • deploy the application onto the Admin Server
  • run a set of integration tests against the application
  • stop the Admin Server
  • run quality checks on our application

The integration tests and quality checks are not actually going to be done with the WebLogic Maven Plugin directly, but I am including them anyway to make the example a bit more realistic.

Let’s get started by creating a simple web application from the WebLogic Basic Web Application Maven Archetype.  I am assuming that you already have a Maven repository that you have populated with the artifacts from your WebLogic Home.  If you don’t, see this post for details on how to do that.

Note: If you prefer, you can get the project by cloning my ci-samples git repository (git://java.net/ci4fmw~ci-samples) and looking at the my-webapp directory.

mvn archetype:generate
    -DarchetypeGroupId=com.oracle.weblogic.archetype
    -DarchetypeArtifactId=basic-webapp
    -DarchetypeVersion=12.1.2-0-0
    -DarchetypeRepository=local
    -DgroupId=com.redstack
    -DartifactId=my-webapp
    -Dversion=1.0-SNAPSHOT

Great!  Now let’s set up our POM file to carry out the steps we want. We are going to do this by adding ‘execution’ entries for the weblogic-maven-plugin to our POM’s ‘build’ section.

Here’s what we need to add to the POM to install WebLogic Server.  Let’s assume that we have the installer (wls_121200.jar) available in a well known location on our machine.

<!-- install weblogic -->
<execution>
  <id>install-wls</id>
  <phase>pre-integration-test</phase>
  <goals>
    <goal>install</goal>
  </goals>
  <configuration>
    <installDir>c:/dev/wlshome</installDir>
    <artifactLocation>c:/dev/wls_121200.jar</artifactLocation>
    <installCommand>@JAVA_HOME@/bin/java -Xms512m -Xmx1024m -jar @INSTALLER_FILE@ -silent -response ${basedir}/misc/wls_response.txt</installCommand>
  </configuration>
</execution>

So, this will execute the weblogic-maven-plugin:install goal during the pre-integraton-test phase of our build, i.e. after we have compiled and packaged our web app.  We need to tell the weblogic-maven-plugin where to install WebLogic Server (installDir), where the installer is (artifactLocation) and since we are using the JAR installer, we need to give it the installCommand as well, to tell it how we want WebLogic Server installed.  If we use the ZIP installer, we don’t need to give it installCommand – it knows how to install from the ZIP installer.

In this example we are using a silent install with a response file. Here are the contents of the response file – note that it has the ORACLE_HOME in it:

[ENGINE]

#DO NOT CHANGE THIS.
Response File Version=1.0.0.0.0

[GENERIC]

#The oracle home location. This can be an existing Oracle Home or a new Oracle Home
ORACLE_HOME=C:\dev\wlshome

#Set this variable value to the Installation Type selected. e.g. WebLogic Server, Coherence, Complete with Examples.
INSTALL_TYPE=Complete with Examples

#Provide the My Oracle Support Username. If you wish to ignore Oracle Configuration Manager configuration provide empty string for user name.
MYORACLESUPPORT_USERNAME=

#Provide the My Oracle Support Password
MYORACLESUPPORT_PASSWORD=

#Set this to true if you wish to decline the security updates. Setting this to true and providing empty string for My Oracle Support username will ignore the Oracle Configuration Manager configuration
DECLINE_SECURITY_UPDATES=true

#Set this to true if My Oracle Support Password is specified
SECURITY_UPDATES_VIA_MYORACLESUPPORT=false

#Provide the Proxy Host
PROXY_HOST=

#Provide the Proxy Port
PROXY_PORT=

#Provide the Proxy Username
PROXY_USER=

#Provide the Proxy Password
PROXY_PWD=

#Type String (URL format) Indicates the OCM Repeater URL which should be of the format [scheme[Http/Https]]://[repeater host]:[repeater port]
COLLECTOR_SUPPORTHUB_URL=

This will get WebLogic Server installed for us.  The next thing we want to do is create a domain.  The weblogic-maven-plugin:create-domain goal will create a ‘simple’ domain for us.  If you want to create a more complex domain, you should use the wlst goal and write a WLST script to create the domain.

For the simple domain, all we need to tell it is where WebLogic Server is installed (middlewareHome), where we want the domain (domainHome) and the administration user and password for the new domain.

Again, we have this execute in the pre-integration-test phase.  When there are multiple executions in the same phase like this, Maven will execute them in the order they appear in the POM file, so we place this one after the install execution.

<!-- Create a domain -->
<execution>
  <id>wls-create-domain</id>
  <phase>pre-integration-test</phase>
  <goals>
    <goal>create-domain</goal>
  </goals>
  <configuration>
    <middlewareHome>c:/dev/wlshome</middlewareHome>
    <domainHome>${project.build.directory}/base_domain</domainHome>
    <user>weblogic</user>
    <password>welcome1</password>
  </configuration>
</execution>

Now we need to start up our server, we can do this using the weblogic-maven-plugin:start-server goal and passing in the domainHome:

<!-- start a server -->
<execution>
  <id>wls-wlst-start-server</id>
  <phase>pre-integration-test</phase>
  <goals>
    <goal>start-server</goal>
  </goals>
  <configuration>
    <domainHome>${project.build.directory}/base_domain</domainHome>
  </configuration>
</execution>

Now we have a running server, we are ready to deploy our application. We do this using the weblogic-maven-plugin:deploy goal.  We need to give it the server details (it defaults to http://localhost:7001, so that will work for us) and the location of the application we want to deploy (source).

<!--Deploy the application to the server-->
<execution>
  <id>deploy</id>
  <phase>pre-integration-test</phase>
  <goals>
    <goal>deploy</goal>
  </goals>
  <configuration>
    <!--The admin URL where the app is deployed. Here use the plugin's default value t3://localhost:7001-->
    <!--adminurl>${oracleServerUrl}</adminurl-->
    <user>weblogic</user>
    <password>welcome1</password>
    <!--The location of the file or directory to be deployed-->
    <source>${project.build.directory}/${project.build.finalName}.${project.packaging}</source>
    <!--The target servers where the application is deployed. Here use the plugin's default value AdminServer-->
    <!--targets>${oracleServerName}</targets-->
    <verbose>true</verbose>
    <name>${project.build.finalName}</name>
  </configuration>
</execution>

Once we have our application deployed, we are ready to run our integration tests.  Those would run in the integration-test phase, but let’s come back to them in a moment.

First, let’s handle the server shutdown.  We will do this in the post-integration-test phase, after all of our tests have run.  We can use the weblogic-maven-plugin:stop-server goal to shut down the server.  We need to tell it the domainHome and the server details:

<!-- stop the server -->
<execution>
  <id>wls-wlst-stop-server</id>
  <phase>post-integration-test</phase>
  <goals>
    <goal>stop-server</goal>
  </goals>
  <configuration>
    <domainHome>${project.build.directory}/base_domain</domainHome>
    <user>weblogic</user>
    <password>welcome1</password>
    <adminurl>t3://localhost:7001</adminurl>
  </configuration>
</execution>

Ok, so now we have most of our steps implemented, but to make this more useful and realistic, let’s add some integration tests and quailty checking to the project.  We are going to use Robot Framework and Selenium for the integration tests, and Sonar with PMD for the quality checks.

If you are not familiar with Robot Framework, you might like to read over this earlier post.

To add Robot Framework to our project, we just need to add a dependency and an execution to our POM.  Here is what we need:

<plugin>
  <groupId>org.robotframework</groupId>
  <artifactId>robotframework-maven-plugin</artifactId>
  <version>1.1</version>
  <executions>
    <execution>
      <id>robot-test</id>
      <phase>integration-test</phase>
      <goals>
        <goal>run</goal>
      </goals>
    </execution>
  </executions>
</plugin>

We will also need to create the directories to store our test cases and libraries.  We need the following:

project root
- src
  - test
    - robotframework
      - acceptance
    - resources
      - robotframework
        - libraries

We want to use the Selenium2Library.  Here is the dependency to add to your POM to get this library and the necessary dependencies:

<dependency>
  <groupId>com.github.markusbernhardt</groupId>
  <artifactId>robotframework-selenium2library-java</artifactId>
  <version>1.4.0.0</version>
  <classifier>jar-with-dependencies</classifier>
</dependency>

Now we are ready to write our test cases.  We will keep these in the acceptance directory.  To keep things simple, we will just write one test case that will open our application and enter some data into the form, press the button and check the output.

Here is what we need:

*** Settings ***
Library  Selenium2Library

*** Test Cases ***
Quick Test
   Open Browser                        http://localhost:7001/basicWebapp/index.xhtml
   Page Should Contain                 Please Enter Your Account Name and Amount
   Page Should Contain Textfield       j_idt10:name
   Page Should Contain Textfield       j_idt10:amount
   Page Should Contain Button	       j_idt10:j_idt18
   Input Text  	       	               j_idt10:name             Bob
   Input Text                          j_idt10:amount           25.00
   Click Button	                       j_idt10:j_idt18
   Wait Until Page Contains            The money have been deposited to Bob, the balance of the account is 25.0
   Close Browser

Now for testing the code quality.  We will use Sonar for that.  You can download the community edition from here.

Once you have that downloaded, just unzip it, and then go start it up with this command (adjust for your OS):

bin\windows-x86-64\StartSonar

You can access the console at http://localhost:9000

To run sonar on our project, you can just specify the sonar:sonar goal on the command line.  So to run the whole thing, we do the following:

mvn verify sonar:sonar

Now you can sit back and watch as WebLogic Server is installed, the domain is created, our application is built and deployed, tested, and checked for quality.

At the end of all this, we can look at the test report, which is located in target/robotframework-reports/report.html (open it with your browser).

blog-more-wmp-robot

You can click on the Quick Test link to see details of the test.

blog-more-wmp-robot-detail

And to check our quality, go to the sonar console (refresh it if you had it open already) and click on our project (my-webapp) to see details.  You will see there are two major problems.  Click on the Major link to see them.  Then click on the AccountBean to see the problems in the context of the actual code.

Here you can see we are using an implementation type HashMap when we should have been using an interface (Map).

blog-more-wmp-sonar

Obviously, this is a very trivial example, but hopefully you get the idea, and see what you could do with a much larger code base!

Posted in Uncategorized | Tagged , , , , , , , | Leave a comment

Setting up a Maven Repository Manager with Artifactory

As you start to work more with Maven, you will undoubtedly reach the point where you feel the need for a repository manager – something that manages all of your repositories of artifacts for you.  You will likely have a few repositories for different types of artifacts that you care about.

There are a few repository managers out there, with varying capabilities, and some with commercial and free/community versions.  In this post, we will use the community version of Artifactory from JFrog.

So the scenario here is that we are going to set up a binary repository to do three things:

  • act as a mirror/proxy of Maven Central
  • hold our work in progress project builds (snapshots)
  • hold our finished products and other dependencies

We will set up Artifactory with three repositories for these three uses, named:

  • repo1
  • snapshot
  • internal

Let’s get started by downloading Artifactory from here.

You can just download the zip file, then unzip it, and start up Artifactory by running the command:

<ARTIFACTORY_DIRECTORY>/bin/artifactoryctl start

You will need to replace <ARTIFACTORY_DIRECTORY> with the path to the
directory where you unzipped Artifactory.

Artifactory will start up and should be accessible at:

http://localhost:8081/artifactory

The first thing you will want to do is login as the admin user (the password is password).  Then click on the Admin tab to go to the administration area.  If you have to go through an HTTP proxy to get to Maven Central, you will need to define the proxy server details.  You can do this in the Proxies page – click on Proxies in the left hand menu.

Then click on the New button, give the proxy a name (key) and fill out the server, port and any other details needed to use the proxy server.

Next, you need to go set the repo1 mirror to use this proxy server.  To do this, go to the Repositories page, find repo1 in the Remote Repositories section, highlight it, click on Edit in the popup menu, go to the Advanced Settings tab, and in the Network section, choose the proxy you just defined from the dropdown list.

If you do not need to use a proxy server, you can continue from this point.

If you are not already there, click on the Admin tab and then the Repositories option in the left menu.

In the Local Repositories section (at the top), click on the New button and create a new repository called internal.  Select the option to Handle Releases, but do not select Handle Snapshots.

Create another new local repository called snapshot, select the option to Handle Snapshots, but not Handle Releases.

So we have our repositories set up.  Now we are going to want to create a Maven settings file to use them.

First, let’s create a master password so that we can store encrypted passwords in our Maven settings file.  Run the following command:

mvn -emp some_password

Change ‘some_password’ to a password of your own choosing.  Copy the output of this command and put it into a new file in $HOME/.m2/settings-security.xml as follows:

<settingsSecurity>
  <master>{vF59g61kOG4H/CNqjBmDav77Ne3tm1MChkIWwffySSM=}</master>
</settingsSecurity>

Note that your encrypted password will be different to the one shown here.

Now you can encrypt passwords for accessing servers, like our Artifactory server.  You will need to authenticate to publish new artifacts into the internal and snapshot repositories.  By default, the admin user has the right to publish to these repositories, so you can encrypt the admin user’s password (which is password) as follows:

mvn -ep password

Copy the output of this command, we will need it later on.

Let’s take a look at the Maven settings file that lets us use our Artifactory server:

<settings>
  <profiles>
    <profile>
      <id>default</id>
      <repositories>
        <repository>
          <id>internal</id>
          <name>internal</name>
          <url>http://localhost:8081/artifactory/internal</url>
          <releases>
            <enabled>true</enabled>
          </releases>
          <snapshots>
            <enabled>false</enabled>
          </snapshots>
        </repository>
        <repository>
          <id>snapshot</id>
          <name>snapshot</name>
          <url>http://localhost:8081/artifactory/snapshot</url>
          <releases>
            <enabled>false</enabled>
          </releases>
          <snapshots>
            <enabled>true</enabled>
            <updatePolicy>always</updatePolicy>
          </snapshots>
        </repository>
      </repositories>
    </profile>
  </profiles>
  <activeProfiles>
    <activeProfile>default</activeProfile>
  </activeProfiles>
  <mirrors>
    <mirror>
      <id>mirror</id>
      <name>mirror</name>
      <url>http://localhost:8081/artifactory/repo1</url>
      <mirrorOf>central</mirrorOf>
    </mirror>
  </mirrors>
  <servers>
    <server>
      <id>internal</id>
      <username>admin</username>
      <password>{+Ghh1+ORiAIHJ1XKXPEorNjGuMSWUgvXZJYgOaEEncY=}</password>
    </server>
    <server>
      <id>snapshot</id>
      <username>admin</username>
      <password>{+Ghh1+ORiAIHJ1XKXPEorNjGuMSWUgvXZJYgOaEEncY=}</password>
    </server>
  </servers>
</settings>

So let’s review what we have in this file.  First we have a profile, called default, which sets up the repositories and tells Maven which ones handle snapshots and releases.  Then we activate that profile.

Then we have a mirror entry to tell Maven where to look for any artifact it wants from central.

Finally, we have the servers section which is used to provide credentials for accessing servers.  In this case we are providing the encrypted admin user password so that we can publish to these repositories.

Save this file in $HOME/.m2/settings.xml and we are ready to test it out!

To test the mirror, you can use the following command:

mvn help:describe

Ultimately this command will produce an error message, because we have not provided enough arguments, but before that happens, it will need to go download a bunch of artifacts from central, and to do that, it will request them from your mirror.  After this command has started running, you can browse the repo1 repository in the Artifactory web interface and you will see it starting to fill up with the various artifacts that Maven needs to execute that command.

So, that tells us that our mirror is working.  Next, let’s try to publish a release artifact to the internal repository.  You could use the Maven Synchronization Plugin as an example.  This will be sitting in your WebLogic Server installation, under $ORACLE_HOME/oracle_common/plugins/maven/com/oracle/maven/oracle-maven-sync/12.1.3

You can install it into the internal repository using this command:

mvn deploy:deploy-file
    -Dfile=$ORACLE_HOME/oracle_common/plugins/maven/com/oracle/maven/oracle-maven-sync/12.1.3/oracle-maven-sync-12.1.3-0-0.jar
    -DpomFile=$ORACLE_HOME/oracle_common/plugins/maven/com/oracle/maven/oracle-maven-sync/12.1.3/oracle-maven-sync-12.1.3-0-0.pom
    -DrepositoryId=internal

After running this command, you can browse the internal repository in the Artifactory web interface and you will see the plugin in there.

Finally, let’s test our snapshot repository.  To do that, we need a new project.  Grab a new directory, and create a project using one of the simple archetypes, like maven-archetype-quickstart for example. You can do this using this command:

mvn archetype:generate
    -DarchetypeGroupId=org.apache.maven.archetypes
    -DarchetypeArtifactId=maven-archetype-quickstart
    -DarchetypeVersion=1.1
    -DgroupId=com.test
    -DartifactId=project1
    -Dversion=1.0-SNAPSHOT

Then, go into your project1 directory, and add the following distributionManagement section to your pom.xml:

<distributionManagement>
  <repository>
    <id>internal</id>
    <name>internal</name>
    <url>http://localhost:8081/artifactory/internal</url>
  </repository>
  <snapshotRepository>
    <id>snapshot</id>
    <name>snapshot</name>
    <url>http://localhost:8081/artifactory/snapshot</url>
  </snapshotRepository>
</distributionManagement>

Now you can deploy the project!  Run this command:

mvn deploy

Then go browse your snapshot repository and you will see your project’s artifact in there!

So that completes the basic setup of a Maven repository manager using Artifactory and configuration of Maven to use it.  Enjoy!

Posted in Uncategorized | Tagged , , | Leave a comment

Writing a Robot Remote Library

After reading the quick introduction to Robot Framework, you might have been asking yourself “so, what else can I do with this Robot thing?” – well there are, of course, a lot of things you can do with it.  Today, I want to take a look at writing Remote Libraries.

Robot Framework is extensible through a library mechanism.  You can write libraries in Java or Python, or you can wrtie a remote library, which just has to be an XMLRPC server – the implementation language is not important.

One thing that I think we are going to want in the future, when we start provisioning environments using tools like Chef, is a way to check that we got what we wanted.  So, I decided to start writing a little Robot library that we could use to check that an environment is configured the way we think it is, or expect it to be.  Obviously there are other ways to do this, perhaps better ways, like exposing the configuration through Ohai for example, but I wanted to write a Robot Remote Library, and this seemed like a reasonable example.

So what will our library do?  Well, we will build it as a simple Web Application (WAR) that we will deploy on to our WebLogic Admin Server, and we will allow it to answer some simple questions like:

  • Does the server called ‘AdminServer’ exist?
  • Is the application called ‘Something’ in the state ‘RUNNING’?
  • Does the data source called ‘Something’ exist?
  • and so on…

You could imagine that we could go on extending this to let us ask questions about SOA, OSB, or anything running on WebLogic Server.  To answer the questions, we will just go read information out of MBeans, and since Fusion Middleware products conveniently put configuration information in MBeans, it is relatively straightforward for us to go get it.

And since we are building a Robot library, let’s also use Robot to test it!  We will build it with Maven, and use the Maven Robot Framework plugin to execute our tests.

So, before we get started, let’s take a quick look at how remote libraries work. You might want to take a quick look at some of the documentation here.

But what we need to know right now is the following:

  • Robot will make calls to the remote library using XML-RPC over HTTP
  • It expects (requires) there to be no prefix on the keywords
  • It will first call get_keyword_names to get a list of the keywords that the library supports
  • When you ask it to do something, it will call run_keyword to run the keyword you asked for

Keywords (you may recall) are used in the test case to express what must be done to execute the test.  So, let’s do a little test driven development, and start with our test case:

*** Settings ***
Library  Remote  http://localhost:7001/oracleRobot/xmlrpc

*** Test Cases ***

Ping Test
   ${pong} =                        ping              bob
   Should Be Equal as Strings       ${pong}           pong bob

Server State Test
   Server Status Should be          AdminServer       RUNNING

Application State Test
   Application Status Should Be     oracleRobot       ACTIVE

Data Source Existence Test
   Data Source Should Exist         myDataSource

You can see that it tells Robot the URL to call the XMLRPC Remote Library in the Settings section.  The keyword ‘Remote’ tells it that is it a remote (as opposed to local or built-in) library.

In the test cases you can see the keywords we are going to define, and also surmise their arguments and return values:

Keyword Arguments Return Value
ping a message the same message
Server Status Should Be server name, state
Application Status Should Be application name, state
Data Source Should Exist data source name

Some of them don’t seem to have return values – that is because the protocol has a built in set of return values to tell Robot if the test was successful or failed (plus some more information about failures – we’ll see this later on), so we don’t need to define our our output to indicate success.

At this point, if you want to follow along with the code, you can grab it from java.net using git:

git clone git://java.net/ci4fmw~robot

Ok, let’s set about building our project.  First we will create a Maven POM to describe the project and handle all of the dependencies we need.

First, we are going to need three dependencies:

  • com.oracle.weblogic:weblogic-server-pom:12.1.2-0-0, type=pom, scope=provided
  • org.apache.xmlrpc:xmlrpc-server:3.1.3
  • org.apache.xmlrpc:xmlrpc-common:3.1.3

You will need to populate your Maven repository with the WebLogic Server artifacts using the Maven Synchronization Plugin.  You can find details in the documentation here, and an example in this earlier post/video.

Next, we can set up our build section.  We need to tell the maven-compiler-plugin to use Java 1.6, we need to configure the maven-war-plugin to create the WAR file, and we need to set up the weblogic-maven-plugin to deploy our WAR to our WebLogic Server instance for testing.

Finally, we need to tell Maven to grab the robotframework-maven-plugin and run the test suite.

Here is what our finished POM looks like:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.oracle.maven</groupId>
  <artifactId>oracle-robot</artifactId>
  <version>1.0-SNAPSHOT</version>
  <packaging>war</packaging>
  <name>oracleRobot</name>
  <properties>
    <serverUrl>t3://localhost:7001</serverUrl>
    <serverName>AdminServer</serverName>
  </properties>
  <dependencies>
    <dependency>
      <groupId>com.oracle.weblogic</groupId>
      <artifactId>weblogic-server-pom</artifactId>
      <version>12.1.2-0-0</version>
      <type>pom</type>
      <scope>provided</scope>
    </dependency>
    <dependency>
      <groupId>org.apache.xmlrpc</groupId>
      <artifactId>xmlrpc-server</artifactId>
      <version>3.1.3</version>
    </dependency>
    <dependency>
      <groupId>org.apache.xmlrpc</groupId>
      <artifactId>xmlrpc-common</artifactId>
      <version>3.1.3</version>
    </dependency>
  </dependencies>
  <build>
    <finalName>oracleRobot</finalName>
    <plugins>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-compiler-plugin</artifactId>
        <version>2.3.2</version>
        <configuration>
          <source>1.6</source>
          <target>1.6</target>
        </configuration>
      </plugin>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-war-plugin</artifactId>
        <version>2.1.1</version>
        <configuration>
          <failOnMissingWebXml>false</failOnMissingWebXml>
        </configuration>
      </plugin>
      <plugin>
        <groupId>com.oracle.weblogic</groupId>
        <artifactId>weblogic-maven-plugin</artifactId>
        <version>12.1.2-0-0</version>
        <configuration>
        </configuration>
        <executions>
          <execution>
            <id>wls-deploy</id>
            <phase>pre-integration-test</phase>
            <goals>
              <goal>deploy</goal>
            </goals>
            <configuration>
              <adminurl>${serverUrl}</adminurl>
              <user>weblogic</user>
              <password>welcome1</password>
              <source>${project.build.directory}/${project.build.finalName}.${project.packaging}</source>
              <targets>${serverName}</targets>
              <verbose>true</verbose>
              <stage>true</stage>
              <upload>true</upload>
              <name>${project.build.finalName}</name>
            </configuration>
          </execution>
       </executions>
       <dependencies>
       </dependencies>
      </plugin>
      <plugin>
        <groupId>org.robotframework</groupId>
        <artifactId>robotframework-maven-plugin</artifactId>
        <version>1.1</version>
        <executions>
          <execution>
            <goals>
              <goal>run</goal>
            </goals>
          </execution>
        </executions>
      </plugin>
    </plugins>
  </build>

</project>

Now, as we know from this earlier post we place the test suite in a directory called src/test/robotframework/acceptance.  I called mine mytest.txt.  The full source for the test is just up above.

Next, we are going to need a web.xml for the Web Application, so let’s create that.  It goes in src/main/webapp/WEB-INF and here is what it looks like:

<?xml version="1.0" encoding="UTF-8"?>
<web-app version="3.0"
         xmlns="http://java.sun.com/xml/ns/javaee"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_3_0.xsd" >

  <session-config>
    <session-timeout>30</session-timeout>
  </session-config>
  <welcome-file-list>
    <welcome-file>index.html</welcome-file>
  </welcome-file-list>
  <servlet>
    <servlet-name>XmlRpcServlet</servlet-name>
    <servlet-class>com.oracle.maven.MyXmlRpcServlet</servlet-class>
    <init-param>
      <param-name>enabledForExtensions</param-name>
      <param-value>true</param-value>
    </init-param>
  </servlet>
  <servlet-mapping>
    <servlet-name>XmlRpcServlet</servlet-name>
    <url-pattern>/xmlrpc</url-pattern>
  </servlet-mapping>
</web-app>

Here we are telling it the name of the servlet we will use to handle the XML-RPC requests and mapping it to the url /xmlrpc.  Notice that we also set enabledForExtensions to true, because we are going to define our our handler since the library we are using assumes prefixes are required, but we actually do not want prefixes.

Let’s look at the servlet – here is the code:

package com.oracle.maven;

import org.apache.xmlrpc.XmlRpcException;
import org.apache.xmlrpc.server.XmlRpcHandlerMapping;
import org.apache.xmlrpc.webserver.XmlRpcServlet;

/**
 *  An XMLRPC servlet.
 *  The Robot Framework allows for the definition of a remote library
 *  using an XML RPC protocol. In this class, we implement an XMLRPC
 *  server as a servlet.  The Apache XMLPRC library that we are using
 *  creates services with a prefix, e.g. Robot.ping, however Robot
 *  expects there to be no prefix, e.g. just ping, so we need to add
 *  a little logic to remove the prefixes.
 */
public class MyXmlRpcServlet extends XmlRpcServlet  {

  /**
   *  Register a new <code>HandlerMapping</code> and remove prefixes.
   */
  @Override
  protected XmlRpcHandlerMapping newXmlRpcHandlerMapping() throws XmlRpcException {
    HandlerMapping mapping = new HandlerMapping();
    mapping.addHandler(com.oracle.maven.Robot.class.getName(), com.oracle.maven.Robot.class);
    mapping.removePrefixes();
    return mapping;
  }

}

Basically all we are doing here is registering our custom handler that removes the prefixes so that our XML-RPC services look the way Robot wants them to look.

Let’s take a look at that handler, here is the source:

Please note that this was adapted from here and is licensed under the Apache License, Version 2.0 – see the source file for more details.

package com.oracle.maven;

import java.util.HashMap;
import java.util.Map;
import java.util.Map.Entry;
import java.util.Set;

import org.apache.xmlrpc.XmlRpcException;
import org.apache.xmlrpc.server.AbstractReflectiveHandlerMapping;

/**
 *  A <code>HandlerMapping</code> that removes the prefixes of service names.
 */
public class HandlerMapping extends AbstractReflectiveHandlerMapping {

    /**
     * Removes the prefixes from all keys in this handler mapping assuming a
     * String was used as the key and period was
     * used as a separator. Example: Robot.getInvoice -> getInvoice
     */
    @SuppressWarnings("unchecked")
    public void removePrefixes() {
      Map<String, Object> newHandlerMap = new HashMap<String, Object>();
      for (Entry<String, Object> entry : (Set<Entry<String, Object>>) this.handlerMap.entrySet()) {
        String newKey = (String) entry.getKey();
        if (entry.getKey() instanceof String) {
          String key = (String) entry.getKey();
          if (key.contains(".")) {
            newKey = key.substring(key.lastIndexOf(".") + 1);
          }
        }
        newHandlerMap.put(newKey, entry.getValue());
      }
      this.handlerMap = newHandlerMap;
    }

    /**
     * Adds handlers for the given object to the mapping. The handlers are build by invoking
     * {@link #registerPublicMethods(String, Class)}.
     *
     * @param pKey
     *            The class key, which is passed to {@link #registerPublicMethods(String, Class)}.
     * @param pClass
     *            Class, which is responsible for handling the request.
     */
    public void addHandler(String pKey, Class<?> pClass) throws XmlRpcException {
      registerPublicMethods(pKey, pClass);
    }
}

Basically this is just going to go through the service names, which will something like Robot.MyService, and remove the prefix, so that it is just MyService.  We need to do this because that is the way Robot expects the services to be.  These two classes (the servlet and the handler) are actually generic – you should not need to make any changes to them (other than the class names in the servlet) to use them in another Robot Remote Library.

Now, we get to the class that does all the work.  The one that actually implements the keywords we will be using.  Here is the code:

package com.oracle.maven;

import java.util.Map;
import java.util.HashMap;

import javax.management.MBeanServer;
import javax.management.ObjectName;
import javax.management.MalformedObjectNameException;
import javax.naming.InitialContext;

/**
 *  A Remote Library for Robot Framework that defines keywords for Fusion Middleware.
 *  This is remote library for the Robot Framework, which can be deployed on a WebLogic
 *  Server and can be used in Robot Test Suites to obtain information about the
 *  WebLogic Domain.  This is mainly useful for 'infrastructure' level tests, for
 *  example, it will answer questions like:
 *  <p/>
 *  <ul>
 *  <li>Does a given server have a given state?</li>
 *  <li>Is a given application deployed?</li>
 *  <li>Does a given DataSource exist?</li>
 *  </ul>
 *  <p/>
 *  This remote library uses WebLogic MBeans, and so it must be deployed on the
 *  AdminServer, not a managed server.
 *  The Robot protocol uses xmlrpc with no prefixes.
 *  <p/>
 *  How to use:
 *  <p/>
 *  Suppose this robot is deployed on context root /oracleRobot.
 *  In your Test Suite, you access it by declaring a remote library in the Settings section
 *  then you may use keywords from this library in your Test Cases section.
 *  For example:
 *  <p/>
 *  <pre>
 *  *** Settings ***
 *  Library  Remote  http://server:port/oracleRobot
 *
 *  *** Test Cases ***
 *  Server Status Should Be   AdminServer    RUNNING
 *  Application Should Be     myWebApp       Active
 *  Data Source Should Exist  myDataSource
 *  </pre>
 *
 *  <p/>
 *  Robot looks for keywords by removing capitals and replacing spaces with
 *  underscores, so in the example above, when we say 'Server Status Should Be',
 *  it will call an XMLRPC method names 'server_status_should_be'.
 *  <p/>
 *  Note that Robot is a Python-based framework, and hence this class contains
 *  method names that follow the Python, rather than Java, naming conventions.
 */
public class Robot {

  private static final ObjectName domainRuntime;
  private static final ObjectName runtime;

  // string literals
  private static final String STATUS    = "status";
  private static final String ERROR     = "error";
  private static final String TRACEBACK = "traceback";
  private static final String RETURN    = "return";
  private static final String EMPTY     = "";
  private static final String PASS      = "PASS";
  private static final String FAIL      = "FAIL";

  private static final String NYI       = "Not Yet Implemented";

  // Initializing the object name for DomainRuntimeServiceMBean
  // so it can be used throughout the class.
  static {
    try {
      domainRuntime = new ObjectName(
          "com.bea:Name=DomainRuntimeService,Type=weblogic.management.mbeanservers.domainruntime.DomainRuntimeServiceMBean");
    } catch (MalformedObjectNameException e) {
      e.printStackTrace();
      throw new AssertionError(e.getMessage());
    }
  }

  // Initializing the object name for RuntimeServiceMBean
  // so it can be used throughout the class.
  static {
    try {
      runtime = new ObjectName(
          "com.bea:Name=RuntimeService,Type=weblogic.management.mbeanservers.runtime.RuntimeServiceMBean");
    } catch (MalformedObjectNameException e) {
      e.printStackTrace();
      throw new AssertionError(e.getMessage());
    }
  }

  //
  // these are the methods that implement Robot keywords
  //

  /**
   *  A Robot keyword that simply returns a message acknowledging its existense.
   *  Mainly to be used to check that the remote robot library is functioning.
   *
   *  @param args An <code>Object[1]</code> where <code>args[0]</code> contains
   *              a message to be echoed back.
   *  @return The robot protocol formatted return object containing the message.
   */
  public HashMap<String, Object> ping(Object args[]) {
    // prepare the result object
    HashMap<String, Object> result = new HashMap<String, Object>();
    // actual logic of the keyword goes here
    // TODO: check input is correct arity
    Object returnObj = "pong " + args[0];
    // create the response
    result.put(STATUS, PASS);
    result.put(ERROR, EMPTY);
    result.put(TRACEBACK, EMPTY);
    result.put(RETURN, returnObj);
    return result;
  }

  /**
   *  A Robot keyword that checks that a Data Source exists.
   *
   *  @param args An <code>Object[1]</code> where <code>args[0]</code> contains
   *              the name of the DataSource you want to confirm exists.
   *  @return The robot protocol formatted return object containing status of
   *          pass if the DataSource exists or fail otherwise.
   */
  public HashMap<String, Object> data_source_should_exist(Object args[]) {
    // prepare the result object
    HashMap<String, Object> result = new HashMap<String, Object>();
    // actual logic of the keyword goes here
    // TODO: check input is correct arity
    Object returnObj = EMPTY;
    // create the response
    result.put(STATUS, PASS);
    result.put(ERROR, NYI);
    result.put(TRACEBACK, EMPTY);
    result.put(RETURN, returnObj);
    return result;
  }

  /**
   *  A Robot keyword that checks the status of a server.
   *  This keyword allows you to check the status of a WebLogic Server against
   *  a state that you expect it to have.  If the server state matches your
   *  expectation, the keyword will pass, otherwise it will fail.
   *
   *  @param args An <code>Object[2]</code> where <code>args[0]</code> contains
   *              a the name of the server, e.g. <code>AdminServer</code> and
   *              <code>args[1]</code> contains the desired state, e.g.
   *              <code>RUNNING</code>.
   *  @return The robot protocol formatted return object indicating pass/fail.
   */
  public HashMap<String, Object> server_status_should_be(Object args[]) {
    // prepare the result object
    HashMap<String, Object> result = new HashMap<String, Object>();
    // actual logic of the keyword goes here
    // TODO: check input is correct arity
    InitialContext ctx;
    boolean found = false;
    try {
      ctx = new InitialContext();
      MBeanServer server = (MBeanServer)ctx.lookup("java:comp/env/jmx/domainRuntime");
      ObjectName[] serverRuntimes = (ObjectName[]) server.getAttribute(domainRuntime, "ServerRuntimes");
      for (int i = 0; i < serverRuntimes.length; i++) {
        String serverName = (String) server.getAttribute(serverRuntimes[i], "Name");
        String serverState = (String) server.getAttribute(serverRuntimes[i], "State");
        if (((String)args[0]).compareTo(serverName) == 0) {
          // correct server
          if (((String)args[1]).compareTo(serverState) == 0 ) {
            // correct state
            found = true;
            break;
          }
        }
      }
    } catch (Exception e) {
      result.put(STATUS, FAIL);
      result.put(ERROR, e.getMessage());
      result.put(TRACEBACK, e.getStackTrace());
      result.put(RETURN, EMPTY);
      return result;
    }
    // create the response
    result.put(STATUS, found ? PASS : FAIL);
    result.put(ERROR, EMPTY);
    result.put(TRACEBACK, EMPTY);
    result.put(RETURN, EMPTY);
    return result;
  }

  /**
   *  A Robot keyword that checks the status of an application.
   *  This keyword allows you to check the status of an application against
   *  a state that you expect it to have.  If the application state matches your
   *  expectation, the keyword will pass, otherwise it will fail.
   *  <p/>
   *  Allowed states are ACTIVE, ACTIVE_ADMIN and INACTIVE.
   *
   *  @param args An <code>Object[2]</code> where <code>args[0]</code> contains
   *              a the name of the application, e.g. <code>myWebApp</code> and
   *              <code>args[1]</code> contains the desired state, e.g.
   *              <code>ACTIVE</code>.
   *  @return The robot protocol formatted return object indicating pass/fail.
   */
  public HashMap<String, Object> application_status_should_be(Object args[]) {
    // prepare the result object
    HashMap<String, Object> result = new HashMap<String, Object>();
    // actual logic of the keyword goes here
    // TODO: check input is correct arity
    InitialContext ctx;
    boolean found = false;
    try {
      // decode state
      int targetState = -1;
      if (((String)args[1]).compareTo("INACTIVE") == 0 ) {
        targetState = 0;
      } else if (((String)args[1]).compareTo("ACTIVE_ADMIN") == 0 ) {
        targetState = 1;
      } else if (((String)args[1]).compareTo("ACTIVE") == 0 ) {
        targetState = 2;
      }
      if (targetState == -1) {
        throw new Exception("You provided an invalid state.  Valid states are ACTIVE, ACTIVE_ADMIN and INACTIVE");
      }
      // now look up the application state
      ctx = new InitialContext();
      MBeanServer server = (MBeanServer)ctx.lookup("java:comp/env/jmx/runtime");
      ObjectName serverRuntime = (ObjectName) server.getAttribute(runtime, "ServerRuntime");
      ObjectName[] appRuntimes = (ObjectName[]) server.getAttribute(serverRuntime, "ApplicationRuntimes");
      for (int i = 0; i < appRuntimes.length; i++) {
        String appName = (String) server.getAttribute(appRuntimes[i], "ApplicationName");
        int appState = (Integer) server.getAttribute(appRuntimes[i], "ActiveVersionState");
        if (((String)args[0]).compareTo(appName) == 0) {
          // correct application
          if (appState == targetState) {
            // correct state
            found = true;
            break;
          }
        }
      }
    } catch (Exception e) {
      e.printStackTrace(System.out);
      result.put(STATUS, FAIL);
      result.put(ERROR, e.getMessage());
      result.put(TRACEBACK, e.getStackTrace());
      result.put(RETURN, EMPTY);
      return result;
    }
    // create the response
    result.put(STATUS, found ? PASS : FAIL);
    result.put(ERROR, EMPTY);
    result.put(TRACEBACK, EMPTY);
    result.put(RETURN, EMPTY);
    return result;
  }

  //
  // methods required by the robotframework remote library protocol follow
  //

  /**
   *  Get a list of Robot keywords implemented in this remote library.
   *  This method is required by the Robot remote library protocol.
   *
   *  @return A <code>String[]</code> containing the names of the keywords
   *          that are implemented by this library.
   */
  public String[] get_keyword_names() {
    // TODO: change this to use reflection?
    return new String[] { "ping",
                          "server_status_should_be",
                          "application_status_should_be",
                          "data_source_should_exist"
                        };
  }

  /**
   *  Run the given keyword.
   *  This method is required by the Robot remote library protocol.
   *  <p/>
   *  The expected return object is a <code>Map</code> containing the
   *  following entries:
   *  <p/>
   *  <ul>
   *  <li><code>status</code>: Indicates if the keyword was successful, values
   *  are <code>PASS</code> or <code>FAIL</code>.</li>
   *  <li><code>return</code>: The (arbitrary) return object/message from the
   *  keyword.</li>
   *  <li><code>error</code>: The error message, if any.</li>
   *  <li><code>traceback</code>: Additional information, such as a stack trace,
   *  for diaganosing errors.</li>
   *  </ul>
   *
   *  @param keyword The name of the keyword to run.
   *  @param args The arguments for the keyword, in an <code>Object[]</code>.
   *  @return The result of executing the keyword, formatted as the Robot
   *          framework expects.
   */
  public HashMap<String, Object> run_keyword(String keyword, Object[] args) {
    HashMap<String, Object> kr = new HashMap<String, Object>();
    try {
      // run the right method
      // TODO: change this to use reflection perhaps?
      if (keyword.equalsIgnoreCase("ping")) {
        kr = ping(args);
      } else if (keyword.equalsIgnoreCase("server_status_should_be")) {
        kr = server_status_should_be(args);
      } else if (keyword.equalsIgnoreCase("application_status_should_be")) {
        kr = application_status_should_be(args);
      } else if (keyword.equalsIgnoreCase("data_source_should_exist")) {
        kr = data_source_should_exist(args);
      } else {
        kr.put(STATUS, FAIL);
        kr.put(RETURN, EMPTY);
        kr.put(ERROR, EMPTY);
        kr.put(TRACEBACK, EMPTY);
      }
    } catch (Exception e) {
      e.printStackTrace(System.out);
    }
    return kr;
  }

}

So let’s take a look at this class.  Right up top we define a few constants and some statics to refer to the roots of the DomainRuntimeService and RuntimeService JMX trees in WebLogic.

Then we move on to define our keywords.  The first one is a really simple ‘ping’:

  public HashMap<String, Object> ping(Object args[]) {
    // prepare the result object
    HashMap<String, Object> result = new HashMap<String, Object>();
    // actual logic of the keyword goes here
    // TODO: check input is correct arity
    Object returnObj = "pong " + args[0];
    // create the response
    result.put(STATUS, PASS);
    result.put(ERROR, EMPTY);
    result.put(TRACEBACK, EMPTY);
    result.put(RETURN, returnObj);
    return result;

This shows us how the Robot protocol works.  First of all, we are going to receive the inputs in an array of Objects, and we are going to send the result in a HashMap<String, Object>.  That result hashmap has a number of entries in it:

  • status, which tells Robot is the test passed or failed
  • error, which identifies the error (if any)
  • traceback, which provides more context for the error
  • return, which is our general purpose object for sending back any data we wish to send

In this example, we are setting the return entry to contain “pong ” plus whatever we got as input in the first parameter.  Now, in real life, you would obviously want to be a little more careful about checking that parameters were passed and about typecasting them, but this is an example :)

Let’s look at a slightly more interesting keyword.  This one allows us to check the state of a server.  Here is the code:

  public HashMap<String, Object> server_status_should_be(Object args[]) {
    // prepare the result object
    HashMap<String, Object> result = new HashMap<String, Object>();
    // actual logic of the keyword goes here
    // TODO: check input is correct arity
    InitialContext ctx;
    boolean found = false;
    try {
      ctx = new InitialContext();
      MBeanServer server = (MBeanServer)ctx.lookup("java:comp/env/jmx/domainRuntime");
      ObjectName[] serverRuntimes = (ObjectName[]) server.getAttribute(domainRuntime, "ServerRuntimes");
      for (int i = 0; i < serverRuntimes.length; i++) {
        String serverName = (String) server.getAttribute(serverRuntimes[i], "Name");
        String serverState = (String) server.getAttribute(serverRuntimes[i], "State");
        if (((String)args[0]).compareTo(serverName) == 0) {
          // correct server
          if (((String)args[1]).compareTo(serverState) == 0 ) {
            // correct state
            found = true;
            break;
          }
        }
      }
    } catch (Exception e) {
      result.put(STATUS, FAIL);
      result.put(ERROR, e.getMessage());
      result.put(TRACEBACK, e.getStackTrace());
      result.put(RETURN, EMPTY);
      return result;
    }
    // create the response
    result.put(STATUS, found ? PASS : FAIL);
    result.put(ERROR, EMPTY);
    result.put(TRACEBACK, EMPTY);
    result.put(RETURN, EMPTY);
    return result;
  }

You can see that the same basic protocol is implemented here.  The only different is in what we actually do in the business logic to come up with an answer.  In this case, we look up the domainRuntime JMX MBeanServer, then go look for a server with the same name as the first parameter, and if we find it, we check if it’s state matches the second parameter.  The we return a result based on what we found.  Again, we should be more careful about our inputs existence and types, but you get the idea.

There is another keyword implemented that lets us check the state of an application.  The data source keyword is not implemented, it is sitting there with just the protocol wrapper, but no logic.  I thought you might like to have a go at implementing it :)

Following the keyword implementations, there are two more methods that we need to implement the protocol.  Oh by the way, we are not implementing the full protocol here – just the basics.  Let’s take a look at the get_keyword_names method:

  public String[] get_keyword_names() {
    // TODO: change this to use reflection?
    return new String[] { "ping",
                          "server_status_should_be",
                          "application_status_should_be",
                          "data_source_should_exist"
                        };
  }

Note that I am using the Python method naming standards here, since Robot is a Python-based framework.  This method just needs to return an array with the name of the keywords that are implemented.  I am doing this the lazy way – it would probably be better to produce the list with some kind of introspection, so you don’t forget to update it.

Finally, we need to implement the run_keyword method, which just routes the request to the right method to handle the keyword.  Again, I have implemented this the lazy way, and there are certainly better ways to do it. Here is the code:

  public HashMap<String, Object> run_keyword(String keyword, Object[] args) {
    HashMap<String, Object> kr = new HashMap<String, Object>();
    try {
      // run the right method
      // TODO: change this to use reflection perhaps?
      if (keyword.equalsIgnoreCase("ping")) {
        kr = ping(args);
      } else if (keyword.equalsIgnoreCase("server_status_should_be")) {
        kr = server_status_should_be(args);
      } else if (keyword.equalsIgnoreCase("application_status_should_be")) {
        kr = application_status_should_be(args);
      } else if (keyword.equalsIgnoreCase("data_source_should_exist")) {
        kr = data_source_should_exist(args);
      } else {
        kr.put(STATUS, FAIL);
        kr.put(RETURN, EMPTY);
        kr.put(ERROR, EMPTY);
        kr.put(TRACEBACK, EMPTY);
      }
    } catch (Exception e) {
      e.printStackTrace(System.out);
    }
    return kr;
  }

So that completes our project, let’s go ahead and test it out.  You can do that by just running

mvn verify

You should get some output like this:


C:\src\ci4fmw~robot\oracle-robot>mvn verify
 [INFO] Scanning for projects...
 [INFO]
 [INFO] ------------------------------------------------------------------------
 [INFO] Building oracleRobot 1.0-SNAPSHOT
 [INFO] ------------------------------------------------------------------------
 [INFO]
 [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ oracle-robot ---
 [WARNING] Using platform encoding (Cp1252 actually) to copy filtered resources, i.e. build is platform dependent!
 [INFO] skip non existing resourceDirectory C:\src\ci4fmw~robot\oracle-robot\src\main\resources
 [INFO]
 [INFO] --- maven-compiler-plugin:2.3.2:compile (default-compile) @ oracle-robot ---
 [INFO] Nothing to compile - all classes are up to date
 [INFO]
 [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ oracle-robot ---
 [WARNING] Using platform encoding (Cp1252 actually) to copy filtered resources, i.e. build is platform dependent!
 [INFO] skip non existing resourceDirectory C:\src\ci4fmw~robot\oracle-robot\src\test\resources
 [INFO]
 [INFO] --- maven-compiler-plugin:2.3.2:testCompile (default-testCompile) @ oracle-robot ---
 [INFO] No sources to compile
 [INFO]
 [INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ oracle-robot ---
 [INFO] No tests to run.
 [INFO]
 [INFO] --- maven-war-plugin:2.1.1:war (default-war) @ oracle-robot ---
 [INFO] Packaging webapp
 [INFO] Assembling webapp [oracle-robot] in [C:\src\ci4fmw~robot\oracle-robot\target\oracleRobot]
 [INFO] Processing war project
 [INFO] Copying webapp resources [C:\src\ci4fmw~robot\oracle-robot\src\main\webapp]
 [INFO] Webapp assembled in [171 msecs]
 [INFO] Building war: C:\src\ci4fmw~robot\oracle-robot\target\oracleRobot.war
 [WARNING] Warning: selected war files include a WEB-INF/web.xml which will be ignored
 (webxml attribute is missing from war task, or ignoreWebxml attribute is specified as 'true')
 [INFO]
 [INFO] --- weblogic-maven-plugin:12.1.2-0-0:deploy (wls-deploy) @ oracle-robot ---
 [INFO] Command flags are: -noexit -adminurl t3://localhost:7001 -deploy -user weblogic -password ******* -name oracleRobot -source C:\src\ci4fmw~robot\oracle-robot\target\oracleRobot.war -targets AdminServer -stage -upload -verbose
 weblogic.Deployer invoked with options:  -noexit -adminurl t3://localhost:7001 -deploy -user weblogic -name oracleRobot -source C:\src\ci4fmw~robot\oracle-robot\target\oracleRobot.war -targets AdminServer -stage -upload -verbose
 <22/10/2013 7:40:08 AM EST> <Info> <J2EE Deployment SPI> <BEA-260121> <Initiating deploy operation for application, oracleRobot [archive: C:\src\ci4fmw~robot\oracle-robot\target\oracleRobot.war], to AdminServer .>
 Task 3 initiated: [Deployer:149026]deploy application oracleRobot on AdminServer.
 Task 3 completed: [Deployer:149026]deploy application oracleRobot on AdminServer.
 Target state: deploy completed on Server AdminServer

Target Assignments:
 + oracleRobot  AdminServer
 [INFO]
 [INFO] --- robotframework-maven-plugin:1.1:run (default) @ oracle-robot ---
 ==============================================================================
 Acceptance
 ==============================================================================
 Acceptance.Mytest
 ==============================================================================
 Ping Test                                                             | PASS |
 ------------------------------------------------------------------------------
 Server State Test                                                     | PASS |
 ------------------------------------------------------------------------------
 Application State Test                                                | PASS |
 ------------------------------------------------------------------------------
 Data Source Existence Test                                            | PASS |
 ------------------------------------------------------------------------------
 Acceptance.Mytest                                                     | PASS |
 4 critical tests, 4 passed, 0 failed
 4 tests total, 4 passed, 0 failed
 ==============================================================================
 Acceptance                                                            | PASS |
 4 critical tests, 4 passed, 0 failed
 4 tests total, 4 passed, 0 failed
 ==============================================================================
 Output:  C:\src\ci4fmw~robot\oracle-robot\target\robotframework-reports\output.xml
 XUnit:   C:\src\ci4fmw~robot\oracle-robot\target\robotframework-reports\TEST-acceptance.xml
 Log:     C:\src\ci4fmw~robot\oracle-robot\target\robotframework-reports\log.html
 Report:  C:\src\ci4fmw~robot\oracle-robot\target\robotframework-reports\report.html
 [INFO] ------------------------------------------------------------------------
 [INFO] BUILD SUCCESS
 [INFO] ------------------------------------------------------------------------
 [INFO] Total time: 26.213s
 [INFO] Finished at: Tue Oct 22 07:40:23 EST 2013
 [INFO] Final Memory: 39M/776M
 [INFO] ------------------------------------------------------------------------
 C:\src\ci4fmw~robot\oracle-robot>

So we can see that it all worked.  And now we have a simple Robot Remote Library that we can extend as needed, and we can test it with Robot too.  Enjoy!

Posted in Uncategorized | Tagged , , | Leave a comment

A Roadmap for SOA Development and Delivery

This post is part of a series on SOA Development and Delivery.

In this post I will present a roadmap and a target state for SOA Development and Delivery.  This will serve as the basis for an extended open ‘proof of concept’ that we will work through together, over a series of posts, to implement and prove this approach.

Let’s talk first about our target state – the goal we want to achieve, then we will come back and look at the steps on the journey.

The Vision – Continuous Delivery

Continuous Delivery is a set of practices, supported by tools and automation, that is focused on answering the question: ‘How much risk is associated with deploying this new change into production?’

It involves automation of everything that needs to happen between a developer committing a change and the release of the software including that change into production, including:

  • building the software
  • testing the software, and
  • managing the configuration of the environments

The outcome that we want to achieve, is that we can automate all of our:

  • build
  • component-level test
  • environment provisioning (for development, test and production environments), and
  • acceptance test

for any ‘application’ that consists of ‘projects’ that are targeted to Oracle SOA Suite runtimes (WebLogic Server, ADF, SOA, BPM, OSB, etc.) in a consistent way, using the same tools an techniques commonly applied for other styles of development.

Major Themes for the vision include:

  • simplification
  • standardization
  • adoption of convention over configuration
  • customization capability
  • intuitive/idomatic (things work the way you think they would/should)

We want to improve quality, visibility, cycle time and repeatability.  We want to enable better testing and more reporting.  We want to reduce risk and cost.

How do we get there?  The Journey

The journey that we are going to take, consists of a few major steps, some of these can be taken without the others, some need to be done together.  Let’s talk about what we need to realise continuous delivery for SOA projects:

  • continuous integration
  • provisioning and configuration management
  • deployment pipelines
  • testing and quality

We will discuss each of these in detail in the following sections.

Continuous Integration

We have talked about this before, so let’s just recap.  Continuous Integration is a set of processes and practices aimed at improving software quality – essentially by testing early and often, to identify defects while they are small and easier to correct.

Common practices adopted in continuous integration include:

  • all developers committing to the trunk regularly (at least daily)
  • build the software on every commit
  • test the software on every build
  • building and testing are automated
  • feedback is given to developers (and other stakeholders) quickly

Let’s consider some of the enabling technologies.

Version Control

We have talked about this already too, so let’s recap.  Version Control systems allow us to store a history of changes over time.  Who changed what, when and why.  They provide the ability to isolate changes by developing in multiple streams (branches).  We can trigger our builds based on commits to the version control system.  They allow teams of developers to work on the application safely, and they permit the sharing of common artifacts.

The two most common and relevant version control systems, in my opinion, that we see in SOA environments today are Subversion and git.

Subversion is arguably the de facto standard for version control today.  We have good integration with development tools like JDeveloper, build automation tools like Maven, and continuous integration tools like Hudson.  It is widely understood and has excellent platform support.  It is a logical choice.

But Subversion does present some challenges.  Increasingly, we are seeing collaborative development approaches – organizations that are outsourcing some of the development to a third party (or multiple third parties) and having teams of developers, testers, build engineers and QA people in different locations.  We also see it is much more common these days for people to work from home.  Subversion does force developers to have network connectivity to the server in order to do operations like committing and branching, and this has been seen to prove challenging in some environments.

Distributed version control systems address this new need.  Arguably the best, at least most widely used, of the new distributed version control systems is git, developed by Linux Torvalds, and used by many open source projects.

I think that it is time for organizations doing SOA development to take a good hard look at git, and what it offers above and beyond Subversion, and giving some consideration to when to migrate to git.  I know some people will not migrate, but I feel that they will be in the minority.  Just like those people who never migrated from CVS to Subversion, I think the world has moved on.

So what does git give us?  And let’s expand that to talk also about gitlab, which is a common private hosted (on premise) git environment.

git allows developers to commit, branch, etc. when they are offline.  A connection to the server is not required for these types of actions.  It also supports a lot of new workflows.  It does not enforce a single master ‘trunk’ but you can have many distributed repositories that have the ability to push and pull commits, tags, branches, etc. to/from each other.

The standard git workflow is undoubtedly more complex than the standard Subversion workflow, but I think it is worth the effort and pain to make the move.  I know from personal experience in many projects using Subversion, Perforce, and other proprietary version control systems, that git seems to provide (anecdotally) more flexibility, especially when things get messy.  The easy things are not quite as easy as in Subversion, but the hard things are easier and in some cases possible, where they would not be in Subversion or other non-distributed version control systems.

gitlab is a great tool that provides a web based environment for interacting with git.  It lets you easily visualize the projects, branches, merges, etc., to view and diff versions of artifacts, and it has integrated collaboration tools like wiki, issue tracking, and so on.  In my opinion, it dramatically increases the value of git by providing much needed visibility and visualization tools.

In our POC, we will use git and gitlab.

Dependency Management and Build Automation

Maven is the obvious choice for us here, as we have good support for it in the 12c products, it is possible to use it with the 11g products as well, and it is an area that is seeing more investment moving forward.  Maven allows us to define the dependencies between projects in a declarative way, and it manages the process of collecting those dependencies and simplifies the whole build process for us.

Most of the other tools that we want to use also have good integration with Maven – things like continuous integration, quality tools, binary managements, and so on.

We will use Maven in the POC.

Binary Management

This is an important area that we will discuss in more detail in a later post.  Once we have built binaries in our continuous integration server (using Maven), we need to put them somewhere and manage them.  This is (like most things) a little more complicated than it seems, and this is why a set of tools has grown up around binary management – things like Archiva, Artifactory and Nexus for example.

One very important tenet of continuous delivery is that you only build the binary once.  That means that we build it, test it, check its quality, etc., and then we publish it to the binary repository.  Anything that happens later in the pipeline, like acceptance tests for example, just consumes the binary from the binary repositoy – we do not go back and rebuild it.  This means that we have some certainty that we are using the exact same code that was in fact tested, etc. earlier in the pipeline.

In the POC, we will use Artifactory as our binary management tool.

Continuous Inspection

Now automation is great, but if the things we are building automatically are garbage, then automation is not going to help.  So it is important that we include a way to measure quality.  There are some excellent tools in the Java space (like PMD, Checkstyle, FindBugs for example) but not too much in the SOA space, so this is an area we are going to need to look at.  One thing we can do right now though is agree to use a framework that allows us to execute various quality tools and consolidate and report on the results and trends.  Sonar is one such framework, and that is what we will use in our POC.

Continuous Delivery

There are a number of tools that are starting to appear in this space, but it is arguably still going through its storming phase.  What we really care about in our POC is that we can build and execute flexible pipelines.  For the purposes of this activity, we are going to use Hudson, along with a couple of Hudson plugins and maybe even write a plugin or two of our own.

Provisioning

Arguably the two front runners in the DevOps provisioning space are Chef and Puppet.  Both of these offer a way to manage the configuration of a system over time, converging it to an approved model.  Both also use a pull based model (predominantly) where a client on the managed nodes connects back to a server periodically to check for updates to the model, though it is relatively straight forward to implement a push model, which is important to us because we are going to want to be able to provision new environments on the fly and not have to wait around for them.  Some of the environments that we want to create (for our acceptance tests) are going to require more than one node (machine) – for example, we might want a cluster of SOA servers with the database on another node, and a load balancer/proxy in front.  Chef and Puppet are pretty good at handling the configuration of an individual node, but they don’t provide everything we really need for multi-node environments.

There are a lot of tools in that multi-node space, and that space is also arguably well and truly in the storming phase of development.  There is really no clear front runner, though there is a tool called Marrionette Collective (MCollective) that works with both Chef and Puppet.  It was actually acquired by the Puppet folks and I believe they use it to power their dashboards in Puppet Enterprise 3.

I would argue that the choice between Chef and Puppet is less important than the choice to use one of them, or not.  So I am not going to go into a lot of detailed comparison.  When I started working in this space, it seemed that Chef had better support for using recipes across virtual, physical and cloud environments, and deeper support across Windows, Linux and Solaris (though I think Puppet has closed the gap now).  There were also some other folks that I had worked with who had a lot of experience using Chef in a pretty large cloud environment, so it seemed like a good choice.

For our POC we are going to use Chef and MCollective.

Acceptance Testing

First, let’s clear up our terminology.  I am going to use the term ‘acceptance test’ here to mean tests that we execute against artifacts that have been deployed to a (production-like) runtime environment, as opposed to tests that are executed against source code, object code, or undeployed binaries.

I am using this term on purpose for two reasons:

  1. I don’t think that it makes sense to draw the line between unit tests and integration tests – while I would argue that all unit tests could and should be executed as part of the build process, i.e. in the continuous integration environment, I don’t think that integration testing is so clear cut.  Some integration tests could be executed as part of the build, but others are more complex, or take longer to execute, or require more infrastructure than it is reasonable to provision (or wait for) in a build.  So I think that some integration tests should be executed in the acceptance test environment, i.e. after the binary has been built and published.
  2. There is a granularity mismatch between build and test.  We typically build small pieces like composite, web applications, etc. in our continuous integration environment.  But we want to test the system or the whole application – not just the pieces in isolation (we do test them in isolation during the build process obviously).  So really our acceptance tests are running against a whole set of binaries that were produced by a group of jobs in our continuous integration environment.  And over the life of the project, we are going to find that more or less of artifacts will actually exist – at the start of the project, we may be using a lot of mocks for our acceptance tests, but later in the project we may switch them out for binaries we have built, or test instances of systems that we need to interact with, that have now been provisioned.

There are also a whole bunch of different test tools that we can use to test different kinds of artifacts, and I don’t think it makes sense to try to find a single test tool to execute all of our tests.  So what is important to us then, is to have a test framework that allows us to use any tool, and provides a way to consolidate the test results for reporting and trending.

Robot Framework is one such test framework.  It is backed by Nokia-Siemens, it has a Maven plugin, it is pretty easy to use, and it supports many of the kinds of test tools we would want to use out of the box.  So in our POC, we will use Robot as our test framework.

Customization and Configuration Management

Since we are going to be building our binaries only once, customization is an important area for us to consider.  How are we going to target the binary to the actual environment we are deploying it in to?  While we have tools like Java EE Deployment Plans and SOA Configuration Plans, we arguably need something more generic and extensible.

We also need it to integrate with whatever is managing our configuration.  For example, when we provision a set of nodes for our acceptance tests to run in, how do we get all of the topological information and use it to customize the deployment of the binaries?

In this space, for the purposes of this POC, we are going to design and create a custom metadata format for holding this information, a Hudson plugin to integrate with Chef to get the configuration information, and a tool to apply customization.

So let’s recap where what tools we plan to use:

Summary of Toolsets

Category Tool
Version Control git/gitlab
Build/Dependency Management Maven
Binary Management Artifactory
Continuous Integration Hudson
Continuous Inspection Sonar
Continuous Delivery Hudson/custom
Provisioning Chef
Acceptance Test Robot
Customization (custom)

Ok, we are ready to get started!  See you in the next post, and please feel free to send comments and feedback, we love to hear from you.

Posted in Uncategorized | Tagged , , , , , , , , , , , , | 7 Comments

E-Business Suite 12.2 released

In case you missed it, E-Business Suite 12.2 has been released.  Check this post for details.

Posted in Uncategorized | Tagged | Leave a comment

A first look at Robot Framework

As we start to assemble the tools we need to implement Continuous Delivery, one area we need to look at is selecting a testing framework.  Now, we are going to want to use a range of different testing tools, depending on what we want to test and how, but we ideally would have some kind of framework that helps to abstract those tools and gives us a single place to go to execute tests.

In looking for a test framework, I have been considering the following:

  • Preferably it should have some kind of backing to ensure its longevity and continued development,
  • Ideally I should not have to learn a new language to use the tool – since most Fusion Middleware users are already using Java and Python (for WLST scripts) – those two seem like the best choices,
  • It should be extensible, so I can add new types of test tools if I want to,
  • It should integrate well with Maven and Hudson, since those are already part of our ecosystem,
  • It should allow me to use a wide range of testing tools to execute the actual tests.

So after a bit of research, what I have come up with is Robot Framework.  It is backed by Nokia Siemens Networks, it uses a simple text/table based test specification approach, Python and Java are supported for extension, there are Maven and Hudson plugins, and there are a wide range of plugins for many common testing tools like Selenium, Swing, iOS, Android, and support for OS integration (processes), XML, HTTP, SOAP, etc.

Seems like a good choice.  So let’s take a look at it!

In this earlier post, we created a basic web application from the WebLogic Server ‘Basic WebApp’ Maven archetype.  Let’s take that application and add some tests to it.  We will do this using the Robot Framework Maven Plugin and the Selenium Library.

So the first thing we want to do, is add the Robot Framework Maven Plugin to our pom.xml by adding a new plugin element to the build section.  Here is what we need to add:


<plugin>
  <groupId>org.robotframework</groupId>
  <artifactId>robotframework-maven-plugin</artifactId>
  <version>1.2</version>
  <executions>
    <execution>
      <id>robot</id>
      <phase>integration-test</phase>
      <goals>
        <goal>run</goal>
      </goals>
    </execution>
  </executions>
</plugin>

This tells Maven that we want to run the Robot Framework during the integration-test phase of the build.  That is all the configuration we need.  As with most things in Maven, there are a set of sensible defaults that we can follow to avoid any additional configuration burden.  In this case, we will take advantage of two – the Robot Framework plugin looks for libraries in src/test/resources/robotframework/libraries and tests in src/test/robotframework/acceptance.

We are going to use the SeleniumLibrary.  You can download the library from here.  Inside the download you will find a directory called robotframework-seleniumlibrary-2.9.1/src/SeleniumLibrary.  You need to copy that directory (i.e. SeleniumLibrary) into your project under src/test/resources/robotframework/libraries.

You can find documentation for this library here.

Now we can write a test.  To do this, we just need to create a file in src/test/robotframework/acceptance.  Let’s call it Basic_Test.txt.  Here is how we specify the test using Robot Framework:

*** Settings ***

Library                             SeleniumLibrary
Test Set Up                         Start Selenium Server
Test Tear Down                      Stop Selenium Server

*** Test Cases ***

Basic Test
   Open Browser                     http://localhost:7001/basicWebapp/
   Page Should Contain              Basic Webapp
   Page Should Contain Textfield    j_idt10:name
   Page Should Contain Textfield    j_idt10:amount
   Page Should Contain Button       j_idt10:j_idt18
   Input Text                       j_idt10:name                            Fred
   Input Text                       j_idt10:amount                          39.99
   Click Button                     j_idt10:j_idt18
   Wait Until Page Loaded
   Page Should Contain              The money have been deposited to Fred, the balance of the account is 39.99
   Close Browser

Let’s take a look at how this test is specified.  Firstly, you will see that the format is reasonably simple.  Sections are marked with headings which start and end with three asterisks.  This test has two sections:

  • Settings – tells Robot Framework about any libraries we want to use and any global actions to take.  In this case, we want to use the SeleniumLibrary and we want to start the Selenium server before running the tests, and then stop it after all tests.
  • Test Cases – this section lists all of the test cases.  In this example we only have one, called Basic Test, but we could have as many as we like listed here.

Test Cases are written using keywords.  The keywords are the first thing on each line, for example Open Browser, Page Should Contain, and so on.  After the keywords, there are (sometimes) parameters.

Robot tests are written in files like this.  You do not actually have to line everything up in a table as shown here, the minimum is three spaces before the keyword and two spaces between the keyword and each parameter.

You can look up details of what each keyword does, and what others are available in the library documentation.

Keywords like Click Button take as a parameter something called a ‘locator’ – simply put these can be the id or name attributes of the HTML tag.

So this test will do the following:

  • Open a browser and go to the application
  • Check that the page shows up correctly
  • Fill in some data and take an action
  • Check that the expected outcome occured

To run the tests, all we need to do is execute Maven:

mvn verify

In the output you will see the tests running and output like this:


[INFO] --- robotframework-maven-plugin:1.2:run (robot) @ my-basic-webapp ---
==============================================================================
Acceptance
==============================================================================
Acceptance.Basic Test
==============================================================================
Basic Test                                                            | PASS |
------------------------------------------------------------------------------
Acceptance.Basic Test                                                 | PASS |
1 critical test, 1 passed, 0 failed
1 test total, 1 passed, 0 failed
==============================================================================
Acceptance                                                            | PASS |
1 critical test, 1 passed, 0 failed
1 test total, 1 passed, 0 failed
==============================================================================

Robot also produces a report which you can view in your browser:

robot-report

So that is a gentle introduction to Robot Framework.  In future posts we will see how we can expand its usage to orchestrate complex acceptance tests, and how we can write our own keywords and extensions.

Enjoy!

Posted in Uncategorized | Tagged , , , , , | 1 Comment

New (HotSpot) JDK adds JRMC and Flight Recorder

Oracle has just released JDK 7u40 which includes some pretty useful new features – most notable for me is the inclusion of JRockit Mission Control and Flight Recorder, giving us a lot more tools for recording and analyzing performance information for the JDK both online and after the fact.  You can read all about it here.

Posted in Uncategorized | Tagged , , , | Leave a comment