Topic is used in order to publish a data and then multiple subscriber can subscribe to the same topic.
But there can be situation where in we have a single topic where in multiple business are publishing messages and the requirement is that based on some particular parmeter only one service should subscribe to the published topic.
This kind of situation can be handled by using the concept of advanced feature of jms in OSB which allows us to define Message selector as shown below.The message selector is used to match the data.
IT works in the following way.
Let suppose we are publishing a data in a topic with message type='App1'
This we can define in mediator as shown below.
Now this is publshing data in a topic and our requirement is that only one service whose message type is same as 'App1' should subscribe to this service,In that case we will go to the advanced configuration of OSB JMS configuration and define the same message type as message selector as shown below.
This configuration will allow only this particular servie to be called when a message will be published in the topic.
The views expressed on this blog are my own and do not necessarily reflect the views of any Organisations owning these products.I keep on doing R & D with different products in and around Middle ware stack and these posts are result of that.Most of the post are result of my own experiments or ideas taken from other blogs .If in any case You feel content is not right you can comment to remove that post. This blog uses the default features,cookies of blogspot.com
Thursday, March 29, 2012
Wednesday, March 28, 2012
Version of Eclipse supported with Different version of OSB
For Version 11.1.1.2 of OSB -Oracle Enterprise Pack for Eclipse - oepe_11gR1PS2
Oracle Service Bus 11g Release 1 (11.1.1.4.0) uses the Oracle Enterprise Pack for Eclipse 11.1.1.5.0 Integrated Development Environment (IDE).
as per the following document however it also support oepe 11.1.1.6
Referring to various document i have listed the version of server and oepe
OSB Weblogic Oepe
11.1.1.6.0 10.3.6 11.1.1.8.0
11.1.1.5.0 10.3.5 11.1.1.7.2
11.1.1.4.0 10.3.4 11.1.1.6.1
11.1.1.3.0 10.3.3 11.1.1.5
Oracle Service Bus 11g Release 1 (11.1.1.4.0) uses the Oracle Enterprise Pack for Eclipse 11.1.1.5.0 Integrated Development Environment (IDE).
as per the following document however it also support oepe 11.1.1.6
Referring to various document i have listed the version of server and oepe
OSB Weblogic Oepe
11.1.1.6.0 10.3.6 11.1.1.8.0
11.1.1.5.0 10.3.5 11.1.1.7.2
11.1.1.4.0 10.3.4 11.1.1.6.1
11.1.1.3.0 10.3.3 11.1.1.5
Tuesday, March 13, 2012
Coherence set up in OSB
Oracle service bus has a build in functionality for caching the data using coherence.
So a prerequisite for Resulting cache is that you should have coherence set up at your end.
The cache is enabled for Business service only and not for proxy service.
LEts take an example to understand what it exactly does.
LEt suppose you have a business service which is calling a standby database to query data.In real life time scenario you can take it as customer details.
So if same customer calls this service again and again ideally it will call the business service again and again,this process will involve calling the adapter and getting the result ..this process will take time in seconds but we are already aware that it is a passive data and is not going to change so what we can do it that we can store the records of the customer in a cache so that the next time the customer tries to get its data he doesn't have to call the adapter and call the procedure to get the data ,instead he can directly get the data from the cache.
So here there are two important things to notice first is that our business service will be invoked once for a particular customer and then the result will be stored in the cache.
The second thing will be the speed will be much faster accessing the data from cache than calling the procedure to get the data.
So as i mentioned for a particular customer it will store the data.So that customer must have some unique attribute so that the cache can store it against that value.That particular value is the Cache Key.
So as you can see the Cache key will be used to store the value in the Cache.
So now i will go to a real service and show you what exactly will happen in real time scenario.
YOu can go to the health of process and we have two operation there for the proxy service.
Current Aggregation Interval and Since last reset.
We will use the second option to identify how is our process behaving.For that purpose first of all reset the statistics from osb console.
Test the proxy service 10 times which is calling this particular business service and now check the statistics
As you will find that the proxy as well as business service both have been called 10 times in the stats.This is because we have not set the cache.
Now we will again reset the statistics and configure the result cache.
To configure result cache.
Go to the business service.
Go to advance setting and click on Result Caching Supported.
Once set it will ask for the key ...it should be the unique key for the customer.
Lets say it is customer id so your unique will be like this
$body/CustomerUniqueKey
Now save this result and again call this proxy service 10 times.
This time if you will go to the statistics you will find that proxy service has been called 10 times but the business service is called only once.
Also note the timing difference the aggregated time interval is much less in this case because of result caching.
So a prerequisite for Resulting cache is that you should have coherence set up at your end.
The cache is enabled for Business service only and not for proxy service.
LEts take an example to understand what it exactly does.
LEt suppose you have a business service which is calling a standby database to query data.In real life time scenario you can take it as customer details.
So if same customer calls this service again and again ideally it will call the business service again and again,this process will involve calling the adapter and getting the result ..this process will take time in seconds but we are already aware that it is a passive data and is not going to change so what we can do it that we can store the records of the customer in a cache so that the next time the customer tries to get its data he doesn't have to call the adapter and call the procedure to get the data ,instead he can directly get the data from the cache.
So here there are two important things to notice first is that our business service will be invoked once for a particular customer and then the result will be stored in the cache.
The second thing will be the speed will be much faster accessing the data from cache than calling the procedure to get the data.
So as i mentioned for a particular customer it will store the data.So that customer must have some unique attribute so that the cache can store it against that value.That particular value is the Cache Key.
So as you can see the Cache key will be used to store the value in the Cache.
So now i will go to a real service and show you what exactly will happen in real time scenario.
YOu can go to the health of process and we have two operation there for the proxy service.
Current Aggregation Interval and Since last reset.
We will use the second option to identify how is our process behaving.For that purpose first of all reset the statistics from osb console.
Test the proxy service 10 times which is calling this particular business service and now check the statistics
As you will find that the proxy as well as business service both have been called 10 times in the stats.This is because we have not set the cache.
Now we will again reset the statistics and configure the result cache.
To configure result cache.
Go to the business service.
Go to advance setting and click on Result Caching Supported.
Once set it will ask for the key ...it should be the unique key for the customer.
Lets say it is customer id so your unique will be like this
$body/CustomerUniqueKey
Now save this result and again call this proxy service 10 times.
This time if you will go to the statistics you will find that proxy service has been called 10 times but the business service is called only once.
Also note the timing difference the aggregated time interval is much less in this case because of result caching.
Tuesday, March 06, 2012
Transaction in SOA 11g
Executive Summary
Oracle SOA Suite is a product which provides a complete functionality for implementing end to end scenario.
This paper focuses on the importance and applicability of one of the most important concepts of SOA i.e. Transactions-How the business behave, Does it go in a single thread or it is processed in multiple threads.
In an ideal scenario we expect that when ever we are making a transaction if there occurs any error in the middle of the process the whole process should roll back to the initial point from where it is started.Specially in case of monetary transaction we want that the whole transaction should be rolled back to the initial point.
In this paper we will see a demo scenario where in we will see what error occurs when the process runs in different transcations and how we can overcome them by making them run in a single transaction.Transactions are mainly considered in case of fault propagation.
Transactions in SOA Suite
A transaction enables an application to coordinate a group of messages for production and consumption, treating messages sent or received as an atomic unit.
When an application commits a transaction, all of the messages it received within the transaction are removed from the messaging system and the messages it sent within the transaction are actually delivered. If the application rolls back the transaction, the messages it received within the transaction are returned to the messaging system and messages it sent are discarded.
In the document we will see how transaction semantics behave in Oracle BPEL Process Manager.
Need for transaction in SOA Suite
Oracle BPEL Process Manager by default creates a new transaction on a request basis. That is, if a transaction exists, it is suspended, and a new transaction is created. Upon completion of the child (new) transaction, the master (suspended) transaction resumes.
However, if the request is asynchronous (that is, one-way), the transaction is either:
• Inherited for insertion into the dehydration store (table dlv_message).
• Enlisted transparently into the transaction (if one exists).
There is no message loss. Either the invocation message is inserted into the dehydration store for processing or the consumer is notified through a fault.
In release 10.1.3.x, there were several properties to set on the consuming process (that is, on the partner link) and the providing process. This enabled you to chain an execution into a single global transaction. On the consuming side, you set transaction=participate on the partner link binding in the bpel.xml file. On the providing side, you set transaction=participate in the section of bpel.xml.
In release 11g, you only must set a new transaction property on the BPEL component being called (known as the callee process). You add bpel.config.transaction into a BPEL process service component section in the composite.xml file (note the required prefix of bpel.config.). This property configures the transaction behavior for BPEL instances with initiating calls.
Additionally we will see how the global inbond parameter for inbound adapter in BPEL process manager affects the flow of process.
Problem definition
We will create a scenario first of all to understand what the problem is when the process do not executes in a single transaction.In this sample I am replicating this scenario by using following logic.
There is an inbound database adapater which is being polled.Upon arrival of data in the table the process is triggered and then within the process (business flow) some transformation happens as per the target system requirement and again this data is write to another table in target database.Once database is update in the next flow of business process a file is also write to the system file directory using a file adapter.
Now Let suppose there is a situation where in you have put a record in the source table and the flow started, transformation happened and then while trying to write the record in file system some issue happened(for simulation we will revoke the write privilege from the file location.).So ideally you will expect the process to roll back that is your data should not be written to target table and should go back to the source table.But if the process is in different transaction you will find that the process will throw an error at stage when it is trying to write a record in file system,The record from the source will be deleted and the target table will be updated.Obviously you don’t want this.We will see how this happens using different retry parameters at different adapter level to understand how the process behave.
High-level solution
We will make the process work in a single transaction; Additionally we will define a global inbound parameter which will ensure that the inbound adapter keeps on polling the record until the whole process works fine.
Solution details
The following section details out the setups required to recreate the problem scenario and then the solution steps.
Setups
SOA Suite 11.1.1.5
Oracle XE 11.2
RCU 11.1.1.5
Oracle WebLogic Server 10.3
Jdeveloper 11.1.1.5
Business flow
Steps to recreate the issue.
1. We will create two tables one for source and one for target.For demo purpose we will create a simple table with two columns.
Source Table:- create table source (name varchar(20), EmployeeId varchar(20));
Target Table:- create table target (EmpName varchar(20),Id varchar(20));
2>Creating Sub Process-Create a simple synchronous BPEL process which will take two input parameter from client and will update it in the target table of the database.My process will look something like this.
On top I have the composite and in bottom I have the BPEL process which is assigning the in put data to the adapter to insert in to the target database.Deploy this process to your application server to get a concrete wsdl url for the Sub-Process.
3. Creating the Connection pool and the data Source
Create a data-Source from Admin console.This will contain the details of the database you want to connect.
Create a new Outbound Connection Pool,This will provide a Jndi name to be associated with the data source using the jndi name in design time configuration.
4. Creating the Main Process.
In the main process we will create an inbound db adapter then using the partnerlink feature of BPEL We will call the SubProcess that we have already defined.We will then create a file Adapter to write the file to a file system.My Final process will look something like this.
Now with write privilege in the folder the process will complete successfully and we will get a completed scenario as shown below.
Now in order to create our issue we will revoke the write privilege from the folder location where file adapter is writing.This will result in a faulted instance as shown below.
But as you can see the first invoke has already updated the record and is completed successfully however the whole flow is not complete.This mean that our record is taken from the source table and is update in the target table but still our process is not complete.So if you will check the end system this result will come.
The expected behavior is if file is not writing the whole process should roll back to the initial point.That is the target should not be updated and the record should not be deleted from source.This happens because the Target database is getting updated in a separate transaction.
Recovery
In order to recover from this situation we need to make the process work in a single transaction so we will see how we can do that.
We will add the following property in the composite of our Sub process.
required
Now we need to redeploy the process to make these changes take effect.Now if you will test this process you will find that the target table is not updated but the record is deleted from the source. So now if you will check the records you will get the following result.
This time as you can see you can see that the target is not update but at the same time the record is deleted from the source table. This happens because we have designed our process as a asynchronous process which by default doesn’t retry for inbound adapter.So it just checked once and then faulted.So we need to make our process a synchronous process in order to make retry happens at inbound level.
As we know that in a synchronous process the input and output lies in the same operation.We will make a change in our inbound WSDL file and will add one more output tag so initially my WSDL was having following entry.
I will add one more output message type in the operation to make it a synchronous process.So now my WSDL will have following entry
Further we need to change our process also as this is now a synchronous process ,hence I will include a reply activity at the end of my process and will send a reply to client.So My new process with the changes will appear like this.
Now further we need to take in to consideration few things.
Following property is by default provided for the inbound dbadapter.
2147483647
1
2
120
So by default the inbound adapter will retry for 2147483647 number of times.Please note that this value is the default value generated by the wizard when we created the adapter.This practically means that adapter will try for an infinite number of times.
Now we will test our process again.This time you will find that neither the target table is updated nor the record is deleted from the source table.This happened because the process is in a single transaction now so every time it will start from polling the database as per the polling interval and go till the last state to verify if transaction can reach up to the last point.If if will find that there is a problem at any point it will not commit and the process will roll back to initial point i.e. our record will not be deleted from the source table which is the functionality we wanted to achieve.So our record will show the following values.
In case when retry parameters are not provided the BPEL Engine takes this value from the server configuration which is again set to infinite by default.
This value is called as GlobalInboundJcaRetryCount
You can set this value from your em console in case if you don’t want to use the retry feature in your composite.
You can set if rom following location in your em console
Go to SOA-Infra
Administration->System Mbean Browser
Here go to oracle.as.soainfra.config
your server then AdapterConfig and then adapter
Now on the right hand side you will find a property called as
GlobalInboundJcaRetryCount
You can set this value to -1
The diagram below shows the location and the definition for the parameter.
Results
When the Main process and SubProcess are in same transaction it will force the process to run in a single thread resulting in a rollback of process to initial point if any error occurs in between.
Further Scope
Transactions are mainly considered in the case of fault propagation we can further verify the following scenario using the same process.
Main Process Calls SubProcess That Has bpel.config.transaction Set to requiresNew
If The SubProcess Then The SubProcess Transaction... And The Main Process
Replies with a fault (that is, it uses). Is saved. Gets the fault and can catch it.
Throws a fault that is not handled (that is, it uses). Is rolled back. Gets the fault and can catch it.
Replies back with a fault (FaultOne), and then throws a fault (FaultTwo). Is rolled back. Gets FaultTwo.
Throws a bpelx:rollback fault (that is, it uses). Is rolled back. Gets a remote fault.
MainProcess Calls SubProcess That Has bpel.config.transaction Set to required
If The SubProcess Then The MainProcess
Replies with a fault (that is, it uses). Gets the fault and can catch it. The BPELCaller owns the transaction. Therefore, if it catches it, the transaction is committed. If the BPELCaller does not handle it, a global rollback occurs.
Throws a fault (that is, it uses). Gets the fault and can catch it.
Replies back with a fault (FaultOne), and then throws a fault (FaultTwo). Gets FaultTwo.
Throws (that is, it uses) a bpelx:rollback fault. Gets its transaction rolled back; there is no way to catch it. This fault cannot be handled.
Behavior of bpel.config.transaction parameter for a Process
For With bpel.config.transaction Set to required... With bpel.config.transaction Set to requiresNew...
Request/response (initiating) invocations The caller's transaction is joined (if there is one) or a new transaction is created (if there is not one). A new transaction is always created and an existing transaction (if there is one) is suspended.
One-way initiating invocations in which bpel.config.oneWayDeliveryPolicy is set to sync. Invoked messages are processed using the same thread in the same transaction. A new transaction is always created and an existing transaction (if there is one) is suspended.
¬¬¬¬¬
The transaction concept is also applicable to mediator in SOA Suite.
In Inbound Scenario the Mediator either participates in existing transaction or if a transaction is not present it will start the new transaction.In general when a mediator is invoked via binding.ws it will create a new transaction and in other case it will join the existing transaction.
In case of sync routing rules, they will be all executed in the same transaction, and if an error occurs, a rollback will be issued.
Note: in this case even a fault policy that is configured against a mediator fault will NOT trigger.
In case of async routing rules, each rule will be executed in a new transaction, and if errors occur, a rollback will happen, and a fault policy can be configured to react to it.
Business benefits
Ensure that the Process complete then only status is changed,If an error occurs in between the process rollback to the initial point.No transaction is lost.No need to resubmit the record.
Drawbacks
If you set the retry parameter to infinite this will cause an un-necessary logging so one should design the retry parameter as per the requirement of the project.
Conclusion
Data will not be lost and user will not require resubmitting the data. This concept is very important especially when you are doing a monetary transaction through internet.
Reproduce issue in local Machine
1>Run the scripts to create the required table.
2>Import the two project in to jdeveloper.
2>Create the data-sources required for interaction with database.
4>Change the folder location where data has to write.
5>Deploy the process and test it.
(You can check the process by removing the bpel.config.transaction parameter from Subprocess and redeploying to notice the changes yourself.)
Oracle SOA Suite is a product which provides a complete functionality for implementing end to end scenario.
This paper focuses on the importance and applicability of one of the most important concepts of SOA i.e. Transactions-How the business behave, Does it go in a single thread or it is processed in multiple threads.
In an ideal scenario we expect that when ever we are making a transaction if there occurs any error in the middle of the process the whole process should roll back to the initial point from where it is started.Specially in case of monetary transaction we want that the whole transaction should be rolled back to the initial point.
In this paper we will see a demo scenario where in we will see what error occurs when the process runs in different transcations and how we can overcome them by making them run in a single transaction.Transactions are mainly considered in case of fault propagation.
Transactions in SOA Suite
A transaction enables an application to coordinate a group of messages for production and consumption, treating messages sent or received as an atomic unit.
When an application commits a transaction, all of the messages it received within the transaction are removed from the messaging system and the messages it sent within the transaction are actually delivered. If the application rolls back the transaction, the messages it received within the transaction are returned to the messaging system and messages it sent are discarded.
In the document we will see how transaction semantics behave in Oracle BPEL Process Manager.
Need for transaction in SOA Suite
Oracle BPEL Process Manager by default creates a new transaction on a request basis. That is, if a transaction exists, it is suspended, and a new transaction is created. Upon completion of the child (new) transaction, the master (suspended) transaction resumes.
However, if the request is asynchronous (that is, one-way), the transaction is either:
• Inherited for insertion into the dehydration store (table dlv_message).
• Enlisted transparently into the transaction (if one exists).
There is no message loss. Either the invocation message is inserted into the dehydration store for processing or the consumer is notified through a fault.
In release 10.1.3.x, there were several properties to set on the consuming process (that is, on the partner link) and the providing process. This enabled you to chain an execution into a single global transaction. On the consuming side, you set transaction=participate on the partner link binding in the bpel.xml file. On the providing side, you set transaction=participate in the
In release 11g, you only must set a new transaction property on the BPEL component being called (known as the callee process). You add bpel.config.transaction into a BPEL process service component section in the composite.xml file (note the required prefix of bpel.config.). This property configures the transaction behavior for BPEL instances with initiating calls.
Additionally we will see how the global inbond parameter for inbound adapter in BPEL process manager affects the flow of process.
Problem definition
We will create a scenario first of all to understand what the problem is when the process do not executes in a single transaction.In this sample I am replicating this scenario by using following logic.
There is an inbound database adapater which is being polled.Upon arrival of data in the table the process is triggered and then within the process (business flow) some transformation happens as per the target system requirement and again this data is write to another table in target database.Once database is update in the next flow of business process a file is also write to the system file directory using a file adapter.
Now Let suppose there is a situation where in you have put a record in the source table and the flow started, transformation happened and then while trying to write the record in file system some issue happened(for simulation we will revoke the write privilege from the file location.).So ideally you will expect the process to roll back that is your data should not be written to target table and should go back to the source table.But if the process is in different transaction you will find that the process will throw an error at stage when it is trying to write a record in file system,The record from the source will be deleted and the target table will be updated.Obviously you don’t want this.We will see how this happens using different retry parameters at different adapter level to understand how the process behave.
High-level solution
We will make the process work in a single transaction; Additionally we will define a global inbound parameter which will ensure that the inbound adapter keeps on polling the record until the whole process works fine.
Solution details
The following section details out the setups required to recreate the problem scenario and then the solution steps.
Setups
SOA Suite 11.1.1.5
Oracle XE 11.2
RCU 11.1.1.5
Oracle WebLogic Server 10.3
Jdeveloper 11.1.1.5
Business flow
Steps to recreate the issue.
1. We will create two tables one for source and one for target.For demo purpose we will create a simple table with two columns.
Source Table:- create table source (name varchar(20), EmployeeId varchar(20));
Target Table:- create table target (EmpName varchar(20),Id varchar(20));
2>Creating Sub Process-Create a simple synchronous BPEL process which will take two input parameter from client and will update it in the target table of the database.My process will look something like this.
On top I have the composite and in bottom I have the BPEL process which is assigning the in put data to the adapter to insert in to the target database.Deploy this process to your application server to get a concrete wsdl url for the Sub-Process.
3. Creating the Connection pool and the data Source
Create a data-Source from Admin console.This will contain the details of the database you want to connect.
Create a new Outbound Connection Pool,This will provide a Jndi name to be associated with the data source using the jndi name in design time configuration.
4. Creating the Main Process.
In the main process we will create an inbound db adapter then using the partnerlink feature of BPEL We will call the SubProcess that we have already defined.We will then create a file Adapter to write the file to a file system.My Final process will look something like this.
Now with write privilege in the folder the process will complete successfully and we will get a completed scenario as shown below.
Now in order to create our issue we will revoke the write privilege from the folder location where file adapter is writing.This will result in a faulted instance as shown below.
But as you can see the first invoke has already updated the record and is completed successfully however the whole flow is not complete.This mean that our record is taken from the source table and is update in the target table but still our process is not complete.So if you will check the end system this result will come.
The expected behavior is if file is not writing the whole process should roll back to the initial point.That is the target should not be updated and the record should not be deleted from source.This happens because the Target database is getting updated in a separate transaction.
Recovery
In order to recover from this situation we need to make the process work in a single transaction so we will see how we can do that.
We will add the following property in the composite of our Sub process.
Now we need to redeploy the process to make these changes take effect.Now if you will test this process you will find that the target table is not updated but the record is deleted from the source. So now if you will check the records you will get the following result.
This time as you can see you can see that the target is not update but at the same time the record is deleted from the source table. This happens because we have designed our process as a asynchronous process which by default doesn’t retry for inbound adapter.So it just checked once and then faulted.So we need to make our process a synchronous process in order to make retry happens at inbound level.
As we know that in a synchronous process the input and output lies in the same operation.We will make a change in our inbound WSDL file and will add one more output tag so initially my WSDL was having following entry.
I will add one more output message type in the operation to make it a synchronous process.So now my WSDL will have following entry
Further we need to change our process also as this is now a synchronous process ,hence I will include a reply activity at the end of my process and will send a reply to client.So My new process with the changes will appear like this.
Now further we need to take in to consideration few things.
Following property is by default provided for the inbound dbadapter.
So by default the inbound adapter will retry for 2147483647 number of times.Please note that this value is the default value generated by the wizard when we created the adapter.This practically means that adapter will try for an infinite number of times.
Now we will test our process again.This time you will find that neither the target table is updated nor the record is deleted from the source table.This happened because the process is in a single transaction now so every time it will start from polling the database as per the polling interval and go till the last state to verify if transaction can reach up to the last point.If if will find that there is a problem at any point it will not commit and the process will roll back to initial point i.e. our record will not be deleted from the source table which is the functionality we wanted to achieve.So our record will show the following values.
In case when retry parameters are not provided the BPEL Engine takes this value from the server configuration which is again set to infinite by default.
This value is called as GlobalInboundJcaRetryCount
You can set this value from your em console in case if you don’t want to use the retry feature in your composite.
You can set if rom following location in your em console
Go to SOA-Infra
Administration->System Mbean Browser
Here go to oracle.as.soainfra.config
your server then AdapterConfig and then adapter
Now on the right hand side you will find a property called as
GlobalInboundJcaRetryCount
You can set this value to -1
The diagram below shows the location and the definition for the parameter.
Results
When the Main process and SubProcess are in same transaction it will force the process to run in a single thread resulting in a rollback of process to initial point if any error occurs in between.
Further Scope
Transactions are mainly considered in the case of fault propagation we can further verify the following scenario using the same process.
Main Process Calls SubProcess That Has bpel.config.transaction Set to requiresNew
If The SubProcess Then The SubProcess Transaction... And The Main Process
Replies with a fault (that is, it uses
Throws a fault that is not handled (that is, it uses
Replies back with a fault (FaultOne), and then throws a fault (FaultTwo). Is rolled back. Gets FaultTwo.
Throws a bpelx:rollback fault (that is, it uses
MainProcess Calls SubProcess That Has bpel.config.transaction Set to required
If The SubProcess Then The MainProcess
Replies with a fault (that is, it uses
Throws a fault (that is, it uses
Replies back with a fault (FaultOne), and then throws a fault (FaultTwo). Gets FaultTwo.
Throws (that is, it uses
Behavior of bpel.config.transaction parameter for a Process
For With bpel.config.transaction Set to required... With bpel.config.transaction Set to requiresNew...
Request/response (initiating) invocations The caller's transaction is joined (if there is one) or a new transaction is created (if there is not one). A new transaction is always created and an existing transaction (if there is one) is suspended.
One-way initiating invocations in which bpel.config.oneWayDeliveryPolicy is set to sync. Invoked messages are processed using the same thread in the same transaction. A new transaction is always created and an existing transaction (if there is one) is suspended.
¬¬¬¬¬
The transaction concept is also applicable to mediator in SOA Suite.
In Inbound Scenario the Mediator either participates in existing transaction or if a transaction is not present it will start the new transaction.In general when a mediator is invoked via binding.ws it will create a new transaction and in other case it will join the existing transaction.
In case of sync routing rules, they will be all executed in the same transaction, and if an error occurs, a rollback will be issued.
Note: in this case even a fault policy that is configured against a mediator fault will NOT trigger.
In case of async routing rules, each rule will be executed in a new transaction, and if errors occur, a rollback will happen, and a fault policy can be configured to react to it.
Business benefits
Ensure that the Process complete then only status is changed,If an error occurs in between the process rollback to the initial point.No transaction is lost.No need to resubmit the record.
Drawbacks
If you set the retry parameter to infinite this will cause an un-necessary logging so one should design the retry parameter as per the requirement of the project.
Conclusion
Data will not be lost and user will not require resubmitting the data. This concept is very important especially when you are doing a monetary transaction through internet.
Reproduce issue in local Machine
1>Run the scripts to create the required table.
2>Import the two project in to jdeveloper.
2>Create the data-sources required for interaction with database.
4>Change the folder location where data has to write.
5>Deploy the process and test it.
(You can check the process by removing the bpel.config.transaction parameter from Subprocess and redeploying to notice the changes yourself.)
Fault Handling Strategies
Executive Summary
Oracle SOA Suite provides a generic fault management framework for handling faults in BPEL processes. If a fault occurs during runtime in an invoke activity in a process, the framework catches the fault and performs a user-specified action defined in a fault policy file associated with the activity. If a fault results in a condition in which human intervention is the prescribed action, you perform recovery actions from Oracle Enterprise Manager Fusion Middleware Control. The fault management framework provides an alternative to designing a BPEL process with catch activities in scope activities.
Fault Handling in SOA Suite
There are two categories of Fault in SOA Suite.
1>Runtime Fault
Runtime faults are the result of logic errors in programming (such as an endless loop); a Simple Object Access Protocol (SOAP) fault occurs in a SOAP call, an exception is thrown by the server, and so on.
2>Business Fault
Business faults are application-specific faults that are generated when there is a problem with the information being processed (for example, when a social security number is not found in the database).
A Fault Handling Framework use Fault binding policy file (fault-policies.xml) and fault policy bindings file (fault-bindings.xml) to handle fault in SOA Suite. These file includes condition and action sections for performing specific tasks. They can be stored in the file location or in the Meta data repository. How these fault policy file work will be very clear once we will come up with an example.
Need for Fault Handling in SOA Suite
When you define a business Process you expect that it will work smoothly without any problem and ideally it works fine too but there are situations where in we do not expect the fault or error to come but still it comes Such as the sudden shut down of the machine, Some network issues and many more. In a good programming we should already define how to handle these situations otherwise this may cause a fatal. We may not define or expect what all kind of problems may occur but we can define some common error condition and remedy for them and for rest of the conditions which are unpredictable we can define a general policy as a remedy. Just think of a situation when you are doing an online transaction and all of a sudden it shows some error on the page saying you are not able to connect. You will prefer that a message should come that the server is currently under maintenance rather than showing an error in the page that is why a fault handling is required to handle some situation which are not ideal.
Problem definition
We will create a scenario first of all to understand what happens when we do not handle the fault and then we will try to handle the fault using the fault policy.
We will create a simple process where in we will intentionally throw a fault using throw activity in the Sub Process. This process will be called by the main process. We will see how this fault will be handled in the Business process when there is a Fault handling Framework defined to handle that error.
We will define the two policy file as per our requirement and will design it for one specific error.
High-level solution
We will define two policy file fault-policies.xml and fault-binding.xml file, these file will contain the condition where in fault will occur and the action to be taken in case of a fault.
A typical Fault policy file will have a format like this
<? Xml version="1.0" encoding="UTF-8"?>
<faultPolicies xmlns="http://schemas.oracle.com/bpel/faultpolicy"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<faultPolicy version="0.0.1" id="FusionMidFaults"
xmlns:env="http://schemas.xmlsoap.org/soap/envelope/"
xmlns:xs="http://www.w3.org/2001/XMLSchema"
xmlns="http://schemas.oracle.com/bpel/faultpolicy"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<Conditions>
<faultName xmlns:medns="http://schemas.oracle.com/mediator/faults"
name="medns:mediatorFault">
<condition>
<action ref="MediatorJavaAction"/>
</condition>
</faultName>
<faultName xmlns:bpelx="http://schemas.oracle.com/bpel/extension"
name="bpelx:remoteFault">
<condition>
<action ref="BPELJavaAction"/>
</condition>
</faultName>
<faultName xmlns:bpelx="http://schemas.oracle.com/bpel/extension"
name="bpelx:bindingFault">
<condition>
<action ref="BPELJavaAction"/>
</condition>
</faultName>
<faultName xmlns:bpelx="http://schemas.oracle.com/bpel/extension"
name="bpelx:runtimeFault">
<condition>
<action ref="BPELJavaAction"/>
</condition>
</faultName>
</Conditions>
<Actions>
<!-- Generics -->
<Action id="default-terminate">
<abort/>
</Action>
<Action id="default-replay-scope">
<replayScope/>
</Action>
<Action id="default-rethrow-fault">
<rethrowFault/>
</Action>
<Action id="default-human-intervention">
<humanIntervention/>
</Action>
<Action id="MediatorJavaAction">
<!-- this is user provided class-->
<javaAction className="MediatorJavaAction.myClass"
defaultAction="default-terminate">
<returnValue value="MANUAL" ref="default-human-intervention"/>
</javaAction>
</Action>
<Action id="BPELJavaAction">
<!-- this is user provided class-->
<javaAction className="BPELJavaAction.myAnotherClass"
defaultAction="default-terminate">
<returnValue value="MANUAL" ref="default-human-intervention"/>
</javaAction>
</Action>
</Actions>
</faultPolicy>
</faultPolicies>
Similarly a typical Fault-binding.xml file will have following format.
<?xml version="1.0" encoding="UTF-8" ?>
<faultPolicyBindings version="0.0.1"
xmlns="http://schemas.oracle.com/bpel/faultpolicy"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<composite faultPolicy="FusionMidFaults"/>
<!--<composite faultPolicy="ServiceExceptionFaults"/>-->
<!--<composite faultPolicy="GenericSystemFaults"/>-->
</faultPolicyBindings>
Solution details
The following section details out the setups required to recreate the problem scenario and then the solution steps.
Setups
SOA Suite 11.1.1.5
Oracle XE 11.2
RCU 11.1.1.5
Oracle WebLogic Server 10.3
Jdeveloper 11.1.1.5
Business flow
Steps to recreate the issue.
1. Create a Synchronous BPEL process in Jdeveloper.
Assign it some Logical name
Now drag and drop a bpel process in the composite panel as shown
Call this process also as a subprocess and also make this process as a synchronous process.
Join the two processes
2> Now open your SubProcess and drag and drop a Throw activity in between the receive and reply activity. Double click on the Throw activity and select Remote Fault from the list of System Faults as shown.
Create a Fault Variable also. Once you will Finish this wizard you will find that a wsdl file called RuntimeFault.wsdl is created with the following content.
<?xml version="1.0" encoding="UTF-8"?>
<definitions name="RuntimeFault"
targetNamespace="http://schemas.oracle.com/bpel/extension"
xmlns:xsd="http://www.w3.org/2001/XMLSchema"
xmlns="http://schemas.xmlsoap.org/wsdl/">
<message name="RuntimeFaultMessage">
<part name="code" type="xsd:string"/>
<part name="summary" type="xsd:string"/>
<part name="detail" type="xsd:string"/>
</message>
</definitions>
Now drag and drop an assign activity in between the receive activity and the throw activity and assign some random values to the fault variables.
So Your SubProcess should look something like this
2. Go to you MainProcess now.
Drag and drop an assign activity and call the SubProcess. Create the corresponding input and output variable for calling the SubProcess.
Now Drag and drop an assign activity after the invoke. This will assign the output of the SubProcess to the replyOutput of the MainProcess.This step is done intentionally in order to pass the fault to the client of the Main Process.
So the overall MainProcess will look something like this.
Now our Fault Process is ready we will deploy it to test what is the behavior we are getting.
On testing the process errored out because of the fault.
As you can see the process is faulted and there is no human intervention for the correction of this fault in the process.
Recovery
In order to recover from this situation we will create the fault policy to handle this situation manually through human intervention.
My Fault-binding.xml will look something like this.
<?xml version="1.0" encoding="UTF-8"?>
<faultPolicyBindings version="2.0.1"
xmlns="http://schemas.oracle.com/bpel/faultpolicy"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<component faultPolicy="bpelFaultHandling">
<name>CatchFault</name>
</component>
</faultPolicyBindings>
And my Fault-policies.xml will look like this.
<?xml version="1.0" encoding="UTF-8"?>
<faultPolicies xmlns="http://schemas.oracle.com/bpel/faultpolicy">
<faultPolicy version="2.0.1" id="bpelFaultHandling"
xmlns:env="http://schemas.xmlsoap.org/soap/envelope/"
xmlns:xs="http://www.w3.org/2001/XMLSchema"
xmlns="http://schemas.oracle.com/bpel/faultpolicy"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<Conditions>
<faultName xmlns:bpelx="http://schemas.oracle.com/bpel/extension"
name="bpelx:remoteFault">
<condition>
<action ref="ora-human-intervention"/>
</condition>
</faultName>
</Conditions>
<Actions>
<!-- This is an action will mark the work item to be "pending recovery from console"-->
<Action id="ora-human-intervention">
<humanIntervention/>
</Action>
</Actions>
</faultPolicy>
</faultPolicies>
There are two important things that you need to note down
1>if you can see in your falut-binding.xml you can see an entry like
component faultPolicy="bpelFaultHandling"
this is basically a reference id which is passed to the fault-policies.xml so corresponding to that
we have an entry in fault-policies.xml
faultPolicy version="2.0.1" id="bpelFaultHandling"
So the important thing is that this reference id should match in the two files.
In my case it is bpelFaultHandling in both the case.
2>You can again see in your fault-bindings.xml an entry like this
<name>CatchFault</name>
This CatchFault is the name of the BPELProcess which is catching the fault. In my case the name of process which is
catching fault is CatchFault.
Now again we need to make two more addition in to our composite to make him aware that fault policies are added
<property name="oracle.composite.faultPolicyFile">fault-policies.xml</property>
<property name="oracle.composite.faultBindingFile">fault-bindings.xml</property>
Here in my demo case I have used Human intervention as an action for the fault handling.
The two file fault-binding.xml and fault-policies.xml should be stored in the same location where we have our composite.xml.So the file structure will look like this.
And the Composite.xml will have entry like this.
Now we will test this scenario again by redeploying the process.
Results
Now if you test the process with fault policies in place you will get the following output in case of error.
As you can see now we have one recovery option coming in the result. Click on the recovery activity and you will get the following page.
Here you have multiple options, you can replay, rethrow abort or continue with another input as per your business requirement. Thus it allows human intervention to take care of the necessary action in case of a fault.
Further Scope
We have applied the fault handling concept for human intervention, This same concept can be used for other widely used practices that are using custom java code to handle the error and rejection handler.
Business benefits
Your Business Process will be ready for the unexpected errors hence the end user will not get shocked in case of process failure. The same Fault policies can be applied to multiple processes if we store the fault policies in meta data repository.
Conclusion
Fault handling if defined properly can be used to handle any kind of error that my come in any Business Process.
Reproduce issue in local Machine
1>Download and unzip the file in your local machine.
2>Open your Jdeveloper and import the .jpr file in to jdeveloper.
3>Deploy and test the project.
Oracle SOA Suite provides a generic fault management framework for handling faults in BPEL processes. If a fault occurs during runtime in an invoke activity in a process, the framework catches the fault and performs a user-specified action defined in a fault policy file associated with the activity. If a fault results in a condition in which human intervention is the prescribed action, you perform recovery actions from Oracle Enterprise Manager Fusion Middleware Control. The fault management framework provides an alternative to designing a BPEL process with catch activities in scope activities.
Fault Handling in SOA Suite
There are two categories of Fault in SOA Suite.
1>Runtime Fault
Runtime faults are the result of logic errors in programming (such as an endless loop); a Simple Object Access Protocol (SOAP) fault occurs in a SOAP call, an exception is thrown by the server, and so on.
2>Business Fault
Business faults are application-specific faults that are generated when there is a problem with the information being processed (for example, when a social security number is not found in the database).
A Fault Handling Framework use Fault binding policy file (fault-policies.xml) and fault policy bindings file (fault-bindings.xml) to handle fault in SOA Suite. These file includes condition and action sections for performing specific tasks. They can be stored in the file location or in the Meta data repository. How these fault policy file work will be very clear once we will come up with an example.
Need for Fault Handling in SOA Suite
When you define a business Process you expect that it will work smoothly without any problem and ideally it works fine too but there are situations where in we do not expect the fault or error to come but still it comes Such as the sudden shut down of the machine, Some network issues and many more. In a good programming we should already define how to handle these situations otherwise this may cause a fatal. We may not define or expect what all kind of problems may occur but we can define some common error condition and remedy for them and for rest of the conditions which are unpredictable we can define a general policy as a remedy. Just think of a situation when you are doing an online transaction and all of a sudden it shows some error on the page saying you are not able to connect. You will prefer that a message should come that the server is currently under maintenance rather than showing an error in the page that is why a fault handling is required to handle some situation which are not ideal.
Problem definition
We will create a scenario first of all to understand what happens when we do not handle the fault and then we will try to handle the fault using the fault policy.
We will create a simple process where in we will intentionally throw a fault using throw activity in the Sub Process. This process will be called by the main process. We will see how this fault will be handled in the Business process when there is a Fault handling Framework defined to handle that error.
We will define the two policy file as per our requirement and will design it for one specific error.
High-level solution
We will define two policy file fault-policies.xml and fault-binding.xml file, these file will contain the condition where in fault will occur and the action to be taken in case of a fault.
A typical Fault policy file will have a format like this
<? Xml version="1.0" encoding="UTF-8"?>
<faultPolicies xmlns="http://schemas.oracle.com/bpel/faultpolicy"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<faultPolicy version="0.0.1" id="FusionMidFaults"
xmlns:env="http://schemas.xmlsoap.org/soap/envelope/"
xmlns:xs="http://www.w3.org/2001/XMLSchema"
xmlns="http://schemas.oracle.com/bpel/faultpolicy"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<Conditions>
<faultName xmlns:medns="http://schemas.oracle.com/mediator/faults"
name="medns:mediatorFault">
<condition>
<action ref="MediatorJavaAction"/>
</condition>
</faultName>
<faultName xmlns:bpelx="http://schemas.oracle.com/bpel/extension"
name="bpelx:remoteFault">
<condition>
<action ref="BPELJavaAction"/>
</condition>
</faultName>
<faultName xmlns:bpelx="http://schemas.oracle.com/bpel/extension"
name="bpelx:bindingFault">
<condition>
<action ref="BPELJavaAction"/>
</condition>
</faultName>
<faultName xmlns:bpelx="http://schemas.oracle.com/bpel/extension"
name="bpelx:runtimeFault">
<condition>
<action ref="BPELJavaAction"/>
</condition>
</faultName>
</Conditions>
<Actions>
<!-- Generics -->
<Action id="default-terminate">
<abort/>
</Action>
<Action id="default-replay-scope">
<replayScope/>
</Action>
<Action id="default-rethrow-fault">
<rethrowFault/>
</Action>
<Action id="default-human-intervention">
<humanIntervention/>
</Action>
<Action id="MediatorJavaAction">
<!-- this is user provided class-->
<javaAction className="MediatorJavaAction.myClass"
defaultAction="default-terminate">
<returnValue value="MANUAL" ref="default-human-intervention"/>
</javaAction>
</Action>
<Action id="BPELJavaAction">
<!-- this is user provided class-->
<javaAction className="BPELJavaAction.myAnotherClass"
defaultAction="default-terminate">
<returnValue value="MANUAL" ref="default-human-intervention"/>
</javaAction>
</Action>
</Actions>
</faultPolicy>
</faultPolicies>
Similarly a typical Fault-binding.xml file will have following format.
<?xml version="1.0" encoding="UTF-8" ?>
<faultPolicyBindings version="0.0.1"
xmlns="http://schemas.oracle.com/bpel/faultpolicy"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<composite faultPolicy="FusionMidFaults"/>
<!--<composite faultPolicy="ServiceExceptionFaults"/>-->
<!--<composite faultPolicy="GenericSystemFaults"/>-->
</faultPolicyBindings>
Solution details
The following section details out the setups required to recreate the problem scenario and then the solution steps.
Setups
SOA Suite 11.1.1.5
Oracle XE 11.2
RCU 11.1.1.5
Oracle WebLogic Server 10.3
Jdeveloper 11.1.1.5
Business flow
Steps to recreate the issue.
1. Create a Synchronous BPEL process in Jdeveloper.
Assign it some Logical name
Now drag and drop a bpel process in the composite panel as shown
Call this process also as a subprocess and also make this process as a synchronous process.
Join the two processes
2> Now open your SubProcess and drag and drop a Throw activity in between the receive and reply activity. Double click on the Throw activity and select Remote Fault from the list of System Faults as shown.
Create a Fault Variable also. Once you will Finish this wizard you will find that a wsdl file called RuntimeFault.wsdl is created with the following content.
<?xml version="1.0" encoding="UTF-8"?>
<definitions name="RuntimeFault"
targetNamespace="http://schemas.oracle.com/bpel/extension"
xmlns:xsd="http://www.w3.org/2001/XMLSchema"
xmlns="http://schemas.xmlsoap.org/wsdl/">
<message name="RuntimeFaultMessage">
<part name="code" type="xsd:string"/>
<part name="summary" type="xsd:string"/>
<part name="detail" type="xsd:string"/>
</message>
</definitions>
Now drag and drop an assign activity in between the receive activity and the throw activity and assign some random values to the fault variables.
So Your SubProcess should look something like this
2. Go to you MainProcess now.
Drag and drop an assign activity and call the SubProcess. Create the corresponding input and output variable for calling the SubProcess.
Now Drag and drop an assign activity after the invoke. This will assign the output of the SubProcess to the replyOutput of the MainProcess.This step is done intentionally in order to pass the fault to the client of the Main Process.
So the overall MainProcess will look something like this.
Now our Fault Process is ready we will deploy it to test what is the behavior we are getting.
On testing the process errored out because of the fault.
As you can see the process is faulted and there is no human intervention for the correction of this fault in the process.
Recovery
In order to recover from this situation we will create the fault policy to handle this situation manually through human intervention.
My Fault-binding.xml will look something like this.
<?xml version="1.0" encoding="UTF-8"?>
<faultPolicyBindings version="2.0.1"
xmlns="http://schemas.oracle.com/bpel/faultpolicy"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<component faultPolicy="bpelFaultHandling">
<name>CatchFault</name>
</component>
</faultPolicyBindings>
And my Fault-policies.xml will look like this.
<?xml version="1.0" encoding="UTF-8"?>
<faultPolicies xmlns="http://schemas.oracle.com/bpel/faultpolicy">
<faultPolicy version="2.0.1" id="bpelFaultHandling"
xmlns:env="http://schemas.xmlsoap.org/soap/envelope/"
xmlns:xs="http://www.w3.org/2001/XMLSchema"
xmlns="http://schemas.oracle.com/bpel/faultpolicy"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<Conditions>
<faultName xmlns:bpelx="http://schemas.oracle.com/bpel/extension"
name="bpelx:remoteFault">
<condition>
<action ref="ora-human-intervention"/>
</condition>
</faultName>
</Conditions>
<Actions>
<!-- This is an action will mark the work item to be "pending recovery from console"-->
<Action id="ora-human-intervention">
<humanIntervention/>
</Action>
</Actions>
</faultPolicy>
</faultPolicies>
There are two important things that you need to note down
1>if you can see in your falut-binding.xml you can see an entry like
component faultPolicy="bpelFaultHandling"
this is basically a reference id which is passed to the fault-policies.xml so corresponding to that
we have an entry in fault-policies.xml
faultPolicy version="2.0.1" id="bpelFaultHandling"
So the important thing is that this reference id should match in the two files.
In my case it is bpelFaultHandling in both the case.
2>You can again see in your fault-bindings.xml an entry like this
<name>CatchFault</name>
This CatchFault is the name of the BPELProcess which is catching the fault. In my case the name of process which is
catching fault is CatchFault.
Now again we need to make two more addition in to our composite to make him aware that fault policies are added
<property name="oracle.composite.faultPolicyFile">fault-policies.xml</property>
<property name="oracle.composite.faultBindingFile">fault-bindings.xml</property>
Here in my demo case I have used Human intervention as an action for the fault handling.
The two file fault-binding.xml and fault-policies.xml should be stored in the same location where we have our composite.xml.So the file structure will look like this.
And the Composite.xml will have entry like this.
Now we will test this scenario again by redeploying the process.
Results
Now if you test the process with fault policies in place you will get the following output in case of error.
As you can see now we have one recovery option coming in the result. Click on the recovery activity and you will get the following page.
Here you have multiple options, you can replay, rethrow abort or continue with another input as per your business requirement. Thus it allows human intervention to take care of the necessary action in case of a fault.
Further Scope
We have applied the fault handling concept for human intervention, This same concept can be used for other widely used practices that are using custom java code to handle the error and rejection handler.
Business benefits
Your Business Process will be ready for the unexpected errors hence the end user will not get shocked in case of process failure. The same Fault policies can be applied to multiple processes if we store the fault policies in meta data repository.
Conclusion
Fault handling if defined properly can be used to handle any kind of error that my come in any Business Process.
Reproduce issue in local Machine
1>Download and unzip the file in your local machine.
2>Open your Jdeveloper and import the .jpr file in to jdeveloper.
3>Deploy and test the project.
Dynamic Adapter
Executive Summary
Most of the business processes in real time uses some integration with external systems.These integration may include interaction with file system, external database, external web services etc. In most of the case we do have an end point which may vary as per requirement.For example –Currently if you are using database A for storage of your data ,May be down the time you will like to use database B for storage of data.At that point of time you may need to change your code and redeploy the process in order to make the changes as required,This is a very lengthy process.Ideally an administrator should be able to do so by making some changes from the console without the use of redeploying the process.So here comes the concept that if we can desing our process in such a way that we can dynamically change the end point of a adapter then our purpose will be solved.
This article describes how to dynamically change the end point of an adapter taking the case of a file adapter.
Dynamic EndPoints in SOA Suite
Oracle SOA Suite provides the JCA architecure for technology adapter. Oracle JCA Adapters expose the properties specific to underlying back-end operations as header elements and allow the manipulation of these elements within a business process.
These properties can be changed from Em console of SOA Suite.
It contains support for following adapters:-
Oracle File and FTP Adapters,Oracle Socket Adapter,Oracle AQ Adapter ,Oracle JMS Adapter ,Oracle Database Adapter ,Oracle MQ Series Adapter
Need for Changing the Endpoint of Adapter Dynamically in SOA Suite
Technology is changing day by day.We can not expect that the same sort of input and output that will fulfill the reuirement of cosumer,With time the change needs to be done to the existing business process in order to fulfill the criteria required by the current generation.There comes the need for dynamically changing the end point.Let suppose we have a business process which write records to a certain file location in a machine A.Now there is a need to up grade the machine A as it is outdated.You can not stop your live business process because one of the end system is to be upgraded.At the same time you need to keep all the records that will be writteen by the adapter.There is one option that you can change the code of your business process to write in a different location and redeploy it.But again this will require a schedule outage of your business process which is not acceptable for a live business process.In that scenario the best option is to change the end points of the adapter dynamically.This can be done through console itself and it doesn’t requre your business process to be redeployed.
High-level solution
Oralce File adapter provides us the following JCA properties in its header metadata in inbound service.
jca.file.FileName: file name
jca.file.Directory: directory name
jca.file.Batch: a unique name for a batch in case of debatching
jca.file.BatchIndex: the batch index for each message within the batch for debatching
jca.file.Size: the file size
jca.file.LastModifiedTime: the last modified time for the file.
In our current scenario we will use the following header property “jca.file.Directory: directory name” to change the value of output directory dynamically in our Business Service.We will create a temporary variable and will assing the value of this variable to the header property “jca.file.Directory: directory name”.Once this process will be deployed,this property will come as a variable in em console and we can change the value dynamically from em console.
Solution details
The following section details out the setups required to recreate the problem scenario and then the solution steps.
Setups
SOA Suite 11.1.1.5
Oracle XE 11.2
RCU 11.1.1.5
Oracle WebLogic Server 10.3
Jdeveloper 11.1.1.5
Business flow
Steps to desing the process.
1. Create an OneWay BPEL process in Jdeveloper.
Assign it some Logical name
Now drag and drop a File Adapter in the right hand side of the Composite swinlane.This will cause a File configuration wizard to come up.
Call this adapter AdapterEndPoint
Let the default interface be there
Choose Write operation to write file
Provide some physical directory in your local machine and specify the name for the file.
For the schema of the message choose the input wsdl file itself
Say Next and Finish and your adapter configuration are complete.
Now join the BPEL process and the File adapter so that you should have a binding created for your adapter.IT should look like this.
Now go iniside your BPEL process drag and drop an invoke activity and make a connection between the invoke and the File adapter.
Drop an assign activity in between the receiveInput and the Invoke activity and and assign the input variable of the client to the input variable of the input variable of File adapter.
After completing this process your BPEL now should look like this.
Now our process is complete.We will deploy it to the server to check if it is working fine i.e. it is writing in to the specific file location.Once it is verified we will go ahead and make changes in our process so that we can change the end points of adapter dynamically.If the flow is working fine you should get following flow instance
Recovery
Now we will modify our process in order to change the adapter endpoint dynamically in the process.Go back to your Jdeveloper and go to the BPEL process that you have designed.
Create a variable with some logical name and make it a string variable.
Now assign some other file location to this new variable you have created.
Now go to your invoke activity,go to properties tab and then search for a property file called jca.file.Directory and in the value assign the new variable which we have created.
These are all the changes that we need to do in our process.Now we will rebuild and redeploy the process to the server.
Results
Now if you test the process with the setting specified above the file will now be written to the new file location that we have specified in our new variable.This means that the adapter properties have a preference over the hardcoded values in the process.We can desing our process so that the new variable can accept values from the client and then we can change the values dynamically.We can also have a rules/decision table where in we can already store the values of differenct location corresponding to some input values and thus can change the values dynamically.
Other than this we also have the feature of changing the value from em console.This can be done in following ways.First of all go back to your BPEL process and remove the changes that we have done for dynamically changing the value (Using a variable and then assigning its value to jca.file.Directory property)
Now Go to your em console.
Click on the process.In the right hand side in dashboard naviage till the bottom of the page,there you will get a property called a AdapterEndPoint as shown below click on that.
Now move to properties tab you will get the following properties file
As you can see we have one property PhysicalDirectory(Write).
You can specify the new location where you want to write your file and save the changes.Make sure the provided directory name already exist and save the changes.Now all the new instances which will get created will be written in the new location which you will specify in this property.
Further Scope
The Dynamic Adapter concept is limited to only technology adapter however for legacy and other adapters we can not use the same concept there we need to use another stricking feature of SOA Suite that is Dynamic PartnerLink.
Business benefits
You need not schedule and outage for the redeployment of your process, the end point can be changed dynamically at run time.
Conclusion
SOA Suite provide properties which can be used to change the end points of an adapter dynamically
Attachment:
IT contains the .jpr file for the project along with all the required artifacts.
Reproduce issue in local Machine
1>Download and unzip the file in your local machine.
2>Open your Jdeveloper and import the .jpr file in to jdeveloper.
3>Change the location of the file as per your local server file structure.
4>Deploy and test the process.
Most of the business processes in real time uses some integration with external systems.These integration may include interaction with file system, external database, external web services etc. In most of the case we do have an end point which may vary as per requirement.For example –Currently if you are using database A for storage of your data ,May be down the time you will like to use database B for storage of data.At that point of time you may need to change your code and redeploy the process in order to make the changes as required,This is a very lengthy process.Ideally an administrator should be able to do so by making some changes from the console without the use of redeploying the process.So here comes the concept that if we can desing our process in such a way that we can dynamically change the end point of a adapter then our purpose will be solved.
This article describes how to dynamically change the end point of an adapter taking the case of a file adapter.
Dynamic EndPoints in SOA Suite
Oracle SOA Suite provides the JCA architecure for technology adapter. Oracle JCA Adapters expose the properties specific to underlying back-end operations as header elements and allow the manipulation of these elements within a business process.
These properties can be changed from Em console of SOA Suite.
It contains support for following adapters:-
Oracle File and FTP Adapters,Oracle Socket Adapter,Oracle AQ Adapter ,Oracle JMS Adapter ,Oracle Database Adapter ,Oracle MQ Series Adapter
Need for Changing the Endpoint of Adapter Dynamically in SOA Suite
Technology is changing day by day.We can not expect that the same sort of input and output that will fulfill the reuirement of cosumer,With time the change needs to be done to the existing business process in order to fulfill the criteria required by the current generation.There comes the need for dynamically changing the end point.Let suppose we have a business process which write records to a certain file location in a machine A.Now there is a need to up grade the machine A as it is outdated.You can not stop your live business process because one of the end system is to be upgraded.At the same time you need to keep all the records that will be writteen by the adapter.There is one option that you can change the code of your business process to write in a different location and redeploy it.But again this will require a schedule outage of your business process which is not acceptable for a live business process.In that scenario the best option is to change the end points of the adapter dynamically.This can be done through console itself and it doesn’t requre your business process to be redeployed.
High-level solution
Oralce File adapter provides us the following JCA properties in its header metadata in inbound service.
jca.file.FileName: file name
jca.file.Directory: directory name
jca.file.Batch: a unique name for a batch in case of debatching
jca.file.BatchIndex: the batch index for each message within the batch for debatching
jca.file.Size: the file size
jca.file.LastModifiedTime: the last modified time for the file.
In our current scenario we will use the following header property “jca.file.Directory: directory name” to change the value of output directory dynamically in our Business Service.We will create a temporary variable and will assing the value of this variable to the header property “jca.file.Directory: directory name”.Once this process will be deployed,this property will come as a variable in em console and we can change the value dynamically from em console.
Solution details
The following section details out the setups required to recreate the problem scenario and then the solution steps.
Setups
SOA Suite 11.1.1.5
Oracle XE 11.2
RCU 11.1.1.5
Oracle WebLogic Server 10.3
Jdeveloper 11.1.1.5
Business flow
Steps to desing the process.
1. Create an OneWay BPEL process in Jdeveloper.
Assign it some Logical name
Now drag and drop a File Adapter in the right hand side of the Composite swinlane.This will cause a File configuration wizard to come up.
Call this adapter AdapterEndPoint
Let the default interface be there
Choose Write operation to write file
Provide some physical directory in your local machine and specify the name for the file.
For the schema of the message choose the input wsdl file itself
Say Next and Finish and your adapter configuration are complete.
Now join the BPEL process and the File adapter so that you should have a binding created for your adapter.IT should look like this.
Now go iniside your BPEL process drag and drop an invoke activity and make a connection between the invoke and the File adapter.
Drop an assign activity in between the receiveInput and the Invoke activity and and assign the input variable of the client to the input variable of the input variable of File adapter.
After completing this process your BPEL now should look like this.
Now our process is complete.We will deploy it to the server to check if it is working fine i.e. it is writing in to the specific file location.Once it is verified we will go ahead and make changes in our process so that we can change the end points of adapter dynamically.If the flow is working fine you should get following flow instance
Recovery
Now we will modify our process in order to change the adapter endpoint dynamically in the process.Go back to your Jdeveloper and go to the BPEL process that you have designed.
Create a variable with some logical name and make it a string variable.
Now assign some other file location to this new variable you have created.
Now go to your invoke activity,go to properties tab and then search for a property file called jca.file.Directory and in the value assign the new variable which we have created.
These are all the changes that we need to do in our process.Now we will rebuild and redeploy the process to the server.
Results
Now if you test the process with the setting specified above the file will now be written to the new file location that we have specified in our new variable.This means that the adapter properties have a preference over the hardcoded values in the process.We can desing our process so that the new variable can accept values from the client and then we can change the values dynamically.We can also have a rules/decision table where in we can already store the values of differenct location corresponding to some input values and thus can change the values dynamically.
Other than this we also have the feature of changing the value from em console.This can be done in following ways.First of all go back to your BPEL process and remove the changes that we have done for dynamically changing the value (Using a variable and then assigning its value to jca.file.Directory property)
Now Go to your em console.
Click on the process.In the right hand side in dashboard naviage till the bottom of the page,there you will get a property called a AdapterEndPoint as shown below click on that.
Now move to properties tab you will get the following properties file
As you can see we have one property PhysicalDirectory(Write).
You can specify the new location where you want to write your file and save the changes.Make sure the provided directory name already exist and save the changes.Now all the new instances which will get created will be written in the new location which you will specify in this property.
Further Scope
The Dynamic Adapter concept is limited to only technology adapter however for legacy and other adapters we can not use the same concept there we need to use another stricking feature of SOA Suite that is Dynamic PartnerLink.
Business benefits
You need not schedule and outage for the redeployment of your process, the end point can be changed dynamically at run time.
Conclusion
SOA Suite provide properties which can be used to change the end points of an adapter dynamically
Attachment:
IT contains the .jpr file for the project along with all the required artifacts.
Reproduce issue in local Machine
1>Download and unzip the file in your local machine.
2>Open your Jdeveloper and import the .jpr file in to jdeveloper.
3>Change the location of the file as per your local server file structure.
4>Deploy and test the process.
oracle.wsm.policymanager.PolicyManagerException: WSM-02106 : Cannot retrieve policy oracle/wss_username_token_service_policy
I have secured my OSB service with owsm policy oracle/wss_username_token_service_policy.
Today i started the system and checked the proxy service and i was getting following error.
An unexpected error occured accessing information about the WSDL of the service:
com.bea.wli.config.component.NotFoundException: Can not compute effective WSDL for : ProxyService CustDetails/AmwCustDetailsPS
I went to the proxy service definition and went to policy page i found the following error in the policy page.
oracle.wsm.policymanager.PolicyManagerException: WSM-02106 : Cannot retrieve policy oracle/wss_username_token_service_policy. [Possible Cause : MDS-01329: unable to load element "persistence-config"
MDS-01370: MetadataStore configuration for metadata-store-usage "OWSM_TargetRepos" is invalid.
MDS-00922: The ConnectionManager "oracle.mds.internal.persistence.db.JNDIConnectionManagerImpl" cannot be instantiated.
MDS-00001: exception in Metadata Services layer
weblogic.common.resourcepool.ResourceDeadException: 0:weblogic.common.ResourceException: Could not create pool connection. The DBMS driver exception was: Listener refused the connection with the following error:
ORA-12514, TNS:listener does not currently know of service requested in connect descriptor
I realised that the database is not up and running.Since my OWSM schema is installed in the database so i started my database.
Somehow i was getting following error while starting database from command console.
ERROR:
ORA-01034: ORACLE not available
ORA-27101: shared memory realm does not exist
Process ID: 0
Session ID: 0 Serial number: 0
TNS-12560: TNS:protocol adapter error
TNS-00530: Protocol adapter error
I tried to start the database from services and i was able to log in to DEV_MDS schema.
Once i was able to log in to the DEV_MDS schema i logged into my osb console and i was able to call the service successfully.
Today i started the system and checked the proxy service and i was getting following error.
An unexpected error occured accessing information about the WSDL of the service:
com.bea.wli.config.component.NotFoundException: Can not compute effective WSDL for : ProxyService CustDetails/AmwCustDetailsPS
I went to the proxy service definition and went to policy page i found the following error in the policy page.
oracle.wsm.policymanager.PolicyManagerException: WSM-02106 : Cannot retrieve policy oracle/wss_username_token_service_policy. [Possible Cause : MDS-01329: unable to load element "persistence-config"
MDS-01370: MetadataStore configuration for metadata-store-usage "OWSM_TargetRepos" is invalid.
MDS-00922: The ConnectionManager "oracle.mds.internal.persistence.db.JNDIConnectionManagerImpl" cannot be instantiated.
MDS-00001: exception in Metadata Services layer
weblogic.common.resourcepool.ResourceDeadException: 0:weblogic.common.ResourceException: Could not create pool connection. The DBMS driver exception was: Listener refused the connection with the following error:
ORA-12514, TNS:listener does not currently know of service requested in connect descriptor
I realised that the database is not up and running.Since my OWSM schema is installed in the database so i started my database.
Somehow i was getting following error while starting database from command console.
ERROR:
ORA-01034: ORACLE not available
ORA-27101: shared memory realm does not exist
Process ID: 0
Session ID: 0 Serial number: 0
TNS-12560: TNS:protocol adapter error
TNS-00530: Protocol adapter error
I tried to start the database from services and i was able to log in to DEV_MDS schema.
Once i was able to log in to the DEV_MDS schema i logged into my osb console and i was able to call the service successfully.
Friday, March 02, 2012
java.lang.RuntimeException: error in finding weblogic.Home at weblogic.ant.taskdefs.management.WLSTTask.execute
You are trying to run ant script to load the data in your osb console and you are getting following error
java.lang.RuntimeException: error in finding weblogic.Home
at weblogic.ant.taskdefs.management.WLSTTask.execute(WLSTTask.java:168)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:269)
at org.apache.tools.ant.Task.perform(Task.java:364)
at org.apache.tools.ant.Target.execute(Target.java:301)
at org.apache.tools.ant.Target.performTasks(Target.java:328)
at org.apache.tools.ant.Project.executeTarget(Project.java:1215)
at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:383)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:269)
at org.apache.tools.ant.Task.perform(Task.java:364)
at org.apache.tools.ant.Target.execute(Target.java:301)
at org.apache.tools.ant.Target.performTasks(Target.java:328)
This is because your classpath is not set correctly.
If you will go to your build.xml file you will find a entry like this
<path id="class.path">
<pathelement path="${bea.home}/wlserver_10.3/server/lib/weblogic.jar"/>
<pathelement path="${bea.home}/osb_10.3/lib/sb-kernel-api.jar"/>
<pathelement path="${bea.home}/modules/com.bea.common.configfwk_1.2.0.0.jar"/>
</path>
This error occurs it the classpath for com.bea.common.configfwk_1.2.0.0.jar is not set correctly.
The version of this class file varies with the version of the server so make sure you are making the changes in the name of the file as per the version.Also set your BEA_HOME and JAVA_HOME and re run this script it should get resolved.
java.lang.RuntimeException: error in finding weblogic.Home
at weblogic.ant.taskdefs.management.WLSTTask.execute(WLSTTask.java:168)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:269)
at org.apache.tools.ant.Task.perform(Task.java:364)
at org.apache.tools.ant.Target.execute(Target.java:301)
at org.apache.tools.ant.Target.performTasks(Target.java:328)
at org.apache.tools.ant.Project.executeTarget(Project.java:1215)
at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:383)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:269)
at org.apache.tools.ant.Task.perform(Task.java:364)
at org.apache.tools.ant.Target.execute(Target.java:301)
at org.apache.tools.ant.Target.performTasks(Target.java:328)
This is because your classpath is not set correctly.
If you will go to your build.xml file you will find a entry like this
<path id="class.path">
<pathelement path="${bea.home}/wlserver_10.3/server/lib/weblogic.jar"/>
<pathelement path="${bea.home}/osb_10.3/lib/sb-kernel-api.jar"/>
<pathelement path="${bea.home}/modules/com.bea.common.configfwk_1.2.0.0.jar"/>
</path>
This error occurs it the classpath for com.bea.common.configfwk_1.2.0.0.jar is not set correctly.
The version of this class file varies with the version of the server so make sure you are making the changes in the name of the file as per the version.Also set your BEA_HOME and JAVA_HOME and re run this script it should get resolved.
Using Ant Script to Deploy the project to OSB Console
Using Ant Script to Deploy the project to OSB Console
Extract the IMPORT-EXPORT. zip file into your local machine.
It will contain the following files.
Export.py and Import.py are the python script written to export and import the files respectively from osb console
Import.properties and Export.properties are the property file that we need to change in order to point it to our local server.
In order to test the import (deploy to osb console)scenario with the given project.a
I made the following changes in my import.properties file.
Adminurl-Changed it to my local host
importUser-My osb username
importPassword-My osb password
Now open a command console.
Set your JAVA_HOME=C:\OSB\jdk160_18
Set your BEA_HOME=C:\OSB
Set your ANT_HOME=C:\OSB\modules\org.apache.ant_1.7.1
Navigate till the folder location where we have build.xml file.
Run the setDomainEnv.cmd to set the properties value.
C:\OSB\user_projects\domains\base_domain\bin\setDomainEnv.cmd
Now run the following command ant import
This will import your code in to the osb console.
Now log into your osb console and verify if the process is there or not.
Since in the import.properties file I have specified the project as default my code is deployed in the same project folder.
=================================================================================
Now in order to check this using customization file. I am having an extracted jar file(sbconfig.jar) and a customization file(ALSBCustomization.xml)- ExportedFile.zip
Now in order to test if customization is working fine or not first import the project without using customization file. In order to do that copy the sbconfig.jar in your directory where you have all other files and change your import.properties as follows
Here notice that I have commented the project ,pass phrase and CustomizationFile option.
Now go to the command console again and do an ant import
It will import the new project in to the console.
Go to the osb console and verify if the new project is created or not.
Now expand the service go to any of the business service and test it.
You will get an error like this.
This is because we have not used customization file to change the port address.
Since my server is running on 7001 port and this project is by default using 7021 as the port to call.
Now delete this project and we will try to redeploy it to the osb console using customization file.
Now copy the customization file ALSBCustomization.xml in to the same folder where you have build.xml
And rename it as OSBCustomizationFile.xml
Now go to the import.properties file and include the customization file as shown below.
This customization file contains the changed port for the osb service.
Now again go to command console and do ant import
Make sure the customization file is loaded.
Now again go back to you osb console and test the business service.
This time you should be able to test the business service.
===================================================================================
Extract the IMPORT-EXPORT. zip file into your local machine.
It will contain the following files.
Export.py and Import.py are the python script written to export and import the files respectively from osb console
Import.properties and Export.properties are the property file that we need to change in order to point it to our local server.
In order to test the import (deploy to osb console)scenario with the given project.a
I made the following changes in my import.properties file.
Adminurl-Changed it to my local host
importUser-My osb username
importPassword-My osb password
Now open a command console.
Set your JAVA_HOME=C:\OSB\jdk160_18
Set your BEA_HOME=C:\OSB
Set your ANT_HOME=C:\OSB\modules\org.apache.ant_1.7.1
Navigate till the folder location where we have build.xml file.
Run the setDomainEnv.cmd to set the properties value.
C:\OSB\user_projects\domains\base_domain\bin\setDomainEnv.cmd
Now run the following command ant import
This will import your code in to the osb console.
Now log into your osb console and verify if the process is there or not.
Since in the import.properties file I have specified the project as default my code is deployed in the same project folder.
=================================================================================
Now in order to check this using customization file. I am having an extracted jar file(sbconfig.jar) and a customization file(ALSBCustomization.xml)- ExportedFile.zip
Now in order to test if customization is working fine or not first import the project without using customization file. In order to do that copy the sbconfig.jar in your directory where you have all other files and change your import.properties as follows
Here notice that I have commented the project ,pass phrase and CustomizationFile option.
Now go to the command console again and do an ant import
It will import the new project in to the console.
Go to the osb console and verify if the new project is created or not.
Now expand the service go to any of the business service and test it.
You will get an error like this.
This is because we have not used customization file to change the port address.
Since my server is running on 7001 port and this project is by default using 7021 as the port to call.
Now delete this project and we will try to redeploy it to the osb console using customization file.
Now copy the customization file ALSBCustomization.xml in to the same folder where you have build.xml
And rename it as OSBCustomizationFile.xml
Now go to the import.properties file and include the customization file as shown below.
This customization file contains the changed port for the osb service.
Now again go to command console and do ant import
Make sure the customization file is loaded.
Now again go back to you osb console and test the business service.
This time you should be able to test the business service.
===================================================================================
Thursday, March 01, 2012
Secure OSB proxy Service
Set up Oracle XE
Download the RCU for the required version of OSB
Run RCU.bat
Extend your domain to include the em and owsm.
Run config.cmd
C:\OSB\wlserver_10.3\common\bin\config.cmd
Since I already have a domain I will extend it for OWSM
===========
Now start weblogic server
Now log in to admin console, sb console and em console and verify if you are able to log in to all of them.
Now log in to your osb console.
Create a new session and go to any of the proxy service and click on policy tab
Choose the policy binding and click on add
Choose the following policy and say submit
IT should get updated in the list of policies
Update the policy activate the service and test the service ,This time you will get the following error
Now we will configure a user and attach it with a keystore
Log into your em console.
Select your domain and then choose security provider configuration.
Now weblogic uses a default keystore that we will use at our end.
Click on the configure button next to keystore
Provide the following details and say ok
Now go to OSB console go to security configuration tab and click on user
Create a new user-specify user name and password.
Now again go to the em console.
Select your domain-go to security credentials
Create a Key
Provide some key name and the user and password that you have created in osb console.
Save it and restart your server.
Now start your server again and test the service you will get a screen like this.
Now test this service first without passing any value
It will fail with following error message
Now again test this service passing the Override value as “SecureKey”
yo
Now test this service and this should be calling the service successfully.
Unfortunately this is not working at my end as I have created my own keystore default-keystore.jks.
However the basic functionality of calling this web service from outside is achieved making following changes.
Go back to you proxy service go to Security enable the process WS-Security Header
Now test the web service from em console using the wsdl
Initially I am testing it without any security header.
So it is failing with following error
I will test it once again passing the header
And this time I am able to invoke the service successfully
In order to fix this issue.
1>Configure the keystore and then restart the server.
2>Now create a key in em console using the same user and password as you have created in osb console
Small hint on creating keystore
keytool -genkeypair -keyalg RSA -alias orakey -keypass welcome1 -keystore default-keystore.jks -storepass welcome1 -validity 3600
don’t change the default keystore.
http://tim.blackamber.org.uk/?p=825
http://niallcblogs.blogspot.in/2010/07/osb-11g-and-wsm.html
Download the RCU for the required version of OSB
Run RCU.bat
Extend your domain to include the em and owsm.
Run config.cmd
C:\OSB\wlserver_10.3\common\bin\config.cmd
Since I already have a domain I will extend it for OWSM
===========
Now start weblogic server
Now log in to admin console, sb console and em console and verify if you are able to log in to all of them.
Now log in to your osb console.
Create a new session and go to any of the proxy service and click on policy tab
Choose the policy binding and click on add
Choose the following policy and say submit
IT should get updated in the list of policies
Update the policy activate the service and test the service ,This time you will get the following error
Now we will configure a user and attach it with a keystore
Log into your em console.
Select your domain and then choose security provider configuration.
Now weblogic uses a default keystore that we will use at our end.
Click on the configure button next to keystore
Provide the following details and say ok
Now go to OSB console go to security configuration tab and click on user
Create a new user-specify user name and password.
Now again go to the em console.
Select your domain-go to security credentials
Create a Key
Provide some key name and the user and password that you have created in osb console.
Save it and restart your server.
Now start your server again and test the service you will get a screen like this.
Now test this service first without passing any value
It will fail with following error message
Now again test this service passing the Override value as “SecureKey”
yo
Now test this service and this should be calling the service successfully.
Unfortunately this is not working at my end as I have created my own keystore default-keystore.jks.
However the basic functionality of calling this web service from outside is achieved making following changes.
Go back to you proxy service go to Security enable the process WS-Security Header
Now test the web service from em console using the wsdl
Initially I am testing it without any security header.
So it is failing with following error
I will test it once again passing the header
And this time I am able to invoke the service successfully
In order to fix this issue.
1>Configure the keystore and then restart the server.
2>Now create a key in em console using the same user and password as you have created in osb console
Small hint on creating keystore
keytool -genkeypair -keyalg RSA -alias orakey -keypass welcome1 -keystore default-keystore.jks -storepass welcome1 -validity 3600
don’t change the default keystore.
http://tim.blackamber.org.uk/?p=825
http://niallcblogs.blogspot.in/2010/07/osb-11g-and-wsm.html