"Publish-Subscribe" versus "Read" in Siebel Integration
By Graham Nicol [Dec-07]
When making data stored in the Siebel application available to other applications in an Enterprise one of two basic patterns is usually followed:
Publish-Subscribe is a mechanism whereby one system (the publisher) makes data available to other systems when the data is changed or updated;
other systems (the subscribers) register that they want to receive notification of changes to this data and so are informed of the data event.
The "Read" mechanism is really an amalgam of a number of other patterns, Event-driven consumer, Point-to-Point, Selective Consumer, Transactional Client etc..
Any pattern that fits into the "Consumer requests data, Provider sends data" generalization can be considered as part of this superset.
This pattern, unlike Publish-Subscribe, does not require the use of some form of Middleware (or excessive coding in the Siebel application), in
those cases it would definitely be considered as point-to-point.
Each of these patterns has it's own advantages and disadvantages, and each makes its own demands and assumptions about the specific integration, and about
the overall enterprise integration architecture. The demands and constraints from the Siebel (or any other application) standpoint can often be very different, and
sometimes counter-intuitive, to those of the middleware platform.
The fundamental concept of a publish-subscribe pattern is that the subscribers receive updates to the data set; this implies that each and every subscriber must
have a copy of the data that it is to work with. So for an enterprise using the publish-subscribe pattern with n subscribers there will be n x number of records.
For a small data set this number will not be too extreme, but for a large data set, say quotes or orders in a sales situation, this number quite rapidly impact the
capacity planning of the whole enterprise.
Another consideration with the publish-subscribe model is that of master data, or "How much do you trust replicants?", perhaps it should be the " Blade Runner"
question! If, by using publish and subscribe there are multiple copies of data in an enterprise then where does every system get its data? The correct answer of
course is from the middleware and hence from the master system for that data, but what happens if a downstream system directly accesses one of the replicants?
What happens if that particular replicant doesn't actually have an up to date copy of the data? Oops. So care needs to be taken when using publish-subscribe with
So what is publish-subscribe good for from an application point-of-view? Well for reference data sets with a relatively low volatility, for example Contacts or
Accounts, publish-subscribe can be a good way to replicate data out to downstream systems to allow them to work with the data set and keep up to date in a
"near real time" fashion (I just love that phrase "near real time", things are either real-time or they aren't..."near" is meaningless)!
The base difference between the Read and Publish-Subscribe pattern is its efficiency. If an Account (let's say) is updated in publish-subscribe and that
change is to be propagated to 10 downstream systems then one message will be passed from Siebel to the middleware and then 10 messages will go further out,
one to each subscribing system. In the read pattern, if the same 10 downstream systems need to access the account then a total of 40 messages will be needed,
each time that the data is requested .
This "inefficiency" of data transmission is the number one reason that integration architects prefer publish-subscribe, but it does have its uses. Areas where
publish-subscribe has issues for example are areas that Read is conversely strong in, as data need not be replicated any issues of data integrity across the
enterprise are negated, and the potential capacity "complications" across subscriber systems are circumvented. For high volatility, transactional, data, the
Read pattern is often a good choice.
So in summary there's no one "right" way to perform integration, each customer and scenario is different, at every stage it's important to consider both the
technical and organisational drivers before coming up with a coherent strategy. From a personal viewpoint I think it's always essential to try and consider the
wider implications of the decisions that are taken, the days are past when a Siebel integration or even an Oracle deployment sit in isolation.
Siebel Import WSDL tool doesn't understand my BPEL-generated WSDL
By Nishit Rao and Markus Zirn [Dec-07]
Why do you need to know? Well, if you ever want to a Siebel Workflow to call out to an external web service (such as a BPEL Process), one of the steps
you typically will need to perform is to generate a WSDL for the endpoint (BPEL Process in this case) and consume it in Siebel Tools via the Siebel Import
WSDL Tool. If that WSDL is auto-generate4ed by BPEL, that's exactly where you will run into our little trouble.
Let's back up. Why do you need to integrate Siebel Workflow with BPEL in the first place? Doesn't Siebel Workflow take care of business processes??
Well, workflows are the process component within Siebel, and have worked well for processing within Siebel. However, for cases where your business
process spans Siebel and other applications, a standards based Business Process engine such as BPEL is best suited (the plug ends here).
Getting back to our little trouble (remember, the fact that the Siebel Import WSDL tool won't understand the BPEL generated WSDL)?.To get to the
matter fast, let's stick with Sales Order integration, using SOAP over HTTP in a synchronous interaction. Granted, there are many other architectural topics
regarding outbound web services integration from Siebel to external system ? such as identifying the data structures, interaction primitives (sync, async, one-way),
transport (http, jms, file) to just name a few. But that's for another time...
Let's work through the regular sequence until we hit our little trouble - bear with me for a second, please: Having identified Order as the data, your first step to
implement this outbound web service integration is to use Siebel's Schema Generation Wizard (part of Siebel Tools) to generate the Order XML message type
(note here that we picked an existing Order Integration Object and did not create a new integration object). By choosing XSD Schema Generator and No
Envelope options you should have a ListOrOrder that looks like this ListOfOrder.xsd.
With this done, let's create the BPEL process that our outbound Siebel web service will call - choose sync process options and set the ListOfOrder.xsd
as both the input and the output of the process. Again, you could have chosen different input and output elements, but lets stay with the simple approach for now.
Now let's save the BPEL process, deploy it and get its WSDL. Here is a sample SiebelOrderProcess.wsdl. Remember: every BPEL process is a web service by
default, and it has a WSDL automatically generated for you.
The rest is simple, you say: Let's just take this automatically generated WSDL and consume in Siebel Tools. That's exactly where we hit our little trouble:?
Unfortunately, the Siebel Import WSDL tool does not like "import" statements, which are included in the WSDL that BPEL auto-generates.
The good news is that the fix is relatively painless: Replace the import statements with inline message types as shown below.
What you have to do is to open the ListOfOrders.xsd file and copy all but the very first line of the schema definition. Leave out the line that starts with??
<?xml version="1.0"?. and copy everything after that line. Paste the full schema definition in between the <types>?</types>. At this point, your import should
go through. Our little trouble is fixed!
Once imported, the BPEL WSDL generates, among other artifacts, a web services proxy in Siebel (this is in reality a business service) which invokes the BPEL
process. Tie this proxy to the internal Siebel elements such as Applet Events, workflows etc. and order information will get send out to the BPEL as planned.
Hungry for more details? We will soon be posting the complete end-to-end tutorial for such an outbound integration soon. Stay tuned.
Interfacing Siebel OnDemand to Siebel OnPremise
By Bram Hoebeke [Dec-07]
For a recent customer engagement, I was asked to design & develop an interface from Siebel OnDemand to Siebel 7.8 They also suggested Siebel EIM
(Enterprise Integration Manager - a bulk data transfer tool), since it is easy, quickly developed and provides a stable solution. This should work for the initial
load as well as in batch mode and on call.
The data that needed to be transferred was Account data and all related child objects (contacts, addresses, account-relationships, ...) in a first phase.
Clear as a bell! until they realized also 'Leads' were needed. Now mapping account data from OnDemand to OnPremise may seem pretty straight forward.
But leads? Should be transferred to unqualified opportunities, prospects and maybe lists?
EIM works out fine for bulk loads and in this case the data model was rather simple. But creating an automated process that extracts data from OnDemand,
validates, extends and loads it into EIM tables and then starts an EIM job was practically impossible (did I mention they wanted the analysis,
design & implementation done in about 15 days?). But the main reason to use webservices here was the object model.
The logical layer of OnDemand and OnPremise are almost identical.
Siebel OnDemand provides WebServices for all data entities you are working with. (even custom objects) And Siebel OnPremise provides a
whole set of tools to consume these webservices, transform the data, map it to Integration components and commit it. All in a secure, flexible way.
Since OnDemand Webservices all have the same 'layout', they are extremely well fit to build a reusable, transparent interface for. Well fit to run in batch mode,
changes to the mapping can be made post roll out and with minimal customizations.
In my next blog, I'll describe step by step how to set up such a solution and what EAI business services and other objects I used to accomplish this.
Consuming an OnDemand WebService in Siebel OnPremise through EAI Services
By Bram Hoebeke [Sep-08]
The following steps describe the core process to consume an OnDemand Web service by Siebel OnPremise.
A good practice is to create a workflow process to control these steps and use properties to easily pass input and output arguments. In addition, logging and
error handling should be added to monitor this process. Some steps require custom scripting. Put these methods in a Business Service.
1. Connect to the OD Webservice by calling the Login Method: Use the EAI HTTP Transport Service.
As an input, pass the OnDemand url, username and password of an OnDemand Account
Store the SessionId as a shared global. OnDemand Webservices use session based security; therefore, we need to pass this SessionId on every
following call to OnDemand. At the end of our process, we call the LogOff method, to explicitely end our session.
2. Create the soap request to send to the Webservice:
Important parameters are the StartRowNum and the PageSize. Our SoapRequest is actually a Query by example, meaning that we send an xml structure
over to the webservice, containing the fields we want to retrieve and the search specifications defining our query. The Pagesize property defines the number of
rows to return and the StartRowNum the row number to start with. In this example we query for all leads modified after a certain date (the SSE in the xml),
this request should return the first 100 records, starting at row num sStartRowNum.
var output_string = "<?xml version=\"1.0\" encoding=\"UTF-8\"?><SOAP-ENV:Envelope xmlns:SOAP-ENV=\"http://schemas.xmlsoap.org/soap/envelope/\"
// mappings for all field data at Lead level to be pulled back must go here
output_string += "<SOAP-ENV:Body><SOAPSDK1:LeadWS_LeadQueryPage_Input><PageSize>100</PageSize><StartRowNum>" + sStartRowNum + "</StartRowNum><SOAPSDK2:ListOfLead>";
output_string += "<Lead>";
output_string += "<AccountExternalSystemId></AccountExternalSystemId>";
output_string += "<AnnualRevenues></AnnualRevenues>";
output_string += "<IntegrationId></IntegrationId>";
output_string += "<JobTitle></JobTitle>";
output_string += "<LastUpdated>>= '" + SSE + "'</LastUpdated>";
output_string += "<LeadEmail></LeadEmail>";
output_string += "<LeadFirstName></LeadFirstName>";
output_string += "<LeadId></LeadId>";
output_string += "<LeadLastName></LeadLastName>";
output_string += "<NumberEmployees></NumberEmployees>";
// end of lead and soap envelope
output_string += "</Lead></SOAPSDK2:ListOfLead></SOAPSDK1:LeadWS_LeadQueryPage_Input></SOAP-ENV:Body></SOAP-ENV:Envelope>"
The output_string now contains the whole soap request to pass to the webservice.
3. Pull Data from OnDemand
Again using the EAI HTTP Transport service, call the webservice and pass the soap envelope
var bs = TheApplication().GetService("EAI HTTP Transport");
var inputs = TheApplication().NewPropertySet();
var outputs = TheApplication().NewPropertySet();
var sURL = TheApplication().GetSharedGlobal("URL");
var sSessionId = TheApplication().GetSharedGlobal("SessionId");
var sURL += "/Services/Integration/Lead;jsessionid=" + sSessionId;
Check the LastPage element of the XML returned: If there were more than 100 records to be returned by our query, we need to send more soap
requests to retrieve all the data and hereby increasing our start row number by our pagesize.
var sString = outputs.GetValue();
var LastPage = "Y";
if (sString.indexOf("<ns:LastPage>false</ns:LastPage>")> 0)
LastPage = "N";
while (LastPage == "N")
s_StartRowNum = s_StartRowNum + 100;
You might need to trim the xml a little to get rid of headers you don’t need and add new headers to meet the layout of an
Integration Object, that represents the structure of the data returned by the web service. Let’s call this the External Integration Object.
NOTE: The Integration Object will be created automatically by importing the .wsdl file you downloaded from OnDemand in the Siebel Tools WSDL Import Wizard (New Object Wizard: EAI: Web Service)
At the end you write the xml to a file.
var sHeader = "<LeadWS_LeadQueryPage_Output><ListOfLead>";
var sFooter = "</ListOfLead></LeadWS_LeadQueryPage_Output>";
var sCompleteXML = sHeader + sTrimmedXML + sFooter;
var file = Clib.fopen(sDataFile,"wu+");
4. Load the xml into a Siebel XMLHierarchy
Call the ReadXMLHier method of the ‘EAI XML Read from File’ Business Service.
As an input argument we pass the xml file and the service returns an XML Hierarchy.
5. Convert the XML Hierarchy to an Integration Object Hierarchy
Call the XMLHierToIntObjHier method of the ‘EAI Integration Object to XML Hierarchy Converter’ Business Service.
As input arguments we pass the XML Hierarchy and the name of our external Integration Object (IntObjectName). The output is a SiebelMessage
6. Using Data Maps to map the external data to the Siebel OnPremise data model.
The data maps can be created in Siebel Client: Administration – Integration: Data Maps
For this we need to have 2 integration Objects, one representing the external data structure and one based on the OnPremise data structure (our internal
Then we call the ‘Execute’ method of the ‘EAI Data Transformation Engine’ business service to convert our SiebelMessage
into the internal format.
7. Upsert the data in OnPremise
Call the ‘EAI Siebel Adapter’ Upsert method to insert or update data in OnPremise (based on the Integration Component Keys
defined on our internal Integration Object’s components)
NOTE: When the upsert fails because e.g. a mandatory field is empty, all records will fail. Therefore a good practice could be to
split the SiebelMessage before passing it to the EAI Siebel Adapter. This way the valid records will pass and we can do some extended error
handling for the invalid ones.
Using BPEL to invoke Siebel OnDemand Web Services requires you to pass a session ID that is returned by posting a URL with the username and password.
You can accomplish the logon process by using embedded JAVA in your BPEL process. Before you do so, define a variable to store the sessionID, which will
be returned after logon.
Next, include a Java Embedding process activity in your BPEL process to post the URL containing the username and password.
The following sample code illustrates how this can be accomplished:
String sessionString = "FAIL";
String wsLocation = "https://yourinstance.crmondemand.com/Services/Integration";
// create an HTTPS connection to the OnDemand webservices
URL wsURL = new URL(wsLocation + "?command=login");
HttpURLConnection wsConnection = (HttpURLConnection)wsURL.openConnection();
// disable caching
// set some http headers to indicate the username and password we are using to logon
// see if we got a successful response
if (wsConnection.getResponseCode() == HttpURLConnection.HTTP_OK)
// get the session id from the cookie setting
for (int i=0; ; i++)
headerName = wsConnection.getHeaderFieldKey(i);
if (headerName != null && headerName.equals("Set-Cookie"))
// found the Set-Cookie header (code assumes only one cookie is being set)
sessionString = wsConnection.getHeaderField(i);
String formattedID = sessionString.substring(sessionString.indexOf("=")+1,sessionString.indexOf(";"));
//System.out.println("Session ID: " + sessionString);
catch (Exception e)
System.out.println("Logon Exception generated :: " + e);
Note that the code assumes a single cookie is being set and parses the header to retrieve it into the SessionID variable that was defined earlier.
Next, define a partnerlink using the appropriate WSDL for the Siebel OnDemand web service you would like to invoke (for example, the Account web service).
To obtain the WSDL, you need to login to the Siebel OnDemand administration portal using the right credentials and navigate to the Web Services
Administration base under the Admin section of the portal.
Now, in your BPEL process, use the Assign process activity to copy the following expression to the partnerlink as shown below:
This effectively replaces the default EndpointReference, so that it now includes the SessionID that was retrieved by the embedded JAVA code above:
Using this approach you can invoke any of the Siebel CRM OnDemand Web Services.
Invoking A Secured Siebel Web Service with Trust Token from BPEL
By Chitra Somasundaram and Sanjay Anavekar [Mar-08]
Secured Siebel web services can be invoked from BPEL by enabling Trust Token in Siebel WebServices thru BPEL PartnerLink Activity.
This approach has complete security and could be successfully integrated with Oracle BPEL and Siebel. We have choosen this option of security though
WSSE Security might be other option.
There is a small workaround needed in using this approach because with the secured user token Siebel expects custom soap header tags with the custom
Siebel namespace and these custom header tags will not be generated as part of Siebel WSDL so it needs to be manually added into the Siebel WSDL file
before importing it into the BPEL process. This can be achieved successfully as shown in the following steps
1. Create and Configure Siebel Inbound Web Services, any business service or Workflow can be published as an Inbound Web Service. The following diagram
shows an example of Siebel Inbound Web Service.
2. Ensure that the Transport layer is HTTP and Binding is SOAP_RPC_LITERAL.
3. Ensure that the URL has the trust token setup in the URL as WS-SOAP=1.
4. Generate WSDL, as shown in the following diagram. See a sample generated WSDL file here.
5. As Siebel does not generate the header tags as part of WSDL export, edit the default Siebel WSDL and add the custom Siebel Header tags manually, as
shown in the following diagrams. See a sample file with custom Siebel Header tags added here.
6. Import the edited WSDL into the BPEL Process and invoke the WSDL with a Partner Link activity.
7. Edit the PartnerLink to pass the Siebel Header as an Adapter variable, as shown in the following diagram.
The Header Variables can be populated in an Assign or Transform activity, and the actual values such as username/password can be read from a file,
Database or Deployment Descriptor.