Integrate 2014 Starting Up

By Nick Hauenstein

Tomorrow morning, the Integrate 2014 conference revs up here in Redmond, WA with Microsoft presenting the vision for the next versions of BizTalk Server and Microsoft Azure BizTalk Services.

I’m getting pretty excited as I’m looking over the first day agenda to see what to expect from the talks and sessions like Business Process Management and Enterprise Application Integration placed in between talks about BizTalk Services and On-Premise BizTalk Server. I’m even more excited seeing the second day agenda and seeing talks like Business Rules Improvements, and especially Building Connectors and Activities.

What’s Keeping Me Busy

In anticipation of this event, I have been building out a hybrid cloud integration sample application that leverages some of the ideas laid out in my previous post regarding cloud integration patterns while also providing the content for the long overdue next piece in my new features series for BizTalk Server 2013 R2.

In the sample, I’m running requests through MABS for content-based routing based on an external SQL lookup, with the requests targeting either an on-premise SQL Server database or a BizTalk Server instance for further processing that can’t otherwise be done currently with BizTalk Services (namely Business Rules Evaluation and Process Orchestration).

I’m hoping that after the conference this week, I will be able to tear the sample apart and build out most (if not all) of the elements in the cloud.

Best Bridge Between Azure and On-Prem?

Along the way, I’ve been further playing with using Service Bus Topics/Subscriptions as the Message Box of the cloud. At the same time, it represents a nice bridge between BizTalk Services and On-Premise BizTalk Server.

Consider the itinerary below;

image

This was actually the first draft of prototyping out the application. What this represents is an application that is receiving requests from client applications that have been published to a service bus topic. As they are received, they are routed based on content and sent either to a database to be recorded (as an example of an on-premise storage target), or to BizTalk Server 2013 R2 for further processing through a Service Bus Relay.

Given the scenario, maybe a relay is appropriate – lower latency requirement, no requirement for durability (which we have already sacrificed by running it through an initial bridge).

However, maybe we want to take a more holistic approach, and assume that the cloud is giving us a free public endpoint and some quite powerful content-based routing, translation, and even publish-subscribe capability when we bring Azure Service Bus to the mix. Let’s further assume that we view these capabilities as simply items in our toolbox alongside everything that BizTalk Server 2013 R2 is already providing us.

Let’s make it more concrete. What if the on-premise processing ultimately ends up sending the message back to the TradesDb? Is there a potential waste in building out that portion of the process in both locations?

Service Bus is the New Hybrid Integration MessageBox

Let’s try this instead:

image

Here, instead of using a relay, we’re using a Service Bus Topic to represent the call out to BizTalk Server 2013 R2.

Why would we do  this? While it introduces slightly more latency (in theory – though I haven’t properly tested that theory), it becomes pure loosely coupled pub-sub. I’m going to go out on a limb here (and risk being called an architecture astronaut), and say that not only is that not a bad thing, but it might even be a good idea. By feeding a topic rather than directly submitting to BizTalk Server via Relay allows us to easily swap out the processing of the rules with any mechanism we want, at any time – even if it means that we will have to transform the message and further submit by a completely different transport. Maybe one day we will be able to replace this with a call to a Rules Engine capability within MABS (crossing my fingers here) if we see such a capability come.

Further, we have broken out the logging of trades to the database into it’s own separate miniature process alongside the rest. This one might be fed by messages generated by BizTalk Server 2013 R2 on-premise or the earlier processing in MABS – providing only a single implementation of the interface to manage and change.

Is It The Right Way™?

I don’t know. It feels kind of right. I can see the benefits there, but again, we are potentially making a sacrifice by introducing latency to pay for a loosely coupled architecture. Call me out on it in the comments if that makes you unhappy. Ultimately, this would have to simply become a consideration that is weighed in choosing or not choosing to do things this way.

What Could Make it Even Better?

Imagine a time when we could build-out a hybrid integration using a design surface that makes it seamless. One where I could quickly discover the rest of the story. Right now there’s quite a bit happening on-premise into which we have no visibility via the itinerary – and very limited visibility within the orchestration since most logic is in Business Rules and my maps happen just before/after port processing.

Tomorrow

Tomorrow I will be writing a follow-up blog with a re-cap of the first day of the Integrate 2014 conference. Additionally, I will be playing with this application in my mind and seeing where the things announced this week change the possibilities.

If you’re going to be there tomorrow, be sure to stop by the QuickLearn Training table to sign-up for a chance to win some fun prizes. You can also just stop by to talk about our classes, or about any of the ideas I’ve thrown out here – I welcome both the positive and negative/nitpicky feedback.

Also, make sure you’re following @QuickLearnTrain on Twitter. We’ll be announcing an event that you won’t want to miss sometime in the next few days.

See you at the conference!

– Nick Hauenstein

Decoding JSON Data Using the BizTalk Server 2013 R2 JSONDecode Pipeline Component

By Nick Hauenstein

This is the fourth in a series of posts exploring What’s New in BizTalk Server 2013 R2. It is also the second in a series of three posts covering the enhancements to BizTalk Server’s support for RESTful services in the 2003 R2 release.

In my last post, I wrote about the support for JSON Schemas in BizTalk Server 2013 R2. I started out with a small project that included a schema generated from the Yahoo Finance API and a unit test to verify the schema model. I was going to put together a follow-up article last week, but spent the week traveling, in the hospital, and then recovering from being sick.

However, I am back and ready to tear apart the next installment that already hit the github repo a few days back.

Pipeline Support for JSON Documents

In BizTalk Server 2013 R2, Microsoft approached the problem of dealing with JSON content in a way fairly similar to the approach that we used in the previous version with custom components – performing the JSON conversion as an operation in the Decode stage of the pipeline, thus requiring the Disassemble stage to include an XMLDisassemble component for property promotion.

The official component Microsoft.BizTalk.Component.JsonDecoder takes in two properties Root Node and Root Node Namespace that help determine how the XML message will be created.

Finally, there isn’t a JSONReceive pipeline included in BizTalk Server 2013 R2 – only the pipeline component was included. In other words, in order to work with JSON, you will need a custom pipeline.

image

Creating a Pipeline for Receiving JSON Messages

Ultimately, I would like to create a pipeline that is going to be reusable so that I don’t have to create a new pipeline for each and every message that will be received. Since BizTalk message types are all about the target namespace and root node name, it’s not reasonable to set that value to be the same for every message – despite having different message bodies and content. As a result, it might be best to leave the value blank and only set it at design time.

This is also an interesting constraint, because if we are receiving this message not necessarily just as a service response, we might end up needing to create a fairly flexible schema (i.e., with a lot more choice groups) depending on the variety of inputs / responses that may be received – something that will not be explored within this blog post, but would be an excellent discussion to bring up during one of QuickLearn’s BizTalk Server Developer Deep Dive classes.

In order to make the pipeline behave in a way that will be consistent with typical BizTalk Server message processing, I decided to essentially take what we have in the XMLReceive pipeline and simply add a JsonDecoder in the Decode stage, with none of its properties set at design time.

image

Testing the JSONReceive Pipeline

In the same vein as my last post, I will be creating automated tests for the pipeline to verify its functionality. However, we cannot use the built-in support for testing pipelines in this case – because properties of the pipeline were left blank, and the TestablePipelineBase class does not support per instance configuration. Luckily, the Winterdom PipelineTesting library does support per instance configuration – and it has a nifty NuGet package as of June.

Unfortunately, the per-instance configuration is not really pretty. It requires an XML configuration file that resembles the guts of a bindings file in the section dedicated to the same purpose. In other words, it’s not as easy as setting properties on the class instance in code in any way. To get around that to some degree, and to be able to reuse the configuration file with different property values, I put together a template with tokens in place of the actual property values.

image

NOTE: If you’re copying this approach for some other pipeline components, the vt attribute is actually very important in ensuring your properties will be read correctly. See KB884981 for details.

From there, the per-instance configuration is a matter of XML manipulation and use of the ReceivePipelineWrapper class’ ApplyInstanceConfig method:

[sourcecode language=”csharp”]
private void configureJSONReceivePipeline(ReceivePipelineWrapper pipeline, string rootNode, string namespaceUri)
{
string configPath = Path.Combine(TestContext.DeploymentDirectory, "pipelineconfig.xml");

var configDoc = XDocument.Load(configPath);

configDoc.Descendants("RootNode").First().SetValue(rootNode);
configDoc.Descendants("RootNodeNamespace").First().SetValue(namespaceUri);

configDoc.Save(configPath);

pipeline.ApplyInstanceConfig(configPath);
}
[/sourcecode]

The final test code includes a validation of the output against the schema from last week’s post. As a result, we’re really dealing with an integration test here rather than a unit test, but it’s a test nonetheless.

[sourcecode language=”csharp”]
[TestMethod]
[DeploymentItem(@"Messages\sample.json")]
[DeploymentItem(@"Configuration\pipelineconfig.xml")]
public void JSONReceive_JSONMessage_CorrectValidXMLReturned()
{

string rootNode = "ServiceResponse";
string namespaceUri = "http://schemas.finance.yahoo.com/API/2014/08/";

string sourceDoc = Path.Combine(TestContext.DeploymentDirectory, "sample.json");
string schemaPath = Path.Combine(TestContext.DeploymentDirectory, "ServiceResponse.xsd");
string outputDoc = Path.Combine(TestContext.DeploymentDirectory, "JSONReceive.out");

var pipeline = PipelineFactory.CreateReceivePipeline(typeof(JSONReceive));

configureJSONReceivePipeline(pipeline, rootNode, namespaceUri);

using (var inputStream = File.OpenRead(sourceDoc))
{
pipeline.AddDocSpec(typeof(ServiceResponse));
var result = pipeline.Execute(MessageHelper.CreateFromStream(inputStream));

Assert.IsTrue(result.Count > 0, "No messages returned from pipeline.");

using (var outputFile = File.OpenWrite(outputDoc))
{
result[0].BodyPart.GetOriginalDataStream().CopyTo(outputFile);
outputFile.Flush();
}

}

ServiceResponse schema = new ServiceResponse();
Assert.IsTrue(schema.ValidateInstance(outputDoc, Microsoft.BizTalk.TestTools.Schema.OutputInstanceType.XML),
"Output message failed validation against the schema");

Assert.AreEqual(XDocument.Load(outputDoc).Descendants("Bid").First().Value, "44.97", "Incorrect Bid amount in output file");

}
[/sourcecode]

After giving it a run, it looks like we have a winner.

image

Coming Up in the Next Installment

In the next installment of this series, I will actually put to use what we have here, and build out a more complete integration that allows us to experience sending JSON messages as well, using the new JsonEncoder component.

Take care until then!

If you would like to access sample code for this blog post, you can find it on github.