Wednesday, December 27, 2006

SHOBU Music Studio

Don't took me longer than it should have to assemble the necessary bits to be able to record my guitar directly into my PC.

After a couple of days of trying different methods, I finally assembled my very first cheap ass recording studio, which consists of:
  • low end PC (AMD 2600 512MB RAM, Windows XP SP2, Creative SB Live!, 120GB HD, CD burner)
  • music equipment (electric guitar, amp, effects)
  • Logitech combination headphone/microphone
  • Audacity recording and audio editing software
  • Drum Drops drum beats
  • Windows Media Player 11.0 for playback and burning
Not counting the sunk costs from pre-existing components, my net expense for being able to record my own tunes is $0.00 USD. Yup, absolutely free! Booyah!

Of course, getting there was something of a on... Mac lovers, turn away, this isn't for the faint of heart.

My Kingdom for a Microphone
It all started in search of a microphone. I consider myself a techie, but even better, I'm a techi pack rat. I've got all sorts of PC peripherals and devices, collected over a decade, stocked in closets and under beds for just such an occasion as this. It took two days just to make sure I hit all my storage spots and I still hadn't turned up a microphone! Unfreakin believable!

Just call me Dr. Frankenstein
As luck would have it, I remembered my digital camcorder and discovered that I could record the video of me playing (incidentally capturing the sound!) and then rip that to PC, and save as MPEG. This left me with a 50MB MPEG that I still couldn't listen to in the car, or an iPod. After a brief search, I found Audio Tools Factory's most excellent Video to Audio Converter. This allowed me to rip only the audio portion of the video to MP3. Skinnying it up to about 4MB.

Ah, now I could enjoy the *ahem* bliss of listening to music that I created.

The only problem was, my music wasn't quite so blissful. Yeah, I'll say it. I suck. I'm still practicing on it though and its definitely a work in progress.

The Hunt Continues
This leads to the desire for more recording sessions, yet it was quite a workout to get everything setup and running so this would work. Oh, recording 'studio' doesn't have a PC powerful enough for the video software that came with my camera. I have to go into another room to use the software on that machine. Needless to say, gratification was less than instant.

Since I wanted to do this on a regular basis, even maybe publish some stuff and get feedback from you guys (nudge! nudge! wink! wink!) , I had to find an easier method! So, I started checking out all sorts of somewhat pricey gear on various sites, reading various 'whitepapers' on 'home studio recording', etc. I really thought that I was going to have to spend some serious cash to start recording (anything over $200 is serious to me. Its about perspective and principle).

The Last Mile
While surfing looking for just such information at Sally's computer I happened to look up and notice, hanging off the corner of her desk, is a set of Logitech headphones with a mike that I had overlooked. [These have to be placed just right to capture sound well, but its not hard to find that sweet spot].

At the same time, I happened to be surfing a site for the open source sound recording and editing software Audacity. Talk about serendipity!

One is the Loneliest Number
All the parts were in place! I can record at will and it doesn't sound half bad, even though I know audiophiles will probably keel over deaf, dumb and blind if they ever heard it.

However, since all I have are these guitar tracks, they sound kinda lonely. What to do? A quick search turns up some really cool beats from Drum Drops who have released several drum tracks into the public domain. Sweet!

Couple those drum tracks, a rhythm and lead track using Audacity and I'm in business!

Sunday, December 24, 2006

Gil's All Fright Diner

Gil's All Fright Diner
A. Lee Martinez
Tor Books

The co-worker that gave this to me called it "a real hoot" and he was dead on! Ok. Sorry for the pun. Nah..not really.

You know your in for a great ride when the two main characters are introduced as Duke, the Duke of Werewolves, and Earl, the Earl of Vampires. Toss in a greasy all night diner that just happens to be a gateway for "the old gods" (Cthulu anyone?) and a nubile teenager, in possession of Lovecraft's Necronomicon, who wants to rule the world and you kinda have to just sit back and hang on as the story unfolds!

Luckily its short because I couldn't put it down! This book is a highly entertaining read. If your looking to catch your breath before jumping back into the next Robert Jordan, George R. R. Martin, or Steven Erikson, this is your book!

Debugging Microsoft.NET 2.0 Applications

Debugging Microsoft.NET 2.0 Applications
John Robbins, Wintellect
Microsoft Press

Chapters 1-4 deal with best practices and guidance on making sure your development team has the necessary infrastructure in place to be successful in eliminating many of the problems they are likely to face before they occur in a production environment: e.g. during development. Take these chapters and combine with Steve McConnells "Code Complete 2" as a must read for your team.

Chapters 5-6 help prepare you for when things have gone south and you likely need a greater intimacy with debugging tools, including VS2005 debugger and Windbg, than you ever planned on. Much of the material is either in the help files, or online, but it never hurts to have a paper copy! When in full on debug mode, a reference to John's debugging process should be taped somewhere in sight.

"This is my debugger. There are many like it, but this one is mine. My debugger is my best friend. It is my life. I must master it as I must master my life. Without me, my debugger is useless. Without my debugger, I am useless..."

Chapters 7-8 deal with the subject of extensibility: VS Macros and Code Analysis Rules. I've never coded a macro, or code analysis rule; not sure I'm going to start now. However, if I did, I'd start by downloading those made available through the book.

Speaking of which, source code to many libraries referenced in the book are fully available for download, including the (ta da!) SUPERASSERT.NET.

Ultimately, the message of the book is that debugging is not some mystic black art, propagated by cryptic commands like '~*kb 50'. There is a method to the madness and you don't have to be mad to see it.

By the way, Wintellect offers a "Mastering .NET Debugging" course hosted by the Godfather of the Debugger himself, John Robbins. I was lucky enough to attend a 3 day session at the Microsoft Las Colinas office last year and I can highly recommend it. Some might think that the topic of debugging would be horribly dry and nothing could be further from the truth! It also helps that John is one of the single most humble, unassuming developers I've ever met.

Thanks for the book!

Saturday, December 23, 2006

WoF: World of Firefly?

Lance turned me on to Interview with Multiverse co-founder Corey Bridges.

"Burn the consoles and my PC, you can't take this game from me..." is a great parody of the Firefly main theme (parody from, theme song by Joss Whedon) but it doesn't quite ring true with me.

Don't get me wrong, I love all things Firefly. But, Firefly as a MMO, using the Multiverse platform? I dunno... I loved Word of Warcraft [WoW], but I wasn't likely their target consumer. I played, maybe, a total of 20 hours across three months before canceling any services and tossing the discs. Hell, I even gave up my XBox Live subscription after the first year, just post XBox 360 launch, right when things were getting interesting there.

Looking past a licensing move to drum up interest, what holds the most interest to me is that there are companies providing the assets for me, Joe Consumer, to make my own games. For example, Microsoft's XNA Game Studio Express allows Joe Consumer to build games that target both the PC and the XBox 360. How cool is that? There is even a marketing channel to release your XNA content on XBox Live for others to download.

Now, it looks like Multiverse is doing something similar in that they will provide the platform, or runtime and design tools, for Joe Consumer to develop complete MMO worlds that plugin to the Multiverse, allowing players to choose which world to enter. Sounds cool. Sounds a lot like a path others have tread with limited success: Neverwinter Nights. I say limited, only because it took the NWN after market content so long to take off that I had long since tossed those discs, too. I understand Bioware is still getting some serious mileage out of their efforts though.

Since Joe Consumer is not even your average Blizzard, Bioware, or Bungie, your platform and tools tend to have their reputation hanging on the quality of the content produced by consumers of said tools.

A great analogy is the Windows operating system. The Microsoft Windows OS is a remarkably solid platform to deliver content; e.g. applications. Microsoft produces some truly phenomenal tools (Visual Studio) that can be used to produce content. Where they, Microsoft, get a bum rap from is that since they've made it so easy to develop applications for their platform almost everyone can do it, although not everyone should!

Another example might be what Wizards of the Coast did with introducing the d20 "game engine" which defines the core rules that their different game systems are based on. This makes it very easy for the player to ease into a fantasy, sci-fi, or even modern setting without worrying too much about the mechanics of the game play changing too much.

Anyway, opportunities abound to flex your creative muscles so get out there and write some code and create some content to help get these efforts off the ground! If you need some inspiration, try checking out some of the Multiverse trailers of existing worlds!

Friday, December 22, 2006

JDE.INI File Settings for Clients and Servers

Found a handy link to an online pdf describing the available configuration sections, along with values, for J.D. Edwards. Its evidently a chapter from the McGraw Hill book, "J.D. Edwards One World: The Complete Reference".

The performance of J.D. Edwards can be tuned dramatically by configuring the JDE.INI file appropriately. Of course, I'd consult with your friendly neighborhood CNC before changing any of these values yourself!

BizTalk InvalidPropertyTypeException Followup

This is a follow-up post to BizTalk InvalidPropertyTypeException.

In order to get around receiving these exceptions in our environments, we added a post deployment SQL script that only enables tracking for those schema's and properties which are a part of our BizTalk project and don't actually belong to a BizTalk property schema.

SQL Script to Enable Tracking by Assembly


DECLARE @AssemblyId int;

SET @CLRNamespace = 'SupplyChain.BizTalk.Schemas';

SET @AssemblyId = (SELECT TOP 1 assemblyid from [BizTalkMgmtDb].[dbo].[bt_documentspec] where clr_namespace = '' + @CLRNamespace + '')


UPDATE [BizTalkMgmtDb].[dbo].[bt_documentspec]
SET is_tracked = 1
WHERE clr_namespace = '' + @CLRNamespace + ''
AND is_property_schema = 0

UPDATE [BizTalkMgmtDb].[dbo].[bt_properties]
SET is_tracked = 1
WHERE nassemblyid = @AssemblyId


I heard back from Lee Graber that the behavior described does appear to be an issue in BizTalk 2004, but he is unable to reproduce under BizTalk 2006. Since the impact is relatively low, and there is a workaround available, the likelyhood of any fix making it into BizTalk 2004 is very small.

Tuesday, November 28, 2006


Lance sent me this link (thanks!) to the GamerBUS, which is somewhat reminiscent of the now defunct SHOBU GAMES.

I'm still very much interested in a business revolving around games. All the research we did initially has shown the gaming industry to be going through a significant growth period, without much sign of slowing.

I think this is a great idea and hope they do well! It will only make things easier for the rest of us dreamers!

Thursday, November 16, 2006

Sysinternals Suite

Thank you Buck Hodges for bringing this to my attention!

It took Mark moving to Microsoft to make this happen! Our server team has been doing a great job in shielding me from the fact that all the Sysinternal utilities are typically individual downloads. That is, until a newer version is available of one of them (typically Process Monitor) and I get ahead of the server team in updating our servers!

Sysinternals Suite

There was a brief moment where all was right with the world...

TechNet Webcast: Windows Hang and Crash Dump Analysis (Level 400)

TechNet Webcast: Windows Hang and Crash Dump Analysis (Level 400)

This has been released for a while, but has been below my radar. Since I'm on a campaign to help bring awareness to our development and operational teams on how easy it is to identify the source of a process crash if you have just a little bit of knowledge, I thought I'd throw it out there.

The presentation is given by Mark Russinovich, of Sysinternals fame and before being assimilated by Microsoft, during TechEd 2006. I was actually at TechEd 2006, but missed Mark's talk this time around. I did have the opportunity to spend a full day with him and David Solomon during the Microsoft PDC 2005. They give an incredible presentation on the topic of windows internals and if you haven't seen it, I highly recommend it!

Since Mark has made it up to Seattle, Microsoft has released their Windows Sysinternals website, which is the new home of

Sunday, November 12, 2006

Shanghai, China - Day 7

This turned into a marathon day...

Had the morning to ourselves, so we hit Nanjing Road to do some sight seeing and shopping.

Found our way to Shanghai Pearl City off of Nanjing road. It was pretty interesting. We definitely bought a few items, but don't believe we actually got the best deal we could have.

Wandered around for a bit before meeting back up at the hotel for lunch.

After that, it was a quick trip to the airport and off to the USA!

Of course, our flight inbound to Chicago O'Hare was asked to hold due to inclement weather, then diverted to Minneapolis for fuel, before making it back to Chicago.

Thankfully, a co-worker had enough sense of mind (even while under the influence of sleeping pills) to recommend we call ahead to check the flight status of our connection to Dallas. After we discovered that our flight had been canceled, we made arrangements at a local hotel.

After an early morning hop to Dallas, I dropped a co-worker off at his house and then headed to Allen (home). Once there, I enjoyed the family briefly, showered, and then packed everyone up and headed back out to Six Flags in Arlington for our annual company picnic, where we shut the place down around 6PM.

What a day... *whew*

Shanghai, China - Day 6

Today, we really focused on test cases, fixed a couple of bugs, and started applying polish to the client side implementation.

Its clear our partner has the technology savvy to implement the integration, what really remains now is for us to continue the development cycle striving for project milestones.

There are a couple of undefined areas, mainly around support and escalation of issues, but we believe we understand them and now just need to get management buy off on a plan for the process to put around them.

The confidence level from both teams is very high right now; we've established a necessary amount of trust in each other to get the job done.

I think we've done a great job, from all teams, on getting through this week. What remains to be seen is how much of it 'sticks' after everyone has gone back to their respective regular jobs, and only time will tell that.

After such a successful week, it obviously calls for a celebration party!

We met up with the business folks and headed to a restaurant named 1221. This restaurant has a little more modern menu when it comes to Chinese food and a good atmosphere. Its a little hard to find, but well worth it!

After 1221, it was a quick cab ride over to the 'Port-a-man'. This is actually the Portman Ritz-Carlton, but you have to speak cabbie lingo to get there! Our final destination was Malone's, but we couldn't get our cabbie to understand that.

At Malones, we enjoyed drinks and entertainment by the band Art 6. Although, 'enjoyed' and 'entertainment' are always subjective to each individuals taste!

Shanghai, China - Day 5

Our partner was up late last night making necessary changes to their current architecture to support integration.

This primarily consists of identifying the intermediate storage for incoming and outgoing data (queuing). Once we defined where the data was going to live, we covered how to process it and further more, how to do it all in a recoverable fashion.

At the end of today, we had a data model to play with, test scripts to populate the data model and a windows service that would process the data.

The teams worked really well together considering the language barrier.

One caveat to this is that our partner sandbox actually lives in Dallas, TX. Our partners office network was a 10MB connection that was nearly fully utilized. What this meant to our test efforts is that the environment is fine for a functional integrated test, which is its intent.

However, we quickly identified that the connection is next to useless for remote administration of our Dallas sandbox. We discovered this when we needed to make minor contract changes for our inventory service, it took nearly 2 hours to get the very minor changes published, most of this was due to access via Citrix to our MK network. Luckily, this is isn't considered 'normal' use, but its definitely something to consider!

The partners data center in Shanghai, which is where their code base typically runs, has a much better connection. In the event that the office network becomes completely saturated, they can publish their code to their 'test' environment in their data center for additional, or more complete testing.

In the meantime, to expediate our process, I wound up hosting our MK Dallas sandbox environment on my local laptop, which made it much easier to setup various test conditions and to make change.

Shanghai, China - Day 4

Another hugely productive day with the China teams. Today mainly focused around reviewing best practices, performance and scaling, error reporting and escalation. Throw in the odd discussion about date/time handling and data precision and we were done.

Since some significant modifications need to happen with our partners systems to complete the integration with their systems, we've pretty much accomplished all we can do.

However, our original goal of prototyping the order life cycle is still on the table. Instead of mocking the backend systems, we all agreed that its time to roll up the sleeves and begin designing and planning the necessary changes. This is predominately the responsibility of our partner, however, both MK USA and MK China have been invited to participate in the discussions tomorrow.

We were treated to lunch at Moon River Diner, which is an American style diner located in the heart of Pudong District. Had a great blue cheese and carmalized onion burger, with a Coke-lite.

Dinner back at the hotel, then a night cap at the Jade on 36 bar.

Shanghai, China - Day 3

Our first day working with our China partner. After a short kick-off meeting, the technical team breaks off and the whiteboarding begins.

We made a HUGE amount of progress today! Language was only a barrier in that it took time for translations to occur for discussions. Mainly we worked through the technical aspects of our service architecture, how to get clients to connected, data flows, etc.

We all 'spoke' the same language (C#) when it came time to code. Having everyone from each team in the same room, as opposed to conference calls, was a huge confidence builder in each other. Now I believe when we are forced to communicate through conference calls, they will be much more productive.

Our MK USA team celebrated the days progress with dinner at the Grand Hyatt Shanghai on 56th floor. Most of us had the tenderloin, which was superb!

This hotel was amazing! I'll post pictures of it later; it had incredible views of Shanghai! Turns out, this was one of our first choices in a residence, but it was booked.

We travelled all the way up to the 85th floor, trying to get to the observation deck, only to find that we had to go back to the first floor to buy tickets. Full of food and wine, and somewhat disoriented from leaning over the rails to look up/down the open interior of the hotel, we decided to head for home once we made it back to the first floor.

Shanghai, China - Day 2

Wake up call (Ray) at 7:30am. Showered and down for breakfast by 8:00am, great breakfast buffet! For ~$60.00 (USA), you can get breakfast for two. *ouch*.

Since the rest of the party was either under the weather, or arrived later than we did, we set off to wander around the block or two within the hotel.

We found our way down to The Bund, of course, we were on the wrong side of the river from the official Bund, but after seeing both sides, all I can say is it looked the same to me.

We are within walking distance of the Pearl Tower and restaurant; its prominently featured in every skyline photo of Shanghai.

The hotel is around the corner from a new mall: The Super Brand Mall. Yup, you guessed it. It features name brand department stores from both China and the US.

With the humidity, we were soaked in sweat by the time we got back. Another shower.

We met the rest of our party to strike out to find lunch. Where we took a cab ride to the official Bund. After walking around, we found a resturant that was part German beer garden and part Chinese resturant. Food was pretty good, all local cuisine. At least we found something with some spice (calamari something), which was hard for me to find last time through!

While out and about we found:

Probably another half dozen monuments that I failed to catch the name of.

Since they closed 'the market', we did our shopping at the Yu Gardens while out, since we don't know what time we'll have later.

A couple of drinks while discussing goals for tomorrow and dinner at a hotel resturant before Another Shower and turning in.

One truism that I'd like to share is that I've stayed at 5 star hotels throughout the US and now a couple in China: the room service food always sucks, but you can always find Kelloggs Brown Sugar and Cinnamon Pop Tarts at any convenience store! Yumm....

Shanghai, China - Day 1

Our party arrived at Pudong Airport (PVG) in Shanghai, China, on Saturday, more than a little weary from our 17 some odd hours of travelling from Dallas (DFW) through Chicago (ORD).

I was lucky enough to sit next to a co-worker for the Chicago to Pudong segment. Couple of things we figured out ( and no, it doesn't include world peace):

  • Gin is a great way to pass the time. We played 6 hours of it. At the end, I was only 6 points ahead. Of course, that still makes me the winner! :)
  • There should be a version of Survivor based on long haul international flights in coach. There would be things like immunity challenges won by making it to the bathroom without waking sleeping travellers, or who can 'hold it' the longest. There is all the drama of the popular show; personality clashes with the other passengers sitting behind you in coach, near physical confrontation over the slightest perceived insults! Maybe Jeff Probst will pick up our idea!

Well, after surviving the tedious bordeom of the flight, managing the hurdles (ingore those images of OJ Simpson dashing through the airport that just popped into your mind, its just a figure of speech) at the airport, we checked into the Pudong Shangri-La in Shanghai. By far the nicest hotel I've ever stayted in.

There is nothing quite like exiting the baggage area at the Pudong airport, you've got a throng of people all excited to see you, photo flashes going off in your eyes! There is a red stripe your supposed to follow that looks remarkably like a 'red-carpet', that terminates with a white gloved attendant holding a sign with your name on it, who quickly whisks you into your own personal car. So this is what [insert your favorite Hollywood star here] feels like!

BTW, if you search for Shangrila in MSN search, you'll receive *ahem* questionable results. Evidently Shangrila has the same meaning in many countries!

Wednesday, November 01, 2006

Snap vs. Dump

It never occurred to me that I might need to expand a little on why, in debugging lingo, I refer to 'snapping' as opposed to 'dumping'. See this post for an example.

Imagine the following typical conversation, with a Microsoft Escalation Engineer. For those of you who don't know, a Microsoft Escalation Engineer is at the top of the food chain when it comes to getting to the root of a problem quickly, and reliably, on the Windows platform. They have something akin to super powers when it comes to debugging and I have the utmost respect for anyone in this position! Which makes it that much more bizarre...

"So, you know how to take a dump?", says the Microsoft Escalation Engineer.

"Sure! I've been taking dumps as long as I can remember. You want me to take one now?", I reply.

"Yes. Go ahead and take a dump for me, then upload it so I can review it. Once I get a chance to sift through the results I'll get back to you", explains the Microsoft Escalation Engineer.

We carry on for a few more minutes on the subject of dumping. Seriously.. Its only in retrospect that I cringe at how someone might take the conversation out of context.

Now, I'm not above a little snickering (even as writing this), but we've got to draw the line somewhere don't we?

It was after one such conversation that I began referring to them as 'snaps'. I probably even read it somewhere, but it sounds so much more, er...professional, than 'dump'.

Debug Tools for Windows - Snapping a Process

This is a modified email that went out to our development and operation teams yesterday that I thought I'd share.

The focus of the post is help educate our operators and administrators on what to do when they witness behavior in a production system that needs to be reviewed.

In this particular case, memory utilization of a process grows until the process stops responding, which affects our user base, which is when the phone starts ringing! To keep things 'moving along' the process is typically recycled.

During the pressure of attempting to right a system thats gone belly up, we want the it to become second nature to inject an additional step in the current process which will aid in diagnosing the problem.

Installing Debug Tools for Windows

The latest version can be found here. Check your servers to make sure they have a recent version! Recent being defined as a version that was released this year.

I’ve installed this on production servers without problems. One thing to note, there are two ways to get it ‘installed’ on the server in our environments.

  1. Run the installation by clicking the setup.exe - or -
  2. Run the installation on one server and then xcopy the directory to the production server. The only thing you lose is access to the tools from the Start menu. This is how the server team typically installs it on c:\tools\deugging tools for windows. You’ll note that the version on the servers is likely out of date unless you’ve installed/upgraded it yourself.

To take a memory snap
(sometimes referred to as a ‘dump’; ‘snap’ just sounds better):

  1. Open a command prompt
  2. Change to the Debugging Tools for Windows directory
  3. Execute adplus –hang –p xxxx (where xxxx is the process id of the process you want to snap); an option you may need is the ‘-o’ which is to redirect the dump to a directory other than the current working directory.

In the default configuration, meaning you’ve installed per procedure above, you will receive a warning dialog about missing environment variable symbol paths. It’s safe to ignore this dialog.

If it’s suspected that a process is ‘hung’, or ‘locked’, take additional snaps of the process, but at least a minute apart. Usually 2-3 will suffice. This will allow the reviewer to confirm a hung process or not.

A snap will take up as much memory on disk as the process consumes, so it can fill a disk quickly. Because the threads of the process are temporarily suspended during a snap, I do not recommend pointing the output (-o) to a file share when you actually take the snap. This adds much latency to the operation. It’s much faster, and less intrusive, to snap it to a local disk and then xcopy to a network share, or a workstation for review.

There are additional options, all of which are available from adplus.vbs; just open it up in your favorite text editor for review. There is also a significant amount of information online, just try Googling ‘adplus’, ‘debugging windows’, etc.

Back in China beginning 11/4/2006

We are headed back to China for a week on Friday. This time around, instead of Powerpoints, we are going to be prototyping the implementation between us and a third party logistics (3PL) vendor.

The prototype will focus on the sales order life cycle and is expected to take 1-2 weeks.

Our team is mainly focused on providing consulting and best practices to the vendor to help enable both parties to reach a mutual goal of successful integration. At the end of the engagement, if the vendor has a solid understanding of the process involved, it will be up to them to implement the remaining interfaces.

Today begins the downhill slide into Friday and the checking and rechecking to make sure we aren't forgetting anything that we might need while on site.

Sunday, October 29, 2006

BizTalk 2004 SP2 Available

BizTalk 2004 SP2 is finally available for download as of last week. If your still running BizTalk 2004 (like we are), this SP2 includes several cumulative fixes to things that have likely been plaguing you!

Tuesday, October 24, 2006

Calendar Available

The Google Calendar is one of those things that I think is just plain cool. Being able to access my calendar from anywhere is extremely valuable. For now, I'm just noodling with how I can make some things public vs. private, etc..

If your interested in subscribing to the calendar (e.g. you participate in Game Night, etc) try this link:

Monday, October 23, 2006

Dallas area BizTalk User Group

If you’re looking to get into a little community involvement with BizTalk Server, you’ve probably seen where they are starting a user group here in Dallas. This is just a reminder that they have a meeting coming up; details for which can be found on their website:

They have a couple of notable charter members:

I Wanna be a Rock n Roll Star

ok..maybe not. Playing for my own enjoyment and relaxation would be enough.

But when my son received his first guitar back in July, it was a little "First Act" acoustic. I tried to ignore it, I really did. It just seemed like I'd find myself with these conflicting emotions; either smash the guitar since it was so out of tune (and my sons incessant strumming) or to pick it up and try to make something with it.

Late at night, after everyone was in bed, it would be there silently whispering my name :'zaaaaaach'.

With disdain for such a cheap feeling instrument written all over my face, I finally caved, and picked up the guitar and started noodling with it.

I used to play guitar, you know. Back in the day. Then Life happened and the thousands I had invested in guitar equipment over the years turned out to be worth only hundreds when liquidated.

First things first. I had to somehow get it into tune before I gave into the dark side and smashed the damn thing. Sending out feelers through the 'net I found a gem of a site called

There I found all I needed to get started. Even with a downloadable recording of low E to use as a reference note when tuning. In hindsight, I could have used our piano's E and been in tune with it.

In addition to the 'How to Tune a Guitar', I found other valuable refresher lessons. The only chords I could remember were D, G, A, E. Before long, I had trebled my chord repertoire and regained some basic scale knowledge (I used to be a big scale hound).

A little more sifting through digital goo turned up another gem: Within minutes I had tab to hundreds, if not thousands of songs!

The 'First Act' was a good starter, and I'll keep it indefinitely to cover my acoustic(when in tune, it sounds remarkably good), however, I just ordered an Ibanez ARX300CRS from If your like me, and the name doesn't mean anything, here ya go! Its what I believe is going to be the Ibanez offering against a Gibson Les Paul, or Epiphone. We'll see...I'm not completely sold on the body style, but they have a 30 day guarantee I can take advantage of if I decide I don't like it.

Here it is Monday morning and low and behold I just discovered I only used The Computer this weekend for searching for guitar related info. Barely read email, no feeds and certainly never fired up Visual Studio!

What did I accomplish? I rekindled a love affair with music; I'll try to keep the flame from burning out this time! Rediscovered an old classic jam song: Led Zeppelin's, "Bring it on home"; with a little practice I was back at it. I also rediscovered the joy of the burning sensation in my fingertips because i don't have any callouses yet...right now they feel like I touched a hot skillet!

Ahhhh, but its worth it.. :)

Friday, October 20, 2006

BizTalk InvalidPropertyTypeException


Receiving first chance exceptions of type Microsoft.XLANGs.RuntimeTypes.InvalidPropertyTypeException while processing a message through BizTalk 2004.

The exception was obviously handled somewhere in the runtime as we never saw it in any of our exception handlers, or the event log. Since we were never aware of the issue, its likely been occuring in our environments over the last year.

With help from Mike (MSFT), we began taking adplus crash dumps on first chance CLR exceptions. What we confirmed is that BizTalk was attempting to track properties that didn't exist in some documents, due to misconfiguration on our part.

Historical Interlude

Early on in our BizTalk development, we found that if we just turned on tracking for our property schema, it wasn't necessary to go enable tracking for all of the properties on all of our schemas. What a great shortcut! It just worked... This property schema tracking was enabled by going to HAT | Configuration | Messages and locating our property schema (SupplyChain.BizTalk.Schemas.PropertySchema) in the list and checking the 'Track' box for all entries.

Little did we know that we were silently throwing these exceptions in the background!

Since BizTalk was attempting to track all the properties in the property schema, for each message, an InvalidPropertyTypeException could be thrown and handled by the runtime many times per message.


Unchecking tracking on our custom property schemas and enabling tracking on the message properties themselves cleared up the InvalidPropertyTypeExceptions.

Another early warning indicator was to include monitoring of the Perfmon counter

.NET CLR Exceptions\# of Exceps Thrown/sec

during our initial analysis.

Many thanks to Mike S. (MSFT) and Michael E. (MSFT) for their insight and assistance in resolving this issue.

Original newsgroup post can be found here.

Technorati tags: ,

Friday, October 13, 2006

Debugging BizTalk Memory Leaks

While running down a suspected memory leak inside a host process it became necessary to identify what objects were on the heap before our messages flowed through our BizTalk system, then force a garbage collection, and review what was on the heap after a garbage collection.

This was to verify that resources used during the processing of the messages were cleaned up as expected.

There are three components to making this work.

  1. a message schema
  2. an orchestration
  3. and a .NET component that can be called by the orchestration from an expression shape

Typically after a recycle of the BizTalk host instance that our resources are running in, take a memory snapshot using ADplus from the Debugging Tools for Windows. This is the base line for comparision after running the messages in question through the system. Its a good idea to have known test data available as you don't want just any garbage running through as it will make identifying resources that are out of place more difficult.

Once you've run the test data through the system, and have verified with BizTalk Healk and Activity Tracking (HAT) that those messages, and service instances, have processed to completion, take another memory snap to see whats on the heap. Again, this is mainly for comparison analysis later.

Drop in the GCMessage instance, which will force a garbage collection to occur, presumably collecting all the objects that were in use by your BizTalk artifcats. Typically, we just want a generation 2 heap collection, as that will also collect all the objects from the heap prior to that. If you know that you have objects that require finalization, then drop the message in a second time (snapping memory in between messages).

Finally, snap the memory one last time and see whats left on the heap. This can give you a solid indication of whether or not your objects and resources are being cleaned up appropriately.

Here is a summary of the solution moving parts:

We use a message instance to control when we want the garbage collections to occur. The schema itself is simplistic; here is a sample instance:

<GCMessage xmlns="http://shobu.schemas/1.0/GC">

The orchestration is not represented here, but it is necessary to use a distinguished property for the heap element so that its value can be accessed from an expression shape inside the BizTalk orchestration designer. The expression shape is used to invoke a method on the helper class, GCWrapper (below).

Here is the sample .NET component that is used to force a GC from within the orchestration. This is packaged in a class library and must be signed with a strong name so that it can be placed in the global assembly cache (GAC).


using System;

namespace SHOBU.BizTalk
/// <summary>
/// Utility class to wrap GC functions. Enables the invoking of .NET
/// code from a BizTalk orchestration.
/// </summary>
public class GCWrapper
public GCWrapper()

/// <summary>
/// Forces a garbage collection of a specific generation heap.
/// heap values:
/// 0 - generation 0
/// 1 - generation 1
/// 2 - generation 2
/// Passing a value of 2 will force a garbage collection of all
/// managed heaps.
/// </summary>
/// <param name="heap">The generation heap to collect.</param>
        public static void InvokeGC(int heap)


Of course, the solution needs to be built and deployed to the BizTalk server in order for it to work.

Technorati tags: , ,

Monday, October 09, 2006

Sunday, October 08, 2006

Google Reader

A friend of mine posted about Google Reader over on his blog so I decided to check it out. I haven't even scratched the surface of all the features it offers, but it does give me a centralized view for the RSS feeds I subscribe to.

The problem I've traditionally had is that I use Mozilla suite at home (Thunderbird, Firefox) but use Microsoft suite at work (Outlook, IE). This made it difficult to always get the same views of my feeds. Now I don't have to worry about it.


Great looking adaptation of Frank Miller's 300 in production for release next March. Check out the teaser here.

Saturday, October 07, 2006

BizTalk SOA and Business Process - Day (4)

Wow...its been an week. I have to bow out of the final day sessions in order to catch a plane back to Dallas. This is a shame because there are several sessions which look promising:
  • Communication, Flow, Rules and Logic by Steve Swartz and Clemens Vasters (Microsoft)
  • BizTalk Web Services: The Next Generation by Aaron Skonnard (Pluralsight) and Gruia Pitigo-Aron (Microsoft)
  • Applying Maximum Sustainable Throughput to a Management Operations Strategy (Scott Colestock)
  • BizTalk WCF Adapters In-Depth by Aaron Skonnard (Pluralsight) and Gruia Pitigo-Aron (Microsoft)
  • Developing & Maintaining Business Rule Solutions by Richard Seroter (Microsoft)
  • The Future of the Microsoft Application Server Platform by Steve Swartz and Clemens Vasters (Microsoft)
Overall, the conference was informative, but to truley get full value there needed to be cross function representation from an organization available. At the very least representation from the Developer/Architect and Business Owner/Manager roles.

BizTalk SOA and Business Process - Day (3)

Literally a roundtable discussion of several topics with aproximately ~8 attendees around the breakfast table that comprise our 'panel'.

BizTalk and Virtualization
7 out of 8 use BizTalk in virtualized non-production environments. That means all development, staging and functional test environments were virtualized.

7 out of 8 use Microsoft's Virtual Server product over VMWare.

Can you guess who the lone hold out was that doesn't use BizTalk in a virtualized environment? I'll give you a hint, its the same person who uses VMWare. No one had anything against VMWare. I got the feeling that some panelists would have preferred it given a choice. However, all voted with confidence that virtualization was a great thing with the caveat being Sql Server. With a high end SAN to back it up, I wonder if the sentiment would have still been the same.

Virtualized Development Environments
Spoke about the need about having such a wide range of tools and technologies available on the desktop that are not necessarily compatible. Brian Prince (a consultant and speaker at TechEd 2006) spoke in detail how their entire development environment is virtualized, even on the desktop.

Brians team, being consultants, may be working as many as two or three projects at once. Each project could have an incompatible technology stack with the other. In such a scenario virtualization saved the day. For each project a developer is working on, the developer will receive VHD representing each tier. E.g. Development Workstation, BizTalk (or other application server) and SQL. One of the keys is evidently giving the VM's each a new SID and with BizTalk, installing, but not configuring the instance until ready. So its effectively a manual post build step after the scripts create the virtual environment.

Sunny, from EDS, explained how they script most of the servers for a virtual environment, from domain controllers, database, and through application servers.

Being able to quickly turn out new 'environments' seemed to be a huge productivity gain, giving faster proejct startup times and consistent developer workstation builds.
Continuous Integration, Repeatable Builds and Deployment
All the talk of virtualization and scripting the build of consistent environments quickly turned into a discussion of how to turn out builds quickly. Opinions seemed to be that library and ASP.NET (including web service) builds continuous integration worked fine. This not ony includes component builds, but deployments to an integrated environment. Brian had the most advanced process, which also included notifying the test team of the availability of the new build along with the change sets that were deployed.

With BizTalk projects, a daily, or regularly scheduled build was more appropriate. The general consensus was due to complexity of the deployment. However, all agreed that getting the BizTalk build and deployment down was critical to the forward motion of a BizTalk project.

It was worth noting that 7 of 8 panelists were now using Team Foundation Server to some degree and MSBuild to build their projects.


Access and Identity Management by Steve Swartz & Clemens Vasters (Microsoft)
I don't know how much of it was about Identity Managment and the Access mean data access, not access as in authorization. So, while the session wasn't exactly what I expected I enjoyed it nonetheless. These guys work very well as a speaking team.

ACID transactions in data access are extremly costly and doesn't scale. Consider using compensation strategies instead.

Key take aways were to consider how the data in the application (any application) was to be accessed. "Correct data architecture is the ultimate performance optimization"; in those times when I've truely encountered problems with peformance, they were traced to data access.

High Availability, Fault Tolerance and Scalability with BizTalk Server 2006 by Jay Lee (eBI Solutions)
I believe this talk was given by a different person at TechEd 2006, at least it had the same 'Failover and Recover BizTalk in less than 5 minutes' demo. Its impressive and makes one want to double check that it works in your environment.

The configurations for BizTalk to achieve high availability were right out of BizTalk documentation you can find online here.

In the slide deck, it appears that Jay had the master secret server installed on the SQL boxes.

MSI is the prescribed method for deployment; reminder to import the msi from one server and then run the msi on the remaining servers in the server farm. This can be automated.

Again, saw BizTalk in a virtual environment: 1 Active Directory VM, 1 SQL VM, 2 BizTalk VM on a laptop. Just what kind of laptop was he running?!?

There was a recurring theme throughout the conference, reinforced here, that BizTalk Developer and BizTalk Operations were two distinct roles. Essentially the Developer role is responsible for development, building, packaging and configuring bindings. The Operational role is responsible for deployment, health, performance and scalability of the system.

Lunch with Covast: Newest Developments in B2B
Interesting that Covast is moving more into the partner space rather than the competitor against BizTalk Server. They also discussed an integration appliance for B2B scenarios using EDI formats.

Effective Techniques for Handling Large Messages in Service Oriented Solutions by Thomas Abraham (Digineer)
Briefly talked about MTOM and streaming methods before jumping into a custom solution he had developed that addressed a specific business case. Essentially he had to transport PDF files and these PDF files could be large. Sucking them through BizTalk, as base64 encoded data peformed fairly poorly. Thomas's custom solution was to split the data on the way in, stream the PDF data to a file share, and then correlate the message identifier back to the PDF on disk at the end of the process. Kudo's for being creative. One audience member brought up disaster recovery and its true, this should work fine as long as the backup's of the database and the fileshare were in synch. Otherwise, there might be some data loss, but it should be relatively easy to reconcile depending on the volume of data.

Here is a quick chart of message sizes and their categories.

Message SizeSize Category
< 10KBIdeal
10KB - < 100KBSmall
100KB - < 1MBMedium
1MB - < 5MBLarge
> 5MBVery Large

Configuring, Building and Deploying BizTalk Applications in a Distributed Environment by Paul Gomez (ThinkBox Solutions)
Again, something of an identity crisis. ThinkBox Solutions shouldn't be confused with ThinkBox educational software (though they do have a Development Services group).

Paul had the hard luck of following a session that has already covered much of the material he was presenting. He discussed creating available BizTalk configurations that have already been covered pretty well.

The client solution that they had developed sounds fairly large (8 BizTalk Servers) and that definitely means that they've got issues around deployment.

Currently, ThinkBox uses NAnt (shelling out to BTSTask) to meet their BizTalk deployment needs. To help keep things streamlined, they deploy all bits to all boxes in the farm. This keeps the server builds conisistent, but allows for configuration of each node in the farm for its specific function, or role (Processor, Sender, Receiver).

Their security account management seemed overly complex with an account for each server role type (Processor, Sender, Receiver).

Prior to deployment to any integrated environment, they used the concept of a deployment staging server to configure bindings, etc.. This is the server that the binds will be exported from for use at deploy time.

Interestingly enough, they created their own direct submit adapter. This direct submit adapter simply submits a message to the message box. One thing that was odd, was that they had a pair of servers dedicated to running the host instances for this adapter. Something else to note is that in order to submit to the message box, each developer workstation had to be a member of the BizTalk workgroup.

Again, it was stressed that the Developer role is to develop and package the build (including bindings) and the Operational role to deploy, monitor, and tune. ThinkBox, like many organizations with a large deployment, has a dedicated deployment guru.

Where the discussion was weak was around the physical organization of the project structure for ThinkBox solutions, how they handled versioning, etc..

BPM Q&A Panel
Little to no discussion around Business Process Management; it was essentially a continuation of developer/architect Q&A on products and platforms. This could, and should, have been moderated better.

One problem might have been that the majority of the 'business people' seemed to have disappeared. I don't know if they bugged out early, or were just attending different sessions. I would have sincerely liked to have found a couple to engage in discussions of whats important to them in their business.

BizTalk SOA and Business Process - Day (2)


Customer Panel: SOA Success Stories
Panel: Ryan Garner, JetBlue Airways, Justin Myrick, Clear Channel Communications & Rodney Turpin, Hewlett Packard Ryan Garner, JetBlue Airways
JetBlue's SOA evolved out of a business driver to standardize partner integration and take advantage of additional inventory channels.

Part of their success was attributed to creating a standards body for service creation. Stressed the importance of tackling areas such as:
  • Security
  • Health Checking
  • Monitoring
  • Logging
While JetBlue uses BizTalk it is not heavily used in their SOA implementation. They do use it to orchestrate the composition of multiple services to expose a single service.

They use a purchased product called Service Manager from SOA Software (yes, agreeably confusing) for their web service monitoring and Service Level Agreement (SLA) compliance.

They also were they only member of the panel to discuss setting up a forum for partner discussion and feedback using Community Server. While they seemed to have more partner integration that others of the panel, there was no discussion of partner management (ran out of time). If I can catch Ryan at the conference later I will try to get that question in front of him.

They followed a contract-first, or schema first, approach to their service creation and contribute that as a success factor.

Justin Myrick, Clear Channel Communications
Their SOA also evolved out of a business requirement. Evidently there was a proliferation of ERP systems; essentially one per market....and Clear Channel is in a lot of markets.

Stated Microsoft Technologies:
  • BizTalk 2006
  • Microsoft .NET 2.0
  • Sharepoint
  • Reporting Services
  • SQL Server 2005
  • K2
  • Peoplesoft
  • UDDI
They were also given the direction to "portalize Peoplesoft" by the CIO. Effectively targeting a view, or views, of work and business functions to specific users and user groups. Justin never came out and stated it, but this is really composition, or aggregation, of multiple services to present a view to a user. Reminds me a lot of Microsofts efforts around the Composite Application Block (CAB). This portal was developed and rendered using Sharepoint technologies.

Clear Channel used a WSE filter to log message request/response data to a database that was used as a debugging and operational aid.

Rodney Turpin, Hewlett Packard
While the other two panelists had much more recent SOA implementations (less than a year), HP has had an ongoing SOA initiative that has been evolving for several years. HP has something on the order of ~11,000 diseperate applications that needed to be integrated, or their functionality composed into services easily consumed by new systems.

HP is the only panelist to use the Business Activity Monitoring (BAM) feature in BizTalk for monitoring business processes, but its widespread use was admittably "light". Rodney had high regard for using Microsoft Operation Manager (MOM) for BizTalk application server monitoring.

There was an initiative to collapse data centers and their SOA implementation needed to support that.

StatedMicrosoft Technologies:

  • BizTalk 2004 + BAM
  • Microsoft Operations Manager
  • Visual Studio Team System (all features: Work Item tracking, version control, etc).
  • UDDI
In regards to BAM, all panelists plan on incorporating BAM support into their BizTalk implementations. When asked why they didn't implement as part of the first release cycle they all stated it was the time constraint that prevented it.

Avoiding 3 Common Pitfalls in Service Contract Design by Tim Ewald, Foliage Software Systems
Tim had a pretty good session and its unlikely that I am able to capture it all here. There was significant information presented, especially around version strategies, that might take some time to digest.

While approving of a cannonical data model, he discusses why canonical data formats for a business typically fail. They have three major problems that typically manifest as:
  1. Too much required data
  2. No realistic versioning
  3. No system level extensibility
Too much required data
Typically a standards body is formed to defined the canonical data format. This standards body spends time ([n] weeks, months) interviewing business consumers identifying what data is required. This required data makes the schema, and those systems that consume it, brittle.

No realistic versioning
Schemas are typically versioned through the namespace. This will typically break any implementation that consumes it (effectively creating a brand new just happens to look a lot like the entity it is a revision of).
No system level extensibility
While the standards body is out interviewing different business teams and defining the model, by the time they get a standard, its likely that a business team will need a revision to it. However, because it takes this standards body time to review the change request, there needs to be a way for a system to extend the schema without breaking it. His recommendation is to include an Extension element, which is a ##any element. Effectively allowing a system to stuff anything it needs there. Of course, the receiving system will need to know what to do with it. The thought is that this will allow systems to be 'agile' until the standards body can get through the review process on whether or not to standardize the extension.
This begs the question of what to do with the maintenance issue that has just been created. The assumption is that the extension that a system adds might be so useful that it is adopted as part of the canonical data model. Now any new work will be able to take advantage of the new data model. However, what about the implementations that are already baked using the extension? Theoretically, they can continue to use the extension as is because it will 'always' be supported. I can imagine that you would eventually want to synch up with the standard though. This will cause rework on now legacy interfaces. Taking an example from our organization:
  1. ERP produces an item update that needs to be published to consumers
  2. BizTalk maps the ERP Item into an SCS Item
  3. BizTalk then routes to consumers (e.g. Order Entry and WMS)
  4. Mary Kay then onboards a new Manufacturing system that needs the SCS Item, but it also needs some additional data that wasn't part of the original SCS Item specification
  5. The ERP system is modified to produce the item udpate with the additional data
  6. BizTalk maps the ERP Item into the SCS Item, the addition is mapped into an Extension element
  7. BizTalk then routes to consumers, original systems continue to use SCS Item as normal (remember, the ##any was still part of the schema and was an addition. If the system is validating, the instance is still valid and the new data is ignored by the original systems)
  8. BizTalk will map from the SCS Item to the MFG Item, taking the new additions from the Extension element. The problem here is that the BizTalk mapper does not work with ##any elements. However, a custom XSLT can be used to map the remaining additional elements.

Any change must be communicated to a the clients.

Tim had a great table for when to version a schema, and or service. I'll try to repro here when more time exists. I'm betting its probably available online somewhere else.

Advanced Routing and Correlation with BizTalk Orchestrations by Lee Graber, Microsoft
Lee is the Lead Developer for the BizTalk Server product. His talks are usually chock full of BizTalk goodness.

The BizTalk runtime uses the WasPromoted property to determine whether or not to 're-promote' properties for routing.

Correlating subscriptions are deleted non-deterministically. E.g. They clean them up after they have been satisfied, but we don't know when.

A Listen shape with a Delay shape == 'Zombie Factory". A 'zombie' message is when there is a problem, either at the transport, or with the logic of what the process is trying to accomplish. In the case of the listen with the delay, a zombie will occur when the delay timespan expires, yet the message is received either at the exact same moment, or before the correlating subscription can be cleaned up. This results in a message 'completed but with discarded messages' message.

Lee also gave a great demonstration of a Resequencer Pattern using orchestration. The sample is availalbe for download, when I find it I'll post a link. It was incredibly simple and powerful. I'm also wondering if we can do some sort of batching with BizTalk using a similar pattern.

Large messages are fragmented into smaller chunks when stored at the database. This occurs at the Message Agent layer. Only the first chunk is loaded and sent to an service instance.

Its key to use a Distinguished Property when accessing message data in an orchestration so as to avoid reparsing of the message. Using the xpath construct within the orchestration will cause the reading of the message data to locate the xpath.
[Saw Marc Berry, a consultant who has given Deep Dive training on BizTalk!]
Understanding the Microsoft Application Platform Infrastructure Optimization Campaign by Pearson Cummings, Microsoft
Ah, man...this was almost a complete waste of time for a customer. I would love to understand my organizations position in the maturity model, but I got almost nothing from the presentation other than where to go for additional information: Microsoft Infrastructure Optimization Partner Kit

Developer Q&A Panel
Dominated by BizTalk questions..good stuff. This was a chance to bounce techie questions across members of the development teams of BizTalk Server, WCF, and WF.

QFE for versioning of MessageType is available if required. Not exactly sure what the context of the question was here.

There is a comprehensive WCF UDDI sample available as part of the WCF runtime.

Requests for a dynamic orchestration filter were common; this is likely a vNext feature.

A couple of people had implemented the JDE OneWorld XE BizTalk adapter and were expressing how painful this was. e.g.
  • 2 months to implement with both Microsoft and iWay help
  • JDE running on AS400
  • Required .jar complilation leads to a very fragile implementation
They are working on the documentation for the BizTalk LOB adapters. Jay Lee, a CNC Consultant is putting together a whitepaper on the adapters that should be available soon.

The new adapter (R2?) for BizTalk PeopleSoft will be based on the 8.96 toolset and include using thinnet through the COM api, not the Java bridge.

Microsoft has signed a partnership with Oracle to assist in the development and support of these LOB adapters.

BizTalk SOA and Business Process - Day (1)

Quick statistic:

This is year 5 of the conference. Year 1 had ~50 attendees; Year 5 had ~800. There were roughly ~120 invited to the Day 0 events.

SOA, BPM and Microsoft: A Pragmatic View by David Chappell (Principal, Chappell Associates)
Turns out, that there are three David Chappells and two out of the three have appeared on tv. Given that, it might be easy to understand how there might be some initial confusion; which David Chappell are we talking about? Is it David Chappell, David Chappell, or David Chappell?

Dynamic speaker, has great stories and a good sense of humor. Poked fun at both Gartner and Forrester definitions of an Enterprise Service Bus (ESB).

Discussed upcoming Windows Communication Foundation (WCF) as this generations tcp/ip standard for the Microsoft Platform. Also indicated that the Service Component Architecture is the Java worlds answer to WCF. There was some discussion that Sun technologies were fragmenting and that the platform might be left in chaos. This statement is based on three technologies competing for shares on the Java platform: Open SOA, Service Component Architecture, and Open Source.

Noted that there is no queing standard with WCF for application to application communication. I had incorrectly assumed that WS-Reliable Messaging (WS-RM) fullfilled this requirement. Thats not the case. The message is not durable (ala MSMQ), however, it is guaranteed much like a TCP/IP connection. Its indicated that this is a target area for future development for WCF.
Real World SOA by John deVadoss (Director of Architecture Strategy, Microsoft Corporation)
Somewhat dryer material. The biggest message seemed to be damage control on the SOA negative spin in the media lately. Kept referring to the "hype cycle" typical of industry buzzwords and not to "throw out the good with the bad".


The Architecture of SOA by John Evdeman (Microsoft)
Again dry technology material. Biggest message is that SOA is not technology, or platform, specific.

Pointed out that Readings in Service Orientation is a free, online book which contains some best practices when composing SOA.
Choosing the Right Technology for Exposing and Composing Services by Kris Horrocks (Microsoft)
I've caught talks delivered by Kris at TechEd 2006 and he does a good job at keeping the information at 'just the right level'. The biggest topics are around deciding when to use SSIS, BizTalk or Windows Workflow Foundation (WF).

There might be an opportunity to replace the proprietary SCSWS Sql Queue with a Sql Server Service Broker implementation.

Selling BizTalk-Based Engagements by David Chappell (Principal, Chappell Associates)
I went into this session thinking that it would be about selling BizTalk as a solution to integration opportunities within an organization, and for the most part it was. However, it was definitely parnter focused in helping you move from the Sales Call through The Close. Much of it was still applicable to stake holders within an organization.

Partner Expo

The partner expo was definitely no comparison to something you might see at a PDC or TechEd. There was little to no swag to be had. The only thing worthwhile was a copy of the book BizTalk 2006 Receipes. However, they weren't giving them away, they were just for display. When I asked how I could get one of them, they joked that several of the authors were here and that if I could get them to tell the staff to give me one, they would. Well, holding down an author and forcing him to give me a free copy of his book wasn't high on my list of priorities (already have a copy of the book on the way). But, grabbing a bite to eat and getting back to the hotel for a 6AM conference call with Dallas was. The snacks were just that...snacks.

Since the snacks didn't fill me I went ahead and ordered room service; vegetable lasagna. It was worst dish I have had in a while. There were raw carrots in the middle of it. Not little carrots mind you, but the big fat kind. blech.

Did I say that? is a case study I participated in that has been translated to Japanese. [hint: look for the quotes]

The original case study can be found here.

BizTalk SOA and Business Process - Day (0)

First off, let me say how beautiful Bellevue is. Its an extremely beautiful town, with great shopping, dining and bars. I'll definitely have to drag Sally if I ever make it back to town!

There were many personalities here that I've met in the past Doug Girard, Kris Horrocks and Scott Colestock (just to drop a few names). Both Doug and Kris are part of the Connected Systems Division, which now includes the BizTalk Server product and were part of a group, along with Eddie Churchill that visited our facilities some time ago. Scott developed the deployment framework we use to deploy our BizTalk 2004 solution.

For the keynotes, we heard from Oliver Sharp (General Manager, BizTalk Server), Janelle Hill (VP, Business IT of Research Division - Gartner) and Steve Guggenheimer (General Manager, Application Platform Marketing). We also heard from Ed Barrie (Sr. Treasury Product Manager) discussing a case study of using BizTalk Server 2006 and its BPM features for the treasury implementation at Microsoft.

The keynotes were fairly repetetive of material presented as part of
TechEd 2006, but doesn't hurt to hear it again. A fairly significant
message from the keynotes were definitely around the need for Business
Process Management (BPM) and a suggestion for the following around BPM:


A) Wait for the domain offerings to mature
B) Identify existing skillsets within the organization and find complementary technology solution to match. Make a 'tactical' (2 year) decision to 'roll your own' in as much of a vanilla way as possible. Limiting scope of custom code.

A warning around using a vendor (Oracle/SAP) ability to provide an explicit solution to BPM, largely due to a vendors inability to provide a blanket solution for every industry/market.

An Integration Competency Center (ICC) plays an important role in an organizations BPM. e.g. a compentency center is always a good idea when there is a requirement to share resources/information across an organization. With eBiz, for example, an ICC would have established the standards for integration so that there would have been less guesswork when it came time for them to actually integrate. This adds value to the organization in re-using experience, and knowledge, when defining integration patterns and operational strategies across the enterprise. The ICC would also provide the standards to partners.

There are a couple of areas to consider for future:

  • BizTalk Accelerator for SWIFT
    • SWIFT is a standards body for banking transactions and includes:
      • Banking solutions for payments and cash management
      • Capital market solutiosn for trading, treasury, custody and corporate actions
      • Corporate solutions for payments (including supply chain transactions), receipts, cash management, treasury
      • We know that we've had a couple of requirements for integration with banking systems that might be fulfilled by adopting a standard for integration with financial institutions.
  • Industry standard for representing supply chain data; e.g. Rosetta.NET
Saw Richard Orr, who I believe is an ex-Manhattan Associates employee.

Had two great break out discussions with peers attending the Day-0 event. These discussions covered Deployment, Configuration, Management and Monitoring of BizTalk solutions. It allowed us to provide some significant input to the BizTalk server team.

The biggest pain points across the groups were:
  1. Configuration management across multiple environments. e.g. Bindings. BizTalk 2006 should mitigate some of the pain we feel today; it could be easier.
  2. Versioning; e.g. dependency management for schemas, components, orchestrations.
  3. Monitoring business process impact on performance; e.g. finding corralaries between executing business processes and system performance.
  4. Sourcing event data through a common api to a central store with a central portal that would have views of the data based on role. Much can be accomplished today through BAM; however, BAM introduces many different complexities.
Spoke with many different people and no longer believe Mary Kay is alone in their implementations. Many attendees represented organizations with installations much larger than ours. Also, many attendees have business processes (orchestrations) that are truley long running, 40+ days in one instance.

Thats the brain dump for now...the regular conference starts tomorrow.