Wednesday 9 May 2012

Simple Service Enterprise - part 4


Today we'll take a REST from REST and I'll touch upon one of the issues I ran into today: the two types of data there are. REST assured however that at least a few of the next posts will be about yesterday's topic, as it has led to fierce debates here and there over the course of the day. Yes, pun intended

There are two types of main data: Master Data and Transactional Data. And both have very different CRUD models, requirements and needs

Master Data is what you could call slow-moving data. It does change, but over a very long period of time. Think of parts, people; building blocks to a larger product. It is depicted above as the thick, organic line slowly making its way from the data layer up
Transactional Data is fast-moving. It shouldn't even hit the ground, to help you visualise that. It is depicted as the the thin red line shooting up and out

The most apt example of Master Data is payroll data: employees, people, with their flat structures and simple relations, static and predictable to change at certain points in time: salary changes, promotions, organisational  changes, save the odd marriage, divorce and moving hourse of course.
The best example of Transactional Data is supply chain data: orders and inventory statuses, with their multi-level and deeply nested complex structures, continuously changing until the very last cut-off moment

You could even divide the two in batch and event-driven / online. For payroll data there's a monthly cut-off time, for transactional it's daily or even a few times a day.
That also means that the level of automation required is much higher for transactional data, than it is for master data.
Next to that, payroll data must bring images of people-processes (being harassed by secretaries to fill in or fix your timesheets) to your mind as well, where transactional data don't

But in terms of read and write, or simply I/O, the difference is generally distinguished as follows: Master Data is written once, read many times, and rewritten and read many times after that. Transactional Data is written once, read once, and that's pretty much it

Master Data is the basis for a company, it forms the building blocks that allow you to do business, c.q. construct Transactional Data. It is the single largest cause for failure of B2B messages: orders that get sent out while containing items that haven't been "propagated" to the various suppliers.
Again it's the people-process there on top of the Item-Master interface that keeps it from being synchronised real-time - not the technology

These are the two main types of data. Of course there are shades of gray to both of them.
Employee data is thick, but organisational data is even thicker. Trying to automate that by building interfaces for it, is ridiculous. But believe me, I know customers that insisted on doing so - so much for business cases in companies where money means nothing.
Supply chain orders are thin, but auction or stock data is even thinner. Buy and sell and deliver sometimes occurs within less than a second.
Whereas automating CRUD for e.g. organisational changes is useless because the frequency is too low and the requirements never fixed, auction and stock markets have a frequency that is too high to even try CRUD

This world is a one of many shades of gray. It always has been, and it always will be. It lies within our nature to swing from one end of the scale to the other (outer) end, and like a dying metronome slowly swing back to a little bit before the other end, and so on, and so one, until ending in the middle: aurea mediocritas!

Remember though, that replacing a bad architecture or implementation (e.g. SOAP) by a slightly less bad or even better one (e.g. REST), doesn't mean that you are doing your best or even achieving your customer's best - nor does it honour the fact that you can't have a one-size-fits-all solution for a many-shades-of-gray challenge

0 reacties:

Post a Comment

Thank you for sharing your thoughts! Copy your comment before signing in...