In my role as a solution architect I often get pulled into meetings to discuss “the state of automation” or to help provide estimates to automate existing manual processes. The business case for automation is pretty clear cut in Information Technology or so it would seem. Business cases can be made a few different ways but typically they are achieved along the following dimensions:
- Cost reduction – which inevitably means a reduction in human resource or a redeployment to more strategic areas of the business.
- Drive increased revenue and scale – e.g. take more orders, support more customers, activate more services or devices, etc.
- Customer satisfaction – an often over looked key performance indicator. Happy customers are repeat customers and social media provides a platform for happy customers to share their experience with a company’s brand driving or diminishing demand.
When analyzing an opportunity for automation my first step is to observe the existing process. I ask questions, such as:
- What key performance indicators does management look at, at what frequency, and why?
- What volumes does the current process support (e.g. calls, orders, payments, etc.), and what does the business require in 6, 12, and 18 mos?
- Is the existing process well understood by the user community and is it documented?
- How are users typically trained on the process and their role?
- What are the information needs of the users to execute effectively?
- How are exceptions to the process triaged and managed?
The answers to these questions “set the stage” and provide key insights. In later stages they will supply data to substantiate the business case. The next step in the process is to understand the currently employed IT systems and infrastructure. Reuse of existing or the requirement for new will drive the cost side of the business case in terms of IT spend.
Now for the hidden and many times unconsidered advesary of Automation – THE STATE of DATA STANDARDIZATION. Inevitably unless in a startup company or Minimal Viable Product(MVP) situation there will be existing, yet to be integrated, supporting IT systems. In larger companies these systems may have come into being via green field build outs, implementations of COTS Products (commercial off the shelf), and/or systems inherited through acquistion of departments or entire companies. Architects love to debate the presence of a canonical model, i.e. canon from Old English, French, or Latin for “Church Law”. From Wikipedia:
A canonical model is a design pattern used to communicate between different data formats. A form of enterprise application integration, it is intended to reduce costs and standardize on agreed data definitions associated with integrating business systems. A canonical model is any model that is canonical in nature, i.e. a model which is in the simplest form possible based on a standard, application integration (EAI) solution.
Yes to architects, the existance of and conformance to a canonical model is a religious debate. Personally I see these debates as a waste of time. I do take the state of data standardization seriously though because lack thereof can quickly destroy the profit margins of a given project at best, and at worst it can completely compromise the business case. Think about it this way the more dimensions that have to be uniquely considered (i.e. to who, how, for how much, where, when & for how long, …):
- Customer segments
- Unique product instances, pseudo products or equivalents, product lines, product packages, add-ons or upsell products, …
- Geographies offered (often drives taxation or other regionalized taxes, fees, tarriffs, etc.)
- Price points, promotions (packages * price), discounts
- Sales channels
- Legal terms & conditions and temporal aspects
– the more complex the business logic that will have to be employed. These decisions drive the level of effort required for the initial build out and required testing. Often over looked is the ongoing impact to an organization’s ability to to launch new market offers in an agile manner. Don’t under estimate the impact to existing offers and the regressions testing required to ensure that something existing is not compromised or broken as a result of launching something new. Oh, and then there’s the frustration that can be created for a long time customer that doesn’t qualify for a “new” promotion.
So what’s the moral of the story? Apply the KISS Principal, yes to the extent possible, Keep It Simple Stupid. Scurtinize your dimensions and when adding on, always attempt to minimize the dimensions. Keep market offers simple. If you can’t explain your offer on a billboard sign that has to be read and understood by a pontential customer as they drive by, redesign your go to market strategy. There’s a cost to complexity and lack of data standardization.
What are your thoughts on this topic?