Creating a plug ‘n’ play enterprise

Creating Plug 'n' Play EnterpriseIn the last 12 months we’ve seen a big push from the new media marketers and enterprise social vendors towards promoting both the connected enterprise and the connected customer.

Mobile and social paradigms are shaping the industry landscape at phenomenal pace and businesses are seriously struggling to keep up. But there’s a massive disconnect between it all, a vital missing component in the conceptual stack of ideas that could actually be bigger than all the purported game changers.

Looking at industry ‘shapers’:

  • Cloud
  • Mobile
  • Social
  • Big and In-Memory Data
  • Internet of Things/ Everything/ Something
  • BYOD
  • BYOP (Bring/ Build Your Own Process)

In April last year, Forrester wrote about creating a convergence of Big Data, Cloud and BPM and called it Big Process. That’s all fine in theory and for want of creating a buzzword to take on Gartner’s iBPMS stance on the enterprise industry but what they all lack is cohesion over convergence.

On their own they are powerful propositions from Business to Consumer but they lack a cohesion and ability to work together. There is little interoperability, no single defining protocol that both hardware and software vendors have signed up to to ease a lot of IT pain and failed business expectation. So what’s the big problem here and why can’t we learn from other industries ?

Home is where the heart is

Take a look at the digital home market for example.

The Digital Living Network Alliance (DLNA) was founded in 2003 by a collection of global companies with a vision to easily connect and enjoy photos, music and video amongst networked consumer electronics, PC and mobile devices.

DLNA has certified more than 15,000 device models from the world’s leading manufacturers, including: TVs, storage devices, mobile phones, software, cameras, printers, game consoles, PCs, photo frames, media adapters, set-top boxes, AV receivers, Blu-ray disc players, tablet computers and many other products. DLNA Certified® devices take the guesswork out of selecting products that work together and facilitate better, easier sharing of digital video, photos and music throughout the home.

In order to achieve the vision of a digitally connected home, DLNA published industry design guidelines that allows OEMs to be involved in a networked device market, leading to “more innovation, simplicity and value for consumers.” According to the Alliance this ultimately meant that industry collaboration and standards-based interoperability produced compelling products.

Building compelling enterprise products

So you have to question why on an enterprise scale this can’t be achieved as well ? If there is a protocol and set of guidelines that exist for a consumer market, and there is a big push towards BYOD (Bring Your Own Device) to work then already there exists the potential to leverage interoperability between these digitally connected devices within the enterprise and build upon, extend and innovate those guidelines to include device and data compliance in the organization.

When IT tells the business they have this box called an ESB that talks to lots of other boxes to make things work it’s a stretched truth because there is very little natural interoperability between those ‘boxes’. There are so many ‘standards’ and messaging protocols is it any wonder that large software platform vendors are about connectivity with workflow built on top to please the business ?

Plug me in

And so to the true plug and play enterprise. In order to create a PnP vision hardware, software, service providers need to form another digital alliance and come up with a set of guidelines and protocols that literally allow business, organizational resource and consumers to plug in and play without the need for heavy integration cost. If something doesn’t work, swap it out for something else. If a process doesn’t work, allow consumers or internal resources to bring their own that work, on their own devices that they feel comfortable engaging with.

The Cloud has such promise and yet it’s really just another clunky service that requires a lot of data compliance and integration.

Big Data wants to show off it’s plumage and give us predictive analytical capabilities that can determine what consumer behavior and trends are before they happen at an extremely targeted level and yet like a leaky tap all that complex plumbing required to piece all the systems together will eventually lead to meaningful data being lost.

All these trends are being designed and delivered in isolation and nobody is really thinking about bringing it all together in a way similar to DNLA that will, according to the alliance, lead to more innovation, simplicity and value for consumers (in this case, the client).

Don’t talk big about a connected future. Build one.

And for that to happen you might need to connect with the people you least expected to. Your competitors.

Tags: , , , , , , , , ,

No Responses to “Creating a plug ‘n’ play enterprise”

  1. January 22, 2013 at 8:12 pm #

    Very thoughtful piece, Theo. There is certainly an enormous amount of fragmentation happening now with on-premise under pressure and cloud with plenty of questions about performance and integration. Something has to give.

  2. January 22, 2013 at 10:19 pm #

    Well. the healthcare industry finally woke up to the need for interoperability.

    They did not do it on their own, it was basically mandated as part of federal legislation called “Meaningful Use”.

    Some positive outcomes are already appearing and hold real promise for reducing the cost of healthcare services delivery (e.g.reducing the need for duplicative testing) as well as improving the quality of care (e.g. reduction of medical errors).

    Civerex has one client in CA that is building an e-clinical Hub where members of a data exchange can receive up-to-the-minute consolidations of clinic,hospital and lab test result information simply by looking up a patient at the e-hub and clicking on a download button.

    The interesting thing about this implementation is the range of different EMRs that the participating members use and the number of “standard” data transport protocols that are in use.

    The challenge is making all of the interconnections seamless.

    • January 23, 2013 at 6:41 am #

      Karl, that is a very interesting post and I’m glad to see there are some positives coming from the “meaningful use”. Most of what we hear in the general media isn’t very positive about those mandates. I’m hopeful more systems are able to adapt like Civerex and build systems for their members.

      What do you feel the implications are between systems, though? Inevitably, a member will change systems or provider networks due to insurance or an emergent visit. How will networks provide inter-network access to data or are these systems just building larger “stove pipes”?

      Used to, one facility’s system wouldn’t talk to another facility. Now they are solving that to make sure all facilities in the same network can talk; which is great. But, are we just moving the problem up in scale? There is still now universal standard for sharing medical data, is there?

      • January 23, 2013 at 9:01 am #

        It’s all very complicated and I could go on for days here.

        What not to do is get everyone on ONE rigid system where all of the database fields are aligned (same names, same data types, same size).

        The real life scenario is the one we have been working with for several years where each publisher/subscriber uses their own native data element naming conventions and you set up for each publisher, a one-time name map to each publisher’s data element name for that same data element. i.e. I ship abc=”John Doe”, you read it as def=”John Doe”, a 2nd subscriber reads it as ghi=”John Doe”.

        Next, the data has to be formatted such that the subscriber can read/import it.

        This gets tedious, so we went a step further where we now ship a “bucket” comprising (for healthcare), a combination of CCD (continuity of care documents), HL7s, XMLs, doc files, pdfs, spreadsheets, images, even videos, all in one encapsulated, protected file.

        The presumption for the new way is that a subscriber will be able read all of the items in the “bucket”. We found some cannot read HL7 even though they have MU software, so we now parse and format HL7 to HTML and ship both.

  3. January 23, 2013 at 6:36 am #

    Theo, you probably know what my two cents will cover for this topic. A standard framework for business processes.

    This was the very reason we created the APQC Process Classification Framework (PCF). We would take one organization to benchmarking or learn from another organization and there wasn’t a common language or standard for business processes.

    One organization called it a “call center”, another organization called it a “contact center” that handled more that just calls, and yet another called is “customer service”.

    We developed the PCF as an open standard that any organization could use to better understand business processes. It takes the approach of outlining process categories, process groups, processes, and activities. We want it to be adopted broadly, and it is our most downloaded tool.

    It is not perfect, but we take feedback and updated it on a regular schedule. The organization that use it report a lot more clarity on “getting everyone on the same page” regarding executing work within a given area.

    To your point, though, there is a great deal of work and effort required to develop a standard for how a process should work, but our hope was to create a tool that is a starting point for organizations. And trust me, some look at parts of the PCF and realize is won’t work for them and they BYOP for that area. But, at least they know how their new BYOP fits into the other organizational processes.

    • Maureen Gervais
      January 28, 2013 at 5:23 am #

      Ron, I am in total agreement with the need for a standard framework, be it the APQC Framework or another tool — granted I am partial to the APQC Framework.

      The number of solutions being provided today is mindboggling. Many of the solutions I have reviewed are ‘bigger and faster’ and entice the end customer with a plethora of data to be stored on-premise or in the clouds, for future analysis. While ‘bigger and faster’ is alluring, it is not always what an enterprise needs to grow their processes and become more efficient and effective.

      For me, before a solution is considered a framework should be put in place. I say this because if an organization lacks a basic understanding of their processes, then whether they are using pen and paper or the newest solution out there, it will be a case of garbage in – garbage out. An organization needs to determine how it will categorize its processes (which framework), identify metrics and KPIs and be able to answer the question “Why are we undertaking this work?” With a framework properly implemented, an organization will have reviewed and validated its processes, identified the related metrics/KPIs and can answer the question of why this work is being conducted. If any one of these key efforts has not been completed - then what will the organization do with the plethora of data?

      To be successful, an organization needs to have a strategy that can be effectively communicated via a robust change management plan. It needs to ensure that the employees have a solid understanding of the importance of the process work effort and their criticality in the process. If the change management plan is lacking, then the organization will likely fail to obtain the required support it needs to be successful in growing its processes, hence its effectiveness, efficiency and customer satisfaction.

      I am not saying that you cannot begin the process journey with a systems solution – that is always an option. However one should keep in mind that the change management around identifying, categorizing, and reviewing the current processes is a monumental task, hence the reason why a framework is so beneficial. Though the work still needs to be completed, you have a framwork to work from. The change management plan will need to include a plan for dealing with peoples’ fears of job reduction and the discomfort most people have with both ‘process’ and ‘change’ let alone “changing a process” that they may have been doing for ten, fifteen, or twenty plus years. If one couples that with the implementation of a new system solution, you have at the very least, doubled your change management requirements.

      The old saying ‘sometimes less is more’ is still relevant and can be the correct approach – it all depends on where your organization is on its process journey. If nothing else, this is a very interesting age to be in process work.

Leave a Reply