Of late there are more questions coming out into the open regarding the effectiveness of RPA. It appears that the magical bullet which promised to empty back-offices and robotize everything has misfired. A recent write-up on ‘RPA is Dead’ (published by HfS) along with a number of profiled ‘failures’ associated with its deployment are making waves – while at the same time service providers, Shared Services, Global In-house Centers (GICs), Global Capability Centers (GCCs), Global Business Services/Solutions (GBS) and vendors are all trying to come to terms with reality. The industry is steadily waking up to the realization that the myth of RPA replacing human workers is somewhat dented – if not coming undone.
While I have been fortunate enough to watch RPA play out in my professional work a question that begets me is: Has the absence of standards hurt RPA? There is plenty of printed material available regarding RPA deployment and the case studies where it has been implemented. However there is absolute silence when it comes to standards pertaining to RPA. [Editor’s note: see footnote regarding recent publication of Intelligent Process Automation Guide to Terms and Concepts, as well as Standards for the Assessment of Robotic and Intelligent Process Automation.]
Currently, the RPA landscape is quite muddled across its entire lifecycle. There are multiple models floating around:
- enlisting an RPA vendor or outsourcing to an RPA vendor to handle the complete deployment;
- building an in-house team and developing capability;
- partnering with RPA vendors and an in-house team;
- building in-house platforms mirroring RPA solutions;
- bundling as an as-a-service offering;
- hiring consulting firms; etc.
When it comes to the implementation approach it is also vastly dispersed:
- Lean Six Sigma;
- leveraging a framework offered by vendors;
- developing and using an in-house framework;
- adopting bits and pieces of the Software Development Life Cycle;
- establishing customized RPA Target Operating Models; etc.
There is, notably, a lack of commonality in terms of reporting lines as well as alignment of leaders and teams across all these approaches. In fact, there are significant variations in each of the respective phases of the above approaches with regards to the organizations and the vendors.
The current workforce, including the roles of those who execute or are associated with RPA, have become commonplace within the span of just a few years which raises the point whether anyone and everyone can become an RPA expert with the bare minimum understanding?
A growing tribe of certifications being offered by the RPA vendors (both classroom and online) and some organizations, with each of them claiming to be top-notch, are adding to the confusion. Other components such as coding, architecture, access management, infrastructure, service level agreements etc. vary between organizations and RPA vendors. With the additional entry of new RPA vendors promising new features, the lifecycle of RPA is likely to be challenged even further.
In this melee the question that needs to be answered is whether standards would benefit RPA. Standards are useful to foster a common understanding of the products or services. While they enable safety, quality and value for money to customers they also promote better governance.
For example Leslie Norris in The Pros and Cons of Sector-Specific Standards published by The American Society of Quality mentions that sector-specific quality standards are requirements developed by a particular industry to address specific needs or requirements. Other software engineering/ process standards like Capability Maturity Model Integration (CMMI), ISO 15504 (known as Software Process Improvement Capability Determination (SPICE)) etc. are known to document and establish common practices, training programs and define development process thereby ironing out any contradictions.
This is similar for well-known global industry standards in sectors like food, automotive, aviation, supply chain management, accounting/finance, insurance etc.
Given the inherent contradictions in RPA today it is cumbersome to arrive at a common understanding of tools, practices, trainings, products, lifecycle, frameworks, operating models, capability, user experience etc. This has led to a lack of ability to compare service offerings provided by RPA vendors and organizations, which have deployed these solutions.
Today, there is absence of global standards to assess performance, reusability, safety, governance, controls, pricing of bots. As a consequence this has pushed up development costs leading many organizations to introspectively evaluate the cost-benefit ratio before signing up for RPA. This equation is equally true for Shared Services Centers, GICs, GCCs, GBS and service providers who are investing in or have invested in building in-house RPA teams.
In this context The Institute of Electrical and Electronics Engineers (IEEE) has recently (see below) embarked upon a path to establish standards for RPA. A larger and concerted effort could be to bring industry leaders, organizations and RPA vendors together on a platform and engage in a discussion. With development of a global standard the transparency in costs, technology, quality, infrastructure, user experience is bound to increase and will benefit all players and customers. Till then RPA may appear to be at cross roads on cementing its credibility and business case in the industry.
Editor’s note:
As Amarpreet mentions, the IEEE has indeed recognized the need for standards in this fast-growing sector and a working party has been engaged in developing both terminology as well as standards, for the past two years.
In fact, the IEEE Standards Association has just published a Standard to guide the Assessment, Evaluation, Comparison and Selection of Robotic and Intelligent Process Automation Products and Features (IEEE 2755.1-2019 Guide for Taxonomy for Intelligent Process Automation Product Features and Functionality). This detailed guide objectively assesses the features and functionality of intelligent automation products, and defines more than 140 product features and functions, detailing their importance, and providing guidance on the assessment process. All the functionalities are grouped into six feature-set categories.
Leveraging the terminology established in IEEE 2755™- 2017 – IEEE Guide for Terms and Concepts in Intelligent Process Automation – this standard was developed over a period of two years with direct contributions from Another Monday, Ascension, Automation Anywhere, Blue Prism, CognitiveScale, ISG, KPMG, NTT Data, Pega Systems, SSON Analytics, Symphony Ventures, Thoughtonomy, UL and Workfusion. [SSON is a non-working member of the working party.]This is a highly relevant and much needed development, as Amarpreet points out. The IEEE Working Group on Standards in Intelligent Automation is headed by Lee Coulter, previously CEO of Ascension Shared Services, and Chief IA Officer and advisor to SSON.