The Blame Game, Robot Edition

10-second speed read

  • Blame is…you
  • Automation means money
  • RPA is not for everything
  • Brainstorming Culprit ex Machina
  • The accountability concept and no fatalities

I was talking to an acquaintance about an IT system installation project and she identified a most important element of the process being no one getting blamed for delays. As system implementations go, this one had taken longer than planned and at a much higher cost. Did the consultants take the “no blame” aviation concept and adopt it into an IT implementation because of the sector in question?

It seems they kept the bigger picture (having a tool instead of not). “Blaming…is more than just a process of allocating fault. It is often a …[means]…of shaming others and searching for something wrong with them…[providing]…an early and artificial solution to a complex problem….a simplistic view of a complex reality: I know what the problem is, and you’re it.”

The “no blame” concept did originate from high reliability organizations where a high survival rate is essential for the company’s existence and one of which was the subject of our discussion. However, it is not easy to implement even in most such environments – just review search results for “implementing a no-blame culture in medical services.”

One indirect means to remove blame is to replace humans with automation – it’s hard to argue with an algorithm when it does not have instructions how to answer. As aviation-centered high reliability organisations demonstrate, robotisation means results:

automation_makes_money

Source: Flintsch, 2000

Removing humans does not have to be a seeming “silver bullet” though. The American engineer W. Edwards Deming demonstrates that 95% of variation in the performance of a system (organization) is caused by processes in it and only 5% occurs because of its participating actors. As autopilots and autothrottles show, for the time being robots can replace some mundane tasks: “RPA works best when application interfaces are static, processes don’t change, and data formats also remain stable – a combination that is increasingly rare.” Actually, about one in twelve of the implementation managers are redesigning processes to be managed by the same robots (I had another discussion on robotic process automation with another acquaintance and it seems it’s only appropriate in 30 to 40% of cases).

While “no blame” is not frequently used even during the IT implementations that lowered the fatality rate, the fashionable “accountability” alternative concept has emerged: “viewing…[errors and shortfalls]…as opportunities for learning and growth.” And one first area where this can be applied is in programming the robots (just ask any engineer in charge of software loadable parts). There are still ways to find culprit ex machina:

  • Programming bias – “the machine we build reflects how we see the world…consciously or not
  • You get what the data gives you – your training sample will teach only what it has, not more
  • Further on that, once the automation/robot has made a decision, whose responsibility are the results of this choice? Drones, for example, require a human commander – for now.
  • In that vein, will robots have to get together and invent a set of governing laws for themselves?

Will using robots be the ultimate no-blame implementation because it will lead to a 0% fatality rate?

FLIGHT OPERATIONS CHALLENGE…

accepted.