Why you shouldn’t fall for the tech 4.0 buzzword mania

By Risk Performance and Technology group

Original, as it appears on Finance et Investissement – https://www.finance-investissement.com/

AI, blockchain, IoT, IoE, 4.0, agility, the “cloud” and Big Data. All these things are so 2017. In 2018, our new catchphrases are micro-service architecture, self-service analytics, quantum computing, server-free architecture, dark data, mobility priority… and the list goes on.

Most of them are probably on your own list. Whenever you are called on to introduce a project, you not only structure it just right but I’ll bet you also throw in some of these key terms to make sure you garner the support of sponsors within your company.

It may sound like I’m against these new technologies, but I’m not… quite the contrary!

However, it is often important to take the time to wonder about the “tree that is hiding the forest” (or rather, the Amazon rainforest to the power of 10).

My point is that when a breakthrough technology or a new technological concept finds its way into the media (I’m not saying “is born” or “comes into being” because today’s free-access cloud architecture has existed in theory since the 70s), the technology in question becomes THE new revolution that suddenly needs to be discussed, funded and implemented.

More and more these days, I’ve been noticing a glaring lack of critical thinking, of anyone asking “Is this really necessary? Is the added value real? Are we mature enough to make full use of it? Can we backtrack if there’s a major problem?” In fact, such reflexive questioning is an integral component of risk management.

There’s a reason these technologies are built into organizations: they sell! “It looks modern, innovative… the leading edge!” I literally roll my eyes when I read stuff like that.

Must our business processes systematically undergo a makeover every time a new technology comes along, just to stay efficient?

Am I against progress? Of course not. But I am against trendy styles. My personal analogy is FacebookTM, which has been termed a societal paradigm shift. Be that as it may, it doesn’t mean that 100% of society has agreed that this evolution is taking place in a purely private interest and that its users are the result. Otherwise, Mr. Zuckerberg would not have been asked to testify before the U.S. Congress. In this particular case, it’s not progress; it’s a latest trend effect whose purpose is to skew results towards a target.

My analogy doesn’t end there. The major risk of all these buzzwords is also the complexity of the underlying technologies and their capacity for self-regulation. Let me be clear. Despite Isaac Asimov’s three laws of robotics, artificial intelligence is known to have some intrinsic conceptual vulnerabilities, which if not well understood, could have quasi-exponential and even potentially defining repercussions on the very existence of the natural or legal person making use of the resource. Artificial Intelligence is all well and good, but it is still at the adolescent stage and its decision making can easily be blurred or skewed.

You may say “No big deal. We’ll fix it as we go along.” Sure, that’s what we always do!

But my point is that our process for adopting and integrating such new technologies has become too immediate, systematic, global, overwhelming and intrusive for our natural maturation process to find its way in time. After all, we are still human. We are changing faster than ever, there’s no denying it, but not fast enough for our own learning curve to catch up to and incorporate these buzzwords.

I mentioned self-regulation earlier, so what does that mean exactly?

Well, it happens to be the meeting point between the maturation curve (trial – error – correction, i.e. a simplified Deming curve) and the general uptake curve of our critical processes. Examples of these processes are driving, bank transactions, contractual transactions, personal identification, health monitoring, societal collaboration and communication, and maintaining confidentiality and privacy.

Normally, I would list some clear, simple solutions. In this case, unfortunately, I can merely insist on the importance of keeping your cool, your sense of critical appraisal and some notions of rationalization, continuity and conservatism.

When the machine goes off the rails and everyone is running headlong at a problem, let’s take the time to clearly understand the complexity of these technologies. After all, though very complex and innovative, they are decidedly promising. We need to exercise due diligence by asking ourselves: “If one day these technologies are no longer of any use, will we readily be able to stop using them?” What would that involve?”

Shouldn’t we wait for a clear answer to these two essential questions before we phase breakthrough technologies deeply into our life cycles?