If you are a fan of the American space television series, The Mandalorian (season 2), then you’ve seen the ominous dark droids—a machine clan that operates in full synchronicity and generations above the human stormtroopers. When Alan Turing invented his machine and cryptanalytic wonder, the Bombe, to crack the German Enigma in World War 2, he devised a mechanism that essentially could do a lot with less; more mathematical calculations and simulations than possible. Afterward, the British Tunny machine automated encryption and decryption.
From Alan Turing’s machines to the dark droids, automation was conceived and continues to evolve. The process of recreating or replicating tasks based on a particular order or logic is the essence of automation—in today and the future, automation is redefined to be doing a whole lot more with a lot less and quickly.
We don’t realize how much automation is involved in our everyday living and working; for example, speech recognition is a highly evolved model of pure automation. A majority of all software and hardware technology systems are predicated on self-repeating steps.
The mechanical process of task recreation and/or creation requires four key factors.
1. Speed: addresses how fast. The time it takes to execute a defined process of tasks growing in complexity will increase the time it takes to complete them. We rely on mathematical finesse to eliminate the existing challenge of latent speed by designing or utilizing algorithms that solve the problem quickly and completely. Although speed implies and contributes to efficiency, it doesn’t necessarily mean it achieves it entirely.
2. Time: the total time it takes to complete a task. For example, a random walk, let’s say, from two points and store the results, and rerun the process again, iteratively or as prescribed.
3. Computational Space: speed is dependent on the computational (processing) time a machine or group of machines can facilitate. This is particularly critical in advanced programming and learning, including artificial intelligence model learning.
4. Algorithmic Design: the mathematical noggin of the machine, so to speak. The techniques to solve the problem, particularly complex tasks, require mathematical principles addressing how to interpret the data, parse, calculate and more.
The basis of every brilliant technological product, past and future, is based on the aforementioned ingredients.
When solution architects to security chiefs assess technology, they are analyzing it from different perspectives. Similarly, new product ideas face the same thinking. The question for technologists in their database management to data centers can sometimes come down to, “Why is this solution comprised of so many services?” Services are essentially components. Like a Lego arrangement of various pieces and colors, how will all this really make sense? And does it—given speed, time, computational space, and design—solve the problem?
Automation can either be an elaborated scheme of multiple components or a highly simplified one. And although it meets all needs well, one factor considered afterward or sometimes not considered soon enough is error generation. Complexity results in error generation—and the perception here is generally true. However, simplified arrangements may also lead to mistakes or deviations from the main objective, thus creating new problems beyond the perceived solved one. When resources are completely utilized and overworked, errors begin.
From factory floors to business processes to speech to text services, systemic automation will result in mistakes. And the time it takes to address the mistakes is often delayed. For example, speech recognition engines may be designed to process errors through an additional dumping process for computational and human analyses and/or may be ignored altogether. The thinking there is that learning will just improve over time and through much more data ingestion, thus eliminating the error altogether “eventually.”
In any world, errors are not all created equally; they differ categorically. Some solutions decide to categorize errors as not important enough (so, non-critical) or highly critical. The delay in recognizing errors and handling them so that they don’t occur or contribute constructively is often large and wide.
When deviations are recognized, the first human response is to do what it takes to fix it; resulting in time and working harder to get it done sooner than later, resulting in eventual overwork.
Human and machine systems can exhibit overwork in the following ways:
1. Data processing delays
2. The phantom error returns
3. No one is checking the error bins
4. No one or thing is processing the feedback
5. Clients complain or switch to someone else
6. System storage reaches its capacity quickly
7. System processing can no longer handle the typical request
8. Follow-through is decreasing while work requests increase
Automation has a front-row seat in the making of a future we both fear and embrace. “Will my job be replaced?” or “Am I just replaceable?” will persist through the halls of human philosophical and existential debates. New jobs will be created, while others will be repurposed. It is critical to understand how jobs will respond to automation.
As human beings, we are experts in what we do; we are the only ones that know what we know best. We’ve accumulated years of experience, which has aided in our learning and decision making. The best strategy could be the coupling of human and the autonomous artificial agent in the way we design future systems; the human is the strategic arm and decision-maker considering the analysis and work generated by an autonomous system.
Athreon’s expertise spans decades in converting speech to text and in technology. Regardless of industry, if you’d like a one-time complimentary consultation about your business processes, technology designs, and speech-to-text needs–from a current or future perspective–contact us today.