Название: Self-Service Data Analytics and Governance for Managers
Автор: Nathan E. Myers
Издательство: John Wiley & Sons Limited
Жанр: Корпоративная культура
isbn: 9781119773306
isbn:
Robotic Process Automation
One of the tools that has gained prominence is robotic process automation (RPA). Robotics, or Bots, for short, can be used to automate routine processing steps that were previously done by humans. RPA is most appropriate for highly routinized or transactional processes or for the routinized portion of more complex processes. Obvious benefits of RPA can be measured in three ways:
1 In many cases, the cost of software licenses required to maintain the Bot can be less than the costs to maintain the number of employees they can replace. This is often, but not always true.
2 Given that Bots, by definition, structure processes that were previously unstructured and manually performed by operators, they can lead to increased control and process stability.
3 Bots can perform processes at speeds unrivaled by humans, when appropriately configured. This means that work which previously required a full day to perform (or many equivalent workdays to perform, in cases where an entire team performed the task in the legacy environment) can be accomplished in minutes – or even seconds.
Data entry is often a prime candidate for this technology, when the target source data for input can be found in a consistent array. Imagine for a moment that a receivables clerk routinely captures an extract of received cash from a bank statement, traces the amount back to a receivables balance, and then formulates a cross-asset journal entry in the general ledger to debit cash and credit accounts receivable. If we assume that there is connectivity between the Bot software and all component systems related to the process, and if we assume that there is a high-quality mapping of customer static allowing the operator to relate the enterprise name on the bank statement to the accounts in the receivables master, and if we assume that adequate controls surrounding this process can be built to ensure it is rigorous and stable, this process may be a great candidate for a Bot.
Today, readers may be interacting with Bots without even being aware. Individuals may engage Chatter Bots (Chat Bots) in any number of platforms. Chat Bots are NLP-intensive applications which are programmed to perform human-like on-line chat conversations. Customers interact with them in much the same way as they would, were there a live operator on the other end of the line. The software would respond to social and conversational queues like “Hello” and respond in kind (“Hello, Reader!”). They would also perform a classification to understand what information is being requested and what operations are required to respond most appropriately. These are heavily used by support teams and can multiply the bandwidth of existing staff. Of course, there are limitations. The Bots must be explicitly programmed to respond to written queues, meaning that each response must have been explicitly provided for, as the program is developed. The Turing Test was developed to test the ability of a machine to interact in a way that is indistinguishable from a human. On this scale, many Chat Bots today fail to convincingly resemble humans, but they can be used to great advantage when requests are highly predictable and standardized.
Complex software installations can be performed by a Bot, given that the number and order of steps are discrete, finite, and well understood. Reconciliations, which occupy many in accounting, finance, and operations, can readily be performed by Bots. Any number of activities that require the merging of data from any number of systems, departments, or processing outputs can benefit from a Bot, though in many cases there are ready-made tools available for ETL use cases that are better suited to this task.
There are general rules of thumb that allow reviewers to confirm that a process is viable for RPA. In order for it to be viable, as a starting point the target data attribute needs to be mastered consistently such that it appears in the same format, placement, or field on a page, file, or screen. If it is being captured for entry elsewhere, the Bot must be able to readily interface with the destination application and to navigate to the target field for entry. More complex processes can represent a challenge for Bots, where instability of input formats and locations is introduced. Any processes which require qualitative discernment would likely be out of reach for independent Bot processing. However, Bots in a first pass could wade through processing queues, process only qualifying transactions in scenarios where prerequisite values and criteria are met, and surface all other nonconforming transactions to an exception queue for the human operator, when they warrant scrutiny and discernment. Sure, it is cherry picking, but does it add up to appreciable savings? In many cases, the answer is yes.
Another way to deal with complex processes is to break them down to a series of the most basic discrete steps. For the simpler steps in the processing chain that are viable, Bots can be deployed for rapid execution, leaving the more complicated processing steps in the value chain to an operator (who now enjoys a bit more time to perform them). A much-repeated folly is to promise your stakeholders that an entire process from A-to-Z can be automated. Very often it is simply a matter of time until a time-sucking challenge is encountered that endangers the delivery as a whole, or at least the perceptions of the delivery. In reality, for virtually all automation projects, there is very often residual manual tail of work left unautomated by the effort.
One alternate approach is to embrace this fact from the start and to employ a modular approach to automation, by narrowly defining the scope of Bots within a process chain. The scope of the Bot could be defined to be only a very narrow sliver of the overall process. An example may be that when product controller performs a reconciliation of trade blotter positions to general ledger positions on T+1, they perform 27 steps across several systems, many of which involve discernment, judgment, and the benefit of their considerable experience and expertise. Within the process overall, there are only several individual steps which are well suited for RPA. For example, two steps that are critical and must be performed without exception, are: (1) download an extract of yesterday's trade activity from the risk management system, and (2) launch the general ledger application and set the value date to the prior day. These narrower scope processes can be successfully automated with a Bot, even though the remainder of the processing steps may be overly complex or unsuitable for automation. By modularly defining the scope of the Bot and deploying them for limited use, the time-to-market can be hastened, versus spinning your wheels attempting to automate the process from head to toe (chasing a unicorn). The individual savings on any given day may be small, but the benefits can be considerable, if the same functionality can be reused with only minor customization. (You have now built an application launch Bot for the general ledger. Need a Bot to launch any other applications? Simply rinse, adapt, and repeat). The benefits can add up when there is wide applicability and abundant replication opportunities for modular functionality to be redeployed to the benefit of hundreds or even thousands of employees across an enterprise.
At the time of this writing, the robotics platforms of four companies dominate, although this is a moving target. The industry-leading platforms are as follows: Automation Anywhere, Blue Prism, UIPath, and NICE. These names are less important than gaining an appreciation for the underlying technology and the appropriate use cases to which they are best applied. For now, remember that Bots are best deployed for stable and repetitive processes that exhibit very little variance.
Machine Learning
Machine learning (ML) is the subset of artificial intelligence (AI) that is focused on the study of computer algorithms that improve automatically through experience. Machine learning algorithms build a mathematical model based on samples of data observations or training data, to make decisions or predictions, without being explicitly programmed to do so. Above, we introduced RPA, which relies on very regimented coding СКАЧАТЬ