Frequently asked questions
Below are our most asked questions about Blue Prism. If you require further assistance, please contact us
Robotic automation refers to a style of automation where a machine, or computer, mimics a human’s action in completing rules based tasks
In the domain of back office administration, Robotic Automation refers to automation where a computer drives existing enterprise application software in the same way that a user does. This means that unlike traditional application software, Robotic Automation is a tool or platform that operates and orchestrates other application software through the existing application’s user interface and in this sense is not “integrated”.
• No IT infrastructure changes are required – there is no integration requirement – the robots interface with any application through the user interface in the same way a user does.
• No integration costs – robots drive existing applications.
• IT robots are “trained” by their users by being “shown” how to complete a task. This is akin to training a new employee.
• A robot once trained can scale across any number of other robots.
• The robot knowledge is extended and re-used over time.
• A robot is trained in the live environment making projects less expensive and much faster than traditional IT.
• Multiple robots applied to a task can be synchronised to deliver large-scale robotic platforms.
No, clerical Robotic Automation is a generation on from old technologies like screen scraping or macros. The major differences are:
• Robots are universal application orchestrators – any application that can be used by a person can be used by a modern robot, whether mainframe, legacy, bespoke application, web service enabled or even a closed 3rd party API hosted service.
• Applications are “read” by the robot, either through dedicated APIs where they exist, through the OS prior to application display, or through the screen in the context of the native application. In this last case the modern robot “reads” an application screen in context and in the same way a user does. As part of the robot training it is shown how to read the application’s display much like a user is shown.
• Robots collect procedural knowledge which over time build into a shared library that can re-used by any other robot or device (in the same way objects are built in traditional SW engineering).
• A robot is trained through a flow chart of the procedure. This flow-chart is managed and audited to document the procedure.
• Management information is gathered automatically as the robot operates. All processes generate statistical profiles as a by-product of doing the action. This allows tuning and development of a process in light of real data.
• Modern robots systems come with failover and recovery inbuilt as core capabilities. It means that if changes take place, or downstream failures occur a “smart” response can be trained into the overall system.
• Modern robots systems have full audit and security authorisation meaning that all changes and all access is recorded and regulated. Back-up process steps are managed, roll-back and recovery, as well process change-highlighting, are all automatically captured by the robot platform.
• BPMS is principally aimed at improving IT architecture to allow greater flexibility in automation and process management capability. Most often its aim is to support agent productivity through desktop acceleration, application connectivity, workflow management. As such BPMS is part of the core IT tool set, to which adjustments outside of configurable parameters to a solution normally require a traditional IT change-program. Most often connectivity between applications, and design work on how applications should be integrated against business requirements is a key skill that is required to operate BPMS effectively.
• Robotic Automation is principally aimed at clerical staff replacement as opposed to clerical staff acceleration as with BMPS. The philosophy of the approach is therefore to target routine, repetitive, rules-based tasks (procedures as sub-tasks within a larger business processes). Such tasks can often tie clerical staff down for long stretches of time. Very often such tasks are small, possibly involving 5-10 people, and so do not justify large IT, or even BPMS, projects to automate. The difference for robot automation is that no IT is required, and business users can “show” the robot what to do. The capability is therefore distributed to operations staff so as to divide-and-conquer many mid-to-small automation initiatives that would otherwise require people.
No, Robotic automation extends and complements BPMS and SOA initiatives which are attacking the automation challenge from a different, top down, IT driven angle. Robotic automation is aimed at small-to-mid size automation initiatives. Where speed and size and agility are major factors, then robotic automation is often the fastest and most efficient approach. When larger initiatives are required with a fuller “Business Process” character then BPMS may be better suited.
This difference in scale is illustrated with the so called Long Tail of Automation Requirements. This says that core IT deals with the high volume bulk processing requirements an organisation may have. Typically, these are core ERP systems, mainframe accounting and core data bases. As we move towards the middle of the graph requirements become more specialist and diverse. This is where an organisation often differentiates its product and service offerings. Typical technologies here are workflow, desktop integration, BPMS, agent acceleration. These are large IT control programs that service to offer a platform for automation and work management.
Finally we have the third section of Long Tail – these tasks are characterized by their diversity. Often they are too diverse to make an IT change program, and may be too small to justify IT project costs. Here traditional approaches have been to outsource, or offshore in order to adjust labour rates to make the task more competitive. Robotic automation offers an alternative to off shoring or outsourcing – presenting a new cost-band of labour based on robots.
No – Robotic automation actually addresses rogue IT (i.e., disparate initiatives across the business that may create risks to business standards, continuity and brand quality). Robotic automation addresses this issue on a number of levels:
• Robotic Automation is normally housed, monitored, licensed and controlled by IT, or at the very least a centralized governance body. This group enforce a central usage policy configured within all robots.
• Robotic processes are accretive – objects are built and are then available for re-use across the business. This allows disparate groups to all share and build common resources supporting much greater resource re-use than many SW environments.
• No new data – best practice robot discourages or even forbids the creation of new data. Systems are used by robots as they appear to users so as to coordinate and streamline enterprise governance.
• Robotic FTE’s are 1/3 of the price of off-shored FTE’s and can work 24/7
• Speed to automation – days and weeks to automate clerical procedures
• “Self Build” – no need for specialist IT, robots are trained by end-users
• Robots are trained to do repetitive clerical tasks and drive existing applications so no costly integration and expensive process re-design expertise needed
• A small specialist team from the business operations works alongside robots to train them, manage exceptions and continually improve the robots operational performance
• MI is automatically captured across all procedures operated.
Blue Prism has incorporated many years of experience of integration and numerous technologies into its software. The technologies used are secure, reliable and robust. Instead of creating new adaptors for each unique application we have developed technology adaptors for all the technologies employed at the presentation layer, Java, Windows, Web, Green Screen/Mainframe and even Citrix.
This coupled with a wide range of dedicated tools that have been developed means that we are confident in being able to link any system with the click of a button. This proven application orchestration capability ensures that new processes can be rapidly designed, built and tested without any impact on existing systems.
It is a key design point of Blue Prism that we don’t change any of your underlying systems, as this is often complex and expensive to perform.
Blue Prism gathers data and integrates processes at an abstracted level using a variety of techniques an interfaces that ensures underlying systems are not impacted.
Security and auditability are built into the Blue Prism robotic automation platform at several levels. The runtime environment is completely separate to the process editing environment.
Permissions to design, create, edit and run processes and business objects are specific to each authorised user.
A full audit trail of changes to any process is kept, and comparisons of the before and after effect of changes are provided.
The log created at run-time for each process provides a detailed, time-stamped history of every action and decision taken within an automated process.
Our clients tend to find that running a process with Blue Prism gives them a lot more control than a manual process, and from a compliance point of view assures that processes are run consistently, in line with the process definition.
Blue Prism can easily track and report changes to processes.
Our audit viewer allows users to track the details of who, when, why and exactly how a process was changed.
In addition, detailed logs are held of every step taken during execution, providing a robust and detailed audit trail.
A “fully loaded” office robot is approximately a 1/3rd the cost of globally sourced agents. The flexibility and ease of deployment means that this comparison is easy to maintain and judge the nest approach to a given tasks. I think we mean to say, “the best approach” rather than the “nest approach”!