Robotic Process Automation: 6 common misconceptions - TechNative
Robotic Process Automation: 6 common misconceptions - TechNative |
- Robotic Process Automation: 6 common misconceptions - TechNative
- Cloudhead Games' VR sleight of hand - GamesIndustry.biz
- PBS to air documentary on EGOT winner Rita Moreno in 2020 - Idaho News
| Robotic Process Automation: 6 common misconceptions - TechNative Posted: 29 Jul 2019 05:03 AM PDT ![]() What false expectations are raised using RPA in companies?The advantage of Robotic Process Automation (RPA) is that it automates repetitive, remedial tasks and frees employees to work on higher value tasks. But many companies believe RPA will enable them to automate even the most complex Business Process Management (BPM) activities, although there are much more suitable solutions available. The following overview shows which other misconceptions companies frequently use to counter RPA solutions. Misconception #1: RPA fully automates processes from A to ZIt's the right decision to automate structured, repetitive tasks with RPA, as it's the best tool for this purpose. It shows its strength especially in shorter, repetitive activities of usually a few minutes. This includes, for example, retrieving data from one system and storing it in another. RPA is best for activities that require multiple repetitions of the same sequence and could be conducted in parallel to create greater efficiencies. For example, B2B companies often have to check several portals or suppliers in order to buy inventory at the best rate. An employee would have to work through all the steps in each portal sequentially. But with RPA, the software robots act as "digital colleagues". They monitor product prices and regularly inform employees about changes, retrieving figures from all portals simultaneously. Unlike BPM platforms, RPA isn't capable of managing processes end-to-end over a longer period of time. An example: A customer wants to order something, complain or obtain information. Accordingly, a process is triggered in the company. Sometimes it can take up to 14 days until the request is completed. Although the digital colleague can support the employee by retrieving data on the customer, decisions are still made by the individual. That's why a BPM solution is the much better choice, because the system can integrate employees into the process depending on availability and skills. A combined solution of BPM and RPA proves to be particularly efficient. In this case, the BPM system takes over the administration and controls employees and digital colleagues as required. Misconception #2: RPA completely replaces other solutionsConceptually, RPA can be used in almost all processes. However, sometimes commercial off-the-shelf (COTS) software is a better choice. For example, companies have been using invoice processing software for many years. Over time, these solutions have improved greatly as feedback and process expertise are incorporated. Replacing these tools isn't practical. It would take a lot of time to map all existing functions with RPA – and wouldn't help organisations reach their goal of end-to-end automation. Nevertheless, many companies still need to connect RPA to these legacy solutions so that software robots can fill the system with the data they need or match it with other applications as needed. However, this only works if companies select an RPA solution that offers the appropriate application programming interfaces (APIs), so they can easily connect systems. Misconception #3: RPA automates processes best via user interfacesAlthough in many cases it makes sense to use existing tools and not simply replace them with RPA, RPA definitely has its raison d'être – to integrate different systems with each other. This is a major challenge facing many companies. Integration is time-consuming and expensive because some clients, the web browser or legacy system applications are difficult to integrate with modern technologies. However, RPA makes it easy to set up an integration between systems very quickly via user interfaces (UI). Direct interfaces such as APIs or web services, are generally preferred, since these are usually faster and don't change as frequently as user interfaces. If the UI changes due to an update, some software robots can no longer perform their service and an adjustment is necessary. However, good RPA solutions work in both directions: They allow the simple integration of direct interfaces and the use of user interfaces without having to write even one line of code. Misconception #4: RPA will replace employeesThe fear that employees will be replaced by an automation tool like RPA is growing all the time, especially as intelligent automation platforms become the talk of the town. RPA on its own can't process unstructured data, such as documents, because it lacks the necessary "intelligence". But intelligent automation combines RPA and artificial intelligence capabilities, enabling it to easily recognise the difference between an invoice and a complaint letter without consulting a human. Although software is becoming more intelligent, it doesn't mean it will replace people. On the contrary, intelligent automation relieves employees of monotonous tasks and gives them back valuable time. In fact, according to a global Forbes Insight survey, 92 percent of organisations say their employees are significantly more satisfied once RPA is implemented because it makes them more efficient. Rather than spend time answering routine customer complaints, they can focus on higher-value work such as resolving complex customer issues. Misconception #5: RPA in the back office has no influence on customer satisfactionAs RPA is predominantly used for simpler activities, one might believe that it has no impact on customer satisfaction. But that's not true. When digital colleagues take over repetitive tasks, not only are requests processed more quickly, but employees spend more time building relationships with customers. Another benefit of RPA is that it improves data quality. According to estimates, about 5 percent of an organisation's data is incorrect, either because information is lost during copying or errors are made during data entry. For example, an employee of an energy supplier might read the electricity meter and then accidentally assign the status to another customer when entering it into the system. When the customer receives their bill, the amount due could be significantly higher or lower than before. Customer satisfaction takes a hit because now the customer has to complain to the utility, which can require a great deal of time and effort. However, when a robot is tasked with copying or storing the data, errors are virtually eliminated. A robot can also quickly and easily check whether the entered data is valid – another way it improves data quality. Misconception #6: RPA can be used for complex processes and immediately throughout the entire companyCompanies recognise that RPA offers enormous optimisation potential. It's possible to deploy RPA across the enterprise and even use it for some complex processes. However, companies sometimes make the mistake of thinking RPA can help them achieve end-to-end automation. Nevertheless, RPA is better suited for the automation of tasks rather than processes. The best way to approach end-to-end automation is to establish a "Centre of Excellence". It's especially important to include business owners on this project team. They're closest to processes and understand what each entail. They're also particularly good at estimating how much time employees spend on tasks. It's advisable to start your RPA initiative by automating smaller tasks first and gaining experience with them. Companies often begin with the finance department, where the robots take over the very time-consuming creation of reports. After that, they can extend RPA to more areas and automate more and more complex activities. With this approach, companies reduce the risk and can even deliver a short-term ROI as small tasks are automated. As they gain more experience, they can scale their RPA efforts and begin working like tomorrow, today. About the Author
![]() |
| Cloudhead Games' VR sleight of hand - GamesIndustry.biz Posted: 29 Jul 2019 04:27 AM PDT Since the early days of the Oculus, VR studio Cloudhead Games has been a dab hand at... well, hands. Co-founder Denny Unger was dabbling in VR around the end of 2012 when he met Oculus Rift creator Palmer Luckey on an online forum and began experimenting with the headset tech; he even designed the first Oculus logo. In the process, Cloudhead Games was formed and began working with the headset prototypes, but always with an emphasis on hand interaction. Cloudhead designer Antony Stevens tells GamesIndustry.biz that, back then, the headsets they were working with were limited to just head motions. That required extra equipment like the Razer Hydra for hand presence, something that people getting in on the ground floor of VR might not necessarily have on them. But the limited technology and audience did not deter Cloudhead. "Hands were a big thing because they were the next step," Stevens says. "The endgame of VR was never going to be just our head. And hands are such a tactile part of the human experience in general and in gaming, too. You game with gamepads and controllers and control sticks, so the fine dexterity you get with hands, the manipulation, is all super important for experiences, and VR is trying to emulate that."
Cameron Oltman, Cloudhead Games senior programmer, adds: "The very first VR headsets out of the recent wave of VR stuff gave you the ability to stick a thing on your head and look around and look at an environment in every direction, but you didn't get that thing you get in real life where you can lean around and look under stuff. You could look in any direction, but it was basically stereoscopic 3D video. "There's this thing that completely changes the nature of the experience when you get into room-scale, where you move your head and the world feels like it stays still around you, just like the real world. That's hard to describe if you haven't been in it, but it's a very transformative experience. And as soon as you're doing that in an environment, you want to reach out with your hands and grab stuff. You can't not do it, basically." Cloudhead Games' fascination with hands in VR paid off fairly early on, when Valve took notice of its work. Valve had been working with HTC and its Vive headset, offering support through SteamVR, but it was also internally developing its own technology that just recently came to market in the form of the Valve Index. Key to the Index is its controllers, which not only track motion in VR space like most modern VR hand controllers, but also can detect individual finger motion and grip, allowing players to let go of and pick up objects. Aperture Hand Labs has the player go through a series of practice hand gestures, guided by friendly (or not-so-friendly) Aperture Science robots That technology was a perfect fit for Cloudhead, which has been involved since long before the Index was officially announced, when its controllers were still referred to as "Knuckles" controllers. Cloudhead was approached by Valve in 2016 to create a demo for the Knuckles after its work on a cinematic fantasy adventure game called The Gallery, which prominently featured hand interactions. "Lots of VR games in 2016 focused on what Owlchemy Labs [Job Simulator, Vacation Simulator] calls "tomato presence," which is where your controller turns into the object you're holding, rather than your hand holding an object," says Stevens. "We went in the opposite direction -- we had hands, and hand animations, and all that stuff, and Valve was working on these hand controllers, so we used The Gallery as the demo for the original Knuckles controllers, and we've been involved with Valve as the Knuckles evolved and as Index evolved." Stevens says that the pitch for Aperture Hand Labs was essentially a "big book of ideas for Valve" that spanned multiple properties, but the Aperture theme was what took. The result is a five-minute tech demo for the Valve Index that takes the player through multiple hand gestures that make specific use of the Valve Index controllers and their individual finger tracking.
The demo clearly takes place at Portal's Aperture Science lab, and it's guided by various robot personalities reminiscent of some of the characters from the Portal games. And though Portal writer Erik Wolpaw and Portal 2 writer Jay Pinkerton put their "signature twist" on the writing and dialogue, Stevens says Cloudhead had a fair amount of freedom to build what they wanted from the ground up -- a brief experience that gives just a taste of what the Valve Index's motion-sensing technology might one day be used for in a full game. "On previous controllers, your gross movements are basically floating in space and [your controllers] are your hands," says Oltman. "All the interactions beyond that were a button press or a direct translation of something you do on a computer already. "But the Index controllers were very different from that. There were all those buttons, but the compelling thing about them was the interpretation of hand pose and intent. Part of the difference for the experience of building for that was there was this large space to explore that didn't involve the sort of things you normally think of as game interactions. Most of the interaction you do [in Hand Labs] is not what you think of as a normal interaction mechanic. "There was a general shift in thinking that was freeing and a challenge at the same time, because you have to come up with new ways to think of what an interaction is." Because the Valve Index can track individual fingers, it's possible to play rock-paper-scissors, flash a peace sign, or flip the bird in VR Both Stevens and Oltman say that they feel the Valve Index is a meaningful step forward, not just because of its controllers, but also because of its display clarity. Stevens recalls a time when Valve internally referred to the Index as the next generation of VR, but thinks the final product is more of a "1.5." But from both him and Oltman, that's a compliment. They do acknowledge that the high cost of the headset ($1000 for two base stations, the headset, and controllers) is a bit prohibitive for developers, but the existence of a high-end, high cost product like the Index shouldn't surprise anyone.
"A lot of the thing that keeps coming back is, what is the gameplay of these Index controllers?" Stevens says. "And we try to show an example of what that will be with Hand Labs, and you haven't really seen too much of that with other developers, and that's likely because to get into the system is a bit more expensive than usual." Oltman adds: "This is one of those things that happens with any kind of market fragmentation, where the majority of what people develop for is the common set of capabilities that a broad install base can make use of. Anything that leads the pack in any direction is going to have a smaller development base just by nature of how many people there are that are going to have it." Because Cloudhead has a history of pushing forward on VR technology and capabilities, I asked the two what they saw as the next step for the kind of work they do with hands and immersive VR experiences in general. Stevens suggests that new narrative strides might be made in the coming years as AI improves, which could combine with motion-sensing technology for interesting and unique interactions with NPCs and characters in VR. Oltman's suggestion is, unsurprisingly, very hand-specific. Valve has said that a new 'flagship title' for the Index will launch later this year, and has slyly winked at (but not confirmed) rumormills suggesting it may be a Half-Life game "There isn't anything on the broad market right now that gives any kind of meaningful, physical, tactile feedback," he says. "Very fine motor control interactions are still basically all fakery. If you think about the things you do without even thinking about it in your day-to-day life, there are all kinds of very subtle, physical cues that are still impossible in VR, and we just kind of fake around the edges of them.
"For example, there's this cup on the table in front of me. I can reach out and grab it quickly, and I'm not going to slap it across the room unless I'm really distracted, because I get these very subtle cues about exactly where it is and when my fingers are touching it. In VR, you don't have that. That's one of the things that would be really cool." With the support of Valve and other big names in VR, Cloudhead has seen success in a tricky segment of the industry that's still struggling to win the same kind of mainstream support as consoles, PC, or other devices. Oltman maintains that cost has been a huge factor in this, but that systems such as the PSVR and newly-released Oculus Quest may soon break down the barrier. For Stevens, it all comes down to content. "I personally think there's an issue with how [VR] content is distributed, in terms of how it is curated," he says. "Sometimes it's overly curated, sometimes it's not curated enough. I don't think we've hit that sweet spot where we are with marketplaces like the PlayStation store or the Xbox store. "We're seeing a lot of big-budget experiences coming from existing 2D teams who aren't looking at VR from the ground up. They're looking at it from their existing genres and ethos and games and IPs and that kind of thing. And then we're seeing lots of amazing experiences from tiny, tiny teams, but those aren't going to be big enough experiences to sustain the ecosystem. "Collectively, there's a big issue with the middle tier in VR in terms of companies like that getting funded. There are some exceptions, but largely there are not too many of your average AA game in VR. There's lots of indies, a few AAA, but the AA is what's going to sustain the platform I think, and we don't have a solid system there yet." |
| PBS to air documentary on EGOT winner Rita Moreno in 2020 - Idaho News Posted: 29 Jul 2019 09:58 AM PDT [unable to retrieve full-text content]PBS to air documentary on EGOT winner Rita Moreno in 2020 Idaho News FILE - In this Aug. 25, 2018, file photo, Rita Moreno arrives at the 33rd annual Imagen Awards in Los Angeles. PBS announced Monday, July 29, 2019, it will air ... |
| You are subscribed to email updates from "portal 2 robots" - Google News. To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
| Google, 1600 Amphitheatre Parkway, Mountain View, CA 94043, United States | |

Daniel Schmidt is Senior Product Marketing Manager at Kofax. There, he


Comments
Post a Comment