Appian – where AI meets RPA and Process Mining – Diginomica

(Image sourced via Appian )

Last week, Appianheld its annual user conference in San Diego. The event reflected the growing size of Appian and a number of changes (notably AI) impacting buyers AND sellers of software.

Attendance was pegged at over 1500 people on-site and many more virtually. The company, by the way, is approximately twice as big as it was in 2019. The last Appian World I attended in person was in May 2013 and was a fraction of the size (and number of sessions) as this event. See this event writeup of that show for contrast and technology themes.

The name badges this year were interesting to read. I saw lots of developers and partners. But there were also people with Knowledge Management, Process Improvement and other job titles that are often scarce at ERP shows. I also met with a number of partner firm personnel. Appian, by dint of its large customers and their complex processes, is apparently a vendor that large integrators, consultants, process improvement specialists, etc. find attractive.

Complexity is something Appian customers possess. There were numerous demonstrations and breakout sessions whose speakers came from federal government agencies, insurance firms, life sciences and other entities with long, complex processes. These Appian customers and prospects are not satisfied with inordinately slow, potentially error-prone and partially manual processes. These businesses want better processes to improve the customer (or user) experience, reduce costs, increase productivity, and/or gain competitive advantage.

This was also a show where Appian developers and super-users at customers are highly revered. There was a $10,000 prize to the super-developer who won Appians version of a hackathon (i.e., the Live Build Challenge). Even the PR team felt I needed to meet with the lead cheerleader of the developer nation at Appian (It was a pleasure to do so, by the way).

Appian is now branding itself as the End to End Process Platform. This branding reflects their use of new AI/ML/Large Language Model technologies with RPA (robotic process automation) and process mining to solve complex business process challenges quicker than ever. Watching demonstrations of these combo capabilities make me think that Appians toolset is really a Business Productivity Generator. Im not trying to start a branding war with that remark but I think its important for prospective Appian customers to see the company as more than a tools provisioner and more of a firm that helps move organizations to a new level of performance. In my experience, people buy outcomes not tools.

AI as my co-pilot came up in numerous executive one-on-ones, keynotes, etc. Appian executives wanted all of their developer community to realize that newer AI, low code, process mapping, etc. tools will not lead to mass layoffs of programmers/developers. We saw several demonstrations where:

I believe that humans will want to and need to work with these tools. Im good with that concept. What I wonder though is what next years Appian World event is going to look like. I suspect in the next 12 months Appian and its customers will create a form factor larger number of process insights, automations, etc. and there wont be near enough time at the show to highlight even a small percentage of what will likely be some outstanding new creations.

The Data Fabric is what Appian calls its ability to not just stitch together disparate data but to also see how this information is (or could be) used in a process. Process performance and usage statistics are identified via process mining technology. Data is analyzed to see what information is being used in different processes and process steps. The Data Fabric helps companies:

Process HQ is Appians long-range vision of process mining. The technology provides a nice graphical view of a process, statistics from process mining technology, workflow logic, process mining insights, etc. The software then shows where new effort/programming is needed to improve the business rules and outcomes.

One of the more interesting aspects of this capability is that users can see before and after process results/performance statistics to see if bottlenecks, throughput, etc. actually improved and are now at acceptable levels.

Earlier in my career, the process documentation tools I used (not the advanced process automation tools of today) were so limited as to what they could do. In fact, most of these were static documentation tools. Process HQs power comes from harnessing the data within and generated by several Appian technologies to rapidly focus process experts on potential improvements and complete these improvements in short order.

Process Automation/Robotic Process Automation generally includes a number of tools to identify process workflows, exception logic/rules, approvals, etc. An RPA outcome can be a highly automated process where a number of routing, processing, decisions and other actions are occurring automatically. Done well, these tools can dramatically reduce human effort and errors.

There were a number of breakouts where customers, partner firms and/or Appian team members stepped attendees through the effort required to light up their government procurement, insurance underwriting or other complex process. But complexity was only one factor common to many of these presentations. Some processes also have a significant amount of regulation, lots of changes over time, rapidly evolving products, etc.

Where one customer might put dealing with frequently changing regulations as the key driver for using these tools, another customer might list improving customer experience as the top goal. The variability in the kinds of processes being automated was quite noticeable but you could see how each was a critical issue for the company to improve.

In the end though, I did observe that Appian has obviously had a lot of success within the government sector just because of the sessions offered and executive comments. This makes sense as US federal agencies are large, highly regulated entities that would benefit from these tools. Insurance is also another sector with similar challenges. These market & process realities shape Appians go-to-market efforts and reflect the kinds of organizations they target for new deals.

Private AI is not a military person but was a term that was used frequently by Appian executives. Appian has delineated all of the new AI/ML/LLM capabilities into two camps (i.e., public and private) based on where the training data and processing logic for these tools lives. For example, if you want to translate all of your English-language support documentation to Castilian Spanish via a large language model, you might use one of the externally available AI tools to do so. Doing so would mean using a public AI tool and exposing your data to the third-partys tool. That third-party tool will get smarter because of its intake of your firms proprietary data.

If the proprietary data is something of a competitive advantage for your firm, is something that should be held to a high degree of confidentiality, etc., a firm would be better off using its own Private AI tool. There may be other reasons to use a Private AI solution. For example, a planning tool might better understand some financial results if it only uses your own firms sales data. Your firms sales may be countercyclical to those of some of your competitors. Since these tools look for patterns within the chosen datasets, getting great data will help ensure better results. Alternatively, poor, confusing data will simply generate low-value results (i.e., garbage in, garbage out).

Appian executives stated their intent to back Private AI solutions for the foreseeable future. That response, while conservative, will help protect their customers data, confidences, etc. While some public AI use cases were mentioned, it was always with the caveat that this would have to be something where data security could be ensured, the risks to people/companies was minimal, and, Appian had time to thoroughly vet the solution.

New AI Use Cases were popping up throughout the show. Heres a taste of what was being discussed:

Integrator/Partner interest was keen at this show. Most major global service firms had some presence. Accenture was noted for their buildout of innovation factories using Appian technology. RSM was acknowledged for doubling their Appian staff complement over the last year. Many customer presenters were either introduced by or shared speaking duties with their implement/development partner.

Unique Solutions are the stars of this show.

Shows like this, especially when tech is undergoing a shockwave of introspection and change, are fascinating to attend. All kinds of new ideas and concepts are flying about with varying degrees of stickiness. Some attendees will be real short-term thinkers and regardless of the new AI buzz are simply looking for an incremental tool to take home with them. Others are looking for the long-range structural changes on the horizon and how these will affect their industry (and not just their firm or a single process). Both kinds of users/buyers were in attendance.

This show also highlighted for me how far both Appian and the process technology spaces have changed in the last few years. A few years ago, the focus was largely around workflow technology and low code solutions. This show was about process mining, generative AI and more. Evolution is an interesting animal to study.

Finally, the mood at the show was notable. Energy/enthusiasm within the customers, Appian team members, partners, etc. was high. Itll be interesting to see if Appian can maintain this at next years event in Washington D.C.

See also: Appian Platform for Process Automation -Low-Code - Process Mining from May 2021

Follow this link:

Appian - where AI meets RPA and Process Mining - Diginomica

Related Posts

Comments are closed.