I was joking for some years about how natural language BPMN modeling would work in a process workshop-style setting where many people meet to source a workflow collaboratively. I imagined how you could shout at the computer if you disagree with what your colleague modeled, and it would revert it. Imagine the dynamics that develop if you have to persuade the computer your ideas are the better ones . We event planed an April Fools’ day joke regarding this once. We did not follow up on it, though. We did just this year present WASDENN though. Money quote:
[…] We actively explore voice recognition for future versions, giving you more freedom to interact with the assistant. You can look forward to vocally disagreeing with WASDENN’s proposals, sharing your excitement, or command it to your desire on the voice line.
Jokes aside, this highlights some of the challenges that NLP in the BPMN modeling domain needs to overcome to be a replacement for regular editor interactions:
(1) You need to be as clear and efficient to model your diagram using your language as you are using a proper tool (i.e. bpmn.io tooling). Which language exactly? Many people do not even understand the names of the symbols they are drawing.
(2) Natural language is ambiguous (example: or), and one point of BPMN is to formalize things to get rid of most of that ambiguity. Not sure how NLP will address this. Trying to fix errors, being precise, and not being understood by an AI can be a smash your screen moment. There have been too many puns regarding this one.
(3) All kinds of assistance and error recovery built into a good editor must be built again, into an NL interface. And it has to be done well, or you fail (cf. (2)).
(4) Misunderstanding needs to be accounted for (cf. (1), (2)). The potential is that the editor would identify ambiguity and ask for clarification / suggest what could be done.
From the editor builder perspective, I firmly believe that we are very very very early on the NLP technology adoption cycle. I also believe NLP in modeling will change the way we build our editors substantially (cf. (1) and (4)).
I do see NLP have a place where people cannot interact with the tool in the normal manner, i.e. because they are impaired (visually and/or input-wise). I’ve recently seen an inspiring tweet on how a blind person uses the iPhone to interact. For blind people specifically, you’d need language output, too, which is a a challenge on its own. Thinking in this direction, NLP could open the door to BPMN editing for more people.
I think it will take a long time until NLP replaces keyboard and mouse interactions in a well-crafted BPMN editor. I’m happy to be proven wrong, of course.