Evaluate Weigh the pros and cons of technologies, products and projects you are considering.

Chatbot security, privacy caveats creating gaps in industry adoption

Lauren Kunze, CEO of Pandorabots, a platform for building and deploying chatbots, predicts that chatbots will eventually touch every vertical. But organizations in industries such as healthcare and finance first have to overcome significant chatbot security and privacy hurdles before moving forward with this nascent technology. At the recent Chatbots & Virtual Assistants for the Enterprise event in San Francisco, Kunze discussed how customer oversharing and data usage questions are challenging early chatbot adopters -- and making it a tough sell for certain industries.

What chatbot security and privacy concerns are organizations dealing with today?

Lauren Kunze: I think the top three industries that [are dealing with chatbot security and privacy concerns] are healthcare, fin tech, and anything having to do with children or Children's Online Privacy Protection Act regulations -- which is just a very long process. In any large organization, you're often going through exhaustive security reviews as part of the measurement system analysis process. That's true for any large company, just in terms of the development timeline and sale cycles. 

One of the big problems is just with the type of things that people volunteer in chat.

In those organizations and sectors specifically, one of the big problems is just the type of things that people volunteer in chat. If you look at all of our data -- and we're talking about six billion messages -- regardless of the use case people will volunteer highly personal information and personally identifiable information. That includes their address and their credit cards, even in cases where a transaction isn't material to that particular topic.

I think we're going to see a lot of emerging dialogue around privacy and security that we have not seen to date, just in terms of how that data gets stored. With larger enterprises that we work with, sometimes we carve out specific parameters around how we can use the data -- do we delete the data after a few months, or do we have to run on HIPAA compliance servers in some cases? It's something to be mindful of that's going to be a big issue going forward just because it's a new paradigm and conversation is the most intimate sphere of how we interact. 

Do you think those security and privacy concerns will slow the adoption of chatbots in the industries that you mentioned? 

Kunze: In terms of our professional services projects that we take on, we actually have shied away from the verticals that I mentioned -- healthcare, fin tech, anything involving children -- for security and privacy reasons because they're so highly regulated and can be very litigious. We've been focused on the most mature part of the market, which I think is conversational commerce and marketing, just in terms of where the budgets are, where the appetites are ripe to develop and where the real business results are today. That said, the security and privacy concerns will not stop the chatbot industry from touching every vertical. All the largest companies in the world are investing in this; they have huge AI teams and conversational user interface teams. The trend is definitely coming and it will herald a whole new set of complications and concerns on the chatbot security and privacy front -- ones that we will be forced to navigate.

View All Videos

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

What do you think are the biggest chatbot security and privacy concerns?
Cancel

-ADS BY GOOGLE

SearchCompliance

SearchHealthIT

SearchCloudComputing

SearchMobileComputing

SearchDataCenter

Close