Johns Creek, GA gives residents easy data access with Amazon's Alexa
The city wants to "democratize data" by enabling residents to ask Alexa more than 200 questions regarding city operations, services and common inquiries.
"Alexa, ask the City of Johns Creek if I need a building permit."
That command and more than 200 others are now available to help Johns Creek, GA residents easily understand the scores of data that the city collects and wants to make publicly accessible.
The city, with the help of resources from Esri, made the Alexa skill earlier this year as part of its entry for the Amazon Web Services (AWS) City on a Cloud Innovation Challenge. The project won a "best practices" award and received $25,000 in AWS credits, which will allow the city to "do more services or prototypes ... or expand the skill, [and] we don't have to worry about investing any taxpayer money, at least for the year we have those credits," said Johns Creek Chief Data Officer Nick O'Day. "It was pretty exciting when we found out we won."
O'Day spoke with Smart Cities Dive about the Alexa skill and Johns Creek's work to provide residents with access to municipal data.
The following interview has been edited for brevity and clarity.
SMART CITIES DIVE: How did you get the idea for this Alexa skill?
NICK O'DAY: We are a small city in Georgia and we have a lot of data that we've curated for a really long time. Not just geographic, like streets and addresses and zoning and things, but all kinds of other data that we collect as part of regular city operations. That stuff ranges from building permits to police calls to fire calls — anything you can think of that a city would do — we have information that tracks when it happened, where it happened, that kind of stuff.
Two years ago ... we were seeing that other cities were releasing that to the public through open data portals. We followed suit and built our open data portal, the DataHub. Inside of the DataHub we have all kinds of different dashboards and ways of looking at the data so that people can get an understanding. But what we realized is the only people who were really able to get any kind of meaning out of it easily are people who are used to dealing with data and have an understanding of what open data really means.
I was at a conference put on by Esri and during a segment this guy from Esri's R&D arm was talking about building Alexa skills that tied in to open data ... They said nobody has used this yet but this is where they're thinking things will go. I saw that and thought that would be an excellent way to put data into the hands of more people. Not force people to go through a dashboard or map or some kind of website, but to make it as easy as asking a question and getting an answer and moving on.
How does this skill work from the city's perspective?
O'DAY: It's a three-step process. There's the skill that's backed up by Alexa, the text-to-speak and the intuitive AI. There's the lambda function, that's really the code we built that interacts between the open data portal and Alexa. And then there's the open data portal.
Our open data portal has an API, a variety of different ways the tools and developers can connect to the data itself in an automated way. At least once a week, we at the city update the data in the portal on DataHub ... and there's some code that lives on AWS inside of what's called the lambda function.
Within the lambda function that's all the code does, the translating between the skill and the open data portal. It's the brains when somebody asks a question, Alexa figures out what you mean, sends that to the function, the function then goes, "OK, this is the question the person is asking, this is the layer in the open data portal that I need to find and query." And once it gets the answer it goes back to Alexa and translates it into something a human can understand.
Do you have to anticipate all the questions people will ask and build it into the Alexa function?
O'DAY: Yes, we had to go through and anticipate what people were going to ask and build in those questions. You build in what's called an intent, that's AWS' or the Alexa framework's way of saying what a question is. Underneath the intent you have utterances, which are different ways of saying the same thing, kind of like keywords.
[For example], we've got a layer all about zoning. Let's pick three different ways that people ask about zoning cases. One way is, "What's the zoning of X-Y-Z property." Another way would be, "Tell me what zoning is of the property located at X-Y-Z address." We had to anticipate what people were going to ask, program those questions in with different ways somebody could ask the questions, and then Alexa takes over from there.
The way we figured out what questions to build into the skill was a combination of asking our receptionists, the people who actually handle most of the calls that come in from the public, and also the analytics from our website, what pages people search for and what pages are hit the most.
How easy is it for the public use this skill?
O'DAY: It's really simple. They have an Echo device, a smart speaker or something, and all they have to do is enable the skill, which is basically like downloading an app to your phone. Once the the skill is enabled on your device, you can just say, "Alexa, ask the City of Johns Creek..." and follow up with whatever question you want to know.
By saying "Alexa," that awakens the device, and then by saying, "Ask the City of Johns Creek," it tells Alexa which skill you want her to use. That's it. It's super simple on the user side. They just have to have that beginning part of the question so Alexa knows to listen to the user to use to connect the question with a response.
What value does this skill bring to residents?
O'DAY: Since it's so simple to ask a question and get an answer, the barrier to understanding open data has dropped to zero. There's nothing that you have to know about using a web application or how the skill is constructed. You just need to know how to ask Alexa a question and you're done.
Our skill has a little over 200 different questions you can ask. About 50 of those are tied directly to open data, and the other 140 are from the FAQs on our website. You can ask things about data but you can also ask, "Where is Newtown Park?" Or, "Do I need a building permit?" We've also tied it into our calendar and our HR system. So you can ask about common things, but also when is the next city council meeting, what activities are planned for this weekend and what positions are open — that's one of the big ones people search our website for.
"It's about trying to meet people where they are instead of trying to force them into learning some kind of interface."
Chief Data Officer, Johns Creek, GA
Since the skill is always on and always available, when City Hall is closed overnight from 5 p.m. to 8 a.m., if you have a question you won't have to leave a voicemail ... and wait for somebody to call back. You can just ask the Alexa skill.
It's about trying to meet people where they are instead of trying to force them into learning some kind of interface. If you're a data scientist or IT person looking at a website, navigating an API is commonplace. But for the vast majority of people that's a challenge and it takes some serious time investment. We're hoping that through releasing these types of technologies and others that we're looking into, we can touch more people and help folks quicker and easier.
We're really trying to democratize the data as much as we can, to absolutely make it as simple as possible for people to ask questions and get answers based on real data rather than people assuming things or using their own intuition.
Do you have plans to expand the skill?
O'DAY: Absolutely ... We're looking into other AWS technology with automated call centers where you can connect a lot of the data into that flow. So when folks call in overnight when nobody is here, hopefully even if they don't have an Echo device, they'll be able to interact with the skill and get more answers. It's a matter of us finding ways of pushing the data we have through the skill to more and more people.
There's a lot of different work flows where you could potentially deploy these kinds of intelligent tools. It's not a matter of trying to cut humans out of the loop and trying to deal with citizens in an adversarial way, it's more about trying to figure out what's the best resource to deploy so when a citizen calls in we have them on the phone the shortest amount of time to get their answer. There's nothing more annoying than calling a city government and you're on the phone for half-an-hour being bounced around from department to department.
The beauty of the Alexa skill is once you build a question and connect it to the right database, it basically knows anything. It can be a subject matter expert in a bunch of different departments whereas there aren't that many people, even a receptionist, who can keep track of all that kind of information.
We're still experimenting with a lot of this stuff because it's new in the government space. ... And if we didn't have the skeleton of the project from [Esri and Amazon], we wouldn't have been able to do it on our own.
What we're seeing over time is that we're saving about 10 hours of staff time every month on answering people's questions through the skill rather than forcing them to call in. For a small city like us, of 88,000, that's not half bad. We're still trying to get the word out that the skill is there and people can use it, so we anticipate that the [staff hours saved] will go up in the future.
Follow Katie Pyzyk on Twitter