Earlier this month, I took part in what was at least my 11th TIG conference - now renamed LSC Innovations in Technology Conference or ITCon (but it will always be "TIG" to me). During one session, I talked about Illinois Legal Aid Online's Victims of Crime portal and why we framed it around survivor stories. The stories tested incredibly well in user studies we conducted with crime victims. Why? Because the stories put a face to the individual survivor experience - they helped crime victims to feel less isolated and offered them validation and even hope.
It got me thinking about how so many of the sessions I went to at the conference were really user experience (UX) sessions, just disguised under topics and titles on artificial intelligence, conversational interfaces, structured data, and mobile-first design.
On artificial intelligence and conversational interfaces...
I attended IV Ashton and Abhijeet Chavan’s session on artificial intelligence (AI) and participated in the pre-conference hackathon, working on a project to integrate IV’s Houston.AI problem classifier with Twilio. I’ve had a fascination with the ideas of AI since I was a kid when WarGames came out, and Matthew Broderick had to try to get a computer to learn that the only winning move in nuclear war (or tic-tac-toe) is not to play. Now, so many years later, what strikes me is not how the computer can learn, but how that smart computer can be used to change how we interact with machines.
I remember a session at last year’s TIG conference where Joyce Raby talked about user trust in systems - how fully online systems rated better than in-person and hybrid systems, and how what users really want is to feel that they were heard, regardless of the outcome. Filling out a web form that makes you pick your legal issue does not lend itself well to letting the user tell their story. A freeform text box, which allows someone to tell their story, then ferrets out the actual (or most likely) legal issue, does exactly that.
When they started talking about conversational interfaces … can there be anything that feels more natural than that yet still involve a machine? Instead of a big empty text box on a web page, conversational interfaces, backed with natural language processing, can allow a person to tell their story in their own words to something (a chatbot, for instance) that is actively listening, all the while translating that story into the bits and bytes a computer needs to understand.
On mobile-first design...
I also attended the mobile-first design session. It almost feels outdated to me, given how ingrained mobile use is, and how much of our website’s traffic comes from mobile (about 52%). Except it’s not. There are still plenty of websites that aren’t mobile-friendly. And it’s easy, as I sit in front of my 32-inch 4K monitor, as I prototype new website features, to forget to consider how the new feature will work on mobile, just expecting it to translate well on mobile within our existing theme. Occasionally that works, but, more often than not, something’s not quite right. Continually improving our design so that it delivers an excellent user experience for both desktop and mobile users is something to continue to work on.
Also at that session, we talked a bit about accessibility. It was pointed out that statewide legal websites aren’t required to comply with 508, and that full compliance with WCAG 2.0 may be too difficult. My take on that is that we need to be as compliant with WCAG as possible. It’s true that the AAA standards - the highest - are difficult to meet in some areas, like video, but in others, such as ensuring a high enough contrast ratio between text and background colors, are easier to meet.
ILAO is compliant on all A, most AA standards, and some AAA standards. OpenAdvocate is compliant on all A and AA standards (per Abhijeet Chavan). A and AA standards are not difficult to achieve, and we owe it to our users to strive for full accessibility. And it’s not just about people with disabilities. As Laura Kalbag points out in her article “Why bother with accessibility?” users in low light, bright light, rain, with intermittent internet connections, who are easily distracted or have cognitive impairments, benefit from every accessibility standard we meet.
On structured data...
I also attended Margaret Hagan and Abhijeet’s session on structured data. Structured data has been around for a while; it is hidden markup that human users don’t see but that machines do (like those at Google, Facebook, and Twitter). Doesn’t sound that exciting, right? But it improves the user experience by making online information more findable and more consistent. Sharing content from IllinoisLegalAid.org to Facebook or Twitter results in a consistent look and feel because it includes tags to tell those sites where our title, description, and images are for each article.
Looking at Schema.org, which provides extended metadata for specific types of information, there are opportunities to make our information more accessible across platforms. Today, if you type in a disease, like COPD, into Google, you get instant answers. Imagine doing that for your legal problems, too.
I left the conference with my head swimming in new thoughts and more things to add to my never-ending to-do list. At the top of that list is to take a step back from all of the grant commitments, work in progress, and user stories that are cluttering up both my planner and my mind, and do a better job of embracing UX by learning more about our users. Then better institutionalizing that information within ILAO, and within our product management processes. I’ll share back on that in a future blog post.