Artificial intelligence (AI) has been the talk of 2023. Within the NDIA community, we have early adopters, technologists who dove in headfirst, those who have only touched the tip of the AI iceberg, people who have read hundreds of ethics concerns and are fearful, and a lot of us who are in between or might not even know where to start.
Wherever you’re at in your AI journey, it’s time for a digital inclusion community discussion. What we know for sure is that AI will cause another digital divide, or further exacerbate the one we already have.
As a community, we know technology alone will not solve the digital divide. Humans are essential to digital inclusion, to help introduce emerging technologies and guide the use of current technologies.
The NDIA community also knows, intimately, the harms that technology can cause and the harms of not having or not understanding technology. That’s what we want to avoid.
To jump-start the discussion, I have some overarching thoughts about AI and digital inclusion:
1. Digital Skills Training Programs Are Essential for People to Safely and Confidently Use AI
Teaching how to use AI tools and how to be safe around AI is a natural addition for digital inclusion skills programs. If an NDIA Affiliate is teaching someone how to use a browser, they’ll also teach them how to use a generative AI tool, such as ChatGPT or Google’s Bard. Just as digital inclusion instructors teach community members how to be safe by identifying bogus websites and phishing, they’ll also have an essential role in teaching others how to verify content provided by a generative AI tool.
2. Grassroots Based Advocacy Supports Responsible, Meaningful Regulation
Digital inclusion practitioners are experts who should be guiding tech equity policy impacting their communities. They can represent community voices by advocating for local AI use that will benefit their communities, with special attention on avoiding potential bias and harm from AI tools. The more confident the NDIA community is with AI, the more confident they will be to engage in AI policy discussions.
3. Successful Digital Navigation Services Are Built on Trust
AI tools are already being created to support digital navigators. These tools will only be successful if used alongside human digital navigators. Future digital navigation AI tools will need to be developed carefully to guide human digital navigators ethically and accurately. All of that is dependent on deep involvement from community digital inclusion practitioners and digital navigators – the experts on community needs. These humans – experts from the digital inclusion community – need to be part of the whole process, from development, to equity and inclusion testing, to feedback loops for continuous improvement, and onto user support.
Continuing the AI Conversation with You – the Digital Inclusion Community
NDIA will help guide the community in some learning, start the conversation, and hear your thoughts about AI and digital inclusion. Here are some upcoming opportunities to join us:
- View the recording from a Special Webinar: AI 101 & a Conversation About AI in Digital Inclusion Work – Tuesday, January 9, 2-3:30 p.m. EST – Ernst & Young will present information about AI – what is it, how can it be useful, and how can it cause harm? NDIA will then facilitate a conversation around how AI might be integrated into digital inclusion programs.
- Discussion about AI at Net Inclusion in Philadelphia – Wednesday February 14, 2pm EST. AI & Digital Equity: The Good, the Bad, & the Ugly. Breakout session led by Angela Siefer