Could techcomm help bridge the accountability gap in AI?

Skyline view of Paris with Eiffel Tower in background.
techcomm AI smart content

As artificial intelligence continues to move into our everyday lives, Firehead will be focusing this year on the challenges ahead for the workers of the future. What skills will be needed? How is the jobs market changing? What does it mean for those working in the field of digital communications, techcomm and content?

Last year it seemed everyone was talking about AI, machine learning, neural networks and data science. In technical communication, the field was talking about adapting to AI and its changing information needs through Information 4.0.

The talking will continue in 2019 but more and more of our clients will be looking into the practicalities of implementing enterprise AI. It should be said that, although the world’s biggest companies are already starting to exploit AI’s potential, it’s still early adoption days for the vast majority of businesses. There are still many barriers to entry, including cost, skills, market and platform readiness, and ethical and accountability challenges. And, of course, the huge talent gap that exists in the field of AI.

For those already working as digital communication, we’ve always said that technical communicators are among those ideally positioned to transition into AI-related jobs, helping people to understand the technical elements of products and services, and especially the human-machine interface.

At this point in AI’s development and use, we can see that techcomm professionals could also be employed to bridge the accountability gap by providing technical documentation for AI systems. Transparency and accountability will help make AI’s automation more socially and politically acceptable, and bring it into the mainstream. This is a big step, not to be underestimated by keen early adopters.

For example, according to a report out last month by the AI Now Institute in New York, accountability is one of the biggest challenges to resolve:

  • Who is responsible when AI systems harm us?
  • How do we understand and remedy these harms?
  • Where are the points of intervention?
  • What research/regulation is needed to make intervention effective?

“The lack of meaningful accountability and oversight – including basic safeguards of responsibility, liability, and due process – is an increasingly urgent concern,” it says. “Companies need to ensure that their AI infrastructures can be understood from ‘nose to tail’, including their ultimate application and use.” This is even more important for those businesses that create AI and automated decision systems for use in the public sector, where the ‘black box effect’ of secret or opaque systems makes assessing bias or contesting decisions and errors impossible.

At university level, the AI Now report recommends that AI programmes should expand beyond computer science and engineering disciplines: “Expanding the disciplinary orientation of AI research will ensure deeper attention to social contexts, and more focus on potential hazards when these systems are applied to human populations.”

At Firehead, we as content people and information providers recognise the vital need for involvement by other disciplines. We are currently working with several European universities to develop and deepen their curricula to prepare students for the new challenges of machine-human content development. We’re calling this “a new information delivery model for modern contexts, making both legacy and new content machine-readable data for human consumption”. (As you can see, we’re in dire need of a sexier description.)

We’ll be posting more on the jobs market as it relates to AI, machine learning, neural networks, data science and Information 4.0 in the coming months.

You can read the 2018 AI Now Institute report and its list of 10 recommendations for governments, researchers, and industry practitioners here: After a Year of Tech Scandals, Our 10 Recommendations for AI.

Image: (CC) Pixabay/geralt

CJ Walker

Related Posts

Call to action

Upskilling Updates: Semantic Technology

Intelligent Foresight, Insight and Hindsight: The Companies Leading and Lagging Behind the AI Upskilling Movement After absorbing this, anyone in the technical communication industry would be forgiven for assuming that upskilling is now an essential component of corporate culture or…...

15 October 2024
CJ Walker

Skills-based Hiring Trends and Technical Communication, part 3

Part 3 of 5 In this five-part series, Firehead takes a look at the new skills-based hiring trend – what it is, why it’s gaining ground, and how it effects technical communication. Remote Work Trends and Technical Communication Whilst remote…...

4 October 2024
CJ Walker

Skills-based Hiring Trends and Technical Communication, part 2

Part 2 of 5 The Top Five Most-in-Demand Technical Skills in 2023 In this five-part series, Firehead takes a look at the new skills-based hiring trend – what it is, why it’s gaining ground, and how it effects technical communication.…...

4 September 2024
CJ Walker