Abstract:
Interest in Artificial Intelligence (AI) has never been greater – while we’re at a
likely tipping point in the adoption of AI into mainstream industry, we’re still
grappling with effective ways to manage the ethical concerns that AI surfaces
on a regular basis. The purpose of this study is to shed some light into the
types of AI applications being developed in the media industry in South Africa,
investigate how AI ethics tensions surface and are managed when building these
AI applications and provide recommendations for the management of AI ethics
tensions in media organisations.
Data was collected from respondents on the types of AI applications being developed,
as well as the nature and characteristics of these projects including roles
required to staff the projects, project duration, focus and business objectives,
project outcomes, technologies used, and source of technologies used.
The study reviewed recent literature on AI ethics, and specifically research into
the roles played by both individuals as well as the organisations they work for, in
managing AI ethics considerations. Using the insights from the literature, as well
as data collected during the study via a cross-sectional survey implementation,
analysis was performed to determine associations between actions in the management
of AI ethics tensions and the perceived outcomes and effectiveness. Several
statistically significant associations, with both weak and medium effect size,
were noted between the way AI ethics tensions were discovered and managed
during projects, and the perceived outcomes and effectiveness of these actions.
The associations noted potentially have implications for media organisations
that are implementing AI solutions and are seeking to effectively manage AI
ethics tensions. Based on these analyses performed, recommendations are provided
to inform the creation of effective frameworks to manage AI ethics tensions
at media organisations developing AI solutions. Limitations of the study and
further areas of research required are also discussed.