A sociotechnical perspective for the future of AI: narratives, inequalities, and human control

Paper · Source
Philosophy SubjectivitySocial Theory Society

Humans have minds that interpret the external reality, beyond the ability to follow instructions. With a ‘mindful brain’ (Edelman & Mountcastle, 1978) that software—based on algorithms—cannot have, they also make sense of reality with the aid of frames through which they organize social experience. Social frameworks provide implicit knowledge for understanding and moving in and out of different situations, ignoring inconsistencies and contradictions, as shown in innovative sociological work by ethnomethodologists Garfinkel (1967) and Goffman (1974). Not without critics, computer scientists— like Marvin Minsky as well as Robert Schank and Robert Abelson with their work on frames and knowledge structures in the late Seventies—tried to organize the ‘social life’ for AI systems. Scripts, plans, and goals are shortcuts to understand a social situation, providing that implicit background necessary to ‘understand’ the ‘social setting’ for AI systems (Schwartz, 1989).

Far from fully explaining the entanglement of concepts such as society and institutions, agency, and intelligence, it is sufficient to say that there is a strong need for developing a sociology for AI. Its usefulness lies on the opportunities to argue both on the sociological origin as much as the social impacts of AI.

….

We have identified three core challenges which we review in this section: (i) the opaque nature of machines; (ii) the guarantee of the respect for human agency and control of our autonomous artefacts; and (iii) the link to inequalities both as input to and output of AI systems.

….

This black-box nature of intelligent systems makes interaction limited and uninformative for the end user. The lack of sufficient information regarding the emerging behaviour of a system results in its users creating inaccurate mental models (Wortham et al., 2017), which in turn may lead to them placing too much or too little trust in their system (Lee & See, 2004).