Digital designers and developers tend to have little interest in attempting to fix any social, political or environmental problems that might arise from the implementation of their products. As Cameron Tonkinwise (2018) puts it, “designers rarely take responsibility for the end-lives of what they do design”. Indeed, from a design perspective taking time out to consider these issues is seen an additional burden – something that would incur more time, additional costs and fundamentally “slow innovation down” (Tonkinwise 2018).

So how might things be different? One suggestion brewing within the field of Human Computer Interaction (HCI) is adding a societal ‘redesign’ dimension to the remit of digital designers and developers – extending the accepted logics of debugging technical problems that arise after a technology has begun to be used. While debugging, patching and updating usually relate to resolving coding errors within computer software and systems, Vishal Sharma and colleagues (2023) propose that software developers and HCI practitioners might take responsibility for developing incremental ‘patches’ to any digital technology – small adjustments and amendments that seek to remedy (or at least address) social and environmental harms as and when they arise. These patches might be technical (e.g. involving new coding) or regulatory (e.g. involving the development of new policies, protocols, regulations, or terms of service):

“Patching is an effort to continuously adapt and reassess interventions to enhance democratic, transparent, and accessible practices. Problems can be ‘re-solved over and over again’ as we keep re-learning the implications of our technologies. Redesigning better technologies is also HCI” (Sharma et al. 2023).

Sharma proposes that this form of retrospective ‘social patching’ would involve reframing hardware and software development as a process that continues long beyond the launch of any technology. Technology designers and developers would remain responsible for stewarding the social, political and environmental consequences over the lifetime of a product. As Sharma notes, this is a bold inversion of current IT industry mantras of ‘move fast and break things’ and ‘fail fast, fail often’:

Of course, the tech industry is a long way from taking responsibility for the social, political and environmental harms that arise from its work. However, we might also consider the more radical proposition of encouraging a culture of ‘undesigning’ technologies – what James Pierce (2012) describes as “negating technology by design”. This would involve expanding the job of digital designers and developers to take on a number of different forms of post-production stewardship of existing technologies – therefore shifting the traditional professional mindset of ‘innovation’ solely occurring through the design of new products.

For example, one ‘undesigning’ priority would be working to completely eliminate technologies that are agreed to be wholly destructive and harmful to our collective good – in short, working to design egregious hardware and software out of existence. It might be possible, for example, to design built-environments that feature awkward lines of sight or lighting levels that render facial recognition and other smart surveillance technologies unusable. It might be possible to develop computer servers that run in ways that significantly disrupt wasteful digital practices such as Bitcoin mining. Pierce (2012) refers to this as designed erasure – i.e. technology designers and developers working to completely eliminate digital technology from existence. 

Alternately, technology designers and developers might also work to design limitations and constraints to generally undesirable technologies – imposing caps on data use, deaccelerating processing speeds, and generally restricting technologies to levels of consumption that are sustainable and acceptable. Pierce (2012) refers to this as designed inhibition – i.e. design that aims to hinder or prevent the use of technology in particular ways that are considered harmful and contexts that are especially vulnerable to these harms.

These are not wholly unfamiliar ways of working within the design community (see Papanek’s 1995 calls for ecologically responsible design). In particular, such approaches hark back to 1990s’ ideas of ‘elimination design’, where designers worked to eliminate unsustainable technologies such as ‘gas-guzzling’ vehicles or toxic building materials such as asbestos (see Fry 2005). Reviving the same approach to stamp out socially-undesirable and environmentally-harmful digital technologies seems eminently sensible.

Of course, encouraging the ‘undesign’ of digital technologies would require a complete shake-up of current tech industry practices. For example, technology developers would have to become comfortable with making regular decisions to not design potentially lucrative new products and accepting that current instances of a technology are perfectly adequate and in no need of upgrade or marginal improvement. 

This would also involve a renewed focus on what Tonkinwise (2018) terms ‘de-progressive design’ – designers and developers working to reclaim, rediscover, and reappropriate past ways of doing things that were previously pushed aside (or even destroyed) by ‘progressive’ design of digital technologies. This might involve the revitalisation of ‘standalone’ digital products that are not networked, that are not in-the-cloud. This might also involve developing new versions of previous analogue products and processes that are considerably less resource intensive and socially harmful than their digital equivalents.

All told, in an new era of digital degrowth, these ideas of redesign and undesign point to how technology development and design might be reimagined as a process of slowing down and defanging the digital excesses of the past few decades. Rather than being put out of job, a transition to digital degrowth would mean that there would still be plenty of work for technology developers to be getting on with.

REFERENCES

Fry, T. (2005). Elimination by design. Design Philosophy Papers 3(2):145-147

Papanek, V. (1995). The green imperative: ecology and ethics in design and architecture. Thames & Hudson

Pierce, J. (2012). Undesigning technology: considering the negation of design by design. in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. (pp.957-966).

Sharma, V., Kumar, N. and Nardi, B. (2023). Post-growth Human–Computer Interaction. ACM Trans. Comput.-Hum. Interact. https://doi.org/10.1145/3624981

Tonkinwise, C. (2018). I prefer not to: anti-progressive designing. in Coombs, G, McNamara, A. & Sade, G. (Eds.) (2019) Undesign: critical practices at the intersection of art and design. (pp.74-84) Routledge