Whatever your political persuasion, in recent times it seems apparent that things are not quite how they should be. Things are never perfect, of course, but no matter your views, there appears to be a groundswell of opinion that things need to change. And with social media turbocharging our keyboard warrior tendencies, we also need to blame somebody for it. 

But do we need to blame someone… or something? A new book has opened the debate about complex systems, and how we used to be on the right track to understanding them before getting sidetracked into neoliberal economics, populist politics, and worrying about AI. It turns out, however, that the AI might just be the solution we have been looking for all these years. 

A short history of cybernetics 

The book in question is The Unaccountability Machine by Dan Davies, an economist, former banker, and now journalist who has studied economic systems in great detail. The study of systems in general and how communications within and without them can control things is known as cybernetics, and Davies’ book goes into a lot of detail about how this pursuit began, chiefly by the excellently named British polymath, Stafford Beer.  

In the book, Davies identifies the crisis in modern organizations as one where there is no one to blame anymore, as “accountability sinks” have been created to diffuse individual responsibility. This has been a result, according to Davies, of the rise of shareholder-value doctrine, financialization, outsourcing, and organizational complexity, which has stripped decision-makers of any discretion, thus turning the market itself into an accountability sink. 

One stark example of this is our own scholarly communications industry, which Davies says, “is an industry where academics compete against one another for the privilege of providing free labor for a profit-making company, which then sells the results back to them at monopoly prices.” He then says academics hate this, but the model has persisted. Why? Well, because of the ‘publish or perish’ culture that has developed, publishers have effectively discovered an accountability sink:  publications and the citations that follow form an arm’s length means of performance management, which suits the universities.  

Systems failure 

The lack of accountability here, as well as in many other organizations, is highly problematic, and Davies points to the global financial collapse of 2008 and other problems as resulting from this lack of accountability. The answer in Davies’ eyes could be cybernetics, which essentially allows for an all-powerful governing system, feedback loops, and control levers all to be in play so that when things go wrong, systems can course correct rather than fail. In an age of fast-developing technology powered by unimaginably fast, complex computer chips, it makes sense to have a look at some of these ideas. 

The quid pro quo of this solution is that this governing system might be in the form of a black box, the insides of which are too complex to understand, but inputs and outputs can be judged. Sound familiar? Yes, the logical conclusion for academic publishing seems to be that, far from worrying about AI doing a bit of peer review here and there to help editors, it should control the whole academic research enterprise, pointing out to us how the world should be researched and managed. While this might seem to be the stuff of dystopian novels, the result might be a whole new research paradigm where the most important research has the best outcomes for society, with further research encouraged in those areas. Suddenly, that doesn’t seem so bad, does it? 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.