The Clockwork Penguin

Daniel Binns is a media theorist and filmmaker tinkering with the weird edges of technology, storytelling, and screen culture. He is the author of Material Media-Making in the Digital Age and currently writes about posthuman poetics, glitchy machines, and speculative media worlds.

Tag: history

  • This algorithmic moment

    Generated by Leonardo AI; prompts by me.

    So much of what I’m being fed at the moment concerns the recent wave of AI. While we are seeing something of a plateauing of the hype cycle, I think (/hope), it’s still very present as an issue, a question, an opportunity, a hope, a fear, a concept. I’ll resist my usual impulse to historicise this last year or two of innovation within the contexts of AI research, which for decades was popularly mocked and institutionally underfunded; I’ll also resist the even stronger impulse to look at AI within the even broader milieu of technology, history, media, and society, which is, apparently, my actual day job.

    What I’ll do instead is drop the phrase algorithmic moment, which is what I’ve been trying to explore, define, and work through over the last 18 months. I’m heading back to work next week after an extended period of leave, so this seems as good a way of any as getting my head back into some of the research I left to one side for a while.

    The algorithmic moment is what we’re in at the moment. It’s the current AI bubble, hype cycle, growth spurt, whatever you define this wave as (some have dubbed it the AI spring or boom, to distinguish it from various AI winters over the last century1). In trying to bracket it off with concrete times, I’ve settled more or less on the emergence of the GPT-3 Beta in 2020. Of course OpenAI and other AI innovations predated this, but it was GPT-3 and its children ChatGPT and DALL-E 2 that really propelled discussions of AI and its possibilities and challenges into the mainstream.

    This also means that much of this moment is swept up with the COVID pandemic. While online life had bled into the real world in interesting ways pre-2020, it was really that year, during urban lockdowns, family zooms, working from home, and a deeply felt global trauma, that online and off felt one and the same. AI innovators capitalised on the moment, seizing capital (financial and cultural) in order to promise a remote revolution built on AI and its now-shunned sibling in discourse, web3 and NFTs.

    How AI plugs into the web as a system is a further consideration — prior to this current boom, AI datasets in research were often closed. But OpenAI and its contemporaries used the internet itself as their dataset. All of humanity’s knowledge, writing, ideas, artistic output, fears, hopes, dreams, scraped and plugged into an algorithm, to then be analysed, searched, filtered, reworked at will by anyone.

    The downfall of FTX and the trial of Sam Bankman-Fried more or less marked the death knell of NFTs as the Next Big Thing, if not web3 as a broader notion to be deployed across open-source, federated applications. And as NFTs slowly left the tech conversation, as that hype cycle started falling, the AI boom filled the void, such that one can hardly log on to a tech news site or half of the most popular Subs-stack without seeing a diatribe or puff piece (not unlike this very blog post) about the latest development.

    ChatGPT has become a hit productivity tool, as well as a boon to students, authors, copy writers and content creators the world over. AI is a headache for many teachers and academics, many of whom fail not only to grasp its actual power and operations, but also how to usefully and constructively implement the technology in class activities and assessment. DALL-E, Midjourney and the like remain controversial phenomena in art and creative communities, where some hail them as invaluable aids, and others debate their ethics and value.

    As with all previous revolutions, the dust will settle on that of AI. The research and innovation will continue as it always has, but out of the limelight and away from the headlines. It feels currently like we cannot keep up, that it’s all happening too fast, that if only we slowed down and thought about things, we could try and understand how we’ll be impacted, how everything might change. At the risk of historicising, exactly like I said I wouldn’t, people thought the same of the printing press, the aeroplane, and the computer. In 2002, Andrew Murphie and John Potts were trying to capture the flux and flow and tension and release of culture and technology. They were grappling in particular with the widespread adoption of the internet, and how to bring that into line with other systems and theories of community and communication. Jean-Francois Lyotard had said that new communications networks functioned largely on “language games” between machines and humans. Building on this idea, Murphie and Potts suggested that the information economy “needs us to make unexpected ‘moves’ in these games or it will wind down through a kind of natural attrition. [The information economy] feeds on new patterns and in the process sets up a kind of freedom of movement within it in order to gain access to the new.”2

    The information economy has given way, now, to the platform economy. It might be easy, then, to think that the internet is dead and decaying or, at least, kind of withering or atrophying. Similarly, it can be even easier to think that in this locked-down, walled-off, platform- and app-based existence where online and offline are more or less congruent, we are without control. I’ve been dropping breadcrumbs over these last few posts as to how we might resist in some small way, if not to the detriment of the system, then at least to the benefit of our own mental states; and I hope to keep doing this in future posts (and over on Mastodon).

    For me, the above thoughts have been gestating for a long time, but they remain immature, unpolished; unfiltered which, in its own way, is a form of resistance to the popular image of the opaque black box of algorithmic systems. I am still trying to figure out what to do with them; whether to develop them further into a series of academic articles or a monograph, to just keep posting random bits and bobs here on this site, or to seed them into a creative piece, be it a film, book, or something else entirely. Maybe a little of everything, but I’m in no rush.

    As a postscript, I’m also publishing this here to resist another system, that of academic publishing, which is monolithic, glacial, frustrating, and usually hidden behind a paywall for a privileged few. Anyway, I’m not expecting anyone to read this, much less use or cite it in their work, but better it be here if someone needs it than reserved for a privileged few.

    As a bookend for the AI-generated image that opened the post, I asked Bard for “a cool sign-off for my blog posts about technology, history, and culture” and it offered the following, so here you go…

    Signing off before the robots take over. (Just kidding… maybe.)


    Notes

    1. For an excellent history of AI up to around 1990, I can’t recommend enough AI: The Tumultuous History of the Search for Artificial Intelligence by Daniel Crevier. Crevier has made the book available for download via ResearchGate. ↩︎
    2. Murphie, Andrew, and John Potts. 2003. Culture and Technology. London: Macmillan Education UK, p. 208. https://doi.org/10.1007/978-1-137-08938-0. ↩︎
  • Dunkirk (2017)

    dunkirk-2017-large-picture
    I feel you, er, Jimmy, Timmy, Bobby, whatever your name is.

    David Cox’s no doubt controversial take on Christopher Nolan’s Dunkirk has spurred me to writing, and not, as you might think, to leap to the film’s defence.

    I saw Dunkirk on Saturday morning; not in IMAX, as Nolan would make everyone if he could legally do so, but at my local cinema, on a normal screen.

    I’d heard similar things to Cox: that Nolan had crafted a perfect war film, a stark and bleak story of suspense and survival. But I, too, left the cinema feeling wholly underwhelmed.

    We had a protracted discussion in my studio yesterday about Interstellar and confection. I avoided discussing Dunkirk too much so not to ruin my students’ gut reactions, which I consider just as good a mark of a film’s impact as any well-crafted review. But in Interstellar, as in much of Nolan’s recent work, the score is solely responsible for imbuing any meaning to the image. We watched a 15-minute clip where the protagonist leaves his family and launches into space. Apart from the opening tears and family stuff, the rest of the scene is highly procedural, with McConnaughey, Hathaway, Bentley et al floating about the spaceship, flicking switches and checking systems. There are moments of banter, but nothing hugely affecting. This scene is ‘confected’ (an excellent word used by my co-teacher) to feel like a massively emotional scene, purely by the score. Humorous lines are given a push by lilting string phrases. Little barbs about home are sent into the realm of epic pathos by a booming bass note. It just feels entirely artificial and wrong (comparisons were made to 2001: A Space Odyssey, which pulls off suspense, emotion, awe much more effectively).

    Dunkirk is equally confected, but in a very different way. Cox mentions the lack of character development and backstory. When I read the early reviews, I came down on Nolan’s side, arguing naively that perhaps there is room for both well-developed character war films and those that are more visceral. After seeing the film, though, I can’t help but agree with those critics. I would add a couple more criticisms, though.

    Compared to the over-exposition of previous Nolan stuff, this film has virtually no dialogue. What dialogue is there is mumbled, or hastily whispered from cover. The dialogue does nothing to explain the characters’ decisions or motivations (exposition), nor does it give a more rounded insight into the characters’ personalities (abstraction/expression). The words just sort of sit there as awkward observations about the characters’ surroundings that, arguably, with the film being shot in IMAX, the audience could probably see and figure out for themselves.

    The supposedly ever-present threat from the unnamed enemy comes off as wholly artificial. The enemy is basically represented by a Shepard tone and, because Nolan doesn’t want you to forget that you’re under threat, the Shepard tone never ends. The result is a suspense that is driven not by empathy, or by a feeling of anticipation or fear, but by sheer audience discomfort in the cinema.

    The third and final criticism is that this is a blinkered story. The protagonist, if there is one, is played by Fionn Whitehead, who does an admirable job of injecting some affect into the lifeless husk with which the audience is meant to sympathise. If you are going to focus on someone, though, if the audience is indeed meant to feel what a protagonist is feeling, we need something more than an innocent-looking face. At least for Whitehead’s character — Bobby? Jimmy? What gratingly archetypal British name did they give him in the credits? — the audience needs some hint of a story of home, a family or partner waiting for him, loving parents. We get more of that for the sailor’s son, who dies purely accidentally (‘He always wanted to be in the paper’), than we do for the protagonist. We also get next to no sense of the scale of the evacuation, nor of the role that non-British countries played, both in terms of being evacuated and off fighting off the German threat. Apart from two French soldiers holding the line at the beginning, one blink-and-you-miss-him-Dutchman, and the other French soldier pretending to be British, you wouldn’t know that there were not only British, French, and Dutch troops at Dunkirk, but also Canadians, Poles, and Belgians.

    330,000 people made it off that beach. That’s the story of Dunkirk. I’m all for visceral cinematic experiences, for switching off my own life in favour of immersion in a story or experience. But in the case of Dunkirk, to ignore the scale of what may well have become ‘the greatest military disaster in our long history’ — to steal Churchill’s words — does something of a disservice to everyone, of all nations, both home and abroad, who somehow got most of those men and women home. To do that, we need to get to know the people on a level greater than pure affect. We also need to see how great, how enormous, this military achievement was. Somehow Nolan, of all people, failed on both counts.