Appetite for disruption?

When I'm looking for examples of how technology has upended a specific industry, the full-scale disruption of publishing in the 1980s offers up a distinct before and after tale.
In the beforetimes there were a host of precise manual skills and procedures involved in the production of any printed publication; typesetting is an art and a science.
In the aftertimes there was the automation of a swathe of specialist processes unleashed by the Apple Mac, Aldus PageMaker and PostScript.
The transition was seismic and brutal and spelled the literal end of hot metal typesetting, for 100 or so years the dominant method of mass-market printing.
The Wapping dispute, which caused widespread demonstrations and unrest during the mid-eighties, exemplified the struggle as people pushed back against looming digitisation and job losses.
At the point I started using PageMaker in the early 1990s such disputes were already fading into memory. I was one of a newish wave of digital natives who only ever knew how to design pages or assemble artwork on the screen of a Mac. Any suggestion that you would use another method seemed peculiar, antiquated, a step back in time.
After I graduated from university I attended job interviews at both The Scotsman and The List magazine and found my skills were already too advanced for their specific and still largely manual set-ups: I simply couldn't do the design and layout roles they were advertising for.
I'm not pitching myself as some wunderkind, however it was evident those publications – both of which survive today in mainly digital form – were going to have to evolve to keep pace with a rapidly transforming landscape.
Whether you bemoan the painful demise of a whole generation of expertise or cheer the 'democratisation' of an industry ripe for disruption, it's hard to argue against technology's specific role ushering in sizeable, tangible change.
Efficiency gains, shmefficiency shmains
I often find myself yearning for such tangibility when hearing (ad nauseam, on repeat) about the disruptive impact of Generative AI.
When I'm sitting in auditoria, or reading articles, or scrolling through responses to social media posts where I'm being told my entire organisation needs to pivot, or I must adopt an AI-first mindset, or to brace myself for the fourth industrial revolution, I will either glaze over or feel my skin start to prickle – neither a particularly positive physical reaction.
It would be woefully naive to assert that AI isn't going to have an impact on the jobs market and emerging skills requirements. Brian Merchant, author of the superlative Blood in the Machine book (and newsletter of the same name), has been covering the very real consequences being felt as companies make a beeline for the AI promised land.

At the same time, it seems perfectly fair to question the extent, pace-of-change, and types of jobs and skills we're actually talking about.
As Merchant duly notes:
Too often the pitch for AI is used as an opportunity to roll out bland aphorisms and astonishing-yet-unspecific claims.
Much hubbub surrounds the case of Klarna, for instance, who very publicly staked a claim to going all-in on AI and now seem to be taking a more subdued standpoint.
Occasionally I'm pleasantly surprised when I learn about a niche example, a Generative AI solution that showcases a genuine step change, but mostly it still feels like a grift. And so many of the promised benefits fall into a bucket of vague efficiency gains and productivity improvements.
Anyone who's used the growing range of large language or multimodal models will likely recognise the potential to improve workflows, automate tasks, and generally cut down on menial chores.
Aside from the odd interesting individual case, how much of it is actually working in practice though? And how much of it scales, and interoperates, and plays nice with the rest of the tools that intertwine with our day-to-day lives?
Efficiency gains are what software vendors and software-as-a-service models have been promising for years, nay decades. It drives me slightly bonkers that practically every technology supplier and platform is pitching efficiency gains made through AI as the win. [Screams into void]: "Haven't we been paying our license fees for you to have already sussed this out?"
When I attempt to turn on my out-of-office in Outlook, a frustratingly clunky task if ever there was, I have a wry little smile to myself. Thanks for all the Copilot shizzle Microsoft, couldn't you fix some of your core freakin' functionality first?
A final point on efficiency gains: admittedly it's easy to cherry pick evidence, however there are reputable studies suggesting that the opposite of efficiency might be happening. TL;DR: AI isn't "moving the needle" and may even be increasing workload.
Pin the tail
Back to tangibility. Maybe if there wasn't such a gold rush mentality and lots of people high on buzzwords, a clearer picture might materialise. Where are the desktop publishing like studies to build the case for transformation? Are we meant to just blindly accept that this stuff is going to live up to its promise?

At the moment, choosing an AI solution feels like a game of pin the tail on the donkey: close your eyes, move approximately in the right direction, and hope for the best. As much as the pressure may be on to leap headlong in, there’s still a huge amount of uncertainty to contend with.
As an organisation you may need to take a punt on one of thousands of emergent solutions – AI startups are big business – while at the same point find yourself at the mercy of a clutch of big tech players. Alphabet, OpenAI, Meta, Anthropic and Microsoft are either furiously undercutting or trying to outdo one another to woo customers and stake their claim as the dominant AI model.
While the venture capital money is flowing and the good times roll, doing something on the relative cheap, using technology in a fairly flexible way, is still possible. That won't last though – it never, ever, does.
Anil Dash recently wrote a characteristically excellent post on the phoniness of "AI first", and this paragraph stood out for me:
Usefulness and utility, eh? How novel. If there was more clarity around the actual value AI brings and specific outcomes it delivers, and less overhyped-solutions-looking-for-problems, maybe all this would be a bit more palatable.
But that doesn't feel like it's coming anytime soon. Contrast the vagaries and banality of Jony Ive and Sam Altman's love-in earlier this month with the sharp, focused precision on display at the launch of Jony's masterpiece in 2007. God help us.🙄
Behold the slopfest
One area where you can draw comparisons between publishing's digital revolution and AI is the slop it discharges on an unsuspecting world.
Where PageMaker led, others followed, and soon there was an array of WYSIWIG (what you see is what you get) tools available to every desktop-computer-owner-cum-amateur-designer.
While not as prolific as the AI outputs now flooding every corner of the Internet, the availability of software with a low barrier to entry (staring hard at you, Microsoft Publisher) led to an onslaught of printed publications, posters and leaflets that ignored the most basic rules of typography and design.
As much as my younger self was incredibly snobby about this, I now gaze back fondly at a more innocent era. Personally, I'd rather look at a terribly designed flyer displaying Comic Sans at a jaunty angle overlayed on an awful piece of clipart than some hyperrealistic AI-generated image or video, devoid of any personality, created purely to generate likes.
Guess I'm still a bit snobby.
Further reading/action:
- This wonderful post by Mike Monteiro just popped up in my inbox: How to survive the weight of an entire industry trying to convince you that you're inadequate
- I really like this Careful Industries' blog post and the last two sections ('Got to have fAIth' and 'AI Obscura') are both relevant to, and far pithier than, what I've written above.
- Sign up for this free event featuring Emily M. Bender, Alex Hanna and Karen Hao: Challenging AI Hype and Tech Industry Power
🖥️ Thank you for reading.
Member discussion