The internet did a commendable job of mocking NFT’s death, or at least pardoning it – major game developers like Ubisoft who initially expressed interest have graciously stopped bringing it up – and now some are hoping to “make it so uncool that no one it will touch”. tactics can be used to thwart another trend: the rapidly advancing AI image generators that produce flattering, false portraits of our friends and stills from imaginary David Lynch Warhammer movies (opens in new tab).
I think they will be disappointed. AI “art” isn’t going anywhere.
In a sense, NFTs and AI art are polar opposites: NFTs promise that every piece of digital artwork can be a unique and valuable asset, while AI art promises to eradicate the value of digital art by flooding the internet with an endless supply of it. If Jimmy Fallon wants to hoard all those stupid NFT monkey pictures, I don’t think most people would care, but the cheap, fast generation of AI images has made it hard not to see more and more of them. If you’ve used social media in the past year, you’ve seen AI-generated imagery.
And I highly doubt it’s a temporary fad. Where investing in blockchain is criticized as pointless waste production, AI art is deplored as it threatens the jobs of illustrators. Everyone can see the value of a machine that converts words into images. It’s hard to resist giving it a try, even if you don’t like it on principle. If someone tells you they have a machine that can take a picture of anything, how can you not test that claim at least once?
The way we interact with these machine learning algorithms reminds me of the way people tease babies, welcome any response to new stimuli, and point to anything that could be taken as a sign that they’ve understood us. When an image generator seems to “get” what we’ve asked for, there’s a pleasantly creepy feeling – it’s hard to believe that a computer program has successfully translated a complex idea when “John Oliver looks fondly at his cabbage and realizes that he falls in love” in an image, but there it is, unmistakably on the screen before us.
And that’s really what makes AI art so offensive to so many, I think. It’s not just the automation of work, but the automation of creative work, that feels so obscene. Something considered deeply human has turned into a party trick.
The good news and bad news for humanity is that sleight of hand is easy to find: image generators won’t do anything unless they’ve been trained on piles of man-made artwork and photographs, and in some cases that’s been done without the permission of the artists whose work was used . Indeed, the popular Lensa AI portrait maker is a regular occurrence reproduced illegible signatures (opens in new tab): the mutilated corpses of the real artists who were fed to it.
An early attempt to save AI art from this criticism is easily dismissed, if you ask me. The claim goes that by scraping online artist portfolios for training materials, AI art generators are “just doing what human artists do” by “learning” from existing artworks. Sure, people learn in part by imitating and building on the work of others, but I don’t take casually anthropomorphizing algorithms sifting through millions of images like living creatures just going to art school really fast. It’s completely premature to attribute human nature to silicon chips just because they can now spit out pictures of cats on demand, even if those pictures occasionally look like they were made by humans.
I’m cropping this for privacy reasons/because I’m not trying to name anyone personally. These are all Lensa portraits where the mutilated remains of an artist’s signature are still visible. Those are the remnants of the autograph of one of the many artists it stole from. A 🧵 https://t.co/0lS4WHmQfW pic.twitter.com/7GfDXZ22s1December 6, 2022
More than flattering portraits
What’s interesting to me about AI-generated images is that they usually do not look man made. One way the inhumanity of machine learning manifests itself is its lack of self-awareness. AI art generators don’t tear up their failures, get bored or frustrated with their inability to represent hands that could exist in Euclidean space. They can’t judge their own work, at least in any way that a human being can relate to, and that fearlessness leads to startling images: images we’ve never seen before, which some artists use as inspiration.
Rick and Morty creator Justin Roiland toyed with AI art generation when making High on Life, for example tell Sky News (opens in new tab) that it helped the development team “come up with weird, funny ideas” and “makes the world feel like a strange alternate universe to our world.”
Image generation is just one way machine learning is used in games, which are already packed with procedural systems like level generators and dynamic animations. That’s how a young company called Any world uses machine learning to animate 3D animals and other models on the fly. What would a game like No Man’s Sky, whose procedurally generated planets and wildlife no longer feel new after so many galaxy jumps, look like after another decade of machine learning research? What will it be like to play games where NPCs can behave in really unpredictable ways, like “writing” unique songs about our adventures? I think we’ll probably find out. After all, our favorite RPG of 2021 was a “procedural storytelling” game.
I don’t want Epic to be a company that nips innovation in the bud. Been there too many times on the wrong side. Apple says “you can’t make a payment system” and “you can’t make a browser engine”. I don’t want to be the “you can’t use AI” company or the “you can’t make AI” company.December 25, 2022
As valid as the ethical concerns may be, the expansion of machine learning into the arts – and everything else humans do – currently looks a bit like the ship crashing on the island at the end of Speed 2: Cruise Control. (opens in new tab)
Users of art portfolio host ArtStation, which recently bought Unreal Engine and Fortnite maker Epic Games, have protested the unauthorized use of their work to train AI algorithms, and Epic added a “NoAI” tag for artists to use to “use of content by AI systems.” But that doesn’t mean Epic is generally against AI art. According to Epic Games CEO Tim Sweeney, some of his own artists view the technology as “revolutionary,” just as Photoshop has been.
“I don’t want the ‘you can’t be AI’ company or the ‘you can’t be AI’ company,” Sweeney said on Twitter (opens in new tab) . “Many Epic artists are experimenting with AI tools in their hobby projects and see it as revolutionary in the same way as previous stuff like Photoshop, Z-Brush, Substance and Nanite. Hopefully the industry will guide it into a clearer role that supports artists.”
It is of course possible to train these algorithms without gobbling up someone else’s artwork without permission. Maybe there’s a world where artists are paid to train machine learning models, though I don’t know how many artists would think that’s better. All sorts of other fears arise from the widespread use of AI. What biases might popular algorithms have and how might they influence our perception of the world? How will schools and leagues adapt to the presence of AI-laundered plagiarism?
Machine learning is used in all sorts of other fields, from graphics technology like Nvidia DLSS (opens in new tab) to self-driving cars nuclear fusion, and will only get more powerful from here. Unlike the blockchain revolution we keep watching, machine learning represents a real change in the way we understand and interact with computers. This ethical, legal and philosophical quagmire has just begun to open up: it gets deeper and more swampy from here. And our friends’ profile pictures will become more and more flattering.