Creativity · · 5 min read

GNOME, Bowie, and why your creative sausage needs a recipe

Creativity works best when you understand the ingredients rather than just generating the results. We use some examples from other areas to learn why.

GNOME, Bowie, and why your creative sausage needs a recipe
Image created with AI (ChatGPT). Sorry about the weird looking sausage.

What do you listen to when you want to go to sleep? White noise? Soothing music? Maybe a calm narrator reading you a bedtime story?

I listen to podcasts about Linux. Puts me right out.

New rules on AI from GNOME

On a recent episode of the Untitled Linux Show (a great podcast that I do fall asleep to sometimes, but always end up re-listening to the next day), the hosts discussed a new rule from GNOME that prohibits the use of AI-generated code in extensions.

GNOME is the engine under the hood of most major Linux distributions (like Ubuntu or Fedora). It features extensions—submitted by 3rd party developers—that modify how the desktop behaves, similar to the extensions you might use in Chrome or Safari.

Here is the new AI rule, straight from the...GNOME’s…mouth:

Extensions must not be AI-generated
While it is not prohibited to use AI as a learning aid or a development tool (i.e. code completions), extension developers should be able to justify and explain the code they submit, within reason.
Submissions with large amounts of unnecessary code, inconsistent code style, imaginary API usage, comments serving as LLM prompts, or other indications of AI-generated output will be rejected.

FrontPage flashbacks

One of the podcast hosts noted that this move reminded him of websites made with Microsoft FrontPage

Obi Wan Kenobi saying: "Now that's a name I've not heard in a long time."
FrontPage: Now that's a name I've not heard in a long time.

If you were around for the early web in the 1990s, you’ll remember making pages with plain old HTML.

Aside: In a corporate strategy class I took in college in the late 90s, I got my pick of project teams because I knew how to create and use HTML. Did I parlay that brief popularity into something bigger? Better? Profitable? I sure did not.

The late 90s early internet era brought with it applications that promised to help you make a website via a WYSIWYG (What You See Is What You Get) interface. FrontPage was the king of these, at the time. It did what was advertised, but if you took a closer look you’d find a mess of extra code and "digital junk" that made webpages heavy, slow, and inefficient.

Later apps like Dreamweaver improved on this by allowing graphical design while producing cleaner code. 

The problem with your sausage (creatively speaking)

The FrontPage example has a massive parallel to the way AI has upended the world of creativity in modern times. Technology evangelists tell us that generative AI will let you create anything without worrying about the details. If you can imagine it, you can make it.

Is that actually a good thing?

Last year, record producer Rick Beato analyzed the waveforms of some AI-generated music. He tried to separate the tracks to see how the song was built—looking for the individual vocal, drum, or guitar lines. He couldn't quite do it. The music wasn't "built"; it was a digitally generated "blob" of information. There were separate tracks, but they weren’t clear and distinct. There was a lot of bleed-over between tracks. Some tracks were decent, and others were barely recognizable. There was a lot of extraneous sound that didn’t quite fit. When it all fit together, it sounded...alright.

To the average Spotify listener that just wants “cool beats and sick grooves”, this may be irrelevant. "If it sounds alright, it is alright." I can’t really argue with that. The sausage gets made, it’s delicious, and you don’t ask what spare parts and filler went into it.

But say you’re a chef who wants to cook with that sausage. If the mix is wrong or there’s too much filler, it could impact the resulting dish. Likewise, if you outsource the entire creative process to AI, letting your work go into the “black box” and come out finished, your creation loses three vital things:

  1. Flexibility: You can’t easily remix or remaster a "blob."
  2. Portability: It’s hard to take an idea generated by one AI and move it into a professional workflow elsewhere if you don't understand the underlying structure.
  3. Influence: This is the most important one, and we should summon David Bowie to address it.

The Bowie method

I believe that creativity is a synthesis. There are no entirely original ideas. Creativity comes from how we take what inspires us, combine it with our own experience, and mold it into something new.

David Bowie was a master at this. He freely admitted to borrowing from The Velvet Underground, Kabuki theater, books, and German cinema. But Bowie never just copied outright (with maybe a notable "Somewhere Over the Rainbow"-like exception or two). Instead, he took specific elements, filtered them through his current setting (London, New York, Berlin), and combined them with new collaborators and conspirators (Mick Ronson, Lou Reed, Iggy Pop, Brian Eno, Robert Fripp).

Bowie’s "ingredients" were intentional. Every person he worked with; every inspiration he took; he did it with intention, and that’s the foundation on which his creativity became legendary.

Be able to explain and justify your work

Circling back to the GNOME policy, here’s the part that I think is most relevant:  

While it is not prohibited to use AI as a learning aid or a development tool (i.e. code completions), extension developers should be able to justify and explain the code they submit, within reason. 

GNOME is a massive open source initiative with many chefs. They’re not anti AI in general, but they need to know how the metaphorical sausage is made to ensure security, efficiency, interoperability, consistency, and a good user experience.

Releasing a product full of code that’s not properly documented and may not be decipherable or reconfigurable by future developers renders the product inflexible, less portable, and very hard to support. Will it always be this way? Perhaps not. AI coding is getting better and better, and a day could come soon when this isn't the case. From what I understand from listening to developers that work with AI, we're not quite there yet.

We should treat our creative work the same way.

3 Lessons for creativity with AI

  • Keep AI small and surgical: use generative AI to enhance your work, but use it as support, not replacement. Use it to solve a specific problem, then step back in.
  • Know your ingredients: be able to identify exactly where you used AI so that it can be changed, reworked, or defended in the future. If you don't know why something's in your work, is it yours?
  • Reimagine creativity: don't think of the creative process as some wholistic thing that is bestowed upon you from the heavens by your favorite deity (mine is David Bowie). Make it a synthesis of the things that inspire you, passed through the filter of constraints, context, and lived experience. 

If we look at AI through this lens, we just might make stuff that's useful, beautiful, and durable, rather than just adding more stuff to the pile.

May Bowie be with you.

Read next

AI writing problem? Just a writing problem? Image generated by ChatGPT.
AI · Featured

AI writing problem? Just a writing problem?

We blame AI for generic, soulless writing, but maybe it’s just holding up a mirror. After decades of human-generated slop, AI has learned from the best (and the worst). Here’s why that realization might actually make us better writers.