Jul 15, 2025

Jul 15, 2025

Jul 15, 2025

BLURRED LINES: WHO OWNS AI GENERATED CULTURE

The internet has always thrived on remix culture — from Tumblr moodboards to SoundCloud samples to TikTok audios looped into infinity. But with the rise of generative AI, the question of who owns culture has become messier than ever. If an algorithm trained on billions of images creates a new aesthetic, is it art, theft, or something in between?

The internet has always thrived on remix culture — from Tumblr moodboards to SoundCloud samples to TikTok audios looped into infinity. But with the rise of generative AI, the question of who owns culture has become messier than ever. If an algorithm trained on billions of images creates a new aesthetic, is it art, theft, or something in between?

The internet has always thrived on remix culture — from Tumblr moodboards to SoundCloud samples to TikTok audios looped into infinity. But with the rise of generative AI, the question of who owns culture has become messier than ever. If an algorithm trained on billions of images creates a new aesthetic, is it art, theft, or something in between?

Author

NOISE
NOISE

READ

5 MINS
5 MINS

Category

FUTURE OF CREATIVE
FUTURE OF CREATIVE
THE COLLAPSE OF AUTHORSHIP


Traditionally, culture had at least a traceable lineage. A trend could be tracked back to a subculture, a designer, a movement. AI collapses that lineage. When Midjourney or Stable Diffusion spits out an image of “cyber-Y2K maximalism,” it isn’t citing references — it’s synthesizing them into something that feels new, even if every pixel is borrowed.


The result is an unsettling question: who gets credit? The creator of the tool? The user prompting it? Or the countless artists whose works were scraped, uncredited, to feed the machine? In the past, sampling meant paying royalties. In the age of AI, ownership blurs into invisibility.


CULTURE WITHOUT CONSENT


This isn’t just an academic debate. For working artists, illustrators, and photographers, the stakes are material. Their style — once their signature, their brand — can now be replicated at scale without permission. Lawsuits are mounting: Getty Images is suing Stability AI for using its licensed library as training data without consent. Independent artists are organizing around tools like Spawning’s “Have I Been Trained”, which lets creators see if their work has been swept into datasets without approval.


The U.S. Copyright Office has also drawn a line: in 2023, it ruled that works generated entirely by AI are not eligible for copyright protection, since copyright requires human authorship. That ruling doesn’t end the debate, but it highlights the legal vacuum — AI outputs exist in a gray zone, free to be reproduced and reused without the protections (or the responsibilities) that govern human work.

THE COLLAPSE OF AUTHORSHIP


Traditionally, culture had at least a traceable lineage. A trend could be tracked back to a subculture, a designer, a movement. AI collapses that lineage. When Midjourney or Stable Diffusion spits out an image of “cyber-Y2K maximalism,” it isn’t citing references — it’s synthesizing them into something that feels new, even if every pixel is borrowed.


The result is an unsettling question: who gets credit? The creator of the tool? The user prompting it? Or the countless artists whose works were scraped, uncredited, to feed the machine? In the past, sampling meant paying royalties. In the age of AI, ownership blurs into invisibility.


CULTURE WITHOUT CONSENT


This isn’t just an academic debate. For working artists, illustrators, and photographers, the stakes are material. Their style — once their signature, their brand — can now be replicated at scale without permission. Lawsuits are mounting: Getty Images is suing Stability AI for using its licensed library as training data without consent. Independent artists are organizing around tools like Spawning’s “Have I Been Trained”, which lets creators see if their work has been swept into datasets without approval.


The U.S. Copyright Office has also drawn a line: in 2023, it ruled that works generated entirely by AI are not eligible for copyright protection, since copyright requires human authorship. That ruling doesn’t end the debate, but it highlights the legal vacuum — AI outputs exist in a gray zone, free to be reproduced and reused without the protections (or the responsibilities) that govern human work.

THE COLLAPSE OF AUTHORSHIP


Traditionally, culture had at least a traceable lineage. A trend could be tracked back to a subculture, a designer, a movement. AI collapses that lineage. When Midjourney or Stable Diffusion spits out an image of “cyber-Y2K maximalism,” it isn’t citing references — it’s synthesizing them into something that feels new, even if every pixel is borrowed.


The result is an unsettling question: who gets credit? The creator of the tool? The user prompting it? Or the countless artists whose works were scraped, uncredited, to feed the machine? In the past, sampling meant paying royalties. In the age of AI, ownership blurs into invisibility.


CULTURE WITHOUT CONSENT


This isn’t just an academic debate. For working artists, illustrators, and photographers, the stakes are material. Their style — once their signature, their brand — can now be replicated at scale without permission. Lawsuits are mounting: Getty Images is suing Stability AI for using its licensed library as training data without consent. Independent artists are organizing around tools like Spawning’s “Have I Been Trained”, which lets creators see if their work has been swept into datasets without approval.


The U.S. Copyright Office has also drawn a line: in 2023, it ruled that works generated entirely by AI are not eligible for copyright protection, since copyright requires human authorship. That ruling doesn’t end the debate, but it highlights the legal vacuum — AI outputs exist in a gray zone, free to be reproduced and reused without the protections (or the responsibilities) that govern human work.

THE NEW GATEKEEPERS


In many ways, AI platforms themselves have become the new cultural gatekeepers. The dataset is the canon. Whatever goes in shapes what comes out. This is less a democratization of creativity than a redistribution of power — away from communities and toward the corporations training the models. Culture is no longer just performed for the algorithm; it is authored by it.


RESOLUTION: REFERENCE, DONT REPLICATE


And yet, dismissing AI outright misses its potential. AI is powerful not because it can “invent” culture, but because it can reference it — surfacing patterns across decades, reviving dormant aesthetics, and refracting them into the present. Think of the way Y2K fashion has reappeared, mutated by Gen Z into new forms. AI, used thoughtfully, can accelerate this process of cultural recontextualization: not replacing creativity, but offering a lens through which old ideas are made strange — and therefore new — again.


This is where Noise’s approach matters. Instead of claiming authorship, Noise Kits act as filters: curated, AI-powered forecasts that frame cultural signals into usable, forward-looking narratives. They don’t claim to own the culture they draw from — they contextualize it, translate it, and make it actionable.


Replication without consent is theft. Recontextualization with intent is commentary. AI at its best doesn’t flatten originality — it amplifies the ways culture repeats, mutates, and resurfaces.

THE NEW GATEKEEPERS


In many ways, AI platforms themselves have become the new cultural gatekeepers. The dataset is the canon. Whatever goes in shapes what comes out. This is less a democratization of creativity than a redistribution of power — away from communities and toward the corporations training the models. Culture is no longer just performed for the algorithm; it is authored by it.


RESOLUTION: REFERENCE, DONT REPLICATE


And yet, dismissing AI outright misses its potential. AI is powerful not because it can “invent” culture, but because it can reference it — surfacing patterns across decades, reviving dormant aesthetics, and refracting them into the present. Think of the way Y2K fashion has reappeared, mutated by Gen Z into new forms. AI, used thoughtfully, can accelerate this process of cultural recontextualization: not replacing creativity, but offering a lens through which old ideas are made strange — and therefore new — again.


This is where Noise’s approach matters. Instead of claiming authorship, Noise Kits act as filters: curated, AI-powered forecasts that frame cultural signals into usable, forward-looking narratives. They don’t claim to own the culture they draw from — they contextualize it, translate it, and make it actionable.


Replication without consent is theft. Recontextualization with intent is commentary. AI at its best doesn’t flatten originality — it amplifies the ways culture repeats, mutates, and resurfaces.

THE NEW GATEKEEPERS


In many ways, AI platforms themselves have become the new cultural gatekeepers. The dataset is the canon. Whatever goes in shapes what comes out. This is less a democratization of creativity than a redistribution of power — away from communities and toward the corporations training the models. Culture is no longer just performed for the algorithm; it is authored by it.


RESOLUTION: REFERENCE, DONT REPLICATE


And yet, dismissing AI outright misses its potential. AI is powerful not because it can “invent” culture, but because it can reference it — surfacing patterns across decades, reviving dormant aesthetics, and refracting them into the present. Think of the way Y2K fashion has reappeared, mutated by Gen Z into new forms. AI, used thoughtfully, can accelerate this process of cultural recontextualization: not replacing creativity, but offering a lens through which old ideas are made strange — and therefore new — again.


This is where Noise’s approach matters. Instead of claiming authorship, Noise Kits act as filters: curated, AI-powered forecasts that frame cultural signals into usable, forward-looking narratives. They don’t claim to own the culture they draw from — they contextualize it, translate it, and make it actionable.


Replication without consent is theft. Recontextualization with intent is commentary. AI at its best doesn’t flatten originality — it amplifies the ways culture repeats, mutates, and resurfaces.

  • More Blogs More Blogs

THE HARD TRUTH


AI will shape culture — that much is inevitable. But whether it becomes a parasite that drains creativity or a tool that deepens it depends on how we use it.

The hard truth is this: culture belongs to people. AI can remix it, accelerate it, even predict it — but it cannot own it. What it can do is help us see cultural loops with sharper clarity, and reimagine them for the moment we’re in. That’s not erasure — it’s evolution.

THE HARD TRUTH


AI will shape culture — that much is inevitable. But whether it becomes a parasite that drains creativity or a tool that deepens it depends on how we use it.

The hard truth is this: culture belongs to people. AI can remix it, accelerate it, even predict it — but it cannot own it. What it can do is help us see cultural loops with sharper clarity, and reimagine them for the moment we’re in. That’s not erasure — it’s evolution.

  • More Blogs More Blogs

03

//FAQ

Concerns

Frequently

Asked Questions

01

What is Noise?

02

Who is Noise for?

03

What makes Noise different from stock libraries or AI tools?

04

Can I use Noise visuals for commercial projects?

05

What if the image links to another site?

06

What models do you use to create AI images?

07

Do I need to use AI to use Noise?

08

What do I need to get started?

09

What if I just want to license one image?

10

Are there paid plans?

03

//FAQ

Concerns

Frequently

Asked Questions

01

What is Noise?

02

Who is Noise for?

03

What makes Noise different from stock libraries or AI tools?

04

Can I use Noise visuals for commercial projects?

05

What if the image links to another site?

06

What models do you use to create AI images?

07

Do I need to use AI to use Noise?

08

What do I need to get started?

09

What if I just want to license one image?

10

Are there paid plans?

03

//FAQ

Concerns

Frequently

Asked Questions

01

What is Noise?

02

Who is Noise for?

03

What makes Noise different from stock libraries or AI tools?

04

Can I use Noise visuals for commercial projects?

05

What if the image links to another site?

06

What models do you use to create AI images?

07

Do I need to use AI to use Noise?

08

What do I need to get started?

09

What if I just want to license one image?

10

Are there paid plans?

What's All

The Noise

BASED IN Toronto,

CAnada

Creative Studio
+ trend Developer

Join the Noise waitlist.

Trend reports, curated inspiration, and exclusive kits — before anyone else sees them.

What's All

The Noise

BASED IN Toronto,

CAnada

Creative Studio
+ trend Developer

Join the Noise waitlist.

Trend reports, curated inspiration, and exclusive kits — before anyone else sees them.

What's All

The Noise

Join the Noise waitlist.

Trend reports, curated inspiration, and exclusive kits — before anyone else sees them.

What's All

The Noise

BASED IN Toronto,

CAnada

Creative Studio
+ trend Developer

Join the Noise waitlist.

Trend reports, curated inspiration, and exclusive kits — before anyone else sees them.
THE COLLAPSE OF AUTHORSHIP


Traditionally, culture had at least a traceable lineage. A trend could be tracked back to a subculture, a designer, a movement. AI collapses that lineage. When Midjourney or Stable Diffusion spits out an image of “cyber-Y2K maximalism,” it isn’t citing references — it’s synthesizing them into something that feels new, even if every pixel is borrowed.


The result is an unsettling question: who gets credit? The creator of the tool? The user prompting it? Or the countless artists whose works were scraped, uncredited, to feed the machine? In the past, sampling meant paying royalties. In the age of AI, ownership blurs into invisibility.


CULTURE WITHOUT CONSENT


This isn’t just an academic debate. For working artists, illustrators, and photographers, the stakes are material. Their style — once their signature, their brand — can now be replicated at scale without permission. Lawsuits are mounting: Getty Images is suing Stability AI for using its licensed library as training data without consent. Independent artists are organizing around tools like Spawning’s “Have I Been Trained”, which lets creators see if their work has been swept into datasets without approval.


The U.S. Copyright Office has also drawn a line: in 2023, it ruled that works generated entirely by AI are not eligible for copyright protection, since copyright requires human authorship. That ruling doesn’t end the debate, but it highlights the legal vacuum — AI outputs exist in a gray zone, free to be reproduced and reused without the protections (or the responsibilities) that govern human work.

//FAQ

Concerns

Frequently

Asked Question

What is Noise?
Who is Noise for?
What makes Noise different from stock libraries or AI tools?
Can I use Noise visuals for commercial projects?
What if the image links to another site?
What models do you use to create AI images?
Do I need to use AI to use Noise?
What do I need to get started?
What if I just want to license one image?
Are there paid plans?

//FAQ

Concerns

Frequently

Asked Question

What is Noise?
Who is Noise for?
What makes Noise different from stock libraries or AI tools?
Can I use Noise visuals for commercial projects?
What if the image links to another site?
What models do you use to create AI images?
Do I need to use AI to use Noise?
What do I need to get started?
What if I just want to license one image?
Are there paid plans?
THE HARD TRUTH


AI will shape culture — that much is inevitable. But whether it becomes a parasite that drains creativity or a tool that deepens it depends on how we use it.

The hard truth is this: culture belongs to people. AI can remix it, accelerate it, even predict it — but it cannot own it. What it can do is help us see cultural loops with sharper clarity, and reimagine them for the moment we’re in. That’s not erasure — it’s evolution.