The news of The New York Times filing a lawsuit against Microsoft and OpenAI for copyright infringement struck the internet like a rogue comet, sparking a fierce debate about the ‘inherent dangers’ of AI.
One thing is clear: there has been no shortage of lawsuits contesting the legality or fair use of a work of art in the digital space, with intellectual property being the subject matter. Meanwhile, concerns over copyright infringement in AI have recently increased, highlighting the need to protect the ownership of AI-generated content.
Ringfence, a web3-based platform for Generative AI creators, is building the infrastructure for creators to protect and monetize various media, including photos, images, videos, documents, and music. In this interview, Ringfence CEO Whitney Gibbs offered valuable insights into Artificial Intelligence Generated Content(AIGC)and the need to incentivise creators.
My journey in web3 began in 2017, buying the top of the bull market. Over the next year, I researched different projects and communities and built relationships in the space. In 2018, I found my way to GPU mining and all things Proof of Work. That year, I began what would evolve into the HASHR8 Podcast (later Compass Mining Podcast, now The Mining Pod).
For the next two years, I focused on providing the crypto mining community with valuable insights, and in 2020, the podcast was recognized by Forbes as one of crypto’s most important podcasts.
Around this time, I started Compass Mining with two friends. As CEO from 2020-2022, I lead Compass’s growth from zero to becoming the world’s largest Bitcoin mining marketplace for hardware and hosting.
Compass was (and still is) an amazing business doing hundreds of millions of dollars in annual sales, and we built a great team of nearly 100 people around the world. The coolest thing about Compass was that we created the “retail mining” category, opening Bitcoin mining up to people who would have otherwise been excluded.
In June 2022, I stepped down as CEO of Compass Mining to begin researching new market opportunities to see how I could best impact the global adoption of web3. It was clear to me that artificial intelligence was the most important technological advancement in our world since the internet, so all research was focused on where blockchain technologies and AI might intersect.
The most immediate, clear intersection is the establishment of digital provenance and attribution in the training of neural networks. There are millions of people around the world currently providing their data to large companies like OpenAI and Stable Diffusion FOR FREE. While I don't think this is a malicious action on the part of those companies, there is a better way to do things. This is ultimately what led to the formation of Ringfence.
Thank you! The New York Times’ lawsuit is a landmark legal case—there’s little doubt its outcomes will shape the future landscape of generative AI. The way some see it, The New York Times is fighting on behalf of journalists and authors worldwide to get answers. The way we see it, it’s a major case among many where creators—both companies and individuals—are fighting for their creative rights. But you can’t ignore that it’s incredibly nuanced.
In my opinion, the New York Times case has merit: These companies train their data on billions and billions of data points, including from media outlets, and these chatbots are perfectly capable of regenerating information strikingly similar to what a journalist produces daily—even articles behind paywalls.
But NYT is also alleging that these chatbots can regurgitate articles word-for-word. It remains to be seen exactly how these tools, in this case, ChatGPT specifically, were able to do this and whether this process relied more on external sources via web searching rather than an internal training set. This will be an important point in the case—if it's the former, it might be fair to say it comes under Fair Use. And if it’s the latter? The case might just raise more questions than it answers.
Ultimately, I think what’s most troubling to creators is that some of these AI companies have claimed that their work is transformative enough of the original training data that it comes under Fair Use. It seems unfair to creators to be able to take their work en masse, train an AI tool, and then use that tool for commercial purposes without giving the creators due credit or compensation.
This is what Ringfence is built to solve. But whatever the case and its outcome, I think the New York Times, like other creators before and after it, has every right to fight to regain creative and financial control over its intellectual property.
Copyright has long been a burning issue when it comes to Intellectual Property rights and digital ownership. As Generative AI has only made it easier to generate digital content, it’s no surprise that this issue has been exacerbated. The digital output of these models relies on the immense amount of information used to train the Large Language Model behind the Generative AI without consent or compensation to the original creators.
There are countless cases, really since the inception of publicly available Generative AI models, of artists and authors having their work scraped from the internet and used to train them. More recently, we now have AI creating music imitating certain artists and even AI movies in the style of different directors.
As these tools have become more advanced, they’ve been able to reproduce artistically similar images, text, and now even video and audio. With them also being commercialized, the copyright vs. Fair Use debate has only become more foggy.
Blockchain is fundamental to Ringfence, providing the infrastructure for digital authentication and monetization. In our case, creations uploaded to the cloud can be tokenized, with their metadata also stored on-chain. This serves a few purposes. Firstly, blockchain provides a transparent and immutable ledger for tracking the digital provenance of creation, providing a clear record of ownership. Secondly, the tokenization also offers digital ownership itself, facilitating peer-to-peer transfer.
We also make use of smart contracts to facilitate enforceable end-user agreements, enabling fair and seamless monetization. With these, creators can set clear rules for how AI gets to use their work, whether it can be personalized or used to train models, and how they prefer to be compensated when their work is used to create AI derivatives.
Generative AI can be a tool to enable hyperproductivity. Artists, authors, and creatives of any kind should be able to make use of it in a way that respects and rewards their creativity. Creators should be able to use their work to rapidly iterate and create derivatives, but they should also be able to license their work to train AI models—and get rewarded.
Data is the most valuable commodity in the world, yet rarely are those supplying it the same ones benefiting from it.
At present, this is most true with AI. Companies like OpenAI,Midjourney, and Stable Diffusion garner billion-dollar valuations and generate million-dollar monthly profits while paying out NOTHING to those who have supplied the key data points without which they would cease to function.
For Ringfence, we believe that content monetization is a critical first step toward the monetization of all personal data. Through content monetization and the neural network needed to support it, we intend to answer important questions around attribution, compensation, and true fair use.
Raising capital in 2023 was no small task, even with a compelling idea and sound business plan. The end of the era of zero-interest rates, coupled with global economic uncertainty, changed the venture landscape and made raising capital hard for many. Ringfence was fortunate to secure funding from a great cohort of investors who believe in our vision for the future and were excited about backing our experienced team.
AI is a tool or technology like any other—it can have an immensely positive impact, but it might also be incredibly detrimental to creators without us taking a step back and assessing it closely.
Ultimately, creatives shouldn't have to compromise or buckle under the threat of AI taking their work. They should be able to create freely, working alongside AI to improve their productivity while being fairly rewarded.
At the risk of sounding a little too sentimental, I hope that we can all work towards a shared future where creatives can collaborate with AI fairly, bringing in a new age of creativity—but where innovation can continue.