OpenAI’s DALL-E 2 is the biggest technology equivalent of ‘Soylent Green’.

technology

[ad_1]

This article contains spoilers for the 1973 film Soylent Green.

Here’s to a hot AI summer for anyone with the slightest interest in putting the “art” into artificial intelligence. I’m talking about DALL-E 2 and OpenAI’s announcement that its amazing text-to-art generator is entering closed-beta.

Most exciting of all: a million more people get access to DALL-E 2. Woohoo! Let’s make a cartridge.

Greetings, humanoids

Subscribe to our newsletter now to see weekly recaps of our favorite AI stories in your inbox.

face to face; There will be no cartwheels in the nervous offices of TNW that day.

DALL-E 2 is a scam in this humble editor’s opinion. But this is nothing new in the world of technology. Facebook is a scam. Google is a scam. Microsoft is a fraud. They are all using something that has nothing to do with what they are selling.

If you throw bricks, you are a builder. And if you write programs, you are a programmer. But if you sell ads in 2022, you’ll probably be a search giant or a social media company. It shouldn’t be meaningful until it makes a profit.

For the most part, big tech makes its profits by convincing you to become its product. Why try and sell to everyone on earth when it’s easier and more profitable to find a TV manufacturer that will pay you to advertise to everyone on earth?

The data we generate powers the products these companies sell. The agreement we make with them is very beautiful You can use our data to make the product we’re developing better.. And, in practice, that means as long as we keep playing Candy Crush for free, we’re meta-okay with advertisers using our browsing data to target us.

Background: DALL-E 2 is an excellent extension of the OpenAI transformer model currently supported as “GPT-3”. It takes text prompts and turns them into pictures. It’s awesome and will absolutely 100% revolutionize the world of content creation.

Let me be very clear here: I love it. I think it’s amazing. It’s a technological triumph and the whole world deserves it (with the huge caveat that OpenAI needs to be secure).

however, sell It is dangerous to reach him. It’s not just immoral. From the current level of exploitation that big tech can get away with, big tech could turn us into the digital equivalent of Soylent Green.

Broken alert: Soylent Green is an old movie but a classic. Now I’m going to spoil the giant twist in this movie, so if that bothers you, you’d better wash it off.

In the film, humanity is on the brink of starvation in the distant future. To survive, the government gives them almi restaurants and that’s all most people eat. One day, they came out with a new flavor called “Soylent Green.” Everyone loves it and luckily, it is widely produced. Unfortunately, Soylent Green is made of people. To solve the problem of hunger, the government began to feed us by examining dead people.

So, two things:

  1. As I recall, Soylent Green was actually free – I haven’t looked at it in a while, I’m sure a reader will correct me if I’m wrong.
  2. It is impossible to eat a Soylent green bar made from your corpse.

I took that all up because it explains my problem with OpenAI selling the DALL-E 2.

My result: Giving people access to DALL-E 2, with certain safeguards in place, has created an environment where anyone can see what I like to call it. A library of human images as interpreted by the transformer model.

But the moment OpenAI turned DALL-E 2 into a monetized service (OpenAI allows you to buy credits for use beyond your beta limit), it turned our data into a product and started selling it back to us.

It went from being an art library that we can all (potentially) enjoy to a for-profit business. OpenAI even makes it clear (in the linked post above) that people with beta access own any artwork the model produces on their request.

Who authorized OpenAI to sell ownership of the images our data helped create?

It is easy to compare it to a writer who has read many books and used that inspiration to create a new book.

But that’s not what DALL-E 2 does. Don’t get hung up on its “artistic” aspect. If we shrink the model, DALL-E 2 will take a photo, hold it, and give it to you when you ask.

If he has eight pictures of the ocean and you tell him to give you one, he will. But since DALL-E 2 is an agent only in the digital world, it can do things that are not possible in the physical world.

He can smash those eight pictures together and make them into one picture. And it can even be fine-tuned to decide if eight is the right number of pictures to smash together, or if six will do. Maybe ten?

Eventually, OpenAI will get to the point where the model is crunching together millions or billions of images. It’s still the same trick.

In summary: Data is data and output is output. It doesn’t matter if it’s art, writing or advertising. When Facebook collects your data, it sells that data to advertisers so you can play more Candy Crush.

When OpenAI collects your data, it doesn’t know your consent. You have never checked a box or logged into a website that clearly states that any images you post online will be used to train OpenAI’s for-profit AI system.

And that means whether it’s as fun as Candy Crush or as convenient as facial recognition, OpenAI is exploiting you and your data every time someone buys access to the DALL-E 2 model.

You don’t need to be a futurist to know that DALL-E will eventually become as important to design as Photoshop is now, because the company took your data and trained a model so well.

The difference between Adobe and OpenAI, however, is that Adobe gets permission to train your AI on your data.

No one has ever asked me if I want my face in Cleleview AI’s database or OpenAI’s, and I feel the same way about monetizing both.

This sets a dark and dangerous precedent for the use of scraped data going forward.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *