OpenAI has made DALL-E, the artificial intelligence (AI) image generator, available in beta and also unveiled the tool’s pricing structure.
The company has announcement that the AI system, which can create realistic images and art from a text prompt, will soon be available to everyone.
Millions of people have signed up to use the early access version, and Open AI, the company that makes DALL-E, will release its latest version to one million people on that waiting list in the coming weeks.
Contacted users will receive 50 free images to use in the first month, then 15 each month thereafter. Each credit represents four images based on an original prompt, or three if the user offers an edit or variation prompt. If the freebies aren’t enough to satisfy the Operator’s AI demands, a set of 115 credits is available for purchase for $15. OpenAI says artists who need financial assistance will be able to apply for subsidized access.
The beta version also allows users to use the images they generate for commercial purposes. For example, printing the images on shirts or selling merchandise containing the AI images will be allowed. However, OpenAI will reject image uploads that include realistic faces and explicit content. The company is concerned that bad people will use its technology to create misinformation, deepfakes and other harmful purposes.
DALL-E 2, the successor to DALL-E, was announced in April and already has 100,000 users. OpenAI says wider access has been made possible by new approaches to mitigating bias and toxicity in DALL-E 2 generations, as well as policy developments governing the images created by the system.
DALL-E 2 was trained on a filtered dataset to remove images that contained overtly violent, sexual or hateful content. However, it is not sure. Google recently said it won’t release an AI generation model it developed, Imagen, because of misuse risks. Meanwhile, Meta has limited access to Make-A-Scene, its art-focused image generation system, to “prominent AI artists”.
OpenAI points out that the hosted DALL-E 2 incorporates other safeguards, including “automated and human monitoring systems” to prevent the model from remembering faces that often appear on the Internet. Still, the company admits there’s still work to be done.
“Expanding access is an important part of our responsible deployment of AI systems, as it allows us to learn more about real-world usage and continue to iterate on our security systems” , writes OpenAI in its blog post. “We continue to research how AI systems, like DALL-E, might reflect biases in its training data and different ways to address them.”