London-based CelAction, a provider of 2D industry animation software solutions, has announced its pledge only to use ethically trained AI in its products.
“We, like most other content-creation software developers, have been exploring the possibilities of integrating AI into our products, but it has become clear to us that without clear guidelines designed to protect creators, AI could do far more damage than the benefits it would bring,” said CelAction CEO Andy Blazdell. “Creators are quite rightly scared that their work is being repurposed without their knowledge or consent, and it is the responsibility of the software developers to allay those fears and protect the hard work of artists and programmers.”
The company describes its pledge in two parts:
Firstly, CelAction will not use generative AI without a chain of ownership to create its products or marketing materials. So, [it will use] no automatic code completion tools and no generative AI sound, voices, or images unless it can be proven that the AI was trained on material that was knowingly given by the original creators.
Secondly, in any AI tools that CelAction may implement into its products, the AI will be trained ethically, either from material knowingly given by the original creators or by the user of the product. So, [users] can let it learn from [their] own data or use data that CelAction has responsibly sourced.
“Fortunately, most of our clients so far are only interested in their own unique creations and reject the idea of diluting the originality of their work with homogenous input from AI,” added Blazdell. “But we call on all serious software companies to join us in our pledge to protect the community that we owe our livelihoods to.”
Source: CelAction
Debbie Diamond Sarto is news editor at Animation World Network.