What would you think if models were bundled with a second model, the "copyright filter". Just as humans know to keep their creations sufficiently far away from copyrighted material, you could distribute models which are trained on copyrighted materials but know well enough not to produce anything so close so something copyrighted that it infringes.
This would prevent anybody from accidentally infringing when using these tools. Does that seem like a reasonable solution, or is your concern greater than accidental infringement?
This would prevent anybody from accidentally infringing when using these tools. Does that seem like a reasonable solution, or is your concern greater than accidental infringement?