>Copilot introduces what we might call a more selfish interface to open-source software: just give me what I want! With Copilot, open-source users never have to know who made their software. They never have to interact with a community. They never have to contribute.
>Meanwhile, we open-source authors have to watch as our work is stashed in a big code library in the sky called Copilot. The user feedback & contributions we were getting? Soon, all gone.
I don't see how you square the above complaint with this:
> First, the objection here is not to AI-assisted coding tools generally, but to Microsoft’s specific choices with Copilot. We can easily imagine a version of Copilot that’s friendlier to open-source developers—for instance, where participation is voluntary, or where coders are paid to contribute to the training corpus.
Is an AI that was trained on opt-in or paid-for training data any less damaging? How would these choices have alleviated the problems described above?
>Meanwhile, we open-source authors have to watch as our work is stashed in a big code library in the sky called Copilot. The user feedback & contributions we were getting? Soon, all gone.
I don't see how you square the above complaint with this:
> First, the objection here is not to AI-assisted coding tools generally, but to Microsoft’s specific choices with Copilot. We can easily imagine a version of Copilot that’s friendlier to open-source developers—for instance, where participation is voluntary, or where coders are paid to contribute to the training corpus.
Is an AI that was trained on opt-in or paid-for training data any less damaging? How would these choices have alleviated the problems described above?