Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have written a whole project [1] for making amateur videoclips [2] of A.I. generated music, using GPT and other LLMs. 10.000 to 12.000 lines of code was written exclusively by AIs.

I didn't know at the start what should be done, and the code ended up having lots of duplication, but i refactored it a lot and now it is in the order of 4.000 lines.

I could make some screencasts about the development process, but it is very simple. I ask it to write some code, and I always provide some relevant context. When it is a method of a struct, i give it the struct, and/or a similar method, and i describe what i want the method to do. Also sometimes I give it the type signature of the method/function which has to write. When it has to return an error, i provide the error Enum.

In other words, by providing the context, i never use it zero shot, rather always aim for few shot answers. When i want another function to use, i just provide the type signature, and almost never the whole function.

One more detail, I give as minimal of a context as possible. I use as a context window, 100 tokens, 200 or maximum 300 tokens. 100 tokens per query, then delete the context for the next task, and provide new context. Never use more than few hundred tokens per query. Even 300 tokens is pushing it too far.

That's about it! Never use LLMs zero shot, always few shot, and never use more than 100 tokens per query.

[1] https://github.com/pramatias/fxp_videoclipper/tree/main [2] https://www.youtube.com/watch?v=RmmoMPu091Y



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: