Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The JSON schema not counting toward token usage is huge, that will really help reduce costs.


> Under the hood, functions are injected into the system message in a syntax the model has been trained on. This means functions count against the model's context limit and are billed as input tokens. If running into context limits, we suggest limiting the number of functions or the length of documentation you provide for function parameters.


I believe functions do count in some way toward the token usage; but it seems to be in a more efficient way than pasting raw JSON schemas into the prompt. Nevertheless, the token usage seems to be far lower than previous alternatives, which is awesome!


But it does count toward token usage. And they picked JSON schema which is like 6x more verbose than typescript for defining the shape of json.


That is up in the air and needs more testing. Field descriptions, for example, are important but extraneous input that would be tokenized and count in the costs.

At the least for ChatGPT, input token costs were cut by 25% so it evens out.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: