Loved it, it's amazing.
Few questions
How many developers developed it?
How much time it took?
You started earning?
How are you advertising?
Why not people call from there phone for international call?
voip.ms requires setting up SIP credentials and configuring a softphone (or their app). Great for technical users who want maximum control and lowest rates. Voklit is for people who wish to download an app / use the browser and start calling immediately without touching any settings.
Google Voice is available everywhere with a little effort. I've used it outside of the US for over 10 years.
Even without GV there are other lower cost options. The actual (wholesale) cost of these services are around what GV charge or less, a 5+ times mark up is pretty rich.
Thanks! There isn’t a static knowledge base — questions and explanations are generated on the fly based on the user’s goal, with the system controlling difficulty and structure.
The application has an AI model in the backend which does object detection in the notes and responds back with the notation coordinates to the desktop application. Once coordinates are sent back, the note in the server will be deleted.
This feels broadly useful beyond translation — e.g., prompt sanitization for support agents or RAG pipelines. Have you experimented with feeding the enriched tags directly into LLM prompts (vs MT engines) and how they behave?
I haven't done as much testing as I'd like to confidently answer this in general terms. In our own environment we have the benefit of defining the system prompt for translation, so we can introduce the logic of the tags to the LLM explicitly. That said, in our limited general-purpose testing we've seen that the flagship models definitely capture the logic of the tags and their semantic properties reliably without 'explanation'.
I'm currently exploring a general purpose prompt sanitizer and potentially even a browser plugin for behind-the-scenes sanitization in ChatGPT and other end-user interfaces.
This is a neat use of git worktrees. Isolating Claude Code per branch avoids a lot of context loss when juggling bugfixes and features. The .env copy + auto-launch touches are thoughtful.
Very nice exploration of URL-as-state. The approach is elegant, but the mobile crashes highlight how hostile real-world URL handling still is once links leave the browser.
Really nice results on MSVC. The idea that tail calls effectively reset compiler heuristics and unblock inlining is pretty convincing. One thing that worries me though is the reliance on undocumented MSVC behavior — if this becomes widely shipped, CPython could end up depending on optimizer guarantees that aren’t actually stable. Curious how you’re thinking about long-term maintainability and the impact on debugging/profiling.
Thanks for reading! For now, we maintain all 3 of the interpreters in CPython. We don't plan to remove the other interpreters anytime soon, probably never. If MSVC breaks the tail calling interpreter, we'll just go back to building and distributing the switch-case interpreter. Windows binaries will be slower again, but such is life :(.
Also the interpreter loop's dispatch is autogenerated and can be selected via configure flags. So there's almost no additional maintenance overhead. The main burden is the MSVC-specific changes we needed to get this working (amounting to a few hundred lines of code).
> Impact on debugging/profiling
I don't think there should be any, at least for Windows. Though I can't say for certain.
That makes sense, thanks for the detailed clarification. Having the switch-case interpreter as a fallback and keeping the dispatch autogenerated definitely reduces the long-term risk.
reply