Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Good question!

PyXL today is aimed more at embedded and real-time systems.

For server-class use, I'd need to mature heap management, add basic concurrency, a simple network stack, and gather real-world benchmarks (like requests/sec).

That said, I wouldn’t try to fully replicate CPython for servers — that's a very competitive space with a huge surface area.

I'd rather focus on specific use cases where deterministic, low-latency Python execution could offer a real advantage — like real-time data preprocessing or lightweight event-driven backends.

When I originally started this project, I was actually thinking about machine learning feature generation workloads — pure Python code (branches, loops, dynamic types) without heavy SIMD needs. PyXL is very well suited for that kind of structured, control-flow-heavy workload.

If I wanted to pitch PyXL to VCs, I wouldn’t aim for general-purpose servers right away. I'd first find a specific, focused use case where PyXL's strengths matter, and iterate on that to prove value before expanding more broadly.



I need to bit bang the RHS2116 at 25MHz: https://intantech.com/files/Intan_RHS2116_datasheet.pdf

Right now I'm doing this with a dsl with an fpga talking to a computer.

Does your python implementation let you run at speeds like that?

If yes, is there any overhead left for dsp - preferably fp based?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: