Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I tried to open a 90GB file once (on a system with 16GB RAM), which it didn't like very much. It didn't actually crash, I just gave up on it before it completed loading the file (I suspect it may actually have worked eventually, utilising swap - but I imagine it would have been unusably slow). I think for such large files you really need an editor that doesn't buffer the entire file into RAM and loads chunks in from disk on demand.

That said, if your file is more reasonably sized (I've opened 7GB files no problem) then the performance and feature set you get is impressive. You can get instant scrolling to anywhere in the file, full syntax highlighting, find and replace, multiple cursors, etc all with reasonable performance. I find it does lag a little when making edits - especially with multiple cursors - but it's still quite usable.



> I think for such large files you really need an editor that doesn't buffer the entire file into RAM and loads chunks in from disk on demand.

That's what mmap does: maps the entire file into the address space of the process while letting the kernel load the pages as they are touched. I assume that would improve performance significantly but I'm not sure if anyone's tried it.


That would be read-only though, so not an ideal solution for Sublime. Although one could mmap and then reference the original from a structure describing how the file was modified.

However mmaping files which can be modified by another process concurrently is tricky. You can just get killed by SIGBUS.


>> I think for such large files you really need an editor that doesn't buffer the entire file into RAM and loads chunks in from disk on demand

It seems that vim could work this way as you are always dealing with a small section of text at a time. However, this seems not to be how it works.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: