I was involved in some work about five years ago where I had to keep a
small number of files (about 5) open all the time for examination as
needed. Each was 100-500 GB.
I opened them all in vim instances running in the background at login time,
and then I read email until they were all loaded. I then put the process
with the file I needed into the foreground, examined it, and returned to
the shell with :stop (which all command line editors should have) when I
was done. Worked like a charm.
On Fri, Jul 18, 2025, 7:46 PM Larry McVoy <lm(a)mcvoy.com> wrote:
On Fri, Jul 18, 2025 at 06:57:03PM -0400, Jeff Johnson
wrote:
Actually, Xvi is now maintained at
https://codeberg.org/martinwguy/xvi.
I'll bow out of the editor discussion now, since I think we're pretty
far from the original topic.
I can pull it back to something potentially useful. As I mentioned, years
and years ago, on small memory machines, I used to vi $BIG_ASS_LOGFILE
and because editors tend to malloc each line, the vi session got really
slow (started swapping) when the file was bigger than roughly 1/2 of mem.
My changes were to teach the string library, and whatever else operated
on a line of the file, to treat \n the same way you would treat \0.
Then you change the code that read in the file to just mmap() it.
I don't think I went so far as to make changes to the file work, not
sure, it may have just worked but I was looking at log files, I don't
think I modified them.
I'm pretty sure the answer is no, the laptop I'm typing on has 64GB and
I suspect everyone else is the same. But can anyone imagine a use case
where having a vi that could read really large files (and quickly, no
parsing/mallocing each line) would be useful?
I'm pretty done with programming but if someone said "here is an important
use case where that would help" I'd go find those changes and see if I can
port them forward.
--
---
Larry McVoy Retired to fishing
http://www.mcvoy.com/lm/boat