How to edit multi-gigabyte text files? Vim doesn't work =( [closed]

How to edit multi-gigabyte text files? Vim doesn't work =( [closed]

Are there any editors that can edit multi-gigabyte text files, perhaps by only loading small portions into memory at once? It doesn't seem like Vim can handle it =(

VIM: store output of external command into a register


How can I create a mapping with control and a function key in Vim?
If you are on *nix (and assuming you have to modify only parts of file (and rarely)), you may split the files (using the split command), edit them individually (using awk, sed, or something similar) and concatenate them after you are done..
vim: delete display lines instead of physical lines
cat file2 file3 >> file1 

valid string in vim?

Retain undo buffer after quitting VIM?


GVim taking forever to load when connected to office network
Ctrl-C will stop file load.

How to go back to lines edited before the last one in Vim?
If the file is small enough you may have been lucky to have loaded all the contents and just killed any post load steps.

How can I map “:vsplit .” to <F7> in Vim?
Verify that the whole file has been loaded when using this tip.
. Vim can handle large files pretty well.

I just edited a 3.4GB file, deleting lines, etc.

Three things to keep in mind:.
  1. Press Ctrl-C: Vim tries to read in the whole file initially, to do things like syntax highlighting and number of lines in file, etc.

    Ctrl-C will cancel this enumeration (and the syntax highlighting), and it will only load what's needed to display on your screen.
  2. Readonly: Vim will likely start read-only when the file is too big for it to make a .

    file copy to perform the edits on.

    I had to w! to save the file, and that's when it took the most time.
  3. Go to line: Typing :115355 will take you directly to line 115355, which is much faster going in those large files.

    Vim seems to start scanning from the beginning every time it loads a buffer of lines, and holding down Ctrl-F to scan through the file seems to get really slow near the end of it.
Note - If your Vim instance is in readonly because you hit Ctrl-C, it is possible that Vim did not load the entire file into the buffer.

If that happens, saving it will only save what is in the buffer, not the entire file.

You might quickly check with a G to skip to the end to make sure all the lines in your file are there.


It may be plugins that are causing it to choke.

(syntax highlighting, folds etc.). you can run vim without plugins.

vim -u "NONE" hugefile.log 
Its minimalist but it will at least give you the vi motions your used to.

syntax off 
is another obvious one.

Prune your install down and source what you need.

You'll find out what it's capable of and if you need to accomplish a task via other means.



You might want to check out this VIM plugin which disables certain vim features in the interest of speed when loading large files..


I've tried to do that, mostly with files around 1 GB when I needed to make some small change to an SQL dump.

I'm on Windows, which makes it a major pain.

It's seriously difficult.

. The obvious question is "why do you need to?" I can tell you from experience having to try this more than once, you probably really want to try to find another way.. So how do you do it? There are a few ways I've done it.

Sometimes I can get vim or nano to open the file, and I can use them.

That's a really tough pain, but it works.. When that doesn't work (as in your case) you only have a few options.

You can write a little program to make the changes you need (for example, search & replaces).

You could use a command line program that may be able to do it (maybe it could be accomplished with sed/awk/grep/etc?). If those don't work, you can always split the file into chunks (something like split being the obvious choice, but you could use head/tail to get the part you want) and then edit the part(s) that need it, and recombine later.. Trust me though, try to find another way..


A slight improvement on the answer given by @Al pachio with the split + vim solution you can read the files in with a glob, effectively using file chunks as a buffer e.g.
$ split -l 5000 myBigFile xaa xab xac ... 

$ vim xa* #edit the files :nw #skip forward and write :n! #skip forward and don't save :Nw #skip back and write :N! #skip back and don't save


Wow, never managed to get vim to choke, even with a GB or two.

I've heard that UltraEdit (on Windows) and BBEdit (on Macs) are even more suitable for even-larger files, but I have no personal experience..


In the past I opened up to a 3 gig file with this tool


I have used TextPad for large log files it doesn't have an upper limit..


I'm using vim 7.3.3 on Win7 x64 with the LargeFile plugin by Charles Campbell to handle multi-gigabyte plain text files.

It works really well.. I hope you come right..


I think it is reasonably common for hex editors to handle huge files.

On Windows, I use HxD, which claims to handle files up to 8 EB (8 billion gigabytes)..


Personally, I like UltraEdit.

Here is their little spiel on large files..


I've used FAR Commander's built-in editor/viewer for super-large log files..


The only thing I've been able to use for something like that is my favorite Mac hex editor, 0XED.

However, that was with files that I considered large at tens of megabytes.

I'm not sure how far it will go.

I'm pretty sure it only loads parts of the file into memory at once, though..


In the past I've successfully used a split/edit/join approach when files get very large.

For this to work you have to know about where the to-be-edited text is, in the original file..

70 out of 100 based on 40 user ratings 1090 reviews