-
Notifications
You must be signed in to change notification settings - Fork 5.1k
Description
Description
https://community.notepad-plus-plus.org/topic/25604/big-files-cannot-opened-in-v8-6-4-x84
STR
https://community.notepad-plus-plus.org/post/93959
- prepare a test-file bigger than 2GB but smaller than 4GB
- preset the N++ Preferences > Performance > Define Large File Size: with a size bigger that the filesize (e.g. for a 3GB test-file, set there the max 4096 MB)
- try to open that test-file
Expected Behavior
test-file is opened
Actual Behavior
N++ FileLoadingException msgbox with error code 1 (Scintilla)
Debug Info
Tested with N++ v8.6.5 x64.
Because of the setting described above, N++ does not use the SC_DOCUMENTOPTION_TEXT_LARGE required for any 2GB+ file.
This leads to a Scintilla throwing in CellBuffer::Allocate:
https://community.notepad-plus-plus.org/post/93963
Side note 1
Maybe a rephrase of the Preferences "Define Large File Size:" to the "Define Large File Size Threshold:".
Side note 2
There is a time to definitely hardcode the SC_DOCUMENTOPTION_TEXT_LARGE for any file size in the x64 N++ builds.
(in practice, 32-bit x86 N++ will not open files larger than approx 850MB, so it is irrelevant there...)
Here is my previous proposal #11047 (comment) which has been rejected #11047 (comment) .
I did a quick memory-consumption test for N++ v8.6.5 x86 & x64 Release builds.
PB ... private bytes, WS ... working set
| Notepad++ v8.6.5 | vanilla (PB/WS) | hardcoded SC_DOCUMENTOPTION_TEXT_LARGE (PB/WS) | file |
|---|---|---|---|
| x64 Release | 63.2/67.6 MB - 2.57/2.59 GB | 57.1/61.1 MB - 2.57/2.58 GB | empty "new 1" - 2046 MB txt-file |
| x86 Release | 40.9/53.3 MB - 914.0/939.2 MB | 41.0/53.4 MB - 914.0/939.4 MB | empty "new 1" - 781 MB txt-file |
So there seems to be no reason for leaving this potentially buggy situation in the N++ code (one can open a smaller non-large file and then paste there a data which cross the 2GB threshold and this will lead to a N++ crash later...). From the Community reports, it is clear that users are already routinely working with such large files nowadays.