Large file 35mg text not displaying correctly and messes up data after HEX conversion EDI
-
I have a large text file (EDI) that is about 35mg. When I open the file the hidden delimiter is displayed in that “FS” with black background. As I scroll to 3/4 way down, the “FS” changes to " FS" where there is a new space before the delimiter. This cause the data to be displayed incorrectly. When I convert the data to HEX, the delimiter “1C” is valid, I see no other special characters, so typically, I would replace “1C” hex, with “0D 0A” hex for CR/LF to make it more readable, since it is EDI data. But when doing so, the data gets messed up. I’m running 64-bit, Win7 and when I use an editor like Ultra Edit, I have no issue, so I know the data itself is good. There must be a bug here for large files. I don’t have the problem on smaller EDI file, (1-10mg).
What could be causing this? -
Some buffer explodes obviously… Habe observed similar sporadically reproducable behavior myself…
I’d suggest to open up a bug issue and post a link to the file.
https://github.com/notepad-plus-plus/notepad-plus-plus/issues