Large TXT files

Discussion in 'Computing and Networks' started by nitko12, Oct 28, 2015.

  1. nitko12

    Thread Starter New Member

    Oct 14, 2015
    12
    0
    I calculated sqrt(5) using the y-cruncher and now i cant find any program to open it (its about 50 GB of text)
    Does anyone know any program to open such massive TXT files?
     
  2. joeyd999

    AAC Fanatic!

    Jun 6, 2011
    2,692
    2,756
    For what purpose?

    Most text editors load a whole file in memory -- and the size is either limited by the editor's code, or your system RAM.

    You can write your own C code to extract the characters sequentially from the file for viewing (i.e print to the display).

    But I cannot think of a single instance where, or why, one would want to view (and potentially edit) 50GB of text data at one time.
     
  3. joeyd999

    AAC Fanatic!

    Jun 6, 2011
    2,692
    2,756
    Also, if you don't mind working at the command line, you should be able to 'type' (MSDOS) or 'cat' (UNIX/LINUX) the file (I don't know if MSDOS has a size limitation wrt the 'type' command).
     
  4. nitko12

    Thread Starter New Member

    Oct 14, 2015
    12
    0
    I dont understand a word you said but is there any program to open it simply by clicking on it because i dont know a thing about c codes or anything based on programing
     
  5. Alec_t

    AAC Fanatic!

    Sep 17, 2013
    5,804
    1,105
    It's going to take an awfully long time to view the whole contents of the file! :). Just don't try printing it out ;).
     
  6. joeyd999

    AAC Fanatic!

    Jun 6, 2011
    2,692
    2,756
    You still haven't said what you want to do with the data. We are talking 53,687,091,200 characters. Do you just want to look at them (do you have enough days left in your life?), do you want to select a reasonable subset of them, or do you want to process them somehow?

    Processing a text file this size is not trivial.
     
  7. DerStrom8

    Well-Known Member

    Feb 20, 2011
    2,428
    1,328
    Very roughly speaking, 5000 characters in a text file is approximately 5kB (4.88kB to be more precise). There are 1,048,576 kB in one GB. That means that in your 50GB text file there is more than 52,000,000,000 characters. How many of those do you really need?

    EDIT: @joeyd999 you beat me to it!
     
  8. joeyd999

    AAC Fanatic!

    Jun 6, 2011
    2,692
    2,756
    Linux/Unix also has a 'split' command:

    split --bytes=1M /path/to/file/textfile.txt /path/to/file/prefixForfiles

    This will split the large file into 50,000 individual 1 megabyte files that you could then individually view with a capable text editor. Some editors are limited to < 64K bytes, so:

    split --bytes=50K /path/to/file/textfile.txt /path/to/file/prefixForfiles

    would give you 1,000,000 individual 50,000 byte files.

    Have fun.
     
  9. joeyd999

    AAC Fanatic!

    Jun 6, 2011
    2,692
    2,756
    He could send them to a random overseas fax...
     
  10. Papabravo

    Expert

    Feb 24, 2006
    10,157
    1,795
    Not one that I know of or can verify will work. Until recently we never even had hard drives that large, let alone a text file that big. My best guess is that Notepad++ might be able to do the job.

    https://notepad-plus-plus.org/
     
  11. dl324

    Distinguished Member

    Mar 30, 2015
    3,250
    626
    Is the data one line with 50G characters?

    If not, 'sed' was designed to operate on arbitrarily large files one line at a time.

    If it's one very long line, you could try 'dd' and specify a number of bytes to read at a time. Subsequent blocks of data would require a byte offset to be specified. It could be very slow without preprocessing the data. You could use 'dd' to partition the data into files of more manageable size...
     
  12. BR-549

    Well-Known Member

    Sep 22, 2013
    2,004
    394
    Comprehensive tax reform.

    It's not meant to be opened.
     
Loading...