Reading large (10+GB) files in PHP Command Line

William Attwood wattwood at
Tue Sep 1 17:43:03 MDT 2009

  A few times I've needed to load a very large text file and explode the
newlines.  I didn't want to do filesize() to get the file in and cut off
parts of strings, only to try and put them together with the next data
input.  After some searching, I found the following, which may help some of
you out there:

ini_set('memory_limit', '1000M'); // Large file to process

# Stream into PHP from STDIN
        }while(trim($selection) == '');
        $selection = str_replace('\'', '', $selection);
        $inputArray[] = $selection;
        if(count($inputArray) == 8000){
                $inputArray = array();
} while ($selection2 != 'q');

this takes in each line from STDIN, adds it to an array, and when the array
hits 8000 (my memory limit at the time) it sends the array to a PHP function
that will process and input it into the DB I am using.

Just in case anyone needs to process large files, stream them in

# more file.log | php process.php

Take care,
William Attwood
Idea Extraordinaire
wattwood at

Samuel Goldwyn<>
- "I'm willing to admit that I may not always be right, but I am never

More information about the PLUG mailing list