I have some huge files, for example 3-9GB large files. But my problem whe i use
$size = filesize($file);
than I got really huge io ussage. It almost kill my apache server. Is there any other method for this?
private function pieces ( $handle, $piece_length, $last = true ) {
static $piece, $length;
if ( empty( $length ) )
$length = $piece_length;
$pieces = null;
while ( ! feof( $handle ) ) {
if ( ( $length = strlen( $piece .= fread( $handle, $length ) ) ) == $piece_length )
$pieces .= self::pack( $piece );
elseif ( ( $length = $piece_length - $length ) < 0 )
return self::set_error( new Exception( 'Invalid piece length!' ) );
sleep(1);
}
fclose( $handle );
return $pieces . ( $last && $piece ? self::pack( $piece ) : null);
}
This is my code that tries to get the file size and make pieces hashs... I actually have a file sharing website and I whould like to generate torrent files for uploaded files, but when I generate the torrent this function eats all my HDD. (I whould like to use torrent, becouse if I download with my browser that can be iterrupted. But If I use torrent, than I can continue my downloading...)