This is what my init.php
looks like that is loaded across the whole website:
$suid = 0;
session_set_cookie_params(60, '/', '.' . $_SERVER['HTTP_HOST'], true);
session_save_path(getcwd() . '/a/');
if (!isset($_SESSION['id'])) {
session_start(['cookie_lifetime' => 60]);
$_SESSION['id'] = session_id();
$_SESSION['start'] = date('d_m_Y_H_i');
$_SESSION['ip'] = $_SERVER['REMOTE_ADDR'];
} elseif (isset($_SESSION['uid'])) {
$suid = $_SESSION['uid'];
}
I'm currently testing PHP sessions, so I just put 60sec as lifetime.
I was wondering why there were sessions created since no one knows the domain yet, so I added ip
. I looked it up and found this out:
So it was the Google crawler bot. Since there are more search engines and bots out there, I don't want to save these crawls in my session files and fill up my webspace with that.
So my questions are:
1) Even when the test lifetime value (60 seconds) is over, the session file remains in the custom directory. I read this is because I set a custom directory. Is this true?
2) What would be an efficient way to delete all non-used/expired session files? Should I add $_SESSION['last_activity']
with a timestamp and let a cronjob look in my custom dir, get the session file data and calculate expired sessions to delete it?
3) Should I avoid saving those unneeded sessions by those bot crawlers just looking for the string "bot" inside $_SERVER['HTTP_HOST']
or is there a better way to identify "non-human visitors"/crawlers?
I also appreciate any improvements/suggestions to my code at the top. I just caused some Internal Server Error
previously, because session_start()
has been called to often as far I can tell from php-fpm-slow
-logs.