My site is using a php function to resize images so the browser doesn't have to do it:
function resizeImg($img, $newW, $newH, $rotateTrue) {
//image resize code, then
return base64_encode($image_data);
}
where the function is called like this:
<img src="data:image/png;base64,<?= resizeImg('myImg', newWidth, newHeight, 1); ?>" alt="pic">
No problems. However, the image data that is returned is sizeable. When I inspect the element in Chrome, > 1kB of encoding data is returned per resized image.
This is fine for a scattered image here and there, but what if you have a huge table? For example, one table on my site is > 1500 rows with one image per row. EACH image is currently resized by the browser, which I know is not ideal. However, if I resized every one of these 1500 images using my server-side resize function, I would add roughly this amount of encoded data:
1500 images x ~1.1 kB encoding data per image = 1650 kB
to this table.
For a high-performance site, which should I give more importance to, bandwidth savings (i.e., resize client side) or browser-savings (i.e., resize server-side)?