使用压缩代理为Web浏览实现“极端”带宽节省

I have a network connection where I pay per megabyte, so I'm interested in reducing my bandwith usage as far as possible while still having a reasenably good browsing experience. I use this wonderful extension (https://bandwidth-hero.com/). This extension runs a image-compression proxy on my heroku account that accepts images urls, and returns a low-quality version of those images.This reduces bandwith usage by 30-40% when images are loaded.

To further reduce usage, I typically browse with both javascript and images disabled (there are various extensions for doing this in firefox/firefox-esr/google-chrome). This has an added bonus of blocking most ads (since they usually need javascript to run).

For daily browsing, the most efficient solution is using a text-mode browser in a virtual console such as elinks/lynx/links2 running over ssh (with zlib compression) on a vps server. But sometimes using javascript becomes necessary, as sites will not render without it .Elinks is the only text-mode browser that even tries to support javascipt, and even that support is quite rudimentary. When I have to come back to using firefox/chrome, I find my bandwidth usage shooting up. I would like to avoid this.

I find that bandwith is used partially to get the 'raw' html files of the sites I'm browsing, but more often for the associated .js/.css files. These are typically highly compressible. On my local workstation, html+css+javascript files typically compress by a factor of more than 10x when using lzma(2) compression.

It seems to me that one way to do drastically reduce bandwith consumption would be to use the same template as the bandwith-hero extension, i.e. run a compression proxy either on a vps or on my heroku account but do so for text content (.html/.js/.css).

Ideally, I would like to run a compression proxy on my local machine. When I open a site (say www.stackoverflow.com), the browser should send a request to this local proxy. This local proxy then sends a request to a back-end running on heroku/vps. The heroku/vps back-end actually fetches all the content, and compresses it (lzma/bzip/gzip). The compressed content is sent back to my local proxy. The local proxy decompresses the content and finally gives it to the browser.

There is something like this mentioned in this answer (https://stackoverflow.com/a/42505732/10690958) for node.js . I am thinking of the same for python.

From what google searches show, HTTP can "automatically" ask for gzip versions of pages. But does this also apply for the associated files that are loaded by JavaScript, and for the css files? Perhaps, what I am thinking about is already implemented by default ?

Any pointers would be welcome. I was thinking of writing a local proxy in python,as I am reasonably fluent in it. But I know little about heroku or the intricacies of http.

thanks.

Update: I found a possible solution here https://github.com/barnacs/compy which does almost exactly what I need (minify+compress with brotli/gzip+transcode jpeg/gif/png). It uses go instead of python, but that does not really matter. It also has a docker image here https://hub.docker.com/r/andrewgaul/compy/ . Since I'm not very familiar with heroku, I cant figure out how to use this to run the compression proxy service on my account. The heroku docs also were'nt of much help to me. Any pointers would be welcome.

dsjk3214
dsjk3214 hmm..interesting。尽管会影响速度,但使用lzma(2)和gzip仍然有一个优势。同样,显示页面所需的文件之间通常具有很高的“相关性”(例如www.stackoverflow.com)。一些文本字符串将在多个.js/.css文件中重复。我不确定,但是我认为压缩页面所需的“所有”文件应提供较小的输出,而不是分别压缩每个文件。
接近 2 年之前 回复
douzi6992
douzi6992 如果您在浏览时查看“网络”标签,则可能已经看到了内容编码:.css和.js文件的gzip(不过我在图像上看不到它,想知道它们是怎么回事)
接近 2 年之前 回复
Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
立即提问
相关内容推荐