I am building a video surveillance app, for learning purposes.
At the moment I have a very basic prototype working.
I am capturing and sending webcam frames to a browser over a websocket.
Just for some context on how basic the implementation is, on the client side I have:
<img id="frame" src="">
I am appending the frames via javascript.
The app and the server are written in Golang
Although what I have works, I know sending individual frames to a browser is never going to cut it if I want to view the stream from outside my home network.
I have to optimize the process.
So far I have only thought of compressing the cam frames before sending them to the client.
Also encoding the frames to video and implementing a video player on the client side.
What else can be done?
I know the question is broad, but I assume there must be a standard optimization pattern for video streaming.