weixin_39674190 2020-11-21 01:33
浏览 0

Local messages store

This is considered as a priority one missed major feature: - Supported by app email providers get DDoSed quite often especially in the recent time. So you can't send or get new messages, but it would be nice to at least have access to your already locally stored messages. - You don't want to be vendor locked. You want to at any time be able to at least process your messages in your own, like for example exporting all your messages into the unencrypted .eml format. - Having email messages stored locally we could enhance email provider's features by adding not yet supported things like for example full-text search for ProtonMail or batch export.

There potentially might be many email messages content getting options, like at least: - Using Service Workers. Considered as a too specific and limited approach. - Listening for email view actions happening in the email provider's UI, also very limited option. The good part is that it would work in a fully passive mode, not producing addition requests to the email provider's backend. But in JavaScript it's not always possible to listen/intercept all the needed actions happening, especially if the code is implemented with an intention to prevent interception happening. Besides going with such option we can only get content of the email message that has ever been explicitly opened/viewed by a user. The simplest in implementation option though. - Direct backend endpoints calling (Rest, WebSocket, etc). The most flexible option. Email provider's endpoints API is not yet well documented, but I believe it can be researched to a sufficient extent. The good side benefit of going with this option is that over a time getting messages logic can be moved from app code to an individual module. Like has been done with modules published here, so developers could used that module building their own programs.

At the initial stage it's not going to be a comprehensive bridge-like thing, but more like email provider's web UI supporting thing. The initial implementation is not going to keep locally stored messages in sync with the server/actual state. Means it would be a one-off putting to local cache action, with no further message state updating (message got unread state, got removed, got changed folder/label, etc).

A brief workflow steps description: - Local messages cache can be enabled per account. Disabled by default. - Having it enabled app starts a background messages fetching job, which starts working with some interval, let's say each 60 minutes. Besides if there is a rate limiter put on top of the endpoints by a backend, then app delays individual requests accordingly. - Producing a fetching request to the backend, job adds at least lastFetchedEmailItemCreateTime-like, portion size and sort order parameters, depending on the email provider's API endpoints format. That lastFetchedEmailItemCreateTime parameter would be added as a portion start request parameter. So app doesn't fetch the same messages twice. If we go with fetching all the messages during each job iteration, we can keep locally cached messages in sync with the server/actual state, but a more optimal approach would be in going with message state changing queue. So app as a client gets notified about some message state change action happened and patches locally saved message accordingly, but this can't be implemented having no control on the backend, but looks like at least Tutanota has it implemented already (see EntityEventBatchTypeRef entities fetching). - The local store would be built using some encrypted embedded SQLite-like database. Encryption key would be automatically generated and stored in the already used encrypted settings.bin file. - Email providers differ in supported features, but at this stage I'd prefer to keep a single set of local messages store table columns. So it would be a single table with a compound primary key=(id, emailProviderType = "protonmail | tutanota", login). App would also store in a raw form an original message blob provided by email provider backend. - If needed, getting and storing attachments can be added later, as an individual background job.

该提问来源于开源项目:vladimiry/ElectronMail

  • 写回答

12条回答 默认 最新

  • weixin_39674190 2020-11-21 01:33
    关注

    I was exploring existing Node.js compatible embedded databases with built-in encryption support and have not really found solutions that would keep metadata encrypted as well. Means, it's possible to encrypt values of specific columns/fields, but metadata remains unencrypted. Metadata is for example information like how many rows in the database you have, what is the columns set, empty/filled cells, etc. But such information in some case can also be considered as sensitive.

    So I'm considering the following approach. App keeps all the messages in memory. App flushes these messages to the encrypted file with some interval and probably on some triggers. This file would be a brick of bytes, fully encrypted, including metadata, similar to how settings.bin is being currently saved (it's a very small file though). When app starts, it loads the saved file into a memory and process continues as described above. At this stage, attachments are not going to be saved, and so it's not going to be a very memory consuming thing. Later it would be possible to introduce an encrypted binary file storage and fetch files into it in a scheduled manner.

    Database encryption key would be generated once and stored in settings.bin. A feature of changing this key can be added later in along with re-encrypting the database with a new key. The encryption algorithm is going to be AES 256 CBC with randomly generated IV on every file saving.

    评论

报告相同问题?