So let me give you a bit of context, we are building a mailing client which will have many connected nylas accounts. So having a large number of connected accounts might trigger the rate limits on api and we want to avoid any rate limits as they will make the experience worse for users.
Question 1: So, Is it a good idea to store the emails & other data for a user in a database like mongodb (lets say for last 30 - 90 days when user onboards) and have it serve all the fetch requests directly from it? Will it be a good idea to make it as wrapper service that can communicate with rest of services and provide the information?
Question 2: Should we be using mongodb or postgres? is there any difference in terms of performance?
Hi @mayank - welcome to the Nylas forums, and pumped to help you
Few items we need to clarify:
For the rate limits on api, does this refer to the Nylas Email API and subsequent provider APIs (i.e. Gmail API)? Have a look at the rate limits sections of the Nylas docs if this helps clarify your inquiry further.
Storage is really dependent on the application use case, and if there is a need to store the information (i.e parts of a email or the entire email) for extended periods of time. Do keep in mind that storage of information may require more effort as it will need to be kept sync’d via Webhooks or API Calls. It’s worth exploring what type of information you may need to store and its use case.
In terms of database, this is also an application decision, our data should work in both scenarios as I’m sure both Postgres and Mongo can store JSON data as is if you are referring to storing the API responses as is.