MySQL vs PostgreSQL -- Choose the Right Database for Your Project

Vovk

Your process should accumulate multiple records into a temporary buffer and run a single query with all records at once on regular intervals.

360digitmg

I wanted to leave a little comment to support you and wish you a good continuation. Wishing you the best of luck for all your blogging efforts.
360digitmg data scientist courses

Raja K Thaw

Should I prefer PostgreSQL-XL free download as it offers mixed workloads+ COLUMNAR, for MPP proprietary databases ? I am not sure, how hard to convert everything to columnar in PostgreSQL. I think for MySQL we need to resort to MariaDB columnstore. Connect with me on linkedin—> https://www.linkedin.com/in…

Thor-x86

Personally, I use either of those database depend on framework currently in use. As example:
Laravel (PHP) => MySQL
Django (python) => PostgreSQL

MindOpener

1000 record / sec is nothing for both Mysql or Postgres. Latency and response time will start to change depending on the size of the row you’re inserting, how many indexing you have, do you have text, json.
Mysql (using load data infile) can load 100,000+ rows per second.
It all depends what your writes need are : (referential integrity, critical transactions). If you need to store data for Audit purpose Mysql and Postgres might not be the best solution. I would look into MemSQL which was built for fast ingestion and analytical queries.

I’m sure Uber did their homework. You don’t switch from one DB to another just because one guy prefers Postgres.
Research and test (benchmarks) is a must. Lots of biased benchmarks out there that probably don’t match your workload and table structure.

Matthew Kennedy

Rails => PostgreSQL

Peter Labos

If I understand properly “logging API queries”, this should avoid DB completely. Not really good choice indexing this stuff realtime. Best option is save it to filesystem and then install and use some log processing/searching software or SaaS api. To send files to different company to process it and giving you all the search options even fulltext ones.
If it is something with more sensitive sata and less volume use some message queue system.

Peter Labos

Haven’t check those two services. But I personally would use also some “browser local DB/storage” to move some workload to customers computers, like saving steps (for undo actions). To limit requests you may save their data to server in some time intervals, or only when they switch to different file in project or only on demand (when they hit save button).

And to that “hundreds of thousands of concurrent users”. Start with 100, then 1000 users and only selected registered testers, before you shut down your server with that number of users or before you pay hundreds or thousands of dollars to your server provider for auto scalling servers and not earning a penny on it.

And good luck with a project.

Saif Ul Islam

Very interesting article. Me and my friends were just discussing this today, so this article was definitely very helpful. Thank you!

Deepak Bharti

Try to do the batch operations

ReverseAds

Very comprehensive article. I have to admit that many times, I have started a project and then when the project reach its potential, I am starting to thing, why I haven’t selected different database from the begining and speed it up.

milan sonkar

One of the major disadvantages of Postgres apart from what is mentioned here is it does not support Transparent Data Encryption which is a must have feature for storing sensitive information