is MySql good for large databases?
I work for a company and we are always accessing an external site for information. The site was developed by an antiquated software development company who doesn't even have a website. They pretty much have a monopoly in my state since the content provider for the database only uses this extremely dysfunctional site to upload their data. The problem with this website is that it is so slow it's not even functional.
Having controlled for things like Connection speed and Browser type, it is clear that the problem lies within the website itself. So, I am thinking about redoing the site and then offering it to the content provider as a means for uploading their data. Basically, this project requires a very large database to store hundreds of thousands of names, addresses, and other types of data.
My only experience with databases is MySql, and really my only experience with dynamic content is PHP. So, yeah, I'm trying to figure out if the old PHP + MySQL combination is suitable for storing and representing large amounts of data. I have only done this on small projects, but I think the whole HTML templates with placeholders for the dynamic content would work fine.
Of course, I truly don't know why this website is so slow. Maybe it's not the DB at all. Maybe it's the server or something else. But the key thing I am trying to accomplish is to improve upon the speed and functionality of this site. I have no experience with other types of databases, so any tips / advice you can offer for doing a project like this would be greatly appreciated. Also, any tips regarding how to generally make a fast and functional site that would need to represent dynamic data from an extremely large database would also be helpful.
*Edit: I am learning python so if you think this would be a better side-scripting language then I can certainly try to implement something different than the initial plan above.
If you do a good design, you can have very large databases in MySQL (this other question may help you). Define properly the keys and indexes , optimize your queries ( EXPLAIN
is your friend here). Select a good database engine .
There are a lot of things to do to get the best from MySQL.
EDIT: some more ideas...
It's also quite important the way you structure your data and the tables to make easy to write them, recover them or finding a compromise between both (depending on the use case).
Also if it grows you can use a cluster , partition your data between several MySQL databases in several machines (using a Master-slave schema for instance) it's another option.
To improve the performance you should also think on using some kind of cache for repetitive queries.
There is also tools like Vitess that help to scale a MySQL database in a similar way as the NoSQL databases.
If PHP/MySQL can scale to Facebook's usage, it can scale to yours. It's not necessarily the best solution for all situations, nor is scaling always a simple task.
You only really start to hit mysql's limit in the millions (and can still push it farther if really needed). You might take a look at mysql's spin offs, or if you are using billions, look at something like Cassandra.
Performance wise, php isn't the fasted, but should be fast enough. Try using background tasks/cron jobs for the time consuming jobs, and a lot of caching.
链接地址: http://www.djcxy.com/p/19750.html上一篇: 是否有相当于在不同平台上进行预编译?
下一篇: MySql是否适合大型数据库?