can dacpac be used for managing databases having large volume of data?
Our current database is of nearly 200MB, but once the application goes live, we are expecting this to grow to a large volume.. may be, 20-30 GB of data in that.
We are planning to use "dacpac" (generated by database project - SSDT) to deploy on production server. The database will be created with a number of tables, and lots of initial data in lookup tables.
However, the concern is for future deployments when we will be using the "dacpac" (generated by database project - SSDT) to upgrade the database on production server.
As I have no past experience of using dacpac for deployments, can anyone please suggest me following -
And lastly, is it better than traditional way of manual writing sql scripts to upgrade database?
From my experience, volume of data does have an impact when deploying a dacpac. The time increase will depend on what changes are being applied in your dacpac across your database. My only advice here is to try and test with larger volumes of data to gauge the increase in time, It may be minimal
All our objects are stored within a SQL Server Data Tools (SSDT) visual studio project, this is then version controlled within TFS, so when we need to do a build based on additional checking, it will create a new version for us
This can depend on the type of updates you are applying, and whether you wish to invest time in understanding what it would take to rollback each schema update.
I like using dacpac's and find it very useful with you hosting all your SQL objects within the one Visual Studio project. Going the manual way could increase the chances that you forget to include one or more patches due to the number of changes required.
链接地址: http://www.djcxy.com/p/62330.html