How to create DACPAC file?

How to create DACPAC file?

In this post, I will explain what the DACPAC is and how to create it.
In the previous post in DevOps series, I described how to create a new, empty database project in Visual Studio.
One of the ways how to import an existing database (from a server) was import database from DACPAC file.

The question is:

What the DACPAC is?

DACPAC = Data Tier AppliCation Package

DACPAC is a single file which contains database model i.e. all files represent database objects. It’s a binary representation of database project compatible with SSDT.
The name comes from an extension of these files. But, if you change the extension from *.dacpac to *.zip, you will be able to jump into and check what is over there.
This is an example of my file:

Content of a DACPAC file

Hence, DACPAC file contains all the information about model of a database needs to restore it. Let’s take a look inside:





 icon-exclamation-triangle DACPAC does not contain DATA and other server-level objects. The file can contains all object types which might be kept in SSDT project.

How to create DACPAC file?

There are few options:

  • Use SSMS
  • Use Visual Studio
  • Use Visual Studio and create a snapshot of the database project (SSDT)

Creating DACPAC using SSMS

SQL Server Management Studio is a very popular and handy tool. You can generate DACPAC file using this tool, although I do not recommend it. Why? I will explain it in a minute.
Let’s take a look at how to do that.
Open SSMS, right-click on database (in Object Explorer) and go to menu: Tasks -> Extract Data-tier Application…

SSMS: Tasks and Extract Data-tier Application

Watch out. There are many options similar to each other. Use:
– Extract Data-tier Application – to create DACPAC
– Export Data-tier Application – to create BACPAC (more details in next post)

Then you will be asked for the target location for the file.
If you are lucky – everything goes smoothly.
If not – you might encounter many errors hence the process fails generating no file.
For example:
Error SQL71561: Error validating element [dbo].[xyzView]: View: [dbo].[xyzView] has an unresolved reference to object [refDb].[dbo].[someTable].

SSMS: When an extract DAC file goes wrong…

Predominantly, it’s no chance to fix the issues on the server and repeat the action, due to:
* production environment – don’t touch anything without proper process of deployment
* depends on number of issues – it might takes ages and you need your DACPAC now.
That’s why I do not recommend generating the file from SSMS.
What other opportunity do we have?

Creating DACPAC using Visual Studio

This option is the most robust. Although a final file (DACPAC) will be the same at the end – Visual Studio offers more options than the predecessor. One of the more important one is an opportunity to turn off “Verify extraction”, which enabled (and hidden) in SSMS leads to failure in action. By disabling that option you will avoid the issue and it doesn’t matter whether the database contains errors or not – the DAC package file will be generated.
Open extract window by using menu: View -> SQL Server Object Explorer, select required database and right-click on it and choice Extract Data-tier Application…

Visual Studio: Extract Data-tier Application

Then select all option as shown below:

More options with extract DAC in Visual Studio – SQL Server Data Tools

Here, you have got much more options.

Extract schema only

Like previously – it’s generating a model only.

Extract schema and data

This option allows you to generate the data. Be careful! Unexpectedly your DACPAC file might grow to huge size and process can takes long time. Use this option wisely.
What a minute… data? The data, you said? Yes! But hold your horses. Personally, I have never used this option and apparently it has got limitations. One of them:

Extract failed

Extract failed. One or more unsupported elements were found in the schema used as part of a data package.
Error SQL71564: Foreign key Foreign Key: [Production].[FK_ProductProductPhoto_ProductPhoto_ProductPhotoID] cannot reference table Table: [Production].[ProductPhoto] when used as part of a data package.

DACPAC file contains the data in BCP format

Once the extract success – the DAC package file contains extra folders and BCP files as shown below:

Select tables…

Basically, not only tables but other objects like views, stored procedures, etc. This option is only available when “Extract schema and data” is chosen.
All objects are grouped by schema:

Visual Studio: Choose the tables to extract data

Verify extraction

The option validates objects in extracted DAC package and you could end up with the same errors like in SSMS which, as a result, would not generate the file for you. It’s not what we expect. Especially, when planning to use the extract for import to Visual Studio as a database project – I do recommend verifying all objects over there.

Creating snapshot of database project (VS/SSDT)

SSDT: Create Snapshot Project

You might want to create a snapshot of database model at specific point in time. It’s convenient method for keeping history of releases when your company haven’t introduced fully automated DevOps processes yet. Please, do not confuse it with database snapshot. This is different one. As the outcome you have got a DACPAC file of current version of database model. Simply right-click on database project (in Visual Studio) and choose “Snapshot Project”.

It will create “Snapshots” folder for you (if you haven’t yet) and save DACPAC file with a timestamp.




As I described in the previous post (New DB project with SSDT and import existing database) in this series, usage the DACPAC file is one of the methods of importing objects to an empty, SSDT database project. Normally, a DAC package shouldn’t contain the data in it. And even though extracting from Visual Studio gives us that opportunity – you can’t import such file to database project. That attempt ends up with message:

This operation is not supported for packages containing data


Data Tier Application (DAC) is very powerful tool. You can create a new database project by importing from file. Having that you can use the DACPAC file in order to deploy changes onto target database server, compare changes between the file and another file or physical database. What’s more important: you can create whole CI/CD process. Believe me or not – since I have found the way of keeping database code – I can not image management without it. Also, I completely understand the pain of those who have to manage code of MySql or Postgesql database. They don’t have counterpart of DACPAC.

Thanks for reading and please leave a comment if you have any questions or own experiences with DACPAC files.

Updated @ 08/11/2019
This post is already part of the whole series about SSDT.

Previous Last week reading (2018-10-28)
Next Last week reading (2018-11-04)

About author

Kamil Nowinski
Kamil Nowinski 182 posts

Blogger, speaker. Data Platform MVP, MCSE. Senior Data Engineer & data geek. Member of Data Community Poland, co-organizer of SQLDay, Happy husband & father.

View all posts by this author →

You might also like

General 0 Comments

BI Conference in Wroclaw

I would like to spread the word about a new conference which appears in my favourite city in Poland – Wroclaw. I’m talking to Jacek Biały, Business Intelligence Competency Center Manager

General 0 Comments

BI Connected by SII – conference in Wrocław

Good day SQL folks! The conference season has begun. You can find very interesting conferences every week in almost every place in the globe. Of course you do not need

Deployment of Microsoft SQL database with Azure DevOps

Hello, data folks! I have published several articles about databases and their maintenance in SSDT project on the blog. One of the posts explained step by step how to import


  1. John
    January 15, 18:50 Reply

    Is there a way to specifically select which tables, sps, functions, and views go into the dacpac?

  2. Chinari
    March 02, 16:59 Reply

    I need to deploy MYSQL. For MSSQL we have dacpac. What abt for MYSQL
    Please reply if anyone knows

  3. Raghu
    June 26, 17:15 Reply

    We have SSDT project for DB and we are doing deployment through Azure DevOps pipeline on target database, when we are altering the table columns type and table has 6 million records that time deployment takes more than 1 hour. How can we reduce deployment time in that scenario?

    • Kamil Nowinski
      July 19, 07:52 Reply

      1 hour sounds pretty long even for (only) 6 million records. If it’s Azure – you can scale up the database before deployment. Sometimes it’s possible to replace automatically created ALTER by manually prepared in pre or post-deployment script. Can you share the ALTER statement?

  4. Donnette Truesdell  
    July 13, 21:40 Reply

    Greetings, Usually I never comment on blogs but your article is so convincing that I never stop myself to say something about it. You’re doing a great job, Keep it up. You can check out this article, might be of help 🙂

Leave a Reply