Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more

KevinChant

Extracting the schema of an existing SQL database in Fabric to perform CI/CD

In this post I cover extracting the schema of an existing SQL database in Fabric to perform CI/CD using Azure DevOps. As per the below diagram.

Extracting an existing database schema of a SQL database in FabricExtracting an existing database schema of a SQL database in Fabric

Which is ideal when you want updates performed in one SQL database in Fabric replicated to another one on a regular basis without the need to maintain a database project.

Just a quick recap, SQL database in Fabric is a new offering that is currently in public preview. You can read more about it in a post I wrote that covers spreading your SQL Server wings with SQL database in Fabric.

To manage expectations, this post covers performing CI/CD for SQL databases in Fabric using classic pipelines. In other words, the GUI-based Classic Editor for the build (CI) and the Releases feature for the deployment (CD). I provide plenty of links along the way.

If you need help with any jargon used in this post, then I recommend that you read one my other posts. Which is a Microsoft Fabric Git integration jargon guide.

Prerequisites

For the benefit of this post, I specify Microsoft-hosted agents so you can all mimic the setup quickly. However, I recommend looking to configure a self-hosted agent long-term.

Plus, for the benefit of this post I configured a Azure Pipeline variable group. Which is why you can see variables used in the pipelines. For example, $(SourceConnString). I recommend storing sensitive values in Azure Key Vault and connecting a variable group to it ongoing.

Extracting the schema from an existing SQL database in Fabric

You can extract the schema of an existing SQL database in Fabric into a dacpac file in Azure DevOps thanks to the latest version of SqlPackage.

In order to read from the source database securely, I recommend creating a service principal and then giving it the relevant permissions to the source database. One simple way to add the service principal is with the functionality to manage SQL database-level roles from the Fabric portal.

After granting the service principal access to the database you can create the below pipeline with Classic Editor in Azure DevOps.Pipeline to extract the schema into a dacpac filePipeline to extract the schema into a dacpac file

First step is to update SqlPackage on the Azure Pipeline agent that is running the pipeline. In order to support the target platform for the SQL database in Fabric. Below is the PowerShell command:

 

 

dotnet tool update -g microsoft.sqlpackage

 

 

In the second task you can run the SqlPackage command to extract the schema into a dacpac file and save it to the artifact staging directory

 

 

SqlPackage /Action:Extract /TargetFile:"$(Build.ArtifactStagingDirectory)\SQLDBinFabric.dacpac" /SourceConnectionString:"$(SourceConnString)"

 

 

In this example SqlPackage authenticates to the source database with a connection string. Which is stored as a secured variable with the below construct:

 

 

Server=tcp:{SERVER NAME AS SHOWN IN FABRIC}.database.fabric.microsoft.com,1433;Initial Catalog={DATABASE NAME AS SHOWN IN FABRIC};MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;Authentication=Active Directory Service Principal; User Id={SERVICE PRINCIPAL CLIENT ID}; Password={SERVICE PRINCIPAL SECRET}

 

 

One key point to remember is that database name required in the connection string will vary from the database name you originally provided in Microsoft Fabric. Because Microsoft Fabric adds an additional suffix for connectivity. 

You can get the correct server and database names in Microsoft Fabric by going into SQL query editor. From there, you can either click to connect to Visual Studio code or navigate to get the connection strings.

Finally, you publish the artifact. Setting the artifact staging directory as the path to publish which contains only the dacpac file.

Verifying the dacpac is published

You can then run the pipeline to confirm that the pipeline publishes the artifact into Azure DevOps.Completed pipeline showing artifactCompleted pipeline showing artifact

You can verify that the dacpac exists by selecting the artifact in the pipeline results and navigating to the dacpac file. Artifact containing the dacpacArtifact containing the dacpac

You can verify the contents of the dacpac by downloading it and rename it with a ".zip" extension. In order extract its contents locally to view the contents. For example, you can then check that the correct target platform is stated in "model.xml" file by viewing the DspName value as below.

 

DspName="Microsoft.Data.Tools.Schema.Sql.SqlDbFabricDatabaseSchemaProvider" 

 

Deploying the contents of the dacpac file to another SQL database in Fabric

Once the dacpac has been published you can then to deploy the contents of the dacpac file to another SQL database in Fabric. Which can be done in Azure Pipelines with the classic release feature. Two prerequisites are required beforehand.

First of all, you need to make sure the service principal you created has the relevant permissions in the target database.

Secondly, if you are intending to connect to the database with a connection string you need to ensure the connection string for the destination database is added as a variable. Alternatively, choose another connection strategy.

You can create a release pipeline to deploy the artifact like in the below example. Selecting the pipeline that created the dacpac as the artifact source.Release pipelineRelease pipeline

You can then add two tasks to the created stage. One to download the artifact for certainty and another to deploy the contents of the dacpac. You can deploy the contents of the dacpac with an Azure SQL Database Deployment task. Like in the below example.Azure SQL Database Deployment taskAzure SQL Database Deployment task

In the above example you connect to the SQL database in Fabric via a connection string. Which I strongly recommend you add as a variable to avoid sharing any sensitive information in your pipeline.

Final words about extracting the schema of an existing SQL database in Fabric

I hope that this example that shows how to extract the schema of an existing SQL database in Fabric to perform CI/CD using Azure DevOps proves to be insightful.

One key point to remember is that there are other ways you can update your SQL databases in Fabric to other environments. Such as Microsoft Fabric deployment pipelines which I covered in a previous post.

Of course, if you have any comments or queries about this post feel free to reach out to me.