Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more

NandanHegde

Automated Deployment of MSFT Fabric Database via Azure DevOps

Problem Statement :

Is it possible to implement CI/CD for MSFT Fabric database via Azure DevOps.

        Prerequisites :

  1. Service Principal

    How to Create a Service Principal

    2. AD Account

    3. Azure Data Studio / Visual Studio

    4. Fabric Capacity

    5. Fabric Database

    6. Azure DevOps

    Solution :

    DACPAC is your ally in deploying schema changes or managing versions across environments, all while keeping the existing data untouched.

    A DACPAC is the build output of a Database project.

    1. We need to create a Database Project for Microsoft Fabric Database which can be achieved via below ways:

      a) Enable Git Integration in the workspace managing the Dev version of the Fabric Database

      NandanHegde_15-1732170475484.png

      b) Leverage Visual Studio / Azure Data Studio to generate the database project. In this demo, we would be leveraging Azure Data Studio for achieving the same.

      Assuming Azure Data Studio is installed as stated in Prerequisites, install the SQL Database projects extension.

      NandanHegde_16-1732170533867.png

      Connect to the Dev version of the Fabric database via Azure Data Studio and Right Click, post which Select 'Create Project from Database' option.

      NandanHegde_17-1732170564206.png

      Outcome :

      NandanHegde_18-1732170584740.png

      Post that synchronize the Project within the Azure DevOps repo.

      2. Post the Database Project creation, we would be leveraging Azure DevOps Azure Pipelines and Release Classic editor to generate and publish the DACPAC to the intended destination Fabric database.

      3. In this scenario, we would be leveraging Azure Pipelines Agent pool for the Build pipeline creation and it would be using the latest Windows version.

      NandanHegde_19-1732170630716.png

      a) MSBuild task : To generate the DACPAC via the SQL Project file from the Azure DevOps Repo configured in the Build pipeline.

      NandanHegde_20-1732170658453.png

      Note : One can parameterize the above highlighted Project parameter via Azure pipeline variable group.

      b) Copy files and Publish build artifacts tasks : Copy project build output into Azure Devops artifacts location.

      NandanHegde_22-1732170715027.png

      Outcome :

      Post successful Pipeline execution, the DACPAC artifact is published.

      NandanHegde_23-1732170739847.png

      4. Post creation of DACPAC, we would leverage Release functionality to publish the DACPAC to the intended Fabric Database.

      NandanHegde_25-1732170825556.png

      The Artifacts section is set up to leverage the latest artifact from the Build pipeline.

      We would be using Windows latest Azure Agent pool within Release

      NandanHegde_26-1732170858284.png

      And Service principal to authenticate to the destination Fabric database.

      So, we need to grant the Service principal 'Member' access within the workspace.

      NandanHegde_27-1732170882514.png

      a) Powershell task : Generate Access Token via the Service principal for authentication, install SQL package exe and publish the DACPAC.

      NandanHegde_28-1732170909042.png

      Powershell code :

      $TenantID = "$(TenantId)"
      $clientID = "$(ClientId)"
      $clientSct = "$(ClientSecret)"
      $resourceAppIdURI = ‘https://database.windows.net/';
      
      $tokenResponse = Invoke-RestMethod -Method Post -UseBasicParsing `
          -Uri "https://login.windows.net/$($TenantID)/oauth2/token" `
          -Body @{
              resource=$resourceAppIdURI
              client_id=$clientId
              grant_type='client_credentials'
              client_secret=$clientSct
          } -ContentType 'application/x-www-form-urlencoded'
      
      if ($tokenResponse) {
          Write-debug "Access token type is $($tokenResponse.token_type), expires $($tokenResponse.expires_on)"
          $Token = $tokenResponse.access_token
      }
      
      dotnet tool install -g microsoft.sqlpackage;
      
      & "C:\Users\VssAdministrator\.dotnet\tools\sqlpackage.exe" /SourceFile:"$(SourceFilePath)" /Action:Publish /TargetServerName:"$(TargetServer)" /TargetDatabaseName:"$(TargetDatabase)" /AccessToken:$Token

      where values appended with $ like $(TenantId) are Pipeline variables.

      Outcome :

      NandanHegde_29-1732170946181.png

       

      Build Pipeline YAML :

      # Variable 'DACPACPath' was defined in the Variables tab
      jobs:
      - job: Job_1
      displayName: Agent job 1
      pool:
      vmImage: windows-latest
      steps:
      - checkout: self
      fetchDepth: 1
      - task: MSBuild@1
      displayName: Build DACPAC
      inputs:
      solution: $(DACPACPath)
      - task: CopyFiles@2
      displayName: 'Copy Files to: $(build.artifactstagingdirectory)/DB/'
      inputs:
      SourceFolder: ' $(agent.builddirectory)\s\'
      TargetFolder: ' $(build.artifactstagingdirectory)/DB/'
      - task: PublishBuildArtifacts@1
      displayName: 'Publish Artifact: drop'
      ...

       

       

       

       

       

       

       

       

       

Comments

Currently none of the tools used for the maintenance of database projects ( Visual Studio, VS Code, Azure Data Studio) is able to generate correct scripts to change the structure of Fabric Data objects, Warehouse, Lakehouse or SQL Database.
This is annoying, as Git integration drops all the data and we need a usable variant of automating changes to the structures.
Anything else are toy projects.