Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more

cpatra

Dynamic Row-Level Security (RLS) Implementation in Power BI (Part 2)

This article is co-authored by Chinmaya Patra, Pradip VS and Vivienne Vereen Architects at Microsoft

cpatra_3-1743069893239.jpeg

 

In Part 1 of the Dynamic Row-Level Security (DRLS) Implementation in Power BI series, we covered the basics of setting up dynamic RLS to control data access based on user roles, improving efficiency and governance. It explained how dynamic RLS simplifies the setup compared to static RLS and addressed key data security challenges. The document detailed defining roles and profiles by Business Unit (BU) and Area/Department using a sample star schema with DimStore and DimBrand tables. It also covered the use of the UserPrincipalName() function for managing access and discussed the performance impact of different relationship types.

 

In this part, we are going to learn Dynamic Row-Level Security (RLS) in Power BI, focusing on two main methods: RLS without ETL and alternative approaches. Managing user access is crucial for data security.

 

With RLS Without ETL, you can directly implement RLS in Power BI without needing ETL processes, making it perfect for smaller models with fewer transactions and users. You can create a DynamicRLS Table to store access rules based on StoreKey and BrandKey, where 0 means full access. Using DAX Implementation, you define roles in Power BI Desktop to filter data according to user access rights from the DynamicRLS table. Various Access Scenarios show how StoreKey and BrandKey values determine user access levels, from specific store and brand access to full access. While RLS without ETL is simple and efficient for smaller datasets, it may face scalability issues for larger models.

 

The guide also discusses Alternative Approaches, like direct relationships between user and fact tables using RLSKey and creating bridge dimensions for full access management. It wraps up with Best Practices for implementing multiple dimensional security in fact tables, ensuring data security through RLS.

 

Approach 2: Row-Level Security (RLS) Without ETL

 

If you don’t want to perform ETL or maintain a separate Roles and Restrictions table, you can apply row-level security (RLS) directly in Power BI. This works well for small models with fewer transactions and users.

 

Instead of linking roles via ETL, create a disconnected table (DynamicRLS) to store user access rules based on StoreKey and BrandKey. A value of 0 in these columns means full access to that dimension.

 

How to Implement:

 

  • Create the DynamicRLS table with columns: Email (User identification), StoreKey (Restricts access to specific stores) & BrandKey (Restricts access to specific brands)
  • cpatra_0-1743069560069.png

 

  • Define a Role in Power BI Desktop and apply the below DAX code to the fact table (FactOnlineSalesAggregated_DynamicRLS):
  • cpatra_1-1743069560078.png

How This Works:

  • Power BI identifies the logged-in user using USERPRINCIPALNAME ().
  • It fetches the user’s access rights from the DynamicRLS table.
  • The SWITCH function applies access rules:
    • (0,0) → Full access (all stores & brands)
    • (_Brand, 0) → Access to all stores under a brand
    • (0, _Store) → Access to all brands in a store
    • (_Brand, _Store) → Access only to a specific store & brand

 

Example Scenario:

cpatra_0-1743074251169.png

 

What Happens When a User Logs In?

  • Power BI retrieves the user’s email and checks their access level in DynamicRLS.
  • The fact table filters itself dynamically based on StoreKey and BrandKey.
  • The user sees only the allowed data based on their access rights.

cpatra_2-1743069733014.png

 

  Advantages of the above approach 2:

  1. Simplicity: This approach is simpler to implement as it does not require ETL processes or maintaining separate Roles and Restrictions tables.
  2. Efficiency: Works well for small models with fewer transactions and users, making it efficient for smaller datasets.
  3. Flexibility: Allows for dynamic filtering based on user access rules without the need for complex data transformations.
  4. Ease of Maintenance: Reduces the need for ongoing maintenance of ETL processes and separate tables, making it easier to manage.

         Disadvantages of the above approach 2:

  1. Limited Scalability: May not be suitable for large models with many transactions and users, as it relies on a disconnected table for user access rules.
  2. Performance: Can impact performance if the disconnected table grows large or if there are many users accessing the data simultaneously.
  3. Dependency: Relies on accurate user account information and proper configuration of access rules in the disconnected table.
  4. Customization: May offer less customization compared to approaches that use ETL processes and separate tables for roles and restrictions.

 Other approaches:

While implementing the above solution I have come across a few other approaches which may be useful in a very specific scenario while it is not an optimal solution, please find the following:

 

Approach 3:

I won’t go into a deep dive on this approach but will share my perspective on overcoming the challenge of implementing full or unrestricted access. The idea is to set up a direct relationship between a user table and a fact table using an RLSKey. This RLSKey must be populated in the fact table during the ETL process, ensuring that access control is enforced at the fact level without relying on dimensional tables. While this simplifies security enforcement, it does lead to an increase in the fact table volume since it now incorporates user-specific RLSKey mappings.

 

Establishing a direct relationship between the user table and the fact table using an RLSKey simplifies security enforcement. This approach ensures that access control is enforced at the fact level without relying on dimensional tables, making it easier to manage and understand. By populating the RLSKey in the fact table during the ETL process, data filtering becomes efficient, ensuring that users only see the data they are authorized to access. This method reduces overall complexity and provides direct access control, making it a streamlined solution for managing user permissions.

 

However, this approach leads to an increase in the fact table volume due to the incorporation of user-specific RLSKey mappings. This can result in larger storage requirements and potentially impact performance. Additionally, the introduction of data redundancy can be challenging to manage, especially as the number of users and roles increases. The maintenance overhead associated with managing RLSKey mappings in the fact table can be significant, and scalability may become an issue for large datasets and high user concurrency. While this approach simplifies security enforcement, it requires careful consideration of the potential impact on performance and scalability.

 

Approach 4:

In this approach, I considered creating a bridge dimension and then appending it to itself. The new appended dimension will have all its keys set to 0. This means that we can set the store key in the user table to 0, indicating full access. By integrating this entry into the model with the help of the bridge table, we can ensure that a single 0 value in the store key will provide full access to the email address or UserPrincipalName.

 

This approach increases the dimensional data volume and introduces a lot of data redundancy. However, it simplifies the process of granting full access to users by using a single entry in the bridge table. This method ensures that users with the 0 value in their store key can access all data without any restrictions, making it easier to manage and maintain user permissions.

 

This approach increases data volume and introduces redundancy by appending the bridge dimension to itself, which can lead to higher storage needs and slower performance. The duplication may create inconsistencies and complicate data integrity. Managing a single zero-value key for full access raises potential security risks and increases the chance of unintended data exposure. Additionally, as data grows, scalability and maintenance become challenging, requiring more effort to manage user permissions and ensure consistent access control.

 

Recommendations:

 

Our recommendation isApproach 1 is the more suitable approach. It offers scalability, flexibility, security, efficiency, and customization, making it suitable for handling multiple users and roles efficiently. Although it requires a thorough understanding of Power BI and DAX functions, the benefits outweigh the complexity and maintenance challenges. This approach ensures that data is securely shared with the right users based on their roles and permissions, providing a robust and scalable solution for dynamic row-level security.

 

cpatra_0-1743074381472.png

 

 

 Best Practices for Multiple Dimensional Security Implementation in the Fact Table

When implementing multiple dimensional security in a fact table, it's essential to follow best practices to ensure data integrity, performance, and security. Here are some key practices to consider:

 

  1. Identify Shared Dimensions:

Relate every dimension with their respective fact tables. Identify the shared dimensions and relate them with every fact. This approach is often referred to as a constellation schema, where each fact table individually is a perfect star schema design because each one is related to a dimension.

 

  1. Avoid Duplicate Dimensions:

Avoid duplicating dimensions in the model as it increases the size of your model, which could impact the refresh time. Instead, consolidate the fact tables by joining them through a union (append in Power Query). However, be cautious as this can create a huge table that takes a long time to refresh and may contain a lot of blank values in unmatched columns.

 

  1. Different Granularities:

When dealing with multiple fact tables related to each other with different granularities, ensure that the granularity represents the level of detail in each row of your fact table. For example, in header/detail models, use the correct granularity in your calculations to keep reliable performance.

 

  1. Flatten Header Tables:

Flatten the header table into the detail table to get a denormalized model, increasing the granularity. This approach offers a perfect star schema that provides all its advantages, making the model easier to use as it can be summed and sliced by any dimension.

 

  1. Relate Header and Detail Tables:

Be cautious when relating to header and detail tables as you could get incorrect results due to granularity differences. Use this approach when you need to use columns from different fact tables in the same visual table.

 

  1. Split Fact Tables Based on Dimensional Relationships:

Split the single large fact table into multiple fact tables based on dimensional relationships. This approach helps in organizing the data based on unique dimensions and improves performance.

 

  1. Ensure Data Security:

Implement row-level security (RLS) to restrict data access based on user roles. Define RLS rules to ensure that users can only access data relevant to their roles and responsibilities.

By following these best practices, you can ensure that your multiple-dimensional security implementation in the fact table is efficient, secure, and scalable.

 

Conclusion:

Implementing dynamic row-level security (DRLS) in Power BI is a crucial step towards ensuring that data is securely shared with the right users while keeping best practices in terms of security and data access. By leveraging DRLS, organizations can achieve a scalable, flexible, and unified security model that dynamically controls data access based on user roles, hierarchies, or predefined rules. This approach not only enhances data governance and efficiency but also reduces the need for manual intervention or static security rules. By following the steps and approaches outlined in this document, you can build a robust and scalable DRLS solution that meets the needs of your organization, ensuring seamless data access management and improved security across your Power BI environment.

 

If you have experience with similar initiatives and have helped customers unlock potential using DRLS in Power BI, please share your insights in the comments.

@cpatra , @PradipVS and @VivienneVereen Architects in Microsoft