As a Fabric Administrator / Capacity admin in my organization, I have to work with many workspaces and not all of the assets (semantic modes, reports, dashboards) are built by an IT team. Some are built very jankly by novice users. Unfortunately the capacity is shared amongst all users and I have no way to limit the capacity consumption by workspace. That means that if a novice user deploys a bad data model to the capacity, all workspaces will essentially choke on itself and all users start complaining of slow dashboards and creating a negavtive impression of Power BI in the org. I have been suggested by "Industry experts" to get another capacitity and assign all "development" workspaces to it, and vet datasets before deploying to production capacity. But again this only works if we have an army of IT resources (knowledgeable in Power BI people) to vet ever single asset being deployed. But thats simply not the case. Additionally, we don't have unlimited budget to spend on Power BI / Fabric capacities unfortunately. As such my idea is as follows: Somehwere in the admin portal / capacity settings, We add some kind of resource limitation setting by workspace. This way I can safely grant a novice developer a slice of the capacity without affecting the rest of my users on the tenant. I could go on as to why this will benefit all of us in so many ways, but for the sake of brevity I'll leave it at that. If you'd like to understand the idea more please reach out to me.
... View more