SelectedOperations.Selected permissions in SharePoint

Microsoft says “Initially, Sites.Selected existed to restrict an application’s access to a single site collection. Now, lists, list items, folders, and files are also supported, and all Selected scopes now support delegated and application modes.”. This article deep-dives into providing and using SelectedOperations.Selected granular permissions to SharePoint.

SelectedOperations is exactly what Microsoft promised a few years ago. And this is great, as we really need granular access to SharePoint sites. I’ve been supporting enterprise SharePoint for more than 10 years now, and I know that it was always a concern when application require access to a list/library or a folder or even document, but admins have to provide access to entire site.

Especially, I believe, this feature becomes more in demand because of Copilot for Microsoft 365. As for now – it’s mostly developers and data analytics who needs unattended application access to SharePoint, but what if regular users powered with m365 Copilot license start creating autonomous agents?

So below is my lab setup, PowerShell scripts and guides with screenshots on how to provide granular (not to entire site but to library/list or folder, or even just one document or list item) permissions to SharePoint and how to use provided permissions.

Admin App

First, we need an Admin App – an app we will use to provide permissions.

The only requirement – this app should have Microsoft.Graph Sites.FullControl.All API permissions consented:

Target Site and Dev Setup

For this lab/demo setup, I have created a team under Microsoft Teams (so it’s a group-based Teams-connected SharePoint site), then test list and test library:

There must be an App Registration for client application – application that will have access to Test-List-01 and Test-Lib-01 only. This app registration should have Microsoft Graph “Lists.SelectedOperations.Selected” API permissions consented:

and I will use Python to access SharePoint programmatically.

At this moment (we have a client app and secret, and “” API permissions, but did not provide for this app access to specific sites or libraries) – we should be able to authenticate to Microsoft 365, but not able to get any kind of data (we can get token, but other call to Graph API would return 403 error – Error: b'{“error”:{“code”:”accessDenied”,”message”:”Request Doesn\’t have the required Permission scopes to access a site.”,):

tbc

References

Securing Storage Account in Azure Function

A storage account that is created as part of an Azure Function App out of the box has some configuration tradeoffs that can be considered vulnerabilities. Let’s look at what can be changed to improve the security of the storage account without affecting the functionality of the Azure Function App.

Public (anonymous) Access

Storage account blob anonymous (public ) access should be disallowed. Though having this Enabled does not allow anonymous access to blobs automatically, it is recommended to disable public access unless the scenario requires it. Storage account that support Azure Function App is not supposed to be shared anonymously.

Navigate to your Storage Account at Azure portal, Settings/Configuration, then
for “Allow Blob anonymous access” select “Disabled”:

MS: “When allow blob anonymous access is enabled, one is permitted to configure container ACLs to allow anonymous access to blobs within the storage account. When disabled, no anonymous access to blobs within the storage account is permitted, regardless of underlying ACL configurations. Learn more about allowing blob anonymous access“.

Storage account key access

This is also considered as vulnerability and it is recommended to disable Storage account key access.

NB: (Jan 2025) It is not possible to Set “Allow storage account key access” to Disabled if you are using Consumption or Elastic Premium plans on both Windows and Linux without breaking the function app. For other plans – it’s possible but requires some work.

Set “Allow storage account key access” to Disabled:

MS: “When Allow storage account key access is disabled, any requests to the account that are authorized with Shared Key, including shared access signatures (SAS), will be denied. Client applications that currently access the storage account using Shared Key will no longer work. Learn more about Allow storage account key access

The problem is it seems like the function app ootb is configured to use key access, so just disabling storage account key access breaks the function app.

For the function app to adopt this new setting, you’d need:

  • provide access for the function app to the storage account
  • reconfigure the function app for authorization via Microsoft Entra credentials

Here is the separate article “Function app to access Storage via Microsoft Entra credentials” with the detailed research and step-by-step guide, so here I just summirise it:

Function App Hosting PlanAllow Storage Account Key Access
Flex ConsumptionPossible to disable Storage Account Key Access
Consumption, PremiumNot Possible to disable Storage Account Key Access
DedicatedPossible to disable Storage Account Key Access
Containertbc

References

Azure Key Vault Purge Protection

Azure key vault is something where you can store keys, secrets, certificates that you’d need to access services (e.g. call Graph API).

What if somebody malicious get access to key vault (that is not what we want but we should consider this as possible risk)? Surely leaked secrets is a serious issue (separate topic), but imagine if that somebody also deletes key vault content – if so, we will simply lose this data, which will break the functionality of the solution and our existing systems will stop working.

So enabling purge protection on key vaults is a critical security measure to prevent permanent data loss. This feature enforces a mandatory retention period for soft-deleted key vault contents (e.g. secrets), making them immune to purging during the retention period.

In a nutshell, you’d just select “Enable purge protection” under the key vault Settings/Properties (notice, that once enabled, this option cannot be disabled):

For details, please refer to the following articles:

References

Automating SharePoint operations with Azure Functions

There are many scenarios for SharePoint or Teams automations, and in most cases you need to run some code on scheduled basis (e.g. every 5 minutes or every 24 hours etc.). This is where timer-triggered Azure Functions might help. In this article I will provide use cases, overview of the whole scenario and technical setup, and provide links to the detailed step-by-step guides for configuring different parts of the entire solution.

Possible scenarios

Possible scenarios (end-user-oriented):

  • Create site upon user request
  • Convert site to a HUB site upon user request
  • Set site search scope upon user request
  • Setup site metadata (site custom properties)
  • Request usage reports/analytics

Possible scenarios (admin-oriented):

  • Provide temporary access to the site (e.g. during troubleshooting)
  • Provide Sites.Selected permissions for the App to the Site
  • Disable custom scripts or ensure custom scripts are disabled
  • Enable custom scripts (e.g. during site migration)
  • Monitor licenses – available, running out etc.

Typical setup

Front-end

SharePoint site works as a front-end. You do not need to develop a separate web application, as It’s already there, with reach functionality, secured and free.

The site can have:
– one or more lists to accept intake requests
– Power Apps to customize forms
– Power Automate to implement (e.g. approval) workflows, send notifications etc.
– site pages as a solution documentation
– libraries to store documents provided as response to requests

You can provide org-wide access to the site if your intention is to allow all users to submit requests or secure the site if you want to accept requests only from a specific limited group of people.

Back-end

Timer-triggered Azure Function works as a back-end. The function can be scheduled to run based on job specific requirements (e.g. every 5 or 10 minutes, or daily or weekly etc.). The function can be written in PowerShell, C#, Python etc.

The function’s logic is to

  • read SharePoint list, iterate through items to get intake requests
  • validate request eligibility
  • perform action
  • share results (e.g. update intake form, send e-mail, save document to library etc.)

Configuration

There should not be an issue to setup a front-end. You’d just need a solid SharePoint and Power Platform skills.

For the back-end the solution stack would include the following tools/skills:
– Azure subscription to host solution
– Registered Apps to configure credentials and API access permissions
– Azure Function App to actually run the code
– Azure Key Vault to securely save credentials
– programming skills in language/platform of choice
– SharePoint API, Microsoft Graph API

Please refer to the separate article Configuring Azure Function App and Key Vault to work with Microsoft 365 SharePoint via Graph API for the basic setup.

Secure

Having basic setup in place, we’d improve solution security. Specifically, we’d address the following:

  • Azure Function network security
  • Key Vault network security
  • Storage Account network security
  • Key Vault purge protection
  • tbc…

TBC…

References

Azure Function app to access Storage via Microsoft Entra credentials

There is a known problem with Azure Function App security configuration. Ootb function app access to storage is configured using shared keys. This is considered as potential vulnerability. Disabling storage account key access breaks the app. This article tells how to reconfigure the app to use Microsoft Entra credentials vs shared keys.

When the function app is created – a Storage Account is created to support function app. By default, storage account has shared keys enabled and function app is configured for shared keys. So we’d need:

  1. Enable function app managed identity
  2. Provide access for the function app managed identity to the storage account
  3. Configure function app to use managed identity

Enabling function app managed identity

It’s done via Azure portal -> function app -> Settings -> Identity:

Providing access for the function app’s managed identity to the storage account

First, you’d navigate to the Azure portal -> Storage Account -> Access Control (IAM)
and Add new role assignment.

then you’d select “Storage Blob Data Owner” role, “Managed identity” your app identity

Configure function app to use managed identity

Responsible app settings are (you can find them under your app Settings -> Environment Variables -> “App settings”:

You can remove “AzureWebJobsStorage” and replace it with “AzureWebJobsStorage__accountName” as per here.

For the WEBSITE_CONTENTAZUREFILECONNECTIONSTRING, unfortunately, it says that this env variable is required for Consumption and Elastic Premium plan apps running on both Windows and Linux… Changing or removing this setting can cause your function app to not start… Azure Files doesn’t support using managed identity when accessing the file share…

Possible error messages

If you have a Function app with Consumption or Elastic Premium plans and completed steps above to disable function storage account key access – your function app will not work. There will be no new invocations. Personally, I was able to observe the following error messages:

“We were not able to load some functions in the list due to errors. Refresh the page to try again. See details” :

We were not able to load some functions in the list due to errors. Refresh the page to try again. See details


Function App with Dedicated Plan

If you have a function app created based on dedicated plan (App Service):

then under function “Environment variables” you’ll see:

e.g. there is a AzureWebJobsStorage (that can be updated) and no WEBSITE_CONTENTAZUREFILECONNECTIONSTRING (than is required).

So, for the function app created with dedicated plan (App Service) – if you followed steps above (provide a role to managed identity, created AzureWebJobsStorage__accountName, removed AzureWebJobsStorage and disabled key access – the function should work.

Function App with Flex Consumption plan

Flex Consumption plan is a new linux-based plan (GA announced Nov 2024), and it looks promising – it’s still consumption, but supports virtual networks and allows fast start (and some more nice features).

What I do not like is it does not support installing dependencies via requirements.ps1 – you have to go with custom modules here (it says: Failure Exception: Failed to install function app dependencies. Error: ‘Managed Dependencies is not supported in Linux Consumption on Legion. Please remove all module references from requirements.psd1 and include the function app dependencies with the function app content).

For our specific needs – disabling storage key access and using function identity to access it’s own storage – I found the following promising: “By default, deployments use the same storage account (AzureWebJobsStorage) and connection string value used by the Functions runtime to maintain your app. The connection string is stored in the DEPLOYMENT_STORAGE_CONNECTION_STRING application setting. However, you can instead designate a blob container in a separate storage account as the deployment source for your code. You can also change the authentication method used to access the container.”

Let us create a function with a hosting option “Flex Consumption plan” (all the other settings are by default) via Azure Portal:

and right away we can see that app is using storage keys by default via environment variables: AzureWebJobsStorage (we know how to deal with) and DEPLOYMENT_STORAGE_CONNECTION_STRING (no description found).

Let us try to create a function app different way (customized as per this). First, we’d create a storage account. When creating a function app – we’d select an existing storage. I did not find any options to select function managed identity and configure the function to use managed identity to access storage account during function app creation wizard.

Let us try to reconfigure the existing app

After this – a system identity was created and the role “Storage Blob Data Contributor” was assigned to this identity to the storage account. Environment variables did not go. Let us disable access keys under storage account – and… and function app stopped working.

Since environment variables are still here – let us blame “AzureWebJobsStorage” and let us do the trick with it – create a new “AzureWebJobsStorage__accountName”, put our storage account name as a value, remove “AzureWebJobsStorage” and restart the app… Drumroll, please! And hooray! The function has started working again!

“Container app environment” – to be tested

tbc…

References

Azure Data Factory: connecting to SharePoint with a Certificate

For a long time we had to provide legacy ACS permissions for Microsoft Azure Data Factory to connect to SharePoint. That’s not the case anymore. Finally Microsoft updated authentication page so ADF V2 supports authentication with Client Id and Certificate, which means that application registration used to connect to SharePoint can have only modern Sites.Selected API permissions.

The steps would be

  1. Obtain a certificate
  2. Get a service principal (Register your app in Entra Id )
  3. Upload the certificate to the app registration
  4. Provide access for the app id (client id) to your SharePoint site
  5. Configure linked service in ADF

Detailed Step-by-Step guide ADF connect to SharePoint with a Certificate

1. Obtain a certificate

There are no special technical requirements for a Certificate. Since this is about trust between two parties and you own both – the certificate can be self-signed (e.g. generated with PowerShell as described here). But some organizations still require all certificates used in an org to be trusted by org CA.

2. Register app in Azure to get a service principal

To get a service principal – Client ID (app id) – your must create a so-called “App registration” in Entra Id (Azure AD). Specific requirements: app should have both – Microsoft Graph API and SharePoint API Sites.Selected permissions configured and consented. The process is described, e.g. here.

3. Upload the certificate to the app registration

Under Secrets and Certificates section of you App Registration – select Certificates tab and upload your certificate.

4. Provide access for the app id (client id) to your SharePoint site

This is something only your admins can do. Having Microsoft Graph API and SharePoint API Sites.Selected permissions configured and consented does not mean you automatically have access to SharePoint. Sites.Selected API permissions presence means you are allowed to get access specific SharePoint sites, but what are these sites and what kind of access?
So you’d request your SharePoint tenant admins to provide access (e.g. read-only or read-write or full control) for your App Id (client id) to specific SharePoint site Urls.
If you are an admin – check this.

5. Configure linked service in ADF

The last step is to configure your Data Factory connection to SharePoint list using service principal and certificate you got earlier with steps 1-4.

References:

Working with SharePoint from Python code via Graph API

Python code samples published by Microsoft at the Microsoft Graph API reference pages use GraphServiceClient module. But you also can use just requests module and call Microsoft graph API directly, using requests.post or requests.get methods. Here I’m sharing my Python code samples.

https://github.com/VladilenK/m365-with-Python/tree/main/Graph-API-Plain