There has always been one problem in the SharePoint world: full site permissions report. Full means across entire site – including all objects with broken permissions. It seems like Microsoft has solved the problem: Full site permissions report is available for site owners out-of-the-box.
How to get SharePoint All Site Permissions Report
(Ensure you are site collection admin or team/group owner). Just navigate to Site Usage, scroll to the end and run report.
Select gearbox “Settings” and then Site usage:
Or Select “Site Contents”, then “Site Usage” as shown below:
2. Scroll down to the “Shared with external users” block and click “Run report”:
Create/Select folder (*) for the report and click “Save”:
If there are no folders in the Documents folder – you need to create one (otherwise you will not be able to save it:)
Once yo have a folder available – just click “Save”:
Give it some time (5-10 minutes) – check the folder’s content. There should be a file with a report on all site permissions. For items shared with direct access, the report contains one row for each user / item combination. SharePoint groups are shown in the report as groups (not individual users inside them… so you have to check group membership to get really full permissions report).
Again, you must be a site admin to run the report.
Secure the permissions report If you don’t want other site members to see the report – secure the report’s folder – e.g. for site owners and for those who must be able see the report… Consider creating a separate library for permissions reports and secure it instead of securing a folder under Documents.
Some more ideas on SharePoint permissions
Permissions are tricky in SharePoint. By default, you have permissions assigned to the root site of the site collection and all subsites, libraries etc. inherit root permissions. But you can break inheritance at any level you need to provide specific (unique) permissions to the resource. Of course you can always navigate to the resource and check resource permissions. But… what if there are hundreds of broken permissions… should you iterate everything under your site to check manually if permissions are broken or inherited?
So the real problem was – you never knew who have access to your site as there was no out-of-the-box tool to get all site permissions in one single report. There are third-party solutions – like ShareGate, Metalogix or SysKit – or you can develop PowerShell script generating report on all SPO site permissions. But… finally Microsoft solved this problem – Microsoft implemented out of the box full site permissions report.
If you have country-specific content – Microsoft Search allows bookmarks to be configured to pop-up only for users from a specific country.
And “Use Azure AD locations” flag is a new option that make it actually works… For a long time country settings were the same but without “Use Azure AD locations” flag. So what does “Use Azure AD locations” flag do?
Use Azure AD locations
“Use Azure AD locations” flag is a straight-forward configuration settings. It says: “This bookmark will only appear for users with Azure AD locations that match selected countries or regions. If cleared, the user’s IP address will be used to determine location. This checkbox can be altered from both Country or region setting and Targeted variations setting.”
I have tested this new “Use Azure AD locations” flag – it works. Once you configure user’s country in AAD and country-targeted bookmarks – all works good. Bookmarks appear for the user.
What if we do not “Use Azure AD locations” flag
to be provided
What was before “Use Azure AD locations” flag
What was before Microsoft implemented this “Use Azure AD locations” option? How did Microsoft understand “this user is from that country”. What was the criteria to correlate User <-> Country? License assigned country? Windows locale? Browser settings? Azure AD properties?
How does Microsoft define “this user is from that country”. What are the criteria to correlate User <-> Country? Physical IP address? License assigned country? Locale? Browser settings? Azure AD properties?
It turned out, the way it was designed previously: – configure Microsoft 365 integration with Bing – in Bing -> Settings : select Country/Region – search from Bing that was the only way to make it work! So yes, do not leave “Use Azure AD locations” option unchecked. Microsoft confirmed it was poor design.
There is was a known problem with Microsoft Delve. It’s not a technology problem though. (Upd: Delve is scheduled for retirement in Dec 2024)
We know SharePoint site permissions are not easy to manage. E.g. you can break permissions inheritance at any level – subsite, library, list, folder, list item or specific document. Anybody with full permissions can do that. The worst thing is there is was (*1) no native ability for site owner to get full site permissions report. We must have used third-party tools or PowerShell to have all permissions in one document.
So no wonder SharePoint sites were heavily over-exposed. Especially when a site owner tired with complexity of SharePoint permissions system decided to share resource with “Everyone”. And the other person, not knowing site is shared with everyone, might save some sensitive data. That is the real issue.
Now, what is Delve? It’s a service that – get signals from allover Office 365 – who did what etc. – based on that, using AI and Office Graph, generates suggestions – “what others do”. Of course, Delve is security-trimmed, i.e. it will neve suggest you a document you do not have access to. But some sites might be overshared. Delve works as it should work – it suggests you documents it believes related to you (based on Microsoft Graph insights) and you already have access to.
Now bad thing happens – people start seeing documents they never new they have access to. Where are these documents from? Of course from sites shared with Everyone. Who to blame for the security breach? Delve? Microsoft Graph? Microsoft 365 SharePoint Online?
Strictly says, it is not Delve’s problem. It’s more human problem than technological. Delve just does it’s job, and does perfectly. Delve simply displays the information already shared widely.
How do we solve the issue?
Disable Delve?
Disable search (stop sites crawling and remove results)?
Those methods are half-measure. Methods above are just hiding the problem – not solving it. Agree it helps stop the deterioration, bud does not fix the root cause.
How do we solve the real problem and what is the root cause?
Of course, we need remove incorrectly provided permissions. How?
Only site owner (data owner) knows which content should be shared with whom with which access rights. So we need to ask sites owners to review their permissions. How?
First, we need a list of over-exposed sites. How?
There are two methods (more details – check this article)
Brute force – use PowerShell or 3-rd party tool to get permission report on all sites in tenant, select permissions provided for Everyone…
Smart move – use Microsoft search. As search is security-trimmed, we can search for available content on behalf of a user with no permissions provided.
Then we find owners for each wide-open site. How?
for group-based sites we get member of the “owners” group
for non-group based sites we get site collection administrators
We would also sort sites by “is it supposed to be public?”. I.e. if the site was born as public – e.g. Public Team or Public Yammer community – or Communication site – maybe it’s less concern.
It would be a good idea to bring DLP and/or automatic content sensitivity labelling, so we could start remediation from sites labelled as storing most sensitive data.
Finally, we need to let site owner know that his site is Open to everybody and ask to fix it. How?
e-mail communication
SharePoint list containing overshared sites with owners, configured the yaw every person can see only sites he/she owns
You run some PnP PowerShell code unattended e.g. daemon/service app, background job – under application permissions – with no user interaction. Your app needs to connect to SharePoint and/or Microsoft Graph API. Your organization require authentication with a certificate (no secrets). You want certificate stored securely in Azure Key Vault.
Solution (Step-by-step process)
Obtain a certificate (create a self-signed or request trusted)
In Azure where you have Microsoft 365 SharePoint tenant
Create a new Registered App in Azure; save App (client) id, Directory (Tenant) Id
Configure App: add MS Graph and SharePoint API application (not delegated) permissions
Upload the certificate to the app under “Certificates & secrets”
In Azure where you have paid subscription (could be same or different)
Create an Azure Key Vault
Upload certificate to the Key Vault manually (with GUI)
While you develop/debug your custom daemon application at your local machine
Provide permissions to the Key Vault for your personal account
Connect to Azure (the one where your Key Vault is) running Connect-AzAccount – so your app can get a Certificate to authenticate to SharePoint Online
For your application deployed to Azure (e.g. Azure Function App )
Turn On managed identity (Your Function App -> Identity -> Status:On) and Save; notice an Object (Principal) Id just created
Provide for your managed identity principal Id permissions to the Key Vault, so when your daemon app is running in the cloud – it could go to the key Vault and retrieve Certificate
Here is the sample PowerShell code to get certificate from Azure Key Vault and Connect to SharePoint with PnP (Connect-PnPOnline):
# ensure you use PowerShell 7
$PSVersionTable
# connect to your Azure subscription
Connect-AzAccount -Subscription "<subscription id>" -Tenant "<tenant id>"
Get-AzSubscription | fl
Get-AzContext
# Specify Key Vault Name and Certificate Name
$VaultName = "<azure key vault name>"
$certName = "certificate name as it stored in key vault"
# Get certificate stored in KeyVault (Yes, get it as SECRET)
$secret = Get-AzKeyVaultSecret -VaultName $vaultName -Name $certName
$secretValueText = ($secret.SecretValue | ConvertFrom-SecureString -AsPlainText )
# connect to PnP
$tenant = "contoso.onmicrosoft.com" # or tenant Id
$siteUrl = "https://contoso.sharepoint.com"
$clientID = "<App (client) Id>" # Azure Registered App with the same certificate and API permissions configured
Connect-PnPOnline -Url $siteUrl -ClientId $clientID -Tenant $tenant -CertificateBase64Encoded $secretValueText
Get-PnPSite
Update: Private content mode will stop working on June 30, 2025. Microsoft announced Private content mode retirement in Microsoft Viva Engage (Yammer). Message ID MC1045211.
As an Office 365 administrator, I would like to get some reports on Yammer with PowerShell. How it’s done?
By default, even with a Verified Admin token, you do not have access to private messages and private groups content. To get private stuff, you need select “Private Content Mode” under Yammer Admin Center -> Content and Security -> Content Mode:
Yes, sure… But! Since it’s a cloud operation against Microsoft 365 – you will be throttled if you start more than 2 parallel threads! Using just 2 threads does not provide significant performance improvements.
Batching
So, try PnP.PowerShell batches instead. When you use batching, number of requests to the server are much lower. Consider something like:
Scenario: You have a large (>5k items) list in SharePoint Online you need to clean-up, for instance:
You need to delete the entire list
You need to delete all the list items, but keep the list
You need to delete some of list items, but keep the others
Deleting a large SharePoint Online list
There was a problem in SharePoint Online – you could not delete a large list – you had to remove all items first, but removing all items was also a challenge. Microsoft improved SharePoint Online, so now it takes ~1 second to delete any SharePoint list, including 5000+ items list via GUI or PowerShell:
Remove-PnPList -Identity $list
command works very fast – ~1 second to delete entire list with >5000 items.
Delete all items in a large SharePoint Online list
In this scenario we need to keep the list, but make it empty (clean it up).
GUI: You can change the list view settings “Item Limit” to <5000 and try to delete items in chunks, but (at least in my experience) when you try to select, let say, 1000 items and delete them via GUI – it says “775 items were not deleted from large list”:
so this option seems like not a good one.
ShareGate: 3-rd party tools like Sharegate, SysKit give a good results too.
for me both methods gave same good result: ~17 items per second ( ~7 times faster than regular).
Deleting some items from a large SPO list
Consider the following scenario: in a large SharePoint list there are items you need to delete and the rest items must stay (typical case might be to purge old items – e.g. items created last year).
In this case you’d
get all list items (or use query to get some list items)
select items that need to be deleted based on your criteria, e.g. created date or last modified date etc.
use PnP.PowerShell batches to delete only what you need
# to get all list items
$listItems = Get-PnPListItem -List Tasks -PageSize 1000
# or to get some list items
$listItems = Get-PnPListItem -List Tasks -Query <query>
# select items to delete
$itemsToDelete = $listItems | ?{$_.Modified -lt $threshold}
# delete some list items
$batch = New-PnPBatch
$itemsToDelete | Foreach-Object { Remove-PnPListItem -List $list -Identity $_ -Batch $batch }
Invoke-PnPBatch -Batch $batch
PnP.PowerShell batch vs ScriptBlock
How fast are PnP batches? What is better in terms of performance – ScriptBlock or Batching? Here are my measurements:
Time elapsed, seconds
with batches
with scriptBlock
without batches
Add-PnPListItem (100 items)
6-10 seconds
60-120 seconds
Add-PnPListItem (500 items)
20-40 seconds
230-600 seconds
Add-PnPListItem (7000 items)
314-600 seconds
Add-PnPListItem (37000 items)
3200 seconds
Remove-PnPListItem (1000 items)
58-103 seconds
58 seconds
430-1060 seconds
Remove-PnPListItem (7000 items)
395-990 seconds 3000 seconds
397-980 seconds
Remove-PnPListItem (30000 items)
one big batch : 13600 seconds 30 batches 1000 items each: 3500 seconds
both – PnP PowerShell batches and ScriptBlocks are 7-10 times faster than plain PnP PowerShell!
Can we use Microsoft Graph API to complete the same task? TBC…
Note… For the sake of history: It used to be like that for 5k+ lists: “Remove-PnPList” fails with a message “The attempted operation is prohibited because it exceeds the list view threshold enforced by the administrator”. Deleting with GUI failed too.
As I am a SharePoint person, and retention policies and labels are not a SharePoint engineer responsibility, I do not go to the m365 Compliance Center frequently. Below are My notes for myself on key moments – how to create and configure Office 365 retention labels and Policies at Compliance Center and use labels in SharePoint Online (SPO).
In SPO at each site collection level you can still work with retention policies the old way – create policies under Site Collection Settings – Content Type Policy – and apply policies at library level under Library Settings/Information Management Policy Settings. There is also Site Retention Policy.
But Microsoft is making efforts to centralize and unify such things – so you can specify retention policies in one place and apply them across all Office 365 content (not only SharePoint). That place was called Office 365 Security and Compliance Center (SCC). Later Microsoft separated Security Center and Compliance Center. So currently Retention Policies are under “Microsoft Purview” (former Microsoft Compliance Center) -> Solutions -> “Data lifecycle management”:
To get access to “Data lifecycle management” solution – you need to have a “” or “” roles. SharePoint or Teams administrator cannot access Purview. Even having “Global reader” or “Security reader” an admin will not be able to see “Data lifecycle management” blade. Here is how Microsoft Purview looks like for a Global reader:
Although SharePoint admins usually do not have access to SCC and do not go to Site content, we still need to know how it all works. And labels are recommended way to specify retention in SharePoint, so here we are.
Labels are applied to documents, documents are kept in libraries, and at each library you can “Apply a label to items in this library”.
Create Labels
Labels are created in SCC under Classification. The main part looks familiar to SharePoint people:
Label Settings
You can
Retain Content forever or for a specified number of days/months/years and then – delete it or trigger a disposition review or do nothing
Delete content if it’s older than specified number of days/months/years
after it was created/modified/labelled
Apply labels
Now you need to publish created labels – and that is how you create a policy. I.e. policies are where you specify which labels to which content (Exchange, OneDrive, SharePoint, Office 365 groups)
You can also auto-apply labels based on conditions, like
content that contains sensitive info
content that contains specific words or phrases, or properties
content that matches a trainable classifier
but as per Microsoft, “It will take up to 7 days to automatically apply the label to all items that match your conditions.”
Note: “trainable classifier” means an AI ML will be used, and as per Microsoft “Creating machine learning rules requires an Office 365 E5 subscription for your organization”
SharePoint admin center
You can do nothing with labels at SharePoint admin center. Labels are created, published and auto-applied at SCC. At each site collection levels site administrators can apply labels.
SharePoint site
At site collection settings you can still see “Content Type Policy Templates” and “Site Policy”, but that is not the case. Labels are applied at library level under Library Settings/Apply label to items in this list or library.
where you can select a label to apply for all new items in the library. With
You can also apply the label to items that already exist in the library.
You can also apply (change) label for each single item or multiple selected items under Details pop-up page:
or from under Contect Menu/More/Compliance details:
Adaptive retention policies and scopes
Microsoft recently implemented “Adaptive” retention policies. At step 2 of “Create retention policy” you’ll be asked “Choose the type of retention policy to create”: “A policy can be adaptive or static. Advantage of an adaptive policy will automatically update where it’s applied based on attributes or properties you’ll define. A static policy is applied to content in a fixed set of locations and must be manually updated if those locations change.”
And if you selected “Adaptive” – on the next step you will need to provide the adaptive scope (so at this moment you should already have created your adaptive scopes):
So, let us create your adaptive scopes. What type of scope do you want to create? SharePoint sites…
And then you’ll have nothing more then set of conditions:
where you can use objects: “Site Url”, “Site Name” and “Refinable String 0″..”Refinable String 99”. Conditions would be “is equal to”, “is not equal to”, “starts with” and “not starts with”. Or you can select “Advanced query builder” and enter LQL query.
What is the takeaway from this for SharePoint administrators? We would be asked to configure SharePoint the way compliance/retention people can use Refinable Strings.
How do users know – what files are going to be deleted
It would be a good idea to let users know, that their files will be deleted, but the next question would be “can I get a list of files that are scheduled to deletion”?
Surely it is possible for admins to go through site content and find files that were modified last time earlier than a specific date, but there is another method – you can recommend your users to search through their site using