Tips for testing an auto-apply Retention Label policy

Reading Time: 9 minutes

A post written in response to questions surrounding the testing of auto-apply label policies for retention labels in Microsoft 365 and some pragmatic advice based on what I do when testing. 

As of May 2022, a retention label applied via an auto-apply policy can take up to 7 days to apply. That 7 days can seem like forever, particularly if you’re not sure you have the condition specified correctly in the auto-apply condition. Although NONE of the tips I’m about to share can expedite the back-end process to be less than 7 days (sorry to disappoint); they may provide some structure and the comfort and confidence you’re seeking when building out your own organization’s label policies and conditions and the testing around them. Pick and choose those that suit.

  • Tip 1 – Start with a plan!
  • Tip 2 – Use a non-production tenant
  • Tip 3 – Validate conditions ahead of time
  • Tip 4 – Test against a single container location
  • Tip 5 – Consider a ‘Just label the items’ retention label (maybe)
  • Tip 6 – Confirm where the retention label has been applied
  • Tip 7 – Testing with short retention periods
  • Tip 8 – Clean up after yourself 😉

Tip 1 – Start with a plan!

This is the most important tip and why its #1. With few exceptions, this is where I start. Proof I walk the talk on this tip is my (large) OneNote notebook dedicated exclusively to retention testing all-up. I do this for several reasons:

  • Documenting what it is I’m trying to validate/test. Enough time passes and you might forget the thing you’re trying to test. What can I say? I’m a busy girl. 🙂
  • Unwinding a tenant from some of these tests is not a trivial undertaking (although it’s getting easier) so having a plan and making sure your setup isn’t haphazard and accidental will help unwind it when/if the time comes
  • Tests can take from days to weeks to complete depending on what you’re testing (disposition review I’m looking at you). Unless you have an exceptional memory or are doing a very small number of tests, it’s easy to forget the specific labels, policies, conditions, sites, test content, and expected results you had as part of the test (Trust me on this one)
  • It’s a great place to document ‘lessons learned’ and results for the eventual setup in a production tenant. Afterall, you don’t want to make the same mistake twice (Trust me, it’s happened)

You’re well-served to have a thoughtful, intentional, and organized test plan before hands touch the keyboard in the Compliance Center. Your future self will thank you!


Tip 2 – Use a non-production tenant

If you have one available to you, the safest place to test out an auto-apply condition is in a non-production tenant. This is where you “go to school” and learn how the tech works for your specific business scenario before doing it “live”.

Many great use-cases exist for a non-production tenant in this context… here are a few:

  • Test out your information architecture and how files can have a retention label applied based on certain metadata values. This may include how you want to set up your tenant-level term store to leverage it for compliance
  • Test out your SIT and Trainable Classifier conditions based on some seeded content (similar to production) to determine if it will match
  • Test out a site provisioning solution and how it can be leveraged to streamline your policies based on properties you set

Tip 3 – Validate conditions ahead of time

You have 3 options to auto-apply a retention label in the Records Management UI:

Before waiting (up to) 7 days to find out if you’re conditions were accurate, try to validate as much as you can in advance. Here are some ways to validate your conditions for each option:

  • Option 1 – ‘Apply label to content that contains sensitive info’
    • Seed some data on a test site with the sensitive information type (SIT) you’re wanting to detect. You’ll see data matching the SIT appear in the Content Explorer tab of Data Classification in the Microsoft Purview Center (image) across Exchange, SharePoint, OneDrive, and Teams.  (See my general comment below about Content Explorer)  
    • You can also use the Test option in the SIT definition to upload sample documents to see if it matches:

 

    • You can also use Content Search to validate the results for SharePoint sites. Please know searching for a SIT will NOT work against an Exchange mailbox, searching for a SIT will only work against SharePoint, Teams site, OneDrive site, Yammer site. 
  • Option 2 – ‘Apply label to content that contains specific words or phrases, or properties’

Note: It’s important to know there’s no built-in validation against the keyword query language (KQL) query you enter as your condition within the tool which makes it critically important to ensure the query is right!

    • Helpful link: Keyword queries and search conditions – Microsoft Purview (compliance)
    • Seed some data on a test site/mailbox with the conditions you’re wanting to detect. If you’re using a managed property for a custom piece of metadata in SharePoint, you must also ensure the mapping is done in the search schema in advance
    • Use Content Search to validate the results by entering the same KQL condition you’ve entered in the auto-apply condition as your search condition (SharePoint example below):

  • Option 3 – ‘Apply label to content that matches a trainable classifier’
    • This one is a bit trickier to validate. Seeding some content matching the trainable classifiers is the only way I know of to test. Business trainable classifiers (Agreement, Finance, HR, IT, Tax, etc.) may align with some content you already have in your production environment. A recent announcement by Microsoft (Data Discovery using Trainable Classifiers in Content Explorer – Microsoft Tech Community) allows you to see trainable classifier matches in the Content Explorer tab of the Data Classification app (the Agreements trainable classifier shown in image) whether you’ve used the trainable classifier in an existing policy or not – a great way to decide if you can use them as a condition in an auto-apply label policy. (See my general comment below about Content Explorer)  

*General comment about using Content Explorer in Data Classification to validateThe number of items you see is a calculation and may not reflect the exact number as indicated by this warning from Content Explorer:

This is why I suggest alternative methods to validate in some cases.


Tip 4 – Test against a single container location 

Keep things as simple as the test will allow but no simpler. Universal testing approach by the way. 😉

Testing against a single container location, particularly if you’re on a production tenant will just simplify the test. In most cases, it’s simply not necessary to test against all locations of a specific type. Once the test is complete, the job of unwinding the test may be immeasurably simpler as well.

What do I mean by a “single container location”? 1 SharePoint site or 1 OneDrive site or 1 Exchange mailbox or 1 Microsoft 365 Group.

Make sure you seed content in the container location that will both pass and fail your conditions. Don’t be too “happy path” here… it’s just as important to ensure something is NOT labeled as it is to ensure it IS labeled. 🙂


Tip 5 – Consider a ‘Just label the items’ retention label

Note: ‘Just label the items’ retention label formerly known as ‘Don’t retain or delete items’ retention label

I almost removed this tip from my list with the recent announcement from Microsoft where you can now delete a record retention label not in use (formerly not allowed). However… if you only want to see whether the retention label will be correctly auto applied without having any retention/deletion action taken (likely true in most cases), consider using a retention label configured to ‘Just label the items’, the third option below:

Advantage of this approach? Simplicity. You’re isolating the test to confirming where a retention has been auto-applied, nothing more. 

If you’ve never used this type of label before, here’s a few things to know:

  • You can still publish OR auto-apply a ‘Just label items’ label just like a regular or record retention label. This is why it’s a great way to test out your auto-apply policy
  • Important! Even with this type of label, you CANNOT change the retention settings after-the-fact. This means you can’t start off with a label of this kind with the plan to modify the retention configuration after your test to reflect your real retention settings to retain/delete. In this case, its sole purpose is to test your auto-apply condition

Anecdotal story relating to the last bullet point above – I’ve had customers want to repurpose a ‘Just label items’ retention label once the auto-apply policy has been tested, but this isn’t (currently) possible.


Tip 6 – Confirm where the retention label has been applied

At some point, particularly if you’re testing against multiple container locations, you’ll want to see where all content had the retention label applied. A couple of ways of doing this…

Use Content Explorer in the Data Classification feature of the Purview Compliance Center. However, as stated in Tip 3, the number of items you see is a calculation and may not reflect the exact number with that retention label applied. You will see the warning below to indicate this:

There is also a back-end process that runs before items will appear in the Content Explorer that can take up to 2 days so don’t expect to see up-to-the-minute location results to appear.

For these reasons, I recommend using Search if you want to get the exact list of items with a retention label applied across multiple locations. The quickest ways to do this:

Option 1 – Use PowerShell and search (New-ComplianceSearch) with your matching search query to programmatically retrieve all matching items.

Option 2 – Use Content Search from the Purview Compliance Center with your Retention Label condition. You can export the search report to see the list of items with the label applied. (Note: be aware the retention label is not included in the report if you were wanting to combine multiple retention labels in 1 search):

The above search equates to: compliancetag=”Sample Do Nothing Label 2” and will return all items (SharePoint, Exchange) where the label has been applied.

If you want to build an end-user friendly way for test users to identify content with retention label(s) applied, you can also build a custom search page with filters. Refer to a recent post I wrote describing the steps to do this: Searching for Records inside SharePoint Online


Tip 7 – Testing with Short Retention Periods

You may be tempted to put a short retention period on your retention label (i.e., 1 day) if part of your test includes observing what happens to a record when it reaches the end of its retention period including either an automatic deletion or a disposition review.

Important to remember about this is there are several back-end processes that run to control this end-to-end process:

  • the automatic application of the retention label (can take up to 7 days)
  • the movement of the item to the disposition review page (happens via a back-end process after the retention label has been applied)

Because of these back-end processes (which you can’t control), don’t expect your 1-day test retention label to appear in the disposition review page on day 2. In my testing, it may take up to 2 weeks for it to appear.

Yet another reason to document your test plan. 


Tip 8 – Clean up after yourself 😉

At some point, particularly if you’re “testing” in a production tenant (don’t judge… it happens a lot), you will/should want to unwind from your testing setup and get back to a pristine state. 🙂 

I haven’t tested every combination of settings you may want to unwind your tenant from. Below are some simple scenarios I’ve tested.

Note: I’ll add the steps required when you’ve configured a disposition review for the label and items are currently sitting in the Pending items tab of the disposition review page.

Microsoft’s guidance on deleting retention labels from Microsoft Purview is here: Deleting retention labels. In summary, it states you cannot delete:

  • an event-based retention label
  • a regulatory record label
  • a retention label that is currently part of an active/disabled retention label policy
  • a record label that has been applied to items
Steps to remove a retention label from the Purview Compliance Center and from the labeled content:

Note: this applies to regular retention labels, record retention labels, and ‘Just label items’ retention labels.

  1. Remove the label from all retention label policy it’s included in
  2. Wait until the policy(s) is in a state of ‘Enabled (Success)’ – this will also automatically put the retention label in an Inactive status
  3. Select the inactive retention label from the File Plan by clicking the Delete (garbage can) icon – this will remove it from any documents/emails where the label was applied. (my testing took ~a day) You DO NOT have to remove the retention labels – the system will do it for you

4. [Optional] Delete the site/team/group you were testing with

 

There you have it… some of my tips! Hopefully, you’ve found some of these helpful; the hard part comes in the test itself and that parts on you! Good luck. 🙂

Thanks for reading.

-JCK

4 comments

  1. I don’t seem to get 1 day retention label with disposition review and then delete to actually work and delete the file (SPO). It is still there. Any ideas?

    1. Hi JN, I don’t have enough information based on your description to know for sure. The backend services that run don’t guarantee that 1-day retention file will get sent to disposition review on day 2. It can take up to 7 days for the file to get processed for its retention period. It may then take some extra time for it to appear in the pending items of disposition review since it’s controlled by another process. Could any of that explain what you’re seeing?
      Although tempting to test with a 1-day retention period, it can be confusing to validate.
      -Joanne

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.