Dataverse now offers the ability to manage cloud flow run, allowing you to utilize its scalable extensibility for tracking your cloud flow executions. It’s important to note that only those cloud flows integrated with Dataverse solution flows will have their run metadata stored in Dataverse.
This functionality is highlighted by the fact that each cloud flow execution is meticulously logged in the Flow Run table. By utilizing the strength of Dataverse’s non-relational database system and its elastic tables, the storage and management of cloud flow run metadata become more streamlined and efficient.
| Element | Description |
| Name | Unique identifier and Logic App ID for the flow run |
| Start time | Timestamp marking the initiation of the cloud flow |
| End Time | Timestamp indicating completion of the cloud flow |
| Run Duration | Duration of the cloud flow run in seconds |
| Status | Outcome of the flow run (e.g., Success, Failed, Cancelled) |
| Trigger Type | Category of the trigger (Automated, Scheduled, Manual) |
| Error Code | Code representing any error encountered during the flow run |
| Error Message | Detailed message explaining the error, if any |
| Owner | Individual or entity that owns the flow |
| Workflow Name | Name displayed for the cloud flow |
| Workflow ID | Unique identifier for the specific cloud flow |
| IsPrimary | Indicator if the flow run is primary (binary value) |
| Parent Run ID | Identifier of the parent cloud flow run, for child flow records |
| Partition ID | User-specific partition ID in the elastic table instance |
| Time to Live | Duration in seconds before automatic deletion of the run record |
The blog post provides instructions on how to form the URL on the run instance to obtain more detailed information about a specific run for that instance.
You can access and modify the details of cloud flow run metadata using standard Dataverse APIs, via a Dataverse connector, or directly within the Tables view of the maker portal.
By default, the metadata for each flow run is retained for 28 days, equivalent to 2,419,200 seconds. Should you need to alter this retention period, you can do so by adjusting the ‘Time to live’ setting, measured in seconds, in the Organization table of your Dataverse-backed environment. This flexibility allows you to tailor the metadata storage duration to suit your specific environmental storage capabilities and needs.

Great post! Any idea if something needs to be enabled to populate data into the flow run table? The table has no data in any of our environments!
LikeLike
Hi Lars Hem,
Apologies for the confusion, it looks like this is a preview feature. When I was initially testing, it was logging correctly, but now I’ve noticed that in my environment too, the logs are getting cleared. Let’s wait a few days; it might start functioning properly again. Regarding activation, there’s no need to enable anything – it should automatically store the logs.
LikeLike
Another great way is to use m365 cli to access/manage flow runs
from the website https://ashiqf.com/2021/05/16/cancel-all-your-running-power-automate-flow-runs-using-m365-cli-and-rest-api/ $flowEnvironment=$args[0] $flowGUID=$args[1] $flowRuns = m365 flow run list -e $flowEnvironment -f $flowGUID –output json | ConvertFrom-Json $targetDate = “2024-01-22” foreach ($run in $flowRuns) { if($run.status -eq “Running” -and $run.startTime.Substring(0,10) -eq $targetDate) { Write-Output “Run details: ” $run # Cancel all the running flow runs m365 flow run cancel -e $flowEnvironment -f $flowGUID –name $run.name
LikeLike
This only works for solution cloud flows, with non-solution cloud flows you have to be the owner otherwise you must have shared the flow, it will fail.
LikeLike
Indeed Maria, these are targetting flows that are solution aware.
LikeLike