Trigger a Flow when a File is Created or Modified in an Azure Blob Container

In January 2020, I created a Flow in Power Automate to transcribe audio files to text using Azure Cognitive Services Batch Transcription service. You can view the post here 

It was triggered on-demand and required whoever was running the Flow to specify the URL of the audio file in Blob storage. 

The Flow met my needs at the time, but I always planned to revisit the process and make it a bit more automated. Wouldn't it be great if, the Flow triggered whenever we added a file to the blob storage container? 

Well, this article explains how to achieve this. 

The purpose of this article is to document and share what I learned. To encourage others to expand on what I have achieved and to create solutions of their own. It is by no means a polished and final solution in itself. 

Both my Batch Transcription Flow and the complete Blob to Transcription Flow are available from my Google Drive - https://link.freefall365.com/cognitiveservicesfiles.

I'm starting from my existing Batch Transcription Flow, if you don’t have anything to start from, then you can download the complete Flow from the URL above or start with these steps and build your own Flow from here. The first thing I did was to delete the manual trigger.

Next search for Blob and use the ‘When a blob is added or modified (properties only)’ trigger.

Add the container where your audio files will be stored.

For your next step, before you initialise the variables you want to search for ‘Blob’ again. Choose the ‘Create SAS URI by Path’ action (you’ll need to scroll to see this).

Populate the Blob path field with the ‘List of Files Path’ variable you get from the output of your trigger

Replace the ‘Value’ of the variable declared to hold the record URL with the ‘Web URL’ output from the previous step.

9.png

And finally, save and test your Flow!

That’s it! Now, anytime a file is created or modified in the specified Blob container the Flow will attempt to transcribe it to text and save it to OneDrive for Business. As I mentioned previously, in a production environment, you’d want to do some checks on file type, etc. and probably save the output somewhere else. I also haven’t tested this by uploading or modifying a directory, some error handling may be required. I hope this either helps you if your stuck or inspires you to do something creative using the Azure Blob connector in Power Automate.

Blue Skies!!!

Paddy Byrne