Costa Rica
Last updated: 2025-04-21
This process shows how to set up Microsoft Fabric Activator to automate workflows by detecting file creation events in a storage system and triggering another pipeline to run.
- First Pipeline: The process starts with a pipeline that ends with a
Copy Dataactivity. This activity uploads data into theLakehouse.- Event Stream Setup: An
Event Streamis configured in Activator to monitor the Lakehouse for file creation or data upload events.- Triggering the Second Pipeline: Once the event is detected (e.g., a file is uploaded), the Event Stream triggers the second pipeline to continue the workflow.
List of References (Click to expand)
List of Content (Click to expand)
Note
This code generates random data with fields such as id, name, age, email, and created_at, organizes it into a PySpark DataFrame, and saves it to a specified Lakehouse path using the Delta format. Click here to see the example script
ExampleOfNotebookGeneratesRandomDeltaData.mp4
-
Create the Pipeline:
- In Microsoft Fabric, create the first pipeline that performs the required tasks.
- Add a
Copy Dataactivity as the final step in the pipeline.
-
Generate the Trigger File:
- Configure the
Copy Dataactivity to create a trigger file in a specific location, such asAzure Data Lake Storage (ADLS)orOneLake. - Ensure the file name and path are consistent and predictable (e.g.,
trigger_file.jsonin a specific folder).
- Configure the
-
Publish and Test: Publish the pipeline and test it to ensure the trigger file is created successfully.
Example.Pipeline.Notebook.RandomData.Copy.Data.to.Different.LakeHouse.mp4
Tip
Event options:
Event.Stream.Options.mp4
-
Set Up an Event:
-
Create a new event to monitor the location where the trigger file is created (e.g., ADLS or OneLake). Click on
Real-Time:
-
Choose the appropriate event type, such as
File Created.
-
Add a source:
How.to.Create.an.Event.Stream.-.Real.Analytics.Activator.mp4
-
-
Test Event Detection:
- Save the event and test it by manually running the first pipeline to ensure Activator detects the file creation.
- Check the Event Details screen in Activator to confirm the event is logged.
Event.Stream.In.Action.mp4
-
Create the Pipeline:
- In Microsoft Fabric, create the second pipeline that performs the next set of tasks.
- Ensure it is configured to accept external triggers.
-
Publish the Pipeline: Publish the second pipeline and ensure it is ready to be triggered.
Get.Meta.Data.from.Fabric.Items.-.Second.Pipeline.mp4
-
Setup the Activator:
How.to.set.up.Activator.from.Event.Stream.mp4
-
Create a New Rule:
- In
Activator, create a rule that responds to the event you just configured. - Set the condition to match the event details (e.g., file name, path, or metadata).
- In
-
Set the Action:
- Configure the rule to trigger the second pipeline.
- Specify the pipeline name and pass any required parameters.
-
Save and Activate:
- Save the rule and activate it.
- Ensure the rule is enabled and ready to respond to the event.
Define.Rule.in.Activator.mp4
-
Run the First Pipeline: Execute the first pipeline and verify that the trigger file is created.
-
Monitor Activator: Check the
Event DetailsandRule Activation Detailsin Activator to ensure the event is detected and the rule is activated. -
Verify the Second Pipeline: Confirm that the second pipeline is triggered and runs successfully.
Testing.Entire.Workflow.mp4
- If the second pipeline does not trigger:
- Double-check the rule configuration in Activator.
- Review the logs in Activator for any errors or warnings.