Costa Rica
Last updated: 2025-04-21
List of References (Click to expand)
Table of Content (Click to expand)
-
Log in to Azure Portal: Open your web browser and go to the Azure Portal. Enter your credentials to log in.
-
Search for Data Factory: Use the search bar at the top to search for
Data Factoryand selectData Factoryfrom the results.
-
Create a New Data Factory:
- Click on the
+ Createbutton. - In the "Basics" tab, fill in the required fields:
- Subscription: Select your Azure subscription.
- Resource Group: Select an existing resource group or create a new one.
- Region: Choose the region where you want to deploy the Data Factory.
- Name: Enter a unique name for your Data Factory.
- Version: Select V2 (the latest version).
- Click on the
-
Configure Git (Optional): If you want to configure Git for source control, you can do so in the
Git configurationtab. This step is optional and can be skipped if not needed.
Note
Or later (crucial for source control or auditing):
-
Review and Create:
-
Wait for Deployment: The deployment process will take a few minutes. Once it's complete, you will see a notification.
-
Access Data Factory: After the deployment is complete, click on the
Go to resourcebutton to access your new Data Factory. -
Launch Data Factory Studio: In the Data Factory resource page, click on the
Launch Studiotile to launch the Data Factory Studio where you can start creating pipelines and other data integration tasks.
-
Log in to Azure Portal: Open your web browser and go to the Azure Portal. Enter your credentials to log in.
-
Go to Data Factory: Use the search bar at the top to search for
Data Factoryand select your Data Factory instance from the list. -
Launch Data Factory Studio: In the Data Factory resource page, click on the
Launch Studiotile to launch the Data Factory Studio where you can start creating pipelines and other data integration tasks.
-
Create a New Pipeline:
-
Add Activities to the Pipeline:
-
Configure Activities:
- Click on each activity on the canvas to configure its properties.
- For example, if you are using a Copy Data activity, you will need to specify the source and destination datasets.
-
Set Up Linked Services:
-
Create Datasets:
-
Validate the Pipeline: Click on the
Validatebutton at the top of the pipeline canvas to check for any errors or missing configurations. -
Publish the Pipeline: Once validation is successful, click on the
Publish Allbutton to save and publish your pipeline. -
Trigger the Pipeline: Click on
Trigger nowto run the pipeline immediately, or configure a trigger for scheduled runs.
-
Monitor Pipeline Runs: In the
Monitortab, you can view the status of pipeline runs, check for any errors, and review the execution details.
-
Log in to Azure Portal: Open your web browser and go to the Azure Portal. Enter your credentials to log in.
-
Go to Azure Data Factory: Once logged in, use the search bar at the top to search for
Data Factoryand select your Data Factory instance from the list. -
Open the Activity Log:
- In the Data Factory resource page, look for the
Activity logoption in the left-hand menu under theMonitoringsection. - Click on
Activity logto open the log view.
- In the Data Factory resource page, look for the
-
View Activity Log Details:
- In the Activity Log, you will see a list of events related to your Data Factory.
- You can see columns such as
Operation Name,Status,Event Initiated By,Time,Subscription, and more.
-
Filter and Search:
- Use the filters at the top to narrow down the events by time range, resource group, resource, and more.
- You can also use the search bar to find specific events or operations.
-
Review Event Details: Click on any event in the list to view more detailed information about that event, including the JSON payload with additional properties.




