Skip to content

Commit b21040f

Browse files
Merge pull request #36967 from MicrosoftDocs/main
Auto Publish – main to live - 2026-03-27 22:30 UTC
2 parents fbe2568 + 06bebc2 commit b21040f

12 files changed

Lines changed: 184 additions & 4 deletions

docs/includes/2016-esu.md

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
---
2+
author: MashaMSFT
3+
ms.author: mathoma
4+
ms.date: 03/27/2026
5+
ms.service: sql
6+
ms.topic: include
7+
---
8+
> [!NOTE]
9+
> Price structure for Extended Security Updates (ESUs) is changing for SQL Server 2016 on Azure VMs. For more information, see the [ESU pricing blog](https://aka.ms/ESUPricingBlog).
27.7 KB
Loading
64 KB
Loading
17.1 KB
Loading
36 KB
Loading
Lines changed: 161 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,161 @@
1+
---
2+
title: "Tutorial: Use SSIS Packages to Write Files to OneLake Through Azure Data Lake Storage Gen2"
3+
description: Learn how to use SSIS packages with Azure Storage connection managers to write files to Azure Data Lake Storage Gen2 and access them in OneLake through shortcuts.
4+
author: chugugrace
5+
ms.author: chugu
6+
ms.reviewer: randolphwest, maghan
7+
ms.date: 03/27/2026
8+
ms.service: sql
9+
ms.subservice: integration-services
10+
ms.topic: tutorial
11+
ms.custom:
12+
- intro-deployment
13+
- sfi-image-nochange
14+
---
15+
16+
# Tutorial: Use SSIS packages to write files to OneLake through Azure Data Lake Storage Gen2
17+
18+
This tutorial shows you how to run an existing SSIS package that writes files to Azure Data Lake Storage (ADLS) Gen2, and then surface those files in OneLake by using a shortcut. By combining the Invoke SSIS Package activity in Data Factory for Microsoft Fabric with OneLake shortcuts, you can centralize all your data in OneLake - even data produced by legacy SSIS workloads.
19+
20+
## Use case
21+
22+
Many organizations have SSIS packages that extract and transform data, then write the results as flat files (CSV, Parquet, XML, and others) to Azure Data Lake Storage Gen2. These files are consumed by downstream analytics and reporting systems.
23+
24+
With Microsoft Fabric, you can bring those files into OneLake without changing your SSIS package logic:
25+
26+
- **Preserve existing SSIS investments** - Continue using battle-tested packages that write files to ADLS Gen2 through the Azure Storage connection manager. No package rewrite is required.
27+
- **Centralize data in OneLake** - Create an ADLS Gen2 shortcut in a Fabric lakehouse so that files written by SSIS appear automatically in OneLake, ready for consumption by Spark, SQL, Power BI, and other Fabric workloads.
28+
- **Orchestrate in Fabric** - Use the Invoke SSIS Package activity in a Fabric pipeline to schedule and monitor package execution alongside other Fabric-native activities.
29+
30+
## Prerequisites
31+
32+
Before you begin, make sure you have:
33+
34+
- A [Microsoft Fabric workspace](/fabric/get-started/create-workspaces) with a Fabric capacity or trial.
35+
- A [lakehouse](/fabric/data-engineering/create-lakehouse) in the workspace.
36+
- An Azure Data Lake Storage Gen2 storage account with [hierarchical namespace enabled](/azure/storage/blobs/create-data-lake-storage-account).
37+
- An SSIS package (*.dtsx*) that uses an [Azure Storage connection manager](../connection-manager/azure-storage-connection-manager.md) to write files to ADLS Gen2.
38+
- Credentials for the ADLS Gen2 account - for example, an account key, shared access signature (SAS), service principal, or organizational account - with at least the **Storage Blob Data Contributor** role.
39+
40+
## Overview
41+
42+
The end-to-end workflow has four steps:
43+
44+
| Step | What you do | Result |
45+
| --- | --- | --- |
46+
| 1 | Configure the SSIS package to write files to ADLS Gen2 | Package produces output files in your storage account |
47+
| 2 | Create an ADLS Gen2 shortcut in a Fabric lakehouse | Files written to ADLS Gen2 appear in OneLake automatically |
48+
| 3 | Upload the SSIS package to OneLake | Package is stored in OneLake and ready to be invoked |
49+
| 4 | Run the package from a Fabric pipeline | Pipeline orchestrates execution and writes output through to OneLake |
50+
51+
## Step 1 - Configure the SSIS package to write files to ADLS Gen2
52+
53+
In this step you make sure your SSIS package uses an Azure Storage connection manager to write files to your ADLS Gen2 account.
54+
55+
1. Open your SSIS project in **Visual Studio** with the [SQL Server Integration Services Projects extension](../integration-services-developer-documentation.md).
56+
1. Install the [Azure Feature Pack for Integration Services (SSIS)](../azure-feature-pack-for-integration-services-ssis.md). The Feature Pack provides the Azure Storage connection manager, Azure Blob Source, Azure Blob Destination, and other Azure-related tasks and components needed to connect to ADLS Gen2 from an SSIS package.
57+
1. In the **Connection Managers** tray, add (or verify) an **Azure Storage** connection manager. Set the following properties:
58+
59+
| Property | Value |
60+
| --- | --- |
61+
| **Service** | ADLS Gen2 |
62+
| **Authentication** | Choose one: *AccessKey*, *ServicePrincipal*, or *SharedAccessSignature* |
63+
| **Account name** | Your ADLS Gen2 storage account name |
64+
65+
:::image type="content" source="media/storage-connection.png" alt-text="Screenshot of the Azure Storage connection manager configuration dialog.":::
66+
67+
1. Configure your data flow or file system task to use this connection manager and write output files to a container and folder path in the storage account - for example, `mycontainer\myfolder`.
68+
69+
:::image type="content" source="media/storage-folder-path.png" alt-text="Screenshot of the data flow configuration with the container and folder path for the storage account." lightbox="media/storage-folder-path.png" :::
70+
71+
1. Test the connection and verify the package executes correctly on your local machine.
72+
73+
For full details on the Azure Storage connection manager, see [Azure Storage connection manager](../connection-manager/azure-storage-connection-manager.md).
74+
75+
> [!TIP]
76+
> If your package uses the **DontSaveSensitive** protection level, credentials aren't persisted in the package file. You supply them at runtime through the **Connection Managers** tab of the Invoke SSIS Package activity. Alternatively, you can set the package protection level to **EncryptSensitiveWithPassword**, which encrypts credentials inside the package. You then provide the package password in the Invoke SSIS Package activity at runtime instead of supplying individual connection manager credentials (Step 4).
77+
78+
## Step 2 - Create an ADLS Gen2 shortcut in a Fabric lakehouse
79+
80+
A shortcut makes the files written by your SSIS package visible in OneLake without copying data. Any Fabric workload - Spark, SQL analytics endpoint, Power BI - can read the files through the shortcut.
81+
82+
1. Open your **lakehouse** in the Fabric portal.
83+
1. In the **Explorer** pane, right-click the **Files** folder (or a subfolder) and select **New shortcut**.
84+
1. Under **External sources**, select **Azure Data Lake Storage Gen2**.
85+
1. Enter the connection URL - the DFS endpoint for your storage account:
86+
87+
```text
88+
https://<STORAGE_ACCOUNT_NAME>.dfs.core.windows.net
89+
```
90+
91+
1. Select an existing connection or create a new one. Choose an authentication kind that has at least the **Storage Blob Data Reader** role on the storage account.
92+
1. Select **Next**, then browse to the container and folder where your SSIS package writes files (for example, `mycontainer`).
93+
1. Select the target folder, then select **Next****Create**.
94+
95+
:::image type="content" source="media/shortcut-storage-container.png" alt-text="Screenshot of the shortcut creation dialog showing the selected storage container." lightbox="media/shortcut-storage-container.png" :::
96+
97+
The shortcut now appears in your lakehouse. Any file that the SSIS package writes to the ADLS Gen2 target folder is automatically accessible in OneLake through this shortcut.
98+
99+
For detailed instructions, see [Create an Azure Data Lake Storage Gen2 shortcut](/fabric/onelake/create-adls-shortcut). For more information about shortcuts, see [OneLake shortcuts](/fabric/onelake/onelake-shortcuts).
100+
101+
## Step 3 - Upload the SSIS package to OneLake
102+
103+
The Invoke SSIS Package activity reads packages from OneLake. Upload your *.dtsx* file (and optional *.dtsConfig* file) to a lakehouse.
104+
105+
1. In the Fabric portal, open the lakehouse where you want to store the package.
106+
1. In the **Files** section, create a folder - for example, `ssis-packages`.
107+
1. Upload the package by using one of these methods:
108+
109+
| Method | How |
110+
| --- | --- |
111+
| **Fabric portal** | Select **Upload****Upload files** and choose your *.dtsx* file. |
112+
| **OneLake file explorer** | Drag and drop the file into the `packages` folder through the OneLake file explorer on your desktop. |
113+
114+
For more information about uploading files to OneLake, see the [Invoke SSIS Package activity documentation](/fabric/data-factory/invoke-ssis-package-activity).
115+
116+
## Step 4 - Run the package in a Fabric pipeline
117+
118+
1. In your Fabric workspace, create a new **Data Pipeline** or open an existing one.
119+
1. From the **Activities** pane, add the **Invoke SSIS Package** activity to the pipeline canvas.
120+
1. On the **Settings** tab, configure the activity:
121+
122+
| Setting | Value |
123+
| --- | --- |
124+
| **Package path** | Browse to the *.dtsx* file you uploaded in Step 3. |
125+
| **Configuration path** *(optional)* | Browse to the *.dtsConfig* file, if applicable. |
126+
| **Encryption password** *(optional)* | If the package protection level is **EncryptSensitiveWithPassword** or **EncryptAllWithPassword**, provide the password used to encrypt the package. |
127+
| **Enable logging** | Select to write execution logs to OneLake. |
128+
129+
:::image type="content" source="media/ssis-activity.png" alt-text="Screenshot of the Invoke SSIS Package activity settings tab in a Fabric pipeline." lightbox="media/ssis-activity.png" :::
130+
131+
1. Select **Save**, then select **Run** to execute the pipeline immediately, or select **Schedule** to set up recurring execution.
132+
1. Monitor progress in the pipeline **Output** tab or the workspace **Monitor** hub. If logging is enabled, the activity output includes the **logging path** on OneLake.
133+
134+
For full configuration details, see [Use the Invoke SSIS Package activity to run an SSIS package](/fabric/data-factory/invoke-ssis-package-activity).
135+
136+
## Verify the results
137+
138+
After the pipeline run completes successfully:
139+
140+
1. Open the lakehouse and navigate to the shortcut you created in Step 2.
141+
1. Confirm that the output files written by the SSIS package appear in the shortcut folder.
142+
143+
## Summary
144+
145+
By combining a few Fabric capabilities, you can bring file-based SSIS output into OneLake without modifying your existing packages:
146+
147+
1. **Azure Storage connection manager** writes files to ADLS Gen2 from within your SSIS package.
148+
1. **OneLake shortcut** surfaces those files in a Fabric lakehouse - no data copy required.
149+
1. **Package upload to OneLake** makes the *.dtsx* file available for Fabric pipeline execution.
150+
1. **Invoke SSIS Package activity** orchestrates and monitors package execution in a Fabric pipeline.
151+
152+
This pattern lets you manage all your data in OneLake while preserving your existing SSIS investments.
153+
154+
## Related content
155+
156+
- [Invoke SSIS Package activity documentation](/fabric/data-factory/invoke-ssis-package-activity)
157+
- [OneLake shortcuts](/fabric/onelake/onelake-shortcuts)
158+
- [Create an Azure Data Lake Storage Gen2 shortcut](/fabric/onelake/create-adls-shortcut)
159+
- [Azure Storage connection manager](../connection-manager/azure-storage-connection-manager.md)
160+
- [Tutorial: Integrate SSIS with SQL database in Microsoft Fabric](integrate-fabric-sql-database.md)
161+
- [Data Factory in Microsoft Fabric overview](/fabric/data-factory/data-factory-overview)

docs/integration-services/toc.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -75,6 +75,8 @@
7575
href: ../integration-services/fabric-integration/integrate-fabric-data-warehouse.md
7676
- name: Tutorial - Integrating SSIS with SQL database in Fabric
7777
href: ../integration-services/fabric-integration/integrate-fabric-sql-database.md
78+
- name: Tutorial - Integrating SSIS with OneLake through Azure Data Lake Storage Gen2
79+
href: ../integration-services/fabric-integration/tutorial-ssis-write-files-onelake.md
7880
- name: Install Integration Services
7981
href: ../integration-services/install-windows/install-integration-services.md
8082
- name: Installing Integration Services Versions Side by Side

docs/reporting-services/install-windows/configure-the-report-server-service-account-ssrs-configuration-manager.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -67,7 +67,7 @@ To view and reconfigure service account information, always use the [!INCLUDE[ss
6767
For best results, specify an account that has network connection permissions, with access to network domain controllers and corporate Simple Mail Transfer Protocol(SMTP) servers or gateways. The following table summarizes the accounts and provides recommendations for how to use them.
6868

6969
> [!NOTE]
70-
> [Group Managed Service Accounts Overview](/windows-server/security/group-managed-service-accounts/group-managed-service-accounts-overview) aren't supported as a report server service account.
70+
> Managed Service Account (MSA), including both standalone MSA (sMSA) and [group MSA (gMSA)](/windows-server/security/group-managed-service-accounts/group-managed-service-accounts-overview) aren't supported as a report server service account.
7171
7272
|Account|Explanation|
7373
|-------------|-----------------|

docs/sql-server/azure-arc/extended-security-updates.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,9 +15,11 @@ ms.custom:
1515

1616
[!INCLUDE [sql-migration-end-of-support](../../includes/applies-to-version/sql-migration-end-of-support.md)]
1717

18+
This article explains how to manage a [!INCLUDE [ssNoVersion](../../includes/ssnoversion-md.md)] subscription to Extended Security Updates enabled by Azure Arc. For more information about the program, see [What are Extended Security Updates for SQL Server?](../end-of-support/sql-server-extended-security-updates.md)
19+
1820
After [!INCLUDE [ssNoVersion](../../includes/ssnoversion-md.md)] reaches the end of its support lifecycle, you can sign up for an Extended Security Update (ESU) subscription for your servers and remain protected for up to three years. When you upgrade to a newer version of [!INCLUDE [ssNoVersion](../../includes/ssnoversion-md.md)], you can terminate your ESU subscription and stop paying for it. When you [migrate to Azure SQL](/azure/azure-sql/migration-guides/), the ESU charges automatically stop but you continue to have access to the security updates.
1921

20-
This article explains how to manage a [!INCLUDE [ssNoVersion](../../includes/ssnoversion-md.md)] subscription to Extended Security Updates enabled by Azure Arc. For more information about the program, see [What are Extended Security Updates for SQL Server?](../end-of-support/sql-server-extended-security-updates.md)
22+
[!INCLUDE [2016-esu](../../includes/2016-esu.md)]
2123

2224
## Subscribe to Extended Security Updates in a production environment
2325

docs/sql-server/azure-arc/manage-configuration.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -100,6 +100,8 @@ Select the **Use physical core license** checkbox if you're configuring a virtua
100100

101101
#### Subscribe to Extended Security Updates
102102

103+
[!INCLUDE [2016-esu](../../includes/2016-esu.md)]
104+
103105
You can subscribe to Extended Security Updates (ESUs) for the individual host. To qualify for an ESU subscription, the host must have **License type** set to **Pay-as-you-go** or **License with Software Assurance**. This option allows you to subscribe by using vCPUs (v-cores) when the host is a virtual machine, or by using physical cores when the host is a physical server that runs without using virtual machines.
104106

105107
Select **Subscribe to Extended Security Updates**. It sets the host configuration property `EnableExtendedSecurityUpdates` to `True`. The subscription is activated after you select **Save**.
@@ -301,15 +303,15 @@ For more examples of Azure Resource Graph queries, see [Starter Resource Graph q
301303

302304
#### List Azure Arc-enabled SQL Server instances subscribed to ESUs
303305

304-
The following example shows how you can view all eligible [!INCLUDE [sssql11-md](../../includes/sssql11-md.md)] or [!INCLUDE [sssql14-md](../../includes/sssql14-md.md)] instances and their ESU subscription status:
306+
The following example shows how you can view all eligible [!INCLUDE [sssql14-md](../../includes/sssql14-md.md)] instances and their ESU subscription status:
305307

306308
```kusto
307309
resources
308310
| where type == 'microsoft.azurearcdata/sqlserverinstances'
309311
| extend Version = properties.version
310312
| extend Edition = properties.edition
311313
| extend containerId = tolower(tostring (properties.containerResourceId))
312-
| where Version in ("SQL Server 2012", "SQL Server 2014")
314+
| where Version in ("SQL Server 2014")
313315
| where Edition in ("Enterprise", "Standard")
314316
| where isnotempty(containerId)
315317
| project containerId, SQL_instance = name, Version, Edition

0 commit comments

Comments
 (0)