Toll Free:

1800 889 7020

Read/Write Files from a SFTP Enabled Azure BLOB Storage Container

Agenda:

In this article, we will learn to create an SFTP enabled Azure BLOB Storage account and access the account container using private credentials with the help of SSH.NET library.

Pre-requisites:

1) .NET SDK (Any version)

2) Visual Studio Code

=> Download .NET SDK for Windows or Mac based on your operating system from https://dotnet.microsoft.com/en-us/download/visual-studio-sdks.

=> Download Visual Studio code from Microsoft’s official website https://code.visualstudio.com/Download

Download Visual Studio code from Microsoft’s

Once you install .NET SDK, type the below command and you should see the result like below. If the result is something different, then there might be some issue with .NET installation.

type the below command

Once you install VS Code, just type “code”. In the terminal, and it should open Visual Studio Code GUI.

Once you install VS Code, just type “code”

=> Creating an SFTP enabled Azure BLOB Storage

To smooth out data management and analytics, create a resource group that incorporates Azure Synapse Analytics for seamless integration and proficient handling of large datasets.

Step 1: Create a Resource Group

=> On the home page of the Azure portal, you should see a tile with name Resource Group. If not, you can search it on the top search bar.

Resource Group

=> Click Create button in the Resource Groups popup, and you should see the below screen.

Resource Groups popup

=> Enter the Resource group name and select a region as mentioned above.

=> Click Review+Create button and you will see the below screen where Azure validates the data entered by you.

Click Review+Create button

=> Finally click create button to create the resource group named “demo-rg”.

READ – Essential 12+ Azure Data Factory Tools for Data Integration

Step 2: Creating Storage Account for File Storage

=> Go inside the resource group and click create button.

Creating Storage Account for File Storage

=> Search for Storage Account and click Create button.

Search for Storage Account

=> Once you click Create button, you will see Storage Account configuration wizard. Select the values as mentioned in the screenshots below.

Basics Tab:

=> The most important thing on this page is to select Primary Service as Azure Data Lake Analytics Gen 2.

Primary Service as Azure Data Lake Analytics

Advanced Tab:

=> To enable SFTP features, you need to first select Enable Hierarchical namespace.

SFTP features

Networking Tab:

=> Keep everything same on this tab.

Networking Tab:

Data Protection Tab:

=> Keep everything same here as well.

Data Protection Tab

Encryption Tab:

=> Keep everything same here as well.

Encryption Tab

=> Press Review + Create and you should see below screen. Verify all the details and click Create button to create the storage account.

Press Review + Create

READ – AI & Analytics: Azure Data Lake as Your Innovation Platform

Step 3: Configure Private User Credentials and Create Container

=> Configure private user credentials and create container in storage account.

=> Go inside the created storage account, and on the left navigation panel, expand Settings and you will find SFTP option there.

Configure Private User Credentials and Create Container

=> Click Add local user.

Add local user

=> In the user + authentication tab, configure the following.

authentication tab

=> Create a new container and mention the same container name in Landing directory.

new container and mention the same container name

=> Select all permissions for the container.

permissions for the container

=> Once you click Add button, you will see a popup showing the SSH password. Copy and keep it somewhere handy. We will use this password when we connect to this container with .NET development services.

SSH password.

=> Finally you will see all container details with user details like below in SFTP enabled Azure BLOB Storage page.

container details

Step 4: Add some Sample Files inside the Container Folder

=> Go to the containers option like below and you should see the container that we created in the previous step.

Sample Files inside the Container Folder

=> Click on the container name and click Upload button. Once you click Upload button, you can upload any sample files which you want using the dialog box with Browse for file option.

dialog box with Browse for file option

=> I have uploaded 12 files like below.

Step 5: Creating .NET Console App

=> Creating .NET console app and access the above container with the app using the private credentials.

=> Go to the terminal and open a folder inside it. Then use the following command to create the .NET console app using .NET CLI.

dotnet new console -lang “C#”

Creating .NET Console App

=> The Nuget package that we are using is SSH.NET. You can install the package using the following command.

Nuget package

=> Once the package is installed, open the project in VS Code. You can see the package reference in .csproj file.

project in VS Code

=> Open Program.cs file and use write the code shown below.

Open Program.cs file

=> The above is very simple and self-explanatory. We are just creating a SFTP client object and passing hostname, username and password.

=> Client object connects to the SFTP enabled Azure BLOB Storage container and lists all the files inside the home directory.

READ – Creating Your Azure Data Factory Integration Runtime

=> Run the program using F5 button, you should see the following output in the output window. In the output, you can see the files that we uploaded in the Azure container.

Run the program using F5 button

=> SSH.NET is a powerful library and you can use it for many features. Below is a screenshot from its official website. You can explore more on it.

SSH.NET is a powerful library

Harsh Savani

Harsh Savani is an accomplished Business Analyst with a strong track record of bridging the gap between business needs and technical solutions. With 15+ of experience, Harsh excels in gathering and analyzing requirements, creating detailed documentation, and collaborating with cross-functional teams to deliver impactful projects. Skilled in data analysis, process optimization, and stakeholder management, Harsh is committed to driving operational efficiency and aligning business objectives with strategic solutions.

Scroll to Top