:D. Open up a pipeline, click the copy data activity, and go to the user properties. ADF has some nice capabilities for file management that never made it into SSIS such as zip/unzip files and copy from/to SFTP. I use this activity to read a CSV text file from a blob, parse its contents, and copy them to a SQL Server database. Or, second best, just create a delete function. Note: This post is about Azure Data Factory V1 I've spent the last couple of months working on a project that includes Azure Data Factory and Azure Data Warehouse. But it also has some… I am creating a test Azure Data Factory Pipeline for learning purposes. You can now use ADF built-in delete activity in your pipeline to deletes undesired files without writing code. User properties are basically the same as annotations, except that you can only add them to pipeline activities.By adding user properties, you can view additional information about activities under activity runs.. For the copy data activity, Azure Data Factory can auto generate the user properties for us.Whaaat! For this blog, I will be picking up from the pipeline in the previous blog post. By: Fikrat Azizov | Updated: 2019-11-28 | Comments (5) | Related: More > Azure Data Factory Problem. Delete activity in Azure Data Factory - Cleaning up your data files. Get started using delete activity in ADF from here. Now we will add a new activity to the Azure Data Factory pipeline inside the ForEach activity that will be used to act based on the source file size, where we will copy the data inside the file if the file size is larger than or equal to 1KB and delete the file that is smaller than 1KB. This should be as simple as a setting on the copy function (to delete after copy i.e. Copy Data Activity ... To empty the Stage tables, you could of course add a Stored Procedure Activity before the Copy Data Activity to execute a Truncate or Delete statement. You can delete either folder or files from Azure Blob Storage, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, File System, FTP Server, sFTP Server, and Amazon S3. For this specific pipeline, I want to move files from one blob to the other. We are going to explore the capabilities of this activity, in this post. make it a move). Check out part one here: Azure Data Factory – Get Metadata Activity; Check out part two here: Azure Data Factory – Stored Procedure Activity; Check out part three here: Azure Data Factory – Lookup Activity; Setup and configuration of the If Condition activity. The Data Factory UI in Azure Portal includes multiple activities that one can chain up when creating a Data Factory pipeline, like the Copy activity for instance. Here is dynamic expression I used in Set Variable activtiy : @replace(variables('inputVar'), '\n', '') Here is the input: Here is the output: Hope this helps. A user recently asked me a question on my previous blog post ( Setting Variables in Azure Data Factory Pipelines ) about possibility extracting the first element of a variable if this variable is set of elements (array). I used a Set Variable activity to test this. Azure Data Factory (ADF) has a For Each loop construction that you can use to loop through a set of tables. It seems crazy that there is no means to delete a file on a blob store after ingesting it. Azure Data Factory (ADF) is a great example of this. In ADF world, this involves creating a … In a previous post (Lookup activity), we discussed Lookup activity to read the content of the database tables or files.ADF also has another type of activity: Get Metadata activity, which allows reading metadata of its sources. Let us know if you have any further questions and also please provide sample/dummy data to …