Azure Blob Storage as Destination in Azure Data Factory Pipeline with Specific File name and Compression -


i trying copy data azure sql database azure blob storage.

source : azure sql database. destination : azure blob storage.

"typeproperties": {             "filename": "myfile.csv.zip",             "folderpath": "myfolderpath/",             "format": {                 "type": "textformat",                 "columndelimiter": ",",                 "nullvalue": "",                 "firstrowasheader": false             },              "compression": {                 "type": "zipdeflate",                 "level": "fastest"             }         }, 

zip file created contains auto-generated file name after extract (not myfile.csv).

when use gzip compression .gz file contains file same file name myfile.csv

i read documentation, cannot find anything.

anybody has faced same issue? please advise.

zip different gzip. former compressor archive, , latter file-based compressor.

the filename specified name of result zip file, not related inner file's name. considering data extract sql source without original file name, use auto-generated names it.

one zip file may have several inner files. if mentioning inner file's name can associated given zip file's when there 1 inner file. yes, is. haven't taken such improvement now.

to mitigate , fulfill requirement, can author 2 activities. first 1 data sql blob specified file name (myfile.csv) want. second 1 archive zip file (myfile.csv.zip).


Comments

Popular posts from this blog

angular - Ionic slides - dynamically add slides before and after -

minify - Minimizing css files -

Add a dynamic header in angular 2 http provider -