How do you do string replacement in MSBuild 4.0? - string

After inputting an ItemGroup containing a list of file names, I'm trying to swap the extensions to something different. For example, I want file.foo converted to file.bar. Is this possible within MSBuild?
Thanks

If you already have an item group containing the files for which you want to change the name, just use that as input to a Copy or Move task and modify its metadata. Here is a list of common item metadata.
<!-- The list of .foo files is in the ItemGroup myFiles -->
<Copy SourceFiles="#(myFiles)" DestinationFiles="%(myFiles.Filename).bar" />
Metadata is accessed by %(/varName/./metadata/). In this example, I am setting the extension to whatever I want by only including the original Filename metadata in the output.

Related

ADF Copy Activity problem with wildcard path

I have a seemingly simple task to integrate multiple json files that are residing in a data lake gen2
The problem is files that need to be integrated are located in multiple folders, for example this is a typical structure that I am dealing with:
Folder1\Folder2\Folder3\Folder4\Folder5\2022\Month\Day\Hour\Minute\ <---1 file in Minute Folder
Than same structure for 20223 year, so in order for me to collect all the files I have to go to bottom of the structure which is Minute folder, if I use wildcard path it looks like this:
Wildcard paths 'source from dataset"/ *.json, it copies everything including all folders, and I just want files, I tried to narrow it down and copies only first for 2022 but whatever I do is not working in terms of wildcard paths, help is much appreciated
trying different wildcard combinations did not help, obviously I am doing something wrong
There is no option to copy files from multiple sub- folders to single destination folder. Flatten hierarchy as a copy behavior also will have autogenerated file names in target.
image reference MS document on copy behaviour
Instead, you can follow the below approach.
In order to list the file path in the container, take the Lookup activity and connect to xml dataset with HTTP linked service.
Give the Base URL in HTTP connector as,
https://<storage_account_name>.blob.core.windows.net/<container>?restype=directory&comp=list.
[Replace <storage account name> and <container> with the appropriate name in the above URL]
Lookup activity gives the list of folders and files as separate line items as in following image.
Take the Filter activity and filter the URLs that end with .json from the lookup activity output.
Settings of filter activity:
items:
#activity('Lookup1').output.value[0].EnumerationResults.Blobs.Blob
condition:
#endswith(item().URL,'.json')
Output of filter activity
Take the for-each activity next to filter activity and give the item of for-each as #activity('Filter1').output.value
Inside for-each activity, take the copy activity.
Take http connector and json dataset as source, give the base url as
https://<account-name>.blob.core.windows.net/<container-name>/
Create the parameter for relative URL and value for that parameter as #item().name
In sink, give the container name and folder name.
Give the file name as dynamic content.
#split(item().name,'/')[sub(length(split(item().name,'/')),1)]
This expression will take the filename from relative URL value.
When the pipeline is run, all files from multiple folders got copied to single folder.

I want to copy files at the bottom of the folder hierarchy and put them in one folder

In Azure Synapse Analytics, I want to copy files at the bottom of the folder hierarchy and put them in one folder.
The files you want to copy are located in their respective folders.
(There are 21 files in total.)
enter image description here
I tried it using ability to flatten the hierarchy of "Copy" activity.
However, as you can see in the attached image, the file name is created on the Synapse side.
enter image description here
I tried to get the name of the bottom-level file with the "Get Metadata" activity, but I could not use wildcards in the file path.
I considered creating and running 21 pipelines that would copy each file, but since the files are updated daily in Blob, it would be impractical to run the pipeline manually every day using 21 folder paths.
Does anyone know of any smart way to do this?
Any help would be appreciated.
Using flatten hierarchy does not preserve existing file name, new file name will be generated. Wildcard paths are not accepted by Get metadata activity. Hence one option is to use Get Metadata with ForEach to achieve the requirement.
The following are the images of folder structure that I used for this demonstration.
I created a Get Metadata activity first. I am retrieving the folder names (21 folders like '20220701122731.zip') inside Intage Sample folder using field list as child items.
Now I used ForEach activity to loop through these folders names by giving items value as #activity('Get folders level1').output.childItems.
Inside ForEach I have 3 activities. First is another Get Metadata activity to get the subfolder names (to get one folder inside '20220701122731.zip', that is '20220701122731')
In this, while creating dataset, we passed the name of parent folder (folder_1 = '20220701122731.zip') to the dataset to use it in the path as
#{concat('unzipped/Intage Sample.zip/Intage Sample/',dataset().folder_1)}
This returns the names of subfolders (like '20220701122731') which are inside parent folder (like '20220701122731.zip' which have 1 subfolder each). I used set variable activity to assign the child items output to this variable using #activity('Get folder inner').output.childItems .
The final step is copy activity to move the required files to one single destination folder. Since there is only one sub-folder inside each of the 21 folders (only one sub-folder like '20220701122731' inside folder like '20220701122731.zip'), we can use the values achieved from above steps directly to complete the copy.
Along with the help of wildcard paths in this copy data activity, we can complete the copy. The wildcard directory path will be
#{concat('unzipped/Intage Sample.zip/Intage Sample/',item().name, '/', variables('test')[0].name)}
#item().name give parent folder name, in your case- '20220701122731.zip'
#variables('test')[0].name gives sub-folder name, in your case like '20220701122731'
For sink, I have created a dataset pointing to a folder inside my container called output_files. When triggered, the pipeline runs successfully.
The following are the contents of my output_files folder.

Select all files from directory that contain file with given name

I would like to create scope that contain all files from directory that contain files with given names.
My example pseudo-pattern:
file:*/box-wideo.tpl/./ *
I mean all catalogues (and it's content) that contain file box-wideo.tpl
Scopes do not support conditional inclusion (e.g. "include folder IF it has specific file in it").
Scopes work on file/folder names directly, i.e. "include files that match this pattern but exclude these that match another pattern).
Therefore it's not possible to do what you are asking for.
References to the manual:
http://www.jetbrains.com/phpstorm/webhelp/scope-language-syntax-reference.html
http://www.jetbrains.com/phpstorm/webhelp/scopes.html

How to Read in multiple CSV files from an XSLT file and output a single XML file

I plan to use Saxon for an XSLT problem. I need to run my program on a schedule. When it runs it needs to select all CSV files from a directory. The number of files can be random but once processed they are cleared from the folder by another process. Originally there was only one CSV file with a fixed name so referencing it in the XSLT wasn’t a problem. I could also programmatically set the filename at runtime so all was working well. My XSLT now needs to know about all the files so I can output a single XML. I’m not sure if I can pass in a file path and let the XSLT read in all the files at that location? Is there a command to do this or is there a better way to do this? Remember I don’t know how many CSV files will be in the folder when the XSLT is run.
See www.saxonica.com/documentation/sourcedocs/intro.xml, you can use the collection function to read in files from a directory e.g.
<xsl:for-each select="collection('file:///C:/dir/subdir?select=*.csv;unparsed=yes')/tokenize(., '\n')">
<line><xsl:value-of select="."/></line>
</xsl:for-each>

Making a custom report with ccnet

So I have my output that I used the merge task to put into ccnet.
Now what I need to do is come up with my own custom xsl and output the data.
Any ideas on where there maybe a tutorial on how to do this?
For example what plugin do I need to use? Can I create my own? What does action name do?
<xslReportBuildPlugin description="MSBuild Output" actionName="MSBuildOutputBuildPlugin" xslFileName="xsl\msbuild.xsl" />
in your cruise control folder:
CruiseControl.NET\webdashboard\xsl
Copy any existing xsl (preferably one that is close to what you already want or in a format you like). you could start an xsl file from scratch also.
edit it to what you want it to be with your own file name.
I copied the msbuild.xsl to BMsBuild.xsl and made my changes.
then in dashboard.config
<xslReportBuildPlugin description="BBuildReport" actionName="BBuildReport" xslFileName="xsl\Bmsbuild.xsl"></xslReportBuildPlugin>
Description: what title you want it to have on the webdashboard link
actionName: a unique name that will be used to generate a URL for that xsl/report
xslFileName: the path to the xsl usually just xsl[your xsl file name].xsl
That's the easiest way. The file is nearly entirely xsl so there's not really anything special you need to do or know. Except that the xsl is going to target the merged xml file from whatever you have in the publishers xml logger tag in your ccnet.config
<publishers>
<statistics/>
<xmllogger logDir="D:\Projects\TFS\Main\BuildProcess\logs\ServiceBuilds" />
</publishers>

Resources