Other questions (MSBUILD Splitting text file into lines) mention implementation-specific alternatives, but none seem to directly address how to split a simple string property into an item group based on endlines.
How can you do this? Attempts that didn't work:
<ItemGroup>
<SplitLines Include="$(SourceString.Split('\r\n'))" />
</ItemGroup>: (splits on 'r' or 'n')
<ItemGroup>
<SplitLines Include="$(SourceString.Split('%0A%0D'))" />
</ItemGroup>: (doesn't split at all)
In case you're curious: SourceString is the output of an Exec command that needs splitting, so ReadLinesFromFile isn't an option. It can't output to an intermediary file because file systems are slow and this needs to be used by build processes that care about file operations.
Using property functions is the way to go and you can search for sulutions using e.g. 'C# split string lines' in your search engine of choice, then translate the answer. This comes up with this SO question and the Regex.Split method is the easiest to implement:
<ItemGroup>
<SplitLines Include="$([System.Text.RegularExpressions.Regex]::Split(`$(SourceString)`, `\r\n|\r|\n`))" />
</ItemGroup>
Related
I am working actually on a dataset about customers complains.
The dataset is really dirty, because of that I want to split the data to different files. One of those files should contain data that has the wrong Date format. The only problem is that the ISDATE function doesn't work with the conditional split.
can someone tell me what function should I use to check a Date format using the conditional split?
I am using Visual Studio (shell) 2013
Thank you
The root problem with your question is that ISDATE does not exist within the SSIS Expression language.
Depending what "dirty" means to you, I would solve this problem with a Data Conversion Task acting as the Conditional Split. Route the failed conversion (Error) rows to one destination and the clean ones to another.
I start with the following query to simulate bad source data
SELECT '2015-02-28' As DirtyDate
UNION ALL SELECT '2015-02-29'
UNION ALL SELECT 'penguin'
The first is a valid date. The second suffers from an invalid range and the third is right out.
Within the Data Conversion Task, I generate a new column called CleanDate which just casts as a DT_DATE
I then simulate your destinations with Row Count transformations to capture how many good versus bad I had.
Biml
If you have BIDS Helper or BimlExpress installed, the following snippet will create an SSIS package that looks like the above screenshot. Adjust the connection string as needed.
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<OleDbConnection Name="tempdb" ConnectionString="Data Source=localhost\dev2014;Initial Catalog=AdventureWorksDW2014;Provider=SQLNCLI11.0;Integrated Security=SSPI;"/>
</Connections>
<Packages>
<Package Name="so_37465482">
<Tasks>
<Dataflow Name="DFT Sample">
<Variables>
<Variable DataType="Int32" Name="RowCountGood">0</Variable>
<Variable DataType="Int32" Name="RowCountBad">0</Variable>
</Variables>
<Transformations>
<OleDbSource ConnectionName="tempdb" Name="OLESRC Dirty Dates">
<DirectInput><![CDATA[SELECT '2015-02-28' As DirtyDate
UNION ALL SELECT '2015-02-29'
UNION ALL SELECT 'penguin'
]]>
</DirectInput>
</OleDbSource>
<DataConversion Name="DCT Filter DirtyDate">
<Columns>
<Column DataType="Date" SourceColumn="DirtyDate" TargetColumn="CleanDate" />
</Columns>
<ErrorHandling
ErrorRowDisposition="RedirectRow"
TruncationRowDisposition="IgnoreFailure"></ErrorHandling>
</DataConversion>
<RowCount VariableName="User.RowCountGood" Name="RC Good">
</RowCount>
<RowCount VariableName="User.RowCountBad" Name="RC Bad">
<InputPath OutputPathName="DCT Filter DirtyDate.Error" />
</RowCount>
</Transformations>
</Dataflow>
</Tasks>
</Package>
</Packages>
</Biml>
Caveat on "dirty" dates
Some dates that convert just fine in SQL Server won't convert in SSIS. For example, the ever popular yyyymmdd format can't be cast as is into a date within SSIS. e.g. 20150228 won't convert 2015-02-28 casts just fine within SSIS. SELECT CAST('20150228' AS date) AS WorksFine is a-ok in SQL Server.
Is it possible to use the ${shortdate} in the internalLogFile?
<nlog internalLogFile="C:\Logs\${shortdate}_nlog.log"
<targets>
<target name="logfile"
fileName="C:/logs/${shortdate}_dev.log"
</target>
I'm getting the expected dated logfile, but the internal log file is named ...
${shortdate}_nlog.log
Short answer: No.
Longer answer: The internal logger file name is just a string. It's read in during initialization and the XmlLoggingConfiguration class ensures that the directory exists whereas (for example) the FileTarget uses a Layout for fileName that converts the value provided using LayoutRenderers.
https://github.com/NLog/NLog/issues/581#issuecomment-74923718
My understanding from reading their comments is that the internal logging should be simple, stable, and used sparingly. Typically you are only supposed to turn it on when trying to figure out whats going wrong with your setup.
You can still dynamically name your internal log file based on the date time if you want. However it won't have the same rollover effect a target file would. It would essentially have a different datetime whenever you initialized your logger I think.
DateTime dt = DateTime.Now;
NLog.Common.InternalLogger.LogFile = #"C:\CustomLogs\NLog_Internal\internal_NLogs" + dt.ToString("yyyy-MM-dd") + ".txt";
I have a WCF configuration file that I am trying to transform with SlowCheetah. For development use, we want to include the MEX endpoints, but when we release the product, these endpoints should be removed on all services except one. The server for which it should be left has the following endpoint:
<endpoint address="MEX"
binding="mexHttpBinding"
contract="IMetadataExchange" />
The ones that should be removed are as follows:
<endpoint address="net.tcp://computername:8001/WCFAttachmentService/MEX"
binding="netTcpBinding"
bindingConfiguration="UnsecureNetTcpBinding"
name="WCFAttachmentServiceMexEndpoint"
contract="IMetadataExchange" />
The transform I am using is:
<service>
<endpoint xdt:Locator="Condition(contains(#address, 'MEX') and not(contains(#binding, 'mexHttpBinding')))" xdt:Transform="RemoveAll" />
</service>
However, when I run this, ALL MEX endpoints are removed from the config file including the one that I wish to keep. How do I make this work properly?
The Locator Condition expression that selects the nodes seems to be correct. If you had only the two endpoints you posted in your example, this expression will select the second endpoint.
According to the documentation the Transform attribute RemoveAll should "remove the selected element or elements." Based on the information you posted it's not working as expected, since the first element was not selected and was removed anyway. Based on this StackOverflow answer it seems to me that the issue is with Condition. I'm not sure if that's a bug (it's poorly documented), but you could try some alternative solutions:
1) Using XPath instead of Condition. The effective XPath expression that is applied to your configuration file as a result of the Condition expression is:
/services/service/endpoint[contains(#address, 'MEX') and not(contains(#binding, 'mexHttpBinding'))]
You should also obtain the same result using the XPath attribute instead of Condition:
<endpoint xdt:Locator="XPath(/services/service/endpoint[contains(#address, 'MEX')
and not(contains(#binding, 'mexHttpBinding'))])" xdt:Transform="RemoveAll" />
2) Using Match and testing an attribute such as binding. This is a simpler test, and would be IMO the preferred way to perform the match. You could select the nodes you want to remove by the binding attribute
<endpoint binding="netTcpBinding" xdt:Locator="Match(binding)" xdt:Transform="RemoveAll" />
3) UsingXPath instead of Match in case you have many different bindings and only want to eliminate only those which are not mexHttpBinding:
<endpoint xdt:Locator="XPath(/services/service/endpoint[not(#binding='mexHttpBinding'))" xdt:Transform="RemoveAll" />
4) Finally, you could try using several separate statements with Condition() or Match() to individually select the <endpoint> elements you wish to remove, and use xdt:Transform="Remove" instead of RemoveAll.
I have a series of square SVG files that I would like to arrange lengthwise into one super long SVG file.
I attempted to use imagemagick to combine them. Based on this page:
http://linux.about.com/library/cmd/blcmdl1_ImageMagick.htm
and this
http://www.imagemagick.org/Usage/compose/
I tried this command
composite 'file1.svg' 'file2.svg' +adjoin 'outputfile.svg'
However, I received the following error message:
composite: unrecognized option '+adjoin' # error/composite.c/CompositeImageCommand/565.
I tried several other imagemagick commands (convert, display), but had no success. How can I combine these files on the command line? Is there an inkscape command that does this?
There's currently no convenient way to do this with only the command line and no custom scripting.
Closest pre-written thing I could find currently (4-16-2012) is https://github.com/astraw/svg_stack, which lets you write commands of the form:
svg_stack.py --direction=h --margin=100 red_ball.svg blue_triangle.svg > shapes.svg
to concatenate.
It should be pretty easy if you're willing to use a scripting language. For each file, just add a prefix to all id tags; so in file 1, id="circle" becomes id="file1_circle", and in file 2, id="circle" becomes id="file2_circle".
In most cases you would get away with a trivial search and replace (find id=" and replace it with id="fileX_) although it is possible to have cases where this won't work (specifically if that find string appears in an item of text, for example).
If you want to do this 'the proper way', you'll need an XML parser (such as XMLReader in PHP).
i've got a question about ant and String split.
In a IniFile i've a section "[app_version]" with 1 element: "VERSION = 3.48".
My goal is to split "3.48" in 3 and 48.
I've try to read the ini file sucessfully with this code and it's work.
<target name="get_new_version_number">
<property file="${basedir}/Ini File/Config.ini" prefix="config.">
</property>
<property name="version_actuelle" value="${config.VERSION}" />
<echo message="version de l'application: ${version_actuelle}"/>
but, how can i split "3.48" witch is my value, in 3 and 48. I need to do this to increment 48 each time i execute the script.
Thanks by advance for your considerations.
Regards.
Simon
thanks for your answer.
I've try your solution but it not work for me because, i've got for result, 3.48.1, 3.48.1.2, 3.48.1.2.3....... etc
I really need to increment "48" so i have to split my value 3.48 with split fonction or something else.
But, again, thanks very much for your time.
regards
Simplest solution would be to read the major number from the ini file and then use the buildnumber task to manage the incrementing number
<buildnumber/>
<echo message="${majorNum}.${build.number}"/>
The Ant addon Flaka provides a split function, f.e. =
<project name="demo" xmlns:fl="antlib:it.haefelinger.flaka">
<property name="yourvalue" value="3.48"/>
<fl:echo>#{split('${yourvalue}', '\.')[0]}${line.separator}#{split('${yourvalue}', '\.')[1]}</fl:echo>
</project>
if you have further requirements -- you mentioned a "need to increment" -- you have to give more details.It's no problem to wrap it in a for loop with Flaka.