I have setup CruiseControl.net for a bunch of my projects which are related.
As a result a single project tag in CruiseControl has multiple SVN checkouts and then a bunch of msbuild tasks compile all the individual sln files.
I need to update the assembly version of all the solutions when this build is being done.
However, since i'm not using nant and not using MSBuild proj files, I am unsure on how to get this.
I wonder if I'm missing something obvious. I just need a solution which can be implemented by making appropriate changes in the ccnet.config file without requiring me to make changes to csproj files.
Thanks,
Anj
What about using a shared AssemblyInfo across your projects?
This is what we do for our products:
Each project has it's own AssemblyInfo.cs - this contains AssemblyTitle, AssemblyDescription, Guid, and other attributes that are unique to that assembly.
Each project also has two other Assembly Info files, note that these are added as a link rather than a direct file (VS -> Add -> Existing File -> Add as link (little down arrow next to add))
The two link files:
CompanyAssemblyInfo.cs - AssemblyCompany, AssemblyCopyright, AssemblyConfiguration, CLSCompliant, SecurityPermission, etc. Basically everything we want standard on all our assemblies.
ProductAssemblyInfo.cs - AssemblyProduct, AssemblyVersion, AssemblyFileVersion. This allows us to push the same version across to all assemblies from the one file.
Our CI and release process is more complicated, but that's at the heart of it - a single point (file) which controls the product version (assemblies, installers, everything!)
There's a task to do just what you're asking about.
You'll need to install the MSBuildCommunity tasks, found here.
Then, you can create something like this:
<PropertyGroup>
<MyAssemblyVersion>$(CCNetLabel)</MyAssemblyVersion>
</PropertyGroup>
<Target Name="GenAssemblyInfo">
<AssemblyInfo
ContinueOnError="false"
CodeLanguage="CS"
OutputFile="$(MSBuildProjectDirectory)\YourAssembly\AssemblyInfo.cs"
AssemblyTitle="blah"
AssemblyDescription="blah blah"
AssemblyCompany="Anj Software, Inc."
AssemblyProduct="Anj's Awesome App"
AssemblyCopyright="blah blah"
CLSCompliant="false"
AssemblyVersion="$(MyAssemblyVersion)"
AssemblyFileVersion="$(MyAssemblyVersion)"
/>
</Target>
Note that you can set a build number prefix in your ccnet.config file so that your assemblies will be numbered 2.1.0.x where x is the build number. That's how we do our version numbering where I work.
You'll still need to keep a default AssemblyInfo.cs file as part of each of the projects that make up your solution.
I use powershell for this. lpath is the path to the source code, and buildnum is my buildnumber I append. That is all I actually do with this. However, it should give you enough to change or set any or all of the other fields available. I pass in lpath and I get the buildnumber from the available environment variables in CC.NET and I can use this script over and over again, just changing what I pass in on the command line in the config file. I also have one that modifies the resource files for the C++ Code if that is actually what you need to modify.
$files = Get-ChildItem $lpath -recurse -filter *AssemblyInfo.cs -name
Foreach ($file in $files)
{
$file = $lpath + "\" + $file
$fileObject=get-item $file
$fileObject.Set_IsReadOnly($False)
$sr = new-object System.IO.StreamReader( $file, [System.Text.Encoding]::GetEncoding("utf-8") )
$content = $sr.ReadToEnd()
$sr.Close()
$content = [Regex]::Replace( $content, '(?<=\[assembly: AssemblyVersion\("[0-9].[0-9].[0-9].)[0-9]?[0-9]?[0-9]', $buildnum);
$content = [Regex]::Replace( $content, '(?<=\[assembly: AssemblyFileVersion\("[0-9].[0-9].[0-9].)[0-9]?[0-9]?[0-9]', $buildnum);
$sw = new-object System.IO.StreamWriter( $file, $false, [System.Text.Encoding]::GetEncoding("utf-8") )
$sw.Write( $content )
$sw.Close()
$fileObject.Set_IsReadOnly($True)
}
Related
I would like to write a function to create a windows .lnk file from my lua script. I found a function in the LuaFileSystem library . Is there a way to do this without the library? (The reason: I am writing the script for multiple users, would be nice if we don't have to install the library on every machine.)
I appreciate the help!
To make a shortcut (an .lnk file)
-- your .lnk file
local your_shortcut_name = "your_shortcut.lnk"
-- target (file or folder) with full path
local your_target_filespec = [[C:\Windows\notepad.exe]]
local ps = io.popen("powershell -command -", "w")
ps:write("$ws = New-Object -ComObject WScript.Shell;$s = $ws.CreateShortcut('"..your_shortcut_name.."');$s.TargetPath = '"..your_target_filespec.."';$s.Save()")
ps:close()
To make a symlink simply use os.execute"mklink ..."
Use luacom is faster than powershell
local luacom=require'luacom'
local shortcut_file_path='test_create_shortcut.lnk'
local target_file_path=arg[0]
local shellObject=luacom.CreateObject("WScript.Shell")
local shortcut=shellObject:CreateShortcut(shortcut_file_path)
shortcut.TargetPath=target_file_path
shortcut:Save()
assert(io.open(shortcut_file_path)):close()--shortcut file exist now
os.remove(shortcut_file_path)
And use FileSystemObject object (another COM), or Windows shell link file format spec for Kaitai Struct (parse binary file struct to get info on various file format) to retrieve shortcut info. Which 'lfs' can't do now.
see: Create a desktop shortcut with Windows Script Host - Windows Client | Microsoft Docs
LuaCOM User Manual (Version 1.3)
I have two MVC projects. And I created two WindowsAzure project:WindowsAzure1-> which package MVC1 , and WindowsAzure2-> which package MVC2 project. After CheckIn on Local TFS 2012, I build my solution. MSBuild Arguments:
/t:Publish /p:PublishDir=c:\drops\app.publish\
After Build I see 3 file, instead 4.
1.WindowsAzure1.cspkg
2.WindowsAzure2.cspkg
3.ServiceConfiguration.Cloud.cscfg//It contain config WindowsAzure2.cspkg
I tried to rename ServiceConfiguration.Cloud.cscfg, but it doesn't rename.
So, I think the better place package on different folder. But problem that in the future MVC and Azure project will be unknown count. So I need automatically create folder contains name project. So how can it do?
Simple Way to create dynamic folders is through PowerShell Script.
Lets say you have folder structure of projects in following way -
Then you can following script to generate package folders -
# Solution directory, which contains all the projects
$path = "C:\Solution"
$folders = Get-ChildItem -Path $path
foreach ($folder in $folders)
{
if ($folder.Attributes -eq "Directory")
{
if($folder.Name -like "*.Cloud")
{
New-Item -Path "$($path)\$($folder.Name)package" -ItemType "Directory"
}
}
}
Output will be -
Then you can use CSPack utility and PowerShell combination to create package and save configuration file to the location of your interest.
http://www.intstrings.com/ramivemula/articles/jumpstart-30-create-azure-cloud-service-package-cspkg-of-visual-studio-2013-project-solution-using-powershell/
In setting up a Jenkins deployment job, I kept running into this error when trying to deploy a Visual Studio 2012 Web project via the command line.
error MSB4044: The "ConcatFullServiceUrlWithSiteName" task was not given a value for the required parameter "SiteAppName"
For reference, here are the parameters that I used:
/p:Configuration=Release /t:Rebuild /p:VisualStudioVersion=11.0 /p:PublishProfile="DeployToDevServer"
/p:DeployOnBuild=True /p:DeployTarget=MSDeployPublish
/P:AllowUntrustedCertificate=True /p:MSDeployPublishMethod=WMSvc
/p:MsDeployServiceUrl=https://devmachine.server.com:8172/MsDeploy.axd
/p:username=domainhere\adminuserhere /p:password=adminpasshere
Note: It would deploy just fine if I chose Publish... from inside the project.
After much googling, and finally comparing a project that would deploy with the one that wouldn't, I finally figured it out after I opened the .csproj files with a text editor and compared them.
In the project that worked, I found this section:
<PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">
And it had this line:
<DeployIisAppPath>Default Web Site/sitenamehere</DeployIisAppPath>
I added this same line to the non-working project, changed the sitename, and it worked.
Hope this helps someone else.
You could pass this DeployIisAppPath as parameter to Jenkins, like this:
p:DeployIisAppPath=Default Web Site/sitenamehere
This would allow you to have different sitenames on different machines. While in your example (with CSPROJ modification) you would be obliged to have one IIS site name on all target machines
I would like to build Nuget package for my add-on which will be used by end user to install as startup task and after that they will upload their applications on window azure platform.
Let's take one simple web application and one cloud project now using Nuget end user will added add-on package it will add 2 files(exe & config) in web application project & add startup task to ServiceDefinition.csdef of cloud project as per shown in figure
How can i created this type of Nuget Pacakge?
Thanks in advance.
Update:
I have tried according with NICK's answer however i am getting problem with 2 webrole in cloud project according to shown in below fig.
and error i am getting as per below
Also i have one question that If i am installing that Nuget package with command line then how i can consider all webrole projects to add exe and config file in solution??
Nuget packages work based on a convention:
http://docs.nuget.org/docs/creating-packages/creating-and-publishing-a-package#From_a_convention_based_working_directory
As far as the exe and config you can do the following:
In your package directory make the following directories
mkdir lib (for the exe)
mkdir content (for the config)
All you have to do for the exe is drop the file in the lib directory and modify you .nuspec file just under the metadata node. There should be a "files" node (if not you can add one). Add something like this inside the files node:
<file src="content\my.exe" target="content\my.exe" />
The config is a little different. Just add a file named myname.config.transform to the content directory and add an entry into the .nuspec file:
A couple of things to note:
If the config file does not exist in your app it will add one for you.
You only have to add the nodes you want transformed if there is a file that already exists
The transform file will do a complete match on your node so if the following existed in
your config file:
<add key="test" value="myval"/>
And in your transform you had:
<add key="test" value="myval2"/>
The resulting file would look like:
<add key="test" value="myval"/>
<add key="test" value="myval2"/>
As far as adding the startup task, that's been a little more tricky for me (there might be a much better way). I use powershell in the install.ps1 (just like the files above but you create a "tools" directory for it):
param($installPath, $toolsPath, $package, $project)
#Modify the service config - adding a new Startup task
$svcConfigFile = $DTE.Solution.Projects|Select-Object -Expand ProjectItems|Where-Object{$_.Name -eq 'ServiceDefinition.csdef'}
$ServiceDefinitionConfig = $svcConfigFile.Properties.Item("FullPath").Value
[xml] $xml = gc $ServiceDefinitionConfig
#Create startup and task nodes
# So that you dont get the blank ns in your node
$startupNode = $xml.CreateElement('Startup','http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition')
$taskNode = $xml.CreateElement('Task','http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition')
$taskNode.SetAttribute('commandLine','my.exe')
$taskNode.SetAttribute('executionContext','elevated')
$taskNode.SetAttribute('taskType','simple')
$startupNode.AppendChild($taskNode)
#Check to see if the startup node exists
$modified = $xml.ServiceDefinition.WebRole.StartUp
if($modified -eq $null){
$modified = $xml.ServiceDefinition.WebRole
$modified.PrependChild($startupNode)
}
else{
$nodeExists = $false
foreach ($i in $xml.ServiceDefinition.WebRole.Startup.Task){
if ($i.commandLine -eq 'my.exe'){
$nodeExists = $true
}
}
if($taskNode -eq $null -and !$nodeExists){
$modified.AppendChild($taskNode)
}
}
$xml.Save($ServiceDefinitionConfig);
I hope this helps out.
--Nick
I have written a script that inserts some test data into a document library. I intend to use it as a post-deployment step in Visual Studio 2010, so that the library is not empty after a retract & deploy.
The relevant portions of the script are:
Install.ps1:
$scriptDirectory = Split-Path -Path $script:MyInvocation.MyCommand.Path -Parent
. "$scriptDirectory\Include.ps1"
$webUrl = "http://localhost/the_site_name"
$web = Get-SPWeb($webUrl)
...
Include.ps1:
function global:Get-SPSite($url)
{
return new-Object Microsoft.SharePoint.SPSite($url)
}
function global:Get-SPWeb($url,$site)
{
if($site -ne $null -and $url -ne $null){"Url OR Site can be given"; return}
#if SPSite is not given, we have to get it...
if($site -eq $null){
$site = Get-SPSite($url);
...
}
It works fine when run as follows from the command line, even immediately after a Visual Studio re-deploy:
powershell \source\ProjectFiles\TestData\Install.ps1
However, it does not work when I use the exact same command as a post-deployment command line in the SharePoint project's properties in Visual Studio:
Run Post-Deployment Command:
New-Object : Exception calling ".ctor" with "1" argument(s): "The Web applicati
on at http://localhost/the_site_name could not be found. Verify that you have t
yped the URL correctly. If the URL should be serving existing content, the syst
em administrator may need to add a new request URL mapping to the intended appl
ication."
At C:\source\ProjectFiles\TestData\Include.ps1:15 char:18
+ return new-Object <<<< Microsoft.SharePoint.SPSite($url)
+ CategoryInfo : InvalidOperation: (:) [New-Object], MethodInvoca
tionException
+ FullyQualifiedErrorId : ConstructorInvokedThrowException,Microsoft.Power
Shell.Commands.NewObjectCommand
Interestingly, I can reproduce the error on the command line if I run:
c:\windows\Syswow64\WindowsPowerShell\v1.0\powershell \source\ProjectFiles\TestData\Install.ps1
However, the post-deployment command fails even if I explicitly run \windows\System32\WindowsPowerShell\v1.0\powershell and \windows\Syswow64\WindowsPowerShell\v1.0\powershell.
Update: Solution found
I seem to be having a similar problem to the one discussed here:
http://social.technet.microsoft.com/Forums/en-US/sharepoint2010programming/thread/faa25866-330b-4e60-8eee-bd72dc9fa5be
I cannot access a 64-bit SharePoint API using 32-bit clients. Because Visual Studio is 32-bit, the post-deployment action will run in a 32-bit process and will fail. There is, however, a 64-bit MSBuild. If we let it run the PowerShell script, all is fine.
Wrap the script in an MSBuild file such as this:
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Install" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<Target Name="Install">
<Exec Command="powershell .\Install" />
</Target>
</Project>
Then, set the post-deployment command line to:
%WinDir%\Microsoft.NET\Framework64\v4.0.30319\MSBuild $(SolutionDir)\ProjectFiles\TestData\Install.msbuild
Use
%WINDIR%\SysNative\WindowsPowerShell\v1.0\powershell.exe
It’s important that you use the virtual path of %WINDIR%\SysNative and not the actual
path of C:\Windows\System32. The reason for this is that Visual Studio 2010 is a 32-bit
application that needs to call the 64-bit version of powershell.exe to successfully load the
Microsoft.SharePoint.Powershell snap-in.
(c)"Inside Microsoft SharePoint 2010", Microsoft Press, Mar 2011
I had same situation, I needed the Post Deployment powershell script to create dummy data for lists on my local instance. I tried several other ways even using the MSBuild with the .msbuild file as suggested above, but i could not all the variables and had to hard code the file with path and url, this is not what i wanted.
I finally figured out a way to explicitly calling the 64-Bit powershell.exe
I know the 64-bit file has to be there on hard dirve. I know that WinSXS folder has all the files. So quick search for powershell.exe in C:\Windows\winsxs folder i got two files so i grabbed the path for one in amd64 folder.
This is what i have as command in post deployment option
C:\Windows\winsxs\amd64_microsoft-windows-powershell-exe_31bf3856ad364e35_6.1.7600.16385_none_c50af05b1be3aa2b\powershell.exe -command "&{$(ProjectDir)PowerShell\dataload.ps1 -xmlPath "$(ProjectDir)PowerShell\dataload.xml" -webUrl "$(SharePointSiteUrl)"}"
I hope this will help someone in future.
Visual Studio is a 32-bit application, so in 64-bit Windows it runs in a simulated 32-bit environment.
Strangely, the 32-bit environment is called "WoW64" (when 32-bit Windows did this for 16-bit apps, it was called "WoW16". The "WoW" part means "Windows on Windows".
It's similarly strange that "System32" didn't become "System64" with 64-bit Windows. The "32" is from the 16-bit -> 32-bit transition, to differentiate from "System". Whatever, that's legacy/compatibility for you.
In WoW64, everything looks like a 32-bit Windows.
For example, c:\windows\system32 just points to c:\windows\syswow64. 32-bit applications can't (easily) reach anything 64-bit.
It is possible to use PowerShell Remoting to get a 64-bit PowerShell session from a 32-bit environment.
PS>gci env:PROCESSOR_ARCH*
Name Value
---- -----
PROCESSOR_ARCHITECTURE x86
PROCESSOR_ARCHITEW6432 AMD64
PS>Invoke-Command -ConfigurationName Microsoft.PowerShell -ComputerName LOCALHOST { gci env:PROCESSOR_ARCH* }
Name Value PSComputerName
---- ----- --------------
PROCESSOR_ARCHITECTURE AMD64 localhost
I have success doing this as a post deployment command:
%comspec% /c powershell -File "c:\foo\bar.ps1"