SlowCheetah transform not working on jenkins build server - vs2010 - different drives - transform

I have added slow Cheetah to 2 Solutions that I'm working with. The first one transforms when I'm working with it but not on the build server. The second one doesn't transform when I build it or on the build server.
Overall I'm a little confused when it comes to how to get slow cheetah working. I've read several links from Sayed and am struggling with which one I should be following. I have added version 2.5.5 to both projects.
http://sedodream.com/2012/12/24/SlowCheetahBuildServerSupportUpdated.aspx - This one says that I need to build a packagerestore.proj file first to restore the nuget package. However I get pathing errors when I try this ... I get this error - error MSB4019: The imported project "E:\jenkins\CAREweb.net (DEV APP BUILD)\workspace\development\systems\CARE\apps.nuget\nuget.targets" was not found.
http://sedodream.com/2011/12/12/SlowCheetahXMLTransformsFromACIServer.aspx - This is the one I would like to follow because I can just put the files on the ci server in the localappdata folder for the user that the jenkins service is running under. Solve it once for everyone on the team and for every build but it just doesn't transform and I don't seem to get any kind of error. Not sure if it makes a difference but we are building on the E: drive and the localappdata folder is on the c: drive.
I've read a number of his other posts on Slow Cheetah and I've looked at a bunch of the answers from stackoverflow but haven't found one that solves my problem yet.
We are using visual studio 2010. Building the project file on Jenkins from the command line. I'm sure that if I can get the one solution working then I should be able to get the other one working but I'm just not making any progress at the moment.
Update
Also these are the related slow cheetah sections from my project file.
<PropertyGroup Label="SlowCheetah">
<SlowCheetah_EnableImportFromNuGet Condition=" '$(SC_EnableImportFromNuGet)'=='' ">true</SlowCheetah_EnableImportFromNuGet>
<SlowCheetah_NuGetImportPath Condition=" '$(SlowCheetah_NuGetImportPath)'=='' ">$([System.IO.Path]::GetFullPath( $(MSBuildProjectDirectory)\..\packages\SlowCheetah.2.5.5\tools\SlowCheetah.Transforms.targets )) </SlowCheetah_NuGetImportPath>
<SlowCheetahTargets Condition=" '$(SlowCheetah_EnableImportFromNuGet)'=='true' and Exists('$(SlowCheetah_NuGetImportPath)') ">$(SlowCheetah_NuGetImportPath)</SlowCheetahTargets>
</PropertyGroup>
<Import Project="$(SlowCheetahTargets)" Condition="Exists('$(SlowCheetahTargets)')" Label="SlowCheetah" />

Try it with version 2.5.10. Ive modified SlowCheetah to add the .targets file to the project. No extra config required now. More info at https://github.com/sayedihashimi/slow-cheetah/issues/113.

Related

Xamarin Android+iOS builds failing on Azure Dev Ops

I have an xamarin mobile app with 3 projects. Shared, Android and iOS.
All 3 build perfectly fine locally but fail on Azure Dev ops pipeline.
iOS and Android only have 2 xmal views that are platform specific. The rest are located in Shared.
For both of the xmal views, all the errors are coming from the code behind cs files complaining that something doesn't exist in the current context. There are around 80 errors like the one below. The errors are identical on both platform builds.
Example error from Droid build:
Droid\Views(Filename).xaml.cs(26,13): error CS0103: The name 'InitializeComponent' does not exist in the current context
This build hasn't run for a while, around 8 months. It used to work fine and none of the views xmal/cs code has changed. I'm assuming a version is now misconfigured somewhere.
Both builds run on VS 2022 pipelines.
Both builds restore okay.
I have tried (Mostly suggestions from similar posts)
Adding restore argurment to the Build step.
Checking name spaces match
Adding a small change (whitespace) to the 2 xmal pages to force a change.
Removing the shared project reference and re-adding it.
I would be grateful for any suggestions or ideas.
Thanks in advance.
In past experience, if something works locally and not on the build server pipeline; this usually points to a discrepancy between both machines, and potentially their versions of given libraries.
If you run the pipeline off of a local machine as well, maybe confirm that all libraries installed on that machine match your local ones (XCode, android-sdk, etc.)
If you run the pipeline off of a hosted machine, it maybe that the hosted images needs to be updated to a newer one to keep up with the project.

How to build .sqlproj requiring SSDT in a linux docker container?

I want to build a .sqlproj inside a linux container. The problem is that building the .sqlproj is dependent on SSDT and so far I can't find SSDT that can be installed as standalone on linux.
Error I see running 'dotnet msbuild' in my container:
error MSB4019: The imported project "/usr/share/dotnet/sdk/2.2.402/Microsoft/VisualStudio/v11.0/SSDT/Microsoft.Data.Tools.Schema.SqlTasks.targets" was not found. Confirm that the path in the <Import> declaration is correct, and that the file exists on disk.
Searching the .sqlproj file for the issue, I see we are trying to import a Schema.SqlTasks.targets file, which I'm assuming SSDT creates:
<Import Condition="'$(SQLDBExtensionsRefPath)' == ''" Project="$(MSBuildExtensionsPath)\Microsoft\VisualStudio\v$(VisualStudioVersion)\SSDT\Microsoft.Data.Tools.Schema.SqlTasks.targets" />
I see the same error testing out 'dotnet build'
There is a Windows standalone option:
https://learn.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?redirectedfrom=MSDN&view=sql-server-ver15
https://www.nuget.org/packages/Microsoft.Data.Tools.Msbuild/10.0.61804.210
Has anyone found a way to provide SSDT in a linux container?
Goal State: The build step to generate the dacpacs would occur inside the container.
Current State: For now I'm using Visual Studio on my machine as the build machine, then copying the dacpacs up to the container where sqlpackage.exe can then publish the schema.
Why Linux? This dockerized DB will support a stack of services running in linux containers, so a windows container is not ideal.
Ok, so with a bit of help from ErikEJ and other members of the community, at the GitHub link given to you by Brett Rowberry in the comments, I finally figured out how to do this.
The steps to follow are quite simple.
Add your SQL Server project
Add a .NET standard class library project and call it something like "database.build"
remove all the code from the class library project, and modify the csproj file so it reads something like the following
Change the properties on your solution so as to NOT build the sql server project for all configurations.
Once you done this, you'll find that you still get access to the SQL database project in visual studio, and get all the syntax highlighting and intellisense, but your CI will build the SQL code via the linked class library and produce a dacpac in it's output folder ready to be deployed.
Right clicking on the project will allow you to build it from within Visual Studio, and right clicking and selecting "Publish" will allow you to specify your database parameters and publish it to the DB server being used. If you have other objects already in your database that you need to reference, you can create an object only dacpac from the database, using SQL server management studio, which you can then add to your project and check in with it's sources so that you do not need to recreate every single object you may already have.
I've written a blog post explaining it all at length here:
https://shawtyds.wordpress.com/2020/08/26/using-a-full-framework-sql-server-project-in-a-net-core-project-build/
I had the same issue what I did as a workaround:
1- Generated sql script from database project "Generate Script"
2- Followed this code sample to run the script on the Linux container
https://github.com/microsoft/mssql-docker/issues/2#issuecomment-547699532
https://github.com/lkurzyniec/netcore-boilerplate/tree/master/db/mssql
https://github.com/lkurzyniec/netcore-boilerplate/blob/master/docker-compose.yml

Why can't I open my Azure Bot Service in Visual Studio?

Okay, so this problem is kinda complicated, but I'll try to make it as simple as possible.
So I have this Bot Service created in Microsoft's Azure platform. It's been hooked up to BitBucket with the continuous integration option. Then, I tried following the instructions here in order to be able to debug the bot locally with Visual Studio. I downloaded and installed all the requisite tools and pulled the BitBucket project into a local repository. However, when I tried running 'dotnet restore' in the messages folder, I received this error message:
C:\...\messages\project.json(1,1): error MSB4025: The project file could not be loaded. Data at the root level is invalid. Line 1, position 1.
This project.json file was automatically built by Azure; why should it be invalid? The contents look like this:
{
"frameworks": {
"net46":{
"dependencies": {
"Microsoft.Bot.Builder.Azure": "3.1"
}
}
}
}
I've also tried the same thing with Visual Studio 2017, this time including the messages.csproj file in the messages folder. And this time, dotnet restore said I needed to specify a project/solution file because there multiple in the folder. I dunno if that's necessarily a problem, but it's not mentioned at all in the official guide, so it's at least a bit suspicious. Anyway, specifying project.json leads to the same error, while specifying messages.csproj seems to work all right and outputs this:
NuGet Config files used:
C:\Users\Connor.Johnson\AppData\Roaming\NuGet\NuGet.Config
C:\Program Files (x86)\NuGet\Config\Microsoft.VisualStudio.Offline.config
Feeds used:
https://api.nuget.org/v3/index.json
C:\Program Files (x86)\Microsoft SDKs\NuGetPackages\
That being the case, I run debughost.cmd. Previously, I had to manually copy over project.lock.json from the downloadable zip file available on Azure (it's in gitignore) as the debughost thing wouldn't automatically restore that stuff. More recently that doesn't seem to be a problem any more. Anyway, the debughost.cmd stuff works fine.
Naw, the real problem's when I try to open this stuff in Visual Studio. See, when I try to open the bot.sln file, I get this error message:
One or more projects in the solution were not loaded correctly. Please see the Output Window for more details.
Okay, so the Output Window gives me this super useful information:
Some of the properties associated with the solution could not be read.
Uh huh... Well, in terms of what shows up in Visual Studio, only debughost.cmd, commands.json, and readme.md show up. The messages folder is there, but it's empty. There's also an Azure Functions func thing. That's it.
Now I've looked all over for information on this issue, but I'm apparently the only person who's had it. Moreover, I've tried opening the Bot in VS2015, VS2017, from only the source code downloaded from Azure (i.e., without the Git stuff), and from the BitBucket repository. I've also tried using connecting to source control from the Team Explorer in Visual Studio. Nothing's working! I can't find any information on what might be incorrectly configured, and I find it odd that I should have to change anything. I could serrrrrrriously use some help here.

Unable to publish node js site to azure using Visual Studio 2013

I am publishing my node js site to azure using this tutorial - http://blogs.technet.com/b/sams_blog/archive/2014/11/14/azure-websites-deploy-node-js-website-using-visual-studio.aspx
I get the following error, as mentioned in one of the comments on the blog, any idea what this error is about and how do I fix this ? I am able to run my app locally no issues with that.
Error: InvalidParameter
Parameter name: index
P.s : the site is like a very basic "Hello world" kind of site, this is the first time I am using and deploying to azure too.
I created a new project as a "Blank Azure Node.js web application", and replaced the resulting package.json and .js files with what I had before, and it publishes fine now
All was working fine for and suddenly got the error! I pretty sure it something in the project as it's now happening on vs2013 and vs2015 on different computers.
Its something to do with Templates after a lot of searching. For me Azure TFS CI got things working again if possible for you?
I had this issue with some projects but not with others, all created in a similar way. So I went thought every change and every setting I could until eventually i worked it out. I didn't want to give up and just remake them.
Basically its file paths, the first thing you notice is that it errors very quickly compared to a usual publish, the first thing that is triggered is a build but unlike heavy framework languages there not really much to actually build.
Like all builds for VS it pops out a bin folder take not of where this appears. This is the key, you want this to appear in the root of your deployment usually at the same level as the publish profile.
Before I moved my projects to VS, TFS and Azure, I used to use git and used the azure push and deployment as part of git, so I instinctively structured my folders in the similar fashion with src folder and all the extra VS baggage in the a directory higher.
This is where I noticed bin folder, so re-structured my solution and made changes to .njsproj (notepad) and moved to be inline with source code and re-added it yo my solution.
Technically speaking this a bug within VS as it allows to create the project and specify different locations which is all fine unless you want to build and publish locally.
Once you get your head around what is going on you should be able to solve this problem easily and not make the same mistake in the future. If anyone is still confused comment and ill grab some screen shots.

VS2012 cannot copy DeploymentItem to test output folder

I have a project that is written in MSTest. I have 3 machines that has VS2012 Ultimate Update 4 installed. But with this project, on one of my 3 machines, the DeploymentItem are not copied to the output folder which in turn causes unit test failure. The other two machines are fine with the same project. I am using TFS as source control system. Can someone help me fix this issue?
Update: I have given up, this seems to be an issue of VS2012 installation itself cause the same project can run tests fine on other machines
Do you have a different test project on that machine that points to the same output folder?
According to this thread, one of the projects could fail to output in that case.
It could be that you excluded the extra project on the working machines, or that the order in which the projects are built and deployed is different. Are there any other differences (like number of processor cores) between the machines?
Another cause could be differences in user rights on the different machines (depending on the destination directory you are deploying to). Try starting Visual studio on the failing machine by right-clicking on the icon and choosing run as administrator. Does that make any difference?
Since it is working on your other machines, I guess the copy to output directory properties are already true.
This is a wired one and I encountered a similar issue most recently where the build and output to folder was successful in few machines while it was failing in others. My web project was referencing assemblies from the GAC and a folder located in the relative path inside the solution.
Machines where the output was failing, I was receiving this error in the output something like --> Can't locate or access assemblies in the path..
I resolved the issue by
Removing all the assemblies that were referenced form the relative path
(optional) Manually removed the debug and release folders in both bin and obj folders for all projects. (Probably a clean solution option in the build menu will work as well here, but I avoid taking risk and wanted to be sure)
Added back the assemblies from the local path.
Re-build the project
Run the test project and everything was working fine in all machines.
Hope this helps !!!
It turns out it's my own fault. I didn't set the Test settings in the "Test -> Test Settings" menu. But how could I know? VS2012 on my other machines are all configured automatically. For some reason, that particular machine didn't. So there you have it, the answer to the question. It's a simple one. But when you don't know, it's utterly hard.

Resources