I would like to add basic logging and make some other minor changes to the classes generated by SubSonic 2.1 (I'm not using SubSonic 3.0 t4 templates).
Is there a way to do this without modifying the SubSonic source code?
You have two choices. You can modify the default templates or create your own. I suggest making your own templates which will lives side-by-side with the original and then generate your code via the following instructions.
Note that these steps assume you ran the default SubSonic installation. In other words, Sonic.exe and the default templates can be found under c://program files/. If not, you'll find your SubSonic files/templates in an alternative installation location, of course.
Make a copy of the default templates folder as found in C:\Program Files\SubSonic\SubSonic 2.1 Final\src\SubSonic\CodeGeneration\Templates. I might recommend naming the copied folder "TemplatesWithLogging.”
Open the aspx files in Visual Studio and modify to your heart’s content. For example, I wanted an alternate C# class template so I modified CS_ClassTemplate.aspx. If you want to merely alter the default templates, you can but I suggest making a backup first.
I am going to assume you are already familiar with code generation with SubSonic. I personally like to setup a Visual Studio External Tool to allow for quick, pre-configured regeneration. Otherwise, the following can be ported over to the command line. Here’s the External Tool setup instructions:
Tools > External Tools > Add Title: TemplatesWithLogging
SubSonic Classes Command: C:\Program Files\SubSonic\SubSonic 2.1 Final\SubCommander\sonic.exe Arguments: generate /out Generated /namespace NAMESPACE /server SERVER/db DATABASE where NAMESPACE, SERVER and DATABASE are replaced accordingly.
Initial Directory: $(ProjectDir)
Check “Use Output window” and “Prompt for arguments.”
Select Apply or OK
4.Select the project which will contain the “Generated” folder and auto-generated files. Select Tool > TemplatesWithLogging.
You can find more here.
Yes you can modify the templates that version 2 uses, they're just aspx files. The templates are stored in src\SubSonic\CodeGeneration\Templates under your installation directory.
This blog post goes into more detail:
http://johnnycoder.com/blog/2008/06/09/custom-templates-with-subsonic/
Related
I'm new to C++/COM. I have created a ATL COM Project with a callback mechanism to send messages to managed side. It has one idl file (sample1.idl) which expose 'n' number of methods, hence managed environment can access it. Now i would like to add another .idl(sample2.idl) file to that project.
.tlb is created for both sample1 & sample2 and build succeeded. On browsing the .dll , I couldn't find the sample2.idl related stuffs. I suspect that .tlb generated from ‘sample2.idl’ is not reflected in the .dll.
Can we have more than one IDL’s in the ATL (COM) project ?
The default for ATL, as with many native build environments, is to embed the type library as a resource in the DLL. Something you can see in Visual Studio (retail edition required), use File + Open + File and select the DLL. Open the "TYPELIB" node, you'll see one type library with resource ID #1. This is the one that Visual Studio sees when you use Add Reference.
Most any build tool that consumes type libraries will only ever look for that one resource. Visual Studio is no exception. It also can encode only one type library in its project files. You perhaps can make it work by selecting the 2nd .tlb file with the Add Reference dialog. Albeit that it is very likely that you'll now get exposed to more problems in your ATL project, like forgetting to register that 2nd type library in your .rgs file.
Very hard to give proper advice without any hint what that second IDL file might contain. Stay out of trouble by merging them or by using the existing support in IDL to import other .idl files or type libraries.
Using Visual Studio 2012, Azure SDK 2.1, I am trying to figure out the best way to create the csx folder for running in the azure emulator. My understanding is that the csx folder is not created until I package the Azure project. I can create a package manually from Visual Studio, but this is not an option for an automated build. The other option is to create the package using the msbuild command line. This seems a bit heavy handed as it will actually do a build which is more time consuming than just repackaging.
So, I thought that cspack might be a more lightweight option. However, when I call cspack with the following command line:
cspack.exe ServiceDefinition.csdef /copyOnly
I get the error: Need to specify the physical directory for the virtual path 'Web/' of role MyProjWeb.
But, I don't do anything like that when using msbuild. I have read a bunch of things about specifying the physical directory and some of the confusion that it can cause. So, I would prefer not to use it unless absolutely necessary, especially since I don't need to specify this when building from msbuild.
So, my main question is what is msbuild doing that cspack is not doing and how do I do the same with cspack?
My other question is, what is the easiest way to generate the csx folder for testing in the azure emulator?
Edit - Resolution
I thought that I would put down how I resolved this here in case it helps someone else. The big answer to my question (thanks to Chandermani and some other reading) is that CSPack with /copyOnly is basically a fancy xcopy to a folder structure according to some rules. If not using /copyOnly it then also does a fancy zip to create a package. Not complaining, it is fine that it is simple, but it is good to know this at the outset. You can use it for packaging anything for azure it is not tied to what can be built in Visual Studio, e.g. a PHP site. Using msbuild has the added benefit of only copying that files that are part of your web site deployment.
So, what I found when I got CSPack working and pointed at the mvc project folder is that it copied everything including source files. Which is not what I wanted. The solution that I could find is to first package the web site then point CSPack at the packaged files. If you do down this path then this link is very valuable as it describes it step by step.
So, it was either having an msbuild post-step in the Web project to package the files and then a post-step in my Azure project to cspack it or to have an msbuild post-step in my azure project to create the package (do cspack with the benefit of only including my web deployment files). Well, it seemed simpler and less error prone to just to have the one post step and let msbuild do the heavy lifting. So, the post step in my azure project is something like:
"C:\Program Files\Microsoft SDKs\Windows Azure\Emulator\csrun.exe" /devfabric:shutdown > NUL
"C:\Program Files\Microsoft SDKs\Windows Azure\Emulator\csrun.exe" /devstore:shutdown > NUL
if $(ConfigurationName) == Debug set CONSTANTSPARAMETER=DEBUG
if $(ConfigurationName) == Release set CONSTANTSPARAMETER=
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\MSBuild.exe $(ProjectDir)$(ProjectFileName) /t:clean;publish /p:Configuration=$(ConfigurationName) /p:TargetProfile=cloud /p:OutputPath=bin\Cloud$(ConfigurationName) /p:VisualStudioVersion=11.0 /p:overwritereadonlyfiles=true /p:DefineConstants="%CONSTANTSPARAMETER%" /verbosity:minimal /p:PostBuildEvent=
The first two lines shut down the compute and storage emulator.
The next two lines set the preprocessor constants. I found that #if DEBUG was no longer taking effect when built using the msbuild line. I think that this is safety protection that DEBUG is stripped when creating a package. I only ever use the package that is created by an automated build system, so it is safe for me to keep the DEBUG constant.
The actual msbuild line has a number of switches. I'll describe the unusual ones:
/p:PostBuildEvent=
If we don't set the postBuildEvent to empty then the same post step will keep getting called forever. And ever...
/p:VisualStudioVersion=11.0
Those clever guys at Microsoft made it possible to open projects with both Visual Studio 2010 and 2012. Which is great, but can bring great sadness when you run msbuild from the command line and end up with nasty MSB4019 error messages because it is looking in the wrong Visual Studio folder for the Azure tools.
Also, note that that I use the cloud profile. Since I am only after the csx files it doesn't seem to make a difference whether I use local or cloud at this point. When I run in azure I specify ServiceConfiguration.Local.cscfg.
Edit: In the end I took this out of the post step and put in my automated build. My original intention was that running the tests from my dev machine would be the same as my automated build, but the post step took too long and the views were sourced from the obj folder rather than the proj folder when running under the debugger which meant I had to copy across when making changes on the fly.
Unanswered questions
It would still be good to understand how msbuild does things to reduce knowledge friction when dabbling in this area. Does it create a package for the website and pass it to CSPack? Or does it parse the project files and then pass some crazy arguments to CSPack? Also when you run an azure project in the debugger, it runs in the emulator with only the binaries in the csx folder (not the images, etc). How does it do that? It would be great to see some description with pictures of the Azure build pipeline with that showed the lifecycle all the way to deployment. That might also explain why there are two copies of the binaries. Also, this would have been a whole lot easier if Visual studio had a project flag like packageOnBuild for the Azure project with options to do a copyOnly or to create a package. I see no point in uneaten cake. Edit: There is a DeployOnBuild setting that can be added to csproj.
Finally, as I mentioned the whole purpose of this is to get a csx folder that I can point the emulator at so that I can run my unit tests on my dev machine. I do the formal packaging on a build machine so don't really need it in Visual Studio. So, really I don't want to package anything and was hoping that there was an easier way of achieving all this.
Since msbuild uses the the azure project file to perform the build, it can derive a lot of information form the project file.
For cspack, the assumption is the role code has been compiled and is available for packaging. Since cspack does not depend upon project file, it needs a explicit information for the code path of the the web\workerrole project. The csdef file does not contain any such information. I suggest if you want to use cspack. Look at its documentation and try to create a package for emulator deployment from command line (CopyOnly option). Once you find the correct syntax you can embed it in you build script.
I have a larger solution that I desire to distribute via ClickOnce. It consists of one main shell executable that directly references only a small subsection of libraries and processes that constitute the solution.
The solution consists of a few other processes and several libraries (some C++). I need to be able to include all of these libraries and processes in one ClickOnce distribution for both local builds and TFS server builds.
I cannot reference every other library and process form the shell project. And I do not wish to push these files into a MSI to be treated as a prerequisite as it would defeat the purpose of using ClickOnce to distribute/update the product.
What is the correct method to incorporate all of our necessary files/projects into a single ClickOnce distribution?
The IDE won't detect native DLLs as dependencies when publishing, but you can run the SDK tools directly to include them manually in your ClickOnce distribution. You can either use mage.exe in your post-build script or run MageUI.exe to have a wizard to guide you through the package generation.
Suggested reading:
Walkthrough: Manually Deploying a ClickOnce Application
Understanding Dependencies of a Visual C++ Application
There is an alternative to Visual Studio for this kind of situation. You could try using Mage, but it can be a little tricky to use. My company wrote an alternative called ClickOnceMore.
ClickOnceMore is a ClickOnce build tool for when you don't want or can't use Visual Studio to do ClickOnce builds.
There is a specific page on the UI for including files (using rules to include anything from a single file to an entire directory trees) so you should be able to do exactly what you need with it.
This is what I have done in a similar situation. I use TFS at work, so convert the terms to whatever you may use (or not use) for source control.
I have a main workspace that I use for all development of my application, I keep this workspace pristine.
I then created another workspace with a proper name (ex: solution-deploy) and in this workspace I do the following:
Get latest and merge everything from source-control into the deployment workspace
I build a Release build of my application
I r-click on the root (I put them in the root, because I need to access them from there, put them in whatever folder you want) project folder for my deployment project and select "Add -> Existing Item"
I browse in the file selector to the Release directory of the assemblies I want to add to my deployment package, select them, then I use the arrow next to the Add button and drop down to "Add As Link", do this for all of the assemblies you want to add and place them wherever you want them to be organized in your deployment
In the Solution Explorer, select the added assemblies and in the Properties window set the Build Action to "Content", this should be all you have to do, but others have had to also set the "Copy to Output Directory" to "Copy Always", I don't do that
Run a Release Build
Go to the Properties view for your deployment Project
Go to the Publish Tab and Click on the Application Files button
Your files should all be available and added to the Deployment
Set up your ClickOnce settings however you need them to be
Publish your ClickOnce package
Your published package should contain all of the assemblies you need now.
Keep your separate Deployment workspace set up this way and never check it in. Do your work in your development workspace. Whenever a new deployment is needed, open your solution in your Deployment workspace and get the latest code, build, then publish.
In our company we regularly create MSI's with Installshield(latest).
These setups adhere to a set of rules and name schemes so they work with our deployment system and autobuilds etc.
Is there a way to eliminate the repetitive overhead of going through all the boilerplate stuff (setting the company meta data, basic folder structures, a few events, including some default helper files etc) for each setup?
Take a look at the InstallShield Automation interface. What I did was:
Abstract all my components out into WiX Merge Modules ( could be IS merge modules though ).
Create a base InstallShield project ( Common.ISM )
Create XML files to describe my feature tree and product configurations
Create Build Automation to reflect the XML and invoke the Automation Interface to "Emit" my installer source.
Build the Product Config in the ISM.
This gave me a great deal of code reuse but it's not trivial to set up this type of system. However it scales very well and the advantages are huge if you have the right business needs.
There are two ways you can do this:
Save the ism file in xml format (there is a setting for this in the project settings). Then in run time, push the desired values with a new application that could be written (which will edit the XML file using DOM or so...)
Use InstallShield Automation interface. This can be done using VBScript. You may check this link: InstallShield Automation Interface
What files can we modify so that our solution is still supported by Microsoft?
Is it allowed to customize error pages?
Can we modify the web.config files to use custom HTTPHandlers?
You can certainly edit the web.config file for your sites. The one thing that you should be aware of, however, is that when you start editing files manually on the file system, you will have to remember to manually make those changes across all servers in the farm (assuming a farm exists). In addition to this, when you edit files in the 12 hive, it's important to understand that you will be making a change to all SharePoint sites hosted on the server(s) for which the files were edited.
Personally, if I were going to create a custom error page, I would simply add a <customErrors> section to my web.config. I avoid editing any existing files in the 12 hive, but I have added files (though it's rare).
The customization of the error page is not very easy (or flexible). You can see an example here:
http://blogs.msdn.com/jingmeili/archive/2007/04/08/how-to-create-your-own-custom-404-error-page-and-handle-redirect-in-sharepoint-2007-moss.aspx
The web.config can be changed. I used my own HttpModules in addition to the original ones, but I haven't used custom HttpHandlers. IMO it should work if you don't change the original handler (i.e. if you add your handler for a specific type of file not handled by SP).
do not modify any pre-installed files in the 12 hive (Program Files\Common Files\Microsoft Shared\Web Server Extensions\12)... a service pack may update and overwrite any changes.
Anything in the Content Database (Masterpage, Stylesheets list in ~Catalogs) is available to modify (I would add, instead of update, in case a service pack changes anything) as it sits atop the file system, and is instantly available to any members of the web farm (newly added servers).
Any custom features, added to the 12 hive in the features folder, in a custom/non-microsoft folder (that is, inside the 12\feature folder, do not modify any preinstalled files, but feel free to add a folder for your feature and work within).
Custom features can be developed using the Visual Studio Extensions (VSeWSS), currently available for Visual Studio 2005/2008... benefit being that the output is a feature package (.WSP file) which is designed to be portable across SharePoint. Additionally, the .WSP files are just CAB files with a different extension, offering the ability to be explored by simply renaming them.
For site definitions, Microsoft has a good article about what is supported and unsupported. In short, the only change you can make to the out-of-the-box site definitions is changing the entry in the webtemp.xml file to hidden in order to prevent the site definition from appearing in the site template list. This is something many may be interested in doing.
You may also, of course, copy existing definitions and rename them in order to create new ones.
The complete list of supported and unsupported scenarios for working with custom site definitions can be found here:
http://support.microsoft.com/default.aspx?scid=kb;en-us;898631
Here is the closest I can find to a official response from Microsoft:
http://technet.microsoft.com/en-us/library/cc263010.aspx