code coverage tool from Visual Studio 2012 Team Tools does not work on release bits - iis

When I try to run codecoverage tool on release bits of my website I get an empty .coverage file containing the following error:
Empty results generated: No binaries were instrumented. Make sure the tests ran, required binaries were loaded, had matching symbol files, and were not excluded through custom settings. For more information see http://go.microsoft.com/fwlink/?LinkID=253731
This issue does not occur if I run it on the debug version of the same build.
These are the exact steps I perform:
- start monitoring code coverage on IIS server
codecoverage collect /IIS /session:test /output:test.coverage
- perform a few click around the website
- stop monitoring code coverage
codecoverage shutdown /session:test
Note! I do hae the .pdb files in the same place as the binaries.
Any ideas?
Thanks,
Cristina

I finally managed to get to the bottom of this. For the release build, there were 2 things I needed to change in the .config file
specify the path to the Symbols (apparently this is needed even if the .pdf files are in the same location as the dlls)
I had to remove the default exclusion list since these included Microsoft public key tokens and our product is a Microsoft product.
Hope this will be helpful for others running into the same situation.
Regards,
Cristina

Try this: Enabling Profiling.
[UPDATE 5 Jun 2013 14:04:00-04:00 UTC]
I also found this article that you may be interested in: Application Analytics: What Every Developer Should Know.
As far as Code Coverage is concerned, I didn't see anything specific to using the VS Code Coverage tool and configuring for IIS. I did see lots of articles about testing web applications in general, some of which involved setting up code coverage (Setting Up Machines and Collecting Diagnostic Information Using Test Settings being an example).
Sorry I couldn't be of more help to you.

Related

Klocwork - how to scan cross-platform projects?

We have a project with a single code base which we build both on Windows and Linux. And we want to run Klocwork code analysis on both Windows and Linux. Currently our approach is:
We have set up one KW project in the web UI
Inject and build on Linux, push the results to the server, save the report
Inject and build on Windows, push the results to the server, save the report
It somehow works, but the problem is that latter scan effectively overwrites results of the first one. If we save report directly after push, then we can still have a saved copy, but if developers want to triage/analyze the hit which is present only in the first build (i.e. some Linux-specific code), then it's almost impossible because KW has already marked this hit as "obsolete" (because it was not present in Windows scan)
Having two projects is not really an option, because 90% of the code is shared and it will cause huge overhead of developers to triage the same hits twice.
There are multiple options to achieve your goal.
Option 1: There is a tab under projects called Builds. That can give you the build chain report. here you can see the reports of previous builds.
Option 2: Are you using Klocwork desktop tools (Plugins/kwcheck)? If yes, developer will be notified automatically about the new defects/issues that he has produced at his machine. So there may be no question of reviewing Klocwork portal by a developer just to see what are all the issues he has created.
Option 3: I see you have mentioned 90% of the code has been shared. Is that meant, your project needs windows dlls and Linux libraries together to build your project?.
If the Answer is YES, please do let me know. i will think about some possible workarounds.
If the Answer is NO, then creating kwtables is one time job and from second time onwards Klocwork can perform incremental analysis (kwbuildproject ....... --incremental)
Option 4: Creating multiple project is not a bad option. Existing project settings can be replicated and the issue status can be sync. When you push the results to Klocwork server, the results will be pushed from Build Machine to Klocwork web/database server and it creates /projects_root/My_Project/builds/My_Build_Name/ directory. So, maintaining two Klocwork projects wont make much of a difference.
Option 5: Schedule a call with Klocwork support team. They will be happy to assist you with the best possible way.
I hope this helps.

Windows Automation needs interaction with a .net application

I really don't know how to better explain this situation I'm running into but I'll try my best.
This is what I'm trying to do...
I currently have a batch file that does some set of actions such as downloading files from FTP, Folder creation, modifying some text files etc. This was taking us 45 minutes to do manually and with the batch script automation it's easier.
The next step I want to do is, launch our .net Windows application, login to it and do some actions in it and then log out of it. This is actually a regression test case which I've automated using VS Coded UI on another machine. The problem I'm running into is, there is a separate support team who will need to do that 45mins of job which I've already automated followed by some actions after logging into the application. That support team's machines will not have VS or Coded UI installed in it.
So, how do I go about it? Any idea, please?
You can execute the CodedUI tests you've written without Visual Studio/CodedUI being installed on the machines. Remember that you can run tests through controllers and agents in TFS or Microsoft Test Manager. You can take those principles to run them manually, even if it's a strange corner case. This takes two steps, if I recall correctly:
Design your CodedUI tests to reference the CodedUI .dlls as part of the solution, rather than the GAC. By this I mean copy and paste the required .dlls into a solution folder and replace the existing references with ones that point to the .dlls in the folder. When you distribute your tests, be sure to include this soln folder of course. (UPDATE: After some more experience wit this, I've found it much easier to use NuGet Packages instead. Project level references are an absolute NIGHTMARE)
Install the "Test Agent" software that Microsoft provides for free on the tester's machine. This will install the other testing .dlls your tester will probably need in their GAC. You could do step 1 with these as well but to be honest I think this is less trouble. In addition, it installs the necessary mstest executables.
Your testers will then have to use mstest.exe (UPDATE: Once you've installed agents, you can/should use vstest.console.exe as an alternative) in the console to run your CodedUI tests. Alternatively you canprobably use powershell or your batch files to execute your tests and your other tests in one neat package.
Please let me know if this gave you a potential solution for your problem.

How to see MSBuild output generated in project load time?

I've noticed that when Visual Studio 2012 loads/initializes projects (when opening a solution or when changing platforms/configurations), it may execute some MSBuild targets - those that are listed as InitialTargets (it doesn't always do that - sometimes it waits until you actually build it; I can't figure out when exactly, but that's a different question).
Anyway, these targets may generate some output in the form of MSBuild messages. If the targets were being run as part of a build, these messages would go to the output window of Visual Studio (and perhaps a log file). These "load-time targets", though, do not seem to send their output to the output window.
How can I see or log the output of MSBuild targets which execute outside of build time, and specifically in initialization time?
The best information source from the MSBuild team at Microsoft I could find is dated (2005), but may still be actual if no one could come up with fresher information:
The project load logger is used when projects are opened. It discards all messages logged while the project is opened, puts warnings in the error list, and displays any errors in a message box to the user. The errors displayed are quite detailed and useful in helping diagnose project file formatting issues.

Possible to use the Windows Phone 8 Store Test Kit with Custom Build Configurations?

I've got a custom build configuration setup in my V.S. 2012 Win Phone 8 solution (distinct Configurations other than just Debug and Release) and am using SQLite (which cannot be built against the AnyCPU Platform) and cannot find a way to run the test kit since it insists on a "Release + AnyCPU" build. Copying the contents of the actual output folder to "Bin\Release" doesn't seem to satisfy the test kit (nor did setting the output folder in the project file to "Bin\Release" for the appropriate configuration) and I can find no way to change the test kit settings/path. Is this simply impossible or am I just missing something?
I have read this post but it's less than helpful and also incomplete (that post doesn't say anything about custom build configurations). I have also sent negative feedback here asking a similar question (but of course that's unlikely to be replied to).
I was going to add "store-test-kit" and "custom-build-config" tags to this but don't have the reputation to do so so I've added them in this note instead in case it helps with future searches.
My solution combines a number of projects spanning from Windows 8 Desktop to Windows 8 Phone (and including DLLs and other supporting dependecy projects including SQLite). In order to build only what is needed I drastically customized my build configurations. So e.g. I have:
N-Dbg, N-Rls, N-Str and P-DBG, P-Rls, P-Str as build Configurations. These distingush from the Desktop ([N]ot phone) and [P]hone primary projects and the Dbg (debug) Rls (Release) and Str (Store) build sets. I have a distinct Str build so that the live store code bits are excluded from the Release build which also leaves out all the extra Debug features.
As complicated as it all sounds it works really quite well in practice, until I need to build a "Release" version. (And this is completely ignoring the platform part of the equation for which just selecting ARM might suffice, I really can't say because I have no good way to test it at the moment).
Ideally then what I want the Store Test Kit to test against is the P-Str build config. I tried setting the configuration to output that set of files to "Bin\Release" which is where a brand new standard build config puts "Release\Any CPU" builds, but the Store Kit didn't like it. It doesn't even list anything for a path to the XAP file that it's supposed to be testing.
After my initial post I did think to try making a new Release build config that was just setup the same as the P-Str build config but that also did not seem to make the Store Test Kit happy (though I admit that I am not 100% sure that I added the new build configuration completely accurately since it had been a long time since I created the custom config in the first place).
Does anyone out in the S.O. world have any experience with such a completely custom build configuration that might be able to help me out here?
I guess after almost a full year if no one has found a better way I'll take the advice to call this the answer...
I was very close with the idea of copying the files to the Bin\Release location but there were a couple missing pieces. First I had to add a Release+AnyCPU entry to the project file (even though I never actually use it) apparently so that the Store Test Kit knows where to look for the xap. Second, I had to be sure to rename the xap file in the Bin\Release folder to ProjName_Release_AnyCPU.xap. Once these things are done I can run the Test Kit.

How to publish MSHTHML.dll and SHDOCVW.dll to Azure

I have a 3rd party web page screen capture DLL from http://websitesscreenshot.com/ that lets me target a URL and save the page to a image file. I've moved this code into my Azure-based project and when I run it on my local sandboxed dev box and save to the Azure blob, everything is fine. But when I push the bits to my live server on Azure, it's failing.
I think this is because either MSHTML.dll and/or SHDOCVW.dll are missing from my Azure configuration.
How can I get these libraries (plus any dependent binaries) up to Azure?
I found the following advice on an MSFT forum but haven't tried it yet. http://social.msdn.microsoft.com/Forums/en-US/windowsazuredevelopment/thread/0344dcff-6fdd-4479-a3b4-3e89750a92f4/
Hello, I haven't tried mshtml in the cloud. But generally speaking, to
use a native dll in a Web Role, you add the dll to the Web Role
project just like adding a picture (choose add existing items). Then
make sure the Build Action is set to Content. This tells Visual Studio
to copy the dll file to the output package.
Also check dependencies carefully. A lot of problems related to native
code are caused by missing dependencies, such as a particular VC++
runtime dll.
Thought I'd ask here first before I burn a day or two on an unproven solution.
EDIT #1:
it turns out that our problem was not related to MSHTML.dll or SHDOCVW.dll missing from the Azure server. They're there.
The issue is that by default new server instance have the IE security hardening feature enabled, and this was preventing our 3rd party dll from executing script. So we needed to turn off the enhanced IE security configuration settings. This is also a non-trivial exercise.
In the meantime, we just created a server-side version of the feature on our site we need to make screen captures from (e.g. we eliminated JSON-based rendering of UI on the client), and we were able to proceed.
I think the solution mentioned in the MSDN forum thread is correct. You should put them as part of your project files, so that the SDK will package and deploy them to the VM on the cloud.
But if they are COM and need to be registed you'd better call the register command via the Startup feature. Please check http://msdn.microsoft.com/en-us/hh351539
HTH

Resources