Trouble building VS2012 project with signed ClickOnce manifest on Jenkins - visual-studio-2012

I'm having trouble building a VS2012 project with a signed ClickOnce deployment manifest. The .pfx certificate used to sign the project is password-protected with a paid-for key. I'm using Jenkins with the MSBuild plugin.
The error that shows up right after BUILD FAILED is:
C:\Windows\Microsoft.NET\Framework\v4.0.30319\Microsoft.Common.targets(2455,5): error MSB3325: Cannot import the following key file: xxxx.pfx. The key file may be password protected. To correct this, try to import the certificate again or manually install the certificate to the Strong Name CSP with the following key container name: VS_KEY_11234A3322194 [C:\Program Files (x86)\Jenkins\jobs\xxxx\workspace\xxxx\xxxx.csproj]
I've tried installing the certificate as a Trusted Root Certificate on my local machine (using certlm.msc) and also tried installing it to the specific container mentioned with sn.exe, but this makes no difference. I've made sure that the Jenkins service is running as a specific user rather than the Local System account.
If I open the project in VS2012, I can build it but only once I've selected the certificate under the Signing tab of the project properties and entered the password. This is even though the .csproj is set to use that specific file to sign the ClickOnce manifest.
I can successfully build the project using MSBuild from the command line as an admin user, but it fails when Jenkins runs the build. I've also tried passing the certificate to Jenkins' MSBuild as the /property:AssemblyOriginatorKeyFile parameter.
Any ideas on what could be causing this?

I have no idea what "Jenkins" is, but I know this about certificates and ClickOnce. The certificate must be installed under the user account that is doing the build. Try importing it and use the defaults, which will import it into the user store, rather than importing it into Trusted Publishers or one of the older folders. You can see it after you import it by running certmgr.msc.
We hit this when doing automated builds on our TFS server. Every year, we have to log into the account that does the builds and import the new signing certificate.

Related

How to deploy to artifactory from azure pipeline?

I'm trying to copy files to artifactory from my Azure pipeline but getting error:
"curl: (60) SSL certificate problem: unable to get issuer certificate
curl failed to verify the legitimacy of the server and therefore could
not establish a secure connection to it."
This used to work fine but I suspect someone has blocked me somehow.
If I try to deploy the file manually within Artifactory, the error shows:
"Error deploying these files: my-linux-shell-script.sh -
artifactory-build-info:path/to/repo/my-linux-shell-script.sh could not
be deployed. Build Info repositories only supports build-info.json
files."
Assuming I could get the Artifactory server's public key, I wouldn't know where to install it on Azure.
I can view all our files within Jfrog Artifactory but can no longer deploy to them. Suggestions?
The error "curl: (60) SSL certificate problem: unable to get issuer certificate" usually means that your server (client) does not have the target's server trusted certificates.
You can ignore it by adding the -k flag or -insecure to the cURL command.
With regards to the artifactory-build-info repository, it will only support to deploy json files that have valid build name, build number and timestamp. There is another possibility that the deploy repository might have not been selected correctly rather the repository "artifactory-build-info" is selected, if that is the case, "artifactory-build-info" will not allow artifacts to be deployed as it only allows a valid JSON files. Please validate.

Installed certificates on Batch account and Pool not available for task

I have an Azure Batch account setup with system assigned identity (the account was created through TF and User assigned identities are not yet supported).
A certificate is available to the batch account and on the pool as well.
When inspecting the node on the pool (scaled to one for now), it shows a certificate reference:
I've manually created a job and a simple task (/bin/bash -c 'ls -la $AZ_BATCH_CERTIFICATES_DIR/') to list contents and everything comes empty.
This seems to be the case for all self-signed certificates I've used to try this.
Can somebody please point out what I'm doing wrong?
(I've tried all combinations for Task-NonAdmin, TaskAdmin, Pool-NonAdmin, Pool-Admin together with LocalMachine, currentUser).
Thanks all!
Well, this thing happened:
Issue with Windows LocalMachine certificates:
If you are adding certificate references on your pool which install into the Windows LocalMachine certificate store, and are running tasks without admin access which need access to the certificate's private key, your tasks will work on the old agent but not work in the new agent.
Only pfx files where your non-admin task needs access to the private key should be moved to "My" in CurrentUser
https://github.com/Azure/Batch/issues/1
If I upload the certs to CurrentUser\My, the tasks do get the certs.

Can't create new schedules from Azure Websites

I followed the tutorial here for settings up the Azure Scheduler:
http://fabriccontroller.net/blog/posts/a-complete-overview-to-get-started-with-the-windows-azure-scheduler/
I want to run my application on an Azure Website but it is blocking me from creating my X509Certificate.
I found this article: http://blog.tylerdoerksen.com/2013/08/23/pfx-certificate-files-and-windows-azure-websites/
Which points out the issue:
Well it turns out that when you load certificates the system will use a local directory to store the key (??)
The default location for the key is the under the local user profile, and with Windows Azure Websites, there is no local user profile directory."
So following his advice and adding the following flag: "X509KeyStorageFlags.MachineKeySet" I can get away with:
CryptographicException: The system cannot find the file specified
but I now get:
CryptographicException: Access denied.
Is there really no way to use the SDK from an AzureWebsite?! It defeats a lot of appeal of the Azure Scheduler if I am forced into using a WebRole instead of an Azure Website.
In this thread: http://social.msdn.microsoft.com/Forums/windowsazure/en-US/cfe06e73-53e1-4030-b82d-53200be37647/load-privately-created-p12-cert-from-azureblob-and-have-it-be-trusted
It appears as if they are sucessfully creating a X509Certificate on an Azure Website so what is different that mine throws "Access Denied" when I try to?
The problem was with using the ManagementCertificate string in the PublishSettings file... I created a self signed certificate on my local machine using the VisualStudio Console and exported both a '.cer' and '.pfx'.
Uploaded the self signed .cer into my Azure/Settings/Management Certificates
Bundled the .pfx with my solution and published to Azure Web Sites
Then used the following code to create the certificate:
var certificate = new X509Certificate2(
HttpContext.Current.Server.MapPath("~/<filename>.pfx"), "<password>", X509KeyStorageFlags.MachineKeySet);

Azure cloud deployment fails : Certificate with thumbprint was not found

I am developing a Web API based web service to be hosted on Azure. I am using Azure 1.8 SDK.
When I try to deploy my cloud service, it takes a very long time to upload after which I get an error message which says:
12:09:52 PM - Error: The certificate with thumbprint d22e9de125640c48a4f83de06ae6069f09cfb76c was not found. Http Status Code: BadRequest OperationId: 50daf49111c9487f82f3be09763e7924
12:09:53 PM - Deployment failed with a fatal error
Apparently, the certificate being referred to is related to enabling remote desktop to role instances on the cloud (i am not very sure about this; saw this on the internet for a similar problem). However, I did not check the option to enable remote desktop on the instances while publishing.
What could be going wrong here?
What worked for me was:
Goto powershell and type mmc
Add certificates snap-in by going to File > Add/Remove Snap-in > Choose Certificates from the list > Choose My user Account
Right click on Certificates - Current User and select Find Certificates
On the dialog box, set Contains to 'azure' and Look in Field to 'Issued To'
Press Find Now. You should be able to see a list of certificates.
Check for the thumbprint by double-clicking the certificate > Details tab > scroll down to Thumbprint
Once you found your certificate, close the dialog, Right click and select Export
Select to export the private key. Follow the steps until you have a *pfx file for upload to Azure
Goto your service and select the Certificates tab
Click Upload, select the exported *pfx file, and supply the password you've set during export
Goto Dashbord and update the Cloud package
List item
The certificate used in your project doesn't exist on the cloud environment. Make sure the same certificate used by your project is uploaded to the cloud environment. If you are using Visual Studio then you can fix this error as follows:
Right click your Web Role / Worker Role (under Roles folder in the cloud project) → Properties → Certificates
Click on the ellipsis button under Thumbprint which will point to your certificate.
Upload the certificate which shown here to Windows Azure environment (Production or Staging)
have you uploaded your publishing settings file in visual studio and/or a management certificate? this is crucial to be a trusted point by your azure subscription, hence why you could be having this issue. BTW try upgrading to SDK 2.1 too for better support and better features (if possible of course).
Adding to Arbie's answer. You can avoid the first few steps. Just type "Manage user certificates" in windows search bar. Go to Personal > Certificates.
Your certificates would have Issued to "Windows Azure Tools".
You can check for the thumbprint by opening the certificate and checking the Details.

Keyset does not exist exception with deployment on windows server 2012 on Azure

I got a problem with upgrading my deployment to windows server 2012, my deploy works fine with osfamily=2 and compiled with .net4, but failed at .net4.5 and osfamily=3,
the exception I saw when remote to the vm is "Keyset does not exist", seems to related to some certificates. My program using the certificates to encrypt some stream and should be able to using this certs to decode this stream after I deploy it.
I checked the certs on vm, it is installed fine, in the right place.
So I suspect this is an issue with the different secure policy with 2012 that prevent my role to get the key in the certs.
this blocks me for a while so Thank a lot for any clue!
Keyset does not exist typically refers to an error when your program is trying to access a private key of a certificate and is unable to do so, either because the private key does not exist or because it has no permissions to access it
You will need to find the certificate in question in your certificate store, verify that it contains a private key (that will show up in the properties of the certificate)
And then verify that your process/application pool has permissions to access the private key by right-clicking on the certificate from the certificate store and choosing: All Tasks->Manage Private Keys. From there, make sure to add sufficient users to the allowed list
Hope this helps

Resources