Google Calendar API and shared hosting issue - c#-4.0

I'm trying to use a public Google calendar in a webpage that will need editing functionalities.
To that effect, I created the calendar and made it public. I then created a Google service account and the related client id.
I also enabled the Calendar API and added the v3 dlls to the project.
I downloaded the p12 certificate and that's when the problems start.
The call to Google goes with a X509 cert but the way the .NET framework is built is that it uses a user temp folder.
Since it's a shared host for the web server (GoDaddy), I cannot have the app pool identity modified.
As a result, I'm getting this error:
System.Security.Cryptography.CryptographicException: The system cannot
find the file specified.
when calling:
X509Certificate2 certificate = new X509Certificate2(GoogleOAuth2CertificatePath,
"notasecret", X509KeyStorageFlags.Exportable);
that cerificate var is then to be used in the google call:
ServiceAccountCredential credential = new ServiceAccountCredential(
new ServiceAccountCredential.Initializer(GoogleOAuth2EmailAddress)
{
User = GoogleAccount,
Scopes = new[] { CalendarService.Scope.Calendar }
}.FromCertificate(certificate));
... but I never get that far.
Question: is there a way to make the call differently, i.e. not to use a X509 certificate but JSON instead?
Or can I get the x509 function to use a general temp location rather than a user location to which I have no access to since I can't change the identity in the app pool?
Since I'm completely stuck, any help would be appreciated.

One simple option which avoids needing to worry about file locations is to embed the certificate within your assembly. In Visual Studio, right-click on the file and show its properties. Under Build Action, pick "Embedded resource".
You should then be able to load the data with something like this:
// In a helper class somewhere...
private static byte[] LoadResourceContent(Type type, string resourceName)
{
string fullName = type.Namespace + "." + resourceName;
using (var stream = type.Assembly.GetManifestResourceStream(fullName)
{
var output = new MemoryStream();
stream.CopyTo(output);
return output.ToArray();
}
}
Then:
byte[] data = ResourceHelper.LoadResourceContent(typeof(MyType), "Certificate.p12");
var certificate = new X509Certificate2(data, "notasecret", X509KeyStorageFlags.Exportable);
Here MyType is some type which is in the same folder as your resource.
Note that there are lots of different "web" project types in .NET... depending on the exact project type you're using, you may need to tweak this.

Related

How to get a TokenCredential from a ServiceClientCredential object?

In my application, we presently are using ServiceClientCredentials from Microsoft.Rest. We are migrating parts of our application over to start using Azure.ResourceManager's ArmClient.
Basically all of our previous application integrations into Azure were using Microsoft.Azure.ResourceManager, which exposed agents like BlobClient or SecretClient, and these all accepted ServiceClientCredentials as a valid token type.
Now, with ArmClient I need to authenticate using DefaultAzureCredential which derives from Azure.Core's TokenCredential.
Surprisingly I haven't been able to find any examples yet of how to create this TokenCredential.
DefaultAzureCredential just works on my local PC since I'm signed into Visual Studio, but not on my build pipeline where I use Certificate based auth exposed as a ServiceClientCredential.
This was easier than I thought. The fix ended up being adding a new ServiceCollection extension method and passing in IWebHostEnvironment.
I use that to determine whether running in local debug, in which case we can use DefaultAzureCredential, or whether running in prod mode, in which case we should use Certificate Based auth.
It looks somewhat like this and works like a charm.
public static IServiceCollection AddDefaultAzureToken (this IServiceCollection services, IWebHostEnvironment environment)
{
if (environment.IsDevelopment())
{
var l = new DefaultAzureCredential();
services.AddSingleton<TokenCredential, l>;
}
else
{
var certCredential= new ClientCertificateCredential(null, null, "Abc");
services.AddSingleton<TokenCredential, certCredential>;
}
return services;
}
This works since DefaultAzureCredential and ClientCertficateCredential all have a common ancestor of TokenCredential, and the L in SOLID, the Liskov Substitution principle tells us that any implementation of a class can be substituted for any other instance of that class without breaking the application.
Note: the above sample was pseudocode and may need slight changing to work in your environment and should be cleaned to match your teams coding standards.

X509Certificate2 different in widows and linux in .NET Core Application

I have a console application written in .NET Core 2.1, and has two functionalities -
one: load a certificate chain (certificate, intermediate certificate and root certificate), as well as a private key, that are located in one .p12 file created from .p7b file and .key file. This is loaded in one X509Certificate2 class, using the ctor(string filename, string password)
The setup that I have works fine on Windows, but when I deploy the app in Linux environment (debian 9), and when I try to sign the message, I get an error: A certificate chain could not be built to a trusted root authority.
here is the code for loading the certificate:
public CertificateManager(IOptions<CertificateSettings> options)
{
var settings = options.Value;
_filePath = settings.Path;
_password = settings.Password;
Certificate = new X509Certificate2(_filePath, _password);
}
The code for the xml signing is as:
public string SignMessage(XmlNode message)
{
var signed = new SignedXml((XmlElement)message)
{
SigningKey = _certificateManager.Certificate.PrivateKey,
};
var referenceID = message.Attributes[PAResDescr.PAResID].Value;
signed.SignedInfo.SignatureMethod = SignedXml.XmlDsigRSASHA1Url;
var reference = new Reference { Uri = $"#{referenceID}" };
reference.DigestMethod = SignedXml.XmlDsigSHA1Url;
// Add the reference to the SignedXml object.
signed.AddReference(reference);
var keyInfo = new KeyInfo();
var keyInfoData = new KeyInfoX509Data(_certificateManager.Certificate, X509IncludeOption.WholeChain);
keyInfo.AddClause(keyInfoData);
signed.KeyInfo = keyInfo;
signed.ComputeSignature();
return signed.GetXml().OuterXml;
}
As I mentioned, this code works fine when I run it on Windows machine, but when I run this on Linux (Debian 9) I get an the mentioned error, here is the whole stacktrace:
at System.Security.Cryptography.Xml.KeyInfoX509Data..ctor(X509Certificate cert, X509IncludeOption includeOption)
at Payment.Service.Cryptography.LocalCertificateSigner.SignMessage(XmlNode message) in /home/juls/Projects/ACS/Payment.Service/Cryptography/LocalCertificateSigner.cs:line 44
The interesting part is, that when I run this without the X509IncludeOption.WholeChain constructor parameter, the error is gone, but in the signature is included only the signing certificate, not the whole chain. My thoughts are, that when the certificate is loaded, in windows it loads the whole chain, but in linux it loads only the signing certificate. I couldn't find anything related here or in the rest of the internet, so I'm asking if anyone have any clue, what is my problem here?
Regards,
Julian
The problem was that the intermediate and root certificates were not trusted by the Linux machine. As #Crypt32 suggested in the first comment below the question, adding them to the trusted store, solved the problem. Thanks !

How can I sign a JWT token on an Azure WebJob without getting a CryptographicException?

I have a WebJob that needs to create a JWT token to talk with an external service. The following code works when I run the WebJob on my local machine:
public static string SignES256(byte[] p8Certificate, object header, object payload)
{
var headerString = JsonConvert.SerializeObject(header);
var payloadString = JsonConvert.SerializeObject(payload);
CngKey key = CngKey.Import(p8Certificate, CngKeyBlobFormat.Pkcs8PrivateBlob);
using (ECDsaCng dsa = new ECDsaCng(key))
{
dsa.HashAlgorithm = CngAlgorithm.Sha256;
var unsignedJwtData = Base64UrlEncoder.Encode(Encoding.UTF8.GetBytes(headerString)) + "." + Base64UrlEncoder.Encode(Encoding.UTF8.GetBytes(payloadString));
var signature = dsa.SignData(Encoding.UTF8.GetBytes(unsignedJwtData));
return unsignedJwtData + "." + Base64UrlEncoder.Encode(signature);
}
}
However, when I deploy my WebJob to Azure, I get the following exception:
Microsoft.Azure.WebJobs.Host.FunctionInvocationException: Exception while executing function: NotificationFunctions.QueueOperation ---> System.Security.Cryptography.CryptographicException: The system cannot find the file specified. at System.Security.Cryptography.NCryptNative.ImportKey(SafeNCryptProviderHandle provider, Byte[] keyBlob, String format) at System.Security.Cryptography.CngKey.Import(Byte[] keyBlob, CngKeyBlobFormat format, CngProvider provider)
It says it can't find a specified file, but the parameters I am passing in are not looking at a file location, they are in memory. From what I have gathered, there may be some kind of cryptography setting I need to enable to be able to use the CngKey.Import method, but I can't find any settings in the Azure portal to configure related to this.
I have also tried using JwtSecurityTokenHandler, but it doesn't seem to handle the ES256 hashing algorithm I need to use (even though it is referenced in the JwtAlgorithms class as ECDSA_SHA256).
Any suggestions would be appreciated!
UPDATE
It appears that CngKey.Import may actually be trying to store the certificate somewhere that is not accessible on Azure. I don't need it stored, so if there is a better way to access the certificate in memory or convert it to a different kind of certificate that would be easier to use that would work.
UPDATE 2
This issue might be related to Azure Web Apps IIS setting not loading the user profile as mentioned here. I have enabled this by setting WEBSITE_LOAD_USER_PROFILE = 1 in the Azure portal app settings. I have tried with this update when running the code both via the WebJob and the Web App in Azure but I still receive the same error.
I used a decompiler to take a look under the hood at what the CngKey.Import method was actually doing. It looks like it tries to insert the certificate I am using into the "Microsoft Software Key Storage Provider". I don't actually need this, just need to read the value of the certificate but it doesn't look like that is possible.
Once I realized a certificate is getting inserted into a store somewhere one the machine, I started thinking about how bad of a think that would be from a security standpoint if your Azure Web App was running in a shared environment, like it does for the Free and Shared tiers. Sure enough, my VM was on the Shared tier. Scaling it up to the Basic tier resolved this issue.

NodeJS and storing OAuth credentials, outside of the code base?

I am creating a NodeJS API server that will be delegatiing authentication to an oauth2 server. While I could store the key and secret along with the source code, I want to avoid doing that since it feels like a security risk and it is something that doesn't match the lifespan of a server implementation update (key/secret refresh will likely happen more often).
I could store it in a database or a maybe a transient json file, but I would like to know what are the considered the best practices in the NodeJS world or what is considered acceptable. Any suggestions are appreciated.
One option would be to set environment variables as part of your deployment and then access them in the code from the global process object:
var clientId = process.env.CLIENT_ID
var clientSecret = process.env.CLIENT_SECRET
Since I wanted to provide something that can store multiple values, I just created a JSON file and then read that into a module I called keystore (using ES6 class):
class KeyStore {
load() {
// load the json file from a location specified in the config
// or process.env.MYSERVER_KEYSTORE
}
get (keyname) {
// return the key I am looking for
}
}
module.exports = new KeyStore();
I would ideally want to store the file encrypted, but for now I am just storing it read only to the current user in the home directory.
If there is another way, that is considered 'better', then I am open to that.

Upload a file to SharePoint through the built-in web services

What is the best way to upload a file to a Document Library on a SharePoint server through the built-in web services that version WSS 3.0 exposes?
Following the two initial answers...
We definitely need to use the Web Service layer as we will be making these calls from remote client applications.
The WebDAV method would work for us, but we would prefer to be consistent with the web service integration method.
There is additionally a web service to upload files, painful but works all the time.
Are you referring to the “Copy” service?
We have been successful with this service’s CopyIntoItems method. Would this be the recommended way to upload a file to Document Libraries using only the WSS web service API?
I have posted our code as a suggested answer.
Example of using the WSS "Copy" Web service to upload a document to a library...
public static void UploadFile2007(string destinationUrl, byte[] fileData)
{
// List of desination Urls, Just one in this example.
string[] destinationUrls = { Uri.EscapeUriString(destinationUrl) };
// Empty Field Information. This can be populated but not for this example.
SharePoint2007CopyService.FieldInformation information = new
SharePoint2007CopyService.FieldInformation();
SharePoint2007CopyService.FieldInformation[] info = { information };
// To receive the result Xml.
SharePoint2007CopyService.CopyResult[] result;
// Create the Copy web service instance configured from the web.config file.
SharePoint2007CopyService.CopySoapClient
CopyService2007 = new CopySoapClient("CopySoap");
CopyService2007.ClientCredentials.Windows.ClientCredential =
CredentialCache.DefaultNetworkCredentials;
CopyService2007.ClientCredentials.Windows.AllowedImpersonationLevel =
System.Security.Principal.TokenImpersonationLevel.Delegation;
CopyService2007.CopyIntoItems(destinationUrl, destinationUrls, info, fileData, out result);
if (result[0].ErrorCode != SharePoint2007CopyService.CopyErrorCode.Success)
{
// ...
}
}
Another option is to use plain ol' HTTP PUT:
WebClient webclient = new WebClient();
webclient.Credentials = new NetworkCredential(_userName, _password, _domain);
webclient.UploadFile(remoteFileURL, "PUT", FilePath);
webclient.Dispose();
Where remoteFileURL points to your SharePoint document library...
There are a couple of things to consider:
Copy.CopyIntoItems needs the document to be already present at some server. The document is passed as a parameter of the webservice call, which will limit how large the document can be. (See http://social.msdn.microsoft.com/Forums/en-AU/sharepointdevelopment/thread/e4e00092-b312-4d4c-a0d2-1cfc2beb9a6c)
the 'http put' method (ie webdav...) will only put the document in the library, but not set field values
to update field values you can call Lists.UpdateListItem after the 'http put'
document libraries can have directories, you can make them with 'http mkcol'
you may want to check in files with Lists.CheckInFile
you can also create a custom webservice that uses the SPxxx .Net API, but that new webservice will have to be installed on the server. It could save trips to the server.
public static void UploadFile(byte[] fileData) {
var copy = new Copy {
Url = "http://servername/sitename/_vti_bin/copy.asmx",
UseDefaultCredentials = true
};
string destinationUrl = "http://servername/sitename/doclibrary/filename";
string[] destinationUrls = {destinationUrl};
var info1 = new FieldInformation
{
DisplayName = "Title",
InternalName = "Title",
Type = FieldType.Text,
Value = "New Title"
};
FieldInformation[] info = {info1};
var copyResult = new CopyResult();
CopyResult[] copyResults = {copyResult};
copy.CopyIntoItems(
destinationUrl, destinationUrls, info, fileData, out copyResults);
}
NOTE: Changing the 1st parameter of CopyIntoItems to the file name, Path.GetFileName(destinationUrl), makes the unlink message disappear.
I've had good luck using the DocLibHelper wrapper class described here: http://geek.hubkey.com/2007/10/upload-file-to-sharepoint-document.html
From a colleage at work:
Lazy way: your Windows WebDAV filesystem interface. It is bad as a programmatic solution because it relies on the WindowsClient service running on your OS, and also only works on websites running on port 80. Map a drive to the document library and get with the file copying.
There is additionally a web service to upload files, painful but works all the time.
I believe you are able to upload files via the FrontPage API but I don’t know of anyone who actually uses it.
Not sure on exactly which web service to use, but if you are in a position where you can use the SharePoint .NET API Dlls, then using the SPList and SPLibrary.Items.Add is really easy.

Resources