Migrating Thales payshield 9000 to Azure Key vault - azure

We want to migrate HSM keys from Thales paysheild 9000 to Azure Key vault. We would like to know if this migration is supported and if supported, what’s the migration approach and use cases where customers have already migrated to Azure. We have gone through the article https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/key-vault/key-vault-hsm-protected-keys.md, it talks about Thales nShield family but we are using https://www.thalesesecurity.com/products/payment-hsms/payshield-9000
Thanks in advance.

Excellent question, as Dan suggests you should contact Microsoft for clarification, but unfortunately I don't think it's possible.
Recapping, as I'm sure you are aware the purpose of HSM's is so that the keys are not exportable.
Microsoft (and I assume Thales) supports key backup: https://learn.microsoft.com/en-us/rest/api/keyvault/backupkey but it can only be restored to the same geographical area.
In the article you supplied it mentions "Key Exchange Key" in each geographical area, which I assume will mean that Microsoft will be using a different key to that of another install of an HSM.
Having said this I'm not a general HSM expert, these are just links I have come across over time using KeyVault.
Please do contact Microsoft as I would to be interested if this is possible, please post an answer once you have heard back or a Microsoft employee can perhaps answer directly.
On the Thales literature it states:
"With nShield BYOK for Microsoft Azure, your on-premises
nShield HSM generates, stores, wraps, and exports keys to the
Microsoft Azure Key Vault on your behalf"
http://go.thalesesecurity.com/rs/480-LWA-970/images/Thales-e-Security-Microsoft-Azure-UK-sb.pdf
Interestingly it says generates / stores which suggests a pre-created key could be migrated. However on the contray I'm guessing the export must happen using the "Key Exchange Key" and stored in both on-prem and exported for Azure at the same time, not on-prem first, in the BYOK process.
This blog post has keyvault team's contact details if it helps: https://blog.romyn.ca/key-management-in-azure/

The migration of important keys, that are encrypted under current LMK on your Thales payshield on premises, is very straightforward process:
1- Use console command GC to generate new ZMK in a clear format component, this will be done by using key type to be 000 which is ZMK key type, and also to choose clear format components option use letter 'x' in GC command steps.
2-Repeat the GC command above 3 times to generate 3 different plaintext format components of the new ZMK.
3-Now, at your payshield 9000 HSM, use the console command FK which means Form Key from components, the result is the new ZMK encrypted under old LMK.
4-Use the command KE ,which means export key, to export the important data encryption keys (DEK), such as ZPK for example, which is encrypted under old LMK to be encrypted under the new ZMK. Note: in KE command here use key type to be 001 which is ZPK key type.
5- Now you need to manually distribute the same new ZMK to the other party that you are going to migrate to.
6- You can do this manual distribution to such an important key (new ZMK) by sending the 3 different plaintext format components, which you have generated earlier in step number 2, to three different security officers at your corporate, and for security reasons, no one can have the 3 components all together.
7- On the other entity that you wanted to migrate your keys to, which is Microsoft Azure Key Vault cloud service, Azure is offering securing your keys in a hardware HSM environmental of nShield type, which is general purpose HSM and it is not specific in payment transactions like Thales payshield HSM.
8 - Refer to Microsoft Azure key vault documents, to know how to form the new ZMK of the 3 different plaintext format components that you have generated before, and refer to nShield manuals also to check the command which is responsible for importing keys.
9- Now, your important keys such as ZPK which was exported under new ZMK, are now imported under the same ZMK, and finally stored encrypted under the new LMK of your nShield provided cloud service.

Related

When using Azure Key Vault or JWT what is the proper design for setting and retrieving/decrypting metadata. 1 to many or 1 to 1 keys?

The use case is a user has a metadata that needs to be encrypted so when they sign-in a protected and stored object "encrypted" will be "checked" to verify the object information coming in plaintext is equal to what is in the encrypted object.
The question is, is it more appropriate in an Azure Key Vault to give each and every user a key with public and private key ability. Or, just use a single key that will encrypt the object that is stored and just un-sign/decrypt the object when it is accessed.
To me, the object is what is necessary to be encrypted and doesn't really relate to how the key is encrypted hence a universal 1 key to many approach.
The other approach makes sense too but I would have to create a hell of a lot of keys in order to facilitate such an approach. Is 1000's or millions of keys resulting in a key per each user appropriate?
What are the advantages or disadvantages of each other.
I think the same practice would apply to JWT token signing.
I think its better to have one key and on a regular basis rotate the key.
For example, like they do in ASP.NET Core Data Protection API (I know you are using node) where they every 90 days (by default) replace the current key with a new one, and the old one is still kept to allow decryption of old data. In .NET they call this the key-ring, that hold many keys.
I did blog about this here.
Also, do be aware that using some SDK's with Azure Key Vault, they try to download all secrets at start-up, one-by-one. That can be quite a time consuming if you have many secrets.

Two PIV certificates - one YubiKey 5

I'm trying to import two PIV certificates to be used on one Yubico Key 5 (slot 9a).
One certificate for regular use and another for elevated privileges. For the life of me, I can't figure it out!
I've tried using the GUI YubiKey Manager > PIV > configure certificates > Import
all this does is overwrite the existing certificate with the one that is being imported to the key
I've tried figuring out what command line to use with the following pdf: https://www.yubico.com/wp-content/uploads/2016/05/Yubico_PIV_Tool_Command_Line_Guide_en.pdf
At this point, I'm just banging my skull against the wall and not seeing how to solve this. Does anyone have any ideas or insights on this?
It's not possible to store more that one certificate in one slot. There are different slots for different purpose. See this page for details: https://developers.yubico.com/PIV/Introduction/Certificate_slots.html
So, I think, this is not possible what you planed to do.
This is unfortunate because Gemalto smartcards allow multiple certificates to be loaded on a single card. The inability to load multiple certs in slot 9a will require using two different Yubikeys for two different certs
Technically the slot numbers refer to key slots where the private key is stored. Tools typically allow you to import certificates 'to' slot 9a when they really mean 'for' slot 9a. Certificates are stored separately from keys in PIV, and there is a well known mapping from key slot to certificate slot. For example, the certificate for slot 9a i stored in a certificate slot named 0x5fc105. Each such certificate slot can only store one certificate according to the PIV standard (which specifies the format of the data in the certificate slot), but there are 24 key key slots (with corresponding certificate slots) on a YubiKey. Depending on tooling you may be able to use other slots for your alternate key and certificate. Since certificates are public information they could also be stored somewhere else than on the YubiKey entirely, as long as you can convince your tools to use them that way. If your goal is to store two separate certificates for one key your best bet would be to import the same private key to two separate key slots, and store your two different certificates in their respective certificate slots. That won't work for onboard generated keys since you can't copy or extract those.

Security on azure Cosmos db

I want to use Cosmos db with c# code. A really important point is that data should stay encrypted at any point. So, as I understood, once the data on the server, it's automaticaly encrypted by azure by the encryption-at-rest. But during the transportation, do I have to use certificate or it's automatically encrypted. I used this link to manage the database https://learn.microsoft.com/fr-fr/azure/cosmos-db/create-sql-api-dotnet. My question is finally : Is there any risk of safety if I just follow this tutorial?
Thanks.
I think that's a great starting point.
Just one note, your data is only as secure as the access keys to the account so, on top encryption at rest and in transit, the Access Key is probably the most sensitive piece of information you need to protect.
My advice is to use a KeyVault to store the database access key rather than define them as environment variables. Combined with Managed Identity, your key will never leave the confines of the azure portal which makes it the most secure option. I'm not sure how you plan on deploying your code but more times than not I've seen those keys encoded in source code or in some configuration file that ends up exposed.
A while ago I wrote a step-by-step tutorial describing how to implement this. You can find my article here
I would suggest you to follow the instructions mentioned in here, and not even using access keys, because if they are accidentally exposed, no matter that you have stored them in a Key Vault or not, your database is out there. Besides, if you want to use access keys, it is recommended to change the access keys periodically, which then you need to make this automatic and known to your key vault, here it is described how you could automate that.

Encryption settings for Widevine CENC on Azure Media Services

I want to use Azure Media Services to provide the licenses for content protection. I created the ContentKey using the PHP SDK, and got the license URL for Widevine. I'm using packager in my side to encrypt the video.
My problem is (mostly because of ignorance) that I don't know where to find the value for the parameters "--signer", "--aes_signing_key" and "--aes_signing_iv". I read in some tutorials that these values are provided by Widevine, but in my case I assumed that were provided by Azure.
It's not an issue with PHP or packager. Even using the REST API I don't know which information correlates to "signer" "signing key" and "signing iv". This information is required even by other platforms like bitcodin and other packagers.
If you are using your own packager, you only need to configure the common encryption key and Widevine License template with us, you will obtain a license URL and KEY ID as return, and you can put those value into your packager.
You don't need to configure AES envelope Key, which is for AES 128 clear key encryption. That's another service that we offer.
Feel free to reach out to me at yanmf#microsoft.com if you have more questions. I am the PM on this Widevine services on Azure. We will help you.

Windows 8 Apps - Local Storage Security

How secure is the local data
ApplicationData.Current.LocalSettings
storage used in Windows 8 Store Apps?
This application data storage article says nothing about security, neither does this one.
Can this data be manipulated from outside of the app?
I looked at the location of the data
C:\Users[username]\AppData\Local\Packages[package_namespace]\LocalState)
but did not find it. Where is it saved exactly?
I'm trying to asses the security of this storage mechanism to decide whether I can store security-critical information there.
After some more investigation I found:
http://lunarfrog.com/blog/2012/09/13/inspect-app-settings/
The data is stored in
C:\Users[username]\AppData\Local\Packages[package_namespace]\LocalState\Settings\settings.dat
which is a Windows NT registry file (REGF) which can be openend with the registry editor and can also be manipulated.
Meaning, local storage is NOT safe.
If there is no other way, encrypting the data and obfuscating the keys is a possibility.
If it's user credentials that you want to store, take a look at PasswordVault class. Otherwise use DPAPI as you already suggested yourself.
This application data storage article says nothing about security, neither does this one.
Can this data be manipulated from outside of the app?
That storage is similar to iOS's Core Data. Its essentially untrusted input unless storage is protected (below the application level). Even if the storage is protected with encryption, its likely not authenticated so its subject to tampering.
If there is no other way encrypting the data and obfuscating the keys is a possibility.
On Windows Platforms, the standard way to protect sensitive data is to use the Data Protection API (DPAPI). Use DPAPI with the user supplied secret (the additional entropy in the APIs) for the best protection. You store the DPAPI'd data with the user's profile, in the registry, or on the filesystem. See, for example, Windows Data Protection, How to: Use Data Protection, and Data protection API (Windows Store apps). Michael Howard and David LeBlanc have a good treatment of the subject in Writing Secure Code, Second Edition. See Chapter 9, Protecting Secret Data, beginning on page 299.
If you want database like encryption, look at SQLCipher. It uses authenticated encryption, so it provide confidentiality and integrity. Windows 8 supports native libraries, including on their phones (see, for example, Native code on Windows Phone 8).

Resources