What determines the appended string when we sign a msix application with signtool - code-signing-certificate

I bought a code signing certificate last year and signed a msix package using the sign tool
sign /f "$SigningKeyFilePath" /fd SHA256 /v /a /p $SigningKeyPassword "$Package"
The resultant msix package when installed created a folder in AppData directory with the name PackageName_244zpcd23egta, since the certificate was valid only for a year, we had to get a new certificate, we were told that we can't renew the existing.
Now after signing with the new certificate the folder created is different PackageName_123zwerd23ewea, this means that we can't update the previously installed application. MSIX installer returns error say a previous application with same name is already installed.
I want to know how can we prevent this problem in future? What determines the _randowm_looking_number at the end of folder? I have noticed the new certificate did not had the pin code. Could that make this happen? or that we should always insist on certificate renewal (if it is possible) and not get a new certificate?

The block appended at the end of the string is a publisher hash, calculated from your subject (publisher). It is a 13-character string, base32-encoded representation of first few bytes of SHA-256 hash of your certificate Distinguished Name.
The algorithm is relatively straightforward:
Take UTF-16 string containing the publisher name (certificate DN) (as-is, with all spaces and punctuations)
Calculate SHA-256 hash of byte representation of this string
Take first 8 bytes (64 bits)
Pad the binary value by a single zero bit to the right (= left shift all bits)
Group the bits in groups of 5 (since we had 64 + 1 bits, we should get 13 groups each having 5 bytes)
For each group, convert the bit representation to an integer, and perform a look-up in a replacement table mapping the numbers to letters and digits.
Join the letters together and make them lowercase to receive the publisher hash.
This can be done for example via the following PowerShell
function Get-PublisherHash($publisherName)
{
$publisherNameAsUnicode = [System.Text.Encoding]::Unicode.GetBytes($publisherName);
$publisherSha256 = [System.Security.Cryptography.HashAlgorithm]::Create("SHA256").ComputeHash($publisherNameAsUnicode);
$publisherSha256First8Bytes = $publisherSha256 | Select-Object -First 8;
$publisherSha256AsBinary = $publisherSha256First8Bytes | ForEach-Object { [System.Convert]::ToString($_, 2).PadLeft(8, '0') };
$asBinaryStringWithPadding = [System.String]::Concat($publisherSha256AsBinary).PadRight(65, '0');
$encodingTable = "0123456789ABCDEFGHJKMNPQRSTVWXYZ";
$result = "";
for ($i = 0; $i -lt $asBinaryStringWithPadding.Length; $i += 5)
{
$asIndex = [System.Convert]::ToInt32($asBinaryStringWithPadding.Substring($i, 5), 2);
$result += $encodingTable[$asIndex];
}
return $result.ToLower();
}
For example:
Get-PublisherHash "CN=SomeName, DN=Some Domain"
> qwz5zh2hhehvm
Any change of your certificate (where the publisher name changes) means that the hash is completely different. And since the hash is a part of MSIX family name and its full package name, the new app is treated as another application.
In some recent efforts to change it (Insider Builds 22000 and newer) there is a feature called "Persistent identity", which can be used to provide a smooth upgrade experience even if the certificate changes.
Source:
My blog post about calculating the hash
Microsoft documentation for feature "Persistent identity"
Description of base32 algorithm

Related

Microsoft Azure - Cannot satisfy password requirements when making FTP user credentials

I cant seem to satisfy password requirements for creating FTP user credentials
I go to
App Service > yourwebapp > Deployment Center > FTP > User Credentials
to make a username and password,
but I can never make a password that satisfies all the requirements.
I have literally dragged my hands across my keyboard pressing every button but I still cant.
Could someone please tell me that I'm not going crazy and that I can follow basic directions.
I used different browsers, both Chrome and Edge, but always get the message
"The specified password does not meet the minimum requirements. The password should be at least eight characters long and must contain capital letters, lowercase letters, numbers, and symbols."
I have verified that both password fields match. Username is in the form of <username>, just like the docs say
My sample password is
TestTestTest_1234!##$
I think it SHOULD meet the requirements.
OK So I found out certain characters are not supported in the password, even if they are not explicitly stated.
Characters not supported include. There may be others but this are the ones I have tested myself.
_ (underscore)
#
^
( )
{ }
[ ]
_ -
+ =
: ;
" '
< >
, .
/
| \
` ~

Verifyable logfile at customer site

We want to create a logfile at customer site where
the customer is able to read the log (plain text)
we can verify at our site that the log file isn't manipulated
A few hundred bytes of unreadable data is okay. But some customers do not send us files where they can't verify that they do not contain sensible data.
The only reasonable option I see so far is to append a cryptographic checksum (e.g. SHA256(SECRET_VALUE + "logtext")). The SECRET_VALUE would be something hardcoded which is plain "security through obscurity". Is there any better way?
We use the DotNet-library and I do not want to implement any crypto algorithm by hand if that matters.
You can use standard HMAC algorithm with a secret key to perform the checksum.
Using a secret key prevents in a simple way that the checksum can be regenerated directly. A hardcoded key could be extracted from code, but for your use case I think is enough
The result is a binary hash. To insert it into the text file encode the value as hexadecimal or base64, and ensure you are able to revert the process in server side so you can calculate the hash again with the original file.
You could use also a detached hash file to avoid modifying the log file
Target
customer readable logfiles
verifyable by our side
minimum of binary data
must work offline
Options
Public-Private-key-things... (RSA, ...)
would be secure
but only binary data
Add a signature
We are not the first ones with that idea ( https://en.wikipedia.org/wiki/Hash-based_message_authentication_code )
DotNet supports that ( System.Security.Cryptography.HMACSHA256 )
Key must be stored somewhere ... in source
Even with obfuscation: not possible to do so securely
Trusted Timestamping
again: we are not first ( https://en.wikipedia.org/wiki/Trusted_timestamping )
needs connection to "trusted third party" (means: a web service)
Build Hash + TimeStamp -> send to third party -> sign the data (public-private-key stuff) -> send back
Best option so far
Add a signature with HMAC
Store the key in native code (not THAT easy to extract)
Get code obfuscation running and build some extra loops in C#
Every once in a while (5min?) put a signature into log AND into windows application log
application log is at least basically secured against modification (read only)
and it's collected by the our error report
easy to oversee by customer (evil grin)

Contents of p7s files

Does anyone out there know the content of the p7s file?
At least one digital signature is located in the file, and I can locate that pretty easily, but I want to catch the part of the file that contains the section encrypted with the private key - i.e., the non-repudiation part of the digital signature.
Does anyone know where a simple copy of the general file format can be found?
SUPPLEMENT ON EDIT:
The general format of the files I am working with is:
byte offset Record Type
0 x509 cert (or so the program says)
0 version
1 8x (information type)
3 1 offset to next information (length of file - 4)
52 x509 cert (embedded, collection member: ROOT CA)
56 x509 cert (embedded, collection member: ROOT CA)
(1 offset) x509 cert (new, subject identifier: Email)
(1 offset +4) x509 cert (embedded, collection memeber: Email )
(varies) x509 cert (new, enhanced key usage: Digital Signature, Non-Repudiation)
(varies+4) x509 cert (embedded, collection member: Digital Signature, Non-Repudiation))
(varies) x509 cert (new, certificate authority info: CA EMAIL)
(varies+4) x509 cert (embedded, collection member: CA EMAIL)
(varies) crosscertificate pair information, unknown class type
(varies) certificate of unknown class type (padded with trailing zeros, if needed)
I do know that this file does not contain a message digest, and is not unique except it is original to the certificate holder (i.e., a different certificate will product different results, but different messages all have the same p7s file attached to them).
I believe the message digest is stripped from the files when they are sent to a system that does not support encryption/decryption of emails (such as gmail or yahoo). That would be my explanation for the lack of distinct contents for the p7s files.
The files are not damaged other than the likelihood that the message digest has been stripped from them. They will not open with ASN. Regular certificate files will open with ASN (cer files).
I wrote a program that analyzes the files by brute force, essentially.
I check the file byte by byte, and copy the bytes into the input for an X509Certificate2 class instantiation in a try-catch loop.
Whether it aborts or not, I move one byte and check again.
If the check is successful, I add the resulting cert to a collection.
At the end, I examine the certs in the collection and dump both hex and formatted ascii with the file offset into a report file.
The p7s files I am referring to are detached from the original file, and apparently do not contain any signed information. That is apparently stripped when the signature is sent to an email address that does not have the capability of encrypting/decrypting information.
P7S file usually contains DER encoded CMS (Cryptographic Message Syntax) structure of SignedData type which is defined in RFC5652. You can use ASN.1 Editor to conveniently examine exact structure and contents of your file.
I recently needed to get the contents of some files ".rtf.p7s". Looking at their content, using the notepad++, I could see that the RTF was there without cryptography after a fixed number of bytes. So I made this little script that removes the first bytes containing the key information that I did not need.
Ps - Open the file in sublime did not help. It opened the file as binary.
<?php
$dir = __dir__;
$dh = opendir($dir);
while (false !== ($filename = readdir($dh))) {
if( strpos( $filename, ".p7s" ) !== false ) {
$files[] = $filename;
}
}
foreach($files as $file ){
$content = file_get_contents( $file );
$newcontent = substr( $content, 62 );
$newfilename = substr( $file , 0, -4 );
file_put_contents($newfilename, $newcontent);
}
?>

hash different strings to same result

I am setting up a voucher code system for a checkout written in C# and I want to be able to distribute unique vouchers that do the same thing, sort of like a product key.
Is there any way of generating unique (fairly short and preferably alpha-numeric) strings that will "hash" in some way to give the same result?
In other words, can I start with a defined voucher code and get multiple results for a reverse hash?
I'm sorry if I'm not explaining this very well - I can give more information if needed.
EDIT: I know that I could use a look-up table with pre-defined codes, but I was wondering if there is a way to auto-generate these codes to allow the system to scale easily.
Here's a thought...
Start w/ some secret password: "100:mypass:yourpass"
then md5 that: you'll get
md5("10:mypass:yourpass")=f6ff5421b31e609c7dcd19c4a462caa0
'key 1'=> left 16 chars of md5 = 'f6ff5421b31e60'
run the right 16 chars of the md5 it into another md5:
md5('7dcd19c4a462caa0') = 582fbfb7a035d08094cdef57d88f720e
'key 2' => '582fbfb7a035d080'
[repeat again here, and again... and again, ]
...
Not sure on the 'distribution' point on this, e.g. if it will be running on a POS type gift card or voucher card system or what, but if you notice I put 3 components into the 'password', this value could contain the total legitimate keys (split on ":", resulting in a break on 100 valid keys ), a system (distributor) password, and a local system password that would be required to 'verify' or 'match' on a 'good' key. You could just do a quick scan to see if the key exists or not and write an invalidation routine locally. I know, my math genius friends would probably say there's a better, more secure and effective way but hey... this is what you asked for right? I'm a simple man and like simple things... but you could make it more complicated by allowing for ranges too...
e.g. pass=> "100:1000:pass1:pass2" that way you could test the 100->1000th md5's partial keys...
Cheers!!
What you are looking for is called Perfect hash function.
Here you can find an article about how to efficiently generate perfect hash for large key sets.
And here you can find a c# minimum perfect hash function generator.
you could use a hash generated via the current date/time
<!-- language: c# -->
byte[] ByteArray = Encoding.UTF8.GetBytes(System.DateTime.Now.ToString());
MD5 md5 = MD5.Create();
byte[] ByteResult = md5.ComputeHash(ByteArray);
StringBuilder result = new StringBuilder(ByteResult.Length * 2);
for (int i = 0; i < ByteResult.Length; i++)
result.Append(ByteResult[i].ToString("X2"));
Console.WriteLine(result.ToString());

How to programatically re-enable documents in the MS Office list of disabled files

MS Office programs keep a list of disabled files that have caused errors when previously opened. A user can remove documents from this list by accessing the list through the program menu and selecting a document to be re-enabled. (http://support.microsoft.com/kb/286017)
The question is: How can this re-enabling of documents be accomplished programmatically, without interaction with the gui?
Consolidating previous answers and expounding upon them here.
Office products store disabled items in the registry under keys named HKEY_CURRENT_USER\Software\Microsoft\Office\<version>\<product>\Resiliency\DisabledItems. For example, Excel 2010's disabled list is under HKEY_CURRENT_USER\Software\Microsoft\Office\14.0\Excel\Resiliency\DisabledItems.
Each disabled item is stored as a randomly-named key of type REG_BINARY. The format of the byte array is:
bytes 0-3 : ??? (perhaps a 32-bit uint type code, 1 = COM Addin)
bytes 4-7 : 32-bit uint length (in bytes) for the first string (path)
bytes 8-11 : 32-bit uint length (in bytes) for the second string (description)
bytes 12-end : two strings of unicode characters, the byte length for each of which is stored in the uints above
BAT script to re-enable all "disabled items" in Excel 2016.
Disabled items are found in Excel->File->Options->Addins->Manage->Disabled items.
:: Deletes all values under the key.
REG DELETE HKEY_CURRENT_USER\Software\Microsoft\Office\16.0\Excel\Resiliency\DisabledItems /va /f
Fyi on params:
/va Delete all values under this key.
/f Forces the deletion without prompt.
PS. I have a bunch of workbooks that run macros with a task scheduler. Excel would randomly add workbooks that crashed onto the disabled items list. So running this BAT script daily resolves it OK.
Here is a Powershell Script that I threw together to fit a similar problem I was having with MS-Access 2013 on Win7
#RemoveOfficeDisabledItem.ps1
#command line:
# powershell -executionpolicy unrestricted -file ".\RemoveOfficeDisabledItem.ps1"
#Update these variables to suit your situation
$OfficeVersion="15.0"
$OfficeApp="Access"
$FileName="My Blocked File.mdb"
#Converts the File Name string to UTF16 Hex
$FileName_UniHex=""
[System.Text.Encoding]::ASCII.GetBytes($FileName.ToLower()) | %{$FileName_UniHex+="{0:X2}00" -f $_}
#Tests to see if the Disabled items registry key exists
$RegKey=(gi "HKCU:\Software\Microsoft\Office\${OfficeVersion}\${OfficeApp}\Resiliency\DisabledItems\")
if($RegKey -eq $NULL){exit}
#Cycles through all the properties and deletes it if it contains the file name.
foreach ($prop in $RegKey.Property) {
$Val=""
($RegKey|gp).$prop | %{$Val+="{0:X2}" -f $_}
if($Val.Contains($FileName_UniHex)){$RegKey|Remove-ItemProperty -name $prop}
}
Regarding MS Office XP (2002) MSWord the list of disabled documents is kept as randomly named binary values under the key:
[HKEY_CURRENT_USER\Software\Microsoft\Office\10.0\Word\Resiliency\DisabledItems]
So deleting the values under the "DisabledItems" key for every user probably will do the trick.
Is there something more to it? I don't know - yet.
There is a good article about how Office handles COMAddins at codeproject. Normal Addins are handled equally and the system was kept unchanged so far (up to Office 2013).
As far as I found out. The randomly named value contains a byte-array of unicode characters, separated by null-strings.
I could not find out about all the entries in the null-separated array of values. However index (3) contains the filename of the ADDIn and index (4) contains a description of the ADDIn if available.
So one should read the values and ask the user to reinstall the addins before deleting the registry keys as Luie wrote back in 2009.

Resources