Currently, I am working on a web app. The backend would frequently communicate with the Docker Engine API and I am using certificate signing (client/server key) to authenticate. However, where should I store the certificate PEM file? Should I store it in the database, or should I store it as a file then store the file path in the database?
Storing into a database would mean that I have to access the database every time there is a command being sent. Am I correct?
I feel storing the PEM files on disk would be less resource-intensive. As you have stated, if it is stored in the database, you will need to make a request any time you wish to access it.
If you do store the PEM file on disk, ensure it is not within a web-accessible directory e.g. nobody should be able to goto https://yourapp/your.pem file
Related
I would like to upload a csv file with sensitive data to a website that allows csv uploads. The website is secured with SSL however, I am uncertain whether the csv file upload would be transmitted securely?
When you upload a file to a website over SSL, the file is encrypted while it is transmitted from your device to the website's server. This means that anyone who might intercept the transmission would not be able to read the contents of the file.
If you are concerned about the security of your data, you might consider using a file sharing service that allows you to password-protect your files or encrypt them before uploading. This will add an extra layer of security to your data and help to ensure that it is protected from unauthorized access.
Dropbox
Google Drive
OneDrive
I'm developing a small private social network where the users can share media files and text with each other. To provide some sort of security the data is stored encrypted on the server / in the database. Now the question is where and how to store the private key(s) of the encrypted data. I use mongoDB to store the user information and a simple Apache 2 server to store the actual encrypted media files. The client is connected to a socket.io server running on https to read the database content. So here is a list of the steps I thought of to make it as safe as possible, please let me know if I miss something or going down the wrong path.
AES 128 bit encryption for the media files and text
HTTPS and SSL encryption for the Server-Client connection
Storing the secret key for the database content in a c / c++ class (if possible?) in the client
code
Creating an own secret key for every media file / upload and storing it encrypted in the database
Restrict the access to the server to everything but the encrypted media files
I would like to know if Azure Storage's Client-Side encryption also applies to file blobs or just strings. I could find some documents on how to do client-side "data encryption" for Azure but they don't specify what data types are valid for client side encryption.
If for example, I have a JPEG file, can it be encrypted before upload and then decrypted before download, using the Azure Storage's Client Side Encryption?
Short answer: Yes.
If you have a JPEG file (or for that matter any file) you will be able to encrypt that file and store it in encrypted format in Azure Storage. It doesn't just apply to strings only.
There are some caveats though for client side encryption to work and for that I suggest you read this article on Azure documentation site which explains the whole process very well: https://learn.microsoft.com/en-us/azure/storage/storage-client-side-encryption.
I am building an app that needs to upload files to S3. Initially, I had my secret key in the web.config file. Since they key has access to my entire account, I am realizing that instead I should rely on the IAM services to just generate a user for accessing a bucket.
However, this doesn't solve the problem of storing the key in plain text. How can I manage it otherwise?
Actually IAM permissions to S3 do solve your problem because the user that you'll create will be only allowed to access this specific bucket - it can't do any harm to your account and you don't have to store the access/secret keys on the machine.
Further, you can restrict access to a bucket to a specific IP.
Consider the following image that shows the encryption hierarchy used in SQL Server. Please note the first blue block, that says the SMK is encrypted using DPAPI. The DPAPI uses a currently logged-in user credentials (+ more) to encrypt data, so it's machine-specific. This means that SMK (as well as DMK and any derived password) will be machine-specific (actually it's generated by SQL Server's setup). OTOH, I can create/backup an X.509 certificate in SQL Server (using CREATE CERTIFICATE, BACKUP CERTIFICATE and so on).
The scenario/question:
I'm developing a Web App that needs to encrypt and store CC information in a database column. I need to access those data, later on another machine so the db backup should be actually readable when restored on another machine (albeit, for someone who has got access to the above-mentioned certificate).
I'm wondering how am I supposed to restore a backup on another machine when the SMK is specific to the current instance of SQL Server? What should I do to access those encrypted data once they are restored on another machine?
UPDATE: Correct me if I am wrong!
We could use the BACKUP SERVICE MASTER KEY TO FILE command to back the currently used SMK. This key, however, can be restored on any other SQLServer (on/out of the same machine) using the RESTORE SERVICE MASTER KEY FROM FILE command. When the SMK is restored, it's being encrypted once again using DPAPI so that the key itself can be stored somewhere on the machine.
Any help would be highly appreciated,
The diagram shows that a certificate can be protected either by the DMK or via a password. If you protect it with just a password, it should be portable.