How to add new key-value pair in secrets manager without impacting existing key and values in aws CDK - python-3.x

I have created aws stack in python which create new secret in secrets manager. When I execute the code, stack created successfully and given secret and all provided keys-values listed successfully. Below is the code.
templated_secret = asm.Secret(self, "abzzzz11",
description="ddddd",secret_name="hahahah",
generate_secret_string=asm.SecretStringGenerator(
secret_string_template=json.dumps({"username1": "", "password1": "","password2": "hello-world-prod2"}),
generate_string_key="qwe"
)
)
I have below two question:
Question 1: After creating secrets, values will be changed against keys for dev or stage environment. Now a new Key and value needs to be added for same secret. But after adding new values in my code when I execute the stack then it replace all the values. So is it possible that system only add those values which does not exist on aws secret manager?
Question 2: I am unable to understand the purpose/use of generate_string_key in above code. I read the aws documentation but unable to understand the purpose of this field. So please help me to understand the usage of this field.

Related

Azure - Create Function App hostkey with Terraform azapi/bicep/powershell

I'm working on automating the rotation of my azure function app's host key, which is used to maintain a more secure connection between my API Management and my function apps. The issue is that I can not figure out how to accomplish this based on the lack of clear documentation. I found a document for how to create a key for a specific function within the function app, but not for the host level. I've tried using the web ui resource manager to figure out what the proper values are, but host seems to have no values available by GET request to help me see what the formatting needs to be. In fact, I can't find any reference to my function app's host keys anywhere in the resource manager UI. (Of course I can in the portal).
I don't care if it's powershell, bicep, ARM, terraform azapi, whatever, I'd just like to find a way to accomplish the creation of a new hostkey so that I can control it's rotation with terraform. Does anyone know how to accomplish this?
Right now my attempt looks like
resource "azapi_resource" "function_host_key" {
type = "Microsoft.Web/sites/host/functionkeys#2018-11-01"
name = "${azurerm_windows_function_app.api_function.name}-host-key"
parent_id = "${azurerm_windows_function_app.api_function.id}/host"
body = jsonencode({
properties = {
name = "test-key-terraform"
value = "asdfasdfasdfasdfasdfasdfasdf"
}
})
}
I also tried
resource "azapi_resource" "function_host_key" {
type = "Microsoft.Web/sites#2018-11-01"
name = "${azurerm_windows_function_app.api_function.name}-host-key"
parent_id = "${azurerm_windows_function_app.api_function.id}/functionsAppKeys"
location = var.region
}
since it said the body was invalid, but this also throws an error due to there being no body. I'm wondering if this just isn't possible.
I also just tried
resource "azapi_resource" "function_host_key" {
type = "Microsoft.Web/host/functionkeys#2018-11-01"
name = "${azurerm_windows_function_app.api_function.name}-host-key"
parent_id = "${azurerm_windows_function_app.api_function.id}/host"
location = var.region
}
and the result said that it was expecting
parent_id of `parent_id is invalid`: expect ID of `Microsoft.Web/host`
so I'm not sure what that parent_id should be.
I found an example through a bash/powershell script using the azure rest API, but I get a 403 error when I attempt to do it, I can only assume because my function app is secured, but I'm not sure a good way to determine that.
There must be a way to create a key programmatically...
UPDATE
I believe that this has been purposely made impossible now to do with terraform and I need to, as grose and backwards as it may be, use a CLI command in my pipeline. I understand you can do this, but it is (ofc my opinion) that if I am using terraform, I have terraform manage something, not have random CLI commands outside of terraform doing things that TF should be able to manage.
I created a key using az functionapp keys set and that worked, and the output explicitly stated that the type of resource which was created was Microsoft.Web/sites/host/functionKeys, so I went to the Azure Resource Explorer to see what versions were available for this type, since it clearly exists.. and found that nope, azure does not have it listed.
What confuses me is that I see this being done w/ ARM templates and I believe that my code matches theirs, just I'm using AZAPI.. and I get a not found error. Giving up for now

Hiding secrets in intake catalog for remote access (S3/MinIO)

I'm trying to build an intake catalog for my team. The datasets are on a shared MinIO server for which each user should have their own service account, and therefore a key/secret pair.
When creating the first catalog entry like this:
source = intake.open_netcdf(
"s3://bucket/path/to/file.netcdf",
storage_options = storage_options
)
where storage_options is a dictionary (read from a json file that the user should have in their file system) containing:
{
'key': 'KEY',
'secret': 'SECRET',
'client_kwargs': {'endpoint_url': 'http://X.X.X.X:9000'}
}
i.e. the necessary credentials for s3fs to access the MinIO server; I get a catalog entry containing the secrets:
sources:
my_dataset:
args:
storage_options:
client_kwargs:
endpoint_url: http://X.X.X.X:9000
key: KEY
secret: SECRET
urlpath: s3://bucket/path/to/file.netcdf
description: 'my description'
driver: intake_xarray.netcdf.NetCDFSource
Now this catalog file shouldn't be shared because it contains secrets, defeating the purpose of having a catalog. My question then is: how do I make the storage_options part be read from the secrets file that the user will have? (ideally without having to change from json to yaml, but it's not a requirement)
Fortunately, AWS already provides for doing this, either via environment variables or files placed in special locations ( https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html#environment-variables and below).
Intake also has ways of templating values, but these ultimately end up in using the environment or getting values directly from the user. Additionally, your case is complicated by needing these values not in a top-level parameter, but nested inside storage_options. We could probably improve this system, but it would still beg the question, where should the secret values come from?

is installing local node necessary to create wallet on test/main net?

i want to be able to create wallets on wave blockchain using their api
according to this
https://nodes-testnet.wavesnodes.com/api-docs/index.html#/addresses/createWalletAddress
i need to send API key in my request header .... i look into how can i obtain this api key and in the doc here are the steps
Set API Key
To set API key, you need to generate API key hash and then use it in
your node configuration.
Create unique string value that you will use as API key.
Go to Swagger web interface.
Open the /utils/hash/secure (opens new window)API method and input
your unique string in the message field.
Click Execute to get the hashed API key.
Use the hashed API key as the value of the api-key-hash parameter in
your node configuration file.
Restart your node.
it says
Use the hashed API key as the value of the api-key-hash parameter in
your node configuration file.
im very confused ... i thought using testnet means that i dont have to install a local node
maybe im wrong ?!
use this package
https://www.npmjs.com/package/#waves/waves-api
you need to creaate a seed pharase , and using the seed you can create address/public & private keys ... here is a shortcut to create all
const Waves = WavesAPI.create(WavesAPI.TESTNET_CONFIG);
const seed = Waves.Seed.create();
console.log(seed);

How to connect Google Datastore from a script in Python 3

We want to do some stuff with the data that is in the Google Datastore. We have a database already, We would like to use Python 3 to handle the data and make queries from a script on our developing machines. Which would be the easiest way to accomplish what we need?
From the Official Documentation:
You will need to install the Cloud Datastore client library for Python:
pip install --upgrade google-cloud-datastore
Set up authentication by creating a service account and setting an environment variable. It will be easier if you see it, please take a look at the official documentation for more info about this. You can perform this step by either using the GCP console or command line.
Then you will be able to connect to your Cloud Datastore client and use it, as in the example below:
# Imports the Google Cloud client library
from google.cloud import datastore
# Instantiates a client
datastore_client = datastore.Client()
# The kind for the new entity
kind = 'Task'
# The name/ID for the new entity
name = 'sampletask1'
# The Cloud Datastore key for the new entity
task_key = datastore_client.key(kind, name)
# Prepares the new entity
task = datastore.Entity(key=task_key)
task['description'] = 'Buy milk'
# Saves the entity
datastore_client.put(task)
print('Saved {}: {}'.format(task.key.name, task['description']))
As #JohnHanley mentioned, you will find a good example on this Bookshelf app tutorial that uses Cloud Datastore to store its persistent data and metadata for books.
You can create a service account and download the credentials as JSON and then set an environment variable called GOOGLE_APPLICATION_CREDENTIALS pointing to the json file. You can see the details at the link below.
https://googleapis.dev/python/google-api-core/latest/auth.html

Access secret environment properties in IBM cloud deploy - NodeJS

I'm having some problem with accessing my secret environments properties I've set in my build stage. In the build environment properties I got two secret fields called "w_username" and "w_password", however, I can not access these properties inside of my NodeJS runtime. I've tried with process.env['w_username'] but it seems like it can't find it. How is it possible to access them?
Using NodeJS 6.x, npm 6.x with SDK for NodeJS on IBM cloud.
You can directly access the build environment properties in the next stage in the toolchain with their names like w_username and w_password.
You can examine the environment properties for a pipeline job by
running the env command in the job's script.
You can also define your own environment properties. For example, you might define an API_KEY property that passes an API key that is used to access IBM Cloud resources by all scripts in the pipeline.
You can add the following types of properties:
Text: A property key with a single-line value.
Text Area: A property key with a multi-line value.
Secure: A property key with a single-line value that is secured with AES-128 encryption. The value is displayed as asterisks.
Properties: A file in the project's
repository. This file can contain multiple properties. Each property
must be on its own line. To separate key-value pairs, use the equals
sign (=). Enclose all string values in quotation marks. For example,
MY_STRING="SOME STRING VALUE".
For more information, refer here
Hope this helps

Resources