Get a signed URL of google storage object via Terraform - terraform

I am trying to get an object (ex. abc.png) signed URL from google bucket via a Terraform .tf script. But I am not getting any output on the console.
I have installed terraform on my local Linux machine, I am providing service account JSON key as credentials but not getting the signed URL, please check my script below:
provider "google" {
credentials = "account.json"
}
data "google_storage_object_signed_url" "get_url" {
bucket = "my bucket"
path = "new.json"
content_md5 = "pRviqwS4c4OTJRTe03FD1w=="
content_type = "text/plain"
duration = "2h"
credentials = "account.json"
extension_headers = {
x-goog-if-generation-match = 1
}
}
Please let me know what I am doing wrong.

If you need see Output Values, please add the Outputs code as below
output "signed_url" {
value = "${data.google_storage_object_signed_url.get_url.signed_url}"
}

Related

How to retrieve temporary credentials using rest api or by using AssumeRole in AWS SDK , facing these issues currently with my approach

ive been trying to retrieve temporary credentials using role arn but getting an error of EC2 Metadata not found in AWS SDK . here is my approach
AssumeRoleRequest request = new AssumeRoleRequest();
request.RoleArn = "arn:aws:iam::532634566192:role/ap-redshift";
request.RoleSessionName = "newsessionanme";
client = new AmazonSecurityTokenServiceClient();
AssumeRoleResponse resp = client.AssumeRole(request);
Console.WriteLine(resp.Credentials);
Console.ReadLine();
2nd approach
client = new AmazonSecurityTokenServiceClient();
var response = client.AssumeRole(new AssumeRoleRequest
{
RoleArn = "arn:aws:iam::532634566192:role/ap-redshift",
RoleSessionName = "newsessionanme"
});
AssumedRoleUser assumedRoleUser = response.AssumedRoleUser;
Credentials credentials = response.Credentials;
This is the error i am getting "Unable to get IAM security credentials from EC2 Instance Metadata Service.'" as also shown in the picture .
enter image description here

Is it possible to use 1password for Terraform provider credentials?

I'm trying to setup a Terraform configuration for Sonatype Nexus (among other things). Rather than providing my passwords directly, I want to get them from my 1Password system. The advantage for doing this is this Terraform config will live with alongside my broader infrastructure configuration, which includes the setup of the 1password Connect deployment.
My infrastructure CI/CD therefore already has environment variables set for the 1password credentials out of necessity, and it would be nice to make those the only variables I would need for anything. Hence trying to access this password from 1Password.
Below is my Terraform setup. As you can see, it gets the Nexus admin password from 1Password and tries to use it in the provider. However, when I run this Terraform script, it fails with a 401 response from Nexus when trying to create the blobstore.
To be honest, the 1Password Terraform documentation leaves much to be desired. I don't even know if I can configure a provider with data from another provider to begin with.
terraform {
backend "kubernetes" {
secret_suffix = "nexus-state"
config_path = "~/.kube/config"
}
required_providers {
nexus = {
source = "datadrivers/nexus"
version = "1.21.0"
}
onepassword = {
source = "1Password/onepassword"
version = "1.1.4"
}
}
}
provider "onepassword" {
url = "https://my-1password"
token = var.onepassword_token
}
data "onepassword_item" "nexus_admin" {
vault = "VAULT_UUID"
uuid = "ITEM_UUID"
}
provider "nexus" {
insecure = true
password = data.onepassword_item.nexus_admin.password
username = "admin"
url = "https://my-nexus"
}
resource "nexus_blobstore_file" "npm_private" {
name = "npm-private"
path = "/nexus-data/npm-private"
}

Node.js reading a blob with azure and creating a SAS token

So I am currently writing some code that gets a container and then selects a blob and makes a SAS token. which all currently work but I get a error when I try to open the link.
The error being given is this.
AuthenticationFailed
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:somethingsomething
The specified signed resource is not allowed for the this resource level
const test = () => {
const keyCredit = new StorageSharedKeyCredential('storageaccount', 'key')
const sasOptions = {
containerName: 'compliance',
blobName: 'swo_compliance.csv',
};
sasOptions.expiresOn = new Date(new Date().valueOf() + 3600 * 1000);
sasOptions.permissions = BlobSASPermissions.parse("r");
const sasToken = generateBlobSASQueryParameters(sasOptions, keyCredit).toString();
console.log(`SAS token for blob container is: url/?${sasToken}`);
return `url/?${sasToken}`;
}
I tried to reproduce the scenario in my system able to download the blob using the sas token
While you returning the return url/?${sasToken}; in your code remove the “/” just give the
the return url?${sasToken};
Example URL : https://StorageName.blob.core.windows.net/test/test.txt?SASToken
I tried in my system able to download blob

Terraform backend SignatureDoesNotMatch

I'm pretty new to terraform, but I'm stuck trying to setup a terraform backend to use S3.
INIT:
terraform init -backend-config="access_key=XXXXXXX" -backend-config="secret_key=XXXXX"
TERRAFORM BACKEND:
resource "aws_dynamodb_table" "terraform_state_lock" {
name = "terraform-lock"
read_capacity = 5
write_capacity = 5
hash_key = "LockID"
attribute {
name = "LockID"
type = "S"
}
}
resource "aws_s3_bucket" "bucket" {
bucket = "tfbackend"
}
terraform {
backend "s3" {
bucket = "tfbackend"
key = "terraform"
region = "eu-west-1"
dynamodb_table = "terraform-lock"
}
}
ERROR:
Error: error using credentials to get account ID: error calling sts:GetCallerIdentity: SignatureDoesNotMatch: The request signature we calculated does not match the signature you provided. Check your AWS Secret Access Key and signing method. Consult the service documentation for details.
status code: 403, request id: xxxx-xxxx
I really am at a loss because these same credentials are used for my Terraform Infrastructure and is working perfectly fine. The IAM user on AWS also has permissions for both Dynamo & S3.
Am I suppose to tell Terraform to use a different authentication method?
Remove .terraform/ and try again and really double check your credentials.
I regenerate the access keys and password and works good.

Create azure cdn endpoint for azure container

I need to create Azure CDN Endpoint for Azure Container. I am using below code to do so.
Endpoint endpoint = new Endpoint() {
IsHttpAllowed = true,
IsHttpsAllowed = true,
Location = this.config.ResourceLocation,
Origins = new List<DeepCreatedOrigin> { new DeepCreatedOrigin(containerName, string.Format(STORAGE_URL, storageAccountName)) },
OriginPath = "/" + containerName,
};
await this.cdnManagementClient.Endpoints.CreateAsync(this.config.ResourceGroupName, storageAccountName, containerName, endpoint);
All the information I provide is correct and Endpoint is getting created successfully. But when I try to access any blob inside it. It is giving an InvalidUrl error.
However the weird thing is If I create the same endpoint using same values through portal, I am able to access and download blobs.
Anyone please let me know what am I doing wrong in my code? Do I need to pass any extra parameters?
As far as I know, if you want to create a storage CDN in code, you need set the OriginHostHeader value as your storage account URL.
More details, you could refer to below codes:
// Create CDN client
CdnManagementClient cdn = new CdnManagementClient(new TokenCredentials(token))
{ SubscriptionId = subscriptionId };
//ListProfilesAndEndpoints(cdn);
Endpoint e1 = new Endpoint()
{
// OptimizationType = "storage",
Origins = new List<DeepCreatedOrigin>() { new DeepCreatedOrigin("{yourstoragename}-blob-core-windows-net", "{yourstoragename}.blob.core.windows.net") },
OriginHostHeader = "{yourstoragename}.blob.core.windows.net",
IsHttpAllowed = true,
IsHttpsAllowed = true,
OriginPath=#"/foo2",
Location = "EastAsia"
};
cdn.Endpoints.Create(resourcegroup, profilename, enpointname, e1);
Besides, I suggest you could generate SAS token to directly access the blob file by URL.

Resources