Escaping Dollar Sign Inside Snowflake Javascript Stored Procedure within Terraform - terraform

I am trying to incorporate Stored Procedure written in Javascript into Terraform for Snowflake, when I tried to apply script as it was developed I was getting bellow error:
A reference to a resource type must be followed by at least one attribute access, specifying the resource name
Based on the line numbers which raised the error message it does not like the dollar sign, so it seems like it needs to get escaped, example of such un-altered lines are below:
if (rowCount == 0) return `Error: Script with SCRIPT_TYPE = ${SCRIPT_TYPE} and ACCES_TYPE = ${ACCES_TYPE} does not exist.`;
var sql = `select PARAMETER_NAMES, TEMPLATE from administration.utils.SCRIPT_TEMPLATE where SCRIPT_TYPE = ''${SCRIPT_TYPE}'' AND ACCES_TYPE = ''${ACCES_TYPE}''`
What I am after is to know how to escape it and have this logic using the replace function incorporated in procedure resource creation resource "snowflake_procedure" as to be seen below, so that any future changes to the logic or introduction of new procedures does not have to be manually altered, my attempt was to use '\$' for escaping in the function, however not successful:
resource "snowflake_procedure" "GENERATE_SCRIPT_FROM_TEMPLATE" {
name = "GENERATE_SCRIPT_FROM_TEMPLATE"
database = "ADMINISTRATION"
schema = "UTILS"
language = "JAVASCRIPT"
arguments {
SCRIPT_TYPE = "arg1"
type = "VARCHAR(250)"
}
arguments {
ACCES_TYPE = "arg2"
type = "VARCHAR(250)"
}
arguments {
PARAMETER_VALUES = "arg3"
type = "VARCHAR(5000)"
}
return_type = "VARCHAR"
execute_as = "OWNER"
statement = replace(
<<EOT
try
{
var parameterValues = JSON.parse(PARAMETER_VALUES);
}
catch (err) {
return `Failed to parse PARAMETER_VALUES: ${PARAMETER_VALUES}. Correct format is: {"DATABASE": "ADMINISTRATOR", "SCHEMA": "UTILS"}.`;
}
var sql = `select PARAMETER_NAMES, TEMPLATE from administration.utils.SCRIPT_TEMPLATE where SCRIPT_TYPE = ''${SCRIPT_TYPE}'' AND ACCES_TYPE = ''${ACCES_TYPE}''`
var stmt = snowflake.createStatement({ sqlText: sql });
var result = stmt.execute();
var rowCount = result.getRowCount();
if (rowCount == 0) return `Error: Script with SCRIPT_TYPE = ${SCRIPT_TYPE} and ACCES_TYPE = ${ACCES_TYPE} does not exist.`;
result.next();
var parameterNames = result.getColumnValue(1);
var scriptTemplate = result.getColumnValue(2);
var parameterNamesArray = parameterNames.split('','');
parameterNamesArray.forEach(parameterName => {
if (!parameterValues[parameterName]) return `Failed: Cannot find parameter ${parameterName} in PARAMETER_VALUES: ${PARAMETER_VALUES}.`
});
var oldStrimg = '''';
var newString = '''';
var script = scriptTemplate;
parameterNamesArray.forEach(parameterName => {
oldStrimg = `<${parameterName}>`;
newString = parameterValues[parameterName];
script = script.replace(oldStrimg,newString);
});
return script;
EOT
, "$", "'\$'")
}

What I did to escape $$ was to use another sign
For example (sql scripting)
let q := $$ ... ## something ## $$
q := replace(:q, '##', '$$')

Related

Terraform, get value from map

Prompt me, please, how can I get separately a value for key paswd-0. I mean, I need separated values for password and username.
This is remote data from data.terraform_remote_state.user_passwd.outputs.login_passwd
output = {
paswd-0 = jsonencode(
{
password = "uGo="
username = "git"
}
)
paswd-1 = jsonencode(
{
password = "wM="
username = "kun"
}
)
}
I'm trying this and get error parameter: lookup() requires a map as the
output "tetts" {
value = lookup(tomap(data.terraform_remote_state.user_passwd.outputs.login_passwd.paswd-0), "password", null)
}
Ideally I would go through of each value and fill these fields.
argocd_repositories = {
[
"private-repo" = {
url = "https://repo.git"
username = "argocd"
password = "access_token"
},
"git-repo" = {
url = "https://repo.git"
password = "argocd_access_token"
username = "admin"
},
"private-helm-chart" = {
url = "https://charts.jetstack.io"
type = "helm"
username = "foo"
password = "bar"
},
]
}
As per my comment, you can get the value from the data source by using the jsondecode built-in function [1]. You would have to update the output to look like the following:
output "tetts" {
value = lookup(tomap(jsondecode(data.terraform_remote_state.user_passwd.outputs.login_passwd["paswd-0"]), "password", null)
}
This is only to make it work as you intended it to. However, it will output only the value for the password. Since I do not have the remote state, I managed to get close to what you want with locals and the following:
locals {
output = {
paswd-0 = jsonencode(
{
password = "uGo="
username = "git"
}
)
paswd-1 = jsonencode(
{
password = "wM="
username = "kun"
}
)
}
sorted_values = { for k, v in local.output : jsondecode(v).username => jsondecode(v).password }
}
Note that jsondecode is used on the values of the original map. Furthermore, since the JSON decoded values are also in a key value pair format, you can access the keys and corresponding values using the usual terraform notation (i.e., jsondecode(v).username and jsondecode(v).password). Using terraform console, the local sorted_values variable has the following look:
> local.sorted_values
{
"git" = "uGo="
"kun" = "wM="
}
I guess this is close to what you wanted to achieve with the tomap function.
[1] https://www.terraform.io/language/functions/jsondecode

How to change code to prevent SQL injection in typeorm

I am writing code using nestjs and typeorm.
However, the way I write is vulnerable to SQL injection, so I am changing the code.
//Before
.where(`userId = ${userId}`)
//After
.where(`userId = :userId` , {userId:userId})
I am writing a question because I was changing the code and couldn't find a way to change it for a few cases.
//CASE1
const query1 = `select id, 'normal' as type from user where id = ${userId}`;
const query2 = `select id, 'doctor' as type from doctor where id = ${userId}`;
const finalQuery = await getConnection().query(`select id, type from (${query1} union ${query2}) as f limit ${limit} offset ${offset};`);
//CASE2
...
.addSelect(`CASE WHEN userRole = '${userRole}' THEN ...`, 'userType')
...
//CASE3 -> to find search results in order of accuracy
...
.orderBy(`((LENGTH(user.name) - LENGTH((REPLACE(user.name, '${keyword.replace( / /g, '', )}', '')))) / LENGTH('${keyword.replace(/ /g, '')}'))`,
'ASC')
...
//CASE4
let query = 'xxxx';
let whereQuery = '';
for(const i=0;i<5;i++)
whereQuery += ' or ' + `user.name like '%${keyword}%'`
query.where(whereQuery)
I cannot use parameter in the select function.
In the above case, I am wondering how to change it.
Is it ok to not have to modify the select code?

Dynamic JSON in Terraform

I am using Terraform to Invoke a lambda function, and need to pass an input JSON which includes a list of string values.
data "aws_lambda_invocation" "invo6" {
function_name = "my_function"
input = <<JSON
{
"pairs":[
{
"principal":"arn:aws:iam::12345678901:role/myRole",
"databases":[
"my_db_apple", "my_db_banana", "my_db_orange"
]
}
]
}
JSON
}
Instead of hard-coding these database names, I want to pull in from a map that already exists elsewhere in my tf files.
variable "gluedb_map" {
type = map(map(string))
default = {
"apple" = {
description = "my apple db"
catalog = ""
location_uri = "s3://mybucket/"
params = ""
}
"banana" = {
description = "my banana db"
catalog = ""
location_uri = "s3://anotherpath/"
params = ""
}
I tried swapping out the 'databases' code for this :
input = <<JSON
{
"pairs":[
{
"principal":"arn:aws:iam::12345678901:role/myRole",
${jsonencode("databases": [for each in var.gluedb_map : "my_db_${each}"], )}
}
]
}
JSON
but i then get error :
A comma is required to separate each function argument from the next.
Can anyone spot where I'm going wrong ?
Thanks
If you're just interested in accessing the keys of the map then you can use the keys function to return a list of keys. You can then combine that with formatlist to interpolate each list item with a string.
I'd also recommend using a HCL map for the wider data structure and then encoding to JSON rather than trying to JSON encode a section of it and having to mangle things to get it in a suitable shape.
A fully worked example then looks something like this:
variable "gluedb_map" {
type = map(map(string))
default = {
"apple" = {
description = "my apple db"
catalog = ""
location_uri = "s3://mybucket/"
params = ""
}
"banana" = {
description = "my banana db"
catalog = ""
location_uri = "s3://anotherpath/"
params = ""
}
}
}
output "json" {
value = jsonencode({
pairs: [
{
principal = "arn:aws:iam::12345678901:role/myRole"
databases = formatlist("my_db_%s", keys(var.gluedb_map))
}
]
})
}
Applying this will output the following:
json = {"pairs":[{"databases":["my_db_apple","my_db_banana"],"principal":"arn:aws:iam::12345678901:role/myRole"}]}
You can try to use keys, formatlist and join to get:
${jsonencode("databases": [join("," , formatlist("my_db_%s", keys(var.gluedb_map)) )}

Creating dynamic customer group using suite script

I am trying to create dynamic customer group using suite script in Net suite, I am trying below code but always getting
system INVALID_KEY_OR_REF
Invalid savedsearch reference key 21.
I have checked it is valid save search, Please help I am doing something wrong.
function createDynamicGroup(savedSearchId, groupName) {
var saveSearchObj = nlapiLoadSearch('customer', savedSearchId);
var initValues = new Array();
initValues.grouptype = 'Customer';
initValues.dynamic = 'T';
var goupRecObj = nlapiCreateRecord('entitygroup', initValues);
goupRecObj.setFieldValue('groupname', groupName);
goupRecObj.setFieldValue('savedsearch',saveSearchObj.getId());
nlapiSubmitRecord(goupRecObj);
}
You need group type = 'CustJob' as well as using a public search id:
function createDynamicGroup(savedSearchId, groupName) {
var saveSearchObj = nlapiLoadSearch('customer', savedSearchId);
var initValues = {
grouptype: 'CustJob', // <-- use this
dynamic: 'T'
};
var goupRecObj = nlapiCreateRecord('entitygroup', initValues);
goupRecObj.setFieldValue('groupname', groupName);
goupRecObj.setFieldValue('savedsearch', savedSearchId);
return nlapiSubmitRecord(goupRecObj);
}

Derived column was not map to output column? How can i do?

Package is successfully generated. but derived column showing the error bellow-
Validation error. This is a programmed DataFlowTask Derived Column [2]: Attempt to parse the expression "Empid" failed and returned error code 0xC00470A2. The expression cannot be parsed. It might contain invalid elements or it might not be well-formed. There may also be an out-of-memory error. MyProgrammedDataflowTaskWithDerivedColumn.dtsx
bellow is my code-
// Create an application
Application app = new Application();
// Create a package
Package pkg = new Package();
//Setting some properties
pkg.Name = #"MyProgrammedDataflowTaskWithDerivedColumn";
//Adding a connection to the database AdventureWorksLT2008R2
ConnectionManager ConnMgrAdvent = pkg.Connections.Add("OLEDB");
ConnMgrAdvent.ConnectionString = "Data Source=412-1682;Initial Catalog=Empdb;Provider=SQLNCLI11.1;Integrated Security=SSPI;Auto Translate=False;";
ConnMgrAdvent.Name = #"AdventureWorks2008R2";
ConnMgrAdvent.Description = #"SSIS Connection Manager for OLEDB Source";
//Adding a connection to the database Import_DB
ConnectionManager ConnMgrImport_DB = pkg.Connections.Add("OLEDB");
ConnMgrImport_DB.ConnectionString = "Data Source=412-1682;Initial Catalog=stgEmpdb;Provider=SQLNCLI11.1;Integrated Security=SSPI;Auto Translate=False;";
ConnMgrImport_DB.Name = #"Import_DB";
ConnMgrImport_DB.Description = #"SSIS Connection Manager for OLEDB Source";
//Adding the dataflow task to the package
Executable exe = pkg.Executables.Add("STOCK:PipelineTask");
TaskHost TKHSQLHost = (TaskHost)exe;
TKHSQLHost.Name = "This is a programmed DataFlowTask";
MainPipe dataFlowTask = (MainPipe)TKHSQLHost.InnerObject;
// Create the source component.
IDTSComponentMetaData100 source =
dataFlowTask.ComponentMetaDataCollection.New();
source.ComponentClassID = "DTSAdapter.OleDbSource.3";
CManagedComponentWrapper srcDesignTime = source.Instantiate();
srcDesignTime.ProvideComponentProperties();
// Assign the connection manager.
if (source.RuntimeConnectionCollection.Count > 0)
{
source.RuntimeConnectionCollection[0].ConnectionManager =DtsConvert.GetExtendedInterface(ConnMgrAdvent);
source.RuntimeConnectionCollection[0].ConnectionManagerID =
pkg.Connections["AdventureWorks2008R2"].ID;
}
// Set the custom properties of the source.
srcDesignTime.SetComponentProperty("AccessMode", 0);
srcDesignTime.SetComponentProperty("OpenRowset", "[dbo].[emp1]");
// Connect to the data source, and then update the metadata for the source.
srcDesignTime.AcquireConnections(null);
srcDesignTime.ReinitializeMetaData();
srcDesignTime.ReleaseConnections();
// Create the destination component.
IDTSComponentMetaData100 destination =
dataFlowTask.ComponentMetaDataCollection.New();
destination.ComponentClassID = "DTSAdapter.OleDbDestination.3";
CManagedComponentWrapper destDesignTime = destination.Instantiate();
destDesignTime.ProvideComponentProperties();
// Assign the connection manager.
destination.RuntimeConnectionCollection[0].ConnectionManager =
DtsConvert.GetExtendedInterface(ConnMgrImport_DB);
if (destination.RuntimeConnectionCollection.Count > 0)
{
destination.RuntimeConnectionCollection[0].ConnectionManager =
DtsConvert.GetExtendedInterface(ConnMgrImport_DB);
destination.RuntimeConnectionCollection[0].ConnectionManagerID =
pkg.Connections["Import_DB"].ID;
}
// Set the custom properties of the destination
destDesignTime.SetComponentProperty("AccessMode", 0);
destDesignTime.SetComponentProperty("OpenRowset", "[dbo].[emp1]");
// Connect to the data source, and then update the metadata for the source.
destDesignTime.AcquireConnections(null);
destDesignTime.ReinitializeMetaData();
destDesignTime.ReleaseConnections();
//Derived Column
IDTSComponentMetaData100 derived =
dataFlowTask.ComponentMetaDataCollection.New();
derived.Name = "Derived Column Component";
derived.ComponentClassID = "DTSTransform.DerivedColumn.3";
CManagedComponentWrapper DesignDerivedColumns = derived.Instantiate();
DesignDerivedColumns.ProvideComponentProperties(); //design time
derived.InputCollection[0].ExternalMetadataColumnCollection.IsUsed = false;
derived.InputCollection[0].HasSideEffects = false;
//update the metadata for the derived columns
DesignDerivedColumns.AcquireConnections(null);
DesignDerivedColumns.ReinitializeMetaData();
DesignDerivedColumns.ReleaseConnections();
//Create the path from source to derived columns
IDTSPath100 SourceToDerivedPath = dataFlowTask.PathCollection.New();
SourceToDerivedPath.AttachPathAndPropagateNotifications(source.OutputCollection[0],derived.InputCollection[0]);
//Create the path from derived to desitination
IDTSPath100 DerivedToDestinationPath = dataFlowTask.PathCollection.New();
DerivedToDestinationPath.AttachPathAndPropagateNotifications(derived.OutputCollection[0], destination.InputCollection[0]);
// derivedColumns.SetUsageType(dInput.ID, vdInput, vColumn.LineageID, DTSUsageType.UT_READONLY);
//Give me an output column
IDTSInput100 dInput;
IDTSVirtualInput100 vdInput;
//Get this components default input and virtual input
dInput = derived.InputCollection[0];
vdInput = dInput.GetVirtualInput();
IDTSOutputColumn100 myCol = derived.OutputCollection[0].OutputColumnCollection.New();
myCol.Name = "RowKey";
myCol.SetDataTypeProperties(Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_I4, 0, 0, 0, 0);
myCol.ExternalMetadataColumnID = 0;
myCol.ErrorRowDisposition = DTSRowDisposition.RD_FailComponent;
myCol.TruncationRowDisposition = DTSRowDisposition.RD_FailComponent;
IDTSCustomProperty100 myProp = myCol.CustomPropertyCollection.New();
myProp.Name = "Expression";
myProp.Value = "Empid";// + vColumn.LineageID;
myProp = myCol.CustomPropertyCollection.New();
myProp.Name = "FriendlyExpression";
myProp.Value = "Empid";
//Create the input columns for the transformation component
IDTSInput100 input = derived.InputCollection[0];
IDTSVirtualInput100 derivedInputVirtual = input.GetVirtualInput();
input.ErrorRowDisposition = DTSRowDisposition.RD_NotUsed;
input.ErrorOrTruncationOperation = "";
DesignDerivedColumns.ReleaseConnections();
// Get the destination's default input and virtual input.
IDTSInput100 destinationinput = destination.InputCollection[0];
int destinationInputID = input.ID;
IDTSVirtualInput100 vdestinationinput = destinationinput.GetVirtualInput();
//Iterate through the virtual input column collection.
foreach (IDTSVirtualInputColumn100 vColumn in vdestinationinput.VirtualInputColumnCollection)
{
IDTSInputColumn100 vCol = destDesignTime.SetUsageType(destinationinput.ID, vdestinationinput, vColumn.LineageID, DTSUsageType.UT_READWRITE);
destDesignTime.MapInputColumn(destinationinput.ID, vCol.ID, destinationinput.ExternalMetadataColumnCollection[vColumn.Name].ID);
}
app.SaveToXml(String.Format(#"D:\{0}.dtsx", pkg.Name), pkg, null);
solved this by adding bellow code to validate derived column component :-
IDTSInput100 DerivedColumnInput = derived.InputCollection[0];
IDTSVirtualInput100 DerivedColumnVirtualInput = DerivedColumnInput.GetVirtualInput();
IDTSVirtualInputColumnCollection100 DerivedColumnVirtualInputColumns = DerivedColumnVirtualInput.VirtualInputColumnCollection;
foreach (IDTSVirtualInputColumn100 virtualInputColumnDT in DerivedColumnVirtualInputColumns)
{
// Select column, and retain new input column
if (virtualInputColumnDT.Name=="Empid")
{
designDerivedColumns.SetUsageType(DerivedColumnInput.ID, DerivedColumnVirtualInput, virtualInputColumnDT.LineageID, DTSUsageType.UT_READONLY);
}
}

Resources