'nvablobs' at line '1' and column '784' is not valid - cisco

Trying to create a create Cisco SD-WAN for Azure Virtual WAN but getting the following error:
Deployment template validation failed: 'The provided value for the template parameter 'nvablobs' at line '1' and column '784' is not valid. Length of the value should be greater than or equal to '2'. Please see https://aka.ms/arm-template/#parameters for usage details.'. (Code: InvalidTemplate)
and here's the Raw Error:
{
"code": "InvalidTemplate",
"message": "Deployment template validation failed: 'The provided value for the template parameter 'nvablobs' at line '1' and column '784' is not valid. Length of the value should be greater than or equal to '2'. Please see https://aka.ms/arm-template/#parameters for usage details.'."
}
this is the image from Azure Marketpace: Cisco SD-WAN for Azure Virtual WAN

Related

Status Message: Invalid value given for parameter Login. Specify a valid parameter value. (Code:InvalidParameterValue)

I get this error while creating a SQL database via ARM templates:
Status Message: Invalid value given for parameter Login. Specify a valid parameter value. (Code:InvalidParameterValue)

Error trying to create an start automatic using "Start Virtual Machine" template

I've created a template but trying to change it or creating a new one; I receive the msg below:
The template validation failed: 'The value for the workflow parameter 'EmailID' at line '1' and column '5022' is not provided.'.
In my case inserting an email on the relevant field solved the problem

How do I use the "expand" param in the Azure SDK?

Given this method signature:
func (client LoadBalancersClient) Get(ctx context.Context, resourceGroupName string, loadBalancerName string, expand string) (result LoadBalancer, err error)
How does one use the "expand" parameter? There appears to be zero documentation on how to format it and all I'm getting is InvalidExpandQueryOptionValue errors.
lbClient := network.NewLoadBalancersClient(subId)
lbClient.Authorizer = authr
lbResult, err := lbClient.Get(context.TODO(), rgName, lbName, "loadBalancingRules")
if err != nil {
panic(err)
}
Results in:
panic: network.LoadBalancersClient#Get: Failure responding to request:
StatusCode=400 -- Original Error: autorest/azure: Service returned an
error. Status=400 Code="InvalidExpandQueryOptionValue"
Message="$expand query option value is invalid. The resource does not
have property loadBalancingRules or the property does not represent a
reference to another resource." Details=[]
I've also tried $loadBalancingRules, {$loadBalancingRules}, and LoadBalancingRules.
I was hit by the same problem but when dealing with VNETs, Subnets and NSGs.
What happens is, when you query an object, by default, the properties of the referenced objects below in the hierarchy are not fetched. So if I were to get a list of NSGs (Network Security Groups), it would show me a list of subnets in the subnets property which would be references to subnets that are assigned to that NSG, however, the properties of those subnets like Name, ip address, etc will be set to None.
To overcome this, when querying for NSG I used
exapnd='subnets'
With this set, I can access the properties of the referenced subnets as well.

Azure Data Factory Copy Activity on Failure | Expression not evaluated

I'm trying to run a copy activity in ADF and purposely trying to fail this activity to test my failure logging.
Here is what the pipeline looks like (please note that this copy activity sits inside a "for each" activity and (inside "for each") an "if conditional" activity.
This is how the pipeline looks
I'm expecting the copy to fail, however not for the "LOG FAILURE" stored procedure, since I want to log the copy activity details in a SQL DB table. Here is what the error says:
In the LOG FAILURE activity:
"errorCode": "InvalidTemplate",
"message": "The expression 'activity('INC_COPY_TO_ADL').output.rowsCopied' cannot be evaluated because property 'rowsCopied' doesn't exist, available properties are 'dataWritten, filesWritten, sourcePeakConnections, sinkPeakConnections, copyDuration, errors, effectiveIntegrationRuntime, usedDataIntegrationUnits, billingReference, usedParallelCopies, executionDetails, dataConsistencyVerification, durationInQueue'.",
"failureType": "UserError",
"target": "LOG_FAILURE"
In the Copy activity INC_COPY_TO_ADL (this is expected since the SQL query is wrong)
"errorCode": "2200",
"message": "Failure happened on 'Source' side. ErrorCode=SqlOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=A database operation failed with the following error: 'Invalid object name 'dbo.CustCustomerV3Staging123'.',Source=,''Type=System.Data.SqlClient.SqlException,Message=Invalid object name 'dbo.CustCustomerV3Staging123'.,Source=.Net SqlClient Data Provider,SqlErrorNumber=208,Class=16,ErrorCode=-2146232060,State=1,Errors=[{Class=16,Number=208,State=1,Message=Invalid object name 'dbo.CustCustomerV3Staging123'.,},],'",
"failureType": "UserError",
"target": "INC_COPY_TO_ADL"
I wonder why the LOG Failure activity failed (i.e. the expression was not evaluated)? Please note that when the copy activity is correct, the "LOG SUCCESS" stored procedures works okay.
This is how the pipeline looks like
Many thanks.
RA
#rizal activity('INC_COPY_TO_ADL').output.rowsCopied is not part of the output of the copy activity in case of failure. Try to set default value for Log_Failure in this case -1 and keep Log_Success as it is

Azure : How to write path to get a file from a time series partitioned folder using the Azure logic apps

I am trying to retrieve a csv file from the Azure blob storage using the logic apps.
I set the azure storage explorer path in the parameters and in the get blob content action I am using that parameter.
In the Parameters I have set the value as:
concat('Directory1/','Year=',string(int(substring(utcNow(),0,4))),'/Month=',string(int(substring(utcnow(),5,2))),'/Day=',string(int(substring(utcnow(),8,2))),'/myfile.csv')
So during the run time this path should form as:
Directory1/Year=2019/Month=12/Day=30/myfile.csv
but during the execution action is getting failed with the following error message
{
"status": 400,
"message": "The specifed resource name contains invalid characters.\r\nclientRequestId: 1e2791be-8efd-413d-831e-7e2cd89278ba",
"error": {
"message": "The specifed resource name contains invalid characters."
},
"source": "azureblob-we.azconn-we-01.p.azurewebsites.net"
}
So my question is: How to write path to get data from the time series partitioned path.
The response of the Joy Wang was partially correct.
The Parameters in logic apps will treat values as a String only and will not be able to identify any functions such as concat().
The correct way to use the concat function is to use the expressions.
And my solution to the problem is:
concat('container1/','Directory1/','Year=',string(int(substring(utcNow(),0,4))),'/Month=',string(int(substring(utcnow(),5,2))),'/Day=',string(int(substring(utcnow(),8,2))),'/myfile.csv')
You should not use that in the parameters, when you use this line concat('Directory1/','Year=',string(int(substring(utcNow(),0,4))),'/Month=',string(int(substring(utcnow(),5,2))),'/Day=',string(int(substring(utcnow(),8,2))),'/myfile.csv') in the parameters, its type is String, it will be recognized as String by logic app, then the function will not take effect.
And you need to include the container name in the concat(), also, no need to use string(int()), because utcNow() and substring() both return the String.
To fix the issue, use the line below directly in the Blob option, my container name is container1.
concat('container1/','Directory1/','Year=',substring(utcNow(),0,4),'/Month=',substring(utcnow(),5,2),'/Day=',substring(utcnow(),8,2),'/myfile.csv')
Update:
As mentioned in #Stark's answer, if you want to drop the leading 0 from the left.
You can convert it from string to int, then convert it back to string.
concat('container1/','Directory1/','Year=',string(int(substring(utcNow(),0,4))),'/Month=',string(int(substring(utcnow(),5,2))),'/Day=',string(int(substring(utcnow(),8,2))),'/myfile.csv')

Resources