ScriptError using Google Apps Script Execution API - node.js

Following these guides https://developers.google.com/apps-script/guides/rest/quickstart/target-script and https://developers.google.com/apps-script/guides/rest/quickstart/nodejs, I am trying to use the Execution API in node to return some data that are in a Google Spreadsheet.
I have set the script ID to be the Project Key of the Apps Script file. I have also verified that running the function in the Script Editor works successfully.
However, when running the script locally with node, I get this error:
The API returned an error: Error: ScriptError
I have also made sure the script is associated with the project that I use to auth with Google APIs as well.
Does anyone have any suggestion on what I can do to debug/ fix this issue? The error is so generic that I am not sure where to look.
UPDATE: I've included a copy of the code in this JSBin (the year function is the entry point)
https://jsbin.com/zanefitasi/edit?js
UPDATE 2: The error seems to be caused by the inclusion of this line
var spreadsheet = SpreadsheetApp.open(DriveApp.getFileById(docID));

It seems that I didn't request the right scopes. The nodejs example include 'https://www.googleapis.com/auth/drive', but I also needed to include 'https://www.googleapis.com/auth/spreadsheets' in the SCOPES array. It seems like the error message ScriptError is not very informative here.
In order to find what scopes you'd need, to go the Script Editor > File > Project Properties > Scopes. Remember to delete the old credentials ~/.credentials/old-credential.json so that the script will request a new one.

EDIT: With the update in information I took a closer look and saw you are returning a non-basic type. Specifically you are returning a Sheet Object.
The basic types in Apps Script are similar to the basic types in
JavaScript: strings, arrays, objects, numbers and booleans. The
Execution API can only take and return values corresponding to these
basic types -- more complex Apps Script objects (like a Document or
Sheet) cannot be passed by the API.
https://developers.google.com/apps-script/guides/rest/api
In your Account "Class"
this.report = spreadsheet.getSheetByName(data.reportSheet);
old answer:
'data.business_exp' will be null in this context. You need to load the data from somewhere. Every time a script is called a new instance of the script is created. At the end of execution chain it will be destroyed. Any data stored as global objects will be lost. You need to save that data to a permanent location such as the script/user properties, and reloaded on each script execution.
https://developers.google.com/apps-script/reference/properties/

Related

Problem accesing a dictionary value on dialogflow fullfilment

I am using fullfilment section on dialogflow on a fairly basic program I have started to show myself I can do a bigger project on dialogflow.
I have an object setup that is a dictionary.
I can make the keys a constant through
const KEYS=Object.keys(overflow);
I am going through the values using
if(KEYS.length>0){
var dictionary = overflow[keys[i]]
if I stringify dictionary using
JSON.stringify(item);
I get:
{"stack":"overflow","stack2":"overflowtoo", "stack3":3}
This leads me to believe I am actually facing a dictionary, hence the name of the variable.
I am accesing a string variable such as stack and unlike stack3
Everything I have read online tells me
dictionary.stack
Should work since
JSON.stringify(item);
Shows me:
{"stack":"overflow","stack2":"overflowtoo","stack3":3}
Whenever I:
Try to add the variable to the response string.
Append it to a string using output+=${item.tipo};
I get an error that the function crashed. I can replace it with the stringify line and it works and it gives me the JSON provided so the issue isnt there
Dictionary values are created as such before being accessed on another function:
dictionary[request.body.responseId]={
"stack":"overflow",
"stack2":"overflowtoo",
"stack3":3 };
Based on the suggestion here I saw that the properties where being accessed properly but their type was undefined. Going over things repeatedly I saw that the properties where defined as list rather than single values.
Dot notation works when they stop being lists.
Thanks for guiding me towards the problem.

Jooq database/schema name mapping

I use jooq to generate objects against a local database, but when running "for real" later in production the actual databases will have different names. To remedy this I use the <outputSchemaToDefault>true</outputSchemaToDefault> config option (maven).
At the same time, we have multiple databases (schemas), and are using a connection pool to the server like "jdbc:mysql://localhost:3306/" (without specifying a database here).
How do I tell jooq which database to use when running queries?
I have tried all config I can think of:
new Settings()
.withRenderSchema(true) // true/false seems to make no difference.
.withRenderCatalog(true) // true/false seems to make no difference.
.withRenderMapping(new RenderMapping()
.withDefaultSchema("my_database") // Seems to have no effect.
// The above 3 configs always give me an error saying "no database selected".
// Adding this gives me 'my_database.my_table' does not exist - while it actually does.
.withSchemata(new MappedSchema()
.withInputExpression(Pattern.compile(".*"))
.withOutput("my_database")
));
I have also tried using a database/schema name, as in not configuring outputSchemaToDefault. But then, adding the MappedSchema code above, but that gives me errors with "'my_databasemy_database.my_table' does not exist", which is correct. I have no clue why that code gives me the database/schema name twice?
Edit:
When jooq tells me that the db.table does not exist, if I put a break point in a good place and get the sql from jooq and run exactly that against my database it does work. But jooq fails to run it.
Also, I'm using version 3.15.3 of jooq.
I solved it. Instead of using .withInputExpression(Pattern.compile(".*")), it seems to work with .withInput("").
I'm still not sure why it works, or if this is the "correct" way of solving it. But at least it is a way forward.
No clue why using the pattern, I got the name twice though. But that one I'll leave alone.

Issue with validating Arm template using DeploymentsOperations.StartValidate

I am currently working on a project where i deploy multiple arm templates each deploying a VM and doing few operations on them. I wanted handle quota issues by calling template validation before triggerring the first deployment. So, i created a template which has logic to create required VMs and i am using this template only for validation (to check if quota will not be exceeded).
Since our code already has the ResourceManagementClient, i tried the following code:
Deployment parameters = new Deployment(
new DeploymentProperties(DeploymentMode.Incremental)
{
Template = templateFile,
Parameters = parameterFile,
});
DeploymentsValidateOperation dp = deployments.StartValidate(groupName, "validation", parameters);
But when i try to access the Value from the variable dp, i keep getting the following exception:
Generic Exception System.InvalidOperationException: The operation has
not completed yet. at Azure.Core.ArmOperationHelpers`1.get_Value()
at
Azure.ResourceManager.Resources.DeploymentsValidateOperation.get_Value()
at DeployTemplate.Program.d__3.MoveNext() in
\Program.cs:line 88
I even added a loop after the "StartValidate" to wait till the dp.HasCompleted is set to true. But this seems to run indefinetly. I also tried the "StartValidateAsync" method, which seems to have the same issue.
I wanted to understand if i am using this method correctly? if there is a better way to do the template validations? I could not find any examples on this method`s usage. if possible please share any code snippet where this method is used for my reference.
Note: Currently, Since this is not working, i am testing with Fluent Api way. That seems to be working. But, it requires lot of changes in our code as it creates ambiguity with many classes in "Azure.ResourceManager.Resources" which are already used for other operations.
I found that even though the deployment operations HasCompleted field is not set, when I call dp.GetRawResponse(), it returns the exact errors expected.
I now use this to validate my templates.

SOAP UI how to configure a PUT request body programmatically

I'm configuring some requests programmatically in my test cases, I can set headers, custom properties, teardown scripts, etc. however I can't find how to set a standard json body for my put requests.
Is there any possibility from the restMethod class ?
So far I end up getting the method used :
restService = testRunner.testCase.testSuite.project.getInterfaceAt(0)
resource = restService.getOperationByName(resource_name)
request = resource.getRequestAt(0)
httpMethod = request.getMethod()
if (httpMethod.toString().equals("PUT"))
but then I'm stuck trying to find how to set a standard body for my PUT requests.
I try with the getRequestParts() method but it didn't give me what I expected ...
can anyone help, please
thank you
Alexandre
I’ve managed this. I had a tests of tests where I wanted to squirt the content of interest into the “bare bones” request. Idea being that I can wrap this in a data driven test. Then, for each row in my data spreadsheet I pull in the request body for my test. At first I simply pulled the request from a data source value in my spreadsheet, but this became unmanageable in my spreadsheet.
So, another tactic. In my test data sheet (data source) I stored the file name that contains the payload I want to squirt in.
In the test itself, I put in a groovy step immediately before the the step I want to push the payload into.
The groovy script uses the data source to firstly get the file name containing the payload, I then read the contents of the file.
In the step I want to push the data into, I just use a get from data, e.g. {groovyStep#result}.
If this doesn’t completely make sense, let me know and I’ll update with screenshot when I have access to SoapUi.

Incremental loading in Azure Mobile Services

Given the following code:
listView.ItemsSource =
App.azureClient.GetTable<SomeTable>().ToIncrementalLoadingCollection();
We get incremental loading without further changes.
But what if we modify the read.js server side script to e.g. use mssql to query another table instead. What happens to the incremental loading? I'm assuming it breaks; if so, what's needed to support it again?
And what if the query used the untyped version instead, e.g.
App.azureClient.GetTable("SomeTable").ReadAsync(...)
Could incremental loading be somehow supported in this case, or must it be done "by hand" somehow?
Bonus points for insights on how Azure Mobile Services implements incremental loading between the server and the client.
The incremental loading collection works by sending the $top and $skip query parameters (those are also sent when you do a query by using the .Take and .Skip methods in the table). So if you want to modify the read script to do something other than the default behavior, while still maintaining the ability to use that table with an incremental loading collection, you need to take those values into account.
To do that, you can ask for the query components, which will contain the values, as shown below:
function read(query, user, request) {
var queryComponents = query.getComponents();
console.log('query components: ', queryComponents); // useful to see all information
var top = queryComponents.take;
var skip = queryComponents.skip;
// do whatever you want with those values, then call request.respond(...)
}
The way it's implemented at the client is by using a class which implements the ISupportIncrementalLoading interface. You can see it (and the full source code for the client SDKs) in the GitHub repository, or more specifically the MobileServiceIncrementalLoadingCollection class (the method is added as an extension in the MobileServiceIncrementalLoadingCollectionExtensions class).
And the untyped table does not have that method - as you can see in the extension class, it's only added to the typed version of the table.

Resources