Office script keep returning failed to fetch error. The code works in google sheet and pythong - office-scripts

Trying to run the following code in excel office script, but it keep returning the "Maximum call stack size exceeded" error
async function main(workbook: ExcelScript.Workbook) {
// Your code here
const cuslocation = "New York"
const endpoint = "http://api.weatherstack.com/current?access_key=****367d8059062daa61b5**********&query="+ colocation;
const response = await fetch(endpoint)
console.log(response. Text());
}
Learning office script, so making this easy api call and trying to log data to the excel sheet.

Related

How to write a api call in node js

I have a function written in node js. When that function ends, at the end I am calling another function. Mostly those functions, I am using GitHub API to read and write from my repo.
So it includes a series of API calls and it is working. It takes around 8 - 10 seconds for it to complete all the API calls.
Now, I am expecting that instead of running that function from the VS code terminal using "node {filename}", I want to make it an API.
So when I hit the API URL from the postman, this function should run.
This is what I am looking to do now. Can someone point me in the right direction like how to convert this node js function to an API and run that from postman?
I tried this syntax from youtube,
var common = require("./commonMethods");
const http = require('http');
http.createServer((req,res)=>{
res.writeHead(200,{'Content-Type':'application/json'});
res.write(()=>{
const promise = await new Promise((resolve, reject) => {
let repoName = "sample";
var status = common.createRepo(repoName);
var status1 = common.getRepoContents(repoName);
console.log(status);
console.log(status1);
})
}
);
res.end();
}).listen(4000);
But it is not working.

Events in Excel for the web

I am using script in Excel for the Web and it is working fine. Now I try to use events but it does not seem to work there, are the events in excel Excel web supported ? Something like this from the tutorial:
const sheet = context.workbook.worksheets.getItem("Sample");
sheet.onActivated.add(function (event) {
return Excel.run(async (context) => {
console.log("The activated worksheet ID is: " + event.worksheetId);
await context.sync();
});
});
await context.sync();
});
Make sure Excel API 1.7 is supported by the host application. Read more about events in Excel in the Work with Events using the Excel JavaScript API article.

GCP Cloud Function reading files from Cloud Storage

I'm new to GCP, Cloud Functions and NodeJS ecosystem. Any pointers would be very helpful.
I want to write a GCP Cloud Function that does following:
Read contents of file (sample.txt) saved in Google Cloud Storage.
Copy it to local file system (or just console.log() it)
Run this code using functions-emulator locally for testing
Result: 500 INTERNAL error with message 'function crashed'. Function logs give following message
2019-01-21T20:24:45.647Z - info: User function triggered, starting execution
2019-01-21T20:24:46.066Z - info: Execution took 861 ms, finished with status: 'crash'
Below is my code, picked mostly from GCP NodeJS sample code and documentation.
exports.list_files = (req, res) => {
const fs = require('fs');
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('curl-tests');
bucket.setUserProject("cf-nodejs");
const file = bucket.file('sample.txt'); // file has couple of lines of text
const localFilename = '/Users/<username>/sample_copy.txt';
file.createReadStream()
.on('error', function (err) { })
.on('response', function (response) {
// Server connected and responded with the specified status and
headers.
})
.on('end', function () {
// The file is fully downloaded.
})
.pipe(fs.createWriteStream(localFilename));
}
I run like this:
functions call list_files --trigger-http
ExecutionId: 4a722196-d94d-43c8-9151-498a9bb26997
Error: { error:
{ code: 500,
status: 'INTERNAL',
message: 'function crashed',
errors: [ 'socket hang up' ] } }
Eventually, I want to have certificates and keys saved in Storage buckets and use them to authenticate with a service outside of GCP. This is the bigger problem I'm trying to solve. But for now, focusing on resolving the crash.
Start your development and debugging on your desktop using node and not an emulator. Once you have your code working without warnings and errors, then start working with the emulator and then finally with Cloud Functions.
Lets' take your code and fix parts of it.
bucket.setUserProject("cf-nodejs");
I doubt that your project is cf-nodejs. Enter the correct project ID.
const localFilename = '/Users/<username>/sample_copy.txt';
This won't work. You do not have the directory /Users/<username> in cloud functions. The only directory that you can write to is /tmp. For testing purposes change this line to:
const localFilename = '/tmp/sample_copy.txt';
You are not doing anything for errors:
.on('error', function (err) { })
Change this line to at least print something:
.on('error', function (err) { console.log(err); })
You will then be able to view the output in Google Cloud Console -> Stack Driver -> Logs. Stack Driver supports select "Cloud Functions" - "Your function name" so that you can see your debug output.
Last tip, wrap your code in a try/except block and console.log the error message in the except block. This way you will at least have a log entry when your program crashes in the cloud.

Make data pulled from a remote source available in NodeJS App

I'm just getting going with NodeJS and am trying to use jsforce (salesforce) to populate a dropdown on a form.
I've written a module the requires jsforce, sets login params, and connects.
``modules/sftools.js
const jsforce = require('jsforce')
const conn = new jsforce.Connection({
loginUrl: process.env.SF_LOGINURL
})
conn.login(process.env.SF_USER, process.env.SF_PASS)
exports.metaDropDown = async (field) =>
conn.sobject...describe..
return arrayOfValues
}
I want to make the value returned available throughout my app, so in index.js I've got
``index.js
const sftools= require('../modules/sftools')
const roles = sftools.metaDropDown(process.env.SF_ROLES)
and then I use some middleware to always set req.roles = roles.
I think the problem is that I'm requesting the roles before the connection is established, but I can't figure out the flow.
I tried logging in before the exports code, but I get an Invalid URL error, presumably because it isn't logged in yet.
I tried to put the login code directly into the metaDropdown export, which got rid of the error, but there is still no data returned.
Any suggestions?
I think the issue your having is that for the login function is expecting a callback as the third argument.
conn.login(process.env.SF_USER, process.env.SF_PASS, function() {
// do your role logic here.
})
Hope this helps.

Convert Google Spreadsheet to Excel XLS with Script

Let me first say I realize this question has been asked before, so sincere apologies for bringing it up again. However, I have tried some of the suggestion and can't get it to work out. I am trying to write a script that will take a Google Spreadsheet, convert it to Excel XLS/XLSX format and then email the converted file as an attachment. Here's the coding that tries to create the converted file.
function googleOAuth_(name,scope) {
var oAuthConfig = UrlFetchApp.addOAuthService(name);
oAuthConfig.setRequestTokenUrl("https://www.google.com/accounts/OAuthGetRequestToken?scope="+scope);
oAuthConfig.setAuthorizationUrl("https://www.google.com/accounts/OAuthAuthorizeToken");
oAuthConfig.setAccessTokenUrl("https://www.google.com/accounts/OAuthGetAccessToken");
oAuthConfig.setConsumerKey('anonymous');
oAuthConfig.setConsumerSecret('anonymous');
return {oAuthServiceName:name, oAuthUseToken:"always"};
}
function test(){
var id = DocsList.getFileById('FILE_ID_HERE');
var url = 'https://docs.google.com/feeds/';
var doc = UrlFetchApp.fetch(url+'download/spreadsheets/Export?key='+id+'&exportFormat=xls',
googleOAuth_('docs',url)).getBlob()
DocsList.createFile(doc).rename('newfile.xls')
}
function autorise(){
// function to call to authorize googleOauth
var id = SpreadsheetApp.getActiveSpreadsheet().getId()
var url = 'docs.google.com/feeds/';
var doc = UrlFetchApp.fetch(url+'download/documents/Export?exportFormat=html&format=html&id='+id, googleOAuth_('docs',url)).getContentText();
}
I have read that the authorizing procedure is the primary culprit. I have granted authorization and it appears to run to the line below and then errors.
var doc = UrlFetchApp.fetch(url+'download/spreadsheets/Export?key='+id+'&exportFormat=xls',
googleOAuth_('docs',url)).getBlob()
The error message states the following:
Request failed for returned code 404. Truncated server response:
<!DOCTYPE html><html lang="en" ><head><meta name="description" content="Web word
processing, presentations and spreadsheets"><link rel="shortcut ic...
(use muteHttpExceptions option to examine full response) (line 50, file "Code")
If anyone has any ideas, I would greatly appreciate it.
The code you show (without referencing the source) is for old version of spreadsheets.
It does not work for new version.
Read the documentation on drive advanced service, you will find all the explanation to do what you want.
this post should help : OAuth error when exporting Sheet as XLS in Google Apps Script

Resources