I have a VPS running amazon linux with Node JS. I'm trying to run a node js script that sends a GET request to an API to get data, then sends a POST request to another API with said data. The script works when running using node in the terminal, but not after adding it to sudo crontab -e.
Now I know about the complications with absolute paths when using crontab, I've read many of the stackoverflow questions and I'm fairly certain pathing isn't the issue. To test this, I created a test script that creates a file in the root directory, and it works in crontab. This test script is in the SAME directory as the main script, which has me very confused.
The script uses two packages, Axios and Puppeteer. Here is the script with some information replaced with placeholders:
import axios from "axios";
import puppeteer from "puppeteer";
const apiKey = process.env.API_KEY
const USER_ID = process.env.USER_ID
const sitekey = process.env.SITE_KEY
const surl= 'url'
const url= 'url'
async function webSubmit (data) {
const browser = await puppeteer.launch({ headless: true, args: ['--no-sandbox'] });
const page = await browser.newPage();
await page.goto(url);
await page.$eval("input[name='data']", (el, data) => el.value = data, data)
await page.click('#submitButton')
setTimeout(() => {
browser.close()
}, 5000)
}
function checkStatus (id) {
axios.get(url)
.then(response => {
if (response.data === 'NOT_READY') {
setTimeout(() => {
checkStatus(id)
}, 10000)
} else {
const data = response.data.slice(3)
webSubmit(data)
}
})
.catch(e => console.log(e))
}
function post() {
const postUrl = url
axios.post(postUrl)
.then(response => {
const id = response.data.slice(3)
setTimeout(() => {
checkStatus(id)
}, 30000)
})
.catch(error => console.log(error))
}
post()
Test Script:
import * as fs from 'fs'
fs.writeFile("/vote" + Math.random(), "Hi", function(err) {
if(err){
return console.log(err)
}
console.log("File saved")
})
Crontab Jobs:
* * * * * /opt/bitnami/node/bin/node /home/bitnami/app/index.js
* * * * * cd /home/bitnami/app && /opt/bitnami/node/bin/node index.js
Both these jobs work if I change the index.js script to test.js, but not as index.js.
Any help would be appreciated, thank you!
Related
I'm writing pact integration tests which require to perform actual call to specific mock server during running tests.
I found that I cannot find a way to change RTK query baseUrl after initialisation of api.
it('works with rtk', async () => {
// ... setup pact expectations
const reducer = {
[rtkApi.reducerPath]: rtkApi.reducer,
};
// proxy call to configureStore()
const { store } = setupStoreAndPersistor({
enableLog: true,
rootReducer: reducer,
isProduction: false,
});
// eslint-disable-next-line #typescript-eslint/no-explicit-any
const dispatch = store.dispatch as any;
dispatch(rtkApi.endpoints.GetModules.initiate();
// sleep for 1 second
await new Promise((resolve) => setTimeout(resolve, 1000));
const data = store.getState().api;
expect(data.queries['GetModules(undefined)']).toEqual({modules: []});
});
Base api
import { createApi } from '#reduxjs/toolkit/query/react';
import { graphqlRequestBaseQuery } from '#rtk-query/graphql-request-base-query';
import { GraphQLClient } from 'graphql-request';
export const client = new GraphQLClient('http://localhost:12355/graphql');
export const api = createApi({
baseQuery: graphqlRequestBaseQuery({ client }),
endpoints: () => ({}),
});
query is very basic
query GetModules {
modules {
name
}
}
I tried digging into customizing baseQuery but were not able to get it working.
With top-level await accepted into ES2022, I wonder if it is save to assume that await import("./path/to/module") has no timeout at all.
Here is what I’d like to do:
// src/commands/do-a.mjs
console.log("Doing a...");
await doSomethingThatTakesHours();
console.log("Done.");
// src/commands/do-b.mjs
console.log("Doing b...");
await doSomethingElseThatTakesDays();
console.log("Done.");
// src/commands/do-everything.mjs
await import("./do-a");
await import("./do-b");
And here is what I expect to see when running node src/commands/do-everything.mjs:
Doing a...
Done.
Doing b...
Done.
I could not find any mentions of top-level await timeout, but I wonder if what I’m trying to do is a misuse of the feature. In theory Node.js (or Deno) might throw an exception after reaching some predefined time cap (say, 30 seconds).
Here is how I’ve been approaching the same task before TLA:
// src/commands/do-a.cjs
import { autoStartCommandIfNeeded } from "#kachkaev/commands";
const doA = async () => {
console.log("Doing a...");
await doSomethingThatTakesHours();
console.log("Done.");
}
export default doA;
autoStartCommandIfNeeded(doA, __filename);
// src/commands/do-b.cjs
import { autoStartCommandIfNeeded } from "#kachkaev/commands";
const doB = async () => {
console.log("Doing b...");
await doSomethingThatTakesDays();
console.log("Done.");
}
export default doB;
autoStartCommandIfNeeded(doB, __filename);
// src/commands/do-everything.cjs
import { autoStartCommandIfNeeded } from "#kachkaev/commands";
import doA from "./do-a";
import doB from "./do-b";
const doEverything = () => {
await doA();
await doB();
}
export default doEverything;
autoStartCommandIfNeeded(doEverything, __filename);
autoStartCommandIfNeeded() executes the function if __filename matches require.main?.filename.
Answer: No, there is not a top-level timeout on an await.
This feature is actually being used in Deno for a webserver for example:
import { serve } from "https://deno.land/std#0.103.0/http/server.ts";
const server = serve({ port: 8080 });
console.log(`HTTP webserver running. Access it at: http://localhost:8080/`);
console.log("A");
for await (const request of server) {
let bodyContent = "Your user-agent is:\n\n";
bodyContent += request.headers.get("user-agent") || "Unknown";
request.respond({ status: 200, body: bodyContent });
}
console.log("B");
In this example, "A" gets printed in the console and "B" isn't until the webserver is shut down (which doesn't automatically happen).
As far as I know, there is no timeout by default in async-await. There is the await-timeout package, for example, that is adding a timeout behavior. Example:
import Timeout from 'await-timeout';
const timer = new Timeout();
try {
await Promise.race([
fetch('https://example.com'),
timer.set(1000, 'Timeout!')
]);
} finally {
timer.clear();
}
Taken from the docs: https://www.npmjs.com/package/await-timeout
As you can see, a Timeout is instantiated and its set method defines the timeout and the timeout message.
Is it possible to run multiple tests in one browser window for playwright/test?
currently it will hit browser.close(); after every test even though they are testing on the same page which puts a lot of extra time on the tests.
test.beforeAll(async ({ browser }) => {
const context = await browser.newContext();
const page = await context.newPage();
await page.goto('https://example.com');
});
test('nav test', async ({ page }) => {
const name = await page.innerText('.navbar__title');
expect(name).toBe('Playwright');
});
test('header test', async ({ page }) => {
const name = await page.innerText('.navbar__header');
expect(name).toBe('Playwright');
});
When you create a tests like this test('header test', async ({page}) => { you're specifying page and telling it to create a new page context.
Remove the page from the test - and share the one you create from your .beforeAll
Try this:
test.describe('1 page multiple tests', () => {
let page;
test.beforeAll(async ({ browser }) => {
const context = await browser.newContext();
page = await context.newPage();
await page.goto('https://example.com');
});
test.afterAll(async ({ browser }) => {
browser.close;
});
test('nav test', async () => {
const name = await page.innerText('h1');
expect(name).toContain('Example');
});
test('header test', async () => {
const name = await page.innerText('h1');
expect(name).toContain('Domain');
});
});
Run it like this :
npx playwright test .\StackTests_SinglePage.spec.ts --headed
(you can see the name of my file in there)
You might need to toggle it down to 1 worker if it tries to parallel run your test.
For me, that code opens 1 borwser, 1 page, passes both tests the closes out.
Can you try wrapping the tests in a describe block? So they are treated as a group and not as an individual tests.
test.describe('two tests for same page', () => {
test('nav test', async ({ page }) => {
const name = await page.innerText('.navbar__title');
expect(name).toBe('Playwright');
});
test('header test', async ({ page }) => {
const name = await page.innerText('.navbar__header');
expect(name).toBe('Playwright');
});
});
I'm trying to run next build when using getStaticProps and getStaticPaths method in one of my routes, but it fails every time. Firstly, it just couldn't connect to my API (which is obvious, they're created using Next.js' API routes which are not available when not running a Next.js app). I thought that maybe running a development server in the background would help. It did, but generated another problems, like these:
Error: Cannot find module for page: /reader/[id]
Error: Cannot find module for page: /
> Build error occurred
Error: Export encountered errors on following paths:
/
/reader/1
Dunno why. Here's the code of /reader/[id]:
const Reader = ({ reader }) => {
const router = useRouter();
return (
<Layout>
<pre>{JSON.stringify(reader, null, 2)}</pre>
</Layout>
);
};
export async function getStaticPaths() {
const response = await fetch("http://localhost:3000/api/readers");
const result: IReader[] = await response.json();
const paths = result.map((result) => ({
params: { id: result.id.toString() },
}));
return {
paths,
fallback: false,
};
}
export async function getStaticProps({ params }) {
const res = await fetch("http://localhost:3000/api/readers/" + params.id);
const result = await res.json();
return { props: { reader: result } };
}
export default Reader;
Nothing special. Code I literally rewritten from the docs and adapted for my site.
And here's the /api/readers/[id] handler.
export default async function handler(
req: NextApiRequest,
res: NextApiResponse
) {
const knex = getKnex();
const { id } = req.query;
switch (req.method) {
case "GET":
try {
const reader = await knex
.select("*")
.from("readers")
.where("id", id)
.first();
res.status(200).json(reader);
} catch {
res.status(500).end();
}
break;
}
}
Nothing special either. So why is it crashing every time I try to build my app? Thanks for any help in advance.
You should not fetch an internal API route from getStaticProps — instead, you can write the fetch code present in API route directly in getStaticProps.
https://nextjs.org/docs/basic-features/data-fetching#write-server-side-code-directly
I'm having issues getting puppeteer to use the default profile that my chrome browser uses. I've tried setting path to the user profile, but when I go to a site with puppeteer that I know is saved with chrome app's userDataDir, there's nothing saved there. What am I doing wrong? I appreciate any help!
const browser = await puppeteer.launch({
headless: false,
userDataDir: 'C:\\Users\\Bob\\AppData\\Local\\Google\\Chrome\\User Data',
}).then(async browser => {
I've also tried userDataDir: 'C:/Users/Phil/AppData/Local/Google/Chrome/User Data',, but still nothing.
UPDATED:
const username = os.userInfo().username;
(async () => {
try {
const browser = await puppeteer.launch({
headless: false, args: [
`--user-data-dir=C:/Users/${username}/AppData/Local/Google/Chrome/User Data`]
}).then(async browser => {
I had same exact issue before. However connecting my script to a real chrome instance helped to solve a lot of problems specially the profile one.
You can see the steps here:
https://medium.com/#jaredpotter1/connecting-puppeteer-to-existing-chrome-window-8a10828149e0
//MACOS
/*
Open this instance first:
/Applications/Google\ Chrome.app/Contents/MacOS/Google\ Chrome --remote-debugging-port=9222 --no-first-run --no-default-browser-check --user-data-dir=$(mktemp -d -t 'chrome-remote_data_dir')
// Windows:
- Add this to Target of launching chrome --remote-debugging-port=9222
- Navigate to http://127.0.0.1:9222/json/version
- copy webSocketDebuggerUrl
More Info: https://medium.com/#jaredpotter1/connecting-puppeteer-to-existing-chrome-window-8a10828149e0
*/
// Puppeteer Part
// Always update this socket after running the instance in terminal (look up ^)
and this is abstracted controller written in Typescript, that I always use in any project:
import * as puppeteer from 'puppeteer';
import { Browser } from 'puppeteer/lib/cjs/puppeteer/common/Browser';
import { Page } from 'puppeteer/lib/cjs/puppeteer/common/Page';
import { PuppeteerNode } from 'puppeteer/lib/cjs/puppeteer/node/Puppeteer';
import { getPuppeteerWSUrl } from './config/config';
export default class Puppeteer {
public browser: Browser;
public page: Page;
getBrowser = () => {
return this.browser;
};
getPage = () => {
return this.page;
};
init = async () => {
const webSocketUrl = await getPuppeteerWSUrl();
try {
this.browser = await ((puppeteer as unknown) as PuppeteerNode).connect({
browserWSEndpoint: webSocketUrl,
defaultViewport: {
width: 1920,
height: 1080,
},
});
console.log('BROWSER CONNECTED OK');
} catch (e) {
console.error('BROWSER CONNECTION FAILED', e);
}
this.page = await this.browser.newPage();
this.page.on('console', (log: any) => console.log(log._text));
};
}
Abstracted webosocket fecther:
import axios from "axios";
import { exit } from "process";
export const getPuppeteerWSUrl = async () => {
try {
const response = await axios.get("http://127.0.0.1:9222/json/version");
return response.data.webSocketDebuggerUrl;
} catch (error) {
console.error("Can not get puppeteer ws url. error %j", error);
console.info(
"Make sure you run this command (/Applications/Google Chrome.app/Contents/MacOS/Google Chrome --remote-debugging-port=9222 --no-first-run --no-default-browser-check --user-data-dir=$(mktemp -d -t 'chrome-remote_data_dir')) first on a different shell"
);
exit(1);
}
};
Feel free to adjust the template to suit whatever you enviroment/tools currrently look like.