Could anyone help me with setting proxy-server for headless chrome while using the lighthouse chrome launcher in Node.js as mentioned here
const launcher = new ChromeLauncher({
port: 9222,
autoSelectChrome: true, // False to manually select which Chrome install.
additionalFlags: [
'--window-size=412,732',
'--disable-gpu',
'--proxy-server="IP:PORT"',
headless ? '--headless' : ''
]
});
However, the above script does not hit my proxy server at all. Chrome seems to fallback to DIRECT:// connections to the target website.
One other resource that talks about using HTTP/HTTPS proxy server in the context of headless chrome is this. But it does not give any example of how to use this from Node.js.
I tried it using regular exec and it works just fine, here is my snippet:
const exec = require('child_process').exec;
function launchHeadlessChrome(url, callback) {
// Assuming MacOSx.
const CHROME = '/Users/h0x91b/Desktop/Google\\ Chrome\\ Beta.app/Contents/MacOS/Google\\ Chrome';
exec(`${CHROME} --headless --disable-gpu --remote-debugging-port=9222 --proxy-server=127.0.0.1:8888 ${url}`, callback);
}
launchHeadlessChrome('https://www.chromestatus.com', (err, stdout, stderr) => {
console.log('callback', err, stderr, stdout)
});
Then I navigated to http://localhost:9222 and in Developer tools I see :
Proxy connection Error, which is ok, because I don't have proxy on this port, but this means that the Chrome tried to connect via proxy...
BTW Chrome version is 59.
Have checked the source code https://github.com/GoogleChrome/lighthouse/blob/master/chrome-launcher/chrome-launcher.ts#L38-L44
I see no additionalFlags here, there is only chromeFlags try to use it...
Related
How can i mimic the action of double-clicking the application (.exe) on the desktop with nodejs, knowing the path to the exe. I want to launch the application and it remain open even after i close the the Nodejs application that launched it.
I've tried a bunch of options but they all seem to have issues not showing the applications native popup window. Which makes me believe it's an issue with the way im launching it from nodejs.
export function testExecute() {
const command =
"C:/Program Files/Side Effects Software/Houdini 18.5.633/bin/happrentice.exe";
const cmd = `"${command}"`;
// OPTION 1: Applications built in console doesn't show when launching
exec(cmd).unref();
// OPTION 2: Applications built in console doesn't show when launching either
spawn(cmd, [], {
detached: true,
shell: true,
stdio: ["ignore", "ignore", "ignore"],
});
}
I have a Chrome Extension I want to install automatically on a Chrome Profile stored at my Desktop.
Chrome Profile Path: C:\\Users\\user\\Desktop\\ChromeProfiles\\test
Chrome Extension Path:
C:\\Users\\user\\Desktop\\SSDC Bot Chrome Console\\Extension Ver
I use this code below to launch a Chrome and load in the Extension:
(async () => {
const pathToExtension = require('path').join("C:\\Users\\user\\Desktop\\SSDC Bot Chrome Console", 'Extension Ver');
const browser = await puppeteer.launch({
headless: false,
args: [
`--disable-extensions-except=${pathToExtension}`,
`--load-extension=${pathToExtension}`,
`--user-data-dir=${'C:\\Users\\user\\Desktop\\ChromeProfiles'+'\\'+'test'}`
],
executablePath : arg[0]
});
})();
What I want to achieve is the following:
Open that Chrome Profile using Puppeteer and Install Extension
Open that Chrome Profile using CMD (Not controlled by puppeteer) and have the Chrome Extension be present.
However, after successfully running the code above and Chrome launches controlled by Puppeteer and having the chrome extension there, when I launch the profile using CMD, the extension is gone.
Should I be using --load-extension? Is there a different flag to use or way to install the extension?
I am trying to debug an issue which causes headless Chrome using Puppeteer to behave differently on my local environment and on a remote environment such as AWS or Heroku.
The application tries to search public available jobs on LinkedIn without authentication (no need to look at profiles), the url format is something like this: https://www.linkedin.com/jobs/search?keywords=Engineer&location=New+York&redirect=false&position=1&pageNum=0
When I open this url in my local environment I have no problems, but when I try to do the same thing on a remote machine such as AWS EC2 or Heroku Dyno I am redirected to a login form by LinkedIn. To debug this difference I've built a Docker image (based on this image) to have isolation from my local Chrome/profile:
Dockerfile
FROM buildkite/puppeteer
WORKDIR /app
COPY . .
RUN npm install
CMD node index.js
EXPOSE 9222
index.js
const puppeteer = require("puppeteer-extra");
puppeteer.use(require("puppeteer-extra-plugin-stealth")());
const testPuppeteer = async () => {
console.log('Opening browser');
const browser = await puppeteer.launch({
headless: true,
slowMo: 20,
args: [
'--remote-debugging-address=0.0.0.0',
'--remote-debugging-port=9222',
'--single-process',
'--lang=en-GB',
'--disable-dev-shm-usage',
'--no-sandbox',
'--disable-setuid-sandbox',
"--proxy-server='direct://",
'--proxy-bypass-list=*',
'--disable-gpu',
'--allow-running-insecure-content',
'--enable-automation',
],
});
console.log('Opening page...');
const page = await browser.newPage();
console.log('Page open');
const url = "https://www.linkedin.com/jobs/search?keywords=Engineer&location=New+York&redirect=false&position=1&pageNum=0";
console.log('Opening url', url);
await page.goto(url, {
waitUntil: 'networkidle0',
});
console.log('Url open');
// page && await page.close();
// browser && await browser.close();
console.log("Done! Leaving page open for remote inspection...");
};
(async () => {
await testPuppeteer();
})();
The docker image used for this test can be found here.
I've run the image on my local environment with the following command:
docker run -p 9222:9222 spinlud/puppeteer-linkedin-test
Then from the local Chrome browser chrome://inspect it should be possible to inspect the GUI of the application (I have deliberately left open the page in headless browser):
As you can see even in local docker the page opens without authentication.
I've done the same test on an AWS EC2 (Amazon Linux 2) with Docker installed. It needs to be a public instance with SSH access and an inbound rule to allow traffic through port 9222 (for remote Chrome debugging).
I've run the same command:
docker run -p 9222:9222 spinlud/puppeteer-linkedin-test
Then again from local Chrome browser chrome://inspect, once added the remote public IP of the EC2, I was able to inspect the GUI of the remote headless Chrome as well:
As you can see this time LinkedIn requires authentication. We can see also a difference in the cookies:
I can't understand the reasons behind this different behaviour between my local and remote environment. In theory Docker should provide isolation and in both environment the headless browser should start with no cookies and a fresh (empty session). Still there is difference and I can't figure out why.
Does anyone have any clue?
I am attempting to use a proxy within my Node.js / Puppeteer application and receiving errors.
If I remove the proxy code, the application runs as intended.
const browser = await puppeteer.launch({args: ['--proxy-server=socks5://127.0.0.1:9050'], headless: false});
I expect the application to run as usual, but with a different IP.
Error received: ERR_PROXY_CONNECTION_FAILED
Either your proxy is not working or puppeteer is rejecting it because it is most likely using a self-signed cert. To fix a cert issue add the following args.
args: [
'--proxy-server=socks5://127.0.0.1:9050'
'--ignore-certificate-errors',
'--ignore-certificate-errors-spki-list '
]
See: https://github.com/GoogleChrome/puppeteer/issues/1159
I am trying to launch a Firefox browser via Selenium using a Custom Profile. I downloaded the latest geckodriver and included it with the following:
java -Dwebdriver.firefox.driver="C:\\xampp\\htdocs\\project\\geckodriver.exe" -jar selenium-server.jar
Note: I am using Selenium Standalone Server 3.5.0.
Then, in my node script I use:
const options = {
desiredCapabilities: {
browserName: 'firefox',
firefox_profile:"C:/Users/Administrator/AppData/Local/Mozilla/Firefox/Profiles/Prnlx0rh6w.bookmarks_player"
}
}
const client = webdriverio.remote(options).init();
It seems that it worked when I look into Selenium logs. Because there will be not created a temp profile in the Windows temp folders ( as it would for a new session), instead you can see the profile from above. But it doesn't work!
The spawned instance of Firefox didn't load cookies, or addons, or anything. It's like a fresh version every time. I also tried to use instead of firefox_profile, profile, or "moz:profile". I even tried to import the profile as a base64 string with fs. Nothing really worked. Maybe I just write it wrong?
How can I include a Firefox custom profile using WebdriverIO?
!!! Edit: Also tried the following C:\Users\Administrator\AppData\Local\Mozilla\Firefox\Profiles\Prnlx0rh6w.bookmarks_player
I also tried with double backslashes, but ended up with the same result. I tried to install Firefox Addon, but each new start it disappears. Also cookies are not found.
Try this as firefox_profile:
"C:\Users\Administrator\AppData\Local\Mozilla\Firefox\Profiles\greerg.profile"