I have a simple function that tries to accept the cookies
Here's my code:
(async () => {
const browser = await puppeteer.launch({ headless: false });
const page = await browser.newPage();
await page.goto('https://www.sport1.de/live/darts-sport');
await page.click('button[text=AKZEPTIEREN]');
// await page.screenshot({ path: 'example.png' });
// await browser.close();
})();
The cookie popup is placed in an iframe. You have to switch to iframe by contentFrame to be able to click on the accept button.
Also, if you want to filter by textContent, you need to use XPath. With CSS selector you can't get elements by its textContent.
const cookiePopUpIframeElement=await page.$("iframe[id='sp_message_iframe_373079']");
const cookiePopUpIframe=await cookiePopUpIframeElement.contentFrame();
const acceptElementToClick = await cookiePopUpIframe.$x("//button[text()='AKZEPTIEREN']");
await acceptElementToClick[0].click();
I've tried few things i.e.
await page.click('.ytp-fullscreen-button.ytp-button') // click on fullscreen button
await page.keyboard.press('f') // press f to open fullscreen
await page.keyboard.down('f'); await page.keyboard.up('f'); //similar to previous
await page.evaluate(() => document.getElementsByClassName('ytp-fullscreen-button ytp-button')[0].click()) //injecting js and using it to click on fullscreen button
but nothing worked, is there a way to enter fullscreen mode on youtube using puppeteer?
This seems working for me:
import puppeteer from 'puppeteer';
const browser = await puppeteer.launch({ headless: false, defaultViewport: null });
try {
const [page] = await browser.pages();
// David Lynch's Weather Report 7/22/21
await page.goto('https://www.youtube.com/watch?v=MlyNWpf1N0s');
await page.waitForSelector('.ytp-fullscreen-button.ytp-button');
await page.evaluate(() => {
document.querySelector('.ytp-fullscreen-button.ytp-button').click();
});
} catch (err) { console.error(err); }
I need scraping with headless mode a site with a lot of debugger;
There is a way to prevent pause on debugger?
I try to send on load CTRL+F8 and F8 with this code but without success!
await crt_page.keyboard.down('Control');
await crt_page.keyboard.press('F8');
await crt_page.keyboard.up('Control');
await crt_page.keyboard.press('F8');
any advice?
Puppeteer is automatically pressing keys inside the page, and not the browser.
So i think the solution is to install a npm package robotjs to do things outside the browser.
Hope this helps you!
Don't forget to select my answer as the correct answer if this code worked.
const puppeteer = require('puppeteer')
const robot = require('robotjs')
;(async () => {
const browser = await puppeteer.launch({
headless: false,
devtools: true
})
const [page] = await browser.pages()
const open = await page.goto('https://www.example.com', { waitUntil: 'networkidle0', timeout: 0 })
await page.waitFor(4000)
await robot.keyToggle(']','down','control') // For Mac, change 'control' to 'command'
await page.waitFor(500)
await robot.keyToggle(']','down','control') // For Mac, change 'control' to 'command'
await page.waitFor(500)
await robot.keyToggle(']', 'up', 'control') // For Mac, change 'control' to 'command'
await page.waitFor(1000)
await robot.keyToggle('f8','down','control') // For Mac, change 'control' to 'command'
await page.waitFor(500)
await robot.keyToggle('f8', 'up', 'control') // For Mac, change 'control' to 'command'
})()
To debugging your robotjs, is it worked or not, try this code.
Code below run puppeteer and change the URL using robotjs.
If this also not worked on your server, then i'm sorry i can't help you.
const puppeteer = require('puppeteer')
const robot = require('robotjs')
const pageURL = 'https://www.google.com'
const normal_Strings = ['`','1','2','3','4','5','6','7','8','9','0','-','=','[',']','\\',';','\'',',','.','/']
const shiftedStrings = ['~','!','#','#','$','%','^','&','*','(',')','_','+','{','}','|',':','"','<','>','?']
;(async () => {
const browser = await puppeteer.launch({
headless: false,
devtools: true
})
const [page] = await browser.pages()
const open = await page.goto('https://www.example.com', { waitUntil: 'networkidle0', timeout: 0 })
console.log('First URL:')
console.log(await page.url())
await robot.keyToggle('l','down','control') // For Mac, change 'control' to 'command'
await page.waitFor(500)
await robot.keyToggle('l', 'up', 'control') // For Mac, change 'control' to 'command'
await page.waitFor(1000)
for (let num in pageURL) {
if (shiftedStrings.includes(pageURL[num])) {
var key = normal_Strings[ shiftedStrings.indexOf(pageURL[num]) ]
await robot.keyToggle( key,'down','shift')
await page.waitFor(300)
await robot.keyToggle( key, 'up', 'shift')
await page.waitFor(300)
}
await robot.keyTap(pageURL[num])
await page.waitFor(200)
}
await page.waitFor(1000)
await robot.keyTap('enter')
await page.waitForSelector('img#hplogo[alt="Google"]', {timeout: 0})
console.log('Second URL:')
console.log(await page.url())
})()
I am trying to type my username on a website but the input box is inside an iframe, I have tried this code to locate the element inside the iframe but I keep getting an error JSHandles can be evaluated only in the context they were created!
const puppeteer = require('puppeteer');
(async () => {
const browser = await puppeteer.launch({headless: false});
const page = await browser.newPage();
await page.goto("www.examplesite.com", { waitUntil: 'networkidle0' })
await sleep(1000)
const myframe = await page.frames()[2];
const userselector = await myframe.$('input[name="usernameinput"]')
await page.type(userselector, "myusername")
await page.screenshot({path: 'example.png'});
await browser.close();
})();
The type function expects a selector string not a handle. So, as the frame also has a type function you could do:
await myframe.type('input[name="usernameinput"]', "myusername")
I am trying to get puppeteer to wait for the navigation to finish before moving on to the next statement. Based on the Docs for waitForNavigation() , the code should work below. but it just skips to the next statement and I have to use a workaround to wait for a specific URL in the response.
I have tried all the waituntil options as well
( load, domcontentloaded, networkidle0 and networkidle2 ) .
Any ideas how I could get that working properly is appreciated.
const browser = await puppeteer.launch({
headless: false,
})
const page = await browser.newPage()
const home = page.waitForNavigation()
await page.goto(loginUrl)
await home
const login = page.waitForNavigation()
await page.type('#email', config.get('login'))
await page.type('#password', config.get('password'))
await page.click('#submitButton')
await login // << skips over this
// the following line is my workaround and it works , but ideally I don't want
// to specify the expected "after" page each time I navigate
await page.waitForResponse(request => request.url() === 'http://example.com/expectedurl')
The function page.waitForNavigation() waits for navigation to begin and end.
The navigation has already been initiated with page.click().
Therefore, you can use Promise.all() to avoid the race condition between the mentioned functions:
const browser = await puppeteer.launch({
headless: false,
});
const page = await browser.newPage();
await page.goto(loginUrl);
await page.type('#email', config.get('login'));
await page.type('#password', config.get('password'));
await Promise.all([
page.click('#submitButton'),
page.waitForNavigation({
waitUntil: 'networkidle0',
}),
]);
await browser.close();
I have been going through the same problem, I used pending-xhr-request
it solved many problems when the requests were expected, but when I have late requests I have faced many problems, it took me a while to solve the problem so I built a package Puppeteer-response-waiter to do that
const puppeteer = require('puppeteer');
const {ResponseWaiter} = require('puppeteer-response-waiter');
let browser = await puppeteer.launch({ headless: false });
let page = await browser.newPage();
let responseWaiter = new ResponseWaiter(page);
await page.goto('http://somesampleurl.com');
// start listening
responseWaiter.listen();
// do something here to trigger requests
await responseWaiter.wait();
// all requests are finished and responses are all returned back
// remove listeners
responseWaiter.stopListening();
await browser.close();
hope this will solve your problem.