Gatsby page does not load when 'adwords' is in the page slug - frontend

I have multiple Gatsby pages where the urls are http://localhost:<port>/adwords and http://localhost:<port>/adwords/remarketing, and both of these pages do not load with the AdBlock extension active in Google Chrome. I have tried changing the slug to /ad-words/, which fixes the problem, however I would like a solution in which I do not have to change the slugs for SEO reasons.
The error in the console says:
Failed to load resource: net::ERR_BLOCKED_BY_CLIENT
http://localhost:<port>/page-data/adwords/page-data.json:1
So i'm guessing because this json file has the word 'adwords' in it, it's stopping the browser from downloading the file.
Any insights are greatly appreciated! Thanks

Related

Eliminate Redirection malware in my website

I am experiencing a redirection malware in some of the subpages of my website. I tried deleting a script that´s been created by someone else. I use wordpress. But it still redirecting to that site. I don´t know how to fix it. It would like some help please.
In the console also appear messages like these:
www-widgetapi.js:1120 Failed to execute 'postMessage' on 'DOMWindow': The target origin provided ('https://www.youtube.com') does not match the recipient window's origin ('https://mysite.example').
q.sendMessage # www-widgetapi.js:1120
googleads.g.doubleclick.net/pagead/id:1
Failed to load resource: net::ERR_BLOCKED_BY_CLIENT
static.doubleclick.net/instream/ad_status.js:1
Failed to load resource: net::ERR_BLOCKED_BY_CLIENT
jsclou.in/pab1002.js:1
Mixed Content: The page at 'https://mysite.example/wp-admin/post.php?post=114&action=elementor' was loaded over HTTPS, but requested an insecure frame 'http://malware.example'. This request has been blocked; the content must be served over HTTPS.
Any advice about this please?
Tried to delete the script where I found the site jsclou. It din´t solve it.
It look like this:

Uses deprecated APIs 1 warning found Browser errors were logged to the console

I have added an image of the WordPress site where I am facing these two issues which are becoming a hurdle in my site to optimize for best practices. I tried to find its solution but got no understanding of how to do that in my website of WordPress.
Uses deprecated APIs 1 warning found
Browser errors were logged to the console
Screenshot from Chrome inbuilt dev tool
I have found the solution by removing this policy from htaccess file. It's an HTTP header. You can check your website HTTP headers.
Header set Expect-CT enforce,max-age=2592000,report-uri="https://example.com/"
here is some help on this issue I believe.
https://developer.chrome.com/blog/immutable-document-domain/

Google couldn't fetch my sitemap.xml file

I've got a small flask site for my old wow guild and I have been unsuccessful in getting google to read my sitemap.xml file. I was able to successful verify my site using googles Search Console and it seems to crawl it just fine but when I go to submit my sitemap, it lists the status as "Couldn't fetch". When I click on that for more info all it says is "Sitemap could not be read" (not helpful)
I originally used a sitemap generator website (forgot which one) to create the file and then added it to my route file like this:
#main.route('/sitemap.xml')
def static_from_root():
return send_from_directory(app.static_folder, request.path[1:])
If I navigated to www.mysite.us/sitemap.xml it would display the expected results but google was unable to fetch it.
I then changed things around and started using flask-sitemap to generate it like this:
#ext.register_generator
def index():
yield 'main.index', {}
This also works fine when I navigate directly to the file but google again does not like this.
I'm at a loss. There doesn't seem to but any way to get help from google on this and so far my interweb searches aren't turning up anything helpful.
For reference, here is the current sitemap link: www.renewedhope.us/sitemap.xml
I finally got it figured out. This seems to go against what google was advising but I submitted the sitemap as http://renewedhope.us/sitemap.xml and that finally worked.
From their documentation:
Use consistent, fully-qualified URLs. Google will crawl your URLs
exactly as listed. For instance, if your site is at
https://www.example.com/, don't specify a URL as https://example.com/
(missing www) or ./mypage.html (a relative URL).
I think that only applies to the sitemap document itself.
When submitting the sitemap to google, I tried...
http://www.renewedhope.us/sitemap.xml
https://www.renewedhope.us/sitemap.xml
https://renewedhope.us/sitemap.xml
The only format that they were able to fetch the sitemap from was:
http://renewedhope.us/sitemap.xml
Hope this information might help someone else facing the same issue :)
put this tag in your robots.txt file Sitemap: domainname.com/sitemap.xml. Hope this will be helpful.

Where/How is generated the html code of vsftpd?

I mean the html code generated when you get into ftp://yourserver... I tried to google it with no luck, not found in the project webpage either. I'd like to modify that code to do it more mobile friendly.
Thank you in advance
If you use a browser to load an FTP directory then the HTML page will be generated by the browser from the directory listing.
For example, see this part of the source code for Google Chrome.

Website not opening in Chrome?

One of my friend's site's users getting this error.
Oops! This link appears to be broken in Google Chrome
http://www.labnol.org/software/webpages-not-opening-in-google-chrome/13041/
Can he do something with their hosting to ensure users of his site will not get this error.
As it is a browser bug, you cannot change this behaviour.
well: as the bug only occurs if prefetching links from your page fails, you could of course remove all 'href' attributes from your html-source and add them on page load using javascript. this would end in chrome not fetching up anything: no fetching => no fetching error. but this 'solution' is not practical.
Are you using redirects? Cause chrome wants a status header with that.

Resources