This is sort of a statistics question. I am looking for a website analyser, not quite like google analytics. I want the analyser to crawl the website itself and record all the data on a page. Images, size of image and so on.
Even if it is just a library then its a start for me.
Thanks
You could try wget to download all the images on a site. Doubt it's the best way to do this though. Chrome's Inspect Element function has information on the sizes of all images on a page, if that's more what you're looking for.
Related
It looks like I can't get the image links to optimize my sites like I could in the previous version. Is there a way to get these links?
Thanks!
Hey Ben i am having the same issues, what he is talking about is before page insights was with lighthouse as it is now, we were given the exact resource needed in the form google requested them to be. So if we had an image that was 4 mb and 2000x2000 but the view port of where the image was lets say 300x300, google would provide that picture in a zip folder along with all other photos in the same boat. Also if javascript or css needed minification it also provided those files for us. I do not see that option at all any longer and its really disapponintng as it saved me two steps of optimiaztion in regards to page speed and hoping we can get it back!?
A client would like to have his website implemented like the following one, on google:
https://www.google.ch/search?q=mlzd&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:fr:official&client=firefox-a
With:
a list of links to the differents sections of the website.
the google function with the arrow (on rollover of the links) to get previews of the website pages.
So, here are my questions:
Do you know to implement the several links? Is it tags?
Do I then have to do something to get the google images previews, or is it automatic?
Will the google images previews work, with a Flash website? If not, the preview will be a screenshot the website with no flash enabled?
Thank you a lot for your answers!!
David
The links are called Sitelinks, they are automatic. You cant do anything to sepefically enable them.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=47334
The preview is automatic, but you do need to make sure that 1) allow google to cache the url (ie avoid "nocache" and 2) not block in anyway the Previews https://sites.google.com/site/webmasterhelpforum/en/faq-instant-previews
See link in the previous answer
We have been hounded by an issue in our websites because web protection facility pages like ones from Norton keep on telling certain visitors in certain browsers that our websites are potential risks because we link to a certain http://something.abnormal.com/ (sample URL only).
I've been trying to scour the site page by page, to no avail.
My question, do you know any site that would be able to "crawl" into our website's pages and then check if any text, image, whatever in them links to the abnormal URL that keeps on bugging.
Thanks so much! :)
What you want is a 'spider' application. I use the spider in 'Burp Suite' but there are a range of free, cheap and expensive ones.
The good thing about Burp is you can get it to spider the entire site and then look at every page for whatever you want, whether it be something to match a regex or dynamic content etc.
If your websites consist of a small amount of static content pages, I would use wget to download all pages (ignoring images)
wget -r -np -R gif,jpg,png http://www.example.com
and then use a text search for the suspicious url on the result. If your websites are more complex, httrack might be easier to configure for a text-only download.
hey guys,
weird question - i have no idea how to describe what i want i in the title of this question.
i wonder how i can measure or query how much megabytes or kilobytes my browser has to download to view my front-page of my website.
i'm trying to optimize my website for mobile devices and so i wonder how much bytes a mobile browser has to download to view my website. images, js-files, css-files, etc. all in all -> is there a nice and simple way to measure that?
thank you for your help
regards matt
You can use web-browser developer tool.
For Chrome, tools are embedded in the browser:
http://www.chromium.org/devtools/google-chrome-developer-tools-tutorial
For Firefox you can use the Firebug plugin:
http://getfirebug.com/network
i'm trying to build a batch image downloader in chrome. Basically, i will overlay a small download square to each image on the page and user clicks on it to download. Or the user can click to download all images on a page. I'm currently stuck on figuring out how to download the images. The best i can come up with is to use XHR to send the image to another server, the user can then retrieve it there.
If anyone have a solution for me. It would be much appreciated!
Jason
I believe you can XHR the images and using the File API you can store them locally.
Take a look at the following site http://www.html5rocks.com/features/file there are additional resources on the right column that has detailed examples and tutorials. Such as http://www.html5rocks.com/tutorials/file/filesystem/
Mohamed Mansour
This code will do the trick for you: https://gist.github.com/1049553
It's very simple usage of a 'feature' in chrome when you open an image in a new tab.