Blogger search only showing 7 results - search

This is my blog: https://www.vapeupdeals.com
If i enter any search term it only shows 7 results. I am not sure why this must be happening?

There are two possibilities :
Your posts limit is set to 7
Your posts list exceeded the limit for auto-pagination feature (it's a hidden feature that Blogger never mentioned), that's it 50 <img> tag and the size of HTML page being requested in kilobytes which amount is not mentioned.
Here's the full article from official blog of Blogger from almost a decade ago :
https://blogger.googleblog.com/2010/02/auto-pagination-on-blogger.html

Related

After 3 years my website with 1338 URLs in the sitemap still shows 1232 pages "Crawled - currently not indexed" in Search Console. Is this normal?

I started my wildlife photography website (www.stevenbrumby.com) in 2018 and since the site relies heavily on Javascript to display content I was aware from the outset that a sitemap would be crucially important. Initially the sitemap included more than 1000 urls but since then I have periodically updated the sitemap and it now includes 1338 urls. The sitemap status has been "success" all along. I've also checked with other sitemap validators and no errors were found.
In Search Console I have 43 valid pages all with the status "Indexed, not submitted in sitemap". But these pages are actually in the sitemap (I have not checked all 43, but the ones I checked were all there.) This is the first thing I don't understand!
There are 1.26K excluded pages of which the majority (1232) have the status "Crawled - currently not indexed". Maybe I am impatient, but I would have thought that by now some of these pages should have been indexed.
I would welcome any advice on where I am going wrong and how I might improve things.
After much trial and error I have found the answer to my question. Most of the urls in my sitemap had four query parameters. When I reduced this to one query parameter, immediately Google started indexing my site. 4 weeks after the change, 88% of the urls in my sitemap have been indexed. I expect this percentage to increase further in the weeks ahead.
sometimes it takes 2-3 days or maybe a couple of weeks before the newly added urls are been index, specially if there's a lot of pages involved.
Good day! :)

Bing Search API / Drupal 8 result of search displays more pages and results then actually available

On our site we use the bing search api v5.
Example: When I search for the term music on either our site search or using bing.com search and specifying to search only our site I get 167 total estimated matches and given that we are displaying 20 results per page there are 9 pages worth of results.
The issue is that when I click on page 9 the total estimated results number displays "Displaying 161 - 74 of approximately 74 results." and only shows 4 pages, with no longer any option to click on a page beyond 4. Is there some known bug that could be causing this issue I am scratching my head here.
This is expected - as you move towards the initially promised number of matches - all search engines remove redundant/very similar matches. So, you get fewer results. You can see this behavior on any search engine site - where millions of webpages are indicated on the first page but you get a handful results when you navigate to pages further down.

google not indexing my page

I search for a complete phrase in google "visitors had the opportunity to swing them to-and-fro. Never had i experienced so " ", the comment from the post "http://radhanathswamiweekly.com/radhanath-swami-describes-jhulan-yatra-festival/"
the comment is posted by "kiran shetty" a month ago. in that post.
the Google search results are:
No results found for "visitors had the opportunity to swing them
to-and-fro. Never had i experienced so ".
Google cache says:
This is Google's cache of
http://radhanathswamiweekly.com/radhanath-swami-describes-jhulan-yatra-festival/.
It is a snapshot of the page as it appeared on 18 Sep 2014 08:00:49
GMT. The current page could have changed in the meantime. Learn more
Using "Fetch as Google" from the webmaster for the post:http://radhanathswamiweekly.com/radhanath-swami-describes-jhulan-yatra-festival/
the fetch status shows as completed.
Google fetch's Downloaded HTTP response can be found at: "http://pastebin.com/v4L1nuG3"
The Downloaded HTTP response contains the complete phrase "visitors had the opportunity to swing them to-and-fro. Never had i experienced so ".
That means google is able to see the text.
Since the cache shows that its cached on 18-sep. the comment is one month old (23-aug) from today (23-sep). Then why it is not getting indexed, as we see its not showing in the search results, even though the text exists in the http response which google sees the page as.
Your page is known and indexed by Google, you can verify this by running the following command in the Google search box:
site:radhanathswamiweekly.com/radhanath-swami-describes-jhulan-yatra-festival/
The query you are using is very very specific and it is a bit long for Google to prepare search results for it. Not many people will type that query.
You can go through a checklist I maintain if you are looking for more reasons why your page is not ranking.

Google Blogger News Feed for Website

I have a website and a google blogger site in conjunction. I would like to add a section to my website home page that displays the 3 or 4 most recent blog posts (post title and the first 100-200 words of the post).
Is there a widget that will do this or any suggestion on how to set this up?
Thanks!
I found a really useful tutorial that does exactly what I want very easily. Inputs the post title, date/time of the entry, and the first 150 characters in the post. Very easy to set up.
http://alt-web.blogspot.com/2011/06/adding-blogger-rss-feed-to-html-page.html
Thanks!
I would suggest writing a short PHP script to print out the first 4 items from the RSS feed...
Check out feedburner. https://feedburner.google.com/
Specifically, look at the BuzzBoost option under publicize. This allows you to embed javascript in your HTML that will generate HTML of your last few posts.

question about sitemap files and their content for a dynamic website

I am writing a set of functions to generate a sitemap for a website. Lets assume that the website is a blog.
The definition of a sitemap is that it lists the pages that are available in a website. For a dynamic website, those pages change quite regularly.
Using the example of a blog, the 'pages' will be the blog posts (I'm guessing), since there is a finite limit on the number of links in a sitemap (ignore sitemap indexes for now), it means I cant keep adding a list of the latest blog posts, because at some point in the future, the limit will be exceeded.
I have made two (quite fundamental) assumptions in the above paragraph. They are:
Assumption 1:
A sitemap contains a list of pages in a website. For a dynamic website like a blog, the pages will be the blog posts. therefore, I can create a sitemap that simply lists the blogposts on the website. (This sounds like a feed to me)
Assumption 2:
since there is a hard limit on the number of links in the sitemap file, I can impose some arbitary limit N, and simply generate the file periodically, to list the latest N blogposts (at this stage, this is indistinguishable from a feed)
My questions then are:
Are the assumptions (i.e. my understanding of what goes inside a sitemap file) valid/correct?
What I described above, sounds very much like a feed, can bots not simply use a feed to index a web site (i.e. is a sitemap necessary)?
If I am already generating a file that has the latest changes in it, I don't see the point of adding in the sitemap protocol file - can someone explain this?
Assumption 1 is correct - the site map should indeed be a list of the pages on the site - in your case, yes that would be the blog posts, and any other pages like a contact page, home page, about page, etc that you have.
Yes, it is a bit like a feed, but a feed generally only has the latest items in it, while the site map should have everything.
From Google's docs:
Sitemaps are particularly helpful if:
Your site has dynamic content.
Your site has pages that aren't easily discovered by Googlebot during the crawl process—for example, pages featuring rich AJAX or images.
Your site is new and has few links to it. (Googlebot crawls the web by following links from one page to another, so if your site isn't well linked, it may be hard for us to discover it.)
Your site has a large archive of content pages that are not well linked to each other, or are not linked at all.
Assumption 2 is a little incorrect - The limit for a site map file is 50,000 links/10MB uncompressed, if you think you are likely to hit that limit, then start by creating a sitemap index file that only links to one sitemap, and then add to it as you go.
Google will accept an RSS feed as a site map if that's all you have, but points out that these usually only contain the most recent links - the value in having a sitemap is that it should cover everything on the site, not just the latest items, which are probably the most discoverable.

Resources