![]() |
Google Search Console - Excluded Pages?
I have a new website.
Few days ago I put the domain into Google Search Console. Just checked today and it's showing 143 excluded pages. The little question mark help thing says: These pages were intentionally not indexed and These pages won't appear in Google but that was probably intentional. Err, no Google. This wasn't intentional. At all. These pages that you are excluding are pretty fucking important. Does anyone know why these are excluded? It it just because the domain is too fresh. Will they get included or is there some issue somewhere? |
The reason must be mentioned in search console... either duplicate or canonical issue etc
|
How about some temporary downtime for whatever reason or some pages being blocked by robots.txt?
|
Quote:
Discovered – currently not indexed Status: Excluded |
Quote:
If I click one of the blocked URL's I get: URL is not on Google This page is not in the index, but not because of an error. See the details below to learn why it wasn't indexed. Learn more |
One of my sites has 139K valid pages and 135K excluded pages. Not sure why they exclude them.
|
You should get email from Google Search Console about this where they usually say what they excluded and why...
Blue Book Digital Marketing Adult Services | Mainstream Services Contact: Email: [email protected] | Skype: Josh BlueBookMarketing (live:.cid.a58cf4c09488ad4e) |
It's now jumped up to 453 excluded pages.
Couple of things to note: 1. The site is an old site on a new domain. 2. It's all good quality content 3. Google console is pulling https://www. 4. There are 3 results listed in Google all are www. 5. Robots.txt looks fine 6. .htaccess looks fine 7. I tried to run a googlebot checker from a random website and got a 302 blocked error but this could just be a false positive because I can access pages fine 8. It's wordpress. I have disabled a lot of the plugins that I think could be causing an issue |
It's not guaranteed at all to have all your pages indexed, it's at their discretion.
Google may think your site or particular pages aren't high enough quality. - Some people actually delete pages regularly to get more search traffic for the rest. https://www.semrush.com/blog/guide-seo-pruning-semrush/ - You may also consider excluding the pages you consider underperforming manually, rather than let Google decide. - Improve your site in other ways (SEO) so they will index more pages |
Important to you is not important to Google
|
Quote:
There are pages (such as tags) which are not however I don't really care if Google ranks these pages or not. As mentioned, the site is an old one on a new domain. Is it possible that Google still knows about the old content and thinks this is duplicate? |
Little update: I know you don't care but it seems I can add links manually. I have added 3 or 4 which are getting indexed the same day.
I am hoping that because these are indexed Google will spider the rest of the website. If not I will just have to keep adding new links. |
good luck does it work for you?
|
All times are GMT -7. The time now is 07:52 AM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123