And equally important question #2 : How many my subpages are visible in google.

A website is not visible in google search results because:

  • it has not been added to the google index (report you or others)
  • it has poor or unoptimized content (no-SEO), and google considered it of little value
  • the whole domain got banned for bad positioning (aggressive, careless, exaggerated ie disappeared from the index)
  • got a filter for bad positioning (it was pushed down by dozens of positions)
  • there were technical problems, e.g. the code has a noindex tag, a block in the robots.txt file, or site maps

How to check it ? You will know the answer to both questions by typing into google the phrase:
This will show the number of indexed subpages of your site – according to the length of links (addresses).

Let’s start with a google submission. In order for a website to display in google search, a crawler (i.e. a search robot) must first hit its address.

There are two ways to point it to that address:

1/ setting up a Google Search Console account and asking it to look at the page in question (adding the sub-page to the index). This is always a SINGLE address.

2/ placing a link to the page, on ANOTHER indexed page. Such a page is, for example, your Facebook profile or website directory.

The crawler visits the page, and a set of algorithms decides whether to INDEX this single page.

(and what initial value to give it).

To add to the index is to INDEX it to Google’s giant database. From it, answers to people’s questions are retrieved (search result = list of relevant pages with descriptions).

Google, when answering your question, does not start searching all over the Internet, but only in its database (we used to call it an archive), and it does it instantly!

Can index (otherwise SERPs, archive on servers, databases) and reality differ ?


Is it possible to partially zindex a site ?

Yes. There are sites that have, for example, 10-90% zindexed pages.

Is it possible to partially zindex a particular sub-page ? (for example, half of the text)


Developers of google and other search engines are aware that most of the Internet is pages / subpages with duplicate or junk content. That’s why they don’t index everything.

According to Ahrefs statistics, 90% of the sub-pages on the Internet have no incoming organic traffic. That is, no one entered them directly from free search engine results. This is due to both junk content and NO content optimization (no SEO).

Is it possible that the page header or content on google is different from the real page ?!


1/ Google, for its needs (SERPs, i.e. search results) makes a copy of the page / subpage. If the robot – crawler rarely visits you, the content in the search engine may be outdated. This can be fixed, for example, via a ticket in Search console. BTW: the equivalent of this tool on Facebook is Fetch (fetch)

2/ MONEY -> for a search engine the important word is „relevant” i.e. pertinent/relevant. Algorithms may find it better to link to your site with a cluster of words that is both more relevant („in its opinion”) and unique. In fact, this prevents 10 the same meta title and meta description as your competitors. You can try to reduce this a bit by applying length rules to these variables: Title 50-70 characters, Description 60-160 characters. Then we will give a signal to googling „to rather not change”.

What are other reasons for lack of visibility ?

  • Google does not see your site because:
  • you have an indexing block (noindex tag in the page files)
  • there was a blockade for robots (crawlers) User-agent: * disallow / in .htaccess or robots.txt file.
  • the site is blocked in public mode, e.g. hashed or under construction, WHILE you see it as a logged-in administrator
  • duplicate content has occurred, i.e. the same content is duplicated on different pages
  • you got a penalty from Google (ban or filter)
  • Google has diagnosed malware on your site

Do you know how to index your site, speed up indexation or increase the number of subpages indexed in search engine ?

If not, use the „Consultation” button . How do we know how to do it ? We have been creating websites for 20 years. Katowice (in Poland) is our headquarters, but we work for clients from all over the world.

Why google does as above ?

It’s all about wasting energy

„Technical spokesman” for Google, John Mueller (formally Senior Webmaster Trends Analyst) says bluntly that the search engine’s goal is not to index all the pages on the Internet, because it is a gigantic and growing resource.

„Spokeswoman” for rival search engine BING – Christi Olson, adds that stray crawling (i.e., scouring) the Internet for new pages/subpages is a waste of energy.

Their posts suggest that perhaps REQUESTING to crawl/index a page/subpage will only be possible in the future by „on-demand” means.

Of course, the request may be granted or denied, by the algorithms.

How does the crawler go from one sub-page to another (within the same site) ???  It does this through your internal link system, for example, through the MENU of the site.

For example, it goes from the 'Home’ page to the 'Offer1′ or 'Offer2′ sub-page.

Important points:
As of March 2021, the google robot crawls, indexes and ranks your site ONLY in the mobile version. This means (in simple terms) that it sees it as you do on mobile.
The command does not show all the pages of your site indexed by Googlebot. It shows a sample. The order of these subpages is variable. They are shown by link length. This list does not show their strength.  The full list of indexed subpages can only be viewed (along with any errors) in Google Search Console.
Does the sitemap take care of the problem of zindexing all subpages. No (but it is valuable).
Do google structural elements (Rich snippets) make sense ? Yes ( here the basic tool –
Crawler, scouring the internet consumes energy. It would like to consume as little as possible. So concepts like your server’s response time and LCP (Largest Contentful Paint), which is the largest element of the page, start to matter a lot (usually it’s the main photo – at the top of a given sub-page so-called Above the fold).

How to measure page speed (and learn about other parameters) ? There are two schools:

1/ What is important for google ? Its own test performed from abroad

2/ What is important for a normal person ? Here I recommend attention: any test from abroad such as Pingdom / GTmetrix/ or

Important to see the advantages of MORE than one test/tool.

Glossary (description of phrases from the official SEO guide maintained by Google vide:

Index – Google stores all the websites it knows about in an index. Each page’s entry in the index specifies its content and location (URL). During indexing, Google retrieves, reads and adds a page to the index: several pages of my site were indexed by Google today.

Scanning – the process of finding new or updated web pages. Google detects URLs, using links, reading site maps, and in many other ways. Google scans the Internet for new sites, and then (if necessary) indexes the ones it has found.

Robot – automatic software that scans (downloads) and indexes sites from the Internet.

Googlebot – the generic name for Google’s robot. Googlebot continually indexes the Internet.

SEO – search engine optimization, the process of improving sites for search engines. It’s also the job title of the person who does this professionally: we hired an SEO specialist to increase our online presence.

Finally … the beginning of another article of mine about SEO

(taken from

How many pages and pages google has in its index ?

It’s hundreds of billions. E.g. pages associated with the city of Katowice, is a microscopic crumb in these data sets.

How did these sites get there ?

They were reported by developers, agencies, search robots came across them themselves (more here) and so on ….

Does your site matter to google ?


If you are missing there will be others (in excess).

Is it possible to get along with google ?

No. It’s a set of schemes (read: algorithms). They don’t talk, they work.

What is positioning based on ?

On knowing these schemes.

Do the schemes change ?

Yes. There are huge changes like the Penguin algorithm of 2012, cutting off artificial links (this was the so-called game changer that bankrupted many businesses) AND dozens of micro changes, every month.

How much does google earn from SEO ?


And how much does it earn from the Google Ads (former Adwords) advertising system ?

It is more than 80% of its revenue. Ca $147 billion in 2020 (source: CNBC)


Właśnie czytasz BLOGA agencji interaktywnej Studio72 z Katowic. Nasza strona firmowa znajduje się pod adresem

śr. ocen w Google 5.0 – ☆☆☆☆☆ – Zobacz opinie (kliknij tutaj)opinie w mapach google

tomasz kita z katowic Autor:
Tomasz Kita [bio]
Studio72 / Katowice
tel. 501 469 329

Wspieraj moją twórczość –

Właśnie czytasz BLOGA agencji interaktywnej Studio72 z Katowic. Nasza strona firmowa znajduje się pod adresem

śr. ocen w Google 5.0 – ☆☆☆☆☆ – Zobacz opinie (kliknij tutaj)opinie w mapach google

tomasz kita z katowic Autor:
Tomasz Kita [bio]
Studio72 / Katowice
tel. 501 469 329

Wspieraj moją twórczość –

Właśnie czytasz BLOGA agencji interaktywnej Studio72 z Katowic. Nasza strona firmowa znajduje się pod adresem

śr. ocen w Google 5.0 – ☆☆☆☆☆ – Zobacz opinie (kliknij tutaj)opinie w mapach google

tomasz kita z katowic Autor:
Tomasz Kita [bio]
Studio72 / Katowice
tel. 501 469 329

Wspieraj moją twórczość –