In order to be able to perform seo search engine optimization, we first need to learn the logic of search engines.
The most basic stage of search engines consists of analyzing the pages and content on the websites called "crawler" and storing them in the databases of search engines.
Crawler
The process called crawler means that the search engine visits websites and saves a list of everything it finds there. These include at least page titles, images, keywords, and links to other pages. Some search engines can store the entire web page on their own, they can scan where the ads and links are located on the page.
Crawler operation is performed automatically by computers. Meanwhile, every page on the website is visited one by one, but this is done much faster than a person can. Crawling is an endless process, which means search engines crawl the websites periodically.
The link to any new website on the websites crawled by the search engine results in the crawling of this new website. The frequency and depth of the crawler process varies from site to site.