The vast majority of servers currently still deliver pages and files using the HTTP/1.1 transfer protocol from 1999. In the Webmasters Blog, Google announced that from November, Googlebot will actively support the new HTTP/2 standard when crawling pages. Because the protocol is backwards compatible , it can also be used by clients that do not support version 2.
However, thanks to multiplexing, retrieving pages via HTTP/2 is more optimized and uses fewer resources than with the old version - for both the server and Google itself. And Google crawls a rcs data belarus large number of pages, as we know. In order for a page to be crawled using HTTP/2, an SSL certificate must also be installed. Since many websites already run on HTTPS, such a certificate is usually available.
There is no need to rush
There is no rush for website operators to switch over. Google will continue to support HTTP/1.1, especially since the new version of the protocol is only sparsely used so far. In most cases, anyone whose website is hosted by a hosting provider will one day be able to benefit from HTTP/2 without having to do anything on their own, as soon as the provider makes the switch.
Incidentally, pages will not receive any ranking advantages just because they use HTTP/2. Google made this clear. There had been speculation about this in SEO circles beforehand. However, this is primarily a server-related matter. Anyone who wants to try out HTTP/2 for themselves can do so using this example page