I previous wrote about how Google Ads is not impacted by the Log4j security vulnerability but many sites and servers are. So if you have a website that needs to have updates because of the Log4j security issue, do you have to worry if you need to take down the site and Googlebot is unable to reach it?
If you take down your site and Google cannot crawl it, then it might impact your rankings. Google has spoke about how to handle site outages including using the 503 status code and maybe making a static site - a lot of this advice from Google's John Mueller then would work here.
John Mueller of Google shared a Twitter thread with new advice on how to handle taking your site offline to fix any Log4j security vulnerabilities.
(1) Make a static version of your site:
... ideally (strongly recommended) use the same URLs, then very little changes for search. Dynamic functionality usually doesn't play a role in SEO (exception: some search pages). Same URLs = no redirects needed = same content, bolding, headings, internal links, etc. (...
ā John (@JohnMu) December 15, 2021
(2) Copy the site and also read the advice for pausing your online business:
... If you can't do that for the whole site, doing it for your primary pages is better than nothing (check Search Console & Analytics). We have more similar tips at https://t.co/uddzHNZwPB from several years ago, I think. (...
ā John (@JohnMu) December 15, 2021
(3) Host it on the same domain, but if you can't use 302s for...
Read Full Story: https://www.seroundtable.com/google-seo-log4j-32605.html
Your content is great. However, if any of the content contained herein violates any rights of yours, including those of copyright, please contact us immediately by e-mail at media[@]kissrpr.com.