A head-scratcher to start the year with (luckily a friend's client, not mine ????)
• Site tanking off a cliff since early Dec, like -90-95% type clicks/impressions/traffic drop bad. Doesn't seem algorithm-related, and no major code/site changes made around that time
• Losing most indexed pages to 'soft 404' status - all different types including homepage, product page, blog page type stuff etc.
• Testing pages in GSC inspect, rich results test, SF crawl, view rendered source etc. don't flag any issues, nothing appears off
• GSC Crawl Stats all appears fine in terms of response times, response codes
The soft 404 part seems to indicate to me that some aspect of page content is not being rendered or displayed (e.g. Google is just seeing header, footer and blank in between) but the fact we can't replicate this in GSC inspect testing, and that it's occurring across nearly every page of the site (not localised to one template type or module etc.) doesn't support this? Likewise there's enough text and content in the raw HTML (Even if it is a JS-heavy build) so Google shouldn't be seeing fully blank content or anything.
Cloudflare is in place with some Googlebot spoof detecting / protection, but assuming that Cloudflare haven't stuffed that up, and that if they had it would be showing a bunch of server errors in crawl stats anyway? Possibly another layer in the tech stack somewhere is causing problems?
What would you be looking at next?