Re the ttfb ave. Response time in search console != TTFB.
The gsc metric is the time it takes to get the resource, so you could think of it as time to last byte. It's not unusual therefore for it to be longer. (Often mitigated by the fact googlebot has a hella good broadband ????)
For the http/2 v 1.1 thing, the way the network layer works in the crawling / rendering pipeline is very different to the network layer in your browser, they are tuned for different things.
A browser is concerned with showing you a webpage as quickly as possible.
Googlebot needs to get stuff at the scale of the Internet in a efficient way.
So pages and resources are queued and crawled, not necessarily at the same time, like they would be in a browser. They may have some resources cached, they need to check if it's blocked by robots.txt
So the performance http/2 brings in multiplexing isn't really relevant for making A page crawled, rendered and indexed faster.
But what they can and do seem to do looking at logs is use http/2 to grab a few, possibly even unrelated URLs (a page, a resource needed elsewhere etc.) that are in the queue with one connection.
But http/2 does have processing overheads, both for the site being accessed and the client fetching, so there's a balancing act as to which is actually the most efficient for a given site. And for something that doesn't need to be very frequently crawled, or is very cacheable, http/1.1 might well remain the best method.
I've never seen any link between http/2 and better ranking, and as cwv aren't coming from googlebot, there's no implementation as to how performant Google would see your site.
Caveat: this is just what I've picked up from talks / docs / observation over time,