Seconding (thirding?!) the above. Google is *specific* - that’ll only block those specific param urls from getting crawled (not indexed) - whatever order the params are in
I mentioned this somewhere else, I think ???? , but I've seen this when sites do a cookie / bot / login-redirect. The actual / final URL doesn't have the parameters, but one of the intermediate URLs does use them -- then these robots.txt directives will block the whole set. If any URL within a redirect chain is blocked, the whole chain is blocked. Search Console focuses on the "final" URL with inspect URL. I *think* you can trick it by looking at the rendered HTML / console output there. You can also try it in an incognito window with the Googlebot user-agent.