You could try having the HDMI input selected on the receiver before plugging it in, in case there's some EDID fuckery going on and it might pass it through that way. There might be some info dumped in dmesg
as well that could help troubleshoot what it's not liking about. Computers try to be smart and output a supported resolution whereas a lot of AV equipment just outputs whatever it wants without checking.
If it is an EDID problem, there's ways to override it for the port so you could dump your TV's, then force it to use that one even when plugged into the receiver. Or you can use a generic 1080p/4K one.
Failing that, if your receiver and TV support eARC, I guess plugging directly into the TV and have the TV feed the receiver is an option.