The Camp is Gone!

everything is a social satire but with no bite...
In the past few years, everytime I go to the theater to watch the new acclaimed movie, I come away dissatisfied and disappointed.
This didn't use to be the case before. So I wonder - have I changed? Or have the movies?
My very positive reaction to anything that came out in 2010s and before answers my question. It's definitely the movies.
I think I've narrowed it down to a few reasons:
Campiness is gone. Things are now perfect. Characters are perfect. If they are flawed, they are still perfect, we just don't know it yet. There's no vulnerability, humility, a willingness to mess up. Any shred of earnestness is masked by layers of irony, sarcasm, and social satire.
The First Act is gone. We are dropped into the action right away. The inciting incident happens before I know who I'm rooting for.
Being preached at without any subtlety or nuance (in conjunction with the first couple points) makes me avert my eyes from the message even if I fundamentally agree with it. The presentation and the messenger couldn't be more wrong.
The fact that Top Gun is one of the frontrunners for my favorite movie of the 2020s (I lean anti-war) says two things - you can present any message well and the other creators couldn't be doing a worse job.