So I know a lot of people LOVE the movie trailers before a movie, but I HATE most trailers. They just give away too much of the movie! And they just don't have to!
Personally, I try my best to watch as few trailers as possible. I don't like spoilers. In fact, I HATE spoilers. People that spoil movies or series are just awful people and should probably be removed from society (at the very least prevented from posting on social media). But the worst culprits are movie trailers! Why, movie studios?! Why do you do it?!
I first realized how terrible trailers are when I watched The Sixth Sense in the theater. As I was watching, the movie made the audience wonder what was going on with the kid, but of course I knew... he sees dead people. Cuz every trailer made that obvious. But why? It could have been cool just showing weird stuff happening. Luckily, there's still more to the movie, but as I was watching it, I thought "Wow! How much cooler would this movie have been if I hadn't known anything about it?"
The next movie that put me over the top for never wanting to watch another trailer again was Castaway with Tom Hanks. The trailer gives everything away! You know he gets rescued! It's insane! How can they even get away with that?! There are still movies I've never seen because I don't have to! I still haven't seen Real Steel. I probably would have really liked it, but when they show everything in the trailer, I guess I don't need to pay to see it. If it comes on Netflix I might take the time to watch it, but otherwise, nope!
Of course, there comes a point that I have to know what movies are coming at all, right? Without trailers how can I decide which movies I want to see? So now I watch a trailer until I know I'm going to see the movie. And for the movies I already know I'm gonna see, I just avoid the trailers completely. And I live a much happier life! Were there any movies that were spoiled by the trailers for you?