Why does it seem like everything in this world comes down to sex or money. Seriously. It's all about sex or money. I'm watching the news and 3 stories in a row are......
The Mistresses of Tiger Woods, did Matt Lauer cheat on his wife, and Jon and Kate, can Jon ever start earning as much as Kate?
These stories are NEWS?
First off, why do people defend Tiger? Why do they say just leave him alone so he can get on with his life. How about his wife? How does she get on with HER life, after the shame and utter humiliation SHE has to suffer. Every day hearing and seeing this played out over and over on the news.
And what about the All-American family throwing away their relationship for money and fame? Oh come on people that's EXACTLY what Jon and Kate did. I mean look at Kate, she doesn't even look or act the same as she did 3 years ago.
Now don't get me wrong, they were offered alot of money to do that show.
And Kate looks HOT! But at what price in the end? When the kids get older and watch the shows, they probally will be like ...that was my Mom and Dad?... WTF?
I don't know. I personally love sex and money, and most of us do. But it seems like the morals of this country has gone down the toilet, because people don't know how to conduct themselves.
Have you ever watched the news lately? We are very quickly becoming a society that has no morals. The stories on the news are horrific, yet they don't shock us anymore, because it's EVERYDAY news.
War, people killing each other for the dumbest reasons, it's like no big deal anymore, and it's sad.
The stories on America's most wanted, or CNN are so common place, it's like life is turning into one big video game to some people.
I guess I'm going to stop watchin T.V.
It just seems like almost every reason things go wrong involve money or sex....