Just for giggles, I went to amazon.com and searched for "self-help books". I just wanted to see what would come up. The top three titles were:
1. "Self-Defeating Behaviors: Free Yourself From the Habits, Compulsions, Feelings and Attitudes That Hold You Back"
2. "When Am I Going to be Happy? How to Break the Emotional Bad Habits That Make You Miserable"
It seems like everywhere you turn, a new and popular self-help book is entering the market. Everyone wants to be the next Dr. Phil, making tons of money and maybe helping some people in the process. I wonder how individuals come to the point where they think they have something to say to others to make their lives better? Is it some sort of divine calling, where the writer feels a burden to get the message out? Is it self-confidence, maybe even a bit of arrogance, where a writer thinks to him/herself, "I've been through this, I've got it figured out now, everyone needs to know!"
Just wondering. And also, why is it even called self-help? The author is the one telling you what to do. You didn't come up with it yourself. I guess the reader has to make the individual decisions to stop letting emotions control them or cancel bad thoughts or whatever these books say, but the author is the one trying to help. So really, they should call it outside-help. But something tells me that wouldn't sell.