The number one skill needed in America to keep our economy growing isn't taught in schools. Why? What is our educational system trying to sell us? Do they not want us to know? If selling is one of the highest paid professions and it's not being taught in schools, is it a secret?
I guess to really understand this principle we must take a good look at what sales really is.
We are all salesmen from the day we are born, but don't realize it. Selling is a naturally occurring condition. Let me explain!
Anytime you have an interest in the outcome of a situation, you are selling. That's pretty much everything, right?
From the time you were a kid and all through life, every second someone has or will try to persuade, convince, or influence you to take action. Sometimes the convincing, persuading and influencing occurs without our willingness to buy into what is being offered.
How many times did we make the wrong decision, not realizing that someone was trying to sell us something?
Your parents, friends, spouse, teachers, relatives, co-workers, bosses, politicians, doctors, TV commercials, attorneys, etc. are examples of people who sell.
Then, of course there are the people who are in the profession of selling.
When you become consciously aware of this, it becomes pretty obvious that every moment of our lives, someone is trying to sell us something.
Just imagine if we were all taught this skill, how to sell ourselves through life, what a difference it would make to all of us… The unemployment numbers would go to zero because we would all be able to find that situation where we could create value for another human being and close that sale!
I have come to the conclusion after 35-plus years of being in the business of selling that your success in life is determined by your ability to sell yourself. Fact!
It's your insurance policy for future success!
Question: Are you buying what they are selling in life? Are they buying you?