by Brent Johnson
The united States of America today has become a society in which the general populace is taught to live life in debt. You are told to buy what you cannot afford and thereby possess things that make your life seem better. You do not need any money to get what you want because banks will loan you whatever you need. The American culture has become one in which living in debt is expected and borrowing money is the norm.