I see shoppers every day who just trash the local Walmart and Target. They throw things everywhere. They open sealed packages. They eat unpurchased food. They trash the aisles. They throw merchandise on the floor. They hide merchandise. They take food out of the freezer and leave it on the shelf to spoil. My question to the public is: "Who taught Americans to shop in such a disgraceful, disrespectful, and destructive manner?"
When did families stop teaching children to put things back where they belong and to leave things as you found them? What happened that America now shops the way it does? Or am I wrong and Americans always shopped this way, it's just I'm only noticing it now?