When did the Germans realize that they we’re the “bad guys” in or after World War II? The reason I ask this is because I’ve managed, like a great many others as of late, to take a step back, unwrap the “Old Glory” cloak that shrouds our bodies (a flag most have forgotten the meaning behind) and view our world, our world and not the American one.
Our government is on the verge of having the ability to unleash the military upon the citizenry – indefinite detention without charge. We are currently engaged in at least two illegal conflicts/occupations on the world stage. Corporations now have the same rights as you and I. Police brutality has become an acceptable norm and most importantly, we, the people, of the United States of America…could give two shits.
In other words, have we become so apathetic, so indifferent and so ignorant as to impair our ability in recognizing right from wrong, good from evil and our place in this world? Something’s severely broken and I can finally see it.