The zombie series "The Walking Dead" is on tonight.
I've heard that Democrats feel that it is necessary to impose laws because individuals are basically dangerous - evil, even.
Republicans, on the other hand, believe that individuals are basically good. Thus, they want to keep laws to a minimum.
In "The Walking Dead," the remaining humans are just as dangerous to each other as the zombies.
So, does that mean that "The Walking Dead" is a Democratic show? With a Democratic world view?
Just asking.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment