Description
"Rights" is a funny word, isn't it? In your context it would be a noun, meaning it is something that God or government grants for people so they can live and have a core foundation of law. It changed meaning through time and it no longer has the same meaning as it did when the Bill of Rights was first written. I think most Americans believe rights are inherently assured and granted. I believe whatever authority grants rights has the authority to deni them as well. Whenever we start compartmentalizing "rights" what always happens is a protection, freedom or privilege afforded for the compartment comes at a cost of marginalizing some other compartment. This is true for women's rights, gay rights, African American rights, so on. Rights change as something for people to something you're born into or a belief you hold. Justice is no longer equal or fair. I would have greater hope for us as a people if more believe the word "rights" does not need a modifier.
Discussion
By posting you agree to the Terms and Privacy Policy.