One of my writing buddies, Tiffany Willis, at AddictingInfo.org wrote the article “10 Facts That Prove America Is Becoming More Liberal (VIDEO)” that put a very wide smile on my face this morning. I have been stating for some time now that America is not a Center Right country but a center left country. Generally when you ask people what policies they like and it invariably puts them in the Center Left to Liberal spectrum, yet when you ask them if they are Liberal many say no, they are Moderate, Centrist, or Conservative.
The problem has been that the Right has successfully rebranded Liberal to mean a bad thing. Liberal has never been a Party. There has always been Liberal Democrats and Liberal Republicans.
It is important to note that all the good policies that gave many people rights and humanity and that moved this nation forward for several decades were Liberal policies. By definition that is what liberalism does. Conservatism works in areas where change is no longer needed and that base then becomes the status quo. We should not forget that the abolition of slavery, women suffrage, voting rights, war on poverty, Social Security, Medicare, Medicaid, and many other policies were vehemently opposed by Conservatives, but effected by Liberals of both parties.
As this political season matures and the media continues to distract with false narratives, it is important that one is grounded in what they believe and not what others project that they should believe. One must allow their real values, their moral humane values to filter false narrative that would lead one to make terrible life and negative country changing political decisions.
Tiffany's article reviewed some numbers from scientific and non-scientific polls that not only show reality, but the trend based on where the Liberal leanings are coming from, the youth and other growing demographics. Give the article a read.