I was taught in school to give people equality and don’t exclude one side over another. I don’t like the way men are being told what to do and women are complaining about not being given enough representation. I genuinely feel that the left are causing a culture war that just makes me lose respect for the left. I feel that the socialist agenda is populist and it’s a big fantasy like Denmark but what is really going on in Denmark. What is the left really at there? The right are just making people fearful about issues and it makes me feel scared to join in on politics.
Be the first to reply to this answer.