The Big Questions About ChristianitySample
Has Christianity oppressed women?
The short answer to this question, if you look back through history and even today in some ways, is yes. But the longer answer, which is the more thoughtful and accurate answer, is that it’s not so simple, and Christianity has actually done far more good for women than harm.
Let’s look at how Christianity started. Of course, Christianity started with Jesus. Jesus was unlike most men in the world at that time. He showed extraordinary kindness and care towards women, including women on the margins of society.
Jesus welcomed women into his community of disciples (Luke 8:1-3), right alongside the men. One of the important reasons Christianity grew so rapidly is that it was very attractive to women. Christians didn’t make a big issue of this, but even the opponents of the Christian faith saw this very clearly.
In the second century, the philosopher Celsus disparaged Christianity as attracting foolish and low individuals with little understanding like women. That shows you some of the attitudes of the time.
The reason why Christianity attracted so many women is that it did just what Jesus had done. It elevated women above the stifling ancient patriarchy when women were regarded as little more than household possessions. In the ancient world, Christians stood out, contrasting with the attitudes of their societies because they emphasized the care of orphans, widows, children, the sick, the poor, and the imprisoned.
The Christian movement put compassion and charity towards others at its heart. Because women often were victims, they found compassion and charity attractive and liberating. It’s no surprise, then, in the earliest Christian churches, women exercised every function and held every position of authority.
However, in the second and third centuries, things started to change. The church had grown larger and needed more organization. The more structure that came into the church, the more men took over, and the more women were pushed out from the freedom and roles they had enjoyed before.
Despite this, the essential message of Christianity is still the same today, and its power to elevate and liberate women remains. The Christian church has kept this message alive throughout the centuries. That’s why, despite its mistakes, the message of Christianity has been responsible for the greatest changes in our world that have continued to elevate women progressively. Things like universal education for all, both men and women, the development of hospitals and health care systems, and achieving equality at the ballot box for women and the very idea of human rights. These were all essentially Christian initiatives based on the message of Jesus.
So overall, I reckon Christianity has been instrumental in freeing women in the world rather than oppressing them.
– Eliezer Gonzalez
About this Plan
This reading plan will address some of the biggest questions about the Christian faith. The answers here will give you a deeper understanding of why Christianity isn’t just like any other religion. They will help you engage with your society in an informed way and deepen your appreciation of what it means to be part of God’s kingdom.
More