KS
ELITE MEMBER
- Joined
- Apr 1, 2010
- Messages
- 12,528
- Reaction score
- 0
- Country
- Location
No, the role of Christians in power who interpreted Christianity in a certain way played its role there. If Christianity the religion itself had played the role, then Christianity would have been banned from America completely. It is not.
Meaningless semantics.