I have been asking myself this question over and over again; why would our slave masters, who enslaved us, tortured us, raped us, breed and branded us like an animal, turn around and give us religion to empower us?
For me personally, my understanding is that our slave masters didn’t give us Christianity to save our souls, nor for us to join them in heaven in the afterlife.
We know this to be indisputable because they believed that black people had no souls and that heaven was for whites only. The truth, and only, the reason they made African slaves into Christians, was to indoctrinate the concept of a white God into our minds. This made us more obedient to our white slave masters.
The views expressed above are my own views and opinions, as they say, are like noises so let us know what you think?