I have lived in the South all my life and these things have been happening to Blacks with very little fan fare or acknowledgement for over a century. Black folks left the South for greener grasses (prosperity and education), fleeing to other places around the country where the racism had simply evolved into other forms. More palatable to our ancestors, but racism none the less. Over the last few decades with the browning of America, the decrease in White prosperity, and Black and Brown folks lookin more like equals to privileged Whites (i.e. a Black man in their White House), White Southern men did their dog whistle to the rest of White America to check Black and Brown folks in other areas of the nation like they’ve in the South since slavery. It’s only alarming to those who have lived outside the South since Reconstruction. Black lives being considered subhuman has always been “the” trend for Southern Blacks, especially women and children. It’s just another day in the life for us. I remember when my cousins/family outside of the South used to talk about us because we stayed here. They bragged about how prosperous they were, going to “good schools,” and having “good” jobs. They despised us and simply couldn’t deal with the overt racism Southern Blacks experienced. At least we knew our devils. They didn’t try to hide. Now, the dog whistle of old has been blown, and White folks all over the country of all professions are doing what they’ve done since they landed on this continent.
The trend was always in style…for us in the South. I’m glad the rest Black America finally recognizes our struggles, but I’m saddened because it took so long to see them.