These Western Films Highlight Black Cowboys and How We Tamed the Wild Wild West
Black westerns, and westerns featuring Black actors in general, are an important part of American cinema. They challenge the traditional, often white-washed view of the Wild West, highlighting the underrepresented role Black cowboys played in taming the …