The WORST seasons of The West
Every season of The West ever, ranked from worst to best by thousands of votes from fans of the show. The worst seasons of The West!
This documentary covers the history of the American West from the Native American tribes to their encounter with Europeans and how the Europeans conquered them and settled the land. In telling this story, the film takes into the account to both the viewpoints of Indians and other minorities to balance the white populations history.
Share:
7.67
6 votes