The BEST seasons of The West

Every season of The West ever, ranked from best to worst by thousands of votes from fans of the show. The best seasons of The West!

This documentary covers the history of the American West from the Native American tribes to their encounter with Europeans and how the Europeans conquered them and settled the land. In telling this story, the film takes into the account to both the viewpoints of Indians and other minorities to balance the white populations history.

Last Updated: 3/25/2024Network: PBSStatus: Ended
Share: