WWII went a long way toward ending the Depression, and ushering in an era of prosperity. There was greatly expansion of home ownership and suburbs. Socially it marked the beginning of increased freedom and opportunity for women, blacks and poor men. Women had worked in factories and gained greater confidence while blacks pointed to the hypocrisy of the US opposing Nazi racist ideas while tolerating racism here. In addition the GI bill meant much greater opportunities and upward mobility for men of all socioeconomic backgrounds, as higher education was no longer for just an upper class elite.
- Hope I helped! <3