American women added a touch of glamour to the grim task of arming their country in war.
During World War II, women began to gain more respect and men realized that women actually could work outside of the home. They fought for equal pay and made a huge impact on the United States workforce. They began to take over "male" jobs and gained confidence in themselves.
Images courtesy of the
Library of Congress