In everyday conversations, Americans often seem to equate health care workers with doctors. Similarly, although many sociologists have researched doctors, few have researched nurses. Yet nurses form the true backbone of the health care system, and hospital patients quickly learn that it is nurses who make the experience miserable or bearable and whose presence or absence often matters most. The history of nursing demonstrates how the drive toward professional status, or professionalization, can be especially difficult for a “female” occupation.
The Rise of Nursing Before the 20th century, most people believed that caring came naturally to women and therefore families could always call on any female relative to care for any sick family member. Hospitals, meanwhile, relied for custodial nursing care on the involuntary labor of lower-class women who were either recovering hospital patients or inmates of public almshouses. These beginnings in home and hospital created the central dilemma of nursing: Nursing was considered a natural extension of women’s character and duty rather than an occupation meriting either respect or rights. Neverthe- less, increasingly during the 19th century, unmarried and widowed women sought paid work as nurses in both homes and hospitals. Few of them, however, had any training.