The media, and Hollywood in particular, represent one avenue in which the general public becomes familiar with the role of nurses. How does the media positively or negatively influence the public’s image of nursing? What other avenues may better educate the general public on the role and scope of nursing as well as the changing health care system?
This question had me baffled, I have never considered that the media and Hollywood had any influence in nursing directly. After researching this phenomenon. It poses the question are nurses the respected professionals I believed. There are several examples of nurses in Hollywood that definitely don’t depict nurses as hardworking professional that have an impact on the lives of others every single day. I guess it’s more appealing for your nurse to be a sexy vixen like Hot Lips Houlihan from the 1970 TV show Mash.
The bottom line is that writers of TV shows and movies have felt the need to take artistic license to create compelling, provocative stories filled with conflict and sexual scenes meant to entertain a mass audience. Nevertheless, fictional nurse characters tend to remain mostly insulting and unrealistic, marginalizing the nursing profession. In reality, nurses are underappreciated, working long hours under stressful situations and just not receiving the respect they deserve. Unfortunately the daily nursing grind without sex, drugs, and drama wouldn’t make for good TV. Media driven messages are powerful, influencing the cultural and collective mindset. It’s up to individuals to decipher reality from fiction. In a study of primary and secondary school students, most participants mistakenly described nursing as a girl’s job, a technical job “like shop,” and an inappropriate career for private school students. (JWT Communications)
Memo to Nurses for a Healthier Tomorrow on focus group studies of 1,800 school children in 10 U.S. cities, 2000)