Review your favorite medical TV show.
Was it medically accurate? Was it anatomically accurate (that is, do residents actually look like that?)?
Mon Sep 15, 2014 4:20 am
Guys, do you think that TV shows about Health and medicine are important, so as to educate people about their health?
Thu Sep 18, 2014 8:56 am
Good question. I think it depends on the type of show and people's personality types.
As for the shows, there are fictional and non-fictional types of health shows. So I think that the fictional ones are the least likely to be honest about health issues and medical topics. Non-fictional types of shows might be honest with everything they share, but they might be sharing opinions that aren't necessarily proven facts, and they could be hiding other truths.
As for people's personalities, individuals need to realize if they're gullible or too critical. If they believe everything they hear and see, then they shouldn't be basing their health philosophies on any types of shows. On the other hand, they could be too critical and not believe anything said on fictional or non-fictional shows.
I think it comes down to a good balance of trusting things that individuals may have heard before, and are therefore supported in a show, and researching on your own when learning something new from a TV show.
What types of shows, specifically, are you referencing? Talk shows, fictional TV shows, reenactment shows, etc.?
Tue Sep 23, 2014 3:39 am
It's very good question. Because every age group of people watch TV and through TV show we can easily deliver our message to people or if show is related to health then it is very much helpful for people to care about their health. :)
Wed Apr 19, 2017 5:42 am
While I don't believe that medical TV shows can teach you medicine, they help to build awareness about different types of diseases. Plus, I like the fact that you can hear different success stories that motivate. After all, Tv is aimed at entertaining, so it's a good combo, especially for those who set foot in the medical niche.