There has been an interesting shift in the world of monsters over the last hundred years. The creatures that started out as nightmares are becoming more human. They brood and struggle with morality. Sometimes they even fall in love with a mortal. For anyone who has enjoyed the surge in paranormal romances, this isn't a surprise. The question I'm interested in is: Why?
I'm not the first person (or most educated on the topic, for that matter) to suggest that the correlation in werewolf tales in the fifteenth century and the rise in rabies are not a coincidence. Humans have a history of inventing powerful monsters to help explain things out of our control or understanding. The same fear of vampires hit the US in the nineteenth century alongside a virulent tuberculosis outbreak. Monsters reflect our fears.
At some point, these monsters became less fearsome. Is it possible that our growing knowledge of biology and epidemiology have taken the fear of magic and the supernatural out of the equation? Perhaps knowing how viruses are transferred and less exposure to wild animals have made the fear of these monsters less real. Of course, zombies are still portrayed as frightening (minus Liv from iZombie), but maybe that's because we still have viruses that terrify us and can't be cured.
There are less facts surrounding my final theory, but my gut tells me I'm onto something. As authors, we are constantly pushed to reimagine things in a different light. People get bored with the same formulas and creatures. The perspective of a monster continues to be interesting, and a one-dimensional bad guy is no longer acceptable. I know I prefer the monster who isn't purely evil.
The literal evolution of any creature isn't due to a single force. These changes are multi-faceted. Obviously, I don't have a definitive answer. Do you have a theory? Why do you think monsters have become more human?