My question is: Does human nature ever actively contradict with evolutionary principles?
This is difficult to answer, and involves making clear one's definition of 'evolutionary principle.' If one means the principles which strictly promote passing on one's genes to the next generation, then human nature absolutely can contradict with them; some humans choose not to reproduce, some humans have sexual orientations which (assuming they act upon only those orientations) do not lead to reproduction, and some humans choose to engage in activities which clearly threaten their lives, therefore jeopardising their chances of passing on their genes. If one means the principles which promote the survival of the species as a whole, I think that human nature can also contradict with them - oddly enough, by the aspects of human nature which promote the survival of the individual, at the expense of the species. Overall, human nature is simply too variable to align itself with one particular goal.
Sunday, May 6, 2012
Q&A 11, First Answer
The basic form of my question is: What is the relation between evolution and emotions?
Emotions, like every other aspect of human nature, are determined by the process of natural selection. However, like many other aspects, they are not always purely practical. Emotions can have other effects than those which caused them to be selected into human nature, and I think it is often these effects which can help in the development of a moral code. For example, the evolutionary use of empathy might be to promote group ties which contribute to the overall well-being of a species. However, empathy can also extend to those not in one's immediate group, and perhaps even those not of one's own species. This could lead to a general 'golden rule' type morality, which later on might develop into a more complex code of ethics, promoting tolerance and kindness to others, support of equality, or even vegetarianism.
Emotions, like every other aspect of human nature, are determined by the process of natural selection. However, like many other aspects, they are not always purely practical. Emotions can have other effects than those which caused them to be selected into human nature, and I think it is often these effects which can help in the development of a moral code. For example, the evolutionary use of empathy might be to promote group ties which contribute to the overall well-being of a species. However, empathy can also extend to those not in one's immediate group, and perhaps even those not of one's own species. This could lead to a general 'golden rule' type morality, which later on might develop into a more complex code of ethics, promoting tolerance and kindness to others, support of equality, or even vegetarianism.
Technology as a Scapegoat
I have heard many people claim that technology is responsible for some significant aspect or aspects of the world's current problems. However, I think that in most cases the problem is not due to the technology itself, but instead to people's use of it. The television, for example, is not an inherently evil object; one can use it to convey important information (as on some science shows) or provide quality entertainment (as in the cases of some films which are undeniably works of art). However, people both misuse and overuse it, by spending an inordinate amount of time fixated on mind-numbing reality shows and other such content-lacking media. Yet many people blame the television itself for this problem. Similarly, many technologies which humans have used to cause death, suffering, or other negative effects have perfectly legitimate other uses which could greatly benefit many people. Again, focusing on the technology is ineffective; if we eliminate the technology which enables us to do some immoral or problematic thing, we will simply find another way to do that very same thing. The problem is with people, not with technology - but owning up to being the cause of serious problems is a difficult action for many people, so rather than doing so they find a convenient scapegoat.
Subscribe to:
Comments (Atom)