Posts Tagged ‘social science’

Science without a purpose

I am being forced to take attending a two-day Professional Ethics Training/Responsible Conduct of Research, which is offered through the Graduate College at the University of Oklahoma. A mandate from the National Science Foundation, passed-on down the chain of command, to the lowly graduate students, that if we (the students) are to be paid through a NSF research grant, we had to have take this seminar. No seminar, no stipend. With a plethora of choices before me, I opted to get paid, err, attend the seminar.

Course Objective:

The seminar’s purpose is to make money give the attendees the tools that allow us to make ethical decisions. Aside from the fact that  a “good ethical decision” is as subjective as the appreciation for art, I see no point in attending such a seminar where the general response to any scenario is “well… that depends on the situation” or “how do I define (such and such)?”. It seems as if the only take-home message I will get from this impedance to my weekend (which could be better spent preparing for the upcoming SCEC meeting!) is that Social Science allows you to dodge commitment to an answer – even if the evidence is there against you.

Case in point? Let’s look at the data…

Data collected by Soft scientists:
[livevideo id=7CB9CBD939C648519A52B18224199D3F/90989/milgram-experiment.aspx]

During the seminar, we are exposed to the fact we may not always make “good” ethical decisions – this notion is repeated several times throughout the morning. The compelling evidence can be seen in the Milgram experiments of the 1960s (video above). Given a choice between “being ethical” and obeying authority, it is found that ~65% of the people that participated in the Milgram experiments obeyed the authority, and subjected high amounts of electricity to another human being. These results are not a happenstance of the 60’s. The Milgram experiments were replicated in 2002, with similar results.

Another statistical study was presented during the brainwash seminar. This includes a Predictor and an Experiencer.  The Predictor is given a scenario, and predicts how he  or she will react in a given situation, while the Experiencer is actually placed in the scenario and his or her reaction is recorded. The scenario involves giving a person the authority to decide (a) selfishly or (b) unselfishly a fate for the participants involved, or to (c) be fair and flip a coin. The reults are as follows:

Experiencer vs Preditor

The data show that most people have good intentions (figure above, Predictor: Self) of being fair, while predicting that others in a similar scenario would not be so pious. The reality is that when placed in the actual scenario, most people acted more selfishly (this is interpreted to be “unethical”).

Interpretation by a Hard scientist:

The scenario is the following… We are required to attend a Professional Ethics Training/Responsible Conduct of Research seminar to learn strategies on ethical decision making. Fine. We learn it, go back to the lab, where we are under the supervision of the advisor… The advisor – an authority figure that is not required to take such a seminar on ethics and professional conduct. If it so happens that we have a crooked advisor, the Milgram experiments show that all training and conscience fly out the window when placed in such a situation!

The response from the seminar coordinator, “Well, someday you will be the authority and in charge of students. This training will give you the tools to make the right ethical decisions”.

Fine. But… The data also show that it may all be fine and dandy now. We can predict what “ought to be” done in a given situation… now. But when placed in the actual situation, we crumble and choose the selfish/unethical choice!


How ethical is it for me to waste my time sitting through a seminar that has absolutely no application when placed into practice?