[ View menu ]
Main

If you make hiring or admissions decisions, read this

Filed in Ideas ,Research News
Subscribe to Decision Science News by Email (one email per week, easy unsubscribe)

MECHANICAL VERSUS CLINICAL DATA COMBINATION IN SELECTION AND ADMISSIONS DECISIONS: A META-ANALYSIS

al

The pink plastic alligator at Erasmus University Rotterdam says “Interview-based impressions belong in the trash can behind me.”

Is there something you’ve learned in your job that you wish you could tell everyone? We have something that’s well known for decades by decision-making researchers, and all but unknown in the outside world.

Here’s the deal. When hiring or making admissions decisions, impressions of a person from an interview are close to worthless. Hire on the most objective data you have. Even when people try to combine their impressions with data, they make worse decisions than by just following the data alone.

Don’t feel swayed by an interview. It’s not fair to the other candidates who are better on paper. They will most likely be better in practice.

Please see:

* This paper by Kuncel, Klieger, Connelly, and Ones: Mechanical versus clinical data combination in selection and admissions decisions: A meta-analysis.

ABSTRACT
In employee selection and academic admission decisions, holistic (clinical) data combination methods continue to be relied upon and preferred by practitioners in our field. This meta-analysis examined and compared the relative predictive power of mechanical methods versus holistic methods in predicting multiple work (advancement, supervisory ratings of performance, and training performance) and academic (grade point average) criteria. There was consistent and substantial loss of validity when data were combined holistically—even by experts who are knowledgeable about the jobs and organizations in question—across multiple criteria in work and academic settings. In predicting job performance, the difference between the validity of mechanical and holistic data combination methods translated into an improvement in prediction of more than 50%. Implications for evidence-based practice are discussed.

REFERENCE
Kuncel, N. R., Klieger, D. M., Connelly, B. S., and Ones, D. S. (2013). Mechanical versus clinical data combination in selection and admissions decisions: A meta-analysis. Journal of Applied Psychology, 98(6), 1060

* This paper by Highhouse Stubborn reliance on intuition and subjectivity in employee selection. Industrial and Organizational Psychology, 1(3), 333-342.

ABSTRACT
The focus of this article is on implicit beliefs that inhibit adoption of selection decision aids (e.g., paper-and-pencil tests, structured interviews, mechanical combination of predictors). Understanding these beliefs is just as important as understanding organizational constraints to the adoption of selection technologies and may be more useful for informing the design of successful interventions. One of these is the implicit belief that it is theoretically possible to achieve near-perfect precision in predicting performance on the job. That is, people have an inherent resistance to analytical approaches to selection because they fail to view selection as probabilistic and subject to error. Another is the implicit belief that prediction of human behavior is improved through experience. This myth of expertise results in an over-reliance on intuition and a reluctance to undermine one’s own credibility by using a selection decision aid.

REFERENCE
Highhouse, S. (2008). Stubborn reliance on intuition and subjectivity in employee selection. Industrial and Organizational Psychology, 1(3), 333-342.

* This paper by Highhouse and Kostek. Holistic assessment for selection and placement

ABSTRACT
Holism in assessment is a school of thought or belief system rather than a specific technique. It is based on the notion that assessment of future success requires taking into account the whole person. In its strongest form, individual test scores or measurement ratings are subordinate to expert diagnoses. Traditional standardized tests are seen as providing only limited snapshots of a person, and expert intuition is viewed as the only way to understand how attributes interact to create a complex whole. Expert intuition is used not only to gather information but also to properly execute data combination. Under the holism school, an expert combination of cues qualifies as a method or process of measurement. The holistic assessor views the assessment of personality and ability as an ideographic enterprise, wherein the uniqueness of the individual is emphasized and nomothetic generalizations are downplayed (see Allport, 1962). This belief system has been widely adopted in college admissions and is implicitly held by employers who rely exclusively on traditional employment interviews to make hiring decisions. Milder forms of holistic belief systems are also held by a sizable minority of organizational psychologists—ones who conduct managerial, executive, or special-operation assessments. In this chapter, the roots of holistic assessment for selection and placement decisions are reviewed and the applications of holistic assessment in college admissions and employee selection are discussed. Evidence and controversy surrounding holistic practices are examined, and the assumptions of the holistic school are evaluated. That the use of more standardized procedures over less standardized ones invariably enhances the scientific integrity of the assessment process is a conclusion of the chapter.

REFERENCE
Highhouse, Scott and Kostek, John A. (2013). Holistic assessment for selection and placement. Chapter in: APA handbook of testing and assessment in psychology, Vol. 1: Test theory and testing and assessment in industrial and organizational psychology. http://psycnet.apa.org/index.cfm?fa=search.displayRecord&UID=2012-22485-031

Feel free to post other references in the comments.

1 Comments

  1. Sam Swift says:

    Being systematic in the evaluation of an applicant’s performance cues may not be enough if there is substantial variance in the situational factors contributing to those cues. Don’t forget to include measures of the situation in your mechanical selection models.

    Swift SA, Moore DA, Sharek ZS, Gino F (2013) Inflated Applicants: Attribution Errors in Performance Evaluation by Professionals.
    http://dx.plos.org/10.1371/journal.pone.0069258

    May 28, 2014 @ 6:53 pm

RSS feed Comments

Write Comment

XHTML: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>