Skip to content

Was it something else, buddy? A “something else” alternative does not increase accuracy of answers in forensic interviews with children - CONCEPT Professional Training

Was it something else, buddy? A “something else” alternative does not increase accuracy of answers in forensic interviews with children - CONCEPT Professional Training

Was it something else, buddy? A “something else” alternative does not increase accuracy of answers in forensic interviews with childrenThe addition of a “something else” alternative to forced-choice questions in forensic interviews with children does not increase the accuracy of answers and does not bypass forced-choice question limitations. This is the bottom line of a recently published article in Psychology, Public Policy, and Law. Below is a summary of the research and findings as well as a translation of this research into practice.

Was it something else, buddy? A “something else” alternative does not increase accuracy of answers in forensic interviews with children

Featured Article | Psychology, Public Policy, and Law | 2017, Vol. 23, No. 3, 281–289

Does It Help, Hurt, or Something Else? The Effect of a Something Else Response Alternative on Children’s Performance on Forced-Choice Questions

Authors

Kamala London, University of Toledo
Ashley K. Hall, Ohio Department of Education
Nicole E. Lytle, Montclair State University

Abstract

Forensic guidelines recommend minimizing forced-choice questions when interviewing children. We investigated whether adding a “something else” alternative to forced choice questions affected 3- to 5-year-olds’ (N = 94) reports of an event involving innocuous touch. Following a 1-week delay, children were randomly assigned to receive either standard 2-alternative forced-choice questions or the same questions with an additional something else alternative. All children received 3 counterbalanced question types: correct alternative present, no correct alternative present, and unanswerable. Children’s overall accuracy was not affected by the something else alternative except on questions with no correct alternative present, where performance went from 15% to 31% accurate. Children selected or generated inaccurate and speculative responses to the majority of unanswerable questions regardless of a something else alternative. These findings suggest that the inclusion of a something else alternative does not bypass concerns about the use of forced-choice questions during interviews with children.

Keywords

forensic interviews, questioning techniques, children’s eyewitness testimony, forced choice questions, something else alternative

Summary of the Research

“Developmental psychologists have long expressed concern about the use of forced-choice questions during forensic interviews with children […] Although the strategy of using forced-choice questions increases the probability that children will provide information, it is problematic because children’s answers to these types of questions often are inaccurate […] Children have particular trouble when faced with forced-choice questions when neither response alternative is correct […] Children generally do not provide do not know responses” (p. 281)

“A novel modification to standard two-alternative forced-choice questions involves the addition of a third alternative, the option to select something else. The “something else” alternative has become ubiquitously used by child interviewers in the United States” (p. 281)

“The rationale for including the something else alternative is that interviewers usually are unaware of whether they are including a correct response option in their forced-choice questions. The something else alternative provides children with another option so they are not locked into selecting one of the two alternatives provided. […] A something else response is different than a do not know response in this regard. A do not know response implies the child does not have the requisite knowledge to answer the question. A something else reply, on the other hand, indicates the child has knowledge but that the answer is different than the two choices provided. […] the driving force for including a something else alternative in forensic interviews is that interviewers might ask children questions where the child does not have requisite knowledge to answer the question or where no correct answer choice is provided.” (p. 282)

“The goal of the current study was to examine whether the addition of a something else alternative affected children’s performance on forced-choice questions. Given the rationale for including the added alternative in forensic interviews, we were particularly interested in whether the something else alternative aided children in resisting false and unanswerable questions.” (p. 283)

The sample included 94 children ages 3 to 5, 48 girls and 46 boys. The procedure was as follows:

“Children initially participated in a 20-min event that was part of a larger study of children’s use of forensic interview aids. In the event, a research assistant touched children on different “public” locations on the child’s body and then asked the child to show on either a doll or a human-figure drawing where they were touched. […] Children were individually interviewed about the touch event following a 1-week delay. Each child was asked 30 forced-choice questions and was randomly assigned to one of two question conditions. Half the children were randomly assigned to receive the standard two alternative version of the questions (standard condition). The other half of children received the same question with an added something else option (something else condition). […] All children received 10 each of three different forced-choice question types. […] If a child chose “something else” as their response, then the interviewer asked a follow up question. […] For all question types, children’s responses were coded for accuracy and also according to whether the child selected one of the provided alternatives or if they provided a self-generated response.” (p. 283–284)

“The major finding of the study is that children showed very high rates of incorrect responses on the false and the unanswerable questions regardless of the something else alternative. When given the something else alternative along with two other answer choices, children selected the something else alternative 35% of the time, at just over the chance rate. […] a something else prompt does not bypass concerns about children’s performance on forced-choice questions even when children are provided with an opportunity to follow up their something else reply with their own self-generated response. Although children made use of the something else alternative, the majority of their self-generated responses were inaccurate.” (p. 286)

“Children made use of the something else alternative on the unanswerable questions, despite the fact that it did not significantly improve their performance. […] Children’s overall accuracy on the unanswerable items was similar in the something else and standard conditions.” (p. 286)

“Children’s performance on the unanswerable questions improved with age across both questioning conditions. However, no developmental improvements occurred in children’s accuracy according to the something else prompt.” (p. 286)

“The something else alternative did produce higher accuracy rates for the false questions (i.e., questions where no correct response alternative was provided) compared to the standard two alternative questions, but still performance was very poor on these questions” (p. 287)

“Like the unanswerable questions, children were much more apt to self-generate their own response alternative when given the additional something else alternative on false questions. However, just over half of their self-generated responses were incorrect. In terms of raw numbers of inaccurate self-generated responses, children in the something else condition were over three times more likely to generate incorrect responses compared with children in the standard condition. Hence, the something else prompt might postpone children’s speculated responses, but children still performed poorly.” (p. 287)

“One encouraging finding is that the something else prompt did not have adverse effects on children’s performance when true answer choices were provided. However, when they did generate their own responses on true items, over half were incorrect.” (p. 287)

Translating Research into Practice

“The major concern expressed by developmental psychologists about forced-choice questions is that children tend to pick a response regardless of whether they know the answer. Some practitioners have argued providing a something else alternative obviates concerns about children’s tendency to pick an answer when given forced-choice questions. Since forensic interviewers rarely know whether they are including a correct response alternative, the idea is the something else alternative makes children free to generate their own response rather than locking them in to two alternatives.” (p. 287)

“Although perhaps intuitively appealing, the use of a something else alternative is without scientific support at this time. Our data indicate children generated responses at high rates for false and unanswerable questions, yet the responses showed high inaccuracy rates.” (p. 288)

“A general principle in forensic interviews is that children’s self-generated statements are more reliable than children’s responses to forced-choice questions. […] In the present study, children frequently generated their own incorrect responses when provided the something else alternative. These findings are particularly important given self-generated details are considered more reliable in legal settings. In a forensic interview, children might be pressured to provide a response to an interviewer’s follow up prompt because, by selecting the something else alternative, the child has indirectly agreed they have an answer to provide (vs. responding “I don’t know” or refuting the claim). However, forensic interviewers should not be lulled into a false sense of confidence about monosyllabic responses, even those produced (vs. selected) by the child.” (p. 288)

“Our study findings, combined with many others indicate the most developmentally appropriate way to pose the question would be to avoid the forced-choice options altogether.” (p. 288)

“Our results suggest the provision of a something else alternative does not overcome children’s tendency to select (or generate) responses to forced-choice questions. Perhaps the biggest danger of incorporating a something else alternative is the practice may cause interviewers to have confidence in forced-choice questions, a confidence that is not warranted according to our data. Importantly, the youngest children were the most at risk for selecting or proving inaccurate responses. Overall, our findings lend further support to scientifically supported protocols that emphasize the use of open-ended prompts when interviewing children.” (p. 288)

Other Interesting Tidbits for Researchers and Clinicians

“The current study has a number of limitations. First, our design did not allow a direct comparison of children’s accuracy on forced-choice questions versus open-ended prompts. […] Future studies might directly compare the something else prompt with open-ended questions.” (p. 287)

“A second limitation of the study is that children performed poorly on the true event questions, indicating they had a fuzzy memory for the details of the event. However, given many children delay abuse disclosure, we chose employed a delayed memory test. Future work can test children’s immediate memory using the something else prompt.” (p. 287)

“Finally, our study design was driven by scientific considerations of experimental control and statistical power. In actual forensic interviews, the interviewer generally intersperses the something else prompts among other question types. Future studies could employ a more ecologically valid presentation of the question types.” (p. 287)

Join the Discussion

As always, please join the discussion below if you have thoughts or comments to add!