Home / BLOG

BLOG Archive - 2 / 2016

Back to BLOG

Have you ever wondered if critical thinking skills can be assessed using a multiple choice test?   

We sat down with Dr. Peter Facione, founder of Insight Assessment and the author of numerous books and the research paper, Critical Thinking: What it is and Why it Counts. He answered this and other questions regarding academic, skill, and employment assessments. 

Hand holding a magnifying glass over the blue silhouette of a head on a blue background

What is the Difference between Validity and Reliability?

A measuring tool is valid if it measures what you intend it to measure.  For example, it is valid to measure a runner’s speed with a stopwatch, but not with a thermometer. 

A test is reliable if it consistently produces the same result, when used correctly. 

Does Validity and Reliability Apply to Measuring Critical Thinking?

A valid assessment of critical thinking skills would be one that targets the correct list of skills. 

This means that scores on the test will be responsive to the quality of the test-taker’s skills. In this instance, in analysis, inference, evaluation, interpretation, explanation and self-regulation as used in the process of judging what to believe or what to do.  

A reliable assessment of critical thinking will yield essentially the same results, if the same test takers retake the test without having done anything in the meanwhile to increase or to diminish their critical thinking skills.   

Standing woman looking at information on a tablet

Why Choose Insight Assessment Solutions

At Insight Assessment measurement is taken very seriously.

  • Our products measuring reasoning skills and mindset has been studied in a variety of populations and contexts over the past 30 years.
  • In each case, items/scales are piloted in target samples and validated in replicated studies (trainees, students, employees, military personnel, K-12 students, health professionals, and the general population) to assure the performance of the assessments in the intended population.

Peter Facione, PhD

Founder and Senior Researcher at Insight Assessment, the leading provider of critical thinking assessments and development programs; principal at Measured Reasons LLC, a Los Angeles based research and consulting firm supporting excellence in strategic thinking and leadership decision making.  Dr. Facione is the developer of the California Critical Thinking Skills Test family of measurement tools. His latest book is Think Critically, 2016, coauthored with Carol Ann Gittens, PhD.

Red pencil next to paper with boxes to check

Rubrics are rating forms designed to capture evidence of a particular quality or construct. The quality of the measure obtained from a rubric depends on how well it is designed. If the rubric is poorly designed, the rating is confounded or inaccurate. The quality of the rubric is also affected by the skill of the rater using the rubric.  When using a rubric it is necessary to train and calibrate the raters to use the rubric well to assure that ratings are accurate and consistent across all raters. Ratings using rubrics cannot be benchmarked against national comparison groups or compared to other ratings made by other rater groups.

Rubrics are a popular approach when the goal is largely developmental; rubrics are good pedagogical tools.   Issues arise, however, when rubrics are used for summative assessment.

To read more about measuring critical thinking with rubrics

Graphic lists Academic uses for Insight Assessment student critical thinking skills and mindset data including admissions, accreditation, documenting student learning, quality improvement initiatives & Institutional effectiveness

It is increasingly important for educational organizations to document achievement of students learning outcomes. Many of Insight Assessment’s customers use one or more of our test instruments to gather the detailed objective evidence required to refine curricular approaches and to demonstrate educational success for accrediting bodies. Effective assessment programs first identify goals for the attainment of critical thinking skills and habits of mind learning outcomes and then create a plan to assess their achievement.

To read more about documenting achievement of evidence based student learning outcomes .

Custom Questions

Custom questions enable test administrators to better organize their results.  Custom questions are used to gather the classification and demographic metrics needed to support analysis and grouping of test results. 

Insight Assessment's testing system enables clients to create up to ten custom demographic items.  Response options include selecting choices provided by the client or free response. There are also five default demographic items which clients may elect to eliminate or retain. Here are some frequently asked questions about getting the most out of our custom question option:

1. What kinds of questions can I ask the people I’m testing using the custom demographics items?

Just about anything that it is legal to ask.  For example, you can ask about educational background, employment background, which department or work group a person is a member of, or how much time a person spends on a given activity.  If you know the choices you want test takers to select from, you can supply them. If you want the test takers to enter a free text response, that works too.

2. What are some examples of custom demographic items?

Here are four examples.  You are able to create your own custom demographic items to best fit with your assessment plans.

  •  Type your student ID number here:  _______________
  •  What is your highest level of formal education?
    • Advanced degree
    • Baccalaureate degree
    • Associates degree
    • High School Diploma or GED
    • Eighth Grade
    • Fourth Grade
    • None
  • Which professor assigned you to take this test?
    • Adams
    • Gomez
    • McDonnell
    • Smith
  • Which Division of our Company do you work in?
    • Accounting
    • Customer Relations
    • Human Resources
    • Marketing
    • Physical Plant & Security
    • Research & Development
    • Upper Management

3. I want to do a pretest and posttest analysis , but anonymity is important.  How can I set that up in the IA testing system?

There is a very straightforward procedure using the custom demographics option. First create an identifier that will link a given person’s pretest with that person’s posttest.  The identifier can be a four digit pin number, for example, that you create. Assign each individual their unique identifier.  You will then ask them to insert that unique identifier as a free response answer to a custom question.  Each person will do that when they login to take their pretest and again when they login to take their posttest. When you download test results in spreadsheet form, use that code to sort the data and, thereby, associate each person’s pretest and posttest. The custom demographic item you would create might look like this:

                Write the code number you were assigned here __________________

4. Does Insight Assessment supply any demographic items automatically?

Yes, there are five optional default demographic questions. Clients can elect to eliminate or retain any or all of these five.  The first three are free response items, which enable the responder to enter anything they wish whether it is accurate or not.  The last two on the list provide responders with the option of declining to provide the requested information.  

  • Name
  • Email address
  • Age
  • Gender
  • Ethnicity 

5. How can I learn more about creating custom demographic items or eliminating one or more of the default demographic items?

Easy, just contact Insight Assessment. We are happy to assist clients in getting the most helpful information possible when using our testing instruments and our powerfully versatile testing system.

Measyring changes and gains in thinking skill. Insight Assessment report graphic showing pretest to posttest distribution of scores on a mindset scale

Performance based funding is a fact of life whether you’re an educational institution, a HR manager or a training specialist.

You need to prove that your training and development methods, curriculum or instruction are working.  Objective, validated criterion-based evidence will be required to demonstrate success and prove the value of your programs.  

The best way to demonstrate the impact of your program is to provide evidence of changes in individual and/or group thinking skills.

Effective training should result in measurable improvements. Thinking should be assessed before the training starts and again after the training is completed. It’s important to document the participants’ baseline as early as possible in the project; before training starts is best. This will help faculty and trainers to learn about strengths as well as areas for improvement.  Then critical thinking skills should be assessed again to measure the changes in thinking skills after the training program. 

Pre & post testing is a testing model designed to examine the change in overall critical thinking skills or mindset attributes in a group of test takers.

It's reasonable to posttest as soon as a few weeks after a focused training program in critical thinking, but most often a posttest is gathered months or even years after the pretest to measure program improvement.

  • In university settings a pretest is frequently scheduled at the beginning of a degree program and a posttest sometime toward the end of the program. Our clients count on our critical thinking tools for the detailed evidence required to refine curricular approaches and to demonstrate educational success.
  • For businesses, the pretest might be done before an employee training program is begun and a posttest could be set for weeks or months after the program has been completed. INSIGHT Development Program includes assessment and critical thinking training modules that can be integrated into your training programs or can be implemented as new initiative.

The validity and reliability of pretest posttest assessment instruments must meet highest standards. Insight Assessment validated test instruments deliver quantifiable metrics for the essential skills of critical thinking and quantitative reasoning. Many of our clients use custom services to receive enhanced data analyses expressly designed to fit their improvement projects.

Only the highest quality assessment tools can deliver statistical evidence of value added to groups and individuals. Contact us to design a pretest posttest assessment program to document individual and/or group progress in attaining learning goals.

For further information on pretest posttest assessment design:

Good thinking is in demand. Download Critical Thinking Insight from your app store today:

Contact Us

* Required fields

Insight Assessment will not share your data with anyone. Click here to view our privacy statement.

[ Hide ]

Contact Us

* Required fields

Insight Assessment will not share your data with anyone. Click here to view our privacy statement.