Home / BLOG

BLOG Archive - 2 / 2016

Back to BLOG

Have you ever wondered if critical thinking skills can be assessed using a multiple choice test?   

We sat down with Dr. Peter Facione, founder of Insight Assessment and the author of numerous books and the research paper, Critical Thinking: What it is and Why it Counts. He answered this and other questions regarding academic, skill, and employment assessments. 

Hand holding a magnifying glass over the blue silhouette of a head on a blue background

What is the Difference between Validity and Reliability?

A measuring tool is valid if it measures what you intend it to measure.  For example, it is valid to measure a runner’s speed with a stopwatch, but not with a thermometer. 

A test is reliable if it consistently produces the same result, when used correctly. 

Does Validity and Reliability Apply to Measuring Critical Thinking?

A valid assessment of critical thinking skills would be one that targets the correct list of skills. 

This means that scores on the test will be responsive to the quality of the test-taker’s skills. In this instance, in analysis, inference, evaluation, interpretation, explanation and self-regulation as used in the process of judging what to believe or what to do.  

A reliable assessment of critical thinking will yield essentially the same results, if the same test takers retake the test without having done anything in the meanwhile to increase or to diminish their critical thinking skills.   

Standing woman looking at information on a tablet

Why Choose Insight Assessment Solutions

At Insight Assessment measurement is taken very seriously.

  • Our products measuring reasoning skills and mindset has been studied in a variety of populations and contexts over the past 30 years.
  • In each case, items/scales are piloted in target samples and validated in replicated studies (trainees, students, employees, military personnel, K-12 students, health professionals, and the general population) to assure the performance of the assessments in the intended population.

Peter Facione, PhD

Founder and Senior Researcher at Insight Assessment, the leading provider of critical thinking assessments and development programs; principal at Measured Reasons LLC, a Los Angeles based research and consulting firm supporting excellence in strategic thinking and leadership decision making.  Dr. Facione is the developer of the California Critical Thinking Skills Test family of measurement tools. His latest book is Think Critically, 2016, coauthored with Carol Ann Gittens, PhD.

Red pencil next to paper with boxes to check

Rubrics are rating forms designed to capture evidence of a particular quality or construct. The quality of the measure obtained from a rubric depends on how well it is designed. If the rubric is poorly designed, the rating is confounded or inaccurate. The quality of the rubric is also affected by the skill of the rater using the rubric.  When using a rubric it is necessary to train and calibrate the raters to use the rubric well to assure that ratings are accurate and consistent across all raters. Ratings using rubrics cannot be benchmarked against national comparison groups or compared to other ratings made by other rater groups.

Rubrics are a popular approach when the goal is largely developmental; rubrics are good pedagogical tools.   Issues arise, however, when rubrics are used for summative assessment.

To read more about measuring critical thinking with rubrics

Graphic lists Academic uses for Insight Assessment student critical thinking skills and mindset data including admissions, accreditation, documenting student learning, quality improvement initiatives & Institutional effectiveness

It is increasingly important for educational organizations to document achievement of students learning outcomes. Many of Insight Assessment’s customers use one or more of our test instruments to gather the detailed objective evidence required to refine curricular approaches and to demonstrate educational success for accrediting bodies. Effective assessment programs first identify goals for the attainment of critical thinking skills and habits of mind learning outcomes and then create a plan to assess their achievement.

To read more about documenting achievement of evidence based student learning outcomes .

Custom Questions

Custom questions enable test administrators to better organize their results.  Custom questions are used to gather the classification and demographic metrics needed to support analysis and grouping of test results. 

Insight Assessment's testing system enables clients to create up to ten custom demographic items.  Response options include selecting choices provided by the client or free response. There are also five default demographic items which clients may elect to eliminate or retain. Here are some frequently asked questions about getting the most out of our custom question option:

1. What kinds of questions can I ask the people I’m testing using the custom demographics items?

Just about anything that it is legal to ask.  For example, you can ask about educational background, employment background, which department or work group a person is a member of, or how much time a person spends on a given activity.  If you know the choices you want test takers to select from, you can supply them. If you want the test takers to enter a free text response, that works too.

2. What are some examples of custom demographic items?

Here are four examples.  You are able to create your own custom demographic items to best fit with your assessment plans.

  •  Type your student ID number here:  _______________
  •  What is your highest level of formal education?
    • Advanced degree
    • Baccalaureate degree
    • Associates degree
    • High School Diploma or GED
    • Eighth Grade
    • Fourth Grade
    • None
  • Which professor assigned you to take this test?
    • Adams
    • Gomez
    • McDonnell
    • Smith
  • Which Division of our Company do you work in?
    • Accounting
    • Customer Relations
    • Human Resources
    • Marketing
    • Physical Plant & Security
    • Research & Development
    • Upper Management

3. I want to do a pretest and posttest analysis , but anonymity is important.  How can I set that up in the IA testing system?

There is a very straightforward procedure using the custom demographics option. First create an identifier that will link a given person’s pretest with that person’s posttest.  The identifier can be a four digit pin number, for example, that you create. Assign each individual their unique identifier.  You will then ask them to insert that unique identifier as a free response answer to a custom question.  Each person will do that when they login to take their pretest and again when they login to take their posttest. When you download test results in spreadsheet form, use that code to sort the data and, thereby, associate each person’s pretest and posttest. The custom demographic item you would create might look like this:

                Write the code number you were assigned here __________________

4. Does Insight Assessment supply any demographic items automatically?

Yes, there are five optional default demographic questions. Clients can elect to eliminate or retain any or all of these five.  The first three are free response items, which enable the responder to enter anything they wish whether it is accurate or not.  The last two on the list provide responders with the option of declining to provide the requested information.  

  • Name
  • Email address
  • Age
  • Gender
  • Ethnicity 

5. How can I learn more about creating custom demographic items or eliminating one or more of the default demographic items?

Easy, just contact Insight Assessment. We are happy to assist clients in getting the most helpful information possible when using our testing instruments and our powerfully versatile testing system.

The words Better Results on blue graph showing an upward trend cause by better decision-making

Measuring Changes and Improvement in Thinking Skills

Thinking skill matters; it is the springboard for problem-solving, decision-making and innovation. Building strong employee critical thinking skills is an important investment in the workforce of the future.  Employees who think well are more productive, dependable and creative. 

Performance-based funding, however, is a fact of life for HR managers and training specialists. You must constantly prove that your training and development methods, curriculum or instruction are working. 

To prove the value of your initiatives, objective, validated criterion-based evidence will be required to demonstrate success.  

The best way is to provide evidence of changes in your targeted individual and/or group thinking skills.

Effective training should result in measurable improvements. Thinking should be assessed before the training starts and again after the training is completed. It’s important to document the participants’ baseline as early as possible in the project.  Pre-training assessment is best. This will help faculty and trainers to learn about strengths as well as areas for improvement.  Then critical thinking skills should be assessed again, post-training, to measure the changes in thinking skills resulting from the training program. 

Use pre and post testing to tell if training is working 

Pre and post testing is an assessment model designed to examine the change in overall critical thinking skills or mindset attributes in a group of test-takers.  Comparison of individual or group results will highlight areas of improvement. Identification of areas that still need improvement can be targeted in subsequent training or mentoring.

For business, the pretest might be done before an employee training program is begun. The post-test could be set for weeks or months after the program has been completed. It's reasonable to post-test as soon as a few weeks after a focused training program in critical thinking, but most often a post-test is gathered months, or even years, after the pretest to measure program improvement.  More about pre and post testing

Report graphic showing pre and post test results for INSIGHT Business Professional

Data to assess impact of employee development program 

Only the highest quality assessment tools can deliver statistical evidence of value added to groups and individuals.

INSIGHT validated business test instruments deliver quantifiable metrics for the essential skills of critical thinking and quantitative reasoning. INSIGHT assessment tools are based on more than 30 years of experience in measuring thinking and reasoning skills.  These tools were specifically designed to provide businesses with the metrics they need to create teams of people able to learn their jobs quickly and to perform successfully in situations requiring problem-solving. 

INSIGHT assessments meet the highest standards of validity and reliability.

INSIGHT measures and reports on core business skills (such as analyzing problems; evaluating options; managing consequences; managing uncertainty; managing numbers) plus associated essential business thinking mindsets (such as job commitment; work ethic; honesty and focus).

INSIGHT Development Program is designed to help employers build critical thinking in teams as well as in individuals.  The program includes assessment and critical thinking training modules that can be integrated into your training programs or can be implemented as a new initiative. This proven solution will address your talent enrichment challenges.

Contact us to make sure you can document individual and/or group progress in attaining your learning goals.

For further information:

Good thinking is in demand. Download Critical Thinking Insight from your app store today:

Contact Us

* Required fields

Insight Assessment will not share your data with anyone. Click here to view our privacy statement.

[ Hide ]

Contact Us

* Required fields

Insight Assessment will not share your data with anyone. Click here to view our privacy statement.