Information Age Education
   Issue Number 99
October, 2012   

This free Information Age Education Newsletter is written by Dave Moursund and Bob Sylwester, and produced by Ken Loge. The newsletter is one component of the Information Age Education project. See http://iae-pedia.org/ and the end of this newsletter. All back issues of this newsletter are available free online at http://i-a-e.org/iae-newsletter.html.

What Does Proficient Mean?

Under a spreading chestnut tree
The village smithy stands;
The smith, a mighty man is he,
With large and sinewy hands;
And the muscles of his brawny arms
Are strong as iron bands.
(Henry Wadsworth Longfellow;
American poet and educator; 1807–1882.)

Longfellow’s The Village Blacksmith is one of my favorite poems. The images of beauty and competence have stuck in my mind since I first encountered them in my childhood.

Competence—proficiency, expertise. These words spin together in my head. From time to time I read a report about the number of students who have reached or exceeded the proficient level on a state or national assessment. Sometimes the article laments that so many students have failed to reach this level in a specific academic area and goes on to discuss other failures of our educational system. Other times the article compares scores on different exams, noting that one measure of proficiency may be quite a bit different from another.

Quoting from a newspaper article by Bruce Mainman:

The future isn’t necessarily promising. In last year’s [2011] National Assessment of Educational Progress, only 17 percent of eighth-graders and 12 percent of high school seniors had a proficient knowledge of American history (Mainman, 8/7/2012).

The author quotes this data to support his arguments that something is wrong with American history education in the U.S. However, there is no definition of proficient given in the article. What does it mean to be proficient in American history? Did the 8th graders and the 12th graders take the same test?

Moursund (2011b) discusses lower-order and higher-order aspects of history education. Lower-order tends to focus on names, dates, and places for various events. Higher-order tends to focuses on causality, legacy, responsibility, and investigation/research. People with a reasonably high level of expertise in history have an appropriate blend of lower-order and higher-order knowledge and skills. Development of higher-order knowledge and skills in history depends on making good progress in achieving formal operations on a Piagetian cognitive development scale and on good teaching.

I am not a historian, but I can imagine it is quite a challenge to define proficiency in American history for 8th graders and 12th graders, and to design tests that divide students into categories such as significantly lower than proficient, proficient, and significantly higher than proficient. The average cognitive development level of 8th grade students is far below that of 12th grade students.

A 2011 report from the National Center for Educational Statistics (NCES) helps to illustrate the approach the National Assessment of Educational Progress (NAEP) is taking. It provides samples of types of questions that are being used to help assess Basic, Proficient, and Advanced levels of student knowledge and skills in American history.

Basic Level

Interpret a map about the colonial economy (grade 4).

Identify a result of Native American-European interaction (grade 8).

Understand the context of a women’s movement document (grade 12).

Proficient Level

Understand that canals increased trade among states (grade 4).

Identify a domestic impact of war (grade 8).

Understand Missouri statehood in the context of sectionalism (grade 12).

Advanced Level

Explain how machines and factories changed work (grade 4).

Explain two differences between plantations and small farms in antebellum South (grade 8).

Evaluate Civil War arguments (grade 12) (NCES, 2011).

Notice that all of the questions focus on higher-order knowledge and skills. This is in sharp contrast to much of the teaching of American history in our schools.

Here is another interesting example. Quoting from a California newspaper article by Sharon Noguchi:

And yet the results of the STAR tests administered last spring starkly show how far off the goal of competency remains. Among Latino students in Santa Clara County, only 21 percent tested proficient or above in Algebra I, the same percentage as last year. The figure is 24 percent in San Mateo County, up one percentage point from 2011.

What's more, amid widespread efforts to narrow the achievement gap that separates white and Asian students on the one hand and African-American and Latino students on the other, the gap between white and Latino students in Algebra I proficiency has grown in the past eight years (Noguchi, 9/1/2012).

The material and ideas given above made me start to think about whether I could give a good definition of proficient that would cover such diverse areas as history, math, and village blacksmithing. Here are three definitions of proficient that I combed from the Web:

Adjective: Competent or skilled in doing or using something.

Adjective: Well advanced in any art, science, subject; skilled.

Noun: An expert; an adept.

Hmm. These definitions do not say anything about students of different ages or enrolled in different courses. It seems obvious to me that in talking about a student’s level of expertise in any curriculum area, we would expect more of a 12th grader than an 8th grader.

On the other hand, suppose that a company is screening job applicants for a particular job. The company wants to hire someone who is proficient in doing the job or who can become proficient after a limited amount of training and experience. Likely the company is not concerned about whether some applicants are four years younger than others.

Using an expression from the game of cricket, it seems to me that defining proficiency is a “sticky wicket.” I wonder what is being communicated through articles that report on the proficiency of students.

Expertise

I have previously written about expertise and developed an expertise scale useful in my teaching (Moursund, 2011a). A person’s level of expertise in a particular area can be measured against all people in the world or against a specified group of people. We routinely do this in competitive sports and other competitive events. A 12-year-old female athlete may be the best in her school, her school district, the state, or the nation as compared to other 12-year-old girls.

Similar statements apply to performers in other areas such as music, math, and history. A person who is really good when compared to a collection of the best in the world is deemed to be world-class.

Within an area of knowledge, skills, and performance a person’s level of expertise might fall any place on a scale from “absolute novice” to “world-class.” A person’s level of expertise may satisfy—or fail to satisfy—their own personal needs. It may satisfy—or fail to satisfy—needs specified by a potential employer or a governmental agency.

I thought I was off to a good start on understanding proficient as a level of expertise. Then I encountered an example from a 2008 Center for Public Education report (CPE, 2008). The report provides math performance data on state assessment exams and on the NAEP exam for each state in 2005. In my home state of Oregon, for example, 64% of the middle school students received the rating of proficient on the state test. On the NAEP test, only 34% of the students were rated as proficient.

This type of large difference between state and NAEP measures is common in the report. However, there are some exceptions. In Missouri, for example, 16% of the students were rated as proficient on the state test and 26% were rated as proficient on the NAEP test. From a student performance point of view, the Missouri state definition of proficient is set at a higher level than the NAEP definition.

This example told me that being proficient in a particular area is subject to multiple definitions. Various stakeholder groups may use different measures that provide different ratings for a particular student or group of students. A person might be proficient according to one definition/measure and not be proficient by a different definition/measure.

Moreover, it is still not clear to me what level of expertise the rating of proficient represents. How “well advanced” does a person need to be in order to be classified as proficient?

NAEP assessments strive to compare students from around the country and over the years. How do NAEP assessments take into consideration changes in technology and the potential effect of these changes on curriculum content? We still have blacksmiths, but their tools and materials have changed. Henry Longfellow’s village blacksmith would certainly face an uphill battle to achieve a rating of proficient as a blacksmith in a modern factory.

Think about comparing students over a period of years during this time when Information and Communication Technology (ICT) has been improving rapidly and is becoming more readily available. How does one compare the proficiency of a student equipped with good ICT facilities versus a student with little or no to access to ICT?

Norm-referenced and Criterion-referenced Assessments

Here is a little background that you may find useful. High-stakes tests tend to be either norm-referenced or criterion-referenced. In a norm-referenced test, the test questions are first field-tested with a large group (a control group) of students. When the norm-referenced test is actually used, students are rated in terms of how well they do relative to the control group. Thus, a rating of 50 percentile would indicate that a student’s score was in the middle of those scored by the control group. See http://en.wikipedia.org/wiki/Norm-referenced_test.

This type of rating system does not tell us how well a student can actually perform in dealing with the problems and tasks in the area being tested. People who develop and/or use such an assessment might make up a definition such as “proficient means 75% or above,” but that certainly sounds silly to me. On average, students with scores in or above the range of those of the top quarter of students in the control group would be rated as proficient, and the rest of the students would not be rated as proficient.

A criterion-based test is designed to measure how well a student can actually perform in dealing with a specified set of problems and tasks. A student’s rating is the student’s score. It is not adjusted to be a comparison with other students who have taken the test.

Suppose, for example, that students in a class are learning to repair airplane engines. The goal is for students to learn to detect, diagnose, and repair the most frequently occurring airplane engine problems or potential problems. The people doing the training define proficient to mean that a student scores 96% or higher on this test.

When the class is first tested, it may be that only a quarter of the students achieve proficiency. Those who don’t are given more training and practice. If they still don’t achieve proficiency, they may be given still more training and practice. Eventually, those who don’t achieve proficiency are removed from the class and/or receive a failing grade. A test score of 84 % may earn a grade of B in some courses, but that is totally unacceptable in dealing with students who will actually be responsible for repairing airplane engines!

In Summary, Proficiency Is a Level of Expertise

A learner receiving a rating of proficient has achieved a level of expertise that some group has defined to be (has labeled as) proficient as measured by a specified assessment instrument and/or procedure. The assessment instrument and/or procedure is designed for a particular group of learners, such as 8th graders, 12th graders, or Air Force recruits who want to be airplane engine repair technicians. Different assessment designers may well develop quite different assessment instruments/procedures. Different groups defining what constitutes proficiency as measured by a particular assessment instrument/procedure will likely set different cutoff scores for achieving proficiency.

In brief summary, we are not talking about an exact science, and the overall process is quite expensive. NAEP’s testing budget in fiscal year 2013 is apt to be about $130 million. (See http://blogs.edweek.org/edweek/curriculum/2012/02/obama_proposes_
cut_to_naep_fun.html
.) I fear that we are falling behind in developing assessments that are both authentic and relevant to our current needs. I am particularly concerned that our current state and national testing programs may be holding back important changes in content, pedagogy, and assessment that would be made possible by the routine and effective integration of Information and Communication Technology into our schools.


References

CPE (2008). How does your state compare to NAEP proficient in math? Center for Public Education. Retrieved 9/3/2012 from http://www.centerforpubliceducation.org/
Main-Menu/Evaluating-performance/The-proficiency-debate-At-a-glance/How-
does-your-state-compare-to-NAEP-proficient-in-math.html
.

Mainman, B. (8/7/2012). As stupidity becomes fashionable, we all lose. The Sacramento Bee. Retrieved 9/4/2012 from http://www.sacbee.com/2012/08/07/4699642/
as-stupidity-becomes-fashionable.html
.

Moursund, D. (2011a). Education for increasing expertise. Retrieved 9/6/2012 from http://iae-pedia.org/Education_for_Increasing_Expertise.

Moursund, D. (2011b). Digital filing cabinet: Secondary school history. Retrieved 9/6/2012 from http://iae-pedia.org/Digital_Filing_Cabinet/Secondary_
School_History
.

Noguchi, S. (9/1/2012). Schools' STAR scores rise, but gap in algebra grows. The Oakland Tribune. Retrieved 9/8/2012 from http://www.insidebayarea.com/
education/ci_21443602
.

NCES (2011). The nation’s report card: U.S. history 2010. National Center for Educational Statistics. Retrieved 9/8/2012 from http://nces.ed.gov/nations
reportcard/pubs/main2010/2011468.asp#section4
.


Reader Comments

We are using the Disqus commenting system to facilitate comments and discussions pertaining to this newsletter. To use Disqus, please click the Login link below and sign in. If you have questions about how to use Disqus, please refer to this help page.
Readers may also send comments via email directly to moursund@uoregon.edu.


About Information Age Education, Inc.

Information Age Education is a non-profit organization dedicated to improving education for learners of all ages throughout the world. Current IAE activities include a Wiki with address http://IAE-pedia.org, a Website containing free books and articles at http://I-A-E.org, a Blog at http://i-a-e.org/iae-blog.html, and the free newsletter you are now reading.

For a free subscription to this twice a month newsletter and to see back issues, go to http://i-a-e.org/iae-newsletter.html. You can change your address or cancel your subscription by clicking on the “Manage your Subscription” link at the bottom of this message.