DOWNLOAD PRESENTATION: Pdf
Motivated by the Force
Concept Inventory (FCI) created by Halloun and Hestenes
and its impact on physics education, the Foundation Coalition
(FC) is working to create concept inventories for specific engineering
disciplines. The FCI was designed to measure conceptual, not computational,
understanding of Newtonian mechanics. The questions focus on intuitive
comprehension independent of knowledge of the terminology or numerical
modeling. Following the lead of the FCI, faculty members are creating
concept inventories for other disciplines.
The FC offers thirteen easy-to-use concept inventories that are
intended for pre-and post-testing to encourage evaluation of different
teaching approaches. To prevent student access, instruments are
not posted on the Web site but may be readily obtained by contacting
(CCI) The CCI is in two parts. Part I measures students'
conceptual understanding of the basic properties of elecricity, circuit
components, and linear time-invariant networks (DC and AC). Part II
addresses frequency domain concepts, coupled inductors, convoultion,
impulse response, and transform techniques. Contact Robert
Engineering (CECI) Parallel CECIs have been developed.
One is a 26-question inventory to measure mastery of fundamental digital
logic (DL) concepts to which sophomore students are introduced. Students
must master DL concepts to perform well in the follow-on courses.
The DL instrument will be released this summer. The other CECI measures
change in the conceptual understanding of computer engineering by
students who have completed sophomore-level courses; it is being refined.
The EMCI is tool designed to measure students' understanding
of fundamental concepts in electromagnetics. Although primarily intended
for junior-level electromagnetics courses in electrical engineering
departments, the EMCI can also be used in a variety of undergraduate
and graduate electromagnetics-related courses in engineering and physics
departments. EMCI Version 1.0 is composed of three instruments: (i)
Fields (electro and magnetostatic, and time-varying EM fields), (ii)
Waves (uniform plane waves, transmission lines, waveguides, and antennas),
and(iii) Fields and Waves (combination of the first two instruments).
The 35-question ECI assesses student understanding of introductory
electronics concepts that are covered in the first of a two-course
sequence. The exam includes a small subset of basic circuit analysis
questions so that instructors can differentiate between misconceptions
in circuit analysis and misconceptions in electronics. The developers
hope the ECI will become standardized across the U.S. as an ABET instrument.
The official version is now in circulation. For more information or
to obtain the exam for use in your classes, contact Marc
and Systems (SSCI)
The SSCI is a 25-question multiple-choice exam that assesses
students’ understanding of fundamental concepts in linear
signals and systems, with separate versions for continuous-time
and discrete-time material. To date, 28 instructors at 12 institutions
have administered versions 1 and 2 to over 1000 students. It has
been used for internal and ABET assessments. Contact Kathleen
(WCI)[6, 7] The WCI has
20 questions with 34 possible answers. Areas probed include visualization
of waves, mathematical depiction of waves, and wave definitions. The
WCI allows more than one correct choice for most of the questions.
Choosing more than one answer correlates with increasing understanding
of the material. The WCI is intended for junior-level electronics
of materials courses. Contact Ron
(DCI) The DCI covers rigid-body mechanics and is
intended for use in the first course in dynamics commonly found in
many engineering curricula. Development (beta) versions of the DCI
were tested in the fall of 2003 and spring of 2004. The first official
release (version 1.0) will be available for use in the spring of 2005.
Contact Don Evans.
Mechanics (FMCI) The FMCI establishes a common base
of fluids concepts and provides instruments that evaluate the degree
to which students have mastered the concepts. A possible outcome of
the inventory could be modification of the curriculum and courses.
Contact Jay Martin.
Transfer (HTCI) The HTCI assesses student understanding
of concepts, identifies misunderstandings, provides feedback to instructors,
and evaluates student gains in a heat transfer course. It is one piece
of a main package to help instructors make learning of heat transfer
more effective. Its development involves both students and instructors.
The HTCI is being evaluated for coverage; concepts include fundamental
ideas, conduction, convection, and radiation. Contact Jay
of Materials (SMCI)
The SMCI measures mastery of fundamental strength of materials concepts
such as stress, strain, and buckling that are introduced to sophomore
students. Many students will not master the more abstract concepts
until they complete follow-on courses. The first draft of the SMCI
became available in 2002; a revised version is under development.
Contact Jim Richardson.
The TCI is intended for use in introductory thermodynamics courses.
Since thermodynamics is often taught as a two-course sequence, two
instruments ("beginning" and "intermediate") are
eventually desirable. This TCI focuses on the first course. The distribution
of subject matter over the questions does not reflect time spent in
class. The question distribution reflects expectations for students
upon course enrollment. Contact Clark
(ChCI) Faculty members selected thermochemistry,
bonding, intermolecular forces, equilibrium, acids and bases, and
electrochemistry as topics. The 20 questions are conceptual, not mathematical
or algorithmic, and can be ansered in a short time. Given at the beginning
and end of a semester, the third version of the ChCI was used in the
fall 2003 and then revised. Contact Jim
(MCI) The MCI measures misconceptions about materials
structure, processing, and properties. It is intended for introductory
materials engineering courses. MCI results suggest that utilizing
more active-learning methods in introductory materials engineering
courses may increase conceptual knowledge gains. Contact Steve
References for Further Information
- Hestenes, D., Wells, M., and Swackhamer, G.
(1992). "Force Concept Inventory,"The Physics Teacher,
30 (3), 141151.
- Hestenes, D., and Halloun, I. (1995). "Interpreting
the Force Concept Inventory,"The Physics Teacher, 33
- Halloun, I., and Hestenes, D. (1985). "The
initial knowledge state of college physics students," American
Journal of Physics, 53(11), 10431055.
- Halloun, I., and Hestenes, D. (1985). "Common
sense concepts about motion," American Journal of Physics,
- Evans, D.L., and Hestenes, D.L., "The Concept
of the Concept Inventory Assessment Instrument," Proceedings,
Frontiers in Education Conference, Reno, Nevada, 1013 October
- Roedel, R.J., El-Ghazaly, S., Rhoads, T.R.,
and El-Sharawy, E., "The Wave Concepts InventoryAn
Assessment Tool for Courses in Electromagnetic Engineering,"
Proceedings, Frontiers in Education Conference, November
1998, Tempe, AZ.
- Rhoads, T.R., Roedel, R.J., "The Wave
Concept InventoryA Cognitive Instrument Based on Bloom's
Taxonomy," Proceedings, Frontiers in Education Conference,
San Juan, Puerto Rico, 1013 November 1999.
- Midkiff, K.C., Litzinger, T.A., and Evans, D.L.,
"Development of Engineering Thermodynamics Concept Inventory
Instruments," Proceedings, Frontiers in Education
Conference, Reno, Nevada, 1013 October 2001.
One-page FIE2001 working paper: http://fie.engrng.pitt.edu/fie2001/papers/1356.pdf
FIE 20001 presentation: http://foundationcoalition.org/thermo
- Richardson, J., and Morgan, J., "Development
of an Engineering Strength of Material Concept Inventory Assessment
Instrument," Proceedings, Frontiers in Education
Conference, Reno, Nevada, 1013 October 2001.
One-page FIE2001 working paper: http://fie.engrng.pitt.edu/fie2001/papers/1353.pdf
FIE 20001 presentation: http://foundationcoalition.org/strength
- Wage, K.E., and Buck, J.R., "Development
of the Signals and Systems Concept Inventory (SSCI) Assessment
Instrument," Proceedings, Frontiers in Education Conference,
Reno, Nevada, 1013 October 2001.
One-page FIE2001 working paper: http://fie.engrng.pitt.edu/fie2001/papers/1358.pdf
FIE 20001 presentation: http://foundationcoalition.org/system
Resources beyond the Foundation Coalition
Determining and Interpreting Resistive Electric
Circuit Concepts Test (DIRECT)
Engelhardt, P.V., and Beichner, R.J. (2004) Students'
understanding of direct current resistive electrical circuits.
American Journal of Physics. 72:1, 98-115
Abstract: Both high school and university students' reasoning
regarding direct current resistive electric circuits often differ
from the accepted explanations. At present, there are no standard
diagnostic tests on electric circuits. Two versions of a diagnostic
instrument were developed, each consisting of 29 questions. The
information provided by this test can provide instructors with
a way of evaluating the progress and conceptual difficulties of
their students. The analysis indicates that students, especially
females, tend to hold multiple misconceptions, even after instruction.
During interviews, the idea that the battery is a constant source
of current was used most often in answering the questions. Students
tended to focus on the current in solving problems and to confuse
terms, often assigning the properties of current to voltage and/or
Conceptual Survey on Electricity (CSE), Conceptual
Survey on Magnetism (CSM), and Conceptual Survey on Electricity
and Magnetism (CSEM)
They deal with E & M and can be used in pre-instruction
and post-instruction modes for the algebra/trigonometry-based
and calculus-based introductory, college-level physics courses.
They have undergone extensive revision and have been reviewed
by many college/university physics educators. Data from over 5000
students from over 30 different institutions (two-year colleges,
four-year colleges, and universities including one in Europe)
have been collected. The data (1999) for the CSEM show that 31%
correct for calculus-based students and 25% for algebra/trigonometry-based
students on the pre-test. Post-instruction results only rise to
47% for calculus-based students and 44% correct for algebra/trigonometry-based
students. Maloney, D., O'Kuma, T., Hieggelke, C., and Van Heuvelen,
A. (2001) Surveying students' conceptual knowledge of electricity
and magnetism," Am. J. Phys. 69 (7), Supplement 1,
Test of Understanding of Kinematic Graphs (TUG-K)
Beichner, R.J. (1994) Testing
student interpretation of kinematics graphs. Am. J. Phys.
62 (8), 750-755
Abstract: Recent work has uncovered a consistent set of
student difficulties with graphs of position, velocity, and acceleration
versus time. These include misinterpreting graphs as pictures,
slope/height confusion, problems finding the slopes of lines not
passing through the origin, and the inability to interpret the
meaning of the area under various graph curves. For this particular
study, data from 895 students at the high school and college level
was collected and analyzed. The test used to collect the data
is included at the end of the article and should prove useful
for other researchers studying kinematics learning as well as
instructors teaching the material. The process of developing and
analyzing the test is fully documented and is suggested as a model
for similar assessment projects.
Force and Motion Conceptual Evaluation (FMCE)
Thornton, R.K., and Sokoloff, D.R. (1998) "Assessing
student learning of Newton's laws: The Force and Motion Conceptual
Evaluation," American Journal of Physics 66 (4), 228-351
Outcomes Assessment Instruments for Identifying
Engineering Student Misconceptions in Thermal and Transport Sciences
This project focuses on creating an outcomes assessment
instrument to reliably identify engineering student misconceptions
in thermal and transport science courses (e.g. thermodynamics,
fluid mechanics, heat transfer, mass transfer).
Statistics Concept Inventory (SCI)
The Statistics Concept Inventory is designed to be in similar
format to the Force
Concept Inventory (FCI). which has been successful in assessing
student understanding of Newtons law and transforming teaching
to improve understanding. SCI development began in Fall 2002,
with a 32-item test. The scores and gains (from pre to post) are
similar to those found on early testing of the FCI in classes
which use the traditional lecture format. The SCI is divided into
the categories Descriptive, Probability, Inferential, and Graphical
based on the results of factor analysis. Each category has around
9 questions on the Summer 2004 version. Details on the sections
are provided on the Topics
Allen, K., Stone, A., Rhoads, T. R., and Murphy, T. J. (2004).
Statistics Concepts Inventory: Developing a Valid and Reliable
Instrument. Proceedings, ASEE Annual Conference and Exposition
Abstract: The Statistics Concepts Inventory (SCI) is
currently under development at the University of Oklahoma. This
paper documents the early stages of assessing the validity,
reliability, and discriminatory power of a cognitive assessment
instrument for statistics. The evolution of test items on the
basis of validity, reliability, and discrimination is included.
The instrument has been validated on the basis of content validity
through the use of focus groups and faculty surveys. Concurrent
validity is measured by correlating SCI scores with course grades.
The SCI currently attains concurrent validity for Engineering
Statistics courses, but fails to do so for Mathematics Statistics
courses. Because the SCI is targeted at Engineering departments,
this is a good starting point, but the researchers hope to improve
the instrument so that it has applicability across disciplines.
The test is shown to be reliable in terms of coefficient alpha
for most populations. This paper also describes how specific
questions have changed as a result of answer distribution analysis,
reliability, discrimination, and focus group comments. Four
questions are analyzed in detail: 1) one that was thrown out,
2) one that underwent major revisions, 3) one that required
only minor changes, and 4) one that required no changes.
Statics Concept Inventory
Concept Inventory is intended to assess ability to use concepts
Steif, P. S. (2004). An
Articulation of the Concepts and Skills which Underlie Engineering
Statics. Proceedings, Frontiers in Education Conference,
accessed 23 June 2004
Abstract: Many instructional approaches are being developed
with the goal of improving learning in Statics. This paper is
aimed at providing guidance to such developments by articulating
the conceptual basis for Statics. This paper recognizes the
primary science prerequisite to Statics, freshman Newtonian
mechanics, and addresses the essential ways in which Statics
differs from freshman physics. A set of four concept clusters
is proposed, together with a set of skills for implementing
these concepts. Then, typical errors committed by students are
presented. Examples of these errors are extracted from student
solutions to Statics problems. These typical errors are then
explained by appealing to the proposed concepts and skills.
It is hoped that this paper can provide an impetus for mechanics
educators to come to a community-wide agreement on a conceptual
structure of this subject that can inform future instructional
Steif, P. S. (2003). Comparison
Between Performance On A Concept Inventory And Solving Of Multifaceted
Problems. Proceedings, Frontiers in Education Conference,
accessed 23 June 2004
Abstract: Engineering science courses teach students
to apply fundamental principles and methods to understand and
quantify new, unfamiliar situations. Prompted by the finding
that students often have widespread misconceptions regarding
basic principles, researchers in physics education have developed
concept inventories to assess conceptual understanding. In this
paper, we put forth a methodology for exploring the relation
between conceptual understanding, as judged by performance on
a concept inventory, and efforts to solve to typical, multifaceted
problems. Based on an early version of a concept inventory for
Statics and a first attempt to employ this methodology, we find
there indeed to be correlations between conceptual understanding
and other general measures of performance on problem solving,
and course success in general. However, we did not find a one-to-one
correlation between an apparent understanding of specific concepts
and the successful application of those concepts in problem
Steif, P. S. (2004). Initial
Data from a Statics Concept Inventory. Proceedings, ASEE
Annual Conference and Exposition, accessed 23 June 2005
Device Concept Inventory
Concept Inventory (DCI) is a fifyt-question multiple choice/multiple
answer web-based quiz that has been developed to test conceptual
understanding of device theory.
© 2001 Foundation Coalition. All rights reserved.