Active/Collaborative Learning Student Teams Integrating Technology Effectively Women and Minorities Assessment and Evaluation EC2000 Emerging Technology Foundation Coalition Curricula Concept Inventories
Assessment and Evaluation

Concept Inventory Assessment Instruments for Engineering Science

The Foundation Coalition offers thirteen easy-to-use concept inventories that are intended for pre-and post-testing of intuitive comprehension of a subject, independent of knowledge of the terminology or numerical modeling.

Assessment Instruments for Interest and Attitudes

Assessment Instruments for Engineering Process Skills Data-based Assessment Stories

Overviews: ABET Engineering Criteria Program Educational Outcomes

The Foundation Coalition offers summaries of resources available for assessment and instruction related to each of the eleven program educational outcomes.


Summative Reports

What is assessment and evaluation?
Assessment is defined as data-gathering strategies, analyses, and reporting processes that provide information that can be used to determine whether or not intended outcomes are being achieved.[1] Evaluation uses assessment information to support decisions on maintaining, changing, or discarding instructional or programmatic practices.[2] These strategies can inform
  • The nature and extent of learning,
  • Facilitate curricular decision making,
  • Correspondence between learning and the aims and objectives of teaching, and
  • The relationship between learning and the environments in which learning takes place.[3]
Introduction to Assessment and Evaluation across the Foundation Coalition

Assessment instruments and processes can be used to collect data on many different attributes and performance characteristics. Instruments designed to collect the data may be placed into one of three broad categories:
  • Content Knowledge
  • Student Interest, Perceptions, and Attitudes
  • Process Knowledge, e.g., teamwork, design

Assessment Stories: In addition to the instruments that have been developed across the Coalition, data has been collected on the curricular innovations that have been implemented on many different partner campuses. Stories constructed from the data are available on the Web site.

Workshops: The Coalition also offers several different workshops that campuses can host. The Coalition will cover facilitator and travel expenses and asks the host institution only to cover costs associated with hosting the workshop.

Glossary: A glossary of terms commonly used in assessment and evaluation has been compiled.

References for Further Information

  1. Gagne, R.M., L.J. Bridges, and W.W. Wagne. 1998. Principles of Instructional Design. Orlando, FL: Holt, Rinehart and Winston, Inc.
  2. Hanson, G., and B. Price. 1992. Academic Program Review. In: M. A. Wjitley, J. D. Porter, and R. H. Fenske (eds.). The Primer for Institutional Research. Tallahassee: Association for Institutional Research.
  3. Satterly, D. 1989. Assessment in schools. Oxford, UK: Basil Blackwell Ltd.

Web Resources
Accreditation Board for Engineering and Technology

Etchics Tools Database

The web site provides information on a number of instrucments that have been developed to ascertain student competence in various areas related to ethics.

Determining and Interpreting Resistive Electric Circuit Concepts Test (DIRECT)

Engelhardt, P.V., and Beichner, R.J. (2004) Students' understanding of direct current resistive electrical circuits. American Journal of Physics. 72:1, 98-115
Abstract: Both high school and university students' reasoning regarding direct current resistive electric circuits often differ from the accepted explanations. At present, there are no standard diagnostic tests on electric circuits. Two versions of a diagnostic instrument were developed, each consisting of 29 questions. The information provided by this test can provide instructors with a way of evaluating the progress and conceptual difficulties of their students. The analysis indicates that students, especially females, tend to hold multiple misconceptions, even after instruction. During interviews, the idea that the battery is a constant source of current was used most often in answering the questions. Students tended to focus on the current in solving problems and to confuse terms, often assigning the properties of current to voltage and/or resistance.

Conceptual Survey on Electricity (CSE), Conceptual Survey on Magnetism (CSM), and Conceptual Survey on Electricity and Magnetism (CSEM).

They deal with E & M and can be used in pre-instruction and post-instruction modes for the algebra/trigonometry-based and calculus-based introductory, college-level physics courses. They have undergone extensive revision and have been reviewed by many college/university physics educators. Data from over 5000 students from over 30 different institutions (two-year colleges, four-year colleges, and universities including one in Europe) have been collected. The data (1999) for the CSEM show that 31% correct for calculus-based students and 25% for algebra/trigonometry-based students on the pre-test. Post-instruction results only rise to 47% for calculus-based students and 44% correct for algebra/trigonometry-based students. Maloney, D., O'Kuma, T., Hieggelke, C., and Van Heuvelen, A. (2001) Surveying students' conceptual knowledge of electricity and magnetism," Am. J. Phys. 69 (7), Supplement 1, S12-S23

Test of Understanding of Kinematic Graphs (TUG-K)

Beichner, R.J. (1994) Testing student interpretation of kinematics graphs. Am. J. Phys. 62 (8), 750-755
Abstract: Recent work has uncovered a consistent set of student difficulties with graphs of position, velocity, and acceleration versus time. These include misinterpreting graphs as pictures, slope/height confusion, problems finding the slopes of lines not passing through the origin, and the inability to interpret the meaning of the area under various graph curves. For this particular study, data from 895 students at the high school and college level was collected and analyzed. The test used to collect the data is included at the end of the article and should prove useful for other researchers studying kinematics learning as well as instructors teaching the material. The process of developing and analyzing the test is fully documented and is suggested as a model for similar assessment projects.

Force and Motion Conceptual Evaluation (FMCE)

Thornton, R.K., and Sokoloff, D.R. (1998) "Assessing student learning of Newton's laws: The Force and Motion Conceptual Evaluation," American Journal of Physics 66 (4), 228-351

Maryland Physics Expectation Survey

The Maryland Physics Expectation (MPEX) survey has been developed by the Maryland Physics Education Research Group as part of a project to study the attitudes, beliefs, and expectations of students that have an effect on what they learn in an introductory calculus-based physics course. Students are asked to agree or disagree on a five point scale with 34 statements about how they see physics and how they think they work in their physics course. Click here to see the survey items. We have given our survey to a group of experienced university faculty committed to reforming their teaching to increase its effectiveness and have used this group's response as our definition of "expert". This group shows a strong consistency (>90%) on most of our survey items. We hypothesize that students who become effective scientists and life-long learners either have or will develop these attitudes.

ASEE Directory of Assessment Programs in Engineering Colleges

The page provides links to assessment activities in engineering programs.


Related Links:









Partner Links