PADI home
Model List
leftNav rightNav Login
Design Pattern for Model Evaluation in Model-Based Reasoning | Design Pattern 2221 [ | ]
Title
Design Pattern for Model Evaluation in Model-Based Reasoning
Overview  
This design pattern supports developing tasks in which students evaluate the correspondence between a model and its real-world counterparts, with emphasis on anomalies and important features not accounted for in the model.
This design pattern is tied closely with model use, and is also associated with model revision and model elaboration.
Use glossary
  U1.
It is essential to be able to determine the degree to which a model is in fact an appropriate method for reasoning about a physical situation. Examining evidence of the nature and quality of model misfit to data is important in this regard. Key to this endeavor is the degree to which predications and inferences from the model, given some of the data, are consistent with other parts of the data or further data that could be gathered.
Focal knowledge, skills, and abilities glossary
Fk1.
In broad terms, ability to determine the appropriateness of a model for reasoning about a situation, for a given purpose.
Fk2.
More specifically, ability to identify salient features of available data for comparison, and detect anomalies that available models cannot explain.
Additional knowledge, skills, and abilities glossary
Ak1.
Familiarity with real-world situation
Ak2.
Domain area knowledge (declarative, conceptual, and procedural)
Ak3.
Familiarity with required modeling tool(s)
Ak4.
Familiarity with required symbolic representations associated procedures (especially statistical methods)
Ak5.
Familiarity with task type (e.g., materials, protocols, expectations)
Ak6.
Familiarity with standards of quality & expectation in the field
Ak7.
Ability to encode and represent evidence to be evaluated as an entity distinct from representations of the model
Ak8.
Knowledge of model at issue
Potential observations glossary
Po1.
Comprehensiveness and appropriateness of methods of assessing model fit
Po2.
Systematicity of model evaluation procedures (e.g., are results of a test used to guide the choice of the next test?)
Po3.
Degree of integration of results from multiple tools/views for assessing model fits
Po4.
Quality of explanation of model fit
Po5.
Indication of which aspects of model do not fit, with respect to aspects of data and aspects of model
Po6.
Quality of determination of whether model misfit will degrade target inferences
Po7.
Quality of rationale student provides for steps in construction of model. Includes domain-specific heuristics and domain-specific explanatory schemas when these are targets of inference
Po8.
Quality of rationale for what entities and relationships are expressed in the model, versus those which are omitted
Potential work products glossary
Pw1.
Verbal/written explanation of model-fitting actions [especially looking for prediction of new observations]
Pw2.
Trace of actions
Pw3.
Statements of hypotheses that motivate evaluation procedures
Pw4.
Talk-aloud trace [which may exhibit evidence of model evaluation actions/reasoning]
Pw5.
Representations and summaries of formal model-fitting tools such as statistical tests.
Pw6.
Record of results of model-fit analysis on forms provided (note: this is a form of scaffolding)
Pw7.
Explanation of results of model-fit analysis
Pw8.
Record of hypotheses formulated and tested
Potential rubrics glossary
Characteristic features glossary
Cf1.
A model is proposed for a situation, and its suitability must be evaluated -- is it satisfactory for the purpose? Where and how might it not appear to be adequate?
Variable features glossary
Vf1.
Is the model-to-be-evaluated given, or was it developed by the student in the course of an investigation?
Vf2.
Does the situation itself provide feedback about a model (e.g., as in interactive tasks such as troubleshooting)
Vf3.
Is the model satisfactory or not satisfactory?
Vf4.
If the model is not satisfactory, in what way(s) is this so? (E.g., lack of fit to observations, inappropriateness to project goal, wrong grain size or aspects of phenomenon)   details
Vf5.
Is problem context familiar (i.e., degree of transfer required)?
Vf6.
To what degree is the model evaluation prompted?
Vf7.
Complexity of problem situation
Vf8.
Complexity of the model; i.e., number of variables, complexity of variable relations, number of representations required, whether the model is runnable)
Vf9.
Is extraneous information present (makes tasks more difficult, because it evokes the need to evaluate whether certain aspects of the situation should not be modeled)?
Vf10.
Group or individual work?
Vf11.
Degree of scaffolding provided
Narrative structure glossary
Cause and effect. An event, phenomenon, or system is altered by internal or external factors.
Change over time. A sequence of events is presented to highlight sequential or cyclical change in a system.
General to Specific or Whole to Parts. A general topic is initially presented followed by the presentation of specific aspects of the general topic.
Investigation. A student or scientist completes an investigation in which one or more variables may be observed or manipulated and data are collected
Specific to general and Parts to whole. Specific characteristics of a phenomenon are presented, culminating in a description of the system or phenomenon as a whole.
Topic with Examples. A given topic is presented using various examples to highlight the topic.
National educational standards glossary
State standards glossary
State benchmarks glossary
MCA III: 7.1.1.1.1. Understand that prior expectations can create bias when conducting scientific investigations. For example: Students often continue to think that air is not matter, even though they have contrary evidence from investigations.
MCA III: 7.1.1.2.1. Generate and refine a variety of scientific questions and match them with appropriate methods of investigation, such as field studies, controlled experiments, reviews of existing work and development of models.
MCA III: 7.1.1.2.3. Generate a scientific conclusion from an investigation, clearly distinguishing between results (evidence) and conclusions (explanation).
MCA III: 8.1.1.2.1. Use logical reasoning and imagination to develop descriptions, explanations, predictions and models based on evidence.
MCA III: 8.1.3.4.1. Use maps, satellite images and other data sets to describe patterns and make predictions about local and global systems in Earth science contexts. For example: Use data or satellite images to identify locations of earthquakes and volcanoes, ages of sea floor, ocean surface temperatures and ozone concentration in the stratosphere.
MCA III: 7.1.1.2.2. Plan and conduct a controlled experiment to test a hypothesis about a relationship between two variables, ensuring that one variable is systematically manipulated, the other is measured and recorded, and any other variables are kept the same (controlled). For example: The effect of various factors on the production of carbon dioxide by plants.
MCA III: 7.1.3.4.2. Determine and use appropriate safety procedures, tools, measurements, graphs and mathematical analyses to describe and investigate natural and designed systems in a life science context.
MCA III: 6.1.3.4.1. Determine and use appropriate safe procedures, tools, measurements, graphs and mathematical analyses to describe and investigate natural and designed systems in a physical science context.
MCA III: 7.1.3.4.1. Use maps, satellite images and other data sets to describe patterns and make predictions about natural systems in a life science context. For example: Use online data sets to compare wildlife populations or water quality in regions of Minnesota.
MCA III: 8.1.3.4.2. Determine and use appropriate safety procedures, tools, measurements, graphs and mathematical analyses to describe and investigate natural and designed systems in Earth and physical science contexts.
I am a kind of glossary
These are kinds of me glossary
These are parts of me glossary
Templates glossary
Exemplar tasks glossary
Online resources glossary
References glossary
  R1.
Belsley, Kuh, & Welch (1980)
  R2.
Cartier (2000)
  R3.
Mosteller & Tukey (1977)
I am a part of glossary        
Design Pattern for Model-Based Inquiry in Model-Based Reasoning. (Design Pattern #2223)

 

Tags

[ Add Tag ]

(No tags entered.)



List of Examples:

Activity    Add'l KSAs: Affective    Add'l KSAs: Cognitive    Add'l KSAs: Executive    Add'l KSAs: Language and Symbols    Add'l KSAs: Perceptual    Add'l KSAs: Skill and Fluency    Continuous Zone    Design Pattern    Educational Standard    Evaluation Phase    Evaluation Procedure (rubric)    Materials and Presentation    Measurement Model    Narrative Structure    Observable Variable    State Benchmark    State Standards    Student Model    Student Model Variable    Task Exemplar    Task Model Variable    Task Specification    Template    Variable Features: Affective    Variable Features: Cognitive    Variable Features: Executive    Variable Features: Language and Symbols    Variable Features: Perceptual    Variable Features: Skill and Fluency    Work Product   


Copyright 2002-2012 SRI International, University of Maryland, Regents of the University of California. Patent Pending.
For more information, see the PADI Web site.