[LON-CAPA-cvs] cvs: modules /gerd/concept description.tex

www lon-capa-cvs@mail.lon-capa.org
Fri, 09 Jul 2004 02:26:17 -0000


This is a MIME encoded message

--www1089339977
Content-Type: text/plain

www		Thu Jul  8 22:26:17 2004 EDT

  Modified files:              
    /modules/gerd/concept	description.tex 
  Log:
  Next round ...
  
  
--www1089339977
Content-Type: text/plain
Content-Disposition: attachment; filename="www-20040708222617.txt"

Index: modules/gerd/concept/description.tex
diff -u modules/gerd/concept/description.tex:1.6 modules/gerd/concept/description.tex:1.7
--- modules/gerd/concept/description.tex:1.6	Thu Jul  8 11:36:03 2004
+++ modules/gerd/concept/description.tex	Thu Jul  8 22:26:16 2004
@@ -34,7 +34,7 @@
 \LARGE\sc Physics education:\\ Does "conceptual" online formative assessment lead to conceptual understanding?
 \end{center}
 
-\section{Introduction}
+\section{Goals and Objectives}\label{intro}
 \begin{quote}
 Mathematics is {\it not} just another language. Mathematics is a language plus reasoning; it is a language plus logic. Mathematics is a tool for reasoning. It is in fact a big collection of the results of some person's careful thought and reasoning. By mathematics it is possible to connect one statement to another.
 \begin{flushright}\sc Richard Feynman~\cite{feynmanCharacter}\end{flushright}
@@ -64,7 +64,7 @@
 Expert and novice approaches to problem solving in physics have been studied extensively (e.g.~\cite{chi,larkin}). Two of the most apparent differences are that
 \begin{enumerate}
 \item experts are initially characterizing problems according to deep structure and physical concepts (e.g., "energy conservation"-problem), while novices tend to characterize them according to surface features (e.g., "sliding-block-on-incline"-problem) or applicable formulas (e.g., "$E=\frac12mv^2+mgh$"-problem")
-\item novices then continue to employ a formula-centered problem solving method~\cite{heuvelen}, frequently refered to as "plug-and-chug."
+\item novices then continue to employ a formula-centered problem solving method~\cite{heuvelen}, frequently referred to as "plug-and-chug."
 \end{enumerate}
 Redish~\cite{redish} somewhat bleakly  describes a novice approach to learning physics as follows:
 \begin{itemize}
@@ -75,33 +75,46 @@
 \item Erase all information from your brain after the exam to make room for the next set of materials.
 \end{itemize}
 
-One cannot really blame learners for shortcircuiting physics "learning" this way, since the cognitive and metacognitive skills, which physicists value so much higher than factual knowledge or formulas, are hardly ever made explicit, neither in instruction, nor in formative or summative assessment~\cite{lin,reif,mazur96}; in fact, they are mostly altogether "hidden"~\cite{redish} from all aspects of a course, and students are affirmed in their novice expectations~\cite{hammer} of what it is to "do physics."
+One cannot really blame learners for short-circuiting physics "learning" this way, since the cognitive and metacognitive skills, which physicists value so highly, are hardly ever made explicit, neither in instruction, nor in formative or summative assessment~\cite{lin,reif,mazur96}; in fact, they are mostly altogether "hidden"~\cite{redish} from all aspects of a course, and students are affirmed in their novice expectations~\cite{hammer} of what it is to "do physics." To quote Lin~\cite{lin}: "The primary determinants of student performance are the specific tasks for which teachers explicitly hold student responsible (e.g. problem sets and exams), rather than the general goals of the teacher (e.g. conveying an appreciation of the power of physics in a broad context)."
+
 \subsection{The Problem with Problems}
-{\bf This project focusses on formative assessment in introductory physics education, and how formative assessment can be used to help learners re-evaluate their epistemologies and develop expert-like problem solving skills.} The challenge is to move students away from treating physics as a set of unrelated factoids and formulas, and away from focussing on memorizing and using formulas without interpretation or sense-making~\cite{hammer}.
+{\bf This project focusses on formative assessment in introductory physics education, and how formative assessment can be used to help learners re-evaluate their epistemologies, develop expertlike problem solving skills, and gain a conceptual understanding of physics.} The challenge is to move students away from treating physics as a set of unrelated factoids and formulas, as well as away from focussing on memorizing and using formulas without interpretation or sense-making~\cite{hammer}, and toward "thinking like a physicist."
+
+
+{\bf "Conceptual understanding" in this project is defined as insight,
+as reflected in thoughtful and effective use of knowledge and skills in varied situations,
+into abstract key ideas,
+which are generalized from particular instances.}
 
-Alternative formative assessment as a classroom tool, where students are forced to verbally  express their views and teach each other, rather than calculate answers~\cite{mazur}, is starting to be accepted as an effective teaching practice in more and more courses. Formative assessment outside the classroom, on the other hand, is frequently limited by logistics, particularly in large enrollment courses, where timely feedback is often impossible without the use of computerized systems (e.g.~\cite{thoennessen,kashy00}). {\bf This project will focus on online homework as formative assessment tool.} As a model system, the project will use the Learning{\it Online} Network with CAPA (LON-CAPA), described in section~\ref{loncapa}.
+Alternative formative assessment as a classroom tool, where students are forced to verbally  express their views and teach each other, rather than calculate answers~\cite{mazur}, is starting to be adopted as an effective teaching practice in more and more courses. Formative assessment outside the classroom, on the other hand, is frequently limited by logistics, particularly in large enrollment courses, where timely feedback is often impossible without the use of computerized systems (e.g.~\cite{thoennessen,kashy00}). {\bf This project will focus on online homework as formative assessment tool.} As a model system, the project will use the Learning{\it Online} Network with CAPA (LON-CAPA), described in section~\ref{loncapa}.
 
 The assumption of this project is that "the problem with problems" (a phrase borrowed from~\cite{mazur96}) is that
 \begin{quote}
-Hypothesis 1a: Standard calculation-oriented textbook problems affirm non-expertlike epistemologies and encourage non-expertlike problem-solving strategies
+{\bf Hypothesis 1a:} Standard calculation-oriented textbook problems affirm non-expertlike epistemologies and encourage non-expertlike problem-solving strategies
 \end{quote}
 and that by the reverse token --- within the limitations of online computer-evaluated problems ---
 \begin{quote}
-Hypothesis 1b: There are classes of online formative assessment computer-evaluated problems which make learners confront their non-expertlike epistemologies and encourage expert-like problem-solving strategies\end{quote}
-
-Online homework systems by the very nature of computers lend themselves to standard calculation-oriented textbook problems, and are extensively used in this way. Yet, the "plug-and-chug" approach is the most prominent symptom of novice-like problem-solving strategy, and calculation-oriented problems may encourage just that. As a result, there is a frequent call for "conceptual" online problems, where both instructors and students seem to define "conceptual" simply by the absence of numbers and formulas. But does depriving students of numbers and formulas indeed make them work on a conceptual level?
-
-Hewitt in the preface to his textbook "Conceptual Physics"~\cite{hewitt} argues that the mathematical language of physics often deters the average nonscience students, a notion which concurs with Tobias' concept of "math phobia"[CITE]. For many students, though, the opposite appears to be true, their "concept phobia" appears to be more prominent than any "math phobia."
+{\bf Hypothesis 1b:} There are classes of online formative assessment computer-evaluated problems which make learners confront their non-expertlike epistemologies and encourage expertlike problem-solving strategies\end{quote}
 
+Hewitt in the preface to his textbook "Conceptual Physics"~\cite{hewitt} argues that the mathematical language of physics often deters the average non-science students, a notion which concurs with Tobias' concept of "math phobia"\cite{tobias}, which is a particular issue for students in the "second tier"\cite{tobiasST} of science course.  For them, the use of mathematics in physics courses can present a hurdle, and a lack of skills or confidence to perform basic algebraic manipulations ("$V=RI\ \Rightarrow\ R=V/I"$), or even problems operating their pocket calculators, can hinder students' learning progress in physics at a very basic level.
 
+Yet, the majority of students appears to be able to correctly substitute variables and execute calculations, and quite content with the "plug-and-chug" approach. In fact, it appears to be true that their "concept phobia" is more prominent than any "math phobia."
 
+Moving beyond initial barriers, the problem with mathematics as part of a formative assessment  appears to be not one of {\it operation}, but one of {\it translation}. Students see formulas in a purely operational sense~\cite{torigoe,breitenberger}, while lacking the ability to translate between the formulas and the situations~\cite{clement}, which is also illustrated in the expert and novice quotes at the beginning of section~\ref{intro}.
 
-
-\subsection{Transfer between mathematical formulations and physical situations}
-Being able to plug-and-chug and perform basic algebraic manipulations ("$V=RI\ \Rightarrow\ R=V/I"$) has little do with the understanding of mathematics, it is a skill. Students who are lacking this skill are kept from advancing in physics on a very basic level, and remedial instruction is in order. Other students simply have problems operating their pocket calculators.
-
-Moving beyond these initial barriers, students are generally able to correctly substitute variables and execute calculations, but see formulas in a purely operational sense~\cite{torigoe}, while lacking the abilitity to translate between the formulas and the situations~\cite{clement}. 
-
+Online homework systems by the very nature of computers lend themselves to standard calcu\-lation-oriented textbook problems, and are extensively used in this way. Yet, the "plug-and-chug" approach is the most prominent symptom of novice-like problem-solving strategy, and calculation-oriented problems may encourage just that. As a result, there is a frequent call for "conceptual" online problems, where {\bf both instructors and students seem to define "conceptual" simply by the absence of numbers and formulas}. 
+\begin{itemize}
+\item But does "depriving" students of numbers and formulas indeed make them work on a conceptual level?
+\item Does it help both the "second tier" students who have problems with applying mathematical methods and those who comfortably "plug-and-chug" gain conceptual understanding of physics?
+\end{itemize}
+The following hypotheses reflect these notions in the positive form:
+\begin{quote}
+{\bf Hypothesis 2:} Learners with a low level of mathematical skills or confidence will more likely develop a conceptual understanding of physics as a result of non-calculation-oriented online formative assessment, by removing mathematics as a barrier to their understanding.
+\end{quote}
+\begin{quote}
+{\bf Hypothesis 3:} Learners with an average or above average level of mathematical skills or confidence will more likely develop a conceptual understanding of physics as a result of non-calculation-oriented online formative assessment, by discouraging non-expertlike problem-solving strategies.
+\end{quote}
+As evidenced in the above definition, it should be emphasized that the project does by no means attempt to establish or promote a dichotomomy between "conceptual understanding" and "basic skills/factual knowledge." {\it A physicist needs basic skills and factual knowledge, and their learning must not be underemphasized in formative assessment.} However, how to best develop these through formative assessment would constitute another valid research project.
 \subsection{Intellectual Merit}Much effort and many resources have been invested into developing effective curricular material and assessment, especially in the interactive or online realm, yet very little research has been done on the impact of different representations and question types on student conceptual understanding.The outcomes of this study will provide a broader research base for STEM curriculum development efforts regarding the most effective use of learner feedback. The outcomes will also inform development efforts for online course and learning content management systems, as well as provide input for educational metadata, content exchange, and interoperability standard efforts.\subsection{Broader Impact/Diversity}Attrition in STEM courses is a national problem and may be particularly acute in large, introductory courses that are the staples of many undergraduate science programs, with 40 to 60\% of students leaving STEM disciplines [Committee99, Seymour97]. Much of this attrition has been attributed to inadequate curriculum design, pedagogy and assessment.It is broadly accepted that frequent formative assessment and feedback are a key component of the learning process~\cite{bransford}. Shifting the focus from summative to formative assessment with feedback can move student motivation from an extrinsic reward to the intrinsic reward of developing understanding of the materials~\cite{stipek}. Intrinsic motivation and positive feedback promote the feelings of competence, confidence~\cite{clark,pascarella02}, and engagement that are crucial to retaining students in introductory STEM courses.  While improving student self-efficacy should have positive impacts on all student retention, Seymour and Hewitt [Seymour97] suggest that such changes should have a particularly strong impact on women and underrepresented groups who may feel that science excludes them.
 
 
@@ -158,7 +171,7 @@
 \section{Research Methodology}
 \subsection{Establishment of Initial Conditions}The validity of the hypotheses may depend on both learner and assessment characteristics.\subsubsection{Learner Attitudes, Beliefs, and Expectations}Instruments have been developed to assess epistemological beliefs, for example the Epistemological Beliefs Assessment for Physical Science (EBAPS) Instrument~\cite{EBAPS}. Related to epistemological beliefs are learnerÕs expectations and attitudes, and of particular interest is the Maryland Physics Expectations (MPEX) survey~\cite{MPEX}.\subsubsection{Learner Knowledge about the Topic}We will use existing concept inventory surveys as both pre- and post-tests.The qualitative Force Concept Inventory~\cite{fci} and the quantitative companion Mechanical Baseline Test~\cite{hestenesmech} have been used in a large number of studies connected to the teaching of introductory mechanics. The Foundation Coalition has been developing a number of relevant concept inventories~\cite{foundation}, namely the Thermodynamics Concept Inventory, the Dynamics Concept Inventory, and the Electromagnetics Concept Inventory (with two subcomponents, namely Waves and Fields).  Since these were designed from an engineering point of view, some adjustment might be necessary. In addition, the Conceptual Survey of Electricity and Magnetism (CSEM)~\cite{maloney} is available for the second semester course.
 
-4.2.3 Assessment LevelOnline problems will be categorized according to BloomÕs taxonomy [Bloom56]. We will distinguished between ÒLevel 1Ó questions, which only require minimal interpretation of the presented content (corresponding to the Knowledge Level of BloomÕs taxonomy) and ÒLevel 2Ó questions (Comprehension and Application), which require learners to make connections between presented material and apply them to more complex scenarios.4.2.4 Assessment Difficulty and Baseline Statistical DataLON-CAPA automatically keeps tracks of the average number of attempts until a problem is solved, as well as the degree of difficulty and the degree of discrimination. This data is cumulative across semesters, and already exists for all assessment problems from their deployment in previous semesters.4.3 Observables4.3.1 EffectivenessEffectiveness will be measured both in terms of performance on summative assessments (quizzes and exams) and on pre-/post-test concept inventory surveys (section 4.2.2).  Each item on these instruments will be associated with topically corresponding formative online exercises to determine correlations and differential gain between the feedback types used with the respective online problems. A second posttest, correlated with first semester problems, will be administered at the end of the second semester to determine long-term effects.4.3.2 Problem Solving TechniqueWe intend to focus on a subset of students in the LBS Collaborative Learning Laboratory, and observe them while solving problems. Schoenfeld [Schoenfeld85] and Foster [Foster00] developed instruments to categorize and document the stages and expert-like characteristics [Chi81] of observed problem-solving activity by learners, as well as application of metacognitive skills.In addition, we will interview a group of students from all courses regarding their problem-solving strategies. Pascarella [Pascarella02] developed some frameworks for these interviews, which can be built upon.Finally, for all students in all courses, LON-CAPA log data will be analyzed. Kotas [Kotas00] and Minaei [Minaei03] developed a mechanism for this log data analysis, which include submission times between attempts, and quality of the entered input. 4.3.3 Help-Seeking BehaviorIt is impossible to observe all on-demand help seeking, but interactions in several settings can be analyzed:¥ Online discussions and email communication are preserved within LON-CAPA and can be analyzed even in retrospect for past semesters with respect to relevant behavioral patterns. Discussion contributions and states can be linked to online transactions, such as submission of homework attempts, browsing of content material, and hint usage. Wallace [Wallace03] reviewed existing research on such online interactions, however, some adaptation of several of the existing conceptualizations will be necessary to account for the nature of science and in particular physics and biology courses.¥ For the subset of LBS students who come to the LBS Collaborative Learning Laboratory, group discussions can to be documented, and linked to online behavior as absolute timing and learner identify are preserved.¥ Minaei [Minaei03] developed data mining strategies to categorize learner behavior, including navigational patterns between assessment and content material, the use of feedback, and communication functions.¥ Self-reporting can be used for several other help-seeking mechanisms, such as textbook use and peer-interaction ([Riffell03a, Riffell03b]).4.3.4 Cross-Cutting Open-Ended Documentation of Learner PerceptionsWe will interview focus groups of students from every course regarding their experiences and perceived relative helpfulness of the different feedback mechanisms. We will ask them to also reflect on how they perceived these mechanisms to influence their problem solving strategies. In the framework of the current NSF ITR grant [NSFITR00] an instrument was developed to gather learner input on perceived time-on-task and perceived helpfulness and functionality of different aspects of the system.4.4 Data Collection and Comparison GroupsFor the data collection, results would be most meaningful, if we divided the course into groups that experience the same assignment with different feedback mechanisms; indeed, this is what we propose to do in later phases of the study regarding the variation of solely the feedback character. However, when varying the number of available tries and the immediacy of the feedback, we will do so concurrently for all students across different problems Ð otherwise, it would be inevitable that students will complain about unfairness compared to the students with more attempts or immediate feedback. Also students with delayed feedback would most certainly confer with students in the other group.
+4.2.3 Assessment LevelOnline problems will be categorized according to BloomÕs taxonomy [Bloom56]. We will distinguished between ÒLevel 1Ó questions, which only require minimal interpretation of the presented content (corresponding to the Knowledge Level of BloomÕs taxonomy) and ÒLevel 2Ó questions (Comprehension and Application), which require learners to make connections between presented material and apply them to more complex scenarios.4.2.4 Assessment Difficulty and Baseline Statistical DataLON-CAPA automatically keeps tracks of the average number of attempts until a problem is solved, as well as the degree of difficulty and the degree of discrimination. This data is cumulative across semesters, and already exists for all assessment problems from their deployment in previous semesters.4.3 Observables4.3.1 EffectivenessEffectiveness will be measured both in terms of performance on summative assessments (quizzes and exams) and on pre-/post-test concept inventory surveys (section 4.2.2).  Each item on these instruments will be associated with topically corresponding formative online exercises to determine correlations and differential gain between the feedback types used with the respective online problems. A second posttest, correlated with first semester problems, will be administered at the end of the second semester to determine long-term effects.4.3.2 Problem Solving TechniqueWe intend to focus on a subset of students in the LBS Collaborative Learning Laboratory, and observe them while solving problems. Schoenfeld [Schoenfeld85] and Foster [Foster00] developed instruments to categorize and document the stages and expertlike characteristics [Chi81] of observed problem-solving activity by learners, as well as application of metacognitive skills.In addition, we will interview a group of students from all courses regarding their problem-solving strategies. Pascarella [Pascarella02] developed some frameworks for these interviews, which can be built upon.Finally, for all students in all courses, LON-CAPA log data will be analyzed. Kotas [Kotas00] and Minaei [Minaei03] developed a mechanism for this log data analysis, which include submission times between attempts, and quality of the entered input. 4.3.3 Help-Seeking BehaviorIt is impossible to observe all on-demand help seeking, but interactions in several settings can be analyzed:¥ Online discussions and email communication are preserved within LON-CAPA and can be analyzed even in retrospect for past semesters with respect to relevant behavioral patterns. Discussion contributions and states can be linked to online transactions, such as submission of homework attempts, browsing of content material, and hint usage. Wallace [Wallace03] reviewed existing research on such online interactions, however, some adaptation of several of the existing conceptualizations will be necessary to account for the nature of science and in particular physics and biology courses.¥ For the subset of LBS students who come to the LBS Collaborative Learning Laboratory, group discussions can to be documented, and linked to online behavior as absolute timing and learner identify are preserved.¥ Minaei [Minaei03] developed data mining strategies to categorize learner behavior, including navigational patterns between assessment and content material, the use of feedback, and communication functions.¥ Self-reporting can be used for several other help-seeking mechanisms, such as textbook use and peer-interaction ([Riffell03a, Riffell03b]).4.3.4 Cross-Cutting Open-Ended Documentation of Learner PerceptionsWe will interview focus groups of students from every course regarding their experiences and perceived relative helpfulness of the different feedback mechanisms. We will ask them to also reflect on how they perceived these mechanisms to influence their problem solving strategies. In the framework of the current NSF ITR grant [NSFITR00] an instrument was developed to gather learner input on perceived time-on-task and perceived helpfulness and functionality of different aspects of the system.4.4 Data Collection and Comparison GroupsFor the data collection, results would be most meaningful, if we divided the course into groups that experience the same assignment with different feedback mechanisms; indeed, this is what we propose to do in later phases of the study regarding the variation of solely the feedback character. However, when varying the number of available tries and the immediacy of the feedback, we will do so concurrently for all students across different problems Ð otherwise, it would be inevitable that students will complain about unfairness compared to the students with more attempts or immediate feedback. Also students with delayed feedback would most certainly confer with students in the other group.
 
 \section{Evaluation}
 The LON-CAPA Faculty Advisory Board was formed as part of our NSF ITR grant project. It consists of eight actively teaching faculty and administrators from a number of colleges on campus of MSU, and meets once every month to both evaluate and advise projects connected to LON-CAPA. We propose to continue using this existing structure to evaluate this projectÕs progress and findings. In addition, Dr.~Kortemeyer's Mentoring Committee, which consists of senior faculty members from both LBS and DSME will guide and advise the progress of this project.
@@ -199,6 +212,12 @@
 \bibitem{kashy00} E. Kashy, M. Thoennessen Y. Tsai, N. E. Davis, and G. Albertelli II, {\it Melding Network Technology with Traditional Teaching: Enhanced Achievement in a 500-Student Course}, Chapter in {\it Interactive Learning: Vignettes from America's Most Wired Campuses}, 
 David G. Brown (editor), Anker Publishing Company, Boston, 51, ISBN 1-882982-29-0 (2000)
 \bibitem{hewitt} Paul G. Hewitt, {\it Conceptual Physics}, Little, Brown, ISBN 0-673-39541-3
+\bibitem{tobias} Sheila Tobias, {\it Overcoming Math Anxiety}, W. W. Norton\& Company; Revised edition, ISBN 0-393-31307-7 (1995)
+\bibitem{tobiasST} Sheila Tobias, {\it They're not dumb, they're different. Stalking the Second Tier}, Research Corporation (1990)
+\bibitem{torigoe} Eugene Torigoe, {\it Student Difficulties with Equations in Physics}, ISAAPT Spring Meeting, Urbana, IL, (April 2004)
+\bibitem{breitenberger} Ernst Breitenberger, {\it The mathematical knowledge of physics graduates: Primary data and conclusions}, Am. J. Phys. {\bf 60}(4), 318-323 (1992)
+\bibitem{clement} J. Clement, J. Lochhead, and G. S. Monk, {\it Translation difficulties in learning mathematics}, Amer. Math. Mon. {\bf 88}, 286 (1981)
+
 \bibitem{stipek} D. J. Stipek, {\it Motivation and instruction}, in D.C. Berliner and R.C. Calfee (Eds.), {\it Handbook of educational psychology}, New York:  Macmillan Library, 85-113 (1996)\bibitem{clark} K. Clark and F. Dwyer, {\it Effect of different types of computer-assisted feedback strategies on achievement and response confidence}, International Journal of Instructional Media {\bf 25}(1), 55-63 (1998)
 \bibitem{pascarella02} A. M. Pascarella, {\it CAPA (Computer-Assisted Personalized Assignments) in a Large University Setting}, Ph.D. (Physics) dissertation, University of Colorado (2002)
 
@@ -219,8 +238,7 @@
 
 % ---- UNUSED
 
-\bibitem{torigoe} Eugene Torigoe, {\it Student Difficulties with Equations in Physics}, ISAAPT Spring Meeting, Urbana, IL, (April 2004)
-\bibitem{clement} J. Clement, J. Lochhead, and G. S. Monk, {\it Translation difficulties in learning mathematics}, Amer. Math. Mon. {\bf 88}, 286 (1981)
+
 
 \bibitem{feynmanLectures} Richard Feynman, Robert B. Leighton, and Matthew L. Sands, {\it The Feynman Lectures on Physics}, Addison-Wesley, ISBN 0-201-5100(3,4,5)-(0,9,0) (1963-65)
 \bibitem{blatt} Frank J. Blatt, {\it Principles of Physics}, Allyn and Bacon, ISBN 0 205 11784 8

--www1089339977--