[LON-CAPA-cvs] cvs: modules /gerd/correlpaper correlations.tex

www lon-capa-cvs@mail.lon-capa.org
Tue, 21 Nov 2006 22:04:37 -0000


This is a MIME encoded message

--www1164146677
Content-Type: text/plain

www		Tue Nov 21 17:04:37 2006 EDT

  Modified files:              
    /modules/gerd/correlpaper	correlations.tex 
  Log:
  Neverending paper, version 189.32.2
  
  
--www1164146677
Content-Type: text/plain
Content-Disposition: attachment; filename="www-20061121170437.txt"

Index: modules/gerd/correlpaper/correlations.tex
diff -u modules/gerd/correlpaper/correlations.tex:1.15 modules/gerd/correlpaper/correlations.tex:1.16
--- modules/gerd/correlpaper/correlations.tex:1.15	Sat Sep 30 19:45:46 2006
+++ modules/gerd/correlpaper/correlations.tex	Tue Nov 21 17:04:36 2006
@@ -63,7 +63,9 @@
                               %display desired
 \maketitle
 \section{\label{intro}Introduction}
-A traditional way of assessing student beliefs, attitudes, and expectations about physics is the deployment of surveys, for example the Maryland Physics Expectations Survey (MPEX)~\cite{mpex}, the Epistemological Beliefs Assessment for Physical Science (EBAPS)~\cite{ebaps}, or the Colorado Learning Attitudes about Science Survey (CLASS)~\cite{adams04}, or through structured interviews (see for example~\cite{hammer94,hogan99}). While these instruments take different approaches and have different philosophies behind their designs, they do have in common that the students need to react to statements outside of the normal course activity, and that they need to self-report their responses.
+In the study of student learning, epistemological beliefs are defined as beliefs and views about how knowledge is constructed and evaluated. Typical dimensions along which these are evaluated include measures of independence in learning (taking responsibility versus relying on authority (instructors, books)), coherence (actively attempting to integrate new knowledge into a coherent framework versus seeing each piece of knowledge as a standalone entity), and emphasis on concepts (e.g., attempting to understand formulas versus memorizing them).
+
+A traditional way of assessing these beliefs and views is the deployment of surveys, for example the Maryland Physics Expectations Survey (MPEX)~\cite{mpex}, the Epistemological Beliefs Assessment for Physical Science (EBAPS)~\cite{ebaps}, or the Colorado Learning Attitudes about Science Survey (CLASS)~\cite{adams04}, or through structured interviews (see for example~\cite{hammer94,hogan99}). While these instruments take different approaches and have different philosophies behind their designs, they do have in common that the students need to react to statements outside of the normal course activity, and that they need to self-report their responses.
 
 The MPEX makes the limitations of this approach very explicit in their ``Product Warning Label''~\cite{mpexwarning}: ``students often think that they function in one fashion and actually behave differently. For the diagnosis of the difficulties of individual students more detailed observation is required.'' Online student discussions associated with online physics problems are different in that they are generated within the real context of the course, and students have a vested interest in making these discussions as productive as possible, given their understanding of how physics is done and their approach to it. They could thus be a ``reality check'' of students' beliefs, attitudes, and expectations. 
 
@@ -78,12 +80,16 @@
 \end{itemize}
 
 \section{\label{background}Background}
-Previous studies indicate that correlations between epistemological beliefs and academic performance exist, both directly and indirectly \cite{schommer93,may02}. The problem is how to measure these beliefs,  and techniques include surveys, guided interviews, and observations. 
-Research results regarding their predictive power of these instruments is not always conclusive: for example, Coletta and Philips~\cite{coletta05} found a strong correlation between the MPEX and FCI Gain, while Dancy~\cite{dancy02} found low correlations between the MPEX and the the performance on homework, tests, and final exams. The discrepancies might all be traced back to the ``Product Warning Lab''~\cite{mpexwarning}, that the survey is best used to gain insights into the beliefs of the class as a whole, rather than on an individual level.
+Previous studies indicate that correlations between epistemological beliefs and academic performance exist, both directly and indirectly. For example, Schommer~ \cite{schommer93} found that belief in ``quick learning'' (characterized by seeking single answers, avoiding ambiguity, and relying on authority) negatively correlates with the GPA of secondary students, even after controlling for general intelligence. May~\cite{may02} found possible correlations between epistemological beliefs extracted from extensive lab reports and conceptual learning gain in introductory physics courses. For example, students who stated that they learned formulas (rather than investigated their conceptual implications), relied on authority, and made no efforts to interpret results were found to have lower gains on the Force Concept Inventory, Mechanics Baseline Test, and Conceptual Survey of Electricity and Magnetism.
+
+The problem is how to measure these beliefs,  and techniques include surveys, guided interviews, and observations. While interviews and observations are likely resulting in better data, the effort in conducting them also limits the scale at which they can be conducted. Surveys do not have this scalability problem, but
+research results regarding the predictive power of these instruments is not always conclusive: for example, Coletta and Philips~\cite{coletta05} found a strong correlation between the MPEX and FCI Gain, while Dancy~\cite{dancy02} found low correlations between the MPEX and the performance on homework, tests, and final exams. It is unclear why these studies would come to such different results regarding the predictive power of the MPEX on an individual student level. Until more insights are gained, it remains a good idea to abide by the ``Product Warning Lab''~\cite{mpexwarning} that the survey is best used to gain insights into the beliefs of the class as a whole.
 
 Online discussions take place within the regular course context and over its complete duration. They are a rich source of feedback to the instructor~\cite{kortemeyer05feedback}, and their quality and character was found to be correlated with the type and difficulty of the associated problems~\cite{kortemeyer05ana}, i.e., data exists regarding the influence of {\it problem} characteristics on associated discussions. Unfortunately, less data exists on the correlation between {\it student} characteristics and discussion behavior, because usually only very few student characteristics are known, with the exception of the students' overall performance in the course. Thus, one of the few findings was the fact that certain discussion behavior, most prominently exhibited on ``non-sanctioned'' discussion sites external to the course, is negatively correlated with performance in the course~\cite{kashy03,kortemeyer05ana}.
 
-Few studies exist on the correlation between beliefs data gathered in research settings and actual discussion behavior in the course. For example, Hogan~\cite{hogan99} assessed eight graders' epistemological frameworks through interviews and then analyzed their discussion behavior in a science course with a particular focus on collaboration, finding a number of correlations.
+Few studies exist on the correlation between beliefs data gathered in research settings and actual discussion behavior in the course. For example, Hogan~\cite{hogan99} assessed eight graders' epistemological frameworks through interviews and then analyzed their discussion behavior in a science course with a particular focus on collaboration, finding a number of correlations. In the interviews, students were ask to articulate views about themselves, about how they learn, and about the subject area. It was found that students' views on learning most strongly correlated with their peer-discussion behavior, for example, students who exhibited a constructivist view of learning were also most strongly engaged in the peer-discussions and collaborative knowledge building. 
+
+The author used the analysis of online student discussions as a means to identify characteristics of physics problems which elicit desirable problem solving strategies in an earlier study~\cite{kortemeyer05ana}. The analysis of online student discussions is similar to observational techniques, yet it does not take place in artificial research settings and, since it is self-documenting, does not suffer the same scalability problems. 
 
 \section{\label{setting}Setting}
 The project was carried out in an introductory calculus-based physics course with initially 214 students. Most of the students in this course plan on pursuing a career in a medical field. The course had three traditional lectures per week. It did not use a textbook, instead, all course materials were available online. Topics were introductory mechanics, as well as sound and thermodynamics. There was twice-weekly online homework: one small set as reading problems due before the topic was dealt with in class (implementing JiTT~\cite{jitt}), and a larger set of traditional end-of-the-chapter style homework at the end of each topic. The online problems in the course were randomized using the LON-CAPA system, i.e., different students would receive different versions of the same problem (different graphs, numbers, images, options, formulas, etc)~\cite{loncapa,kashyd01}. The students had weekly recitation sessions, and a traditional lab was offered in parallel. The course grade was determined from the students' performance on biweekly quizzes, the final exam, the recitation grades, and the homework performance.
@@ -244,7 +250,7 @@
 The most surprising result was that only 31\% of the students stated that they would be frustrated or very frustrated if they did not do well on the FCI, and only 30\% of the students stated the same for the MPEX. Particularly the FCI percentage is smaller than expected, since the FCI is generally believed to be fairly robust in ungraded settings, see for example Henderson~\cite{henderson}, who found only 0.5 points difference between graded and ungraded administration of the FCI. Also, the FCI is similar to the tests and exams used in the course, and students tend to base their relative value system regarding a subject area on the assessments used~\cite{lin}. 
 
 
-On the other hand, student discussions correlate more strongly with performance measures. Students are taking them seriously, likely because they are perceived as helpful and relevant. In the same post-course survey, 89\% of the students found the discussions either helpful or very helpful, and 73\% stated that they used the discussions to learn physics, as opposed to 35\% who said they often or very often just used the discussions to get the correct result as quickly as possible. Discussions appear to be an authentic reflection of what the students perceive as good problem solving strategy:  while an expert would characterize most postings as ``bad strategy,''  
+On the other hand, student discussions correlate more strongly with performance measures. Students are taking them seriously, likely because they are perceived as helpful and relevant. In the same post-course survey, 89\% of the students found the discussions either helpful or very helpful, and 73\% stated that they used the discussions to learn physics, as opposed to 35\% who said they often or very often just used the discussions to get the correct result as quickly as possible. Discussions appear to be an authentic reflection of what the students perceive as effective, not necessarily good, problem solving strategy:  while an expert would characterize most postings as ``bad strategy,''  
 only 17\% of the students admitted that they often against better knowledge used bad problem solving strategies to get the correct result as soon as possible, and 48\% stated that they rarely or never did so (35\% were not sure). 
 
 
@@ -319,7 +325,7 @@
 \end{equation*}
 with an explained variance of 47.9\% of the Post FCI score. Both coefficients are significant, the solution-oriented discussion has $p=0.019$. Thus, controlling for pre-test FCI score, for each 10 percent increase in solution-oriented discussion, the predicted post-test FCI score goes down by 0.42 points. Students who do not make any solution-oriented contributions would on the average gain 7.6 points on the 30 item FCI due to instruction, while at the other extreme, students who only make solution-oriented discussions would on the average only gain 3.4 points -- less than half.
 \section{Conclusions}
-Online student discussions have very little correlation with MPEX outcomes, but appear to be a good reflection of students' individual beliefs regarding the nature of problem solving in physics. Students who exhibit more expert-like views and strategies have higher learning success, even when controlling for prior physics knowledge.
+Online student discussions have very little correlation with MPEX outcomes, but appear to be a good reflection of students' individual beliefs regarding the most effective strategy of problem solving in physics. Students who exhibit more expert-like views and strategies have higher learning success, even when controlling for prior physics knowledge.
 \begin{acknowledgments}
 Supported in part by the National Science Foundation under NSF-ITR 0085921 and NSF-CCLI-ASA 0243126. Any opinions, findings, and conclusions or recommendations expressed in this 
 publication are those of the author and do not necessarily reflect the views of the National Science Foundation. The author would like to thank the students in his course for their participation in this study, as well as Deborah Kashy from Michigan State University for assistance with the statistical analysis of the data, and Stephen Pellathy from the University of Pittsburgh for carrying out the interrater reliability study.

--www1164146677--