[LON-CAPA-cvs] cvs: modules /gerd/discussions/paper discussions.bib discussions.tex

www lon-capa-cvs@mail.lon-capa.org
Fri, 22 Apr 2005 17:20:42 -0000


www		Fri Apr 22 13:20:42 2005 EDT

  Modified files:              
    /modules/gerd/discussions/paper	discussions.bib discussions.tex 
  Log:
  Literature
  
  
Index: modules/gerd/discussions/paper/discussions.bib
diff -u modules/gerd/discussions/paper/discussions.bib:1.3 modules/gerd/discussions/paper/discussions.bib:1.4
--- modules/gerd/discussions/paper/discussions.bib:1.3	Sat Apr 16 11:40:11 2005
+++ modules/gerd/discussions/paper/discussions.bib	Fri Apr 22 13:20:42 2005
@@ -44,3 +44,21 @@
    title = "Individualized interactive exercises: a promising role for network technology"
 }
 
+@ARTICLE{steinberg,
+   author = "Richard N. Steinberg and Mel S. Sabella",
+   year = "1997",
+   journal = "Phys. Teach.",
+   volume = "35",
+   pages = "150-155",
+   title = "Performance on multiple-choice diagnostics and complementary exam problems"
+}
+
+
+@ARTICLE{lin,
+   author = "Herbert Lin",
+   year = "1982",
+   journal = "Phys. Teach.",
+   volume = "20",
+   pages = "151-157",
+   title = "Learning physics vs. passing courses"
+}
Index: modules/gerd/discussions/paper/discussions.tex
diff -u modules/gerd/discussions/paper/discussions.tex:1.13 modules/gerd/discussions/paper/discussions.tex:1.14
--- modules/gerd/discussions/paper/discussions.tex:1.13	Mon Apr 18 17:12:49 2005
+++ modules/gerd/discussions/paper/discussions.tex	Fri Apr 22 13:20:42 2005
@@ -86,13 +86,16 @@
 
 \subsection{\label{subsec:problemcat}Problem Classification}
 Kashy~\cite{kashyd01} showed that student mastery of different types of homework problems correlates differently with the student's performance on final exams --- 
-with multiple-choice non-numerical problems having the lowest correlation, and numerical/mathematical problems that require a translation of representation having the highest. For this project, we chose a finer-grained classification scheme of homework types: Redish~\cite{redish} identifies eight classes and features of exam and homework questions, 
+with multiple-choice non-numerical problems having the lowest correlation, and numerical/mathematical problems that require a translation of representation having the highest.
+Steinberg~\cite{steinberg} also analyzed student performance on multiple-choice diagnostics and open-ended exam problems, and found that while those correlate in general, for certain students
+and certain questions, responses differ greatly. 
+For this project, we chose a finer-grained classification scheme of question types: Redish~\cite{redish} identifies eight classes and features of exam and homework questions, 
 an adapted version of which will be used:
 \begin{description}
 \item[Multiple-choice and short-answer questions] The most basic and most easily computer-evaluated type of question, representing the conventional (typical back-of-chapter textbook) problem.
 
 For the purposes of this project, ``multiple choice" and ``short-answer" will be considered as separate classes, where short-answer includes numerical answers such as ``$17 kg/m^3$," and formula answers, such as ``\verb!1/2*m*(vx^2+vy^2)!."  The problems on the left side of Figs.~\ref{threemasses} and \ref{trajectory} are examples of ``short-(numerical)-answer" problems.
-\item[Multiple-choice multiple-response questions]  This type of problem, a first step beyond conventional problems, requires a student to evaluate each statement and make a decision about it. The problems Fig.~\ref{problemview} and on the right side of Fig.~\ref{threemasses} are of this type.
+\item[Multiple-choice multiple-response questions]  This type of problem, a first step beyond conventional problems, requires a student to evaluate each statement and make a decision about it. The problem on the right side of Fig.~\ref{threemasses} is of this type.
 
 
 \begin{figure*}
@@ -382,7 +385,7 @@
 Multiple-choice problems that do not involve numbers are frequently called ``conceptual'' questions, but in this study it was found that they do not lead to conceptual discussions.
 
 It is a surprising result that the only significant difference between ``conventional'' and representation-translation problems is that students discuss slightly less procedure in favor of 
-more complaints, and that differences disappear when ``chat'' is excluded from the analysis. Kashy~\cite{kashyd01} on the other hand found that mastery of representation-translation problems 
+more complaints, and that differences disappear when ``chat'' is excluded from the analysis. McDermott~\cite{mcdermott} and Beichner~\cite{beichner} on the other hand found that students have unexpected difficulties in translating for example data presented as graphs, so a stronger effect of this feature was expected. In additon, Kashy~\cite{kashyd01} found that mastery of representation-translation problems 
 is the best predictor of final exam scores, even when controlling for ACT, cumulative GPA, and force-concept inventory pretests.
 Discussion behavior and final exam performance are clearly different measurements for the influence of problem types and do not necessarily need to correlate, but a connection between 
 individual discussion behavior and performance in the course clearly exists (see subsection~\ref{subsec:gradedep}).
@@ -401,13 +404,13 @@
 \end{itemize}  
 Especially the last observation is discouraging, since as the student in the calculus-based course progressed further into their study of physics, the degree to which they were discussing concepts
 decreased. This might partly be due to the different subject matter (electricity and magnetism versus mechanics), but also due to the lack of reward for conceptual considerations in solving standard
-homework problems. 
+homework problems~\cite{lin}. 
 \subsection{Qualitative Observations}
 Reading the online discussions associated with the homework provides valuable insights to the instructor, which are hard to quantify.
 When assigning homework, instructors usually have an instructional goal in mind, for example, they would like the students to grapple with a certain concept or work through a specific strategy of problem 
 solving. Until the ``reality check,'' the fact that a specific problem only serves this purpose when being approached with an expert mindset is under-appreciated. An even deeper misconception is the
 assumption that solving the problem correctly is a reliable indicator of the concept or problem solving strategy being successfully communicated. What the (expert) instructor had in mind, and what the
-(novice) learner actually does, is worlds apart. Students are going through reasoning processes and steps that are hardly imaginable to the instructor, and more often than not do several times more work
+(novice) learner actually does, is worlds apart~\cite{lin}. Students are going through reasoning processes and steps that are hardly imaginable to the instructor, and more often than not do several times more work
 than necessary. The situation that they get a problem right for the wrong reasons is rare, but the instances that they get the problem correct with the same (minimal) amount of steps that an expert 
 would are equally rare --- in the end, the concept that was meant to be communicated is lost.  
 \section{Conclusions}