[LON-CAPA-cvs] cvs: modules /gerd/concept description.tex

www lon-capa-cvs@mail.lon-capa.org
Sat, 10 Jul 2004 20:16:28 -0000


This is a MIME encoded message

--www1089490588
Content-Type: text/plain

www		Sat Jul 10 16:16:28 2004 EDT

  Modified files:              
    /modules/gerd/concept	description.tex 
  Log:
  Talk about discussions ...
  
  
--www1089490588
Content-Type: text/plain
Content-Disposition: attachment; filename="www-20040710161628.txt"

Index: modules/gerd/concept/description.tex
diff -u modules/gerd/concept/description.tex:1.10 modules/gerd/concept/description.tex:1.11
--- modules/gerd/concept/description.tex:1.10	Sat Jul 10 11:10:53 2004
+++ modules/gerd/concept/description.tex	Sat Jul 10 16:16:28 2004
@@ -1,4 +1,4 @@
-\documentclass[11pt]{article}
+\documentclass[10pt]{article}
 
 \newif\ifpdf
 \ifx\pdfoutput\undefined
@@ -22,6 +22,7 @@
 \headheight = 0.0 in
 \headsep = 0.3 in
 
+
 \pagestyle{headings}
 
 \begin{document}
@@ -105,7 +106,7 @@
 Online homework systems by the very nature of computers lend themselves to standard calcu\-lation-oriented textbook problems, and are extensively used in this way. Yet, the "plug-and-chug" approach is the most prominent symptom of novice-like problem-solving strategy, and calculation-oriented problems may encourage just that. As a result, there is a frequent call for "conceptual" online problems, where {\bf both instructors and students seem to define "conceptual" simply by the absence of numbers and formulas}. 
 \begin{itemize}
 \item But does "depriving" students of numbers and formulas indeed make them work on a conceptual level?
-\item Does it help both the "second tier" students who have problems with applying mathematical methods and those who comfortably "plug-and-chug" gain conceptual understanding of physics?
+\item Does it help both students who have problems with applying mathematical methods and those who comfortably "plug-and-chug" gain conceptual understanding of physics?
 \end{itemize}
 The following hypotheses reflect these notions in the positive form:
 \begin{quote}
@@ -115,11 +116,12 @@
 {\bf Hypothesis 3:} Learners with an average or above average level of mathematical skills or confidence will more likely develop a conceptual understanding of physics as a result of non-calculation-oriented online formative assessment, by discouraging non-expertlike problem-solving strategies.
 \end{quote}
 As evidenced in the above definition, it should be emphasized that the project does by no means attempt to establish or promote a dichotomomy between "conceptual understanding" and "basic skills/factual knowledge." {\it A physicist needs basic skills and factual knowledge, and their learning must not be underemphasized in formative assessment.} However, how to best develop these through formative assessment would constitute another valid research project.
-\subsection{Intellectual Merit}Much effort and many resources have been invested into developing effective curricular material and assessment, especially in the interactive or online realm, yet very little research has been done on the impact of different representations and question types on student conceptual understanding.The outcomes of this study will provide a broader research base for STEM curriculum development efforts regarding the most effective use of learner feedback. The outcomes will also inform development efforts for online course and learning content management systems, as well as provide input for educational metadata, content exchange, and interoperability standard efforts.\subsection{Broader Impact/Diversity}Attrition in STEM courses is a national problem and may be particularly acute in large, introductory courses that are the staples of many undergraduate science programs, with 40 to 60\% of students leaving STEM disciplines [Committee99, Seymour97]. Much of this attrition has been attributed to inadequate curriculum design, pedagogy and assessment.It is broadly accepted that frequent formative assessment and feedback are a key component of the learning process~\cite{bransford}. Shifting the focus from summative to formative assessment with feedback can move student motivation from an extrinsic reward to the intrinsic reward of developing understanding of the materials~\cite{stipek}. Intrinsic motivation and positive feedback promote the feelings of competence, confidence~\cite{clark,pascarella02}, and engagement that are crucial to retaining students in introductory STEM courses.  While improving student self-efficacy should have positive impacts on all student retention, Seymour and Hewitt [Seymour97] suggest that such changes should have a particularly strong impact on women and underrepresented groups who may feel that science excludes them.
-
-
+\subsection{Intellectual Merit}Much effort and many resources have been invested into developing effective curricular material and assessment, especially in the interactive or online realm, yet very little research has been done on the impact of different representations and question types on student conceptual understanding.The outcomes of this study will provide a broader research base for STEM curriculum development efforts regarding the most effective use of learner feedback. The outcomes will also inform development efforts for online course and learning content management systems, as well as provide input for educational metadata, content exchange, and interoperability standard efforts.
+\subsection{Broader Impact/Diversity}
+Currently, every semester approximately 350,000 students are taking introductory undergraduate physics courses similar to the ones under investigation in this project~\cite{aapt}.
+It is broadly accepted that frequent formative assessment and feedback are a key component of the learning process~\cite{bransford}. Shifting the focus from summative to formative assessment with feedback can move student motivation from an extrinsic reward to the intrinsic reward of developing understanding of the materials~\cite{stipek}. Intrinsic motivation and positive feedback promote the feelings of competence, confidence~\cite{clark,pascarella02}, and engagement that are crucial to retaining students in introductory STEM courses.  While improving student self-efficacy should have positive impacts on all student retention, Seymour and Hewitt~\cite{seymour} suggest that such changes should have a particularly strong impact on the attrition of women and underrepresented groups in science, who may often feel that science excludes them.
 
-They report that even women with good academic records nonetheless lost confidence in judging their academic performances as "good enough" if they did not receive feedback through personal teacher-student relationships Ð these relationships were reported to be rare, possibly due to the disproportionate number of male teaching staff in STEM courses. Lack of confidence, paired with a perceived gender or race bias, can negatively influence performance [Steele97].A gender-differential benefit of using online formative assessment systems (particularly LON-CAPA) has been found in the past at both Michigan State and Central Michigan University, and was recently studied in more detail through an NSF planning grant [NSFPGE03]. While the final analysis of the data gathered during the last semester is still not completed, additional evidence was found that such technology helps women to more quickly close the initial gender gap in performance in the sciences. In the science courses we propose to focus on, we traditionally have approximately 50-60\% female students, so we anticipate being able to gather additional statistically significant data. The traditionally relatively low enrollment of ethnic minorities in the courses under consideration might unfortunately not yield statistically significant findings with regards to the improvement of STEM education for underrepresented groups.
+
 
 \section{Background and Environment}
 \subsection{PI Education and Appointments}
@@ -130,7 +132,8 @@
 
 LON-CAPAÕs core development group is located at Michigan State University, and in addition to faculty members, has a staff of three fulltime programmers, two user support staff, one technician, one graduate student, and one project coordinator. The LON-CAPA group also offers training and support for adopters of the system.
 
-LON-CAPA is open-source (GNU General Public License) freeware, there are no licensing costs associated. Both aspects are important for the success of a research project like the one proposed here. The open-source nature of the system allows researchers to modify and adapt the system in order to address the needs of their project, and the freeware character allows easier dissemination of results, in particular adaptation and implementation at other universities.\subsubsection{Shared Distributed Content Repository}LON-CAPA is designed around the concept of easy sharing and re-use of learning resources. The ITR research showed that this is only possible by integrating all layers of the infrastructure to give instructors a seamless user experience: pure library systems, such as NSDL, do not offer instructors the ability to in one system: locate existing content; generate and publish new content; enforce secure digital rights management; assemble (sequence) content; deploy the assembled content in a complete course management system; and have continual objective and subjective assessment of the learning content quality.In addition, the system has to be highly scalable, and avoid single points of failure.In LON-CAPA, the underlying distributed content repository spans all servers in a given cluster. Navigation through selected resources is provided by an internal sequencing tool, which allows assembling, re-using, and re-purposing content at different levels of granularity (pages, lessons, modules, chapters, etc). LON-CAPA provides highly customizable access control on resources, and has a built-in key mechanism to charge for content access. The shared content pool of LON-CAPA currently contains over 60,000 learning resources, including more than 18,000 personalized homework problems. Any content material contributed ot the pool is immediately available ready-to-use within the system at all participating sites, thus facilitating dissemination of curricular development efforts. Disciplines include astronomy, biology, business, chemistry, civil engineering, computer science, family and child ecology, geology, human food and nutrition, human medicine, mathematics, medical technology, physics, and psychology. A large fraction of these resources are also available through the gateway to the National Science Digital Library. In addition, the problem supplements to a number of commercial textbooks are available in LON-CAPA format.The network provides constant assessment of the resource quality through objective and subjective dynamic metadata. Selection of a learning resource by instructors at other institutions while constructing a learning module does both establish a de-facto peer-review mechanism and provide additional context information for each resource. In addition, access statistics are being kept, and learners can put evaluation information on each resources.\subsubsection{Formative and Summative Assessment Capabilities}LON-CAPA started in 1992 as a system to give personalized homework to students in introductory physics courses.  ÒPersonalized" means that each student sees a different version of the same computer-generated problem: different numbers, choices, graphs, images, simulation parameters, etc, Fig.~\ref{twoproblems}.
+LON-CAPA is open-source (GNU General Public License) freeware, there are no licensing costs associated. Both aspects are important for the success of a research project like the one proposed here. The open-source nature of the system allows researchers to modify and adapt the system in order to address the needs of their project, and the freeware character allows easier dissemination of results, in particular adaptation and implementation at other universities.\subsubsection{Shared Distributed Content Repository}LON-CAPA is designed around the concept of easy sharing and re-use of learning resources. 
+In LON-CAPA, the underlying distributed content repository spans all servers in a given cluster. Navigation through selected resources is provided by an internal sequencing tool, which allows assembling, re-using, and re-purposing content at different levels of granularity (pages, lessons, modules, chapters, etc). LON-CAPA provides highly customizable access control on resources, and has a built-in key mechanism to charge for content access. The shared content pool of LON-CAPA currently contains over 60,000 learning resources, including more than 18,000 personalized homework problems. Any content material contributed ot the pool is immediately available ready-to-use within the system at all participating sites, thus facilitating dissemination of curricular development efforts. Disciplines include astronomy, biology, business, chemistry, civil engineering, computer science, family and child ecology, geology, human food and nutrition, human medicine, mathematics, medical technology, physics, and psychology. A large fraction of these resources are also available through the gateway to the National Science Digital Library. In addition, the problem supplements to a number of commercial textbooks are available in LON-CAPA format.The network provides constant assessment of the resource quality through objective and subjective dynamic metadata. Selection of a learning resource by instructors at other institutions while constructing a learning module does both establish a de-facto peer-review mechanism and provide additional context information for each resource. In addition, access statistics are being kept, and learners can put evaluation information on each resources.\subsubsection{Formative and Summative Assessment Capabilities}LON-CAPA started in 1992 as a system to give personalized homework to students in introductory physics courses.  ÒPersonalized" means that each student sees a different version of the same computer-generated problem: different numbers, choices, graphs, images, simulation parameters, etc, Fig.~\ref{twoproblems}.
 \begin{figure}
 \includegraphics[width=6.5in]{atwood}
 \caption{Web-rendering of the same LON-CAPA problem for two different students.\label{twoproblems}
@@ -142,15 +145,37 @@
 \item as a result, lively discussions take place, both online and in the helproom --- both of which will be analyzed in this project, see section~\ref{discussion}
 \end{itemize}
 
-Students are generally given immediate feedback on the correctness of their solutions, and in some cases additional help. They are usually granted multiple attempts to get a problem correct. This allows to follow a learner's thought process, both through statistical analysis (see~\ref{statistical}) and data-mining (see~\ref{datamining}) approaches.
+Students are generally given immediate feedback on the correctness of their solutions, and in some cases additional help. They are usually granted multiple attempts to get a problem correct. This allows to follow a learner's thought process, both through statistical analysis (see~\ref{analysis}) and data-mining approaches.
 
 The system also allows for free-form essay-type answers, which are however graded by humans with the assistance of the system (keyword-highlighting, plagiarism-checks, etc).\subsubsection{Course Management}Over the years, the system added a learning content management system and standard course management features, such as communications, gradebook, etc., which are comparable to commercial course management systems, such as BlackBoard, WebCT, or ANGEL. See 
-Refs.~\cite{features,edutools} for an overview of features, and comparisons to other systems.In addition to standard features, the LON-CAPA delivery and course management layer is designed around STEM education, for example: support for mathematical typesetting throughout (\LaTeX\ inside of XML) Ð formulas are rendered on-the-fly, and can be algorithmically modified through the use of variables inside formulas; integrated GNUplot support, such that graphs can be rendered on-the-fly, and allowing additional layered labeling of graphs and images; support for multi-dimensional symbolic math answers; and full support of physical units.\subsection{Collaborative Learning Laboratory}
+Refs.~\cite{features,edutools} for an overview of features, and comparisons to other systems.In addition to standard features, the LON-CAPA delivery and course management layer is designed around STEM education, for example: support for mathematical typesetting throughout (\LaTeX\ inside of XML) Ð formulas are rendered on-the-fly, and can be algorithmically modified through the use of variables inside formulas; integrated GNUplot support, such that graphs can be rendered on-the-fly, and allowing additional layered labeling of graphs and images; support for multi-dimensional symbolic math answers; and full support of physical units.
+
+\begin{figure}
+\includegraphics[width=6.5in]{problemview}
+\caption{Example an individual student view for problem analysis.\label{problemview}}
+\end{figure}
+\subsection{Collaborative Learning Laboratory}
 The Lyman-Briggs School of Science Collaborative Learning Laboratory, which is expected to be completed in 2005. It is modeled in part after a setup by the North Carolina State University Physics Education R\&D Group~\cite{ncsu}, and offers a space where students can collaborate on homework while their interactions and online transactions are recorded.
 
 In addition to having whiteboards and wireless laptop computers for students to work with in flexible group settings, the facility will have integrated observation equipment to video- and audio-record student interactions. All recorded information is immediately digitized and made available for transcription and analysis using the Transana~\cite{transana} software system.
+
+
+\subsubsection{Analysis Capabilities}\label{analysis}
+LON-CAPA allows instructors to analyze student submissions both for individual students (Fig.~\ref{problemview}) and across the course (Fig.~\ref{problemanalysis}).
+
+For example, Fig.~\ref{problemview} indicates that in the presence of a medium between the capacitor plates, the student was convinced that the force would increase, but also that this statement was the one he was most unsure about: His first answer was that the force would double; no additional feedback except "incorrect" was provided by the system. In his next attempt, he would change his answer on only this one statement (indicating that he was convinced of his other answers) to "four times the force" --- however, only ten seconds passed between the attempts, showing that he was merely guessing by which factor the force increased. The graphs on the right of Fig.~\ref{problemanalysis} show which statements were answered correctly on the first and on the second attempt, respectively, the graphs on the right which other options the students chose if the statement was answered incorrectly. Clearly, students have the most difficulty with the concept of how a medium acts inside a capacitor, with the absolute majority believing the capacitance would increase, and only about 20\% of the students believing the medium had no influence.
+
+\begin{figure}
+\begin{center}
+\includegraphics[width=5in]{problemanalysis}
+\end{center}
+\caption{Example of a course-wide problem analysis for the problem Fig.~\ref{problemview}.\label{problemanalysis}}
+\end{figure}
+
 \subsection{Courses}
-The project will be carried out  LBS course sequence LBS 271/272 (calculus-based physics I \& II), which have an enrollment of over 200 students each semester. 
+The project will be carried out  in the two-semester LBS course sequence LBS 271/272,"Calculus-Based Introductory Physics I/II. These second-year three-credit courses have a Calculus pre-requisite, and traditionally an enrollment of over 200 students. Two separate, but associated one-credit laboratory courses (LBS 271L/272L) are required, which most but not all students choose to take simultaneously.
+
+Faculty and teaching assistants are frequently assuming shared responsibilities between the lecture and laboratory courses, with a combined staff of two faculty members and six undergraduate student assistants. The latter are responsible for particular recitation and laboratory sections, and will be involved in this research project (see Section~\ref{undergrad}). Within the duration of this project, the lecture and laboratory courses might be combined to provide greater coherence between these two venues.
 \subsection{Synergy between Project and Institutional Goals}
 The mission statement of the Lyman-Briggs School of Science includes the statement
 \begin{quote}
@@ -168,21 +193,21 @@
 \item[Multiple-choice and short-answer questions] The most basic and most easily computer-evaluated type of question, representing the typical back-of-chapter textbook problem.
 
 For the purposes of this project, "multiple choice" and "short-answer" will be considered as separate classes, where short-answer includes numerical answers such as "$17 kg/m^3$," and formula answers, such as "\verb!1/2*m*(vx^2+vy^2)!."  The problems on the left side of Figs.~\ref{threemasses} and \ref{trajectory} are examples of "short-(numerical)-answer" problems.
-\item[Multiple-choice multiple-response questions]  This type of problem requires a student to evaluate each statement and make a decision about it. The problem on the right side of Fig.~\ref{threemasses} is of this type.
+\item[Multiple-choice multiple-response questions]  This type of problem requires a student to evaluate each statement and make a decision about it. The problems Fig.~\ref{problemview} and on the right side of Fig.~\ref{threemasses} are of this type.
 
 
 \begin{figure}
 \includegraphics[width=6.5in]{threemassesjpg}
-\caption{Example of two LON-CAPA problems addressing the same concepts. The problem on the left is a traditional short-numerical-answer problem, while the problem is of type "multiple-choice multiple-response."\label{threemasses}}
+\caption{Example of two LON-CAPA problems addressing the same concepts. The problem on the left is a traditional short-numerical-answer problem, while the problem on the right is of type "multiple-choice multiple-response."\label{threemasses}}
 \end{figure}
 
-\item[Representation-translation questions] This type of problem requires a student to translate between different representations of the same situation, for example from a graphical to a numerical or textual representation. The answer might be given in different formats, for example in the problem on the right side of Fig.~\ref{trajectory}, it is a short-numerical-answer.
+\item[Representation-translation questions] This type of problem requires a student to translate between different representations of the same situation, for example from a graphical to a numerical or textual representation. The answer might be given in different formats, for example in the problem on the right side of Fig.~\ref{trajectory}, it is a short-numerical-answer. Translation between representations can be surprisingly challenging for physics learners~\cite{mcdermott,beichner}.
 
-For the purposes of this project, "representation-translation" will be considered a feature, which may apply or may not apply to any of the other problem types.
+For the purposes of this project, "representation-translation" will be considered a feature, which may or may not apply to any of the other problem types.
 
 \begin{figure}
 \includegraphics[width=6.5in]{trajectoryjpg}
-\caption{Example of two LON-CAPA problems addressing the same concepts in two different representations. The problem on the left is a traditional short-numerical-answer problem, while the problem is of type "multiple-choice multiple-response."\label{trajectory}}
+\caption{Example of two LON-CAPA problems addressing the same concepts in two different representations. The problem on the left is a traditional short-numerical-answer problem, while the problem on the right requires "representation-translation."\label{trajectory}}
 \end{figure}
 
 \item[Ranking-tasks] This type of problem requires a student to rank a number of statements, scenarios, or objects with respect to a certain feature. For example, a student might be asked to rank a number of projectiles in the order that they will hit the ground, or a number of points in order of the strength of their local electric potential.
@@ -206,31 +231,197 @@
 &\multicolumn{4}{|c|}{Multiple-choice and short-answer}&Multiple-choice multiple-response&Ranking&Esti\-mation&Quali\-tative&Essay\\
 &Multiple-choice&Tex\-tual&Nume\-rical&For\-mula&&&&&\\
 \hline
-"Traditional"&S&MS&S&MS&S&S&&&\\\hline
-Repre\-sentation-translation&S&MS&S&MS&S&S&&&\\\hline
-Context-based&MS&MS&MS&MS&MS&MS&PMS&PMS&PMS\\\hline
+"Traditional"&S&AMS&S&AMS&S&AS&&&\\\hline
+Repre\-sentation-translation&AS&AMS&AS&AMS&AS&AS&&&\\\hline
+Context-based&MS&AMS&MS&AMS&MS&AMS&GMS&GMS&GMS\\\hline
 \end{tabular}
-\caption{Classification scheme for question types, adapted from Redish~\cite{redish}, see section~\ref{class}. The symbols denote different components of the project, i.e., "P" - additional platform development efforts (section~\ref{platform}); "M" - additional materials development (section~\ref{matdev}); "S" - this question type will be included in the study of its impact (sections~\ref{hypo} and \ref{analysis}).\label{classification}}
+\caption{Classification scheme for question types, adapted from Redish~\cite{redish}, see section~\ref{class}. The symbols denote different components of the project, i.e., "A" - additional analysis tool development (section~\ref{analysisnew}); "G" - additional scalable grading tool  development (section~\ref{platform}); "M" - additional materials development (section~\ref{matdev}); "S" - this question type will be included in the study of its impact (sections~\ref{hypo} and \ref{analysis}).\label{classification}}
 \end{table}
 \section{Preliminary Project Components}
 Several aspects of the research project can be started in the first year, while others will require additional functionality in the LON-CAPA platform, and the preparation of additional homework problems of certain types.
 
-In Table~\ref{classification}, problem types which are marked "S" with no additional symbols have both sufficient functionality support in the LON-CAPA system, and a sufficient library of problems of this type to conduct the study. If a problem type is marked "M," the library of problems of this type is still small, and additional problems need to be developed. If a problem type is marked "P", it means that while the platform in its current version can support it, the solution does not scale well with the number of students due to the effort required for manual grading.
+In Table~\ref{classification}, problem types which are marked "S" with no additional symbols have both sufficient functionality support in the LON-CAPA system, and a sufficient library of problems of this type to conduct the study. If a problem type is marked "M," the library of problems of this type is still small, and additional problems need to be developed. If a problem type is marked "A," it measn that the study would profit from the development of additional analysis tools, if it is marked "G," it means that while the platform in its current version can support it, the solution does not scale well with the number of students due to the effort required for manual grading.
 \subsection{Additional Platform Development}
-\subsubsection{Scalable Functionality for Estimation and Qualitative Questions}\label{platform}
-\subsubsection{Additional Analysis Tools}
-While the premise of this project is that feedback on formative assessment is crucial for the learner, it is almost equally important to the instructor~\cite{pellegrino}, with technology as enabler~\cite{novak,feedback}.
+\subsubsection{Additional Analysis Tools}\label{analysisnew}
+While the premise of this project is that feedback on formative assessment is crucial for the learner, it is almost equally important to the instructor~\cite{pellegrino}, with technology as enabler~\cite{novak,feedback}. Particularly in the context of a research project on formative assessement, timely and comprehensive feedback on student performance --- including new material (section~\ref{matdev}) --- is essential. The LON-CAPA system already has sophisticated analysis tools (see section~\ref{analysis}), but these do not support all questions types in Table~\ref{classification} equally well, and the project includes a tools development component to further enhance these mechanisms for the problem types marked "A."
+
+Data collection on a particular problem type can proceed independently  from the existence of the respective analysis tools, since LON-CAPA permanently stores all data.
+
+\subsubsection{Scalable Functionality for Manual Grading of Free-Form Answers}\label{platform}
+LON-CAPA already offers grading support for free-form student submission, such as keyword highlighting and plagiarism-checks. Additional tools will be developed for the grading of the problem types marked "G" in Table~\ref{classification}: for questions that require student submissions of the type "Explain your reasoning," better coupling between the computer- and manually-evaluated sections will be provided, for the free-form "essay" submissions better tools to compare student submissions with each other and with examplary essays.
+
 \subsection{Additional Materials Development}\label{matdev}
 For the question types marked "M" in Table~\ref{classification}, the currently existing library of LON-CAPA problems is not provide enough samples to carry out the study. Within this project, new homework problems of this type will be developed. Since development of completely new problems would constitute a project by itself, this component of the current project will heavily draw on existing problem collections, i.e., Redish (\cite{redish}, resource CD), McDermott~\cite{mcdermottprob}, Mazur~\cite{mazur}, and Project Galileo~\cite{galileo}. These research-based problems will be adapted and implemented in the the LON-CAPA system.
 \section{Research Methodology}\label{analysis}
-\subsection{Establishment of Initial Conditions}The validity of the hypotheses may depend on both learner and assessment characteristics.\subsubsection{Learner Attitudes, Beliefs, and Expectations}Instruments have been developed to assess epistemological beliefs, for example the Epistemological Beliefs Assessment for Physical Science (EBAPS) Instrument~\cite{EBAPS}. Related to epistemological beliefs are learnerÕs expectations and attitudes, and of particular interest is the Maryland Physics Expectations (MPEX) survey~\cite{MPEX}.\subsubsection{Learner Knowledge about the Topic}We will use existing concept inventory surveys as both pre- and post-tests.The qualitative Force Concept Inventory~\cite{fci} and the quantitative companion Mechanical Baseline Test~\cite{hestenesmech} have been used in a large number of studies connected to the teaching of introductory mechanics. The Foundation Coalition has been developing a number of relevant concept inventories~\cite{foundation}, namely the Thermodynamics Concept Inventory, the Dynamics Concept Inventory, and the Electromagnetics Concept Inventory (with two subcomponents, namely Waves and Fields).  Since these were designed from an engineering point of view, some adjustment might be necessary. In addition, the Conceptual Survey of Electricity and Magnetism (CSEM)~\cite{maloney} is available for the second semester course.
+\subsection{Establishment of Initial Conditions}The validity of the hypotheses may depend on both learner and assessment characteristics.\subsubsection{Learner Attitudes, Beliefs, and Expectations}Instruments have been developed to assess epistemological beliefs, for example the Epistemological Beliefs Assessment for Physical Science (EBAPS) Instrument~\cite{EBAPS}. Related to epistemological beliefs are learnerÕs expectations and attitudes, and of particular interest is the Maryland Physics Expectations (MPEX) survey~\cite{MPEX}.\subsubsection{Learner Knowledge about the Topic}\label{prepost}We will use existing concept inventory surveys as both pre- and post-tests.The qualitative Force Concept Inventory~\cite{fci} and the quantitative companion Mechanical Baseline Test~\cite{hestenesmech} have been used in a large number of studies connected to the teaching of introductory mechanics. The Foundation Coalition has been developing a number of relevant concept inventories~\cite{foundation}, namely the Thermodynamics Concept Inventory, the Dynamics Concept Inventory, and the Electromagnetics Concept Inventory (with two subcomponents, namely Waves and Fields).  Since these were designed from an engineering point of view, some adjustment might be necessary. In addition, the Conceptual Survey of Electricity and Magnetism (CSEM)~\cite{maloney} is available for the second semester course.
+
+\subsubsection{Problem Difficulty and Baseline Statistical Data}LON-CAPA automatically keeps tracks of the average number of attempts until a problem is solved, as well as the degree of difficulty and the degree of discrimination. This data is cumulative across semesters, and already exists for all assessment problems from their deployment in previous semesters.
+\subsection{Observables}\subsubsection{Effectiveness}Effectiveness will be measured both in terms of performance on summative assessments (quizzes and exams) and on pre-/post-test concept inventory surveys (Section~\ref{prepost}).  Each item on these instruments will be associated with topically corresponding formative online exercises to determine correlations and differential gain between the feedback types used with the respective online problems. A second posttest, correlated with first semester problems, will be administered at the end of the second semester to determine long-term effects.\subsubsection{Problem Solving Technique}We intend to focus on a subset of students in the LBS Collaborative Learning Laboratory, and observe them while solving problems. Schoenfeld~\cite{schoenfeld} and Foster~\cite{foster} developed instruments to categorize and document the stages and expertlike 
+characteristics~\cite{chi} of observed problem-solving activity by learners, as well as application of metacognitive skills.In addition, we will interview a group of students from all courses regarding their problem-solving strategies. Pascarella~\cite{pascarella02} developed some frameworks for these interviews, which can be built upon.Finally, for all students in all courses, LON-CAPA log data will be analyzed. Kotas~\cite{kotas} and Minaei~\cite{minaei} developed a mechanism for this log data analysis, which include submission times between attempts, and quality of the entered input. \subsubsection{Help-Seeking Behavior}It is impossible to observe all on-demand help seeking, but interactions in several settings can be analyzed:Online discussions and email communication are preserved within LON-CAPA and can be analyzed even in retrospect for past semesters with respect to relevant behavioral patterns.  Table~\ref{discussion} shows excerpts of discussions around the two problems in Fig.~\cite{trajectories}.
 
-4.2.3 Assessment LevelOnline problems will be categorized according to BloomÕs taxonomy [Bloom56]. We will distinguished between ÒLevel 1Ó questions, which only require minimal interpretation of the presented content (corresponding to the Knowledge Level of BloomÕs taxonomy) and ÒLevel 2Ó questions (Comprehension and Application), which require learners to make connections between presented material and apply them to more complex scenarios.4.2.4 Assessment Difficulty and Baseline Statistical DataLON-CAPA automatically keeps tracks of the average number of attempts until a problem is solved, as well as the degree of difficulty and the degree of discrimination. This data is cumulative across semesters, and already exists for all assessment problems from their deployment in previous semesters.4.3 Observables4.3.1 EffectivenessEffectiveness will be measured both in terms of performance on summative assessments (quizzes and exams) and on pre-/post-test concept inventory surveys (section 4.2.2).  Each item on these instruments will be associated with topically corresponding formative online exercises to determine correlations and differential gain between the feedback types used with the respective online problems. A second posttest, correlated with first semester problems, will be administered at the end of the second semester to determine long-term effects.4.3.2 Problem Solving TechniqueWe intend to focus on a subset of students in the LBS Collaborative Learning Laboratory, and observe them while solving problems. Schoenfeld [Schoenfeld85] and Foster [Foster00] developed instruments to categorize and document the stages and expertlike characteristics [Chi81] of observed problem-solving activity by learners, as well as application of metacognitive skills.In addition, we will interview a group of students from all courses regarding their problem-solving strategies. Pascarella [Pascarella02] developed some frameworks for these interviews, which can be built upon.Finally, for all students in all courses, LON-CAPA log data will be analyzed. Kotas [Kotas00] and Minaei [Minaei03] developed a mechanism for this log data analysis, which include submission times between attempts, and quality of the entered input. 4.3.3 Help-Seeking BehaviorIt is impossible to observe all on-demand help seeking, but interactions in several settings can be analyzed:¥ Online discussions and email communication are preserved within LON-CAPA and can be analyzed even in retrospect for past semesters with respect to relevant behavioral patterns. Discussion contributions and states can be linked to online transactions, such as submission of homework attempts, browsing of content material, and hint usage. Wallace [Wallace03] reviewed existing research on such online interactions, however, some adaptation of several of the existing conceptualizations will be necessary to account for the nature of science and in particular physics and biology courses.¥ For the subset of LBS students who come to the LBS Collaborative Learning Laboratory, group discussions can to be documented, and linked to online behavior as absolute timing and learner identify are preserved.¥ Minaei [Minaei03] developed data mining strategies to categorize learner behavior, including navigational patterns between assessment and content material, the use of feedback, and communication functions.¥ Self-reporting can be used for several other help-seeking mechanisms, such as textbook use and peer-interaction ([Riffell03a, Riffell03b]).4.3.4 Cross-Cutting Open-Ended Documentation of Learner PerceptionsWe will interview focus groups of students from every course regarding their experiences and perceived relative helpfulness of the different feedback mechanisms. We will ask them to also reflect on how they perceived these mechanisms to influence their problem solving strategies. In the framework of the current NSF ITR grant [NSFITR00] an instrument was developed to gather learner input on perceived time-on-task and perceived helpfulness and functionality of different aspects of the system.4.4 Data Collection and Comparison GroupsFor the data collection, results would be most meaningful, if we divided the course into groups that experience the same assignment with different feedback mechanisms; indeed, this is what we propose to do in later phases of the study regarding the variation of solely the feedback character. However, when varying the number of available tries and the immediacy of the feedback, we will do so concurrently for all students across different problems Ð otherwise, it would be inevitable that students will complain about unfairness compared to the students with more attempts or immediate feedback. Also students with delayed feedback would most certainly confer with students in the other group.
+\begin{table}
+\tiny
+\begin{tabular}{p{3.2in}|p{3.2in}}
+{\bf Student A:}
+since your not given the initial velocity or the angle, 
+but you know the distance covered, couldnt the angle be 
+anything as long as the velocity is big enough?
+
+{\bf Student B:}
+The angle could be anything if there was no time given, but 
+since there is time given, only one path can be the right 
+one. 
+
+To solve this problem, you have to take apart the initial 
+shot (velocity) into its x- and y-components. Since you 
+know the horizontal distance and that air-resistance is 
+negligible, the horizontal acceleration is zero (horizontal 
+velocity is constant). Hence, you can use the \verb!x = x0 + v0*t + .5*a*t^2! equation to come up with the x-component of 
+the initial velocity. 
+
+Do the same thing for the y-component: use the equation \verb!y = y0 + v0*t - .5*g*t^2!
+
+Now you have both components of the initial velocity. Put 
+these components into a triangle (and use tangent) to get 
+the angle, and keep the triangle for the initial velocity 
+(hypotenuse). 
+
+For the third part, use the y-component of the initial 
+velocity in the equation \verb!v^2 - v0^2 = -2*g*(x - h)!, where v 
+is the y-component of the velocity at the tip of the arc 
+path (...therefore, equals z...), v0 is the y-component of 
+the initial velocity, x is the height to find, and h is the 
+initial height (a.k.a. x0; it's given).
+
+
+{\bf Student C:}
+How do we use \verb!y = y0 + v0*t - .5*g*t^2!, when we dont have 
+two of the variables (y and v0)? How do we use that 
+formula to get the vo in the y direction? (i.e. what 
+numbers and such do we use?) Thanks.
+
+
+{\bf Student D:}
+Ok. Someone tell me what I'm doing wrong. I figure since 
+they give you the distance traveled in the x direction, and 
+the time it was in the air, you should be able to get the x 
+component of velocity with distance/time. 
+
+Now for the y component. My logic was that at half the 
+total airtime, the object would be at the peak of the arc, 
+and thus would be moving at 0m/s, (being in transition from 
+going up to coming down). I tried solving for the initial y 
+velocity using this information and the Vfinal=Vinitial + 
+(A)(t)equation. But still no luck. Any pointers would be 
+greatly appreciated.
+
+{\bf Student E:}
+Hey EVERYBODY, whoever did this FORGOT to divide whatever 
+your total change in x is by 2 and use that as the 
+displacement in x to find the V nought x, just a little 
+heads up 
+
+i.e. i used 175 m/ 2 = 87.5 m (since the object launced 
+isnt an even parabolic function, its not all of the upside 
+down U shape on the graph) as my displacement for finding 
+V nought x
+
+{\bf Student F:}
+When using the equation \verb!y = y0 + v0*t - .5*g*t^2! 
+the v0 in the second term on the right side is really the 
+initial velocity in the y direction not the total initial 
+velocity. 
+In general the equation \verb!x = x0 + v0*t + .5*a*t^2! is always 
+a one demensional equation so when you use it in the y- 
+direction all variables are for the y-direction only 
+initial y, initial y velocity, and constant y acceleration
+
+&
+
+{\bf Student A:} What does the magnitude of the gravitational field mean?
+
+{\bf Student B:} i'm guessing acceleration
+
+{\bf Teaching Assistant:} That is correct. You need to calculate 'g' for Planet X.
+
+{\bf Student C:}
+How are you supposed to do this problem? I am confused, it 
+seems like we have learned nothing during lecture to help 
+us understand these problems, we never do any examples and 
+work thru problems in lecture. please help.
+
+{\bf Student D:} Yeah, I'm totally lost on this one and all I have to look 
+at in my notes are a bunch of variables in an equation, I 
+don't know where to plug in half of the numbers I have. 
+This problem and the catapult one I'm totally lost on cuz 
+all I have to go by are these equations with like 5-6 
+variables such as \verb!y=(tan[!$\ddot{y}$\verb!o]*x-(g*x^2/2*(vocos[!$\ddot{y}$\verb!o])^2)! and 
+then all the problem tells me is "you threw the rock at 
+22.8 m/s" or something... 
+
+I got all the other problems done easily, but this one and 
+the catapult one... I dunno, I just can't figure them out. 
+I worked on them for a while the other day and then got up 
+at like 8:30 today to work on them and still haven't 
+figured them out. :/
+
+{\bf Student E:} Here is a simple answer to the question, go to sample 
+problem 4-7 in your book and you'll get the answer. But 
+I'll be nice enough to help you out a little more. 
+
+1.) Lecture we talked about getting the tangent line in 
+order to find the angle, DO THIS!!! Print out the paper 
+and find the angle, IT'S THE ONLY WAY!!! 
+
+2.) Sort of kind of eye ball the total distance the 
+object traveled from start to finish. 
+
+3.) In sample problem 4-7 in the book they used the 
+Horizontal Range equation in order to find the answer, but 
+you have to adjust the problem to find Gravity or G. 
+
+Here is the ADJUSTED equation so all you have to do is 
+plug in the numbers that you got. 
+
+\verb!G = Intial speed * sin(2*your angle) / Total distance^2!
+
+Now the computer gives you some lead way due to the "eye- 
+balling" you have to do, but it gave me my answer and I 
+was 0.08 off. 
+
+Hope this helps you guy's.
+
+{\bf Student F:}
+what are the units used for this?
+
+{\bf Student G:}
+Gravity is acceleration, so the units should be \verb!m/s^2!.
+
+
+{\bf Student H:}
+Once you plot your points how does this determine your 
+angle?
+
+{\bf Student I:}
+I had to do 3 iterations of this problem before getting it 
+right. Assuming the math is done correctly there is not 
+much tolerance in this problem in regards to calculating 
+the launch angle (theta). When I was off by more than 3 
+degrees I got it wrong. Be VERY careful when drawing the 
+tangent.
+\end{tabular}
+\caption{Excepts from online discussion around the two problems Fig.~\ref{trajectories}}
+\end{table}
+
+Discussion contributions and states can be linked to online transactions, such as submission of homework attempts, browsing of content material, and hint usage. Wallace~\cite{wallace} reviewed existing research on such online interactions, however, some adaptation of several of the existing conceptualizations will be necessary to account for the nature of physics courses.For the subset of LBS students who come to the LBS Collaborative Learning Laboratory, group discussions can to be documented, and linked to online behavior as absolute timing and learner identify are preserved.Minaei~\cite{minaei} developed data mining strategies to categorize learner behavior, including navigational patterns between assessment and content material, the use of feedback, and communication functions.Self-reporting can be used for several other help-seeking mechanisms, such as textbook use and peer-interaction~\cite{riffell1,riffell2}.\subsubsection{Cross-Cutting Open-Ended Documentation of Learner Perceptions}We will interview a focus groups of students regarding their experiences and perceived relative helpfulness of the different problem types, and ask them to also reflect on how they perceived these question types were influencing their problem-solving strategies. 
+\subsection{Data Collection and Comparison Groups}For the data collection, results would be most meaningful, if we divided the course into groups that experience the same assignment with different feedback mechanisms; indeed, this is what we propose to do in later phases of the study regarding the variation of solely the feedback character. However, when varying the number of available tries and the immediacy of the feedback, we will do so concurrently for all students across different problems Ð otherwise, it would be inevitable that students will complain about unfairness compared to the students with more attempts or immediate feedback. Also students with delayed feedback would most certainly confer with students in the other group.
+
+\section{Involvement of Undergraduate Students in Research}\label{undergrad}
 
 \section{Evaluation}
 The LON-CAPA Faculty Advisory Board was formed as part of our NSF ITR grant project. It consists of eight actively teaching faculty and administrators from a number of colleges on campus of MSU, and meets once every month to both evaluate and advise projects connected to LON-CAPA. We propose to continue using this existing structure to evaluate this projectÕs progress and findings. In addition, Dr.~Kortemeyer's Mentoring Committee, which consists of senior faculty members from both LBS and DSME will guide and advise the progress of this project.
 \section{Dissemination}
-We will present papers at conferences such as the LON-CAPA User Conference, Frontiers in Education, and the American Association of Physics Teachers Annual Meeting.  We will submit papers to journals such as The Physics Teacher, the American Journal of Physics, Computers and Education, and the Journal of Asynchronous Learning Networks.  Finally, any content material adapted and implemented in this project will be immediately available to all participant LON-CAPA institutions and via the LON-CAPA gateway to the NSF-funded National Science Digital Library.
+We will present papers at conferences such as the LON-CAPA User Conference, Frontiers in Education, and the American Association of Physics Teachers Annual Meeting.  We will submit papers to journals such as The Physics Teacher, the American Journal of Physics, Computers and Education, and the Journal of Asynchronous Learning Networks.  Finally, any content material adapted and implemented in this project will be immediately available to all participant LON-CAPA institutions, and via the LON-CAPA gateway to the NSF-funded National Science Digital Library.
 \section{Project Timeline}
 
 
@@ -272,8 +463,11 @@
 \bibitem{breitenberger} Ernst Breitenberger, {\it The mathematical knowledge of physics graduates: Primary data and conclusions}, Am. J. Phys. {\bf 60}(4), 318-323 (1992)
 \bibitem{clement} J. Clement, J. Lochhead, and G. S. Monk, {\it Translation difficulties in learning mathematics}, Amer. Math. Mon. {\bf 88}, 286 (1981)
 
+\bibitem{aapt} American Association of Physics Teachers, {Final Report: Project SPIN-UP (Strategic Programs for Innovations in Undergraduate Physics)}, available online (2003)
 \bibitem{stipek} D. J. Stipek, {\it Motivation and instruction}, in D.C. Berliner and R.C. Calfee (Eds.), {\it Handbook of educational psychology}, New York:  Macmillan Library, 85-113 (1996)\bibitem{clark} K. Clark and F. Dwyer, {\it Effect of different types of computer-assisted feedback strategies on achievement and response confidence}, International Journal of Instructional Media {\bf 25}(1), 55-63 (1998)
 \bibitem{pascarella02} A. M. Pascarella, {\it CAPA (Computer-Assisted Personalized Assignments) in a Large University Setting}, Ph.D. (Physics) dissertation, University of Colorado (2002)
+\bibitem{seymour} E. Seymour and N. M. Hewitt, {\it Talk about leaving: Why undergraduates leave the sciences}, Boulder, CO: Westview Press (1997)
+
 
 % Background
 
@@ -283,6 +477,10 @@
 \bibitem{ncsu} North Carolina State University, Physics Education and Development Group, {\tt http://www.physics.ncsu.edu:8380/physics\_ed/Room\_Design\_files/frame.htm}
 \bibitem{transana} University of Wisconsin, Wisconsin Center for Education Research, {\tt http://www2.wcer.wisc.edu/Transana/}
 
+% cat
+\bibitem{mcdermott} Lillian McDermott, Mark L. Rosenquist, and Emily H. van Zee, {\it Student difficulties in connecting graphs and physics: Examples from kinematics}, Am. J. Phys {\bf 55}(6), 503-513 (1987)
+\bibitem{beichner} Robert J. Beichner, {\it Testing student interpretation of kinematics graphs}, Am. J. Phys. {\bf 62}(8), 750-762 (1994)
+
 % Prelim
 
 \bibitem{novak} Gregor Novak, Andrew Gavrin, Wolfgang Christian, and Evelyn Patterson,
@@ -297,16 +495,12 @@
 \bibitem{hestenesmech} D. Hestenes and M.Wells, {\it A Mechanics Baseline Test}, Phys. Teach {\bf 30}(3), 159-166 (1992)
 \bibitem{foundation} Foundation Coalition, {\it Key Components: Concept Inventories}, {\tt http://www.foundationcoalition.org/home/keycomponents/concept/index.html} (2003)
 \bibitem{maloney} D. P. Maloney, T. L. O$'$Kuma, C. J. Hieggelke, A. van Heuvelen, {\it Surveying studentsÕ conceptual knowledge of electricity and magnetism}, Am. J. Phys. Suppl. {\bf 69}, S12 (2001)
-
-% ---- UNUSED
-
-
-
-\bibitem{feynmanLectures} Richard Feynman, Robert B. Leighton, and Matthew L. Sands, {\it The Feynman Lectures on Physics}, Addison-Wesley, ISBN 0-201-5100(3,4,5)-(0,9,0) (1963-65)
-\bibitem{blatt} Frank J. Blatt, {\it Principles of Physics}, Allyn and Bacon, ISBN 0 205 11784 8
-
-
-
+\bibitem{schoenfeld} A. H. Schoenfeld, {\it Mathematical Problem Solving}, Academic Press (1985)
+\bibitem{foster} T. M. Foster, {\it The Development of StudentÕs Problem-Solving Skill from Instruction Emphasizing Qualitative Problem-Solving}, dissertation, University of Minnesota (2000)
+\bibitem{minaei}  Behrouz Minaei-Bidgoli, William. F. Punch, {\it Using Genetic Algorithms for Data Mining Optimization in an Educational Web-based System}, Proceedings, Genetic and Evolutionary Computation Conference (2003)
+\bibitem{kotas} P. Kotas, Homework Behavior in an Introductory Physics Course, Masters Thesis (Physics), Central Michigan University (2000)
+\bibitem{wallace} Raven M. Wallace, {\it Online Learning in Higher Education: a review of research on interactions among teachers and students}, Education, Communication and Information, 
+{\bf 3}(2), 241 (2003)\bibitem{riffell1} Samuel K. Riffell and Duncan F. Sibley, {\it Can hybrid course formats increase attendance in undergraduate environmental science courses?},  Journal of Natural Resources and Life Sciences Education, in press (2003)  \bibitem{riffell2} Samuel K. Riffell and Duncan F. Sibley, D.F., {\it Student perceptions of a hybrid learning format: can online exercises replace traditional lectures?}, Journal of College Science Teaching, {\bf 32}(6), 394-399 (2003)
 \end{thebibliography}
 \end{document}
 \end
\ No newline at end of file

--www1089490588--