Usage - Graphical interface

We shall describe here a usage example with the graphic interface from the conception of the multiple-choice test until the edition of students' scores.

Creating a new project and subject

Let's open the graphic interface. This can be done ordinarily by selecting ApplicationsEducationAuto Multiple Choice in the general menu of Gnome (or its equivalent in KDE or other), but the command auto-multiple-choice can be used directly.

Let's now create a new project, thanks to ProjectNew. A window opens and allows us to see existing project names (if any), but we want to create a new project, so we choose a name (made with simple characters; « test » will be OK for our short test), which we write in the field New project, then we push the Create a project button.

Now we must choose a LaTeX file as a source for the multiple-choice. To do so let's click on the button named LaTeX file. The current directory is the one with examples given with AMC (usually it is /usr/share/doc/auto-multiple-choice/exemples/): let's select simple.tex. To be able to modify it at will, we must copy it into the project's directory, with the button import. Now we can edit it to modify the shape of the document or the questions, thanks to the edit button which lanches the default editor.

Preparing the subject

Preparing a project is done in two steps. First we must make the reference documents from the LaTeX source file. This can be done by clicking the Update button in the section Work documents. The following documents are produced:

  • The subject. This file must be printed to distribute its pages to students.

  • The correction. We can check that the chosen responses there are the good ones. It is also made to be distributed to students.

  • The layout document. Framing marks, checkboxes, and also the area where students write their names are drawn with colors. The program will use this document to locate exactly those elements.

When produced, those documents can be viewed (and possibly printed) with a double-click on their names.

Now we can begin the last step of the preparation: analyzing the layout document. It can be launched with the button Calculate in the section Layout . This analysis detects, in every page of the subject, the exact position of every element which must be analyzed in the students' copies. This analysis can be a bit long (mainly when there are many pages in the subject). We can see the result in the right-side list, which displays for each page its number, its identificator (a code containing the student's number), the page number for this student, and a code to verify the identification), and also the date of the last update of the layout.

To check whether the layouts have been correctly detected, we can use the button Check the layouts. A short insight allows to check that red checkboxes are correctly located over the boxes of the subject.

Exam

When the preparation is over, we can print the subject, and distribute it to the sudents... In simple cases, we can directly print from the viewer (after clicking the line Subject in the list of work documents). When it is better to print the copies separately (for example if copies contain multiple pages and when the printer allows to stapple them together), we shall rather use the button Print copies after calculating the layout.

[Important]Important

When the subject is printed and distributed, we may no more modify the work documents because they must remain identical to distributed copies.

Reading the copies

Now we shall describe the input from students' copies, which can be done automatically and/or manually.

Let's move to the Input tab of the graphical interface.

Automated input

For automatic recognition of the checked boxes in the students' pages, they must be previously digitallized. I use a copier/scanner which does it automatically (all the pages in a bundle without interaction with me), with the following settings: 300 dpi, OCR mode (for the characters' recognition, black and white without greyscale - but the scanner does not process any character regognition), each scan delivered as a single TIFF file per page.

[Note]Note

To analyze the scans, we must have them in one or several image files (TIFF, JPG, PNG, etc.). Vector graphics formats (such as PDF, EPS, SVG) are not suitable.

Then we select this set of scan files in the dialog opened by the button Automated of the section Input of copies after exam, then we validate with the button Use of this dialog. It's now time to drink a coffee because the analysis of copies can be very long (and can be done quietly because it requires no interaction).

The result of the analysis of each page is indicated in the lists of the section Diagnosis:

  • The value MSD (mean square deviation) is an indication of the good framing of the marks (the four black dots surrounding each copy). When it is too great, the framing must be checked (right click on the page's line then choose page to view the scanned page and the boxes as they were detected).

  • The value sensitivity is an indicator of proximity of the filling of the boxes with the threshold. If it is too great (from 8 to its max value 10), we must check whether the boxes recognized as checked are the good ones (a right click on the page's line the choose zoom to view the set of boxes in the copy, and verify whether the detection worked correctly).

Manual input

If we cannot use easily the scanner, or if, for a few copies, the automated input did not work as expected, we can manage the input manually. To do so, let's open the right window thanks to the button Manual of the section Input of the copies after exam. In that window, we can input the boxes which have been checked ourselves (by clicking them) on the wanted pages.

[Note]Note

Every manual input will overwrite results eventually coming from a previous or posterior automated input for the same page.

Correction

In the Scoring tab of the graphic interface, the part Scoring allows us to deduce the scores of the students from the inputs, but also to read the codes written by the students (see the section called “Code acquisition”).

Process

The computation of the scores is launched with the button Correct, but we must previously make the following choice:

  • If we check the box Update the scale, the scoring strategy will be first extracted from the LaTeX source file. This allow to try many strategies at the end of the correction process. The method to specify the strategy in the LaTeX file will be explained in the section Scoring strategy (a default scoring strategy is used when no indication is given).

When we click the button Correction, the correction is made (this can take some time if we also aked for the reading of the scale).

Scoring strategy

The strategy used to score the copies is indicated in the LaTeX source file, with the command scoring. It can be used in an environment question or questionmult, to set it for every response, but also in the environment choices, to give scaling indications about a single response. The argument of the LaTeX command scoring is made of indications like parameter=value, separated by comas. The usable parameters are the following (the table shows also in which case those parameters can be used):

parametersimplemultiplevalue
QRQR
e  The score given when responses are incoherent: many boxes checked for a simple question, or, for a multiple question, the box "none of the responses are correct" checked while another box is also checked.
v  The score given in case of no response (no box is checked).
d   An offset, i.e. a value added to every score not relevant of parameters e and v.
p   The bottom score. If the calculation of the score in that question yelds a value below the bottom value, the sore is set to the bottom value.
b Score for a good response to a question.
m Score for a bad response to a question.
    Without parameter name (syntax: \scoring{2}), this indicates the score to give if the student has checked this response.
auto   With this parameter, the value of the response numbered i will be auto+i-1. This option is mainly used with \QuestionIndicative (see section Questions and answers).
haut   When you give this parameter a value n, the score given for a perfect response will be n, and one point will be withdrawn for each error.
MAX  Gives the maximal value gived for the question (for a "question scored 5", one can write MAX=5). To be used only when it is not the same value as when one replies every good response.

The default scale for a simple question is e=0,v=0,b=1,m=0, which gives one point for a good response and and no point in the other cases. The default scaling for a multiple question is e=0,v=0,b=1,m=0,p=-100,d=0, which gives a point for every checked box, either good or not (good box checked or wrong box not checked).

The LaTeX command \scoring can also be used outside question definitions, for whole examination parameters:

  • SUF=x gives a total number of points sufficient to get the maximal mark. For example, with 10 for the maximal mark and parameter SUF=8, a student getting a total of 6 points will get mark 6/8*10=7.5, whatever the value of the total number of points for a perfect answer sheet.

Using all of these parameters in combination allows to define many kinds of scoring strategies, as in the following example:

\documentclass{article}

\usepackage[utf8x]{inputenc}
\usepackage[T1]{fontenc}

\usepackage[bloc,completemulti]{automultiplechoice}

\begin{document}

\element{qqs}{
\begin{question}{good choice}
  How many points would you like for this question?
  \begin{choices}
    \correctchoice{Maximum: 10}\scoring{10}
    \wrongchoice{Only 5}\scoring{5}
    \wrongchoice{Two will be enough}\scoring{2}
    \wrongchoice{None, thanks}\scoring{0}
  \end{choices}
\end{question}
}

\element{qqs}{
\begin{questionmult}{added}
  Get free points checking the following boxes:
  \begin{choices}
    \correctchoice{2 points}\scoring{b=2}
    \wrongchoice{One negative point!}\scoring{b=0,m=-1}
    \correctchoice{3 points}\scoring{b=3}
    \correctchoice{1 point}
    \correctchoice{Half point}\scoring{b=0.5}
  \end{choices}
\end{questionmult}
}

\element{qqs}{
\begin{questionmult}{2 ok}\scoring{b=3,d=-9,p=0}
  Only a perfect response will be scored 3 points - otherwise, null score.
  \begin{choices}
    \wrongchoice{Wrong}
    \wrongchoice{Wrong}
    \correctchoice{Right}
    \correctchoice{Right}
  \end{choices}
\end{questionmult}
}

\element{qqs}{
\begin{questionmult}{all}\scoring{d=-3,p=0}
  Perfect response scored 2 points, and give back one point for any error...
  \begin{choices}
    \correctchoice{Right}
    \correctchoice{This one is OK}
    \correctchoice{Yes!}
    \wrongchoice{False!}
    \wrongchoice{Don't check!}
  \end{choices}
\end{questionmult}
}

\element{qqs}{
\begin{question}{attention}\scoring{b=2}
  Some very bad answer yields here to a negative score (-2), but the correct answer is rewarded 2 points.
  \begin{choices}
    \correctchoice{Good!}
    \wrongchoice{Not correct}
    \wrongchoice{Not correct}
    \wrongchoice{Not correct}
    \wrongchoice{Very bad answer!}\scoring{-2}
  \end{choices}
\end{question}
}

\element{qqs}{
\begin{questionmult}{as you like}
  Choose how much points you need:
  \begin{choices}
    \correctchoice{You take two points here}\scoring{b=2}
    \wrongchoice{Check to give 3 points}\scoring{b=0,m=3}
    \correctchoice{Get one if checked, but give one if not}\scoring{m=-1}
  \end{choices}
\end{questionmult}
}

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

\onecopy{20}{

\noindent{\bf QCM  \hfill Scoring strategy test}

\vspace*{.5cm}
\begin{minipage}{.4\linewidth}
\centering\large\bf Test\\ Jan. 2008\end{minipage}
\champnom{\fbox{\begin{minipage}{.5\linewidth}
Name:

\vspace*{.5cm}\dotfill
\vspace*{1mm}
\end{minipage}}}

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

\shufflegroup{qqs}

\insertgroup{qqs}

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

\clearpage

}

\end{document}

Global scoring strategy

To use a strategy globally for a set of questions, one can define it in a LaTeX command, as in the following example:

\def\barQmult{haut=3,p=-1}

\begin{questionmult}\scoring{\barQmult}
[...]
\end{questionmult}

Another possibility comes with the LaTeX commands \scoringDefaultS and \scoringDefaultM, to be used in the begin of the document (outside the command \onecopy), which allow to give default values for the scoring strategy of simple and multiple questions:

\scoringDefaultM{haut=3,p=-1}

In some cases, defining a global strategy can be interesting depending of the number of proposed responses. To do so, just input the value N. For example, to get a scale yelding 4 as the maximal score and such as the mean expected score of a student checking randomly the boxes is 1, one can use the scale d=4,b=0,m=-(4-1)*2/N (which give the score -2 if every response is false, i.e. the wrong boxes have been checked and the right boxes are not). Operations allowed in those formulas are the four simple operations (+ - * /), cases operator ( ? : ) and parenthesis.

Other variables can also be used:

  • N is the number of proposed responses, without counting the response eventually added by the option completemulti.

  • NB is the number of correct responses to the question (without taking in account checked or non-checked boses).

  • NBC is the count of correct responses which have been checked.

  • NM is the number of wrong responses to the question (without taking in account checked or non-checked boses).

  • NMC is the count of wrong responses which have been checked.

  • IS is set to 1 if the question is simple and 0 if not.

  • IMULT is set to 1 if the question is multiple and 0 if not.

Identification of the students

This stage is not mandatory. It deals with associating each copy with a student. The name of the student is not read in an automated fashion, but two reasonable possibilities are proposed:

  1. It is possible to ask students to identify themselves on their copy with their student number, which will be written by checking one box per digit. A LaTeX command is designed to use this method on the copy (see the part the section called “Code acquisition”). After the exam, copies will be identified automatically taking into account a list matching the students' numbers and their names.

  2. With no input of the students' numbers, or in the case when the automatied identification has not succeeded perfectly (for example when a student made a wrong input), the graphical interface allows an assisted manual association.

Let's first move to the Scoring tab of the graphical interface.

List of the students

We must previously supply a list of students. This list can obviously be used for many multiple-choices tests. This list is a text file containing one student per line. We may also append a few complementary informations in the file, as in the following example (in the example above, email will not be used at all by AMC):

# STUDENTS / 1ST YEAR
surname:name:id:email
Bienvenüe:Alexis:001:paamc@passoire.fr
Boulix:Jojo:002:jojo.boulix@rien.xx
Noël:Père:003:pere.noel@pole-nord.xx

The lines of the file which begin with the character `#' are comments. The first of the other lines contains (separated by the character `:') the column titles. Then, with one line per student, we write the corresponding informations. There must be at least one column named name or surname.

[Note]Note

One can replace the separator `:' by a comma, a semicolon or a tabulation. However the same separator must be used eveywhere in the file which contains the list of students. The used separator is detected by taking the character (out of the four possible characters) which appears most frequently in the first line which is not a comment.

The prepared list of students (either a simple list or with more informations) will then be selected with the button List of the students. We must also choose one of the columns as a unique key wich will identify the students (generally, one chooses the column containing the student's number). Last, to prepare an automated association, we must choose the name of the relevant code used in the LaTeX command AMCcode (if used).

Association

Automated association

When we push the button Automated association in the part Identification of the students, matching of the codes given by the students begins. We can watch or improve the result later with a (partial) manual association.

[Warning]Warning

To make an automated association, at least one command AMCcode is required (see the section called “Code acquisition”) in the LaTeX source file, as well as a list of students with a column containing a reference (generally a number of student) which will be identical to the input given in the boxes produced by the commnd AMCcode.

Manual association

To open the window allowing recognition of the students' names, let's click on Manual association button in the part Identification of students. This window is made of an upper part which presents in sequence images of the names written by the students, a lower part containing a button for each student from the list we supplied, and a right part allowing to browse easily the copies to be identified. Let's click the button matching the name written in the upper part for each presented page (by default, only the copies not or badly identified are presented - this can be changed by checking the box associated). When every page is read, a blue background appears instead of the names, and we just need to click the Save button to end with association.

Exporting the scores list

At this stage, we can get the list of scores under various formats (currently CSV and OpenOffice), with the button Export. This export will be followed by the opening of the produced file by the appropriate software (if available).

Annotation

When we push the button Annotate the copies, copies annotation will begin: on every scan, the following annotations will be made:

  • The boxes wrongly checked by the student will be circled in red;

  • the non-checked boxes which should have been are checked in red;

  • the checked boxes which had to be checked are checked in blue;

  • for each question, obtained and maximal scores are indicated;

  • the global score of the copy is indicated on the first page of the copy.

This operation is made for each page, giving as a result JPG format annotated pages.

If we wish to distribute the annotated/corrected copies to students in electronic format, it is useful to make a PDF file for each student, from his/her JPG annotated pages. This is made by pushing the button Gather. The name of the PDF file which will contain the corrected copy of a student is based on the template indicated in the field Template of file name. In that template, every substring as « (col) » is replaced by the contents of the column named col in the file containing the list of students (see section List of the students). If we let this field empty, a default value is built up based on the student's name.