Marynn Dause

My feedback

  1. 2 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  EssayTagger Feature Requests  ·  Flag idea as inappropriate…  ·  Admin →
    Marynn Dause shared this idea  · 
  2. 2 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  EssayTagger Feature Requests  ·  Flag idea as inappropriate…  ·  Admin →
  3. 1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  EssayTagger Feature Requests  ·  Flag idea as inappropriate…  ·  Admin →
  4. 1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  EssayTagger Feature Requests  ·  Flag idea as inappropriate…  ·  Admin →

    Sort of.

    If you use Common Core-aligned rubrics, you can view progression charts across all CC-aligned assignments for the year for individual students and whole sections.

    You can also download data into Excel, but it’s still very complicated to run the pre/post comparison yourself in a spreadsheet.

    But we don’t currently support data comparisons between non-CC-aligned rubrics.

    The problem with non-CC-aligned rubrics is that I’ve built in complete flexibility—which is great for the instructor—but creates all sorts of problems if the code is asked to compare results.

    Imagine all the possible comparison challenges: this rubric element was removed, this element was added, this one was renamed, this one was split into two elements, etc.

    Even if 95% of all comparisons will use an identical rubric, the code has to account for ALL possible scenarios. And it’s those last 5% of cases that makes the whole thing way more complicated than…

  5. 4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  EssayTagger Feature Requests  ·  Flag idea as inappropriate…  ·  Admin →

    I’ve gone back and forth on this one. The Error Mark feature was designed to partially resolve this.

    As I say in the blog post about it (http://blog.essaytagger.com/2013/01/latest-update-error-marking.html): In most cases I recommend not entering a comment about the error. Instead hold the student responsible for reviewing her errors and thinking through them herself to figure out and learn from her own mistakes. She can seek out help if she needs it, but we shouldn’t take on the responsibility of making corrections for the students when it really doesn’t do them any good.

    I’d much rather have students submit corrections to earn back some mechanics points rather than having me write endless “subject-verb agreement” or “you’re/your” comments that the students won’t even read.

    There’s also the possibility of using Kevin Brookhouser’s shorthand for his grmr.me resource:
    http://blog.essaytagger.com/2013/02/check-out-grmrme-and-stop-endlessly-re.html

    Regardless, I’ll follow the demands of EssayTagger’s instructors! If lots of…

  6. 4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  EssayTagger Feature Requests  ·  Flag idea as inappropriate…  ·  Admin →

    I’m envisioning this as an addition to the “grading” tab of the Assignment Details page which lists all of the essays submitted for an assignment. An additional column would show a checkmark if the student has viewed their graded results.

    I think this would be nice to have but I’m not sure it’s a crucial feature. Let me know!

Feedback and Knowledge Base