Essaytagger Buy Direct

The only other limitation is that iOS devices (iPad, iPhone) do not support Flash and therefore the grading app will not function on iOS.

An internet connection is required.

How do I get my students' papers into the system?

Teacher batch uploads:The most basic method is to have teachers upload the essays themselves. This places the burden on the teacher, but makes sense for those teachers who have a "hand-in" folder on the school network or who receive their assignments via email.

Students upload to an assignment link: No logins are required. Each assignment is given a unique five-character upload code (e.g. "WE4T2"). Give this code to your students and the site will guide them from there. This has the added advantage of building your class roster for you (see this video for details).

Other options

We have a number of other possibilities for getting essays into the system that have not yet been implemented. We will listen to our users to help us prioritize which ones to work on first. Those options are:

Students have their own logins: This is how all of the online learning environments work (e.g. Blackboard, moodle, Sakai). Students log in and then submit their assignment. We're able to identify the essay by student and by section. The downside is, of course, that the students will have yet another login to remember. Perhaps if the usernames and passwords are very simple (e.g. first.lastname and studentID) this will be a little less frustrating.

Moodle integration: Link your account to your school's moodle server so that we can copy the submitted essays from moodle and import them for you. This is one of our preferred methods, but obviously only works for the teachers that have access to and use moodle.

Sakai integration: Same as moodle integration, but a lower priority.

Dropbox integration: You are using Dropbox, right?! If not, read why you should! With Dropbox integration you'd be able to place all of your students essays in a particular Dropbox subdirectory and then link your account so that we can copy the files for you.

Google Docs integration: You would have your students share their documents with you and then once you link your account to we would copy the documents and import them for you. The downside of this is that Google Docs is not very good at document organization. It quickly becomes messy and confusing if you have multiple sections and multiple preps sharing documents with you.

How do students view their graded papers?

When you hit "Mark essay as Graded" in the grading app we generate a marked-up version of the graded essay with all of the comments incorporated into the text. 
If you opted to "Enable student email support" for your course, the site can email the graded papers to each student.
The graded papers can also be viewed directly on the Web or you can print it out. 
The marked-up version also includes a completed rubric grid that reflects the student's performance on the assignment.

This is really just the beginning. One of the more powerful possibilities within EssayTagger is that the graded essays don't have to be the end of the process, but rather the launching off point for the next phase in the student's education.

Future features:

  • Student interaction with comments: have the kids view their graded essays in our system and have them click on each comment and then select "I agree" or "I disagree or don't understand". That could then trigger a discussion (in person or through the system) about a specific comment you made on the paper. Note: We would love to go forward with this feature, but we need to know that our users will find it useful before doing so. It would take a fair amount of work to implement, but we think it would be well-worth it.

With EssayTagger's core platform in place, it's time to turn our attention to the incredibly rich data that is generated when you grade your essays in our system.

UPDATE 11/3:
We've already updated the charts quite a bit and have updated this post to reflect the changes!

UPDATE 11/29:
Even more improvements and two new charts! Post updated again.

UPDATE 11/30:
You can now download your grading data to Excel!

We've reached the first milestone of our major push to enhance and extend the data reporting features of the site. Today's release opens the first new data reports on a beta test basis. "Beta" in programmer lingo means it's not yet finalized, but is mostly where it needs to be. There will likely be further refinements based on instructors' feedback as well as minor bugs to be fixed.

Quick highlights
  • "Section snapshot" overall section-wide aggregate performance graph
  • "Section details" chart of all students' performance on each rubric element
  • "Individual details" in-depth view of a particular student's performance on the assignment
  • Statistically-significant outlier identification to help you focus on the students who are furthest from the pack.

All of these data reports are amazingly useful tools for teachers, but I'm particularly excited about the statistical analysis we're able to provide. You don't have to know the first thing about stats, standard deviation, or z-values; we're computing everything for you and flagging the kids that need your attention the most!

You grade, we crunch the numbers. How awesome is that?!

(see the demo video here:

"Section snapshot" overall results
This is the new default view; you'll be routed here automatically when you click "exit grading app" when you're done grading. It's the broadest view of the data and includes two charts. The goal is to provide a rough "snapshot" look at how your class section performed as a whole on the essays graded thus far:

The stacked column graph displays how many of your students fell into which quality levels when you evaluated their essays in the grading app.

Put simply: the more green, the better.

The second chart takes the same data but presents it in a slightly different manner:

Now the rubric elements are sorted from best performance to worst performance so you can quickly hone in on the areas that need the most work.

The average rating for each rubric element is reported in the far right column. The percentages within the grid are an easier way to process how many of your students fell into each quality level for each rubric element.

Where did that average score come from?

The system automatically scales your evaluations based on the number of quality levels you specified when creating your rubric.
  • 3 quality levels: 1.0 - 3.0
  • 4 quality levels: 1.0 - 4.0
  • 5 quality levels: 1.0 - 5.0

Note: Common Core-aligned rubrics are always restricted to 5 pre-configured quality levels.

The numeric value is listed under each quality level (e.g. "Proficient" equates to the 4.00 range).

We can then take each of these numeric values and do aggregate calculations like a determining a class-wide average and even more advanced statistical analysis (more on this below).

"Section details" individual results
This view drills down to the per-student level and gives you a color-coded view of each student's performance on each rubric element:

The color coding allows you to see who is struggling on which rubric elements. Each column of the grid is sortable; just click on the column header to swap between ascending and descending order. This allows you to see, for example, who is struggling the most on "Introduction" or who is excelling at "Conclusion." The table scrolls horizontally so all the rubric elements can fit.

Note: If an element is evaluated more than once in an essay (e.g. "Textual Evidence / Inferences" in the chart above), the evaluations are averaged together to create a single score for the student for that rubric element.

Outlier analysis

The down arrows indicate scores that are statistically significant outliers; these students are significantly under-performing relative to their peers. This is where our ability to run the statistics does the work for you. Sure, we'll always have students that are struggling, but now you'll know exactly which students stand out as statistical outliers on specific rubric elements.

And the results may be surprising. For example, in the chart above Bart Connor is struggling on "Overall Organization" (indicated in red), but he's actually further behind his classmates on "Textual Evidence / Inferences" and "Transitions/Links" based on the statistical analysis. This doesn't necessarily mean you shouldn't work "Overall Organization" with him, but the outlier analysis reveals a significant weakness in his skill set that you might have otherwise missed.

"Individual details" full breakdown
Clicking on a student's name will bring up a more detailed view of that student's performance on the assignment:

Each of the evaluations you made in the student's essay appear here as checkmarks in the rubric grid. The multiple checkmarks in "Textual Evidence / Inferences" indicate that the instructor made multiple evaluations of the student's evidence in his essay.

Performance relative to peers
The student's numeric value for each rubric element is reported (this is the same as what was displayed in the previous chart). But this time we add the "diff vs section average" column. This shows how the student performed relative to his or her peers.

In the example here, Brian beat the average on "Thesis" and "Develop Counterclaims" but underperformed in the remaining rubric elements.

You can view other students by selecting them from the "student" droplist above the chart.

A note about these scores
These numbers are just a simple way to quantify the quality level decisions you made while evaluating each essay. They are not intended to be used to calculate a grade for each essay; that is left up to the instructor and his or her own judgment (in fact, numeric scores for essays are optional and are enabled/disabled by the Max Point Value setting in the Assignment setup).

We avoid auto-tabulating a final score based on these numbers because we assume that the assignment will weight certain elements more heavily than others (e.g. "Thesis" might count more than "Citation Format"). Because we currently do not support the weighting of one rubric element over the other, any "final" calculation we could perform would be unlikely to produce satisfactory results for the instructor.

We're also wary of removing the instructor's judgment from the final score. The stats may say that a student underperformed on every aspect of the rubric, but it still might be the best paper he's ever written. Instructors should be able to exercise their judgment and reward that student's effort accordingly. Technology does not have all the answers and it never will.

How to access these reports
The grading app will auto-redirect you to the first report when you click "exit grading app." However, you can directly access the reports from the "analytics" tab on the Assignment Details page. Notice that the original v1 data reports are still available as well:

You can also click on the new "data reports" link at the top right of any member area page:

Future enhancement: Student access
I'd like to add a link in the students' graded version of their essays to their "Individual details" chart. I think it's useful to see how you're performing relative to your peers.

I'm used to teaching seniors and I train them to be tough and face facts, but I can also see some teachers worrying about this comparison against the median; it can be quite disheartening to see that every aspect of your essay underperformed relative to your peers.

Obviously there would also have to be security restrictions so that students would not be able to access anyone else's chart.

Future enhancement: Sharable reports
I think there would be a lot of utility in being able to send a link to these reports via email. Send a report to your administrator, to the other members of your teacher team, to your mentor or coordinator. The reports are currently only available to the logged-in instructor, but a future code enhancement could certainly make sharing possible.

Categories: 1

0 Replies to “Essaytagger Buy Direct”

Leave a comment

L'indirizzo email non verrà pubblicato. I campi obbligatori sono contrassegnati *