Bob’s Assessment, ED 653

Original Post

The performance I really need to assess occurs in the workplace, at the service desk, in employees various interactions. Wiggins and McTighe define “authentic performances” in six ways:

  • Is realistically contextualized….
  • Requires judgment and innovation….
  • Asks the student to “do” the subject….
  • Replicates key challenging situations in which adults are truly “tested” in the workplace, in civic life and in personal life….
  • Assesses the student’s ability to efficiently and effectively use a repertoire of knowledge and skill to negotiate complex and multistage tasks….
  • Allows appropriate opportunities to rehearse, practice, consult resources, and get feedback on and refine performances and products. (Wiggins, 2006)

My quick and dirty assessment is that I have addressed the first four elements. My first draft falls down in two places concerning the last two elements. Both my learning outcomes and my journal assignments need revision with these elements in mind. A profound difference in this context is that my instruction is blended/flipped into the workplace. I am assessing their performance for continued employment and possible promotion. Turning to the work I need them to do because of this training, I identified these five performances:

  • use of effective listening skills in conversations with customers, employees and supervisors.
  • service interactions will be higher quality and more efficient.
  • increased confidence in their performance.
  • navigate difficult conversations successfully.
  • choosing to have difficult conversations more frequently.

I chose to emphasize learning journals for two reasons, one selfish and one pragmatic. The selfish reason first, a graduate school mentor, before the advent of online instruction used learning journals, alas; his life was cut short by cancer. I want to experiment with journals, in part, in his remembrance. But, some of the best demonstration of learning comes out of the tradition of journals, the Lewis and Clark expedition, Darwin’s, Voyage of the Beagle, and a personal favorite, The Log from the Sea of Cortez. My pragmatic reason is that I do not want the work in the online environment to over shadow the work at the service desk rather I want it to support it. If I get into required essays, or quizzes, I have slipped back into schooling for the sake of schooling. In fact, something I need to help these young people to keep at a distance. They are excellent at schooling. They are not so good at real life. So, returning to my assignment:

Let us think of the forums as our journals. Journal entries do three kinds of work: first they record what we have seen and heard in the training videos, (slides, bullet points, interesting turns of phrase, ideas we encounter for the first time), then, second we turn to reflecting, we record our reactions, feelings, judgments, and learning, third we engage in analyses:

  • What was really going on?
  • What sense can you make of the training video?
  • Can you integrate theory into the workplace experience/online training?
  • Can you demonstrate an improved awareness and self-development because of the training and our work in the forums?

Unlike our private journals, this is a shared journal. As such, we have responsibility and accountability to each other. We need to be both courteous and hardheaded. If we simply pitch each other softballs, our learning will be limited. If we are rude to each other, no one will participate.

A quick and dirty literature review shows that a lot of the scholarship around learning logs was done in the mid-1980s through the early 1990s. A lull occurs until the early 2000s (Babcock, 2007; Hurst, 2005). The most recent articles however are not exploring the use of this tool in the online context, forums and blogs. I think these mediums are ready-made sites for these teacher-student/student-teacher, and peer-to-peer interactions. Returning to Wiggins and McTighe I am reminded to ask about the evidence I need to confirm learning.

  • What specific characteristics in student responses, products, or performances should we examine to determine the extent to which the desired results were achieved?
  • Does the proposed evidence enable us to infer a student’s knowledge, skill, or understanding? (Wiggins, 2006)

Because this journal is shared in real time, it is possible to engage in formative assessment rather than depend on summative assessment for proof of learning. At any time, we can step out of the conversation to ask about the conversation hence one important aspect of assessment. I can look for key words, or synonyms, “explain, interpret, apply, express perspectives, empathize, and self-knowledge” in their journaling and commenting.

As I have said elsewhere, I really care most about what employees do at the service desk. Accordingly, the rubric I want to work on is focused on the workplace not the journal. We have created documents, over the years to help us do our work, training checklists, and performance evaluation forms. Like many workplace documents, they are organic and local. The notion of workplace rubrics inspired me to do a little Googling – see what other people are doing. I stumbled on the Arizona Workplace Employability Skills Project (Arizona Department of Education in partnership with the University of Arizona and Corporate Education Consulting, 2012) which is simply a great resource for the task. This is their rubric in an outline.

chart

However, Owen’s point is that we practice writing rubrics. Right now, I am content with the two elements of core communication so provisionally I will let them stand. “Sensitive to diversity” is a serious hot button in both the workplace and in higher education. We have not really grappled with serious and focused training — we give protected classes and harassment a salute. We have not treated it as a core skill (mental note we have work to do here). The technology piece is in many ways assumed (mental note we have work to do here). We do talk about privacy laws. We talk around the matter of brand integrity.   Therefore, I have a lot of work to do both in terms of refining training but in defining this rubric. Alas, the actual evidence for relative competence offered in the Arizona report is of limited use hence why I leave it out of my summary. So yes, this is feeling overwhelming. Our authors say, “If the thought of using so many rubric traits seems overwhelming, start small. Go back to the two basic criteria – quality of the understandings and the quality of the performance. Add a third for process when appropriate, and other rubric traits as time and interest permit”(Wiggins, 2006).

I need to refine the technology criteria of this core skill set. We use a telephones (voice and text), email, and Google calendar, one of my Library Coordinators uses Facebook as well, in our communication strategy. Therefore, I will rephrase the criteria being specific “Exercises competence in using telephone, email, Google calendar, or Facebook for work place communication.” I will rephrase the brand integrity criteria to say, “Represents the library in a positive manner.” Regarding laws, I offer this formulation “Abides by privacy laws and library policy protecting customer information.” The criteria “Matches technology to content” reminds me of matching affect appropriately to the situation (laughing when someone is crying as a negative example). Perhaps, I can do a parallel formation one in the core workplace skills and one in the technology section. These are heavy on performance and light on evidence of understanding. Through posing “why” questions I may better formulate both my learning outcomes and facilitate the employee’s attempts to offer evidence of understanding along with practical performances.

Full stop, I am worried about using rubrics in the workplace because of misplaced concreteness and misplaced emphasis. I do not want student employees chasing points on assignments I want them chasing excellence in a real world performance in all of its ambiguity.   The right answer in service situation does not exist once and for all. Rather it is negotiated repeatedly instance by instance. Yet I sense the value and merit of rubrics for supervisors in having a touchstone, a standard for evaluation even as we negotiate this best possible version of Colby College libraries.

I think that I like the Arizona rubric in the context of a student employee’s workplace performance.  I like the 4 stages of progression, novice to leader.  However, in the context of blended/flipped online instruction supporting workplace performance, and my emphasis on learning logs,  I prefer the rubric that Tatiana offers: clarity, analysis, relevance, and self-reflection.  I mentioned in an earlier post that I am teaching the precursor material at the same time I am developing this unit.  What I am seeing in that is almost a hunger among the student employees for this kind of interaction.  One where I and my library coordinators’ engage with them on the topics and coach them on these necessary life skills.  The engagement and self-reflection is extraordinary and I am right to be cautious about developing a rubric for grading in this instance.  I get that I need to learn more about writing rubrics, but, that is a different situation.

Arizona Department of Education in partnership with the University of Arizona and Corporate Education Consulting, I. C. (2012). Arizona Workplace Employability Skills Project: Rubric Development Workplace Employability Skills Project Phase II. Arizona.

References

Babcock, M. J. (2007). Learning Logs in Introductory Literature Courses. Teaching in Higher Education, 12(4), 10. doi: 10.1080/13562510701415615

Hurst, B. (2005). My Journy with Learning Logs. Journal of Adolescent & Adult Literacy, 49(1), 4. doi: 10.1598

Wiggins, G., and McTighe, J. (2006) Understanding by Design (2nd ed.). New Jersey: Pearson: Merrill Prentice Hall.

Leave a Reply

Your email address will not be published. Required fields are marked *