top of page

"Growth"? What is the Measurement that Matters: Describing Adult Learners' Progress

It has been quite some time since I was in a U.S. classroom setting faced with standardized tests. I am an elementary school teacher by training, a bilingual teacher as emergency demanded and a nonprofit staffer by experience. While in Guatemala, the evaluation, “illiterate” was commonplace. However, while I had to agree with this assessment when I saw thumbprints in ink instead of signatures or sixth graders struggle to decode, I was in constant disagreement around what this meant for critical or analytical skills. I reflected on my own abilities in content where I had less knowledge like cooking, and my struggle to teach my father to use his tablet to Skype my brother and his family. I have a Master’s degree in Teacher Education. He runs his own business and is constantly constructing arguments concerning world events and local policy. Neither of us are lacking in critical or analytical skills in our contexts of choice so I began to question the accuracy of assessment and its causes.

Currently as Program Manager at a community based nonprofit serving adults with literacy goals, I have the task of revising and creating the assessment that my organization is using to both assess incoming adults’ literacy levels and track their progress. The most acceptable means for grant applications is a boxed test. First, the organization does not have the funds to purchase this type of test at the current time. Second, the organization serves adults with a variety of content needs from ESL to computers. I returned to my question from Guatemala, “What types of tests produce the most useful information?” for a particular context or student. The answer was both exciting and frightening. “And yet, those are the moments that are most empowering,” I related to fellow literacy colleagues at the Wisconsin Literacy Regional Meeting in March. “Challenges in creating and maintaining a method of assessment that unifies both program and student progress, not to mention is accessible for tutor and student use without regular support were only the beginning.” I proceeded to share the following strategy benchmarks.

The Literacy Connection is an organization that began from individual mission projects based on the “each one, teach one” philosophy”. This has meant for the organization that the volunteers are trained to interact, teach and plan progressively for one student in regular tutoring sessions, once or more times a week. However, these sessions either do not occur on a regular basis because the students have difficulties that arise in their personal lives, changes in work schedule or are just not able to organize their time in order to prioritize the learning session every week as the students work towards identified personal, educational or employment goals. In the current moment, my position has the unique opportunity to (re)create the language through which the organization speaks its mission and describes students’ steps towards their own interpretation of its vision. However, this means I must also be both creative and efficient in designing the assessment options for a volunteer staff with a range of professional and educational experiences, some of which may or may not include teaching.

Both tutors and students are adults. The students themselves are identified by adult literacy levels, beginning, intermediate and advanced levels, which can be found on websites like proliteracy.org. Sometimes students are highly educated but unemployed due to citizenship or language proficiency issues, but the majority struggle to make ends meet and often have not finished high school either in the U.S. or their native country. For both tutor and student, a key piece to move forward is the identification of specific skill, but the piece most often overlooked is how to practice these skills. That missing piece became the first words in a developing organizational vocabulary and shared understanding to discuss content and strategies.

Community based literacy agencies that receive Workforce Investment Act (WIA) Title 2: Adult Education & Family Literacy Act (AEFLA) funding are required to report out on 5 core outcomes: 1. Educational Gain: Did the student progress from his/her entry educational level to a higher level?, 2. Enter Employment: Did the student find employment?, 3. Retain Employment: Did the student maintain employment?, 4. Obtain Secondary Credential: Did the student pass the GED® Test and earn a secondary credential?, 5. Enter Postsecondary Education or Training: Did the student enroll in college or a training program? Educational gains are determined by National Reporting System (NRS) recognized standardized assessments including: TABE, TABE Class-E; BEST Plus, BEST Literacy and CASAS (if asked, I can explain the acronyms in the presentation). As my Wisconsin Literacy representative explained to me, it takes approximately 60-90 instructional hours to move up one NRS level. The Literacy Connection, like many community based literacy agencies, does not receive WIA funding because our students do not show this growth. How could they? With this data, it would take a student in our program over a year to receive the number of instructional hours required.

What were the obstacles, visible or not, that our students are, in fact, conquering? As I continued to explore the challenges, I carried my guiding question with me, “What types of tests produce the most useful information?” However, my question was drowned out by other voices. Students were asking, “Why am I not making any progress?” Tutors were asking, “Why doesn’t my student tell me what he or she needs?” I came to the conclusion that students seeking services through TLC needed to be provided strategy instruction embedded in their current literacy goal (reading/writing, listening/speaking or math, etc.) based upon problem solving strategies imperative for academic success. I determined that lack of perceived progress, as the NRS assessments would measure, is a direct result of not knowing how to structure one’s own learning. What a tutor may perceive to be passivity may in fact be only the effect of years of the people in an individual’s life doing everything for that person, or a lack of learning experiences and/or content vocabulary to express them, both positive and negative.

As a result of strategy benchmarked instruction, students are measured through levels: novice, apprentice, practitioner and expert, resulting in increased interaction with instructors, effectiveness of study and progress in literacy content. That is useful information to measure!

Currently, what do growth outcomes mean in our context? Below are short descriptions. These descriptions were modified from examples at www.exemplars.com. The rubrics used for recording are specific to student goals: reading/writing, listening/speaking, math.

  • is defined as an individual who chooses no strategy when approaching a task and there is little or no evidence of engagement in the task present.

  • is defined as an individual who chooses a partially correct strategy or a correct strategy for only solving part of the task. Evidence of drawing on some relevant previous knowledge is present showing some relevant engagement in the task.

  • is defined as an individual who chooses a correct strategy based on the unknown task introduced. Planning or monitoring of strategy is evident and there is evidence of solidifying prior knowledge and applying it to the task present. The practitioner must achieve correct answers in the presented task.

  • is defined as an individual who selects an efficient strategy and progress toward literacy content may be evaluated consistently using a summative assessment. Adjustments in strategy if necessary, are made along the way, and/or alternative strategies are considered. Evidence of analyzing the task and relating solutions to those of others’ is also present. The expert must achieve correct answers in the presented task.

While earning grant money using outcomes based around the strategy benchmarks is a challenge, it is a language for every level of our organization to speak to the student progress and describe not evaluate students. Since The Literacy Connection adapted using rubrics based upon the strategy benchmarks, we have seen

  • Involvement of students in the assessment conversation. They make connections that make their new knowledge meaningful. They are excited to talk about their own learning.

  • Involvement of tutors in the assessment conversation. Tutors do not need to feel overwhelmed with “teacher” specific literacy terminology or develop content specific knowledge related to literacy components. They can speak using their observations of their interactions with their students.

  • More positive perception of teaching and learning accomplishments by tutors as well as an increased ability to structure observations and analysis of student work. Tutors are more likely to express where their student is and where he or she needs to get to next.

  • Provision of vocabulary and organizational structure to content already in practice. Students need a way to talk about their progress and the organization needs a means to express and track this.

Featured Posts
Recent Posts
Search By Tags
Follow Us
  • Facebook Classic
  • Twitter Classic
  • Google Classic
bottom of page