On this page:
Dr. Shona tests the user experience for Cambium to prove how taking a tutorial on any assessment platform is not going to be enough exposure to screen complexities to help eliminate the challenges of online assessments for young student readers.
It is critical to address the challenges of online assessments for students, specifically regarding platforms like Cambium. The platform has so many components and idiosyncrasies that working memory cannot keep up with the complexities. Especially struggling readers, young readers, and emerging bilingual thinkers. Teachers must respond with explicit instruction that address how the platform works and the impact it has on comprehension and memory. Teaching how to read online and use the platform effectively begins with teacher awareness of Cambium components and reveal online assessment best practices we might miss without such a close examination.
Some Challenges of Online Assessments Include Screen Composition and Designs
Remember the experiments we tried earlier with our devices? Usually, we find that we are using different strategies and reading styles depending on the kind of thing we are reading. Some of it is conscious and purposeful. Some of what we are doing is so automatic that we may not realize why we are doing what we are doing. There’s a lot of design and marketing involved in much of the text we consume online or on any device. Lots of marketing research and specific design and engineering purposefully influence our user experience to achieve the goals of the product or service.
Consider website design. Usually, there’s some kind of bar at the top or the side for navigation. The images and key segments of text are in key locations spaced throughout the page or indicated with links. Usually, there’s some kind of text and links at the bottom of the site that we ignore unless we are looking for something specific such as footnotes or contact information.
Over time, our eyes have come to expect a similar design for almost all digital texts. Some researchers put special goggles on folks as they encountered different kinds of texts and screens. The goggles measured eye fixations. They discovered that our eyes automatically move in the same pattern on a screen that is different from a physical text. On a digital screen, our eyes move in an F or E pattern. In a physical text, our eyes do some of that, but move more to a Z pattern when there are paragraphs and lengths of connected text.
Do kids know that their eyes are doing that? Do they know when to stop doing the F and E scanning and when to move to a Z pattern to get all the details? Do they know their eyes might go back to scanning when they need to be internalizing the texts? Seems like that should be a thing.
TEA has all kinds of tutorials about the new platform and how to use it. Great. But does that help kids comprehend and use all the stuff on those screens that help them understand the texts and questions? The answer to that question will help us uncover some of the many challenges of online assessments. Knowing what the tools are and how they will help are critical. We’ll cover that next. What I want you to focus on now is the Cambium platform itself. Combine that with what you know about kids. Combine that with what you know about working memory.
Testing the User Experience for Cambium
Open the Cambium assessment portal and pick a released assessment for your grade here.
Now, try out the user experience. Count how many “clicks” you have to take for each component to work. What works differently than what you are used to? How many different components are there?
I’m going to use the 3rd grade 2023 released ELAR exam. Because most of them are nine years old. Note – I’m working in the practice sections. Things might be different on the live platform.
I created my own chart to count how many “clicks” I have to take for each component to work and highlighted elements that might cause problems for users. View it here.
Well, I got frustrated in making the chart and fiddling with the tools. Am I the only one? When I train folks face to face, they’ll guess that there are about 10 components on the screen that kids have to navigate. No one ever gets close to the real number. When I first started doing this work, I counted +/–65 components and clicks. Today, I found more: 197. Who knows what the real number is for a given child anyway? Some of them are nine years old, y’all. The user interface is wonky enough to cause confusion and frustration for an adult with a PhD in reading, all before reading a single word of the passage. There are more components and clicks and navigational elements than we have spaces for working memory.
Now, I’m not totally ridiculous. I know that some of these tasks and screen components are automatic and don’t cause much of a fuss for most readers most of the time. And I know that the short-term memory is cleared frequently when moving from thing to thing. But there’s enough here that taking a tutorial on the platform is not going to be enough exposure to screen complexities to help eliminate the challenges of online assessments for young student readers. Readers and writers can’t just be familiar with these tools and formats because we take our weekly CBAs on the computer. That data is ultimately flawed unless our instruction illuminates how the screen impacts the reading and writing experience. Learners are going to need more than practice tests and simulations in the platform to navigate the complexities in ways that don’t confound comprehension or cause frustration and anxiety.
Next Up:
Now that we understand the problem with working memory and the challenges of online assessments like Cambium more clearly and have experienced it for ourselves, let’s tackle some solutions.