Saturday, May 30, 2015

EDUC 7726: Week 8 - Formative Assessment

According to Black and Wiliam, good formative assessment practices can increase student learning dramatically. Formative assessment helps "low-attainers" more than high-achievers.
We are focused on differentiated instruction. Why not formative assessment?
Our National focus is on competitive summative testing. We have been "teaching to the test" for years.

* This is a slide from my presentation. Formative assessment is NOT all technology based!

At Abbott Tech, we have been talking about formative assessments in relation to SLO's. Teachers need data, and formative assessment is a good way to collect that data. Formative assessments include pre-tests, exit tickets, and check-ins midway through a lesson also provide valuable data. Until I looked in depth into formative assessment, I was insecure about how to get this data. Technology can make it quick and game-like.

Next year, our freshmen will be getting Chromebooks. In addition to cataloging and distributing devices, and overseeing equipment maintenance, I am hoping to provide some solid instruction on using the Inernet for research. Recently, I attended a keynote address by Erik Palmer. He wrote a concise 50 page book called "Researching in a Digital World." Although I use the information myself, I have struggled to embed these skills into classroom teachers' lessons. The author had some wonderful ideas around which I was able to create formative assessments.
I chose to focus on three different ways to use Polleverywhere as a formative assessment tool.


When working on this project, I realized that, when I taught with Internet research skills this past year, I did not have crystal clear goals for student learning. Of course they struggled. I used formative assessments, but they were cumbersome and were not "in the moment." Because I see the students sporadically, I need to tive instant assessment. These tech based quick assessments will be more valuable for me as teacher and students as learners.

Black, P. and Wiliam, D. (2001) Inside the Box: Raising Standards with Classroom Assessment. BERA.
Greenstein, L. Formative Assessment powerpoint.
Palmer, E. (2015) Researching in a Digital World. ASCD.

Saturday, May 23, 2015

EDUC 7726: Week 7 Assessment with technology

Computer-based assessment is in the news. Parents across the nation are exempting their children from participating in the annual high-stakes testing ritual. Tests like the SBAC were supposed to correct the flaws of the high stakes tests that were generated after the testing for NCLB created environments that "teach to the test." According to chapter 7 of the Gordon Commission report, these new tests are designed to assess higher-order thinking, and experts are working on making the tests more effective.

Little focus has been paid on the day-to-day formative assessments which make the most impact on a child's learning. Allowing students to use the electronic devices that have become like human body parts, teachers can quickly assess their students' understanding in engaging  ways.
- Apps that act as games, like Kahoot and Plickers, can quickly pinpoint information gaps.
- Digital tools that can handle more thoughtful classroom response, like Socrative and Lino, can draw out a student who is unlikely to speak out in a classroom discussion. Others can learn from and about fellow students and the teacher gets real-time feedback.
- Reflection is a key component of learning. Blogging and mind-mapping tools can demonstrate student learning to a teacher in a way that multiple choice cannot.

This timely formative assessment is important. Teachers use this data to inform instruction and bring their students to deeper level of learning. Like traditional instruction, not every digital tool works well in every situation. An instructor should not be afraid to try new digital tools and "change up" their assessments. Some things will work. Some will fail. There are online learning communities where teachers post articles and ideas for using digital tools in the classroom for instruction and assessment. It takes time and practice, and with persistence, it will add a new dimension (and more data) to instruction.

Studies show that "effective formative assessment causes large improvements in learning" 
Black and Wiliam (1998)

Friday, May 22, 2015

Translating Chapter 7 of the Gordon Commission report

Throughout the readings, the concept that kept jumping out at me was that "effective formative assessment causes large improvements in learning" 
Black and Wiliam (1998)

So much talk today is about summative assessment, high-stakes tests that compare our schools to those neighboring towns and schools halfway across the world. As I was reading, my mind wandered back to my youthful impression of the Iowa tests in Elementary school. I considered myself smart, and I wanted to be placed with the smart kids, so I took the tests seriously. At the time, I did not understand that these were formative tests, instructing teachers about skills gaps and missing concepts. The test was designed to form instruction.

I chose to read Chapter 7 of the Gordon Commisson report because it focused on the best that summative testing using technology can be. There are 13 different components that summative tests like the new SBAC look at. I will focus on three.

1. Provide meaningful information: Data collected from the results of the tests must be "trustworthy and actionable." The results must be true and have practical value. Information from high stakes tests is used in studies like the PISA report, which compares countries from around the world with one another. It has become like a competition. It is important we have the correct data to make the comparisons. 2. Satisfy multiple purposes 3. Use modern conceptions of competency as a design basis 4. Align test and task designs, scoring and interpretation with those modern conceptions 5. Adopt modern methods for designing and interpreting complex assessments 6. Account for context: The answers to the tests are facts. The reasons why the student may have given the answers are subject to interpretation. Data about each school - demographics, ethnicity, free lunch recipients, etc. - can cause a child to answer differently than peers in other contexts. This must be taken into consideration during testing. 7. Design for fairness and accessibility 8. Design for positive impact 9. Design for engagement: There is much discussion that students must be engaged in order to learn. Similarly, when students are engaged in their assessments, they do better. Using gaming principals can help make tests seem more relevant. 10. Incorporate information from multiple sources 11. Respect privacy 12. Gather and share validity evidence 13. Use technology to achieve substantive goals

No Child Left Behind became synonymous with "frequent testing" and teaching to the test. Common Core and SBAC are trying to remedy that with more rigorous standards and adaptive testing. Testing using technology at this time still uses easy-to-score-questions. As test designers become more familiar with what works, they will be better able to use the above best practices.


Bennett, R. (2013). Preparing for the Future: What Educational Assessment Must Do. Gordon Commission on the future of assessment in education. 123-141.

Kahl, S. (2015) Technology and the future of assessment: Pitfalls and Potential. Measured Progress.

Saturday, May 16, 2015

Peer Assessment

Notes about Peer Assessment best practices

  • Thoughts about how it would work in a lesson:

  1. Supply students with “well-defined instructor-established criteria
  2. Have student post work to be reviewed in a public, class-accessible space.
  3. Reviewers (4 is best) look at work based and give feedback based on the specific criteria. In order for the feedback to be anonymous, it must go through another format. What could that be? Should it go through the teacher and be sent to the student? Anon is best and can be most brutal! TurnItIn and StudySunc are two platforms that support double-blind reviews. Both cost money. Alternative, reviewer sends to teacher who sends to reviewee.
  4. After corrections/adjustments, student work is resubmitted and re-evaluated by same team.
  5. Multiple purposes because, not only does the students get the feedback, they are able to look at other student work and learn from others. They are giving diagnoses, being informative, getting information, allowed to adjust their work and resubmit, then the same group re-evaluates for a summative assessment.

Link to lesson:

Helpful resources found this week:

Bostock, S.  (2000) “Student Peer Assessment.” The Higher Education Academy.

Looney, J. “Making it Happen: Formative Assessment and Educational Technologies.” Assessment Network.

Vega, V. (2014) “Comprehensive Assessment Research Review.” Edutopia.
Rubistar rubric maker.

Friday, May 8, 2015

Adventures with Infographics

I enjoy creating visual representations. I also obsess over the finished product. The project becomes more of a production for me. Though I enjoy it, it becomes very time-consuming. There comes a point when I just take a deep breath and hit the "share" button.

That said, I found information on the topic of technology coaching and quickly realized that it is not a data-rich subject. I was not going to be able to throw statistics on the screen and write little blurbs that tied the information together. I found a template in Canva that really shouted to me. I like information linear, succinct, and tied up in a quick recap. With some adjustments, the design I chose worked for me.

My infographic working title was "Why hire a technology coach?" which evolved into "Why teachers need a technology coach." The concept of technology (instructional) coaching fascinates me. Much of the literature about integrating technology into classrooms is not encouraging. Many studies show that, once the newness wears off, little improvement in learning takes place. In many instances, the actual device and its functions have been the focus. In order for technology integration to be meaningful, teachers need to be trained to focus on the student outcomes.

I sent my design to a technology teacher (Tina) and an artist (Carl) who each gave me points to look at, both design and content. I also looked at Josh's infographic and saw the link to his works cited. I did not want to add a second page and Canva has font size parameters, so this eased my struggle to fit everything in.

"ISTE Standards: Coaching" was a gold mine of information. I was previously unaware of the varied responsibilites of a good tech coach. A coach will help set goals, lay a strong foundation, provide professional development, give individual feedback, troubleshoot, support teachers, look at data... At first glance it seemed like a tech coach would be indispensible. But why?

The ISTE Conference White Paper, "Technology, Coaching, and Community," really filled in that gaps. The authors cited studies, and the studies had some data behind them. Coaching is one feature of an effective professional development program. PD is critical for teacher growth and system change. I was able to use some of the works cited in this artice to find more anecdotes and information.

Finally, I found a statistic in an article sponsored by Amplify that related to my overall infographic: The WHY. School systems are adding technology as quickly as finances allow, and they are starved for good teacher training. A technology coach would fill the need. The article included an amazing infographic:

Tips for teachers: To make an infographic with lots of data, choose a topic with lots of data. As an infographic novice, I chose a template on an app where I had previously created an account. If I had had a lot of time to play, I would have created something from scratch, but templates are really handy and can be adjusted. 
When first using infographics in the classroom, I would recommend teacher-created infographics or "authentic" (found) infographics for practice with data interpretation. Some infographics are more like reading a comic book than a short story. Personally, I would not assign students an infographic creation project unless (1) the class had worked on interpretation of infographics, (2) there was available data for the project, and (3) there was a very specific rubric for student expectations. Even with high school students, I would specify the number of "numbers" that should appear in the piece.

I was musing that I may create an infographic for my year-end library report. Considering the many hours I spent designing and re-thinking this project, I may just type up a word document.
I was so excited about my classmates' infographics, I decided to try Pictochart. Here is the same information in another format: . I will post it on the G+ forum in a separate post.