Using Research & Data to Define & Measure Success – LIVE Blog of Dr. Damian Bebell at LFL15

Dr. Damian Bebell, assistant research professor at the Center for the Study of Testing, Evaluation, and Data kicked off the afternoon. He is a researcher and not a practitioner, so he brings a new perspective to the conference.

How do we study the impact of educational technology?

This is the first big question that Damian asks. He says that in doing that, we need to define outcomes, measure access, measure use, and examine the relationship between access, use, and outcomes. When looking at the impact of 1:1 access on education, Damian then shows the ration of students to computer over time. In 1983, the ratio was 125:1. Meanwhile, in 2014, the ratio had become 3:1. Access shifted from shared access (i.e. computer labs) to sporadic use to limited access.

1:1, in reality, describes access but not necessarily pedagogy. From a research perspective, the emphasis has been on the selection and mechanics of programs rather than on an understanding of how teaching and learning might change. This raises two new questions:

  1. How do you know what success looks like?
  2. How do you measure your success?

For a program to be successful, there needs to be ongoing professional support as well as leadership in the building modeling expectations. These are critical for sustaining programs and for moving programs from an introductory to an established phase.

What is the purpose of school?

While it would be nice to think that these conversations happen regularly, research has found that there are a tremendous amount of assumptions but not a lot of evidence to support that people are having this conversation.

How do we categorize and reflect on the purpose of school? To answer this question, researchers analyzed mission statements and published their findings at www.purposeofschool.com. They then analyzed the language across thousands of mission statements to identify the purpose based on a number of themes. It’s critical that students have both their assessments and their missions in alignment. For example, if a school’s mission is to inspire creative students in a child-centric environment, then measuring success based on test scores doesn’t match up.

As another example, Damian uses LA Unified as an example. Because of the differences of assumptions for how the devices should be used, the media deemed their 1:1 iPad program to be a failure. There was no consensus on how to measure success before beginning.

While it is easier than ever to capture both formative and summative assessment data, it raises the question of HOW teachers and school leaders will then be prepared to use that information. It removes the ability to make assumptions without having data to support them.

Why Collect Data or Conduct Research?

Before even conducting a study, the process of bringing together constituents to have a discussion of project/program goals helps to bring assumptions to the table. This is before even collecting any data and empowers the voices of the various parties and stakeholders. Ultimately, this can lead to informed discussions about the purpose of school and the ways in which we can support advances in teaching and learning.

By collecting data, it becomes possible to document the evolution of practices and then assess their effectiveness. Data itself is agnostic and has no opinions. However, it allows for various conversations to develop and shine light on pockets of innovation and excellence. This data can also support planning, resource allocation, and even professional learning. It can also help to support strategic planning and inform decisions.

Within the culture of the school community, data can be the dashboard to provide indications of success. While it may or may not change practice, it can offer insights into the community as well as teacher practice.

What Resources are Available?

Most schools already have access to data, it’s just a matter of how to explore and examine it. Both Formative and Summative Assessment can be used to support decision making and conversations. Different visualization tools also allow for different types of analysis. Data can really illuminate nuances to help provide answers – particularly when viewed in different ways.

To conclude, Damian references Seymour Papert – “it’s hard to think about the future when thinking only about the capabilities of the technology today.”


Share this:

AI Essential Course Promo Image

AI ESSENTIALS:

10-Week Accelerator Course for Educators

Join us for ten 90-minute virtual sessions over ten weeks that will each dive into specific aspects of AI in education!