In 1994, only 3 percent of U.S. classrooms offered access to the internet. Researchers like me coined the term the “digital divide” to acknowledge the emerging disparities between the “technology rich” and the “technology poor.”
Since then, schools have been in a virtual arms race to acquire more technology. By 2005, 94 percent of instructional classrooms offered access to the internet. Today, students in lower-income schools are just as likely as their counterparts in more affluent schools to see computers, the internet and a wide range of software applications in their classrooms.
As the pressure to promote science, technology, engineering and math (STEM) in schools continues to intensify, the acquisition of technology has increased, too. Tech companies such as Google and Apple are in an epic battle to shape STEM learning and control the classrooms of tomorrow.
A few years ago, a team of graduate students and I went to a high school to study how schools integrate technology into the curriculum. Many of the educators that we met equated providing students with technology to providing them with richer learning opportunities. But a new reality is forming that questions the logic that technology is a certain pathway to better learning futures.
During our time in the school we saw evidence of what some researchers call “technology-rich, curriculum-poor classrooms.” These are schools that have invested substantial financial resources into hardware and software but have not figured out how to build curriculum that prepares students for a society and economy undergoing steady change.
Students practice basic computing skills such as search, word processing and creating slide presentations, but those aren’t the most important skills. The most important skills in today’s economy involve being able to use technology in smart and novel ways. For example, we rarely saw students learning how to code or use technology to work with information in creative ways or to solve problems.
Schools aim to produce students who are college ready or career ready. But this too is problematic. Data from the U.S. Department of Education suggest that only a small percentage of students graduate from college in four years. And the notion of a career — steady and long-term employment with one employer — is steadily fading in the so-called gig economy. Most millennials and Generation Z people will work several jobs in their lifetimes, and not necessarily in their particular fields of study.
Rather than focus exclusively on college-ready or career-ready skills, schools should also develop “future ready” skills. What does it mean to be future ready? It means rather than preparing students for the jobs in today’s economy, our schools should be preparing them to be flexible enough to find or create work in tomorrow’s economy. Future readiness suggests that the ability to think creatively, practice lifelong learning, grapple with uncharted problems and use technology in novel ways will be some of the primary skills of tomorrow.
Even as many schools across the country invest in more technology, Silicon Valley parents are opting to send their young children to schools that do not provide screen technologies in the classroom. And while these parents certainly operate from a position of privilege, the decision to limit their kids’ exposure to screens highlights something that we found in our research: In the economy of tomorrow, the ability to use creative, critical and problem-solving skills will be just as important as the ability to use technology.
Future-ready schools prepare students to perform tasks that computers cannot perform, such as asking novel questions, solving uncharted problems, or working with and analyzing data in creative ways. Developing these skills does not require an abundance of technology. What they do require, however, are well-trained teachers, carefully calibrated curricula and a vision of education that is relevant in our knowledge-driven economy.
Importantly, future-ready schools understand that technology is a tool to help create better learning futures and not, by itself, an indicator of better learning.
S. Craig Watkins is a professor in the Moody College of Communication at the University of Texas at Austin, where he studies young people’s social and digital media behaviors.