Whenever we see a job listing “digital native required,” we read it as code for “someone under 30 please.” Exclusionary perhaps?
The idea of “digital native” comes out of the assumption that people who were teethed on smartphones think and intake information fundamentally differently from the rest of us “digital immigrants.”
Digital Native Is a Myth
A 2017 editorial in Nature takes issue with this belief. Essentially, millennials and younger are no better at technology than the rest of us. And despite the assumption that digital natives have the ability to multitask, turns out there’s sparse proof of that, too. Shocker. We all learn at the same speed, we all intake information at the same speed, and we all use technology the same way: as a means to passively absorb information.
It’s not just the study referenced in the Nature editorial that makes this argument. Multitudes of research agree with the Nature article, including a paper by the ECDL Foundation, an international non-profit dedicated to improving digital competence standards in society. The group contends that the digital native is a dangerous fallacy and that studies have, in fact, repeatedly shown a glaring digital skills gap among younger folks, who often overestimate their skills. (For example, in a study in Switzerland, 85 percent of respondents thought they had good internet and email skills, but only 34 percent actually did.)
Related post: Digital Native: Ageism’s New Code Words?
For the most part, younger generations tend to learn their skills from friends rather than through formal education, according to Google senior research scientist Dan Russell, who has also called digital natives a myth. Kids can be fast with technology, but only if it’s something they’ve spent much time practicing. (This may include things like social media and video games.) So while they may possess lifestyle technology skills (like texting and watching videos) that’s different from workplace skills employers look for (like changing a chart type on a spreadsheet).
Our ability to access a seemingly infinite amount of information at our fingertips has certainly changed the definition of what it means to be literate, Russell says. But being able to craft a good question and organize information learned are still crucial components of being literate in this digital age.
Related post: Age Discrimination: Are Men Hurt More Than Women?
Exposure to technology alone doesn’t necessarily translate to an ability to use it. Younger generations aren’t necessarily innate or more talented digital technology users. Digital skills are a continually growing asset in the workforce, but they’re aren’t only found in individuals born in the 1980s and 90s.
Maybe someone should let HR know about this?
Click here to read the editorial in Nature, here to read some bite-sized arguments about the fallacy of the digital native from the ECDL.