Hugh McGuire recently pointed to a New Scientist blog entry that begins:
A bunch of sources are reporting on a University College London study into how people born after the arrival of the internet – sometimes dubbed the Google generation – handle information. The top line is, they’re not very good at it.
The link points to a press release, entitled Pioneering research shows ‘Google Generation’ is a myth, which summarizes a 35-page report in PDF format. That report in turn summarizes a whole series of “work packages” (more PDF files) identified as the full project documentation.
Let’s trace one of the assertions made in the report, as retransmitted by Information Week:
Also, it’s not true that young people pick up computer skills by trial-and-error. “The popular view that Google Generation teenagers are twiddling away on a new device while their parents are still reading the manual is a complete reversal of reality,” researchers said.
Fascinating. I’d like to know more. How did the researchers arrive at this conclusion? Here’s the piece of the report summary that Information Week sourced:
They pick up computer skills by trial-and-error
Our verdict: This is a complete myth. The popular view that Google generation teenagers are twiddling away on a new device while their parents are still reading the manual is a complete reversal of reality, as Ofcom survey(22) findings confirm.
Ofcom? There’s no link, but footnote 22 says Ibid, referring to footnote 21, which says: Communications Market Report: Converging Communications Markets. Ofcom, August 2007. No link.
Maybe the “work packages” say more about this? In package 2 I found this:
The source? Ofcom (2006). No link. Unclear what the superscript 6 means, as the references in this report are not numbered, but they do mention:
Ofcom (2007) Communications Market Report: Converging Communications Markets. Research Document. London, UK: Office for Communications
Ofcom (2006). The Consumer Experience. London, UK: Office for Communications
So I searched for Ofcom (2006), The Consumer Experience, and found, you guessed it, another PDF, the relevant part of which appears to be section 2.4.2: Profile of those who experience difficulties when using technology. But nothing I can find there, or elsewhere in this report, says anything about who is or isn’t likely to learn about technology by reading the manual. And nothing in Ofcom(2007) either.
At this point I have to stop and remind myself what I was looking for in the first place: Evidence that it is a myth that kids learn by doing, and adults by reading the manual. All I have found is a flock of PDF files that mention one another obliquely, and in ways I cannot even correlate. No links. No data.
Now, the message of this highly-touted “Google generation” report, as refracted through the lens of Information Week, is:
Information literacy has not improved with the widening access to technology. Instead, the speed of Web searching means little time is spent evaluating information for relevance, accuracy, or authority.
And that may well be true. But do you see the irony here? The study making this claim was constructed and published in a way that resists all efforts to evaluate its relevance, accuracy, or authority. Which hardly matters, since none of the reporting about the study seems to have made any such effort.
Pioneering research shows ‘Google Generation’ is a myth? So far as I can see, that report says more about the researchers who wrote it, and about the reporters who reacted to it, than it says about any real or imaginary trends.
20 thoughts on “Mythbusting the ‘Google generation’ report”
Great post Jon. Its issues such as this that make my boss who is not a part of the “google generation” say “Trust nothing you read anywhere especially on the internet”.
We are all equally guilty of rarely dotting our i’s and crossing our t’s. But I blame traditional media more for such mistakes . Because they were fourth estate before the internet or blogging or anything came about. They supposedly had their act together..and then they lost it !
Actually, it is self-proving since the researchers are probably of the Google Generation.
This is exactly the type of thing I find on the web all the time. 37 websites that all reference each other which means you actually only have 1 data point, not 37. At one point, web searches could be used for research, but the web is now so self-referential (or circularly referential) that you can’t see the forest for the TREE (and not a forest in the first place).
Of course, our two boys – 9 and 5 (and the 5 yr can’t really read yet) – absolutely *do* pick up things by trial and error and experimenting. Including online games, Nintendo Wii, hooking laptop up to large screen plasma, cameras. Whilst us poor parents are valiantly trying to RTFM.
Maybe it’s the case that this is something kids do – ie kids have always done this – rather than a “Google generation” thing, but it’s certainly not a “complete reversal of reality” in my experience.
@Mike: You hit the nail right on the head. They pick up everything by trail and error and… by copying their influencers. Although I am quite Computer-Affine I am having a hard time to beat my son (5) in good old Bomberman on Nintendo DS.
But I lose trying beat him in playing memory-cards (offline, 60 cards, 20 years old style)! I lost memory-cards against my daughter more than once. They do it effortly while I try to make up all kinds of concepts to remember where those d*** machting pairs were.
I agree that there is no “Google Generation”. It´s just youth applied to current technology.
Let’s hear it for good old common sense. When you cannot even begin to verify information, its relevance and value decreases to zero. The faster you can determine its relevance the better. That is where the internet shines. Consider the example without the internet to assist in determining the value. (ignoring the fact that the question came from the internet to begin with and would have probably not even been a consideration without the internet.)
Or more and faster information just makes bad decision makers make more bad decisions.
Interesting post. I found myself spending quite a bit of time trying to read into the graph results and realized that the apparent increase or decrease from one age group to the next seems difficult (and thats being nice) to corrolate at best. At any age group, the datapoints noted seem to add up to approximately 145% (+/- 10%). Either the data for the graphic is in error or they allowed any given individual the ability to select more than one preferred means of learning. Either way the conclusion is the same..bad data = bad results.
Great post, Jon. Going back to the data – if it is even available in any satisfactory form – is very important.
I’ve found similar problems in supposedly serious research papers about the age-related “digital divide” in Canada (basically doesn’t exist except as a function of economic power) and the wide assumptions about Facebook usage (outside of the US, a clear majority of Facebookers are over 25 and a good portion are over 35).
People seem very willing to take some kernel of an idea and take it much further than any data will support. When I see that, I always wonder what the underlying motivation might be…
Good note, but its one fact among a whole report that you have fisked. Also, is the alternative hypothesis that the Google Generation is very different any better backed up? I liked this report simply because its research is no worse than most of that that I have seen pimping the opposite point of view, and some of the other points made are correct.
Two wrongs may not make a right, but they at least give you a believable mean ;)
Nice piece of digging, Jon.
I guess this just goes to show that today’s ‘Google generation’ is just like the ‘permissive society’ of the 60s and 70s — most of us reading about it get the uneasy feeling that everyone else is doing it, and we’re the only ones not getting it …
It’s certainly a shame that the referencing isn’t clearer. For what it’s worth, my best guess at the evidence they had in mind for this particular point was Figure 1.85 (page 90) of the 2007 Ofcom Report (http://www.ofcom.org.uk/research/cm/cmr07/cm07_print/cm07_1.pdf) – which “indicates that assistance in setting up and using the internet is more likely to
encourage adults without the internet to adopt it” and that “This trend is true across all age groups, although it is least marked in the 25-44 bracket” – which in turn can be taken to mean that those in the 15-24 bracket feel this need for assistance more than those in the 25-44 bracket.
However, while I would agree that there are some problems with the report, the author’s themselves seem to feel this as well – with some caveats such as “The evidence base relevant to the issues raised in this report is incomplete and, in some cases, contradictory.”, also the marking of conclusions with confidence levels, and the admission at the start that what is really needed is a proper longitudinal study.
BTW, the source for the other chart is http://www.ofcom.org.uk/advice/media_literacy/of_med_lit/OfcomPromotion/ecconsult/ecresponse.pdf
The superscript 6 has clearly come across from this report, and reads “Base: All UK adults (3,244). Question Z4, prompted responses, multi-coded”
The raw data for the chart seems to be from
http://www.ofcom.org.uk/advice/media_literacy/medlitpub/medlitpubrss/adult_audit.csv – line 41582 onwards – although I can’t say it means much to me…
Awesome post Jon. We need more people like you keeping all those “reports” and reporters honest, and pointing out the hypocrites when necessary.
Owen: Awesome sleuthing! I marvel at the the way the RSS feed where (I presume) you found that CSV file is unadvertised, and four levels deep in another part of the tree from the place where the main reports live.
“while I would agree that there are some problems with the report, the author’s themselves seem to feel this as well – with some caveats”
I guess so. But geez, the irony is thick here. For example:
“Our final message, one which information professionals have exactly the right skills set to address is the need for greater simplicity. We know that younger scholars especially have only a very limited knowledge of the many library-sponsored services that are on offer to them. The problem is one of both raising awareness of this expensive and valuable content and making the interfaces much more standard and easier to use. The cognitive load on any library user (or librarian) in trying to work through such complexity is at present immense. Librarians are guilty of complacency here.”
Seems to me this report does more to contribute to that cognitive load than to lighten it.
Dude, seriously get a life. Who gives a rats?
Did they even discuss how “instructions” have changed in the past two decades. Back in the past, software was documented in books. Today, most software is documented right in the user interface, either in the forms, like web pages or tax software, or via a sidebar, like office software. Of course younger people read the instructions — the instructions are right there, a single search away.
Re: Google Generation searching report…
I would imagine that people use keyword searches in order to view as much as possible about any topic. Research requires a comprehensive review of available information and that might not be possible unless properly indexes the literature (print & online). The untrained formatting of an “advanced search” is exclusionary. Poor information indexing, similar to the political/beaurocratic construct of “information on a need to know basis”, is not only exclusionary it is prohibitive.
Don’t bash the keyword searcher. Better, inform the Google-like databases and their indexers they’ve lost the trust of those trying to use/promote online information.
It’s the “researcher’s” job to hunt down everything available and make the most of it. I think that’s why keyword searches “reign.”
Funny I ran into exactly the same questions, then found this post.