Online Engagement: What Causes Variation in Contributing to Participatory Web Sites?

Uploaded by GoogleTechTalks on 27.01.2011

I'm very happy to introduce Eszher Hargittai who's from Northwestern. I first got to know
Eszher when she was here visiting scholar at Stanford. She came over and was helping
us with a bunch of field study work, and her insights into how people search, how they
think about working on the web and getting information have proven really interesting
and useful. So, I also want to point out that she's also a Master Geocacher, so she's capable
of finding anything under the most rigorous of conditions. So, please welcome me in welcoming
Eszher Hargittai. >> HARGITTAI: Great. Thanks very much. And
thanks for coming. So I thought I would start by situating myself since this is an interdisciplinary
group. I don't know if you've seen this slide from XKCD but there I am identifying myself
as to where I belong on the scale. So I'm a sociologist by training. I thought that
would be helpful for you to know because I know there are people from very different
backgrounds here. And generally speaking, my background in terms of what has motivated
to me to do the research I do is an interest in questions of social inequality and how
internet use in particular, relates to that. So that's the very general motivating framework
that I use. So I'll start by a few framing words for my talk. So there's--to this day,
after quite awhile of having cut the internet around, there's still a lot of enthusiasm
about its potential to improve our lives in various ways. But what's interesting is that
there's also another side of this, is that there's actually also quite a bit of anxiety
when it come to kind of the opposite of all these potential positives. So depending on
who we are or how you think about these things, people get either extremely excited about
the internet and its potential in positive ways or freak out about its negative potential
consequences. So, if you think this is kind of an ugly slide, that' on purpose because
I just wanted to show the cacophony of just these opinions that are out there and how
jumbled they are and not--not very clear view I would say, because how can you really have
all these different perspectives? So, a simpler way to show it is just to say that. So one
perspective suggest that, there's the internet and everything that comes from it is great.
And then there's the other side which suggest that, there's the internet and we have all
these negative consequences. Conveniently, these smileys actually can transform into
a Venn diagram. And what I argue is that, in reality the way that we should be thinking
about the internet's potential social implications is something in between. That there's a balance
and that there's nothing inherent about what, well yes, emanate from the internet, socially
speaking. Rather there's a balance and in fact it matters on the social context of a
person's usage of the internet, what might come of it, whether there will be potential
benefits or negative consequences and what the balance is of those types of outcomes.
So that's just the frame that talk for you. So, I'm very interested in people's social
background, what they're bringing to their internet uses and, in particular, we'll be
focusing quite a bit on the concept of skill, and how skill in particular matters, and why
that matters and why we need to pay more attention to it. Another issue that I wanted to mention
concerns what seem like pretty widely held assumptions about how savvy today's youths
are with technology. And this is what I will call a myth, and will show you data to suggest
that this assumption is actually incorrect. These are some of the--what some people, quite
a few people seem to consider the capital T, truth, about young people, but in fact
there's little empirical evidence to suggest that this is all true. So, what we--we do
have empirical work on is more that "yes, young people have been online a lot throughout
their lives and actually do spend a lot of time online". Many of them--however, there's
not really empirical evidence to suggest that they do lots of different things online per
se and there's definitely no empirical evidence to suggest that they are universally savvy.
So, I'll be showing you some data on this. Just one more word on overall framework. So,
again, I am interested in internet users. I'm not that interested, for the purposes
of this talk and much of my work, in people who don't use the internet at all. To be sure
there are still lots of Americans who don't, and lots of people in the world who don't
and that's an issue as well, it's just not my focus. So I start out with the user and
recognize very consciously that that user has a certain position in society and brings
all sorts of characteristics to their uses, and then there's also a technical context
and a social context of use that also matters to how people use the internet which is often
ignored in a lot of work. And then I argue that these influence someone's skill all of
which that influences what they end-up doing with the general media. And then the really
big question, which I'll leave more of as the question but is something that really
guides all of these work at a much larger level is how all these things that might feed
into people's life chances--what kind of feedback mechanism do we see of this that how do people
skills and what they do online, might--how might those things impact their offline life
chances as well? So, whether that's health wellbeing, financial wellbeing, or what not.
So that's why ultimately we should care about these things from my perspective. And again,
as I said, skill is a--is a factor that I will be focusing on quite a bit. Okay. So
what data should one turn to if interested in skill? So that's--that's an issue because
people don't tend to collect data on skill, large data sets don't really have information
on that. So, I've been left to collect a lot of my own data. Before explaining to, specifically,
the data sources that I am trying on for today's talk, I did want to spend on a few minutes,
just a couple of minutes talking about why I don't rely on logged data and why I think
there are challenges to rely on logged data. And why I'm excited to see that Google over
the years is supplementing its work on logged data with other types of data work about users.
So, for example, an important issue that becoming the user of a service is not a random event.
Now that's an empirical question. I will show you data to show that this is the case. But
basically what I mean by that is, whether you use a site is not random across the population
and so; often we'll see data analyses by having people analyze the users of a certain site.
Well, by analyzing users of a certain site, that project, that study has already automatically
sampled against those people who are not or biased against those people who are not users
of that site in the first place. So that's one issue. Another issue is that, people understand
and use site and services differently. So, for example, if you want to look at people's
friendship networks in general, then you use data just about people's networks on Facebook,
you can't just say that you are looking at people's friendship networks per se because
some people may use Facebook to connect with colleagues, others might use Facebook to connect
with family, yet others might use it to connect with distant friends, others with their closest
friends. So you can't compare the data about networks of people on that one site assuming
that they're all the same types of ties. So, that's just an example of why, again relying
on just logged data maybe problematic. And then finally, or one more point is that, what
people do online is just one aspect of what people do for different things. So for example,
how am I in touch with my best friends? Well, maybe I'm in touch with them, again, say on
Facebook, but in fact if I'm in touch with my best friends mostly through texting or
picking up the phone, then if you just look at my Facebook network, you're getting a skewed
perception of what I do with my best friend or even who my best friend maybe because you
don't have the other data of how we interact, for example. So those are some reasons why
logged data have tons of potential. Obviously, they can't address all issues. There are also
some other factors as well, things that we can't really assume by looking at logged data
and I'll get to some of that as I talk to you about my data. So as Dan mentioned in
the introduction, I've done quite a bit of work looking at, specifically looking at how
people search for information. Actually meeting with people one on one, sitting with them,
its what's called usability testing in some areas. We just call it observation in my area.
But in one study I did, what we found across, or what I found across a hundred cases was
that people differed considerably in how long they took to complete a group of task. And--so
that was interesting in of itself and it showed that there are definitely skill differences
among users. I'm not going to get into details of that particular study because it's not
really what I'm focusing on today. The reason I bring this up is to mention how I developed
a skill measure, an awareness measure that I use on surveys. So one of the big challenges
of these impersonal studies is that of course, it's incredibly labor intensive, it's very
expensive. It takes a lot of time to do and you really can't do it on large samples and
it would be very difficult to do on representative samples. So findings--so there's limited generalizability
from the findings given that its--it would be very hard to get to a nationally representative
sample. That's large enough. So what I did in that study was, I looked to see what were
good survey measures--what were good proxies on the survey to measure peoples' actual skills
as I observe them while looking at how they found the information. And so this is how
I came up with the survey instrument that I used, that I call a measure of skill, which
itself--it's a--it's a complicated methodological question. I'm happy to talk about it more
but, just so you know, it's not that one day I woke up and came up on with this. I've been
working on this for about a decade now. And have three publications that detail how it
was developed and how it relates to other types of data, but basically this is the question
that people see on the survey. And this is a list of items that they are asked to rank.
And the ideas that these--a measure that comes from this survey item list which is--this
is just the half there's a whole other list of another dozen terms. This is a better predictor
of actual skill based on a previous study than how many years you've been a user, how
much time you spend online, or even simply asking people what they think their internet
skill is. So this--this just perform better statistically so that's why I use it. Okay.
So then--today I'll mainly be drawing on survey data about internet uses and skills. In an
ideal world, I'd have nationally representative data. I don't, although I will mention a data
set that has been since conducted--collected that has some similar data. So, I'll be able
to compare to the nationally representative data set to some of my findings. But basically,
so, yes, in an ideal world we'd have a nationally representative sample, [INDISTINCT] expensive
I haven't have the funding to do that. So what I've done is I've done a lot of my data
collection on a specific college campus at the University of Illinois, Chicago. And I
want to say a few words about why there. So first of all, note from my signature line
incase you forgot, that's not my institution and I've never been affiliated with it, so
this does not simply me studying my students, just to clarify. And for those of you not
familiar with Chicago geography, that's Northwestern and that's where UIC is and it's about 20
miles apart. And we do in-person data collection in the winter. So, I used to have--I used
to have the average temperatures for Chicago winters. But you can imagine, I mean last
weekend was zero but the average is about 20. So, though all of which is to say this
is not a convenient sample because as my research assistants will tell you there's nothing convenient
about going down to UIC from Northwestern 30 times in the dead of winter. So, the reason
I work with UIC is twofold. One is that, it turns out that it's one of the most ethnically
diverse universities in the country, and so as someone who's interested in social inequality,
it's helpful to work with the population that has quite a bit of diversity in terms of students'
backgrounds. And then other reason that I work with this group is that, there's one
course they--UIC actually is a course that every first year has to take. So that's very
helpful because I don't have to bias towards a course that people happen to sign up for.
There's a course I can go into, a course where people have been very kind and generous to
work with me where I can go into week and go into the classrooms and survey people in
there. Now just to clarify, it's not one big class. It's actually more than 80 sections
so we surveyed them 20 at time. So it's still a lot of work. And we did a study back in
2007 with over a thousand of their first years at that time and then followed up with over
a thousand new first years in 2009. And then what's also very exciting is that this past
year, we actually followed up with that same sample so we have [INDISTINCT] data for--for
about 500 students for what they do online and their skills. Okay, so as I said, it's
a diverse campus and indeed you can see that from the demographics of the students who
participated in the study. So we have half, almost half who were first generation college
students. This is important because--so obviously by going with the first year college cohort
we're controlling on age and education. Well you can see that age is pretty much constant
and education obviously, we're controlling for [INDISTINCT] first years of the same school.
However, parental education is a proxy for socioeconomic status as is often used in literature.
I just want to remind you again, this is, while a diverse sample it is nonetheless,
controls on education, controls on the age, both are related to internet uses so that's
important to acknowledge. But if anything, it means that findings would be conservative
when it comes to a more representative national sample, so that's just something to keep in
mind. So is it the wired generation? Is it the case thatófor those assumptions that
this generation has grown up with technology? Yes, we can confirm that. So, from the data
we see that they have lots of experiences with the internet--I don't think I have to
do all of the slide--I will add though that most of them still--do use email a lot. There's
some--at least in the academic community there's some discussion that today's young adults
no longer use email, but I think that's more they don't want to, you know, they don't want
you to reach [INDISTINCT] but they actually do. Okay. So, when it comes to understanding
some of this internet related items that I showed you were part of this measure; this
is what the figures look like. So again, this was that--a survey item that I described that
has all these responses. So, there are certain terms that lots of people understand but then
others that people understand much less. Now what I try to do is it's an ongoing process
for me to figure out additional ways to measure skill. So, I try to come up with new measures
as I field new surveys, and one of the things I did on the 2010 survey was actually used
multiple choice question for whether people understand BCC and--just to be clear I didn't
care if they knew that it stands for Blind Carbon Copy. That's not the point. The point
is do they care, do they understand the function? Like what does that actually mean? It turns
out that a third of them don't. They don't know what this BCC is. So all those disastrous
situations sometimes when, I'm sure we've all experienced it, cc less that shouldn't
have been cc less. This would be one source of that. So, just because an organization
hires a young person doesn't mean that they know how to use every aspect of technology.
So it's kind of--I often think of some of my data as a reality check about how to think
about employees and colleagues. So, then there's--there was a list of what I call more advanced internet
related terms so this was just--really just me deciding what to call them. It's not about--they
don't have to be classified in any particular way per se. And then again, we see that there's
a pretty quick drop in average level of understanding of some of these terms. I will add that the
2007 measures were actually quite consistent. So there's very--that was obviously a different
cohort but the measures in general for the averages tend to be very close over time.
So it's only in a hundredth decimal point that it changed over two years actually. And
to 2010 which is the same group, again the measures were quite consistent. So--because
sometimes people react and say, "Oh by next year they all know this," and no they don't.
Yes. Go ahead. >> [INDISTINCT] measure of a burning effect
which supposed the burning effect were predicting that there's no exposure.
>> HARGITTAI: No, there's not really. That's--that's why I'm saying that itsóits--it pretty much
stays consistent. I mean, again, it goes up hundredth decimal point level. There are few
things that there--really from 2007 to 2009 there were just a couple of things that went
up, tagging went up quite a bit and I think that's because, I would say Facebook and photo
tagging, in particular. Social book marking has definitely gone up as browsers--not social
book marking, sorry. [PAUSE]
>> HARGITTAI: Tab browsing. That went up since 2007 [INDISTINCT] I think because browsers
with tabs have really started to spread. But, unless there's that kind of a change that
there's this one technology that's really taking off; you don't see much change in how
people rate their understanding. So, then as I said I'm really interested on how people's
background relate to their skill and uses. So let's look at just some [INDISTINCT] relationships.
So, what we see is that women tend to rate their skills lower than man and I'm going
to come back to the gender variable more in the talk because it's actually a very complicated
variable that I want to talk about it a little bit more. But this is a--this is a statistically
significant difference. We see differences by race/ethnicity as to where people are in
terms of their average level of understanding. And we also see it by parental education,
again, the proxy for socioeconomic status. So taking it all together, what we have is
that, if you take the average Hispanic female student in the group whose parents have less
than a high school education, that's her score. And if you take the average male Asian student
in the group whose parents have a graduate degree that's his score. That's a pretty large
difference. I also have it later, after the talk; people are going to see the regression
results I have at to. Okay. So, what's exciting is that I--I've been talking to people about
the importance of measuring skill and that we need to get it on nationally representative
samples. I'm happy to say that in--in the fall of 2009, when the Federal Communication
Commission fielded a survey, nationally they did include some of these items on their survey.
So I'm able--and those data are made public--so I'm able to look at, "Okay so we know what
you guys see there at these differences but how does it look nationally?" So basically,
just quickly looking at their figures, so the findings I--I showed you are consistent.
So, in terms of gender and race, those data show the same relationships with the skill
measures. Data I didn't have in my data set. Income; perhaps not surprisingly as another
measure of socioeconomic status is positively related to skill. Education; which as you
might recall in my data set is constant. Nationally, it's not. Is also positively related to skill.
But what's interesting is age, if you look at just people 50 and under in that data set
is actually not related to skill. So, that's one of those assumptions people have that
the younger, the more highly skilled, the older, the less skilled. Over 50, it's true
that there we have a decline, but 50 and under there's no clear correlation between age and
skill. Okay. So I said that I would come back to the question of gender and I want to talk
about this just a little bit more. I keep coming back to it. So, in a paper that I published
a few year ago, we showed--and this was based on that study where you saw the--where I observed
a hundred people and doing actual tasks. And one of our findings was that, if you predict
people's self-perceived skill while controlling for actual skill, gender is still significant.
So that shows that regardless of their skill, women are rating their skills lower than men.
So that's an issue. But one of the things we concluded is, whether its actual perceived
this might still influence what people actually end-up doing. So, we can't completely dismiss
it as, "Oh it's just in their hatter. It's just a perception," because it might still
influence what women versus men do online. So it's just something to keep in mind but
again not to assume that there's something inherent per se about women, or that it's
a simple question. It's actually quite complicated. So I like to show this slide or this XKCD
comic because it shows again the social factors that influence how women might be thinking
about their skills versus men, that if we have situations where just women in general
are seem to be worse that could influence people how they perceive their skills. Like
if we draw on those stereotype threat literatures from social psychology, for example. Okay.
So now moving on to what--what is it that people actually do online? So, just a few
slides on some of the sites that I asked about on the survey, because again there are so
many assumptions that young people do all these things. So these are the three sites
that many of the people in the sample actually used, but what's interesting is that there
are a lot of sites that often associated with younger people that they actually don't really
use at all. So I think, yet again, it's a bit of a reality check. So, in particular,
comparing Facebook and My Space, it's very important I think to highlight that there
are quite a bit of racial ethnic differences, and who selects into using these sites. So
this brings us back to that point I made earlier that you can't rely on one site to draw conclusions
about all people out there because people select to use these sites at different rates.
Another reason to be concerned about something like this, if you're trying to reach the general
population, whether it's an add campaign or what not, you should be very careful not to
target just one site because if you, say want to reach Hispanic people as well then Facebook
might not be the best place to do it. And I had already shown this relationship back
in 2007 from the 2007 data set. And, again, it was one of those things where back then
people said, "Oh, this is just timing. It will change very quickly," but 2009 there
is still differences that held up. So, we can't just dismiss these differences as just
a very temporary blip. They tend to persist quite a bit. What was also interesting was
when--when I publish the 2007 piece; I tried to get some places to cover it and the Chronicle
of Higher Education covered it because it was about students but other places said,
"Oh, well you're just talking about this group of students." What's interesting is that Nielsen
two years later had more representative data so that could have covered very widely, but
in fact they just confirmed my findings from two years prior. So I think it's helpful to
know that this data set is actually quite generalizable more so than it might seem at
first. Partly because it's such a nicely diverse sample. So, I thought I'd say a few words
about twitter in particular; I have some interesting findings there. So you might have noticed
that for 2009 the level of use was 4% and maybe you thought, "Well, there." Surely a
year makes a difference because 2010 was really the year it took off. So, what is the rate
in 2010? Well, it did go up quite a bit, for sure. But let's recognize that it's still
less than a fifth of the student who were using it. To be fair, all I have data on is
whether people use it. I don't know if they're using it to tweet or is it just following
people. But it's still interesting. Now, I have a couple of quotes here from a different
study. This was focus groups with Northwestern seniors, actually. Then again show that just
because you're young you don't necessarily get it. So, I think these are just interesting
[INDISTINCT] instead of quotes. But quantity--back to the quantitative data, so I'm able to draw
onówell, okay, so one of the things that others have shown as well is that there's
quite a bit of variation in twitter use by race and ethnicity, where we have African-Americans
much more likely to use it than anybody else. So what might be going on? And this is where
longitudinal data actually come in really handy. And so in a paper that I've written
with my student Eden Litt, we looked at the following. We looked at outcome twitter use
in 2010 as predicted by various factors about you in 2009. So not surprisingly, African-Americans
are more likely to tweet. Yes, we've--I showed that previously. If we look at skill--your
skill in 2009, that's positively related to adopting twitter in 2010, but here's--here's
the really interesting thing. So we have data for 2009 about different topics that you're
interested in, and it turns out that this is actually what's driving adoption so far
as getting rid of the relationship with African-American. So if we put in information about being interested
in entertainment and celebrity news in 2009, then being black is no longer associated with
being a tweeter adopter. So it's really that it looks like African-Americans on average
seem to be more interested on entertainment news that whites and that's what's driving
it. This is the first study I know that has actually managed to explain some of this racial/ethnic
difference in site adoption, so I'm pretty excited about it. What's also interesting,
not sure what to make of this is that, if you were interested in science and research
in 2009, you're actually less likely to use twitter in 2010. And I think also interesting
is the interest in technology and politics are in no way related to twitter adoption
either. I think especially that when our politics is interesting because there's so much rhetoric
about the potential of twitter to get people excited about news and politics and get people
participating, but it doesn't look like that actually to driving why people would adopt
it. Obviously there's more room for more research here where, for example, if we have data more
specifics of what are actually people doing on twitter right? Are they tweeting or do
they're just following? How are they choosing? That would be nice to have. So now, what I
call participation gap which is the idea that you're actually contributing content to the
web yourself. So you're making your voices heard yourself. You're potentially influencing
public opinion yourself. So this is one--another one of those areas that supposed to be really
exciting about the web. So here are some questions that I asked on both the 2009 and 10 data
surveys, but I'll just be focusing on the nine data. But I checked the numbers and they're
really similar for ten. So here are some activities I asked about, whether people engage on them.
And here are the numbers. So, I think it's helpful to know that none of them are things
that, at least half the people do. So, again, some people will engage on them but not everybody.
Now, you might be wondering how I listed these activities because they're not, obviously
they're not in order of popularity. So I'd like you to take a look at the list and think
a little bit about how I might have picked this listing because it's not random, there's
a reason for this ordering. I wonder if anybody has any thoughts on that.
[PAUSE] >> HARGITTAI: I'm sorry. From difficult to
easy or from easy to difficult? Something like that. So, like a difficulty level issue
potentially? Yes John? >> JOHN: [INDISTINCT] range of like how many
people we have [INDISTINCT] >> HARGITTAI: Okay. That's another possibility.
So the--you might be thinking I'm saying "yes" maybe because those aren't the right answers,
but actually they maybe the right answers. It's not how I chose to list them, but as
soon as I tell you the reason for this listing you'll see that it might actually relate to
those factors. So, I choose this listing based on the level of difference by gender in engaging
in these activities. So, just to clarify, the top activity is probably about Facebook
quizzes, right? So let's not think about really complicated quizzes that you're putting out
there for students to take. I mean, it's to do more of those playful things. But what's
interesting--so creating quizzes is--it seems to be very similar by gender. But then uploading
video there's also quite a bit of difference and if you go down to changing or adding a
Wikipedia entry, the gender difference is--is enormous there. So, while I listed this by
gender difference, it maybe that those things you mentioned, whether it's difficulty level
or reach are still related, right? But it's interesting to note that this is what we get
reported by gender. The only thing in the 2009 data set that women are reported doing
more than men was changing their privacy on their Facebook accounts, which is probably
not too surprising. By 2010, it turns out that it's so universal that there's no variation
because pretty much everyone has done it. So, then I was interested in seeing whether
we see difference by other factors as well. So I created a summary variable. Have you
engaged in these--how many of these activities have you engaged in? And again it's a very
simple measure, right? It's just ever. So like, how many times to do [INDISTINCT] clear
or anything like that, just have you ever done it? And so, I think not surprisingly
from the figures I already showed you, there's a significant gender difference. But there's
also significant variation by race and ethnicity and how many of these things people tend to
do. And also variation by parental education, so the proxy for socioeconomic status. And
again, bringing all those variables together, what you have if you take the average female
Hispanic student whose parent have less than a high school education, she will have done
one of those activities. If you take the average white male whose parents have--whose have--who
have--who has parents with the graduate degree, he will have engaged in three of those activities.
That's a huge difference on a five scale. And again, I have the regressions if anyone's
interested. So, and now other paper that I'm not talking about but is somewhat similar.
We looked at who shares content online, so there it was bout videos and photos and fiction.
And what we found was that men shared more than women. However, in a regression model
when we controlled for skill, the skill measure I've been showing you, that gender effect
went away. So, in fact if you have a man and a woman of similar skill level, whether that's
actual or perceived, they will share similarly. So again, I'm pointing out the importance
of that skill measure. I thought I'd show you a little bit from the 2010 data because
it's more detailed about Youtube, in particular. And so we see that--pretty much everybody
has watched a video on Youtube, I think that's helpful partly just to verify that the data
are right. But, again, note the difference in what it is exactly that people are doing
on Youtube and the variation especially by gender. And again, even in terms of posting
a video, the difference among blacks and other students in the sample, so just pointing out
that background seems to matter in different ways. And so, to assume, sort of, the universal
internet for all is an incorrect assumption. I wanted to say just a few words about Wikipedia
as well. Now, I'm going to shift a little bit to talk about some more recent observational
work that we've done partly on the UIC students but also on a hundred Northwestern students
a couple of years ago. And I'll just take another few minutes, I'll wrap up.
[PAUSE] >> HARGITTAI: In terms of understanding Wikipedia,
there's quite a bit of variation. So this is another thing where, it's--we know that
students use Wikipedia but we don't necessarily know if they really understand it or not.
And so it's interesting to note these quotes that I think signal that people have different
levels of understanding. And again this is one of those things that I'm trying to form
into survey questions. So, in fact in the 2010 data, we have a question where we tried
to get at whether people understand Wikipedia or not. And it was interesting to note that
almost a third believed that they were official editors responsible for what content it is
on there, which has implications for how people understand the content that's on there and
how they approach it. On an--survey question, again, on the 2010 survey, we also asked about
more specifically what people are doing on Wikipedia and again pretty much everyone goes
on there to read content but huge, huge variation in who's actually editing the material, which
again goes back to these issue of whose voices are really being heard out there.
[PAUSE] >> HARGITTAI: Just finally, I had to put this
in--in here since I'm giving a talk at Google. I've, I mean, I've talked about this on other
talks but I focus more on the information search--information seeking search process.
But I think this is another area where there are skill issues at hand. So, in a paper that
we published last year, we looked at how people decide, how they trust the content that they
encounter and traditionally a lot of work on credibility looks at the site features
and what the layouts like, who's the source, but what we found which this quote illustrates
very succinctly is that basically, Google ranked at number one, of course it's trustworthy.
So, that was interesting to note because--as we know Google doesn't rate sites per se based
on credibility. There are all sorts of factors that going to help Google rates sites and
while one would hope that in that process credibility comes into play, we know of examples
where that's absolutely not the case, that the number one hits are very problematic.
So, this is a cause for some concern. Just finally, a few words about Mobile because
you might be thinking, you know, "She hasn't talked about the Mobile at all, and this is
the mobile generation." So, it is. I mean everyone's very wired when it comes to owning
a cellphone for sure. However, again, not in every way. It turns out that in 2009, the
majority of the students in this sample had not accessed the web on their mobile phones.
And using regression we were able to show that in fact its the students who already
accessed the web more anyway, who have more autonomy, who have better skills, who are
more likely to have access to the web on their mobile phones, in the first place. So again,
it's a concern about whether we're recreating traditional divides, if we just assume that,
"Oh well, they're accessing it on their phones anyway and everyone's doing it," when in it's
not in fact the case that's everyone's doing that. So why is it helpful to focus on skill?
I've shown you that it matters or hopefully you're convinced by the data that I've showed
you that it matters. Well, other things matter, too. I've showed you that user background
matters. Those are not things that we can really effect or change, not that some of
those things need fixing, but we can't, you know, it's very hard to change people's level
of education at some level, or certainly their financial status, economic status. It's hard
to change. However, skill is something that is much more likely for intervention, right?
So that's something where we could really intervene to help people get better skilled.
Obviously, there's the other side of it which is that information should be supplied to
the users in a more friendly way and what not, but--and there obviously lots of progress
and innovation being done there but at the same time people are using these sites and
services. We see how they differ in understanding them. So I think it's reasonable to suggest
that we try to intervene and improve people skills. So, that's the component of the story
that skill matters and I think we need to be working on that. So one way to do that
is to--partly we need more data understanding how people do these searches and I know there's
obviously work being done here and I know Dan thinks about that a lot, too. Another
is interventions, I think to see how do--what are the ways. So even if we agree, sure let's
try to improve people skill, we don't really know yet how--what's the optimal way to do
that, how do we do that? Is it the same across population given what I've shown in terms
of how different types of people use the internet in the first place? Maybe we need different
interventions based on type of person. Who knows? So that's the question that remains.
And then another issue which I haven't really addressed because we simply lack the data
to really gather it is this final loop back to what are their real--what are the life
outcome implications of these online uses? And here, what's important is to look at,
"Well, how does your skill and what you do online relates to all these different types
of outcomes." So, to wrap up, you might recall that in 2006, Time came out with Person of
the Year as you, because the idea was that everybody was contributing. But hopefully
what you'll take away from this talk is that, in fact it's just really some of you who are,
more precisely some of us in this room, so let's not think of it as this universal happy
story that everyone is benefiting and taking advantage of all these or is able to, and
partly because they lack the skill. So thank you and I welcome questions.
>> So do they have a your teammates professor [INDISTINCT].
>> That's a great talk. I'm just curious if you repeated any of these certain measures
or surveys at Northwestern [INDISTINCT] sort of supervised school compliments. That's improvised?
>> HARGITTAI: So the question is that if I've--if I've repeated this surveys at Northwestern
or other schools to see if these differences exists? And yes. So to the extent--not at
the same skill, we did observations of over a hundred Northwestern students at a similar
time when we did the observations of about--of over a hundred UIC students. And we surveyed
those 109 Northwestern students. There is slight difference but not huge difference.
Some of the factors are hard to track at Northwestern for the reasons I'm--well, actually I didn't
point out that Northwestern is not quite as diverse as UIC but you could probably imagine
that it is. So people don't--students don't differ as much in terms of their parental
education, for example. So I can't really look at the predictors there. But in terms
of just the core numbers, they're a little bit higher on average, but they still differ
quite a bit, so there's still a variation. Do you have your hand out?
>> [INDISTINCT] word [INDISTINCT] at yet you're saying that play some [INDISTINCT] and do
you think that they're slow [INDISTINCT] >> HARGITTAI: So the question is when I was
showing the numbers that 20% of students had edited, if I thought if that was lower if
that was high. >> Because people use of encyclopedia [INDISTINCT]
>> HARGITTAI: Right. Right. So it is definitely important to recognize that compared to--so
first of all I'd like to point out that it's--those numbers are true for the men and not for the
women. So, ten old white guys, ten young white guys, that's simplified. But--so the story
is actually not that different. Now, obviously more people are contributing to the content
today than they were 20 years ago, for sure. But part of the point of showing this data
is that there's still difference in who those types of people are, who are contributing.
And I think, and part of, part of the issue here in pointing out these differences is
also, that while things may look better it's important to remember that there are these
differences because if we just walk away assuming that its all great, then we might really be
leaving those people behind. Like, before we knew that it was just those few people.
And now if we assume, "Oh, it's everybody," and it's not, then that's a really wrong perception
of what's actually going on. I don't deny though that it is more democratic, for sure.
I just want to make sure that we remain conscious of the trends that we see and who is actually
contributing and who isn't. [PAUSE]
>> [INDISTINCT] Haven't looked like an animity* online and in terms of gender differences
there. And whether they're contributing this magnet and then [INDISTINCT] where there [INDISTINCT]
like [INDISTINCT] more, really identifying yourself.
>> HARGITTAI: So the question is whether I've looked at--to what extent [INDISTINCT] synonymous
contributions might influence. Well, you didn't say that but that's what I'm assuming you're
asking, might influence how--to what extent people are contributing? And, I think it's
a really great question and it's something I've thought about because I do think that
some of the reactions that people get to content is what may have them contribute more or less,
so to speak. So I don't know to what extend people have looked at these for online content.
I'll give you an example from academic research where, as far as I know there's some evidence
that shows, for example that when, you know, the journal review process, peer review, you
got lots of critical comments. I think there is research that are shown that female faculty
academics are more likely when they get a rejection like that to internalize the critiques
and say, "Oh, wow. I did poorly. I'm not even going to send this out," where as male, on
average the male scholar would say, "Oh, these reviewers have no idea what they're talking
about. Send it out, turn around." And so I think this is an inch--irrelevant parallel
here because it's the--it's the question of how do people perceive critical comments they
got. And so the way I'm tying--linking it back here is lets say, you make a comment
on Youtube and people jump on you and say, "Oh, you're so stupid what are you doing?"
Or you pose a video and it's all, it gets all negative comments. This would suggest
that gender might predict how you react to it and then that might influence whether you
come back to do more of this activity. So, I mean, I don't want to say that that's what
you were suggesting but it seems like, that's one thing that could be going on here.
>> Did you put on you surveys whether [INDISTINCT] contributed to Wikipedia and what [INDISTINCT]
but rather things they were doing today? >> HARGITTAI: Oh, so do I have data on it?
Sorry. Right, so the question is do I have data on whether people are contributing on
[INDISTINCT] or not. I'm afraid I do not have data on that. I did ask a question I tried--so
I've been trying to get at this because it is something I've been curious about for the
reasons I just mentioned. And I had a question about the extent to which you've had about
a bad experiences with a--just aggressive behavior that kind of stuff. It didn't--it
didn't really show a lot, I mean they're definitely gender, there's some gender variation there
but not a lot. So, I haven't quite been able to get at that. I do have however, I do have
some data that I haven't looked at closely in that way yet, where I have data on sharing
photos, videos, writing, images that you created either just privately or publicly. So it's
not about whether you're posting anonymously but it's about whether you're just sharing
it with people presumably you know because you're doing it privately versus publicly
and that could get at that a little bit as well. Just haven't looked at it yet. Yes.
>> So what's the best example of the most diverse input knowledge base?
>> HARGITTAI: What's the best example of the most diverse input for...?
>> That's something unexcused [INDISTINCT]. >> HARGITTAI: I think it would be neat if--oh
sorry. So the--what is the best example of contributions to a type of site or service?
What am I shooting for? I guess what would I like to see? I think it would be great if
we didn't see that your background matters, right? That it didn't matter whether you were
female or male, or whether you were Hispanic or white, or whether your parents have a graduate
degree or just a high school degree. That regardless of that, people are contributing.
And it seems that skill is mediating this, so one way to achieve it is to make sure that
people are better skilled and understand sites better. So, one of the thing--one of the ways
people sometimes approach us is to say, "Well some people just don't want to contribute."
Well, maybe. But my point is that, some people might not know how to contribute and that's
why they're not doing it. Yes? >> So you're [INDISTINCT] something else as
well and so I've [INDISTINCT] somewhere though I cant be sure that women create more complete
profile [INDISTINCT] sites where they [INDISTINCT]. I remember that's completely correct but I
remembered reading it at some point. But so I guess that made me think of in terms of
like hunting operation on the web. Could you in some ways define your concept through sky
like basically how easy it is or something else [INDISTINCT] back on it, you know like
you can pose a video and people can comment on it. But nobody can [INDISTINCT] comment
on your [INDISTINCT] >> HARGITTAI: So the question is, whether
I've looked at content creation in terms of how easy it is to comment or respond to the
comment or respond to the content that was created and the example given was that apparently,
there--there's some data to show that women create more complete--complete profiles on
Facebook. I'm not familiar with that research. I don't know if it's about more of a safe
space because on Facebook it's, again, the people you know, so it's what we would consider
more private space, and we call that the creating quizzes was actually quite equal between men
and women and again I think that's pretty much Facebook quizzes. So I don't know if
it's about that, I--but it's an interesting question although I feel like the things we
had here, I mean, on Wikipedia, sure people--someone could erase your change. So that's a way of
commenting on it for sure. Yes. I'll think about it more. But I think it may partly be
related to the whole private/public nature of it. I will say also in a completely different
paper--well drawing on the same data set, the 2007 data set, we showed that looking
at how people use social network sites differently. One of the things we showed was that men were
more likely to make new connections, whereas women were more likely to engage in activities
that solidify their existing connections. So, there are some differences, and again,
you might say there, too, that your--women are solidifying connections they already have.
So there is in that fear who might this be or don't know this is, whereas men, perhaps--I
don't know if some of it is fear that we talk about, what maybe lurking outside in the unknown
and that, then end-up scaring people away. And certainly the way the rhetoric is framed
are on those issues makes it more of a concern I think, often for women just the way we hear
about it. So, that something to consider, too, where I would again argue that if we
have more skilled people then hopefully they'd be able to know how they can be safe and not
alternative behavior per se or alter it in a way that still leaves them engaged.
>> Time for one more question? >> HARGITTAI: No pressure. One last question?
Yes? >> You mentioned the perception and skilled
gap between men and women and then per online skills, women rate themselves lower, I guess
I would, I wondered different control for variables like amount of experience online
and then interest online. I'm wondering if there's something about [INDISTINCT].
>> HARGITTAI: Okay, so the question is whether I look at things like experiences with the
internet and interest to explain the gender variation partly because historically there's
stereotypes associated with who goes into what areas. So, in the regressions I always
control for things like number of years you've been online, how much time you spend online.
Things like that. I do. And the gender effect is pretty much robust to all that. I do think
that there's some stereotype issue going on when it comes to people's--women's health
perceptions, right? So this issues of "well it's not the technology, it's not women's
domain" so they might--it's the stereotype threat literature that I've mentioned that
would also be related to it. I think--so this then concerns how we think about this thing.
I think it's interesting because often we talk about the internet that was a very social
space, and not so much a technology per se, so it's curious that it translate to this
domain. I will say as a last note on that point that a literature review we did as we
wrote up that piece showed that, pretty much [INDISTINCT] domain that has been surveyed
and this is not necessarily technology related, women always rate their skills lower than
men. So that's completely consistent, so there's just an issue going on there.
>> Join me in thanking professor.