English subtitles for clip: File:Brown bag - Panthea from Reboot.webm

From Wikimedia Commons, the free media repository
Jump to navigation Jump to search
1
00:00:00,000 --> 00:00:00,500


2
00:00:00,500 --> 00:00:01,710
PRESENTER: So welcome.

3
00:00:01,710 --> 00:00:04,500
Good morning, good afternoon,
or evening, depending

4
00:00:04,500 --> 00:00:07,720
on where you are joining us.

5
00:00:07,720 --> 00:00:09,360
Thank you, Anna.

6
00:00:09,360 --> 00:00:14,430
So today, I'm really, really
happy and pleased to be here

7
00:00:14,430 --> 00:00:16,650
with Panthea Lee.

8
00:00:16,650 --> 00:00:21,540
And Panthea is working with
us in the movement strategy

9
00:00:21,540 --> 00:00:22,680
research.

10
00:00:22,680 --> 00:00:25,020
And she's going to
tell us a lot more.

11
00:00:25,020 --> 00:00:27,610
Panthea is the
principal of Reboot,

12
00:00:27,610 --> 00:00:29,640
which is a design
research firm that

13
00:00:29,640 --> 00:00:34,620
has been partnering with us for
now over two years, I think?

14
00:00:34,620 --> 00:00:35,270
Over a year.

15
00:00:35,270 --> 00:00:36,090
Wow.

16
00:00:36,090 --> 00:00:37,670
PANTHEA LEE: We've done
a lot of work in a year.

17
00:00:37,670 --> 00:00:38,295
PRESENTER: Yes.

18
00:00:38,295 --> 00:00:40,930
We're really
productive together.

19
00:00:40,930 --> 00:00:45,840
And I also want to mention
that Zack is here with her.

20
00:00:45,840 --> 00:00:48,720
He's also principal at Reboot.

21
00:00:48,720 --> 00:00:51,750
And we hope to be
learning a lot from them.

22
00:00:51,750 --> 00:00:54,660
And before Panthea
starts talking,

23
00:00:54,660 --> 00:00:57,300
I just wanted to make
sure that you all

24
00:00:57,300 --> 00:00:59,950
feel free to ask
questions and interrupt

25
00:00:59,950 --> 00:01:02,540
and interact with
us as you're hearing

26
00:01:02,540 --> 00:01:04,440
the many interesting,
fascinating things

27
00:01:04,440 --> 00:01:06,370
that she's going to
share with us today.

28
00:01:06,370 --> 00:01:07,710
OK?

29
00:01:07,710 --> 00:01:12,779
So to start, tell us a
bit more about Reboot.

30
00:01:12,779 --> 00:01:14,320
PANTHEA LEE: So
thank you for coming.

31
00:01:14,320 --> 00:01:21,830


32
00:01:21,830 --> 00:01:22,770
Hello?

33
00:01:22,770 --> 00:01:24,270
Great.

34
00:01:24,270 --> 00:01:27,550
So Reboot is a design
firm that works

35
00:01:27,550 --> 00:01:30,700
with mission-driven
organizations to design

36
00:01:30,700 --> 00:01:33,490
policies and services
and products that

37
00:01:33,490 --> 00:01:35,860
help them better meet
the needs of their users

38
00:01:35,860 --> 00:01:37,270
and of their constituents.

39
00:01:37,270 --> 00:01:40,030
So that means we work
with anyone from the UN

40
00:01:40,030 --> 00:01:43,920
to the city of New York
to grassroots activist

41
00:01:43,920 --> 00:01:45,610
collectives.

42
00:01:45,610 --> 00:01:47,836
And we were founded
on the belief

43
00:01:47,836 --> 00:01:49,210
that people should
have a greater

44
00:01:49,210 --> 00:01:52,390
say in the policies
and the products

45
00:01:52,390 --> 00:01:54,610
that impact their lives.

46
00:01:54,610 --> 00:01:57,940
And we do that by bringing
in a lot of the practices

47
00:01:57,940 --> 00:02:00,580
that the private sector uses to
design things people actually

48
00:02:00,580 --> 00:02:01,750
want and need.

49
00:02:01,750 --> 00:02:04,150
And we try and translate
that to the public sector,

50
00:02:04,150 --> 00:02:06,580
and to the social sector,
where there's not always

51
00:02:06,580 --> 00:02:10,354
the same incentives or the
same accountability mechanisms.

52
00:02:10,354 --> 00:02:11,770
And so what that
means in practice

53
00:02:11,770 --> 00:02:17,100
is we've done everything from
helping the government of Libya

54
00:02:17,100 --> 00:02:21,310
post-revolution design the
world's first mobile voter

55
00:02:21,310 --> 00:02:23,410
registration and
elections management

56
00:02:23,410 --> 00:02:28,060
system to working with New
York City on criminal justice

57
00:02:28,060 --> 00:02:29,120
reform.

58
00:02:29,120 --> 00:02:31,420
We've been working
specifically on bail reform

59
00:02:31,420 --> 00:02:34,220
and helping immigrants better
understand their rights.

60
00:02:34,220 --> 00:02:35,440
So, a range of things.

61
00:02:35,440 --> 00:02:36,280
PRESENTER: Yes.

62
00:02:36,280 --> 00:02:36,880
Fascinating.

63
00:02:36,880 --> 00:02:37,850
Fascinating.

64
00:02:37,850 --> 00:02:41,600
And how the work with the
Wikimedia Foundation, right?

65
00:02:41,600 --> 00:02:45,060
Like, how has Reboot been
working with the Wikimedia

66
00:02:45,060 --> 00:02:45,835
Foundation?

67
00:02:45,835 --> 00:02:46,585
PANTHEA LEE: Yeah.

68
00:02:46,585 --> 00:02:49,750
So, I think our work
together started about just

69
00:02:49,750 --> 00:02:51,530
over a year ago.

70
00:02:51,530 --> 00:02:53,710
A lot of Reboot's
work is global.

71
00:02:53,710 --> 00:02:57,250
And so I think it was at a
point where the foundation was

72
00:02:57,250 --> 00:03:00,267
thinking about investing more
deeply into understanding

73
00:03:00,267 --> 00:03:02,350
the really diverse communities
that you guys serve

74
00:03:02,350 --> 00:03:03,320
all over the world.

75
00:03:03,320 --> 00:03:07,150
And so it really started with
this New Readers project,

76
00:03:07,150 --> 00:03:10,810
where we were helping you
all design and conduct

77
00:03:10,810 --> 00:03:14,170
design research in
Nigeria and India

78
00:03:14,170 --> 00:03:17,875
to try and inform strategies
from communications to product

79
00:03:17,875 --> 00:03:19,589
to partnerships, whatnot.

80
00:03:19,589 --> 00:03:21,880
And I know that's led to some
really interesting things

81
00:03:21,880 --> 00:03:25,090
that Jack and Zack are
doing around communications,

82
00:03:25,090 --> 00:03:28,655
around sort of offline support
that Ann's leading, whatnot.

83
00:03:28,655 --> 00:03:29,780
So that's where it started.

84
00:03:29,780 --> 00:03:35,620
And then we did some work around
mapping your various audiences,

85
00:03:35,620 --> 00:03:37,060
developing an
audience framework,

86
00:03:37,060 --> 00:03:39,840
and thinking about how
to prioritize engagement

87
00:03:39,840 --> 00:03:43,720
and investment in research
on different audiences, which

88
00:03:43,720 --> 00:03:46,990
led to some work around
conducting design research

89
00:03:46,990 --> 00:03:48,785
with editors, actually.

90
00:03:48,785 --> 00:03:52,450


91
00:03:52,450 --> 00:03:53,650
With editors?

92
00:03:53,650 --> 00:03:55,990
Great.

93
00:03:55,990 --> 00:04:01,180
And so we've been working with
the editing team in mid-sized

94
00:04:01,180 --> 00:04:05,440
Wikis in South Korea, in the
Czech Republic to understand

95
00:04:05,440 --> 00:04:08,080
the editor experience,
and where all you are--

96
00:04:08,080 --> 00:04:11,170
where you guys are gaining
people and losing people,

97
00:04:11,170 --> 00:04:13,780
and learning from lateral
communities, as well.

98
00:04:13,780 --> 00:04:15,280
And then sort of
coming full circle,

99
00:04:15,280 --> 00:04:19,270
we've been obviously doing work
around movement strategy, Track

100
00:04:19,270 --> 00:04:23,260
D, now called New
Voices, research

101
00:04:23,260 --> 00:04:28,160
in Brazil and Indonesia
just to understand how

102
00:04:28,160 --> 00:04:29,577
are people getting information.

103
00:04:29,577 --> 00:04:31,910
What are the trends that the
foundation and the movement

104
00:04:31,910 --> 00:04:35,510
need to be aware of looking
out to the next 15 years.

105
00:04:35,510 --> 00:04:36,940
How do people get online.

106
00:04:36,940 --> 00:04:40,160
How do they come to seek and
trust information online.

107
00:04:40,160 --> 00:04:42,645
What do they know about
Wikipedia, and do they care.

108
00:04:42,645 --> 00:04:45,020
So some of those questions
that we've been batting around

109
00:04:45,020 --> 00:04:48,160
to inform strategy process.

110
00:04:48,160 --> 00:04:49,790
PRESENTER: And why
design research?

111
00:04:49,790 --> 00:04:53,750
Like, how design research
compares to a more analytics,

112
00:04:53,750 --> 00:04:54,670
data-driven approach?

113
00:04:54,670 --> 00:04:57,670


114
00:04:57,670 --> 00:05:00,650
PANTHEA LEE: You know,
design research is

115
00:05:00,650 --> 00:05:05,390
at the heart of a lot of what
Reboot does because we believe

116
00:05:05,390 --> 00:05:08,960
that fundamentally
understanding human experiences,

117
00:05:08,960 --> 00:05:11,510
human behaviors,
mental models is

118
00:05:11,510 --> 00:05:14,360
critical to designing
things that people actually

119
00:05:14,360 --> 00:05:16,810
want and need and will use.

120
00:05:16,810 --> 00:05:18,620
And so what that
means is, instead

121
00:05:18,620 --> 00:05:20,870
of just cold hard
logic, looking at sort

122
00:05:20,870 --> 00:05:23,190
of market analytics,
which are important,

123
00:05:23,190 --> 00:05:26,960
we also use empathy as a
primary processing tool.

124
00:05:26,960 --> 00:05:29,990
And it's been really cool to
have foundation staff with us

125
00:05:29,990 --> 00:05:32,060
on all of these
research projects

126
00:05:32,060 --> 00:05:34,640
to understand, OK, once we talk
to all these different users

127
00:05:34,640 --> 00:05:36,800
and potential
users of Wikipedia,

128
00:05:36,800 --> 00:05:38,780
can we walk in their shoes.

129
00:05:38,780 --> 00:05:40,455
Can we understand
what they care about,

130
00:05:40,455 --> 00:05:42,830
how they feel about things,
how they get online, what the

131
00:05:42,830 --> 00:05:44,720
challenges that they have are.

132
00:05:44,720 --> 00:05:47,180
Because if we can understand
that and put ourselves

133
00:05:47,180 --> 00:05:50,060
in their shoes, that will
make us better strategists.

134
00:05:50,060 --> 00:05:52,670
That will make us
better designers.

135
00:05:52,670 --> 00:05:56,480
And I think it's actually-- we
do a lot of generative design

136
00:05:56,480 --> 00:05:58,970
research, which means
we don't necessarily

137
00:05:58,970 --> 00:06:01,100
know what the solution
or answer will

138
00:06:01,100 --> 00:06:03,800
be at the very start
of the process,

139
00:06:03,800 --> 00:06:05,300
compared to some
of the more I think

140
00:06:05,300 --> 00:06:09,230
evaluative work that the
foundation currently does,

141
00:06:09,230 --> 00:06:11,120
which is also very important.

142
00:06:11,120 --> 00:06:14,180
I know the product teams have
been doing a lot around sort

143
00:06:14,180 --> 00:06:16,460
of testing different sort
of products and approaches

144
00:06:16,460 --> 00:06:18,077
and iterating upon those.

145
00:06:18,077 --> 00:06:19,910
But I think some of the
more generative work

146
00:06:19,910 --> 00:06:22,700
here has been really valuable
for this sort of strategy

147
00:06:22,700 --> 00:06:25,894
process where we want
to think bigger picture.

148
00:06:25,894 --> 00:06:27,310
PRESENTER: And you
were thinking--

149
00:06:27,310 --> 00:06:29,630
we were just talking about
the movement strategy,

150
00:06:29,630 --> 00:06:33,800
and how ambitious
this big process is.

151
00:06:33,800 --> 00:06:37,580
And I think that everyone is
really interested and excited

152
00:06:37,580 --> 00:06:44,010
to hear what are the findings so
far from Brazil and Indonesia.

153
00:06:44,010 --> 00:06:46,220
PANTHEA LEE: We have a
25-page memo that you're all

154
00:06:46,220 --> 00:06:48,740
welcome to read.

155
00:06:48,740 --> 00:06:52,550
You know, I think we
have a lot of findings

156
00:06:52,550 --> 00:06:54,050
that we're still
processing and want

157
00:06:54,050 --> 00:06:56,150
to work with you all and
the movement strategy

158
00:06:56,150 --> 00:06:59,870
team over the coming few weeks.

159
00:06:59,870 --> 00:07:03,260
But I think one of the biggest
things that is coming out

160
00:07:03,260 --> 00:07:08,120
from the research is that
I think, looking forward,

161
00:07:08,120 --> 00:07:10,760
Wikipedia and the
Wikimedia movement

162
00:07:10,760 --> 00:07:15,440
is going to have to think
about how Wikipedia is not just

163
00:07:15,440 --> 00:07:19,680
a destination for knowledge
and for information,

164
00:07:19,680 --> 00:07:22,370
but how it can become
a source of knowledge

165
00:07:22,370 --> 00:07:25,370
and a source of
information in all

166
00:07:25,370 --> 00:07:28,370
the different and diverse
ways that people learn.

167
00:07:28,370 --> 00:07:30,980
And what I mean by
that is, you know,

168
00:07:30,980 --> 00:07:33,590
the internet was a
very different place

169
00:07:33,590 --> 00:07:35,930
when Wikipedia first started.

170
00:07:35,930 --> 00:07:37,430
What we're seeing
now and what we're

171
00:07:37,430 --> 00:07:39,920
finding through the
research is people

172
00:07:39,920 --> 00:07:41,840
are learning and
getting information

173
00:07:41,840 --> 00:07:45,680
in all these really diverse
and fascinating ways,

174
00:07:45,680 --> 00:07:50,630
from people learning to cook and
groom their eyebrows on YouTube

175
00:07:50,630 --> 00:07:53,420
to all the different
homework help communities,

176
00:07:53,420 --> 00:07:56,090
the Brainly.coms,
the other things.

177
00:07:56,090 --> 00:07:58,430
You know, and young
people especially

178
00:07:58,430 --> 00:08:00,890
are just finding really
creative and new ways

179
00:08:00,890 --> 00:08:04,250
to learn, to share information,
to get information.

180
00:08:04,250 --> 00:08:08,510
And they're not really
going to websites anymore.

181
00:08:08,510 --> 00:08:11,890
And so, you know, if we play
that out for Wikipedia, I

182
00:08:11,890 --> 00:08:15,980
think it's interesting to
think about how the site

183
00:08:15,980 --> 00:08:20,390
and how the platform and all the
knowledge that you guys have,

184
00:08:20,390 --> 00:08:24,440
how to make it more modular,
how to make it more portable,

185
00:08:24,440 --> 00:08:28,130
to be able to take the resource
that you all have built

186
00:08:28,130 --> 00:08:31,940
and feed it into all the diverse
and different channels in ways

187
00:08:31,940 --> 00:08:33,890
that people are learning.

188
00:08:33,890 --> 00:08:37,100
And I think that's going to
be something quite interesting

189
00:08:37,100 --> 00:08:40,520
because, you know, the
first 15 years was, I think,

190
00:08:40,520 --> 00:08:42,530
for you all really
about building

191
00:08:42,530 --> 00:08:48,050
this incredible resource,
and really thinking about how

192
00:08:48,050 --> 00:08:51,392
this production model works.

193
00:08:51,392 --> 00:08:53,600
The next 15 years might be
really thinking about, OK,

194
00:08:53,600 --> 00:08:55,975
so then once we have this
information, how do we actually

195
00:08:55,975 --> 00:08:57,620
get it out to people.

196
00:08:57,620 --> 00:08:59,919
What is the distribution model.

197
00:08:59,919 --> 00:09:01,460
And I think that's
going to be really

198
00:09:01,460 --> 00:09:05,735
interesting to look at how
Wikimedia innovates next.

199
00:09:05,735 --> 00:09:07,110
PRESENTER: And
some of-- and what

200
00:09:07,110 --> 00:09:08,800
would be some of
the key insights

201
00:09:08,800 --> 00:09:11,260
when you're thinking about,
like, Brazil and Indonesia

202
00:09:11,260 --> 00:09:12,880
specifically?

203
00:09:12,880 --> 00:09:15,460
Are there things that
stood out for you?

204
00:09:15,460 --> 00:09:18,540


205
00:09:18,540 --> 00:09:20,640
PANTHEA LEE: Something
that was really fascinating

206
00:09:20,640 --> 00:09:26,070
for us was looking at the
rise of messaging apps,

207
00:09:26,070 --> 00:09:31,410
and just how prevalent
and popular they are now.

208
00:09:31,410 --> 00:09:34,770
I think WhatsApp is installed
in something like 2/3

209
00:09:34,770 --> 00:09:36,330
of smartphones in Indonesia.

210
00:09:36,330 --> 00:09:42,690
And it grew 300%
between 2015 and 2016.

211
00:09:42,690 --> 00:09:45,480
And there's a lot
of factors driving

212
00:09:45,480 --> 00:09:49,080
that, one of them being that a
lot of telcos, mobile network

213
00:09:49,080 --> 00:09:52,230
operators, are offering
these apps for free.

214
00:09:52,230 --> 00:09:53,730
They're zero rating
them, or they're

215
00:09:53,730 --> 00:09:55,313
including them in
sort of data bundles

216
00:09:55,313 --> 00:09:57,400
and packages and whatnot.

217
00:09:57,400 --> 00:09:59,850
And so these are really--
you know, whether it's

218
00:09:59,850 --> 00:10:02,010
social networking
apps, messaging apps,

219
00:10:02,010 --> 00:10:04,970
they're really becoming
people's onramp to the internet.

220
00:10:04,970 --> 00:10:07,402
And for some people,
they are the only ways

221
00:10:07,402 --> 00:10:08,610
that they are getting online.

222
00:10:08,610 --> 00:10:11,070
And they're not even
thinking about using WhatsApp

223
00:10:11,070 --> 00:10:14,850
as being online, or
using the internet.

224
00:10:14,850 --> 00:10:17,520
And where cost is a
factor or where it's just,

225
00:10:17,520 --> 00:10:21,060
like, fun to chat with and share
links with your friends, what

226
00:10:21,060 --> 00:10:23,880
does that mean for Wikipedia?

227
00:10:23,880 --> 00:10:27,014
You know, how--

228
00:10:27,014 --> 00:10:28,680
And I think what's
interesting, as well,

229
00:10:28,680 --> 00:10:31,920
is people are also
using these messaging

230
00:10:31,920 --> 00:10:35,520
apps now not just to
chat and to share links.

231
00:10:35,520 --> 00:10:37,935
They're also forming
these what we

232
00:10:37,935 --> 00:10:41,520
were calling sort of
hyper-targeted social networks.

233
00:10:41,520 --> 00:10:43,560
Like, people don't want
to use Facebook anymore.

234
00:10:43,560 --> 00:10:46,600
They're saying, gosh,
Facebook is for old people.

235
00:10:46,600 --> 00:10:47,100
You know?

236
00:10:47,100 --> 00:10:51,960
We are-- I mean, I was
like, OK, I use Facebook.

237
00:10:51,960 --> 00:10:55,170
But so Facebook is for old
people, what we want to do now

238
00:10:55,170 --> 00:10:58,680
is we want to form our networks
for, when we go to an event,

239
00:10:58,680 --> 00:11:01,500
afterwards we have our community
that we form a WhatsApp group

240
00:11:01,500 --> 00:11:03,040
around that.

241
00:11:03,040 --> 00:11:05,970
I have a study group for
every single one of my classes

242
00:11:05,970 --> 00:11:07,530
at university.

243
00:11:07,530 --> 00:11:11,290
I want to form these
hyper-targeted social networks,

244
00:11:11,290 --> 00:11:12,000
and then--

245
00:11:12,000 --> 00:11:15,390
and my free messaging app
that telco is supporting

246
00:11:15,390 --> 00:11:17,160
is enabling me to do that.

247
00:11:17,160 --> 00:11:18,930
So what does that
mean for Wikipedia?

248
00:11:18,930 --> 00:11:21,540
How do you guys appear and
show up and help people

249
00:11:21,540 --> 00:11:24,660
in these study groups, and
appear organically in context?

250
00:11:24,660 --> 00:11:27,900
Is that in-app previews
of Wikipedia content?

251
00:11:27,900 --> 00:11:29,970
Is it some other ways
to sort of help people

252
00:11:29,970 --> 00:11:32,770
share information where
and how they're doing so?

253
00:11:32,770 --> 00:11:38,052
And I think that was
really fascinating to us.

254
00:11:38,052 --> 00:11:39,510
And then I know
you guys know this,

255
00:11:39,510 --> 00:11:43,500
but there's obviously a
lot of brand confusion

256
00:11:43,500 --> 00:11:44,790
that we surfaced.

257
00:11:44,790 --> 00:11:46,860
And I know it's
been talked about,

258
00:11:46,860 --> 00:11:49,630
and I know the comms
team is working on this.

259
00:11:49,630 --> 00:11:56,820
But Wikipedia has great
brand recognition,

260
00:11:56,820 --> 00:11:59,590
but a lot of brand confusion.

261
00:11:59,590 --> 00:12:02,520
And so what that means is people
think of you all as, you know,

262
00:12:02,520 --> 00:12:04,810
this is a technology giant.

263
00:12:04,810 --> 00:12:06,990
We know the name,
but we'll compare it

264
00:12:06,990 --> 00:12:09,284
to a Google or a Facebook.

265
00:12:09,284 --> 00:12:11,700
That's kind of crappy for you,
because you get judged then

266
00:12:11,700 --> 00:12:14,850
to be a pretty poor
search engine or a really

267
00:12:14,850 --> 00:12:17,130
confusing social network.

268
00:12:17,130 --> 00:12:20,760
And so, you know, how
do we help people really

269
00:12:20,760 --> 00:12:23,380
understand what Wikipedia
is, how it works,

270
00:12:23,380 --> 00:12:25,202
and why they should care.

271
00:12:25,202 --> 00:12:27,410
And those are some of the
things that are coming out.

272
00:12:27,410 --> 00:12:29,790
I'm happy to talk about any
of the other findings that

273
00:12:29,790 --> 00:12:30,450
surfaced.

274
00:12:30,450 --> 00:12:31,116
PRESENTER: Yeah.

275
00:12:31,116 --> 00:12:33,790
And I'm also interested, like,
in talking about the youth,

276
00:12:33,790 --> 00:12:34,290
right?

277
00:12:34,290 --> 00:12:36,900
Like, you mentioned
that, and what

278
00:12:36,900 --> 00:12:39,030
are some of the key
interesting things

279
00:12:39,030 --> 00:12:42,450
that you're seeing particularly
with that audience,

280
00:12:42,450 --> 00:12:45,300
and if there are things there
that we should be particularly

281
00:12:45,300 --> 00:12:49,460
paying attention to as we
think about our next 15

282
00:12:49,460 --> 00:12:52,310
years as a movement.

283
00:12:52,310 --> 00:12:54,680
PANTHEA LEE: That's
a big question.

284
00:12:54,680 --> 00:12:58,250
What are the kids doing?

285
00:12:58,250 --> 00:13:00,620
So I think one
thing that is really

286
00:13:00,620 --> 00:13:05,420
interesting about the youth
that we were looking at

287
00:13:05,420 --> 00:13:13,000
is trust in content--

288
00:13:13,000 --> 00:13:17,290
whether or not they trust
content doesn't really matter.

289
00:13:17,290 --> 00:13:18,920
Trust doesn't equal usage.

290
00:13:18,920 --> 00:13:21,310
Trust doesn't equal utility.

291
00:13:21,310 --> 00:13:26,800
You're seeing a generation that
is highly skeptical of media,

292
00:13:26,800 --> 00:13:31,870
of information sources, and
particularly of online content.

293
00:13:31,870 --> 00:13:34,990
And there's a lot of factors
that give rise to that.

294
00:13:34,990 --> 00:13:37,150
In Brazil and Indonesia
specifically, you

295
00:13:37,150 --> 00:13:40,870
have a long history
of government control

296
00:13:40,870 --> 00:13:42,280
of the media.

297
00:13:42,280 --> 00:13:46,120
You have highly
concentrated media ownership

298
00:13:46,120 --> 00:13:48,280
in both markets.

299
00:13:48,280 --> 00:13:52,420
You have the proliferation
of, you know, fake news,

300
00:13:52,420 --> 00:13:54,970
and people now use
sort of fake news

301
00:13:54,970 --> 00:13:58,180
as a term to describe
seemingly everything.

302
00:13:58,180 --> 00:14:01,810
And you have business models
that incentivize clickbaits

303
00:14:01,810 --> 00:14:03,267
and sensational content.

304
00:14:03,267 --> 00:14:05,350
So there's lots of reasons
that we can talk about.

305
00:14:05,350 --> 00:14:07,120
But ultimately,
what we're seeing

306
00:14:07,120 --> 00:14:11,560
is young people don't trust
the content that they have,

307
00:14:11,560 --> 00:14:16,270
or they know content is biased,
but they will use it anyway.

308
00:14:16,270 --> 00:14:19,570
They will then take that
content and discuss it

309
00:14:19,570 --> 00:14:22,526
with their friends to
try and triangulate

310
00:14:22,526 --> 00:14:24,400
between lots of different
sources to say, OK,

311
00:14:24,400 --> 00:14:28,540
you know, should I use
this, for what purposes,

312
00:14:28,540 --> 00:14:30,610
I know it's biased.

313
00:14:30,610 --> 00:14:32,830
And I think that's
quite interesting,

314
00:14:32,830 --> 00:14:36,730
because Wikipedia and Wikimedia
spend a lot of time thinking

315
00:14:36,730 --> 00:14:39,820
about trust and accuracy.

316
00:14:39,820 --> 00:14:43,110
And that's great,
but also how do

317
00:14:43,110 --> 00:14:45,120
we think about
relevance and utility,

318
00:14:45,120 --> 00:14:47,940
and what's actually going to
get people to use this content.

319
00:14:47,940 --> 00:14:50,490
And you know, one of the
things that we're seeing, too,

320
00:14:50,490 --> 00:14:53,160
around trust is that
the indicators of trust

321
00:14:53,160 --> 00:14:54,330
are changing.

322
00:14:54,330 --> 00:14:58,050
You know, young people are
not trusting institutions

323
00:14:58,050 --> 00:15:00,810
to give them credible,
verifiable content.

324
00:15:00,810 --> 00:15:03,480
They're trusting each other.

325
00:15:03,480 --> 00:15:07,620
And so they are looking for
indicators such as number

326
00:15:07,620 --> 00:15:12,277
of followers, number of likes on
articles, other sort of social,

327
00:15:12,277 --> 00:15:13,860
more individual
indicators-- you know,

328
00:15:13,860 --> 00:15:16,560
the reputation of
a content curator--

329
00:15:16,560 --> 00:15:18,330
to help them
determine what content

330
00:15:18,330 --> 00:15:20,624
they should trust and use.

331
00:15:20,624 --> 00:15:23,040
And I think there's really
interesting implications there,

332
00:15:23,040 --> 00:15:27,570
then, for Wikipedia, because
your process and your content

333
00:15:27,570 --> 00:15:29,610
is driven by individuals.

334
00:15:29,610 --> 00:15:33,360
And so is there a way of
showcasing that and surfacing

335
00:15:33,360 --> 00:15:37,380
that to help people understand
how the sausage is made,

336
00:15:37,380 --> 00:15:39,860
because they want to know
that to then understand

337
00:15:39,860 --> 00:15:42,184
whether or not they
should invest in and use

338
00:15:42,184 --> 00:15:42,975
a piece of content.

339
00:15:42,975 --> 00:15:46,030


340
00:15:46,030 --> 00:15:53,190
Yeah, and then maybe the
final point on young people

341
00:15:53,190 --> 00:15:59,140
is that, you know, I think we
hear visual, we hear real-time,

342
00:15:59,140 --> 00:16:02,620
we hear social, and
we know all that.

343
00:16:02,620 --> 00:16:04,320
But those aren't just buzzwords.

344
00:16:04,320 --> 00:16:08,340
I mean, kids, young people
are getting their news

345
00:16:08,340 --> 00:16:09,810
on Instagram.

346
00:16:09,810 --> 00:16:12,540
You know, instead of going to
the website of a newspaper,

347
00:16:12,540 --> 00:16:14,940
they are following the Instagram
account of a newspaper,

348
00:16:14,940 --> 00:16:16,380
because we're sitting there
and they're scrolling through

349
00:16:16,380 --> 00:16:18,120
like this is the exact
amount of content

350
00:16:18,120 --> 00:16:21,060
that I want on a
significant news story.

351
00:16:21,060 --> 00:16:22,560
Less than 100 words.

352
00:16:22,560 --> 00:16:23,100
That's it.

353
00:16:23,100 --> 00:16:24,540
You know, big photo, great.

354
00:16:24,540 --> 00:16:26,420
Scroll through.

355
00:16:26,420 --> 00:16:28,950
They're getting breaking news
by following trending topics

356
00:16:28,950 --> 00:16:30,010
on Twitter.

357
00:16:30,010 --> 00:16:31,860
They're then taking
that to WhatsApp

358
00:16:31,860 --> 00:16:34,430
to discuss with their friends.

359
00:16:34,430 --> 00:16:36,770
So what does that
mean for Wikipedia?

360
00:16:36,770 --> 00:16:42,020
You know, do we need
to think about using--

361
00:16:42,020 --> 00:16:46,230
allowing video as
references, for example.

362
00:16:46,230 --> 00:16:50,840
Do we think about, I
don't know, push alerts

363
00:16:50,840 --> 00:16:54,440
around articles that are getting
sort of rapid distributed

364
00:16:54,440 --> 00:16:56,800
and concentrated edits
so people-- you know,

365
00:16:56,800 --> 00:16:59,540
so Wikipedia can be seen
as relevant and real-time

366
00:16:59,540 --> 00:17:02,596
in the way that they want
their news and information.

367
00:17:02,596 --> 00:17:04,220
I think those are
interesting questions

368
00:17:04,220 --> 00:17:08,594
to explore and wrestle with.

369
00:17:08,594 --> 00:17:09,260
PRESENTER: Yeah.

370
00:17:09,260 --> 00:17:11,960
And I'm wondering
here, now that you just

371
00:17:11,960 --> 00:17:14,720
mentioned some of the possible
opportunities and things

372
00:17:14,720 --> 00:17:17,440
that we could be
exploring and considering,

373
00:17:17,440 --> 00:17:19,399
and how we take all that, right.

374
00:17:19,399 --> 00:17:20,940
We touched base on--
you touched base

375
00:17:20,940 --> 00:17:23,990
on some of the
bigger key findings

376
00:17:23,990 --> 00:17:27,440
how youth is relating
to content information,

377
00:17:27,440 --> 00:17:29,910
how they're accessing it.

378
00:17:29,910 --> 00:17:32,420
So making sense of
that and connecting

379
00:17:32,420 --> 00:17:37,940
that with the five thematic
directions of the strategy,

380
00:17:37,940 --> 00:17:39,500
like, how are you seeing those?

381
00:17:39,500 --> 00:17:41,420
How are you seeing
what you learn

382
00:17:41,420 --> 00:17:46,170
and saw from Indonesia and
Brazil relating to the teams?

383
00:17:46,170 --> 00:17:50,870


384
00:17:50,870 --> 00:17:56,060
PANTHEA LEE: So, we were
really excited to see the five

385
00:17:56,060 --> 00:17:58,310
movement strategy themes.

386
00:17:58,310 --> 00:17:59,570
They are ambitious.

387
00:17:59,570 --> 00:18:00,500
They are visionary.

388
00:18:00,500 --> 00:18:02,570
They are comprehensive.

389
00:18:02,570 --> 00:18:04,760
And we were then also
thinking about how

390
00:18:04,760 --> 00:18:07,145
to map our findings
against those.

391
00:18:07,145 --> 00:18:10,340


392
00:18:10,340 --> 00:18:11,830
And I think what's
interesting is

393
00:18:11,830 --> 00:18:16,930
we try to separate out between
objectives and strategies

394
00:18:16,930 --> 00:18:20,140
and then tactics, because
I think the five themes are

395
00:18:20,140 --> 00:18:22,480
slightly different in that way.

396
00:18:22,480 --> 00:18:25,302
So I'm going to get
the letters confused.

397
00:18:25,302 --> 00:18:26,260
PRESENTER: I have them.

398
00:18:26,260 --> 00:18:27,551
PANTHEA LEE: OK, you have them.

399
00:18:27,551 --> 00:18:32,305
But you know, the
ones around being

400
00:18:32,305 --> 00:18:34,810
a respected and relevant
source of knowledge, that

401
00:18:34,810 --> 00:18:37,000
is an objective.

402
00:18:37,000 --> 00:18:40,110
And we can do that through--

403
00:18:40,110 --> 00:18:41,960
by advancing with technology.

404
00:18:41,960 --> 00:18:44,290
And we can do that by engaging
the knowledge ecosystem.

405
00:18:44,290 --> 00:18:45,160
Those are tactics.

406
00:18:45,160 --> 00:18:50,140
You know, becoming a truly
movement, that's an objective.

407
00:18:50,140 --> 00:18:52,015
And you know, again,
we can do that sort

408
00:18:52,015 --> 00:18:55,600
of through technology, through
engaging diverse partners.

409
00:18:55,600 --> 00:18:57,190
And so as-- perfect.

410
00:18:57,190 --> 00:19:00,160
And so as we were mapping
some of the opportunities

411
00:19:00,160 --> 00:19:01,420
that we saw--

412
00:19:01,420 --> 00:19:03,825
and I know we have a workshop
with some of the Movement

413
00:19:03,825 --> 00:19:05,320
Strategy folks after this--

414
00:19:05,320 --> 00:19:09,100
we tried to essentially sort
of map our opportunities

415
00:19:09,100 --> 00:19:10,670
into a matrix of sorts--

416
00:19:10,670 --> 00:19:12,820
I know folks can't see
this-- to understand

417
00:19:12,820 --> 00:19:18,467
how we use technology to meet
each of these objectives.

418
00:19:18,467 --> 00:19:20,050
Obviously, the work
that we were doing

419
00:19:20,050 --> 00:19:26,650
was really focused on readers,
on communities, on audiences.

420
00:19:26,650 --> 00:19:29,890
But we know that, you know,
just strong, healthy communities

421
00:19:29,890 --> 00:19:32,050
will be critical and
foundational to driving

422
00:19:32,050 --> 00:19:32,890
all of this forward.

423
00:19:32,890 --> 00:19:34,264
But you know, our
work under this

424
00:19:34,264 --> 00:19:37,170
wasn't really
focused on Track A.

425
00:19:37,170 --> 00:19:39,610
And I think what's
really then interesting

426
00:19:39,610 --> 00:19:43,510
about this is this notion
of being more modular, more

427
00:19:43,510 --> 00:19:45,880
portable, you know, engaging
with diverse partners

428
00:19:45,880 --> 00:19:47,920
to push content out,
I think that's really

429
00:19:47,920 --> 00:19:51,880
going to be at the intersection
of all of these themes.

430
00:19:51,880 --> 00:19:56,560
You know, how do you work
with educational institutions

431
00:19:56,560 --> 00:19:58,165
to think about using
Wikipedia content

432
00:19:58,165 --> 00:20:01,750
in after-school programs
that extend learning outside

433
00:20:01,750 --> 00:20:03,530
of the classroom.

434
00:20:03,530 --> 00:20:07,390
How do you all work
with nonprofits

435
00:20:07,390 --> 00:20:12,880
that are investing in skills
training for unemployed youth,

436
00:20:12,880 --> 00:20:15,010
to take Wikipedia
content, mix and match

437
00:20:15,010 --> 00:20:17,410
it to develop a curricula.

438
00:20:17,410 --> 00:20:21,390
I think those will require
both a mix of technology

439
00:20:21,390 --> 00:20:24,120
and partnerships.

440
00:20:24,120 --> 00:20:27,690
And I think, you know,
those will effectively

441
00:20:27,690 --> 00:20:31,650
help Wikipedia be relevant
and global and sustainable

442
00:20:31,650 --> 00:20:33,040
into the future.

443
00:20:33,040 --> 00:20:33,806
So, yes.

444
00:20:33,806 --> 00:20:34,680
PRESENTER: Thank you.

445
00:20:34,680 --> 00:20:38,250
Thank you for showing and
talking about this map,

446
00:20:38,250 --> 00:20:41,670
and how you are integrating
that based on the data.

447
00:20:41,670 --> 00:20:45,900
And I think another thing that
I would love to hear from you

448
00:20:45,900 --> 00:20:50,100
is that how do you think we are
serving the emerging markets

449
00:20:50,100 --> 00:20:51,390
and emerging communities?

450
00:20:51,390 --> 00:20:53,190
Are we-- like,
what are the things

451
00:20:53,190 --> 00:20:57,940
that we could be doing
differently there?

452
00:20:57,940 --> 00:21:01,650
And what are the ways
of actually identifying

453
00:21:01,650 --> 00:21:03,510
their needs and then
serving those needs?

454
00:21:03,510 --> 00:21:06,450


455
00:21:06,450 --> 00:21:10,745
PANTHEA LEE: So I think, in
terms of emerging markets--

456
00:21:10,745 --> 00:21:14,610


457
00:21:14,610 --> 00:21:16,490
you know, I think
one thing that I

458
00:21:16,490 --> 00:21:18,260
know we've all
been talking about

459
00:21:18,260 --> 00:21:20,930
is emerging markets,
global [INAUDIBLE],,

460
00:21:20,930 --> 00:21:23,420
it's very broad term.

461
00:21:23,420 --> 00:21:27,050
And you all have been
investing in research

462
00:21:27,050 --> 00:21:29,180
in a lot of specific markets.

463
00:21:29,180 --> 00:21:32,060
So, you know, I think one
key step moving forward

464
00:21:32,060 --> 00:21:35,300
is to try and
disentangle some of this,

465
00:21:35,300 --> 00:21:37,520
and to break these
markets apart.

466
00:21:37,520 --> 00:21:40,980
So if you take a look at
Indonesia and Nigeria,

467
00:21:40,980 --> 00:21:45,410
for example, where you guys have
invested in primary research,

468
00:21:45,410 --> 00:21:48,380
these are quite
different markets.

469
00:21:48,380 --> 00:21:54,920
Let's take a look at mobile
penetration and cost of data,

470
00:21:54,920 --> 00:21:56,030
for example.

471
00:21:56,030 --> 00:22:02,065
Nigeria may be representative of
markets in sub-Saharan Africa.

472
00:22:02,065 --> 00:22:04,430
You know, data is
relatively expensive,

473
00:22:04,430 --> 00:22:05,690
and so people ration it.

474
00:22:05,690 --> 00:22:07,712
It's kind of a scarce resource.

475
00:22:07,712 --> 00:22:09,170
And so the strategies
that you guys

476
00:22:09,170 --> 00:22:12,110
develop there will be
applicable to certain regions.

477
00:22:12,110 --> 00:22:16,651
But Indonesia, where cost of
data is dropping quite rapidly,

478
00:22:16,651 --> 00:22:18,400
even though there's
still sort of barriers

479
00:22:18,400 --> 00:22:23,330
to access, that might be a much
more sort of illustrative view

480
00:22:23,330 --> 00:22:27,240
of where different
markets are going

481
00:22:27,240 --> 00:22:33,850
in terms of internet usage,
information behaviors, whatnot.

482
00:22:33,850 --> 00:22:35,430
And so I think
first step might be

483
00:22:35,430 --> 00:22:37,950
to sort of disentangle
emerging markets,

484
00:22:37,950 --> 00:22:41,250
and think about what each
market can tell you about larger

485
00:22:41,250 --> 00:22:43,620
patterns in which regions.

486
00:22:43,620 --> 00:22:46,260


487
00:22:46,260 --> 00:22:53,040
And then I think, from there,
we see that Wikipedia has--

488
00:22:53,040 --> 00:22:57,240
we talked about sort of
brand recognition, whatnot.

489
00:22:57,240 --> 00:23:02,040
But I think one group,
or one set of users

490
00:23:02,040 --> 00:23:04,710
that the foundation I know--
and the movement its thinking

491
00:23:04,710 --> 00:23:07,860
about whether and how
we can better serve them

492
00:23:07,860 --> 00:23:12,930
is what we might term sort of
more marginalized communities,

493
00:23:12,930 --> 00:23:14,680
lower income users, whatnot.

494
00:23:14,680 --> 00:23:17,940
I know that's been a topic
of conversation here.

495
00:23:17,940 --> 00:23:20,040
And I think that
is an area where

496
00:23:20,040 --> 00:23:24,930
there is still a significant
gap, through no fault

497
00:23:24,930 --> 00:23:26,970
of the movement.

498
00:23:26,970 --> 00:23:32,190
You know, these are populations
that, for lots of reasons--

499
00:23:32,190 --> 00:23:36,780
you know, levels of
education, income, whatnot--

500
00:23:36,780 --> 00:23:40,830
simply are not getting
online, or are not

501
00:23:40,830 --> 00:23:44,700
able to get to and
use the information

502
00:23:44,700 --> 00:23:47,700
that Wikipedia
provides once they are.

503
00:23:47,700 --> 00:23:53,070
And so I think an interesting
question in these markets

504
00:23:53,070 --> 00:23:57,510
and with these users
in particular is what--

505
00:23:57,510 --> 00:24:00,690
does the movement
want to serve them,

506
00:24:00,690 --> 00:24:06,510
and, if you do, who are
the partners that you might

507
00:24:06,510 --> 00:24:08,760
need to engage to do that.

508
00:24:08,760 --> 00:24:12,030


509
00:24:12,030 --> 00:24:14,850
So there are nonprofits,
there are government programs,

510
00:24:14,850 --> 00:24:17,250
there are other folks
that sort of specialize

511
00:24:17,250 --> 00:24:22,440
in serving communities and
populations such as these that

512
00:24:22,440 --> 00:24:26,700
very much need the content
that Wikipedia has,

513
00:24:26,700 --> 00:24:28,560
that could really
learn from and draw

514
00:24:28,560 --> 00:24:31,200
on the energy and the
passion of the community

515
00:24:31,200 --> 00:24:33,330
that you all have.

516
00:24:33,330 --> 00:24:35,700
These folks-- you
know, we work with tons

517
00:24:35,700 --> 00:24:37,740
of nonprofits
[INAUDIBLE] we want

518
00:24:37,740 --> 00:24:41,820
to develop educational
content in x or y way.

519
00:24:41,820 --> 00:24:45,990
We have the-- we know how to
reach low-income populations,

520
00:24:45,990 --> 00:24:48,600
but we don't necessarily know
how to package and put together

521
00:24:48,600 --> 00:24:51,000
things for them or to have
the resources to invest

522
00:24:51,000 --> 00:24:52,150
in developing this content.

523
00:24:52,150 --> 00:24:56,640
So I think that's an area
that the movement might

524
00:24:56,640 --> 00:25:01,510
continue to explore, because we
do see that as a gap right now.

525
00:25:01,510 --> 00:25:04,810


526
00:25:04,810 --> 00:25:07,440
PRESENTER: So I
think we already--

527
00:25:07,440 --> 00:25:09,810
I don't know, I don't
have my phone on me,

528
00:25:09,810 --> 00:25:12,436
so I don't know how
we're doing on time.

529
00:25:12,436 --> 00:25:14,240
10:30?

530
00:25:14,240 --> 00:25:16,070
How are you folks
feeling in terms

531
00:25:16,070 --> 00:25:19,040
of asking questions
and jumping in?

532
00:25:19,040 --> 00:25:21,470
Because I think I
have more questions,

533
00:25:21,470 --> 00:25:24,590
and I can keep asking them,
but I just want to do a check

534
00:25:24,590 --> 00:25:26,930
and open a little bit
more of the floor for us

535
00:25:26,930 --> 00:25:30,290
to have more of a
conversation with Panthea

536
00:25:30,290 --> 00:25:32,180
and with the group.

537
00:25:32,180 --> 00:25:34,050
So, how are we feeling?

538
00:25:34,050 --> 00:25:37,370
Can-- like, do we have
questions, do we have comments?

539
00:25:37,370 --> 00:25:39,290
Yes, we have Sati there.

540
00:25:39,290 --> 00:25:42,180


541
00:25:42,180 --> 00:25:43,406
Brendan?

542
00:25:43,406 --> 00:25:45,286
Go over there.

543
00:25:45,286 --> 00:25:46,330
Oh, there's no mic there.

544
00:25:46,330 --> 00:25:46,830
OK.

545
00:25:46,830 --> 00:25:49,830


546
00:25:49,830 --> 00:25:50,636
AUDIENCE: Hello.

547
00:25:50,636 --> 00:25:51,380
Hello?

548
00:25:51,380 --> 00:25:51,880
OK.

549
00:25:51,880 --> 00:25:55,070
So, I have many
questions, too, but I'm

550
00:25:55,070 --> 00:25:58,970
going to ask one that's
probably not fully filled out.

551
00:25:58,970 --> 00:26:02,190
And I guess my question is
a lot of what you've said

552
00:26:02,190 --> 00:26:05,550
have been talking about how--

553
00:26:05,550 --> 00:26:07,170
kind of a comparison, right?

554
00:26:07,170 --> 00:26:10,320
Like, we sitting here
today in San Francisco

555
00:26:10,320 --> 00:26:12,270
understand a world
a certain way.

556
00:26:12,270 --> 00:26:14,280
We see Wikipedia
in a certain way.

557
00:26:14,280 --> 00:26:16,320
We do research,
and we ask people

558
00:26:16,320 --> 00:26:19,470
to understand kind of Wikipedia
in their own context, right?

559
00:26:19,470 --> 00:26:24,120
And then there is this feeling
I guess I get where we say,

560
00:26:24,120 --> 00:26:28,920
well, they have
barriers to, let's say,

561
00:26:28,920 --> 00:26:31,050
accessing the world
in the way we do.

562
00:26:31,050 --> 00:26:33,040
And I guess I'm
asking, that's very--

563
00:26:33,040 --> 00:26:36,360
that focuses a lot on
deficiencies, right?

564
00:26:36,360 --> 00:26:38,450
Like barriers, like things
that they don't have,

565
00:26:38,450 --> 00:26:40,200
or ways in which they
do different things.

566
00:26:40,200 --> 00:26:43,020
And I guess I'm asking, in
what ways should we actually

567
00:26:43,020 --> 00:26:44,460
be learning from them?

568
00:26:44,460 --> 00:26:46,410
Should we actually be
saying maybe the world

569
00:26:46,410 --> 00:26:50,580
is trending in a way where
we're actually behind

570
00:26:50,580 --> 00:26:52,380
and they're
progressing, and instead

571
00:26:52,380 --> 00:26:57,990
of trying to match
them to us, maybe

572
00:26:57,990 --> 00:26:59,590
we should be
matching us to them.

573
00:26:59,590 --> 00:27:00,520
Does that make sense?

574
00:27:00,520 --> 00:27:02,700
PANTHEA LEE: Yeah,
no, absolutely.

575
00:27:02,700 --> 00:27:03,555
I think that--

576
00:27:03,555 --> 00:27:07,970


577
00:27:07,970 --> 00:27:10,190
I don't necessarily
mean to say, you know,

578
00:27:10,190 --> 00:27:13,310
their people have deficiencies.

579
00:27:13,310 --> 00:27:15,630
I think it's more
there are barriers.

580
00:27:15,630 --> 00:27:19,130
So Wikipedia works a
certain way right now

581
00:27:19,130 --> 00:27:21,920
that then sort of erects
barriers to people

582
00:27:21,920 --> 00:27:25,190
being able to access
and use you all.

583
00:27:25,190 --> 00:27:28,460
And so, you know, what are
ways that then the Movement

584
00:27:28,460 --> 00:27:33,230
Foundation might think of
addressing those barriers.

585
00:27:33,230 --> 00:27:37,130
I do think that there are
things that the movement can

586
00:27:37,130 --> 00:27:40,830
definitely learn from
the creativity of--

587
00:27:40,830 --> 00:27:43,220
and it's hard to talk
about these populations,

588
00:27:43,220 --> 00:27:46,640
because we're talking about
quite a diverse set of users

589
00:27:46,640 --> 00:27:47,700
and markets.

590
00:27:47,700 --> 00:27:49,250
But I think one
of the things that

591
00:27:49,250 --> 00:27:51,560
has been really
interesting is the rise

592
00:27:51,560 --> 00:27:52,820
of these social networks.

593
00:27:52,820 --> 00:27:55,250
And people that
have cost barriers

594
00:27:55,250 --> 00:27:58,220
to be able to use the internet
are getting around them

595
00:27:58,220 --> 00:27:59,780
in quite creative ways.

596
00:27:59,780 --> 00:28:02,180
You know, when telcos
are offering a file

597
00:28:02,180 --> 00:28:05,420
transfer for free on these, all
of a sudden you see a rush--

598
00:28:05,420 --> 00:28:07,550
you know, everyone is now,
instead of using email,

599
00:28:07,550 --> 00:28:10,760
just using, like, WhatsApp
to basically send everything

600
00:28:10,760 --> 00:28:11,730
around.

601
00:28:11,730 --> 00:28:14,060
And so that's actually
quite interesting.

602
00:28:14,060 --> 00:28:16,070
Instead of using
other social networks,

603
00:28:16,070 --> 00:28:17,600
they are creating their own.

604
00:28:17,600 --> 00:28:21,650
And I think that's a really
interesting way for people--

605
00:28:21,650 --> 00:28:24,050
for you all to explore.

606
00:28:24,050 --> 00:28:28,700
I think also there are
really interesting--

607
00:28:28,700 --> 00:28:31,880
you know, we work a lot
with nonprofits and media

608
00:28:31,880 --> 00:28:35,600
organizations and whatnot
that are leveraging

609
00:28:35,600 --> 00:28:39,740
traditional media in quite
interesting ways, as well,

610
00:28:39,740 --> 00:28:43,310
thinking about how
do we use television

611
00:28:43,310 --> 00:28:48,170
and radio in creative ways to
deliver educational content.

612
00:28:48,170 --> 00:28:53,210
And I think that could be an
area for you all to explore,

613
00:28:53,210 --> 00:28:56,007
as well.

614
00:28:56,007 --> 00:28:57,590
AUDIENCE: So then I
guess my follow-up

615
00:28:57,590 --> 00:29:04,940
to that is then you've kind
of laid out, it feels like,

616
00:29:04,940 --> 00:29:07,190
two models, one where,
right now, we're

617
00:29:07,190 --> 00:29:09,480
kind of this self-encased
thing, right?

618
00:29:09,480 --> 00:29:12,620
A website, like a Facebook,
like an ecosystem, in a sense.

619
00:29:12,620 --> 00:29:15,380
And what I'm hearing from
you is that we actually need

620
00:29:15,380 --> 00:29:16,984
to be more ubiquitous, right?

621
00:29:16,984 --> 00:29:18,650
That we need to be
embedded in some way,

622
00:29:18,650 --> 00:29:22,430
we need to show up in ways
that might not even be branded

623
00:29:22,430 --> 00:29:23,870
in a way that's recognizable.

624
00:29:23,870 --> 00:29:26,570
But if our true mission
is about knowledge--

625
00:29:26,570 --> 00:29:29,480
let's say a part of it is about
knowledge dissemination, then

626
00:29:29,480 --> 00:29:31,640
that knowledge, it's
really important for that

627
00:29:31,640 --> 00:29:34,790
to show up in the ways that
people are accessing it.

628
00:29:34,790 --> 00:29:39,157
And so how, I guess,
in that vein--

629
00:29:39,157 --> 00:29:40,490
and that's a lot about readers--

630
00:29:40,490 --> 00:29:45,660
how do you think about content,
let's say, curation, creation?

631
00:29:45,660 --> 00:29:48,200
Kind of the upstream
pieces to dissemination.

632
00:29:48,200 --> 00:29:52,610
How does that end
of the flow match

633
00:29:52,610 --> 00:29:54,740
into kind of this
ubiquity that you're

634
00:29:54,740 --> 00:29:57,178
kind of describing at the
other end of the flow?

635
00:29:57,178 --> 00:30:00,850


636
00:30:00,850 --> 00:30:02,980
PANTHEA LEE: So I think that--

637
00:30:02,980 --> 00:30:04,710
yeah, I think you're
absolutely right.

638
00:30:04,710 --> 00:30:06,372
Instead of becoming
like a product,

639
00:30:06,372 --> 00:30:07,830
or instead of
thinking of ourselves

640
00:30:07,830 --> 00:30:11,280
as a product and knowledge here,
how do we actually be, like,

641
00:30:11,280 --> 00:30:13,500
an engine that pushes
information out

642
00:30:13,500 --> 00:30:17,340
in ways to enable
ubiquity, to be

643
00:30:17,340 --> 00:30:21,000
able to push it out to other
content creators or folks that

644
00:30:21,000 --> 00:30:23,230
are providing educational
resources, whatnot.

645
00:30:23,230 --> 00:30:26,700
And I think that
the role of curators

646
00:30:26,700 --> 00:30:33,060
here is really
interesting and important.

647
00:30:33,060 --> 00:30:37,620
We see a lot of young people
following vloggers and bloggers

648
00:30:37,620 --> 00:30:40,650
and, you know, that have
built up trust in certain ways

649
00:30:40,650 --> 00:30:44,850
that may not see Wikipedia
content as credible or easy

650
00:30:44,850 --> 00:30:45,980
to use or whatnot.

651
00:30:45,980 --> 00:30:50,080
And so does the role of
the community, for example,

652
00:30:50,080 --> 00:30:51,090
change here?

653
00:30:51,090 --> 00:30:53,970
Do the role of the
affiliates change?

654
00:30:53,970 --> 00:30:56,580
You know, right now I
think the community often--

655
00:30:56,580 --> 00:30:58,770
in the way that I
understand it, a lot of it

656
00:30:58,770 --> 00:31:05,820
is dominated by editors that are
contributing content and doing

657
00:31:05,820 --> 00:31:07,260
a lot of different functions.

658
00:31:07,260 --> 00:31:11,160
But you know, in the future,
is it about developing topic

659
00:31:11,160 --> 00:31:16,570
guides to help nonprofits
that are thinking

660
00:31:16,570 --> 00:31:18,180
about digital literacy.

661
00:31:18,180 --> 00:31:20,880
OK, so these are the
resources that we have,

662
00:31:20,880 --> 00:31:23,340
these are-- you know,
through APIs, through more

663
00:31:23,340 --> 00:31:24,419
modular content, whatnot.

664
00:31:24,419 --> 00:31:26,460
This is how you might be
able to mix and match it

665
00:31:26,460 --> 00:31:29,670
to design your own
curricula in x or y way.

666
00:31:29,670 --> 00:31:36,890
Is it about thinking about
community members as liaisons

667
00:31:36,890 --> 00:31:39,350
between other content
creators, whether they're

668
00:31:39,350 --> 00:31:42,050
more individuals-- you
know, alternative history

669
00:31:42,050 --> 00:31:44,930
vloggers on YouTube,
to nonprofits

670
00:31:44,930 --> 00:31:50,635
focusing on educating
people about women's rights.

671
00:31:50,635 --> 00:31:52,760
I think those are different
ways that you all might

672
00:31:52,760 --> 00:31:54,050
be able to think about it.

673
00:31:54,050 --> 00:31:59,480
But I think this ubiquity point
is a really interesting one,

674
00:31:59,480 --> 00:32:02,320
and definitely
worth considering.

675
00:32:02,320 --> 00:32:03,839
Please.

676
00:32:03,839 --> 00:32:06,130
AUDIENCE: It's a really great
question where you ended.

677
00:32:06,130 --> 00:32:07,660
And I think one
other piece of it

678
00:32:07,660 --> 00:32:10,115
that's somewhat simple to
add on to what Panthea said

679
00:32:10,115 --> 00:32:12,490
is the more ubiquitous you
are and the more people you're

680
00:32:12,490 --> 00:32:15,064
reaching, the more opportunities
you have to build somebody

681
00:32:15,064 --> 00:32:16,480
up the ladder of
engagements where

682
00:32:16,480 --> 00:32:18,896
they might become a contributor
and actually be an editor.

683
00:32:18,896 --> 00:32:21,280
So there is a sort
of feedback loop

684
00:32:21,280 --> 00:32:23,620
that comes from reaching
more people in terms

685
00:32:23,620 --> 00:32:26,212
of potentially increasing
the pool of contributors.

686
00:32:26,212 --> 00:32:30,927


687
00:32:30,927 --> 00:32:31,760
PRESENTER: Oh, yeah.

688
00:32:31,760 --> 00:32:34,379
So let's maybe go over them.

689
00:32:34,379 --> 00:32:35,754
AUDIENCE: I think
that you should

690
00:32:35,754 --> 00:32:38,134
stack the first question.

691
00:32:38,134 --> 00:32:39,086
PRESENTER: Can you--

692
00:32:39,086 --> 00:32:39,562
AUDIENCE: Yeah.

693
00:32:39,562 --> 00:32:40,062
Let's see.

694
00:32:40,062 --> 00:32:41,010
Let's get some people.

695
00:32:41,010 --> 00:32:44,860
All right, so we have a set of
questions from blue jeans here.

696
00:32:44,860 --> 00:32:46,860
Jamo, you're first on the list.

697
00:32:46,860 --> 00:32:51,870
And then we can go
to Edward and Amir.

698
00:32:51,870 --> 00:32:52,890
AUDIENCE: Hello.

699
00:32:52,890 --> 00:32:54,540
How's my audio?

700
00:32:54,540 --> 00:32:55,786
AUDIENCE: You sound great.

701
00:32:55,786 --> 00:32:58,830
AUDIENCE: Aw, thank you.

702
00:32:58,830 --> 00:33:03,960
So, thank you very much
for speaking with us.

703
00:33:03,960 --> 00:33:09,270
I had a question about
methods because I'm

704
00:33:09,270 --> 00:33:13,950
a design researcher,
and so I like methods.

705
00:33:13,950 --> 00:33:18,390
My question is around
how Reboot gets

706
00:33:18,390 --> 00:33:25,830
to the point of providing
the high-priority findings

707
00:33:25,830 --> 00:33:27,220
and recommendations.

708
00:33:27,220 --> 00:33:30,420
So I believe that
a lot of people

709
00:33:30,420 --> 00:33:34,080
have an idea of how design
research works in terms of,

710
00:33:34,080 --> 00:33:38,010
you know, you talk to people,
you observe what they do,

711
00:33:38,010 --> 00:33:41,070
you go to where they
live, you find out

712
00:33:41,070 --> 00:33:44,070
kind of how they interact
with, in our case, you know,

713
00:33:44,070 --> 00:33:47,070
Wikipedia or information
technologies or technology

714
00:33:47,070 --> 00:33:49,950
in general, how it
fits into their lives.

715
00:33:49,950 --> 00:33:52,800
And then you take lots of notes.

716
00:33:52,800 --> 00:33:56,250
But I think that it's really
interesting and kind of less

717
00:33:56,250 --> 00:33:58,980
understood how you
get from, OK, we

718
00:33:58,980 --> 00:34:03,480
learned all this stuff about
people and about what they do

719
00:34:03,480 --> 00:34:06,120
and what they believe
and what motivates them,

720
00:34:06,120 --> 00:34:07,590
how do we get to
the point where we

721
00:34:07,590 --> 00:34:11,040
are synthesizing that
and prioritizing what are

722
00:34:11,040 --> 00:34:13,139
the most important findings?

723
00:34:13,139 --> 00:34:16,672
What are the big takeaways
for this audience?

724
00:34:16,672 --> 00:34:17,755
You know, for our clients.

725
00:34:17,755 --> 00:34:19,900
Wikimedia, in this case.

726
00:34:19,900 --> 00:34:22,739
I think we did a really great
job of that in the New Readers

727
00:34:22,739 --> 00:34:27,389
project, but I'd love to hear
a little more kind of about how

728
00:34:27,389 --> 00:34:30,360
that process works
within your organization.

729
00:34:30,360 --> 00:34:33,639


730
00:34:33,639 --> 00:34:35,940
PANTHEA LEE: I love
nerding out on methods.

731
00:34:35,940 --> 00:34:38,980
And so thank you
for the question.

732
00:34:38,980 --> 00:34:42,292
You know, so I think for us--

733
00:34:42,292 --> 00:34:44,750
I'll try not to get lost in
details, but I think, you know,

734
00:34:44,750 --> 00:34:50,030
where we started with you all
is thinking about this research

735
00:34:50,030 --> 00:34:52,280
framework, and what were
the questions that we

736
00:34:52,280 --> 00:34:53,179
wanted to ask.

737
00:34:53,179 --> 00:34:55,670
And usually, where
we start here is

738
00:34:55,670 --> 00:34:59,810
one principle we have
in mind is always

739
00:34:59,810 --> 00:35:01,880
don't necessarily
ask about the thing

740
00:35:01,880 --> 00:35:03,680
that you're most
interested in, which

741
00:35:03,680 --> 00:35:07,190
means we did not start by
asking people about Wikipedia.

742
00:35:07,190 --> 00:35:10,850
We started by asking people
about just information systems,

743
00:35:10,850 --> 00:35:13,670
like what information they
need, what they care about,

744
00:35:13,670 --> 00:35:14,420
how they get it.

745
00:35:14,420 --> 00:35:16,929
And we go from sort of the
broadest to how they get

746
00:35:16,929 --> 00:35:18,470
information to then
sort of narrow it

747
00:35:18,470 --> 00:35:21,950
how do they get online to then
how do they come to Wikipedia,

748
00:35:21,950 --> 00:35:23,190
how do they use Wikipedia.

749
00:35:23,190 --> 00:35:26,369
So we go from sort of
broadest to most narrow.

750
00:35:26,369 --> 00:35:28,160
Happy to talk about
the research framework.

751
00:35:28,160 --> 00:35:31,580
I know that's sort of been
shared with different team

752
00:35:31,580 --> 00:35:32,930
members.

753
00:35:32,930 --> 00:35:34,970
And then as part of
the field research,

754
00:35:34,970 --> 00:35:40,790
we bring a team of local
researchers from the city,

755
00:35:40,790 --> 00:35:42,590
the community, wherever
that we're working,

756
00:35:42,590 --> 00:35:46,190
and then paired with a
Wikimedia team, as well,

757
00:35:46,190 --> 00:35:49,220
to make sure that we both
have the institutional sort

758
00:35:49,220 --> 00:35:53,600
of context and
knowledge, and then

759
00:35:53,600 --> 00:35:58,550
also the sort of local and
cultural translation, as well.

760
00:35:58,550 --> 00:36:00,590
For us, it's really
important to make

761
00:36:00,590 --> 00:36:02,090
sure we spend a lot
of time with you

762
00:36:02,090 --> 00:36:04,130
all to understand your
strategies, your work

763
00:36:04,130 --> 00:36:06,290
processes, whatnot,
because I think sometimes

764
00:36:06,290 --> 00:36:09,470
user-centered design is
misinterpreted as, you know,

765
00:36:09,470 --> 00:36:11,810
just let's focus
on the end user,

766
00:36:11,810 --> 00:36:13,610
whereas in fact, you
know, we actually

767
00:36:13,610 --> 00:36:16,070
need to do a lot of work to
understand the organizations

768
00:36:16,070 --> 00:36:18,860
and the institutions that
serve them to be able to design

769
00:36:18,860 --> 00:36:21,290
strategies that are actually
feasible and implementable,

770
00:36:21,290 --> 00:36:24,740
rather than a shiny blue sky
deck that you can't do anything

771
00:36:24,740 --> 00:36:26,480
with.

772
00:36:26,480 --> 00:36:29,360
And then I think from there, in
terms of the actual research,

773
00:36:29,360 --> 00:36:32,360
we do a lot of sort of
semi-structured ethnographic

774
00:36:32,360 --> 00:36:32,930
interviews.

775
00:36:32,930 --> 00:36:34,340
We do user observation.

776
00:36:34,340 --> 00:36:36,140
We do tech demos, whatnot.

777
00:36:36,140 --> 00:36:38,330
I'm happy to get
into any of those.

778
00:36:38,330 --> 00:36:42,020
But then we end up doing
nightly synthesis sessions

779
00:36:42,020 --> 00:36:43,910
with the entire
team, which means

780
00:36:43,910 --> 00:36:48,110
we are making sense of the data
collected every single day.

781
00:36:48,110 --> 00:36:50,660
What are the patterns,
what are the connections,

782
00:36:50,660 --> 00:36:52,970
and where should we go
deeper the next day,

783
00:36:52,970 --> 00:36:55,311
because this type of
research, it's really applied.

784
00:36:55,311 --> 00:36:56,810
And so we're not
asking the same set

785
00:36:56,810 --> 00:36:58,990
of questions every single day.

786
00:36:58,990 --> 00:37:01,119
If we're asking the same
questions we did on day 10

787
00:37:01,119 --> 00:37:02,910
as we are on day one,
we would have failed.

788
00:37:02,910 --> 00:37:05,030
You know, as we're thinking
about opportunities,

789
00:37:05,030 --> 00:37:07,670
we're trying to figure out
how to hone in deeper on those

790
00:37:07,670 --> 00:37:09,170
so that we can start
sort of testing

791
00:37:09,170 --> 00:37:10,790
some of your hypotheses.

792
00:37:10,790 --> 00:37:15,500
And so it's quite of active
and iterative research.

793
00:37:15,500 --> 00:37:17,540
And then I would
say, in terms of how

794
00:37:17,540 --> 00:37:20,660
we got to some of these
higher priority findings--

795
00:37:20,660 --> 00:37:23,390
and then I'll stop because
I might be losing people--

796
00:37:23,390 --> 00:37:27,470
is we tried to map
all the users that we

797
00:37:27,470 --> 00:37:31,940
spoke to in each context
against a matrix along sort

798
00:37:31,940 --> 00:37:33,800
of digital competence
and literacy,

799
00:37:33,800 --> 00:37:36,260
and then also the
type of use cases

800
00:37:36,260 --> 00:37:38,900
that they are using
the internet for.

801
00:37:38,900 --> 00:37:41,630
From there, we then
segmented into a couple

802
00:37:41,630 --> 00:37:45,710
of subsets of users.

803
00:37:45,710 --> 00:37:48,200
And so you all have
user personas now

804
00:37:48,200 --> 00:37:51,230
for these sort of
different archetypes.

805
00:37:51,230 --> 00:37:54,320
And then from there, we
mapped the user journey

806
00:37:54,320 --> 00:37:58,130
of each of these
types of users to help

807
00:37:58,130 --> 00:38:01,340
us understand, in their
information journey

808
00:38:01,340 --> 00:38:04,010
to try and get the
information that they want

809
00:38:04,010 --> 00:38:07,370
and need to do x or y,
depending on the type

810
00:38:07,370 --> 00:38:10,670
of user, what are the biggest
barriers that they face

811
00:38:10,670 --> 00:38:15,020
and what are the most valuable
information sources for them.

812
00:38:15,020 --> 00:38:17,090
For the most valuable
information sources,

813
00:38:17,090 --> 00:38:19,040
we then think about
what can we learn

814
00:38:19,040 --> 00:38:21,962
from these lateral examples, and
then for the biggest barriers

815
00:38:21,962 --> 00:38:24,170
we then try and sort of
aggregate them to understand,

816
00:38:24,170 --> 00:38:27,590
OK, what are the most prominent
and significant barriers that

817
00:38:27,590 --> 00:38:31,760
are preventing more users from
being able to take advantage

818
00:38:31,760 --> 00:38:32,930
of Wikipedia.

819
00:38:32,930 --> 00:38:34,880
And then that's
how we end up rank

820
00:38:34,880 --> 00:38:37,190
ordering the
opportunities that then we

821
00:38:37,190 --> 00:38:38,360
put forward to you guys.

822
00:38:38,360 --> 00:38:41,180


823
00:38:41,180 --> 00:38:42,167
That was pretty fast.

824
00:38:42,167 --> 00:38:43,500
I'm not sure if that was useful.

825
00:38:43,500 --> 00:38:47,566


826
00:38:47,566 --> 00:38:48,690
AUDIENCE: Excellent answer.

827
00:38:48,690 --> 00:38:50,350
Thanks.

828
00:38:50,350 --> 00:38:51,020
Nice work.

829
00:38:51,020 --> 00:38:54,610
OK, so we had I think
Edward next, is that right?

830
00:38:54,610 --> 00:38:57,230
Edward, would you
like to take over?

831
00:38:57,230 --> 00:38:58,045
EDWARD: Sure.

832
00:38:58,045 --> 00:38:59,920
My apologies, I'm in a
coffee shop right now.

833
00:38:59,920 --> 00:39:02,650
So can you all hear me?

834
00:39:02,650 --> 00:39:05,210
AUDIENCE: We can hear you.

835
00:39:05,210 --> 00:39:07,070
EDWARD: Thank you.

836
00:39:07,070 --> 00:39:10,130
Yeah, so thank you so
much for talking with us.

837
00:39:10,130 --> 00:39:12,949


838
00:39:12,949 --> 00:39:14,490
Yeah, this is super
interesting work,

839
00:39:14,490 --> 00:39:18,330
and I've been following the
audience's work for a while.

840
00:39:18,330 --> 00:39:21,400
And I've actually been trying
to use it in my own work.

841
00:39:21,400 --> 00:39:24,060
So I do a lot of surveys
at the foundation

842
00:39:24,060 --> 00:39:25,590
and with communities,
so I'm also

843
00:39:25,590 --> 00:39:28,650
interested in the
methods a little bit.

844
00:39:28,650 --> 00:39:30,870
So I am actually curious
to hear generally

845
00:39:30,870 --> 00:39:33,690
what have been some of the
limitations or challenges

846
00:39:33,690 --> 00:39:38,774
that you're finding in
conducting your research,

847
00:39:38,774 --> 00:39:41,190
especially what you said at
the beginning, which is around

848
00:39:41,190 --> 00:39:43,890
how you try to bring voices
into the design process.

849
00:39:43,890 --> 00:39:46,350
So what have been some
challenges around that?

850
00:39:46,350 --> 00:39:49,594
And perhaps if there's been
anything in our context

851
00:39:49,594 --> 00:39:51,010
that has been
challenging for you.

852
00:39:51,010 --> 00:39:53,670


853
00:39:53,670 --> 00:39:56,440
PANTHEA LEE: Time?

854
00:39:56,440 --> 00:40:02,040
Challenges, I think that one of
the biggest challenges for us

855
00:40:02,040 --> 00:40:12,820
has been thinking about the
breadth of opportunities there,

856
00:40:12,820 --> 00:40:16,480
and then how to organize
them, which is why sort of--

857
00:40:16,480 --> 00:40:17,980
going to sort of
Jonathan's question

858
00:40:17,980 --> 00:40:21,340
previously, for us having a
really structured process,

859
00:40:21,340 --> 00:40:26,830
to narrow down, to rank order
to process the opportunities

860
00:40:26,830 --> 00:40:28,660
and findings was
really important.

861
00:40:28,660 --> 00:40:31,000
But I think that has
been a challenge,

862
00:40:31,000 --> 00:40:34,600
but I think what it's been
helped by is the fact that it's

863
00:40:34,600 --> 00:40:36,280
been really
refreshing and really

864
00:40:36,280 --> 00:40:38,722
exciting to work with such
a cross-functional team.

865
00:40:38,722 --> 00:40:40,930
We've been working with
Global Reach and partnerships

866
00:40:40,930 --> 00:40:43,300
and community and
community engagement

867
00:40:43,300 --> 00:40:45,470
and comms and
product and whatnot.

868
00:40:45,470 --> 00:40:49,720
And so you all having the
conversations to then give us

869
00:40:49,720 --> 00:40:54,650
sort of a more targeted brief,
that's been really useful.

870
00:40:54,650 --> 00:40:56,635
I think that we've--

871
00:40:56,635 --> 00:40:58,390
I think there is
more that we could

872
00:40:58,390 --> 00:41:02,410
do to think about
how to use surveys

873
00:41:02,410 --> 00:41:05,980
and other sort of quantitative
methods in conjunction

874
00:41:05,980 --> 00:41:09,400
with the more qualitative
methods that we use.

875
00:41:09,400 --> 00:41:11,890
Typically, when we do this, we--

876
00:41:11,890 --> 00:41:16,490
design research is really good
at going deep and understanding

877
00:41:16,490 --> 00:41:19,001
people's behaviors,
attitudes, whatnot.

878
00:41:19,001 --> 00:41:19,750
And I know we've--

879
00:41:19,750 --> 00:41:22,690
and we've been working
with Dan and taking a look

880
00:41:22,690 --> 00:41:24,910
at some of the
trends and findings

881
00:41:24,910 --> 00:41:26,980
from the mobile surveys,
thinking about how

882
00:41:26,980 --> 00:41:28,074
that informs our research.

883
00:41:28,074 --> 00:41:30,240
I think we could do more
perhaps to think about, OK,

884
00:41:30,240 --> 00:41:32,080
so now we've gone
really deep, how

885
00:41:32,080 --> 00:41:36,130
do we go broad again to test out
the representativeness of some

886
00:41:36,130 --> 00:41:40,780
of our findings here
against a wider audience.

887
00:41:40,780 --> 00:41:46,720
And then I know that
I think one area that

888
00:41:46,720 --> 00:41:51,930
has also been somewhat
challenging is thinking about--

889
00:41:51,930 --> 00:41:54,400
is the time factor.

890
00:41:54,400 --> 00:41:56,830
You know, we've been doing
this sort of quite rapidly.

891
00:41:56,830 --> 00:41:59,910
And we've had basically time
for about two-week sprints

892
00:41:59,910 --> 00:42:01,870
in each of these countries.

893
00:42:01,870 --> 00:42:05,440
And so-- which has
been about sort

894
00:42:05,440 --> 00:42:07,960
of 70 respondents per country.

895
00:42:07,960 --> 00:42:11,200
And so we haven't
really had time

896
00:42:11,200 --> 00:42:16,540
to be able to prototype any
sort of solutions or strategies,

897
00:42:16,540 --> 00:42:18,520
which sometimes,
given a longer sprint,

898
00:42:18,520 --> 00:42:21,220
we are able to do to
test out and come back

899
00:42:21,220 --> 00:42:24,850
with user feedback on
specific strategies

900
00:42:24,850 --> 00:42:27,160
or product designs that
you might want to pursue.

901
00:42:27,160 --> 00:42:30,530


902
00:42:30,530 --> 00:42:32,020
PRESENTER: On that
note of time, I

903
00:42:32,020 --> 00:42:34,745
think there was a question also
there about the user personas

904
00:42:34,745 --> 00:42:36,090
not being posted.

905
00:42:36,090 --> 00:42:38,510
They're not posted yet,
but they will be soon.

906
00:42:38,510 --> 00:42:43,300
So there's a big DAC coming with
all the personas from Indonesia

907
00:42:43,300 --> 00:42:45,750
and Brazil, and
more information.

908
00:42:45,750 --> 00:42:47,870
So yeah, stay tuned for that.

909
00:42:47,870 --> 00:42:51,920
And then I believe that the
next question is from Amir.

910
00:42:51,920 --> 00:42:52,580
Is that right?

911
00:42:52,580 --> 00:42:54,934


912
00:42:54,934 --> 00:42:55,600
AUDIENCE: Hello.

913
00:42:55,600 --> 00:42:56,660
Can anybody hear me?

914
00:42:56,660 --> 00:43:00,190
PRESENTER: Yes, we can hear you.

915
00:43:00,190 --> 00:43:00,930
AUDIENCE: Oh.

916
00:43:00,930 --> 00:43:03,630
You might also hear
my son in the back.

917
00:43:03,630 --> 00:43:04,950
OK.

918
00:43:04,950 --> 00:43:06,810
So, quick question.

919
00:43:06,810 --> 00:43:09,204
It was all really,
really interesting.

920
00:43:09,204 --> 00:43:11,370
Thank you so much for coming
and talking about this.

921
00:43:11,370 --> 00:43:12,550
I appreciate this a lot.

922
00:43:12,550 --> 00:43:17,159
The global perspective is
really important for us.

923
00:43:17,159 --> 00:43:19,200
My question is about
something that you mentioned

924
00:43:19,200 --> 00:43:23,010
in the beginning of your talk.

925
00:43:23,010 --> 00:43:26,910
You said that the messenger
networks like WhatsApp

926
00:43:26,910 --> 00:43:30,570
are really important today.

927
00:43:30,570 --> 00:43:36,240
How do you think
Wikipedia could get there?

928
00:43:36,240 --> 00:43:40,740
Because currently, we are
very much a web organization,

929
00:43:40,740 --> 00:43:43,780
and people access
us through browsers.

930
00:43:43,780 --> 00:43:49,290
And a few people access us
through the Wikipedia app,

931
00:43:49,290 --> 00:43:54,720
but it's almost the same as
reading it in the browser.

932
00:43:54,720 --> 00:43:56,970
What could we do with
these networks, given

933
00:43:56,970 --> 00:44:01,050
that they are so important and
popular, possibly even more

934
00:44:01,050 --> 00:44:02,662
popular than Facebook by now?

935
00:44:02,662 --> 00:44:03,620
What could we do there?

936
00:44:03,620 --> 00:44:06,338


937
00:44:06,338 --> 00:44:08,800
PANTHEA LEE: That's
a good question.

938
00:44:08,800 --> 00:44:11,690
So I think that first
of all it's just I

939
00:44:11,690 --> 00:44:13,460
think really
understanding what people

940
00:44:13,460 --> 00:44:15,950
are using these networks for.

941
00:44:15,950 --> 00:44:20,870
And so, you know, it wasn't
really a focus of our research,

942
00:44:20,870 --> 00:44:24,710
but it sort of emerged
quite quickly as a key trend

943
00:44:24,710 --> 00:44:25,787
that we are seeing.

944
00:44:25,787 --> 00:44:27,620
So, you know, I think
I mentioned, you know,

945
00:44:27,620 --> 00:44:29,920
whether it was study groups,
I think that is actually--

946
00:44:29,920 --> 00:44:33,980
that could be a big opportunity
for Wikipedia in terms

947
00:44:33,980 --> 00:44:36,110
of people are talking
about homework assignments,

948
00:44:36,110 --> 00:44:40,220
they're debating topics
learned in class and whatnot

949
00:44:40,220 --> 00:44:42,380
sort of on these networks.

950
00:44:42,380 --> 00:44:46,910
And is there a way
to bring in Wikipedia

951
00:44:46,910 --> 00:44:49,880
to be sort of like
a context provider

952
00:44:49,880 --> 00:44:53,130
when people are having these
debates and conversations.

953
00:44:53,130 --> 00:44:58,181
I'm not sure of the specific
sort of product strategy

954
00:44:58,181 --> 00:45:00,680
that would mean, but basically
how do you be sort of organic

955
00:45:00,680 --> 00:45:02,300
and in context.

956
00:45:02,300 --> 00:45:04,940
I think there's been some
experiments with WhatsApp chat

957
00:45:04,940 --> 00:45:05,509
bots?

958
00:45:05,509 --> 00:45:06,050
I'm not sure.

959
00:45:06,050 --> 00:45:08,810
I think I heard
something about this.

960
00:45:08,810 --> 00:45:12,590
But those could be
ways to help people,

961
00:45:12,590 --> 00:45:14,390
you know, again, sort
of get information

962
00:45:14,390 --> 00:45:18,650
from Wikipedia that is, in many
markets, you know, low-cost

963
00:45:18,650 --> 00:45:20,830
and where people are.

964
00:45:20,830 --> 00:45:22,115
I think that there's also--

965
00:45:22,115 --> 00:45:24,710


966
00:45:24,710 --> 00:45:31,590
in many of these markets, we
found that the ways that--

967
00:45:31,590 --> 00:45:35,090
so this now maybe ties a little
bit into the editor's work

968
00:45:35,090 --> 00:45:40,710
that we've been doing, but
in some of these markets,

969
00:45:40,710 --> 00:45:45,080
there are not great
representations

970
00:45:45,080 --> 00:45:47,730
of, like, essentially
sort of people's,

971
00:45:47,730 --> 00:45:51,090
like, culture in
context on the internet.

972
00:45:51,090 --> 00:45:54,510
I think Wikipedia's
current model

973
00:45:54,510 --> 00:45:57,420
of, you know, determining sort
of what is good content that

974
00:45:57,420 --> 00:46:01,380
can be included on
the site presents

975
00:46:01,380 --> 00:46:05,040
some barriers to cultures
that may be sort of more oral

976
00:46:05,040 --> 00:46:10,650
or may not have the
resources and the references,

977
00:46:10,650 --> 00:46:17,490
rather, to be, like, you know,
a verified and trustworthy sort

978
00:46:17,490 --> 00:46:20,850
of Wikipedia article.

979
00:46:20,850 --> 00:46:29,480
Are there ways to have
different versions of Wikipedia,

980
00:46:29,480 --> 00:46:33,920
perhaps, that allow different
types of contribution

981
00:46:33,920 --> 00:46:37,880
that you might be sourcing
via these chat channels

982
00:46:37,880 --> 00:46:40,520
that then sort of gets
put into larger processes

983
00:46:40,520 --> 00:46:43,100
of verification and whatnot.

984
00:46:43,100 --> 00:46:45,562
Do you guys think about it as
a way to sort of get content,

985
00:46:45,562 --> 00:46:48,020
or just source material that
then might be put through more

986
00:46:48,020 --> 00:46:48,890
rigorous processes.

987
00:46:48,890 --> 00:46:52,250
I think those are also things
that you could explore.

988
00:46:52,250 --> 00:46:55,240
But yeah, those are some ideas.

989
00:46:55,240 --> 00:46:57,240
PRESENTER: Amir, does
that answer your question?

990
00:46:57,240 --> 00:46:59,345
Do you have follow-ups
or comments?

991
00:46:59,345 --> 00:47:02,596


992
00:47:02,596 --> 00:47:05,650
AUDIENCE: Yeah, it
answers the question.

993
00:47:05,650 --> 00:47:07,540
Just a tiny follow-up.

994
00:47:07,540 --> 00:47:09,360
You only mentioned
WhatsApp, but I

995
00:47:09,360 --> 00:47:11,740
know that there are
several other platforms

996
00:47:11,740 --> 00:47:14,350
around the world, like
Viber or WeChat or Telegram.

997
00:47:14,350 --> 00:47:17,950


998
00:47:17,950 --> 00:47:22,970
Are there any others that you
suggest looking closely at?

999
00:47:22,970 --> 00:47:25,000
Which are the important
ones around the world,

1000
00:47:25,000 --> 00:47:28,400
or maybe in particular regions?

1001
00:47:28,400 --> 00:47:33,290
PANTHEA LEE: I know some of
this is in our findings memo.

1002
00:47:33,290 --> 00:47:38,750
I do know, in both markets,
WhatsApp was dominant.

1003
00:47:38,750 --> 00:47:43,580
I know Line and Telegram are--

1004
00:47:43,580 --> 00:47:45,440
and people use them
for various purposes.

1005
00:47:45,440 --> 00:47:47,090
You know, we've
heard WhatsApp we

1006
00:47:47,090 --> 00:47:49,640
might use for more sort of
professional and schools

1007
00:47:49,640 --> 00:47:51,740
reasons, but we might
use Line because we

1008
00:47:51,740 --> 00:47:54,760
like their emojis better.

1009
00:47:54,760 --> 00:47:58,400
And so we didn't
really do a deep dive

1010
00:47:58,400 --> 00:48:01,040
into comparing the
messaging apps,

1011
00:48:01,040 --> 00:48:07,960
but there is good market
research on some of this.

1012
00:48:07,960 --> 00:48:11,150
A note of caution on the
market research is oftentimes,

1013
00:48:11,150 --> 00:48:12,920
they really sort
of just give you

1014
00:48:12,920 --> 00:48:15,320
penetration in terms of,
like, downloads, whatnot.

1015
00:48:15,320 --> 00:48:19,280
And that may not be the
same thing as usage.

1016
00:48:19,280 --> 00:48:22,700
And so I think sort of probing
into why people are using

1017
00:48:22,700 --> 00:48:24,380
specific messaging
platforms, for what

1018
00:48:24,380 --> 00:48:27,380
and how could be something
that you guys might

1019
00:48:27,380 --> 00:48:30,065
want to think about further.

1020
00:48:30,065 --> 00:48:30,564
Yeah.

1021
00:48:30,564 --> 00:48:32,820
PRESENTER: All right.

1022
00:48:32,820 --> 00:48:34,080
We have a question there.

1023
00:48:34,080 --> 00:48:36,230
Thank you, Amir.

1024
00:48:36,230 --> 00:48:38,650
And we have Rosie
there with a question.

1025
00:48:38,650 --> 00:48:40,100
Do we have the mic in the back?

1026
00:48:40,100 --> 00:48:53,407


1027
00:48:53,407 --> 00:48:54,240
AUDIENCE: Thank you.

1028
00:48:54,240 --> 00:48:56,900
This has been pretty
enlightening for me.

1029
00:48:56,900 --> 00:49:00,510
I have a question about
who's being left behind.

1030
00:49:00,510 --> 00:49:03,570
So for the first 15 years,
there's been a lot said,

1031
00:49:03,570 --> 00:49:08,910
a lot written about the
teenagers and young men

1032
00:49:08,910 --> 00:49:13,500
in their 20s are the ones
who kind of dug in deep

1033
00:49:13,500 --> 00:49:19,080
and were the ones that spent
the most time on the Wikipedia

1034
00:49:19,080 --> 00:49:22,050
that we've known of
the last 15 years

1035
00:49:22,050 --> 00:49:25,830
or so, and that
there were definitely

1036
00:49:25,830 --> 00:49:29,990
big segments of people
who were kind of left out.

1037
00:49:29,990 --> 00:49:35,340
A very small percentage of
women versus men, and so on.

1038
00:49:35,340 --> 00:49:37,830
And so I'm wondering, does
your research touch on that?

1039
00:49:37,830 --> 00:49:39,650
Do you have a feeling for--

1040
00:49:39,650 --> 00:49:42,770
and you've spoken to groups of
people in Indonesia and Brazil,

1041
00:49:42,770 --> 00:49:45,690
but can you sense
who has been left out

1042
00:49:45,690 --> 00:49:49,980
of the conversation in
Brazil, in Indonesia?

1043
00:49:49,980 --> 00:49:53,040
And is there a way that,
in the next 15 years,

1044
00:49:53,040 --> 00:49:54,930
we're not going to
replicate this feeling

1045
00:49:54,930 --> 00:49:57,360
of somebody being left out?

1046
00:49:57,360 --> 00:49:59,970
Will we be able to be
more inclusive based

1047
00:49:59,970 --> 00:50:05,010
on the things you're kind
of learning so that we kind

1048
00:50:05,010 --> 00:50:08,771
of learn from what we've
experienced in these first 15

1049
00:50:08,771 --> 00:50:09,270
years?

1050
00:50:09,270 --> 00:50:11,317


1051
00:50:11,317 --> 00:50:13,400
PANTHEA LEE: That's a
really interesting question.

1052
00:50:13,400 --> 00:50:15,850
So I think, on just
your earlier point

1053
00:50:15,850 --> 00:50:22,430
around it being mostly young
men really digging into this,

1054
00:50:22,430 --> 00:50:26,900
I think that our work
on editors, which

1055
00:50:26,900 --> 00:50:30,860
has been in South Korea
and Czech Republic,

1056
00:50:30,860 --> 00:50:35,030
that somewhat confirms
what you're saying here.

1057
00:50:35,030 --> 00:50:38,810
And we were sort of probing
into where you lose people

1058
00:50:38,810 --> 00:50:40,400
in terms of contributors.

1059
00:50:40,400 --> 00:50:45,290
What is the editor experience,
and where do people drop off.

1060
00:50:45,290 --> 00:50:50,390
And I think we see that there
are elements of policies,

1061
00:50:50,390 --> 00:50:54,650
norms, cultural norms and
practices that, you know,

1062
00:50:54,650 --> 00:50:59,720
pose barriers to, I think--
you know, we saw sometimes,

1063
00:50:59,720 --> 00:51:02,540
you know, women
feeling excluded,

1064
00:51:02,540 --> 00:51:04,700
or just it was a difficult--

1065
00:51:04,700 --> 00:51:06,640
I think we heard
people say, you know,

1066
00:51:06,640 --> 00:51:09,800
Wikipedia is almost more
hassle than it's worth,

1067
00:51:09,800 --> 00:51:13,340
getting into the tussles
to be able to contribute.

1068
00:51:13,340 --> 00:51:14,630
And so I think we saw that--

1069
00:51:14,630 --> 00:51:17,630
I'm not sure-- and I think
now people are then finding

1070
00:51:17,630 --> 00:51:19,640
and founding other--

1071
00:51:19,640 --> 00:51:23,390
you know, we saw FemiWiki in
South Korea, this new resource

1072
00:51:23,390 --> 00:51:27,110
that had been developed by
contributors to Wikipedia

1073
00:51:27,110 --> 00:51:32,210
that just didn't want to get
into the fight, as they saw it.

1074
00:51:32,210 --> 00:51:35,840
We're seeing sort of
older people, retirees

1075
00:51:35,840 --> 00:51:40,730
that we spoke with that
chose instead to contribute

1076
00:51:40,730 --> 00:51:42,650
to other platforms.

1077
00:51:42,650 --> 00:51:48,500
We were analyzing how Coursera,
the sort of MOOC platform, how

1078
00:51:48,500 --> 00:51:53,660
they recruit and, you know,
onboard and then continue

1079
00:51:53,660 --> 00:51:56,780
encouraging contributors
to their platforms.

1080
00:51:56,780 --> 00:51:58,520
And they've been
successful we've

1081
00:51:58,520 --> 00:52:02,162
seen with retirees that
actually have a lot to give.

1082
00:52:02,162 --> 00:52:03,620
And so those are
some of the things

1083
00:52:03,620 --> 00:52:06,320
that we're finding through
the editors research.

1084
00:52:06,320 --> 00:52:10,220
In terms of movement strategy, I
think we got a sort of big push

1085
00:52:10,220 --> 00:52:12,350
from Adele and her
team really to look

1086
00:52:12,350 --> 00:52:16,220
at who's been left behind, and
to really talk with populations

1087
00:52:16,220 --> 00:52:19,670
that are not currently
online, lower income

1088
00:52:19,670 --> 00:52:25,100
populations, to test out some of
our hypotheses that, you know,

1089
00:52:25,100 --> 00:52:27,920
Wikipedia may not
reaching them and may not

1090
00:52:27,920 --> 00:52:30,350
be able to on its
current trajectory.

1091
00:52:30,350 --> 00:52:32,420
And I think that's
really where some

1092
00:52:32,420 --> 00:52:36,680
of the ideas around
partnering with nonprofits

1093
00:52:36,680 --> 00:52:38,840
or other social organizations
that really serve

1094
00:52:38,840 --> 00:52:41,570
these populations and
who's, like, you know,

1095
00:52:41,570 --> 00:52:44,810
core value is doing
outreach to them,

1096
00:52:44,810 --> 00:52:47,660
and then thinking about how you
guys bring your unique model

1097
00:52:47,660 --> 00:52:51,920
to then be the engine
of their work in terms

1098
00:52:51,920 --> 00:52:54,980
of contributing content or,
you know, community members,

1099
00:52:54,980 --> 00:52:56,810
helping them navigate
your content,

1100
00:52:56,810 --> 00:52:59,450
I think that's where some
of those ideas came from,

1101
00:52:59,450 --> 00:53:02,294
out of the push to speak with
those types of populations.

1102
00:53:02,294 --> 00:53:03,210
PRESENTER: Absolutely.

1103
00:53:03,210 --> 00:53:07,520
Yeah, and I think for us, when
we were thinking about this

1104
00:53:07,520 --> 00:53:10,370
and, like, really taking
advantage of design research

1105
00:53:10,370 --> 00:53:13,490
and really being able
to hear from the users

1106
00:53:13,490 --> 00:53:17,140
that we were trying to serve,
we had no limits, right?

1107
00:53:17,140 --> 00:53:19,760
It was really like let's go
there and really understand.

1108
00:53:19,760 --> 00:53:23,990
And I think we were trying to
probe and go further into who

1109
00:53:23,990 --> 00:53:25,880
we are leaving behind now.

1110
00:53:25,880 --> 00:53:29,210
So those are really, like, a
lot of people in this countries,

1111
00:53:29,210 --> 00:53:29,710
right?

1112
00:53:29,710 --> 00:53:36,980
And they are, like, women, not
nonwhite populations, right.

1113
00:53:36,980 --> 00:53:43,110
Like, so racially and
ethnicity diverse populations.

1114
00:53:43,110 --> 00:53:44,450
Age diversity.

1115
00:53:44,450 --> 00:53:47,770
Like, all the things
that we have--

1116
00:53:47,770 --> 00:53:50,570
that we feel that we should
be paying more attention to.

1117
00:53:50,570 --> 00:53:52,850
And those are part
of the new voices

1118
00:53:52,850 --> 00:53:57,350
that we feel that have not
been represented or included

1119
00:53:57,350 --> 00:53:59,260
in our conversations, right.

1120
00:53:59,260 --> 00:54:02,150
And now, in this
process, we want

1121
00:54:02,150 --> 00:54:05,100
them to be sitting at the
table and being part of that.

1122
00:54:05,100 --> 00:54:07,880
And I think the work that we
did in Brazil and Indonesia,

1123
00:54:07,880 --> 00:54:10,580
but also with all the
other work that we

1124
00:54:10,580 --> 00:54:12,320
have been doing in
all these countries,

1125
00:54:12,320 --> 00:54:16,850
were really our go to that.

1126
00:54:16,850 --> 00:54:18,980
Let's really open
the floor, and let's

1127
00:54:18,980 --> 00:54:21,770
really elevate those
voices and make sure

1128
00:54:21,770 --> 00:54:23,470
that we're hearing from them.

1129
00:54:23,470 --> 00:54:26,960
And we are now at cycle three,
right, in the movement strategy

1130
00:54:26,960 --> 00:54:27,660
process.

1131
00:54:27,660 --> 00:54:32,390
And I think that is how
this information is coming

1132
00:54:32,390 --> 00:54:34,770
right to the communities.

1133
00:54:34,770 --> 00:54:36,440
And we're sharing all that.

1134
00:54:36,440 --> 00:54:37,700
And we are connecting, OK?

1135
00:54:37,700 --> 00:54:40,640
So where we want
to be in 15 years,

1136
00:54:40,640 --> 00:54:46,061
and can we work and have
those new voices as part

1137
00:54:46,061 --> 00:54:47,060
of this movement, right?

1138
00:54:47,060 --> 00:54:49,970
Can we build something
that it's not--

1139
00:54:49,970 --> 00:54:51,950
that is with them.

1140
00:54:51,950 --> 00:54:53,060
And I think we are--

1141
00:54:53,060 --> 00:54:54,560
like, I'm excited
for this phase,

1142
00:54:54,560 --> 00:54:57,950
because I think we have a lot
to exchange, a lot to learn

1143
00:54:57,950 --> 00:54:59,220
from each other, right.

1144
00:54:59,220 --> 00:55:01,850
Like Sati was talking about,
how we learn from them.

1145
00:55:01,850 --> 00:55:06,440
And I think there's so many
things that we have to learn.

1146
00:55:06,440 --> 00:55:08,120
And they are out there, right?

1147
00:55:08,120 --> 00:55:09,870
Like those findings,
the insights,

1148
00:55:09,870 --> 00:55:13,820
and really interesting
creative ideas are out there.

1149
00:55:13,820 --> 00:55:16,670
But we need to open up a
space for the new voices

1150
00:55:16,670 --> 00:55:19,100
to be in the dialogue.

1151
00:55:19,100 --> 00:55:19,600
Yeah.

1152
00:55:19,600 --> 00:55:22,470


1153
00:55:22,470 --> 00:55:23,600
One more question.

1154
00:55:23,600 --> 00:55:25,740
And who is it?

1155
00:55:25,740 --> 00:55:26,240
Jessica.

1156
00:55:26,240 --> 00:55:30,230


1157
00:55:30,230 --> 00:55:32,027
Jessica, are you there?

1158
00:55:32,027 --> 00:55:32,610
JESSICA: Yeah.

1159
00:55:32,610 --> 00:55:33,300
Can you hear me?

1160
00:55:33,300 --> 00:55:35,112
PRESENTER: Yes, we can hear you.

1161
00:55:35,112 --> 00:55:35,820
JESSICA: Perfect.

1162
00:55:35,820 --> 00:55:36,720
Thank you.

1163
00:55:36,720 --> 00:55:39,520
Thank you for this great
talk, to begin with.

1164
00:55:39,520 --> 00:55:41,430
It was really, really,
really interesting.

1165
00:55:41,430 --> 00:55:42,170
Just a quick question.

1166
00:55:42,170 --> 00:55:43,794
And I know we don't
have a lot of time,

1167
00:55:43,794 --> 00:55:48,520
but you mentioned something
that was really interesting,

1168
00:55:48,520 --> 00:55:52,620
I thought, around brand
awareness and brand confusion,

1169
00:55:52,620 --> 00:55:54,740
that a lot of-- about--

1170
00:55:54,740 --> 00:55:57,270
lot of people know
about Wikipedia,

1171
00:55:57,270 --> 00:56:01,590
but they might not have a good
understanding of who we are

1172
00:56:01,590 --> 00:56:02,970
and how we work.

1173
00:56:02,970 --> 00:56:04,920
And I was just
interested in getting

1174
00:56:04,920 --> 00:56:08,170
some high-level thoughts
and suggestions from you of,

1175
00:56:08,170 --> 00:56:10,560
you know, how that
can be addressed.

1176
00:56:10,560 --> 00:56:13,950
What can we do better as a
movement and a foundation side

1177
00:56:13,950 --> 00:56:19,320
in order to educate people
about who we are and how we work

1178
00:56:19,320 --> 00:56:23,250
and why they should care,
as you put it very clearly?

1179
00:56:23,250 --> 00:56:26,010


1180
00:56:26,010 --> 00:56:29,160
PANTHEA LEE: So, I think that
we're seeing a lot of interest

1181
00:56:29,160 --> 00:56:31,440
from different users around
sort of understanding

1182
00:56:31,440 --> 00:56:36,600
the process behind, like,
how the sausage gets made.

1183
00:56:36,600 --> 00:56:40,050
And we heard people say,
you know, I know Wikipedia,

1184
00:56:40,050 --> 00:56:41,520
but they seem to be--

1185
00:56:41,520 --> 00:56:44,302
they're not very transparent.

1186
00:56:44,302 --> 00:56:45,510
We don't know where they are.

1187
00:56:45,510 --> 00:56:46,980
We don't know how they work.

1188
00:56:46,980 --> 00:56:49,710
But you know, Google seems
much more transparent,

1189
00:56:49,710 --> 00:56:53,434
because Google posts videos
about, you know, how they work.

1190
00:56:53,434 --> 00:56:56,100
And we, like, see their offices,
and their offices are colorful.

1191
00:56:56,100 --> 00:56:57,210
And we're like, really?

1192
00:56:57,210 --> 00:56:59,880
That's so fascinating.

1193
00:56:59,880 --> 00:57:03,180
And so, you know,
and we saw people--

1194
00:57:03,180 --> 00:57:08,520
we had users that were on
the Wikipedia Instagram.

1195
00:57:08,520 --> 00:57:12,420
We were watching them sort of
scroll through, and they go,

1196
00:57:12,420 --> 00:57:14,070
I don't get it.

1197
00:57:14,070 --> 00:57:17,610
Like, what's the
logic, basically,

1198
00:57:17,610 --> 00:57:19,740
behind all of this content?

1199
00:57:19,740 --> 00:57:23,970
You know, because people are--
you guys cover everything.

1200
00:57:23,970 --> 00:57:26,760
And so people are following
Instagram accounts that

1201
00:57:26,760 --> 00:57:29,160
basically relate to
a specific theme,

1202
00:57:29,160 --> 00:57:33,300
you know, like cute cats,
beautiful sunsets, you know,

1203
00:57:33,300 --> 00:57:34,650
space, whatever.

1204
00:57:34,650 --> 00:57:36,990
And so, you know, they're
trying to understand also,

1205
00:57:36,990 --> 00:57:39,360
like, what is Wikipedia,
because I can't make

1206
00:57:39,360 --> 00:57:41,730
sense of all of this content.

1207
00:57:41,730 --> 00:57:43,530
And so, you know,
I think there are

1208
00:57:43,530 --> 00:57:45,510
things that you all
might be able to do

1209
00:57:45,510 --> 00:57:47,310
to expose the process.

1210
00:57:47,310 --> 00:57:49,250
Who are the people
behind Wikipedia?

1211
00:57:49,250 --> 00:57:50,370
People want to know.

1212
00:57:50,370 --> 00:57:52,184
People want to
know how Wikipedia

1213
00:57:52,184 --> 00:57:53,850
is made, who are the
people behind them,

1214
00:57:53,850 --> 00:57:55,500
because they're
trusting-- like, we're

1215
00:57:55,500 --> 00:57:58,390
seeing trust shift from
institutions to individuals.

1216
00:57:58,390 --> 00:58:01,610
And so you guys are a
movement of individuals.

1217
00:58:01,610 --> 00:58:04,020
Let's show that, because--

1218
00:58:04,020 --> 00:58:05,610
and there are things
that you can do,

1219
00:58:05,610 --> 00:58:08,640
whether in terms of
communications campaigns,

1220
00:58:08,640 --> 00:58:11,880
but then also I think on
articles and on the platform

1221
00:58:11,880 --> 00:58:15,270
to be able to show that process
to help people understand

1222
00:58:15,270 --> 00:58:16,920
how it gets made.

1223
00:58:16,920 --> 00:58:19,980
And then I think around
the actual sort of product,

1224
00:58:19,980 --> 00:58:24,210
is it around just aggregating,
you know, just knowledge, you

1225
00:58:24,210 --> 00:58:29,310
know, one platform, and
developing subchannels, topic

1226
00:58:29,310 --> 00:58:33,070
guides, whatever it is, thinking
about how you communicate,

1227
00:58:33,070 --> 00:58:36,670
whether it's subchannels
around your Instagram accounts

1228
00:58:36,670 --> 00:58:38,580
or is there other
things to help people

1229
00:58:38,580 --> 00:58:40,830
understand Wikipedia
is all of these things,

1230
00:58:40,830 --> 00:58:44,220
but I don't have to just
engage with Wikipedia as, like,

1231
00:58:44,220 --> 00:58:45,530
one singular platform.

1232
00:58:45,530 --> 00:58:50,220
You know, I can find whatever
it is that Wikipedia offers

1233
00:58:50,220 --> 00:58:53,790
that is interesting to me and
sort of ignore the other stuff.

1234
00:58:53,790 --> 00:58:57,120
And I think that could be
another interesting area

1235
00:58:57,120 --> 00:58:58,404
to explore.

1236
00:58:58,404 --> 00:58:59,820
But I know the
comms team has been

1237
00:58:59,820 --> 00:59:02,580
doing different sort of
communications campaign

1238
00:59:02,580 --> 00:59:08,760
in local languages and
through other sort of locally

1239
00:59:08,760 --> 00:59:10,140
relevant distribution channels.

1240
00:59:10,140 --> 00:59:14,580
And so they might be better
suited to speak to that.

1241
00:59:14,580 --> 00:59:17,150
PRESENTER: So I think that
was our last question.

1242
00:59:17,150 --> 00:59:19,040
Thank you, Jessica.

1243
00:59:19,040 --> 00:59:20,220
Thank you, Panthea.

1244
00:59:20,220 --> 00:59:21,230
This was really good.

1245
00:59:21,230 --> 00:59:22,920
We're really happy
to have you here.

1246
00:59:22,920 --> 00:59:27,410
And yeah, thank you for
all attending, as well.

1247
00:59:27,410 --> 00:59:29,730
And that's it for this morning.

1248
00:59:29,730 --> 00:59:30,230
Thank you.

1249
00:59:30,230 --> 00:59:31,420
PANTHEA LEE: Thanks
for having me.

1250
00:59:31,420 --> 00:59:32,800
This has been really fun work.

1251
00:59:32,800 --> 00:59:36,150
[CLAPPING]

1252
00:59:36,150 --> 00:59:37,543