English subtitles for clip: File:WikiCon 2023 - Wikipedia. Macht. Bilder. Ein Streitgespräch über (Gender) Bias und Bilder in der Wikipedia.webm

From Wikimedia Commons, the free media repository
Jump to navigation Jump to search
1
00:00:00,000 --> 00:00:10,000
[Subtitles with Whisper Medium/revised] So, welcome. I hope that Eva and Ziko are both online now.

2
00:00:10,000 --> 00:00:12,000
Okay, he nods, so that's it.

3
00:00:12,000 --> 00:00:18,000
Yes, Eva was my colleague at the University of Mannheim for a long time, and Ziko was also our guest in Mannheim.

4
00:00:18,000 --> 00:00:22,000
That's why I'm all the more pleased to be able to briefly introduce the two of them today

5
00:00:22,000 --> 00:00:27,000
and then moderate the discussion.

6
00:00:27,000 --> 00:00:31,000
Eva Gredel is a junior professor in Duisburg-Essen.

7
00:00:31,000 --> 00:00:35,000
She is a junior professor of German linguistics

8
00:00:35,000 --> 00:00:40,000
and her focus is on digital discourse and education in the digital world.

9
00:00:40,000 --> 00:00:44,000
In both areas she also deals with Wikipedia,

10
00:00:44,000 --> 00:00:49,000
with Wikipedia didactics, the use of Wikipedia in teaching and

11
00:00:49,000 --> 00:00:53,000
Wikipedia, the scientific research of Wikipedia.

12
00:00:53,000 --> 00:00:57,000
And in particular she deals with digital multimodal discourses

13
00:00:57,000 --> 00:01:00,000
and also the gender bias in Wikipedia.

14
00:01:00,000 --> 00:01:05,000
In 2017 she was co-organizer of the Wiki-Dach in Mannheim

15
00:01:05,000 --> 00:01:10,000
and also gave a lecture at this year's Femnetz meeting.

16
00:01:10,000 --> 00:01:18,000
Ziko van Dijk has been active in Wikipedia at Wikimedia Germany and the Netherlands since 2003, and

17
00:01:18,000 --> 00:01:25,000
was also on the board and arbitration tribunal of the Netherlands and the German Wikipedia.

18
00:01:25,000 --> 00:01:30,000
He is a co-founder of Klexikon, an online encyclopedia for children,

19
00:01:30,000 --> 00:01:36,000
holds a doctorate in history and has a particular interest in images in Wikipedia,

20
00:01:36,000 --> 00:01:39,000
including AI-generated images.

21
00:01:39,000 --> 00:01:45,000
He has already published two books on Wikipedia, in 2010 and 2021,

22
00:01:45,000 --> 00:01:48,000
such as “Understanding Wikis and Wikipedia”.

23
00:01:48,000 --> 00:01:52,000
Both speakers share an interest in images on Wikipedia,

24
00:01:52,000 --> 00:01:57,000
and they will now explain to us their different views on the topic.

25
00:01:57,000 --> 00:02:01,000
And at the end, as I said, there is time for discussion,

26
00:02:01,000 --> 00:02:04,000
both here in the room and via the chat.

27
00:02:04,000 --> 00:02:08,000
And I will then pass on the questions.

28
00:02:08,000 --> 00:02:10,000
Welcome, Eva and Ziko.

29
00:02:10,000 --> 00:02:11,000
Hello.

30
00:02:11,000 --> 00:02:12,000
Hello.

31
00:02:12,000 --> 00:02:13,000
Yes.

32
00:02:13,000 --> 00:02:21,000
Yes, according to the structure that we have planned for our discussion,

33
00:02:21,000 --> 00:02:27,000
that is how we named the format, I would now first

34
00:02:27,000 --> 00:02:32,000
present my linguistic, partly semiotic, image-scientific view,

35
00:02:32,000 --> 00:02:37,000
with a focus on women in Wikipedia.

36
00:02:37,000 --> 00:02:39,000
But it also goes something beyond that.

37
00:02:39,000 --> 00:02:43,000
And that's right, Ziko's point of view will continue later.

38
00:02:43,000 --> 00:02:46,000
I can say for myself that I'm a bit excited

39
00:02:46,000 --> 00:02:49,000
because the technical connection is also a bit more complex.

40
00:02:49,000 --> 00:02:53,000
In any case, I'm looking forward to the discussion afterwards

41
00:02:53,000 --> 00:02:55,000
and hope that with our discussion,

42
00:02:55,000 --> 00:03:01,000
which may not be so contentious in the end or not that controversial,

43
00:03:01,000 --> 00:03:07,000
but that we can bring good food for thought to the Wikipedia community.

44
00:03:08,000 --> 00:03:14,000
Maybe go back to the overarching title that Ziko and I had thought about,

45
00:03:14,000 --> 00:03:17,000
Wikipedia. Might. Pictures.

46
00:03:17,000 --> 00:03:22,000
This is a title that can be read in different ways.

47
00:03:22,000 --> 00:03:28,000
On the one hand, you can see that of course many Internet users

48
00:03:28,000 --> 00:03:32,000
access Wikipedia and use it as a central source of information.

49
00:03:32,000 --> 00:03:36,000
And you can then express the assumption or the assumption

50
00:03:36,000 --> 00:03:44,000
that Wikipedia, as a central source of information, influences people's view of the world,

51
00:03:44,000 --> 00:03:49,000
so to speak, conveys images of the world.

52
00:03:49,000 --> 00:03:53,000
And on the other hand, you can also read it in such a way that the question is

53
00:03:53,000 --> 00:04:00,000
who actually has the power in Wikipedia to impose certain images on discussion pages,

54
00:04:00,000 --> 00:04:06,000
i.e. to then enforce the question of how certain facts

55
00:04:06,000 --> 00:04:10,000
in the analogue world should be illustrated and illustrated.

56
00:04:10,000 --> 00:04:14,000
And my title, today it's about wiki world views,

57
00:04:14,000 --> 00:04:17,000
I signed the title with underrated images,

58
00:04:17,000 --> 00:04:20,000
captions and digital image practices.

59
00:04:20,000 --> 00:04:24,000
So my concern now is how

60
00:04:24,000 --> 00:04:28,000
worldviews are constructed and created using images in Wikipedia.

61
00:04:28,000 --> 00:04:35,000
And that this is an aspect, a topic that is really, very strongly worthy of discussion.

62
00:04:35,000 --> 00:04:40,000
Exactly, before we get into the actual presentations,

63
00:04:40,000 --> 00:04:45,000
we wanted to have two blocks in which Ziko and I would present our statements.

64
00:04:45,000 --> 00:04:51,000
And you could show my second slide at this point for my statements.

65
00:04:51,000 --> 00:04:54,000
Ziko and I each thought about three theses.

66
00:04:54,000 --> 00:04:57,000
I have added a small preview of the topic

67
00:04:57,000 --> 00:05:01,000
that we can only touch on today, but which will probably also

68
00:05:01,000 --> 00:05:06,000
be of relatively great importance for the Wikipedia community in the future.

69
00:05:06,000 --> 00:05:10,000
So my theses are that images don't just lighten up Wikipedia articles.

70
00:05:10,000 --> 00:05:14,000
They not only illustrate these, but they suggest many more world views

71
00:05:14,000 --> 00:05:19,000
and can also lead to thematic distortions and bias.

72
00:05:20,000 --> 00:05:24,000
Secondly, captions concretize the meaning,

73
00:05:24,000 --> 00:05:27,000
the relatively open meaning of images in the respective context.

74
00:05:27,000 --> 00:05:31,000
And they should therefore not be underestimated in Wikipedia.

75
00:05:31,000 --> 00:05:35,000
These are relatively small texts, the captions, but in my opinion they have

76
00:05:35,000 --> 00:05:39,000
a great importance for the overall meaning of an article.

77
00:05:39,000 --> 00:05:44,000
And finally, the handling of images and text-image constellations in the culture

78
00:05:44,000 --> 00:05:49,000
of digitality is a topic worth discussing, also for the Wikipedia community.

79
00:05:49,000 --> 00:05:53,000
We have digital platforms that are of course very image-heavy, such as

80
00:05:53,000 --> 00:05:58,000
Instagram, but in Wikipedia, images and their captions

81
00:05:58,000 --> 00:06:02,000
and the texts surrounding images are also very important.

82
00:06:02,000 --> 00:06:06,000
And I think that there is still room for improvement

83
00:06:06,000 --> 00:06:10,000
when it comes to the rules on how to use images in Wikipedia, and that one or two aspects

84
00:06:10,000 --> 00:06:13,000
could perhaps be included in the Wikipedia rules.

85
00:06:13,000 --> 00:06:19,000
And the outlook concerns the distribution of images generated with AI tools.

86
00:06:19,000 --> 00:06:25,000
And from my point of view, there are completely new challenges associated with these

87
00:06:25,000 --> 00:06:30,000
AI-generated images, especially for the design of Wikimedia projects.

88
00:06:30,000 --> 00:06:34,000
Exactly, those are my theses. Now, just before I start the lecture, I would

89
00:06:34,000 --> 00:06:39,000
hand it over to Ziko so that he can briefly present his theses.

90
00:06:40,000 --> 00:06:45,000
Yes, thank you Eva. Well, I'll make it short and painless or we'll see.

91
00:06:45,000 --> 00:06:52,000
Thesis 1. The Wikimedia movement needs to do a better job with images and I'll have examples of that in a moment.

92
00:06:52,000 --> 00:06:58,000
Thesis 2. What can have an influence does not automatically always have it.

93
00:06:58,000 --> 00:07:06,000
And thesis 3. Emancipatory impetus can slide into problematic collectivism.

94
00:07:06,000 --> 00:07:12,000
Bamm, well, we'll see about that in a moment. Now I'm looking forward to Eva's lecture.

95
00:07:12,000 --> 00:07:17,000
Yes, thank you very much Ziko, then I would continue. On the next slide I would like to briefly

96
00:07:17,000 --> 00:07:22,000
go over the key points that Maja mentioned. Well, I have a junior professorship

97
00:07:22,000 --> 00:07:26,000
with tenure check for German linguistics, but linguistics has

98
00:07:26,000 --> 00:07:32,000
also been greatly expanded in recent years to include symbiotic visual science aspects.

99
00:07:32,000 --> 00:07:38,000
Multimodality, i.e. the use of images, videos and audio documents,

100
00:07:38,000 --> 00:07:43,000
also plays a major role in linguistics. And it is precisely this multimodality,

101
00:07:43,000 --> 00:07:48,000
i.e. the use of images, video and audio material in Wikipedia, for example, that was also the subject of my

102
00:07:48,000 --> 00:07:55,000
habilitation thesis, which I defended in a habilitation colloquium in 2022.

103
00:07:55,000 --> 00:08:02,000
And I was also interested in what effects certain image inventories

104
00:08:02,000 --> 00:08:07,000
have on meaning in articles. Yes, in the last few years

105
00:08:07,000 --> 00:08:13,000
some of my publications on Wikipedia and Wikipedactics have appeared in the context of this habilitation project.

106
00:08:13,000 --> 00:08:20,000
So I'm interested in how one can introduce Internet users, especially young ones, to the reflective use of Wikipedia

107
00:08:20,000 --> 00:08:24,000
in teaching contexts, be it at school or university . I experience again

108
00:08:24,000 --> 00:08:29,000
and again that the so-called digital natives in particular access Wikipedia very heavily,

109
00:08:29,000 --> 00:08:33,000
but, for example, do not know the discussion pages and the version histories at all,

110
00:08:33,000 --> 00:08:38,000
so they have never taken a look behind the scenes of Wikipedia and that they are missing very

111
00:08:38,000 --> 00:08:44,000
central aspects, to understand wikis in general, but also Wikipedia in particular.

112
00:08:44,000 --> 00:08:48,000
And that's why I take the approach, for example in my university seminars here with students,

113
00:08:48,000 --> 00:08:54,000
to look closely at where, for example, you can find arguments for using certain images on the

114
00:08:54,000 --> 00:08:58,000
discussion pages. So that's briefly about me. And it is precisely against this background

115
00:08:58,000 --> 00:09:04,000
that I would now like

116
00:09:04,000 --> 00:09:10,000
to place my presentation today in a larger context on the next slide. Why do I integrate Wikipedia into my university seminars? We have now had

117
00:09:10,000 --> 00:09:17,000
new educational standards in place since 2022 and the theoretical framework for these educational standards,

118
00:09:17,000 --> 00:09:22,000
These are the skills that today's students should acquire in German language lessons.

119
00:09:22,000 --> 00:09:31,000
There is a lot about the topic of digitality and digitality as a culture-shaping

120
00:09:31,000 --> 00:09:37,000
framework of our time. And very specifically, these educational standards, which

121
00:09:38,159 --> 00:09:44,159
are to be implemented in schools nationwide, or at the state level,

122
00:09:44,159 --> 00:09:48,000
are intended to influence the educational plans and school curricula.

123
00:09:48,000 --> 00:09:55,000
Stalder is explicitly mentioned in his book “Culture of Digitality” and he says that there are three central forms of culture

124
00:09:55,000 --> 00:10:00,000
of digitality. On the one hand, there is communality, referentiality and

125
00:10:00,000 --> 00:10:07,000
algorithmicity. When it comes to community, a number of

126
00:10:07,000 --> 00:10:13,000
projects have emerged in the last 20 or 30 years in which Commons have been built. This also includes Wikipedia

127
00:10:13,000 --> 00:10:20,000
with its freely accessible texts. And Stalder explains that new

128
00:10:20,000 --> 00:10:26,000
commoner institutions have emerged in this context and he describes Wikipedia, for example, as a

129
00:10:26,000 --> 00:10:31,000
so-called ad hoc meritocracy, meaning that those who have been there for a long time also

130
00:10:31,000 --> 00:10:39,000
have the power to edit certain texts and to bring images into the negotiation of encyclopedically relevant

131
00:10:39,000 --> 00:10:44,000
issues. Referentiality, he points out, for example in relation to

132
00:10:44,000 --> 00:10:51,000
digital photography, that with the possibility of taking digital images using smartphones,

133
00:10:51,000 --> 00:10:59,000
for example, the artifacts, i.e. photos, have multiplied and are easily available and that

134
00:10:59,000 --> 00:11:05,000
many people rely on them can access digital artifacts over the Internet, share them,

135
00:11:05,000 --> 00:11:11,000
link them and use them for their own purposes. And finally he describes

136
00:11:11,000 --> 00:11:17,000
algorithmicity. I think everyone knows that. And the aspect that is particularly important is that

137
00:11:17,000 --> 00:11:23,000
artificial intelligence can now be used to generate images, for example. This means that

138
00:11:23,000 --> 00:11:29,000
the number of images available is actually increased and a

139
00:11:29,000 --> 00:11:35,000
tendency to depict certain people or facts can also be determined here. Then I'll move

140
00:11:35,000 --> 00:11:40,000
on to the next slide and would like to share some thoughts from linguistics here. So in

141
00:11:40,000 --> 00:11:45,000
linguistics it is assumed that many resources, sign resources such as language, images, typography,

142
00:11:45,000 --> 00:11:51,000
contribute to the creation of meaning in texts. And that it is their interaction that is important in

143
00:11:51,000 --> 00:11:57,000
creating meaning in texts. And these resources fulfill very different tasks.

144
00:11:57,000 --> 00:12:02,000
I'll show you what that means in detail in a moment. Maybe another important quote. As early as 2009, Warnke wrote

145
00:12:02,000 --> 00:12:08,000
with regard to the knowledge society that the naive view that knowledge societies are

146
00:12:08,000 --> 00:12:12,000
secured against the influence of opinions through rationally based knowledge is, on the other hand, based on

147
00:12:12,000 --> 00:12:17,000
the assumption, intersubjectively secured, that knowledge is opinion and ultimately power neutral.

148
00:12:17,000 --> 00:12:22,000
The opposite is the case. Because knowledge is fundamentally contested, especially in

149
00:12:22,000 --> 00:12:30,000
knowledgeable societies that operate through mass media. What does that mean? The assumption that knowledge can be neutral

150
00:12:30,000 --> 00:12:36,000
makes neutral, which means that constellation does not play a role. He rejects that here, the thesis,

151
00:12:36,000 --> 00:12:42,000
but knowledge is fundamentally contested and is intensively shared. I think this is a

152
00:12:42,000 --> 00:12:48,000
point that also applies very, very strongly to Wikipedia. On the next slide I would like

153
00:12:48,000 --> 00:12:56,000
to show what functions and properties are attributed to images in linguistics and also what

154
00:12:56,000 --> 00:13:02,000
function language fulfills. When it comes to images, it means that they are

155
00:13:02,000 --> 00:13:07,000
perceived simultaneously and holistically, that they can be perceived quickly, that they have a strong memory and impact,

156
00:13:07,000 --> 00:13:13,000
and that they are linked to emotions. And when it comes to meaning potential, one can say that images

157
00:13:13,000 --> 00:13:18,000
are accompanied by an excess of meaning because they can show objects rich in features. But at the same time they are vague and

158
00:13:18,000 --> 00:13:24,000
underdetermined. This means that they bring with them a wide range of potential meanings. In addition,

159
00:13:24,000 --> 00:13:29,000
images can also be accompanied by emotional appeals and instructions for action. With

160
00:13:29,000 --> 00:13:34,000
language, perception is more subcessive, linear and significantly slower than with

161
00:13:34,000 --> 00:13:40,000
images. Their meaning is firmly established and the meaning is precise and definite

162
00:13:40,000 --> 00:13:45,000
compared to the image meaning. When we think about scientific texts, I think it is very

163
00:13:45,000 --> 00:13:51,000
catchy and insightful. And texts can represent action in time, create logical connections

164
00:13:51,000 --> 00:13:56,000
and a variety of references are possible. For example, you can use language to

165
00:13:56,000 --> 00:14:01,000
talk about language, which is not the case with images; you

166
00:14:01,000 --> 00:14:07,000
can essentially think about images with images. On the next slide I would like to clarify this a little more. I

167
00:14:07,000 --> 00:14:12,000
took a closer look at the sustainability discourse in Wikipedia, i.e. the question of how sustainability

168
00:14:12,000 --> 00:14:18,000
is specifically negotiated and presented in the article Sustainability in Wikipedia. And here it is

169
00:14:18,000 --> 00:14:24,000
that one or more images can be found on the Wikipedia article or in the Wikipedia

170
00:14:24,000 --> 00:14:30,000
article. And one of them shows some kind of building, probably a stable with lots and lots of

171
00:14:30,000 --> 00:14:38,000
pigs. And now the question is, what is it and what it is supposed to show us is

172
00:14:38,000 --> 00:14:43,000
made concrete by the caption. But now it is the case that this is being negotiated very heavily and

173
00:14:43,000 --> 00:14:48,000
the version history can be used to reconstruct that the caption

174
00:14:48,000 --> 00:14:55,000
is also changed very often. Namely a user on November 17th, 2017 and 16:28 describes what can be

175
00:14:55,000 --> 00:15:01,000
seen here in the picture as modern factory farming. And this is defined in the Duden dictionary relatively clearly and with negative

176
00:15:01,000 --> 00:15:07,000
connotations as follows. This is mechanized animal husbandry on large farms to

177
00:15:07,000 --> 00:15:13,000
produce as many animal products as possible, so it has a rather negative connotation. And in the case of livestock farming,

178
00:15:13,000 --> 00:15:17,000
as under the second picture, modern livestock farming can be sustainable if an efficient

179
00:15:17,000 --> 00:15:23,000
approach to resources, use, environmental requirements, animal husbandry conditions and so on

180
00:15:23,000 --> 00:15:29,000
are taken into account. So you can see here that one and the same picture

181
00:15:29,000 --> 00:15:35,000
is given very different captions and that the potential meaning that goes along with the picture

182
00:15:35,000 --> 00:15:41,000
is intended to be made more precise through the writing, through the text that accompanies the picture. And we also see that the

183
00:15:41,000 --> 00:15:47,000
image here is relatively patient, so that it only serves to illustrate the potential for image meaning

184
00:15:47,000 --> 00:15:52,000
and the properties that go along with images, in contrast to what writing and text

185
00:15:52,000 --> 00:15:58,000
can achieve. Then I would come to the next slide and would like to briefly

186
00:15:58,000 --> 00:16:04,000
mention the information in Wikipedia about illustrating articles. This means that

187
00:16:04,000 --> 00:16:09,000
illustrating the article can significantly contribute to understanding the article text and also

188
00:16:09,000 --> 00:16:13,000
loosen up the body. From my point of view, this is a point where images are actually underestimated.

189
00:16:13,000 --> 00:16:18,000
So pictures and illustrations of articles are not just about loosening things up, but actually,

190
00:16:18,000 --> 00:16:23,000
as I said, world views are presented. He wants the carefully selected media files

191
00:16:23,000 --> 00:16:28,000
to be integrated in the appropriate place in relation to the text in such a way that neither the text is dominated

192
00:16:28,000 --> 00:16:33,000
nor its readability is impaired. So this is specifically about text-image constellations.

193
00:16:33,000 --> 00:16:38,000
There should be a reference to the content, a significance, i.e. relevance of the image and, to a certain

194
00:16:38,000 --> 00:16:43,000
extent, representativeness. Now I would like to look at individual examples on the next

195
00:16:43,000 --> 00:16:51,000
slide and would like to show that what is relatively little discussed here

196
00:16:51,000 --> 00:16:57,000
can be problematic in certain articles or should at least stimulate discussion. Exactly,

197
00:16:57,000 --> 00:17:05,000
next slide please. Exactly, here we have the image inventory, i.e. the images that

198
00:17:05,000 --> 00:17:12,000
are included in the article about Marie Curie are integrated and I would like to

199
00:17:12,000 --> 00:17:18,000
add a quote from the Diversathon 2021, an event within the Wikipedia community

200
00:17:18,000 --> 00:17:24,000
that is Dealing with images has been discussed, especially in relation to women, and it says that

201
00:17:24,000 --> 00:17:29,000
in articles about women, avoid using images of her husband's father or her family

202
00:17:29,000 --> 00:17:34,000
unless it is absolutely necessary and gives the article quality. These images

203
00:17:34,000 --> 00:17:39,000
tend to ignore women's individuality without adding important information to the article.

204
00:17:39,000 --> 00:17:45,000
We now have a person here who

205
00:17:45,000 --> 00:17:50,000
developed his important scientific findings at the end of the 19th century and the beginning of the 20th century and that is why we also have

206
00:17:50,000 --> 00:17:56,000
historical image inventories here and as you can see, what is described in the Diversathon is here,

207
00:17:56,000 --> 00:18:01,000
but yes actually implemented, namely that Marie Curie

208
00:18:01,000 --> 00:18:07,000
is shown in context, in constellation, constellation of images that represent, for example, the father, the husband and Marie Curie

209
00:18:07,000 --> 00:18:13,000
Here, in many places, it is not centrally placed, even though there is a Wikipedia article

210
00:18:13,000 --> 00:18:20,000
that discusses it. Here, for example, we have Wladyslaw Sklodowski with his three daughters and

211
00:18:20,000 --> 00:18:27,000
one of them is Marie Curie, who was later called Marie Curie. Exactly,

212
00:18:27,000 --> 00:18:33,000
you can see a historical constellation of people here that we probably

213
00:18:33,000 --> 00:18:38,000
often find in pictures in the 19th century, but the question that arises here is whether such a

214
00:18:38,000 --> 00:18:43,000
picture will meet the requirements of an online encyclopedia of the 21st century. Century is fair if the person

215
00:18:43,000 --> 00:18:50,000
being discussed is not placed centrally here at all, but rather family images are reproduced

216
00:18:50,000 --> 00:18:57,000
that are no longer tenable today, that the father is placed so centrally and perhaps in

217
00:18:57,000 --> 00:19:02,000
particular do not fit with an article that actually is intended to represent a woman with her scientific

218
00:19:02,000 --> 00:19:07,000
achievements. We then see her again with her daughter and her husband,

219
00:19:07,000 --> 00:19:13,000
Pierre Curie, and quite excitingly from an earlier version is the picture that is then signed with

220
00:19:13,000 --> 00:19:18,000
the widow with her young daughters Eve and Irène and here it is actually Marie Curie

221
00:19:18,000 --> 00:19:24,000
sits in the middle, but in the caption there is the relational

222
00:19:24,000 --> 00:19:30,000
personal name widow, so she is

223
00:19:30,000 --> 00:19:36,000
put in relation to her already deceased husband and the question is why the word widow is written here and not, for example, scientist,

224
00:19:36,000 --> 00:19:42,000
so why on that It should be noted that there is no man in the picture here. The question is,

225
00:19:42,000 --> 00:19:47,000
is this a relevant aspect, does the image actually have to be made so concrete by the caption?

226
00:19:47,000 --> 00:19:53,000
It has actually already been fixed, so it can no longer be found in the current version,

227
00:19:53,000 --> 00:19:59,000
but of course it is a nice point to illustrate image inventories and captions, text image constellations.

228
00:19:59,000 --> 00:20:05,000
At the top is the framed picture, Pierre and Marie Curie in their laboratory on Rue Curie.

229
00:20:05,000 --> 00:20:10,000
A very exciting discussion has developed on the corresponding discussion page and I have

230
00:20:10,000 --> 00:20:15,000
another excerpt on the next slide, Simon, if you could click on further, please.

231
00:20:15,000 --> 00:20:22,000
Exactly, thank you very much. The question is which image should actually be used to

232
00:20:22,000 --> 00:20:28,000
show Marie Curie in her work context. There on the talk page it says,

233
00:20:28,000 --> 00:20:33,000
hello Suku, why did you prefer your photo on the left over my photo and reversed my change?

234
00:20:33,000 --> 00:20:39,000
This is about Marie, and the main character should look at the viewer, greetings from OS and then the answer,

235
00:20:39,000 --> 00:20:44,000
you are putting the cart before the horse, it is up to you to explain why you

236
00:20:44,000 --> 00:20:49,000
consider your change in this form to be an article improvement. Attributes such as impressive or the idea that

237
00:20:49,000 --> 00:20:54,000
the main character should look at the viewer are rather subjective here. So what this excerpt from the

238
00:20:54,000 --> 00:21:00,000
discussion page shows is that there are actually discussions about images and that over the course of

239
00:21:00,000 --> 00:21:06,000
Wikipedia's history over the last 22 years, a lot has improved in terms of the topic,

240
00:21:07,000 --> 00:21:13,000
reflections on images, image discussions and the question how women or other groups

241
00:21:13,000 --> 00:21:19,000
should be portrayed is an example of this, so to speak.

242
00:21:19,000 --> 00:21:26,000
On the next slide I would like to come to another example, which is now about

243
00:21:26,000 --> 00:21:32,000
an architect couple, one is Elisabeth Böhm and the other is Gottfried Böhm, both of whom

244
00:21:32,000 --> 00:21:37,000
have a Wikipedia article. And here too I have brought a quote

245
00:21:37,000 --> 00:21:43,000
that can be found in Wikipedia, but in the context of the English version of Wikipedia, it says,

246
00:21:43,000 --> 00:21:48,000
What you see here in these small excerpts of the image material is that Elisabeth Böhm has

247
00:21:48,000 --> 00:21:53,000
a lot going on There is less picture material and we see here above all the family grave and she is

248
00:21:53,000 --> 00:21:58,000
also shown again together with her husband and below in the associative references

249
00:21:58,000 --> 00:22:03,000
that are shown here on the side of the pictures is the picture material that is here in the

250
00:22:03,000 --> 00:22:08,000
library, who is here in the library, in the library, in the library, in the library,

251
00:22:08,000 --> 00:22:14,000
is also shown again with her husband and below in the associative references the

252
00:22:14,000 --> 00:22:19,000
men of this family of architects are shown, but she is not shown there at all.

253
00:22:19,000 --> 00:22:26,000
What about her husband then? The thing is, we have a lot of pictures there

254
00:22:26,000 --> 00:22:31,000
You can find buildings that he designed, whereas with Elisabeth Böhm, not a single one of the

255
00:22:31,000 --> 00:22:36,000
buildings she designed was integrated into the article at a certain time, and that was the case for quite a long time

256
00:22:36,000 --> 00:22:41,000
. On the next slide I showed it again so that you

257
00:22:41,000 --> 00:22:46,000
can see it a little bit bigger. So the question here is, why is she

258
00:22:46,000 --> 00:22:51,000
included in Wikipedia as an architect? So it met the relevance criteria. But why are

259
00:22:51,000 --> 00:22:57,000
n't her buildings also integrated into the Wikipedia article? Why are there images here

260
00:22:57,000 --> 00:23:04,000
that do not really portray her in her science or in her professional role as an architect

261
00:23:04,000 --> 00:23:12,000
? On the next slide I would like

262
00:23:12,000 --> 00:23:17,000
to go into a text-image constellation and depict something like multimodal blank spaces. But first I would like

263
00:23:17,000 --> 00:23:23,000
to go back to the text that accompanies the pictures. It is said that after her marriage in 1948, then as a

264
00:23:23,000 --> 00:23:29,000
wife, but not officially as an employee, she worked in the architectural office and after the

265
00:23:29,000 --> 00:23:35,000
authorization to sell writings, which she received in the library, from four sons, she showed more presence in the

266
00:23:35,000 --> 00:23:40,000
Böhm office . Now these are excerpts that I have highlighted here. So from my point of view,

267
00:23:40,000 --> 00:23:46,000
this is an example of an article that very, very strongly

268
00:23:46,000 --> 00:23:52,000
emphasizes the family roles of this architect, over-emphasised in my opinion and also described in a disrespectful manner in parts

269
00:23:52,000 --> 00:24:00,000
. I can certainly ask myself the question, are these formulations that emphasize that she

270
00:24:00,000 --> 00:24:07,000
has combined or united both, i.e. family roles and professional roles, or is it

271
00:24:07,000 --> 00:24:13,000
the case that her status as an architect, her professional role, that that was not taken seriously

272
00:24:13,000 --> 00:24:18,000
and was discussed here too disrespectfully. As I said, you can see it again here in the frame without

273
00:24:18,000 --> 00:24:24,000
a picture. A photo of buildings that she designed was not integrated into her article for a long time.

274
00:24:24,000 --> 00:24:32,000
On the next slide, however, I will show another excerpt from Gottfried

275
00:24:32,000 --> 00:24:40,000
Böhm's article. And there it is that the WDR arcades, which

276
00:24:40,000 --> 00:24:45,000
were largely designed by Elisabeth Böhm, are actually integrated with an image. In principle, a suitable image would

277
00:24:45,000 --> 00:24:51,000
have been available via Wikimedia [Commons], but this was only integrated into Gottfried Böhm's, whereas

278
00:24:51,000 --> 00:25:00,000
no photo of this building can be found at Elisabeth Böhm's. And on the next slide I would like

279
00:25:00,000 --> 00:25:04,000
to talk about digital image practices, also a small aspect that

280
00:25:04,000 --> 00:25:09,000
can perhaps be used to improve one or two articles without much effort. I mentioned earlier that the

281
00:25:09,000 --> 00:25:18,000
Diversathon determined that it would be appropriate to see women centrally in pictures and not

282
00:25:18,000 --> 00:25:25,000
in family constellations with husbands, sons, whatever. In fact, Elisabeth Böhm had

283
00:25:25,000 --> 00:25:31,000
a picture with her husband integrated into the article for a long time. With Gottfried Böhm, on the other hand,

284
00:25:31,000 --> 00:25:37,000
an image was cropped, a digital image was cropped and adjusted so

285
00:25:37,000 --> 00:25:42,000
that Gottfried Böhm could be seen centrally. And as we can see, this was a picture

286
00:25:42,000 --> 00:25:49,000
in which other people can also be seen. Now the question is, why do you arrange the picture of the man

287
00:25:49,000 --> 00:25:55,000
so that he is placed centrally? For the woman, however, it is actually the family constellation again.

288
00:25:55,000 --> 00:26:05,000
The man is placed centrally in the picture. And then I come to my last substantive slide.

289
00:26:06,000 --> 00:26:11,000
What I can only touch on here, but what I find really interesting and

290
00:26:11,000 --> 00:26:17,000
would like to bring into the discussion here, is the topic of AI-generated images. I titled it a blessing or a curse.

291
00:26:17,000 --> 00:26:22,000
So I definitely see advantages of AI-generated images. Certain historical constellations could perhaps

292
00:26:22,000 --> 00:26:28,000
be presented better or in a more contemporary way using AI-generated images.

293
00:26:28,000 --> 00:26:33,000
But that is a bold thesis. But what I'm noticing at the moment, particularly at Wikimedia, is

294
00:26:33,000 --> 00:26:38,000
that there are an incredibly large number of images that are posted daily on Wikimedia

295
00:26:38,000 --> 00:26:46,000
that are very, very strongly gender-stereotypical images of women or can be seen as such.

296
00:26:46,000 --> 00:26:52,000
And the question now is, what is it doing with Wikimedia and what is it doing with the illustrations on Wikipedia

297
00:26:52,000 --> 00:27:00,000
when we see such a flood of AI-generated images that are far from meeting the criteria?

298
00:27:00,000 --> 00:27:06,000
which images in Wikipedia are usually supposed to fulfill. Maybe this is also a nice starting point

299
00:27:06,000 --> 00:27:13,000
for discussing rules in Wikipedia about images. And then I only have the last slide,

300
00:27:13,000 --> 00:27:20,000
which says thank you very much again for your attention. And with that I would hand it over to Ziko for the replica.

301
00:27:20,000 --> 00:27:32,000
Yes thank you. Do you hear me? Yes, you hear me? OK. Yes, dear Eva, thank you very much for the very interesting lecture.

302
00:27:32,000 --> 00:27:40,000
I'm afraid that the debate won't be a very spectacular fight, not because of the friendly space policy anyway.

303
00:27:40,000 --> 00:27:48,000
But I mostly agree with your diagnosis, “although.” And then at the end I will say something about the third thesis.

304
00:27:48,000 --> 00:27:53,000
There is still a lot that can be improved in Wikipedia. This means that I often think directly about the therapy,

305
00:27:53,000 --> 00:27:59,000
but I also have to stop at the diagnosis. Let us then consider the article Elisabeth Böhm.

306
00:27:59,000 --> 00:28:06,000
So overall, some things actually seem strange, the woman as an appendage of the husband.

307
00:28:06,000 --> 00:28:12,000
But when I look closer, there is a navigation bar with a photo and Ms. Böhm is missing from it.

308
00:28:13,000 --> 00:28:20,000
That's true, but others are missing too. And the Wikipedia author, you can always see who did it so transparently,

309
00:28:20,000 --> 00:28:26,000
i.e. the one I also know, the Wikipedia author then had the choice between a picture of Mrs. and Mr. Böhm

310
00:28:26,000 --> 00:28:32,000
or the one she chose Picture with Mr. Böhm and three male relatives. And then the author probably just thought,

311
00:28:32,000 --> 00:28:37,000
I'll take the picture with the most faces. So that's not misogyny, unless of

312
00:28:37,000 --> 00:28:45,000
course you could argue that Ms. Böhm is individually much more relevant than the three gentlemen, you could.

313
00:28:45,000 --> 00:28:52,000
And when it comes to therapy, I think that with navigation bars it's best to take a photo that represents all the elements anyway.

314
00:28:52,000 --> 00:29:00,000
And if you don't have one, you do without it. That should have been done here. So that's actually a point.

315
00:29:00,000 --> 00:29:07,000
Yes, and in the article text there is actually a strong reference to Elisabeth Böhm's husband.

316
00:29:07,000 --> 00:29:13,000
Therapy, you could rewrite the text. But then I would ask myself, would I dare to do that?

317
00:29:13,000 --> 00:29:23,000
Because my feeling tells me that I first need to know the facts better. So what did it look like at the Böhms' home and in the architect's office?

318
00:29:23,000 --> 00:29:28,000
And if I rephrase it just a little bit, then a statement can become incorrect.

319
00:29:28,000 --> 00:29:37,000
Then I would have to look it up again in the literature. But I suspect that the Wikipedia author wrote the article very conscientiously based on the literature.

320
00:29:37,000 --> 00:29:46,000
So that the incriminated style already comes from the works used. And yes, Wikipedia is dependent on literature.

321
00:29:46,000 --> 00:29:50,000
None of us know Böhm personally. So what should you do there?

322
00:29:51,000 --> 00:29:58,000
Another topic, the photo of Marie Curie. Should the main character of a picture look at the viewer?

323
00:29:58,000 --> 00:30:04,000
Or let's put it another way, should the lemma person look at the viewer? And that is a question in itself.

324
00:30:04,000 --> 00:30:13,000
I don't know that exactly. I don't even know the rule. In any case, I admit that the answer in the quoted discussion might seem a bit mushy.

325
00:30:13,000 --> 00:30:16,000
And that's not good for the discussion. I understand that.

326
00:30:16,000 --> 00:30:21,000
Allow me to use another example. This summer I have, I don't know "heurig", is that what you say?

327
00:30:21,000 --> 00:30:28,000
I always have my Austria hat from the Austrians here. Thank you again, Manfred.

328
00:30:28,000 --> 00:30:35,000
So I worked on this summer's article about the Titanic and this ship with the iceberg.

329
00:30:35,000 --> 00:30:40,000
And in the article there was a long list of the prominent passengers and then also the rich gentlemen.

330
00:30:40,000 --> 00:30:52,000
These were the millionaires and the rich ladies were the millionaires' wives or the owner of the New York department store Macy's, Isidor Straus and his wife Ida Straus.

331
00:30:52,000 --> 00:30:58,000
This relational thing that Eva has worked out. And then I heard a voice inside me.

332
00:30:58,000 --> 00:31:04,000
Well, what's that supposed to mean? Ida Straus lived like a millionaire. What's the point of the bad life?

333
00:31:04,000 --> 00:31:12,000
I can also simply write here: Ida Straus, millionaire. Maybe that was even Eva's voice in the back of my head that I heard there.

334
00:31:12,000 --> 00:31:19,000
But then I had another voice inside me. Yes, what was it like legally and financially back in 1912?

335
00:31:19,000 --> 00:31:26,000
I'm not an expert on social law at the time, but I'm guessing that Ida Straus didn't have any millions of her own.

336
00:31:26,000 --> 00:31:31,000
At that time, the man alone determined the assets in the marriage.

337
00:31:31,000 --> 00:31:38,000
If I were to simply call Ms. Straus a millionaire, then I would ignore inequality.

338
00:31:38,000 --> 00:31:46,000
And so I also have to think about Marie Curie, who is not always prominently portrayed as a widow in the article.

339
00:31:46,000 --> 00:31:54,000
I agree. But yes, back then it was very important for a woman whether she was a wife or a widow.

340
00:31:54,000 --> 00:32:01,000
This must be dealt with appropriately in the article. And how? You have to talk about it and it will then be negotiated.

341
00:32:01,000 --> 00:32:09,000
By the way, I wouldn't say that the article's purpose is to show the person's academic achievements, but rather their entire life.

342
00:32:09,000 --> 00:32:17,000
But yes, of course you have the question in your head: would you have written it that way if it were a man? The widower, the widower, the widower.

343
00:32:17,000 --> 00:32:25,000
I have to say that your third thesis, Eva, could have been formulated more sharply.

344
00:32:25,000 --> 00:32:31,000
I think the Wikimedia movement urgently needs to take image culture to a whole new level.

345
00:32:31,000 --> 00:32:37,000
And your post showed us wonderfully what else we need to pay attention to. Thank you very much for that.

346
00:32:37,000 --> 00:32:45,000
Yes, thank you very much, Ziko. Maybe as a reply to the replica.

347
00:32:45,000 --> 00:32:54,000
So of course I would think the last point you mentioned would be great if there were more discussions about images and image cultures within Wikipedia.

348
00:32:54,000 --> 00:33:03,000
I think this is exactly the direction it has to go. As I said, illustrating the rule page article, I would choose a completely different title.

349
00:33:03,000 --> 00:33:14,000
So it should really be clear that images are used to construct meanings in articles and that certain views of the world are suggested.

350
00:33:14,000 --> 00:33:26,000
So, I think you could really formulate the rules much more sharply and clearly, based on the findings of visual science, and not have such a naive conception of images.

351
00:33:27,000 --> 00:33:36,000
Images break up the text, but they don't have any more function. I'm really overdoing it now, bringing in the sharpness that some people might expect in an argument.

352
00:33:36,000 --> 00:33:43,000
But I think there is definitely still a need for improvement in order to make the rules more specific.

353
00:33:44,000 --> 00:33:54,000
Maybe go to Elisabeth Böhm again. Yes, there are certainly historical facts that suggest certain statements in the article.

354
00:33:54,000 --> 00:34:05,000
But I don't think it's appropriate for an architect to structure her articles solely according to family turning points.

355
00:34:06,000 --> 00:34:17,000
That means dealing with marriage, the birth of sons and the accompanying, I'll call it parental leave, even though that didn't exist back then.

356
00:34:17,000 --> 00:34:22,000
And then using such formulations showed Böhm to be more present in the office.

357
00:34:22,000 --> 00:34:37,000
So from my point of view, this is a formulation that is rather disrespectful and overall does not adequately honor both her family responsibilities and her achievements and merits as an architect.

358
00:34:37,000 --> 00:34:42,000
Exactly, and then of course the question is why this one image isn't integrated into her.

359
00:34:42,000 --> 00:34:50,000
In the meantime, I have to say, the article has been wonderfully revised; in my opinion, a lot has already been achieved with very few edits.

360
00:34:50,000 --> 00:35:08,000
Therefore, to conclude the reply to the replica, I would just like to point out that sometimes small changes can have a very, very big impact and that captions and, above all, of course, images in the construction, in the production of meanings should not be underestimated.

361
00:35:08,000 --> 00:35:13,000
Exactly, with that I'll hand it over to you, Ziko, and look forward to your presentation.

362
00:35:13,000 --> 00:35:15,000
Yes thank you.

363
00:35:15,000 --> 00:35:24,000
So from a stranger's perspective, yes, dear friends of knowledge, we have heard that the pictures in Wikipedia enhance an article.

364
00:35:24,000 --> 00:35:27,000
And now picture 2, please.

365
00:35:27,000 --> 00:35:32,000
But yes, we don't always use images in a meaningful and appropriate way.

366
00:35:32,000 --> 00:35:42,000
You have long articles, but hardly any pictures, or not a sufficient caption, or a use of an image where someone really didn't think carefully, as we'll see.

367
00:35:42,000 --> 00:35:52,000
So my topic is images are important, but not always harmless and unfortunately the discussions about them are sometimes overshadowed by an attitude that goes too far.

368
00:35:52,000 --> 00:35:59,000
An image on Wikimedia Commons, a Congolese village in a so-called people's show.

369
00:35:59,000 --> 00:36:04,000
In Europe, people from other parts of the world were sometimes exhibited like animals.

370
00:36:05,000 --> 00:36:08,000
Yes, the picture probably seems harmless to many people.

371
00:36:08,000 --> 00:36:12,000
No nudity, no violence, no bloodshed.

372
00:36:12,000 --> 00:36:18,000
But when people are reduced to their ethnic show value, is that really harmless?

373
00:36:18,000 --> 00:36:22,000
And should I use the image at all, and if so, how?

374
00:36:22,000 --> 00:36:29,000
Briefly about me. Among other things, I studied history and worked in image archives and I insert images into Wikipedia articles.

375
00:36:29,000 --> 00:36:33,000
And I do the same with the Wikipedia encyclopedia for children.

376
00:36:33,000 --> 00:36:41,000
And thinking about a good choice of images led me to literature, such as a contribution by Mr. Ian Ramjohn.

377
00:36:45,000 --> 00:36:49,000
Ian Ramjohn is concerned about how poor countries are portrayed on Wikipedia.

378
00:36:49,000 --> 00:36:51,000
And he writes:

379
00:37:03,000 --> 00:37:09,000
If the competitions are judged through the eyes of the global north, the effect is magnified.

380
00:37:09,000 --> 00:37:14,000
No one wants to give out prizes for things they can regularly see in their own backyard.

381
00:37:14,000 --> 00:37:23,000
But as long as developing countries are portrayed through this lens on Wikipedia, they remain exotic.

382
00:37:23,000 --> 00:37:26,000
Yes, Florence Devouard answered.

383
00:37:26,000 --> 00:37:35,000
She belongs to the Wiki loves Africa organization and explained indignantly that this competition, on the contrary, welcomes the everyday.

384
00:37:35,000 --> 00:37:39,000
And three to four people from the global north take part in the organization.

385
00:37:39,000 --> 00:37:45,000
The rest are all from the south. And most of the jury members also come from Africa.

386
00:37:45,000 --> 00:37:50,000
Did Ian Ramjohn mean a different competition?

387
00:37:50,000 --> 00:37:54,000
Yes, that's probably it, replied Ian Ramjohn sheepishly.

388
00:37:54,000 --> 00:37:59,000
He wrote the post two years ago and can no longer remember.

389
00:37:59,000 --> 00:38:08,000
Well, it also applies to Wiki loves Earth, says Florence Devouard, that most of the national winners in African countries come from there.

390
00:38:08,000 --> 00:38:16,000
And why should we call the winning photos exotic? You try to show the most beautiful places in a country everywhere.

391
00:38:16,000 --> 00:38:20,000
Ohh! What went wrong?

392
00:38:20,000 --> 00:38:25,000
Firstly, I find it morally questionable to assume anything about the competitions.

393
00:38:25,000 --> 00:38:31,000
And just like that, automatically, without having looked at the matter closely.

394
00:38:31,000 --> 00:38:38,000
Secondly, it is counterproductive because those who are attacked in this way can then think: No matter what I do, everyone will just complain.

395
00:38:38,000 --> 00:38:42,000
Why should I advocate for a more global Wikipedia?

396
00:38:42,000 --> 00:38:47,000
Thirdly, the attitude is problematic for the discourse and another example.

397
00:38:47,000 --> 00:38:51,000
Ian Ramjohn also complains about a Wikipedia discussion.

398
00:38:51,000 --> 00:38:56,000
So when you search for the word Java, what should you see?

399
00:38:56,000 --> 00:39:01,000
The article about the Indonesian island or Java, the programming language?

400
00:39:01,000 --> 00:39:08,000
And then he comments that the sensible people have prevailed, so that the lemma Java is still the island,

401
00:39:08,000 --> 00:39:13,000
but the fact that it was even discussed shows a fundamental weakness.

402
00:39:13,000 --> 00:39:21,000
It is absurd: white people from the global north marginalize 150 million inhabitants of Java

403
00:39:21,000 --> 00:39:25,000
by weighing their merits against a programming language.

404
00:39:25,000 --> 00:39:31,000
And at first I just thought, like yes, but the discussion was 15 years ago.

405
00:39:31,000 --> 00:39:35,000
And about ten people joined in the discussion and the island prevailed.

406
00:39:36,000 --> 00:39:43,000
And then I saw that Wikipedia, in English, had discussed the Java question several times.

407
00:39:43,000 --> 00:39:50,000
With sensible arguments on both sides, for example the article about the programming language

408
00:39:50,000 --> 00:39:55,000
is viewed significantly more often than the one about the island.

409
00:39:55,000 --> 00:40:02,000
Apparently some people don't even want to discuss things because they already have a fixed opinion.

410
00:40:02,000 --> 00:40:07,000
And anyone who has another, yes, represents a fundamental weakness.

411
00:40:07,000 --> 00:40:10,000
And he is no longer perceived as an individual with his own opinion,

412
00:40:10,000 --> 00:40:14,000
but only as a representative of his skin color or his origins.

413
00:40:14,000 --> 00:40:19,000
This novelist is also not conducive to discourse.

414
00:40:19,000 --> 00:40:25,000
She complained on Twitter that “a man” had deleted the article about her.

415
00:40:25,000 --> 00:40:30,000
Yes, and you can see how the Twitter crowd supported her

416
00:40:32,000 --> 00:40:38,000
here. And what does the organization “Whose Knowledge?” think?

417
00:40:38,000 --> 00:40:44,000
It wants to make underrepresented groups and minorities more visible. Fine.

418
00:40:44,000 --> 00:40:47,000
Quote from Adele Vrana's video.

419
00:40:47,000 --> 00:40:51,000
Most of the people who write Wikipedia are still white men.

420
00:40:51,000 --> 00:40:58,000
Siko Bouterse: And since who you are affects what you create,

421
00:40:58,000 --> 00:41:02,000
these gaps are also reflected in the content of Wikipedia.

422
00:41:02,000 --> 00:41:06,000
For example, there is a Wikipedia article for every episode of The Simpsons.

423
00:41:06,000 --> 00:41:08,000
Military history pretty good too.

424
00:41:08,000 --> 00:41:12,000
The coverage of female porn stars is also fine.

425
00:41:12,000 --> 00:41:19,000
Vrana: Yes, but there are still tons of biographies of Brazilian women scientists missing

426
00:41:19,000 --> 00:41:24,000
or activists. - Yes, I ask myself, what is the point of such a polemic

427
00:41:24,000 --> 00:41:29,000
that is presented in a giggly mood? Do you want to defame a population group

428
00:41:29,000 --> 00:41:35,000
by saying that they constantly think about sex? We know historical examples of this.

429
00:41:35,000 --> 00:41:42,000
Sure, the statement should be: us against them. But does this do justice to the individual Wikipedia author?

430
00:41:42,000 --> 00:41:47,000
Someone who has never written an article about porn stars?

431
00:41:47,000 --> 00:41:52,000
And by the way, I once looked at Wikipedia categories and the result is that there are

432
00:41:52,000 --> 00:41:59,000
23,000 articles for scientists and 512 for porn actresses.

433
00:41:59,000 --> 00:42:07,000
So is Ian Ramjohn's article pure polemic for the trash?

434
00:42:07,000 --> 00:42:11,000
No, the question of exoticism is legitimate.

435
00:42:11,000 --> 00:42:17,000
So whether pictures intentionally show the different, the strange.

436
00:42:17,000 --> 00:42:20,000
Because taking a photo is always an effort.

437
00:42:20,000 --> 00:42:23,000
Of course, this is much easier today than it was 100 years ago.

438
00:42:23,000 --> 00:42:27,000
But of course I tend to pull out my cell phone for the unusual.

439
00:42:27,000 --> 00:42:33,000
For example, when fresh snow has fallen in the garden and not when the snow is melting again.

440
00:42:33,000 --> 00:42:42,000
And in Wikipedia, in an article about a village, I show a special building, not an ordinary one.

441
00:42:43,000 --> 00:42:47,000
An image from the article about Ghana.

442
00:42:47,000 --> 00:42:50,000
The man works a traditional loom.

443
00:42:50,000 --> 00:42:52,000
Now is that bad?

444
00:42:52,000 --> 00:42:56,000
Would you rather show someone working on a modern machine?

445
00:42:56,000 --> 00:43:01,000
What is a typical meal in the Netherlands? The famous herring?

446
00:43:01,000 --> 00:43:05,000
Perhaps. McDonald's, on the other hand, may not be typical of the country,

447
00:43:05,000 --> 00:43:10,000
but it is much more representative of what is really eaten.

448
00:43:11,000 --> 00:43:18,000
It is by no means unimportant who looks at a picture and what prior knowledge they have in order to classify a picture.

449
00:43:18,000 --> 00:43:21,000
For example, traditional culture.

450
00:43:21,000 --> 00:43:27,000
Many of us here have been to Austria before, so those present [at WikiCon] have definitely.

451
00:43:27,000 --> 00:43:32,000
And you know that Austrians don't all walk around like that all the time.

452
00:43:32,000 --> 00:43:36,000
But what about the picture from Kenya? What do I see there?

453
00:43:36,000 --> 00:43:40,000
Everyday life, tradition, re-enactment?

454
00:43:40,000 --> 00:43:44,000
So reenactment is a tradition that has been interrupted.

455
00:43:44,000 --> 00:43:48,000
Well, I don't know much about Kenya and maybe my readers don't either.

456
00:43:48,000 --> 00:43:53,000
And that's why it's so important that we have good image explanations on Commons.

457
00:43:53,000 --> 00:43:57,000
Yes, and unfortunately the description of the image on Commons was very brief.

458
00:43:57,000 --> 00:44:02,000
Maybe because it was obvious to the photographer what was there.

459
00:44:02,000 --> 00:44:07,000
Strictly speaking, you would have to say to yourself: If I don't understand exactly what I'm seeing,

460
00:44:07,000 --> 00:44:11,000
then I shouldn't actually use the image. A pity.

461
00:44:11,000 --> 00:44:17,000
Well, now that's a single image. But what I think is even more important is the overall impression that is created.

462
00:44:17,000 --> 00:44:21,000
In an article about a country you often see images like this.

463
00:44:21,000 --> 00:44:23,000
And especially when it comes to poor countries.

464
00:44:23,000 --> 00:44:29,000
Coffee is planted, raw materials are mined and you can see beautiful sandy beaches.

465
00:44:29,000 --> 00:44:35,000
Okay, that's there. And exports and tourism are really important sectors of the economy.

466
00:44:35,000 --> 00:44:38,000
Why shouldn't you show that?

467
00:44:38,000 --> 00:44:42,000
Well, I'm afraid that might give off a certain impression.

468
00:44:42,000 --> 00:44:48,000
The entire country seems to exist only for us people in rich countries.

469
00:44:48,000 --> 00:44:54,000
That's where we go on vacation and that's where the beautiful products come from for us.

470
00:44:55,000 --> 00:44:58,000
So with the lexicon I tried to avoid such an impression.

471
00:44:58,000 --> 00:45:03,000
Here is the article about Barbados. Beautiful beaches, great.

472
00:45:03,000 --> 00:45:09,000
But then I also added an Independence Day celebration or the Parliament building.

473
00:45:12,000 --> 00:45:16,000
Where do the images that we find on Wikimedia Commons actually come from?

474
00:45:16,000 --> 00:45:21,000
It doesn't matter whether the photos show a rich country or a poor country.

475
00:45:21,000 --> 00:45:26,000
The photographer is usually a man from the rich north. Conversely, this is much rarer.

476
00:45:26,000 --> 00:45:31,000
I regret that and I think it would be good if people from poor countries were encouraged

477
00:45:31,000 --> 00:45:35,000
so that they could take photos in rich countries.

478
00:45:35,000 --> 00:45:39,000
But please let us be very clear on two points.

479
00:45:39,000 --> 00:45:44,000
Firstly, what concerns the photographer and secondly, what concerns the subject.

480
00:45:45,000 --> 00:45:49,000
This is Diego Delso from Spain.

481
00:45:49,000 --> 00:45:54,000
He has taken and uploaded numerous high-quality photos in many countries.

482
00:45:54,000 --> 00:46:00,000
Ha, a rich, white, privileged man from the global north.

483
00:46:00,000 --> 00:46:08,000
His camera gaze reproduces colonial stereotypes and thus he manipulates the stupid Wikipedia readers.

484
00:46:08,000 --> 00:46:10,000
Is that so?

485
00:46:11,000 --> 00:46:15,000
No, Diego Delso is not the problem.

486
00:46:15,000 --> 00:46:19,000
He does everything on a voluntary basis and could save himself the trouble with Commons.

487
00:46:19,000 --> 00:46:25,000
His pictures are not a problem, they are part of the solution. They make illustration easier.

488
00:46:25,000 --> 00:46:28,000
What would be the alternative?

489
00:46:28,000 --> 00:46:32,000
On Commons these are often images of the national government,

490
00:46:32,000 --> 00:46:36,000
the American Navy or the Russian embassy.

491
00:46:36,000 --> 00:46:40,000
I don't want to claim that such photos are automatically unacceptable,

492
00:46:40,000 --> 00:46:45,000
but they were created with a specific intention.

493
00:46:45,000 --> 00:46:50,000
That's exactly why I'm such a big fan of Wiki loves Africa.

494
00:46:50,000 --> 00:46:54,000
With the many images from the past few years, we have

495
00:46:54,000 --> 00:46:57,000
really taken a giant step forward as a global movement.

496
00:46:57,000 --> 00:47:02,000
And most of the photos come from people from the countries in question.

497
00:47:02,000 --> 00:47:07,000
As for the content, let's be clear about the following.

498
00:47:07,000 --> 00:47:12,000
As a historian, I am aware that one should question

499
00:47:12,000 --> 00:47:17,000
who took a photo, why, what is the cultural and social context, etc.

500
00:47:17,000 --> 00:47:19,000
Yes.

501
00:47:19,000 --> 00:47:28,000
But you also have to say very clearly that this is simply a picture of a mill.

502
00:47:28,000 --> 00:47:32,000
Does it really matter who took the photo?

503
00:47:32,000 --> 00:47:36,000
Is he a native of the town or just passing through?

504
00:47:36,000 --> 00:47:39,000
Whether the photo was taken by a woman or a man?

505
00:47:39,000 --> 00:47:43,000
Would it really look completely different in each case?

506
00:47:43,000 --> 00:47:46,000
Critical questions can be important, but one can doubt

507
00:47:46,000 --> 00:47:54,000
whether a person's background is extremely relevant at all times and everywhere.

508
00:47:54,000 --> 00:48:01,000
Here we see the Wikipedia in Dutch, article Sudan, female circumcision.

509
00:48:01,000 --> 00:48:04,000
Yes, what is the real problem here?

510
00:48:04,000 --> 00:48:08,000
Of course, we wonder, did the person photographed give their consent

511
00:48:08,000 --> 00:48:10,000
for the image to be uploaded to Commons?

512
00:48:10,000 --> 00:48:15,000
And most importantly, would she actually like to see this image being used?

513
00:48:15,000 --> 00:48:21,000
So a cruel topic where she happily looks into the camera?

514
00:48:21,000 --> 00:48:27,000
This is about personal rights and this use would be a problem anywhere in the world.

515
00:48:27,000 --> 00:48:31,000
Does this have anything to do with poor and rich countries?

516
00:48:33,000 --> 00:48:35,000
Well, I admit, maybe yes.

517
00:48:35,000 --> 00:48:40,000
It could be that some photographers from rich countries do not dare

518
00:48:40,000 --> 00:48:43,000
to violate personal rights in their own country.

519
00:48:43,000 --> 00:48:47,000
Because someone photographed could come with the lawyer.

520
00:48:47,000 --> 00:48:51,000
And if those affected live in the global south, then you might be a little more lax

521
00:48:51,000 --> 00:48:54,000
because they can hardly afford a lawyer.

522
00:48:54,000 --> 00:48:58,000
I emphasize that this is just a hypothesis of mine, I don't want to assume anything,

523
00:48:58,000 --> 00:49:02,000
but anyway, the problem of personal rights,

524
00:49:02,000 --> 00:49:06,000
that is street photography and the whole problem,

525
00:49:06,000 --> 00:49:09,000
the problem of personal rights is a general one,

526
00:49:09,000 --> 00:49:15,000
but it could be, that on top of that there is the problem of rich and poor.

527
00:49:16,000 --> 00:49:21,000
The Wikimedia movement sees itself as an educational movement.

528
00:49:21,000 --> 00:49:23,000
Let's use the word education for this.

529
00:49:23,000 --> 00:49:28,000
The aim of education is to help people better understand themselves,

530
00:49:28,000 --> 00:49:31,000
but also their environment and of course their fellow human beings.

531
00:49:31,000 --> 00:49:34,000
Self-realization in social responsibility.

532
00:49:34,000 --> 00:49:38,000
And that fits well with the traditional ideal of the encyclopedia.

533
00:49:38,000 --> 00:49:41,000
It doesn't just collect any bits of facts,

534
00:49:41,000 --> 00:49:45,000
but rather knowledge that helps the reader orient themselves in their world.

535
00:49:45,000 --> 00:49:51,000
Yes, and we often see a tension.

536
00:49:51,000 --> 00:49:56,000
First, we are all human beings with universal human rights.

537
00:49:56,000 --> 00:50:00,000
Secondly, people can see

538
00:50:00,000 --> 00:50:03,000
or are seen as members of groups, i.e. collectives.

539
00:50:03,000 --> 00:50:07,000
And thirdly, we want to see the individual, the individual.

540
00:50:08,000 --> 00:50:11,000
And

541
00:50:11,000 --> 00:50:13,000
opinions differ as to what a good balance between these areas looks like.

542
00:50:13,000 --> 00:50:17,000
And a person's opinion depends, among other things,

543
00:50:17,000 --> 00:50:22,000
on their personal situation or the global status of their group.

544
00:50:24,000 --> 00:50:28,000
And this tension, well, we also see that in the Wikimedia movement.

545
00:50:28,000 --> 00:50:30,000
And hence my theses.

546
00:50:30,000 --> 00:50:34,000
As we have seen, there is still much to improve at Wikimedia Commons.

547
00:50:34,000 --> 00:50:36,000
Keyword open letter.

548
00:50:36,000 --> 00:50:38,000
And on Wikipedia.

549
00:50:38,000 --> 00:50:44,000
And yes, where someone comes from and which collectives they belong to, yes, that is important.

550
00:50:44,000 --> 00:50:48,000
And it can influence how he photographs.

551
00:50:48,000 --> 00:50:52,000
But that doesn't mean that such an influence automatically always exists.

552
00:50:52,000 --> 00:50:55,000
Anyone who only pays attention to a person's background

553
00:50:55,000 --> 00:51:00,000
and judges or condemns their opinions or photos accordingly

554
00:51:00,000 --> 00:51:03,000
is arguing ad hominem.

555
00:51:03,000 --> 00:51:08,000
But an individual has the right to be perceived as such.

556
00:51:08,000 --> 00:51:11,000
And yes, when people stand up for a collective,

557
00:51:11,000 --> 00:51:15,000
be it women or people in poor countries, then that's good.

558
00:51:15,000 --> 00:51:18,000
However, they can overstep the mark

559
00:51:18,000 --> 00:51:21,000
by demeaning other collectives.

560
00:51:21,000 --> 00:51:25,000
And that would mean throwing the baby out with the bathwater.

561
00:51:25,000 --> 00:51:27,000
Thank you.

562
00:51:34,000 --> 00:51:39,000
Yes, thank you very much, dear Ziko, for the exciting presentation.

563
00:51:39,000 --> 00:51:42,000
Exactly, now we can both be seen in the hall again.

564
00:51:42,000 --> 00:51:47,000
And in my reply to your presentation I would like

565
00:51:47,000 --> 00:51:51,000
to address your theses, which you just showed again.

566
00:51:51,000 --> 00:51:56,000
The first was yes, the Wikimedia movement needs to deal better with images.

567
00:51:56,000 --> 00:51:58,000
I absolutely agree with that.

568
00:51:58,000 --> 00:52:01,000
I think we almost speak with one voice.

569
00:52:01,000 --> 00:52:05,000
And that's why I don't want to be too critical of this thesis,

570
00:52:05,000 --> 00:52:08,000
but would actually like to support it even more.

571
00:52:08,000 --> 00:52:12,000
I think what you have now shown very well with your lecture is

572
00:52:12,000 --> 00:52:19,000
that the illustration of Wikipedia articles involves many aspects

573
00:52:19,000 --> 00:52:24,000
that relate to the legal dimension surrounding images,

574
00:52:24,000 --> 00:52:29,000
to an ethical-moral and also to a cultural dimension Dimension.

575
00:52:29,000 --> 00:52:34,000
In my opinion, the illustrations of Wikipedia articles should

576
00:52:34,000 --> 00:52:37,000
be examined much, much more closely.

577
00:52:37,000 --> 00:52:42,000
And you've already mentioned at one point or another

578
00:52:42,000 --> 00:52:47,000
that perhaps the image inventories at Wikimedia are to some extent the problem.

579
00:52:47,000 --> 00:52:53,000
I think I said this at one point that the image explanations on Wikimedia are relatively short.

580
00:52:53,000 --> 00:52:57,000
From my point of view, this is a very central point that I would like to take up again here.

581
00:52:57,000 --> 00:53:01,000
I have the impression, but let the discussion prove me wrong,

582
00:53:01,000 --> 00:53:09,000
that Wikimedia needs to provide a lot more context for the images.

583
00:53:09,000 --> 00:53:13,000
So there really needs to be more precise image descriptions

584
00:53:13,000 --> 00:53:18,000
so that those who then use the images for the Wikipedia articles

585
00:53:18,000 --> 00:53:24,000
to provide them with images can be even more precise about

586
00:53:24,000 --> 00:53:26,000
where the image actually comes from.

587
00:53:26,000 --> 00:53:30,000
I think that this plays a role in many topics, but in some it doesn't.

588
00:53:30,000 --> 00:53:36,000
So of course, Ziko, I also think that with a mill it's certainly not that problematic

589
00:53:36,000 --> 00:53:37,000
who took the picture.

590
00:53:37,000 --> 00:53:42,000
But there are very, very many topics, and you have also shown good examples of them,

591
00:53:42,000 --> 00:53:47,000
where you have to think very carefully about which image inventories are taken up.

592
00:53:47,000 --> 00:53:51,000
And in my opinion, images of the global south play a big role.

593
00:53:52,000 --> 00:53:58,000
And you also named a photographer here who, in your opinion,

594
00:53:58,000 --> 00:54:01,000
made a major contribution to illustrating Wikipedia.

595
00:54:01,000 --> 00:54:06,000
From my point of view, it also plays a role that you don't just let the photographers go off like that,

596
00:54:06,000 --> 00:54:08,000
even if they are already making a great contribution,

597
00:54:08,000 --> 00:54:14,000
but that you perhaps start training photographers about

598
00:54:14,000 --> 00:54:18,000
what they can do with their photos provide photos.

599
00:54:18,000 --> 00:54:23,000
Namely, not this naive view that they simply depict the world

600
00:54:23,000 --> 00:54:31,000
and illustrate Wikipedia and lighten up texts, but that they also

601
00:54:31,000 --> 00:54:38,000
present certain world views of the world with their pictures, especially when it comes to certain topics

602
00:54:38,000 --> 00:54:41,000
such as the discussion of the global south.

603
00:54:41,000 --> 00:54:47,000
So I do think that a certain perspective, a Eurocentric perspective,

604
00:54:47,000 --> 00:54:53,000
plays a role in Wikipedia. So my appeal is that there must definitely continue to

605
00:54:53,000 --> 00:54:59,000
be a discussion about image inventories in Wikipedia, much more intensively than before.

606
00:54:59,000 --> 00:55:05,000
You should also develop a kind of grid or guideline about which aspects play a role.

607
00:55:05,000 --> 00:55:10,000
As I said, I think that the cultural background of a photographer

608
00:55:10,000 --> 00:55:16,000
plays a big role. You should also explain legal aspects,

609
00:55:16,000 --> 00:55:21,000
as you said or showed, the picture of a woman who may not have been informed about it at all,

610
00:55:21,000 --> 00:55:26,000
that now in the context of street photography it suddenly ends up in Wikipedia.

611
00:55:26,000 --> 00:55:34,000
In my opinion, all of this should be made available to

612
00:55:34,000 --> 00:55:41,000
the many photographers who perhaps take professional, semi-professional or even private pictures in order to

613
00:55:41,000 --> 00:55:46,000
influence the quality of image inventories in Wikimedia and thus also in Wikipedia.

614
00:55:46,000 --> 00:55:51,000
So that would be my big appeal, now also after listening to your presentation,

615
00:55:51,000 --> 00:55:58,000
more discussions about the different dimensions, aspects of images that are relevant, as I said legal,

616
00:55:58,000 --> 00:56:04,000
ethical, moral, cultural and even more training. I think there will be some people on

617
00:56:04,000 --> 00:56:12,000
how to train people on sensitive topics about the global south or about groups that are fundamentally underrepresented in Wikipedia,

618
00:56:12,000 --> 00:56:20,000
such as women, about what are important aspects when producing new images.

619
00:56:20,000 --> 00:56:28,000
And then also, as I said, the aspect that, in my opinion, many images in Wikimedia are decontextualized

620
00:56:29,000 --> 00:56:35,000
and, as I said earlier in my lecture, they have great potential for meaning

621
00:56:35,000 --> 00:56:41,000
and, in my view, they are partially integrated into Wikipedia in a dysfunctional way.

622
00:56:41,000 --> 00:56:46,000
This should be avoided if possible, to create awareness here, as I said,

623
00:56:46,000 --> 00:56:51,000
that images not only lighten things up, but also have a decisive influence on

624
00:56:51,000 --> 00:56:57,000
how the texts are perceived and how meaning is reconstructed other than ticking.

625
00:56:57,000 --> 00:57:01,000
Exactly, so I'll give it back to you Ziko if you want.

626
00:57:01,000 --> 00:57:07,000
Yes, thank you very much for this classification, it is really very important.

627
00:57:07,000 --> 00:57:12,000
I thought for the lexicon that it should be suitable for children

628
00:57:12,000 --> 00:57:18,000
and then I came up with criteria, like I'll go to Wikimedia Commons

629
00:57:18,000 --> 00:57:24,000
and then try to choose good pictures and I want to make a longer article about it for my YouTube channel.

630
00:57:24,000 --> 00:57:28,000
You notice with Wikimedia Commons that there is a kind of automatic bias,

631
00:57:28,000 --> 00:57:33,000
so when I look for a photo of a doctor, it is usually a man who has a light skin color

632
00:57:33,000 --> 00:57:38,000
and you have to invest extra time if you want to have some diversity.

633
00:57:38,000 --> 00:57:43,000
My keyword for this is actually the quality of the encyclopedia.

634
00:57:43,000 --> 00:57:50,000
There are woke and anti-woke and discussions like that, I can't do that much with it.

635
00:57:50,000 --> 00:57:56,000
I want to make a good quality, a qualitative encyclopedia and if I now have an article

636
00:57:56,000 --> 00:58:04,000
about noses, eyes, arms, feet, where you only ever see light-skinned people, then the encyclopedia would not be of high quality.

637
00:58:04,000 --> 00:58:13,000
That's kind of my approach and it's really not easy to find feet based on skin color.

638
00:58:13,000 --> 00:58:18,000
That's really incredible and I don't know whether it's the indexing or other things,

639
00:58:18,000 --> 00:58:22,000
we have so many problems at Wikimedia Commons as to how we can improve it.

640
00:58:22,000 --> 00:58:27,000
There's a kind of automatic bias and if, for example, in Chimney Sweeping,

641
00:58:27,000 --> 00:58:33,000
I happen to find a great picture of a woman who is currently learning to be a chimney sweep,

642
00:58:33,000 --> 00:58:39,000
then I prefer to take that, even if there may be other better pictures of men ,

643
00:58:39,000 --> 00:58:44,000
but then I take that and then I know that in other articles I don't have the luxury,

644
00:58:44,000 --> 00:58:47,000
I don't have that much choice, but that's how I try to counteract that

645
00:58:47,000 --> 00:58:54,000
and what kind of thoughts you have to have in your head, that is Yes, yes, and it just takes an awful lot of time.

646
00:58:54,000 --> 00:59:00,000
When I pay attention to things like that, the illustration takes maybe twice or three times as much time as usual,

647
00:59:00,000 --> 00:59:05,000
but somehow I still have the pride that I say to myself, great now, it's worth it,

648
00:59:05,000 --> 00:59:10,000
even if the result looks okay now and doesn't look great.

649
00:59:11,000 --> 00:59:18,000
Take a photo, upload it, index it, find it, select it, caption it, image-image relationship, image-text relationship

650
00:59:18,000 --> 00:59:23,000
and what's in the reader's head, the exotic is also somehow what the readers then have in their heads,

651
00:59:23,000 --> 00:59:28,000
as they do classify, keyword Austria and Kenya, that we then assume with the Austrians,

652
00:59:28,000 --> 00:59:33,000
well, they don't always walk around in traditional costumes, that's a thing.

653
00:59:33,000 --> 00:59:37,000
I sometimes have the impression, I don't know, that with excellent articles,

654
00:59:37,000 --> 00:59:43,000
the authors have usually paid attention to good illustrations, but otherwise, I have the impression that

655
00:59:43,000 --> 00:59:48,000
perhaps the illustrations are not so prestigious or somehow, there are The initiative,

656
00:59:48,000 --> 00:59:57,000
Wikipedia, WPWP, that the Wikipedia articles should be illustrated, that doesn't quite come out of the pot.

657
00:59:57,000 --> 01:00:03,000
One more thing, you described the legal and ethical-moral aspects, yes when it comes to legal matters,

658
01:00:03,000 --> 01:00:08,000
I think we have to be very good at that and many Wikipedians are already aware of that

659
01:00:08,000 --> 01:00:13,000
and they know that many of the photos we get there are actually, They shouldn't actually be on Wikimedia Commons,

660
01:00:13,000 --> 01:00:18,000
then you always leave it to the subsequent users so that they then think about personal rights.

661
01:00:18,000 --> 01:00:22,000
But yes, when it comes to ethical and moral aspects, I think we should think about an image culture,

662
01:00:22,000 --> 01:00:30,000
how do we want to be as a community that deals with images and then we can be more Roman than the Pope,

663
01:00:30,000 --> 01:00:38,000
so our ethical guidelines can do a lot be stricter than the legal ones, why not? Thank you very much.

664
01:00:42,000 --> 01:00:49,000
Thank you Ziko, thank you Eva. We now have a good half hour left for the discussion,

665
01:00:49,000 --> 01:00:53,000
that is, if there are any requests to speak here in the room.

666
01:00:54,000 --> 01:01:00,000
Ah, okay, so if you want, you can come forward and they can see you too.

667
01:01:05,000 --> 01:01:11,000
Thanks, I'm asking on the basis of anonymity. I just wanted to make a few comments from the practice of illustration,

668
01:01:11,000 --> 01:01:16,000
because I also write long texts and try to illustrate them as intensively as possible.

669
01:01:16,000 --> 01:01:21,000
And starting from Ziko, from the picture you showed, where you had three pictures,

670
01:01:21,000 --> 01:01:28,000
strawberries in Ecuador, then the sulfur mining in Indonesia and this beach in Jamaica, I think it was

671
01:01:28,000 --> 01:01:36,000
where you said that shows either Clichés or exoticism. What struck me about all the pictures you showed

672
01:01:36,000 --> 01:01:43,000
is that you never actually described what captions were actually underneath,

673
01:01:43,000 --> 01:01:49,000
which makes you wonder whether some things that seem problematic at first glance

674
01:01:49,000 --> 01:01:54,000
that can actually be captured via the caption or whether it has also been captured.

675
01:01:54,000 --> 01:02:00,000
And the second thing is that it is not only important that an image is at the top right in articles,

676
01:02:00,000 --> 01:02:11,000
but also that images are assigned to sections. And for example, I also know why this sulfur mining in Indonesia

677
01:02:11,000 --> 01:02:20,000
is considered exotic, but when it comes to the illustration of the section about Indonesian exports,

678
01:02:20,000 --> 01:02:29,000
this picture is simply very good. As I said, and I also think this is a note to the speaker,

679
01:02:29,000 --> 01:02:37,000
the positioning of the image in the article is really an important thing, and also, for example, that you have the courage,

680
01:02:37,000 --> 01:02:44,000
that you are brave enough, to create an article without an introductory image , if you don't have a suitable introductory image,

681
01:02:44,000 --> 01:02:54,000
but rather just illustrate it in the article body. Last note, although we don't have the feet by color category,

682
01:02:54,000 --> 01:03:01,000
I think you can still work with photo galleries much more than is generally done.

683
01:03:01,000 --> 01:03:09,000
My experience is that if you

684
01:03:09,000 --> 01:03:15,000
create a gallery with four pictures dedicated to such topics on a topic such as impressive mountains, including agriculture, rice fields, steps, whatever, that is a little bit suspected of being

685
01:03:15,000 --> 01:03:23,000
exotic They take away a bit of this impressiveness from each other,

686
01:03:23,000 --> 01:03:29,000
or take away from this exoticism, which I actually think is a very good effect, because then you see, okay,

687
01:03:29,000 --> 01:03:35,000
that's noticeable, but also a bit normal, or show three pictures Just don't have such an exoticism,

688
01:03:35,000 --> 01:03:44,000
while only one picture shows it, and if you then have the courage to say, yes, I'll just go beyond the one picture,

689
01:03:44,000 --> 01:03:51,000
which of course also requires that you are reasonably capable on Commons , to find such pictures at all,

690
01:03:51,000 --> 01:04:01,000
but it's possible, and that's why I'm a bit unsure, Ziko, what you said at the end,

691
01:04:01,000 --> 01:04:10,000
that the illustrations on Wikipedia are progressing so slowly, could also be a very good thing, namely that

692
01:04:10,000 --> 01:04:17,000
it is better not to put a picture in there than a bad picture, even from a content point of view.

693
01:04:17,000 --> 01:04:21,000
I'm unsure what that looks like in reality.

694
01:04:24,000 --> 01:04:33,000
Yes, how do we do that, Eva? Mono-Ett, shall we go collect first? Maybe that's better, yes? Or should I react to it?

695
01:04:33,000 --> 01:04:41,000
So I think that would be a very differentiated statement. I think we should perhaps get into it straight away, Ziko,

696
01:04:41,000 --> 01:04:47,000
and you're welcome to start, because there were a few points, otherwise it might just be too much.

697
01:04:47,000 --> 01:04:55,000
I found wonderful pictures, points. I just wanted to say, a misunderstanding about the coffee and sulfur and sandy beaches,

698
01:04:55,000 --> 01:05:01,000
it wasn't so much about exoticism for me, and that impression can always arise, but it was about

699
01:05:01,000 --> 01:05:07,000
Whether the overall impression arises that this country now only has sandy beaches or raw materials to offer,

700
01:05:07,000 --> 01:05:17,000
then I would like to see more pictures from everyday life, street life, or I have a trade union meeting somewhere in a glossary article.

701
01:05:17,000 --> 01:05:25,000
There are people sitting around a table like that at the union in Angola, I think it was, and I would like to show you something like that,

702
01:05:25,000 --> 01:05:32,000
or a parliament building and then not just the holiday impressions. I think galleries are great. I have the impression that

703
01:05:32,000 --> 01:05:40,000
there are articles here like the ones I looked at in Ghana or Burundi, they were essentially written around 2009,

704
01:05:40,000 --> 01:05:51,000
and during that time they also essentially illustrated it, and in the meantime there is so much good stuff Added through Wiki loves Africa and other initiatives,

705
01:05:51,000 --> 01:05:59,000
so you can do more, but I actually find such reticence - I'd rather not take a picture than an inappropriate picture - very sensible.

706
01:06:00,000 --> 01:06:07,000
Yes, then I would perhaps jump right in, so I think it's very sensible to say that it's better not to have a picture than to have a picture that

707
01:06:07,000 --> 01:06:17,000
is inappropriate or that could definitely be problematic. But of course I also see, now when I take the perspective of my students, for example,

708
01:06:17,000 --> 01:06:27,000
who use Wikipedia very, very frequently, that as a Wikipedia author you might want to keep up with other platforms,

709
01:06:27,000 --> 01:06:36,000
which, as I said, are very image-heavy. So here is the question, how does Wikipedia position itself in relation to other knowledge resources on the Internet,

710
01:06:36,000 --> 01:06:45,000
and I would also like to consider the following: I also think the idea of ​​the image galleries is good, and I also have to keep in mind

711
01:06:45,000 --> 01:06:56,000
that the image constellation is important to consider, that images can also influence each other in their meaning.

712
01:06:56,000 --> 01:07:07,000
What I always ask myself is that many young people, but also of course in all age groups, now access Wikipedia via their smartphone,

713
01:07:07,000 --> 01:07:16,000
which means that when illustrating articles, you may also have to take into account how the image galleries are on the website smartphone can be seen.

714
01:07:16,000 --> 01:07:27,000
I think many Wikipedia authors still edit Wikipedia via the PC, the laptop that is standing somewhere and have the classic view,

715
01:07:27,000 --> 01:07:35,000
but the question is always how do text-image relations and image-image constellations appear via the smartphone,

716
01:07:35,000 --> 01:07:43,000
i.e. here the specific Internet users who access a Wikipedia article may have a completely different view

717
01:07:43,000 --> 01:07:56,000
that you as the author have not taken into account. So you have to consider what the actual reception situation is like via the laptop or smartphone.

718
01:07:56,000 --> 01:08:06,000
And I think there has been little research on this so far, which is definitely an exciting point, also considering that the end devices accessed via Wikipedia have

719
01:08:07,000 --> 01:08:14,000
changed, and that perhaps you as an author have to deal with this anticipates what the reception situation is like, but I know that that's asking a lot,

720
01:08:14,000 --> 01:08:24,000
especially since many authors work on Wikipedia on a voluntary basis, and taking most and so many aspects into account is of course asking a lot.

721
01:08:24,000 --> 01:08:30,000
Thank you, I think we could now move on to the next question. Mono-ett.

722
01:08:30,000 --> 01:08:34,000
Thanks, Raimund here. I don't even know if you see us here in the hall.

723
01:08:37,000 --> 01:08:38,000
No.

724
01:08:38,000 --> 01:08:39,000
Then we move forward.

725
01:08:39,000 --> 01:08:48,000
Hi, now you see me. I would like to provide some context on the topic of Elisabeth Böhm.

726
01:08:48,000 --> 01:08:54,000
I was immediately triggered because my wife wrote the article.

727
01:08:54,000 --> 01:09:01,000
And she and I have been working on the subject of the Böhm family, i.e. the Böhm family of architects, for a very, very long time.

728
01:09:01,000 --> 01:09:06,000
I know that it was very difficult for her to write the article because Elisabeth Böhm took

729
01:09:06,000 --> 01:09:11,000
a very, very long time... on the subject of the Böhm family, i.e. the Böhm family of architects.

730
01:09:11,000 --> 01:09:17,000
I know that she had a very difficult time writing the article because Elisabeth Böhm

731
01:09:17,000 --> 01:09:23,000
Yes, we think she is relevant, but like many mothers, she was just a mother.

732
01:09:23,000 --> 01:09:33,000
That's just how it is. Think about why there wasn't a photo of her in the article, not a single photo, which you, Eva, criticized.

733
01:09:33,000 --> 01:09:38,000
I see the deficiency too, but it would actually have been better to have no photo than this photo,

734
01:09:38,000 --> 01:09:43,280
the photo that you, Eva, criticized. I can see the deficiency too, but there would

735
01:09:43,280 --> 01:09:49,280
actually have been no better photo than this photo, because the photo was taken in 2009

736
01:09:49,280 --> 01:09:53,960
from, I don't know, I knew, I just looked and I don't know the user

737
01:09:53,960 --> 01:10:00,600
. That's terrible, that's blurry, that's small. At some point

738
01:10:00,600 --> 01:10:09,720
in 2017, Ms. Böhm cut the man out and cropped him. It hurts to see the picture.

739
01:10:09,720 --> 01:10:16,120
So that's really, really bad. She died in 2012 and we

740
01:10:16,120 --> 01:10:21,000
saw the family from time to time, but mostly only Gottfried Böhm and brothers.

741
01:10:21,000 --> 01:10:28,200
The woman simply didn't show up anymore, including between 2009 and 2012.

742
01:10:28,200 --> 01:10:32,000
Well, not that I know of. Makes an interesting film. I think

743
01:10:32,000 --> 01:10:37,520
it's about the Böhm family for an hour. So just a little bit of context. I ca

744
01:10:37,520 --> 01:10:43,240
n't say much about the text right now. My wife isn't here either, she's at home. But I think

745
01:10:43,240 --> 01:10:49,160
I sent her a few screenshots of the slides earlier. Look what she's doing now.

746
01:10:49,160 --> 01:10:56,800
Yes, well, because I also take a lot of photos, it's not always easy

747
01:10:56,800 --> 01:11:03,640
to find the right thing, take the right photo and do a little advertising for a

748
01:11:03,640 --> 01:11:08,440
SteePro session. I don't know whether today or tomorrow he will complain about the fact

749
01:11:08,440 --> 01:11:14,080
that there is so much riffing on small, ugly photos. I also find it an exciting topic.

750
01:11:14,080 --> 01:11:16,280
Yes, so much for that.

751
01:11:16,280 --> 01:11:23,920
Yes, thank you very much. Of course, for me this is also a very, very exciting contextualization

752
01:11:23,920 --> 01:11:29,440
of images. So that shouldn't have been a fundamental criticism of the article.

753
01:11:29,440 --> 01:11:35,080
I think it's great that a woman

754
01:11:35,080 --> 01:11:40,440
has stood her ground and taken on an article about a woman on a woman who perhaps just about meets the relevance criteria. So

755
01:11:40,440 --> 01:11:46,440
that's also a point that is being discussed in many initiatives surrounding Wikipedia. How do

756
01:11:46,440 --> 01:11:53,240
we manage to get more articles about women who also had family constellations,

757
01:11:53,240 --> 01:11:58,280
historically speaking, that did not allow them

758
01:11:58,280 --> 01:12:02,600
to be present in an architectural office or to be present in professional roles? So to think about that, I've

759
01:12:02,600 --> 01:12:07,880
already spoken intensively about it with Ziko. So no fundamental criticism of the article,

760
01:12:07,880 --> 01:12:13,320
but rather nice that something is there. But there is still a bit of room for improvement

761
01:12:13,320 --> 01:12:20,240
to design the article well so that it doesn't cause any bite. And then it is now an

762
01:12:20,240 --> 01:12:25,440
exciting point to talk about image quality again. So I see that there

763
01:12:25,440 --> 01:12:30,800
are different aspects here too. So on the one hand there is of course the technical aspect. I'm

764
01:12:30,800 --> 01:12:36,160
actually relatively open and forthcoming. So it doesn't bother me when a photo

765
01:12:36,160 --> 01:12:42,000
isn't of good quality from a technical point of view, but rather I argued earlier that perhaps it would

766
01:12:42,000 --> 01:12:47,160
be better to crop it. I don't know exactly what it means, but I suspect that it means

767
01:12:47,160 --> 01:12:53,360
cutting out a person from a group photo in order to place the person in the center. Exactly,

768
01:12:53,360 --> 01:12:58,680
Zico indicated it with a gesture. From my cultural linguistics and

769
01:12:58,680 --> 01:13:03,360
image studies perspective, I think this is a good option

770
01:13:03,360 --> 01:13:08,480
for cropping the images so that people are in the center, so that certain constellations of people

771
01:13:08,480 --> 01:13:14,440
don't also have to offer a stage. And in particular constellations of people that

772
01:13:14,440 --> 01:13:19,840
bring men to the fore because, in my opinion, they are already very present in many places.

773
01:13:19,840 --> 01:13:26,600
Exactly, to put it a bit polemically. So as I said, from a

774
01:13:26,600 --> 01:13:30,600
technical perspective I can understand that with the image qualities there may

775
01:13:30,600 --> 01:13:35,320
be considerations, but from a cultural studies, image studies

776
01:13:35,320 --> 01:13:42,680
perspective I absolutely think it's a good process, good digital image practice here on the person

777
01:13:42,680 --> 01:13:47,640
who is being described, so to speak then cropping the image accordingly is my perspective.

778
01:13:47,640 --> 01:13:52,600
That would also be an exciting point to negotiate. And I'm very excited

779
01:13:52,600 --> 01:13:55,640
about the session I just mentioned, which might also be about that. Maybe you can

780
01:13:55,640 --> 01:14:00,920
include this cultural studies perspective in the session too.

781
01:14:00,920 --> 01:14:06,480
We now have one more question from the audience here and then a few more questions from the chat.

782
01:14:12,920 --> 01:14:21,000
User Matthias B. from Schwetzingen. Hello Eva. You mentioned it earlier

783
01:14:21,000 --> 01:14:29,600
to describe the question or concern about better naming images for subtitles.

784
01:14:29,600 --> 01:14:34,840
Of course, this has a really big problem and that is the mass of files that

785
01:14:34,840 --> 01:14:43,240
are on Wikimedia Commons. There are tens of millions of them now, so no one has the time to do it.

786
01:14:43,240 --> 01:14:50,680
So maybe we have to start by saying that images used on Wikipedia

787
01:14:50,680 --> 01:14:56,520
should be described in more detail. But this also goes into accessibility with the alternative

788
01:14:56,520 --> 01:15:04,400
names. It's a completely different story that the visually impaired cannot see the image.

789
01:15:04,400 --> 01:15:11,720
Or rather, it is even more difficult to assess. And another point that occurred to me,

790
01:15:11,720 --> 01:15:17,160
we had a discussion a few years ago, a very heated discussion in Wikipedia,

791
01:15:17,160 --> 01:15:23,440
about the illustration of the article Tights. At the time it was illustrated, if I'm correct,

792
01:15:23,440 --> 01:15:28,320
with a photo of an advertisement from the Austrian company Palmers, which is known for very,

793
01:15:28,320 --> 01:15:36,000
I would say, revealing billboards, which I think have now somewhat disappeared from the street scene

794
01:15:36,000 --> 01:15:44,040
here in Austria. And then they were deleted, other pictures came in, then there was a woman with

795
01:15:44,040 --> 01:15:48,760
opaque tights, that was then replaced again, then a ballet dancer was

796
01:15:48,760 --> 01:15:55,720
inserted, back and forth and back and forth. I have to be honest about what the situation is now,

797
01:15:56,120 --> 01:15:59,760
but the problem is not new, so it has been known somewhere for several years.

798
01:16:03,240 --> 01:16:09,160
Yes, thank you very much, Matthias B., for all the great tips. So, of course, I see

799
01:16:09,160 --> 01:16:15,280
that this is a practical problem to subsequently contextualize images well,

800
01:16:15,280 --> 01:16:19,360
especially the tens of millions that are already in Wikimedia Commons.

801
01:16:19,840 --> 01:16:24,080
But perhaps that would be something that could be taken into account with a view to the future.

802
01:16:24,120 --> 01:16:31,320
So the images that are newly added, there are good rules and criteria to set in which context

803
01:16:31,320 --> 01:16:36,760
these images are placed. I think it might also help with the specific illustration

804
01:16:36,760 --> 01:16:41,760
if there was a good context here. You would perhaps save time when working on

805
01:16:41,760 --> 01:16:46,640
illustrating an article if you had better access to the images.

806
01:16:47,120 --> 01:16:54,400
And as I said, what I'm now observing, especially against the background of AI, is that a large number

807
01:16:54,400 --> 01:16:59,640
of images are being posted in Wikimedia Commons, especially for women, that

808
01:16:59,640 --> 01:17:05,280
are not well contextualized at all and, in my view, also in a project like Wikimedia Commons have no business

809
01:17:05,280 --> 01:17:11,960
because they reproduce gender stereotypes on a mass scale. And that actually brings me

810
01:17:11,960 --> 01:17:17,520
to your second point, Matthias B. I actually know the case of pantyhose.

811
01:17:17,520 --> 01:17:22,720
Of course, I have to ask an online encyclopedia of the 21st century whether it's

812
01:17:22,720 --> 01:17:28,640
really about reproducing revealing images from advertising in a Wikipedia article.

813
01:17:28,640 --> 01:17:33,680
I mean, it's clear to everyone that advertising wants to get attention, that it's possible

814
01:17:33,680 --> 01:17:40,320
to bring a product to women and that perhaps certain ideals of beauty

815
01:17:40,360 --> 01:17:45,120
are also addressed. And I know that in this case there were also some kind of fetish images

816
01:17:45,120 --> 01:17:51,600
that also showed problematic images of women that, in my opinion,

817
01:17:51,600 --> 01:17:57,360
had no online encyclopedic relevance at all. So, this is a point that

818
01:17:57,360 --> 01:18:04,680
is really worth discussing. What do you really want to show in an online encyclopedia with such topics and

819
01:18:04,680 --> 01:18:09,440
articles like this? Is it necessary, as I said, to reproduce ideals of beauty or, even worse,

820
01:18:09,600 --> 01:18:16,680
fetish images or images that can be seen as fetishes and

821
01:18:16,680 --> 01:18:22,440
include them here in an online encyclopedia? In my opinion it has no place there. Therefore

822
01:18:22,440 --> 01:18:24,800
, I can understand if this article has been modified accordingly.

823
01:18:29,080 --> 01:18:29,800
Yes thank you.

824
01:18:29,800 --> 01:18:32,480
Yes, if I can add something or I'll take it.

825
01:18:34,000 --> 01:18:36,920
We also have questions from the chat and then there's someone else here at the front.

826
01:18:37,480 --> 01:18:39,520
Then it's better to ask your questions than yes.

827
01:18:46,520 --> 01:18:50,640
Dear Ziko, I am the online angel from the session and we didn't have any questions, just

828
01:18:50,640 --> 01:18:55,000
comments, but I'll throw them out there anyway. So a comment that

829
01:18:55,000 --> 01:19:00,640
deals with the picture of Elisabeth Böhm again and actually says regrettably that the

830
01:19:01,120 --> 01:19:07,520
photo, which was already so poorly resolved, was cropped. And then two comments that are so thoughtful,

831
01:19:07,520 --> 01:19:13,760
moving away from illustrations more to the larger topic of how do you get women out of the shadows of theirs

832
01:19:13,760 --> 01:19:18,560
Men in their Wikipedia articles and this is a bit like the example of Miliva Maric

833
01:19:18,560 --> 01:19:25,760
, the Serbian physicist and mathematician, who many know as Albert Einstein's wife,

834
01:19:25,760 --> 01:19:31,560
Albert Einstein's first wife. So the people of Zurich simply

835
01:19:31,560 --> 01:19:36,720
thought a bit, so to speak. Yes, now here's another question from the audience.

836
01:19:39,840 --> 01:19:45,400
Hello user Alfons, I actually thought that we would talk a lot more

837
01:19:45,400 --> 01:19:52,280
here about the question of AI-generated images. It resonated with you very briefly, but as you

838
01:19:52,280 --> 01:19:57,240
may know or have followed on the discussion pages, when Chet-T-Petit

839
01:19:57,240 --> 01:20:02,720
popped up everywhere in the spring, the question immediately arose: what about AI-generated

840
01:20:02,720 --> 01:20:11,040
images? the Wikipedia, which doesn't illustrate the article AI, but where someone just

841
01:20:11,040 --> 01:20:17,720
thinks, I'll make a prompt, I'll let the AI ​​work for me and put together any

842
01:20:17,720 --> 01:20:24,080
picture that I think will fill some gap in Wikipedia . And we

843
01:20:24,080 --> 01:20:29,400
had a discussion about it in this context and in my opinion it was relatively astonishing,

844
01:20:29,400 --> 01:20:36,600
gratifying, consistent, namely that such pictures, if they do not illustrate the AI ​​article,

845
01:20:36,600 --> 01:20:44,960
are under suspicion of TF, i.e. theory development, you just build something yourself together, this has

846
01:20:44,960 --> 01:20:51,760
no claim to final authenticity and will be deleted from the articles. We even had that,

847
01:20:51,760 --> 01:20:57,840
there is still a small variation of it, for a while there were people who

848
01:20:57,840 --> 01:21:05,800
painted contours of people for whom there were no pictures in Wikipedia. We had 30 or 40 pictures like that,

849
01:21:05,800 --> 01:21:11,560
I think especially articles about musicians, all of which

850
01:21:11,640 --> 01:21:15,280
were then deleted. Maybe we could discuss that again briefly. Thanks.

851
01:21:15,280 --> 01:21:24,160
So I'm very happy to take up this topic, even though I'm not an AI expert at all, but

852
01:21:24,160 --> 01:21:29,280
I'm approaching the topic out of my own interest and I do

853
01:21:29,280 --> 01:21:34,960
n't think there's been that much research yet, at least not on the Jet-GPT phenomenon. DALI, i.e. chatbots, which are currently

854
01:21:34,960 --> 01:21:40,600
being negotiated across society as a whole since last November. But I also see it as

855
01:21:40,680 --> 01:21:44,880
a big problem, the AI-generated images. So I see it as a

856
01:21:44,880 --> 01:21:52,200
double-edged sword. You can definitely see opportunities, but you have to contain them very, very well,

857
01:21:52,200 --> 01:21:57,880
so you have to think about very good rules about when images

858
01:21:57,880 --> 01:22:04,160
are still acceptable for Wikipedia. For example, when it comes to historical people, i.e. people

859
01:22:04,160 --> 01:22:09,240
who have already died and for whom there are Wikipedia articles, if there

860
01:22:09,240 --> 01:22:16,320
is image material that only contains historical constellations, such as the

861
01:22:16,320 --> 01:22:22,080
family photo of Marie Curie, on which her father sits so centrally, it would of course have a

862
01:22:22,080 --> 01:22:30,640
certain appeal to generate images based on such images in which Marie Curie

863
01:22:30,640 --> 01:22:36,720
can be seen in contemporary constellations. But that is of course very risky because it

864
01:22:36,720 --> 01:22:41,600
completely contradicts previous image cultures. I would say that we simply want

865
01:22:41,600 --> 01:22:47,000
images to have a certain authenticity and the

866
01:22:47,000 --> 01:22:52,280
images that were mentioned were the contours of people. I think it was also a wish to

867
01:22:52,280 --> 01:22:56,560
give people who have already died and can no longer be photographed a

868
01:22:56,560 --> 01:23:02,960
face, let's say on Wikipedia, because we have a very

869
01:23:02,960 --> 01:23:08,760
visual culture. But of course I also understand if you have concerns about

870
01:23:08,760 --> 01:23:14,920
integrating such images, as I said, given the idea of ​​the authenticity of images. In my

871
01:23:14,920 --> 01:23:20,360
opinion, the dangers of AI-generated images clearly predominate at the moment. As I said,

872
01:23:20,360 --> 01:23:25,040
I have the impression at the moment that there is a flood of

873
01:23:25,040 --> 01:23:33,160
image material, especially when it comes to the representation of women, that builds on earlier material and

874
01:23:33,160 --> 01:23:37,880
even reinforces gender stereotypes that can already be found in earlier, authentic image material.

875
01:23:37,880 --> 01:23:43,880
And this is promptly chosen in such a way that gender stereotyping is taken to

876
01:23:43,880 --> 01:23:48,960
a much greater extent and that this is image material in Wikimedia Commons

877
01:23:48,960 --> 01:23:54,200
that should under no circumstances find its way into Wikipedia. I can't

878
01:23:54,200 --> 01:24:00,800
imagine any contexts in which these images could be used. And that's why I see one here

879
01:24:00,800 --> 01:24:05,680
There is a great need to rethink the rules at Wikimedia Commons. I

880
01:24:05,680 --> 01:24:10,640
don't think there's much regulation yet as to what can be posted on Wikimedia Commons. Probably in

881
01:24:10,640 --> 01:24:14,240
view of the fact that people were generally happy if someone had taken the trouble

882
01:24:14,240 --> 01:24:20,960
to post pictures here, but with the possibility of

883
01:24:20,960 --> 01:24:26,920
generating masses of pictures at the push of a button with certain prompts, from my point of view this beautiful picture inventory, this beautiful

884
01:24:26,920 --> 01:24:32,120
picture material, floods what was previously mostly available in Wikimedia Commons. In my opinion, this will be completely

885
01:24:32,120 --> 01:24:36,840
overshadowed if regulations are not found here in a timely manner. And then,

886
01:24:36,840 --> 01:24:41,320
secondly, you would of course have to consider how to deal with these image materials on Wikimedia Commons in other Wiki projects

887
01:24:41,320 --> 01:24:47,960
. This is my first rather tentative

888
01:24:47,960 --> 01:24:53,640
attitude towards images. Exactly, Ziko, you definitely want something now. Oh, I'm very tentative about that too

889
01:24:53,640 --> 01:24:59,840
. Yes, thank you very much, Alphons, for throwing it around again. I have a video

890
01:24:59,840 --> 01:25:05,600
on my channel so I don't want to go into most aspects of it. Yes, it's always the

891
01:25:05,600 --> 01:25:11,920
question, what do I expect from a photo? Do I really take it as a source or as an illustration? And

892
01:25:11,920 --> 01:25:16,880
then we have just as many examples from the history of the encyclopedia where you also

893
01:25:16,880 --> 01:25:22,760
had pure illustrations of something. History painting and yes, it is always difficult for me to

894
01:25:22,760 --> 01:25:29,360
say why AI in particular is not allowed. But well, more on that. As far as the images of women are concerned,

895
01:25:29,360 --> 01:25:34,080
they are very sexualized representations of what I know, fantasy,

896
01:25:34,080 --> 01:25:39,320
warrior woman who is scantily clad or something like that. And I also ask myself what the need is,

897
01:25:39,320 --> 01:25:45,080
unless you want to show in an article about artificial intelligence that such images exist

898
01:25:45,080 --> 01:25:50,480
or something. Yes, I don't know if the rules need to be stricter. Then they are simply

899
01:25:50,480 --> 01:25:56,480
ignored. They are on Commons. I don't know to what extent they're a nuisance there. However, yes,

900
01:25:56,480 --> 01:26:02,920
to what extent does Wikimedia Commons now want to be a platform for what is ultimately

901
01:26:02,920 --> 01:26:09,680
a kind of fan art or amateur drawings or whatever you want to call it. So this

902
01:26:09,680 --> 01:26:14,440
may not be the appropriate platform for this. I can understand this displeasure.

903
01:26:14,440 --> 01:26:23,680
Yes, but in general, how this will develop in society, whether people

904
01:26:23,680 --> 01:26:30,600
will perhaps be much cooler in five years with such representations as Albert Einstein somehow in

905
01:26:30,600 --> 01:26:35,200
reconstructions with AI. It may be that people will find it completely okay.

906
01:26:35,200 --> 01:26:39,840
I have one more question from the audience and then we are slowly coming to the end.

907
01:26:44,440 --> 01:27:01,480
So my name is Andreas Werle. You all know that my hobby is Shakespeare and I,

908
01:27:01,480 --> 01:27:09,400
together with an English scholar Wide Horizons,

909
01:27:09,400 --> 01:27:16,440
completely re-wrote the entire body of work, including Shakespeare, all the dramas, and also exchanged all the pictures and added new pictures. And that

910
01:27:16,440 --> 01:27:24,200
is problematic, because Shakespeare's work is full of misogynistic elements and

911
01:27:24,200 --> 01:27:32,440
colonial fantasies and so on. And I deliberately chose two examples of images

912
01:27:32,440 --> 01:27:40,040
that reproduce the colonial stereotype and a misogynistic element, namely in King Lear,

913
01:27:40,040 --> 01:27:49,040
a photo of Margaret Cameron, which shows Lear and then the two evil daughters

914
01:27:49,040 --> 01:27:54,000
who stand behind him, who are deceitful and then the good daughter, Cordelia,

915
01:27:54,000 --> 01:28:00,600
who bows to him a bit and who is supposed to be the humble daughter

916
01:28:00,680 --> 01:28:09,080
because she is the good daughter. In fact, the history of interpretation and

917
01:28:09,080 --> 01:28:15,560
the modern interpretation of the work and the work itself are completely contrary to this. So Cordelia

918
01:28:15,560 --> 01:28:22,600
is rebellious, contradicts Lear and then later appears as a military leader and tries

919
01:28:22,600 --> 01:28:28,600
to conquer England. And the two devious daughters, the evil ones, are the ones

920
01:28:28,600 --> 01:28:37,000
who are stubborn, who oppose the father, the father figure. And in the

921
01:28:37,000 --> 01:28:42,640
Cleopatra article I chose a very classic image, which

922
01:28:42,640 --> 01:28:48,760
represents Cleopatra as a femme fatale, which basically reflects the colonial context in an affirmative way,

923
01:28:48,760 --> 01:28:56,040
even though Cleopatra is one of the most fantastic characters that Shakespeare

924
01:28:56,040 --> 01:29:03,440
has played alongside Hamlet and... Falstaff generated and where everyone is fascinated by this strong and

925
01:29:03,440 --> 01:29:09,040
emotional and contradictory woman. So and this contradiction, I find it interesting

926
01:29:09,040 --> 01:29:14,240
and challenging and I did this consciously, so it is completely clear to me that

927
01:29:14,240 --> 01:29:20,360
these images are contrary, firstly to the content of the works and secondly to the history of interpretation.

928
01:29:20,360 --> 01:29:31,280
Do you think that's good? What do you think about it? Caption in the Lear article is simply

929
01:29:31,280 --> 01:29:38,960
King Lear and his daughters by Margaret Cameron and Cleopatra, there it is just

930
01:29:38,960 --> 01:29:46,040
a name, painting by so and so. Thanks. Yes, totally exciting, Ziko, I would start

931
01:29:46,040 --> 01:29:51,680
if you like. The discussion of fictional texts in Wikipedia is of course

932
01:29:51,680 --> 01:29:57,680
highly relevant for teaching contexts, because students also

933
01:29:57,680 --> 01:30:03,080
access Wikipedia when they

934
01:30:03,080 --> 01:30:11,680
discuss fictional texts in teaching contexts and this area of ​​tension from what we have now discussed, how images work and

935
01:30:11,680 --> 01:30:16,600
I find it very exciting that people would like to include reception stories in the article,

936
01:30:16,600 --> 01:30:21,800
but I would also like the captions to perhaps also

937
01:30:21,800 --> 01:30:27,200
reveal the background against which these images were chosen. So I think it will

938
01:30:27,200 --> 01:30:33,320
make a lot of sense to reformulate the captions again and at least include to some extent

939
01:30:33,320 --> 01:30:37,760
why which images were selected here, with what background,

940
01:30:37,760 --> 01:30:46,240
so that it becomes transparent for, at least for, students. Whether students

941
01:30:46,240 --> 01:30:51,360
then access Wikipedia in such a thoughtful way remains to be seen, but at least for

942
01:30:51,360 --> 01:30:56,360
students it would perhaps become more transparent and perhaps also offer a different opportunity

943
01:30:56,680 --> 01:31:02,280
to think again about why certain image inventories are included here. So,

944
01:31:02,280 --> 01:31:08,040
as I said, please don't underestimate image captions, but rather disclose

945
01:31:08,040 --> 01:31:18,800
why which image inventories are included here. Yes, thank you Andreas. A very interesting example,

946
01:31:18,800 --> 01:31:22,640
I honestly hadn't even thought of something like that. Yes, I don't know whether

947
01:31:22,800 --> 01:31:27,360
an AI image might help and then we'll discuss it together, yes the daughter would have to

948
01:31:27,360 --> 01:31:32,760
be portrayed a little less humbly or something like that, who knows. Very interesting, thank you.

949
01:31:32,760 --> 01:31:40,800
Yes, I think that's where we've come to the end. Thank you Ziko and Eva

950
01:31:40,800 --> 01:31:47,280
for joining us digitally and thank you to the audience for joining in the discussion. Bye.