User talk:Jimbo Wales: Difference between revisions

From Wikimedia Commons, the free media repository
Jump to navigation Jump to search
Content deleted Content added
No edit summary
Atomaton (talk | contribs)
We should follow an methodical and organized method for solving the broader problem
Line 72: Line 72:
:Removing these questionable images encourages new images that are within scope (like illustrations) to be found or created. While this may remove a few live images in the short term, in the long term this should help improve the overall quality of the project. - [[User:Stillwaterising|Stillwaterising]] ([[User talk:Stillwaterising|<span class="signature-talk">talk</span>]]) 13:03, 6 May 2010 (UTC)
:Removing these questionable images encourages new images that are within scope (like illustrations) to be found or created. While this may remove a few live images in the short term, in the long term this should help improve the overall quality of the project. - [[User:Stillwaterising|Stillwaterising]] ([[User talk:Stillwaterising|<span class="signature-talk">talk</span>]]) 13:03, 6 May 2010 (UTC)
::FWIW, I'll add my 2c in support of Jimmy's statement at the top of the page. It's about time and I am pleasantly surprised that it has come about. So kudos from me too. cheers, [[User:Casliber|Casliber]] ([[User talk:Casliber|<span class="signature-talk">talk</span>]]) 13:31, 6 May 2010 (UTC)
::FWIW, I'll add my 2c in support of Jimmy's statement at the top of the page. It's about time and I am pleasantly surprised that it has come about. So kudos from me too. cheers, [[User:Casliber|Casliber]] ([[User talk:Casliber|<span class="signature-talk">talk</span>]]) 13:31, 6 May 2010 (UTC)



We should follow an methodical and organized method for solving the broader problem.

Why is there such a focus on sexual content? The broader picture is clearly that there is a variety of content that is objectionable. A content filtering mechanism needs to be developed based on categorization, rather than a sudden broad censorship sweep based on subjective opinions of individual editors or admins. Some prudish editors will be inclined to delete anything that they view as sexually explicit, including nudity (it is already happening). The method being followed here is irresponsible and outside of the established process.

If prospectively objectionable content is really a big problem, then we should discuss it and make a workable mechanism. (see [[Commons_talk:Sexual_content#Labeling_of_prospective_objectionable_content|Labeling of prospective objectionable content]]) For instance, initial photograph submission should go into a pool that only administrators can view and categorize. That can't be that hard to set up. Volunteer administrators who recognize they may be exposed to objectionable content of all kinds (sexual, religious, violent, etc.) -- and who are also legally capable of viewing those -- can follow agreed upon policies for content categorization. Eventually (someday) individual viewers would be able to check box and select types of prospectively objectionable content that they wish to allow, or not allow (on whatever project included the content.) Yes -- this would take time. It always takes time to do things the right way. As they say "God is in the details" or if you prefer "The devil is in the details."

The narrow focus on exclusively sexual content when it is not harmful, and only a personal perspective is just censorship. Images of violence, or Wikipedia article on how to build a nuclear bomb, a pipe bomb, a Molotov cocktail and the like would be things that were potentially harmful. I recognize that there are a lot of useless images of penises and the like brought into Commons, but then there are many images of many things that are of poor quality, tasteless and useless being uploaded too. Again, taking the time to do this correctly and establishing a process and procedure is the proper method.

In my opinion (my apologies) the method that has been followed here shows a real lack of leadership and a lack of respect for everyone else involved in the project, from the Board on down. Policy from the board should flow downward in a planned and organized manner. An edict or directive from the founder outside of the accepted process and hierarchy should be addressed and dealt with by the board. [[User:Atomaton|Atomaton]] ([[User talk:Atomaton|<span class="signature-talk">talk</span>]]) 13:53, 6 May 2010 (UTC)

Revision as of 13:53, 6 May 2010

Wikimedia Commons admins who wish to remove from the project all images that are of little or no educational value but which appeal solely to prurient interests have my full support. This includes immediate deletion of all pornographic images. We should keep educational images about sexuality - mere nudity is not pornography - but as with all our projects, editorial quality judgments must be made and will be made - appropriately and in good taste. I am stating here my public support for admins who are prepared to enforce quality standards and get rid of a large quantity of what can only be characterized as "trolling" images of people's personal pornography collections. I am fully willing to change the policies for adminship (including removing adminship in case of wheel warring on this issue). I think our existing policies here on commons are sufficient to deal with the problem - with the minor exception that many things should just be speedy deleted and argued about later. If you want to be technical about it, please consider this a policy change in that regard. Try to relax. Anything which is deleted can be resorted if there's a good reason.--Jimbo Wales (talk) 01:52, 6 May 2010 (UTC)[reply]

RIP consensus. Roux (talk) 04:43, 6 May 2010 (UTC)[reply]
Wikimedia is not a democracy. It is a benevolent dictatorship headed by Jimbo Wales. - Stillwaterising (talk) 07:25, 6 May 2010 (UTC)[reply]
Sanity breaks out. Thank goodness for this. Commons contains vast swathes of garbage; penis pics, yet more masturbation pics, homemade pr0nz, etc, etc. What value does all this add to the projects? Little or nothing. Thanks for stating this clearly and plainly, Jimmy - Alison 04:53, 6 May 2010 (UTC)[reply]
But most all, commons contains lots and lots of propaganda by the invaders of Iraq. Commons welcomes the pictures of mass murderers, but whines about pictures made by consenting adults. Erik Warmelink (talk) 05:45, 6 May 2010 (UTC)[reply]
That dog don't hunt. It's a discredited argument. The existence of bad thing A does not justify the existence of bad thing B. It merely suggests more work to do. ++Lar: t/c 12:00, 6 May 2010 (UTC)[reply]
Incidentally, would you like to comment on this user's "contributions" and username? Killiondude (talk) 05:43, 6 May 2010 (UTC)[reply]
I think his username is juvenile and unfortunate, and tends to shed a negative light on his contributions. I did not review all of his contributions but at least some of what I looked at seemed fine, some quite talented in fact. (Maybe there's some particular bad stuff in there, as I say, I didn't look at it all.) I think he should change his username to something more dignified, but I'd consider that to be a matter of not very great importance.--Jimbo Wales (talk) 10:36, 6 May 2010 (UTC)[reply]
100 percent d'accord with Alisons comment. Thank you, Jimbo. --Túrelio (talk) 06:08, 6 May 2010 (UTC)[reply]
Bravo. This is consistent with the mission of the Wikimedia Foundation, which is about educational content not about being a free web host. JzG (talk) 06:50, 6 May 2010 (UTC)[reply]
Thank you Jimbo. This directive is an excellent first step in restoring Wikimedia's credibility. My hope is that all projects can implement standardized content filtering metatags so all projects can once again be declared safe for use in primary schools. Commons:Sexual Content was two user's attempt to author such guidelines. Could you please review what has been done so far (see Commons:Sexual content/April 2010 for latest draft proposal) and give some advice on how these guidelines can become global policy? - Stillwaterising (talk) 07:16, 6 May 2010 (UTC)[reply]
Jimbo, given this declaration, and in hopes of heading off some wheel wars, it would be very useful if you could provide more guidance as to examples as to where you consider the line to fall. In the absence of examples, I have to say that your remark above does not give me as an admin much guidance. - Jmabel ! talk 07:48, 6 May 2010 (UTC)[reply]
In the interest of avoiding wheel wars, let me first say that we can proceed in a relaxed and friendly way. There will be some disagreements on some details, and I ask all admins to be extremely conservative about undeleting things that are even remotely borderline. Remember, there is no hurry to undelete things - nothing is permanently lost - and things mistakenly deleted in the cleanup can be undeleted in the fullness of time after a calm discussion.
I would say that images that would trigger 2257 record keeping requirements are the obvious starting point. I know there is some question as to whether the 2257 requirements apply to the Foundation (apparently not), but they may very well apply to the uploader. But that's not really the point. The point of 2257 in our context is that it does provide a reasonably well-understood and objective "line" beyond which we do not go.
I do not mean to imply that if an image doesn't cross the 2257 line, it's ok. A suitably tasteful image of nudity helpful in a medical/instructional context is not the issue. "Homemade pr0nz" as Alison put it, is just stupid and should go.--Jimbo Wales (talk) 10:30, 6 May 2010 (UTC)[reply]
it's great that we're (finally!) getting some momentum here - good on ya' jimbo (although you weren't in a rush ;-) - @jmabel - I would say starting with photographs and videos which depict genital contact with hands, mouths, and other genitals, and probably masturbation too, would be a sensible set of images to discuss first - they're probably not really ok. Thoughts? Privatemusings (talk) 09:11, 6 May 2010 (UTC)[reply]

< you can see some images it may be worth discussing here if you're over 18, and ok with explicit content. Privatemusings (talk) 09:20, 6 May 2010 (UTC)[reply]

Here is a collection of categories requiring attention by admins aged 18 or over. Note that some of these have many, many subcategories:

http://commons.wikimedia.org/wiki/Category:BDSM

http://commons.wikimedia.org/wiki/Category:BDSM_humiliation

http://commons.wikimedia.org/wiki/Category:Sexual_fetishism

http://commons.wikimedia.org/wiki/Category:Spanking

http://commons.wikimedia.org/wiki/Category:Zoophilia

http://commons.wikimedia.org/wiki/Pedophilia

http://commons.wikimedia.org/wiki/Category:Sexual_penetrative_use_of_dildos

http://commons.wikimedia.org/wiki/Category:Sexual_penetrative_use_of_objects

http://commons.wikimedia.org/wiki/Category:Masturbating_with_dildos

http://commons.wikimedia.org/wiki/Category:Female_masturbation

http://commons.wikimedia.org/wiki/Category:Fisting

http://commons.wikimedia.org/wiki/Category:Male_masturbation

Some of these might be kept if we had a tagging system in place. As it is, libraries and schools providing minors with access to Commons have no practical way to filter these images out; this leaves them with the choice of either blocking access to Commons altogether, or running the risk of losing funding, or being found guilty of having broken the law.

Also, I believe Commons is breaking the law in a number of Western countries if it allows minors to administer these images. For reference, see [1]. We may need a separate Commons for adult content. --JN466 10:09, 6 May 2010 (UTC)[reply]

Jayen466, actually there is a very easy way to filter those images out because we do have a category system, the very system you used to find those images! As for your other idea, I think it is a very good one: if someone, somewhere wants to start a commons for adult content, that'd be fine with me. Wikimedia is not a free homepage provider or general hosting company. Saying that not everything belongs here is not to say that it can't live somewhere more appropriate.--Jimbo Wales (talk) 10:30, 6 May 2010 (UTC)[reply]

How about tagging media categories that feature explicit violent or sexual content? Then all that our admins would need to do would be to ensure that the files are assigned to one of these categories. Libraries, schools and other organisations providing computer access to minors could use these tags as a basis for their content filters.
I think we also need to address the issue of admins aged 17 or under administering this content. --JN466 10:42, 6 May 2010 (UTC)[reply]
Regarding content ratings: bugzilla:982. TheDJ (talk) 12:08, 6 May 2010 (UTC)[reply]

I am very sad to see that we are now deleting slews of images that are in use by various projects. To say "things can be undeleted" is putting the burden on the projects, on people that might not even speak english. I am truly ashamed TheDJ (talk) 12:08, 6 May 2010 (UTC)[reply]

On the other hand I am very thankful that we finally have this issue sorted out. Though in my opinion "every image that potentially might require 2257 record keeping" is rediculiously broad, since well, those are ALL images of teasing nudity, bdsm and sex. TheDJ (talk) 12:13, 6 May 2010 (UTC)[reply]
And I really appreciate your support here, as I know you don't agree with the specifics of this. I think the important thing is that we work together to refine our notions in a way that is consistent with the overall mission of the Wikimedia Foundation projects. I think there are two possible views of Commons - one is as a radical free speech zone where everything is allowed as long as it is legal. Speaking philosophically about the Internet in general, I don't have a problem with something like that existing somewhere. The other view is that commons has a firmly defined educational mission that has to allow for a certain amount of 'difficult' content, but within bounds and always with a very detailed rationale. And speaking philosophically about Wikimedia and our goals, I think that's the right approach.--Jimbo Wales (talk) 12:42, 6 May 2010 (UTC)[reply]
If "work together" means announcing your willingness to "change the policies for adminship", it is a fine example of en:Newspeak. Has the Board of the Wikimedia Foundation taken a decision? Had you even discussed consistency of your notions with the overall mission with them? /Pieter Kuiper (talk) 13:04, 6 May 2010 (UTC)[reply]
Nothing at all is sorted out. All that has happened so far, is that one board member has expressed his opinion on the issue. Please note, this is just a single opinion of a single person. Jimbo is not god here. Lots of other people have already expressed their opinions on this matter and a new proposal about what we should do with content which might be considered pornographic, is currently going on at Commons:Sexual content. If you want to change policy, that is the right place to go. Just because Jimbo offers advice on how he would like to see something handled, doesn't mean that is what we are going to do. Remember, this is a community-driven project, not ruled by a single person. Please don't take Jimbo's opinion on this as the final word in this matter - this might be a starting point for a discussion, but it is certainly not the end of the decision making process! @Jimbo: I disagree with your opinion, but I respect it. But please don't try to interfere with processes and threaten to de-admin people. This is not at all helpful and can not be part of an open discussion and policy making process. Regards, -- ChrisiPK (Talk|Contribs) 12:48, 6 May 2010 (UTC)[reply]
ChrisiPK, I thank you for your remarks, but I don't want to leave any mistaken impression here. This is not merely the advice of one person. I don't intend to deadmin anyone - but I will. The order of operation here is going to be that we first clean up commons, and then we can open a broader discussion about what to undelete if necessary. I just don't want anyone to get the impression that we're going to have an open vote about whether to turn Commons into a porn server. We aren't. It isn't going to happen. This is not a democratic debate, this is policy.
As is well known, I am a very strong supporter of community process and consensus. I'm not the right person to work out every single detail of what should remain. But just as NPOV is non-negotiable, so is "Wikimedia Commons is not a porn server".--Jimbo Wales (talk) 12:56, 6 May 2010 (UTC)[reply]
I'm sure you mean en:WP:NPOV, but it would be nice if you could keep in mind that Commons is a different project; our COM:NPOV actually states the opposite of what en:WP:NPOV does: we don't delete content just because it's biased. As for deleting everything and then restoring a few, it's not as easy as you make it sound. If a file is in use 100 times on 20 different projects, deleting it will automatically delink it from all these articles. But upon undeletion, it will not be automatically added back; there's a huge asymmetry in the amount of work. Not even mentioning the fact that "delete first, ask questions later" is a recipe for disaster. –Tryphon 13:22, 6 May 2010 (UTC)[reply]
(EC) I understand that we are not turning Commons into a porn server. But please understand that the line between what people consider pornography and what people consider useful for an educational purpose is not very clear. Thus I think it is a _very_ bad idea to encourage people to delete images which they think might be borderline or even beyond the line. This brings the problems mentioned below by Dragons flight with images being removed from projects. Obviously the projects found the images useful, so who are we to decide as a single person that these images are not useful? I really think we need to establish some sort of policy how to decide whether an image is useful or not. Simply deleting anything that is not of high quality and features sexual content is exactly what was overwhelmingly opposed when it was last proposed and what is overwhelmingly opposed when a deletion request on such a file is filed. Unless we work out some guideline to distinguish between "porn" and "educational content", the last thing we should do is issue a "Shoot on sight"-order.
Also I still don't see the immediate urgency of the matter. Commons has been hosting sexual content for years; you will surely not have learned that only today? Why are you showing up here and requesting removal of questionable images outside policy (and then in policy). If our policy is too lax on this issue, propose improvements for it so admins can take action according to it. Right now our policy states, that an image which is in use is in scope. Below you disagree with that. Please don't give commands on how to act which contradict local policy. Regards, -- ChrisiPK (Talk|Contribs) 13:28, 6 May 2010 (UTC)[reply]
@ChrisiPK, in regard to urgency: you might have missed FOX News accusations against Commons for distributing Child P0rn; though largely or completely wrong, surely not the PR we need. Though the immediate trigger was likely this Commons:Deletion requests/Images of Stan Spanker. --Túrelio (talk) 13:36, 6 May 2010 (UTC)[reply]

In the general spirit of not making a mess, shouldn't we give at least an initial presumption that images that appear in articles have an educational purpose? I agree that warehousing questionable content is a bad idea when it isn't being used for anything, but I'd like to think we'd stop and have at least a little discussion about images that are being used. I bring this up because someone has already deleted images that appeared in 30+ enwiki articles [2]. Dragons flight (talk) 12:50, 6 May 2010 (UTC)[reply]

I'd agree with you that those images ought to be among the first to be examined for possible undeletion. However, I think it extremely unlikely that many of them should be undeleted.--Jimbo Wales (talk) 12:58, 6 May 2010 (UTC)[reply]
Wait, seriously ? I'm not sure if I can do that in good conscience. I had been skipping such images mostly so far. TheDJ (talk) 13:09, 6 May 2010 (UTC)[reply]
Part of the problem with delete first, sort it out later, is that CommonsDelinker will automatically remove those images from all the articles and other places they are being used once they are deleted. This in turn makes it much more difficult to see how an image might have been used prior to deletion, since all the image links will already have been removed. It also means that restoring any undeleted images to the places they were used would require someone to edit each of the appropriate places to put it back (potentially a daunting task if something is used on many different projects). If there is really a possibility that some of the images might be okay, then doing it this way creates a lot more work for people who might be tasked with considering the legitimacy of the images. Dragons flight (talk) 13:10, 6 May 2010 (UTC)[reply]
Removing these questionable images encourages new images that are within scope (like illustrations) to be found or created. While this may remove a few live images in the short term, in the long term this should help improve the overall quality of the project. - Stillwaterising (talk) 13:03, 6 May 2010 (UTC)[reply]
FWIW, I'll add my 2c in support of Jimmy's statement at the top of the page. It's about time and I am pleasantly surprised that it has come about. So kudos from me too. cheers, Casliber (talk) 13:31, 6 May 2010 (UTC)[reply]


We should follow an methodical and organized method for solving the broader problem.

Why is there such a focus on sexual content? The broader picture is clearly that there is a variety of content that is objectionable. A content filtering mechanism needs to be developed based on categorization, rather than a sudden broad censorship sweep based on subjective opinions of individual editors or admins. Some prudish editors will be inclined to delete anything that they view as sexually explicit, including nudity (it is already happening). The method being followed here is irresponsible and outside of the established process.

If prospectively objectionable content is really a big problem, then we should discuss it and make a workable mechanism. (see Labeling of prospective objectionable content) For instance, initial photograph submission should go into a pool that only administrators can view and categorize. That can't be that hard to set up. Volunteer administrators who recognize they may be exposed to objectionable content of all kinds (sexual, religious, violent, etc.) -- and who are also legally capable of viewing those -- can follow agreed upon policies for content categorization. Eventually (someday) individual viewers would be able to check box and select types of prospectively objectionable content that they wish to allow, or not allow (on whatever project included the content.) Yes -- this would take time. It always takes time to do things the right way. As they say "God is in the details" or if you prefer "The devil is in the details."

The narrow focus on exclusively sexual content when it is not harmful, and only a personal perspective is just censorship. Images of violence, or Wikipedia article on how to build a nuclear bomb, a pipe bomb, a Molotov cocktail and the like would be things that were potentially harmful. I recognize that there are a lot of useless images of penises and the like brought into Commons, but then there are many images of many things that are of poor quality, tasteless and useless being uploaded too. Again, taking the time to do this correctly and establishing a process and procedure is the proper method.

In my opinion (my apologies) the method that has been followed here shows a real lack of leadership and a lack of respect for everyone else involved in the project, from the Board on down. Policy from the board should flow downward in a planned and organized manner. An edict or directive from the founder outside of the accepted process and hierarchy should be addressed and dealt with by the board. Atomaton (talk) 13:53, 6 May 2010 (UTC)[reply]