Commons talk:Sexual content/Archive 6

From Wikimedia Commons, the free media repository
Jump to navigation Jump to search

Archive 1 (Dec 2008-Apr 2009) | Archive 2 (April 2010) | Archive 3 (May 2010) | Archive 4 (June 2010) | Archive 5 (June - October 2010)

Recent Changes[edit]

I did a pretty major copy-edit to the whole policy. No content changes of any kind were intended. Here's a summary of what I did.

  • Fixed typos and obvious grammar errors
  • Shortened phrases for brevity
  • Rewrote some wikilinks for flow and simplicity
  • Cleaned up header and list formatting
  • Added H2 header to combine File Descriptions and Categorization under 'Working with sexual content'
  • Added H2 header to combine 2257 and obscenity law under 'Legal issues'
  • Removed duplicate policy links
  • Rearranged sections to put most important info first
  • Added italics to two key areas
  • Refactored legal discussion to separate Wikipedia procedure from historical background

If there are any questions, shoot. The diff won't be very helpful, since there was a lot of shuffling and it happened over 10 different edits, but there are no content changes. I'd just give it a top to bottom read to make sure that it didn't change anything substantive, and hopefully it's a cleaner, shorter version of an important policy. Ocaasi (talk) 05:20, 2 November 2010 (UTC)[reply]

Limits on what 2257 applies to[edit]

Should we add a note that {{2257}} should not be used indiscriminately and never on pictures of children? I removed 2257 from a lot of pictures of nude children; pictures of children that 2257 applied to would need deletion, not tagging. In the larger sense, I don't want to see this indiscriminately spammed across a bunch of files in Category:Nudity when it only applies to sexual activity.--Prosfilaes (talk) 07:29, 2 November 2010 (UTC)[reply]

That seems like an important point: record keeping is not for illegal images; deletion is for illegal images. I don't know when exactly the line is drawn, but if you have that sense, it sounds good. Ocaasi (talk) 09:36, 2 November 2010 (UTC)[reply]
That's not my point at all; my point is that record keeping is not for mere nudity.--Prosfilaes (talk) 11:07, 2 November 2010 (UTC)[reply]
Alright, misread then. I think they're both good points. I will add yours as a counterweight, explaining both ends of the 2257 spectrum. Check in and see if it works. Sorry I identified you as privatemusings in the edit comment, I had just written on that talk page. Ocaasi (talk) 12:45, 2 November 2010 (UTC)[reply]
I tried this: Content identified as a sexual depiction of a minor should not be tagged with a 2257 template or amended to include 2257 information; it should be deleted. Ocaasi (talk) 10:21, 2 November 2010 (UTC)[reply]
New Version: 2257 requirements apply specifically to sexual content including subjects of a legal age. If content is of child nudity that does not qualify as sexual then 2257 notifications are not needed. On the other hand, if child nudity is identified as sexual, then it should not be tagged with a 2257 template or amended to include 2257 information; it should be deleted and reported to the WMF. Ocaasi (talk) 12:55, 2 November 2010 (UTC)[reply]
Yes - I can say unequivocally that 2257 should never be used on images of minors (unless the image happens to also include images of adults who fit it). Explicitly sexual images of minors should be deleted, not tagged - mere nudity is generally not a reason for 2257 tagging. I've added brief notes to the tag regarding this. Dcoetzee (talk) 21:44, 4 November 2010 (UTC)[reply]

How about the artworks? They shouldn't be counted as sexual just because someone is naked (i.e. a god) unless it really contains sexual content. 221.126.247.208 10:08, 12 December 2010 (UTC)[reply]

What does this mean?[edit]

Can someone tell me what this sentence is trying to say:

By sporadic application of the First Amendment obscenity prosecution for adult sexual content or for sexual artwork, depicting apparent minors has become rare and unpredictable, but explicit sexual content depicting actual minors has not been protected by the courts and its prohibition is strictly enforced. Ocaasi (talk) 01:16, 4 November 2010 (UTC)[reply]
The Feds could sue a site hosting w:lolicon claiming that the material was legally obscene, but given that obscenity requires that it be without artistic, etc. value, it's never cut and dry, so they rarely do so. However, material showing actual children doesn't have such requirements, and will get slammed down quickly and hard. (In practice, I understand a lot of cases are stopped by the fact the FBI won't generally go after teenage CP unless they can get a confirmed ID and age of birth. But that's not really relevant for us.)--Prosfilaes (talk) 01:45, 4 November 2010 (UTC)[reply]

Check please:

Obscenity prosecutions of adult sexual content, sexual art, or caricatures depicting apparent minors has become rare and unpredictable; however, explicit sexual content depicting actual minors has received no protection and prohibitions against it are strictly enforced.

--Ocaasi (talk) 09:57, 4 November 2010 (UTC)[reply]

I read that, and it still doesn't make sense, because there is nothing qualifying 'protection' - the sentence above uses the term 'modification', and 'protection' does not obviously apply when what is actually being referred to is a loosening of restrictions on the depiction of such materials. How about Obscenity prosecutions of adult sexual content, sexual art, or caricatures depicting apparent minors has become rare and unpredictable; however, there has been no such modification where explicit sexual content depicting actual minors is concerned, and prohibitions against it are strictly enforced. --Elen of the Roads (talk) 22:50, 5 December 2010 (UTC)[reply]
Ah, I see what you mean. The perils of a la carte fixes. It will probably have to wait until after the poll, but that should be an uncontroversial improvement. I think we should start a list of to-fix-afterwards edits, since all of the new eyeballs will be wasted while the draft is !vote-locked. Maybe I'll do that. Ocaasi (talk) 18:14, 6 December 2010 (UTC)[reply]

Archive[edit]

Would someone mind archiving the old threads on this page. I'm a bit inexperienced in the archive business...Ocaasi (talk) 01:16, 4 November 2010 (UTC)[reply]

I've given it a go - although we may not even require 4 / 5 months of threads above? cheers, Privatemusings (talk) 02:19, 4 November 2010 (UTC)[reply]
Excellent...Ocaasi (talk) 07:37, 4 November 2010 (UTC)[reply]

Question about content without explicit consent[edit]

Edit went from:

This shall not in itself be a reason to delete them, but the community will discuss whether or not consent can be assumed in individual cases.

to

The community is free to discuss whether or not consent can be assumed in individual cases.

Do we want to remove the statement that lack of explicit consent is not sufficient for deletion? Is 'community is free to discuss' weaker than 'community will discuss'? Which is intended... Ocaasi (talk) 07:48, 4 November 2010 (UTC)[reply]

I think we need the statement, because it's been such a controversial question in this discussion and is likely to arise over and over. A small but persistent minority have wanted a strict requirement of explicit consent for everything. I think it's important that we make it clear that is not the policy we are proposing. - Jmabel ! talk 15:08, 4 November 2010 (UTC)[reply]
Agreed. There are many works for which explicit consent is unnecessary and unreasonable (e.g. notable artwork that happens to include a nude person). Dcoetzee (talk) 21:39, 4 November 2010 (UTC)[reply]
That sentence is not about "everything". There are already a list of cases where explicit consent is not necessary (including non-photographic artwork..., and old photographs). That sentence is only about old uploads. We compromised that there would be no mass deletion of old untagged uploads, but that individual discussions would proceed. So, how does the new wording not convey that? --99of9 (talk) 09:40, 5 November 2010 (UTC)[reply]
Logically, it shouldn't matter if we say explicit consent is necessary or not if we're leaving it to case by case discussion anyway. Even so, I agree that I'd like to keep the extra statement anyway, just to be clear about it. Wnt (talk) 09:37, 5 November 2010 (UTC)[reply]
I think there's a risk of being overly verbose, a bit confusing, and perhaps even mildly prejudicial, to be honest. The intent of my edit (as requested) was to keep it simple - a deletion discussion where an older file is deleted due to perceived consent issues is ok, as is a deletion discussion where an older image is kept despite consent issues being raised. If our intention here is not to mandate things either way, which I think it is, then I think it's prolly best the way it is :-) Privatemusings (talk) 23:25, 7 November 2010 (UTC)[reply]
I combined the bullet point with the other point about this to reduce confusion (I hope). Looking at the text and messing about with it this way and that, I didn't see any obvious way to make the text clearer without making it more confusing. ;) I think that as long as we have agreement here that we're not somehow intending this to be interpreted to allow deletion by default of old material (which I don't think is how it reads), I suppose we can spare readers of the policy the extra verbiage. Wnt (talk) 04:02, 8 November 2010 (UTC)[reply]
I kept the combination, but reworded it a bit, dropping the future tense and instead stating it like this:
Uploaders of self-produced sexual content are expected to make a positive assertion that the subjects of the photos consented to the photograph and to its public release. Provision of further evidence of consent is welcome (using the OTRS process) but is not normally required. Content which lacks assertions of consent, usually from files uploaded before implementation of this policy, should not be automatically deleted. The community should instead discuss whether or not consent can be assumed in each case.
I think that covers both issues ok, while keeping it as straightforward as possible. Ocaasi (talk) 17:42, 8 November 2010 (UTC)[reply]

< I tried again, because I felt that 'your' version kind of implied that future uploads of sexual content which are missing explicit assertions of consent might be ok - whereas I think we've agreed that they're necessary. Hopefully it's even clearer now :-) cheers, Privatemusings (talk) 23:45, 8 November 2010 (UTC)[reply]

I see how your version was an improvement in affirming the present requirements. I tightened the phrasing just a touch, no change to content (except 'discuss' to 'determine', if you want to be really technical). On the other sentence, you went from the 'should not be automatically deleted' to 'free to discuss', which though correct is a somewhat weaker way of saying that prior content without consent 'shouldn't be deleted without discussion'. I think in cases where people get deletion-happy, 'free to discuss' doesn't set a requirement for them to make that effort. How about both changes? Ocaasi (talk) 05:55, 9 November 2010 (UTC)[reply]
I thought that version sounded a little too favorable toward random proposals for deletion of old content, while being too restrictive toward proposals to delete new content that seems very suspicious despite an assertion of consent. (A different issue, mentioned only briefly in the previous discussions, was that you might come up with some content that seems intrinsically suspicious despite an assertion - a flash snapshot of a startled couple in a hotel bed, etc. The assertion should generally be a good defense, but we weren't going to require editors to be completely blind, either) I'm not sure my last edit is the best way either... Wnt (talk) 09:14, 9 November 2010 (UTC)[reply]

I've made a small edit, but I'm still concerned by this new wording. One thing that bugs me is that old random Flikr uploads will now be treated differently to new random Flikr uploads (without assertions). The old need reasonable suspicion to delete, the new ones should not be uploaded. --99of9 (talk) 09:55, 9 November 2010 (UTC)[reply]

In general I think we should have some hint of positive trustworthiness (e.g. trusted uploader) to keep the old stuff, rather than just no negatives. --99of9 (talk) 10:00, 9 November 2010 (UTC)[reply]
Looks ok to me. Maybe 'preferred' instead of 'encouraged' just as a word choice. Ocaasi (talk) 14:11, 9 November 2010 (UTC)[reply]
I thought the last draft (99of9) was OK also. I just added a footnote linking an article about the Tyler Clementi case as general background - but if someone doesn't like that I won't insist on keeping it. Wnt (talk) 03:55, 11 November 2010 (UTC)[reply]
That case is really significant (or it will be soon), but I have a few thoughts. I'd like to tighten up the summary a bit just for brevity. Also, do we need to consider victimization issues? This case is receiving ample attention right now, and the family hasn't been avoiding publicity, but maybe until this actually winds up in a court ruling we should just speak generally about it. I could go either way. Thoughts? Ocaasi (talk) 20:13, 11 November 2010 (UTC)[reply]
I've seen several news cycles about Clementi, and in the U.S. his case is being used as something of a rallying cry against bullying. I don't think it would be inappropriate to name him. The legal case against the people involved in the videotaping is likely to be a significant test of a 2003 "invasion of privacy" law — as I understand it, they were operating their own webcam in their own (shared) dorm room. It may be more important in social terms, however — before this case, I think that many people would think of the consent issues more in the context of actresses suing to try to keep sex tapes off the Internet, whereas now some are going to think of it as a life and death issue. I don't think that's really fair — it was prejudice and bullying, not video, that caused this tragedy — but I can't pretend it won't be true, and editors (many of whom may not be from the U.S. and never heard of the case) should have some warning.
As people here know by now, I believe in the "not censored" principle of Wikipedia, and even now I wouldn't have acquiesced to demands for an explicit assertion of consent for future content without a tangible reason. If scenes from the Clementi video had been posted to Commons without an explicit assertion of consent, policy wouldn't touch Clementi (we wouldn't know who he was), nor most likely the uploader (who would be breaking the law and existing policy with or without an explicit statement about it, and probably wouldn't need a notice to tell him this). But it might have protected people "distributing" the video, which I think might be interpreted to affect various types of innocent editors by various arguments which it is probably not productive to detail here. By having an explicit statement to fall back on, they would have a firm defense, whereas without it prosecutors might well end up Wikilawyering in the courtroom.
I am not at all happy to have yet another way for it to be illegal to copy content you find on the Internet!, but from what I've read (which questions whether the video was viewed by anyone) I don't know if the current prosecution could lead to constitutional review of that part of the law; if it did this is not the test case I'd want to see used to make the decision. Wnt (talk) 21:46, 11 November 2010 (UTC)[reply]

Commons Study on Controversial Content[edit]

I noticed there's no mention, not even a link to the recent study on controversial content. Is that intentional? Does this policy assume that those recommendations have no teeth or force until the foundation decides to do something about them? Does it make sense to link to them in a 'see also' section, perhaps? Ocaasi (talk) 10:01, 4 November 2010 (UTC)[reply]

See also seems a good idea but the study is just a study, not a policy mandate. I think we ought to adopt this proposal as policy on our own rather than by being forced to via a WMF mandate, though. ++Lar: t/c 11:44, 4 November 2010 (UTC)[reply]
Yes, it's always nicer to self-legislate, like when McDonald's added apple slices in lieu of FDA regulation. I'll try a see also link, which seems to be the right fit between informing and capitulating to the (thoughtful) meddling of the outsiders. Ocaasi (talk) 11:56, 4 November 2010 (UTC)[reply]
(I really don't get the apple-slices/McDonalds reference. Was this a US problem?)Archolman (talk) 00:43, 6 December 2010 (UTC)[reply]
(There were rumblings about Big Food becoming the next Big Tobacco, facing multi-billion dollar obesity lawsuits, congressional hearings, etc. with documents showing execs tried to market to kids and get them addicted to fatty, salty, hormone-ridden snacks. So the industry took a hint and voluntarily adopted milk, apples, more salads, and a few other things to health-wash their image. Which uncharitably suggests by analogy that that is what we're doing, but there's either a) no truth to that or b) nothing wrong with it) Ocaasi (talk) 18:20, 6 December 2010 (UTC)[reply]
The study makes few specific recommendations, and those it makes largely apply to other projects (image hiding on Wikipedia) or are not limited to sexual content as defined here (semi-nude women, which are not considered sexual content by our definition). In any case, I'm not very favorable about it. Wnt (talk) 09:31, 5 November 2010 (UTC)[reply]
If the recommendations are passed on by the foundation, we can then consider whether to expand our definition of sexual content. The good thing is that what we've developed here is a framework for how to deal with whatever is considered sexual. --99of9 (talk) 09:44, 5 November 2010 (UTC)[reply]
Yes - it is important though that we adopt this (in some form) as policy before the Foundation considers the study, or else I fear they might swoop in, say we're not taking clear action, and institute something more conservative. Dcoetzee (talk) 20:53, 5 November 2010 (UTC)[reply]
It's premature to speculate on how to deal with a hypothetical arbitrary policy, which might affect anything. But I think we've designed this policy to deal with a very specific class of content, based to a large degree on American legal distinctions. If the WMF imposed some policy about bare breasts I think we would be best off (at least) to try to keep it as something separate rather than trying to integrate it here. The whole point of an imposed policy is that it is imposed on you, so how can you edit it? So how and why would you try to unify it with any other? Wnt (talk) 03:47, 8 November 2010 (UTC)[reply]
Agree with that, except that when something is 'imposed' on you, it can still be in your interest to adapt it to your purposes, or at least to integrate it with them. But it's premature to bring it up while those issues are just floating through the proposals-and-recommendations cloud. Ocaasi (talk) 17:45, 8 November 2010 (UTC)[reply]

Do we need assertions of consent from unidentifiable people?[edit]

The way the last part of the "Prohibited content" section is currently written, we "must" have an assertion of consent even for close-up photos showing, e.g., a particular kind of STD rash. Firstly, is there any reason to require an assertion of consent when the subject of the photograph isn't identifiable? Furthermore, wouldn't requiring an assertion of consent in such cases hinder the ability to illustrate educational medical sexuality articles? 71.198.176.22 16:44, 16 November 2010 (UTC)[reply]

were this proposal adopted, yes - sexual content, as defined on the proposal page, would require an explicit assertion of consent. If you read the note related to the 'close up' type of image you use as an example, I think it's likely such an image would fall outside of the purview of this policy. Personally, I think an assertion of consent would be desirable anywhooo - don't you? cheers, Privatemusings (talk) 20:10, 16 November 2010 (UTC)[reply]
No, I don't think it would be desirable to have to get consent for close up, unidentifiable images because what is the point of such a hassle? I'm not sure what the note you linked to has to do with images of unidentifiable people, either. Is that the note you meant to link to? 71.198.176.22 04:26, 18 November 2010 (UTC)[reply]
I see there was a "grammar tweak" from "with sexual content all are subject" to "all sexual content is subject".[1] This is an example of the perils of copy editing. Additionally, there was some effort to define "identifiable" quite expansively before, including e.g. EXIF information, which has since been taken out; I think the feeling was that it was best to have one single definition of identifiability under the main policy about it.
However, on consideration, I think we should think through carefully just what we want or need to say about this. For example, Wikimedia might want to steer clear of unpublished photos from a secret locker room camera, no matter how carefully cropped and anonymous they may be. On the other hand, we would want to rule in non-identifiable photos from a retired surgeon who has taken a collection of photos for personal use and in medical publications, but never obtained consent to publish them here. My first thought on processing this is that we need to exclude even non-identifiable photos if the photography itself was an illegal violation of privacy, but that we don't need to exclude them otherwise. For example, in the locker room example, it is possible that the act of investigating the dissemination of the "anonymous" photos would end up revealing that they were of a particular person, and put Wikimedia in an entirely undesirable position. But before trying to implement such a zig-zag line, I should see what others here have to say. Wnt (talk) 13:27, 18 November 2010 (UTC)[reply]
I think if you re-read the copy-edit closely, it actually removed a redundancy and did not change the meaning at all. Generally, though I respect legal writing, policy, and constitutions, etc. there is something wrong if a single word or phrase can obscure the intent of the guidance. So we should make it more plain and robust if that is what happened.
I think your 'illegal itself' notion is a good distinction, but I don't know how it would be incorporated. It seems more than reasonable that content which appears to be illegally acquired should be excluded, even if it does not violate other provisions (identifiability, sexual content with children). But how would we know what 'looks illegal'. I think the only question is whether those situations should be a) speedy deleted, with the possibility of discussion afterwards; b) discussed per regular deletion, with the burden on showing it is legal; or c) discussed per regular deletion, with the burden on showing it is illegal.
My gut instinct is that where there identifiability or sexual content with children, a) is the obvious choice; where the content is blatantly sexual or exposing body parts, (or if there are subtle identifiers that could end up as clues), that b) is the best option; and if there is just a generic body part but no children, sex act, or obvious identifiers, that c) is okay. That's not very simple, but it's my best stab at it. Thoughts? Ocaasi 02:20, 19 November 2010 (UTC)[reply]
There's really no way to verify that consent was really obtained for photography under any scenario - even if we kept 2257 paperwork, which we shouldn't, it would be child's play to forge an electronic copy of a driver's license and a signature for which we don't have a valid exemplar. Here we're just trying to figure out a required standard, without much thought for serious enforcement beyond the assertion of consent. However, I think it's possible to be pretty sure in some cases that no consent was obtained, allowing AfDs to be raised. Even so, I don't think that speedy deletion will usually be appropriate for suspected lack of consent, simply because it is going to be a complex conclusion to draw from media that don't explicitly say that they are violating policy. I think that the current guidance that speedy deletion is only for uncontroversial cases is sufficient. As for burden of proof, I don't think that any AfD on Commons truly has a "reasonable doubt" standard; it's more a "preponderance of the evidence", which puts b and c as pretty much the same option. Disturbing the current language on the point could be trouble, since we've had enough trouble just agreeing that there should be a reason to suspect lack of consent... Wnt (talk) 12:00, 19 November 2010 (UTC)[reply]
I've added yet another bullet point to the section, trying to deal with this issue out in the open; it should make any other rewording unnecessary.[2] Provided I didn't screw it up... we're diving into some rather fine detail nowadays. Wnt (talk) 12:25, 19 November 2010 (UTC)[reply]
No no no. Why are we suddenly reopening something we've hammered out so hard already? I thought we had agreed to stick fairly closely to copyedits until we had a clean stable page to vote on. This "close crop" stuff allows someone to put their ex-girlfriend all over commons as long as he cuts her up small enough. Consent is important. I will revert, please obtain wide consensus before making such significant changes since we've already found a compromise point. --99of9 (talk) 13:27, 19 November 2010 (UTC)[reply]
What would the point of a vindictive ex putting unidentifiable images up? Anything intended to embarrass is necessarily going to be identifiable. I don't think this is a legitimate point of compromise since we already have dozens of close-cropped images of genitalia and breasts, several of which are currently being used to illustrate wikipedias. The proposal as worded would require the deletion of those images unless the uploaders, many of whom are long gone, come up with assertions of consent. I don't believe anyone actually wants that. Do they?
I think Wnt's bullet point was a reasonable solution, so I'm replacing it. 71.198.176.22 03:37, 20 November 2010 (UTC)[reply]
Hmmm, I don't want to break down a fragile consensus, but at the same time I'm responding to a November 4 "grammar tweak" as explained above. I don't really want to go back to that rather peculiar phrasing either, especially since we apparently had more difference of opinion about what we were reading there than we imagined. I'd hoped that my wording might be acceptable, but is there a way we can fix this? I should emphasize that if "identifiable" is interpreted narrowly enough, 99of9 does have a point about the cut-up photo sequences. I originally considered some sentences that tried to define "identifiable" strictly, e.g. specifying that a photo can be identifiable from context, clothing, tattoos, etc. That should include cut-up photo sequences where one end has the head and one end has some other part. The problem is, I see this really as a question for the photographs of identifiable people policy - we shouldn't have two different definitions of "identifiable" knocking around or chaos will ensue. Wnt (talk) 05:20, 20 November 2010 (UTC)[reply]
Wnt, I don't understand how you could have otherwise interpreted that sentence before the grammar tweak. Can you please spell out what you thought it meant? The diff you provided seems like a straightforward improvement on the English, and I can't for the life of me see any change of meaning, even implied meaning... --99of9 (talk) 12:09, 20 November 2010 (UTC)[reply]
The pre-Nov. 4 version read "All media is subject to Commons:Photographs of identifiable people.... In addition with sexual content, all [media subject to COM:IDENT] are subject to the following requirements:" It seems to me that the following points were meant to modify the normal requirements of the COM:IDENT policy. Since the consent issue has previously been part of COM:IDENT, this is what was to be expected. Wnt (talk) 13:51, 20 November 2010 (UTC)[reply]
IP address, your statement "The proposal as worded would require deletion of ..." appears completely false to me. We have a dotpoint that explicitly addresses files uploaded prior to activation of this policy. Since that was the justification of your revert, I trust that you will restore to correct your error. Further, my point above was that "I think ... is a reasonable solution" is no longer a good enough reason to make functional changes to the text. We have a carefully agreed compromise, and one or two people is no longer sufficient consensus to make that kind of change. We are supposed to be copyediting. (PS, please consider logging in and backing your opinions with the reputation of your username.)--99of9 (talk) 12:09, 20 November 2010 (UTC)[reply]
I should agree with 99of9 that this part of the policy isn't about the lack of explicit assertions of consent, which would affect the old files, but only about actual lack of consent, which is a much narrower matter. What I'm interested in preserving here is the use of photos taken with consent but not with explicit consent for use here, primarily medical and surgical photos.
To give an example of the sort of things I'm thinking about, consider an NIH-funded paper about a surgery which is published in a private journal, but which by law is required to be under an open license. The paper may contain a low rsolution black and white photo of the procedure, which is published in the journal, but include "supplemental data" at the journal's, author's, or department's web address with a wider array of high-resolution color photos and video. Such photos can be used on Commons by copyright, but may not be regarded by some as having been published (especially since it is not that uncommon to update supplemental data after the paper is out...). The patient would have consented to a medical publication and other uses by the author, but probably not by us.
I think there is a way to write this to clearly rule out the cut-up ex scenario, and clearly rule in the medical images scenario. Wnt (talk) 23:18, 20 November 2010 (UTC)[reply]

I hadn't realized that the text currently includes "Explicit assertions of consent are encouraged but not required for sexual content uploaded prior to implementation of this policy" which does indeed address my concerns about earlier images above. However, I really do think requiring an assertion of consent for unidentifiable images is much less desirable than allowing them. The chance that an uploader is going to try to embarrass someone with an unidentifiable image is tiny, and there's no way it would work if they tried, by definition. That means requiring consent for unidentifiable images is needless instruction creep. Also I hope Wnt can tweak his text to address the external publication issue he describes. I don't understand why we shouldn't still be trying to make improvements to the proposal; we improve policies even after they're adopted. 71.198.176.22 06:51, 21 November 2010 (UTC)[reply]

Want to bet? Send me a full frontal nude image of yourself, your full name, and the copyright holder's release, and I'm sure I can embarrass you on commons without breaking existing policy. --99of9 (talk) 08:42, 21 November 2010 (UTC)[reply]
If you cut an image in two, where one half is identifying and the other is embarrassing, then the existing policy would allow publishing the embarrassing part here. We do not want that, as the identifying part can be published later or elsewhere. That is why we want consent for most sexual content. Truly unidentifiable images and images from serious sources are no problem. --LPfi (talk) 11:48, 22 November 2010 (UTC)[reply]
As I said, 99of9 has a good point if we define identifiability too narrowly. But this scenario extends to many things beyond this policy. After all, a full frontal nude image of a typical person, especially if female, can be quite embarrassing even if the actual genitals are obscured, and it isn't within the scope of this policy at all. Nor do I think we should consider a separate COM:BREASTS policy to deal with that; after all, it might be embarrassing for a man to have his belly cropped out and used to illustrate the obesity article and so forth, if the picture still gets linked back to his name.
My feeling is that we can't force our policies on someone with truly malicious intent - he can always falsely assert consent for the whole image, even fake "evidence" such as it is, so why focus on a later malicious release of a second half of an image? But we might warn editors against doing this unintentionally. Based on this discussion my thought is to add some redundant warning about identifiability, though trying not to set up a separate definition. Wnt (talk) 15:41, 22 November 2010 (UTC)[reply]
I agree; it doesn't matter how close-cropped an image is if it has been, or is likely to be, associated with someone's name or identity. In that case an assertion of consent should be required for new uploads, and if the person associated with an image uploaded any time wants to withdraw consent or claims that there never was consent, we should respect that and delete the image. I am not sure those points are in the policy, and they should be. I'm going to try to put them in. 71.198.176.22 21:54, 22 November 2010 (UTC)[reply]
best way forward in my view, and the one currently written up, is for the sexual content policy on consent to apply to all sexual content - it's easier, clearer, and just plain better that way :-) Privatemusings (talk) 20:26, 22 November 2010 (UTC)[reply]
"All sexual content" includes both images and text, a substantial subset of which there is a wide consensus to allow instead of delete. Which version are you referring to as "the one currently written up?" 71.198.176.22 21:54, 22 November 2010 (UTC)[reply]
not by our definition :-) - take another look..... Privatemusings (talk) 22:36, 22 November 2010 (UTC)[reply]
I'm sure that whoever wrote "media depicting" didn't intend to include textual media or textual descriptions, but text is a kind of media, and according to two of the dictionaries I just checked, depictions include text as well as pictures. Do you think it would be a good idea to resolve such ambiguities before asking people to approve the draft? 71.198.176.22 23:48, 22 November 2010 (UTC)[reply]
  • "Media" here is clearly in the sense that Commons is WMF's media repository. We can't just say "images" because it also includes audio, video, etc. Still, in the context of Commons content, when we refer to "media" we mean the media files, distinct from descriptive text. I believe this is pretty consistent across policies & guidelines. - Jmabel ! talk 05:15, 23 November 2010 (UTC)[reply]
Perhaps I was going about this the wrong way. I'm trying a different edit here [3] explicitly stating that "previously published" includes certain websites. Some of my other concerns are actually not prohibited by the current language of the first section, which only says "public release" rather than release on Commons. I'm not going to work myself up just now over the right to take other people's snapshots for personal use then release anonymized cropped bits on Wikipedia for fun; no doubt there is some case in which this is terribly important but I'm not thinking of it today. An aberration in this edit which might raise some hackles is that I refer to a Wikipedia policy in a link - unfortunately, I didn't see a Commons policy to cite about reliable sources! Wnt (talk) 19:57, 23 November 2010 (UTC)[reply]

Really not ready for prime time[edit]

I think we've got a reasonable consensus on the meaning of the policy we will present, but it's really not yet all that well written. If I have time (and focus), I may take shot at this myself, but I'd welcome another good writer trying in my stead.

For the most part, I would hope any rewrite would be shorter, not longer, but there is one thing I think is missing: an explanation of why sexual content raises certain issues that are not found, or are found only in a lesser degree, in other content. E.g.

  1. Sexual depictions of minors can trigger criminal legal concerns in many countries.
  2. Sexual depictions of identifiable living or recently deceased people have enormous potential to prove embarrassing to those people or their associates, and therefore it is more important than usual to be confident that the subject gave any necessary consent for a photo to be distributed.
  3. To a greater degree than in other subject-matter areas, hosting numerous sexual images with little or no educational value is likely to be detrimental to Commons' reputation. Few people will consider it a serious problem if Commons has superfluous pictures of domestic cats or of the west face of St. Paul's Cathedral. Far more will consider it a problem if Commons has superfluous amateur pornographic images.

If someone wants to rework that, great, but I think something like this is needed.

Also, there is currently one very confusing passage in the lede: "This policy addresses how Wikimedia Commons should handle materials outside Commons' project scope..." Is that really what we mean to say? It seems to me that the policy is, instead, about determining the boundaries of that scope. "Materials outside Commons' project scope" are routinely deleted. - Jmabel ! talk 21:02, 16 November 2010 (UTC)[reply]

I'm not sure if writing style is justification for holding up a vote. After all, it's a policy, not a literary masterpiece, and having a policy that we agree on the meaning of already seems like an accomplishment. We keep having gaps of up to a week when no one edits the policy, and waiting for major revisions could take a very, very long time. And every change, however small, seems to risk reopening disputes. I see you've made improvements, and so far we've been delaying the vote for copy-editing; still, it has to end sometime. Wnt (talk) 11:38, 17 November 2010 (UTC)[reply]
I think Jmabel is getting at the fact that we have a good document that needs one more paragraph in the lead. It can be done. Points 1 and 2 are clearly in the article in different sections. Point 3 is somewhat more controversial because it gets into the why of this policy, and whether there are extra-legal, extra-commons concerns about mere reputation or social responsibility involved. I can't address the last point, but I'll incorporate the first to for the intro and see if we can discuss the third.
Sentences added: Sexual content must be handled differently from other content, because sexual depictions of minors can trigger legal issues in many countries; depictions of identifiable living or recently deceased people may be embarrassing or violate their privacy; and hosting sexual images can be done in a way which protects the reputation of Commons among the public to safeguard their continued support.
Question: Is point three accurate: are we writing this policy out of any consensus-concern about the reputation of Commons or about some general social concern?
I have to say that I oppose this paragraph altogether. The policy as written really does not treat sexual content differently: Commons has never had a policy of hosting illegal content, it has long had the Commons:Photographs of identifiable persons policy, and it does not censor photographs based on perceived external or even internal opposition to their content. Furthermore, it is not actually contributing to the policy in any identifiable way - it's exactly the sort of excess and confusing verbiage that we should be trying to chop out. Wnt (talk) 00:58, 18 November 2010 (UTC)[reply]
I do not like the paragraph either. There are other things that can trigger legal issues. Depictions of identifiable people may be embarrassing. Sure sexual content is extra sensitive, but I think people know that. And the third sentence is strange, it sort of says the thing backwards. We want to have these things straight, why do we say we do it to safeguard our support? --LPfi (talk) 13:53, 19 November 2010 (UTC)[reply]
I've offered a compromise (?) sentence about the above.[4] But I don't know if this will satisfy Jmabel, nor would I object to its deletion since it is more or less redundant. Wnt (talk) 01:08, 18 November 2010 (UTC)[reply]
I see what you're getting at, but if identifiability and illegality are sufficient, what is the point of this policy at all? By your description, either sexual content is already covered by existing policy and a sexual content policy is redundant (or merely guidance for applying the main policies), or else sexual content is handled differently than other content. I don't have a preference either way, but I think we should know which is which. I'm going to try rephrasing the added sentenced to say the same thing with less perceived distinction. See if it gets closer.
Sentences added (2): Sexual content tends to need closer attention because of the legal and privacy-related issues it raises. In order to ensure that such content does not violate U.S. law, or other Commons policies, this policy addresses processes to handle sexual material that falls within Wikimedia Commons' project scope and to identify and remove material that does not.
Re: Wnt's point about voting despite style, I agree. Policy on Wikipedia is rephrased constantly, but it doesn't affect the basic guidance. User:Ocaasi 13:25, 17 November 2010 (UTC)[reply]
I'd be perfectly glad to see us head toward a vote at this time.
As for the matter of reputation: that's essentially what started this all in the first place. Jimbo launched a bit of a crusade, deleting even some images that were clearly significant works of art. His actions were almost certainly in response to sensationalistic news stories besmirching Commons' reputation. - Jmabel ! talk 16:19, 17 November 2010 (UTC)[reply]
I recall that his edits generally said "out of project scope", and I would like to think that whatever external pressures were brought to bear on him, that he would not have launched an all-out attack on a category of content if much of it were not actually at odds with existing policy. We know some files were deleted that should not have been, but many of his deletions stuck based on preexisting policy. Wnt (talk) 01:15, 18 November 2010 (UTC)[reply]

Wikimedia leadership response to controversial content[edit]

Linking these because they may be of interest to people watching this page, and to inform policy discussion.

  • en:Wikipedia:Wikipedia_Signpost/2010-11-15/News_and_notes#Controversial_content_and_Wikimedia_leadership
  • Sue Gardner: Making change at Wikimedia: nine patterns that work: "we’re the only major site that doesn’t treat controversial material –e.g., sexually-explicit imagery, violent imagery, culturally offensive imagery– differently from everything else. The Board wanted –in effect– to probe into whether that was helping or hurting our effectiveness at fulfilling our mission."
  • [Ting Chen] writes: "the ongoing controversial content discussion is a result of our strategic planning (development and adaption in the nonwestern cultures) and the response of the changes in public policy and in our responsibility."
  • meta:Talk:2010_Wikimedia_Study_of_Controversial_Content:_Part_Two#Recommendations_discussion: Inviting feedback on the 11 recommendations of the study, which are:
    1. no changes be made to current editing and/or filtering regimes surrounding text
    2. no changes or filters be added to text on current Wikimedia projects to satisfy the perceived needs of children
    3. creating Wikijunior
    4. review images of nudity on Commons
    5. historical, ethnographic and art images be excluded from review
    6. active curation within controversial categories
    7. user-selected shuttered images (NSFW button)
    8. no image be permanently denied with such shuttering
    9. allow registered users the easy ability to set viewing preferences
    10. tagging regimes that would allow third-parties to filter Wikimedia content be restricted
    11. principle of least astonishment to be codified

Feel free to visit the last link and provide your own feedback. Dcoetzee (talk) 04:03, 17 November 2010 (UTC)[reply]

Should we allow subjects to withdraw consent for images uploaded prior to adoption?[edit]

Is there any reason not to allow subjects associated (correctly or incorrectly) with a potentially embarrassing image to withdraw consent for its inclusion, as a BLP-strength support for deletion? I remember people suggesting that there was, but I'm having a hard time remembering why. 71.198.176.22 22:03, 22 November 2010 (UTC)[reply]

the granting of, and subsequent withdrawing of consent is a tricky issue. I'd be happy to continue discussions about the best policies in this area whilst the current draft is adopted. Privatemusings (talk) 22:37, 22 November 2010 (UTC)[reply]
I'd rather we put forth a draft addressing this issue, because it is tricky, instead of delaying addressing it before asking people whether they want to support the draft. But I'm not opposed to addressing the issue in a wider proposal instead, since it's not specific to sexual content. Commons:Living persons would seem to apply, but that has only been edited by one person about a year ago, and it's in pretty bad shape. Can we use foundation:Resolution:Biographies of living people instead? It doesn't say anything about images, but the spirit of that Resolution seems very clear to me here. Alternatively, should this issue be addressed in Commons:Photographs of identifiable people instead of here? Or do we want to limit withdrawal of consent to sexual content? 71.198.176.22 23:48, 22 November 2010 (UTC)[reply]
Currently the photographs of identifiable people policy is the only place where consent issues come from; depending on ongoing discussions above, things may or may not stay that way.
My position is that the point of Commons is to put images permanently into the public domain. So consent should not be revocable, just as the copyright terms are not revocable. That said, I think we should heavily weight the subject's statement about consent: if he says he never consented, or never consented to public distribution of the image, this outweighs the photographer's assertion of consent. Only hard evidence like a verifiable prior publication of the image is likely to outweigh the subject's statement. It is true that in many cases subjects can abuse this to effectively revoke consent, but at least we're not committing ourselves to the principle that Commons images are only temporarily public. I should note that a subject should only be able to challenge an image if he asserts that it correctly associated with him; dispelling false rumors isn't our responsibility. Lastly, in the interests of defending the public domain, we must not remove published photos that have reached the public domain, no matter how the subject objects; otherwise we fall behind the archives of the original magazine and other copyrighted resources. Wnt (talk) 18:52, 23 November 2010 (UTC)[reply]
2 remarks, though I'm not sure either affects the policy we are discussing here:
  1. I presume you mean "public domain" in a metaphorical rather than a legal sense. I've posted tens of thousands of my photos to the Commons, but I have not placed them in the public domain.
  2. As a matter of practice, we often remove photos as a courtesy to the subject of the photo. If the picture is readily replaceable for all current uses, and there doesn't seem to be anything particularly notable about it, and if the subject of the photo objects to its presence here, we usually do this. For example, if we have a routine photo of someone around the threshold of encyclopedic notability, and they don't like the particular photo, and they provide us with a better one (or we already have a better one), we almost always accede to their wishes. - Jmabel ! talk 05:32, 24 November 2010 (UTC)[reply]
As a matter of practice, not policy, and when the image is replaceable. I would much rather have deleting photos on request a matter of practice when it's convenient or reasonable, rather than policy.--Prosfilaes (talk) 06:33, 24 November 2010 (UTC)[reply]
This is worth a separate policy about. Generally, regardless of legality, copyright, or current policy, I can't really see a good reason not to permit someone to change their mind for any reason at any time. If an image is harmful to a person's reputation, it's not worth hosting it, and we should probably be able to replace it anyway. The only situation I can see where this might not apply are mass uploads to non-sexual content (i.e. trains), where the author wakes up one day and wants to make a profit; that might be too late. Thoughts? Also, where would be the place to discuss this aspect of policy? Ocaasi (talk) 05:25, 25 November 2010 (UTC)[reply]
This discussion is getting off topic but, yes, there are lots of reasons not to allow people to revoke their permissions. None of the following has particularly to do with sexual imagery (which is why this is off topic) but...
Is there a better place where this discussion could happen?
  1. When you upload a photo you took, you grant an irrevocable license. If someone uploads a bunch of useful images, then later stalks out of the project in a fit of pique, they don't get to revoke those permissions and remove their pictures from Wikipedia.
  2. Example rather than abstraction here: the images from Abu Ghraib are certainly embarrassing to their subjects (both Iraqi and American) but we certainly wouldn't remove them because Lynndie England found an image of herself embarrassing.
  3. Example again: a U.S. Senator might object to us hosting anything other than their official photo, but it would still be entirely legitimate (for example) for us to host their high school yearbook photo or a photo contributed by someone who photographed them at a public event.
  4. Similar example, less obvious: if I take a photo of someone about whom we have a Wikipedia article, and I take it entirely in accord with the usual rules in the relevant country (e.g. photographed in a public space in the U.S.), and that person doesn't like Wikipedia having their photo, frankly, unless we decide we want to do a courtesy to them, it's their problem, not ours. Now, if it's really not a good photo, and especially if we have a better one, I'll probably support doing that courtesy, but it's a courtesy, not a matter of policy.
Jmabel ! talk 06:36, 25 November 2010 (UTC)[reply]
I think we're discussing different cases. Generally, I was only referring to pictures that the subject took themselves or granted permission themselves to use, not other public domain images which someone is merely the subject of. Re Lyddie England, she's a public figure at this point, and the photograph wasn't produced or uploaded by her--same with the Senator who presumably didn't upload the content that s/he is seeking to take down. I was trying to think of counterexamples where we should not honor a request for someone to take down a photo they took themselves or granted permission themselves. Can you think of any for those? Ocaasi (talk) 06:48, 25 November 2010 (UTC)[reply]
One of the most frustrating things Commons does is going around deleting images in use, for good reasons or bad. In an optimal world, we would never delete an image used in a current or historical version of any page on a Wikimedia project. We already give photographers much more rights then authors of encyclopedia articles, who have their names buried in the history and don't get to choose their license; there's no need to privilege them further. A photographer shouldn't get to rip their photo out of an article any more than an author could rip their text out of a WP article. We have to treat the portrayed with all the respect of a professional scholar or journalist, which usually coincides with their own desires; when it doesn't, we destroy our own quality as a news source and encyclopedia by letting them dictate to us.
Sexual content is complex, and I will freely acknowledge that the issues surrounding it mean it will get deleted much more freely then most content. I still think it important that policy demand that uploads to Wikimedia projects are not revokable, and that people think before uploading, instead of coming back and demanding that users on 14 Wikimedia projects fix up 47 pages because images that they assumed--and should have had every right to assume--were permanent are now being deleted.--Prosfilaes (talk) 07:09, 25 November 2010 (UTC)[reply]
What you're saying makes sense about the problems caused by revoked images. As long as the warnings to the uploader are sufficiently clear, I think they should at least need a 'good reason' and maybe an OTRS ticket. On the other hand, what about: a photo of a girl in a bikini which ends up on a very popular article like 'beach'; a masturbating woman who decides 5 years later that for employment, personal, or family reasons that the image is harming her; a picture of someone's pierced genitals which has become un-anonymized from cross-publishing in a body-modification mag; a topless photo from an 18 year old which 20 years later doesn't seem like such a good idea. I'm stretching credulity on some of those, but I'm looking for what the precise rationale is. We definitely shouldn't let people revoke content for permissions, except perhaps, when we should. Ocaasi (talk) 07:23, 25 November 2010 (UTC)[reply]
I'm happy to let all of those be dealt with under practice, not policy. But what about some other examples: someone writes the definitive biography of a Stalinist lackey that, while its positive spin has been toned down, stands as the Wikipedia article 10 years later, when he wants it deleted because it will hurt his political career. Or a women who contributes extensively to a Wikiversity project on polyamory who's now worried about the effects on her marriage. Or someone who rewrote masturbation to portray it as dangerous, and filled a talk page with arguments for keeping it; in any of those cases, do we shred articles or talk page to preserve the modesty of the author?--Prosfilaes (talk) 16:27, 25 November 2010 (UTC)[reply]
We're obviously well into hypothetical-world, which is fine with me. I think the major difference between our examples is that photographs are different than text. In both cases, there is a contribution (photo, writing). In both cases, the contributions can be linked to a contributor (uploader, writer). But in the case of the photograph the uploader (or consent giver) is the subject--the contribution bears the mark of the person who uploaded it right out in the open. In the case of an article writer, a screen name could easily be anonymous; and even if the screen name was a Real Name, the text itself would not obviously reveal who did it--you'd have to dig through the history to find the user and then check the diffs, etc. A photograph gives pretty much all of that information without any effort or words, which is why I think the comparison doesn't fit for me. I agree that text contributions like the ones you described should not be revoked, but I don't think that settles the photograph situation.
You're probably right that policy can avoid this issue, but it might be worth beefing up warnings on the upload page to make clear that consent is non-revocable. Once this COM:SEX policy is set, it might be worth linking to or summarizing as well. Something like: "You can upload sexual content to commons if: it is of your body and you understand it will be publicly seen in any context under an open license which you cannot revoke; it is of someone else who consented to and understood those terms; it is of content already released under compatible permissions." Currently the COM:SEX policy is geared towards Commons curators rather than uploaders, which is understandable but not ideal for a broad issue that should address the concerns of both 'users' and 'editors'. Ocaasi (talk) 02:17, 27 November 2010 (UTC)[reply]
So far as I know the upload page has always said that the consent is non-revocable. The point is, once an image goes up on Commons, and especially once it is used to illustrate Wikipedia, it is going to get mirrored all over the world. Publishing a photo here isn't much different than publishing in a sex magazine. Now Wikimedia does recognize a "right to vanish", and the uploader's account name, and presumably any mention of the subject's name in the image, could be removed from the image on this basis. But bear in mind that if we make a policy decision to delete a photo, we're also making a policy decision to delete its derivative works. We're telling the contributor who has taken that picture and put it beside another and labelled up the comparative anatomy between glans and clitoris, labia and scrotum and so forth that all the work he did was for nothing. Now if we were returning the original uploader to secrecy and anonymity we might be tempted, but knowing we're not? Wnt (talk) 18:58, 29 November 2010 (UTC)[reply]
I see what's going on here and the cats-out-of-the-bag nature of a non-revocable, open license. I think where it's pushing me is that we need to be a little more explicit about what exactly placing an image, especially on commons means. Basically, I think we should spell it out: "If you upload an image of your naked body or sexual act onto Wikipedia Commons, you are granting license for anyone in the world to use the image anywhere they choose so long as they follow the terms of the license. Also, you can never revoke your consent once it has been granted." Maybe it doesn't have to be as scary as that, but it's basically just the facts, and it is a little scary. That's what I'm getting at. We shouldn't change policy regarding revoking consent (although I bet OTRS does it quietly on occasion), but we should be abundantly clear that this content is not just for a single Wikipedia article and that uploaders have absolutely no control over content once it is uploaded. For-Ev-Er. Ocaasi (talk) 05:28, 30 November 2010 (UTC)[reply]

Does the upload form create a binding contract when images are submitted? If so, what is the consideration of that contract? Does the notation that the license is non-revocable carry any weight when the uploader isn't reimbursed? 71.198.176.22 14:26, 2 December 2010 (UTC)[reply]

I'm not a lawyer, but similar terms are stated by quite a few prominent sites on the Internet. On Facebook, for example, once you upload content you give them a pretty nearly unlimited license in that content, and it is not revocable either. - Jmabel ! talk 02:10, 3 December 2010 (UTC)[reply]
Armchair analysis (also not a lawyer): Consideration is the mutual exchange of something of value, either money, goods, or a promise not to do something else (e.g. I can contractually pay you $100 not to paint your car red). In the case of cc-by-sa licenses, there is obviously no two-way exchange of value or goods (unless one considers Commons to be exchanging a service by hosting and allowing the distribution of these materials, which is nice but legally unlikely); there is also no exchange of money. Is there a promise 'not' to do something that applies? Well, many copy-left licenses requires attribution, so there is a promise not to distribute the content without doing and also under a compatible copyright. Still, I don't think this is what the courts have in mind, since the license itself should probably be separate from the consideration which makes it binding. There are things which have no contractual power, but can still not be taken back. They're called gifts, and once you give them, you lose your claim of property over them. Although Copyleft licenses are couched in the terminology of contract, they are more just gifts with a few strings attached (that's how I personally think of them; copyleft lawyers probably have more nuanced terminology). This 2006 article discussed consideration and the GNU GPL, seemingly coming down against contract designation (on consideration grounds) but for license status, however that differs. There's more of this out there if you want to poke around the Google or the Google Scholar, or check out http://www.eff.org or ask at the RefDesk, where they are wicked smart.Ocaasi (talk) 09:10, 3 December 2010 (UTC)[reply]
I spend some time at the RefDesk, and I wouldn't trust it for legal advice (which is specifically banned there anyway... ). The main problem here is that if uploads are revocable, it applies to every single image on Commons, so it's not specifically relevant to this policy. I would also worry that any change to the site upload notice might risk the status of all uploads and should be done only with formal legal input. The only question here is if some special courtesy deletion policy is required here, which I don't see. Wnt (talk) 15:39, 3 December 2010 (UTC)[reply]

Simulation resolution[edit]

What is the resolution between a diagram and a simulation? 71.198.176.22 20:36, 24 November 2010 (UTC)[reply]

are you asking for thoughts on whether or not / how this policy should differentiate a diagram (drawn, or computer generated I presume) with a simulation (presumably photographed?) - I'm not 100% of how to kick off a response, so a clarification would be helpful to me :-) Privatemusings (talk) 00:30, 25 November 2010 (UTC)[reply]
Yes, if we are going to prohibit photographs and simulations both, then wouldn't it be a good idea to provide some guidance about what is a simulation and what is merely a diagrammatic drawing? 71.198.176.22 22:40, 25 November 2010 (UTC)[reply]
do you have trouble discerning the difference - aren't they different just by definition? :-) Privatemusings (talk) 01:21, 26 November 2010 (UTC)[reply]
This policy isn't about prohibiting photographs, just keeping them within Commons policies. Diagrams are specifically exempted from speedy deletion because certain uncontroversial reasons for deletion can't exist for them: they can't be child pornography, and they can't be created without consent, and the act of drawing them probably implies a certain educational purpose. Simulations aren't specifically defined here, and might be regarded as diagrams; but it is also possible to imagine looking at a "simulation" and not being sure if it is a drawing or a photograph run through a few photoshop filters, or more controversially (for example) a nude rendition of a particular person recreated from X-ray, terahertz, or other forms of scanning. In any case, no one should claim a simulation can be speedy deleted if it doesn't fall into one of the prohibited content category at all. So I'm not going to stickle on having simulations specifically exempted from uncontroversial speedy deletion when it will probably just raise more arguments. The term is just too prone to varying interpretations to use for policy purposes. Wnt (talk) 18:46, 29 November 2010 (UTC)[reply]

I asked the question poorly. I'm referring to "simulated sexually explicit conduct" -- where is the boundary between that and diagrammatic drawings? For example, are the cartoon illustrations used for sex acts in the English Wikipedia simulated sexually explicit conduct or diagrams, and why? 71.198.176.22 12:01, 30 November 2010 (UTC)[reply]

The policy doesn't try to define a difference between simulations and diagrams. At the beginning it defines sexual content to include "simulated" images, which aren't really defined. The main dividing lines are (1) what is prohibited content, which very likely does not include them, and (2) what must go through a full deletion review, with the additional comment that "Some categories of material which are generally useful for an educational purpose include: diagrams, illustrations..." (which reiterates that diagrams and illustrations probably aren't going to be prohibited by the policy).
So to take the example of File:Wiki-analsex.png, the file probably counts as "simulated sexually explicit conduct", so it gets sorted through the policy. Then you ask if it's illegal (probably not; fortunately even photos aren't counted as "obscene" nowadays), or if it's taken without consent (nobody to consent), or if it's out of the project scope (the policy specifically says that illustrations are generally educational). And if someone wants to argue for its deletion, it has to go through a proper discussion.
This may be a roundabout logic, but it's the result of a people trying to come up with a policy from different points of view, perhaps regarding situations we haven't thought of yet. And to be fair, a photo like that isn't all that different from the sort of "animated/live action" image that you could find in the film A Scanner Darkly, which arguably would need (for example) the consent of the original participant. Wnt (talk) 01:00, 1 December 2010 (UTC)[reply]

Time to take it to a poll?[edit]

No, it's not perfect. But few things in this world are, and it's been pretty stable, and I'd like to see us adopt this before we are overtaken by events from the Foundation level. - Jmabel ! talk 02:26, 25 November 2010 (UTC)[reply]

I think that timeline is important. The remaining questions are:
  • Should a user be able to revoke sexual content including themselves? (If it was uploaded by them; if it was not uploaded by them; if they gave prior consent in either case)
  • How to handle 'close-up' images. What qualifies as identifiable? Is consent required? What is the process or 'burden' regarding identifiability and consent in these cases?
  • How to handle what could be an illegally acquired image? What does illegally acquired 'look like'? What is the process or 'burden' in these cases?
These are increasingly minor and should only be resolved first if they have the potential to change the outcome of the vote. Ocaasi (talk) 07:02, 25 November 2010 (UTC)[reply]

Here's my answers fwiw :-)

  • Revoking consent will remain outside the purview of this policy, to be handled through practice. The principle of irrevocable consent is important to all wmf content generation, so the best thing for this policy to do is to invite contact with the office, which is suitably in place.
  • Close-up images which are sexual content, as defined, require assertions of consent, as does all sexual content. We've left the burden to be decided through community discussion - I'm fine with that.
  • We've stated 'Provision of further evidence of consent is welcome (using the OTRS process) but is not normally required.' - in the event that an image has a reasonable basis for being illegally required (as discussed and established through community discussion) this would, in my view, qualify as 'not normal' hence the community is empowered to insist on further evidence, the nature of which is left to the community discussion to evaluate. I'm ok with that for now too.

that's my thoughts. Privatemusings (talk) 22:09, 25 November 2010 (UTC)[reply]

At a quick read, I agree completely with Privatemusings here. And none of this entails any further rewordings. - Jmabel ! talk 02:05, 26 November 2010 (UTC)[reply]
I like those answers too. Maybe they'd be useful for a COM:SEX FAQ. In fact, an FAQ, even an informal one might help explain the intention of some of the policies and let voters browse through issues. That's not to say the policy shouldn't be self-explanatory or that the FAQ is binding--just that a guide to a new policy might help introduce the policy itself (see: W:Wikipedia:Neutral_point_of_view/FAQ for example). Ocaasi (talk) 05:30, 26 November 2010 (UTC)[reply]
  • Yes, please, let's put the baby up for adoption. --JN466 12:25, 26 November 2010 (UTC)[reply]
  • Ok from me too. 99of9 (talk) 13:04, 26 November 2010 (UTC)[reply]
  • It's ready. - Stillwaterising (talk) 01:38, 29 November 2010 (UTC)[reply]
  • I'm generally agreeing with Privatemusings' answers above; though I'd favored a less restrictive policy on consent for cropped images, it wasn't well accepted and I took a different and probably better approach as described above. I think we're ready to hold the vote, using the text previewed in the "[PREVIEW] Second poll for promotion to policy" thread above. Wnt (talk) 19:18, 29 November 2010 (UTC)[reply]
  • It looks good to me. I would still like to see some declaration that cartoons are more diagrams than simulation unless they include nucleic acids, proteins, lipids, and glycans of the gonocytes, just to be on the safe side, because I have a feeling there may be case law suggesting low resolution diagrams are simulations. 71.198.176.22 08:31, 4 December 2010 (UTC)[reply]
I don't actually understand what you mean, but it looks like people want a poll, so I'll start the section, closely based on the preview text above (which I'll archive now to avoid any misplaced votes). Wnt (talk) 13:56, 4 December 2010 (UTC)[reply]
  • may be we are ready for the polls, but not yet, i feel, till the question of "irrevocable consent" is well settled. "sign" jayan 05:51, 9 December 2010 (UTC)
  • The poll is going on. And I think there was consensus about "irrevocable consent": no big need to write anything about it here. --LPfi (talk) 08:04, 9 December 2010 (UTC)[reply]

Summary of poll arguments[edit]

This section is intended to be a brief, neutral summary of opinions raised in the policy poll below. Please make NPOV edits for completeness, concision, neatness, or accuracy, but not for persuasion (that's what the poll comments are for). Also, !nosign! any changes to keep it tidy. Thanks, Ocaasi (talk) 15:50, 7 December 2010 (UTC)[reply]

Thank you for doing this, it's very helpful! -- Phoebe (talk) 00:26, 12 December 2010 (UTC)[reply]
Gladly, I think every poll should have one! Ocaasi (talk) 12:16, 12 December 2010 (UTC)[reply]

Like it...[edit]

  • Policy is needed
  • Good enough for a start
  • Represents Compromise
  • Not having the policy is worse
  • Not a major change of existing policies, just a focused reorganization
  • Can help avoid social backlash and panic
  • Needed for legal reasons
  • Needed for ethical reasons
  • Protects the WMF
  • Prevents more strict censorship
  • Legally sound
  • Prevents becoming a porn repository
  • Better to have a centralized policy than multiple deletion rationales
  • Preempts WMF from imposing a less nuanced sexual content policy

Don't like it...[edit]

  • Slippery slope to more censorship
  • Educational purpose is not clearly defined
  • Doesn't explicitly protect content from deletion
  • Policies on illegality, privacy, pornography, and nudity already cover the material
  • Better to handle on a case by case basis
  • Instruction creep
  • Policy addresses legal issues but is not written by lawyers
  • Policy addresses legal issues but commons users are not legal experts
  • Sexual content should not be treated differently than other content
  • Vague wording
  • The phrase 'low-quality'
  • Out-of-scope deletions should be normal not speedy
  • US-Centric legal concerns
  • Legal concerns that are unresolved by courts
  • Addresses sex but not offensive or violent content
  • After Wales deletions, implementation cannot be trusted
  • Biased against Western cultural taboos (it's too conservative)
  • Biased against non-Western cultural taboos (it's too liberal)
  • Better as a guideline
  • Needs more specific criteria for inclusion and exclusion
  • Better to solve this on the user end (personal image settings/filters)
  • Insufficient protection for erotic art
  • Only available in English language, which makes fair overall understanding and voting impossible

Questions[edit]

  • What does 'low-quality' mean?
  • What qualifies sexual content as educational?
  • What content cannot be deleted?
  • Would it be better as a guideline?
  • Can the legal discussion be rephrased in normal language?
  • Should out-of-scope decisions be normal rather than speedy deletions?

Tweaks[edit]

  • Shorter
  • Less legalese
  • Clarity of legal interpretation
  • More specific criteria for inclusion/exclusion
  • Out-of-scope deletions to be handled by regular process, not speedy

Second poll for promotion to policy (December 2010)[edit]

This is a poll to adopt Commons:Sexual content as a Wikimedia Commons policy.

Please give your opinion below with {{Support}}, {{Oppose}}, or {{Neutral}}, with a brief explanation.

The poll concerns whether people accept or reject the November 26 revision. Edits made to the policy during the polling period may be temporarily reverted, and will need to be adopted by consensus, like changes to the other established policies.

Voting on the poll will proceed until 06:08, 15 December, ten days after the time that poll was first advertised at MediaWiki:Sitenotice.

A previous poll was closed as "no consensus" at Commons talk:Sexual content/Archive 4#Poll for promotion to policy. Wnt (talk) 07:47, 23 October 2010 (UTC)[reply]