Commons:Requests for comment/Safesearch

From Wikimedia Commons, the free media repository
Jump to: navigation, search
Skip to table of contents

Introduction[edit]

Note: Due to length issues this was moved from Commons:Requests for comment/improving search.

Recently the issue of how to improve Commons' search (Special:Search) has been raised. This RFC is to further discussion of how to improve search. Rd232 (talk) 18:35, 28 February 2012 (UTC).

  • Links to working and proposed search enhancements are listed at Help:Searching.

Some prior discussions:


One issue is how to give users control over whether pornographic images show up in searches. To some extent, making search more effective will address this, by reducing the likelihood of pornographic images being shown to users not looking for them. However, the issue can also be addressed to directly.

Google uses the term Safesearch for this, and uses a variety of means based on classifying search terms, blacklisting URLs, and automated image analysis looking for flesh tones. Commons could attempt to use classification based on the categories files are in: pornographic images should be classified appropriately within the topic category tree. Rd232 (talk) 18:35, 28 February 2012 (UTC)

Actually, this is not only about pornography but equally much about pictures of Mohammed. I think that it would be a good idea to offer the possibility to exclude any category of choice (for example Category:Nudity or Category:Cars) including all of their subcategories. Not only for unpleasant pictures but also as an aid for finding pictures. --Stefan4 (talk) 23:05, 28 February 2012 (UTC)
Easily excluding any category would be part of the "effective search" section above. By contrast "Safesearch" is specifically excluding things users generally don't want unless they specifically ask for it. Porn is the most obvious example. The idea is that users wouldn't need to understand the category structure in order to exclude porn from search; it would be excluded if they ticked the Safesearch box or user preference (maybe with some optional customisation, but it should work well enough to work as an on/off thing). Exclusion would most obviously be based somehow on the existing topic categorisation scheme. Rd232 (talk) 00:06, 29 February 2012 (UTC)
  • Pictogram voting comment.svg CommentStefan4 (talk · contribs), above diff, makes a very interesting point with regard to the inherent censorship problems this raises. But let's analyze this a bit further, shall we:
  1. Should there be a parameter for users who wish to avoid seeings depictions of Muhammad ?
  2. Perhaps users who are members of the Catholic church should be able to avoid viewing images of contraception ?
  3. Maybe users who are members of Jehovah's Witnesses should have a parameter to avoid viewing images of blood transfusions ?
  4. What about users who belong to the Scientology organization, who don't want to see images of psychiatry or psychology in practice ?

Thank you for your time, -- Cirt (talk) 04:39, 29 February 2012 (UTC)

  1. Using the word censorship in this context - and even linking to the article on it! - is an insult to the many people who have lived or still live under actual censorship regimes. Tools that empower people to choose what they see have nothing to do with censorship.
  2. Maybe there should be different "safe search" schemes for different needs. A system based on topic categorisation could in principle exclude any selection of topics that people don't want to see in categories or search results - whether because they're not looking for those right now, or because they never want to see them. But let's do one thing at a time, eh? Rd232 (talk) 04:57, 29 February 2012 (UTC)
    Instead of screwing with stuff here, it'd be easier to just create a separate site(s) for those who are "insulted", as you put it, by any one of the various topics, above. :) -- Cirt (talk) 05:08, 29 February 2012 (UTC)
    What is it with people putting words in my mouth on this topic? I specifically never used "insulted" for the unwanted topics. As for "separate site(s)" - I think you're missing the point of Commons. (The clue's in the name.) I think you're also missing that Commons exists for educational purposes; nowhere does it say that such educational purposes are limited to cases where this sort of search result is acceptable. Rd232 (talk) 11:51, 29 February 2012 (UTC)
    But that are all constructed examples. How about the thousands, if not million of keywords that don't produce such results? I searched for Melons and got not the result you would expect. I found actual melons. Just think about how such examples are created. Look up some pictures that have sexual content, go to the description, find a more or less unique word pattern and insert it into the search engine. After that it is posted at the common places to do propaganda work. That has nothing to do with an actual problem. It is a selfmade, provocated problem. It's like me typing in futanari in the google image search, setting it to strict filtering and finding explicit content, even if filtered. The construction of a problem is a problem itself. -- /人 ‿‿ 人\ 苦情処理係 12:11, 29 February 2012 (UTC)
    The example is not "constructed" - it's there in the index, and it's a reasonable search term. That "melons" doesn't seem to have that problem is great, but it's not very helpful because "melon" is a slang term and would probably be removed from descriptions; at any rate, I can't find any uses of the slang term when specifically searching for it. Rd232 (talk) 12:29, 29 February 2012 (UTC)
    There already is, it's called conservopedia or something like that. People should do a search and move there. I really don't understand why they are wasting everybody's time here. Personally i qualify this sort of proposals as a very good trolling attempt, even Gay Nigger Association of America wasn't able to get so much time wasted with their "jewsdid911.com" and other troll sites. VolodyA! V Anarhist Beta_M (converse) 05:50, 29 February 2012 (UTC)
    • That's ridiculous. Conservapedia is an unencyclopedic piece of crap. There are genuine reasons to have a form of search that people can use on Commons where they don't have to be wary of violating the rules of many schools, workplaces, and even libraries by using our site. There are genuine reasons to have a form of search that allows parents, in good conscience, to allow their 10-year-old to use our site. Providing a mode like this is not censorship. - Jmabel ! talk 16:07, 29 February 2012 (UTC)
      • But if you add an option to exclude specific categories and their subcategories (such as Category:Nudity or Category:Falun Gong), users may feel that they are forced to use censorship because of rules regulating what you may do whereas a complete lack of such a search option instead may force a removal of censorship rules at various institutions, making information more accessible to some people. --Stefan4 (talk) 16:15, 29 February 2012 (UTC)
        • Doubtful. Institutions that keen on censorship won't be relying on a currently non-existent Wikimedia system for users to filter their own search results. They'll have their own systems, or they might just block Wikipedia entirely. That's how real censorship works. Rd232 (talk) 16:30, 29 February 2012 (UTC)
  • I'm no fan of content hiding, and if such a scheme is implemented anyway, at least let's not have two different forms of it. I assume that any content hiding 'feature' would conceal such results anyway. This isn't the route I want to go here. Wnt (talk) 05:21, 29 February 2012 (UTC)
  • I think the whole problem is that there is no global definition of what kind of material you don't want to see. A safe search functionality would probably need to have a set of checkboxes (nudity, pornography, Muhammed pictures, dog meat, pig meat, etc.) so that users can tell what they find disgusting. However, I'm no big fan of censorship, so I'd prefer that this functionality is off by default. --Stefan4 (talk) 10:58, 29 February 2012 (UTC)
First, offering a safesearch option is not censorship. I have no problem with it in principle as long as it's both optional and transparent. To me, this means that it should be an opt-in (i.e. the default search is unfiltered), either on a search-by-search basis, or as an option that registered users can turn on. It also means that the search results page should plainly state that the search was filtered, and should probably offer a link to unfiltered search results.
The issue raised by Stefan4, "that there is no global definition of what kind of material you don't want to see", is a legitimate practical problem. The checkbox proposal makes sense, but the question is how many options to offer and where to draw the line, as Cirt sarcastically (I assume) pointed out (e.g. "What about users who belong to the Scientology organization, who don't want to see images of psychiatry or psychology in practice?"). The expanded forms of incategory: discussed above would solve this problem. Ideally there would also be a way for users to set a preference for default searches (custom JavaScript?). cmadler (talk) 14:33, 29 February 2012 (UTC)
It's not entirely not censorship either. A "safesearch" option potentially forces workers, students and others to use it out of fear for being blamed for what comes up if they don't, when currently it is clear that if you go to Wikipedia this is what happens. But that's really beside the point. What I object to most in this instance is that when we have an ineffective search interface that presents people with irrelevant results, the answer is not to cover up those few results that got your attention. The right way to spend the Foundation's developer time is making a better search that works in all cases to get you the content you actually looked for, no matter what the extraneous hits might be. Wnt (talk) 15:54, 29 February 2012 (UTC)
"currently it is clear that if you go to Wikipedia this is what happens." - I would vigorously dispute that most users are aware of this risk. However, I agree that developer time is very precious and anything that is asked for on this issue specifically should be as simple as possible to do. I don't agree that doing anything specifically on it should be ruled out completely, because massively improving search is difficult and will take years. Bear in mind too that the backend for the search is Apache's Lucene, and AFAIK the MediaWiki developer focus is adapting it for Wikimedia's scaling and performance requirements, not better functionality. Rd232 (talk) 16:26, 29 February 2012 (UTC)
  • Where is the difference of your proposed "safesearch" (an awful euphemism, btw... you should not use this word) and the dead WMF image filter? None, right, so why do you try it again? We are not stupid. --Saibo (Δ) 16:31, 29 February 2012 (UTC)
    • (i) as the name suggests, "safesearch" (borrowed from Google to give a handle for the discussion - would you prefer "MySearch"?) applies to searching only (ii) WMF image filter is a WMF thing, this isn't. Also, I wasn't aware of the WMF filter being "dead", so "again" doesn't come into it from my perspective. Rd232 (talk) 16:38, 29 February 2012 (UTC)
      • You could even run into trademark problems if you use "safesearch", btw. Use whatever word you want but not one that tries to label censorship as something positive. The image filter is not implemented since it was not liked by us. So you do not need to propose the same stuff again. --Saibo (Δ) 16:41, 29 February 2012 (UTC)
        • trademark :) yes the thought crossed my mind. BTW I'm getting really tired of arguing with people who are either unwilling or unable to understand the concept of censorship. Allow me to quote en.wp: ...the suppression of speech or other public communication which may be considered objectionable, harmful, sensitive, or inconvenient to the general body of people as determined by a government, media outlet, or other controlling body. Frankly, at times the abuse of COM:NOTCENSORED seems two steps away from declaring that all searches must include the word "masturbation", because anything else would be "censorship". And I'm just waiting for someone to change the {{done}} icon to an image of an ejaculating penis... just because there's no discernable difference between that and a tick mark! Rd232 (talk) 17:04, 29 February 2012 (UTC)
          • Fine, if you are also tired, can't we just stop this "crap, the Xth edition"? I do not understand why your quote should not match your and Fæ's attempts here. all searches must include the word "masturbation" - what? Stop to fantasize... really. You don't understand the issue (clearly by your fictional {{done}} example). --Saibo (Δ) 17:13, 29 February 2012 (UTC)
            • I was being silly for effect. On the other hand, even with the assistance of bolding, you still don't understand the concept of censorship. One more time: censorship involves someone else deciding what you can say or hear - taking control away from you. It does not involve classifying things and then saying "well it's up to you" (giving power to you). You might as well say that Commons using categories is "censorship" (maybe we should delete them all, just in case, and redirect the Main Page to Special:Search). Rd232 (talk) 17:29, 29 February 2012 (UTC)
              • ... and someone will decide which categories are excluded by the "savebullshitsearch", or which are represented as to be excluded by the user as a default choice of options, labeling all included categories as insecure and threatening to everyone. In Germany we call that "Meinungsmache" and comment it with "Dümmer gehts nimmer". -- /人 ‿‿ 人\ 苦情処理係 17:37, 29 February 2012 (UTC)
                  • Excluded options would be labelled as what they are (eg Category:Nudity, for the sake of argument), not as Danger! You will become a danger to society if you look at this stuff! Click if you want to know what we're protecting you from... but even knowing that may harm you!. Really, I can only come back to the point I made earlier, and supplement it: this topic brings out irrationality and rudeness in people. Could we put a fraction of that energy into the general "improving search" section, please? Rd232 (talk) 17:44, 29 February 2012 (UTC)
  • This whole category should be deleted, since it is POV from the very beginning. It has no definition where nudity starts and where it ends. It's a shame that we have it. -- /人 ‿‿ 人\ 苦情処理係 17:53, 29 February 2012 (UTC)
Facepalm (yellow).svg and how many categories that aren't referring to a unique object (eg a person) are well-defined? The problem is a general one. Possibly fuzzy sets would help (but it would confuse the hell out of most people). Rd232 (talk) 18:00, 29 February 2012 (UTC)
BewareOfTrolls.svg What isn't well defined will be be removed as a category, since it is arbitrary. It's that simple. -- /人 ‿‿ 人\ 苦情処理係 18:58, 29 February 2012 (UTC)
Well you'd better get to work then, hadn't you? ;) Or more seriously, we could try to use the Template:Category definition approach more. Rd232 (talk) 19:21, 29 February 2012 (UTC)

Proposal: Reduce the ranking of non-keywords[edit]

We have some well known example sexual images appearing either high, or at the top, of generic searches for household words like "cucumber" or "toothbrush". The vast majority of users of Wikimedia Commons find that very unhelpful and it has been the subject of genuine negative press attention and several independent complaints by users concerned about how these images appear on the first page of matches for a search. This problem is not going away and does our community no favours while we wring our hands over marginal interpretations of censorship and POLA.

An optional and simple template such as {{lowsearch | <list of words>}} could trigger the search function to skip or reduce the ranking of the appearance of problematic matches in results. I doubt anyone would care much if a search for "toothbrush" had a close-up photograph of a woman masturbating on even the second or third pages of search results; the real problem here is it appearing on the first page.

In the first instance I would strongly resist any usage of such a template apart from those images which have been subject to specific complaint and detailed review, such as going through Deletion review where this type of solution could be available for an administrator to recommend as an alternative. If the implementation could be done in such a way that only administrators could make such an option effective, this might resolve many concerns and ensure that associated policy or consensus is strictly enforced. -- (talk) 08:09, 29 February 2012 (UTC)

Even if the community accepts that (which I doubt), I seriously doubt the developers would be willing to do it. Allowing the community such broad scope to mess with search results is a non-starter, I would think. The fix for the problem you describe needs to be systemic. Alternate proposal below. Rd232 (talk) 12:04, 29 February 2012 (UTC)
Reducing the ranking of certain words but still displaying the results, is further away from "censorship" issues than a safesearch would be. The addition of requiring administrator control also avoids highly contentious issues about whether Wikimedia is in the business of providing censorship tools that may be used to force limitations on certain groups of users, such as schools requiring that students can only see safesearch results that exclude anyone not fully clothed, potentially even excluding 19th century paintings that might show heroic figures of women with their breast visible. -- (talk) 12:34, 29 February 2012 (UTC)
Administrator control is not really a well-integrated part of this proposal, if it's just a template that can be applied to a file. As for "forcing" people to use safesearch results - well if we absolutely don't want to permit that (maybe it would be appropriate some places?), I think we can probably prevent it, and ensure it's a user choice. Rd232 (talk) 13:11, 29 February 2012 (UTC)
This line has already been extensively discussed in the filter debate. The ranking of search results is conceptually separate and not undermined by the same issues. -- (talk) 13:39, 29 February 2012 (UTC)
Censorship, image filter... not here, thanks. Stop listing to such middle ages media like foxnews (reading their article about wikipedia really was funny - yes, you can see sex positions, uh!). --Saibo (Δ) 16:34, 29 February 2012 (UTC)
LOL. My sole exposure to Fox News is via the Daily Show. My concern is searches like this. I just don't see how you can argue that this is not a problem ("it's a feature, not a bug"?). So what do you suggest to fix it? Rd232 (talk) 18:14, 29 February 2012 (UTC)
I was referring to this article xxxx (see edit mode - don't want to give them free links). That search is not a concern to me. The search engine is far from "good" but this example search result is not bad - no false positives. What is your concern? Yes, I really see not a problem here, in fact. You probably don't like(!) the e. toothbrush used for masturbation, then I suggest searching with google which has annoyingly "safesearch" turned on per default so it is always two clicks more to get all search results. Oh, but don't let your teacher, mother, grandma see that you have explicitly turned "safesearch" off - here you are with the restriction imposed by other people on you. Do you note something? --Saibo (Δ) 21:40, 29 February 2012 (UTC)
Hm? With my browser settings (which delete all cookies when I close the browser), your Google link shows me several controversial photos (Category:Nude or partially nude people with electric toothbrushes, File:Dental Hygene-Nude child.jpg, en:File:Bundesarchiv Bild 183-S33882, Adolf Hitler retouched.jpg) without any extra clicks. They're not on the first page, but all I need is some scrolling. --Stefan4 (talk) 21:57, 29 February 2012 (UTC)
Ironically (perhaps), with SafeSearch off, Google shows the toothbrush masturbation image on line 2; with "moderate", it's on line 5. it needs to be "strict" to avoid it. Rd232 (talk) 22:03, 29 February 2012 (UTC)
A generic image search for "toothbrush" in Google does not bring up any masturbation images at all, whether with safe, moderate or unrestricted search. Same for tolling bells, or backhand. --JN466 05:32, 2 March 2012 (UTC)
Yes, my previous comments related to using Google Images to search Commons (at Saibo's suggestion). Global web search has totally different results. Rd232 (talk) 07:39, 2 March 2012 (UTC)
(i) Google SafeSearch is not "always two clicks more" - go to the cog icon, Search Settings, change, save. (ii) if safesearch is off, nobody is likely to complain about the mere fact of it being off - if they even notice, you can explain false positives. (iii) I have no problem with the toothbrush masturbation image myself, but I find it hard to evaluate how tiny the proportion of searches must be where that image is the sort of thing a user typing "toothbrush" into Special:Search wants to find. Rd232 (talk) 22:03, 29 February 2012 (UTC)
It is always two more clicks - whyever. Maybe because I do not allow big data google to set cookies. ;-) I try to avoid more comments regarding your POV regarding humans (and their sexuality which is an inherent part of life ...). --Saibo (Δ) 02:25, 1 March 2012 (UTC)
Then don't use Google unless necessary. I've switched to en:duckduckgo as my main search engine. Rd232 (talk) 13:44, 1 March 2012 (UTC)
I tried out duckduckgo. But I'm disappointed. The results are horrible and mostly not useful at all. At least if you want to search for content that isn't part of Wikipedia. -- /人 ‿‿ 人\ 苦情処理係 14:01, 1 March 2012 (UTC)
It takes getting used to (results are certainly different from Google), and you have to be willing to switch to other search engines at times, which it makes easy (eg add !g to a search to get sent to Google; duckduckgo tips here). But that's better than the search engine monoculture that's developed since Google started to dominate so much (90% of the market in Europe!). It's not healthy, even without the privacy concerns. Rd232 (talk) 07:36, 2 March 2012 (UTC)
I know that Google is a monoculture and that it has it's disadvantages to use it, because of that. But often you have no real alternative when other search engines great you with "no results". For Japanese content i switched instead to Yahoo and Baidu, because they are better then Google if it comes to Asian languages. -- /人 ‿‿ 人\ 苦情処理係 07:49, 2 March 2012 (UTC)
I prefer to have a template to raise relevant search results, but one to lower them can also be helpful. It's not censorship to tell a search "there's not really any cucumber in this image", especially if it still comes up near the end. Wnt (talk) 05:32, 1 March 2012 (UTC)

Proposal: Simple safesearch[edit]

A simple version of the Safesearch concept would exclude images listed in certain categories and their subcategories (to be defined in a MediaWiki: page, perhaps, like MediaWiki:Safesearch-categories), unless the user opts to include them in the search. Users could opt to do that in their preferences ("always show Safesearch images") or per search ("click here to include images excluded because they match Safesearch criteria"). In a slightly more complex version, there could be different levels or types of Safesearch, with different category sets. For a number of reasons the categories should probably be existing topic categories rather than some new Safesearch category system, though technically they needn't be.

This might be relatively feasible to implement, since the search already pulls descriptions from each file page. And deciding which topic categories to include or exclude would be an issue, but at least such decisions would be global, and then implemented by the system. Rd232 (talk) 12:04, 29 February 2012 (UTC)

You know that this kind of preselected content filtering was denied after the image filter referendum and accepted as this as a statement by the foundation? -- /人 ‿‿ 人\ 苦情処理係 12:31, 29 February 2012 (UTC)
No, I don't know that, and I can't find anything at m:Image filter referendum. Can you point me to it? I see m:Image_filter_referendum/Next_steps/en#Basing_the_filter_on_the_existing_category_system, which does raise the point that using existing categories might lead to a change in how those categories are applied. That's something that would need to be addressed. Rd232 (talk) 13:06, 29 February 2012 (UTC)
I want to point you at the last paragraph (top section) of this article. -- /人 ‿‿ 人\ 苦情処理係 15:01, 29 February 2012 (UTC)
Censorship: "unless the user opts to include them in the search". Stop using that "safesearch" word. No, we are not prudish US-american-safe. Also: why do you need several sections for the same? Do you think your opinion looks more important then? --Saibo (Δ) 16:37, 29 February 2012 (UTC)
Amazing how sex topics seem to bring out the irrationality in so many people... Kennst du FKK, Saibo? Ich schon... Anyway, to answer your question, the subsection Proposal: is a concrete proposal, the main section was a general heading to differentiate from ways of improving search more generally. Rd232 (talk) 16:40, 29 February 2012 (UTC)
Was soll ich mit deiner mir reichlich unverbunden vorkommenden Frage anfangen?! Make another sub-section of the same stuff - sure. The concept overall is not accepted - so you do not need to make sub-proposals which implies that the overall concept is accepted. --Saibo (Δ) 16:45, 29 February 2012 (UTC)
War ziemlich verbunden mit deinem "US-american-safe"... egal. And nothing was implied about acceptance. Rd232 (talk) 16:54, 29 February 2012 (UTC)
This whole discussion circles around exactly the same arguments as the image filter discussion. To be more precise: It's repeating the same the same thing again. Using categories for filtering (searching with exclusion of categories is filtering) is off the table. I will do my best to ensure that it stays this way. Anything else would be just stupid. Can we close this bullshit of "image filter - the extended extended edition, also know as X"? -- /人 ‿‿ 人\ 苦情処理係 17:00, 29 February 2012 (UTC)
Hum. The image filter applied to images everywhere, including images presented in places where you'd expect to see them. It wasn't useful in the same way that filtering search results is; filtering of search results is a basic necessity to make them useful. That includes filtering things out that most people don't want and aren't looking for, just as it includes filtering out, say "Cancer (crab)" if you're looking for the other kind of cancer. Can we not concede the basic point that porn results in an average search result are not useful because the vast majority of searches are not for porn? But also beyond that, Commons is not designed solely to be useful to people who don't mind porn in search results; its educational remit goes beyond that. Rd232 (talk) 17:39, 29 February 2012 (UTC)
You start right away with the "most people" assumption. I find this dangerous, since this implies that you want to introduce presets for someone else ("most people"), despite repeatedly denying it. If a visitor wants to refine his search then he might do so. I would favor it to give him such tools. But presenting him presets, at it's best with the label "things you don't want to see", which excludes some media in general, is out of question. -- /人 ‿‿ 人\ 苦情処理係 17:48, 29 February 2012 (UTC)
I haven't denied that I would make these things on-by-default (at least some settings, anyway). As long it's easy to change (both temporarily per search and permanently as a user preference), I fail to see any "danger" here (nor "censorship"). As for the assumption: if we cannot with some confidence say that most people searching Commons are not looking for porn, then the WMF may get in trouble with the IRS, since that might rather undermine the claim that Commons is an educational resource. Rd232 (talk) 17:56, 29 February 2012 (UTC)
Sorry, but if you are not able to see what you are trying to do, then you must be blind on both eyes or just stupid. I know that I'm going to far with this kind comment, but your words are utterly unacceptable and i felt a strong enough urge to tell you what i think, despite the rules. -- /人 ‿‿ 人\ 苦情処理係 18:44, 29 February 2012 (UTC)
Maybe if you explained your views better you wouldn't be reduced to getting angry at someone who doesn't share them but is willing to listen. Rd232 (talk) 19:00, 29 February 2012 (UTC)
If you followed this discussion you already read anything you would need to understand the issue and wouldn't talk that way. All the needed arguments are present (again). But I'm kind and will give you an simple example of thought:
Lets consider the filter option "no blacks". If we would represent it to our users as the only option to exclude black, or even dark colored people, from the search results would be clearly considered as a discrimination of first class. I guess you can agree with me. But when you add "no nudity" instead, then it would be no discrimination? Can you follow me this far, or is there some detail which won't fit? -- /人 ‿‿ 人\ 苦情処理係 19:18, 29 February 2012 (UTC)
You really have lost the plot, equating filtering by subject's race with filtering by sexual explicitness. Not to say that filtering by race is never a useful thing to do - sometimes users need an image specifically of a black man or a white woman or whatever. But that sort of filtering is (or should be) well covered by categories; or at least it will be if incategory: is improved to cover subcategories (see section above somewhere). Inclusive filtering is much better served at the moment than exclusive (making it much easier to search for X than "everything but X" unless X happens to be a single category). Rd232 (talk) 19:31, 29 February 2012 (UTC)
No I'm still at it. You made the proposal that "safesearch" should exclude certain categories as a default. But this is just a cover up mission for the actual categories which will be excluded. This categories might be "nudity" and "black people". But who (you?) decides which of this categories will be excluded? You seam to have no moral problem to add "nudity" to the excluded categories as a default. But at the same time you find yourself having a problem to admit that you might have added "black people" as well, since this would certainly not only get the attention of FOX. So i might conclude that you have not the morality to treat subjects as equal in value, and therefore aren't able to see the issue. -- /人 ‿‿ 人\ 苦情処理係 19:52, 29 February 2012 (UTC)
Hum, I think might finally understand your position. Your view is that in the safesearch concept (i) users only exclude things from search results that they think are "immoral", or which they otherwise disapprove of; and (ii) by allowing users to do that we would be endorsing that moral view. Well both points are wrong: (i) it's perfectly reasonable for users to want to exclude things some of the time because of the context they're in (eg the workplace) and (ii) by empowering users to filter search results in particular ways we're not endorsing anything; we're acknowledging and respecting needs. And frankly, I think the extremist NOTCENSORED position insists on imposing its own particular morality on everyone else, which I personally find immoral. Commons exists to serve the whole world, and the whole world's needs and views are diverse. By insisting on imposing your morality on everyone else, you are not treating people's views as having equal value, which is fundamentally more important than pretending that we cannot distinguish sexual subjects from non-sexual ones without devaluing either. We can, and we should, because that is what the actual and potential consumers of Commons need and want. Rd232 (talk) 20:30, 29 February 2012 (UTC)
I don't impose any morality. It's a show all or show nothing approach. By showing anything or nothing you ensure to not draw an arbitrary line somewhere in the middle, which is defined by someone that isn't the user itself. -- /人 ‿‿ 人\ 苦情処理係 20:46, 29 February 2012 (UTC)
"I don't impose any morality. It's a show all or show nothing approach. " is contradictory - you're precisely imposing a "show all" approach. And it's irrelevant that the line isn't defined by the user (which would be totally impractical) if the user has easy control over whether the line applies. Besides, the line might be arbitrary, but it wouldn't be random: it would be drawn by consensus in the usual way, and that's good enough for everything else we do. Rd232 (talk) 21:02, 29 February 2012 (UTC)
Showing anything does not judge any of the content. Everything is treated as equal, as valuable as anything else. That way no morality (an imbalanced judgment) is applied. Using consensus would impose our morality onto the visitor (we decide) and we would draw lines after our preferences. That simply isn't our job. -- /人 ‿‿ 人\ 苦情処理係 21:43, 29 February 2012 (UTC)
Drawing lines does not imply value judgements, any more than categorising dog images into Category:Dogs does. And drawing a consensus line does not impose our judgements on visitors; the imposition lies in giving or not giving users the choice of whether the line applies to their searches. By insisting on showing everything you are already drawing a line, without giving users any control over whether to apply it. There's no getting away from it - that is imposing your morality. Rd232 (talk) 22:13, 29 February 2012 (UTC)
Sorry, but i will end this discussion right here at this point. You only shout out nonsense and it gets more and more worse in the progress. I hoped for a long time that some kind of primitive logic would solve the matter and lead to some kind of conclusion. But now i have given up that hope. To me it is like talking to a wall or trying to lecture the derivative of some term at primary school. I don't think that i can convince you to see the obvious. It just makes no sense to exchange any more arguments at this point. -- /人 ‿‿ 人\ 苦情処理係 22:42, 29 February 2012 (UTC)
Symbol support vote.svg Support adding a filter on Category:Nudity (since it upsets Fox News moralists) only if you also add a filter on Category:Judaism (since it upsets Nazis). And let's also get an option to censor the safesearch filter since seeing that there is such a filter upsets other people, e.g. User:Saibo. --Stefan4 (talk) 20:22, 29 February 2012 (UTC)
Much of the discussion here is overly theoretical. We should simply follow the lead of reputable sources out there in the real world. As far as the world's top websites are concerned, all of them enable filtering of adult material. Wikipedia is the only exception. And of course none of these top websites offer options like filtering out black people, or filtering out Jews. Absurd! What is happening on this page is not a mature discussion; it looks a lot more like adolescent trolling, or filibustering. --JN466 05:54, 2 March 2012 (UTC)
Why do you wish to discriminate Muslims, Nazis and Ku Klux Klan members by only providing a safe search for Fox News moralists? --Stefan4 (talk) 09:43, 2 March 2012 (UTC)
Actually it's the pro-censorshit crowd that is trolling. Commons isn't censored, if you don't like that fact, there's Conservapedia, so please leave the adolescent Commons and go there. VolodyA! V Anarhist Beta_M (converse) 06:28, 2 March 2012 (UTC)
I don't think you realise just how entertaining the suggestion is that I should go to Conservapedia. Let's just say that if I did, that would involve trolling... Rd232 (talk) 07:42, 2 March 2012 (UTC)
Let's just say that Wikimedia is the only such website operator in the world where people like you would stand a chance in hell of being involved in decisions like this. And that's only because Wikimedia allows literally anyone who turns up to participate, regardless of intelligence or competence, and regardless of intent, trolling or otherwise. --JN466 14:32, 2 March 2012 (UTC)
Can we not agree that most people aren't looking for butterflies in their search. And yet here i am, searching for different monarchs and what do i get? This aweful set of results. I mean almost half of these pictures have nothing to do with monarchs. Clearly there's some agenda here to show our children as many butterflies as possible. There should be some sort of a filter to ban butterflies from all searches (well, a person may have a right to register and unselect the box "I don't hate the children"). VolodyA! V Anarhist Beta_M (converse) 18:50, 29 February 2012 (UTC)
That's a perfectly good example of the disambiguation problem - see #Disambiguation above. Solving it generally (for all cases) is difficult - so difficult that I would have a separate solution for certain limited cases. But by all means, please go to that section and address the general problem. At this point it's fairly unlikely (...!...) that the SafeSearch part of this RFC will achieve anything. However, the discussion about it might still help shed some light on how search can be improved more generally. So, again, everyone bothering to get steamed up about the SafeSearch, please contribute to the general discussion as well. Thanks. Rd232 (talk) 19:00, 29 February 2012 (UTC)
Ich verstehe nicht, was du damit sagen willst. Please do something useful for Commons - that is not inventing new ways to censorship. --Saibo (Δ) 17:07, 29 February 2012 (UTC)

First we had Jimbo's clean up, now a safe search is proposed. It would almost be a relief if someone had the honesty to openly advocate censorship. Any such mechanism could be used by an individual to limit their searches but such an individual could simply avoid unwanted categories: the mechanism most certainly would be used by those in authority over a user (parents, schools, governments etc.). The Commons' sex-related images are mostly for the purpose of sex-education, though there is also some art. Both were removed in the "cleanup", both would be blocked by the safe-search (or any other such mechanism). Sex education takes place, in most instances, because the religious right are unable to prevent it. These projects exist to implement the ideals of the enlightenment:- knowledge for all, whether of or not those in authority approve. --Simonxag (talk) 19:17, 29 February 2012 (UTC)

"such an individual could simply avoid unwanted categories" - this RFC is about search. Please explain how end users can practically prevent things like this happening. And BTW there's plenty of opposition to properly categorising such images too... (see eg Commons talk:Nudity). Rd232 (talk) 19:24, 29 February 2012 (UTC)
Ha ha, I see, but why should one bit of life (human sexuality) be singled out. Presumably if we search on "bugger" there's now going to be a risk of throwing up a picture of a Murdoch. How gross! I'm happy for anything to improve the power of the search but not if it gives the opportunity for authorities to block material. As for Commons nudity, museums don't have galleries with nudes and "safe" galleries: editors object to such segregation (not least for the uses that might be made of it). --Simonxag (talk) 20:32, 29 February 2012 (UTC)
I find it vaguely amusing that people genuinely seem to think that a safesearch system on the user-empowering basis described would be of any interest to actual censors. They already have systems; and if they didn't, they wouldn't rely on something that could so easily be changed (or even deactivated, if it was genuinely being abused by censors). And people going to museums broadly know what to expect. A Mapplethorpe exhibition? OK, there'll be sex. Science Museum, in amongst the steam engines, I don't expect video screens of people having sex in a steam engine. Whereas in a section about reproduction, maybe. Clear? See also Principle of Least Astonishment (eg m:Resolution:Controversial content). Rd232 (talk) 20:45, 29 February 2012 (UTC)
It would be of interest since it would strengthen their position at least in some parts: "Hey look, these guys know what is good for you, we know it even better, let us help you, don't look for this shit". Well it is written with some irony, but keep in mind that todays censors your various methods and even if the content rating system (i would call it that way) is imperfect it could serve as an improvement, as a part or subsystem for actual censors. Simply put: This idea is bullshit and nothing good will come from this. Larry sang for the first time, FOX adopted it, Jimbo followed their lead, he made them proud and now it isn't enough anymore and we are talking shit again. Do you think it will stop after implementing such changes? If you do, then be assured i would laugh at you by the grave of my mother. -- /人 ‿‿ 人\ 苦情処理係 21:53, 29 February 2012 (UTC)
Alright, let's assume for a moment that a censor takes an interest. What, then, could they usefully get from Commons to support their censorship? Well, as far as I can see, they'd use the topic classification system, probably combined with keyword-filtering of file descriptions and titles, and build a system which doesn't allow users any choice over how it's applied. I can't see a Commons-based user system based on topic categories materially helping them; for one thing, they wouldn't want to rely on it, so they'd have to do their own work even if they thought it might be useful. So the proposed system would make no difference; people subject to censorship will be subject to censorship, because the information already exists to do it. And the money and resources to turn that information into censorship exists too. This is unavoidable (even if you stripped out all file descriptions and categories and randomised file titles, you have the information content in the files themselves). Censorship can only be beaten by getting rid of censors. Rd232 (talk) 22:25, 29 February 2012 (UTC)
Preposition: Smiley blind.gif Action: Notpayingattention.gif Reaction: Smiley lol.gif -- /人 ‿‿ 人\ 苦情処理係 22:52, 29 February 2012 (UTC)
Harsh but fair. Well at least you're self-aware. Rd232 (talk) 23:02, 29 February 2012 (UTC)
(EC) LOL, are you intentionally trying to make me laugh? Very amusing, sadly... --Saibo (Δ) 21:54, 29 February 2012 (UTC)
  • Am generally with Rd232 here. The search function is completely unsatisfactory as it is, presenting hardcore porn when it is not expected. The Commons search function is also the Wikipedia search function. [1][2][3] --JN466 05:17, 2 March 2012 (UTC)
I explained under A little bit of intelligence why it isn't necessary to introduce new tags or to hide categories. Additionally it is a proposal that would improve the search for other cases as well. So it is up to you to decide if want to use "a little bit of intelligence" that empowers the user or to waste time with inappropriate tagging (a constant time effort) and violating NPOV. -- /人 ‿‿ 人\ 苦情処理係 07:17, 2 March 2012 (UTC)
What part of Commons:Project scope/Neutral point of view is relevant here? Rd232 (talk) 07:24, 2 March 2012 (UTC)
  • Symbol support vote.svg Support Unfortunately it's an idea that's not going to fly, but I am very favourable towards the idea of a safe search option. Specifically, my preferred implementation would be default filtering (for non-logged-in, obviously logged-in would be able to choose) based on a simple, agreed category designation. Then, we could even put a big link at the top of search results with the words "In addition to the results displayed below, one or more images deemed "not safe for work" matched your search query. To include them in your search results, click here." It's not censorship - we're not hiding the files, they're only one click away - but it prevents Commons getting a reputation for including pornographic images for innocent search terms, and, well, generally showing visitors images they don't want to see. Okay, so maybe we'd like people's sensibilities not to be touched by the sight of god knows what sex act, but unfortunately they are, and we should be trying where easy to make Commons more user friendly. Otherwise, what's the point of having a visitor-accessible search facility at all? (Further thought: if safesearch isn't going to fly, what about simply not displaying thumbnails for images in certain categories?) Jarry1250 (talk) 14:33, 2 March 2012 (UTC)

COM:NOTCENSORED[edit]

I've had enough of the abuse of the term "censorship" for proposals which give control to end users over their search experience; every time this is done, it is an insult to the millions who suffer under actual censorship. I've also had enough of the abuse of the Commons policy COM:NOTCENSORED (a section of Commons:Project scope). Quote:

The policy of "Commons is not censored" means that a lawfully-hosted file, which falls within Commons' definitions of scope, will not be deleted solely on the grounds that it may not be "child-friendly" or that it may cause offense to you or others, for moral, personal, religious, social, or other reasons. [emphasis added]

What the policy says is that Commons may host "controversial content" (to use the WMF resolution term - wmf:Resolution:Controversial content). It says nothing about preventing users from choosing whether to filter some or all of it out of their searches. Rd232 (talk) 07:22, 2 March 2012 (UTC)

But you have yourself argued for prevention of display of some idiotic-adult-unsafe (they aren't unsafe for intelligent children) images. You said that you wanted to hide sexual use of toothbrushes from the search for toothbrushes for example; and that is censorship. VolodyA! V Anarhist Beta_M (converse) 07:38, 2 March 2012 (UTC)
No it isn't, because I was always very clear that users would retain the ability to (i) see what was filtered out from a specific search and (ii) turn off the filtering permanently. If it was impossible for users to access what was filtered out then that would be censorship. Rd232 (talk) 08:15, 2 March 2012 (UTC)
There are various definitions of censorship and it starts at very different points. In this regard i share the position with the ALA, which calls filtering based on value judgment a "censors tool" and therefore censorship. -- /人 ‿‿ 人\ 苦情処理係 08:17, 2 March 2012 (UTC)
"Labels on library materials may be viewpoint-neutral directional aids designed to save the time of users, or they may be attempts to prejudice or discourage users or restrict their access to materials. When labeling is an attempt to prejudice attitudes, it is a censor’s tool. The American Library Association opposes labeling as a means of predisposing people’s attitudes toward library materials." [4]
Good of you to provide the quote which disproves your interpretation of it. When labeling [of materials] is an attempt to prejudice attitudes, it is a censor’s tool. is not the same as saying "filtering [of search results] based on value judgment" is a censor's tool. It would be more like sticking giant warning templates on certain files' file description pages, "warning, this content is not something you'd want your wife or your servants to see" (to quote an old English obscenity trial). Besides, the definition is simply wrong. Censorship is about preventing access to or dissemination of information. Labelling intended to prejudice attitudes is a propagandist's tool. Censorship and propaganda may often be combined, but they are far from the same thing. Rd232 (talk) 10:18, 2 March 2012 (UTC)
"Viewpoint-neutral directional aids facilitate access by making it easier for users to locate materials. The materials are housed on open shelves and are equally accessible to all users, who may choose to consult or ignore the directional aids at their own discretion."[5]
"Directional aids can have the effect of prejudicial labels when their implementation becomes proscriptive rather than descriptive. When directional aids are used to forbid access or to suggest moral or doctrinal endorsement, the effect is the same as prejudicial labeling."[6]
I guess you did not read at all, right? (Marking by myself) -- /人 ‿‿ 人\ 苦情処理係 10:30, 2 March 2012 (UTC)
I read the quote you provided, and skimmed the source document; I didn't see what you've now quoted in addition. I'm not sure whether there's much point in repeating in different form the question we talked about above: is allowing users to filter their search (or not, if they don't want) enforcing a view? I don't think so. Is preventing users from filtering their search as they might wish because "all content is equally valuable" enforcing a view? I think it is. I'll concede that the user filtering could be set up to imply endorsement, and we wouldn't want that and it can be avoided by appropriate description. Rd232 (talk) 10:39, 2 March 2012 (UTC)
That thing you suggested and called "safe search" would perfectly meet the quote "to suggest moral or doctrinal endorsement". You would provide a preset with the secondary description "activate me, or you might find immoral results". Which is exactly what I, most German users and the ALA see as discrimination and censorship. This is also a point which doesn't have any room for discussion. Point. -- /人 ‿‿ 人\ 苦情処理係 10:57, 2 March 2012 (UTC)
I adopted "SafeSearch" as a handle for it, borrowing Google's term, without ever intending the end product to be called that (sorry if that wasn't clear). Because, as I said in my comment just above, it should be presented without implying judgement. The only thing it should do is acknowledge that many people don't want to see particular types of image unexpectedly. If search can really be fixed to avoid the unexpectedness (as discussed in the Effective Search section) then great - the "safesearch" approach is no longer necessary. It's not like I ever envisaged somebody searching for, say, "man masturbating" being asked if they really want to see that; they've searched for it, they should get it, without further delay. PS "Punkt" doesn't translate that way for that usage. You need an idiom like "end of story". Rd232 (talk) 12:27, 2 March 2012 (UTC)
Or you can say, "period", or "full-stop". --JN466 14:45, 2 March 2012 (UTC)
OK. Could we agree,
  1. that it would be unnecessary to introduce a filter that suggests moral or doctrinal endorsement, by using some kind of fixed rule set to exclude content from the search, that is defined by us and not the single user itself?
  2. that we should not lower or raise the search results by manipulating the ranking in any way?
-- /人 ‿‿ 人\ 苦情処理係 13:04, 2 March 2012 (UTC)
Niabot, you may have noticed that when you go into a library, the books are sorted, right? So when I look for books on dental hygiene, I don't find them mixed with porn DVDs. Now, do you feel that the library is engaged in censorship by not interspersing its books on other topics with pornographic works? Should there always be a little bit of porn on every shelf, regardless whether it's dentistry, cathedral architecture, or tennis? Because that is what we are doing. --JN466 14:42, 2 March 2012 (UTC)
The difference is that library books only can be sorted under one category whereas Commons images can be included in multiple categories, so a library would only sort Media:Masturbating with a toothbrush.jpg under the most significant category. --Stefan4 (talk) 14:52, 2 March 2012 (UTC)
... which would not be dental hygiene. The whole point of sorting is to enable library visitors to find what they want. So if the library enables them to avoid porn while looking for books on toothbrushing, is it engaged in censorship? And is Google, used by almost every porn consumer in the world to find their porn of choice, "censored", because it enables safe searching? You clearly have no idea how ridiculous some of these arguments sound. It's [like] arguing with a teenager. --JN466 14:57, 2 March 2012 (UTC)
Did you notice that this is exactly what I try to fix with my proposal A little bit of intelligence, without having any problems considering unequal treatment? Wouldn't this be sufficient enough, or is this whole talk actually done with the intention to ban certain topics from Commons? -- /人 ‿‿ 人\ 苦情処理係 15:12, 2 March 2012 (UTC)
It is of course useful if the search function finds what you want and place less important things further down in the list, but this should not be done in any moralistic way. For example, if a search for dental hygiene places images depicting pornography at the bottom of the search results, the converse should also be true so that a search for pornography places images depicting dental hygiene at the bottom of the search results. If a search for pornography instead places images depicting pornography at the bottom of the search results, as you seem to be proposing, the search function is censored and faulty. And there should not be any Show unexpected results button; all images should appear somewhere in the list of results. --Stefan4 (talk) 15:29, 2 March 2012 (UTC)
I don't know where you got the idea from that anyone here would advocate a system where people looking for images of sex should be shown images of toothbrushes instead. Google doesn't do that either, does it? What I and others would like is for Wikimedia simply to behave in the way that is standard among all other reputable websites out there, be it Google, Flickr or YouTube. --JN466 15:40, 2 March 2012 (UTC)
That wasn't my idea. I think you misunderstood me. My idea is to place the most important results at the top while grouping them to topics/categories. That way the user has an easy way to see what he might have to expect and it would be his decision to look at the results. I did not propose anything like a search that would have a preference for some kind of topic. It just a way to display results in a manner that can't be objected. If you search for a cucumber and open the topic "sex toys", then it is your fault if you don't want to see a woman using it for masturbation. As simple as that. The idea behind this approach is to display anything at will and to make refined searches to avoid ambiguous results. -- /人 ‿‿ 人\ 苦情処理係 15:50, 2 March 2012 (UTC)
(ec) Niabot, I have never objected to the fact that we cover adult topics, or have explicit media. I object to the absence of an image filter, when an image filter is standard among all mainstream websites. And I object to our keeping a lot of exhibitionistic and poorly-executed material that is not realistically useful for an educational purpose. (And my objections on the latter count would be reduced somewhat if we had an image filter.) You probably realise that Commons would be illegal on a German server without AVS, according to German Jugendschutz law. --JN466 15:34, 2 March 2012 (UTC)
That others do something can't be the rule, neither does it mean that it would be good for us and the mission. Regarding German law (don't know why you at it): We wouldn't have a problem with the JuSchG. It's by far not the only rule a court has to follow. There are many other laws to consider. To say it is forbidden, because we have the JuSchG would be fraud, as well it would be fraud to to build a stupidly simplified construction of argument like this: German laws are more strict (no proof). The Germans are the one which protest (you forgot France, Spain, Italy, Sweden, ...). They have no right to protest, because US laws are less strict. If we want an image filter - you shut up. -- /人 ‿‿ 人\ 苦情処理係 16:02, 2 March 2012 (UTC)
See http://www.bundespruefstelle.de/bpjm/information-in-english.html – to wit:
Media with pornographic content are regularly considered to be obviously and severely harmful to minors. Pornography itself is defined by the German High Court as a presentation of sexuality that is not connected to any kind of psychologically motivated human relationship and which glorifies sexual satisfaction as the only reason for human existence, often accompanied by grossly depicted genitals.
Distributing those objects to minors is illegal (§ 15 I and III-VI JuSchG) and will be punished by law (§ 27 JuSchG). In addition, the German penal code (Strafgesetzbuch - StGB) penalizes the dissemination of pornographic content (§ 184 StGB).
Completely prohibited - even among people of legal age - are the depictions of sexual acts involving children, animals or violence. Similar regulations prohibit media with explicitly violent content.
The spreading of pornographic content and other harmful media via the internet is a criminal offence under German jurisdiction. A pornographic content on the internet is legal only if technical measures prohibit minors from getting access to the object (AVS = Age Verification System or Adult-Check-System).
I enquired with the Bundesprüfstelle a while ago, and they were quite happy to confirm that sample media files I sent them, like the van Maele incest and bestiality drawings and various photographs of sex acts, plainly fall foul of German youth protection law. However, they also told me that their hands are tied because the servers are in the US.
Now, when you say that what everybody else does might not be good for us, what you really mean is that Wikimedia should be doing what you, some dude on the Internet, want it to do – even though it brings Wikimedia into disrepute, and even though no other reputable site behaves that way. The concept of Wikimedia projects doing what other reputable publishers do is well anchored in fundamental Wikimedia project policies such as WP:NPOV, WP:V and WP:OR, all of which militate against some dude on the Internet making up what content Wikimedia should offer, and how it should be presented. How major players in wider society, such as Google, YouTube, Yahoo and Flickr, present adult material is rather more relevant to this project than the ideas of some dude on the Internet whose signature looks like a pair of nipples. --JN466 14:23, 3 March 2012 (UTC)
You know with which kind of organization you exchanged with? They are pushing their own point of view because it is their own job to do so. On the other hand we have movies, tv series, literature or comics that clearly show sexual activity, brutality and other stuff and are free for minors. Thats because the JuSchG itself is also limited by other laws that have to be considered as well. You only cite one side of a medal, while willingly ignoring the other. Thats why i pity you. Additionally (look at the header) you are way off topic. So i would recommend to not continue this discussion at this place. -- /人 ‿‿ 人\ 苦情処理係 14:39, 3 March 2012 (UTC)
So they are pushing a point of view, but you are not? Wouldn't you say that their point of view is rather more representative of the society we aim to serve than yours? --JN466 14:53, 3 March 2012 (UTC)
I want to provide media files for all interest groups. They only want to provide media files from which they think that this files are suited for children. Quite a difference in mission, which makes it impossible to answer the question what would be representative. -- /人 ‿‿ 人\ 苦情処理係 15:18, 3 March 2012 (UTC)
We are all concerned with providing media files for all interest groups, including educational adult media for adults. However, I hope we can agree that the view of the Bundesprüfstelle represents the mainstream view in Germany on how to do that responsibly (and that in many other countries the mainstream view is similar). Note that that's a quite separate matter from agreeing with the mainstream view. --JN466 17:27, 3 March 2012 (UTC)
I can't agree with that. There is a constant dispute between the BPJM and all kind of media, which results in strong criticism about it's practices. The reason for that is simple and i would doubt that it needs an explanation at this place. Since there is this constant dispute, which is also enforced by the BPJM to point out it's own importance, while schoolchildren openly exchanging adult material behind their back (was never easier as today), i can't agree that it would be representative for the mainstream. It has a too narrow focus (goal, aim) to be mainstream or mainstream compatible. -- /人 ‿‿ 人\ 苦情処理係 18:00, 3 March 2012 (UTC)
So-called adult material is obviously only an issue for some adults who can't stand it and try to restrict its availability whereas children have no problems with it. In some countries the censors are lucky, managing to expel the material to especially designated areas marked as 18禁 (or similar) whereas in most countries the material would be readily available for anyone to access in standard magazine stands mixed with other material such as children's comics (although typically not on the same shelf as the comics due to categorisation). --Stefan4 (talk) 20:59, 3 March 2012 (UTC)
As an organ of a democratically elected German government, which as part of its publicity work draws the public's attention to relevant legislation shaped by the German people's democratically elected representatives, the Bundesprüfstelle represents the mainstream in Germany. That's not to say that the topic isn't contentious, or that things may not change at some point in the future, but that is the status quo. Thanks for adding the cluster search idea on Meta. --JN466 04:07, 4 March 2012 (UTC)

Custom Safesearch[edit]

Rather than creating a list for all users, each user should be able to decide what categories to filter by using the css scripts. Then users could import other users filter options by adding an include statement in their own css file. Maybe a user is just afraid of spiders and wants to hide the Araneae category. This would also fix the battle over tagging of images mentioned by AerobicFox above. MorganKevinJ(talk) 00:17, 10 March 2012 (UTC)

Apart from being inconvenient, and possibly unworkable for the less techy users, I don't see the difference. The other proposal allowed for different lists too, with user choice of which (or none). Rd232 (talk) 01:04, 10 March 2012 (UTC)

Proposal: Increase the ranking of keywords[edit]

If an image or category is more relevant to the search terms or a featured image it should be ranked higher. The issue seems to only be brought up when an offensive image has very low relevance to the search terms. This would also be very useful from a usability standpoint since it will reduce the appearance of any unrelevant search results. MorganKevinJ(talk) 16:54, 15 March 2012 (UTC)

But what if the user expects to find the "irrelevant" results, because he was actually searching for them? It would make his search unnecessary complicated. --/人 ‿‿ 人\ どこに見てもオッパイばかり 23:19, 20 March 2012 (UTC)

Create image galleries for search terms that lead to offensive content[edit]

As an example searching for pearl necklace lead users to this offensive image but now when users search for pearl necklace they are directed to this page. MorganKevinJ(talk) 20:54, 26 April 2012 (UTC)