File:Erste Artikel in der deutschen Wikipedia - Versionsfragmente von extern.gekuerzt.pdf

From Wikimedia Commons, the free media repository
Jump to navigation Jump to search
Go to page


Original file(1,239 × 1,754 pixels, file size: 6.82 MB, MIME type: application/pdf, 136 pages)

Captions

Add a one-line explanation of what this file represents

Captions

Summary[edit]

Description
Deutsch: *Häufig wird de:Polymerase-Kettenreaktion vom 12. Mai 2001 als erster Artikel der Wikipedia genannt – Es handelt sich hierbei jedoch nur um die älteste noch erhaltene Version eines Artikels in der Versionsgeschichte
  • Durch eine falsch eingestellte Option der damals verwendeten Wikisoftware wurde die Anzahl der Versionen pro Artikel begrenzt. Nach Erreichen dieser Grenze wurde mit jeder Bearbeitung die jeweils älteste Version des Artikels gelöscht. Welcher Artikel in der deutschsprachigen Wikipedia wirklich der erste war, ist heute nur noch mit externen Ressourcen feststellbar.
  • Im erstellten PDF wird auf die Korrektheit der Archivdaten von archive.org gesetzt, die Daten der frühen Wikipedia spiegeln. Vermutlich ist es ein glücklicher Umstand, dass damals "robots.txt" noch nicht umfangreich mit Ausschlussregeln befüllt worden war.
Siehe auch
Date
Source Daten [archive.org], compilation, chronological ordering of archived content (/not/ equal to the time of archiving), pdf bookmarks by cmuelle8
Author Early Wikipedia autors, pdf compilation by cmuelle8
Other versions

Licensing[edit]

GNU head Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation; with no Invariant Sections, no Front-Cover Texts, and no Back-Cover Texts. A copy of the license is included in the section entitled GNU Free Documentation License.
Creative Commons license
Creative Commons Attribution Creative Commons Share Alike
If this file is eligible for relicensing, it may also be used under the Creative Commons Attribution-ShareAlike 3.0 license.

The relicensing status of this image has not yet been reviewed. You can help.

العربيَّة | azərbaycanca | беларуская (тарашкевіца)‎ | български | বাংলা | català | čeština | Deutsch | Deutsch (Sie-Form)‎ | English | español | eesti | فارسی | suomi | français | galego | עברית | hrvatski | magyar | italiano | 日本語 | 한국어 | lietuvių | македонски | മലയാളം | Bahasa Melayu | Nederlands | occitan | português | português do Brasil | română | русский | sicilianu | slovenščina | српски / srpski | svenska | ไทย | Türkçe | українська | Tiếng Việt | 中文(简体) | +/−

Licensing update unknown

Howto / Source Code[edit]

The PDF was generated using a script. It's PD, be creative. It relies on wkhtmltopdf, pdftk and python. It also fixes some quirks in the sources, mainly bad character encodings. See fx() functions for the modifications done. The base64 blob contains built-in urls that are sourced to build the final pdf. The script also generates a bookmark hierarchy from the urls sourced. You can modify it after- wards if you need to.

#!/bin/bash
TMP="/dev/shm"
WK="$PWD/wkhtmltopdf -n" f="--footer"

# comment if qt patched version of wkhtmltopdf is not available
WK="$WK --image-dpi 100 --image-quality 85 --margin-bottom 1cm \
  $f-left [doctitle] $f-right [page] $f-line $f-font-size 6 \
  --zoom 0.${ZOOM:-80}" #--use-xserver --no-pdf-compression"

Y="import sys, urllib as ul
sys.stdout.write(ul.unquote_plus(sys.stdin.readline()[:-1]))"

function urldecode() { python2 -c "$Y"; }
function urlencode() { python2 -c "${Y/.un/.}"; }

function _ll() {
  B="http://jansson.de/wikipedia/wiki.cgi?"

  wget -qO - "${B}action=index" \
  | grep -ao 'wiki.cgi[^"]\+' | cut -b 10- | uniq \
  | sed -e '/^?\(HomeP\|RecentC\|action=editprefs\)$/d' \
  | while read u
    do
      u="$(urldecode <<<"$u")" t="$u"
      u="$(iconv -f utf-8 -t latin1 <<<"$u" | urlencode \
           | sed -e 's,%2F,/,g;s,%28,(,g;s,%29,),g')"
      echo "${B}$u" "$t"

      uh="${B}action=history&id=$u"
      echo "$uh" "$t art revhist"

      wget -qO - "$uh" \
      | grep -ao "name..revision..value..[0-9]\+" \
      | grep -ao "[0-9]\+" \
      | while read to
        do
          ur="${B}action=browse&id=$u&revision=$to"
          echo "$ur $t art r$to"

          ur="${ur/id=/diff=1&id=}&diffrevision=$((to-1))"
          echo "$ur $t diff r$((to-1)) r$to"
        done \
      | sed -e '$d;1,2d'
    done
}

function _ar() {
  B="https://web.archive.org"

  while read u t
  do
    wget "$B/save/$u" -O /dev/null
    echo "$B/web/$u"
    sleep 2
  done
}

function bm() {
  B="Bookmark" ; [[ -n "${1%%quellen.html}" ]] && \
  echo -e "\n${B}Begin\n${B}Title: $1\n${B}Level: $2\n${B}PageNumber: $3"
}

function fx() {
  echo -e "\n$1" | tee /dev/tty | grep -q '^http' 3>wk && wget -qkO wk "$1"

  tr '\n' '\a' <wk | sed -e "\
    s,<head>,\0<meta http-equiv=\"content-type\"\
      content=\"text/html; charset=$(file -bi wk | grep -o "[^=]*$")\">,I;\
    s,<button\|<input,\0 style=\"font-size:85%;border:1px solid darkgray;\
      color:black;background-color:white;padding: 0.2em;\" ,gI;\
    s,id=.p-cactions.,\0 style=\"visibility:hidden\" ,I;\
    s,\(href=[ \"\']*\)wk,\1$1,gI" \
  | case "$1" in \
      *=index) sed -e "s,%28,(,g;s,%29,),g;s,%2F,/,g;\
        s,%C3%84,%C4,g;s,%C3%96,%D6,g;s,%C3%9F,%DF,g;s,%C3%A4,%E4,g;\
        s,%C3%A8,%E8,g;s,%C3%A9,%E9,g;s,%C3%B4,%F4,g;s,%C3%B6,%F6,g;\
        s,%C3%BC,%FC,g" ;;
      *20050216*Hauptseite) sed -e 's,<img[^>]*ujoma.01.jpg[^>]*>,,' ;;
      *20050628*Hauptseite) sed -e 's,</\?img[^>]*>,,g' ;;
      $Q) sed -ne '/^M4/,${/^M/d;p}' <<<"$URLS" ;;
      *) cat ;;
    esac | tr '\a' '\n' > "$2"
}

function _pdf() {
  D="$TMP/$0.tmp~" J="jansson.de" A="archive.org"
  rm -rf "$D" ; mkdir "$D" ; cd "$D"

  pp=1 n=1 v="prevurl" w="http://[^/]*pedia"
  while read url title
  do
    m="" fi=wk.html fo=$(printf %04d $n).pdf ; fx "$url" $fi </dev/null
    c="$WK --title \"$url\" $fi $fo" ; { echo "$c"; $c; } &>>log || exit 1

    if ! grep -q "$Q\|RecentChanges$" <<<"$url"
    then 
      [[ -z "${url%%*$J*}" && -n "${v%%*$J*}" ]] && v="$url" \
        m="$(bm "$J/wikipedia/wiki.cgi?" 1 $pp)"

      # add top level bookmarks when year or domain changes
      t="$(grep -o "$A/web/200." <<<"$url")"
      l="$(grep -Po "/$w\.(com|org)" <<<"$url")" s="$l"
      grep -q "[3-9]$" <<<"$t" && l="${l/.com/.org}"

      [[ -n "$t" && -n "${v%%*$t*$l*}" ]] && v="${url/$s/$l}" l="$l/wiki/" \
        m="$(bm "$t ${l/%*.com*/${l%/}.cgi?}" 1 $pp)"
    fi

    t="$(urldecode <<<"$url" | iconv -f iso8859-1 -t utf-8 | sed -e "\
      s,/200[0-2]0[0-6].*com/$,\0HomePage,;s,^[^0-9?*]\+\(200.\|?\),,;\
      s,^\([0-9]*\)/http://[^/]*/\(\|wiki/\?\(\|.cgi?\?\)\),\1 ,;\
      s, $, Hauptseite,")"
    l="$(grep -q "?action=[heb]\|from=" <<<"$url" && echo 3 || echo 2)"

    grep -q "$Q" <<<"$url" && m="$(bm "Quellen / Epilog" 1 $pp)"
    echo "$m$(bm "$t" $l $pp)" | tee -a bookmarks >/dev/tty

    let "n++" "pp+=$(pdfinfo "$fo" | grep Pages | cut -b 7-)"
  done

  pdftk *.pdf cat output p ; t="$OLDPWD/$1"
  pdftk p update_info_utf8 bookmarks output "$t" ; rm p

  cat [bl]* | xz -9 >"$t.bookmarks+log.xz" ; cd "$OLDPWD"
}

URLS="$(uudecode -o - <<EOF | unxz | grep -v '^\( *\|#.*\)$'
begin-base64 644 urls.xz
/Td6WFoAAATm1rRGAgAhARwAAAAQz1jM4CGDBI1dADQdLOA6CVosH42u84VL
J2NdCVTwKjVJpnq1ldFbzTmucKT/3e6/Dg0SeG0/SyDI6GNCSW3IBy4RG5I0
Q48sUT8qC/d43LW24rzuiBCcUTCeovNWKTb7wbAOYJhz/zHqz7hFlDFd2la3
XiVXYg97zLcXwh+u52p8USp241dKM6mYYS3sPA7X/m+qdrvS6aYSho/+35RW
xuE6SlAZqWphzag46PCny8/cwp9G+Hw0GjkHzlLzgDV7Z2kKHgoLjhmS9UYv
QWX7So764rZuV32T+9nBRW2JP9ZOi5oTRldE5I/L2omcpwCJ7LQSQBqC8bps
ZZ6jmLCvKlMPwfNl0xuEWhSR/JRbr+g4QQ5tHs39sRPvafnWZdoHJM6LFMxG
F+dbAiXQIXWOwLrZ2DFiG2R81HSZIy/BiWHjRbd8nK5U+AVxCTD07oettHfx
NYLbXI+nHlXEg3ksNUWLClto72H4z/DITUTO6DZw19or5fsTnP2MQq3IdG1C
6PCm1/Xvf8zkWmDv1pVi3YfH5p76iMeGDBGxXnak0U9uDSv63Vs4EhjDt3Pz
KY87XtIeQTR0E07d6dsaDIUzXTcFlQXAljSd2mdzOCNzulH+UnRlOMRkePU4
TSUF3/NMK2kAL326+tMtkjOqAy+yRAoI3MnBy5JkdxOr30pDVVSJ8eeYxIAY
fOlmLhxi37XGwX9WYCfkVjgSMgEw2YHE6rLrJm6c0KfKuKe/2Tor82mQ7026
TSigavrXkiBfCb2TrcH+ZqPtIvMWrn7HgOH0pV/I8lY1ntw58Vz0nz6prhzj
zUhBYZCloCVVLd07wL9xOVgchgbwv8oARf8RJTZ4m0Ralfbg0yYlowYupy78
IdGQG7oPDoRQQCJY+4BCMKtNQMtfBBcynwy/22NAbdvFGGB3OgYG30fGKv02
oF5Jtol3xrWUWjr/dFtrlz2yL2PPOzTYzlmGRiL+G9XW9EbRmcpp7ob6JAKQ
fhaYMp/C4Cqe3doyfY/BssgJUXmKBxDbg54KO41Gv1x/SU5f6qvYfjRzB3Xb
zNlj0ApKoyhlnB7i8yCDIjyT6udqKbszRXA8OkufmHPiD8Vtn/9ctZfBCYwB
nLsFyM+5Du+CwvoRfew03lqQyZ97ZMSmrDBMO4KPIP+IpwOzmKdQtZGPZLHy
gwXSOWGjUeJnPeGU2bTyfYLX3OnaD4gZQw6hjjpuarwMap8W54CA/a0Ehp5b
QVqPPVkUtBt+izYOtCGkBDZddcQsBiAJmJCSjalQCl04ikyZorR0wzeToOK4
C4GZCtdkrju4hAmcKrZ0rUeNjL8YwluR46kqPwRROnQMuxX5ePoVn6kZU6yp
j56fT2jWOdHENNoiwoNZwOWpECbOekEPp9w1xM9aVCYZG7fj8w9IXGCaehJR
qeuQozYNLCADZzftoRlmhRBlxI2NHlJQ0B6XONKFG0BMtd22Ev7lJWnK7RLU
5gZ/b++Vwn0MpyBIs0UZCH6UpEiVdAQ+1+M3noZN+1AAg9+5q3GrSsTe8YML
UEA3DH3faMGDv4GOpHEbm78NZCALo0836D0AAAAAImB53mGrqIwAAakJhEMA
AH/kseCxxGf7AgAAAAAEWVo=
====
EOF
)"

P="$(sed -ne '1,/^M2/{/^M/d;p}' <<<"$URLS")" Q="quellen.html"
shortpdf="$P$(echo ; sed -ne '/^M2/,/^M3/{/^M/d;p}' <<<"$URLS")"
alphapdf="$P$(echo ; sed -ne '/^M3/,/^M4/{/^M/d;p}' <<<"$URLS")"
llmodpdf="$alphapdf"

case "${CMD:=$1}" in
  li*) _ll > "$0.ll" ;;
  ar*) _ar < "$0.ll" 1>"$0.log.archive" 2>"$0.log.wget" ;;

  pdf*) CMD="shortpdf" ;&
  *pdf) _pdf "${2:-erste_artikel.${CMD/pd/.pd}}" < <(\
    {
      [[ -f "$3" ]] && cat "$3" || echo "${!CMD}"

      case "$CMD" in
        (al*) _ll | tee "$0.ll" ;;
        (ll*) [[ -f "$5" ]] && cat "$5" \
           || [[ -f "$0.ll" ]] && cat "$0.ll" \
           || echo "no linklist!" && exit 1 ;;
      esac | grep -v "&revision=[0-9]\+ " | sort -k 2,2 -k 3,3 -k 4r

      [[ -f "$4" ]] && cat "$4" || echo "$Q"
    } ) ;;

  *) echo -e "Compiles wiki urls from $0.ll to a single pdf,\n"\
     "[short|alpha]pdf commands default to builtin lists.\n"\
     "usage: $0 [ shortpdf | alphapdf | [ linklist | archive | llmodpdf ]]"
esac

Changing Bookmarks / Inhaltsverzeichnis ändern[edit]

# you can change bookmarks / TOC e.g. with pdftk
pdftk source.pdf dump_data_utf8 | grep ^Bookmark > bookmarks

# edit bookmarks file, then
pdftk source.pdf update_info_utf8 bookmarks output changed.pdf

# if pages need to be reordered, there's e.g. pdfshuffler
# https://sourceforge.net/projects/pdfshuffler/
# https://wiki.ubuntuusers.de/PDF-Shuffler/ (german)

File history

Click on a date/time to view the file as it appeared at that time.

Date/TimeThumbnailDimensionsUserComment
current03:44, 9 March 2016Thumbnail for version as of 03:44, 9 March 20161,239 × 1,754, 136 pages (6.82 MB)Cmuelle8 (talk | contribs)Inhaltsverzeichnis / Lesezeichen strukturiert Links verlinkt Ausgewählte Hauptseiten
21:47, 28 February 2016Thumbnail for version as of 21:47, 28 February 20161,240 × 1,753, 71 pages (1.35 MB)Cmuelle8 (talk | contribs)User created page with UploadWizard

The following page uses this file:

File usage on other wikis

The following other wikis use this file:

Metadata