Port: WGET und WPUT für ppc Plattform

  • Hi,


    mich nervte das rudimentäre, unvollständge wget der busybox. Daher habe ich mal schnell dass aktuellste wget (1.10.1) für PPC (5xxx/70x0) kompiliert. Verlgeicht mal die Optionen beider Varianten. Dann wird es klar. Dennoch ist das busybox-wget klein und auf Geschwindigkeit optimiert, während das hier angehaengte eher mit Funktionalität glänzt.


    Gruß Mamba


    GNU Wget 1.10.1, a non-interactive network retriever.
    Usage: wget [OPTION]... [URL]...


    Mandatory arguments to long options are mandatory for short options too.


    Startup:
    -V, --version display the version of Wget and exit.
    -h, --help print this help.
    -b, --background go to background after startup.
    -e, --execute=COMMAND execute a `.wgetrc'-style command.


    Logging and input file:
    -o, --output-file=FILE log messages to FILE.
    -a, --append-output=FILE append messages to FILE.
    -d, --debug print lots of debugging information.
    -q, --quiet quiet (no output).
    -v, --verbose be verbose (this is the default).
    -nv, --no-verbose turn off verboseness, without being quiet.
    -i, --input-file=FILE download URLs found in FILE.
    -F, --force-html treat input file as HTML.
    -B, --base=URL prepends URL to relative links in -F -i file.


    Download:
    -t, --tries=NUMBER set number of retries to NUMBER (0 unlimits).
    --retry-connrefused retry even if connection is refused.
    -O, --output-document=FILE write documents to FILE.
    -nc, --no-clobber skip downloads that would download to
    existing files.
    -c, --continue resume getting a partially-downloaded file.
    --progress=TYPE select progress gauge type.
    -N, --timestamping don't re-retrieve files unless newer than
    local.
    -S, --server-response print server response.
    --spider don't download anything.
    -T, --timeout=SECONDS set all timeout values to SECONDS.
    --dns-timeout=SECS set the DNS lookup timeout to SECS.
    --connect-timeout=SECS set the connect timeout to SECS.
    --read-timeout=SECS set the read timeout to SECS.
    -w, --wait=SECONDS wait SECONDS between retrievals.
    --waitretry=SECONDS wait 1..SECONDS between retries of a retrieval.
    --random-wait wait from 0...2*WAIT secs between retrievals.
    -Y, --proxy explicitly turn on proxy.
    --no-proxy explicitly turn off proxy.
    -Q, --quota=NUMBER set retrieval quota to NUMBER.
    --bind-address=ADDRESS bind to ADDRESS (hostname or IP) on local host.
    --limit-rate=RATE limit download rate to RATE.
    --no-dns-cache disable caching DNS lookups.
    --restrict-file-names=OS restrict chars in file names to ones OS allows.
    -4, --inet4-only connect only to IPv4 addresses.
    -6, --inet6-only connect only to IPv6 addresses.
    --prefer-family=FAMILY connect first to addresses of specified family,
    one of IPv6, IPv4, or none.
    --user=USER set both ftp and http user to USER.
    --password=PASS set both ftp and http password to PASS.


    Directories:
    -nd, --no-directories don't create directories.
    -x, --force-directories force creation of directories.
    -nH, --no-host-directories don't create host directories.
    --protocol-directories use protocol name in directories.
    -P, --directory-prefix=PREFIX save files to PREFIX/...
    --cut-dirs=NUMBER ignore NUMBER remote directory components.


    HTTP options:
    --http-user=USER set http user to USER.
    --http-password=PASS set http password to PASS.
    --no-cache disallow server-cached data.
    -E, --html-extension save HTML documents with `.html' extension.
    --ignore-length ignore `Content-Length' header field.
    --header=STRING insert STRING among the headers.
    --proxy-user=USER set USER as proxy username.
    --proxy-password=PASS set PASS as proxy password.
    --referer=URL include `Referer: URL' header in HTTP request.
    --save-headers save the HTTP headers to file.
    -U, --user-agent=AGENT identify as AGENT instead of Wget/VERSION.
    --no-http-keep-alive disable HTTP keep-alive (persistent connections).
    --no-cookies don't use cookies.
    --load-cookies=FILE load cookies from FILE before session.
    --save-cookies=FILE save cookies to FILE after session.
    --keep-session-cookies load and save session (non-permanent) cookies.
    --post-data=STRING use the POST method; send STRING as the data.
    --post-file=FILE use the POST method; send contents of FILE.


    FTP options:
    --ftp-user=USER set ftp user to USER.
    --ftp-password=PASS set ftp password to PASS.
    --no-remove-listing don't remove `.listing' files.
    --no-glob turn off FTP file name globbing.
    --no-passive-ftp disable the "passive" transfer mode.
    --retr-symlinks when recursing, get linked-to files (not dir).
    --preserve-permissions preserve remote file permissions.


    Recursive download:
    -r, --recursive specify recursive download.
    -l, --level=NUMBER maximum recursion depth (inf or 0 for infinite).
    --delete-after delete files locally after downloading them.
    -k, --convert-links make links in downloaded HTML point to local files.
    -K, --backup-converted before converting file X, back up as X.orig.
    -m, --mirror shortcut for -N -r -l inf --no-remove-listing.
    -p, --page-requisites get all images, etc. needed to display HTML page.
    --strict-comments turn on strict (SGML) handling of HTML comments.


    Recursive accept/reject:
    -A, --accept=LIST comma-separated list of accepted extensions.
    -R, --reject=LIST comma-separated list of rejected extensions.
    -D, --domains=LIST comma-separated list of accepted domains.
    --exclude-domains=LIST comma-separated list of rejected domains.
    --follow-ftp follow FTP links from HTML documents.
    --follow-tags=LIST comma-separated list of followed HTML tags.
    --ignore-tags=LIST comma-separated list of ignored HTML tags.
    -H, --span-hosts go to foreign hosts when recursive.
    -L, --relative follow relative links only.
    -I, --include-directories=LIST list of allowed directories.
    -X, --exclude-directories=LIST list of excluded directories.
    -np, --no-parent don't ascend to the parent directory.


    Mail bug reports and suggestions to <bug-wget@gnu.org>.

  • kannst du es auch für die DM7025 bauen ?


    Und ein wput gäbe es da auch zum Compilieren :smiling_face:


    Habe immer gerne neue Spielzeuge !


    Hintergrund ist das ich gerade versuche ein Telefonbuch Plugin zum Laufen zu bringen (von www.dasoertliche.de) und mit dem normalen wget ist es zu mühsam. Im Moment habe ich nur ein Spruch des Tages Plugin zum Laufen gekriegt, weil da ist der webaufruf einfacher und das Ergebnis auch.


    Gruss
    thowi

    2 Mal editiert, zuletzt von thowi ()

  • Klaro,


    bin gerade auf der Suche nach dem 7025 compiler. Gibt es den schon fertig gebaut zum Download?


    Gruß Mamba

    __________________________________
    Dreambox 800/7020, 250 GB HDD, 100 Mbit Lan

    Einmal editiert, zuletzt von mamba0815 ()

  • Anbei wput für die ppc plattform. Für 7025 sobald ich den compiler dafuer habe.


    Gruß Mama



    root@dm7020:/media/hdd/bin# ./wput -h
    Usage: wput [options] [file]... ...<br> url [URL]ftp://[username[:password]@]hostname[:port][/[path/][file]]


    Startup:
    -V, --version Display the version of wput and exit.
    -h, --help Print this help-screen
    -b, --background go to background after startup


    Logging and input file:
    -o, --output-file=FILE log messages to FILE
    -a, --append-output=FILE append log messages to FILE
    -q, --quiet quiet (no output)
    -v, --verbose be verbose
    -d, --debug debug output
    -nv, --less-verbose be less verbose
    -i, --input-file=FILE read the URLs from FILE
    -s, --sort sorts all input URLs by server-ip and path
    --basename=PATH snip PATH off each file when appendig to an URL
    -I, --input-pipe=COMMAND take the output of COMMAND as data-source
    -R, --remove-source-files unlink files upon successful upload


    Upload:
    --bind-address=ADDR bind to ADDR (hostname or IP) on local host
    -t, --tries=NUMBER set retry count to NUMBER (-1 means infinite)
    -nc, --dont-continue do not resume partially-uploaded files
    -u, --reupload do not skip already completed files
    --skip-larger do not upload files if remote size is larger
    --skip-existing do not upload files that exist remotely
    -N, --timestamping don't re-upload files unless newer than remote
    -T, --timeout=10th-SECONDS set various timeouts to 10th-SECONDS
    -w, --wait=10th-SECONDS wait 10th-SECONDS between uploads. (default: 0)
    --random-wait wait from 0...2*WAIT secs between uploads.
    --waitretry=SECONDS wait SECONDS between retries of an upload
    -l, --limit-rate=RATE limit upload rate to RATE
    -nd, --no-directories do not create any directories
    -Y, --proxy=http/socks/off set proxy type or turn off
    --proxy-user=NAME set the proxy-username to NAME
    --proxy-pass=PASS set the proxy-password to PASS


    FTP-Options:
    -p, --port-mode no-passive, turn on port mode ftp (def. pasv)
    -A, --ascii force ASCII mode-transfer
    -B, --binary force BINARY mode-transfer


    See wput(1) for more detailed descriptions of the options.
    Report bugs and suggestions via SourceForge at
    h**p://sourceforge.net/tracker/?group_id=141519

  • Ich compiliere so kleine Sachen direkt auf der Dreambox mit dem Compiler der eh hier in der Entwicklungssection geposted ist - gcc-mipsel ist aber blaube ich was du brachst, aber ich kenne mich mit X-compilieren am PC nicht aus, sorry


    Trotzdem danke fürs wput :winking_face:

  • Hallo, das wput läuft bei mir leider nicht mit Aktuellem Image. Bekomme immer diese meldung.


    Code
    ../wput: relocation error: ./wput: symbol gettext, version GLIBC_2.0 not defined
    in file libc.so.6 with link time reference


    Gruß Michael

  • Hi,


    der GLIBC Fehler kommt, wenn du ein altes Image verwendest oder eine 500 bzw. 7000er verwendest. Dort laeuft nur ein statisch gelinktes wget/wput. Aber das scheinst du ja schon gefunden zu haben.


    Mamba

    __________________________________
    Dreambox 800/7020, 250 GB HDD, 100 Mbit Lan

  • Hiho...



    ich hab mir mal die Mühe gemacht den wget für die Dreambox 7025 zu kompilieren.
    War notwending, da das busybox-Derivat keine Authentifizierung kennt..


    Gruß
    MausFan