linux

Download offline version of dynamic pages with Wget

24 November, 2006 - 10:43
Categories:

Remainder mainly to myself: short list of useful options of wget for recursive downloading of dynamic (PHP, ASP, ...) webpages (because wget's man page is too long):

  • --no-clobber: do not redownload pages that already exist locally.
  • --html-extension: append extension .html to webpages of which the URL does not end on .html or .htm but with things like .php or .php?q=boo&bar=4.
  • --recursive: turn on recursive downloading.
  • --level=3: set the recursion depth.
  • --convert-links: make the links in downloaded documents point to local files if possible.
  • --page-requisites: download embedded images and stylesheets for each downloaded html document.
  • --relative: only follow relative links, not absolute links (even if in the same domain).
  • --no-parent: do not ascend to parent directory of the given URL while recursively retrieving.

Using custom LaTeX document classes in LyX

21 September, 2006 - 10:23
Categories:

For writing with LaTeX I prefer using LyX because it hides the ugliness of LaTeX source code behind a pseudo-WYSIWYG frontend (the developers call it WYSIWYM: what you see is what you mean). Including mathematical expressions, however, is very LaTeX minded and comfortable: you type LaTeX math constructs (stuff with _, ^, \sum, ...) and LyX directly visualises it as a pretty printed formula. For the advanced LaTeX constructions not available in LyX's interface one can always fall back om "raw LaTeX" input fields. In short, I think LyX is a very handy compromise between the power of LaTeX and the user friendliness of a WYSIWYG system.

Sometimes, for example when writing an article for a conference, one need to use a custom LaTeX document class, recommended by the conference author guidelines. The procedure to make LyX using this custom document class is non obvious and a bit involved. It is broadly explained in the LyX manual (Part Customization, Chapter 5 Installing New Document Classes, Layouts, and Templates), but here is the short version (for teTeX on Linux or related systems, I don't know about Windows):

Making a video from frames with transcode

20 August, 2006 - 11:51
Categories:

Making a video from a set of frames can be done with transcode as follows.
Disclaimer: this is mainly a reminder to myself, based on these instructions for making animations from frames, with some extra's.

First make a file with the filenames of the individual frames (in the right order of course). E.g. if the frames are named /tmp/0001.png, /tmp/0002.png, ..., /tmp/0100.png:

$> ls /tmp/0*.png  > framelist.txt

Also, indentify the size of the input frames. E.g. with ImageMagick's identify:

$> identify /tmp/0001.png
/tmp/0001.png PNG 400x300 400x300+0+0 DirectClass 65kb

Then invoke transcode with the frame list as input:

$> transcode -i framelist.txt \
   -x imlist,null -g400x300 --use_rgb -z \
   -y xvid,null -f25 -o frames.avi -H 0

the meaning of the different arguments:

  • -x: the input video format (imlist) and audio format (null)
  • -g: the size of the frames (400x300 in this case)
  • --use_rgb: to indicate that the input color space is RGB
  • -z: to flip the input upside down (I need this to get the output video right)
  • -y: the output video format (xvid) and audio format (null)
  • -f: the frame rate
  • -o: the output video file
  • -H 0: disable autoprobing for the input format

Feh, yet another image viewer for Linux

1 August, 2006 - 10:12
Categories:

Some years ago, when I solely worked in Windows, I happily used Irfanview as image viewer. Under Linux I didn't found a real replacement for it, yet. I already tried Gwenview, Kuickshow, Kview, ImageMagick's display, xv and maybe some others, but all have their little quirks and annoyances concerning speed or usability.

Here's a new candidate for my image viewer list: Feh (available for Ubuntu in the universe repository). Linux.com has a short overview of feh. It's a command line application, so it has it shortcomings on the usability front, but it quite fast and has nice features (hot keys, different view modes, mouse control, directory traversal, ...).

Thunderbird: install spell check dictionary (under Ubuntu)

28 June, 2006 - 18:46

I had some difficulties in Ubuntu with installing a new spell check dictionary for my mail client Thunderbird. My native language is Dutch (Nederlands) and writing Dutch with a English spell checker active is not very interesting. I tried the methods explained at http://www.mozilla.com/thunderbird/dictionaries.html , but that didn't work. I got some message that the dictionary "has been succesfully installed", but that message did not reflect the truth.

I messed a bit with the installation script inside the spell-nl.xpi file, and found out that the dictionary would/should be installed to /usr/lib/mozilla-thunderbird/components/myspell, a directory that is only writable by the root user. Apparently the installation procedure did not detect the write failure. I could put the Dutch dictionary there as root user, but I looked a bit further and found out that installing the appropriate MySpell package (myspell-nl for me) is a much cleaner/safer solution. MySpell is the spell checker of OpenOffice.org, which is also used in Thunderbird.

My Ubuntu Breezy to Dapper upgrade experiences

25 June, 2006 - 16:00
Categories:

On friday 23 june 2006 I decided to upgrade my kubuntu powered laptop (Acer Travelmate 4002WLMi) from Breezy (aka 5.10) to Dapper Drake (aka 6.06). Here is a list (under construction) of my experiences with the upgrade so far. The merits/blames are not all addressed to (k)ubuntu, some are related to the fact that the software is upgraded too (KDE 3.5.3 instead of 3.4 for example). Moreover, some things might be related to stupid faults/wrong expectations of my own.

All in all, I'm not so satisfied with this Ubuntu upgrade as I expected to be, after all those "Ubuntu Dapper is the best" vibes in the linux community.

Extracting fonts from PDF's

19 June, 2006 - 15:56
Categories:

Thanks to Planet Ubuntu NL I found a blog entry by Pascal de Bruijn with a hack to extract fonts from a PDF, using pdftops and FontForge. When I have some time, I definitely should try this.

get working directory in python

2 June, 2006 - 16:14
Categories:

This is mainly a reminder to myself, because I always forget it and spend to much time to find it again. It's so simple/basic that it is almost embarrassing.
To get the current working directory of in a python script, like you have pwd (print working directory) on a Linux/Unix command line:

import os
print os.getcwd()

Amarok crash (problem with collection.db)

15 May, 2006 - 09:17
Categories:

I'm a happy Amarok user, using it for playing/browsing/querying my music collection, but this weekend I encountered a problem: Amarok didn't start anymore. When I launched Amarok (version 1.3.1, that is) from the command line I got the following error:

$> amarok
amaroK: [Loader] Starting amarokapp..
amaroK: [Loader] Don't run gdb, valgrind, etc. against this binary! Use amarokapp.
QLayout: Adding KToolBar/mainToolBar (child of QVBox/unnamed) to layout for PlaylistWindow/PlaylistWindow
kio (Scheduler): FATAL: BUG! _ScheduleJob(): No extraJobData for job!

I experimented with changing/deleting my Amarok settings in ~/.kde/share/apps/amarok and it became clear that there was some problem with collection.db. Deleting (after a backup of course) that file would solve the problem, but I did not want to lose the statistical information about my music collection in that file.

Blocking referer spam with Apache .htaccess directives

28 April, 2006 - 16:03

The logs of my (drupal powered) website show a lot of referer spam. Some time ago I had this statistics page which contained a listing of the last 10 pages my site's visitors came from (aka referers). Soon spambots found out and spammed this list. I made the list invisible for anonymous visitors, but nevertheless spambots stil target my site (less frequent than when the list was visible, however), polute my stats, use bandwidth, use processing power and kill those cute little puppies. Now I went a bit further to block those dirty spambots ...