Little programs and tools
Bulk Conversion of Images into an OpenOffice Presentation
I wrote a little perl script based on the
OpenOffice::OODoc
perl module to take a list of images and create an OpenOffice
Presentation out of the images, one full size on each slide. This
script was
published as an
article on
Linux-Gazette. Unfortunately
somewhere between the 1.x version of OpenOffice::OODoc and the
newer 2.0.X version my script broke. There is an
update
also published on Linux-Gazette and you can grab two versions of
the updated img2ooImpress.pl script here:
- The first version works
just like the original -- it puts one image full size on each
slide and adapts the aspect ratio as needed. This is intended to
be used to convert slides which are created by e.g. the LaTeX beamer class
into a OpenOffice or PowerPoint Presentation.
- The second
version is simply taking the images and puts them one on
each slide while preserving the aspect ratio. So this can be
used to create an Impress or PowerPoint Presentation from
digital photos. Please mind, that the script does absolutely
nothing to beautify the slides right now -- they are simply
dropped on white pages.
Both scripts have a list of prerequisites which are describes in
detail in the Linux-Gazette
article so
please have a look there. There are also usage examples and
details on the actual LaTeX/beamer to odp conversion. To get
PowerPoint you'll need an OpenOffice to load the odp and then
export it again as ppt. Of course you loose all navigation
funktionality of the beamer pdf slides -- you just get the
slides.
The scripts are published under GPL V3.
Update: Rafael Laboissiere sent me back an improved version of this script which does also the pdf-to-images conversion (with helper programs). You can find his version on the alioth website.
Script for recording internet radio
Since I like "Hörspiele" (radio shows?) and am often not at home to
listen to them live I've started recording them from available
internet streams. After a while it got annoying to have too many
incomplete stories because the streaming broke down somewhere in
the middle and I improved that script to find the streaming link
by fetching the webpages and extracting the current link and made
it restart the download if it stopped short of the programmed time.
The
script is written in perl and
relies on wget and mplayer for the actual recording/download and
on WWW::Mechanize for the webpage reading. The script only handles
the link extraction, the restart if the stream breaks down and
also stops the download at the end of the programmed time.
Timed starting is done by the "at" daemon, so I would type into a
shell (bash) something like:
$ at 00:05 15.3.2008
at> getradio.pl dlf 56
at> [Ctrl-D]
To record the stereo (!) ogg stream of the weekly (every Sat
00:05) crime story from "Deutschlandfunk" on Sat 3/15/2008.
"at" will start the script at 00:05 on 15.3.2008, the script fetches
the webpages, extracts the streaming link and starts in this case
wget to download the ogg-stream. If the wget process dies it will
restart it with a little delay to avoid too quick restarts. If the
endtime (56 minutes in above example) is reached the wget process
is stopped (and no restarting).
Right now the script takes two arguments:
Usage:
getradio.pl [br2|br4|wdr3|wdr5|dlr|einslive|figaro] duration
and can handle the following German radio stations: Bayrischer
Rundfunk br2 and br4, Westdeutscher Rundfunk WDR3, WDR5 and
einslive, Mitteldeutscher Rundfunk Figaro and the two channels of
Deutschlandradio -- Deutschlandfunk and Deutschlandradio Kultur.
Extending the script to other stations based on the examples is
pretty easy I think once you take a peek into the html-source of the
webpages which lets you start the internet radio.
If you happen to preprogram a lot of these for the next month
you'll find the usual output of atq rather unhelpful. Another
little
script helps getting
all relevant information from the programmed getradio jobs (start
time, channel, duration, jobid for canceling, where the files will
end up,..) and outputs them in chronological order and warns of
duplicates:
start date and time chan ID dur target dir
2008-03-15 20:05:00 dlf Sat 1004 115 /home/khh
2008-03-15 22:00:00 ???? Sat 1013 ?? /home/khh <- not a proper entry (unknown channel, not getradio.pl,..)
2008-03-16 15:00:00 br2 Sun 1011 60 /home/khh
2008-03-19 20:30:00 br2 Wed 1012 60 /home/khh
I had to modify the script when switching to ubuntu, maybe the
old
version is helpful for somebody as well.
Licene for the scripts is GPL V3.
E-Mail: kh1@khherrmann.de
Last modified: Sat Mar 15 18:50:56 CET 2008