I needed to update many of the links in our wiki because a team member left, so I had to reupload all of her files to a shared service and change all the URLs to point to the new files. Unfortunately, the file service didn’t send me the former URLs of the files, so that was going to be a manual process. Our wiki had 149 pages in it. Not fun.
After a few pages of editing (and correcting the occasional typo that crept in as I changed URLs), I decided to partially automate the process. Using a smidgen of Emacs Lisp, I created a function that pasted text into a temporary buffer, performed whatever automatic fixes it could make, prompted me for any URLs it didn’t recognize, remembered the old URL – new URL mapping I defined, and copied the text back.
The function looked somewhat like this:
(defvar sacha/wiki-links nil "Associative list of (old-url . new-url).")
(defun sacha/wiki-fix ()
"\\[\\([^|]+\\)|\\([^\]]+\\)\\]" nil t)
(if (or (string-match-p "viewpage" (match-string 2))
(string-match-p "lsoohoo" (match-string 2)))
(unless (assoc (match-string 2) sacha/wiki-links)
(cons (match-string 2)
(read-string (concat (match-string 1)
(cdr (assoc (match-string 2) sacha/wiki-links)))
t t nil 2)))
M-x global-set-key to bind a convenient function key to it (F12, I think), and then it was just a matter of clicking on each page, clicking on Edit, typing Ctrl-C to copy the text, switching to Emacs, pressing F12, switching back to my browser, typing Ctrl-V, and saving the wiki page. I also added some lines (not shown here) to convert the previous wiki gardener's full links to intrawiki links, change server URLs, and do other fun things.
I thought about fully automating it (somehow hooking into w3, perhaps?), but that seemed to be more trouble than needed. Besides, it was good to review all the pages.
As a result of this Emacs wizardry, processing all 149 wiki pages took me a few hours instead of a few days. Yay!
Of course, I finished the last wiki page, I found out that I needed to change the servers in the URL. I decided to go ahead and fully automate the darn thing.
replace-regexp rocks), I had a list of URLs to the different pages. I knew I needed to put in some kind of delay when loading web pages.
sleep-for let me spread out my requests so I didn't hammer the server too badly. Reading the w3m.el source code turned up
w3m-async-exec. Once I set that to nil, requesting web pages and running code on the results turned out to be straightforward. Selecting the right widgets was a bit of a hack (
w3m-previous-anchor there), but hey, it worked. After confirming it by manually running it on a few pages, I left it merrily running in the background.
Here it is (some tweaking required):
(defun sacha/edit-wiki-page ()
(let ((buffer (current-buffer))
(delay 5)) (while (not (eobp))
(when (search-forward "Edit" nil t)
(when (search-forward "Minor change" nil t)
(while (re-search-forward "https?://example.com/path" nil t)
(replace-match "http://path.example.com" t t nil 0))
(when (search-backward "Save" nil t)
I’m sure this kind of automation might be possible with lots of hacking in Mozilla Firefox, and I’ve seen great scripts for the Mac, too. But I know Emacs, I’m comfortable digging into source code, and I can make things work.