Uploaded a new set of changes by K to the departmental website, on HN's instructions.
Still pushing forward with MOL. Hope to be done by the end of the week.
Met to discuss two new project proposals, one of which we'll start writing up now, and the other later. I'll get to work on the first one tomorrow.
A few fixes done, and remaining things:
- The XML link now no longer appears in the More Info box when there's no XML source document (as in the case of an index page, for instance). However, this now leaves the box empty some of the time, so I'll need to think of a cleverer solution. I don't want to slow down the XSLT processing by testing for the presence of everything that might be in there.
- Old render_page.php and other .php paths are now redirected to the correct page wherever possible (in most cases, it is possible). This is rather neat. Here's how it works:
[in controller.xql] ... else if (contains($exist:path, 'render_page.php')) then <dispatch xmlns="http://exist.sourceforge.net/NS/exist"> <redirect url="{request:get-parameter('id', ())}.htm"> </redirect> </dispatch>
- Annoying xmlns:exist attributes on XHTML nodes, which were making some pages invalid, have now been removed, by including the exist prefix in
@exclude-result-prefixes
in the XSLT root node.
My plan to handle variant spellings is to create an AJAX routine that allows the user to retrieve a set of suggested variant spellings while on the search page, in order to allow them to see and select among candidates; this will also allow users to retrieve variant spellings without having to search.
After discovering that fuzzy-search is broken in eXist at the moment, I've found an implementation of Levenshtein Distance in XSLT by Jeni Tennison, which I've been able to re-purpose for my module, and that's working well. I have a complete system for tokenizing a search string and retrieving all the possible variant spellings, based on an analysis of all the <ref>
tags in the db. It's quite effective. It returns the variant list as plain text, which I plan to AJAX into an HTML5 <pre>
element on the page, so the user can copy/paste from there into the search box if that's what they want to do.
It's a bit complicated, and took me half the day to figure out, but it seems reasonably fast. I don't want to do variant-spelling searches by default, because that will considerably increase the burden for every search, and for most, it probably won't be necessary, so I like the two-stage solution.
Next I need to plug the AJAX stuff into it. I need a really small and efficient AJAX library -- want to avoid JQuery bloat if possible.
A "peripheral_vessels.xml" file was created to house vessels mentioned in files other than the despatches. For example, in Captain Cook's biography, we might mention his ship, Discovery, which does not appear in the despatches, at least not in the content transcribed currently.
As we discussed as a team, it seems odd that the online reader should encounter some vessels tagged and others not. After all, readers do not know which vessels occur in the letters and which do not. The peripheral-vessels file solves cures this potential for confusion.
Lastly, should a vessel that appears in the peripheral-vessels file one day be discovered elsewhere in the future, say, if the enclosures are eventually transcribed, then we would move the respective vessel entry over to the "vessels.xml" file, a simple copy/paste operation.
I've re-written the handling of editorial and marginal notes so that notes are now fully rendered, meaning that markup inside them is processed, and they're displayed in the margin like bibliographical entries. This means they have clickable names, italicized titles, etc. The process for doing this is somewhat convoluted, because of the way documents are processed in fragments with XSLT during the XQuery processing; essentially a second copy of each note is created, and placed in a list at the end of the document, and then those copies are processed during rendering, but kept hidden; the clickable note marker in the text then links to those.
Forgot to post for last two weeks
week of Oct 10 - Oct 14
M vac, T, W +0.5 hold fort, R, F +1.0 faculty meeting
week of Oct 17 - Oct 21
M -1.0 CSG, T +0.5 hold fort, W +0.5 hold fort, T, F +0.5 teaching class
AC reported another bug when editing poems in the database. It turned out to be virtually identical to the bug we fixed the other week, which affected authors. I made the same fixes, with minor variations:
Editing and saving an author was throwing the following error:
Could not successfully run query (UPDATE `poems` SET `po_title` = 'Ode to Labour (From Tait\'s Magazine)x' , `po_translator` = '' , `po_date` = '1839-10-12' , `po_organ` = '1' , `po_vol` = '1' , `po_num` = '3' , `po_pages` = '12' , `po_anonymous` = '0' , `po_unsigned` = '0' , `po_pseudonym` = 'An Industrious Englishman' , `po_text` = '' , `po_origLang` = '' , `po_images` = '' , `po_illustrator` = '' , `po_illustrations` = '' , `po_links` = '' , `po_notes` = '' WHERE `po_id` = 1) from DB: Incorrect number of arguments for PROCEDURE vpn.related_poem_lookup_loop; expected 3, got 2
I looked at the db backup, and found the following:
DROP TRIGGER IF EXISTS `vpn`.`related_poem_update`; DELIMITER // CREATE TRIGGER `vpn`.`related_poem_update` AFTER UPDATE ON `vpn`.`poems` FOR EACH ROW BEGIN DECLARE related_poem_content text; CALL related_poem_lookup_loop(new.po_id, related_poem_content, -1); END // DELIMITER ; DROP TRIGGER IF EXISTS `vpn`.`poem_delete`; DELIMITER // CREATE TRIGGER `vpn`.`poem_delete` BEFORE DELETE ON `vpn`.`poems` FOR EACH ROW BEGIN DECLARE related_poem_content text; CALL related_poem_loop(old.po_id, related_poem_content, old.po_id); DELETE FROM poem_search WHERE id = old.po_id; END // DELIMITER ;
The first trigger calls related_poem_loop with only two arguments; the second uses three, and the function signature expects three (although I couldn't find the function itself in the db backup file -- that's worth looking into). On the analogy of the other error we fixed, the third argument is presumably "exclude_id", which is passed to the related_poem_loop function, where its function seems to be to allow you to avoid processing a poem which you're actually in the process of deleting. When we're doing an update rather than a delete, presumably this does not apply, so following our logic of last time, we decided to pass -1, which would not match any poem_id. Accordingly, we modified the trigger. We did that at the MySQL command line: ssh hcmc@mysqldev, then mysql -u hcmc -p vpn, which logs you into that db. Finally, this is what we ran:
DROP TRIGGER IF EXISTS `vpn`.`related_poem_update`; DELIMITER // CREATE TRIGGER `vpn`.`related_poem_update` AFTER UPDATE ON `vpn`.`poems` FOR EACH ROW BEGIN DECLARE related_poem_content text; CALL related_poem_lookup_loop(new.po_id, related_poem_content, -1); END // DELIMITER ;
This appears to have solved the immediate problem -- we can now save changes to poem -- but we should watch out for any other issues that might have arisen out of the re-introduction of triggers.
Leaving early.