Archives for: May 2012

31/05/12

Permalink 04:01:51 pm, by esaint, 73 words, 85 views   English (CA)
Categories: Activity log; Mins. worked: 10

Update from E.S on May 31, 2012

1. ES met with Pierre for recording, on Friday, May 25. All 8 videos have been edited. 2 have been chosen to go on the site. Transcripts still to be done.
2. The next recording session was supposed to happen on Friday, June 1. However the subject cancelled our meeting. To be rescheduled later in the month (i.e. mid-June 2012).
3. 15 new transcripts have been added to Oxygen. CC has agreed to upload them to the site. To be continued.

Permalink 03:24:42 pm, by mholmes, 37 words, 140 views   English (CA)
Categories: R & D, Activity log, Documentation; Mins. worked: 45

All data from old machine retrieved

I've now got all important data off my old Lucid box (the Drive2 data), onto my new machine. We can now take out Drive2 from that machine and use it in a cradle, and repurpose the machine.

Permalink 03:18:25 pm, by mholmes, 235 words, 133 views   English (CA)
Categories: Servers, R & D, Activity log, Documentation; Mins. worked: 180

More fixes, still not quite there with Jenkins...

Problems solved this afternoon:

  • rnv was failing to download and build; ampersands in the url query string needed to be backslash-escaped, and the file name needed to be specified with the -O flag.
  • The log parse rules were not actually being used, even though they were downloaded, and were referred to in the job configs. I think this is because a file called hudson.plugins.logparser.LogParserPublisher.xml also needed to be there, to specify that the other file exists somehow. That's the theory anyway; not tested yet.
  • Emailing could never have worked without going to a lot of trouble to mess with Jenkins's setup, so I'm simply sidestepping it and leaving it up to the user to set it up if they want to. I'm removing mine and SR's email addresses from the job configs.
  • The Priority Sorter is working, but didn't actually solve the sequencing problem for the first build. Jinks managed to complete OxGarage, Roma and Stylesheets1 before Stylesheets completed, so it just had time to start and fail on P5-Test before the required artifacts were there. I'm consideriong making P5-Test a downstream job from Stylesheets, which should solve it once and for all.
  • Core files are now in the TEI repo, as they should be, and the hudson log parse rules have been moved from P5/Source to Documents/Editing/Jenkins, where all the rest of the stuff is.
Permalink 03:10:25 pm, by mholmes, 163 words, 133 views   English (CA)
Categories: Activity log, Tasks; Mins. worked: 45

DB Permissions; appearance of links

CB now has a user id on the eXist db, and is a member of the editors group. In the process of testing this, we discovered that lots of files and dirs in /db/data did not have group write, or were assigned to the dba group, which meant that he couldn't overwrite them. This is fixed for most files, but we need to watch out for it. I think it may happen when I upload stuff as admin; although admin is in the editors group, it's also dba, and it may cause uploaded data to be set to group dba.

I have a task, which needs to be clarified a bit before I start work on it. We have two types of links: those which open up popup windows, and those which navigate off the page or off the site. It would help users if they could tell the difference before they click on one. A number of options are under consideration.

Permalink 02:49:42 pm, by sarneil, 133 words, 110 views   English (CA)
Categories: Activity log; Mins. worked: 120

tomcat eXist incompatibility resolved?

Found out from list that the betterform extension to eXist is making a call to a SAXON method that doesn't exist in my saxon parser (though I'm using the one suggested, so I don't know why that method isn't there). Anyway, by removing all the configuration specifications for betterform from the WEB-INF/web.xml file, I can get the vanilla install of eXist 2.1 to launch in Tomcat. That's progress. I then trashed the betterform folder and that introduced no problems. Martin suggested that just getting rid of the betterform folder might have been enough, but I haven't tried that. Next I've got to try to create an instance of the francotoile as an eXist 2.1 app, remove any betterform config stuff and folder, put that into the tomcat environment and hopefully it will run.
Permalink 10:07:47 am, by mholmes, 152 words, 519 views   English (CA)
Categories: Announcements; Mins. worked: 120

Jenkins build progress

  • Added the OxGarage messages to the log parser rules file, so they no longer show up as errors.
  • Re-organized the structure of the build process so EULAs come first, and the rest of the build should be able to proceed unattended, using the -y flag on apt-get install commands.
  • Moved the requirement for an Oxygen license to the beginning of the process, so that the user must provide it before the build can proceed. This prevents the Stylesheets build from failing first time out, because of the missing license.
  • Added more useful information messages, particularly before the build.
  • Tested the Priority Sorter plugin for Jenkins, which should enable us to put P5 builds at the end of the queue, so that we can be sure that Stylesheets will build first.

The last thing to do is to XSLT the job configs to insert the user's email address instead of SR's and mine.

30/05/12

Permalink 05:43:06 pm, by mholmes, 21 words, 40 views   English (CA)
Categories: G&T Hours; Mins. worked: 0

MDH: 166 + 2 = 168 hours G&T

Testing server builds takes forever -- you make one small change and then you have to run the whole script again...

Permalink 05:13:18 pm, by mholmes, 129 words, 141 views   English (CA)
Categories: Servers, R & D, Activity log, Documentation; Mins. worked: 45

rnv problem solved

It turns out that the response at the command line when you try to run rnv is completely misleading. It leads you to believe that the executable is missing, whereas it's clearly there. Actually the problem is caused by the fact that the rnv installed from the TEI repo is 32-bit, and won't run on a 64-bit kernel.

So instead of installing it from the repos, I'm downloading and building it instead:

apt-get install libexpat-dev
wget http://downloads.sourceforge.net/project/rnv/Sources/1.7.10/rnv-1.7.10.zip
unzip rnv-1.7.10.zip
cd rnv-1.7.10
./configure
make
make install

In view of this, I don't think rnv should be in the TEI repos at all, unless it can be done in such a way as to provide the right build for the host architecture.

Permalink 04:42:07 pm, by sarneil, 367 words, 101 views   English (CA)
Categories: Activity log; Mins. worked: 300

namespace issues in kwic - eXist 2.1 differs from 1.4

After much testing and experimenting, I think I've got the search page working in eXist 2.1 (running under jetty). The newer version of eXist (and/or the lucene extensions) handle default or implicit namespaces differently.

Original code looked something like this:

for $match in $utter//exist:match
let $summary := kwic:get-summary($expanded, $match, <config xmlns="" width="40"/>)
for $line in $summary//self::p
return

The p element is introduced by the kwic:get-summary function, and the question is what namespace that element is deemed to be in. In the older version of eXist, the code above worked. In eXist 2.1, that code returned nothing. I don't know what namespace that p is in, so Martin suggested the wildcard namespace:

for $match in $utter//exist:match
let $summary := kwic:get-summary($expanded, $match, <config xmlns="" width="40"/>)
for $line in $summary//self::*:p
return

and that worked.

I had to work through similar issues with the span elements that kwic embeds within the p element. They too are in some limbo namespace so my code had to include the wildcard namespace selector. In addition, the span outputted to the page included an xmlns="" attribute, and that caused the css to fail to select it.
Original code (worked in eXist 1.4)

for $line in $summary//self::p
let $before := $line/span[@class='previous']
let $match := $line/span[@class='hi']
let $after := $line/span[@class='following']
return
<li>
<a href="player.xql?id={$id}&start={$startTime}" title="{$start}"> {$before} {$match} {$after} </a>
</li>

Modified code (works in eXist 2.1)

for $line in $summary//self::*:p
let $before := $line/*:span[@class='previous']
let $match := $line/*:span[@class='hi']/text()
let $after := $line/*:span[@class='following']
return
<li>
<a href="player.xql?id={$id}&start={$startTime}" title="{$start}"> {$before} <span class="hi">{$match}</span> {$after} </a>
</li>

As the span is coming from the lucence kwic extension, it will be only text, so I don't think explicitly grabbing only the text should cause me any problems, and it allows me to then code in the containing span, which renders properly on the page.

Permalink 04:30:33 pm, by sarneil, 97 words, 71 views   English (CA)
Categories: Activity log; Mins. worked: 60

add wildcards at start of search string

I discovered that I can append an option to the ft query that allows wildcards at the start of the search string. I added a searchClauseOptions variable

let $searchClauseOptions := '<options><leading-wildcard>yes</leading-wildcard></options>'

and then passed that in as an argument to the search clause:

fn:concat('[tei:text/tei:body[ft:query(.,"', $searchterm, '",', $searchClauseOptions, ')]]')

There's a lot of escaping of string delimiters as that search clause itself ends up as a string which is eval'd to generate the results.

Permalink 04:25:36 pm, by sarneil, 105 words, 107 views   English (CA)
Categories: Activity log; Mins. worked: 180

change lucene analyzer

The system/config/db/site/data/collection.xconf file controls which lucene analyzer to use when indexing the data collection. It was set to use the WhitespaceAnalyzer. When I changed that to use the StandardAnalyzer instead and re-indexed the files, then the upper-case/lower-case issues went away.
I've done a bit of testing and the change does not seem to have introduced any problems, so I'm going to stick with it.
I also ran across the SnowballAnalyzer, which looks interesting, but I'll postpone investigating that until I get eXist working within Tomcat, as the arrangement with Jetty is annoying - particularly the implications for SVN.

Permalink 04:19:57 pm, by sarneil, 250 words, 84 views   English (CA)
Categories: Activity log; Mins. worked: 60

install problem possibly caused by SAXON version mismatch

I poked around the log files for a while to see what I could see about the problems launching eXist 2.1 in Tomcat. A guy on the eXist list posted the following in response to me posting the log files showing the errors when I tried to launch exist 2.1 in Tomcat. I haven't yet taken any action on it.

From this [see below] I read that one of exist-db extensions (betterFORM) tries to initialize the SAXON xslt library without success…. a method is missing.

Since the error is not about a missing class, but about a missing java method , I think a different (older or newer) version of saxon.jar is installed.

The solution is….. either to change saxon.jar (endorsed directory or somewhere else) to the version expected by betterFORM (actually bF depends on version 9.2.x.y ; for a newer version the bF code needs to be changed), or to to disable bF in the configuration files [need to check; it is in web.xml I think]

The localhost log includes this:

May 30, 2012 9:20:13 AM org.apache.catalina.core.StandardContext filterStart
SEVERE: Exception starting filter XFormsFilter
javax.servlet.ServletException:
de.betterform.xml.config.XFormsConfigException:
java.lang.reflect.InvocationTargetException
at de.betterform.agent.web.filter.XFormsFilter.init (http://web.filter.XFormsFilter.init)(XFormsFilter.java:71)
and
Caused by: de.betterform.xml.config.XFormsConfigException:
java.lang.reflect.InvocationTargetException
at de.betterform.xml.config.Config.initSingleton(Config.java:135)
Caused by: java.lang.NoSuchMethodError:
net.sf.saxon.sxpath.IndependentContext.setFunctionLibrary(Lnet/sf/saxon/functions/FunctionLibrary;)V

Permalink 04:16:36 pm, by mholmes, 604 words, 191 views   English (CA)
Categories: Servers, R & D, Activity log, Documentation; Mins. worked: 240

More progress with Jinks 2012 build

I now have a more convenient setup for creating and testing Jinks builds. This is how it works:

  • There's a sort of "seed" vm called PreJenkins in /home/mholmes/VirtualBox VMs/
  • That's a fully-updated vanilla install of Precise server.
  • In my tei/jenkins directory, there's a script called vboxmanage_Jenkins2012A.sh. When you run that script, it clones the vanilla server to create a new VM called Jenkins2012A. It also configures that VM so that you can see its port 8080 (Jenkins) on the host's 7070, and so that you can ssh into it on port 2012.
  • After running the above script, you start the newly-created VM. You log in as hcmc, and sudo su, then you run a script you'll find there called make_jenkins.sh (also in my local tei/jenkins directory on my host).
  • The make_jenkins.sh script first scps a copy of the jenkins_builder_script_2012.sh from the host into the hcmc home directory (pulling this in, rather than having it there in the seed, enables me to tweak the script easily while working on it). It runs the script.
  • At this point, you're seeing what the ordinary user of the jenkins build script would see.
  • Once the build is complete, and you exit from the build script, make_jenkins.sh scps a copy of our Oxygen license from the host into the right location, so that Oxygen is registered.
  • Now Jenkins should be running on the new machine. You can now run a script on the host called connect_to_Jenkins2012A.sh, which will send Firefox to the right port, and ssh into the machine. The connection to the machine is done like this:
    ssh -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -l hcmc -p 2012 localhost
    
    which precludes checking of the machine's key, or storage of its key; this avoids the problem where every time you clone the pre-seed to create a new VM, it has a different key, and the host complains that the key in known_hosts has changed.
  • Then you wait and watch to see if Jenkins builds the jobs OK.

This is what remains to be done:

  • Fix the rnv problem. It's installed now from the TEI packages, and is clearly there, but make and bash are unable to find it:
     rnv
    -bash: /usr/bin/rnv: No such file or directory
    
    Very weird indeed. There are executable copies in /usr/bin and in /usr/bin/X11, just like on my local machine, where they work fine.
  • Figure out what to do about job configs which include mine and SR's email addresses. We should probably XSTL the config.xml files during the setup process to replace SR's email with the user's own, which we can ask for.
  • Solve the problem whereby P5-Test builds and fails before Stylesheets has built for the first time. Stylesheets needs to build once successfully before the P5 jobs can build; thereafter, they're independent. There must be a way to force Jenkins to build Stylesheets before P5-Test the first time out.
  • List the error messages and warnings that show up ONLY on the first checkout/build of the jobs. There are lots of these in OxGarage, and possibly elsewhere. These need to be put into the hudson-log-parse-rules file so that a first-time user of the script is not worried by a lot of errors that will never show up again, and aren't relevant.
  • Document and publish the script, by updating the page on the TEI wiki.
  • Get a real VM to replace our current teijenkins, and when it's working, repoint the domain and bring the old one down.
Permalink 03:07:18 pm, by mholmes, 157 words, 76 views   English (CA)
Categories: Activity log; Mins. worked: 30

Fixed a sorting issue, one more remaining

CB reported that on the People index page, the name Æthelred II was sorting at the end. The sort is done in XQuery, and I've fixed this issue by adding a collation parameter to the order by clause:

order by $p/persName/reg/text() collation "?lang=en&amp;strength=primary&amp;decomposition=full"

Another issue is that Disraeli and D'Israeli are out of order; the apostrophe sorts before the letters. This can't be solved with a standard collation, so I could either strip the punctuation prior to the sort:

order by $p/persName/reg/text()/replace(., ''', '') collation "?lang=en&amp;strength=primary&amp;decomposition=full"

(untested), which would probably slow the page down noticeably, or write a custom collation and move the sort into XSLT (very disruptive and also slow). If it proves important, though, I'll have to do one of these. The former, for preference, if it works.

Permalink 08:55:55 am, by mholmes, 33 words, 382 views   English (CA)
Categories: Activity log, Announcements; Mins. worked: 30

CO 60 Vol 10 page images added to the Colonial Despatches collection

767 page images for CO 60 Vol 10 (in three different sizes) have been added to the collection. These cover the 1861 Despatched from London, January to August. These will now be linked into the transcription documents.

29/05/12

Permalink 05:43:02 pm, by mholmes, 13 words, 72 views   English (CA)
Categories: G&T Hours; Mins. worked: 0

MDH: 164 + 2 = 166 hours G&T

Trying to get stuff off the table so I can work on MoEML.

Permalink 05:38:34 pm, by mholmes, 24 words, 67 views   English (CA)
Categories: Activity log; Mins. worked: 60

First timesheets done

I've submitted the first set of timesheets. Takes a while to get all the info together. Watching to make sure they get processed correctly.

Permalink 05:37:25 pm, by mholmes, 83 words, 100 views   English (CA)
Categories: Activity log; Mins. worked: 240

ContentDM metadata now imported

Today's progress:

  • Finished the XSLT, and ran it against the whole collection.
  • Went through those files with no matches in the ContentDM repo, and manually ported over info from similar files, and elaborated what was there based on map legends etc.
  • Tweaked the XSLT to add a link to the ContentDM repo.

Still to do: rework the processMapBibl template so that it really uses all of the info that's now there (author, publisher, etc. etc.). This should probably be done with regular templates.

Permalink 05:34:39 pm, by mholmes, 97 words, 184 views   English (CA)
Categories: Labs, Activity log, Documentation; Mins. worked: 60

Problems with LibreOffice Writer

SM working on MoL has been seeing LibreOffice Writer crash frequently when editing a complex document with comments. I've worked with similar documents without problems, though.

I looked around for alternative word-processors, but neither Calligra Words nor AbiWord handle comments. Noting that Writer works fine for me on Precise + Gnome 3, and fine for CB with Gnome 2, I've now installed Gnome Shell on Radish, and we'll see if that solves the problem. Never liked Unity anyway.

In the process of doing this, I noticed that an HCMC style deb is failing to update itself when doing apt-get upgrade.

Permalink 05:31:35 pm, by mholmes, 41 words, 68 views   English (CA)
Categories: Activity log; Mins. worked: 30

Fixed bug in display of people list

Links on the people list weren't working, due to a missing block of XHTML that should have been supplied by the XQuery. Due to a problem with Flow, I didn't know about this bug report until today, but it's now fixed.

Permalink 01:22:07 pm, by jnazar, 4 words, 90 views   English (CA)
Categories: Activity log; Mins. worked: 180

"Interviews" - project (Part 2)

2 more interviews completed today.

Permalink 10:45:16 am, by Cameron, 418 words, 148 views   English (CA)
Categories: Activity log, Tasks, Planning Notes, Encoding Notes; Mins. worked: 1

First few days on the job

Greetings all,

I am really enjoying my time with MoEML so far. While working through the BIBL1 and PERS1 files I have noticed a few things that we will need to be thinking about in the near (and distant) future. The following is a log of my tasks so far, including notes about what we may need to think about looking forward.

My first task was to delete the dates and names of contributors who have added files to BIBL1 in the past. JJ decided that there was no longer any need for this info. Working in this file, I noticed numerous formatting inconsistencies (arising from different people adding different info at different times with different MLA conventions). I look forward to amending these errors in the coming weeks.

My second task was to ensure that all links in PERS1 were updated. The ODNB had made some changes to their website and so most of our links were broken. Again, I noticed many formatting/style inconsistencies that I am eager to amend in the coming weeks.

My third task was to add the medium (i.e. Print, Web) to each BIBL1 entry. This is a newer MLA convention and has not been used consistently since the website launched. While adding these, I made some changes to the more easily-spottable inconsistencies. This got me pretty excited about a large-scale tidy-up! I am hoping to have this "spring cleaning," as I am calling it, finished by mid-June.

Since I will be spring cleaning for the next few weeks, JJ has assigned me the task of creating an updated style guide for the website. This will ensure that everyone adding information to these files continues to follow a correct and consistent format. I cannot stress enough the importance of consistent formatting for even the most trivial matters like using an en-dash instead of a hyphen between a person's life dates. If we are to continue to assert MoEML as a serious academic publication, we cannot allow formatting errors to persist. I (think I) have attached a draft of the style guide (which is also available in the svn file "documentation," in case the file addition backfired) and I would love to hear your input. It is not yet completely implemented, but please begin referring to it when inputting information. If you have any questions about formatting, please don't hesitate to contact me.

I think that's all for now. Thanks to JJ and MH for all of their guidance so far.

CB

Permalink 09:47:25 am, by mholmes, 40 words, 255 views   English (CA)
Categories: Activity log, Tasks; Mins. worked: 60

Added XSLT tweaks to handle tei:ref and empty paragraphs

Fixed the two problems reported in the last two posts, and provided a new version to the user who reported them. If everything works well, I'll do an official release. Putting this on as a task in case I forget.

28/05/12

Permalink 03:18:37 pm, by mholmes, 302 words, 145 views   English (CA)
Categories: Servers, R & D, Activity log; Mins. worked: 180

Great progress on the Jinks 2012 build

I've been working through the problems on the Jinks build with SR, and we're close to a working system. This is where we're at right now:

  • Problems caused by the absence of a JDK have been solved by apt-get install openjdk-6-jdk (so we apparently don't actually need Oracle).
  • There are still a few error messages about not being able to write to /root/.java/.com.oxygenxml.rk, which I'm puzzled by -- I even see them if I make the file world-writable. I'm inclined to suppress these messages in the log-parse rules.
  • SR has added rnv to the TEI packages, so I'm now installing it as part of that group of installs; no need to get it from SourceForge.
  • We still have a problem with some error messages in the OxGarage build, but they don't seem to be indicating a series issue.
  • All of the P5 jobs have now completed apparently successfully, but there one set of errors on TEIP5:
    tar: p5odds.rng: Cannot stat: No such file or directory
    tar: p5odds.rnc: Cannot stat: No such file or directory
    tar: p5odds-examples.rng: Cannot stat: No such file or directory
    tar: p5odds-examples.rnc: Cannot stat: No such file or directory
    tar: Exiting with failure status due to previous errors
    and in fact none of those four files are in the release/xml/tei/odd directory of the archived artifacts, whereas they are there in my current Jenkins artifacts. So I think this build should be tagged as a failure. As it is, although the console says it will be tagged as a fail, it isn't. This will need a bit of investigation.
  • I've created a vanilla Precise Server install which I can keep updated and use as the basis for script testing, by cloning it each time.
Permalink 01:54:29 pm, by jnazar, 9 words, 90 views   English (CA)
Categories: Activity log; Mins. worked: 180

"Interviews" - project (Part 2)

2 more interviews conducted today. More forthcoming through June 15th.

Permalink 11:52:35 am, by sarneil, 121 words, 223 views   English (CA)
Categories: Activity log; Mins. worked: 60

etcl : confirm problems with links

Compared the code in the twentyeleven theme and the etcl theme that is generating the links on the pages. The code is different, but the output from each is identical (other than the order of the attributes), so I'm not sure why the navigation fails. I guess the ETCL theme must parse the arguments in the URL differently than the twentyeleven theme does, but that doesn't make sense, as that would be the DB doing that parsing and it hasn't changed. TwentyTen theme Brown Bag Seminar: Close Reading, Distant Reading and in Between http://hcmc.uvic.ca/~etcl/wordpress/blog/2012/03/13/brown-bag-seminar-close-reading-distant-reading-and-in-between/ ETCL theme Brown Bag Seminar: Close Reading, Distant Reading and in Between http://hcmc.uvic.ca/~etcl/wordpress/blog/2012/03/13/brown-bag-seminar-close-reading-distant-reading-and-in-between/
Permalink 11:04:27 am, by Greg, 223 words, 153 views   English (CA)
Categories: Labs, Activity log; Mins. worked: 60

B045 setup for presentations

Tested projector with new build/Intel Graphics and wireless keyboard.
Turns out that the projector looks fabulous as long as you use its native resolution. The bad news is that its native resolution is 1024x768. Any other 4:3 resolution that works with the Intel Graphics chip looks terrible on the screen.
To get mirroring to work, you need to plug the HDMI plug in to the DVI port on the computer (via the HDMI-DVI converter) and use a VGA cable for the monitor itself. Any other combo is fraught.
Wireless keyboard/trackpad has a nice feel to it but is a real pain to use as it loses its connection regularly. Pressing the connect button MAY work, but it doesn't always. I tried elevating the receiver, which helps some (but not enough). Martin replaced the batteries and reports that performance is better.

If you want to do a presentation in B045 using our equipment:
Projector plugs in to the orange DVI converter and then in to the DVI port on the computer. Monitor plugs in to the computer via the VGA cable.
Connect wireless keyboard: plug the receiver in to the computer using a USB extension cable and elevate the receiver so it's within line-of-sight of the keyboard. Turn on the (keyboard) power and depress the connect button. It should work more-or-less right away.

Permalink 10:42:53 am, by sarneil, 289 words, 513 views   English (CA)
Categories: Activity log; Mins. worked: 90

etcl : changing upload limits in WordPress multi-user

The ETCL/INKE is a multi-user WordPress site. There are global settings which affect total amount of uploaded files and maximum individual file size. These limits apply to all sites in the instance (and I'm guessing the total is an aggregate for all the sites, rather than for each, but I'm not sure).

To get to the settings
- log in to admin interface
- go to My Sites / Network Admin / Dashboard in the grey navbar
- choose settings from the menu on the left
- scroll down to Upload Settings.
(e.g. http://hcmc.uvic.ca/~etcl/wordpress/wp-admin/network/settings.php)

There's a "limit total size of uploads" checkbox (to enable/disable limit) and textfield (to specify limit).
There is also a "max upload filesize" textfield (which is currently set to 1500 KB or 1.5 MB)
Don't forget to save your modifications.

The maximum size of uploadable file that appears in the meda / add new page (e.g. http://hcmc.uvic.ca/~etcl/wordpress/wp-admin/media-new.php) is 1 MB. Not sure why that is not the 1.5 MB specified in the network settings.
If you get very close to the total size allocated, the max uploadable file size displayed in the add media page is whatever space you have left within the limit.

Finally, this wasn't a factor in this instance, but php has limits too (which you can see by calling phpinfo):
post_max_size 150M
upload_max_filesize 150M
I'm not sure how to modify those values. As an experiment, I tried adding
php_value upload_max_filesize 100M
php_value post_max_size 100M
in the .htaccess file at the root of wordpress, but that threw errors. I didn't try a php.ini file within the account.

Permalink 10:00:43 am, by mholmes, 189 words, 219 views   English (CA)
Categories: Activity log; Mins. worked: 210

Mapping between ContentDM metadata and TEI

This is the complete mapping for copying metadata over from the ContentDM records to our TEI files:

  • dc:title (multiple): titleStmt/title, bibl/title.
  • dc:description[1]: notesStmt/note (replace the first one).
  • dc:description[preceding-sibling::dc:description][string-length(.) gt 50]: notesStmt/note (add new ones). These are the textual descriptions; the shorter ones are various scale and coordinate details.
  • dc:description[matches(., "^[0-9]+[ 0-9'NW\-\./]+$") and string-length(.) gt 3]: bibl/geo. These one-line expressions of geo locations will have to be further processed into something we can use to map to Google. They're not really in consistent format.
  • dc:subject (multiple) = notesStmt/note type="subject".
  • dc:creator = bibl/author.
  • dc:contributor[not(preceding-sibling::dc:creator)][not(starts-with(., 'Fund')] = bibl/author.
  • dc:language == 'eng' : bibl/@xml:lang = 'en'
  • dc:language == 'spa' : bibl/@xml:lang = 'es'
  • dc:contributor[starts-with(., 'Fund')] = funder.
  • dc:publisher = bibl/publisher
  • dc:relation = bibl/publisher (really should be repository, but we don't want to be get into having a full msIdentifier).
  • dc:identifier[starts-with(., 'http://contentdm')] = idno type="contentdm".

I'm now halfway through the XSLT which will integrate the metadata into the TEI files. Should be done tomorrow.

Permalink 07:57:38 am, by mholmes, 24 words, 239 views   English (CA)
Categories: Activity log, Tasks; Mins. worked: 15

Another task: add proper handling for ref/@target to XSLT

The <ref> element's @target attribute should be converted to a proper XHTML link on when you export to create a web view.

25/05/12

Permalink 03:17:05 pm, by sarneil, 44 words, 88 views   English (CA)
Categories: G&T Hours; Mins. worked: 0

SA G&T 18.5 - 2.5 = 16 hours

Just noticed that I haven't posted G&T for 4 weeks. I estimate +1.5 the first two week, +1.5 for the second week, +0.5 for the third week and +1 for the fourth week. Deduct 7 hours for the pro-d day on the 18th. Brings me current to April 25.
Permalink 03:04:11 pm, by sarneil, 42 words, 110 views   English (CA)
Categories: Activity log; Mins. worked: 30

phil : small adjustments to undergrad courses page

Made a few modifications to the undergrad index page (added links to pdf checklist files provided by dept) and the undergrad courses page (made controls at top of page for selecting which term's courses to display more obviously controls rather than text).
Permalink 03:02:16 pm, by sarneil, 121 words, 79 views   English (CA)
Categories: Activity log; Mins. worked: 60

malahat : how to handle gift subscriptions

Trying to figure out the best way of handling gift subscriptions (i.e. person 1 at address 1 buys the subscription, but it goes to person 2 at address 2) After some experimenting, we discovered that the ref1 through ref5 values don't appear in the transaction details field in the admin interface, but the trnComments field does. The ref1 through ref5 values do show up in the confirmation notice that is sent, so they will use that to alert themselves to special treatment for gift subscriptions. They are still thinking about using the "shipping" address fields for the giftee's details, rather then asking for them separately in the comments field and then the user wondering about whether or not to fill in the "shipping" fields.
Permalink 02:58:20 pm, by sarneil, 138 words, 99 views   English (CA)
Categories: Activity log; Mins. worked: 120

malahat : add code to basket based on value in get array

The Malahat Review wants to create two pages with special offers for contributors (subscriptions and issues) which connect to the shopping basket as usual. Problem is how to allow a user coming from either of those pages to get back from the basket to either of those pages, but not allow anyone getting to the basket from the normal products pages to get to the special offer pages.
I use the reserved field names in the get array to submit a value from the special pages to the basket and then added javascript code on the basket page so that if it finds the right value in the right variable in the get array, it then displays buttons which link back to the special pages. Obviously this could be extended for other special navigation requests in the future.

Permalink 02:32:04 pm, by mholmes, 2 words, 58 views   English (CA)
Categories: G&T Hours; Mins. worked: 0

MDH: 165 - 1 = 164 hours G&T

Leaving early.

Permalink 02:30:26 pm, by mholmes, 1005 words, 376 views   English (CA)
Categories: Servers, R & D, Activity log; Mins. worked: 120

Work on new TEI Jenkins server based on Precise

Started testing on a VM, running my old script bit by bit and watching the results. I've got as far as a working Jinks server, and one or two builds work, but there will have to be many changes. These are my notes so far:

First time building Jenkins with Precise, I'm running only what I think I need to run, and keeping notes.

#START UNTESTED BIT -- TEST LATER ON DEFAULT INSTALL.#
#First do updates.
echo "Doing system updates before starting on anything else."
apt-get update
apt-get upgrade

#Now add the repositories we want.
echo "Backing up repository list."
cp /etc/apt/sources.list /etc/apt/sources.list.bak

#Uncomment partner repos.
echo "Uncommenting partner repositories on sources list, so we can get Sun Java."
sed -i -re '/partner/ s/^#//' /etc/apt/sources.list
#FINISH UNTESTED BIT#


#First Jenkins
echo "Adding Jenkins repository."
wget -q -O - http://pkg.jenkins-ci.org/debian/jenkins-ci.org.key | apt-key add -
echo "deb http://pkg.jenkins-ci.org/debian binary/" > /etc/apt/sources.list.d/jenkins.list

#OK

#Next TEI.
echo "Adding TEI Debian repository."
     gpg --keyserver wwwkeys.uk.pgp.net --recv-keys FEA4973F86A9A497
#NOT OK: HAD TO USE:
#Next TEI.
gpg --keyserver wwwkeys.pgp.net --recv-keys FEA4973F86A9A497
echo "deb http://tei.oucs.ox.ac.uk/teideb/binary ./" > /etc/apt/sources.list.d/tei.list

#OK

#Now we can start installing packages.
echo "Updating for new repositories."
apt-get update

#OK

echo "Installing core packages we need."
apt-get install openssh-server libxml2 libxml2-utils devscripts xsltproc debhelper subversion trang &&
echo "Installing curl, required for some tei building stuff."
apt-get install curl &&

#OK. curl already installed.

#TEI packages
echo "Installing TEI packages."
	apt-get install psgml xmlstarlet debiandoc-sgml linuxdoc-sgml jing jing-trang-doc libjing-java rnv texlive-xetex &&
#NOT OK: rnv no longer in repos (but is in TEI, so no worries); linuxdoc-sgml should be linuxdoc-tools. HAD TO USE:

apt-get install psgml xmlstarlet debiandoc-sgml linuxdoc-tools jing jing-trang-doc libjing-java texlive-xetex

	apt-get install trang-java tei-p5-doc tei-p5-database tei-p5-source tei-schema saxon nxml-mode-tei tei-p5-xsl tei-p5-xsl2 tei-p5-xslprofiles tei-roma onvdl tei-oxygen zip &&

#NOT OK: nxml-mode-tei has no installation candidate. HAD TO USE:
apt-get install trang-java tei-p5-doc tei-p5-database tei-p5-source tei-schema saxon tei-p5-xsl tei-p5-xsl2 tei-p5-xslprofiles tei-roma onvdl tei-oxygen zip &&

#Setting up configuration for oXygen
mkdir /root/.com.oxygenxml
chmod a+x /root/.com.oxygenxml
mkdir /root/.java
chmod a+x /root/.java
echo "Don't forget to put your licensekey.txt file in the folder /usr/share/oxygen so that oXygen is registered."

#OK

#Various fonts and the like.
echo "Installing fonts we need."
apt-get install ttf-dejavu msttcorefonts ttf-arphic-ukai ttf-arphic-uming ttf-baekmuk ttf-junicode ttf-kochi-gothic ttf-kochi-mincho
echo "The Han Nom font is not available in repositories, so we have to download it from SourceForge."
cd /usr/share/fonts/truetype
mkdir hannom
cd hannom
wget -O hannom.zip http://downloads.sourceforge.net/project/vietunicode/hannom/hannom%20v2005/hannomH.zip
unzip hannom.zip
find . -iname "*.ttf" | rename 's/\ /_/g'
rm hannom.zip
fc-cache -f -v

#OK

#Configuration for Jenkins
echo "Starting configuration of Jenkins."
echo "Getting the Hudson log parsing rules from TEI SVN."
cd /var/lib/jenkins
svn export https://tei.svn.sourceforge.net/svnroot/tei/trunk/P5/Utilities/hudson-log-parse-rules
chown jenkins hudson-log-parse-rules
echo "Getting all the job data from TEI SVN."
#Don't bring down the config.xml file for now; that contains security settings specific to 
#Sebastian's setup, and will prevent anyone from logging in. We leave the server unsecured,
#and make it up to the user to secure it.
#svn export https://tei.svn.sourceforge.net/svnroot/tei/trunk/Documents/Editing/Jenkins/config.xml
#chown jenkins config.xml
svn export --force https://tei.svn.sourceforge.net/svnroot/tei/trunk/Documents/Editing/Jenkins/jobs/ jobs
chown -R jenkins jobs
echo "Installing Jenkins plugins."
cd plugins
wget --no-check-certificate http://updates.jenkins-ci.org/latest/copyartifact.hpi
chown jenkins copyartifact.hpi
wget --no-check-certificate http://updates.jenkins-ci.org/latest/emotional-hudson.hpi
chown jenkins emotional-hudson.hpi
wget --no-check-certificate http://updates.jenkins-ci.org/latest/greenballs.hpi
chown jenkins greenballs.hpi
wget --no-check-certificate http://updates.jenkins-ci.org/latest/jobConfigHistory.hpi
chown jenkins jobConfigHistory.hpi
wget --no-check-certificate http://updates.jenkins-ci.org/latest/plot.hpi
chown jenkins plot.hpi
wget --no-check-certificate http://updates.jenkins-ci.org/latest/log-parser.hpi
chown jenkins log-parser.hpi
wget --no-check-certificate http://updates.jenkins-ci.org/latest/scp.hpi
chown jenkins scp.hpi
wget --no-check-certificate http://updates.jenkins-ci.org/latest/WebSVN2.hpi
chown jenkins WebSVN2.hpi

echo "Restarting Jenkins server, so that it finds and initializes all the new plugins."
/etc/init.d/jenkins restart

#OK


PROBLEMS AFTER STARTUP:

1. Builds fail because rnv is not installed. Have to get it from SourceForge:

cd ~
apt-get install libexpat-dev
wget http://downloads.sourceforge.net/project/rnv/Sources/1.7.10/rnv-1.7.10.zip
unzip rnv-1.7.10.zip
cd rnv-1.7.10
./configure
make
make install

2. NON-FATAL, but investigate: in Stylesheets:
	Unable to locate tools.jar. Expected to find it in /usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar

3. FATAL:

BUILD Build for P5, XSLT 2.0
test -d release/p5 || mkdir -p release/p5/xml/tei/stylesheet/
for i in  bibtex common2 docx dtd docbook epub epub3 fo2 html html5 latex2 nlm odds2 odt profiles/default rdf relaxng rnc slides tbx tite tools txt xhtml2 xsd ; do \
		tar cf - --exclude .svn $i | (cd release/p5/xml/tei/stylesheet; tar xf - ); \
	done
(cd odt;  mkdir TEIP5; saxon -o:TEIP5/teitoodt.xsl -s:teitoodt.xsl expandxsl.xsl ; cp odttotei.xsl TEIP5.ott teilite.dtd TEIP5; jar cf ../teioo.jar TEIP5 TypeDetection.xcu ; rm -rf TEIP5)

/bin/sh: 1: jar: not found
mkdir -p /var/lib/jenkins/jobs/Stylesheets/workspace/debian-tei-p5-xsl2/debian/tei-p5-xsl2/usr/share/xml/tei/stylesheet
cp catalog.xml /var/lib/jenkins/jobs/Stylesheets/workspace/debian-tei-p5-xsl2/debian/tei-p5-xsl2/usr/share/xml/tei/stylesheet
cp teioo.jar /var/lib/jenkins/jobs/Stylesheets/workspace/debian-tei-p5-xsl2/debian/tei-p5-xsl2/usr/share/xml/tei/stylesheet
cp: cannot stat `teioo.jar': No such file or directory
make[1]: *** [installp5] Error 1
make[1]: Leaving directory `/var/lib/jenkins/jobs/Stylesheets/workspace'

Permalink 02:28:14 pm, by sarneil, 124 words, 167 views   English (CA)
Categories: Activity log; Mins. worked: 60

security certificates on interpretive essays

User noted that https://canadianmysteries.ca... threw up a security warning. The certificate is actually on the canadianmysteries.uvic.ca domain and not the canadianmysteries.ca domain (long story involving it being purchased through systems and thus part of the uvic.ca domain name). Got user to try using the http protocol and the domain name with uvic in it, but still he couldn't get through, so I suspect some kind of security setting on his machine or network was also in play.

Ended up sending him the visible text of those files as attachments to an email, so at least he had that, which is all he wanted.

Nobody else has ever complained about this, so I don't think I'll do anything more.

Permalink 02:22:17 pm, by sarneil, 101 words, 83 views   English (CA)
Categories: Activity log; Mins. worked: 30

hist : suitability of conference tool for review of abstracts

JSR enquired about suitability of uvic conference hosting service for review of abstracts. The software has those features. I've been involved in a conference (GRS) that used them sort-of for already submitted and approved abstracts just so the abstracts would appear to users. I seem to recall a problem uploading more than one attached file per submission (as in, I had to get the admin of the conference system to do it), but no other problems. I have never used them in earnest, but they should be adequate. Did some poking around with those features (users, user management, workflow management etc.).
Permalink 02:17:20 pm, by sarneil, 38 words, 126 views   English (CA)
Categories: Activity log; Mins. worked: 30

missing file on thomson site

user reported that links to thomson/artistsworld/importance/5331en.html break. The file does not exist, so links to it naturally break. I found a partial file with that ID in the development database, but it's not ready.
Permalink 02:09:15 pm, by sarneil, 116 words, 92 views   English (CA)
Categories: Activity log; Mins. worked: 120

phil : fiddling because UVic Calendar is late

code in course_list_generation.php assumes that the UVic calendar for the upcoming year will be posted by May 1. It wasn't, so my code failed. Temporarily changed this if($the_month < 5){ $calendarYearForWinter = $the_year - 1; to this if($the_month < 6){ $calendarYearForWinter = $the_year - 1; When UVic calendar went live, set that "6" back to "5". Also noticed that the integrateCoursesFromCalendar(winterList,summerList) simply returned the winter list, so had to write that method which copies winterList to ouputList, goes through each item in the summer list and if that item does not appear in the winterList, then append it to the outputList, and at the end of the loop sort the outputList and return it.
Permalink 02:00:22 pm, by sarneil, 115 words, 136 views   English (CA)
Categories: Academic; Mins. worked: 30

phil : redirect from dept URL to conference site

Dept wanted a URL that was associated with the dept site and simpler than the actual conference url on the conferences site
(http://conferences.uvic.ca/index.php/WCPA/wcpa2012)

so I created a simple setup in
http://web.uvic.ca/philosophy/wcpa
to redirect to the conference site

I put in an index.htm file with a redirect.
I also created a .htaccess file to catch any other URLs within that folder and redirect them to the home page on the conference site

ErrorDocument 404 http://conferences.uvic.ca/index.php/WCPA/wcpa2012

There is about a half-second lag while the redirect occurs, but I don't think that is long enough to bother your users.

Permalink 01:55:38 pm, by sarneil, 46 words, 43 views   English (CA)
Categories: Activity log; Mins. worked: 30

phil : fix error in data

courses_taught.php had the following line:
$professor['kluge'][3] = array ("208, "331"); //jan-apr
It was missing a quotation mark. I corrected it to look like this:
$professor['kluge'][3] = array ("208", "331"); //jan-apr
I also tidied up extra spacing and stuff just to make the file easier
for me to read.

Permalink 01:51:32 pm, by sarneil, 344 words, 79 views   English (CA)
Categories: Activity log; Mins. worked: 120

vpn : bug fixes

1) I modified the detailed information for a single poem
(vpn-single-record.php) so that the "more information" link for the poem
appears only for those poems that have at least one link and not for
those poems that no links. compare
http://web.uvic.ca/~vicpoet/?vpn_id=1096 and
http://web.uvic.ca/~vicpoet/?vpn_id=1130
Similarly for the "more information" link on an author.

2) AC's convention for data entry is that if there is more than one link for a poem or an author, each is delimited by a carriage return (e.g Eliza Cook). I've added code on the results page to
look for carriage returns in the links field and create a new link when
it finds one, rather than make all the lines one big link to the first URL. For example, see the information on Cook, Eliza at
http://web.uvic.ca/~vicpoet/?vpn_id=53

3) I've modified the vpn-search-results.php page so that if a poem has more than one author, only one instance of that poem appears in the listing in the search results page, e.g. Hidden Light.

I originally tried pruning out duplicates during the search procedure,
but the introduced a cascade of headaches. Instead, I decided to accept the results including duplicates and then prune any duplicates as they were being displayed and telling the user that's what I'm doing.

The result is that the total number of hits is still reported (e.g. 2 in the case of Hidden Light), and a message is appended saying the number of duplicate entries which are not being displayed (e.g. 1 in the case of Hidden Light). That's so that from the user's perspective the number of items in the listing don't inexplicably go out of sync with the reporting and navigation schemes. For example, if the query returned 12 hits, 2 of which were duplicates, the user would expect to see 10 hits and then 2 hits. If they got 10 and then 1, they'd be puzzled. My hope is that adding a note about hiding the duplicates will resolve the puzzle for them.

Permalink 01:35:12 pm, by sarneil, 56 words, 103 views   English (CA)
Categories: Activity log; Mins. worked: 180

audio files of student interviews sent to archives

EGW students did interviews of retired people. Got 20 interviews from Judy as mp3 files, Greg made copy of each to wav file, resulting in a total of 40 files and 11 gigs of data. After some back and forth with LW of archives, arranged to deliver the files. Took them over on a USB stick and transferred them.
Permalink 01:32:51 pm, by sarneil, 191 words, 101 views   English (CA)
Categories: Activity log; Mins. worked: 120

can't install eXist 2.1 on mac OS 10.7

Encountered various problems with the lucene indexing and reporting in Francotoile, so decided to upgrade from eXist 1.5 to 2.1 in hope that improvements to lucene between those two versions would solve the problems.

I am successfully running an eXist instance on
a Mac running OS 10.7.4
java version 1.6.0_31
Java(TM) SE Runtime Environment (build 1.6.0_31-b04-415-11M3635)
apache tomcat 7.0.21
eXist 1.5.0

I tried downloading and running exist-2.1-dev-rev16458 and am unable.
The catalina log says
"org.apache.catalina.core.StandardContext startInternal
SEVERE: Error filterStart"
If I use the tomcat manager to start exist 2.1 I get
"FAIL - Application at context path /exist21 could not be started"
and the same error in the catalina log

I installed a newer version of tomcat (7.0.27) on that same computer and got the same results when I tried to launch eXist 1.5 (worked) and eXist 2.1 (failed), so I don't think that's the issue.

I then went to a Mac running OS 10.6.8
java version 1.6.0_31
JRE build 1.6.0_31-b04-415-10M3646
apache tomcat 7.0.27

I tried to run exist-2.1-dev-rev16458 and that worked, so it appears to be an issue with the JRE or (less likely) the OS on the first Mac.

Permalink 01:30:31 pm, by sarneil, 414 words, 76 views   English (CA)
Categories: Activity log; Mins. worked: 1200

completed bunch of upgrades on dev instance

Over the past 10 days or so, have worked through the to-do list and made a number of improvements on the dev site. If CC approves, I'll migrate these to the production site. Estimate about 20 hours in all.

improvements to searching:
- if user types in upper-case search string, the search now works as
well as if they typed in a lower-case search string
- the selected item in the "gender" and "topic" dropdowns on the search page remain visible
- all apostrophes have been normalized to straight (') rather than smart (’)
- quotation marks around a phrase now matches the entire phrase rather
than any word within the phrase
- if the user puts a colon (:) in the search string, the page removes it

improvements to markup and presentation:
- if you put a <title> element into an <utterance>, a <reference>, a <note>, or into <person><trait><p> in the teiHeader, it will be rendered in italics on the page
- if you add an <incident who="#interviewer"><desc> element into an utterance or a <u who="#interviewer> element, it will be rendered as grey on the page
- there is a show/hide control on the full transcript

remaining issues:

There are still problems with searching for words that happen to be
upper case in the transcript (e.g. search for bonjour and you'll see
there are 4 hits, the two that are lower-case in the transcript show a
link to the occurrence in the transcript, the two that are upper-case
don't show the link). To fix those I need to upgrade the version of the database engine and for some reason I'm unable to do that on my computer (though I can on others). So, until I sort that out, we're stuck on that issue.

We still don't match instances of the search string that occur in the
notes. I think this may be related to the upper-case/lower-case problem,
so a solution to it will have to wait a newer version of the database
engine.

The underline of the space following a reference results from the way
the machinery I'm relying on handles whitespace and is virtually
impossible to fix reliably, so I'm leaving it for now.

The indexing engine (lucene) used by the database does not allow wilcard
characters (? or *) at the start of the search string. No way around that.

Permalink 11:37:22 am, by mholmes, 36 words, 122 views   English (CA)
Categories: Activity log; Mins. worked: 120

Meeting and committee proposal

Met with ECH to plan work for the Fall, and write application for the HCMC committee. The application is done, workstation time is booked for SK, and we have plans under way for grant applications etc.

Permalink 09:47:00 am, by mholmes, 54 words, 139 views   English (CA)
Categories: Activity log; Mins. worked: 60

Mapping between ContentDM metadata and TEI

This is my preliminary mapping:

  • dc:title (multiple): titleStmt/title, bibl/title.
  • dc:description[1]: notesStmt/note (replace the first one).
  • dc:subject (multiple) = notesStmt/note type="subject".
  • dc:creator = bibl/author.
  • dc:language == 'eng' : bibl/@xml:lang = 'en'
  • dc:language == 'spa' : bibl/@xml:lang = 'es'
  • dc:contributor[starts-with(., 'Fund')] = funder.
  • [ more to come later... ]
Permalink 09:05:10 am, by mholmes, 76 words, 266 views   English (CA)
Categories: Activity log, Tasks; Mins. worked: 60

Tech support for user

Interesting issue reported by a user: if you have empty p tags in your annotations, then a self-closing div is output to the HTML. If you then open the file in Firefox, it will screw up, but if you give the file an .xhtml extension, it works fine.

I should put in a trap for empty paragraphs, and insert a non-breaking space into them. That could easily be done in the XSLT. Making this a task.

24/05/12

Permalink 03:44:16 pm, by esaint, 119 words, 56 views   English (CA)
Categories: Activity log; Mins. worked: 10

Update from ES - May 24, 2012

1. With iMovie, which was installed on POMME on Wednesday, 8 video files have been edited. ES will write a document to explain the procedure for future use.
2. ES prepared two posters for recruitment. To be displayed at GSS and other significant places on campus.
3. ES will record two new subjects on Friday, May 25 and Wednesday, May 30 (respectively a male 60+, from south of France, and a female 20-30, from south of France).
4. Real Player was installed on POMME as it currently is the only player that will display one digit beyond the seconds and therefore provide enough precision for establishing the utterances for subtitles.
5. New transcripts for accf1, ancf1, and ancf2 were entered on Oxygen. To be continued with all files.

Permalink 03:22:48 pm, by mholmes, 6 words, 54 views   English (CA)
Categories: G&T Hours; Mins. worked: 0

MDH: 167 - 2 = 165 hours G&T

Out running errands during the morning.

Permalink 03:00:26 pm, by mholmes, 113 words, 121 views   English (CA)
Categories: Activity log; Mins. worked: 240

Matching part of the process finished

Spent most of the day manually aligning records between ContentDM and ColDesp, so this is where we're at:

  • DONE: Manually edit the XHTML file to fix bad matches among the candidates.
  • DONE: Search for matches for the unmatched items manually.
  • DONE: Add matches found back into the XHTML.
  • Generate from the XHTML a list of pairings from which metadata can be brought over.
  • Map desired metadata fields in ContentDM OAI file to TEI.
  • Write XSLT to port the metadata into the TEI files.
  • Update the map gallery rendering code to include the new metadata.

Also wrote to CP with a list of 7 maps that we have, but which are apparently missing from ContentDM.

23/05/12

Permalink 03:14:27 pm, by mholmes, 150 words, 121 views   English (CA)
Categories: Activity log; Mins. worked: 240

Matching with ContentDM records

More progress on matching with ContentDM. I've now generated an XHTML file with two tables, one of candidate matches (186 maps) with links to both ColDesp and ContentDM, for human checking, and one of failed matches (33 maps from ColDesp), with ColDesp links and enough metadata for a manual search. I've manually verified the 186 candidate matches and found that most match; I reported one map apparently missing from ContentDM to CP, and found a dupe in ColDesp.

Next steps:

  • Manually edit the XHTML file to fix bad matches among the candidates.
  • Search for matches for the unmatched items manually.
  • Add matches found back into the XHTML.
  • Generate from the XHTML a list of pairings from which metadata can be brought over.
  • Map desired metadata fields in ContentDM OAI file to TEI.
  • Write XSLT to port the metadata into the TEI files.
  • Update the map gallery rendering code to include the new metadata.
Permalink 11:31:38 am, by Erin, 120 words, 264 views   English (CA)
Categories: Activity log; Mins. worked: 170

various

Today I added in "education" and "rappelle_senat" tags in the following accounts : benignewinslow, bordeu, chambaud, de thou vol3, bullart, dethou vol1, eloy, de thou vol4, guillemeau, feller, carlencas, de thou vol3 rigault, lacroixdumaine_et_verdier, moreri_english, moreri, niceron vol5, pare x 2, niceron vol10, senac.... Now entered globally. Also, searched for "mélancolie" to add seg tags (search missed the accent in some texts previously)................ changed la_palestine --> palestine within place-ography.. changed within accounts as well........ marked up imperialis......... went over paragraph conflicts, discussion with greg............... created xml files for argenterius and clusius-notes........ currently in "xml to be added" folder, keeping "imperialis" company. began mark up (finished argenterius, clusius almost done - add in persName and placeName tags.....)
Permalink 10:15:02 am, by mholmes, 98 words, 175 views   English (CA)
Categories: Activity log; Mins. worked: 60

Revisiting normalization

Tested out Franscriptor.com with some sample text from our content, to see what it's doing and to try to deduce how (it's a black box). It offers to "dissimiler" and "détilder" the text, but it's not clear exactly what that means. This is what I've learned:

  • It does nothing with long s, so that has to be normalized before submission.
  • It expands ligatures such as œ.
  • It does quite a good job with u/v normalization, although it failed with "oeuures".
  • Many anacronistic spellings survive unchanged ("luy", "bastir", "tousjours"), so it's clearly not trying to do modernization.

22/05/12

Permalink 11:27:16 pm, by Janelle Jenstad Email , 110 words, 123 views   English (CA)
Categories: Encoding Notes; Mins. worked: 0

.docx and .odt working files

Note to Janelle and to the RAs:

The HCMC computers do not have MS-Word. If we are editing working files, use OpenOffice. If you are on another computer and need to convert a .doc or .docx file to .odt, do NOT use MS-Word to save the file as a .odt file. The comments (where we record so much information for our encoders) will be stripped away in the file conversion.

Instead, save and close the .doc/.docx file. Then, start up OpenOffice and open the .doc/.docx file. Now you can save it as a .odt file without losing the comments.

Remember: File names must not include spaces or punctuation.

Permalink 05:36:32 pm, by mholmes, 19 words, 78 views   English (CA)
Categories: G&T Hours; Mins. worked: 0

MDH: 165 + 2 = 167 hours G&T

Getting the MoEML project under way with our RAs, and getting some TODOs from Flow out of the way.

Permalink 03:50:05 pm, by mholmes, 78 words, 142 views   English (CA)
Categories: Activity log, Encoding Notes; Mins. worked: 180

Implemented popup people references

JJ requested on Flow that when you click on a person's name, their info be shown in a popup. This is now implemented. Specifically, if you use this type of reference:

<name type="person" ref="mol:HOLM3">Martin Holmes</name>

then the name will generate a popup link, but if you put this:

<ref target="mol:HOLM3">Martin Holmes</ref>

then a link to that person's page will be generated.

Permalink 12:53:52 pm, by Erin, 9 words, 133 views   English (CA)
Categories: Activity log; Mins. worked: 180

transcriptions

Today I transcribed Clusius' notes, and began Metellus' letter
Permalink 11:33:43 am, by mholmes, 34 words, 277 views   English (CA)
Categories: Activity log, Announcements; Mins. worked: 20

CO 305 Vol 18 page images added to the Colonial Despatches collection

910 page images for CO 305 Vol 18 (in three different sizes) have been added to the collection. These cover the 1861 Vancouver Island Public Offices and Miscellaneous Correspondence. These will now be linked into the transcription documents.

Permalink 11:09:26 am, by mholmes, 77 words, 81 views   English (CA)
Categories: Activity log; Mins. worked: 90

Implemented title level="a"

Worked on bibliography encoding and rendering this morning:

  • Tweaked the XSLT so that following punctuation is rendered inside the quotes resulting from <title level="a">, even though in the encoding it's (correctly) placed outside the <title> tag.
  • Used a variety of search/replace/regex and manual fixes to add <title level="a"> tags throughout the bibliography file. These will be checked by CB.

All changes have been uploaded into the db.

Permalink 08:02:38 am, by mholmes, 20 words, 96 views   English (CA)
Categories: Activity log; Mins. worked: 10

MVP: tweaked the CSS for KT

Suppressed display of the speech statistics in the XML ographies files when viewed in a web browser, at KT's request.

21/05/12

Permalink 10:37:47 pm, by Janelle Jenstad Email , 228 words, 137 views   English (CA)
Categories: Planning Notes; Mins. worked: 0

encodingDesc

Martin's Comments on the simple_template in an email to Janelle on 2012-05-15:

The template looks fine. The only thing I noticed that I'd reconsider is the use of the <segmentation> element. I'm not quite sure how that got into our documents in the first place:

"segmentation<>> describes the principles according to which the text has been segmented, for example into sentences, tone-units, graphemic strata, etc. [2.3.3 The Editorial Practices Declaration 15.3.2 Declarable Elements]"

We don't really have much to say about that; segmentation isn't a major issue for us, especially in the born-digital documents such as Location files.

I think what we should probably have is:

  • One file on the site where we detail all of our transcription and encoding practices for the original documents such as Stow;
  • Another file on the site where we detail our markup practices for the born-digital/modern content;
  • Something like this in the encodingDesc of every other file:

<encodingDesc>

<p>

See <ref target="mol:modernEncoding">modernEncoding.xml</ref> for full details of transcription and encoding used in this document.

</p>

</encodingDesc>

We do have to produce those files first, though. We should start with the original document encoding practices, which we need to lay out in some detail before the work on Stow begins in earnest.

18/05/12

Permalink 01:00:08 pm, by Janelle Jenstad Email , 515 words, 180 views   English (CA)
Categories: Encoding Notes; Mins. worked: 0

Draft instructions for preparing a basic text for encoding

First Pass: Preparing a basic text for encoding.

  • Copy text from EEBO-TCP print view (this presupposes we have permission to use their transcription -- we're actually using their .xml files for Stow).
  • Convert short s to long s (lower case only). (Convert long s before spaces, periods, and commas back to short s.)
  • Change the colour of italicized text (for your convenience and the encoder’s convenience). You may want to convert all the text to black and then the italicized text back to a colour once you have finished all your corrections.
  • Add in signature numbers as milestones (a way for US to navigate through the book)
  • Record what is in the headline (usually the running title but sometimes also a page number) and direction lines (signature number on signed leaves and catchword). EEBO-TCP does not display this info.
  • Proofread. Correct errors in EEBO-TCP’s transcription. Correct any long s errors introduced by us. Check punctuation carefully (is it really italicized after names?). Infer gaps in the transcription if you can but don’t make it up if you can’t. If you make an educated editorial guess, put the conjectural characters within editorial square brackets. The microfilms in the library are often clearer.

Second Pass (depends on time and the instructions from JJ).

If you are putting the information into comment bubbles:

  • Identify the unique XML:id for all the streets and sites.
  • Put notes about queries, unresolved ids, and gaps in database (e.g., people and places who aren't there) into comments. Put a reminder in Flow in Janelle's TTD folder.

If you are printing out the file, highlighting locations, people, bibl items, and other features you want to flag for the encoder:

  • Print out file (or send it to Janelle to print out).
  • Use pink for bibl tags, yellow for locations, purple for people, green for anything that encoders will need to note (foreign words, book/article titles).
  • Look up unique XML:ids and write them on the print-out. If an XML:id is recurrent, you don't need to write it every time unless the context requires you to give it again.

Finally...

  • Write a note about the structure of the document. Should we use the simple template (and for most topics and transcriptions we will), the location template, or the complex template? To create sections and subsections and sub-sub-sections, we use the div element. You'll have to think about the structure of the document. Does it consist of sections already? Do we need to divide it into sections? Is there a piece of text that will serve as the header for the section? E.g., The Carriers Cosmographie has letters for section headers, but we also had to make up some editorial section headers. See Basic Document Structure for information. You may need to work out these suggestions by discussing them with the encoders and/or JJ.
  • Send file back to JJ.
  • When JJ gives the okay, tell the encoders to go ahead. Use Flow to tell them the full name of the file in the SVN working_files folder.
Permalink 10:52:24 am, by Janelle Jenstad Email , 156 words, 209 views   English (CA)
Categories: Tasks; Mins. worked: 0

Rendering punctuation around article titles

JJ, MS, and CB have started to use the title level="a" markup for articles, both in the BIBL1.xml file and in the markup of pages where an article title appears.

Current rendering code puts quotation marks around article titles. We note, however, that MLA style and MoEML house style call for commas and periods to be INSIDE quotation marks, even if they are not part of the quotation. Colons, semi-colons, exclamation marks, and question marks remain outside the quotation marks.

The rendering code we need will pull a following period or comma into the quotes but leave a following colon, semi-colon, or question mark outside the quotes.

If the question mark is part of the title, then it will be inside the title tag anyway and will be automatically included inside the quotation marks.

Priority: Sometime this summer. We can leave with a few stray commas until MH has time to write the code.

Permalink 10:35:23 am, by Janelle Jenstad Email , 69 words, 150 views   English (CA)
Categories: Tasks; Mins. worked: 0

Displaying article titles in XML:id table

Right now, the titles of articles are not showing up in our handy list of all XML:ids used in the site. Let's try adding the title level="a" tag to a couple of articles, then checking out what displays on http://mapoflondon.uvic.ca/ids.htm. I've asked MS to undertake this task next week. If titles do not show up, we'll ask MH to adjust the programming.

Permalink 10:28:23 am, by Janelle Jenstad Email , 130 words, 92 views   English (CA)
Categories: Activity log; Mins. worked: 10

working_files

Added a new folder to the Subversion repository called "working_files." This folder is where we can store spreadsheets, workbooks, Word files, and OpenOffice files that we are sharing and storing in the process of preparing articles and texts for the site.

Important! File names must not contain spaces or punctuation.

Examples of files we might store here:

  • Sarah's transcriptions if she's not typing directly into an .xml file
  • Janelle's master Excel workbook of all the essays, transcriptions, and editions that have been assigned. This workbook tracks the progress of each contribution from Janelle's initial contact with the contributor to the final proofing of the page once it's been encoded and published. Normally, only Janelle will make updates to this file, but it's available for any team member to see.

17/05/12

Permalink 05:09:17 pm, by mholmes, 13 words, 70 views   English (CA)
Categories: G&T Hours; Mins. worked: 0

MDH: 172 - 7 - 165 hours G&T

Taking Friday to work on my final course assignment for the NLP course.

Permalink 05:08:47 pm, by mholmes, 10 words, 89 views   English (CA)
Categories: G&T Hours; Mins. worked: 0

MDH: 171 + 1 = 172 hours G&T

Late duty, and had to get things finished: away tomorrow.

Permalink 05:01:31 pm, by mholmes, 66 words, 151 views   English (CA)
Categories: Activity log; Mins. worked: 60

Cascade help session for MK

The Pacific Asia site work is now under way, and MK dropped by for help with some Cascade issues. The GUI for Cascade editing is pretty under-functional in non-IE browsers, so we always end up messing with the code in the end for things like tables. Gave her a copy of all the photos from the current PA site, to save downloading them one by one.

Permalink 04:59:48 pm, by mholmes, 73 words, 107 views   English (CA)
Categories: Activity log; Mins. worked: 60

Expansion of SVN instructions

I expanded the XML Encoding document today to include a set of instructions for using SVN in Linux (previously it was only Window). The new stuff can be easily tweaked so it applies both to Mac and Linux, and the screenshots should look the same for both.

This was prompted by SM showing up for her first day of work (I wasn't expecting her till Tuesday), but she's not using SVN yet anyway.

Permalink 04:58:00 pm, by mholmes, 82 words, 73 views   English (CA)
Categories: Activity log; Mins. worked: 120

Stow document conversion

I've made a number of changes to both Stow documents using regex and search-and-replace, and I'm half-way through an XSLT conversion that will do such things as add all the forme works, add the long s, and fix various other things. When that's ready, I'll add documentation to the blog of all the changes we've made. At present I'm tracking them in a wp document (because I'm going to send some of this info to PS as feedback on the TCP documents).

Permalink 03:37:09 pm, by esaint, 186 words, 74 views   English (CA)
Categories: Activity log; Mins. worked: 1

Update from ES - May 17, 2012

1. Transcripts for "Gary 1" and "Gary 2" are now complete. Once iMovie is installed on POMME and training has been provided, ES will proceed with video editing and timeline setup for subtitles.
2. All transcripts as they currently appear on the website have been saved in a folder named "TranscriptionsOLD" on Dropbox and in a folder named "Old Transcripts" on POMME. Each file contains the timeline and the transcript in Plain text format. This would allow to quickly copy/paste the info back into Oxygen, should reverting to these versions in the future be needed. The three following files did not have any transcripts: "sngl1", "sngl2", and "mixm1"
3. Agreed with CC today: Interviewer utterances and incidents within utterances will display in a grey color (no italics)
4. Agreed with CC today: Titles of books and films, for example, will show within utterances and within notes in italics (no change of color)
5. Next steps are:
a. contact the two potential contacts for recording;
b. edit "Gary 1" and "Gary 2" (see 1. above);
c. prepare a poster for recruitment that can be posted at GSS;
d. start entering the new annotated transcripts in Oxygen.

16/05/12

Permalink 04:06:26 pm, by ccaws, 308 words, 111 views   English (CA)
Categories: Activity log; Mins. worked: 0

Update on work- 16 may 2012

Met with Elizabeth Saint this morning regarding the planning of the next phase of recording: As of today we have a total of 50 videos (including 7 that still need to be edited and placed in the site), 22 Female, 28 Male. 17 videos are from the west of France, 7 from the south-east (inc. 2 that might be from Switzerland), 2 from Montreal, 11 from Lac St Jean area, 1 from Haiti, 1 from Mali, 1 from B. Faso, 3 from Cameroun, 4 from Ile Maurice, 1 from BC. 2 videos (Female) from Belgium need to be re-edited/edited (contact Marie-Claude). We have a plan to go to 100 videos with a balance of gender and origin. Underrepresented are: Francophonie in Asia and Oceania, Francophonie in Europe outside of France, North Africa, Canada outside QC, Gaspésie, Acadie, St Pierre et Miquelon, Antilles. We will contact GSS to find students on campus from some of these areas, will re-contact Moussa Magassa for contact in community (Rwanda, Cameroun, etc?), Martin Beaudoin for contact in Northern Alberta, C. Guilbault for contact in Maillardville. We have made contact with one Male (over 60) from the Southwest of France with potentially 2 (or 3) videos. Also one Female (20-30) from Aix en Provence (with again 2 videos)- In order to maximize work we will ask our future speakers to do two videos (sometimes three if appropriate)- Elizabeth will: 1. transcribe Gary 1 and Gary 2. then continue with import of new transcripts (i.e. updated transcript)- Elizabeth and Catherine: continue contact for video recording (Catherine will purchase a video camera this week)- Will review process for video editing, chunking, and importing in Oxygen- Catherine will: Update XCL database of videos and additional info- create a digital database of all documents- Paul will: start analysis of intervention 2 in FRAN 200- (pre-task doc will be transcribed then shared with Elizabeth and Catherine for future analysis)- Paul will then transcribe results of analysis 2-
Permalink 02:38:59 pm, by mholmes, 5 words, 70 views   English (CA)
Categories: G&T Hours; Mins. worked: 0

MDH: 172.5 - 1.5 = 171 hours G&T

Leaving early for an appointment.

Permalink 02:24:23 pm, by mholmes, 38 words, 62 views   English (CA)
Categories: Activity log; Mins. worked: 180

Stow document conversion

I've moved forward with my detailed documentation of the Stow encoding and conversion. I've also added P5 versions of both documents to the repository, and re-worked the encoding of the "ye" for "the" typographical convention, following MUFI guidelines.

Permalink 11:31:25 am, by Erin, 73 words, 115 views   English (CA)
Categories: Activity log; Mins. worked: 150

16th mai 2012

segments in some latin texts. pelerinage-louanges-injures tagged globally. created 2 new themes, "education" and "rappelle_senat", added in from portal through to imperialis. These 2 new themes need to be added in accounts : benignewinslow, bordeu, chambaud, de thou vol3, bullart, dethou vol1, eloy, de thou vol4, guillemeau, feller, carlencas, de thou vol3 rigault, lacroixdumaine_et_verdier, moreri_english, moreri, niceron vol5, pare x 2, niceron vol10, senac......... Also, paragraph conflicts (with seg) need to be resolved.
Permalink 10:28:09 am, by mholmes, 18 words, 142 views   English (CA)
Categories: Activity log; Mins. worked: 90

Built out the structure of Professional Writing site

Created all the pages and navigation structure based on the approved navplan. Over to EG-W for the content.

15/05/12

Permalink 02:43:39 pm, by mholmes, 63 words, 65 views   English (CA)
Categories: Activity log; Mins. worked: 240

Meetings and work on Stow

I've continued work on my conversions of Stow 1598 and 1633, creating a full list of issues that need to be addressed, and we've discussed them in detail. The results are in my conversion_process_notes.odt file, which still need to be completed with an action list, before I start work on actual fixes. The 1633 needs to be ready for RA work in 10 days.

Permalink 02:41:11 pm, by mholmes, 504 words, 201 views   English (CA)
Categories: Activity log; Mins. worked: 90

Long discussion: next stage

JS-R is able to work with the current spreadsheet for the May presentation, but would like some more elaborate output for the next phase. These are the details we've discussed:

  • The initial db XML output should go through a transformation which basically takes all the information encoded in relations and makes it explicit on individual records. So, for instance, all owners should have explicit ethnicities realized on the owner record; each title should have complete copies of all its owners; and so on. This will make it much easier, and faster, to generate other views of the data.
  • This output also needs to include some new boolean flags on titles:
    • Sale to self (as currently created during the spreadsheet transaction transform).
    • Possible family sale (ditto).
    • liquidated property: properties sold by a Japanese owner to custodians or the state between beginning 1943 and end 1946. The custodian category would be set as an institution type by JS-R.
    • Control for liquidated property: any title which is not flagged as above, nor is it a family transaction or a sale to self, which takes place from 1943-01-01 through 1946-12-12.
  • We need a view which constitutes a chain of transactions, constructed by preceding title. The way to construct the chains is:
    • Order titles by date ascending.
    • Start from the first.
    • Look for another title which as this one as its preceding title. Add that to the chain, and continue.
    • Every time you add a title to a chain, flag the title as having been used.
    • If you find two titles with the current title as preceding, then you have a fork. Annotate the end of the current chain to point to those two titles, and start new chains from each of those titles. Annotate the first link in the new chains to point back to the fork title.
    • If your current chain hits a title which has already been used, then you have a merge. In that case, split the previously-constructed chain into two, and annotate the break points, and stop your current chain, annotating the end of it, so that you end up with two chains which end, pointing to another single chain which continues.
    This view will have to be realized in XML (although it's not clear whether XSLT can be used to create it -- probably) because it's not a 2-dimensional matrix. Since it's in XML, each title can bring along a full copy of all its data, including flags such as LIQ and LIQ_CONTROL, and we can then generate matrix views of the chains which are flattened in various ways. Titles in this view have an inherent generation number, which can be output to spreadsheets.
  • In the current spreadsheet output, the LIQ and LIQ_CONTROL flags would be output, along with a generation number for any title which has one of these flags, constituting the count of transactions subsequent to the custodian transaction (in the case of LIQ titles), or the first transaction following 1943-01-01 (in the case of LIQ_CONTROL properties).
Permalink 11:28:50 am, by Erin, 51 words, 175 views   English (CA)
Categories: Activity log; Mins. worked: 170

Prioreschi

Worked on markup of Imperialis - need to locate themes within the latin to continue.... Research out of Plinio Prioreschi's History of Medecine as it needs to be returned through interlibrary loan today. Fabri, Geminus, Guillemeau, Lancisi, Riolan, Rondelet, Zacchia. TOMORROW - finish inserted insult themes + check marking up of paragraphs

14/05/12

Permalink 10:50:47 pm, by Janelle Jenstad Email , 196 words, 111 views   English (CA)
Categories: Activity log; Mins. worked: 20

Resources for RAs

I gave this list of resources to our incoming RAs:

  • (1) Start here: YouTube video on TEI (Text Encoding Initiative) prepared by Amelia Chesley, a grad student at Texas Tech: http://www.youtube.com/watch?v=R6iiIFrWvmU
  • (2) Oxford DigiHums Workshop Notes on TEI: http://tei.oucs.ox.ac.uk/Talks/2007-12-Poznan/
  • (3) HCMC notes for English 500 workshops on TEI and CSS: http://hcmc.uvic.ca/presentations/xml/. (Sarah and Michael: These notes will look familiar! Noam and Nathan: You’ll be meeting these notes again in English 500.)
  • (4) MoEML’s own encoding notes: http://mapoflondon.uvic.ca/xml_encoding.htm. Martin wrote these notes after the November 2011 rebuild. I refer to them EVERY time I encode something. You’ll have templates to work from, but you’ll still be referring to these notes. Don’t worry if these notes don’t make sense yet. All will become clear with training and practice.
  • (5) If you are interested, here’s a brief statement of the tech specs for the site: http://mapoflondon.uvic.ca/technical_specs.htm (don’t worry if this info doesn’t make sense – you won’t be dealing with the back end of the site)
Permalink 05:07:06 pm, by mholmes, 9 words, 86 views   English (CA)
Categories: G&T Hours; Mins. worked: 0

MDH: 171.5 + 1 = 172.5 hours G&T

Mainly working on NLP course, but interruptions all day...

Permalink 05:06:04 pm, by mholmes, 23 words, 96 views   English (CA)
Categories: Activity log; Mins. worked: 420

NLP course work

Completed the Week 8 programming assignment. Now I have to go back and do the week 6, which I couldn't finish in the time available...

Permalink 02:58:45 pm, by jnazar, 5 words, 77 views   English (CA)
Categories: Activity log; Mins. worked: 60

"Interviews" - project (Part 2)

Another retiree interview completed today

Permalink 02:57:54 pm, by jnazar, 35 words, 65 views   English (CA)
Categories: Activity log; Mins. worked: 360

HCMC website - Cascade

Working today on:
-proposals and projects: index page , links (e.g. to same page and/or different
location within same page); linking to other newly created pages
-about us: publications and presentations, links and content

Permalink 02:11:35 pm, by sarneil, 51 words, 218 views   English (CA)
Categories: Activity log; Mins. worked: 180

etcl : new wordpress blog

got multisite support enabled in wp 3.3.x
added inke front-end
added connection to inke database back-end
added etcl theme (it broke links the same way as it does in the dev etcl site). When I figure out a fix for the broken navigation problem, it presumably will work for both sites.

Permalink 08:25:07 am, by mholmes, 12 words, 190 views   English (CA)
Categories: Activity log; Mins. worked: 15

Added a new field

At JS-R's request, added a new "Description" field to the Ethnicities table.

11/05/12

Permalink 02:55:37 pm, by jnazar, 29 words, 69 views   English (CA)
Categories: Activity log; Mins. worked: 360

HCMC website - Cascade

Continued working on site today. Have updated various sections (e.g. services and resources,
grant writing, in-class workshops, training and supervision.....)
Sorted and updated various links and index pages.

Permalink 02:38:49 pm, by mholmes, 2 words, 102 views   English (CA)
Categories: G&T Hours; Mins. worked: 0

MDH: 172.5 - 1 = 171.5 hours G&T

Leaving early.

Permalink 02:26:04 pm, by mholmes, 107 words, 82 views   English (CA)
Categories: R & D, Activity log, Documentation; Mins. worked: 160

Switching to new machine

This took all day. I first attempted to set the home dir for mholmes on the second disk drive, but when I did that, I ended up with no bash profile. I ended up leaving it at /home/mholmes, but symlinking to specific folders on the second drive instead.

To move VirtualBox vms, I first deleted all snapshots, then moved the disks over (only the disks). Then I created new VMs for the HDs. All working normally, after lots of Windows updates and a bit of tweaking on the Win7 machine (which by default tried to attach the old IDE disk image to a SATA disk controller).

Permalink 02:20:45 pm, by mholmes, 31 words, 269 views   English (CA)
Categories: Activity log, Tasks; Mins. worked: 30

Generated output spreadsheet

Generated output for JS-R, who's finished the current batch of edits. Also He's asked for a large Notes field on the Owners table or the Ethnicity table (not clear yet which).

Permalink 09:00:00 am, by Greg, 504 words, 141 views   English (CA)
Categories: Labs; Mins. worked: 0

Build system complete

There is now a complete build system for Ubuntu 12.04 set up. Here's how it works.
A machine called papaya is set up with a mirror of the precise repo (main, restricted, security, extras, universe and multiverse), Google's repos for Earth and Chrome, Oracle's Virtualbox and a groovy icon set (Faenza). It also has a reprepro setup that runs a kind-of local ppa with a few home-brew apps for use in the HCMC labs: hcmc-desktop (a metapackage that installs a bunch of necessary software and sets up stuff like printers and so forth), hcmc-auth (for LDAP logins), hcmc-oxygen (xml editor) and hcmc-style (adjusts the candy-cane look to a greyscale look).

In order to manage the mirrors see the setup documention here. The mirror should automatically update itself every day. To add a new repo to be mirrored run the script called add-mirror.sh in the admin user's homedir. It's a wizard-kind-of-thing that leads you by the hand through the process.

In order to add a package to the hcmc repo there is a script in the admin homedir called uprepo.sh. It is extremely basic, adding anything it finds in the admin user's homedir/packages directory to the repo, ignoring anything that is already in the repo. It demands a passphrase (twice) for my gpg key (ask me for it) in order to add a package.

To remove a package from the hcmc repo there is a script called rmpkg.sh in the admin user's homedir. It takes a package name as an argument (e.g. hcmc-desktop) and also demands my gpg passphrase

In order to install a fresh Ubuntu 12.04 you can either do the vanilla install first, then run the Bob the Builder script, or you can grab the hcmc-mini.iso from http://apt.hcmc.uvic.ca/iso/precise/ and put it on a thumb drive using something like unetbootin, which is in the repos. The hcmc-mini.iso is a custom-built iso which has a set of preseeds built in to it so it sets up everything required in the HCMC labs. The great thing about using it is that it pulls all packages for the install directly from papaya, so there is no need to update the machine after the install. After the install you're left with a completely set up HCMC lab machine that's ready to go.
***** NOTE: the admin user that gets set up by hcmc-mini is preseeded with a LAME password because I have so far been unsuccessful in creating an md5 hash to store in the preseed - although it is *supposed* to work. I'll change the preseed if I can get it to work. In the meantime, chage the admin user's password after the build is finished.

The hcmc-mini.iso image is created by a script in the admin user's homedir called build-hcmc-mini.sh. It downloads a stock netboot image from an official Ubuntu source, mangles it to include the necessary preseeds, then repacks it in to a bootable iso image, storing it in /var/www/iso/precise.

10/05/12

Permalink 05:02:47 pm, by mholmes, 79 words, 91 views   English (CA)
Categories: Activity log; Mins. worked: 180

Stow: conversion of the TCP 1598 and 1603 texts

I've done a preliminary analysis of the TCP encoding of the Stow 1598 and 1630 texts, along with a test conversion using the latest TEI stylesheets for TCP. There are some problems, which I've detailed in my report, but they seem fairly minor, and I think with some post-processing we'll have usable texts. All the placename markup will have to be added, of course, and some current encoding will have to be added to and elaborated, but the core is sound.

Permalink 02:59:27 pm, by jnazar, 37 words, 75 views   English (CA)
Categories: Activity log; Mins. worked: 120

Medieval Studies website

Received requests from SA (Med.Studies) to update their course offerings for 2012-13
academic year; update faculty page with new faculty information.
Completed all updates to their site; sent confirmation emails to SA (MStudies) - cc'd SA.

Permalink 02:55:36 pm, by kim, 42 words, 172 views   English (CA)
Categories: Tasks; Mins. worked: 15

Apostophe rending glitch?

EDIT: Fixed 2012-05-23. In this file, hover over the word "Majesties," which has sic/corr tags around it, the intention being to correct it to "Majesty's." In the hover-over pop-up, the apostrophe renders as the hex-code for an apostrophe. Very strange!
Permalink 09:43:32 am, by jnazar, 26 words, 69 views   English (CA)
Categories: Activity log; Mins. worked: 60

"Interviews" - project (Part 2)

Set up equipment; filed paperwork for Ret. Interviews - Part 2 recording
session #4.
More bookings made via Doodle for on-site interviews as well as home interview
sessions.

09/05/12

Permalink 04:40:17 pm, by mholmes, 3 words, 103 views   English (CA)
Categories: G&T Hours; Mins. worked: 0

MDH: 171.5 + 1 = 172.5 hours G&T

Procession of visitors...

Permalink 03:17:13 pm, by jnazar, 73 words, 77 views   English (CA)
Categories: Activity log; Mins. worked: 360

HCMC website - Cascade

Worked specifically on grant writing section of website today.
Have learned how to input new content with links now redirected to new site.
Edited html; have implemented two different options for navigating on a page e.g.
linking to tabs below and also text further down same page. (Decisions regarding
which method we will use will be decided later.)

Created new pages within grant writing section.

In progress: working on "in-class workshops" section

Permalink 01:23:28 pm, by mholmes, 36 words, 315 views   English (CA)
Categories: Activity log, Announcements; Mins. worked: 60

Complete set of CO 305 Vol 17 page images added to the Colonial Despatches collection

The complete collection of 1208 page images for CO 305 Vol 17 (in three different sizes) have been added to the collection. These cover the 1861 Vancouver Island Despatches to London. These will now be linked into the transcription documents.

Permalink 11:32:00 am, by Erin, 84 words, 184 views   English (CA)
Categories: Activity log; Mins. worked: 150

biographies and imperialis

May 9th - added themes "pelerinage/pilgrimage", and "louanges/praise for Vesalius". Changes injures_sylvius to "injures/insults". Began to insert into texts. benignewinslow, bordeu, chambaud, de thou vol3, bullart, dethou vol1, eloy, de thou vol4, guillemeau, feller, carlencas, de thou vol3 rigault, lacroixdumaine_et_verdier, moreri_english, moreri, niceron vol5... created xml file for imperialis anecdote _______________________________________________________________________________ to be done: pare x 2, niceron vol10, senac, portal, terilli, strada, teissier, sigaud de la fond, imperialis mark up imperialis....... _______________________________________________________________________________ .....adam, castellanus, lancisi, imperialis are in latin
Permalink 11:16:51 am, by mholmes, 655 words, 200 views   English (CA)
Categories: Activity log; Mins. worked: 210

Two tasks completed

Completed the tasks set yesterday, as follows:

  • Generated a list of all surnames for which multiple ethnicities are associated with owners bearing those surnames (so, for instance, if there are several owners with the surname Lee, and some have ethnicity Chinese while others have Unknown, that surname is added to the list). I used XQuery running against the XML output of the db to do this:
    xquery version "1.0";
    
    (: The purpose of this query is to find all examples where the same 
       owner surname has been associated with different ethnicities. :)
    
    declare namespace saxon="http://saxon.sf.net/";
    declare option saxon:output "method=text";
    
    let $surnames := distinct-values(//own_surname),
    (:return count($surnames):)
    $names := 
    for $name in $surnames
    order by $name
    return <name>
    <surname>{$name}</surname>
    <ids>
    {let $ids := //owners[own_surname = $name]/own_owner_id
    for $id in $ids return xs:string($id)
    }
    </ids>
    <ethnicities>
    {let $ids := //owners[own_surname = $name]/own_owner_id, $eths := //owners_to_ethnicities[ote_owner_id_fk = $ids]
    for $eth in $eths
    return <eth>{$eth/ote_ethnicity_id_fk/text()}</eth>
    }
    </ethnicities>
    </name>
    
    return 
    <names>
    {for $n in $names
    where count(distinct-values($n/ethnicities/eth)) gt 1
    return 
    concat(string-join(($n/surname, ': ',
    for $e in distinct-values($n//eth)
    return //ethnicities[eth_ethnicity_id = $e]/eth_name/text()), ' '), '&#xa;')
    }
    return 
    </names>
    
  • Generated a list of owners who have the same surname and forename, and are associated with titles on the same property. These are likely to be either duplicate owners or cross-generation family transactions. Again, this was done with XQuery:
    xquery version "1.0";
    
    (: The purpose of this query is to pull out instances of 
       owners who have the same surname and forename, 
       and who are associated with titles that have the same
       property. :)
       
       (:declare namespace map="http://www.w3.org/2005/xpath-functions/map";:)
       
    declare namespace saxon="http://saxon.sf.net/";
    declare option saxon:output "method=text";
       
       let $owners := //owners,
       $dupes := for $curr in $owners where $curr/following-sibling::owners[own_surname = $curr/own_surname and own_forenames = $curr/own_forenames and $curr/own_surname != '' and $curr/own_forenames != ''] return $curr,
       $dupeIds := for $d in $dupes 
          let $owner_set := //owners[own_surname = $d/own_surname and own_forenames = $d/own_forenames]
          return <group>
          <surname>{$d/own_surname/text()}</surname>
          <forenames>{$d/own_forenames/text()}</forenames>
          {$owner_set//own_owner_id}
          <properties>
          {let $owner_ids := $owner_set//own_owner_id/text(),
                $titles_for_group := (//owners_to_titles[ott_owner_id_fk = $owner_ids], //sellers_to_titles[stt_owner_id_fk = $owner_ids])
            for $t in $titles_for_group
            return
            if ($t/ott_owner_id_fk) then
            <title><title_id>{$t/ott_title_id_fk/text()}</title_id> <property_id>{//titles[ttl_title_id = $t/ott_title_id_fk/text()]/ttl_property_id_fk/text()}</property_id></title>
            else
            <title><title_id>{$t/stt_title_id_fk/text()}</title_id> <property_id>{//titles[ttl_title_id = $t/stt_title_id_fk/text()]/ttl_property_id_fk/text()}</property_id></title>
          </properties>
          </group>
          
          
       
       return 
       <result>
       {for $d in
       $dupeIds
       where count($d//property_id) gt count(distinct-values($d//property_id))
       order by $d/surname, $d/forenames
       return 
       ('&#x0a;',
       $d/surname/text(),
       ', ',
       $d/forenames/text(),
       '&#x0a;owner ids: ',
       for $o in $d//own_owner_id
       return ($o/text(), ' '),
       for $t in $d//title return
        (
        '&#x0a;&#09;title: ', $t/title_id/text(),
        '&#09;&#09;property: ', $t/property_id/text()
        ),
       '&#x0a;&#x0a;'
       )
       }
    </result>
    

08/05/12

Permalink 04:37:43 pm, by mholmes, 3 words, 51 views   English (CA)
Categories: G&T Hours; Mins. worked: 0

MDH: 170.5 + 1 = 171.5 hours G&T

On late duty.

Permalink 03:32:52 pm, by mholmes, 135 words, 187 views   English (CA)
Categories: Activity log; Mins. worked: 60

Deleting duplicate owners_to_ethnicities records

An unwanted side-effect from the way I've deduped owners was that I've ended up with duplicate records in the owners_to_ethnicities table (different primary keys, but the same owner and ethnicity). This is a quick way to fix that, which I've now implemented:

DELETE ote2
FROM owners_to_ethnicities AS ote1, owners_to_ethnicities AS ote2
WHERE ote1.ote_owner_id_fk = ote2.ote_owner_id_fk
AND ote1.ote_ethnicity_id_fk = ote2.ote_ethnicity_id_fk
AND ote2.ote_ote_id > ote1.ote_ote_id

Similar processes may have to be run on other linking tables. It's probably safer to allow the duplicate records to be created by the merge, then examine and de-dupe them, than it would be to make the de-dupe process itself, which is already complicated enough, more messy.

Permalink 01:51:18 pm, by mholmes, 20 words, 119 views   English (CA)
Categories: Activity log; Mins. worked: 240

NLP course work

Completed the week 7 programming assignment (6 is still not done, but I'm going to come back to it at the end).

Permalink 08:55:14 am, by mholmes, 3351 words, 232 views   English (CA)
Categories: Activity log, Tasks; Mins. worked: 60

New batch of owners de-duped; new tasks for tomorrow

Got a new batch of owners to de-dupe, ran the generator script and ran the SQL on the db to merge them. SQL is below for the record.

Two new tasks for tomorrow:

  • Generate a list of owners who have the same surname and forename, and who appear on titles which are linked to the same property (either as sellers or buyers).
  • Generate a list of owners who have multiple ethnicities, including id and names.
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "1921" WHERE jtt_owner_id_fk = "1900"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "1921" WHERE ltt_owner_id_fk = "1900"; UPDATE owners_to_titles SET ott_owner_id_fk = "1921" WHERE ott_owner_id_fk = "1900"; UPDATE sellers_to_titles SET stt_owner_id_fk = "1921" WHERE stt_owner_id_fk = "1900"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "1921" WHERE ote_owner_id_fk = "1900"; DELETE FROM owners WHERE own_owner_id = "1900";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2356" WHERE jtt_owner_id_fk = "2340"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2356" WHERE ltt_owner_id_fk = "2340"; UPDATE owners_to_titles SET ott_owner_id_fk = "2356" WHERE ott_owner_id_fk = "2340"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2356" WHERE stt_owner_id_fk = "2340"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2356" WHERE ote_owner_id_fk = "2340"; DELETE FROM owners WHERE own_owner_id = "2340";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2496" WHERE jtt_owner_id_fk = "2049"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2496" WHERE ltt_owner_id_fk = "2049"; UPDATE owners_to_titles SET ott_owner_id_fk = "2496" WHERE ott_owner_id_fk = "2049"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2496" WHERE stt_owner_id_fk = "2049"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2496" WHERE ote_owner_id_fk = "2049"; DELETE FROM owners WHERE own_owner_id = "2049";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2477" WHERE jtt_owner_id_fk = "2259"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2477" WHERE ltt_owner_id_fk = "2259"; UPDATE owners_to_titles SET ott_owner_id_fk = "2477" WHERE ott_owner_id_fk = "2259"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2477" WHERE stt_owner_id_fk = "2259"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2477" WHERE ote_owner_id_fk = "2259"; DELETE FROM owners WHERE own_owner_id = "2259";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2473" WHERE jtt_owner_id_fk = "2246"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2473" WHERE ltt_owner_id_fk = "2246"; UPDATE owners_to_titles SET ott_owner_id_fk = "2473" WHERE ott_owner_id_fk = "2246"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2473" WHERE stt_owner_id_fk = "2246"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2473" WHERE ote_owner_id_fk = "2246"; DELETE FROM owners WHERE own_owner_id = "2246";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2486" WHERE jtt_owner_id_fk = "2279"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2486" WHERE ltt_owner_id_fk = "2279"; UPDATE owners_to_titles SET ott_owner_id_fk = "2486" WHERE ott_owner_id_fk = "2279"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2486" WHERE stt_owner_id_fk = "2279"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2486" WHERE ote_owner_id_fk = "2279"; DELETE FROM owners WHERE own_owner_id = "2279";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2211" WHERE jtt_owner_id_fk = "2475"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2211" WHERE ltt_owner_id_fk = "2475"; UPDATE owners_to_titles SET ott_owner_id_fk = "2211" WHERE ott_owner_id_fk = "2475"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2211" WHERE stt_owner_id_fk = "2475"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2211" WHERE ote_owner_id_fk = "2475"; DELETE FROM owners WHERE own_owner_id = "2475";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2211" WHERE jtt_owner_id_fk = "2278"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2211" WHERE ltt_owner_id_fk = "2278"; UPDATE owners_to_titles SET ott_owner_id_fk = "2211" WHERE ott_owner_id_fk = "2278"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2211" WHERE stt_owner_id_fk = "2278"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2211" WHERE ote_owner_id_fk = "2278"; DELETE FROM owners WHERE own_owner_id = "2278";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2211" WHERE jtt_owner_id_fk = "2485"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2211" WHERE ltt_owner_id_fk = "2485"; UPDATE owners_to_titles SET ott_owner_id_fk = "2211" WHERE ott_owner_id_fk = "2485"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2211" WHERE stt_owner_id_fk = "2485"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2211" WHERE ote_owner_id_fk = "2485"; DELETE FROM owners WHERE own_owner_id = "2485";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2481" WHERE jtt_owner_id_fk = "2263"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2481" WHERE ltt_owner_id_fk = "2263"; UPDATE owners_to_titles SET ott_owner_id_fk = "2481" WHERE ott_owner_id_fk = "2263"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2481" WHERE stt_owner_id_fk = "2263"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2481" WHERE ote_owner_id_fk = "2263"; DELETE FROM owners WHERE own_owner_id = "2263";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2139" WHERE jtt_owner_id_fk = "2462"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2139" WHERE ltt_owner_id_fk = "2462"; UPDATE owners_to_titles SET ott_owner_id_fk = "2139" WHERE ott_owner_id_fk = "2462"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2139" WHERE stt_owner_id_fk = "2462"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2139" WHERE ote_owner_id_fk = "2462"; DELETE FROM owners WHERE own_owner_id = "2462";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2139" WHERE jtt_owner_id_fk = "1912"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2139" WHERE ltt_owner_id_fk = "1912"; UPDATE owners_to_titles SET ott_owner_id_fk = "2139" WHERE ott_owner_id_fk = "1912"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2139" WHERE stt_owner_id_fk = "1912"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2139" WHERE ote_owner_id_fk = "1912"; DELETE FROM owners WHERE own_owner_id = "1912";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2487" WHERE jtt_owner_id_fk = "2276"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2487" WHERE ltt_owner_id_fk = "2276"; UPDATE owners_to_titles SET ott_owner_id_fk = "2487" WHERE ott_owner_id_fk = "2276"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2487" WHERE stt_owner_id_fk = "2276"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2487" WHERE ote_owner_id_fk = "2276"; DELETE FROM owners WHERE own_owner_id = "2276";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2465" WHERE jtt_owner_id_fk = "2134"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2465" WHERE ltt_owner_id_fk = "2134"; UPDATE owners_to_titles SET ott_owner_id_fk = "2465" WHERE ott_owner_id_fk = "2134"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2465" WHERE stt_owner_id_fk = "2134"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2465" WHERE ote_owner_id_fk = "2134"; DELETE FROM owners WHERE own_owner_id = "2134";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2466" WHERE jtt_owner_id_fk = "2202"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2466" WHERE ltt_owner_id_fk = "2202"; UPDATE owners_to_titles SET ott_owner_id_fk = "2466" WHERE ott_owner_id_fk = "2202"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2466" WHERE stt_owner_id_fk = "2202"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2466" WHERE ote_owner_id_fk = "2202"; DELETE FROM owners WHERE own_owner_id = "2202";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2484" WHERE jtt_owner_id_fk = "2277"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2484" WHERE ltt_owner_id_fk = "2277"; UPDATE owners_to_titles SET ott_owner_id_fk = "2484" WHERE ott_owner_id_fk = "2277"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2484" WHERE stt_owner_id_fk = "2277"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2484" WHERE ote_owner_id_fk = "2277"; DELETE FROM owners WHERE own_owner_id = "2277";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2482" WHERE jtt_owner_id_fk = "2264"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2482" WHERE ltt_owner_id_fk = "2264"; UPDATE owners_to_titles SET ott_owner_id_fk = "2482" WHERE ott_owner_id_fk = "2264"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2482" WHERE stt_owner_id_fk = "2264"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2482" WHERE ote_owner_id_fk = "2264"; DELETE FROM owners WHERE own_owner_id = "2264";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2023" WHERE jtt_owner_id_fk = "2459"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2023" WHERE ltt_owner_id_fk = "2459"; UPDATE owners_to_titles SET ott_owner_id_fk = "2023" WHERE ott_owner_id_fk = "2459"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2023" WHERE stt_owner_id_fk = "2459"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2023" WHERE ote_owner_id_fk = "2459"; DELETE FROM owners WHERE own_owner_id = "2459";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2023" WHERE jtt_owner_id_fk = "2495"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2023" WHERE ltt_owner_id_fk = "2495"; UPDATE owners_to_titles SET ott_owner_id_fk = "2023" WHERE ott_owner_id_fk = "2495"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2023" WHERE stt_owner_id_fk = "2495"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2023" WHERE ote_owner_id_fk = "2495"; DELETE FROM owners WHERE own_owner_id = "2495";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2474" WHERE jtt_owner_id_fk = "2247"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2474" WHERE ltt_owner_id_fk = "2247"; UPDATE owners_to_titles SET ott_owner_id_fk = "2474" WHERE ott_owner_id_fk = "2247"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2474" WHERE stt_owner_id_fk = "2247"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2474" WHERE ote_owner_id_fk = "2247"; DELETE FROM owners WHERE own_owner_id = "2247";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2478" WHERE jtt_owner_id_fk = "2260"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2478" WHERE ltt_owner_id_fk = "2260"; UPDATE owners_to_titles SET ott_owner_id_fk = "2478" WHERE ott_owner_id_fk = "2260"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2478" WHERE stt_owner_id_fk = "2260"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2478" WHERE ote_owner_id_fk = "2260"; DELETE FROM owners WHERE own_owner_id = "2260";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2008" WHERE jtt_owner_id_fk = "2471"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2008" WHERE ltt_owner_id_fk = "2471"; UPDATE owners_to_titles SET ott_owner_id_fk = "2008" WHERE ott_owner_id_fk = "2471"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2008" WHERE stt_owner_id_fk = "2471"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2008" WHERE ote_owner_id_fk = "2471"; DELETE FROM owners WHERE own_owner_id = "2471";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2008" WHERE jtt_owner_id_fk = "2090"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2008" WHERE ltt_owner_id_fk = "2090"; UPDATE owners_to_titles SET ott_owner_id_fk = "2008" WHERE ott_owner_id_fk = "2090"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2008" WHERE stt_owner_id_fk = "2090"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2008" WHERE ote_owner_id_fk = "2090"; DELETE FROM owners WHERE own_owner_id = "2090";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2008" WHERE jtt_owner_id_fk = "1992"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2008" WHERE ltt_owner_id_fk = "1992"; UPDATE owners_to_titles SET ott_owner_id_fk = "2008" WHERE ott_owner_id_fk = "1992"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2008" WHERE stt_owner_id_fk = "1992"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2008" WHERE ote_owner_id_fk = "1992"; DELETE FROM owners WHERE own_owner_id = "1992";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2022" WHERE jtt_owner_id_fk = "2458"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2022" WHERE ltt_owner_id_fk = "2458"; UPDATE owners_to_titles SET ott_owner_id_fk = "2022" WHERE ott_owner_id_fk = "2458"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2022" WHERE stt_owner_id_fk = "2458"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2022" WHERE ote_owner_id_fk = "2458"; DELETE FROM owners WHERE own_owner_id = "2458";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2022" WHERE jtt_owner_id_fk = "2494"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2022" WHERE ltt_owner_id_fk = "2494"; UPDATE owners_to_titles SET ott_owner_id_fk = "2022" WHERE ott_owner_id_fk = "2494"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2022" WHERE stt_owner_id_fk = "2494"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2022" WHERE ote_owner_id_fk = "2494"; DELETE FROM owners WHERE own_owner_id = "2494";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2476" WHERE jtt_owner_id_fk = "2248"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2476" WHERE ltt_owner_id_fk = "2248"; UPDATE owners_to_titles SET ott_owner_id_fk = "2476" WHERE ott_owner_id_fk = "2248"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2476" WHERE stt_owner_id_fk = "2248"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2476" WHERE ote_owner_id_fk = "2248"; DELETE FROM owners WHERE own_owner_id = "2248";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2488" WHERE jtt_owner_id_fk = "2024"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2488" WHERE ltt_owner_id_fk = "2024"; UPDATE owners_to_titles SET ott_owner_id_fk = "2488" WHERE ott_owner_id_fk = "2024"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2488" WHERE stt_owner_id_fk = "2024"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2488" WHERE ote_owner_id_fk = "2024"; DELETE FROM owners WHERE own_owner_id = "2024";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2453" WHERE jtt_owner_id_fk = "2025"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2453" WHERE ltt_owner_id_fk = "2025"; UPDATE owners_to_titles SET ott_owner_id_fk = "2453" WHERE ott_owner_id_fk = "2025"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2453" WHERE stt_owner_id_fk = "2025"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2453" WHERE ote_owner_id_fk = "2025"; DELETE FROM owners WHERE own_owner_id = "2025";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2455" WHERE jtt_owner_id_fk = "2026"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2455" WHERE ltt_owner_id_fk = "2026"; UPDATE owners_to_titles SET ott_owner_id_fk = "2455" WHERE ott_owner_id_fk = "2026"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2455" WHERE stt_owner_id_fk = "2026"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2455" WHERE ote_owner_id_fk = "2026"; DELETE FROM owners WHERE own_owner_id = "2026";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2454" WHERE jtt_owner_id_fk = "2018"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2454" WHERE ltt_owner_id_fk = "2018"; UPDATE owners_to_titles SET ott_owner_id_fk = "2454" WHERE ott_owner_id_fk = "2018"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2454" WHERE stt_owner_id_fk = "2018"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2454" WHERE ote_owner_id_fk = "2018"; DELETE FROM owners WHERE own_owner_id = "2018";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2490" WHERE jtt_owner_id_fk = "2018"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2490" WHERE ltt_owner_id_fk = "2018"; UPDATE owners_to_titles SET ott_owner_id_fk = "2490" WHERE ott_owner_id_fk = "2018"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2490" WHERE stt_owner_id_fk = "2018"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2490" WHERE ote_owner_id_fk = "2018"; DELETE FROM owners WHERE own_owner_id = "2018";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2021" WHERE jtt_owner_id_fk = "2457"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2021" WHERE ltt_owner_id_fk = "2457"; UPDATE owners_to_titles SET ott_owner_id_fk = "2021" WHERE ott_owner_id_fk = "2457"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2021" WHERE stt_owner_id_fk = "2457"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2021" WHERE ote_owner_id_fk = "2457"; DELETE FROM owners WHERE own_owner_id = "2457";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2021" WHERE jtt_owner_id_fk = "2493"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2021" WHERE ltt_owner_id_fk = "2493"; UPDATE owners_to_titles SET ott_owner_id_fk = "2021" WHERE ott_owner_id_fk = "2493"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2021" WHERE stt_owner_id_fk = "2493"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2021" WHERE ote_owner_id_fk = "2493"; DELETE FROM owners WHERE own_owner_id = "2493";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2491" WHERE jtt_owner_id_fk = "2019"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2491" WHERE ltt_owner_id_fk = "2019"; UPDATE owners_to_titles SET ott_owner_id_fk = "2491" WHERE ott_owner_id_fk = "2019"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2491" WHERE stt_owner_id_fk = "2019"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2491" WHERE ote_owner_id_fk = "2019"; DELETE FROM owners WHERE own_owner_id = "2019";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2020" WHERE jtt_owner_id_fk = "2456"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2020" WHERE ltt_owner_id_fk = "2456"; UPDATE owners_to_titles SET ott_owner_id_fk = "2020" WHERE ott_owner_id_fk = "2456"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2020" WHERE stt_owner_id_fk = "2456"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2020" WHERE ote_owner_id_fk = "2456"; DELETE FROM owners WHERE own_owner_id = "2456";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2020" WHERE jtt_owner_id_fk = "2492"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2020" WHERE ltt_owner_id_fk = "2492"; UPDATE owners_to_titles SET ott_owner_id_fk = "2020" WHERE ott_owner_id_fk = "2492"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2020" WHERE stt_owner_id_fk = "2492"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2020" WHERE ote_owner_id_fk = "2492"; DELETE FROM owners WHERE own_owner_id = "2492";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2489" WHERE jtt_owner_id_fk = "2017"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2489" WHERE ltt_owner_id_fk = "2017"; UPDATE owners_to_titles SET ott_owner_id_fk = "2489" WHERE ott_owner_id_fk = "2017"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2489" WHERE stt_owner_id_fk = "2017"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2489" WHERE ote_owner_id_fk = "2017"; DELETE FROM owners WHERE own_owner_id = "2017";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "338" WHERE jtt_owner_id_fk = "337"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "338" WHERE ltt_owner_id_fk = "337"; UPDATE owners_to_titles SET ott_owner_id_fk = "338" WHERE ott_owner_id_fk = "337"; UPDATE sellers_to_titles SET stt_owner_id_fk = "338" WHERE stt_owner_id_fk = "337"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "338" WHERE ote_owner_id_fk = "337"; DELETE FROM owners WHERE own_owner_id = "337";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2463" WHERE jtt_owner_id_fk = "2140"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2463" WHERE ltt_owner_id_fk = "2140"; UPDATE owners_to_titles SET ott_owner_id_fk = "2463" WHERE ott_owner_id_fk = "2140"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2463" WHERE stt_owner_id_fk = "2140"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2463" WHERE ote_owner_id_fk = "2140"; DELETE FROM owners WHERE own_owner_id = "2140";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2469" WHERE jtt_owner_id_fk = "2146"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2469" WHERE ltt_owner_id_fk = "2146"; UPDATE owners_to_titles SET ott_owner_id_fk = "2469" WHERE ott_owner_id_fk = "2146"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2469" WHERE stt_owner_id_fk = "2146"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2469" WHERE ote_owner_id_fk = "2146"; DELETE FROM owners WHERE own_owner_id = "2146";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2472" WHERE jtt_owner_id_fk = "2009"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2472" WHERE ltt_owner_id_fk = "2009"; UPDATE owners_to_titles SET ott_owner_id_fk = "2472" WHERE ott_owner_id_fk = "2009"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2472" WHERE stt_owner_id_fk = "2009"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2472" WHERE ote_owner_id_fk = "2009"; DELETE FROM owners WHERE own_owner_id = "2009";

07/05/12

Permalink 05:41:17 pm, by mholmes, 10 words, 65 views   English (CA)
Categories: G&T Hours; Mins. worked: 0

MDH: 168.5 + 2 = 170.5 hours G&T

Lots to do, not enough time to do it in...

Permalink 04:30:42 pm, by mholmes, 240 words, 86 views   English (CA)
Categories: Activity log; Mins. worked: 240

Implemented Buyer_is_seller and Poss_family_transaction, and more

To get test data against which to check my XSLT implementation, I ran this against the database to get back instances of buyers and sellers sharing surnames (some because they were identical, some not):

Searching for family transactions:

SELECT titles.ttl_title_id, titles.ttl_title_code, 
sellers.own_owner_id AS seller_id, buyers.own_owner_id AS buyer_id 
FROM titles 
LEFT JOIN owners_to_titles on titles.ttl_title_id = owners_to_titles.ott_title_id_fk
LEFT JOIN owners AS buyers on owners_to_titles.ott_owner_id_fk = buyers.own_owner_id
LEFT JOIN sellers_to_titles on titles.ttl_title_id = sellers_to_titles.stt_title_id_fk
LEFT JOIN owners AS sellers on sellers_to_titles.stt_owner_id_fk = sellers.own_owner_id
WHERE buyers.own_surname = sellers.own_surname
AND buyers.own_owner_id IS NOT NULL
AND sellers.own_owner_id IS NOT NULL
LIMIT 0, 500

Then I wrote the XSLT to generate those two fields. It's a bit tricky to disambiguate the two fields -- all instances of Buyer_is_seller do have the same surname, of course, so you have to exclude them -- but I think I have it working OK.

In the process, I discovered a lot more candidate owner duplicates, so my process last week was obviously too cautious. I've re-run it and generated a new list, which JS-R will look at, then I'll de-dupe those.

Permalink 11:30:47 am, by Erin, 52 words, 160 views   English (CA)
Categories: Activity log; Mins. worked: 170

notes for person-ography

added "author" in notes for peoplexml added Adam biography, argenterius is written, Teissier is written. got call numbers for sources in library for Du Verdier's Bio. Wrote Winslow biography Worked on Clusius' biography - requested book from UBC - The World of Carolus Clusius Began de Thou's biography - to be continued...
Permalink 01:29:11 am, by Janelle Jenstad Email , 38 words, 51 views   English (CA)
Categories: Activity log; Mins. worked: 30

News Tab

Added a News tab to the main site navigation bar and created page. Would like to make this a dynamic feature -- link in with FB, this blog, Twitter, for example -- and make it an RSS feed.

04/05/12

Permalink 02:32:53 pm, by mholmes, 2 words, 57 views   English (CA)
Categories: G&T Hours; Mins. worked: 0

MDH: 169.5 - 1 = 168.5 hours G&T

Leaving early.

Permalink 02:03:19 pm, by jnazar, 23 words, 71 views   English (CA)
Categories: Activity log; Mins. worked: 60

"Interviews" - project (Part 2)

Part 2 of Interviews project has started.
Recording sessions now being booked.
Written summaries will be submitted by interviewers and will
accompany interview records.

Permalink 02:03:03 pm, by mholmes, 10 words, 63 views   English (CA)
Categories: Activity log; Mins. worked: 130

Committee work

Posting time spent in committee/focus group work in HR.

Permalink 02:02:27 pm, by mholmes, 265 words, 77 views   English (CA)
Categories: Activity log; Mins. worked: 120

Diagnosing problems with owners, and specifying requirements for next week

Discussion with JS-R by phone, during which we determined some immediate tasks:

  • DONE: Fix the proliferation of new individual owners added with only display_name info (no surname or forename). Discovered 262 of this, like this:
    SELECT * FROM `owners` WHERE own_institutional=0 AND own_surname=""
    
    and at JS-R's request, fixed 260 of them with this:
    UPDATE `owners` SET `own_surname` = (SELECT RIGHT( `own_display_name` , LOCATE( ' ', REVERSE( `own_display_name` ) ) - 1 )) WHERE CHAR_LENGTH(`own_surname`) = 0 and own_institutional=0;
    
    UPDATE `owners` SET `own_forenames` = (SELECT LEFT( `own_display_name` , CHAR_LENGTH( `own_display_name`) - LOCATE( ' ', REVERSE( `own_display_name` ) ) )) WHERE CHAR_LENGTH(`own_forenames`) = 0 and own_institutional = 0;
    
  • DONE: Generate a list of owners not associated with any title. That was done with this:
    SELECT owners.* FROM owners 
         LEFT JOIN owners_to_titles ON owners.own_owner_id=owners_to_titles.ott_owner_id_fk
    WHERE owners_to_titles.ott_owner_id_fk IS NULL;
    
    and sent to JS-R in the form of a spreadsheet. These owners may be linked in some other way, of course -- as sellers, or through mortgages. "owners" is actually "actors" (individuals or companies who act in transactions).
  • TODO: Add a new flag to the new spreadsheet generated from XML, which is true when the same individual is involved both as a buyer and a seller in a transaction (buyer_is_seller).
  • TODO: Add a new flag to the new spreadsheet generated from XML, which is true when there is a buyer who has the same surname as a seller, but is a different individual (poss_family_transaction).
Permalink 01:37:54 pm, by jnazar, 16 words, 136 views   English (CA)
Categories: Activity log; Mins. worked: 15

FAMIS REPORT

Submitted FAMIS report for supply of bioBags to be delivered to HCMC office via campus mail.

Permalink 01:31:07 pm, by jnazar, 25 words, 70 views   English (CA)
Categories: Activity log; Mins. worked: 120

HCMC website - Cascade

Continued inserting content.
Working on redirecting links (old to new site).
Setting up tabs as display option.
Final editing of content will be done later.

Permalink 11:35:38 am, by jnazar, 18 words, 53 views   English (CA)
Categories: Activity log; Mins. worked: 30

Hispanic and Italian Studies - Cascade website

Sent email to DR, DF (cc'd SA) summarizing recent Cascade website
meeting.
Included: communications attachments as reference material

Permalink 11:29:51 am, by Erin, 17 words, 123 views   English (CA)
Categories: Activity log; Mins. worked: 165

biographies

Research for Languet, Lancisi, La Croix du Maine, Antoine du Verdier, Teissier, Eloy, Castellanus, Benigne-Winslow, Avicenne, Adam.
Permalink 08:52:03 am, by Erin, 20 words, 140 views   English (CA)
Categories: Activity log; Mins. worked: 180

Wednesday

Wednesday I spent the morning in the library doing some transcription and research for biographies (Fabri, Clusius, Argenterius, Bottoni, Dryander)
Permalink 08:16:01 am, by mholmes, 3669 words, 76 views   English (CA)
Categories: Activity log; Mins. worked: 30

Owner de-duplication done

Got back a spreadsheet with owners who can be merged, and ran the automated process, testing first on the dev db. One issue was with multiple owners who are all to be merged into one; these operations obviously have to be done in the right order. Once that was sorted out, and some two-way dupes (x=y and y=x) taken care of, the process seems to have gone smoothly. This is the dupe list and the SQL, for the record:

48|902
49|51
65|155
73|253
101|104
101|436
104|436
187|250
202|252
330|332
459|553
460|555
483|502
544|454
569|471
1131|1141
1863|1864
1870|1872
1870|1999
1870|2076
1872|1999
1872|2076
1875|1877
1887|1889
1897|1898
1917|1926
1946|2100
1979|1980
1979|1981
1980|1981
1999|2076
2015|2034
2024|2016
2032|2053
2041|2165
2041|2166
2070|2158
2075|2077
2092|2093
2163|2164
2165|2166
2171|2172
2194|2154
2220|2221
2286|2287
2452|2024
2452|2024
2452|2016


UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "902" WHERE jtt_owner_id_fk = "48"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "902" WHERE ltt_owner_id_fk = "48"; UPDATE owners_to_titles SET ott_owner_id_fk = "902" WHERE ott_owner_id_fk = "48"; UPDATE sellers_to_titles SET stt_owner_id_fk = "902" WHERE stt_owner_id_fk = "48"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "902" WHERE ote_owner_id_fk = "48"; DELETE FROM owners WHERE own_owner_id = "48";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "51" WHERE jtt_owner_id_fk = "49"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "51" WHERE ltt_owner_id_fk = "49"; UPDATE owners_to_titles SET ott_owner_id_fk = "51" WHERE ott_owner_id_fk = "49"; UPDATE sellers_to_titles SET stt_owner_id_fk = "51" WHERE stt_owner_id_fk = "49"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "51" WHERE ote_owner_id_fk = "49"; DELETE FROM owners WHERE own_owner_id = "49";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "155" WHERE jtt_owner_id_fk = "65"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "155" WHERE ltt_owner_id_fk = "65"; UPDATE owners_to_titles SET ott_owner_id_fk = "155" WHERE ott_owner_id_fk = "65"; UPDATE sellers_to_titles SET stt_owner_id_fk = "155" WHERE stt_owner_id_fk = "65"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "155" WHERE ote_owner_id_fk = "65"; DELETE FROM owners WHERE own_owner_id = "65";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "253" WHERE jtt_owner_id_fk = "73"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "253" WHERE ltt_owner_id_fk = "73"; UPDATE owners_to_titles SET ott_owner_id_fk = "253" WHERE ott_owner_id_fk = "73"; UPDATE sellers_to_titles SET stt_owner_id_fk = "253" WHERE stt_owner_id_fk = "73"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "253" WHERE ote_owner_id_fk = "73"; DELETE FROM owners WHERE own_owner_id = "73";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "104" WHERE jtt_owner_id_fk = "101"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "104" WHERE ltt_owner_id_fk = "101"; UPDATE owners_to_titles SET ott_owner_id_fk = "104" WHERE ott_owner_id_fk = "101"; UPDATE sellers_to_titles SET stt_owner_id_fk = "104" WHERE stt_owner_id_fk = "101"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "104" WHERE ote_owner_id_fk = "101"; DELETE FROM owners WHERE own_owner_id = "101";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "436" WHERE jtt_owner_id_fk = "101"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "436" WHERE ltt_owner_id_fk = "101"; UPDATE owners_to_titles SET ott_owner_id_fk = "436" WHERE ott_owner_id_fk = "101"; UPDATE sellers_to_titles SET stt_owner_id_fk = "436" WHERE stt_owner_id_fk = "101"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "436" WHERE ote_owner_id_fk = "101"; DELETE FROM owners WHERE own_owner_id = "101";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "436" WHERE jtt_owner_id_fk = "104"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "436" WHERE ltt_owner_id_fk = "104"; UPDATE owners_to_titles SET ott_owner_id_fk = "436" WHERE ott_owner_id_fk = "104"; UPDATE sellers_to_titles SET stt_owner_id_fk = "436" WHERE stt_owner_id_fk = "104"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "436" WHERE ote_owner_id_fk = "104"; DELETE FROM owners WHERE own_owner_id = "104";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "250" WHERE jtt_owner_id_fk = "187"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "250" WHERE ltt_owner_id_fk = "187"; UPDATE owners_to_titles SET ott_owner_id_fk = "250" WHERE ott_owner_id_fk = "187"; UPDATE sellers_to_titles SET stt_owner_id_fk = "250" WHERE stt_owner_id_fk = "187"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "250" WHERE ote_owner_id_fk = "187"; DELETE FROM owners WHERE own_owner_id = "187";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "252" WHERE jtt_owner_id_fk = "202"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "252" WHERE ltt_owner_id_fk = "202"; UPDATE owners_to_titles SET ott_owner_id_fk = "252" WHERE ott_owner_id_fk = "202"; UPDATE sellers_to_titles SET stt_owner_id_fk = "252" WHERE stt_owner_id_fk = "202"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "252" WHERE ote_owner_id_fk = "202"; DELETE FROM owners WHERE own_owner_id = "202";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "332" WHERE jtt_owner_id_fk = "330"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "332" WHERE ltt_owner_id_fk = "330"; UPDATE owners_to_titles SET ott_owner_id_fk = "332" WHERE ott_owner_id_fk = "330"; UPDATE sellers_to_titles SET stt_owner_id_fk = "332" WHERE stt_owner_id_fk = "330"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "332" WHERE ote_owner_id_fk = "330"; DELETE FROM owners WHERE own_owner_id = "330";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "553" WHERE jtt_owner_id_fk = "459"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "553" WHERE ltt_owner_id_fk = "459"; UPDATE owners_to_titles SET ott_owner_id_fk = "553" WHERE ott_owner_id_fk = "459"; UPDATE sellers_to_titles SET stt_owner_id_fk = "553" WHERE stt_owner_id_fk = "459"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "553" WHERE ote_owner_id_fk = "459"; DELETE FROM owners WHERE own_owner_id = "459";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "555" WHERE jtt_owner_id_fk = "460"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "555" WHERE ltt_owner_id_fk = "460"; UPDATE owners_to_titles SET ott_owner_id_fk = "555" WHERE ott_owner_id_fk = "460"; UPDATE sellers_to_titles SET stt_owner_id_fk = "555" WHERE stt_owner_id_fk = "460"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "555" WHERE ote_owner_id_fk = "460"; DELETE FROM owners WHERE own_owner_id = "460";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "502" WHERE jtt_owner_id_fk = "483"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "502" WHERE ltt_owner_id_fk = "483"; UPDATE owners_to_titles SET ott_owner_id_fk = "502" WHERE ott_owner_id_fk = "483"; UPDATE sellers_to_titles SET stt_owner_id_fk = "502" WHERE stt_owner_id_fk = "483"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "502" WHERE ote_owner_id_fk = "483"; DELETE FROM owners WHERE own_owner_id = "483";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "454" WHERE jtt_owner_id_fk = "544"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "454" WHERE ltt_owner_id_fk = "544"; UPDATE owners_to_titles SET ott_owner_id_fk = "454" WHERE ott_owner_id_fk = "544"; UPDATE sellers_to_titles SET stt_owner_id_fk = "454" WHERE stt_owner_id_fk = "544"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "454" WHERE ote_owner_id_fk = "544"; DELETE FROM owners WHERE own_owner_id = "544";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "471" WHERE jtt_owner_id_fk = "569"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "471" WHERE ltt_owner_id_fk = "569"; UPDATE owners_to_titles SET ott_owner_id_fk = "471" WHERE ott_owner_id_fk = "569"; UPDATE sellers_to_titles SET stt_owner_id_fk = "471" WHERE stt_owner_id_fk = "569"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "471" WHERE ote_owner_id_fk = "569"; DELETE FROM owners WHERE own_owner_id = "569";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "1141" WHERE jtt_owner_id_fk = "1131"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "1141" WHERE ltt_owner_id_fk = "1131"; UPDATE owners_to_titles SET ott_owner_id_fk = "1141" WHERE ott_owner_id_fk = "1131"; UPDATE sellers_to_titles SET stt_owner_id_fk = "1141" WHERE stt_owner_id_fk = "1131"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "1141" WHERE ote_owner_id_fk = "1131"; DELETE FROM owners WHERE own_owner_id = "1131";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "1864" WHERE jtt_owner_id_fk = "1863"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "1864" WHERE ltt_owner_id_fk = "1863"; UPDATE owners_to_titles SET ott_owner_id_fk = "1864" WHERE ott_owner_id_fk = "1863"; UPDATE sellers_to_titles SET stt_owner_id_fk = "1864" WHERE stt_owner_id_fk = "1863"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "1864" WHERE ote_owner_id_fk = "1863"; DELETE FROM owners WHERE own_owner_id = "1863";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "1872" WHERE jtt_owner_id_fk = "1870"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "1872" WHERE ltt_owner_id_fk = "1870"; UPDATE owners_to_titles SET ott_owner_id_fk = "1872" WHERE ott_owner_id_fk = "1870"; UPDATE sellers_to_titles SET stt_owner_id_fk = "1872" WHERE stt_owner_id_fk = "1870"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "1872" WHERE ote_owner_id_fk = "1870"; DELETE FROM owners WHERE own_owner_id = "1870";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "1999" WHERE jtt_owner_id_fk = "1870"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "1999" WHERE ltt_owner_id_fk = "1870"; UPDATE owners_to_titles SET ott_owner_id_fk = "1999" WHERE ott_owner_id_fk = "1870"; UPDATE sellers_to_titles SET stt_owner_id_fk = "1999" WHERE stt_owner_id_fk = "1870"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "1999" WHERE ote_owner_id_fk = "1870"; DELETE FROM owners WHERE own_owner_id = "1870";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2076" WHERE jtt_owner_id_fk = "1870"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2076" WHERE ltt_owner_id_fk = "1870"; UPDATE owners_to_titles SET ott_owner_id_fk = "2076" WHERE ott_owner_id_fk = "1870"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2076" WHERE stt_owner_id_fk = "1870"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2076" WHERE ote_owner_id_fk = "1870"; DELETE FROM owners WHERE own_owner_id = "1870";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "1999" WHERE jtt_owner_id_fk = "1872"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "1999" WHERE ltt_owner_id_fk = "1872"; UPDATE owners_to_titles SET ott_owner_id_fk = "1999" WHERE ott_owner_id_fk = "1872"; UPDATE sellers_to_titles SET stt_owner_id_fk = "1999" WHERE stt_owner_id_fk = "1872"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "1999" WHERE ote_owner_id_fk = "1872"; DELETE FROM owners WHERE own_owner_id = "1872";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2076" WHERE jtt_owner_id_fk = "1872"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2076" WHERE ltt_owner_id_fk = "1872"; UPDATE owners_to_titles SET ott_owner_id_fk = "2076" WHERE ott_owner_id_fk = "1872"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2076" WHERE stt_owner_id_fk = "1872"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2076" WHERE ote_owner_id_fk = "1872"; DELETE FROM owners WHERE own_owner_id = "1872";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "1877" WHERE jtt_owner_id_fk = "1875"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "1877" WHERE ltt_owner_id_fk = "1875"; UPDATE owners_to_titles SET ott_owner_id_fk = "1877" WHERE ott_owner_id_fk = "1875"; UPDATE sellers_to_titles SET stt_owner_id_fk = "1877" WHERE stt_owner_id_fk = "1875"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "1877" WHERE ote_owner_id_fk = "1875"; DELETE FROM owners WHERE own_owner_id = "1875";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "1889" WHERE jtt_owner_id_fk = "1887"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "1889" WHERE ltt_owner_id_fk = "1887"; UPDATE owners_to_titles SET ott_owner_id_fk = "1889" WHERE ott_owner_id_fk = "1887"; UPDATE sellers_to_titles SET stt_owner_id_fk = "1889" WHERE stt_owner_id_fk = "1887"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "1889" WHERE ote_owner_id_fk = "1887"; DELETE FROM owners WHERE own_owner_id = "1887";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "1898" WHERE jtt_owner_id_fk = "1897"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "1898" WHERE ltt_owner_id_fk = "1897"; UPDATE owners_to_titles SET ott_owner_id_fk = "1898" WHERE ott_owner_id_fk = "1897"; UPDATE sellers_to_titles SET stt_owner_id_fk = "1898" WHERE stt_owner_id_fk = "1897"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "1898" WHERE ote_owner_id_fk = "1897"; DELETE FROM owners WHERE own_owner_id = "1897";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "1926" WHERE jtt_owner_id_fk = "1917"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "1926" WHERE ltt_owner_id_fk = "1917"; UPDATE owners_to_titles SET ott_owner_id_fk = "1926" WHERE ott_owner_id_fk = "1917"; UPDATE sellers_to_titles SET stt_owner_id_fk = "1926" WHERE stt_owner_id_fk = "1917"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "1926" WHERE ote_owner_id_fk = "1917"; DELETE FROM owners WHERE own_owner_id = "1917";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2100" WHERE jtt_owner_id_fk = "1946"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2100" WHERE ltt_owner_id_fk = "1946"; UPDATE owners_to_titles SET ott_owner_id_fk = "2100" WHERE ott_owner_id_fk = "1946"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2100" WHERE stt_owner_id_fk = "1946"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2100" WHERE ote_owner_id_fk = "1946"; DELETE FROM owners WHERE own_owner_id = "1946";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "1980" WHERE jtt_owner_id_fk = "1979"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "1980" WHERE ltt_owner_id_fk = "1979"; UPDATE owners_to_titles SET ott_owner_id_fk = "1980" WHERE ott_owner_id_fk = "1979"; UPDATE sellers_to_titles SET stt_owner_id_fk = "1980" WHERE stt_owner_id_fk = "1979"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "1980" WHERE ote_owner_id_fk = "1979"; DELETE FROM owners WHERE own_owner_id = "1979";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "1981" WHERE jtt_owner_id_fk = "1979"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "1981" WHERE ltt_owner_id_fk = "1979"; UPDATE owners_to_titles SET ott_owner_id_fk = "1981" WHERE ott_owner_id_fk = "1979"; UPDATE sellers_to_titles SET stt_owner_id_fk = "1981" WHERE stt_owner_id_fk = "1979"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "1981" WHERE ote_owner_id_fk = "1979"; DELETE FROM owners WHERE own_owner_id = "1979";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "1981" WHERE jtt_owner_id_fk = "1980"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "1981" WHERE ltt_owner_id_fk = "1980"; UPDATE owners_to_titles SET ott_owner_id_fk = "1981" WHERE ott_owner_id_fk = "1980"; UPDATE sellers_to_titles SET stt_owner_id_fk = "1981" WHERE stt_owner_id_fk = "1980"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "1981" WHERE ote_owner_id_fk = "1980"; DELETE FROM owners WHERE own_owner_id = "1980";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2076" WHERE jtt_owner_id_fk = "1999"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2076" WHERE ltt_owner_id_fk = "1999"; UPDATE owners_to_titles SET ott_owner_id_fk = "2076" WHERE ott_owner_id_fk = "1999"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2076" WHERE stt_owner_id_fk = "1999"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2076" WHERE ote_owner_id_fk = "1999"; DELETE FROM owners WHERE own_owner_id = "1999";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2034" WHERE jtt_owner_id_fk = "2015"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2034" WHERE ltt_owner_id_fk = "2015"; UPDATE owners_to_titles SET ott_owner_id_fk = "2034" WHERE ott_owner_id_fk = "2015"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2034" WHERE stt_owner_id_fk = "2015"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2034" WHERE ote_owner_id_fk = "2015"; DELETE FROM owners WHERE own_owner_id = "2015";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2053" WHERE jtt_owner_id_fk = "2032"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2053" WHERE ltt_owner_id_fk = "2032"; UPDATE owners_to_titles SET ott_owner_id_fk = "2053" WHERE ott_owner_id_fk = "2032"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2053" WHERE stt_owner_id_fk = "2032"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2053" WHERE ote_owner_id_fk = "2032"; DELETE FROM owners WHERE own_owner_id = "2032";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2165" WHERE jtt_owner_id_fk = "2041"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2165" WHERE ltt_owner_id_fk = "2041"; UPDATE owners_to_titles SET ott_owner_id_fk = "2165" WHERE ott_owner_id_fk = "2041"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2165" WHERE stt_owner_id_fk = "2041"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2165" WHERE ote_owner_id_fk = "2041"; DELETE FROM owners WHERE own_owner_id = "2041";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2166" WHERE jtt_owner_id_fk = "2041"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2166" WHERE ltt_owner_id_fk = "2041"; UPDATE owners_to_titles SET ott_owner_id_fk = "2166" WHERE ott_owner_id_fk = "2041"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2166" WHERE stt_owner_id_fk = "2041"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2166" WHERE ote_owner_id_fk = "2041"; DELETE FROM owners WHERE own_owner_id = "2041";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2158" WHERE jtt_owner_id_fk = "2070"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2158" WHERE ltt_owner_id_fk = "2070"; UPDATE owners_to_titles SET ott_owner_id_fk = "2158" WHERE ott_owner_id_fk = "2070"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2158" WHERE stt_owner_id_fk = "2070"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2158" WHERE ote_owner_id_fk = "2070"; DELETE FROM owners WHERE own_owner_id = "2070";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2077" WHERE jtt_owner_id_fk = "2075"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2077" WHERE ltt_owner_id_fk = "2075"; UPDATE owners_to_titles SET ott_owner_id_fk = "2077" WHERE ott_owner_id_fk = "2075"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2077" WHERE stt_owner_id_fk = "2075"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2077" WHERE ote_owner_id_fk = "2075"; DELETE FROM owners WHERE own_owner_id = "2075";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2093" WHERE jtt_owner_id_fk = "2092"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2093" WHERE ltt_owner_id_fk = "2092"; UPDATE owners_to_titles SET ott_owner_id_fk = "2093" WHERE ott_owner_id_fk = "2092"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2093" WHERE stt_owner_id_fk = "2092"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2093" WHERE ote_owner_id_fk = "2092"; DELETE FROM owners WHERE own_owner_id = "2092";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2164" WHERE jtt_owner_id_fk = "2163"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2164" WHERE ltt_owner_id_fk = "2163"; UPDATE owners_to_titles SET ott_owner_id_fk = "2164" WHERE ott_owner_id_fk = "2163"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2164" WHERE stt_owner_id_fk = "2163"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2164" WHERE ote_owner_id_fk = "2163"; DELETE FROM owners WHERE own_owner_id = "2163";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2166" WHERE jtt_owner_id_fk = "2165"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2166" WHERE ltt_owner_id_fk = "2165"; UPDATE owners_to_titles SET ott_owner_id_fk = "2166" WHERE ott_owner_id_fk = "2165"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2166" WHERE stt_owner_id_fk = "2165"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2166" WHERE ote_owner_id_fk = "2165"; DELETE FROM owners WHERE own_owner_id = "2165";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2172" WHERE jtt_owner_id_fk = "2171"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2172" WHERE ltt_owner_id_fk = "2171"; UPDATE owners_to_titles SET ott_owner_id_fk = "2172" WHERE ott_owner_id_fk = "2171"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2172" WHERE stt_owner_id_fk = "2171"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2172" WHERE ote_owner_id_fk = "2171"; DELETE FROM owners WHERE own_owner_id = "2171";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2154" WHERE jtt_owner_id_fk = "2194"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2154" WHERE ltt_owner_id_fk = "2194"; UPDATE owners_to_titles SET ott_owner_id_fk = "2154" WHERE ott_owner_id_fk = "2194"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2154" WHERE stt_owner_id_fk = "2194"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2154" WHERE ote_owner_id_fk = "2194"; DELETE FROM owners WHERE own_owner_id = "2194";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2221" WHERE jtt_owner_id_fk = "2220"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2221" WHERE ltt_owner_id_fk = "2220"; UPDATE owners_to_titles SET ott_owner_id_fk = "2221" WHERE ott_owner_id_fk = "2220"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2221" WHERE stt_owner_id_fk = "2220"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2221" WHERE ote_owner_id_fk = "2220"; DELETE FROM owners WHERE own_owner_id = "2220";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2287" WHERE jtt_owner_id_fk = "2286"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2287" WHERE ltt_owner_id_fk = "2286"; UPDATE owners_to_titles SET ott_owner_id_fk = "2287" WHERE ott_owner_id_fk = "2286"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2287" WHERE stt_owner_id_fk = "2286"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2287" WHERE ote_owner_id_fk = "2286"; DELETE FROM owners WHERE own_owner_id = "2286";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2024" WHERE jtt_owner_id_fk = "2452"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2024" WHERE ltt_owner_id_fk = "2452"; UPDATE owners_to_titles SET ott_owner_id_fk = "2024" WHERE ott_owner_id_fk = "2452"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2024" WHERE stt_owner_id_fk = "2452"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2024" WHERE ote_owner_id_fk = "2452"; DELETE FROM owners WHERE own_owner_id = "2452";
UPDATE joint_tenants_to_titles SET jtt_owner_id_fk = "2016" WHERE jtt_owner_id_fk = "2452"; UPDATE lawyers_to_titles SET ltt_owner_id_fk = "2016" WHERE ltt_owner_id_fk = "2452"; UPDATE owners_to_titles SET ott_owner_id_fk = "2016" WHERE ott_owner_id_fk = "2452"; UPDATE sellers_to_titles SET stt_owner_id_fk = "2016" WHERE stt_owner_id_fk = "2452"; UPDATE owners_to_ethnicities SET ote_owner_id_fk = "2016" WHERE ote_owner_id_fk = "2452"; DELETE FROM owners WHERE own_owner_id = "2452";

03/05/12

Permalink 04:01:08 pm, by mholmes, 52 words, 81 views   English (CA)
Categories: Activity log; Mins. worked: 90

Automation of owner-merging

Tomorrow a list of duplicate owners to be merged will come back from JS-R. In preparation, I've written a script that will automatically produce the correct SQL to do the job (which involves changes to five tables), along with some SQL that can be used to check that the process went OK.

Permalink 03:57:47 pm, by mholmes, 10 words, 49 views   English (CA)
Categories: Activity log; Mins. worked: 120

Meetings with JS

Two meetings with JS to help with CFI grant planning.

Permalink 03:55:29 pm, by mholmes, 55 words, 63 views   English (CA)
Categories: Activity log; Mins. worked: 75

Planning meeting

Met with JJ to start planning the next four years. Decisions made on:

  • Stow texts (probably 1598, 1603, 1633 and 1908; some already transcribed, some encoded (EEBO/TCP), one needs transcribing (1603).
  • Map: first priority is to get the new image, then get a tiled zoomable version of it up, for the use of encoders. Then look at the layers.
Permalink 02:03:03 pm, by sarneil, 212 words, 144 views   English (CA)
Categories: Activity log; Mins. worked: 180

future registrars and hosts

Met with Greg JL and MF to confirm plans for future registrars and hosts for various domain names associated with canmys. Discovered that there are two domain registrars involved (sibername and domainpeople). The two primary domain names are registered through sibername (but point at the uvic dns), so at some point we'll apply to transfer those to domainpeople. All the domain names registered through domainpeople are using the synergies Domain Name Servers (or for domains that are unused, the dns at domainpeople). We've got access to change those, but before we actually do, we have to make sure that we're ready to handle the requests on the uvic servers. The specific problem is the email addresses. Greg will arrange with the sys admins how to handle the admin@ feedback@ and webmaster@ email addresses. Backed up the entire set of files on the development server (synergies) to my mac and then backed that up. Backed up the mysql db containing all the development files (Manual's CMS backend) to my mac and backed that up. Renamed a number of the folders in the public folder on the dev server so that any URLs to the old folder names would fail, and added instructions in the htaccess file and a 404.htm file to catch those.
Permalink 10:24:22 am, by mholmes, 13 words, 76 views   English (CA)
Categories: Activity log; Mins. worked: 15

Hispanital site updates

Did a couple of updates to the old Hispanital site on DR's instructions.

Permalink 08:00:38 am, by mholmes, 66 words, 64 views   English (CA)
Categories: Activity log; Mins. worked: 15

Graves: solution to the puzzle

The mysterious numbers in Graves bibliographic data turn out to be references to the Higginson bibliography, and in the case of Riding, to the Wexler one. Duplication of idnos is caused by the fact that multiple analytic-level items may appear in the same m-level work. We can preserve these numbers in the data without worrying about displaying them, presumably, which obviates the need to explain them.

02/05/12

Permalink 02:52:23 pm, by mholmes, 339 words, 72 views   English (CA)
Categories: Activity log; Mins. worked: 360

Graves: existing markup all converted to P5

I've now finished the conversion of the existing markup to P5, after a fair amount of work converting the references file from its proprietary XML to TEI. One puzzle remains (see below).

This is the collection structure I've got so far:

  • graves.rng (the schema; this will be a generic diary schema when we're done)
  • abstracts (XML files containing <div type="abstract">s)
  • editorial (XML files from which each root-level <div> will become a front page on the site -- e.g. <div id="home" n="Home">, which would result in a menu entry "Home") )
  • entries (XML files containing <div type="diaryEntry">s; we may just handle any root level <div>, and assume that all attachments, enclosures etc. are contained within root level <div>s. In this case, we'll have to check the structure of the Graves files.)
  • metadata (XML files containing metadata for the entire project. I don't yet know how these will be rendered, if at all.)

The puzzle: Bibliographic references have an "idno" field, and I'm not sure what kind of idno it is. For instance. "I, Claudius" has the idno "A42". It can't be a Cutter number, or it should begin with G.

These are not the unique ids used for linking <rs> elements in the text to references (which are stored in the id attribute); these are in addition to those. Also, these idnos are not unique; Graves's "Landscape" and Riding's "Letter on International Affairs" share the idno A36.

For the moment, I've preserved these as an <idno> element in the P5. This is the complete list of those idnos:

A101 A13 A18 A21c A23 A26 A29 A32 A33 A35 A35 [Wexler] A36 A37 A38
A39 A40 A41 A42 A43 A44 A45 A46 A47 A48 A49 A50 A58 A59 A6 A60 A7 A9
B22 B23 B24 B25 B26 B27 B29 B29.1 B30 C22 C24 C291 C291.2 C291.3
C291.6 C291.7 C292 C292.1 C292.2 C292.3 C293 C293.1 C293.2 C294 
Permalink 11:42:41 am, by jnazar, 54 words, 48 views   English (CA)
Categories: Activity log; Mins. worked: 60

Hispanic and Italian Studies - Cascade website

Had meeting today with DR, DF.
Reviewed:
- the site's structure
- temporary inserted content
- blocks; page editing; saving changes; publishing; linking; photos and tips for writing for the web.

Decisions made:
- DR:register for upcoming training session
- Hispanic Dept: plan/organize department's division of new site's workload;
decide on timeframe

Permalink 09:27:10 am, by jnazar, 17 words, 68 views   English (CA)
Categories: Activity log; Mins. worked: 20

FAMIS REPORT

Submitted FAMIS report today for replacement/repair of office lights in
B043 and adjoining research computer area.

01/05/12

Permalink 04:20:11 pm, by sarneil, 159 words, 138 views   English (CA)
Categories: Activity log; Mins. worked: 90

vpn : problem connecting from web.uvic.ca page to mysql.hcmc db server

Went in to debug some things on the VPN site and noticed that the home page was taking forever to load (and in fact I think I ended up getting some cached version of the page).

Web server serving static pages no problem
phpmyadmin communicating with db no problem
Therefore problem has to be something to do with the wordpress code trying to establish a connection with the db.
Created a test.php file which does nothing other than try to connect to the DB. If it tries to connect to any DB running on the hcmc's mysql server, it times out and generates an error. If it tries to connect to a DB running on uvic's mysql server, it's fine. So, there's definitely a problem with any page on web.uvic.ca trying to connect to a db on the hcmc server. Put in request to sysadmins to check the Access Control List on the hcmc db server.

Permalink 04:09:47 pm, by sarneil, 144 words, 276 views   English (CA)
Categories: Activity log; Mins. worked: 180

malahat : php version of front-end page for ecommerce

Noticed yesterday that when you're returned from the beanstream shopping cart to the page on the uvic server, you of course get the default version of the page (i.e. no form has been selected), and the only way to get back to the shopping cart page again is to buy another item (and then delete that item on the beanstream page). So, decided to implement a php page. The URL back from the beanstream shopping cart includes a bunch of key value pairs in the GET array, so I can test for those and figure out which form to display (I pass in a value using the ref1 variable which is intended for this kind of thing) and also to display a "return to shopping cart button" with appropriate URL. Also tidied up and commented the php code and the static html page.
Permalink 04:02:28 pm, by mholmes, 8 words, 97 views   English (CA)
Categories: Activity log; Mins. worked: 420

NLP course work

Still struggling mightily with last week's programming assignment...

Permalink 11:27:28 am, by Erin, 70 words, 133 views   English (CA)
Categories: Activity log; Mins. worked: 170

proofing persnames and places

Today I finished going through all of the accounts and modifying persName tags so that it is all consistent, using last names only for the tag. (Borgarucci, bottonus, pare, wechel, teissier, peucer, clusius, charles_quint, frisius, fabri, cassandre, boucher, rondelet, garetius, languet, thomasius, benigne_winslow, malatesta, sylvius, fernel, gessenius, metellus, riolan, joubert, fuchs, lemnius, duret, moreri, adam, fontanus, castellanus, zacchias, gemini) I also added the note tags into the place-ography :)
Permalink 08:46:32 am, by Erin, 34 words, 139 views   English (CA)
Categories: Activity log; Mins. worked: 180

last week

last Tuesday I globally added xml ids above the header as well as in page number tags. I also edited a few tags in the people xml file and in corresponding xml account files.

All HCMC Blogs

Actions

Reports

Categories

All HCMC Blogs

Transformer blog

Work on this blogging tool

Image Markup Tool blog

HCMC Project Management

Nxaʔamxcín (Moses) Dictionary Blog

Maintenance

FrancoToile

Mariage

Administration

Academic

Depts

Scandinavian-Canadian Studies

EMLS

Scraps

Image Markup and Presentation

Update of Humanities Sites

viHistory

Vacation, Hours and Sickday Log

Times Colonist Transcript Database

Devonshire

CMC Research Collective

Moodle

Humanities Project Showcase

Peter's blog

teiJournal

Projects

Professional Development

Colonial Despatches

Coup De Des - GUI for concrete poem

Capital Trials at the Old Bailey

Agenda Class Timetabling

Lansdowne Lectures

German Medical Exams

Canadian Mysteries

Map Of London

MyNDIR

Canadian Journal of Buddhist Studies

Adaptive Database

Myths on Maps

Properties

Cascade

Vesalius

DHSI

History of the Philosophy of Language

A City Goes to War

Landscapes of Injustice

May 2012
Sun Mon Tue Wed Thu Fri Sat
 << < Current> >>
    1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 18 19
20 21 22 23 24 25 26
27 28 29 30 31    

XML Feeds