At request of PD, I processed the the raw checkcensus csv files he submitted into something suitable for upload to database. Oddities in data that had to be resolved: street numbers such as 7-9 or 12 and a half, number of occupants such as 20+, 20plus, 20 to 30 etc., quotation marks and commas in the dataset being confused for quotation marks and commas as delimiters.
Also set up the table, created the fields including an entry in the sequences table, all based on similar work done for the 1871 census, which was also a kind of one-off data set.
Did some twiddling with the presentation of results (particularly with how to represent the various note fields). For those records that have an id for the 1891 Dominion census, wrote code to generate a link to the full record for that individual in the Dominion Census, but have not modified the 1891 Dominion census in any way (e.g. to include a link back to an entry in the 1891 Check census).
Looked into what's involved in updating 1400+ records in the census_1911 table.
Have spreadsheet from PD. About 37 of the fields in the db can be just read in from the spreadsheet. About 12 of the fields in the db have to be calculated based on values in the spreadsheet (the nature of the calculation varies for each of the 12 instances. The notes field in the PG table describe the needed calculations. Not sure how Jamie did this last year, but it looks like I'll have to write a bunch of code to do that processing and generate the csv needed.
Then I'll need to tread carefully regarding the actual upload. I think I'll need to delete the existing records and then upload (using the copy command, or possibly the import feature in the admin client).
PD asked me to create one page to display a table of all the streets (or portions of streets) that have been renamed over the time range of the VIHistory datasets. I did that by running the tab-delimited through a few regexs to produce a styled html table and put that into a new page.
PD also asked for a similar page to display a table of all the addresses that were renumbered in 1907. That took a bit more regex processing, as I wanted to created a heading for each street and then a table for all the addresses on that street.
Made copies of a number of ta (tax assessment) files, renamed them and edited them to work with the building_permits table in the db. Did this in the dev instance in my account on our server.
Files I've added:
The link to the searchbp.php page is on the tax assessment page (ta/taxassessment.php).
Emailed JL and PD for guidance on which fields to include in search interface(s) and which results fields are primary and which secondary.
More headaches with svn than the actual code (as usual). I had in the repo and on my local drive a folder (bp) containing 1 file. I did an svn delete and the file deleted on the local instance, but not the folder. When I then did a commit, I got a file out of date error. When I tried various ways of sorting out this problem I ended up consistently geeting "/path/on/local/drive/' remains in conflict" errors.
Googled it and discovered I'm not alone. Quite a number of people renamaing, moving or deleting files that have everything going smoothly except for one file or folder that some gets into "conflict"
svn resolved path/to/conflicted folder
svn update path/to/conflicted folder
svn commit -m "resolving conflicted folder or whatever"
Created a building permits table in the vihdev db. Noticed that to auto-increment the building_permit_id field, you have to reference a sequence, so created the necessary sequence modelled on others I found in the db (census tables).
Processed the raw data file (spreadsheet) into normalized data (typed a couple of the data fields that I could e.g. int or date and normalized data to comply with the constraints I had established e.g. length of varchar fields). Saved that as a CSV (rather than tab-delimited) as the documentation seemed to favour the CSV approach.
Only substantial fiddling I had to do with the data was for all the records whose date field was only a year (e.g. 1889), I arbitrarily assigned them the 1st of January (e.g. 18890101) as the date field requires 8 digits.
Once that instance uploaded successfully, did the exact same thing in the production instance, just so I have a second copy of the thing somewhere.
Once that was all working in the dev instance of the db,
Replaced the contents of the trunk, Alex branch and the backup branch with the files as updated by Martin and Greg, so they're up to date.
Gave AD all the svn, web account and db connection info he should need to get to the files, check them out, and post them to the web space for testing.
note to self on nuts and bolts
on local file system:
create the folder structure you want (if you're copying an existing local instance of an svn project, you have to delete the .svn file from each folder in that project)
on command line,
cd to parent folder of the one you want to add (that parent folder has to already be in svn)
svn add FOLDER_YOU_WANT_TO_ADD
svn commit -m "message about adding new folder"
There are three files in the site which contain database connection strings:
In each of these three files, the values for the database connection string have been replaced with placeholders. You have to make a copy of each of those files with the following names:
In the copies, substitute the correct values for your connection string.
If the folder is in svn (which it probably is), you'll need to use svn add to add each of the files to the repo, then do your svn commit.
viHistory is a web site that is a teaching, learning and research tool. It's principally about the history of Vancouver Island in British Columbia, but it is also a vehicle for exploring the larger field of Canadian history during the late 19th and early part of the 20th century. It allows census, directory and tax assessment roll data from the late 19th and early 20th centuries to be searched in many ways. It also incorporates IMaP to display historical maps. The project director is Dr. Patrick A. Dunae.
|<< <||> >>|