New plans provided by JSR from LTO turned out to have been dealt with already by AC. Put a couple of them into Zotero, confirmed that others were still missing or still wrong.
The next part of the plan to handle the Plans is to use the basic GeoJSON we now have in WGS84 to create BreezeMap-compatible GeoJSON with all the extra info we need in it. I decided to do this with XSLT3, to get familiar with the new JSON-handling functionality, and it seems pretty straightforward once you get the hang of it. I'm now producing all the output I need, but I haven't yet figured out how to create an index for these pages.
Wrote another node script using proj4 to convert all our Plan and Sketch files to WGS84. This is a key component of creating TEI files which merge the geo data from the plans and sketches with the land title data. I've also massaged the UIDs of the Maple Ridge DB lots so that they match the new ones in the db. That means that the most reliable resources for mapping are the *_wgs84.json files.
From the existing TEI, built a map for a specific dataset that shows the sequence of titles with rising prices for JC property that was seized. JSR now wants to add plans to the picture, so I've started to build a transformation that works from the GML of the plan files to create GeoJSON; that would give us one map per plan, with all the child lots, which could be linked to their titles. The current issue I'm wrestling with is that the numbers for coordinates are a bit too large for the XSLT to handle, so some of them end up being transformed into exponential notation. It looks like I'll have to try to figure out some way to call an external function to convert all the numbers to WGS84 first. Not sure how to do that; may have to work from the GeoJSON files I've already created, and use Node/Proj4 to do it. In any case, I'm using the XSLT 3 JSON reading-and-writing functionality, so it should be possible to transform the GeoJSON and then use that as the source for the GML, perhaps; then transform that GML.
Met with JSR and started to thrash out some of the requirements for the next block of processing. Outcomes:
- (Urgent): this week, put together a demo map of a single sequence of titles pertaining to a single Maple Ridge property. Do this by getting a list of titles in a chain, then combining their generated TEI place files into a single file; then edit that file to make info more human-friendly; then generate GeoJSON and create a map.
- The existing TSV seems to have all the required info in it for the various calculations that will need to be done.
JSR raised the possibility of using the cadastrals on top of the title maps. I've looked at the GeoTiffs created by AC, and determined that I can use gdalwarped like this:
gdalwarp input.tif output.WGS84.tif -t_srs "+proj=longlat +ellps=WGS84"
to create a version in degrees in WGS84; then I can use:
gdalinfo output.WGS84.tif
to see the resulting coordinates. I think we can then use convert to get PNGs and use ol.source.StaticImage to put the results on the map, but I haven't actually succeeded in making that work yet; still wrestling with it.
I have the map working for individual titles, and I've added lots of extra warning info to the description of the title for cases where lot information is incomplete. The current streets tables now link to the map. I've also updated the code which builds the TSV file of transactions by lot in 2016 values, to include both Powell Street and Maple Ridge together.
I discovered an issue with our databases. When we moved from the Powell Street database to the Maple Ridge db, I intentionally set the new database to assign automatic ids to new items starting from numbers beyond the totals reached in the Powell Street db. So (for example) the original Powell Str properties maxed out at 1039, and the Maple Ridge db was set to start at 1105, allowing a buffer of 65 for small changes to the Powell St db.
Similarly, with titles, the Powell St db maxed out at 6459, and I started the Maple Ridge db at 6555, leaving a buffer of just under a hundred.
However, we later made the decision to start entering Kitsilano data into the Powell Street database, and this turned out to be on a scale which pushed it beyond the buffer and created overlapping ids. The result is that we have properties and titles in both databases with identical ids, meaning that we cannot easily combine that data. Similar overlaps are happening with owners, and presumably also lawyers and so on.
This is not a disaster; the two databases are separate, so there's no confusion unless and until we start trying to combine them. Similarly, AC's work on maps is divided into five distinct sets which each map to one or other of the dbs.
However, we had to solve the problem because we are now in the business of merging this data, and overlapping ids are a bad idea generally. My solution, which I tested thoroughly on the dev db before doing it to the live db, was to simply update the Maple Ridge owners, titles and properties tables to add 20,000 to each of their id fields. This propagated automatically through the linking tables, which was very neat. Then I just reset the next-autoincrement value appropriately (that doesn't happen automatically). Seems to have worked perfectly.
I've also corrected some inconsistencies between the location table values in the two dbs, and I'm working on some properties which have no assigned location in the Powell St db.
I have a basic OSM map working for any given title. This involved hacking the BreezeMap code a bit; I'll port some of the changes back into BreezeMap itself. What remains to be done is to create index pages for the titles, fix the links between maps (they're done with a ttl: prefix), push the map build into the products, add links to the titles from the street pages, and (perhaps most urgent) fix the build, which is currently broken for reasons I don't understand.
I've solved the problem of converting the coordinates from 32610 in metres to WGS84 in the process of prepping the data for insertion into the MySQL dbs. This is the script:
/* This file is designed to be run against a series of GeoJSON * files to create SQL statements which insert polygon and * area information into our Land Title Databases. */ /* The GeoJSON files are generated from QGIS. */ // We need fileio and proj4 for coordinate conversion. var fs = require("fs"), content, i, maxi, j, maxj, k, maxk, l, maxl, gj, output, id, area, pathBits, outFile, arrCoords, degreeCoords, coords, ids=[], idList='', idCheck='', vancouverFiles=[], mapleRidgeFiles=[], proj4 = require("proj4"); //We create these Proj4JS definitions so we can more easily refer to them later. proj4.defs('epsg32610', '+proj=utm +zone=10 +ellps=WGS84 +datum=WGS84 +units=m +no_defs'); proj4.defs('WGS84', "+title=WGS 84 (long/lat) +proj=longlat +ellps=WGS84 +datum=WGS84 +units=degrees"); //GeoJSON files //Two sets of data are for properties in the Vancouver db. vancouverFiles.push('KitsilanoPlans/MergedKitsilano/MergedKitsilano_geom.json'); vancouverFiles.push('PowellStreetPlans/MergedPowellStreet/MergedPowell_geom.json'); //Three sets are for properties in the Maple Ridge db. mapleRidgeFiles.push('MapleRidgePlans/MergedMapleRidge/MergedMapleRidge_geom.json'); mapleRidgeFiles.push('SaltSpringPlans/MergedSaltSpring/MergedSaltSpring_geom.json'); mapleRidgeFiles.push('StevestonPlans/MergedSteveston/MergedSteveston_geom.json'); console.log("\n *START* \n"); processFileArray(vancouverFiles, 'vancouver.sql'); processFileArray(mapleRidgeFiles, 'mapleRidge.sql'); //Process an array of files, creating a single output file //with SQL commands to set the coordinate and area fields. function processFileArray(files, outputFile){ output = ''; for (i=0, maxi=files.length; i<maxi; i++){ ids = []; content = fs.readFileSync(files[i]); console.log('Read in ' + files[i] + '.'); gj = JSON.parse(content); console.log('Found ' + gj.features.length + ' features in the file.') for (j=0, maxj=gj.features.length; j<maxj; j++){ //The UID is the value of the prp_property_id field in the database props table. id = gj.features[j].properties["UID"]; if (id !== null){ //Keep a list of ids so we can do some checking later if necessary. ids.push('"' + id + '"'); //The area in sqm needs to be rounded to two decimal places. area = Math.round(parseFloat(gj.features[j].properties.area) * 100) / 100; //Here we convert the original coordinates from 32610/metres to WGS84 (for GeoJSON). arrCoords = gj.features[j].geometry.coordinates; for (k=0, maxk=arrCoords.length; k<maxk; k++){ for (l=0, maxl=arrCoords[k].length; l<maxl; l++){ degreeCoords = proj4(proj4('epsg32610'), proj4('WGS84'), arrCoords[k][l]); //console.log(JSON.stringify(arrCoords[k][l])); //console.log(JSON.stringify(degreeCoords)); arrCoords[k][l] = degreeCoords; } } coords = JSON.stringify(gj.features[j].geometry.coordinates); //Write the two update statements for this property record. output += 'UPDATE `props` set `prp_coords` = "' + coords + '" WHERE `prp_property_id`="' + id + '";\n'; output += 'UPDATE `props` set `prp_area` = "' + area + '" WHERE `prp_property_id`="' + id + '";\n'; } } idList = '(' + ids.join(', ') + ')'; idCheck = '#id check: \n\n# ' + 'let $ids := ' + idList + ' return for $i in $ids where not(//prp_property_id[. = $i]) return $i \n\n' //idCheck = 'CREATE TYPE id_nums AS TABLE OF NUMBER;\n'; //idCheck += 'SELECT * FROM TABLE(id_nums(' + idList + ')) ' //idCheck += ' WHERE NOT IN (SELECT `prp_property_id` from `props`)\n\n'; output = idCheck + output; } fs.writeFile(outputFile, output, function(err){}); } console.log("\n *EXIT* \n");