Permalink 01:04:16 pm, by mholmes, 100 words, 55 views   English (CA)
Categories: Activity log; Mins. worked: 120

Working on figuring out lot overlap

Put a lot of time into figuring out if I could use node and turf to do the lot calculation, but it seems unlikely; the current release of turf lacks the advertised boolean-overlap function as far as I can see, and the turf.area() function gives me very weird results (such as negative numbers). Had a go at using OpenLayers under node, but that's very difficult because of its requirement for a browser window object, document, and so on; so I'm now investigating the use of a hosting browser page, which seems more promising. Lots of work to do, though.


Permalink 03:26:28 pm, by mholmes, 224 words, 44 views   English (CA)
Categories: Activity log; Mins. worked: 320

Meeting, plans, progress

Met with JSR and discussed the situation wrt geo plans and properties. Collected the remaining issues with missing plans and turned them into detailed requests for the NWLTO to go back to original titles and confirm/correct data to find them; sent this to JSR to forward. Also did stats on the problems generated by these issues; the number of affected titles is quite low, but we don't know what the number of affected "chains" is yet, because we don't know how chains should be constructed. However, I did make some progress on this. I've started some XSLT to build a spreadsheet of per-lot value histories per sq metre in 2016 dollars.

The next stage is to figure out how to continue the chain for an extinguished lot. My suggested approach is to analyse the potential overlap of all lots, in order to discover any pairs which overlap by more than 50% of the area of one of them; this should be sufficient to generate a link between them, and then any chain can be continued by following [one of] the lot[s] into which it was [de]composed. I think this can be done in JavaScript using Node and the Turf.js library, specifically turf intersect. I'll need to script and test this, then it'll run for many hours to generate all the required lists.


Permalink 04:35:05 pm, by mholmes, 33 words, 47 views   English (CA)
Categories: Activity log; Mins. worked: 60

Worked through apparently new plans

New plans provided by JSR from LTO turned out to have been dealt with already by AC. Put a couple of them into Zotero, confirmed that others were still missing or still wrong.


Permalink 04:24:23 pm, by mholmes, 83 words, 49 views   English (CA)
Categories: Activity log; Mins. worked: 240

Working on plans: processing JSON with XSLT

The next part of the plan to handle the Plans is to use the basic GeoJSON we now have in WGS84 to create BreezeMap-compatible GeoJSON with all the extra info we need in it. I decided to do this with XSLT3, to get familiar with the new JSON-handling functionality, and it seems pretty straightforward once you get the hang of it. I'm now producing all the output I need, but I haven't yet figured out how to create an index for these pages.


Permalink 03:57:56 pm, by mholmes, 75 words, 61 views   English (CA)
Categories: Activity log; Mins. worked: 60

Working on plans: Converted GeoJSON to WGS84

Wrote another node script using proj4 to convert all our Plan and Sketch files to WGS84. This is a key component of creating TEI files which merge the geo data from the plans and sketches with the land title data. I've also massaged the UIDs of the Maple Ridge DB lots so that they match the new ones in the db. That means that the most reliable resources for mapping are the *_wgs84.json files.


Permalink 04:53:42 pm, by mholmes, 190 words, 44 views   English (CA)
Categories: Activity log; Mins. worked: 240

Created hand-crafted map

From the existing TEI, built a map for a specific dataset that shows the sequence of titles with rising prices for JC property that was seized. JSR now wants to add plans to the picture, so I've started to build a transformation that works from the GML of the plan files to create GeoJSON; that would give us one map per plan, with all the child lots, which could be linked to their titles. The current issue I'm wrestling with is that the numbers for coordinates are a bit too large for the XSLT to handle, so some of them end up being transformed into exponential notation. It looks like I'll have to try to figure out some way to call an external function to convert all the numbers to WGS84 first. Not sure how to do that; may have to work from the GeoJSON files I've already created, and use Node/Proj4 to do it. In any case, I'm using the XSLT 3 JSON reading-and-writing functionality, so it should be possible to transform the GeoJSON and then use that as the source for the GML, perhaps; then transform that GML.


Permalink 04:23:43 pm, by mholmes, 100 words, 44 views   English (CA)
Categories: Activity log; Mins. worked: 90

Meeting on tsvs and maps

Met with JSR and started to thrash out some of the requirements for the next block of processing. Outcomes:

  1. (Urgent): this week, put together a demo map of a single sequence of titles pertaining to a single Maple Ridge property. Do this by getting a list of titles in a chain, then combining their generated TEI place files into a single file; then edit that file to make info more human-friendly; then generate GeoJSON and create a map.
  2. The existing TSV seems to have all the required info in it for the various calculations that will need to be done.


Permalink 03:25:07 pm, by mholmes, 100 words, 46 views   English (CA)
Categories: Activity log; Mins. worked: 180

Investigation into the use of geotiffs on the map

JSR raised the possibility of using the cadastrals on top of the title maps. I've looked at the GeoTiffs created by AC, and determined that I can use gdalwarped like this:

gdalwarp input.tif output.WGS84.tif -t_srs "+proj=longlat +ellps=WGS84" 

to create a version in degrees in WGS84; then I can use:

gdalinfo output.WGS84.tif

to see the resulting coordinates. I think we can then use convert to get PNGs and use ol.source.StaticImage to put the results on the map, but I haven't actually succeeded in making that work yet; still wrestling with it.


Permalink 04:08:26 pm, by mholmes, 63 words, 37 views   English (CA)
Categories: Activity log; Mins. worked: 300

Maps working, and index page created; TSV by property now complete

I have the map working for individual titles, and I've added lots of extra warning info to the description of the title for cases where lot information is incomplete. The current streets tables now link to the map. I've also updated the code which builds the TSV file of transactions by lot in 2016 values, to include both Powell Street and Maple Ridge together.

Permalink 04:06:31 pm, by mholmes, 340 words, 43 views   English (CA)
Categories: Activity log; Mins. worked: 60

Fix for database id overlap

I discovered an issue with our databases. When we moved from the Powell Street database to the Maple Ridge db, I intentionally set the new database to assign automatic ids to new items starting from numbers beyond the totals reached in the Powell Street db. So (for example) the original Powell Str properties maxed out at 1039, and the Maple Ridge db was set to start at 1105, allowing a buffer of 65 for small changes to the Powell St db.

Similarly, with titles, the Powell St db maxed out at 6459, and I started the Maple Ridge db at 6555, leaving a buffer of just under a hundred.

However, we later made the decision to start entering Kitsilano data into the Powell Street database, and this turned out to be on a scale which pushed it beyond the buffer and created overlapping ids. The result is that we have properties and titles in both databases with identical ids, meaning that we cannot easily combine that data. Similar overlaps are happening with owners, and presumably also lawyers and so on.

This is not a disaster; the two databases are separate, so there's no confusion unless and until we start trying to combine them. Similarly, AC's work on maps is divided into five distinct sets which each map to one or other of the dbs.

However, we had to solve the problem because we are now in the business of merging this data, and overlapping ids are a bad idea generally. My solution, which I tested thoroughly on the dev db before doing it to the live db, was to simply update the Maple Ridge owners, titles and properties tables to add 20,000 to each of their id fields. This propagated automatically through the linking tables, which was very neat. Then I just reset the next-autoincrement value appropriately (that doesn't happen automatically). Seems to have worked perfectly.

I've also corrected some inconsistencies between the location table values in the two dbs, and I'm working on some properties which have no assigned location in the Powell St db.

<< Previous Page :: Next Page >>

Landscapes of Injustice


XML Feeds