01/12/17

Permalink 03:25:07 pm, by mholmes, 100 words, 22 views   English (CA)
Categories: Activity log; Mins. worked: 180

Investigation into the use of geotiffs on the map

JSR raised the possibility of using the cadastrals on top of the title maps. I've looked at the GeoTiffs created by AC, and determined that I can use gdalwarped like this:

gdalwarp input.tif output.WGS84.tif -t_srs "+proj=longlat +ellps=WGS84" 

to create a version in degrees in WGS84; then I can use:

gdalinfo output.WGS84.tif

to see the resulting coordinates. I think we can then use convert to get PNGs and use ol.source.StaticImage to put the results on the map, but I haven't actually succeeded in making that work yet; still wrestling with it.

30/11/17

Permalink 04:08:26 pm, by mholmes, 63 words, 15 views   English (CA)
Categories: Activity log; Mins. worked: 300

Maps working, and index page created; TSV by property now complete

I have the map working for individual titles, and I've added lots of extra warning info to the description of the title for cases where lot information is incomplete. The current streets tables now link to the map. I've also updated the code which builds the TSV file of transactions by lot in 2016 values, to include both Powell Street and Maple Ridge together.

Permalink 04:06:31 pm, by mholmes, 340 words, 17 views   English (CA)
Categories: Activity log; Mins. worked: 60

Fix for database id overlap

I discovered an issue with our databases. When we moved from the Powell Street database to the Maple Ridge db, I intentionally set the new database to assign automatic ids to new items starting from numbers beyond the totals reached in the Powell Street db. So (for example) the original Powell Str properties maxed out at 1039, and the Maple Ridge db was set to start at 1105, allowing a buffer of 65 for small changes to the Powell St db.

Similarly, with titles, the Powell St db maxed out at 6459, and I started the Maple Ridge db at 6555, leaving a buffer of just under a hundred.

However, we later made the decision to start entering Kitsilano data into the Powell Street database, and this turned out to be on a scale which pushed it beyond the buffer and created overlapping ids. The result is that we have properties and titles in both databases with identical ids, meaning that we cannot easily combine that data. Similar overlaps are happening with owners, and presumably also lawyers and so on.

This is not a disaster; the two databases are separate, so there's no confusion unless and until we start trying to combine them. Similarly, AC's work on maps is divided into five distinct sets which each map to one or other of the dbs.

However, we had to solve the problem because we are now in the business of merging this data, and overlapping ids are a bad idea generally. My solution, which I tested thoroughly on the dev db before doing it to the live db, was to simply update the Maple Ridge owners, titles and properties tables to add 20,000 to each of their id fields. This propagated automatically through the linking tables, which was very neat. Then I just reset the next-autoincrement value appropriately (that doesn't happen automatically). Seems to have worked perfectly.

I've also corrected some inconsistencies between the location table values in the two dbs, and I'm working on some properties which have no assigned location in the Powell St db.

29/11/17

Permalink 04:08:19 pm, by mholmes, 85 words, 14 views   English (CA)
Categories: Activity log; Mins. worked: 240

Maps working, but build still broken

I have a basic OSM map working for any given title. This involved hacking the BreezeMap code a bit; I'll port some of the changes back into BreezeMap itself. What remains to be done is to create index pages for the titles, fix the links between maps (they're done with a ttl: prefix), push the map build into the products, add links to the titles from the street pages, and (perhaps most urgent) fix the build, which is currently broken for reasons I don't understand.

Permalink 10:55:32 am, by mholmes, 565 words, 16 views   English (CA)
Categories: Activity log; Mins. worked: 120

Converted coords to WGS84 degrees

I've solved the problem of converting the coordinates from 32610 in metres to WGS84 in the process of prepping the data for insertion into the MySQL dbs. This is the script:

/* This file is designed to be run against a series of GeoJSON 
 * files to create SQL statements which insert polygon and 
 * area information into our Land Title Databases. */
 
/* The GeoJSON files are generated from QGIS. */

// We need fileio and proj4 for coordinate conversion.
var fs = require("fs"), content, i, maxi, j, maxj, k, maxk, l, maxl, 
              gj, output, id, area, pathBits, outFile, arrCoords,
              degreeCoords, coords, ids=[], idList='',
              idCheck='', vancouverFiles=[], 
              mapleRidgeFiles=[], proj4 = require("proj4");
              
//We create these Proj4JS definitions so we can more easily refer to them later.
proj4.defs('epsg32610', '+proj=utm +zone=10 +ellps=WGS84 +datum=WGS84 +units=m +no_defs');
proj4.defs('WGS84', "+title=WGS 84 (long/lat) +proj=longlat +ellps=WGS84 +datum=WGS84 +units=degrees");

//GeoJSON files

//Two sets of data are for properties in the Vancouver db.
vancouverFiles.push('KitsilanoPlans/MergedKitsilano/MergedKitsilano_geom.json');
vancouverFiles.push('PowellStreetPlans/MergedPowellStreet/MergedPowell_geom.json');

//Three sets are for properties in the Maple Ridge db.
mapleRidgeFiles.push('MapleRidgePlans/MergedMapleRidge/MergedMapleRidge_geom.json');
mapleRidgeFiles.push('SaltSpringPlans/MergedSaltSpring/MergedSaltSpring_geom.json');
mapleRidgeFiles.push('StevestonPlans/MergedSteveston/MergedSteveston_geom.json');

console.log("\n *START* \n");

processFileArray(vancouverFiles, 'vancouver.sql');
processFileArray(mapleRidgeFiles, 'mapleRidge.sql');

//Process an array of files, creating a single output file
//with SQL commands to set the coordinate and area fields.
function processFileArray(files, outputFile){
  output = '';
  for (i=0, maxi=files.length; i<maxi; i++){
      ids = [];
      content = fs.readFileSync(files[i]);
      console.log('Read in ' + files[i] + '.');
      gj = JSON.parse(content);
      console.log('Found ' + gj.features.length + ' features in the file.')
      for (j=0, maxj=gj.features.length; j<maxj; j++){
//The UID is the value of the prp_property_id field in the database props table.
        id = gj.features[j].properties["UID"];
        if (id !== null){
//Keep a list of ids so we can do some checking later if necessary.

          ids.push('"' + id + '"');
        
//The area in sqm needs to be rounded to two decimal places.
          area = Math.round(parseFloat(gj.features[j].properties.area) * 100) / 100;
          
//Here we convert the original coordinates from 32610/metres to WGS84 (for GeoJSON).
          arrCoords = gj.features[j].geometry.coordinates;
          for (k=0, maxk=arrCoords.length; k<maxk; k++){
            for (l=0, maxl=arrCoords[k].length; l<maxl; l++){
              degreeCoords = proj4(proj4('epsg32610'), proj4('WGS84'), arrCoords[k][l]);
              //console.log(JSON.stringify(arrCoords[k][l]));
              //console.log(JSON.stringify(degreeCoords));
              arrCoords[k][l] = degreeCoords;
            }
          }

          coords = JSON.stringify(gj.features[j].geometry.coordinates);
          
//Write the two update statements for this property record.
          output += 'UPDATE `props` set `prp_coords` = "' + coords  + '" WHERE `prp_property_id`="' + id + '";\n';
          output += 'UPDATE `props` set `prp_area` = "' + area + '" WHERE `prp_property_id`="' + id + '";\n';
        }
      }
      idList = '(' + ids.join(', ') + ')';
      idCheck = '#id check: \n\n# ' + 'let $ids := ' + idList + ' return for $i in $ids where not(//prp_property_id[. = $i]) return $i \n\n'
      
      //idCheck = 'CREATE TYPE id_nums AS TABLE OF NUMBER;\n';
      //idCheck += 'SELECT * FROM TABLE(id_nums(' + idList + ')) '
      //idCheck += ' WHERE NOT IN (SELECT `prp_property_id` from `props`)\n\n';
      output = idCheck + output;
  }
  fs.writeFile(outputFile, output, function(err){});
}

console.log("\n *EXIT* \n");

28/11/17

Permalink 03:04:54 pm, by mholmes, 88 words, 13 views   English (CA)
Categories: Activity log; Mins. worked: 60

Converting coords from EPSG 32610

My next problem with creating the GeoJSON is that I have coords in EPSG 32610, in meters, and I need them in WGS84 for the GeoJSON. This approach will probably work:

npm install proj4

node...

proj4.defs('epsg32610', '+proj=utm +zone=10 +ellps=WGS84 +datum=WGS84 +units=m +no_defs');
proj4.defs('WGS84', "+title=WGS 84 (long/lat) +proj=longlat +ellps=WGS84 +datum=WGS84 +units=degrees");

proj4(proj4('epsg32610'), proj4('WGS84'), [493251.00240006257,5459084.067122783])
[ -123.09280153510765, 49.28447161994075 ]

This converts a single point; I need to figure out how to convert polygons.

27/11/17

Permalink 04:51:15 pm, by mholmes, 81 words, 15 views   English (CA)
Categories: Activity log; Mins. worked: 240

Working on price per metre and mapping

  • Integrated the new price-per-metre data into the spreadsheet output for stats. Reported to JSR.
  • Began work on generating mappable data for titles. TEI files are now generated from the title-based XML view, and these can be converted with the BreezeMap code to GeoJSON. This is working, but in the process I discovered that a set of GeoJSON coordinates (for 121 locations) inadvertently got truncated during the import process, so I'll have to rescue that data.

Ids of property records with truncated coords:

152, 159, 168, 174, 177, 184, 190, 193, 194, 266, 267, 268, 278, 281, 283, 288, 293, 294, 295, 320, 321, 322, 323, 328, 329, 332, 333, 338, 339, 347, 396, 424, 439, 480, 481, 482, 483, 484, 485, 488, 489, 490, 491, 494, 552, 553, 568, 569, 570, 571, 572, 573, 574, 575, 577, 590, 610, 623, 630, 636, 700, 701, 705, 706, 707, 708, 710, 711, 713, 820, 821, 824, 825, 831, 832, 844, 845, 846, 847, 848, 849, 850, 851, 852, 853, 854, 856, 857, 859, 860, 861, 862, 863, 864, 865, 866, 867, 872, 873, 877, 891, 896, 900, 925, 977, 980, 984, 985, 994, 996, 997, 999, 1000, 1004, 1006, 1012, 1013, 1018, 1021, 1024, 1031

24/11/17

Permalink 02:56:43 pm, by mholmes, 74 words, 18 views   English (CA)
Categories: Activity log; Mins. worked: 240

XML generation with val per square metre

After putting the coords and areas into the dbs yesterday, I reworked the XSLT that generates the title-based and lot-based XML views of the data so that they include price-per-square-metre for titles and lots, and also a version of it normalized to 2016 values. There are still lots of unknowns about how we handle this data, especially the edge cases (missing lot info, extinguished lots, etc.), but I'm working this stuff out now with JSR.

23/11/17

Permalink 02:55:03 pm, by mholmes, 97 words, 13 views   English (CA)
Categories: Activity log; Mins. worked: 240

Adding coords and areas to db

I figured out how to use QGIS to generate a GeoJSON file with areas in it, and documented that in the protocol document; then I wrote some JS for nodejs to generate SQL update statements to add that data to the db. I created two new fields in the landscapes_dev (my testing db):

ALTER TABLE `props` ADD `prp_coords` VARCHAR( 256 ) CHARACTER SET utf8 COLLATE utf8_unicode_ci NULL ,
ADD `prp_area` DECIMAL( 8, 2 ) NULL ;

Took backups of the dbs, then merged the new data into it. There are 262/1079 Powell St lots without coords/area, and 56/1120 Maple Ridge.

22/11/17

Permalink 04:53:34 pm, by mholmes, 53 words, 17 views   English (CA)
Categories: Activity log; Mins. worked: 120

Working on calculating areas for polygons

I'm trying to find a straightforward way to calculate the areas of all the polygons in the lot maps created in QGIS by AC, but so far I haven't got one. Trying with OpenLayers and the GeoJSON output created with ogr2ogr, but for some reason the JS won't read the JSON files.

<< Previous Page :: Next Page >>

Landscapes of Injustice

Reports

XML Feeds