Dave Wolowicz asked for some help with an XSLT list-processing question, for his work on the uSource portal. (No strictly Humanities, but uSource is for everyone.)
posting minutes worked with martin on start of backend for reporting
Got final approval to upload history deparment site.
Zipped existing site, put zip file into root of their account.
Uploaded new files, brief testing.
email dept secretaries and chair.
Still have to show secretaries how to upload pdf files and make simple text edits to announcements
Figured out how to hook into the main db functionality by including the relevant libraries in reports.php. Worked out how to extract parameters from the URL using the B2Evo param() function. Began building the SQL string we'll need to pass to the database to get back a list of posts. Found a number of difficulties; posts are not directly in blogs, they're associated with categories (through two different tables), and categories are associated with blogs. Got some way into it; more tomorrow.
I hacked together a style derived from the main (new) HCMC website.
Please try it and comment. I've only tested in Firefox on Linux so far.
Modified the AJAX code from the Moses project to call the back-end PHP script and put the response into the page. (PHP script itself doesn't do anything yet, other than respond...)
Repeat of my comments to your post, but it rightly belongs here.
Discussed with Purchasing Officer Sarwan Singh Dillon (email@example.com) about the LCD Samsung 32-inch LCD TV - LNS3251D - Black, and was told the item needed to go to tender. I sent him the specs on 11/20 and we should hear back within a week or so. I will follow up next Wednesday if I don't hear from him.
Added the HTML code to:
Added CSS code to:
JS code which displays and updates the progress bar is part of the AJAX code which will be imported next.
Contrary to the previous post, I decided that the best way to submit a report query to the back-end is to build a URL search string; this will enable us to provide URLs directly to reports. The SQL query should be constructed by the back-end code based on the field values submitted (through GET). This is also consistent with our AJAX code in the Moses and ScanCan projects.
Created a JS file to do this (hcmc_reports.js). This will later include all the AJAX functions. Tested the search string generation, and tweaked the form field names so they make more sense as GET fields.
Reviewed another paper. Had the unfortunate experience of believing the conference tool when it told me that "a ConfTool session is limited to two hours"; after 45 minutes of work, I submitted my review, only to be bounced out to the login screen with the loss of my data. Reported this to the admin email on the site, and then reconstructed my review from memory.
In future I'll write the review in a text editor and save it before trying to submit it.