13 Mar 2018
But, why?
queried the Supervisor after I had responded that I had taken a few days off regular work to volunteer. Still I felt it didn’t quite make sense to her. The last time I was at the CTICC (Cape Town International Convention Centre) was seven years earlier. I was meeting a prospective employer to do a hands-on job interview. Long story short, I didn’t get the job but I learnt (from the interviewer) to do a definition query. At that time I was oblivious to the AfricaGeo Conference 2011 which was taking place. The place was abuzz with activity and I feared I would fail to meet my potential job provider.
But this time around I was here for a different agenda.

A Runner Doesn’t Run
As I sat down and scanned the room. The age and mannerisms profile of those gathered started to become apparent. The vibe and increasingly obvious before-now acquaintances who clustered and chatted away strengthened my suspicions. I surveyed the room once more to try and identify contemporaries. Before I could make a count I quickly dropped the exercise as I paid attention to the Program Director who was now welcoming us and introducing key persons.

The team of Volunteers was briefed on what the 17th World Conference on Tobacco or Health was about, how important a role volunteers played and for a longer time, volunteer expectations. I had elected the roles of Runner, Speaker Room and Session Room volunteer. I chose these base ones as I generally appreciate seeing gears moving, literally. High profile roles like team leads and supervisor were not so attractive at this stage.

We got access to the WCTOH2018 Conference Wifi and downloaded the conference app. What an indispensable tool this was. Floor Plans, Session Times, Presenters Profiles and what-not, all a tap of a finger away. ( Surely every conference must be having this - at the back of my mind, Conference = FOSS4G 2018). We were taken around the venue to get oriented to the place and become expert Directionals come conference days. The thorough labelling in CTICC made grasping of what the ‘Venue Guide’ wanted us to remember concrete. Before long we were back in the briefing room to a healthy lunch pack and to receiving directions for the day which followed. The grandeur in design and yet simplistic layout of CTICC lingered as I left the Centre to return on Wednesday. The first day I had chosen to be placed.

Surprised
I reported to my station for the day - The Media Centre. Lanyard, Volunteer Tee on and a bout of enthusiasm abounding. Wednesday was a special day as there was going to be Press Conferences and a key Plenary Session. So access to my station ( that included everyone else) was through a metal detector and security scanner. Eye cast in any direction got you a security officer.
I reported to my contact person, a lovely Matilde who had a game plan ready for execution. A Press Conference was to take place in one of the Session Rooms. Together with my team members we distributed print material for the attendees, made sure the room was ready for use. I carried name tags for the key persons and laid them out on the desk. High-tech Audio-Visual equipment was conspicuous in the room. When the speakers started, a plethora of gadgets started being waved around to capture the moments.
Towards the end of the Press Conference the struggle in my head concluded as I read out one of the speakers’ name label - Michael Bloomberg! I had seen this face many a time on-line and at this moment I was in the same room with this globally renowned man. South Africa’s minister of health, Dr Aaron Motsoaledi among others were also here. What a surprise!

Delegates left the room, we cleared it up and went back to our station. Here we assisted journalists go online and to have a comfortable working space. It was interesting to see how ‘NEWS’ was developed from the Press Conference which had just taken place. News was going out into cyberspace, driven by individuals from various parts of the world, now seated in one room. Literally in minutes, post event.

We hovered in the room churning out print material for upcoming events. Of interest were articles with titles such as ‘For Immediate Release on Wednesday 7 March 2018’. The content of such publication tied congruently with what had just been said in the Press Release.( Someone must surely have published this two page release 1 minute after the Conference had concluded). I had an insight on NEWS propagation.
We also scrapped the web for news coverage of the conference. We printed the relevant articles and furnished our Press Coverage board near the main entrance. My shift concluded quite quickly. When the clock struck 12:30, I had not eaten anything substantial save for a quarter-palm-sized energy biscuit and the cup of coffee I had had circa 6:00. I was too excited to get hungry so had forgone my break. I snacked on the healthy muffin in the lunch bag, contemplating how quickly the first half of the day had gone by. Unfortunately my schedule couldn’t allow me to stay for the Awards Ceremonies which was a big thing and was to take place in the evening.
Three Steps Up
Thursday was supposed to be much calmer than the previous day. So today I was scheduled the Runner Role…but where would I be my Running Base be? I lingered the Deployment Centre (CTICC2, Aloe) to get further instruction from the shift manager.

Fate would have it, I became Co-Supervisor for the day. The person assigned wasn’t coming any more. I was to ‘float’ between the Media Centre and The Speaker Centre. This was my detailed interaction with the speaker centre. This section insured speakers had the resources they needed to present their work. Session Rooms ready to go, Presentations loaded to the virtual space and above all, making sure concurrent Sessions ran flawlessly.
I was briefed by the Co-Head of this section and off I went to explore the entire conference venue as I wished. I roamed from Room to Room checking on my team members - the volunteer crew, manning the Session Rooms. The deployment from Speaker Centre was flawless. The sessions went well. The onsite CTICC technicians making sure audio-visual equipment was impeccable.
09:00 saw a shift in tempo with the volunteers. There was an important Plenary Session taking place in Auditorium I. Every unengaged volunteer stormed A1. It was akin to re-enforcements deployed to where the war was fiercest. You could not find any volunteer idling around in empty session rooms. All hands where on deck ensuring guests sat well in A1 and where rightly directed to it.
The Session went off to a good start. When the Panelist session began, I got one of my great conference takeaway - equipment failure is bound to happen. Inevitable failure. Inspite of capable technicians expertly massaging the dials, the sounds at times just wouldn’t bow. The thought of an ink cartridge running empty trolled the periphery of my thoughts. I watched the technician wince in response to negative feedback from the mic and to epocs of silence as the words of a panelist vanished.
I had my fare share of knowledge impartation from the panel of experts as I kept a watchful eye on the comfort of the attendees. Without doubt the greatest challenge became isle seat occupation. Luckily A1 was big enough to apportion everyone a seat.

The importance of the Speaker Centre became apparent when I had to direct at least three Presenters to the station, who wanted to ensure their ducks were in a row before their presentation time slot.
A Delicate Order
When the session ended volunteers went back to their usual stations - mainly the Session Rooms. During this time slots I was assigned to help with poster take down. Hanne, who was responsible for the section had it figured out. It wasn’t long before all the remaining posters where rolled up for collection by the owners or destined for the recycle bin.
In no time, together with other volunteers we had the appropriate stickers in place for the next Poster Session. The posters were made of a plethora of material, from canvas via rexin to flimsy paper. I got to have a chat with a presenter north of the African Equator! Apparently we helped each other battle a poster off the wall which was more of the perma-stick material. The print service provider he had used had misunderstood his request. This was an Academic Poster not an advertisement. I shared his frustration but, his session had gone well
When the Poster work was done, so was my shift and the conference experience. I headed to the Volunteer Centre, signed out and thanked the volunteer manager for the opportunity. I grabbed the lunch bag as I left CTICC2, thinking to myself, I would do this all over again.

#Postscript
The conference experience should be for every college student and as a volunteer - such and opportune entrant. I had been to user members meetings before and even involved in logistics but, a conference is on another level. So I took the chance and had some takeaways:
-
I should consider learning French. ( Either the bulk of the conference organising team members spoke it or it is truly a global language. )
-
A Runner doesn’t run. It projects a negative image about the conference hosts.
-
Ready or double-ready. Equipment can fail and extra resilience is needed.
17 Aug 2017
Water Waste to Water, Less ?
This is a pseudo-follow-up post to A Point About Points.CapeTown is still in the midst of drought howbeit winter rains bringing some relief. As further mitigation, introducing Level 4b water restrictions from 1 July 2017, were each individual should use 87L per day. With this information all about I could not resist the itch to know where the water sources were in CapeTown, well, geographically that is. The Centre for Geographical Analysis at Stellenbosch University started a satellite image time-series of main dams supplying Cape Town with water. What a brilliant idea! But where were these 6 main (and 8 minor) water sources in the Western Cape? - I sought to answer that.
Think Water, Find Dams

To get me started I got the Dam Names off the City of Cape Town’s Open Data Portal. Search term “dams” yields a csv file of Dam levels, (Dam levels update 2012-2017.csv). This listed all the 14 water supply sources, with storage capacities. I Googled the locations for these and had X, Y coordinates for these.
I planned on having a story map of sorts to help the ‘gazer’ have an idea where the water sources where. I settled for JackDougherty’s template on GitHub. This was particularly attractive because it had LeafletJS in the mix and some GeoJson. My mind was set on the map portion of this excellent visualisation which proved to be too complex to emulate. I also considered Odyssey.js which turned out to be unpolished and kinda project abandoned.
It didn’t take long to have the scrollable map-story for the dams. After following the instructions on the GitHub Repo and editing a few lines in Sublime Text.
See Full Version
Credits
The following sources were used for the Dams Narratives.
-
Cape Water Harvest
-
Hike Table Mountain
-
The Table Mountain Fund
-
Cape Trekking
and the City of Cape Town website for dam capacities.
07 Jul 2017
Love to Doodle ?
I tweeted a workflow doodle of what I was doing geo-wise on my day job. The workflow aka algorithm, did produce the desired results when I executed it. In retrospect however I self queried if I really had to and if I did enjoy doodling. What was wrong with working with models (ESRI’s Model Builder) on the computer? Was doodling not ‘wasting’ precious time …since I had to recreate the thing in the geo-software later on anyway? Add to that I was the sole workman on this project so I had the liberty to transfer thought to implementation without the need to bounce the idea off a co-worker. Well, white boards are a great idea, especially when there’s need for team collaboration. Not to mention the departure from the boxing effect a computer system/ software tool may have on the thinking process.
The Handicap
Mine wasn’t (isn’t) just a matter of preferring to doodle on papyrus first. I then realised the doodling was born in the past. Dating back to the time I was first introduced to the word processor, Corel Word perfect on a Windows 95 laptop, circa. year 2000. A computer, I was made to believe, …and also observed many treating it so, was a formatter of work. It was meant to make one’s work look neater, cleverer, add fancy text formatting to that and you had a document you could look at and admire the ‘look’ of things, never mind the worth of what was being presented. So I would spend hours, writing out stuff on paper first for later transfer unto ‘The Computer’. (…well call it electronic typewriter).
The idea that a personal computer was a tool, crept in two years later when I was introduced to programming..in Pascal. Only then did it start to dawn that you could have additional text on the screen as you interacted with the PC. But then, PCs at the College were communal, so back to doodling again…for when I get a chance at the keyboard.
So that’s how I fell in love with doodling. One shouldn’t take their mistakes with to the computer you see. So in working the habit off, it’s let’s muddle on the machine. It is a tool and It’s not like it may explode or anything.
21 Apr 2017
Point or Line?
The City of Cape Town on 27 February 2017 released a list of the top 100 water consumers in the city. Cape Town lies in a water-scarce region and is (Q1, 2017) in the grip of a severe drought. Among other measures like water usage restrictions, part of mitigating the water crisis in the city was the publication of the list of ‘heavy users’ above. Several media houses ran with the story (and list) with some including a points map of these water users’ streets.
The top water wasters were named by street name and not street address.The ‘offenders’ points maps (I’ve come across) serve an excellent purpose in pinpointing the location of the subject street. A point however, gives the impression of ‘on-this-spot’.There is also as attribute of the point, the subject street name to aid the map user. Still a point map exposes our species of gazers to misinterpretation. The camouflaged water users cannot be quickly and easily identified via some buffer of the points plotted but rather by buffering the entire ‘offenders’ street. The offending household lives ‘along-this-street’ and not ‘around-this-point.’
Well, enough read? You can - Cut To The Chase
For The Love Of Eye-Candy
Well, I’ve learnt (and came to realise it too) that the value of spatial data decreases with the passage of time but, does good cartography follow the same trend? (discussion for another day). But for the love of better cartography I decided to come up with an improved version of the Top 100 Water Users Map.
Get The List
I got the list from the News24 Wesbsite. From the way it was formatted, I quickly concluded this would be a good candidate for CSVs. 100 lines in Notepad++ is not a lot to manipulate and format.
Make It Geo
Address geocoding used to be a costly and rigorous process. Nowadays there are several options to turn address descriptions to vector data. QGIS has the GeoCode option with the MMQGIS ‘Plugin’. Since I had determined to have the data visualised in CARTO, I moved the streets data to there for address geocoding. In seconds I had points!

So in minutes I had the same Points Map as the media houses!

On inspecting the point data - Look, Spatial Duplicates! This would slip the untrained eye when looking at text data, especially that sorted/ranked by consumption. This is where spatial representation shines. The following duplicates(street wise) were found;
89.Hofmeyr Street‚ Welgemoed - 143 000 litres
94.Hofmeyr Street‚ Welgemoed - 123 000 litres
4.Upper Hillwood Road‚ Bishop’s Court - 554 000 litres
71.Upper Hillwood Road‚ Bishop’s Court - 201 000 litres
23.Deauville Avenue‚ Fresnaye - 334 000 litres
29.Deauville Avenue‚ Fresnaye - 310 000 litres
100.Deauville Avenue‚ Fresnaye - 116 000 litres
69.Sunset Avenue‚ Llandudno - 204 000 litres
95.Sunset Avenue‚ Llandudno - 122 000 litres
46.Bishop’s Court Drive‚ Bishop’s Court - 236 000 litres
97.Bishop’s Court Drive‚ Bishop’s Court - 119 000 litres
This ‘repeat’ of a street name changes the way the data would be represented. For Hofmeyr Street for instance, the combined consumption would be 266 000 litres. Since consumption is presented by street name.(At the back of our minds cognisant of the fact that the data was presented by street to mask individual street addresses. )
The geocoding in CARTO was also not 100% (would say 95% accurate). I had to move a few point to be exactly over the subject street in the suburb. Not CARTO’s geocoder’s fauly at all. My data was roughly formatted for a relly good geocode success hit.
Street = Line
Now to get my streets (spatial) data, I headed over to the City of Cape Town Open Data Portal. Road_centrelines.zip hordes the street vector data.
In CARTO I exported the points data, to be used to identify the street ‘line’ data.
Within QGIS, the next task was to select the subject streets, being guided by the points data. The (un)common spatial question:
Select all the streets (lines) near this point (having the same street name).
To answer the above question, a ‘Select by Attribute’ was done on the street geo-data. Formatting the expression with the 100 street names was done in a text editor (Notepad++) but the expression executed in QGIS. Some subject streets (e.g GOVAN MBEKI ROAD) spanned more than one suburb and these had to be clipped appropriately. Again I headed over to the City of Cape Town Open Data Portal to get the Suburbs spatial data (Official planning suburbs 2016.zip).
I used these to clip the road segments to the corresponding suburbs. I further merged small road sections to ease the process of assigning attributes from the point to the streets. A field to store the combined consumption for that street was also created.
A Buffer operation of the points data was done followed by a Join Attributes by Location operation to transfer attributes from the points to the line. An edit was made for the spatial duplicates to sum the consumption totals.
The streets vector data (lines) will not have Usage Position assigned to them. The data has been aggregated and totals for a street are now in use. This reduces the data count from 100 to 93 ( 6 repeats , 1 not found street).
Show It
Back to CARTO, I imported the edited streets data for mapping and styling. Wizards in CARTO make mapping trivial for the user. The resultant map below -
I chose the Darkmatter background inorder to highlight the subject streets. A on-hover Info window option was chosen in keeping with the ‘gazing species’.
Perhaps Areas (Polygons) ?
I am somewhat satisfied with the way the data looks represented as lines (read - streets). Strong point being - a line forces one to scan its entire length. While with a point, it immediate vicinity. There’s still some discontentment with the way the lines look - ‘dirtyish’, but then that is the reality of the data. There are several flaws such a visualisation:
- The length of a subject street gives the impression of higher water usage.(Colour coding of the legend attempts to address that.)
- Chances are high a would be user associates the ‘lines’ with water pipes.
- On first sight, the map doesn’t ‘say’ intuitively what’s going on.
This leads me to Part B - “Maybe it’s better to have property boundaries along the street mapped for the water usage”.
#postscript
- Good luck finding Carbenet Way‚ Tokai! (Looks like a data hole to me.)
- With the data wrangling bit, street names such as FISHERMAN’S BEND were problematic dealing with the 's
- CCT open data portal could imporove on the way data is downloaded. Preserving the downloaded filename for instance and other things.
05 Apr 2017
Make Me A Map
Oft in my day job I get requests to map data contained in spreadsheets. I must say I prefer ‘seeing’ my (now not) spreadsheets in a GIS Software environment - TableView. So I quickly want to move any data I get to that place. The path to such and eventually the cartographic output is rarely straight forward, springing from the fact that:
- Most data cells are merged to make the spreadsheet look good.
- Column (field) names are rarely database friendly. (Remember the Shapefile/ DBF >10 characters limit?)
- Something is bound to happen when exporting the data to an intermediate format - CSV - for manipulation. (Data cells formatting.)
- Not forgetting the unnecessary decimal places at times.
The map requester often wonder why I’m taking so long to have their maps ready. In the meantime I have to deal with a plethora of issues as I wrangle the data:
- Discovering that some field names chosen by the client are reserved words in “My GIS” - [ size, ESRI File Geodatabase ]
- The tabular data can’t be joined to the spatial component straight-on because:
- A one-to-one relationship doesn’t exist cleanly because the supplied data contains spatial duplicates but unique attribute data. (What to retain?)
- The supposed common (join) field of the source data doesn’t conform to the format of my reference spatial dataset.
- Some entries of the common field from the master data, contain data that does not fall within the domain of the spatial - e.g a street address such as 101 Crane Street, where evidently street numbers are only 1 - 60. (and 01,10 and 11 exist in the database - in case you’re thinking about a transcribing error.)
Frustration from the Client is understandable. The integrity of their data has come under scrutiny by the Let-The-Data-Speak dictate of data processing and cleaning. Unfortunately they have to wait a bit longer for their map. The decision on how to deal with the questionable data lies with them. I simply transform, as best as I can, to a work of map art.
Database Again Please
… entry level Data Scientists earn $80-$100k per year. The average US Data Scientist makes $118K. Some Senior Data Scientists make between $200,000 to $300,000 per year…
~ datascienceweekly.org, Jan 2017
Working with and in geodatabases makes me somewhat feel like I’m getting closer to being a data scientist. So as I resolved for year 2017 - more SQL (…and Spatial SQL) and databases.
Importing spreadsheets to an ESRI’s Geodatabase has major advantages but isn’t straightforward either. The huge plus being able to preserve field names from source and do data check things. Good luck importing a spreadsheet without first tweaking a field or two to get out of the way ‘Failed To Import’ errors.

Comma Separated Values (CSV) always wins but, beware of formatting within the spreadsheet. If wrongly formatted…Garbage In! But once the import is neatly done, I can comfortably clean data for errors and check integrity better in My GIS environment. As a bonus I get to learn SQL too. Cant’ resist that pretty window!

#postscript
The above are just a few matters highlighting what goes behind the scenes in transforming a spreadsheet to a map. The object to show the non GIS user it takes more than a few clicks to come up with a good map.