Spatial Data Hub Scotland

Planning in the UK has for a long time suffered from a classic siloing of data by authority resulting in slow and varied analysis of information. Authorities relied on talented motivated individuals with particular interests and skills to develop bespoke solutions that assisted in the development of things like housing land audits , population forecasts , capital planning and local plan development which while often impressive individually struggled to transfer between authorities.

The continual improvement of digital tools has dramatically improved standardisation of the attributes of particular spatial data sets and database technology originally developed for accounting systems and flight control systems is starting to be applied to the amalgamation and analysis of planning related information. Within the UK different regions are progressing along this path at different rates. Scotland now has a body the Improvement Service who has a specific remit to collect spatial planning data which they do at something call the Spatial Data Hub.

The Spatial data hub at 08 January 2024 had 59 datasets listed at Scotland coverage level. Including
Planning application boundaries
School Catchment Areas
Housing Land Supply
Vacant and Derelict Land
Employment Land

The improvement service has been building these datasets for a number of years now however last year they expanded general access to much of the information and I have since been experimenting with it to see what can be achieved.

Multi-tenancy – Design Options

Some personal notes on different options for managing multi-tenancy

Silo – Different databases for each tenant. Again PHPRunner has an excellent solution for this.

Siloed Database with one Front End PHPRunner

Bridging – Unique schema for each tenant – for me this I can’t see any great advantages to this and quite a lot of extra required work.

Row Level Security – Tenancy managed by the database – One database additional column in tables that record the tenant and prior to showing results of query the database will filter results by tenant. Policy set up with unique user for each tenant. Postgres applicable. Still requires an additional field in all the tables but it is nice to have and can be extremely quick to implement.

Client Software – Client side implemented tenancy – PHP Runner has an option to implement Row Level Security in the design of the front end.

Amazon Web Services article on Multi-tenancy

Some points – for postgres implementing Row Level Security on tables does not carry through to the views which is a major issue for me. There are way rounds it though. It would be great to have the database managing the row level security and if you have all your tenants in one database possibly the best way to do it is using the below.

How to Add Row Level Security to Views in PostgreSQL

New System Assessment : A Simple Set of Guidelines for judging material differences between Software

Here’s a short guide I put together for myself to help me assess the value of enterprise systems that I may be looking for or designing myself. Not hard and fast and some points may not be relevant to every application but this should form a good basis from which to start. Generally a higher number is better than a lower number but within numbers items are more cumulative and of equal value.

1)Paper based = Everything from single sheets to formal Books of Accounts
2)Simple Digital = spreadsheet and other file based storage
3)Cloud based Simple Digital = spreadsheet and other file based storage – this can include file based blob storage
4.1)LAN relational database – normalised but otherwise fairly locked down
4.2)LAN relational database + easy import and export of data other than spatial
4.3)LAN relational database + report writer definable by users
4.4)LAN relational database + spatially enabled
5.1)Internet available relational database – normalised but otherwise fairly locked down
5.2)Internet available relational database + easy import and export of data other than spatial
5.3)Internet available relational database + report writer definable by users
5.4)Internet available relational database + spatially enabled (noted not all applications require a spatial aspect)
5.5)Internet available relational database + separate site available for public access view only
5.6)Internet available relations database + separate access to public access to edit and add information

Requirements
+ UPtime / reliability
+ SPEED – after reliability very important
+ Good master details forms (sounds easy but bootstrap is not great for this)
+ GOOD Search – the larger the system the more important the search options are in it. Look out for things like Automated Objective Index creation (case sensitive search is more of a hassle than a benefit a lot of the time – odd wild cards – or not being wild by default are problem areas)

Important Points
+ Portability can you up sticks and move it to a different cloud provider
+ Non – Proprietary (Can be a gotcha if its really obscure or has weird security, often linked to portability)
+ two factor authentication
+ Responsive Design (Not necessarily as important as you think in some applications see Github / Open Layers arguably don’t need it)
+ Dynamic saving – I really miss the fast dynamic saving environment of MS Access the save button isn’t quite the same
+ Built in CRM – I generally like them especially if data can go straight into correct files.. Increasingly I am designing systems with inbuilt CRM – I know this might not be to every organisations taste but it is jarring to go between systems and normalisation between systems is usually very sub optimal – plus you frequently come across proof of negative problems when data required for a specific task is not held appropriately.
+ Satellite application for customers to enter information (why not let your customers maintain their information can be great and can empower customers Fintech anyone?)

= All the bells and whistles
Spatially enabled internet available database with two factor authentication – report writer and easy import and export of spatial and attribute information with public access to edit and add information by customers (Possibly via satellite application).

Creation of SITE History from Planning Application Polygons using QGIS

In planning it is important to know the planning history on a site. The status and likelihood of approved permission will often relate to previous permissions. Many council planning systems do not specifically relate planning applications to each other and there may be situations where you would like to create such links. This is essentially an excercise in using spatial analysis to create the junction table to hold what are many to many relationships.

If your datasets are in any way large you will need to set aside a computer so that it can perform the calculations. When I first tried this the process took a weekend with queries running overnight.

Start by obtaining as many years of planning application polygons as you can. Here I use polygon files in shape format.

The polygon file or shape file should be in one file so if you need to merge the shape files you have together. I did this and the file I ended up with was

AllPlanningApplications.shp

Next – Delete all attribute fields EXCEPT the planning application number.

Next – Create a centroids file from AllPlanningApplications.shp I called mine
AllPlanningApplicationsCentroids.shp

The next series of iterations are about getting a unique set of polygons with which we can go forward and generate a set of SITEPKIDS that can be attached to the child records.

Step – Using AllPlanningApplications.shp ADD an additional field called area and populate it using QGIS $area calculation – save this file.

Step – this is where it becomes interesting – in most authorities there are a vast number of planning application boundaries that overlap. Performing a dissolve at this point would result in a large congealed set of polygons that could not clearly identify unique sites. Thus buffering the polygons down we can start to identify unique sites. This is particularly important where boundaries are completely contiguous to each other.

sites the buffering command is used within the geometry tools to try to separate adjacent overlapping and contiguous polygons.

Step ‐ Create two files from the AllPlanningApplications.shp one for polygons less than 4500 metres squared and one for more than or equal to 4500 metres squared. This is to allow for two differing buffering processing to be performed on each.

AllSmallLessthan4500PlanningApplications.shp

AllLargeGreaterthanequal4500PlanningApplications.shp

Now the 4500 is an empirical figure that was subjectively chosen there may be a better figure feel free to investigate.

The following 2 steps also introduce empirical figures for the buffering that can be altered as appropriate.

Step ‐ Take the file AllSmallLessthan4500PlanningApplications.shp and create a buffer polygon file of this with

boundaries of less than 2m lets call it

AllSmallLessthan4500PlanningApplicationsBufferMinus2.shp

Step ‐ Take the file AllLargeGreatethanequal4500PlanningApplications.shp and create a buffer polygon file with

boundaries of less than 20m lets call it

AllLargeGreaterthanequal4500PlanningApplicationsMinus20.shp

THIS NEXT STEP WILL TAKE SEVERAL HOURS IT MAY BE BEST TO DO EACH ONE OVERNIGHT

Step ‐ Perform dissolves on both of these new files ensuring that dissolve all is used names could be something like

Vector / Geoprocessing Tools / Dissolve /

Set input layer alternatively to the two above files and set Dissolve field to dissolve all.

Suggested file Names are

MultipartDissolvedPolygonLessthan4500PlanningApplicationsBufferMinus2.shp

MultipartDissolvedPolygonAllLargeGreaterthanequal4500PlanningApplicationsMinus20.shp

Step You should now have two shape files of a large multipart polygon you want to perform the multipart to single part operation now

Vector / Geometry Tools / Multipart to Single Part

Processing involved with this is typically quick and suggested names for these files are

DistinctPolygonsAllSmallLessthan4500PlanningApplicationsMinus2.shp

DistinctPolygonsAllLargeGreatethanEqual4500PlanningApplicationsMinus20.shp

Add area column and identify the largest polygon on the small files

Add area column and identify the smallest polygon are on the large files you may want to remember this.

Step ‐ perform merge on these two files to get

Vector / Data Management Tools / Merge

CombinedSmallandLargeDistinctPolygonsPlanningApplicationswithbuffering.shp

ONGOING investigation ‐ would Difference be better than dissolve on this and should the above files be put together before

Step ‐ perform dissolve

Vector / GeoprocessingTools / Dissolve

ensure that ‐‐Dissolve all‐‐ is selected

DissolvedCombinedSmallandLargeDistinctPolygonsPlanningApplicationswithbuffering.shp

Step now you want to split mutlipart to single

DistinctPolygonsAllPlanningApplications.shp

Step add field called SitePKID and populate it using $rownum command.

Step

Vector / Data Management Tools / Join Attributes by Location

Set Target Vector Layer as

AllPlanningApplicationsCentroids.shp

Set Join Vector Layer as

DistinctPolygonsAllPlanningApplications.shp

Ensure that Keep all records (including non‐matching target records are kept)

Output Shapefile suggestions

AllPlanningApplicationsCentroidswithSitePKID.shp

If there are centroids without Site PKIDs put them to the end and give them consecutive unique row numbers. The attribute file associated with AllPlanningApplicationsCentroidswithSitePKID.shp should now be a child table of the shape file DistinctPolygonsAllPlanningApplications.shp perform checks here to see if all centroids within a polygon defined by the distinct polygons have the same SitePKID and that it is matched by the SitePKID of the Parent shape file.

You should be able to do a join on the this file to get the PKID back into the very original file.

AllPlanningApplications.shp

Finally perform a dissolve on the corrected AllPlanningApplications.shp file but this time dissolve on the field

SitePKID

You can call this

DistinctCorrectedPolygonsAllPlanningApplications.shp

QED!!!!

MS Access like development environments for the Web – 3 alternatives

So you would like to construct simple applications that you can at the moment create in MS Access but you want to do it on the web. By that I mean you would like to create a data driven application with somewhat complicated forms that can be accessed by anyone through either IE or Chrome anywhere in the world with a simple login screen at the front to prevent simply anyone accessing the applications collecting the information into a database. What are your options for programs that will assist you in a MS Access like environment rather than going the full IDE deep dive – Visual Studio route – for what I consider to be a reasonable fee?

From my experience the unicorn of access on the web is slowly coming to fruition BUT for the vast majority of people with a budget similar to that for MS Access – lets say £200 ($250) a year for unlimited applications there is simply nothing which is quite as easy and powerful as MS Access. Some are pretty close but simply not as stable and require typically several magnitudes greater amount of configuration. WYSIWYG design isn’t quite as WYSIWYG and stability is a few orders lower than the desktop.

What you are probably looking at can typically be described as either RAD tools for the Web, a Low Coding Platform or something called a Code Generator any of those phrases can be useful for Google.

Assuming you don’t have your own servers whatever you do you will need to spend money on a web host.

The minimum this is likely to cost you is in the region of $15 a month. If you don’t want to spend the next 6 months learning about the insides and outsides of frameworks then I would suggest you go to one of the below three providers who all provide complete environments set up and ready to go with their particular generators pre-installed. This is good value it is extremely difficult to beat these guys on cloud hosting costs and unless you are very advanced and have very particular requirements its a waste of time to try. All three of the below providers will allow you to create limitless number of applications albeit you are limited by the space you hire on your web server. Similarly distribution will be limited by the quality of web server you sign up for. In all likelihood if you have few users it is unlikely that the coding front ends of your applications will be a limit to the number you create more likely the size of databases you are attaching them to and the shear time you have available to create applications.

For a period I was paying a monthly amount for a Nubuilder Pro hosted platform. This performed well and I could create an unlimited number of applications. As it was so hosted I skipped the step of learning some of the deeper parts of the initial configuration. I hope at some point to go back to this. It is open source and seems well maintained with a very dedicated developer. The developer re-wrote much of it and at March 2019 it latest re-incarnation is Nubuilder Forte.

Be warned n-tier web applications do not play as friendly as the desktop you WILL be slower to construct applications than you are on the desktop, getting into it WILL take time and a bit of commitment, you WILL have far less flexibility regards coding, there WILL be less people about to ask questions and there is far far less WYSIWYG design capabilities, error trapping is poor and errors are far more likely to be fatal and the really big warning is that on release of new web frameworks you may not necessarily be able to update without a full site re-design (A fact that comes as a nasty surprise to many CIOs and Project Managers when they realise that they are locked into front end system replacements every 4 or 5 years ) Know how to get data to your local environment out of the back end and accept that the front end is ephemeral and not likely to last in the same way as your desktop applications. (Your database will last but don’t expect to be running it through the same front end ten years from now). Accept that you will now have monthly or annual rental fees for cloud provision.

That said the design of these items is significantly faster than its ever been.

Scriptcase and ASP Runner dot net (Xlinesoft also produces a PHP equivalent generator) have free downloads that are good for getting started.

Commit to one and go for it. – I’ve got both PHP and ASP.NET solutions.. Nubuilder only connects to MySQL whereas Scriptcase and ASPRunner.NET connect to pretty much any database. I started with Nubuilder and am using ASPRunner.net as well because importantly it connects to SQL Server and it was easy to get up and running in MS Azure. Scriptcase is php based and I believe the applications it build require some kind of security program to sit on the web server this put me off – they do however have hosting that you can sign up for which is pre-configured. Their IDE is web based which could be a winning advantage. One of the great advantages of ASP runner dot net is that the program produces an open web application that should run on all stock servers. I found Nubuilder Pro (now Nubuilder Forte) really conceptually elegant which despite its rather drab looks is incredibly flexible the applications it produces are however limited to MySQL and non responsive (But being non responsive you get get more detailed forms!). I would probably be able to change it’s look if I was prepared to get my own server and install everything on it myself. That is not something I have time to do at present.
Nubuilder hosts its IDE in the browser which again is an advantage. ASPRunner.net is more traditional in that you have a program running on a desktop that creates the plumbing of your application which you then need to push to a server for publication  this has the advantage that you get to see the plumbing in the background which makes backup of the site easier but publishing slightly harder.

You may have heard of other generators / design applications out there for example – Zoho Creator / Alpha 5 / Outsystems these hold your hand even more but as a result are even more proprietary and won’t fit in that budget of £200 per year ( by quite a long way!)

Some further information on costs – nubuilder being open source in theory could scale for very little money espectially if you have your own servers already. Scriptcase and Xlinesoft ASP Runner product have an initial fee followed by annual subscription – you may be able to configure it so that you can create unlimited applications for that one fee (if you have good access to web servers ) but it is likely that initially there will be some kind of variable cost per additional application you wish to build. I am using MS Azure with ASP Runner dot net and a developer database costs me about £5 a month with each application being hosted in a web application service which again costs £5. With both Scriptcase and ASP Runner if you stop paying the annual fees your applications will continue to work you will just not get version upgrades. You will be able to step back into version upgrades but you may need to restart your subscription with an additional fee.

Nubuilder Forte Link

Scriptcase Link

ASP Runner – PHPRunner and ASPRunner.Net Link

Good luck

nuBuilderPro – a database driven Web Framework for simple Form Design

I have been seeking a way of designing simple web forms for a reasonable cost for some time. I was continually coming across either very professional products with really quite large annual charges (alpha anywhere) or alternatively products that charge a per seat cost for users (zoho creator). Fine I don’t mind paying for software I really don’t but what about a project that only brings in $2000 dollars a year how can I justify Alpha Anywhere’s costs for that or how can I give access to individuals who will need an application very rarely but for whom there is real value in knowing this information. I have long believed that software costs should be coming down in cost rather than ever increasing and getting users interested in your applications is often impossibly hard at the best of times when there is no cost let alone when you tell them that they need to shell out for a subscription. What happens to the user who only needs something once in a blue moon or a user not from within your department within a section where you have no control of the budget? I’ve recently had a lot of success with open source software and as a result I have been scouring the internet for options for some years. Recently I found a project called nuBuilderPro from an Australian based development house called nuSoftware. nuBuilderPro is version 3 of what I think is their own framework. It looked good and what’s more although based on the LAMP stack which is something I’ve had very little dealings with they offered a fully hosted VPS service where I could experiment and get my bearings. They completely host the development environment for a very very reasonable price. Given that I didn’t know whether it was going to be useful to me yet this seemed an ideal opportunity to experiment. It has only been a short time since sign up but I can already see that it will prove useful. I have signed up for the most basic of server accounts and at 17th of January created 3 very basic applications. All in about 3 hours of work. Thats what I like to see proper RAD development.

The website is here.

www.nubuilder.net

And the options for hosting are here

www.nubuilder.net/hosting.php

Importantly you can start small get to know the product and work up – additional users do NOT cost you extra money, login security comes built in and you are free to design as many applications as you can fit within your VPS. Given that when I first signed up I didn’t have any users or applications and didn’t know how to design applications, this seemed completely ideal.

So far it looks like I’m going to use it for small but important applications that need very quick development that are very distributed. That’s not to say it couldn’t be used for much larger applications but I need to run before I can walk. Given the underlying database engine and stack it is likely that it will be able to handle much bigger projects than I am likely to throw at it. Although Open Source all but the top of the stack has been in solid development for many years I guess the question mark is over the control management framework for the database integration and form building UIs which is somewhat new and untested. I would very likely have gone with something like Lightswitch but Microsoft have been giving a distinct lack of commitment to any one RAD web tool recently. Certainly it is true that very few individuals seem to know about it and the forum is somewhat quiet compared with some open source projects. Importantly though I haven’t come across any other open source projects with the price model that they have and importantly I can easily get any information in and out so I don’t believe I am taking any risk. I hope to be investigating it further over the coming months. From what I have seen so far I am very impressed and did I say it had a unique price model anyway watch this space!

SQL Saturday #388 Edinburgh

SQLSAT388_web
Went along to SQL Saturday BI Edition on 13th of June.
Honestly I thought it was really great. Many thanks to Jenny Stirrup for being the event organiser. If you are thinking about going next year you really should.

This was the schedule for SQL Saturday in Edinburgh

Sessions I attended
Key Note with Jon Woodward – on long term future of computing in general
Web Scraping with Python – Iain Elder of Sky Scanner
Better Data Visualisation for CRM and ERP – Adam Vero
The European Economic Crisis and the Euro-A Data Tale – Carmel Gunn
Blue Sky Thinking : SQL Azure Geospatial Mashup – Thomas Sykes
Master Data Management – Dave Lawrence

All the talks were really thought provoking and nice to hear from people who are really at the top of the game. I have already started experimenting with Web Scraping and Python.

Administering a Web Application some thoughts

In the last couple of years I have started administering a bought in Web Application from an outside vendor. I have found this really quite educational. I will not reveal the name of the product the vendor seems neither better nor worse than other vendors and many of the issues I am experiencing I see repeated with other admins of other web applications.

The software consists of various pieces of middleware linking SQL Server and the .Net and Javascript files. It uses a fairly standard Microsoft Stack to deliver a single page application. The big selling point was that that it was widely available through browsers pretty much anywhere on the web.

We have had it up and running in anger for a couple of years now and I have about 75 users all internal.

The vendor has created a new version and we now need to update. In discussion with the company they tell me that I should not be doing this myself and it is standard practice to get them in to upgrade the version. We already use it more as a stack technology to build our own applications I would want to continue this in the future. They have indicated that it will cost money to get them in to upgrade the software. This is triggering alarm bells because the existing configuration is very complicated, errors are quite regular and error messages tend to be pretty cryptic or non existent. I also like doing setup myself as it teaches me the ins and outs and speeds my learning up for systems which have non-trivial configuration.
A project board has been set up senior management are on board and well its grown legs arms and tails.

We have not widely publicised the application to the general public and they do not access the application at present.

The application itself requires significant amounts of configuration to set up correctly and parameters tend to be static and buried deep in very large HTML files. All this leads me to conclude that configuration in the new version is likely to be just as awkward. The company’s help desk is very good and there is extensive documentation but the sheer obscurity and quantity of configuration parameters makes the documentation the size of a phone directory, difficult and opaque.

Here’s some questions I should remind myself of when looking at New Web Applications in the future.
Are staff really so distributed that you require wide distribution?
Is it an internal or external distribution? – If not external do you really need a web enabled application.
Do you have internal expertise that is interested in developing applications?
With Web Applications or Local applications the backend database technology could well be the same. Internal local area development is likely to be far more flexible / cheap / quick and could motivate staff additionally it can be far more rich in its scope. It would be possible for instance to develop rich interface internally and a minimal interface externally using something like Lightswitch. This would take advantage of the strengths of both development environments.

There is a big push towards web applications I just think IT managers need to carefully consider their positions these applications tend not to be cheap and have a number of disadvantages that seem to have been pushed out of sight.

Database Normalisation – a review of the rules to refresh my knowledge

Here is a review of the 6 forms of normalisation (yes the 4th is called 3.5 for some reason) – there are lots of summaries on the web of these, but I wanted to re-write them in my own words to ensure I understand them fully. Revision of these post real world experience is particularly enlightening.

Normalisation rules are cumulative ie a database or set of tables can be said to be 1 through to 5 compliant. To rank at the highest level it must comply with all previous forms.

That is to say that a database (or table) to be in the 2nd Normal Form must first fulfill all the criteria of the 1st Normal Form.

A database is said to be properly normalised if the first three normalised forms are inviolate.

Violations of normalisation forms subsequent to the third normal form may be legitimately required and as such are NOT absolutely required for a database to be classified as properly normalised.

Some web explanations do not list forms post 3 I would argue there is still a lot of value in trying to adhere to these less well forms but must admit to not always including them in design usually due to poor knowledge of the application domain.

The differing forms of normalisation start simple becoming increasingly complicated.
If you have a lot of experience in creating relational backends you may have deduced these issues for yourself.

(1NF) First Normal Form – Edgar F Codd – first stated 1970
Eliminate duplicate columns from the same table.
Create separate tables for each group of related data and identify each row with a unique column or set of columns (the primary key).

(2NF) Second Normal Form – Edgar F Codd – first stated 1971
No partial dependency on anything but the key fields

(3NF) Third Normal Form – Edgar F Codd – first stated 1971
Every non prime attribute of the primary key is non transitively dependent on every key of R. The example I saw was that if you had winners of wimbledon some of which repeated you would not store there birth dates. You would have a junction table of the winners and link to the names of the individuals and it would be in this table that you would store the dates of birth.

The 3.5 Normal Form or the Boyce Codd Normal Form
This is a slightly stronger version of the third normal form developed in 1974 by Raymond F. Boyce and Edgar F. Codd to address certain types of anomaly not dealt with by the 3rd Normal form as originally defined.

Only in rare cases does a 3rd Normal Form table not meet the requirements of Boyd Codd Normal Form. A 3rd Normal Form table which does not have multiple overlapping candidate keys is guaranteed to be in Boyce Codd Normal Form.

Web investigation indicates that the most concise description is;
Every determinant must be a candidate key

and that Boyd Codd Normalised Form is not always possible

Nearest Shops
Person Shop Type Nearest Shop
Mark Jewellers H Samuel
Mark Barbers Mr Man
Natalia Coffee Shop Starbucks
John Coffee Shop Costa
John Barbers Mr Man
John Jewellers H Samuel

The issue above is that Mr Man on record 2 could be changed to a Hairdressers for shop type but the second last record John would not change and as such it would seem to be inconsistent. In this case I would split this table into two with Nearest Shop and Shop Type being linked and there being some kind of table that links People to Nearest shops.

The 4th Normal Form
A table is in the 4th Normal Form if it has no multi-valued dependencies.
Definition: Multivalued dependencies occur when the presence of one or more rows in a table implies the presence of one or more other rows in that same table.

For example lets say you have a you have a shop selling coffee – lets call it Starbucks and you have a table that lists the permutations of cofee sold in its store
CoffeeShop Coffee Product Location
Starbucks Café Late Livingston
Starbucks Americano Livingston
Starbucks Café Late Edinburgh
Starbucks Americano Edinburgh

If we know that all Starbucks sell the same products if there is a new product in one shop then we would expect that new product to be in the other shops so adding a Skinny Late to the Livingston shop implies that there should be another record for Skinny Lates in the Edinburgh shop.

To eliminate the possibility of anomalies we must place the facts about the products and the location of the shops into two tables

Products
Starbucks Café Late
Starbucks Americano
Starbucks Skinny Late

Location of Coffee shops
Starbucks Livingston
Starbucks Edinburgh

Alternatively if the coffee types did vary from location to location the original table would satisfy 4NF

The 5th Normal Form
A table is in fifth normal form or Project-Join Normal Form if it is in the Fourth Normal Form and it cannot have a loss-less decomposition into any number of smaller tables.

Or as I like to call it the – I know about it and try to adhere to it but sometimes I’m a bit lazy form.

I certainly try to keep my designs to the full six normalisation principles but there are instances where certain tables do not comply.

Towards the best table design for Addresses

Floor Reference or Flat No
Building Name or House Name
House/Building/Unit Number
Street Name
Through Street
Locality
Town or City
Country
Postcode

A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away.

Antoine de Saint-Exupery

OK so I didn’t know who he was either until I did a search on simplicity but I wanted to record my best field structure for the storage of addresses which although important is on a par with watching paint dry in terms of excitement. Despite this addresses are central to a heck of a lot of applications so it is important to get it right. This copes with pretty much most addresses. I mainly use locality in larger towns or cities to denote a sub area specifically for users as you can guarantee they will be unfamiliar with the majority of specific street names and particularly for users who are doing business in the same city they are located in it is very useful for them to have a quick understanding of where things are in relation to the assets of the operation involved. Such “context” can be added by using maps but will come for free with intelligent staff if you give them opportunity. I long ago dropped counties and districts but I could see if you were in a larger country such as Russia or America could be useful to note state / oblast etc.. Additionally in the UK town is really postal town – which is often not obvious at all – it can be overkill but you can organise addresses so that postal town is referenced from the town automatically. Probably overkill.

Information on BS7666
National Land and Property Gazeteer

Business Continuity

I see Business Continuity frequently evidenced as a reason by management to disallow staff responsibility for their business systems. The belief being that individuals will become too central to the operation of the organisation and will become somehow irreplaceable.

For the record I would like to state that I consider this to be failed logic.

Parents with children will encourage them to take responsibility by trusting them with tasks. Why is this important? Why? Because one day the parents won’t be there to support their children who will have to stand on their own two feet and it is better for the offspring to make mistakes with the parents there to catch them than to wait till the parents are gone and then for them to make the same mistakes. Many parents consider that their children do things in inefficient ways nonetheless they continually trust them. The world continues to progress and the human race continues to be the most successful species on the planet (hands down). It is the mechanism of passing knowledge down from one generation to another. So giving responsibility and trusting children far from decreasing continuity has a long history of ensuring it.

If anyone uses business continuity as a reason to prevent individuals from being given responsibility you might want to highlight the above point to them.

SQL Saturday coming to Edinburgh again 13 June 2015

SQL Saturday is coming to Edinburgh again Saturday 13 June 2015 I hope to attend. I have attended the two previous events I highly recommend it. Thanks must go to Jennifer Stirrup and others for taking time out to organise and speak at it.
Venue is Pollock Halls on the edge of Holyrood Park in the beautiful city of Edinburgh.

Edinburgh SQL Saturday event page

In the beginning

UPDATE January 2021
Yes its still here and I still seem to be sporadically writing. Decided that I wanted to change the name to something more representative of what the site is about. So henceforth it is Cloudy DATA blog.

April 2014
This was my first post on 27 April 2014 as I write this I am about a month away from its 7th birthday.