SQL Server Express – Connection to QGIS issues with geometry display

I was having issues displaying polygons stored in SQL Server Express locally within a QGIS 3.28 project.

After having quizzed unsuccessfully GPT 4.0 I found the answer on good old Stack Overflow.

As part of your SQL Server Express installation you should have installed SQL Server Configuration Manager. Open this administration package and navigate to your instance to show the following;

Make sure both Named Pipes and TCP/IP are enabled.

That fixed it for me
Important note the host connection string IS case sensitive and sometimes when you use the properties from within SSMS to look at the hostname it capitalise things so watch out! That caught me out for half an hour!!

And here is my definition of the geometry table

CREATE TABLE [dbo].[geometry_columns](
[pkid] [int] IDENTITY(1,1) NOT NULL,
[f_table_catalog] [varchar](50) NULL,
[f_table_schema] [varchar](50) NULL,
[f_table_name] [varchar](100) NULL,
[f_geometry_column] [varchar](50) NULL,
[coord_dimension] [int] NULL,
[srid] [int] NULL,
[geometry_type] [varchar](50) NULL,
PRIMARY KEY CLUSTERED
(
[pkid] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY]
GO

Add POSTGIS extension to Azure Postgres Database – Worked Example

Firstly we need to allow for the extension on the server.
Within the Azure portal navigate to server parameters

Search for azure.extensions

Select which extensions to add
REMEMBER to save before exit!

Wait a short period while the extension is loaded onto the server..

Some points
Azure Database for PostgreSQL supports a subset of key PostgreSQL extensions as listed HERE. This information is also available by running SHOW azure.extensions;. Extensions not listed in this document aren’t supported on Azure Database for PostgreSQL – Flexible Server. You can’t create or load your own extension in Azure Database for PostgreSQL.

After extensions are allow-listed and loaded, these must be installed in your database before you can use them. To install a particular extension, you should run the CREATE EXTENSION command. This command loads the packaged objects into your database.

The extension(s) should now be listed against your database.

QGIS – Working with Free BASEMAP Services for Desktop Projects (Bing and Google Maps Aerial and Lines)

This is an older post updated at 30 June 2023 for changes within QGIS

As soon as you need to practically implement any information in Spatial Databases display of the information through a mapping front end becomes absolutely vital. Many database administrators are used to simply going into their favourite database editors and displaying the raw subsets of tables and queries. That works well for financial transactions and inventory tables were collapsing the attributes of objects into single digits is often valuable or possibly preferable to simple photos. When dealing with boundary information the complete opposite applies. Display of information as simple screens of matrix numbers is completely useless.

Additionally often boundaries make zero sense unless referenced to the land beneath them either through aerial photography or topographic maps.

In a previous time where I worked we actually commissioned a company to give us aerial photos of a local authority. This was not an insignificant amount of money and was probably only marginally reduced by the vendor having multiple channels of sale. Google and Microsoft are now very good in offering very good aerial and line interpretations for limited use to companies and individuals. This is absolutely great as it can be used as background either to confirm accuracy of other information or as data upon which to calculate further information (eg routing).

So how can an individual get up and started with some of these basemaps.

Sometime recently (I know not when exactly) QGIS changed its implementation of Open Street Maps through their desktop – rather than being an additional plugin Open Street Map provision is now included on install.

Here I am working with QGIS version 3.10

Now you should be presented with the Data Source Manager Dialog which looks like this

Now expand the XYZ Tiles

You can then double click on any one of the three and the WMS will be transferred into the layer panel

Layer > Data Source Manager > XYZ

Thank you to Google and Microsoft and OSM and QGIS for a great implementation.

SQL Server Express – starting to work with Geometry Functions in SQL Server

Firstly I created a database.

This is essentially me trying to implement in SQL Server what I had written about in Postgres here – Using ST_Within in Postgres

Now I create the starting tables.

CREATE TABLE [dbo].[t00001fields](
	[pkid] [int] IDENTITY(1,1) NOT NULL,
	[fieldname] [varchar](50) NULL,
	[geom] [geometry] NULL

Then

CREATE TABLE [dbo].[t00002plots](
	[pkid] [int] IDENTITY(1,1) NOT NULL,
	[plotname] [varchar](50) NULL,
	[geom] [geometry] NULL

Now a magic junction table that will identify what plot is within which field when it is run

CREATE VIEW v001FieldPlotJunction AS
SELECT t00002plots.pkid as Plotspkid,t00001fields.pkid as Fieldspkid
FROM
t00002plots,
t00001fields
WHERE
t00002plots.GEOM.STWithin(t00001fields.GEOM) = 1;

and here’s an example of it running on local in SQL Express

And this is what I am seeing when viewed through QGIS = many of the fields are smaller than the plots and so although 1 2 look like they should be in the view because the plot is not WITHIN the field it does not show which is the correct response in this case.

Postgres – create a trigger to automatically calculate area of Polygon on edit

This will calculate the area in hectares of a polygon on creation or edit..( referenced here as Before Insert or Update On)

This appears to work for polygon and multi-polygon geometry types I have had a discussion with a colleague and he has indicated that the type of project may affect the accuracy of the measurement. Certainly there is a straight division in there so it would probably be possible to alter the division figure by empircally testing to get it as accurate as possible. Apparently with some popular projections the further north you are the less accurate the resultant figure. (Just something to bear in mind).

CREATE OR REPLACE FUNCTION calc_area() 
RETURNS trigger AS $BODY$ 
BEGIN NEW.grossarea := ROUND((st_area(NEW.geom)/10000)::numeric,2); 
RETURN NEW; 
END; 
$BODY$ LANGUAGE plpgsql; 

What I found interesting about this was the field name is defined in the function and called from the trigger. There is possibly a better way of writing this to define the field in the trigger itself..

And the trigger that goes on the table

CREATE TRIGGER area_calculate BEFORE INSERT OR UPDATE ON public.t001landparcels FOR EACH ROW EXECUTE PROCEDURE calc_area();

021 Postgres with PostGIS plugin – Create junction table sites in catchments

Quick post that I will come back and edit

So we need two tables

t001asites which has a geometry field called geom
and another table which will be the catchments table called
t002bcatchments which has a geometry field called geom.

Both tables must have a serial primary key of pkid and both tables must be polygon data and the geom field MUST be defined as polygon and NOT multipolygon.

Air code is as follows.

    1. Create table containing digitised polygons of housing sites.
    2. Create table containing digitised polygons of catchments.
    3. Measure the area of the housing sites and place that value in an area column within the housing sites table t001asites.
    4. Split the housing sites by the catchment boundaries ensuing that each split polygon inherits the catchment it was split by.
    5. Re-measure the areas of these split sites and add an area column to store the new calculations.
    6. Divide figure obtained in 5. by figure obtained in 3 which will indicate the proportion of the housing site is in which catchment.
    7. Perform a least remainder method on the individual sites grouped by their original housing sites to ensure the proportions sum to 1.

So to the code

BEGIN;
SET LOCAL check_function_bodies TO FALSE;
CREATE OR REPLACE FUNCTION part01catchjunctionmaker() returns void as $$
Alter table t001asites add column area integer;
$$ LANGUAGE SQL;
CREATE OR REPLACE FUNCTION part02catchjunctionmaker() returns void as $$
Update t001asites set area=ST_Area(geom);
$$ LANGUAGE SQL;
CREATE OR REPLACE FUNCTION part022catchjunctionmaker() RETURNS void AS $$
DROP TABLE IF EXISTS t200;
$$ LANGUAGE SQL;
CREATE OR REPLACE FUNCTION part03catchjunctionmaker() RETURNS void AS $$
CREATE TABLE t200 AS select a.pkid as t001pkid, b.pkid as t002pkid, a.area as t001area, ST_intersection(a.geom, b.geom) as geom FROM t001asites a, t002bcatchments b;
$$ LANGUAGE SQL;
CREATE OR REPLACE FUNCTION part04catchjunctionmaker() RETURNS void AS $$
ALTER TABLE t200 add column pkid serial primary key, add column area integer,add column proportion decimal (10,9);
$$ LANGUAGE SQL;
CREATE OR REPLACE FUNCTION part06catchjunctionmaker() RETURNS void AS $$
UPDATE t200 SET area=ST_Area(geom);
$$ LANGUAGE SQL;
CREATE OR REPLACE FUNCTION part07catchjunctionmaker() RETURNS void AS $$
DELETE from t200 where area=0 or null;
$$ LANGUAGE SQL;
CREATE OR REPLACE FUNCTION part08catchjunctionmaker() RETURNS void AS $$
UPDATE t200 SET proportion= cast(area as decimal)/cast(t001area as decimal) WHERE area > 0;
$$ LANGUAGE SQL;
CREATE OR REPLACE FUNCTION part088catchjunctionmaker() RETURNS void AS $$
DROP table IF EXISTS t201;
$$ LANGUAGE SQL;
CREATE OR REPLACE FUNCTION part09catchjunctionmaker() RETURNS void AS $$
Create table t201 as Select pkid,t001pkid,t002pkid, t001area, area, proportion, sum(proportion) OVER (PARTITION BY t001pkid ORDER BY t001pkid, proportion) as cum_proportion FROM t200 ORDER BY t001pkid, proportion;
$$ LANGUAGE SQL;
CREATE OR REPLACE FUNCTION part10catchjunctionmaker() RETURNS void AS $$
Alter table t201 add column value decimal (14,9),
Add column valuerounded integer,
Add column cumulvaluerounded integer,
Add column prevbaseline integer,
Add column roundproportion integer;
$$ LANGUAGE SQL;
CREATE OR REPLACE FUNCTION part11catchjunctionmaker() RETURNS void AS $$
UPDATE t201 set value = proportion * 100;
$$ LANGUAGE SQL;
CREATE OR REPLACE FUNCTION part12catchjunctionmaker() RETURNS void AS $$
UPDATE t201 set valuerounded = round(value,0);
$$ LANGUAGE SQL;
CREATE OR REPLACE FUNCTION part13catchjunctionmaker() RETURNS void AS $$
update t201 set cumulvaluerounded = round((cum_proportion*100),0);
$$ LANGUAGE SQL;
CREATE OR REPLACE FUNCTION part14catchjunctionmaker() RETURNS void AS $$
update t201 set cumulvaluerounded=100 where cumulvaluerounded = 101;
$$ LANGUAGE SQL;
CREATE OR REPLACE FUNCTION part15catchjunctionmaker() RETURNS void AS $$
update t201 set prevbaseline = round((cum_proportion - proportion)*100);
$$ LANGUAGE SQL;
CREATE OR REPLACE FUNCTION part16catchjunctionmaker() RETURNS void AS $$
update t201 set roundproportion = (cumulvaluerounded-prevbaseline);
$$ LANGUAGE SQL;
CREATE OR REPLACE FUNCTION part17catchjunctionmaker() RETURNS void AS $$
DELETE from t201 where roundproportion=0 or null;
$$ LANGUAGE SQL;
CREATE OR REPLACE FUNCTION part18catchjunctionmaker() RETURNS void AS $$
alter table t201 add column proppercent decimal(3,2);
$$ LANGUAGE SQL;
CREATE OR REPLACE FUNCTION part19catchjunctionmaker() RETURNS void AS $$
update t201 set proppercent = cast(roundproportion as decimal)/100;
$$ LANGUAGE SQL;
COMMIT;

and now a function to pull it all together;

CREATE OR REPLACE FUNCTION createcjt()
RETURNS TEXT AS
$BODY$
BEGIN
PERFORM part01catchjunctionmaker();
PERFORM part02catchjunctionmaker();
PERFORM part022catchjunctionmaker();
PERFORM part03catchjunctionmaker();
PERFORM part04catchjunctionmaker();
PERFORM part06catchjunctionmaker();
PERFORM part07catchjunctionmaker();
PERFORM part08catchjunctionmaker();
PERFORM part088catchjunctionmaker();
PERFORM part09catchjunctionmaker();
PERFORM part10catchjunctionmaker();
PERFORM part11catchjunctionmaker();
PERFORM part12catchjunctionmaker();
PERFORM part13catchjunctionmaker();
PERFORM part14catchjunctionmaker();
PERFORM part15catchjunctionmaker();
PERFORM part16catchjunctionmaker();
PERFORM part17catchjunctionmaker();
PERFORM part18catchjunctionmaker();
PERFORM part19catchjunctionmaker();
RETURN 'process end';
END;
$BODY$
LANGUAGE plpgsql;

QGIS and PostGIS : Identifying direction of a vector

If using the dijkstra function with direction turned on it is important to identify the order in which the nodes of a vector line have been digitised. This is called the direction, dijkstra can use this with a reverse_cost attribute to handicap wrong movement along lines to such an extent that the correct path can be calculated around things like roundabouts.

Here is an example of the roundabout in Straiton in Edinburgh just North of the A720 bypass. While some of the lines have a correct anti clockwise orienation clearly some have been incorrectly digitised.

First we can see this by displaying the network in QGIS but using the styling to arrow the direction.

The function that can be used to reverse such inaccuracies if you can’t resort to buying a correct dataset try ST_REVERSE

013 Postgres command line : psql : Using ST_Within function to build junction tables to compare 2 separate polygon tables

First off let us create a new database to hold our examples in.

CREATE DATABASE stwithindb;

Now add the postgis extension.

Lets create two tables one called fields and one called plots

CREATE TABLE
t00001fields
(
pkid serial primary key,
fieldname varchar(50),
geom geometry(polygon,27700)
)
;
CREATE TABLE
t00002Plots
(
pkid serial primary key,
plotname varchar(50),
geom geometry(polygon,27700)
)
;

Now lets go to QGIS connect to the PostGIS instance add the tables and create some test data manually.

Here I have added fields in green with bold number labels and plots in brown with smaller number labelling. The numbers represent the pkid fields.

Now here I can quickly run a query to identify the plots that are in fields

SELECT t00002plots.pkid
FROM
t00002plots,
t00001fields
WHERE 
ST_WITHIN(PUBLIC.T00002PLOTS.GEOM, PUBLIC.T00001FIELDS.GEOM);

And it correctly identifies that plot 1 is within the fields layer.

But what would be great in an application is to have some kind of junction table that individual master records could display their children on. For this we need a junction table that links between the field and plots table showing the pkids from each.

SELECT t00002plots.pkid as Plotspkid,t00001fields.pkid as Fieldspkid
FROM
t00002plots,
t00001fields
WHERE 
ST_WITHIN(PUBLIC.T00002PLOTS.GEOM, PUBLIC.T00001FIELDS.GEOM);

Now I will move plot 2 into field 3 and rerun the above.

The layer now looks like

and running the former query we get.

Now its possible to either create a junction table to hold this information..

eg

CREATE TABLE t00010fieldplotjunction AS 
SELECT t00002plots.pkid as Plotspkid,t00001fields.pkid as Fieldspkid
FROM
t00002plots,
t00001fields
WHERE 
ST_WITHIN(PUBLIC.T00002PLOTS.GEOM, PUBLIC.T00001FIELDS.GEOM);

or we can create a view that will constantly calculate this everytime it is seen

CREATE VIEW v001FieldPlotJunction AS
SELECT t00002plots.pkid as Plotspkid,t00001fields.pkid as Fieldspkid
FROM
t00002plots,
t00001fields
WHERE 
ST_WITHIN(PUBLIC.T00002PLOTS.GEOM, PUBLIC.T00001FIELDS.GEOM);

Now if I add a few more plots and fields and then pull up the view we shall see that everything has been adjusted

and running the view we now get

In some circumstances this calculation may be expensive so we may wish to run and create a junction table overnight other times we may be happy to do it fully dynamically. Of course in a front end you could query and filter such that only one record was compared against the fields plot at anytime. Very useful nonetheless.

010 Postgres command line : psql : Getting started with pgrouting using open data from Ordnance Survey to identify and measure the shortest route between two points.

Objective here is to write a series of queries that can be used to measure the shortest distance between selected paired locations on a network such that the geometry of the routes can be calculated and displayed on a map.

For this particular tutorial you will need – QGIS 3 or higher and a version of Postgres I am using version 11.0 here (I have upgraded since my former posts). I believe this tutorial will work with previous versions but if you are following along now might be a good time to upgrade.

QGIS 3.4 or higher – needed as the Ordnance Survey road network geometry contains a z coordinate which will prevent the creation of the required geometry for measurement. QGIS 3 introduced the ability to save geometry excluding z coordinate. If you have a network without z coordinates you should not require this.

So let us first get the data. Here you tick the option in the top right hand corner – scroll to the bottom and submit your request after which you will be asked a few basic questions along with email address you wish the download to be sent to after a few minutes you should be sent the download link through your email – follow the instructions and you should be able to get the information

Ordnance Survey Open Data

The information you are downloading is a block framework for the whole of the uk. When you unzip the download into a folder you will see multiple files. We will be using a section of the national dataset relating to Edinburgh – NT. Choose the block or selection that you are interested in. More blocks may take more time however.

Open QGIS
Create a new project : eg EdinburghRouting.qgz
Load in your chosen network block : eg NT_RoadLink.shp

Select the layer you just loaded in : eg NT_RoadLink.shp

and navigate to the following in the menu settings
Layer / Save As

Fill out the Save Vector Layer as … dialog box
IMPORTANT – ensure within the Geometry section
Geometry type is set to LineString
Include z-dimension is unticked

Give the new file a name : eg ntosroutingnetwork.shp

Hit ok

Within the layer dialog of QGIS your new layer should appear you can now remove the for NT_RoadLink shape file from the project

Next go to your version of PostgreSQL and using a superuser account create a new database : eg edinburghrouting

I would suggest you use lower casing as well

As a superuser ensure you add the postgis and pgrouting extensions.

Next I set up the following connection between the QGIS project and PostgreSQL

Personal tastes may vary but I like like to select
Also list tables with no geometry
Allow saving/loading QGIS projects in the database

OK the selection and you should now have a connection to the new database you just created.

QGIS has an excellent dbmanager window which we will use to load our new shape file which excludes the z layer into the new database we created in PostgreSQL

Ensuring that you have a connection to your localpostgis database hit the

ImportLayerFile

Here I load the information into a new table t01roadnetwork

On pressing OK there will be delay after which if things go well you will receive the following message.

As ever it is good to check that things appear to be going well.
Add the layer to your project and determine approximately whether import was successful.

Next back in psql command line and in an editor we are going to run 4 queries
The first 2 add columns that are required in the shortest distance algorithm we shall use, the third will allow anyone to write an aggregation function to see the total cost of the route and the last creates a topology for the road network.

alter table public.t01roadnetwork add column source integer;
alter table public.t01roadnetwork add column target integer;
alter table public.t01roadnetwork add column agg smallint default 1;
select pgr_createTopology('public.t01roadnetwork', 0.0001, 'geom', 'id');

If things go correctly you should see the database engine start to create the topology and what I see is it gradually stepping through the creation process.

and on completion you should have something like the following:

A new table has been added to the edinburghrouting database and next step is to display the network and its vertices. In QGIS.

In QGIS we should see something like

The next thing that I like to do is to label the nodes so that for quick identification.

And look to the t01roadnetwork table and see if the columns are clear and present.

We are now ready to make a measurement. Here I choose the nodes 15883 and 10967

SELECT seq, id1 AS node, id2 AS edge, cost, geom , agg
  FROM pgr_dijkstra(
    'SELECT id, source, target, st_length(geom) as cost FROM public.t01roadnetwork',
    15883, 10967, false, false
  ) as di
  JOIN public.t01roadnetwork pt
  ON di.id2 = pt.id ;

Now we can load this as a new layer and then improve the symbology

Doing this we get.

It should be noted that the line you see is a collection of lines. In my next post I will go through and indicate how we can amalgamate that into a single line for storage in a table.

Congratulations if you have got this far you should be able to measure the shortest distance between any two points on a valid network by altering the numbers.

004 Postgres Command Line : psql : Create a spatially enabled table

For this you will need to have a version of Postgres Database engine installed and running and you will need to have created a database which has the PostGis extension installed.

Open psql
Login to the database you wish to create the table in

type the following

CREATE TABLE t001landparcels (PKID SERIAL PRIMARY KEY, PARCELNAME VARCHAR(50), GEOM GEOMETRY(POLYGON,27700));

Here I do this and then check on the tables with the \dt command before inspecting the columns itself using the \d command.

and here I open up QGIS and link to my local postgres instance and the exampledb database;

and here I connect to it and draw a polygon. If you are wondering where it is this is InchKeith in the Firth of Forth and island very visible from George Street in Edinburgh. If you have flown into Edinburgh you will have flown almost over it.

and here after having digitised a single polygon I look at the contents of the table

SELECT count(*) FROM t001landparcels;

Produces the more helpful count of records in the table.

002 Postgres Command Line : psql : Enabling extensions (PostGIS) to a database

So you have an existing database in PostGres that you wish to add the PostGIS extension to.

You will need to be logged in with a username that has superuser privileges.
Here let me do this using the postgres default account that on my instance has superuser access.

Here rather than logging in at the start I am choosing to change the connection to a different database

This is done by using

\c DatabaseName;

or

\connect DatabaseName;

Please note that the semicolon is vital in performing the instruction.

The way I have discovered to do this is by opening the database and then allocating the postgis extension to the database once it is open.

Let us first examine the tables in our blank database before the extension is enabled.

\dt

Now let us enable the extension.

CREATE EXTENSION postgis;

Let us now examine the database tables

\dt

Enabling the PostGIS extension adds this table along with the ability to add geometry types of varaibles within tables.

001 Postgres Command Line : psql : Getting Started with Postgres

I am just getting into PostGres and here are some rough notes for my reference.

Assuming you have a postgres admin account you want to sign in first of all and create a database

To find the command line go to search in Windows and type psql

Ensure that your postgres engine is running firstly

You should be presented with the following

There are default connections to the local host keep hitting these initially until you reach the following;

You will now need to know your password enter it here and press return

I will deal with the warning message further in the post – but a lot of people experience this message so I wanted to keep it in at present.

From my initial investigations as ever it is a good idea to restrict users to particular privileges at this level I am not being particularly refined – I would like to have a poweruser role that I can allocate to people and give this a defined password.

Signing in you can check out the roles and users as follows – on starting up a new instance you may well see something like this \du

So here I try and set up a user called general and create a role called general which I give create DB rights

I would recommend something stronger than the classic password password.

Issuing the \du command again we get to see the roles

Now we can close down and go back in but this time login as username general by altering the appropriate item when asked.

Note how the =# characters have been replaced by => this appears to denote the non superuser sign in.

To identify what username you are logged in as type \c at the prompt

My investigations suggest that the # sign denotes that you are logged into the database as superuser.

So first of all lets get rid of that annoying warning message when you log in at psql

I am running Postgres version 9.5 your version may vary but you can remove the warning by editing runpsql.bat file every version of postgres has this file and it will be located in the version equivalent directory to 9.5 for me.

C:\Program Files\PostgreSQL\9.5\scripts

Add the line

cmd.exe /c chcp 1252

as per underline and save the file

Now fire up psql as usual you should get the following

It should be noted that if you REM out the SET statements you can choose login with particular server / localhost / database / port and username presets which may be useful if you find yourself constantly going into a particular database as a particular user.

Here you see that the warning note has stopped.

It should be noted that using the general username you will NOT be able to create Databases

In order to CREATE databases you will have to be signed in with a username with sufficient privileges here I am in as postgres and I create a database called ExampleDB

You can see that on carrying out a successful command we usually see a repeat of the command.

To get a list of all databases in the instance type

\l

It can be seen that despite typing the name in capitals the database has been created in lower case this seems to be a feature of commandline. If you desperately want capitals you may need to go to pgadmin tool.

I think I’ll end it there for now.

MS Access Function – import all CSV files from a directory with the same structure into a single table

This is a really nice function that can be used to place all data from multiple CSVs (with the same structure) into a single table.

Here I use the Ordnance Survey’s excellent Code Point data set that gives postcodes in the UK along with eastings and northings as an example – This lists each postcode in the UK along with further administrative categories. Apologies to anyone from outside of the UK that may not be able to access these files I hope the demonstration is still useful. For those wishing to try please follow the links.

After download you will see the problem each postcode is in a separate CSV.

Ordnance Survey Open Data Code Point UK Postcodes

After a short administration excercise to request a data download involving filling out your name you should be sent an email link to initiate a data download. The download consists of a zip file of two directories one named DOC one named DATA the DATA directory contains a subdirectory called CSV which at May 2018 for my download consisted of 120 csv files.

Opening a single file ( in this case Edinburgh EH ) we see

I’ve already figured this out here , but there are 10 fields here (some are blank in my example)

Here I create a table called T01CodePointCombined with 10 fields marked
F1 through to F10
Note if you don’t create the table this function is so powerful it will do it for you

I then create a module and ensure that all the CSV files I wish to import are in a single directory here “C:\Users\Mark\Documents\CodePoint\Data\CSV\”

Public Function ImportAllFiles()

        Dim strPathFile As String, strFile As String, strPath As String
        Dim strTable As String
        Dim blnHasFieldNames As Boolean

        ' Change this next line to True if the first row in csv file
        ' has field names
        blnHasFieldNames = False

        ' Replace C:\Users\Mark\Documents\CodePoint\Data\CSV\ with the real path to the folder that
        ' Now place the location of where the csvs are within the brackets below
        strPath = "C:\Users\Mark\Documents\CodePoint\Data\CSV\"

        ' Replace tablename with the real name of the table into which
        ' the data are to be imported
        strTable = "T01CodePointCombined"

        strFile = Dir(strPath & "*.csv")
        Do While Len(strFile) > 0
              strPathFile = strPath & strFile

              DoCmd.TransferText _
                TransferType:=acImportDelim, _
                TableName:=strTable, _
                filename:=strPathFile, _
                HasFieldNames:=blnHasFieldNames

        ' Uncomment out the next code step if you want to delete the
        ' csv file after it's been imported
        '       Kill strPathFile

              strFile = Dir()
        Loop

        MsgBox "Finished"

End Function

Points to note make sure all csv are closed when you run it. That’s about it takes less than 5 minutes to move all the records from those 120 files into a single table within an MS Access Database.
After import if it’s gone correctly you should have in the region of 1.7 million records in T01CodePointCombined.

Links to GIS information for test system design

If you are trying to design software that includes a Geographical element it is easier if you are working with data that makes some kind of sense.

The following are a list of sites where you can get good and consistent information on Local Authority Geographical Datasets within Scotland and in London. There has been an improvement in the quality and extent of information available but open data still remains patchy. Fortunately some datasets are available. The interesting thing about this data is that although it is rich it is largely unstructured and without relationships. Fortunately if there are geographical attributes then these can be used to spatialiy analyse the information and create relationships from which you can start to construct better systems.

I understand why the data is patchy. To really publish well it is a necessity to get your systems working well so that the export (and publication) of data can be at least semi-automated. Without this it is simply too onerous for staff to repeatedly perform Extraction Transformation and Load procedures on ever larger numbers of datasets. Taking a step back however therein may lie the benefit. The quicker they can learn to cleanly optimise and export and hopefully automate these procedures the more likely they are to have their systems working properly and importantly the more investigation and experimentation they can put into linking their datasets. The skills to link these datasets constantly to a web data portal being similar to the skills required to link between systems.

It might be expected therefore that better availability of open data is reflective of better internal systems.

Here is the information that I was able to identify through Google Searches at February 2018.

Aberdeenshire Open Data

Angus Council Open Data Portal

Argyll and Bute Open Data Portal

Dundee City Open Data Portal

Moray Council Open Data Portal

North Ayrshire Open Data Portal

North Lanarkshire Open Data Portal

Perth and Kinross Open Data Portal

South Ayrshire Open Data Portal

and scheduled ancient monument information can be obtained here.

Historic Environment Scotland

Here is London

London

Creation of SITE History from Planning Application Polygons using QGIS

In planning it is important to know the planning history on a site. The status and likelihood of approved permission will often relate to previous permissions. Many council planning systems do not specifically relate planning applications to each other and there may be situations where you would like to create such links. This is essentially an excercise in using spatial analysis to create the junction table to hold what are many to many relationships.

If your datasets are in any way large you will need to set aside a computer so that it can perform the calculations. When I first tried this the process took a weekend with queries running overnight.

Start by obtaining as many years of planning application polygons as you can. Here I use polygon files in shape format.

The polygon file or shape file should be in one file so if you need to merge the shape files you have together. I did this and the file I ended up with was

AllPlanningApplications.shp

Next – Delete all attribute fields EXCEPT the planning application number.

Next – Create a centroids file from AllPlanningApplications.shp I called mine
AllPlanningApplicationsCentroids.shp

The next series of iterations are about getting a unique set of polygons with which we can go forward and generate a set of SITEPKIDS that can be attached to the child records.

Step – Using AllPlanningApplications.shp ADD an additional field called area and populate it using QGIS $area calculation – save this file.

Step – this is where it becomes interesting – in most authorities there are a vast number of planning application boundaries that overlap. Performing a dissolve at this point would result in a large congealed set of polygons that could not clearly identify unique sites. Thus buffering the polygons down we can start to identify unique sites. This is particularly important where boundaries are completely contiguous to each other.

sites the buffering command is used within the geometry tools to try to separate adjacent overlapping and contiguous polygons.

Step ‐ Create two files from the AllPlanningApplications.shp one for polygons less than 4500 metres squared and one for more than or equal to 4500 metres squared. This is to allow for two differing buffering processing to be performed on each.

AllSmallLessthan4500PlanningApplications.shp

AllLargeGreaterthanequal4500PlanningApplications.shp

Now the 4500 is an empirical figure that was subjectively chosen there may be a better figure feel free to investigate.

The following 2 steps also introduce empirical figures for the buffering that can be altered as appropriate.

Step ‐ Take the file AllSmallLessthan4500PlanningApplications.shp and create a buffer polygon file of this with

boundaries of less than 2m lets call it

AllSmallLessthan4500PlanningApplicationsBufferMinus2.shp

Step ‐ Take the file AllLargeGreatethanequal4500PlanningApplications.shp and create a buffer polygon file with

boundaries of less than 20m lets call it

AllLargeGreaterthanequal4500PlanningApplicationsMinus20.shp

THIS NEXT STEP WILL TAKE SEVERAL HOURS IT MAY BE BEST TO DO EACH ONE OVERNIGHT

Step ‐ Perform dissolves on both of these new files ensuring that dissolve all is used names could be something like

Vector / Geoprocessing Tools / Dissolve /

Set input layer alternatively to the two above files and set Dissolve field to dissolve all.

Suggested file Names are

MultipartDissolvedPolygonLessthan4500PlanningApplicationsBufferMinus2.shp

MultipartDissolvedPolygonAllLargeGreaterthanequal4500PlanningApplicationsMinus20.shp

Step You should now have two shape files of a large multipart polygon you want to perform the multipart to single part operation now

Vector / Geometry Tools / Multipart to Single Part

Processing involved with this is typically quick and suggested names for these files are

DistinctPolygonsAllSmallLessthan4500PlanningApplicationsMinus2.shp

DistinctPolygonsAllLargeGreatethanEqual4500PlanningApplicationsMinus20.shp

Add area column and identify the largest polygon on the small files

Add area column and identify the smallest polygon are on the large files you may want to remember this.

Step ‐ perform merge on these two files to get

Vector / Data Management Tools / Merge

CombinedSmallandLargeDistinctPolygonsPlanningApplicationswithbuffering.shp

ONGOING investigation ‐ would Difference be better than dissolve on this and should the above files be put together before

Step ‐ perform dissolve

Vector / GeoprocessingTools / Dissolve

ensure that ‐‐Dissolve all‐‐ is selected

DissolvedCombinedSmallandLargeDistinctPolygonsPlanningApplicationswithbuffering.shp

Step now you want to split mutlipart to single

DistinctPolygonsAllPlanningApplications.shp

Step add field called SitePKID and populate it using $rownum command.

Step

Vector / Data Management Tools / Join Attributes by Location

Set Target Vector Layer as

AllPlanningApplicationsCentroids.shp

Set Join Vector Layer as

DistinctPolygonsAllPlanningApplications.shp

Ensure that Keep all records (including non‐matching target records are kept)

Output Shapefile suggestions

AllPlanningApplicationsCentroidswithSitePKID.shp

If there are centroids without Site PKIDs put them to the end and give them consecutive unique row numbers. The attribute file associated with AllPlanningApplicationsCentroidswithSitePKID.shp should now be a child table of the shape file DistinctPolygonsAllPlanningApplications.shp perform checks here to see if all centroids within a polygon defined by the distinct polygons have the same SitePKID and that it is matched by the SitePKID of the Parent shape file.

You should be able to do a join on the this file to get the PKID back into the very original file.

AllPlanningApplications.shp

Finally perform a dissolve on the corrected AllPlanningApplications.shp file but this time dissolve on the field

SitePKID

You can call this

DistinctCorrectedPolygonsAllPlanningApplications.shp

QED!!!!

QGIS – Import shape file into PostGIS Table

The following uses
QGIS 2.14.2 Essen and
PostGres 9.5

A number of local authorities have released information through the UK’s data government site. The following example uses a shape file obtained from Lichfield District Council – At 2nd of October 2016 this was available for download from the following link

Lichfield Planning Applications

Open up QGIS and add Lichfield’s planning application shape file
qgisessen2142

Now scan along the top menu and go to Database

Select the sub menu DB Manager and then DB Manager

dbmanager

The following windows dialog should appear

dbmanagerdialog

Expand the area on the left named PostGIS – any PostGIS instances that you have created should be visisble here. Note you will have to have the PostGIS server running. Then highlight the actual instance that would like to import information into.

In this case I use the instance LocalPostGres

dbmanagerdialog

Choose the third icon from the left.
dbmanagerimportlayerfile

It should be noted that the window on the right may or may not show the correct connection to the database on the right.

importdialog

Name the table you wish to create and then hit OK – additional parameters are available.
There will be a delay before a confirmation of successful import happens – try to not issue commands during this time – once confirmation has been received go back into the PostGIS option and add the layer.

Connecting to PostgreSQL 9.3 from QGIS 2.8.1 – local host

First ensure that you have both Postgres and QGIS installed on your machine.

In order for you to be able to connect to Postgres from QGIS on local host you must ensure 2 things. Firstly that the PostGIS plugin has been installed on your laptop AND secondly that you have included the postgis extension in each database that you wish to connect to. Without enabling the extension in the database you won’t be able to connect OR import shape files. Installation of PostGIS is often a default during the install of postgres but you can check whether this was completed correctly by using the Application Stack Builder, a small program that is installed with later versions of postgres.

I navigated to this on the win 8.1 machine I was using by using search.

Opening application stack builder you will be presented with the following.

ApplicationStackBuilder

Expand the spatial extensions tree to identify if you already have the PostGIS plugin installed – if not – select as appropriate the plugin and you will be prompted to install. You will need an internet connection for this. Above you can see that my plugin was already installed.

Next you need to add the PostGIS extension to each Postgres database you wish to link to from QGIS this is done through PG Admin.

This is something that both myself and a colleague got caught out by and it took me an hour of searching to find how to fix it.

Below I have a database called GISData which I have just created. You will note there is only one object within the expanded extensions tree. You will not be able to connect to a database that does not include PostGIS in its list of extensions
CreateExtension

Hi-light the database you want to spatially enable then go to Tools – Query Tool( Ctrl + E will do the same). In the above picture I’ve slightly jumped the gun. To add the extension to the database type.

CREATE EXTENSION postgis

Run the query by selecting the green right arrow
There will be a short delay and then upon refresh of the connection postgis should appear in the list of extensions.

CreateExtensionCreated

You can now close the Postgres administrator and return to QGIS where you should be able to setup the connection to the database.

Parameters should be similar to below and it is useful to test the connection prior to saving.

SettingupthePostGISconnection

Introduction to Basic Printing QGIS 2.2

Creating maps that you can pass on to others is often a central and regular requirement if not in paper format then in a digital format that can be e-mailed or printed out. Here’s a quick reference for myself as much as anything else.

To get into the print composer you can create a completely new print composition or alternatively load an existing print composition – Generally the 5th icon in the main menu bar will take you there (should be a white landscape rectangle with a star will give you access to the print composer , demonstrated below;

The 6th icon can be used to get to an existing declared print composition.

PrintComposerDemonstrationEnvironment

Now in the first instance you are going to want to navigate around the map and ensure that the map you wish to produce has the correct extents. In the below image on a 2 screen image I show QGIS v2.2 open on the left screen and the print composer open in the right. To move the composition area around go to the map window within the main program and navigate accordingly. Then within the print composer window hit the command button titled

Set to map canvas extent

This will re-draw your composition with an interpreted boundary defined directly from you map window. You can enforce scale in the item properties. Similarly after changing layers you will need to ensure that you hit the above button again when you want the composition to reflect the layers within the map window.

VB Function for Zooming to location in Google Maps at 23 March 2014

mapGoogle recently slightly altered their URL to link to Google Maps. This code passes eastings and northings from fields into a function that calculates Longitude and Latitude and then passes those calculated coordinates to the google maps url for display

Note you will need to have implemented the functions that calculate lat and long for this to be useful.

Private Sub Command01_Click()

On Error GoTo Err_Command01_Click

Dim Llatitude As Double

Dim Llongitude As Double

Dim strSitename As String

Llatitude = degrees(lat([Eastings], [Northings]))

Llongitude = degrees(lon([Eastings], [Northings])) - 0.0015

strSitename = Me.Sitename

Dim strlatlong As String

strlatlong = Llatitude & ",+" & Llongitude

'From 24 March Google changed their hyperlink address this is the new one

Command01.HyperlinkAddress = "https://maps.google.com/maps?q=" & strlatlong & "+(" & strSitename & ")&z=18&iwloc=near&hl=en&ll=" & strlatlong

Exit_Command01_Click:

Exit Sub

Err_Command01_Click:

MsgBox Err.Description

Resume Exit_Command01_Click

End Sub