Geoserver Installation (Configuration for SQL Server on Windows)

Once you have your data in an Enterprise Grade database sharing that information becomes important. Some vendors, in particular Digital mapping SAAS vendors want you to transfer your data into their clouds but for a lot of authorities that is categorically impossible for large segments of their data. For example about 3/4 of UK planning authorities run on software built by a company called IDOX Group plc which holds its data in Oracle this cannot be moved. For Geographical data here’s where Geoserver comes in.

Geoserver Official Website

From the website
GeoServer is an open source server for sharing geospatial data.
Designed for interoperability, it publishes data from any major spatial data source using open standards.

GeoServer implements industry standard OGC protocols such as Web Feature Service (WFS), Web Map Service (WMS), and Web Coverage Service (WCS). Additional formats and publication options are available as extensions including Web Processing Service (WPS), and Web Map Tile Service (WMTS).

Another way of putting it Geoserver allows programs that display and manipulate geographic information to display and edit that data while keeping it safe and secure and located in enterprise databases. It can connect to SQL Server / Postgresql and I am told Oracle which are the main relational databases used by Local authorities today so once setup and configured it could be used to compare and contrast information in one location from variable datastores. Datastores that for historical reasons cannot be moved. Here I concentrate on SQL Server because we have a particular issue with new systems needing to connect to SQL Server but it would be great if we could get it working with Oracle as well.

The following are my notes on installation and configuration of Geoserver on Windows linking to SQL Server and in this case a remote(not on the same computer) SQL Azure instance I believe any remote SQL Server instance would be similar. This post is a detailed explanation of the installation of Geoserver locally on a laptop but I believe installation would be exactly the same on a Windows Server machine albeit additional steps would be required to publish to the web either internally or externally.

Its all configuration so remember your host names / IP numbers / ports / usernames / passwords / database, table and column names

To start this tutorial please ensure the following resources are available;

  • SSMS – installation of Sql Server Management Studio
  • SQL Azure instance with privileges to run SQL. Please note you will need to know the port number of your SQL server instance and have a Username and Password for your user.
  • Necessary privileges to set the firewall on the database which you will link to Geoserver –
  • QGIS – installation – If you don’t have a SQL Server instance with geodatabase already that you can use you will probably need this to add at least one record to a geometry enabled table I plan to expand my notes on this but at present you might be able to skip this
  • Geoserver 2.24.1 – installation files
  • Enough privileges on the computer you are installing geoserver to download installation files and move files into a library subdirectory of the c:/program file/geoserver directory
  • WorkFlow Synopsis Overview
    1)Download and install Java runtime engine (as above importantly here I use 11)
    2)Download and install Geoserver (2.24.1 in this case)
    3)Test that the Geoserver Admin dashboard is up and working
    4)Download and configure Geoserver Extension for SQL Server library (SQL Azure example)
    5)Good idea to setup a login specifically to your SQL Azure instance with least privilege – Your SQL Server instance with geodatabases may already have suitable logins. So steps 5 and 6 can be skipped
    6)Good idea to create an example table with Geodatabase in SQL Azure – will be used as a test connection table – NOTE if you already have a SQL Server with geodatabases you could use that instead.
    7)White list IP within the SQL Azure instance firewall rules to Geoserver computer
    8)Opening Geoserver and setting the ‘Store’ to reference your SQL Azure database using the login setup in step 5.
    9)Adding a new layer from the Store made in point 8 and seeing if you can Preview the layer – Ideally you should have a table that has some features in it. Setting the WMS and WFS up in Geoserver with sufficient rights to allow editing (if WFS setup)
    10)I don’t explain it in my post below but the next step would be to test in QGIS to see if you can setup a project in QGIS which can pull from Geoserver against a basemap and you check that the Polygons are actually where you want them to be.

    Workflow Detailed
    1)Download the install Java Runtime via JDK

    There are some complications here
    Initially I tried to use the Java Runtime engine 17 and went through the complete Geoserver installation but on testing I was getting a Java error when trying to view layers. After a short google I came across a stack overflow thread that indicated that geoserver support for JDE 17 was experimental and I should use Java Runtime engine 11. Deinstallation of 17 and installation of 11 solved this.

    Additionally there is the issue of where you obtain the Java Runtime installation from – there is the Adoptium location and then there is Oracle location. Oracle are making moves to charge for use of their Java Runtime engine so it is important that you use Adoptium resource to reduce costs.

    Later I will list the url to the JDK download for Java Runtime engine but it is important to realise that Adoptium is a safe source, well supported and noted as open source.

    https://adoptium.net/en-GB/members/

    so

    Google Search for Adoptium
    https://adoptium.net/en-GB/

    Navigate to other platforms and versions and target windows 64 bit JDK and 11 LTS version and download the .msi.

    After a short delay you will be shown the msi which will be in your downloads folder

    Double click on it to install.

    Now the only thing that is tricky here is to setup the java home directory as follows

    Video Installing Adoptium JDK on Windows

    In particular note this section

    2) Download and install Geoserver

    Go to the official Geoserver web page and follow the links to download the windows installer

    Here I use the windows version of Geoserver version 2.41.1 obtained January 2024 from the following url
    Geoserver Official Website

    See download button and then go to Windows Installer

    I chose the left option and then I chose the Windows Installer option

    This will download Geoserver-2.24.1-winsetup.exe to your download directory and you can then start the geoserver installation process

    Next open up the executable and follow the instructions

    Next agree the licence

    Next we reference the java library installed in the previous step – if you have set the java_home variable correctly it should automatically find JRE 11 for you and it will place it within the path reference below. I don’t have a screenshot of the JRE 11 reference here as on my first installation I referenced the Oracle 17 JRE – (note I went back de-installed geoserver installed JRE 11 and then reinstalled geoserver to counter the proprietary Oracle runtime library and importantly to fix the issue that I was facing of geoserver not being compatible with JRE 17

    The rest of the install from here is standard for a windows install

    Next setup the default admin password – defaults are admin / geoserver

    Set up port geoserver runs on – default is 8080

    Next choose the execution type I like to install geoserver as a service

    Confirm your preferences and then trigger install

    Now you should see geoserver on your local machine as a service which you will need to ensure is running to access properly

    3) Test that Geoserver Admin dashboard is up and working at least locally

    If you have used the same settings as me open a browser and navigate to the following url

    https://localhost:8080/geoserver/web

    Login with the username and password which is usually admin / geoserver

    At which point you should see something similar to the following

    4) Download and configure Geoserver Extension for SQL Server

    At install Geoserver comes with the ability to connect to Postgres but NOT SQL Server so we must install/configure plugin extension to enable Geoserver SQL Server connections.

    Here we go back to the geoserver.org website and go to download but this time instead of choosing the windows installer we look to the Extensions section and choose SQL Server

    This should download geoserver-sqlserver extension plugin

    Next copy all files with the jar suffix into the following directory

    C/program files/GeoServer/webapps/geoserver/WEb-Inf/Lib

    Next restart the geoserver and go back to the local host and sign in

    http://localhost:8080/geoserver/web/?0

    We are now very close to linking to SQL Server prior to that we must whitelist our geoserver so that your SQL Server instances will accept connections from your geoserver. Steps 5 and 6 are more about creating a user and ensuring you have a geodatabase with a georeferenced table skip to 7 if you want to read about white listing in SQL Azure or you have an alternative database that you could use with requirements already set up.

    5) Next its a good idea to create a user with least privilege this will be used to set up the link so go to SSMS

    Least privilege User Login Setup SQL Server

    Here allow the amount of access you wish users to have remembering the principles of least privilege

    6) Lets create an example table with geometry that we will be connecting
    When I first tried connecting to SQL Server (SQL Azure in my case) I didn’t have any georeferenced tables so I created one and added a few records. If you have a database already with tables with geometry or geography you might not need to do this step.

    So open SSMS and navigate to your database and use the following TSQL to create a table here I call it t064

    SET ANSI_NULLS ON
    GO
    
    SET QUOTED_IDENTIFIER ON
    GO
    
    CREATE TABLE [dbo].[t064](
    [PKID] [int] IDENTITY(1,1) NOT NULL,
    [geomcol] [geometry] NULL,
    [sitename] [varchar](30) NULL,
    PRIMARY KEY CLUSTERED
    (
    [PKID] ASC
    )WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
    ) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
    GO

    This is a really simple table with three columns and you might want to link to it through QGIS and just create a few records – I would try and use a default projection of 27700 if you are UK based. I might come back with an example table with records..

    If you want to link to this table for testing then you should try and input at least one record here. I added records using QGIS and I hope to come back to this post and update it to be more specific

    7) White List the IP within your SQL Server or SQL Azure instance.
    Setting IP White List in SQL Azure

    8) Opening Geoserver and setting the ‘Store’ to reference your SQL Azure database using the login setup in step 5

    Navigate to the local host url for Geoserver namely

    http://localhost:8080/geoserver/web/?0

    Go to store and then you should see the following page

    Hit the Add New Store green plus button in the top left corner you will be presented with the following screen

    Fill out as many details as you can

    For SQL Azure this is likely to be similar to the following

    Host = namevariable.database.windows.net (normally unique to instance in SQL Azure)
    Port = 1433 (sql azure default)
    Schema = dbo (sql azure default is dbo but your database maybe bespoke)
    User = uservariable (remember least privilege is a good idea)
    Passwd = passwordvariable (should be unique to your database)
    

    Save and Apply and then your new connection should appear in your store

    Here’s an example…. (see top line)

    For the purposes of this tutorial I setup a login with db_owner rights to the database LandRegisterAzure your mileage will vary.

    9) Next we add the layer which references the SQL Server

    Select Layers within the Data section of the menu (usually to the left of the geoserver dashboard) then hit Add a new layer

    You should get the following windows

    From the drop down select the store which references your SQL server. This will reveal all the tables and views in the database and you scroll through them to the table or view you wish to publish and in the column marked Action will be publish you can hit the Publish highlighted text (I’ve done that here already so under Action our table is marked as Publish again and there is a tick in the column titled Published.

    We can now quickly test to see if things look like they are working by going to Layer Preview.

    Look to the left hand side and select Layer Preview

    Scroll down through the list and identify the layer that you have just added. I now select the open layers option

    A new browser tab will open and if you have successfully configured the SQL Server you should be presented with your layer – without any background

    10) I don’t go into it here but next I will try and write up my notes on QGIS and Geoserver setup – On my laptop I managed to set up a WFS to my SQL Azure instance through Geoserver where the layer was added to a project and I was able to edit information in my SQL Azure database through QGIS and were displayed correctly against a basemap in the right place!!

    –/END/–

    Spatial Data Hub Scotland

    Planning in the UK has for a long time suffered from a classic siloing of data by authority resulting in slow and varied analysis of information. Authorities relied on talented motivated individuals with particular interests and skills to develop bespoke solutions that assisted in the development of things like housing land audits , population forecasts , capital planning and local plan development which while often impressive individually struggled to transfer between authorities.

    The continual improvement of digital tools has dramatically improved standardisation of the attributes of particular spatial data sets and database technology originally developed for accounting systems and flight control systems is starting to be applied to the amalgamation and analysis of planning related information. Within the UK different regions are progressing along this path at different rates. Scotland now has a body the Improvement Service who has a specific remit to collect spatial planning data which they do at something call the Spatial Data Hub.

    The Spatial data hub at 08 January 2024 had 59 datasets listed at Scotland coverage level. Including
    Planning application boundaries
    School Catchment Areas
    Housing Land Supply
    Vacant and Derelict Land
    Employment Land

    The improvement service has been building these datasets for a number of years now however last year they expanded general access to much of the information and I have since been experimenting with it to see what can be achieved.

    SQL Azure – Using a Trigger to watch changes on fields

    The below trigger monitors a table called t001Table4Monitoring and four particular fields / 3 date fields and one integer field. It monitors the trigger table and compares new values to old values using simple subtraction. As a comparison between null and a value will always return null we need to catch situations were a field is changed from null or to null hence the ISNULL section. The record has a bit field that is updated to True if change is detected.

    This code was created with the help of GPT-4

    CREATE TRIGGER [dbo].[trg_Update_t001Table4Monitoring]
    ON [dbo].[t001Table4Monitoring]
    AFTER UPDATE
    AS
    BEGIN
        IF UPDATE(Date01Monitor) OR UPDATE(Date02Monitor) OR UPDATE(Date03Monitor) OR UPDATE(IntegerMonitor)
        BEGIN
            -- Get old and new values
            DECLARE @old_Date01Monitor DATE, @new_Date01Monitor DATE;
            DECLARE @old_Date02Monitor DATE, @new_Date02Monitor DATE;
            DECLARE @old_Date03Monitor DATE, @new_Date03Monitor DATE;
            DECLARE @old_IntegerMonitor INT, @new_IntegerMonitor INT;
    
            SELECT @old_Date01Monitor = Date01Monitor, @old_Date02Monitor = Date02Monitor, @old_Date03Monitor = Date03Monitor, @old_IntegerMonitor = IntegerMonitor FROM deleted;
            SELECT @new_Date01Monitor = Date01Monitor, @new_Date02Monitor = Date02Monitor, @new_Date03Monitor = Date03Monitor, @new_IntegerMonitor = IntegerMonitor FROM inserted;
    
            -- Check if values have changed
            IF (ISNULL(@old_Date01Monitor, '1900-01-01') <> ISNULL(@new_Date01Monitor, '1900-01-01')) OR 
               (ISNULL(@old_Date02Monitor, '1900-01-01') <> ISNULL(@new_Date02Monitor, '1900-01-01')) OR 
               (ISNULL(@old_Date03Monitor, '1900-01-01') <> ISNULL(@new_Date03Monitor, '1900-01-01')) OR 
               (ISNULL(@old_IntegerMonitor, 0) <> ISNULL(@new_IntegerMonitor, 0))
            BEGIN
                UPDATE dbo.t001Table4Monitoring
                SET RecordChanged = 1
                FROM dbo.t001Table4Monitoring t
                INNER JOIN inserted i ON t.pkid = i.pkid
            END
        END
    END;

    SQL Azure – Adjusting for different timezones a practical example

    Here I create a main table which can be used to translate times into a UTC value using a function I store here. It should be noted that if you have a field with variable type set to datetime and you set the default to GetDate() – the server will create a UTC value on record creation but time is set on the server and with SQL Azure you don’t have an option to change it.

    CREATE TABLE [dbo].[MainTable](
    	[pkid] [int] IDENTITY(1,1) NOT NULL,
    	[storeddatetime] [datetime] NULL,
    	[timezonename] [nvarchar](128) NULL
    PRIMARY KEY CLUSTERED 
    (
    	[pkid] ASC
    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
    ) ON [PRIMARY]
    GO
    
    ALTER TABLE [dbo].[MainTable] ADD  CONSTRAINT [DF_MainTable_storeddatetime]  DEFAULT (getdate()) FOR [storeddatetime]
    GO
    

    Next we create a timezone table

    CREATE TABLE [dbo].[TimeZoneTable](
    	[pkid] [int] IDENTITY(1,1) NOT NULL,
    	[timezonename] [nvarchar](128) NULL,
    PRIMARY KEY CLUSTERED 
    (
    	[pkid] ASC
    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
    ) ON [PRIMARY]
    GO
    

    And we can populate this from sys.time_zone_info table.

    INSERT INTO TimeZoneTable (timezonename)
    SELECT name FROM sys.time_zone_info;

    We can create a function to convert this datetime value knowing the timezone into UTC is as follows

    CREATE FUNCTION dbo.fn_ConvertToUTC (@localDatetime DATETIME, @timeZone NVARCHAR(128))
    RETURNS DATETIME
    AS
    BEGIN
        DECLARE @utcDatetime DATETIME;
        SET @utcDatetime = @localDatetime AT TIME ZONE @timeZone AT TIME ZONE 'UTC';
        RETURN @utcDatetime;
    END;
    

    Now runnig the following query on some example data we get the following

    SELECT 
        pkid, 
        storeddatetime, 
        timezonename, 
        dbo.fn_ConvertToUTC(storeddatetime, timezonename) AS utcValue
    FROM 
        MainTable;
    

    And here is a function that will convert to local time from utc

    CREATE FUNCTION dbo.fn_ConvertToLocal (@utcDatetime DATETIME, @timeZone NVARCHAR(128))
    RETURNS DATETIME
    AS
    BEGIN
        DECLARE @localDatetime DATETIME;
        SET @localDatetime = @utcDatetime AT TIME ZONE 'UTC' AT TIME ZONE @timeZone;
        RETURN @localDatetime;
    END;

    Javascript option
    The getTimezoneOffset() method of the Date object in JavaScript calculates the difference, in minutes, between the local time zone and the UTC time zone 1. The method returns a positive value if the local time zone is behind UTC, and a negative value if the local time zone is ahead of UTC 1.

    The method works by subtracting the local time zone offset from UTC time. For example, if the local time zone is 5 hours behind UTC, then the method will return 300 (5 * 60) 2

    Everyone gets beaten by Complexity

    I don’t do that many strategic posts these days (compared with my first posts) but this is really a reminder to myself to always seek out the simplest solution especially when configuring systems and writing code. Less is generally always better. The above post was a DALL E production and the below graph is the simpler one which QED actually makes sense. I would suggest that the line moves to the right with improved education of staff / time / money and number of staff but eventually productivity dramatically drops with complexity no matter how much money , time or people.

    SQL Server Express – Connection to QGIS issues with geometry display

    I was having issues displaying polygons stored in SQL Server Express locally within a QGIS 3.28 project.

    After having quizzed unsuccessfully GPT 4.0 I found the answer on good old Stack Overflow.

    As part of your SQL Server Express installation you should have installed SQL Server Configuration Manager. Open this administration package and navigate to your instance to show the following;

    Make sure both Named Pipes and TCP/IP are enabled.

    That fixed it for me
    Important note the host connection string IS case sensitive and sometimes when you use the properties from within SSMS to look at the hostname it capitalise things so watch out! That caught me out for half an hour!!

    And here is my definition of the geometry table

    CREATE TABLE [dbo].[geometry_columns](
    [pkid] [int] IDENTITY(1,1) NOT NULL,
    [f_table_catalog] [varchar](50) NULL,
    [f_table_schema] [varchar](50) NULL,
    [f_table_name] [varchar](100) NULL,
    [f_geometry_column] [varchar](50) NULL,
    [coord_dimension] [int] NULL,
    [srid] [int] NULL,
    [geometry_type] [varchar](50) NULL,
    PRIMARY KEY CLUSTERED
    (
    [pkid] ASC
    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
    ) ON [PRIMARY]
    GO

    PHPRunner – Pass Value to CSS Reference using Javascript in a pop up

    Javascript is very powerful and will easily calculate all sorts of interesting things for you dynamically. In PHPRunner I use the popup windows for nearly every table or view form so I wanted it to work with these.

    But with PHPRunner we want to store these in the database. I had a devilish time finding a way of referencing the field with which to copy any Javascript value into. After some lengthy discussion with ChatGPT 4 (via Bing) it suggested that I might try and use the CSS Selector.

    I then discovered that I couldn’t seem to identify the name of the CSS Selector.

    What I discovered is that I could not see a static CSS Selector reference for any of the fields except those that I had altered the formatting on for example changing the font to Roboto Mono.

    So first step choose your target field and then alter it using the PHP page designer and then publish.

    What I discovered was after that I could use the inspect item to identify the CSS Selector

    On the published application navigate to the form and the field you wish to target for entry right click and select inspect.

    Look to the DevTools window (in chrome and you should see in bold the css names of your field

    You can then ask ChatGPT the following

    Can you parse the input css selector I need from the following string that can be used by javascipt to be passed a value

    [data-page="t0017_add"][data-itemid="integrated_edit_field4"][data-page="t0017_add"][data-itemid="integrated_edit_field4"][data-page="t0017_add"][data-itemid="integrated_edit_field4"][data-page="t0017_add"][data-itemid="integrated_edit_field4"] > * > * > input

    An element is a part of a webpage. In XML and HTML, an element may contain a data item or a chunk of text or an image, or perhaps nothing. A typical element includes an opening tag with some attributes, enclosed text content, and a closing tag. Elements and tags are not the same things.

    More on elements is available here

    This can now be used behind a button and away we go… see below

    document.querySelector('[data-page="t0017_add"][data-itemid="integrated_edit_field4"] > * > * > input').value = "Password123";

    And here are some notes on finding CSS Selectors by ChatGPT4 I am still investigating CSS Selectors there seems to be a black art to understanding their structure and how they can be useful

    I also asked chatGPT about child selectors in CSS. In my discussions with chatGPT I have discovered that spaces are important and symbols are important in naming and of course Javascript is case sensitive. The dynamic nature of CSS Selectors and their very specific naming conventions combined with spaces potentially being characters really means you need to be on your toes when you use them.

    Geoserver – Setting up Users / Roles / Workspaces and Layers

    Create Users
    Firstly ensure you are logged in as administrator ONLY the system administrator can set up users and roles and the default geoserver username/password of the administrator is admin/geoserver

    The sa login has additional menus as here

    Go to the Security menu on left towards the bottom and find line marked – Users/Groups/Roles

    Go to the Roles tab

    Look to the green circle with white cross and add new role and complete as appropriate

    Note once you have created the role you will not be able to edit the name of that role so you have to delete the role and then create a role with a different name if you don’t want the old role name to continue.

    Next we create the users

    Go to the Users/Groups tab

    Look to the green circle with white cross and add new user and within the Roles taken section ensure you move the role you created from the left box to the right see grey above.

    The new username should now appear in the User list

    You grant admin privileges to the workspaces through the Workspace section as follows

    Within the edit workspace section go to security

    And give admin rights to the role you wish to link to the workspace

    Note that you still won’t see anything until you go into layer edit and edit the security as follows.

    Within the Security section you can also see the roles added

    PHPRunner and MS SQL Azure – Running a Stored Procedure behind a button – With Sweet Alert User Feedback

    I have written previously about triggering stored procedures in MS SQL Azure through a PHP Runner generated web front end. This is a vanilla explanation which includes sweet alerts to indicate users updating them on the progress of longer running stored procedures. It also locks the front end preventing users from rushing procedures until the queries have had time to complete.

    Firstly navigate to the table you wish to run the stored procedure from in this example it is table T032Email List Screen

    Insert a Custom Button here I call it Transform

    Next select the button and look to the right for the Edit Button Code button.

    You will be presented with the Buttons properties window dialog with three tabs
    1=Client Before
    2=Server
    3=Client After

    1) In the Client Before place the following code

    Swal.fire({
      icon:"info",
      title: 'Please wait... transforming selected records',
      allowOutsideClick: false,
      allowEscapeKey: false,
      showConfirmButton: false
    });

    2) On the Server trigger the stored procedure – swap out your stored procedure name.

    DB::Exec("EXEC dbo.transformemail");

    and on the 3) Client After notify the user once everything has been completed

      Swal.fire({
        icon: "success",
        allowOutsideClick: false,
        allowEscapeKey: false,
        title: "Records Transformed",
        showConfirmButton: false,
        timer: 2000
      });
    setTimeout("window.t032emailPage.reload({a:'reload'})", 2000);

    PHPRunner – Using Sweet Alerts to give Users confirmation of Save

    In PHPRunner as of version 10.51 the sweet alert javascript library is included in PHPRunner generated web applications.

    How can we add code to a project to make bespoke adjustments?

    Firstly navigate to the form you wish to add a special sweet alert to and insert a Custom Button.

    Next navigate to the Events tab in PHPRunner and expand the table or view to which you added the additional button.

    Behind the Javascipt OnLoad event add the following code

    $('a[id^="saveButton"]').hide(); // Hide button "Save"

    And on the Client Before Event of the new button add the following code

    Swal.fire({
    icon: "success",
    title: "Saved",
    showConfirmButton: false,
    timer: 1000
    });
    $('a[id^="saveButton"]').click();
    return false;

    So what does the code do
    Javascript OnLoad Event – Hides the real save button
    On Click – Triggers the sweet alert success routine and once that is complete triggers the hidden savebutton code.

    And more examples and inspiration

    Sweet alert 2

    And also if you find the Sweet Alert modal size too small add this to your page css:

    .swal2-popup { font-size: 1.6rem !important; }

    Most of this was from the following Xlinesoft User forum Thread
    Thread

    MS SQL Azure – Computed Column – Create UTC Datetime value from Datetime & Timezone fields – Daylight Saving

    This is not tested at time of writing obtained from ChatGPT 4

    Answer
    Sure, here’s an example of how you might implement the ConvertToUTC function to handle time zones like ‘London/Europe’. This example uses the AT TIME ZONE clause, which is available in SQL Server 2016 and later:

    CREATE FUNCTION dbo.ConvertToUTC (@localDateTime DATETIME, @timeZone NVARCHAR(50))
    RETURNS DATETIME
    AS
    BEGIN
        -- Convert the local datetime to a datetimeoffset value using the specified time zone
        DECLARE @datetimeoffset DATETIMEOFFSET = @localDateTime AT TIME ZONE @timeZone
        -- Convert the datetimeoffset value to UTC
        DECLARE @utcDateTime DATETIME = CONVERT(DATETIME, SWITCHOFFSET(@datetimeoffset, 0))
        RETURN @utcDateTime
    END

    You can then use this function in a computed column definition:

    ALTER TABLE YourTable
    ADD UtcDateTime AS dbo.ConvertToUTC(field01, field02)

    In this example, field02 should contain time zone names that are valid for the AT TIME ZONE clause. For example, ‘UTC’, ‘Pacific Standard Time’, ‘Central European Standard Time’, etc. For London, you would use ‘GMT Standard Time’.

    Please note that this is just a basic example. The actual implementation would depend on how you’re storing time zone information and the specific requirements of your application. It’s also worth noting that handling date and time conversions can be complex due to factors like daylight saving time, so it’s important to thoroughly test your solution to ensure it handles all edge cases correctly.

    If you’re frequently performing these types of conversions, you might want to consider storing datetime values in UTC directly in your database. This can simplify your data model and avoid potential issues with time zone conversions.

    Question
    Does AT TIME ZONE have a default value that will alter depending on whether the UK is under British Summer time?

    Answer
    Yes, the AT TIME ZONE clause in SQL Server does take daylight saving time into account. When you use a time zone name with this clause, it will automatically adjust the time based on the current daylight saving time rules for that time zone.

    For example, if you use ‘GMT Standard Time’ (which is the time zone name for UK time in SQL Server), it will automatically show the correct offset for daylight saving when it is in effect.

    Here’s an example:

    SELECT GETDATE() AT TIME ZONE 'GMT Standard Time'

    This will return the current date and time on the database BUT in my testing does NOT adjust the time to the timezone but will show the correct time zone OFFSET for the date for corrected for daylight saving i.e Greenwich Mean Time (GMT) or British Summer Time (BST), depending on the date.

    Please note that the actual time zone names and their daylight saving time rules are based on the time zone data on the server where SQL Server is running. This data is updated through the Windows operating system, so it’s important to keep your servers up-to-date to ensure that the time zone rules are accurate.

    And here are what I think are the supported time zones
    MS Azure supported time zone values

    Post-script
    A lot of online editors won’t let you run the At Time Zone function but this one does..

    dofactory

    Comparing the time to my location and the time where I am I can tell that this is Pacific Standard Time – i.e Western America – California perhaps.

    We also note that the time isn’t adjusted to the stated timezone but we do see the offset (see +01:00 in record return) Again this is because per se it doesn’t tell you what the server is set to but by using the sysdatetimeoffset we can correct the server time back to UTC and the adjust for timezone see second code example. The Timezone although included in the time is again be stated.

    SELECT SYSDATETIMEOFFSET() AT TIME ZONE 'W. Europe Standard Time' as WEuropeTime, 
    SYSDATETIMEOFFSET() AT TIME ZONE 'GMT Standard Time' as GMTStandard

    And so there is a direction to timezone switch – In the above we have mainly been switching from UTC to a timezone but the below switches from a timezone to UTC which is what we will need if we are storing the input as a datetime and a separate timezone for each record.

    SELECT SWITCHOFFSET(GetDate() AT TIME ZONE 'Pacific Standard Time', '+00:00')
    

    Remember though timezones are held outside SQL Server databases on the server and as such are non deterministic. This is a good demonstration of determinism in practice

    Deterministic algorithms are entirely predictable and always produce the same output for the same input.

    Non-deterministic algorithms may produce different outputs for the same input due to random events or other factors.

    Databases and Time – Time Zones are not Datatypes

    When working with time in databases and web applications we should recognise that time zones are NOT a datatype. They represent an important part of a datetime value that links to a datetime value and may represent the time on the server or the time in the browser or some time in a particular country for an attribute of a record. Simply viewing a datetime value individually is incomplete if users are entering data from all over the world. If this is not designed into the schema from the beginning problems may be presented when filtering and ordering records based on datetime.

    Does your application have users within the same or different time zones? Is your database hosted in the same or a different time zone?

    SQL Azure database servers for instance only allow for a default time zone set to UTC. This means wherever you are in the world if you set a default value on an datetime field new records will record time at UTC. This can present issues where users submit values based on a different localised time.

    Default values that are set to GetDate() can be out by a given time offset. Additionally web interfaces which set the datetime value at input may be set by the browser and if the browser is not on UTC that time will use a clock that may be forward or behind UTC. What’s the solution? My suggestion is to record the time zone in a separate field per record to the datetime value and then use a view to convert all datetimes to a common time zone field (probably best UTC). At least that should reduce complexity for users. Developer and users should be aware as this could be confusing when a user inputs a time into a database and then is presented with the UTC value when viewing the value.

    It should be noted that Azure Logic Apps are on a server that is not set to UTC but local time (in my case London / Europe) I suspect that whatever local you set will be affect the local time. This led to a situation where in British Summer Time a logic app procedure triggered for 10pm was showing 9pm in the database – the BST offset.

    Repeat after me – Time Zones are not Datatypes

    And here is a nice site that identifies differences from UTC of timezones based on URL parameters

    Time around the world
    British Summer Time – 10am UTC

    Eastern Seaboard Time – 10am UTC

    Sydney Australia – Aussie Eastern Daylight Time – 10am UTC

    WordPress and OceanWP(theme) – Default config for blank Blog

    WordPress is excellent but setup through the admin menu system seems to me unintuitive and a standard configuration can be really difficult to replicate. I constantly wrestle to get sites how I like them. I resolved to document a standard configuration with WordPress and the theme OceanWP (free version) to speed things up.

    What are my preferences on good design? A wide section for text and obvious and intuitive search option with menu that can be setup dynamically to add new posts to menu under their respective category without continually needing to alter the Menu design.

    It should be noted that I don’t use the OceanWP theme for this site – which is a historical thing.

    I’ve used OceanWP theme (Free version) for some time its a good free popular WordPress theme.
    You can read more about Oceanwp here
    My shared hosting provider is InMotion I have found them to be excellent particularly their online support which is 24/7 link here

    Firstly install a default WordPress website. My provider uses softalicious through cPanel your preferred web management hosting platform may vary. I won’t document this stage

    Next install the OceanWP Theme and add Plugins.

    Next I add my preferred plugins, some of which come with shared hosting WordPress installation;

  • Akismet (default with my shared hosting wordpress installer (inmotion))
  • Broken Link Checker
  • Jetpack (again default with my shared hosting wordpress installer (inmotion))
  • Login logo
  • Nginx Helper (default)
  • Password Protected (if enabled will require a password for the entire site)
  • Post Types Order
  • Total Upkeep (I use premium as it allows for improved backups)
  • Current Year, Symbols and IP Shortcode (allows addition of copyright symbol with dynamic year)
  • .

    Next Default configuration of OceanWP WordPress theme using the above plugins

    Settings / General
    – alter site title and tagline

    Settings / Reading / Search engine visibility
    Set as appropriate either tick yes to encourage or clear to discourage
    – Discourage engine visibility
    This is also where you show blog posts as either Full Text or just an Excerpt. If you are on Full text there is little point in having the Read more element in the blog element positioning section.

    Settings / Permalinks
    Apparently this helps with SEO
    – Postname

    Settings/ Discussion

    Appearance/Customize/Widgets/Default Sidebar
    This is where you can add elements and alter the order of elements in the sidebar. I find the admin design clunky and unintuitive but I eventually discovered how to do it, use the edit icon and three vertical buttons to add and subtract elements (search is not added and removed from the sidebar here for example).

    Appearance / Menus
    I set up a menu called Primary Menu and add Home and Index to it as follows. Note that Home is a Custom Link and Index is a Category Link. Note if Home is not available you can create it and then add it.. the url link will just be the default domain.

    Appearance / Widgets

    You can highlight recent comments and delete the entire block – I also add the category list to the side bar

    Appearance / Customise / General Options
    – Layout Style – Separate

    Appearance / Customise / Top Bar / General
    Turn off Top bar by ensuring the ENABLE TOP BAR tick box is blank

    Appearance / Customise / Menus
    Here I am ensuring the menus are not repeated. In OceanWP there are four places where you can place menus. I turn off Top Bar and I don’t like the Menu being in the footer so I leave both at the default blank -Select-

    Appearance / Customise / Blog
    We are given two options here – Blog which is the overall and Single Posts. I will normally remove the sidepanel for the single posts and I am still considering whether I should remove the side panel for the full list of posts, either way you do both here.

    Appearance / Customise / Header / General
    I haven’t experimented with all the different options but one that seems to work is Minimal so these are the settings

    Appearance / Customise / Header / Menu
    Again there are a lot of options here and not all seem to work or seem to be intuitive so I have settled on the following

    Appearance / Customise / Header / Mobile Menu
    Again a lot of options but here are a set that I am happy with and work. It should be noted see the drop down Elements Positioning it likes to have the Logo at the front otherwise the magnifying glass can disappear. Note if you change it there is sometime lag on display this might be related to your hosting provider and caching (not sure)

    Appearance / Customise / General Options / Theme Icons
    I like to set to Font Awesome

    Appearance / Customise / Footer Bottom
    A nice touch to ensure that copyright is Set – here we use the shortcode from the plugin

    Appearance / Customise / General Options / Scroll to Top
    If you haven’t installed a Sticky Header this can be a reasonable alternative OceanWP specific

    Next lets make sure the search box is above content on mobile devices I like this as it makes it really obvious to users on mobiles about how to use the search facility. The search facility to my mind being one of the most important things in a wordpress site. I also like infinite scroll.

    I’ve pretty much documented everything here – I haven’t touched the logo which is how you get the icon on a site which is also something you are going to do. I will come back to this post and add that when I am doing that in anger for a demonstration site. Remember if you are reading this that these options are relevant to WordPress with the Free theme of OceanWP installed at October 2023.

    Add POSTGIS extension to Azure Postgres Database – Worked Example

    Firstly we need to allow for the extension on the server.
    Within the Azure portal navigate to server parameters

    Search for azure.extensions

    Select which extensions to add
    REMEMBER to save before exit!

    Wait a short period while the extension is loaded onto the server..

    Some points
    Azure Database for PostgreSQL supports a subset of key PostgreSQL extensions as listed HERE. This information is also available by running SHOW azure.extensions;. Extensions not listed in this document aren’t supported on Azure Database for PostgreSQL – Flexible Server. You can’t create or load your own extension in Azure Database for PostgreSQL.

    After extensions are allow-listed and loaded, these must be installed in your database before you can use them. To install a particular extension, you should run the CREATE EXTENSION command. This command loads the packaged objects into your database.

    The extension(s) should now be listed against your database.

    Postgres on Azure : also named Azure Database for Postgres Flexible Server : MS Azure

    Quick Start Azure Documentation

    Good link on tips and tricks Azure

    Login to your azure portal and choose Create a resource

    Next select the Azure Database for PostgresSQL Flexible Server

    Now indicate the starting parameters
    Please note IF you want to connect using PGAdmin I found issues with version 15 so chose version 14 it is not clear why this is some quick googling suggested it was the version of PGAdmin I was using. So at the bottom of this post I capture the about PgAdmin 4 that I am using as a client. I was able to connect to the server vis psql.
    Remember all of your parameters and choose an appropriate account

    Note you can stop the server if this is a development test copy which is good for reducing costs…

    Do this by navigating to the server and selecting the stop button at the top.

    Note give it time to start and stop not like me try and start the server before it has finished stopping!

    MS Azure : Identify outbound IP addresses from an MS Azure Web App Service

    When setting up an Web App Service on Azure which you are wanting to hook up to an Azure Database you will need to ensure that the IP address of the Web App Service is white listed for the Database Server. This is true of Postgres and SQL Azure in my experience. Now when you upload your project to the Web App Service and connect to an SQL Azure the error message indicates what IP address of the web application is failing. That is not the case when trying to connect to Azure Database for Postgres Flexible Server (my experience at September 2023) so how do we find this. As usual the answer is in Azure documentation but thoroughly buried.

    One paragraph states

    Regardless of the number of scaled-out instances each has a set number of outbound IP addresses at any given time. Any outbound outbound connection from the App Service app, such as to a back end database, uses one of the outbound IP addresses as the origin IP address. The IP address is selected randomly at runtime,

    SO YOUR BACK END SERVICE MUST OPEN ITS FIREWALL TO ALL THE OUTBOUND IP ADDRESSES FOR YOUR APP

    So how do we find these outbound IP addresses?
    Log into the Azure Portal using the administrator account that has access to the Web App Service you want to identify outbound IP Addresses for.

    Select the web app service

    Look to Networking within Settings menu

    Now several boxes should be presented to you towards the right of your browser window identify the one marked Outbound Traffic.

    IP Addresses you are interested in should be presented where the red box is. There is a helpful copy button should you need to quickly copy them.

    So that is great but what if you are quickly wanting to identify the outbound IP address of a web app

    So go to the windows terminal

    and use the nslookup command followed by the domain you wish to identify the IP address of.

    Postscript
    One point is that I seemed to have difficulty registering the outbound IP addresses and it was repeatedly failing to connect a front end application to the Azure Database for Postgres flexible server – I went through my outbound addresses thoroughly and then stopped and started the server itself. Note the restart option wasn’t working for me so I did a complete stop and then start. After this it seemed to work. I will continue to monitor this but its not a process I perform often so it may be sometime before I can truly establish my issue. It should be noted that nslookup did not initially reveal the IP that worked I still don’t know which of the outbounds are working or whether it was the restart that made it work. I did note that the allow all azure connections option always corrected the issue (so this can be used for testing)

    Postgres – Export a Spatial database and then Importing again (pg_dump + psql utilities)

    Lets take this from the beginning. The first thing you need to do is be on the physical machine where the database is running.

    Next check that the database server is up and running

    To do that

    Go to the windows bar click on the search icon and type in

    Services

    Scroll down to P and look for the name postgresql this and look to the Status column in the above example there are two postgres 11 and postgres 14 of the two only 14 shows as Running.

    There are several Utility executables that come with every postgres installation.

    The standard installation set up puts these into a directory equivalent to the following;

    We need to navigate at the command prompt to the location of the executable we are interested in. Technically you could qualify the path to, the executable when running a command, but I tend to navigate to the location.

    So firstly in the search button go to the command prompt

    cmd

    cd C:\Program Files\PostgreSQL\14\bin

    pg_dump -U postgres -W -F p -d dbspatialhub > c:\sql\dbspatialhub.sql

    pg_dump -U postgres -W -F p -d dbspatialhub > c:\sql\dbspatialhub.sql

    and here is a string that works if your database is on the local server with a non standard port / substitute 5433 for your port

    pg_dump -U postgres -h localhost -p 5433 -d dbsource > c:\sql\dbsourceexport.sql

    run this command – you will note that I qualify the output directory here. I prefer to do this as I don’t particularly like writing and deleting files into the program directory. I prefer this directory to be reserved for standard files and reduce the likelihood of me in anyway accidentally deleting an important file.

    You should be presented with the Password: option you will need to input the password for specified User (-U) complete

    Success will be shown as a return to the former prompt with no message.

    and running a DIR on the directory look for the sql file…

    Next from either psql prompt or from within PGAdmin4 create a database you are going to import the sql into and ensure that the postgis extension is installed. Remember this is the psql prompt and not DOS command line prompt.

    CREATE DATABASE dbspatialhubnew
    CREATE EXTENSION postgis
    

    Now you have to go back to the command prompt (and the correct directory).

    If you are on a different port add the following string -p 5434

    psql -U postgres -d dbspatialhubnew -f c:\sql\dbspatialhub.sql

    After being asked for the password the import will start and continue showing a list of various commands and data being executed and imported.
    Similar to below.

    Now you can go back to a client such as QGIS , link to the database to start confirming a successful import.

    Congratulations you have copied a database.

    It should be noted that as it stands I am having problems importing the layers table which has the format of the boundaries for tables.
    I will continue to work on this.

    QGIS – Working with Free BASEMAP Services for Desktop Projects (Bing and Google Maps Aerial and Lines)

    As soon as you need to practically implement any information in Spatial Databases display of the information through a mapping front end becomes absolutely vital. Many database administrators are used to simply going into their favourite database editors and displaying the raw subsets of tables and queries. That works well for financial transactions and inventory tables were collapsing the attributes of objects into single digits is often valuable or possibly preferable to simple photos. When dealing with boundary information the complete opposite applies. Display of information as simple screens of matrix numbers is completely useless.

    Additionally often boundaries make zero sense unless referenced to the land beneath them either through aerial photography or topographic maps.

    In a previous time where I worked we actually commissioned a company to give us aerial photos of a local authority. This was not an insignificant amount of money and was probably only marginally reduced by the vendor having multiple channels of sale. Google and Microsoft are now very good in offering very good aerial and line interpretations for limited use to companies and individuals. This is absolutely great as it can be used as background either to confirm accuracy of other information or as data upon which to calculate further information (eg routing).

    So how can an individual get up and started with some of these basemaps.

    Sometime recently (I know not when exactly) QGIS changed its implementation of Open Street Maps through their desktop – rather than being an additional plugin Open Street Map provision is now included on install.

    Here I am working with QGIS version 3.10

    Now you should be presented with the Data Source Manager Dialog which looks like this

    Now expand the XYZ Tiles

    You can then double click on any one of the three and the WMS will be transferred into the layer panel

    Layer > Data Source Manager > XYZ

    Thank you to Google and Microsoft and OSM and QGIS for a great implementation.

    Long Passwords : Evidence showing why long passwords are a simple security principle

    Creating long passwords is one of the easiest ways of improving the security of your applications and resources – here’s a graphic that estimates time it might take to brute force a password depending on the complexity of the character set and its length.

    Secure Password Generator

    Last Pass Password Generator

    Norton Password Generator

    1Password – random password generator

    bitwarden random password generator

    AVAST Password Generatotr

    MariaDB – Add calculated field that shows day name

    ALTER TABLE tablename 
    ADD COLUMN dayname VARCHAR(20) GENERATED ALWAYS AS (DAYNAME(startdate)) 
    STORED;

    This query will add a new column called dayname to the tablename table. The column will be of type VARCHAR with a maximum length of 20 characters. The GENERATED ALWAYS clause tells MariaDB to calculate the value of the column each time a row is inserted or updated. The AS keyword specifies the expression used to calculate the value, in this case the DAYNAME function applied to the startdate column. Finally, the STORED keyword tells MariaDB to store the calculated value in the table so that it can be retrieved more efficiently.

    Note that the GENERATED ALWAYS and STORED clauses require MariaDB version 5.7.6 or later. If you are using an earlier version of MariaDB, you can still add a calculated field using a trigger or a view.