Postgres : Create Table SQL Syntax Best Practice

For primary keys it is now the recommendation to use identity during table creation.

Apparently it explicitly conveys the purpose of the column and aligns with SQL standards thus it is important to use when possible rather than the former serial command. In theory this should be more portable as well.

Depending on what source you read you may or may not need the following extension to create UUIDs – I needed it but I am told it should be in the main library.

CREATE EXTENSION IF NOT EXISTS pgcrypto WITH SCHEMA public;

So here is standard code for creating a table with three columns

CREATE TABLE t001example 
(
pkid bigint GENERATED ALWAYS AS IDENTITY PRIMARY KEY,
genuuid uuid DEFAULT gen_random_uuid(),
sitename varchar(500)
);

And these are example values

cPanel – Just gets better and better

Good self service Web Hosting Management software is just a game changer for Dev Ops work.

Every year I learn a bit more and every year they probably improve it a bit more. I am really very far off the cutting edge so I can’t speak to cutting edge improvements of the whole platform but I can talk to the options that I am continually learning.

  • Being able to quickly create, manage and backup additonal wordpress sites for zero cost
  • Being able to quickly create, manage and backup additional mysql databases for zero cost
  • Being able to host additional CRUD applications for zero cost (with thanks to PHPRunner)
  • Being able to create and manage email addresses easily
  • .

    AND NOW……

    Being able to quickly manage the update PHP versions is just amazing…

    cPanel:

    Log into cPanel

  • Under Software click MultiPHP Manager
  • Make a note of of your current version on the domain
  • Select an updated version of PHP and click Apply
  • Visit your site to confirm that the change has not broken your site.
  • If the site is broken, you can switch back to the previous PHP version
  • .

    Yes its that easy

    Things I want next
    Front end cPanel at InMotion backend SQL Server in Azure
    Front end cPanel at InMotion backend Postgres Server in Azure
    Postgres backends in cPanel

    cPanel rocks
    and so does PHP and PHPRunner
    and AI when it can learn to spell and count!
    (yes the picture annoys me as well)

    Device Name – Quickest ways to Identify on Windows x64

    Computing tries hard to obfuscate the device that is actually running the software that you are using. That is particularly the case with cloud computing and its particularly the case with web enabled software. Are you logging into the software that manages the database? or are you logging into the database itself?

    Irrespective software gets installed on an individual machine and it is often important to know the name of that machine as that software might have its own username and password requirement and you need to know the username/password combination for the particular installation you are trying to access.

    This is often confusing for database servers – increasingly I am trying to remember a broad map of what computers are where and what servers are where so I can at least reference them accurately.

    Two ways to find the DEVICE NAME on a windows machine (there are others)

    In the Windows Search bar type

    ABOUT
    

    or in the Windows Search bar type

    DEVICE MANAGER
    

    About will give you this

    Device Manager will give you this

    Note you can get to that About screen through file explorer by right clicking on this PC and selecting Properties – but it is slower than just typing about in the Windows Search box.

    GeoServer and ORACLE – WFS & WMS Config Tips

    This post is a brief description of my findings on setting up Geoserver from scratch and some particular points on setting it up for an Oracle connection. I hope it will prove useful for others.

    INSTALLATION
    GeoServer with ORACLE database connection support
    Geoserver Postgres (PostGIS) connection support comes with the default installation. This is NOT the case for Oracle or SQL Server therefore for enterprise purposes it is a requirement to install Geoserver WITH Oracle and SQL Server extensions if you wish to support these database connections.

    I’ve done a post on Geoserver on windows installation link here that talks specifically about SQL Server. I will not go into a lot of detail about installation here. See below for links to the geoserver docker image I didn’t install the geoserver I describe here but I can confirm that we have been successful in including Oracle as a plugin to our Geoserver installation and we are able to connect.

    Official Geoserver Docker Image GITHUB

    Below is a recent (Feb 2024) list of Docker parameters available at install including Oracle and SQL Server please seek advice on the use of Docker for installation if you are not familiar with it already.

    GENERAL SETUP

    Contact Information – General Setup
    On the Welcome screen there is a line that reads For more information visit {organization} to change this from the initial meaningless variable setting go to Contact Information and fill out your organisation.

    Difference between Save and Apply – General point

    In geoserver admin there are a lot of pages where there is a Save and an Apply button. Most users initially at least find this confusing.

    In GeoServer, the terms “Save” and “Apply” serve different purposes:

    Save:
    When you click “Save”, it permanently stores the changes you’ve made to the configuration. These changes are saved in the GeoServer configuration files. , “Save” ensures that your modifications persist even after restarting GeoServer or rebooting the system. It’s like committing your changes to a version control system—it makes them permanent.
    Apply:
    “Apply”, on the other hand, is more immediate. When you click “Apply”, it takes the current configuration (including any unsaved changes) and applies it dynamically. The applied changes take effect immediately without requiring a restart or permanent storage. Think of it as a temporary adjustment that affects the running instance of GeoServer but doesn’t alter the saved configuration. In summary, “Save” makes changes permanent, while “Apply” applies them immediately without saving them to the configuration files

    The first thing you will want to do is determine if you can actually connect to your ORACLE database.

    ORACLE required parameters for database connection. Before you continue make sure you have the following

    ORACLE host name
    Port ORACLE database (default 1521)
    database name
    user – case sensitive
    password – case sensitive
    Geoserver installed with Oracle extension

    It is good idea to get your DBA to setup a read only user if you are connecting to a database that is part of a vendor supplied product as you really should not be editing the database through anything except the vendor’s UI and Geoserver can be setup to allow edit of data. Geoserver has powerful security configuration included but conflicting settings can arise that may alter the intended security levels making it safer to enforce read only least privilege at the database level.

    ORACLE and Postgres – some SQL Differences
    Please be aware Oracle databases are by default case sensitive but may convert SQL Strings to upper case when passed to Oracle. I believe this complies strictly with the SQL Standard.

    Postgres classifies tables, columns views and other database objects as Identifiers. SQL Identifiers must begin with a letter. If possible avoid including special characters in the naming of identifiers because although postgres may support them special characters are not allowed in the SQL Standard so their use might render applications less portable. Additional identifiers and keyword must begin with a letter. Keywords are defined as items such as SELECT / UPDATE / CREATE etc.. The SQL Standard says that databases should not define a key word that contains a digit or starts or ends with an underscore so identifiers could be something like t001users should be safe against possible conflict with future extensions of the standard. In postgres the maximum identifier length is 63 bytes. Additionally Postgres also folds unquoted names to lower case but some postgres management UIs allow for the naming of identifiers with varied case.

    UPSHOT for Geoserver (and anything else for that matter)

    If you are having difficulty referencing tables or views in any enterprise database – experiment with the case sensitivity in the query. There might be some folding of object names when passed to the server.
    This characteristic should be born in mind when using SQL Views in Geoserver.

    ORACLE Store setup to get the most out of Geoserver
    Login to Geoserver admin panel using your admin account

    On the left hand side under Data hit Stores

    Assuming you have successfully installed Geoserver with the Oracle extension you should be presented with the following dialog and in particular the items highlighted below.

    Oracle NG uses a standard Oracle driver (ignore the other options this article will not explain them)

    Select Oracle NG

    Use the default settings for most things but IMPORTANTLY ensure Expose Primary Keys is ticked

    NOTE from our brief testing we found that exposing primary keys seemed to be one factor that improved stability in our Oracle connection when dealing with QGIS.

    QGIS WFS set up to improve Oracle Connection Stability
    It is recommended that you have a copy of QGIS to test setup to things like WFS

    When we first started displaying specifically the Oracle spatially enabled table in QGIS via WFS service we found that the layer would initially display but if we were to leave the project and go back in for a period of time the layer would not be visible. This seemed to be temporary and after an unpredictable amount of time the layer would reappear. The attribute table continued to be viewable but polygons were not visible.

    We fixed this by ensuring Expose Primary Key (see Geoserver layer configuration) was ticked and within QGIS within the WFS Connection Configuration ensure within the WFS options section that the version is set to match the linked WFS Service. This might not be necessary in all QGIS versions.

    LAYER Config
    Useful background information (partly written by co-pilot)

    Enabled and Advertised parameters – Layer Config
    In GeoServer, the Enabled and Advertised checkboxes play distinct roles in layer configuration:

    Enabled:
    When a layer is enabled, it means that the layer is available for requests (such as WMS GetMap or WMS GetFeature). If a layer is not enabled, it won’t be accessible for any kind of request. However, it will still appear in the configuration (and in the REST config). Essentially, enabling a layer makes it operational and ready to serve data.
    Advertised:
    By default, a layer is advertised. An advertised layer is included in the GetCapabilities request and appears in the layer preview. However, if you uncheck the Advertised checkbox, the layer will still be available for data access requests (like WMS), but it won’t appear in any capabilities documents or previews. In other words, non-advertised layers remain functional but are not explicitly listed in service metadata. Remember, enabling a layer makes it operational, while advertising it determines whether it appears in service capabilities and previews

    Name and Title parameters – Layer Config
    Name:
    The Name corresponds to the identifier used to reference the layer in WMS (Web Map Service) requests. It is primarily used for computer interaction and serves as an internal reference default name is usually the name of the table or the name of the referenced view. When creating a new layer for an already-published resource, the Name must be unique to avoid conflicts. Essentially, it’s the technical name associated with the layer.
    Title:
    The Title provides a human-readable description that briefly identifies the layer to clients. Unlike the Name, which is for computers, the Title is meant for humans to read. It helps users understand the purpose or content of the layer. For example, if you have a layer representing a map of the USA, you might set the Title to “This is a map of USA.” Remember, while the Name is essential for system functionality, the Title enhances user experience by providing meaningful context about the layer1

    Security – the Name does not need to be the table or view name you are referencing. This string appears in the URL which may be viewable to the user – you may wish to obfuscate this by changing the name

    i18n – Internalization – Layer Config
    Internationalization (i18n):
    GeoServer supports returning a GetCapabilities document (used for describing available services and layers) in various languages. The i18n functionality is available for the following services:
    WMS 1.1 and 1.3
    WFS 2.0
    WCS 2.0
    The i18n editor allows you to provide translations for the title and abstract of various components:
    Layers configuration page
    Layergroups configuration page
    WMS, WFS, and WCS service configuration pages
    Additionally, for Styles, there’s a separate i18n configuration (see i18N in SLD).By default, the i18n editor is disabled and can be enabled via the i18n checkbox.

    GetCapabilities Document Language:
    The language of the GetCapabilities document can be selected using the AcceptLanguages request parameter.GeoServer’s response varies based on the following rules:
    Internationalized elements include titles, abstracts, and keywords.If a single language code is specified (e.g., AcceptLanguages=en), GeoServer tries to return content in that language.If multiple language codes are specified (e.g., AcceptLanguages=en,fr), GeoServer attempts to return content in one of the specified languages.

    Coordinate Reference Systems – Declared SRS and SRS handling – Layer Config

    If you are in the UK ensure that Declared SRS is set to 27700

    SRS Handling
    In GeoServer, the SRS handling setting plays a crucial role in how coordinate reference systems (CRS) are managed. Let’s explore its significance:

    Declared SRS:
    The Declared SRS specifies the coordinate system that GeoServer publishes to clients. It represents the CRS that clients should use when interacting with the layer. Essentially, it’s the officially declared CRS associated with the layer.
    SRS Handling:
    The SRS Handling option determines how GeoServer should handle projection transformations when the declared CRS and the native CRS of the data differ. Here are the possible values for SRS Handling:
    Force declared (default): In this mode, GeoServer forces the declared SRS upon the data, overwriting the native CRS if necessary.
    Other options (not mentioned in the snippet):
    Reproject native to declared: GeoServer performs a reprojection from the native CRS to the declared CRS.
    Reproject declared to native: GeoServer reprojects from the declared CRS to the native CRS.
    None: No reprojection is performed; the data remains in its native CRS.
    In summary, the SRS handling setting ensures that data is presented consistently to clients, regardless of the underlying native CRS. It’s a critical aspect of maintaining accurate spatial information in GeoServer1.

    Bounding Box recommendations – Layer Config
    Scotland

    Min X = 010000
    Min Y = 533000

    Max X = 464000
    Max Y = 1215000

    West Lothian

    Min X = 287000
    Min Y = 655000

    Max X = 314000
    Max Y = 678000

    UK

    Workspace setup and visibility in WFS
    Workspaces can allow for editing and not editing of information although to date I seem to be struggling to isolate layers and workspaces to allow them to be edited or not edited. One thing that is important in our present system is to Enable the settings in the Basic info – without this the layers within the workspace were not showing up in the WFS

    Restrictions on Layers : Important Note

    Important – Workspace Security settings override Layer Security Settings

    If you need to have different layer security settings within one Workspace you will need to turn off all Workspace security settings e.g see below.

    Geoserver Installation (Configuration for SQL Server on Windows)

    Once you have your data in an Enterprise Grade database sharing that information becomes important. Some vendors, in particular Digital mapping SAAS vendors want you to transfer your data into their clouds but for a lot of authorities that is categorically impossible for large segments of their data. For example about 3/4 of UK planning authorities run on software built by a company called IDOX Group plc which holds its data in Oracle this cannot be moved. For Geographical data here’s where Geoserver comes in.

    Geoserver Official Website

    From the website
    GeoServer is an open source server for sharing geospatial data.
    Designed for interoperability, it publishes data from any major spatial data source using open standards.

    GeoServer implements industry standard OGC protocols such as Web Feature Service (WFS), Web Map Service (WMS), and Web Coverage Service (WCS). Additional formats and publication options are available as extensions including Web Processing Service (WPS), and Web Map Tile Service (WMTS).

    Another way of putting it Geoserver allows programs that display and manipulate geographic information to display and edit that data while keeping it safe and secure and located in enterprise databases. It can connect to SQL Server / Postgresql and I am told Oracle which are the main relational databases used by Local authorities today so once setup and configured it could be used to compare and contrast information in one location from variable datastores. Datastores that for historical reasons cannot be moved. Here I concentrate on SQL Server because we have a particular issue with new systems needing to connect to SQL Server but it would be great if we could get it working with Oracle as well.

    The following are my notes on installation and configuration of Geoserver on Windows linking to SQL Server and in this case a remote(not on the same computer) SQL Azure instance I believe any remote SQL Server instance would be similar. This post is a detailed explanation of the installation of Geoserver locally on a laptop but I believe installation would be exactly the same on a Windows Server machine albeit additional steps would be required to publish to the web either internally or externally.

    Its all configuration so remember your host names / IP numbers / ports / usernames / passwords / database, table and column names

    To start this tutorial please ensure the following resources are available;

  • SSMS – installation of Sql Server Management Studio
  • SQL Azure instance with privileges to run SQL. Please note you will need to know the port number of your SQL server instance and have a Username and Password for your user.
  • Necessary privileges to set the firewall on the database which you will link to Geoserver –
  • QGIS – installation – If you don’t have a SQL Server instance with geodatabase already that you can use you will probably need this to add at least one record to a geometry enabled table I plan to expand my notes on this but at present you might be able to skip this
  • Geoserver 2.24.1 – installation files
  • Enough privileges on the computer you are installing geoserver to download installation files and move files into a library subdirectory of the c:/program file/geoserver directory
  • WorkFlow Synopsis Overview
    1)Download and install Java runtime engine (as above importantly here I use 11)
    2)Download and install Geoserver (2.24.1 in this case)
    3)Test that the Geoserver Admin dashboard is up and working
    4)Download and configure Geoserver Extension for SQL Server library (SQL Azure example)
    5)Good idea to setup a login specifically to your SQL Azure instance with least privilege – Your SQL Server instance with geodatabases may already have suitable logins. So steps 5 and 6 can be skipped
    6)Good idea to create an example table with Geodatabase in SQL Azure – will be used as a test connection table – NOTE if you already have a SQL Server with geodatabases you could use that instead.
    7)White list IP within the SQL Azure instance firewall rules to Geoserver computer
    8)Opening Geoserver and setting the ‘Store’ to reference your SQL Azure database using the login setup in step 5.
    9)Adding a new layer from the Store made in point 8 and seeing if you can Preview the layer – Ideally you should have a table that has some features in it. Setting the WMS and WFS up in Geoserver with sufficient rights to allow editing (if WFS setup)
    10)I don’t explain it in my post below but the next step would be to test in QGIS to see if you can setup a project in QGIS which can pull from Geoserver against a basemap and you check that the Polygons are actually where you want them to be.

    Workflow Detailed
    1)Download the install Java Runtime via JDK

    There are some complications here
    Initially I tried to use the Java Runtime engine 17 and went through the complete Geoserver installation but on testing I was getting a Java error when trying to view layers. After a short google I came across a stack overflow thread that indicated that geoserver support for JDE 17 was experimental and I should use Java Runtime engine 11. Deinstallation of 17 and installation of 11 solved this.

    Additionally there is the issue of where you obtain the Java Runtime installation from – there is the Adoptium location and then there is Oracle location. Oracle are making moves to charge for use of their Java Runtime engine so it is important that you use Adoptium resource to reduce costs.

    Later I will list the url to the JDK download for Java Runtime engine but it is important to realise that Adoptium is a safe source, well supported and noted as open source.

    https://adoptium.net/en-GB/members/

    so

    Google Search for Adoptium
    https://adoptium.net/en-GB/

    Navigate to other platforms and versions and target windows 64 bit JDK and 11 LTS version and download the .msi.

    After a short delay you will be shown the msi which will be in your downloads folder

    Double click on it to install.

    Now the only thing that is tricky here is to setup the java home directory as follows

    Video Installing Adoptium JDK on Windows

    In particular note this section

    2) Download and install Geoserver

    Go to the official Geoserver web page and follow the links to download the windows installer

    Here I use the windows version of Geoserver version 2.41.1 obtained January 2024 from the following url
    Geoserver Official Website

    See download button and then go to Windows Installer

    I chose the left option and then I chose the Windows Installer option

    This will download Geoserver-2.24.1-winsetup.exe to your download directory and you can then start the geoserver installation process

    Next open up the executable and follow the instructions

    Next agree the licence

    Next we reference the java library installed in the previous step – if you have set the java_home variable correctly it should automatically find JRE 11 for you and it will place it within the path reference below. I don’t have a screenshot of the JRE 11 reference here as on my first installation I referenced the Oracle 17 JRE – (note I went back de-installed geoserver installed JRE 11 and then reinstalled geoserver to counter the proprietary Oracle runtime library and importantly to fix the issue that I was facing of geoserver not being compatible with JRE 17

    The rest of the install from here is standard for a windows install

    Next setup the default admin password – defaults are admin / geoserver

    Set up port geoserver runs on – default is 8080

    Next choose the execution type I like to install geoserver as a service

    Confirm your preferences and then trigger install

    Now you should see geoserver on your local machine as a service which you will need to ensure is running to access properly

    3) Test that Geoserver Admin dashboard is up and working at least locally

    If you have used the same settings as me open a browser and navigate to the following url

    https://localhost:8080/geoserver/web

    Login with the username and password which is usually admin / geoserver

    At which point you should see something similar to the following

    4) Download and configure Geoserver Extension for SQL Server

    At install Geoserver comes with the ability to connect to Postgres but NOT SQL Server so we must install/configure plugin extension to enable Geoserver SQL Server connections.

    Here we go back to the geoserver.org website and go to download but this time instead of choosing the windows installer we look to the Extensions section and choose SQL Server

    This should download geoserver-sqlserver extension plugin

    Next copy all files with the jar suffix into the following directory

    C/program files/GeoServer/webapps/geoserver/WEb-Inf/Lib

    Next restart the geoserver and go back to the local host and sign in

    http://localhost:8080/geoserver/web/?0

    We are now very close to linking to SQL Server prior to that we must whitelist our geoserver so that your SQL Server instances will accept connections from your geoserver. Steps 5 and 6 are more about creating a user and ensuring you have a geodatabase with a georeferenced table skip to 7 if you want to read about white listing in SQL Azure or you have an alternative database that you could use with requirements already set up.

    5) Next its a good idea to create a user with least privilege this will be used to set up the link so go to SSMS

    Least privilege User Login Setup SQL Server

    Here allow the amount of access you wish users to have remembering the principles of least privilege

    6) Lets create an example table with geometry that we will be connecting
    When I first tried connecting to SQL Server (SQL Azure in my case) I didn’t have any georeferenced tables so I created one and added a few records. If you have a database already with tables with geometry or geography you might not need to do this step.

    So open SSMS and navigate to your database and use the following TSQL to create a table here I call it t064

    SET ANSI_NULLS ON
    GO
    
    SET QUOTED_IDENTIFIER ON
    GO
    
    CREATE TABLE [dbo].[t064](
    [PKID] [int] IDENTITY(1,1) NOT NULL,
    [geomcol] [geometry] NULL,
    [sitename] [varchar](30) NULL,
    PRIMARY KEY CLUSTERED
    (
    [PKID] ASC
    )WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
    ) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
    GO

    This is a really simple table with three columns and you might want to link to it through QGIS and just create a few records – I would try and use a default projection of 27700 if you are UK based. I might come back with an example table with records..

    If you want to link to this table for testing then you should try and input at least one record here. I added records using QGIS and I hope to come back to this post and update it to be more specific

    7) White List the IP within your SQL Server or SQL Azure instance.
    Setting IP White List in SQL Azure

    8) Opening Geoserver and setting the ‘Store’ to reference your SQL Azure database using the login setup in step 5

    Navigate to the local host url for Geoserver namely

    http://localhost:8080/geoserver/web/?0

    Go to store and then you should see the following page

    Hit the Add New Store green plus button in the top left corner you will be presented with the following screen

    Fill out as many details as you can

    For SQL Azure this is likely to be similar to the following

    Host = namevariable.database.windows.net (normally unique to instance in SQL Azure)
    Port = 1433 (sql azure default)
    Schema = dbo (sql azure default is dbo but your database maybe bespoke)
    User = uservariable (remember least privilege is a good idea)
    Passwd = passwordvariable (should be unique to your database)
    

    Save and Apply and then your new connection should appear in your store

    Here’s an example…. (see top line)

    For the purposes of this tutorial I setup a login with db_owner rights to the database LandRegisterAzure your mileage will vary.

    9) Next we add the layer which references the SQL Server

    Select Layers within the Data section of the menu (usually to the left of the geoserver dashboard) then hit Add a new layer

    You should get the following windows

    From the drop down select the store which references your SQL server. This will reveal all the tables and views in the database and you scroll through them to the table or view you wish to publish and in the column marked Action will be publish you can hit the Publish highlighted text (I’ve done that here already so under Action our table is marked as Publish again and there is a tick in the column titled Published.

    We can now quickly test to see if things look like they are working by going to Layer Preview.

    Look to the left hand side and select Layer Preview

    Scroll down through the list and identify the layer that you have just added. I now select the open layers option

    A new browser tab will open and if you have successfully configured the SQL Server you should be presented with your layer – without any background

    10) I don’t go into it here but next I will try and write up my notes on QGIS and Geoserver setup – On my laptop I managed to set up a WFS to my SQL Azure instance through Geoserver where the layer was added to a project and I was able to edit information in my SQL Azure database through QGIS and were displayed correctly against a basemap in the right place!!

    –/END/–

    Everyone gets beaten by Complexity

    I don’t do that many strategic posts these days (compared with my first posts) but this is really a reminder to myself to always seek out the simplest solution especially when configuring systems and writing code. Less is generally always better. The above post was a DALL E production and the below graph is the simpler one which QED actually makes sense. I would suggest that the line moves to the right with improved education of staff / time / money and number of staff but eventually productivity dramatically drops with complexity no matter how much money , time or people.

    Postgres – Export a Spatial database and then Importing again (pg_dump + psql utilities)

    Lets take this from the beginning. The first thing you need to do is be on the physical machine where the database is running.

    Next check that the database server is up and running

    To do that

    Go to the windows bar click on the search icon and type in

    Services

    Scroll down to P and look for the name postgresql this and look to the Status column in the above example there are two postgres 11 and postgres 14 of the two only 14 shows as Running.

    There are several Utility executables that come with every postgres installation.

    The standard installation set up puts these into a directory equivalent to the following;

    We need to navigate at the command prompt to the location of the executable we are interested in. Technically you could qualify the path to, the executable when running a command, but I tend to navigate to the location.

    So firstly in the search button go to the command prompt

    cmd

    cd C:\Program Files\PostgreSQL\14\bin

    pg_dump -U postgres -W -F p -d dbspatialhub > c:\sql\dbspatialhub.sql

    pg_dump -U postgres -W -F p -d dbspatialhub > c:\sql\dbspatialhub.sql

    and here is a string that works if your database is on the local server with a non standard port / substitute 5433 for your port

    pg_dump -U postgres -h localhost -p 5433 -d dbsource > c:\sql\dbsourceexport.sql

    run this command – you will note that I qualify the output directory here. I prefer to do this as I don’t particularly like writing and deleting files into the program directory. I prefer this directory to be reserved for standard files and reduce the likelihood of me in anyway accidentally deleting an important file.

    You should be presented with the Password: option you will need to input the password for specified User (-U) complete

    Success will be shown as a return to the former prompt with no message.

    and running a DIR on the directory look for the sql file…

    Next from either psql prompt or from within PGAdmin4 create a database you are going to import the sql into and ensure that the postgis extension is installed. Remember this is the psql prompt and not DOS command line prompt.

    CREATE DATABASE dbspatialhubnew
    CREATE EXTENSION postgis
    

    Now you have to go back to the command prompt (and the correct directory).

    If you are on a different port add the following string -p 5434

    psql -U postgres -d dbspatialhubnew -f c:\sql\dbspatialhub.sql

    After being asked for the password the import will start and continue showing a list of various commands and data being executed and imported.
    Similar to below.

    Now you can go back to a client such as QGIS , link to the database to start confirming a successful import.

    Congratulations you have copied a database.

    It should be noted that as it stands I am having problems importing the layers table which has the format of the boundaries for tables.
    I will continue to work on this.

    MS Azure & Web Applications – TSQL Create a check constraint on nvarchar column specifying a minimum length in DB and through the front end ensure enforce OWASP security

    Double security!
    Enforce length in database and then through the front end by validating in this case password values (PHPRunner example).

    First the backend
    You have a nvarchar(128) column in MS Azure SQL Database (or any Microsoft SQL Server database) and you wish to add a check constraint that does not allow a value in the column to be less than lets say 15 characters long.

    tablename = users
    field = userpassword

    ALTER TABLE dbo.users
    ADD CONSTRAINT [MinLengthConstraint] 
    CHECK (DATALENGTH([userpassword]))>14;
    

    And then a good way to check the constraints on an individual table using TSQL through SSMS for MS AZURE.

    SELECT TABLE_NAME, 
    CONSTRAINT_TYPE, 
    CONSTRAINT_NAME 
    FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS 
    WHERE TABLE_NAME ='users';
    

    Front End
    Now for double security in your web application use a regular expression to prevent users entering a weak password.

    ^(?=.*\d)(?=.*[a-z])(?=.*[A-Z]).{15,128}$
    
    ^               # start of the string
    (?=.*\d)        # a digit must occur at least once
    (?=.*[a-z])     # a lower case letter must occur at least once
    (?=.*[A-Z])     # an upper case letter must occur at least once
    .{15,128}       # 15-128 character password must be a minimum of 15 characters and a maximum of 128
    $               # end of the string
    

    OWASP – Open Web Application Security Project

    Setting up CRON job using cPanel

    Automation is so important these days and the following details how to create a simple cron job using CPANEL to send out a regular email. You can automate anything but I was wanting to use an example that would send me something to show that it was working. So I set up the following as an example to send out an email to myself every 5 minutes.

    See also
    How to Set Up and Run a Cron in CPanel

    I am going to focus on a single method. The above link details how to do it using the command line on a linux server which may be helpful. I am concentrating on using the cpanel interface here.

    Step 1 : Sign into your cpanel administrative panel.

    Step 2 : Look to the Advanced Tab and identify the Cron Jobs Icon

    Entering the Cron Jobs panel you will be presented with the following screen.

    Leave the Update email button…

    Step 3 : Identify the PHP script you wish to run and load it into a directory taking note of where you have noted.
    (You will need to point CRON to run your specific script)

    If you don’t have a script try this one. It uses the php mail function to send out a simple email to whatever target email you want which allows you to

    <?php     
    $to_email = 'targetemail@targetcompany.com';
    $subject = 'Mail sent using a Cron Job Script';
    $message = 'Hello and best wishes';
    $headers = 'From: noreply@yourcompany.com';
    mail($to_email,$subject,$message,$headers);
    ?>
    

    Step 4 : Save the above PHP script somewhere and load it to your server so that taking a note of where you put it.

    Now I would start and set this to run up every 5 minutes just to test that it is working –

    In the common settings there is s drop down which has a range of settings everything from once a minute to once a year select once every 5 minutes.

    Step 6 : Set the command to be run.
    Now this is the only really tricky part to the whole thing. I haven’t found a great deal of documentation on the syntax of the command that you should enter. I found two different syntaxes that seemed to work.

    /usr/local/bin/php /home/youruserid/public_html/cronscripts/testrun.php
    

    or

    php -q /home/youruserid/public_html/cronscripts/testrun.php
    

    Then simply hit the Add New Cron Job button and then wait for five minutes – if everything has completed successfully there will an additional line in the cron jobs listing and you will get an email every 5 minutes.

    Happy Cron Jobbing.

    One interesting point in my cpanel host was that you could limit the cron job to the domain which may be necessary as you become more proficient with the kind of jobs that you are wanting to run in your cron job.

    MYSQL / MariaDB – Useful SQL in both creating New Tables and Clarifying Default Parameters

    Within an instance to identify the default character sets and collation in all databases.

    SELECT * FROM INFORMATION_SCHEMA.SCHEMATA;
    

    To alter the default character sets and collations for a database in a MySQL or MaraiDB instance.

    ALTER DATABASE dbname CHARACTER SET utf8 COLLATE utf8_general_ci;
    

    Creating Table – Template SQL

    CREATE TABLE table1 
    (col1 INT(10) NOT NULL AUTO_INCREMENT, 
    col2 CHAR(30), 
    PRIMARY KEY (col1))
    ENGINE = INNODB;
    

    And a good source of data types are here
    Tech on the Net

    There are unconfirmed reposts that queries against TEXT fields are generally 3 times slower than against VARCHAR fields.

    Values in VARCHAR columns are variable-length strings. The length can be specified as a value from 0 to 255 before MySQL 5.0.3, and 0 to 65,535 in 5.0.3 and later versions. The effective maximum length of a VARCHAR in MySQL 5.0.3 and later is subject to the maximum row size (65,535 bytes, which is shared among all columns) and the character set used.

    VARCHAR(MAX)

    Is a thing

    And if you want to change the engine on a table retrospectively. See OPERATIONS.

    And a short note on your choice of collation

    From Stack overflow here

    Actually, you probably want to use utf8_unicode_ci or utf8_general_ci.

    utf8_general_ci sorts by stripping away all accents and sorting as if it were ASCII
    utf8_unicode_ci uses the Unicode sort order, so it sorts correctly in more languages
    However, if you are only using this to store English text, these shouldn’t differ.

    Note further down there is a person that states “be very aware of this problem that can occur using utf8_general_ci

    “MySQL will not distinguish between some characters in select statements, if the utf8_general_ci collation is used. This can lead to very nasty bugs – especially for example, where usernames are involved. Depending on the implementation that uses the database tables, this problem could allow malicious users to create a username matching an administrator account.

    This problem exposes itself at the very least in early 5.x versions – I’m not sure if this behaviour as changed later.”

    And here is some code that allowed me to reset the auto increment index on a table.

    ALTER TABLE my_table MODIFY COLUMN pkid INT(10) UNSIGNED;
    COMMIT;
    ALTER TABLE my_table MODIFY COLUMN pkid INT(10) UNSIGNED AUTO_INCREMENT;
    COMMIT;
    

    I have tested it and works with INNODB and MySQL

    Another way to do it might be to drop the pkid table and then recreate it again

    Using

    ALTER TABLE mytable DROP pkid;
    ALTER TABLE mytable ADD pkid INT NOT NULL AUTO_INCREMENT PRIMARY KEY FIRST;
    

    CPanel – Setting up an additional FTP Account

    As previously stated there are a lot of hosted services that come with CPANEL installed. We have already indicated how it is possible to create a subdomain that can be used to host php based sites.
    If you are using a design environment you will want to set up an FTP account that can be used to upload your applications too. It is worth just making sure that you are familiar with this.

    Firstly log into your account at your host cpanel.

    Once configured it should appear in your FTP Accounts section here

    If you want to see the FTP parameters for upload hit the related Configure FTP Client

    CPANEL only allows you to create against domains and subdomains that already exist so this should be you ready to install software.

    SQL Azure – Reset Primary Key Identity/Index to Zero after Deleting all records

    I was faced with a situation where I needed to set up an index for widgets but I wanted those to start at zero. I made some attempts to import information and deleted those records from the table several times and was faced with a situation where users might see the index occasionally. As a result I wanted the index to start from 1 and at least at the start go up to the approximate number of widgets.

    Its a quick search using google to get this but wanted to reset the identity on a blank table.

    In SSMS / navigate to the database and open a new query window

    DBCC CHECKIDENT (dbo.t001table, RESEED,0);
    GO

    New System Assessment : A Simple Set of Guidelines for judging material differences between Software

    Here’s a short guide I put together for myself to help me assess the value of enterprise systems that I may be looking for or designing myself. Not hard and fast and some points may not be relevant to every application but this should form a good basis from which to start. Generally a higher number is better than a lower number but within numbers items are more cumulative and of equal value.

    1)Paper based = Everything from single sheets to formal Books of Accounts
    2)Simple Digital = spreadsheet and other file based storage
    3)Cloud based Simple Digital = spreadsheet and other file based storage – this can include file based blob storage
    4.1)LAN relational database – normalised but otherwise fairly locked down
    4.2)LAN relational database + easy import and export of data other than spatial
    4.3)LAN relational database + report writer definable by users
    4.4)LAN relational database + spatially enabled
    5.1)Internet available relational database – normalised but otherwise fairly locked down
    5.2)Internet available relational database + easy import and export of data other than spatial
    5.3)Internet available relational database + report writer definable by users
    5.4)Internet available relational database + spatially enabled (noted not all applications require a spatial aspect)
    5.5)Internet available relational database + separate site available for public access view only
    5.6)Internet available relations database + separate access to public access to edit and add information

    Requirements
    + UPtime / reliability
    + SPEED – after reliability very important
    + Good master details forms (sounds easy but bootstrap is not great for this)
    + GOOD Search – the larger the system the more important the search options are in it. Look out for things like Automated Objective Index creation (case sensitive search is more of a hassle than a benefit a lot of the time – odd wild cards – or not being wild by default are problem areas)

    Important Points
    + Portability can you up sticks and move it to a different cloud provider
    + Non – Proprietary (Can be a gotcha if its really obscure or has weird security, often linked to portability)
    + two factor authentication
    + Responsive Design (Not necessarily as important as you think in some applications see Github / Open Layers arguably don’t need it)
    + Dynamic saving – I really miss the fast dynamic saving environment of MS Access the save button isn’t quite the same
    + Built in CRM – I generally like them especially if data can go straight into correct files.. Increasingly I am designing systems with inbuilt CRM – I know this might not be to every organisations taste but it is jarring to go between systems and normalisation between systems is usually very sub optimal – plus you frequently come across proof of negative problems when data required for a specific task is not held appropriately.
    + Satellite application for customers to enter information (why not let your customers maintain their information can be great and can empower customers Fintech anyone?)

    = All the bells and whistles
    Spatially enabled internet available database with two factor authentication – report writer and easy import and export of spatial and attribute information with public access to edit and add information by customers (Possibly via satellite application).

    Extraction Transformation and Load (ETL) – some thoughts on a large IT transfer project

    In 2017 I was involved in an important work project to transfer all the records in a legacy system that was being deprecated by the vendor (IDOX Group) into another maintained system. We were in some ways fortunate because both systems had been designed by a single company (IDOX) and they were encouraging us to transfer. We had delayed transfer for several years already but were aware that we now had to move. The vendor did have some tools in place , had staff dedicated to such transfers and were offering favourable consultancy rates. The amount of data was not horrendous in computing terms but they were far far beyond the remit of the ability to cope with any sort of manual data correction and the system was an absolute core system upon which several departments completely depended. These were systems that all departments are in from the moment they start the work day to the end. Generally its unusual if they are down for more than 5 minutes in a month, all work pretty much stops when they stop and in no circumstances could they be down for more than a day without special dispensation and coordination to indicate to manage customer expectations.

    The whole project was a success although it was challenging. Here is an outline of the steps we took. As ever order here is important in most of the steps. I had written something on this before but consider this to be a more accurate rundown see here

    Step 1
    Inform managers of all involved sections and ensure they are on board – identify and ring fence budget

    Step 2
    Appoint project manager on vendor and client side
    draw together team to perform transformation.

    Step 3
    Draft time table creation of how long it will take putting in place planning for tutorials on systems and consultancy.

    Step 4
    Request managers to put forward staff on all sides willing to be involved

    Step 5
    Identify any omissions in knowledge and start to identify how this can be remedied. Kick off and complete acquisition of said staff.

    Step 6
    Meeting with lead staff to confirm buy in. Request buy in from staff including ring fencing of holidays etc.. to ensure key staff are available at required times.

    Step 7
    Set up test systems that all individuals have access to and ensure that the old and new systems can be viewed simultaneously by individuals. Ensure that the domain specialists can identify processes that will be mirrored from the old system to the new system

    Step 8
    Give DBAs or those that will be doing data transfer access to databases of source so that they can start thinking of how they can pull out information.

    Step 9
    Training for all individuals concerned in new systems.

    Step 10
    In new system start tasking individuals with how they are going to do the simple processes – eg register a record approve a record alter a record and get reports out. If possible allow new champions to start to define things like reports.

    Step 11
    Start making up any new lookup fields compared with old lookups and also start tasking individuals with creation of reports and letter that will need to be done.

    Step 12
    Start mapping the data from old system to new system – excel spreadsheets can be used for this that show the data going from the old system and what fields they are going to go into in the new system. Divide this task up between domain users – this step needs to be done after old and new systems are on domain users machines. As part of this the applications in question should expose if possible the table and field names of the source and target fields. With the systems we were involved in this was possible both for the old and new systems.

    For each form on the two systems try to identify the below

    Source table.field Target table.field

    Also get them to map the lookup table values if direct transfer is not possible or if alias id are used in these lookups.

    Source table.field.value=Equivalent.Target table.field.value

    Step 13
    Give both mapping documents to the ETL people to allow them to start writing the queries. It is unlikely that there will be a straight transfer across from table to table. While it would be expected that field and table names will be completely different it will be expected that table structure will in certain places be different in this respect it would be good to have a really nice schema diagram of both source and target.

    Step 14
    Allow data individuals to write scripts that can be run live against present initial system – if necessary doesn’t need to be live live could copy every night and then perform on 1 day old database backend – which is what we did. This means work can go on in old system and then at a touch of a button.

    Encourage DBAs to be able to run these scripts every day to ensure that running them for go live is absolutely no issue. Our scripts only took about half an hour to run so this wasn’t an issue. I was personally involved in writing the SQL for those and I had systems in place to cross tab the amount coming into each new table so I could see new records and information from the old system trickling manually into the system and then being transferred.

    Step 15
    Test data input into new system

    Step 16
    Check test data input into new system with reference to domain users.

    Step 17
    Confirm go live date ensure staff available for issues

    Step 18
    Go live to production and start all new procedures ensure staff technical and domain key players on hand to make flexible solutions to things

    Step 19
    Project review on going maintenance and improvement of new system

    Step 20
    After suitable time turn off of old system if possible.

    Simple Bat File to open multiple Web Pages in One Browser Window with alternate Tabs

    Simple but can be useful

    Useful if you are wanting to open multiple tabs in a browser window at the same time

    @echo off
    start "Simple Search" "https://planning.westlothian.gov.uk/publicaccess/search.do?action=simple&searchType=Application"
    start "Google Maps " "https://www.google.co.uk/maps/@55.8625775,-3.6759593,17z"
    start "WLC Spade" ;
    start "idoxEDRMS Login" "http://cc-dmsapp-01:8080/IDOXSoftware/secure/IG_Main?url="

    MS SQL Azure – Take Complete Backup of Azure Database (Structure and Data)

    SQL Azure as part of the service offers a number of differing back up options as standard – however you may wish to take additional backups which for instance you can load onto a local version of SQL Server (Express or Enterprise). Here’s a quick rundown of one method of doing this.

    Firstly open up your version of SQL Server Management Studio and navigate to the instance and database that you wish to backup

    Highlight the database in this case DB001 and right click -Select Tasks and Export Data-Tier Application

    Choose an appropriate location to put the backup file

    Review the summary of items that will be backed up and then hit Finish

    There will be a run down of how the export has worked

    And just a check to see the exported file in the directory

    MS Access like development environments for the Web – 3 alternatives

    So you would like to construct simple applications that you can at the moment create in MS Access but you want to do it on the web. By that I mean you would like to create a data driven application with somewhat complicated forms that can be accessed by anyone through either IE or Chrome anywhere in the world with a simple login screen at the front to prevent simply anyone accessing the applications collecting the information into a database. What are your options for programs that will assist you in a MS Access like environment rather than going the full IDE deep dive – Visual Studio route – for what I consider to be a reasonable fee?

    From my experience the unicorn of access on the web is slowly coming to fruition BUT for the vast majority of people with a budget similar to that for MS Access – lets say £200 ($250) a year for unlimited applications there is simply nothing which is quite as easy and powerful as MS Access. Some are pretty close but simply not as stable and require typically several magnitudes greater amount of configuration. WYSIWYG design isn’t quite as WYSIWYG and stability is a few orders lower than the desktop.

    What you are probably looking at can typically be described as either RAD tools for the Web, a Low Coding Platform or something called a Code Generator any of those phrases can be useful for Google.

    Assuming you don’t have your own servers whatever you do you will need to spend money on a web host.

    The minimum this is likely to cost you is in the region of $15 a month. If you don’t want to spend the next 6 months learning about the insides and outsides of frameworks then I would suggest you go to one of the below three providers who all provide complete environments set up and ready to go with their particular generators pre-installed. This is good value it is extremely difficult to beat these guys on cloud hosting costs and unless you are very advanced and have very particular requirements its a waste of time to try. All three of the below providers will allow you to create limitless number of applications albeit you are limited by the space you hire on your web server. Similarly distribution will be limited by the quality of web server you sign up for. In all likelihood if you have few users it is unlikely that the coding front ends of your applications will be a limit to the number you create more likely the size of databases you are attaching them to and the shear time you have available to create applications.

    For a period I was paying a monthly amount for a Nubuilder Pro hosted platform. This performed well and I could create an unlimited number of applications. As it was so hosted I skipped the step of learning some of the deeper parts of the initial configuration. I hope at some point to go back to this. It is open source and seems well maintained with a very dedicated developer. The developer re-wrote much of it and at March 2019 it latest re-incarnation is Nubuilder Forte.

    Be warned n-tier web applications do not play as friendly as the desktop you WILL be slower to construct applications than you are on the desktop, getting into it WILL take time and a bit of commitment, you WILL have far less flexibility regards coding, there WILL be less people about to ask questions and there is far far less WYSIWYG design capabilities, error trapping is poor and errors are far more likely to be fatal and the really big warning is that on release of new web frameworks you may not necessarily be able to update without a full site re-design (A fact that comes as a nasty surprise to many CIOs and Project Managers when they realise that they are locked into front end system replacements every 4 or 5 years ) Know how to get data to your local environment out of the back end and accept that the front end is ephemeral and not likely to last in the same way as your desktop applications. (Your database will last but don’t expect to be running it through the same front end ten years from now). Accept that you will now have monthly or annual rental fees for cloud provision.

    That said the design of these items is significantly faster than its ever been.

    Scriptcase and ASP Runner dot net (Xlinesoft also produces a PHP equivalent generator) have free downloads that are good for getting started.

    Commit to one and go for it. – I’ve got both PHP and ASP.NET solutions.. Nubuilder only connects to MySQL whereas Scriptcase and ASPRunner.NET connect to pretty much any database. I started with Nubuilder and am using ASPRunner.net as well because importantly it connects to SQL Server and it was easy to get up and running in MS Azure. Scriptcase is php based and I believe the applications it build require some kind of security program to sit on the web server this put me off – they do however have hosting that you can sign up for which is pre-configured. Their IDE is web based which could be a winning advantage. One of the great advantages of ASP runner dot net is that the program produces an open web application that should run on all stock servers. I found Nubuilder Pro (now Nubuilder Forte) really conceptually elegant which despite its rather drab looks is incredibly flexible the applications it produces are however limited to MySQL and non responsive (But being non responsive you get get more detailed forms!). I would probably be able to change it’s look if I was prepared to get my own server and install everything on it myself. That is not something I have time to do at present.
    Nubuilder hosts its IDE in the browser which again is an advantage. ASPRunner.net is more traditional in that you have a program running on a desktop that creates the plumbing of your application which you then need to push to a server for publication  this has the advantage that you get to see the plumbing in the background which makes backup of the site easier but publishing slightly harder.

    You may have heard of other generators / design applications out there for example – Zoho Creator / Alpha 5 / Outsystems these hold your hand even more but as a result are even more proprietary and won’t fit in that budget of £200 per year ( by quite a long way!)

    Some further information on costs – nubuilder being open source in theory could scale for very little money espectially if you have your own servers already. Scriptcase and Xlinesoft ASP Runner product have an initial fee followed by annual subscription – you may be able to configure it so that you can create unlimited applications for that one fee (if you have good access to web servers ) but it is likely that initially there will be some kind of variable cost per additional application you wish to build. I am using MS Azure with ASP Runner dot net and a developer database costs me about £5 a month with each application being hosted in a web application service which again costs £5. With both Scriptcase and ASP Runner if you stop paying the annual fees your applications will continue to work you will just not get version upgrades. You will be able to step back into version upgrades but you may need to restart your subscription with an additional fee.

    Nubuilder Forte Link

    Scriptcase Link

    ASP Runner – PHPRunner and ASPRunner.Net Link

    Good luck

    Upload XLS information to a specific table or new table within an SQL Azure Database

    I have started experimenting with Microsoft Azure if you haven’t already you can get a free experimentation account here;

    Microsoft Azure Trial account with £125 credit

    This gets your registered on Microsoft’s cloud and after a free trial period you will be able to continue with a Pay as You go Account which depending on the services that you go for can start at very cheap rates.

    In order for this to work you will need the following
    1) Microsoft Azure account
    2) An SQL Azure Database
    3) SQL Server Management Studio downloaded and installed on the machine you will be uploading from, this can be obtained from SSMS download link be warned its over 800mb. Here I use SSMS 2016
    4) Know your server name this is generally [Yourname].database.windows.net
    5) Login and Password (I use SQL Server Authentication)

    Testing things out I have been using the Web Apps Service to run a website and connect to an SQL Azure Database – both on the cheapest options.

    What makes the website particularly cheap is that it can be stopped and started and by paying for it by the minute you can really get a powerful demonstration sites up and running and stop them immediately after the demonstration for very little money.

    So after having created an Azure SQL database (Microsoft Create Azure Database Tutorial)I wanted to get a decent number of records into it. Which would be the starting position when taking on most work.
    Here I use the Lichfield Planning Application information previously referred to in this post QGIS Import. What I did was take the 45,000 records of planning applications from the shape file. I did this by opening up the dbf file of the shape collection in Excel 2003 and then saving it in excel format. This will be used later to import into the database.

    Having your excel file ready the next step is to open up SQL Server Management Studio and connect to your Azure Database. The parameters with regard to username and servername will have been setup when you created your Azure database it is important that when you create your Azure database you somehow record these details.

    Next highlight the database – in my case this was DB001 and right click to get tasks.

    At this point you enter the import wizard windows dialog boxes and having passed the opening welcome screen , a screen that can be turned off for subsequent navigations, you should hit your first screen that allows you to define the format of the file that should be imported.

    The next step is about the only one that is slightly confusing – you are given a number of different options for the target – for me SQL Server Native Client 11.0 worked for me.

    Now using the previous parameters specific to your database server and your database name complete the next dialog.

    The next dialog asks you whether you want to copy all information or want to write a query to filter the information to be imported. For my example I chose the all import item. Here I select the database and then I am able to see the from and too destinations.

    If you wish to import into an existing table use the drop down to select tables from the database – if you wish to import into a new table you can type in the name of the new table within the square brackets.

    Here I create a new table called T010Test and import and then continue through the import wizard dialogs until on completion of import you should see a similar screen to that below. It is possible to go into the edit mappings if you are copying into a table that already exists. This will give you a preview showing to what extent the mapping will be successful and how the fields map. You may wish to alter the names of columns to match your target table at this point.

    Microsoft Azure – moving to the cloud

    I must confess I do love MS Access and I have a number of projects that I suspect will always remain as are – these are ones that are highly complex but very personal that I really don’t need to share. Even some projects that because of their sensitivity I would never want to share but despite the difficulty of designing really user friendly user interfaces on the web the ability to distribute your applications to everyone in the world is a very powerful attractor which is quite clearly going to be a game changer.

    I have therefore started to experiment with data driven online applications and so far I am impressed.

    To get started you will first want to sign up with an Azure account.
    Azure Portal Sign In

    I can see in the future I will probably be using Access as a desktop platform for writing queries and personal content curation with SQL Azure for projects where I need to communicate with others linked through the apps service in the Azure portal.

    The Economics of Immortality (part 2) and a real reason for moving platform.

    The announcement in December 2014 that Microsoft was releasing a version of Visual Studio Community to virtually everyone for free and then again a further announcement in February 2016 of Microsoft’s purchase of Xamarin and subsequent release with VS community edition suggests to me that my theory that cost of development tools should steadily reduce to next to nothing is on the button. Still it comes as a surprise that it seems to be happening so quickly. This can only be a good thing especially when considering that increasingly, any application can be developed locally and sold globally or at least published internationally.

    In my opinion it is looking more and more likely that the best reason for moving legacy systems to new platforms will in the future not be cost savings but rather revenue possibilities of resulting systems. In a word Software as a Service. But why will anyone one want to create their own Services rather than paying to be a customer of others? Two reasons – number one – at present the field is wide open and in many fields there is a complete lack of rich software out there applicable to individuals problem, secondly the caution with which some individuals will relinquish their information and knowledge of their tools – in that respect revenue generation might just be a nice optional bonus. I do think that different categories of software will progress at different rates and maybe in that respect we can look to Accounting software to see the future for other categories. Everyone is interested in money and the shear ubiquity of the requirement for accounting support has focused resources heavily on tools resulting in a healthy market of options for customers for online software. There are still sticking points in the form of cost especially when being charged on a per seat basis but it is no longer really practical for any individual organisation to develop their own accounting software especially when you can choose both on quality and price in the wider national markets. But outside of accounting the choice of products seem to be somewhat scarce.

    So it seems its early days for Software as a Service but certainly maybe the question should be not what legacy systems need replacing but…

    What tool do you have at your work which you think with a redesign could be used by others by publishing to the web?