Postgres – Export a Spatial database and then Importing again (pg_dump + psql utilities)

Lets take this from the beginning. The first thing you need to do is be on the physical machine where the database is running.

Next check that the database server is up and running

The database server also needs to be up and running..

To do that

Go to the windows bar click on the search icon and type in

Services

Scroll down to P and look for the name postgresql this and look to the Status column in the above example there are two postgres 11 and postgres 14 of the two only 14 shows as Running.

There are several Utility executables that come with every postgres installation.

The standard installation set up puts these into a directory equivalent to the following;

We need to navigate at the command prompt to the location of the executable we are interested in. Technically you could qualify the path to, the executable when running a command, but I tend to navigate to the location.

pg_dump -U postgres -W -F p -d dbspatialhub > c:\sql\dbspatialhub.sql

pg_dump -U postgres -W -F p -d dbspatialhub > c:\sql\dbspatialhub.sql

and here is a string that works if your database is on the local server with a non standard port / substitute 5433 for your port

pg_dump -U postgres -h localhost -p 5433 -d dbsource > c:\sql\dbsourceexport.sql

run this command – you will note that I qualify the output directory here. I prefer to do this as I don’t particularly like writing and deleting files into the program directory. I prefer this directory to be reserved for standard files and reduce the likelihood of me in anyway accidentally deleting an important file.

You should be presented with the Password: option you will need to input the password for specified User (-U) complete

Success will be shown as a return to the former prompt with no message.

and running a DIR on the directory look for the sql file…

Next from the psql prompt create a database you are going to import the sql into.. Remember this is the psql prompt and not DOS command line prompt.

CREATE DATABASE dbspatialhubnew
CREATE EXTENSION postgis
psql -U postgres -d dbspatialhubnew -f c:\sql\dbspatialhub.sql

and here is a string that works if your database is on the local server with a non standard port / substitute 5433 for your port

After being asked for the password the import will start and continue showing a list of various commands and data being executed and imported.
Similar to below.

Now you can go back to a client such as QGIS , link to the database to start confirming a successful import.

Congratulations you have copied a database.

It should be noted that as it stands I am having problems importing the layers table which has the format of the boundaries for tables.
I will continue to work on this.

MS Azure & Web Applications – TSQL Create a check constraint on nvarchar column specifying a minimum length in DB and through the front end ensure enforce OWASP security

Double security!
Enforce length in database and then through the front end by validating in this case password values (PHPRunner example).

First the backend
You have a nvarchar(128) column in MS Azure SQL Database (or any Microsoft SQL Server database) and you wish to add a check constraint that does not allow a value in the column to be less than lets say 15 characters long.

tablename = users
field = userpassword

ALTER TABLE dbo.users
ADD CONSTRAINT [MinLengthConstraint] 
CHECK (DATALENGTH([userpassword]))>14;

And then a good way to check the constraints on an individual table using TSQL through SSMS for MS AZURE.

SELECT TABLE_NAME, 
CONSTRAINT_TYPE, 
CONSTRAINT_NAME 
FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS 
WHERE TABLE_NAME ='users';

Front End
Now for double security in your web application use a regular expression to prevent users entering a weak password.

^(?=.*\d)(?=.*[a-z])(?=.*[A-Z]).{15,128}$
^               # start of the string
(?=.*\d)        # a digit must occur at least once
(?=.*[a-z])     # a lower case letter must occur at least once
(?=.*[A-Z])     # an upper case letter must occur at least once
.{15,128}       # 15-128 character password must be a minimum of 15 characters and a maximum of 128
$               # end of the string

OWASP – Open Web Application Security Project

Setting up CRON job using cPanel

Automation is so important these days and the following details how to create a simple cron job using CPANEL to send out a regular email. You can automate anything but I was wanting to use an example that would send me something to show that it was working. So I set up the following as an example to send out an email to myself every 5 minutes.

See also
How to Set Up and Run a Cron in CPanel

I am going to focus on a single method. The above link details how to do it using the command line on a linux server which may be helpful. I am concentrating on using the cpanel interface here.

Step 1 : Sign into your cpanel administrative panel.

Step 2 : Look to the Advanced Tab and identify the Cron Jobs Icon

Entering the Cron Jobs panel you will be presented with the following screen.

Leave the Update email button…

Step 3 : Identify the PHP script you wish to run and load it into a directory taking note of where you have noted.
(You will need to point CRON to run your specific script)

If you don’t have a script try this one. It uses the php mail function to send out a simple email to whatever target email you want which allows you to

<?php     
$to_email = 'targetemail@targetcompany.com';
$subject = 'Mail sent using a Cron Job Script';
$message = 'Hello and best wishes';
$headers = 'From: noreply@yourcompany.com';
mail($to_email,$subject,$message,$headers);
?>

Step 4 : Save the above PHP script somewhere and load it to your server so that taking a note of where you put it.

Now I would start and set this to run up every 5 minutes just to test that it is working –

In the common settings there is s drop down which has a range of settings everything from once a minute to once a year select once every 5 minutes.

Step 6 : Set the command to be run.
Now this is the only really tricky part to the whole thing. I haven’t found a great deal of documentation on the syntax of the command that you should enter. I found two different syntaxes that seemed to work.

/usr/local/bin/php /home/youruserid/public_html/cronscripts/testrun.php

or

php -q /home/youruserid/public_html/cronscripts/testrun.php

Then simply hit the Add New Cron Job button and then wait for five minutes – if everything has completed successfully there will an additional line in the cron jobs listing and you will get an email every 5 minutes.

Happy Cron Jobbing.

One interesting point in my cpanel host was that you could limit the cron job to the domain which may be necessary as you become more proficient with the kind of jobs that you are wanting to run in your cron job.

MYSQL / MariaDB – Useful SQL in both creating New Tables and Clarifying Default Parameters

Within an instance to identify the default character sets and collation in all databases.

SELECT * FROM INFORMATION_SCHEMA.SCHEMATA;

To alter the default character sets and collations for a database in a MySQL or MaraiDB instance.

ALTER DATABASE dbname CHARACTER SET utf8 COLLATE utf8_general_ci;

Creating Table – Template SQL

CREATE TABLE table1 
(col1 INT(10) NOT NULL AUTO_INCREMENT, 
col2 CHAR(30), 
PRIMARY KEY (col1))
ENGINE = INNODB;

And a good source of data types are here
Tech on the Net

There are unconfirmed reposts that queries against TEXT fields are generally 3 times slower than against VARCHAR fields.

Values in VARCHAR columns are variable-length strings. The length can be specified as a value from 0 to 255 before MySQL 5.0.3, and 0 to 65,535 in 5.0.3 and later versions. The effective maximum length of a VARCHAR in MySQL 5.0.3 and later is subject to the maximum row size (65,535 bytes, which is shared among all columns) and the character set used.

VARCHAR(MAX)

Is a thing

And if you want to change the engine on a table retrospectively. See OPERATIONS.

And a short note on your choice of collation

From Stack overflow here

Actually, you probably want to use utf8_unicode_ci or utf8_general_ci.

utf8_general_ci sorts by stripping away all accents and sorting as if it were ASCII
utf8_unicode_ci uses the Unicode sort order, so it sorts correctly in more languages
However, if you are only using this to store English text, these shouldn’t differ.

Note further down there is a person that states “be very aware of this problem that can occur using utf8_general_ci

“MySQL will not distinguish between some characters in select statements, if the utf8_general_ci collation is used. This can lead to very nasty bugs – especially for example, where usernames are involved. Depending on the implementation that uses the database tables, this problem could allow malicious users to create a username matching an administrator account.

This problem exposes itself at the very least in early 5.x versions – I’m not sure if this behaviour as changed later.”

And here is some code that allowed me to reset the auto increment index on a table.

ALTER TABLE my_table MODIFY COLUMN pkid INT(10) UNSIGNED;
COMMIT;
ALTER TABLE my_table MODIFY COLUMN pkid INT(10) UNSIGNED AUTO_INCREMENT;
COMMIT;

I have tested it and works with INNODB and MySQL

Another way to do it might be to drop the pkid table and then recreate it again

Using

ALTER TABLE mytable DROP pkid;
ALTER TABLE mytable ADD pkid INT NOT NULL AUTO_INCREMENT PRIMARY KEY FIRST;

CPanel – Setting up an additional FTP Account

As previously stated there are a lot of hosted services that come with CPANEL installed. We have already indicated how it is possible to create a subdomain that can be used to host php based sites.
If you are using a design environment you will want to set up an FTP account that can be used to upload your applications too. It is worth just making sure that you are familiar with this.

Firstly log into your account at your host cpanel.

Once configured it should appear in your FTP Accounts section here

If you want to see the FTP parameters for upload hit the related Configure FTP Client

CPANEL only allows you to create against domains and subdomains that already exist so this should be you ready to install software.

SQL Azure – Reset Primary Key Identity/Index to Zero after Deleting all records

I was faced with a situation where I needed to set up an index for widgets but I wanted those to start at zero. I made some attempts to import information and deleted those records from the table several times and was faced with a situation where users might see the index occasionally. As a result I wanted the index to start from 1 and at least at the start go up to the approximate number of widgets.

Its a quick search using google to get this but wanted to reset the identity on a blank table.

In SSMS / navigate to the database and open a new query window

DBCC CHECKIDENT (dbo.t001table, RESEED,0);
GO

New System Assessment : A Simple Set of Guidelines for judging material differences between Software

Here’s a short guide I put together for myself to help me assess the value of enterprise systems that I may be looking for or designing myself. Not hard and fast and some points may not be relevant to every application but this should form a good basis from which to start. Generally a higher number is better than a lower number but within numbers items are more cumulative and of equal value.

1)Paper based = Everything from single sheets to formal Books of Accounts
2)Simple Digital = spreadsheet and other file based storage
3)Cloud based Simple Digital = spreadsheet and other file based storage – this can include file based blob storage
4.1)LAN relational database – normalised but otherwise fairly locked down
4.2)LAN relational database + easy import and export of data other than spatial
4.3)LAN relational database + report writer definable by users
4.4)LAN relational database + spatially enabled
5.1)Internet available relational database – normalised but otherwise fairly locked down
5.2)Internet available relational database + easy import and export of data other than spatial
5.3)Internet available relational database + report writer definable by users
5.4)Internet available relational database + spatially enabled (noted not all applications require a spatial aspect)
5.5)Internet available relational database + separate site available for public access view only
5.6)Internet available relations database + separate access to public access to edit and add information

Requirements
+ UPtime / reliability
+ SPEED – after reliability very important
+ Good master details forms (sounds easy but bootstrap is not great for this)
+ GOOD Search – the larger the system the more important the search options are in it. Look out for things like Automated Objective Index creation (case sensitive search is more of a hassle than a benefit a lot of the time – odd wild cards – or not being wild by default are problem areas)

Important Points
+ Portability can you up sticks and move it to a different cloud provider
+ Non – Proprietary (Can be a gotcha if its really obscure or has weird security, often linked to portability)
+ two factor authentication
+ Responsive Design (Not necessarily as important as you think in some applications see Github / Open Layers arguably don’t need it)
+ Dynamic saving – I really miss the fast dynamic saving environment of MS Access the save button isn’t quite the same
+ Built in CRM – I generally like them especially if data can go straight into correct files.. Increasingly I am designing systems with inbuilt CRM – I know this might not be to every organisations taste but it is jarring to go between systems and normalisation between systems is usually very sub optimal – plus you frequently come across proof of negative problems when data required for a specific task is not held appropriately.
+ Satellite application for customers to enter information (why not let your customers maintain their information can be great and can empower customers Fintech anyone?)

= All the bells and whistles
Spatially enabled internet available database with two factor authentication – report writer and easy import and export of spatial and attribute information with public access to edit and add information by customers (Possibly via satellite application).

Extraction Transformation and Load (ETL) – some thoughts on a large IT transfer project

In 2017 I was involved in an important work project to transfer all the records in a legacy system that was being deprecated by the vendor into another maintained system. We were in some ways fortunate because both systems had been designed by a single company and they were encouraging us to transfer. We had delayed transfer for several years already but were aware that we now had to move. The vendor did have some tools in place , had staff dedicated to such transfers and were offering favorable consultancy rates. The amount of data was not horrendous in computing terms but they were far far beyond the remit of the ability to cope with any sort of manual data correction and the system was an absolute core system upon which several departments completely depended. These were systems that all departments are in from the moment they start the work day to the end. Generally its unusual if they are down for more than 5 minutes in a month, all work pretty much stops when they stop and in no circumstances could they be down for more than a day without special dispensation and coordination to indicate to manage customer expectations.

The whole project was a success although it was challenging. Here is an outline of the steps we took. As ever order here is important in most of the steps. I had written something on this before but consider this to be a more accurate rundown see here

Step 1
Inform managers of all involved sections and ensure they are on board – identify and ring fence budget

Step 2
Appoint project manager on vendor and client side
draw together team to perform transformation.

Step 3
Draft time table creation of how long it will take putting in place planning for tutorials on systems and consultancy.

Step 4
Request managers to put forward staff on all sides willing to be involved

Step 5
Identify any omissions in knowledge and start to identify how this can be remedied. Kick off and complete acquisition of said staff.

Step 6
Meeting with lead staff to confirm buy in. Request buy in from staff including ring fencing of holidays etc.. to ensure key staff are available at required times.

Step 7
Set up test systems that all individuals have access to and ensure that the old and new systems can be viewed simultaneously by individuals. Ensure that the domain specialists can identify processes that will be mirrored from the old system to the new system

Step 8
Give DBAs or those that will be doing data transfer access to databases of source so that they can start thinking of how they can pull out information.

Step 9
Training for all individuals concerned in new systems.

Step 10
In new system start tasking individuals with how they are going to do the simple processes – eg register a record approve a record alter a record and get reports out. If possible allow new champions to start to define things like reports.

Step 11
Start making up any new lookup fields compared with old lookups and also start tasking individuals with creation of reports and letter that will need to be done.

Step 12
Start mapping the data from old system to new system – excel spreadsheets can be used for this that show the data going from the old system and what fields they are going to go into in the new system. Divide this task up between domain users – this step needs to be done after old and new systems are on domain users machines. As part of this the applications in question should expose if possible the table and field names of the source and target fields. With the systems we were involved in this was possible both for the old and new systems.

For each form on the two systems try to identify the below

Source table.field Target table.field

Also get them to map the lookup table values if direct transfer is not possible or if alias id are used in these lookups.

Source table.field.value=Equivalent.Target table.field.value

Step 13
Give both mapping documents to the ETL people to allow them to start writing the queries. It is unlikely that there will be a straight transfer across from table to table. While it would be expected that field and table names will be completely different it will be expected that table structure will in certain places be different in this respect it would be good to have a really nice schema diagram of both source and target.

Step 14
Allow data individuals to write scripts that can be run live against present initial system – if necessary doesn’t need to be live live could copy every night and then perform on 1 day old database backend – which is what we did. This means work can go on in old system and then at a touch of a button.

Encourage DBAs to be able to run these scripts every day to ensure that running them for go live is absolutely no issue. Our scripts only took about half an hour to run so this wasn’t an issue. I was personally involved in writing the SQL for those and I had systems in place to cross tab the amount coming into each new table so I could see new records and information from the old system trickling manually into the system and then being transferred.

Step 15
Test data input into new system

Step 16
Check test data input into new system with reference to domain users.

Step 17
Confirm go live date ensure staff available for issues

Step 18

Go live to production and start all new procedures ensure staff technical and domain key players on hand to make flexible solutions to things

Step 19
Project review on going maintenance and improvement of new system

Step 20
After suitable time turn off of old system if possible.

Simple Bat File to open multiple Web Pages in One Browser Window with alternate Tabs

Simple but can be useful

Useful if you are wanting to open multiple tabs in a browser window at the same time

@echo off
start "Simple Search" "https://planning.westlothian.gov.uk/publicaccess/search.do?action=simple&searchType=Application"
start "Google Maps " "https://www.google.co.uk/maps/@55.8625775,-3.6759593,17z"
start "WLC Spade" ;
start "idoxEDRMS Login" "http://cc-dmsapp-01:8080/IDOXSoftware/secure/IG_Main?url="

MS SQL Azure – Take Complete Backup of Azure Database (Structure and Data)

SQL Azure as part of the service offers a number of differing back up options as standard – however you may wish to take additional backups which for instance you can load onto a local version of SQL Server (Express or Enterprise). Here’s a quick rundown of one method of doing this.

Firstly open up your version of SQL Server Management Studio and navigate to the instance and database that you wish to backup

Highlight the database in this case DB001 and right click -Select Tasks and Export Data-Tier Application

Choose an appropriate location to put the backup file

Review the summary of items that will be backed up and then hit Finish

There will be a run down of how the export has worked

And just a check to see the exported file in the directory

MS Access like development environments for the Web – 3 alternatives

So you would like to construct simple applications that you can at the moment create in MS Access but you want to do it on the web. By that I mean you would like to create a data driven application with somewhat complicated forms that can be accessed by anyone through either IE or Chrome anywhere in the world with a simple login screen at the front to prevent simply anyone accessing the applications collecting the information into a database. What are your options for programs that will assist you in a MS Access like environment rather than going the full IDE deep dive – Visual Studio route – for what I consider to be a reasonable fee?

From my experience the unicorn of access on the web is slowly coming to fruition BUT for the vast majority of people with a budget similar to that for MS Access – lets say £200 ($250) a year for unlimited applications there is simply nothing which is quite as easy and powerful as MS Access. Some are pretty close but simply not as stable and require typically several magnitudes greater amount of configuration. WYSIWYG design isn’t quite as WYSIWYG and stability is a few orders lower than the desktop.

What you are probably looking at can typically be described as either RAD tools for the Web, a Low Coding Platform or something called a Code Generator any of those phrases can be useful for Google.

Assuming you don’t have your own servers whatever you do you will need to spend money on a web host.

The minimum this is likely to cost you is in the region of $15 a month. If you don’t want to spend the next 6 months learning about the insides and outsides of frameworks then I would suggest you go to one of the below three providers who all provide complete environments set up and ready to go with their particular generators pre-installed. This is good value it is extremely difficult to beat these guys on cloud hosting costs and unless you are very advanced and have very particular requirements its a waste of time to try. All three of the below providers will allow you to create limitless number of applications albeit you are limited by the space you hire on your web server. Similarly distribution will be limited by the quality of web server you sign up for. In all likelihood if you have few users it is unlikely that the coding front ends of your applications will be a limit to the number you create more likely the size of databases you are attaching them to and the shear time you have available to create applications.

For a period I was paying a monthly amount for a Nubuilder Pro hosted platform. This performed well and I could create an unlimited number of applications. As it was so hosted I skipped the step of learning some of the deeper parts of the initial configuration. I hope at some point to go back to this. It is open source and seems well maintained with a very dedicated developer. The developer re-wrote much of it and at March 2019 it latest re-incarnation is Nubuilder Forte.

Be warned n-tier web applications do not play as friendly as the desktop you WILL be slower to construct applications than you are on the desktop, getting into it WILL take time and a bit of commitment, you WILL have far less flexibility regards coding, there WILL be less people about to ask questions and there is far far less WYSIWYG design capabilities, error trapping is poor and errors are far more likely to be fatal and the really big warning is that on release of new web frameworks you may not necessarily be able to update without a full site re-design (A fact that comes as a nasty surprise to many CIOs and Project Managers when they realise that they are locked into front end system replacements every 4 or 5 years ) Know how to get data to your local environment out of the back end and accept that the front end is ephemeral and not likely to last in the same way as your desktop applications. (Your database will last but don’t expect to be running it through the same front end ten years from now). Accept that you will now have monthly or annual rental fees for cloud provision.

That said the design of these items is significantly faster than its ever been.

Scriptcase and ASP Runner dot net (Xlinesoft also produces a PHP equivalent generator) have free downloads that are good for getting started.

Commit to one and go for it. – I’ve got both PHP and ASP.NET solutions.. Nubuilder only connects to MySQL whereas Scriptcase and ASPRunner.NET connect to pretty much any database. I started with Nubuilder and am using ASPRunner.net as well because importantly it connects to SQL Server and it was easy to get up and running in MS Azure. Scriptcase is php based and I believe the applications it build require some kind of security program to sit on the web server this put me off – they do however have hosting that you can sign up for which is pre-configured. Their IDE is web based which could be a winning advantage. One of the great advantages of ASP runner dot net is that the program produces an open web application that should run on all stock servers. I found Nubuilder Pro (now Nubuilder Forte) really conceptually elegant which despite its rather drab looks is incredibly flexible the applications it produces are however limited to MySQL and non responsive (But being non responsive you get get more detailed forms!). I would probably be able to change it’s look if I was prepared to get my own server and install everything on it myself. That is not something I have time to do at present.
Nubuilder hosts its IDE in the browser which again is an advantage. ASPRunner.net is more traditional in that you have a program running on a desktop that creates the plumbing of your application which you then need to push to a server for publication  this has the advantage that you get to see the plumbing in the background which makes backup of the site easier but publishing slightly harder.

You may have heard of other generators / design applications out there for example – Zoho Creator / Alpha 5 / Outsystems these hold your hand even more but as a result are even more proprietary and won’t fit in that budget of £200 per year ( by quite a long way!)

Some further information on costs – nubuilder being open source in theory could scale for very little money espectially if you have your own servers already. Scriptcase and Xlinesoft ASP Runner product have an initial fee followed by annual subscription – you may be able to configure it so that you can create unlimited applications for that one fee (if you have good access to web servers ) but it is likely that initially there will be some kind of variable cost per additional application you wish to build. I am using MS Azure with ASP Runner dot net and a developer database costs me about £5 a month with each application being hosted in a web application service which again costs £5. With both Scriptcase and ASP Runner if you stop paying the annual fees your applications will continue to work you will just not get version upgrades. You will be able to step back into version upgrades but you may need to restart your subscription with an additional fee.

Nubuilder Forte Link

Scriptcase Link

ASP Runner – PHPRunner and ASPRunner.Net Link

Good luck

Upload XLS information to a specific table or new table within an SQL Azure Database

I have started experimenting with Microsoft Azure if you haven’t already you can get a free experimentation account here;

Microsoft Azure Trial account with £125 credit

This gets your registered on Microsoft’s cloud and after a free trial period you will be able to continue with a Pay as You go Account which depending on the services that you go for can start at very cheap rates.

In order for this to work you will need the following
1) Microsoft Azure account
2) An SQL Azure Database
3) SQL Server Management Studio downloaded and installed on the machine you will be uploading from, this can be obtained from SSMS download link be warned its over 800mb. Here I use SSMS 2016
4) Know your server name this is generally [Yourname].database.windows.net
5) Login and Password (I use SQL Server Authentication)

Testing things out I have been using the Web Apps Service to run a website and connect to an SQL Azure Database – both on the cheapest options.

What makes the website particularly cheap is that it can be stopped and started and by paying for it by the minute you can really get a powerful demonstration sites up and running and stop them immediately after the demonstration for very little money.

So after having created an Azure SQL database (Microsoft Create Azure Database Tutorial)I wanted to get a decent number of records into it. Which would be the starting position when taking on most work.
Here I use the Lichfield Planning Application information previously referred to in this post QGIS Import. What I did was take the 45,000 records of planning applications from the shape file. I did this by opening up the dbf file of the shape collection in Excel 2003 and then saving it in excel format. This will be used later to import into the database.

Having your excel file ready the next step is to open up SQL Server Management Studio and connect to your Azure Database. The parameters with regard to username and servername will have been setup when you created your Azure database it is important that when you create your Azure database you somehow record these details.

Next highlight the database – in my case this was DB001 and right click to get tasks.

At this point you enter the import wizard windows dialog boxes and having passed the opening welcome screen , a screen that can be turned off for subsequent navigations, you should hit your first screen that allows you to define the format of the file that should be imported.

The next step is about the only one that is slightly confusing – you are given a number of different options for the target – for me SQL Server Native Client 11.0 worked for me.

Now using the previous parameters specific to your database server and your database name complete the next dialog.

The next dialog asks you whether you want to copy all information or want to write a query to filter the information to be imported. For my example I chose the all import item. Here I select the database and then I am able to see the from and too destinations.

If you wish to import into an existing table use the drop down to select tables from the database – if you wish to import into a new table you can type in the name of the new table within the square brackets.

Here I create a new table called T010Test and import and then continue through the import wizard dialogs until on completion of import you should see a similar screen to that below. It is possible to go into the edit mappings if you are copying into a table that already exists. This will give you a preview showing to what extent the mapping will be successful and how the fields map. You may wish to alter the names of columns to match your target table at this point.

Microsoft Azure – moving to the cloud

I must confess I do love MS Access and I have a number of projects that I suspect will always remain as are – these are ones that are highly complex but very personal that I really don’t need to share. Even some projects that because of their sensitivity I would never want to share but despite the difficulty of designing really user friendly user interfaces on the web the ability to distribute your applications to everyone in the world is a very powerful attractor which is quite clearly going to be a game changer.

I have therefore started to experiment with data driven online applications and so far I am impressed.

To get started you will first want to sign up with an Azure account.
Azure Portal Sign In

I can see in the future I will probably be using Access as a desktop platform for writing queries and personal content curation with SQL Azure for projects where I need to communicate with others linked through the apps service in the Azure portal.

The Economics of Immortality (part 2) and a real reason for moving platform.

The announcement in December 2014 that Microsoft was releasing a version of Visual Studio Community to virtually everyone for free and then again a further announcement in February 2016 of Microsoft’s purchase of Xamarin and subsequent release with VS community edition suggests to me that my theory that cost of development tools should steadily reduce to next to nothing is on the button. Still it comes as a surprise that it seems to be happening so quickly. This can only be a good thing especially when considering that increasingly, any application can be developed locally and sold globally or at least published internationally.

In my opinion it is looking more and more likely that the best reason for moving legacy systems to new platforms will in the future not be cost savings but rather revenue possibilities of resulting systems. In a word Software as a Service. But why will anyone one want to create their own Services rather than paying to be a customer of others? Two reasons – number one – at present the field is wide open and in many fields there is a complete lack of rich software out there applicable to individuals problem, secondly the caution with which some individuals will relinquish their information and knowledge of their tools – in that respect revenue generation might just be a nice optional bonus. I do think that different categories of software will progress at different rates and maybe in that respect we can look to Accounting software to see the future for other categories. Everyone is interested in money and the shear ubiquity of the requirement for accounting support has focused resources heavily on tools resulting in a healthy market of options for customers for online software. There are still sticking points in the form of cost especially when being charged on a per seat basis but it is no longer really practical for any individual organisation to develop their own accounting software especially when you can choose both on quality and price in the wider national markets. But outside of accounting the choice of products seem to be somewhat scarce.

So it seems its early days for Software as a Service but certainly maybe the question should be not what legacy systems need replacing but…

What tool do you have at your work which you think with a redesign could be used by others by publishing to the web?

Using Task Scheduler and Visual Basic Scripts to automate pretty much anything Win 7

This is hardly ground breaking but windows Task Scheduler can be used to run VBS scripts on a variety of events to automate repetitive and boring tasks. Backups for example or refreshing caches on web browsers before users come in.

Hit Window button

Type “Task Scheduler”

Press return you should get the following screen.

Schedule1

Make sure Task Schedule Library is selected in the tree on the left then within Actions sub window on the right click Create

Schedule2

Give the Task a name in the area above marked with a red arrow and then go to the actions Tab where you can use the New… button to navigate to the vbs file you wish to run.

Schedule3

Next use the browse button OR if you know the path and file name simply type in the path to your visual basic script you would like to run. Here I have put in an imaginary backup script. Next go to the Trigger tab and again hit the New… tab

The Begin Task selection list gives a good indication of when you want to trigger whatever tasks.

Schedule4

If we continue with the theme and run a vbs script to do a backup at 11:48 everyday

Schedule5

nuBuilderPro – a database driven Web Framework for simple Form Design

I have been seeking a way of designing simple web forms for a reasonable cost for some time. I was continually coming across either very professional products with really quite large annual charges (alpha anywhere) or alternatively products that charge a per seat cost for users (zoho creator). Fine I don’t mind paying for software I really don’t but what about a project that only brings in $2000 dollars a year how can I justify Alpha Anywhere’s costs for that or how can I give access to individuals who will need an application very rarely but for whom there is real value in knowing this information. I have long believed that software costs should be coming down in cost rather than ever increasing and getting users interested in your applications is often impossibly hard at the best of times when there is no cost let alone when you tell them that they need to shell out for a subscription. What happens to the user who only needs something once in a blue moon or a user not from within your department within a section where you have no control of the budget? I’ve recently had a lot of success with open source software and as a result I have been scouring the internet for options for some years. Recently I found a project called nuBuilderPro from an Australian based development house called nuSoftware. nuBuilderPro is version 3 of what I think is their own framework. It looked good and what’s more although based on the LAMP stack which is something I’ve had very little dealings with they offered a fully hosted VPS service where I could experiment and get my bearings. They completely host the development environment for a very very reasonable price. Given that I didn’t know whether it was going to be useful to me yet this seemed an ideal opportunity to experiment. It has only been a short time since sign up but I can already see that it will prove useful. I have signed up for the most basic of server accounts and at 17th of January created 3 very basic applications. All in about 3 hours of work. Thats what I like to see proper RAD development.

The website is here.

www.nubuilder.net

And the options for hosting are here

www.nubuilder.net/hosting.php

Importantly you can start small get to know the product and work up – additional users do NOT cost you extra money, login security comes built in and you are free to design as many applications as you can fit within your VPS. Given that when I first signed up I didn’t have any users or applications and didn’t know how to design applications, this seemed completely ideal.

So far it looks like I’m going to use it for small but important applications that need very quick development that are very distributed. That’s not to say it couldn’t be used for much larger applications but I need to run before I can walk. Given the underlying database engine and stack it is likely that it will be able to handle much bigger projects than I am likely to throw at it. Although Open Source all but the top of the stack has been in solid development for many years I guess the question mark is over the control management framework for the database integration and form building UIs which is somewhat new and untested. I would very likely have gone with something like Lightswitch but Microsoft have been giving a distinct lack of commitment to any one RAD web tool recently. Certainly it is true that very few individuals seem to know about it and the forum is somewhat quiet compared with some open source projects. Importantly though I haven’t come across any other open source projects with the price model that they have and importantly I can easily get any information in and out so I don’t believe I am taking any risk. I hope to be investigating it further over the coming months. From what I have seen so far I am very impressed and did I say it had a unique price model anyway watch this space!

SQL Saturday #388 Edinburgh

SQLSAT388_web
Went along to SQL Saturday BI Edition on 13th of June.
Honestly I thought it was really great. Many thanks to Jenny Stirrup for being the event organiser. If you are thinking about going next year you really should.

This was the schedule for SQL Saturday in Edinburgh

Sessions I attended
Key Note with Jon Woodward – on long term future of computing in general
Web Scraping with Python – Iain Elder of Sky Scanner
Better Data Visualisation for CRM and ERP – Adam Vero
The European Economic Crisis and the Euro-A Data Tale – Carmel Gunn
Blue Sky Thinking : SQL Azure Geospatial Mashup – Thomas Sykes
Master Data Management – Dave Lawrence

All the talks were really thought provoking and nice to hear from people who are really at the top of the game. I have already started experimenting with Web Scraping and Python.

Setting up a Blank SQL Server Spatially enabled Table using Microsoft SQL Server Management Studio 2008R2 Express and displaying it in QGIS 2.8.1

Programs used;

1- SQL Server 2008R2 Express
2- SQL Server Management Studio 2008R2 Express
3- QGIS

The example uses UK national grids coordinates to create a Triangle Polygon in a SQL Server Table

I’ve previously written that while we’ve had spatially enabled SQL Server for over 5 years I constantly come across line of business applications that although using SQL Server have not and do not intend to spatially enable the application. This is undoubtedly because of the difficulty in re-designing legacy systems actively in use and because the benefits although significant are not generally requested by all but the most knowledgable of colleagues.

While I understand this legacy system reasoning spatially enabled databases are the future so its just a matter of when and not if an application will require alteration. Understanding it in this context makes it really a requirement to start seriously planning for its inclusion.

Developerers creating new applications however should always consider spatially enabling relevant tables from the start even if it is not specked by the client/colleague. It being so much easier to spend a couple of minutes future proofing the schema of a new born database rather than hours trying to retrofit a live in production back end.

Firstly it’s important to understand what a geodatabase in SQL Server actually is.
Really it is a normal database which has one table that has a field that has a geometry or geography value type. In this example I will use desktop QGIS 2.8.1 to display the resulting geometry but any other digital mapping package that can link to SQL Server could be used. SQL Server also has a very rudimentary Mapping Display but you will need something better if you want to manipulate boundaries visually.

Many digital mapping products have plugins that will create Geodatabases and tables however I haven’t seen one for QGIS. I really wanted to be able to create spatial SQL tables on my own without recourse to paid tools directly in SQL Server Management Studio. So here’s my method of creating blank polygon table whose geometry is ready to be read and edited in QGIS or any other digital mapping system just using SQL Server Management Studio Express 08R2.

Steps
1. Create a new Table
2. Ensure the table has an identity Key that increments
3. Create a geometry column
4. Write a query that updates the geometry column

UPDATE T001Sites SET Coordinates=geometry::STGeomFromText(‘POLYGON((301804 675762,295789 663732,309581 664870,301804 675762))’,27700)

You will note that there are four coordinates here (each coordinate being a pair of numbers )
The first coordinate and last are the same this is required by SQL to ensure that the polygon is closed.

The 27700 number is the Spatial Reference System Identifier (SRID) – it is a unique value used to unambiguosly identify projecttion. 27700 is the unique identifier for the United Kingdom National Grid and represents the coordinates my example refer to. The spatial reference identification system is defined by the European Petroleum Survey Group (EPSG) standard which is a set of standards developmed for cartography surveying etc and owned by the Oil and Gas Producers Group list here; http://www.epsg-registry.org/

The above coordinates display a triangle in West Lothian near Edinburgh

5. Set up the connection to SQL Server Instance

Ensure the box marked “Only look in the geometry_columns metadata table” checkbox is unchecked. By default this is checked and if the geometry_columns table does not exist you will get an error message.

QGIS-SSMS-Connection

6. Display the table and edit as appropriate.

Select the table then hit the Add button

QGIS-SSMS-TableReadyforDisplay

And here is the SQL Server table in QGIS ready to be added to or edited.
QGISshowingSQLServerPolygon

Administering a Web Application some thoughts

In the last couple of years I have started administering a bought in Web Application from an outside vendor. I have found this really quite educational. I will not reveal the name of the product the vendor seems neither better nor worse than other vendors and many of the issues I am experiencing I see repeated with other admins of other web applications.

The software consists of various pieces of middleware linking SQL Server and the .Net and Javascript files. It uses a fairly standard Microsoft Stack to deliver a single page application. The big selling point was that that it was widely available through browsers pretty much anywhere on the web.

We have had it up and running in anger for a couple of years now and I have about 75 users all internal.

The vendor has created a new version and we now need to update. In discussion with the company they tell me that I should not be doing this myself and it is standard practice to get them in to upgrade the version. We already use it more as a stack technology to build our own applications I would want to continue this in the future. They have indicated that it will cost money to get them in to upgrade the software. This is triggering alarm bells because the existing configuration is very complicated, errors are quite regular and error messages tend to be pretty cryptic or non existent. I also like doing setup myself as it teaches me the ins and outs and speeds my learning up for systems which have non-trivial configuration.
A project board has been set up senior management are on board and well its grown legs arms and tails.

We have not widely publicised the application to the general public and they do not access the application at present.

The application itself requires significant amounts of configuration to set up correctly and parameters tend to be static and buried deep in very large HTML files. All this leads me to conclude that configuration in the new version is likely to be just as awkward. The company’s help desk is very good and there is extensive documentation but the sheer obscurity and quantity of configuration parameters makes the documentation the size of a phone directory, difficult and opaque.

Here’s some questions I should remind myself of when looking at New Web Applications in the future.
Are staff really so distributed that you require wide distribution?
Is it an internal or external distribution? – If not external do you really need a web enabled application.
Do you have internal expertise that is interested in developing applications?
With Web Applications or Local applications the backend database technology could well be the same. Internal local area development is likely to be far more flexible / cheap / quick and could motivate staff additionally it can be far more rich in its scope. It would be possible for instance to develop rich interface internally and a minimal interface externally using something like Lightswitch. This would take advantage of the strengths of both development environments.

There is a big push towards web applications I just think IT managers need to carefully consider their positions these applications tend not to be cheap and have a number of disadvantages that seem to have been pushed out of sight.