SQL Azure – Script to identify all Users in a SQL Azure DATABASE and how to remove Users

SELECT name AS username, 
     create_date, modify_date, 
     type_desc AS type, 
     authentication_type_DESC AS authentication_type 
FROM sys.database_principals 
WHERE type NOT IN ('A', 'G', 'R', 'X') 
     AND
          sid IS NOT null ORDER BY username;

And to remove a user one of the easiest ways both to see all the users and remove any of the individual users. After connecting to your Azure Database within the database in question identify the Security Branch and then the Users branch – right click on the user you wish to remove and select DELETE.

This follows on from the post whereby we were trying to creat a user see here
Creating Contained Users

IDOX – Using Public Access to find Planning Application History

A significant number of authorities in the UK have the same software that allows members of the public viewing access to most of their recent planning application data. The last time I looked the software was in place for almost the whole of Scotland, Northern Ireland and close to 3/4 of authorities in England and Wales.

Part of this provision includes a public access web site that allows members of the public access to planning application details via a mapping screen. With practice this can be used to search and research planning application history. When first using the public website users tend to be defeated by the user interface.

Here’s a quick guide to assist in finding available planning applications relative to a particular site if you know its location on a map.

We will take West Lothian as an example. The UI slightly varies between councils but the principles are the same for a significant number of authorities

Firstly navigate to the Public Access Search screen of the authority that you are interested in.

https://planning.westlothian.gov.uk/publicaccess/

You should be presented with the following screen.

Two thirds of the way down on the right is a tab marked – Map – click this.

You should see something similar :-

Using your mouse navigate to the land parcel you are interested in determining the planning application history for.

Next hit the filter ICON

Note on some authority sites this has been replaced by a drop down combination on the right.

Hitting this icon on West Lothian site reveals a slider – initially it is set to 6 months for West Lothian

Chances are you will want to know all of the planning application history so move that slider all the way across to all time.

Now back on the map you should see many more boundaries representing many more planning applications.

Users can now zoom in on the particular site they are interested in e.g. Wilkieston below

Now using the mouse you can click at the position you are interested in and the attributes of the planning application at the location you click will be revealed in a further dialog.

At this point you can note down the planning application references for further research or you can hit the link marked Application Details – here I click on the link related to 0761/P/18

Which takes you to the Planning Application details page.

Want to see any available associated documents linked to the application.

Look to the text line beneath the table of attributes. The 2 documents is a link that takes you to available docs linked to the application. The number will be different according to the planning application.

Clicking on it brings up the Planning Application Documents screen where you should be able to download or view the documents.

SQL Azure – Azure Automation Hybrid Worker Setup and RunAs Account

UPDATE January 2024 – For SQL Azure you are better using Logic Apps and whitelisting the IP records of the outbound Logic Apps IPs which are specific to the location you are running you are hosting your logic app
See Link for Logic App setup example
and
Azure Logic App IP addresses to White List

My old notes…
The highs and lows.

You’re flying high – you’ve created a stored procedure that you’ve managed to reference in your automation account within a powershell runbook and you’ve got it nicely churning over on an extremely regular basis. You want to run it daily or weekly but as a test you run it hourly just to ensure that it works before releasing it to production. A day passes ok and you think wow this is really great. Then you forget about it and come back to it a week later. You find that somehow it stopped working at 1 am on Tuesday.

What happened????

Sorry you got hit by a block resulting from dynamic IP change on the runbook in the Automation Account – DAMN.
(there is probably a good reason for having a dynamic IP on automation accounts but damn it was really going too well)

So after quite a lot of investigation I found those in the know seemed to be saying the solution is.
Azure Automation linked to a Hybrid Worker

I kept investigating this and I kept getting directed to Microsoft documentation which although really extensive was just not helping me really get my head around things. It wasn’t until I found Travis Roberts that things started to click… I link to three of his videos below.

So the downsides of Hybrid Worker setup..

  • Despite being the recommended solution configuration is complicated.
  • You need to create a VM with a static IP just to run the hybrid worker (and then grant firewall access to the databaseserver for that VM)
  • That VM needs to have 2 cores and 4Gb of Ram (yes unfortunately not the smallest VM so you could be looking at $25 a month charge)
  • You need to set up log analytics
  • It is recommended that log analytics is in the SAME Azure location as the sample database BUT not all Azure locations have log analytics set up… Use UK South! Hands up who has already set up their server and just realised it is in a location that doesn’t allow log analytics.
  • So I was really struggling with this after my runbook unexpectedly started failing. After several days of reading and searching I found Travis Roberts. Hallelujah for MVPs.

    So I haven’t done it yet but I think the procedure can all be defined in here.

    1. Setting up an Azure VM with a fixed IP (It can be any VM but Azure is good for me)
    Setup VM with static IP
    The server you setup will need to have internet connectivity with SSL outbound on port 443

    2. Setting up Azure Log Analytics a step by step guide
    Setting up Azure Log Analytics a step by step guide

    3. Azure Automation Setup and Linking Log Analytics to your Automation account
    Set up Azure Automation and link it to Log Analytics

    4. Configuring an Azure Automation Hybrid Runbook Worker to run your runbook.
    Installation of Hybrid Runbook Worker
    Includes information about Runbook Worker Group and
    RunAsAccount
    You add credentials to the Runbook Worker Group

    Firstly a big thank you to Travis Roberts for putting out those videos he has loads of Azure Centric content on his youtube channel which you can find here

    CAVEAT I haven’t implemented this myself properly and at the time of writing(December 2021) those videos are a couple of years old now. At the start of one of the Videos Travis indicates that things are changing quickly in the space and expect that there has been changes since these videos. Lets face it this should really be easier to do Azure is just frustrating their users with this complicated configuration.

    SQL Azure – Using Azure Automation and Runbooks to Schedule an Azure SQL Stored Procedure

    UPDATE January 2024 – I have moved away from Runbooks for scheduling Azure SQL stored procedures towards using logic apps but keep this here as its always good to have alternative methods of completing the same task. Link to Logic Apps Setup

    Original Post
    Its vital to be able to run stored procedures on a schedule it can be extremely important for doing daily housekeeping.

    This site was where I learned how to do this.
    How to Schedule an Azure SQL Database’s Stored Procedure

    Runbooks sit in Azure Automation and a Runbook consists of a script that performs a task.

    Runbooks can be written in Python or Powershell. See the last part of this post for an important additional issue.

    "Data Source=01server.database.windows.net;Initial Catalog=database;Integrated Security=False;User ID=01userid;Password=JK12jl423;Connect Timeout=60;Encrypt=False;TrustServerCertificate=False"
    
    # Instantiate the connection to the SQL Database
    $sqlConnection = new-object System.Data.SqlClient.SqlConnection
    $sqlConnection.ConnectionString = "Data Source=servername.database.windows.net;Initial Catalog=database;Integrated Security=False;User ID=userid;Password=password;Connect Timeout=60;Encrypt=False;TrustServerCertificate=False"
    
    $sqlConnection.Open()
    Write-Output "Azure SQL database connection opened"
    
    # Define the SQL command to run
    $sqlCommand = new-object System.Data.SqlClient.SqlCommand
    $sqlCommand.CommandTimeout = 120
    $sqlCommand.Connection = $sqlConnection
    
    Write-Output "Issuing command to run stored procedure"
    
    # Execute the SQL command
    $sqlCommand.CommandText= 'exec [dbo].[storedProcedure]'
    $result = $sqlCommand.ExecuteNonQuery()
    Write-Output "Stored procedure execution completed"
    
    # Close the SQL connection
    $sqlConnection.Close()
    
    Write-Output "Run completed"
    

    Honestly the hardest part is getting the code setting up the scheduling after that is relatively easy.

    WARNING : Runbooks have dynamic IPs so although you can set grant access to an IP your runbook will only run for a certain period before it stops working unless you either have a Hybrid Runbook Worker setup OR you allow Azure services and resources to access this server (which isn’t recommended) The next post I will do is my initial notes on how to configure a Hybrid Runbook Worker.

    SQL AZURE – Group records or widgets in Sets of a Specific Number

    Useful if you have factory products that are being packed into sets of a specfic number. In our case sixes.

    Swap out to a different amount if you need to. The last group in the record set will have a remainder level if the recordset is not perfectly divisible by the set integer.

    SELECT pkid,
    floor(((ROW_NUMBER() OVER(ORDER BY pkid))-1)/6)+1 as grpNum
    FROM t002caskfill
    

    Here is a screenshot of the above code being run on a table.

    PHP Runner : Tips and Tricks – Reload a page from event that belongs to a button

    For example on a page we have a button that parses text coming in.

    1. Add the following code to Javascript OnLoad event of the table in question

    window.tablePage = pageObj
    

    2. Add the following code to any Javascript event where you want to reload the table. Can be ClientAfterevent of any button

    if( window.tablePage ) {
    window.tablePage.reload({a:'reload'});
    }
    

    As far as I can tell what you name the pageObj in this case tablePage is up to you – at the point of naming you are creating this variable.
    My hunch is that you would want to name this something relating to the page and something not overly complicated but unique.

    The ClientAfterevent references the same pageObj for page refresh.

    My understanding of the way the code works is.
    On loading the page create a pageObj variable named window.tablePage

    After pressing the button if there is a pageObj variable named window.tablePage refresh it!

    PHP-Runner linked to SQL Azure – Passing parameters from a row to a stored procedure which is then Executed

    This is useful where you have values in a table row that you wish to pass to a stored procedure.

    As per the previous post I was wanting to use this in the context of needing to create a particular view from the data in a particular row.
    Its probably possible to do this with some kind of DB::Query or DB::Lookup but in this instance I chose to use dynamic SQL. I will be working on other methods as well.

    From my previous post I already had my SQL Azure Stored procedure defined as.

    SET ANSI_NULLS ON
    GO
    SET QUOTED_IDENTIFIER ON
    GO
    
    ALTER   PROCEDURE [dbo].[auditselection]
    (@firstname nvarchar(15),
    @Dcreate datetime)
    AS
    BEGIN
    SET NOCOUNT ON
     
    EXEC ('CREATE or ALTER VIEW v01 AS 
    SELECT dbo.T0001Persons.PKID, 
    dbo.T0001Persons.Firstname, 
    dbo.T0001Persons.DateCreated
    FROM
    dbo.T0001Persons 
    WHERE 
    dbo.T0001Persons.Firstname=' + '''' + 
    @firstname + '''' + ' 
    AND dbo.T0001Persons.DateCreated <= ' + '''' 
    + @Dcreate + '''' +
    '')
    END
    

    Firstly I went to the list view in PHP Runner and within the designer I inserted a button into the row.
    (I will be working on something a bit cleaner later but for now I’m just trying to get it to work)

    In the CLIENT BEFORE part of the tri part event I put

    params["Firstname"] = row.getFieldValue("Firstname");
    params["Dcreate"] = row.getFieldValue("DateCreated");
    

    Where Firstname and DateCreated are field values in my table T0001Persons

    So in Javascript I am passing the field values to parameters called Firstname and Dcreate and then in PHP I pass those variables to the stored procedure and the database procedure does the rest.

    Next I put the following code in the Server event

    $sql = DB::PrepareSQL( "EXEC dbo.auditselection 
    @firstname=':1',
    @Dcreate=':2'", 
    $params["Firstname"], $params["Dcreate"]);
    DB::Exec( $sql );
    

    Looking at the stored procedure syntax in SQL Azure.. you can see that the variables in the stored procedure that I happen to have been called @firstname and @Dcreate although they could have been named anything because PHP is passing the values and not the memory location(variable name) which makes sense because you are passing from a web server to a separate database server and the same memory spaces don’t exist in both.

    Which lead me onto investigating how programs and operating systems manage memory space. link

    And here are some screenshots of the code in PHPRunner

    Javascript Client Before Part

    PHP Server Part ( note this was my first effort which although worked did not qualify the database schema or name state the procedure variables being passed parameters)

    and just for completeness the Javascript Client After Part

    and how it looks in the list once deployed

    and what happens once I have hit the button.

    Of course this is all documented but when there is so much to learn it can take sometime to just look at the documentation and understand it. Most of what I have discovered here is written up
    Link

    Couple of points

  • Qualify the stored procedure with the database schema (although it does seem to work without this I am told this results in better performance)
  • You can qualify the variables that items are being passed to. Again will work without this.
  • SQL Azure – Dynamic SQL – Using a parameterized Stored Procedure to Create a View

    There are situations where you need to create SQL on the fly from a record that you are currently dealing with. This is sometimes referred to as Dynamic SQL. Here I start investigating creating a dynamically created SQL view using a stored procedure. In my case I was doing this so that I could create some particularly valuable values.

    Here we have a table called T0001Persons that has the following three fields

    PKID
    Firstname
    DateCreated

    How do we create a stored procedure that will Create a view

    USE yourdatabasename
    GO
    
    SET ANSI_NULLS ON
    GO
    SET QUOTED_IDENTIFIER ON
    GO
    
    ALTER   PROCEDURE [dbo].[auditselection]
    (@firstname nvarchar(15),
    @Dcreate datetime)
    AS
    BEGIN
    SET NOCOUNT ON
    
    EXEC ('CREATE or ALTER VIEW v01 AS SELECT dbo.T0001Persons.PKID,
     dbo.T0001Persons.Firstname, 
    dbo.T0001Persons.DateCreated
    FROM
    dbo.T0001Persons WHERE dbo.T0001Persons.Firstname=' + '''' 
    + @firstname + '''' + ' AND dbo.T0001Persons.DateCreated < 
    ' + '''' + @Dcreate + '''' +
    '')
    END
    

    and then an example execution is

    EXEC auditselection 'Mark', '2021-10-03'
    

    Using the script as facility in SSMS we can identify the query that has been created by EXEC auditselection 'Mark', '2021-10-03'.

    SELECT dbo.T0001Persons.PKID, 
    dbo.T0001Persons.Firstname, dbo.T0001Persons.DateCreated
    FROM
    dbo.T0001Persons WHERE dbo.T0001Persons.Firstname='Mark'AND dbo.T0001Persons.DateCreated<'Oct  3 2021 12:00AM'
    

    Several things should be noted here

    Within the EXEC statement we use a single quote to escape the string and then a plus sign followed by the parameter to insert the parameter into the SQL that will be executed. A single quote is then used again to escape the code to go back into the string.

    But what if the SQL we want to run needs to include a single quote (which is often the case). I tried escaping the the string with a single quote and then a plus sign and then a Unicode Character references (only tried 39) surrounding a single quote but that didn’t really work.

    What did work was to have four single quotes with two pluses on either side.

    + '''' +

    Here a single quote escapes the code and starts the string again. Two single quotes together are interpreted as a single quote (clearly some kind of special interpretation is happening in the background) and then the last single quote escapes the string back into code.

    From a discussion on Stackoverflow I also found the option to SET QUOTED_IDENTIFIER switch.

    If escaping your single quote with another single quote isn’t working for you (like it didn’t for one of my recent REPLACE() queries), you can use SET QUOTED_IDENTIFIER OFF before your query, then SET QUOTED_IDENTIFIER ON after your query.

    SET QUOTED_IDENTIFIER OFF;
    
    UPDATE TABLE SET NAME = REPLACE(NAME, "'S", "S");
    
    SET QUOTED_IDENTIFIER ON;
    -- set OFF then ON again
    

    Next up I will be passing parameters to parameterized stored procedure via PHP Runner where the parameters are taken from an onscreen record.

    PHPRunner create an event to pass parameters to a stored procedure

    This looks promising
    Passing one of the field values as a parameter:

    $sql = DB::PrepareSQL( "EXEC StoredProcNameHere ':1'", $values["FieldName"] );
    DB::Exec( $sql );
    

    and I have discovered that the :1 is considered a placeholder for the following variable and so passing two variables would come in as the following assuming both first and second parameters are text values. If :1 was the target of an integer instead of ‘:1’ we would have :1.

    $sql = DB::PrepareSQL( "EXEC auditselection ':1',':2'", $values["FieldName1"], $values["FieldName2"]);
    DB::Exec( $sql );
    

    This is php here the single quotes DO NOT escape the string. They will form part of the string that will be executed which is what I want.
    The string is not escaped until the double quotes. Confusingly two single quotes look EXACTLY the same as a double quote but they are completely different. If you use your cursor you can see that the double quote is a single character which is interpreted completely differently by the php engine.

    The following link may be important as well in determining what can be achieved
    Grid row Javascript API

    SQL Azure – Convert UTC to your local time – and an interesting observation

    Not unsurprisingly SQL Azure GetDate() stores datetime as a UTC value.

    If your application is hosted in a specific country it is likely you will wish to return that value to the local user in the local time.
    Here I host in the UK and want to display in the UK so my choice is simple enough.

    The default CONVERT() statement in the SQL Azure library will do this for you.

    An example of the statement with parameters is here (slightly confusingly here I have the field name as a reserved word but just ensure that your field name is in the square brackets)

    CONVERT(datetime, 
    SWITCHOFFSET([datetime], 
    DATEPART(TZOFFSET, 
    [datetime] AT TIME ZONE 'GMT Standard Time'))) 
    as UKTime

    And here is that statement included in a full query.

    And here is an SQL statement to determine all time zones currently observing daylight saving on your server.
    This query can also be used to identify the pre-defined text string parameter you should have for your convert statement.

    SELECT
      name,
      current_utc_offset
    FROM sys.time_zone_info
    WHERE is_currently_dst = 1;
    

    Running this I get the following..

    Now what is interesting about this is that the text string defined by Microsoft is GMT Standard Time and according to my investigations it definitely adjusts for British Summer Time (which most of the time is what I want!). OK but if you go to Wikipedia you get the following definition of what GMT is :-

    Greenwich Mean Time or GMT is the time displayed by the Shepherd Gate Clock at the Royal Observatory in Greenwich, London. When the sun is at its highest point exactly above the Prime Meridian, it is 12:00 noon at Greenwich.

    GMT is not affected by Daylight Saving Time (DST) clock changes.
    The Greenwich Meridian (Prime Meridian or Longitude Zero degrees) is the reference point for every Time Zone in the world.

    Every 15° longitude represents one hour’s difference in time: (24 x 15 = 360, the degrees of a circle). You can work out the time at every location on earth if you know how many degrees it is east or west of Greenwich.

    GMT is not affected by Daylight Saving Time (DST) clock change -> So obviously that contradicts the definition in SQL Server – fair play to the engineer who set up the sys.table he just wasn’t aware. Probably won’t be changed now but its indicative of the danger there is in dealing with time when you need to be ultra accurate. Users tend to think that dealing with time is straight forward but in reality its not!

    Multi-tenancy – Design Options

    Some personal notes on different options for managing multi-tenancy

    Silo – Different databases for each tenant. Again PHPRunner has an excellent solution for this.

    Siloed Database with one Front End PHPRunner

    Bridging – Unique schema for each tenant – for me this I can’t see any great advantages to this and quite a lot of extra required work.

    Row Level Security – Tenancy managed by the database – One database additional column in tables that record the tenant and prior to showing results of query the database will filter results by tenant. Policy set up with unique user for each tenant. Postgres applicable. Still requires an additional field in all the tables but it is nice to have and can be extremely quick to implement.

    Client Software – Client side implemented tenancy – PHP Runner has an option to implement Row Level Security in the design of the front end.

    Amazon Web Services article on Multi-tenancy

    Some points – for postgres implementing Row Level Security on tables does not carry through to the views which is a major issue for me. There are way rounds it though. It would be great to have the database managing the row level security and if you have all your tenants in one database possibly the best way to do it is using the below.

    How to Add Row Level Security to Views in PostgreSQL

    Identify whether you are using MariaDB or MySQL using phpMyAdmin (through cPanel here)

    When it comes to relational databases I used to think right there’s

      MySQL
      SQL Server
      PostGres
      Oracle

    I know there is MariaDB as well and I know that is a direct substitute for MySQL I think I might be using it but you know what I’m not really sure. Whenever I look things up I quite often look for MySQL and to date any information obtained using those parameters have worked when executed against my databases through phpMyAdmin.

    Should I be using MariaDB or hang on – am I???

    Well turns out they are so similar that you might be using and not even know it. Turns out my hosting company uses MariaDB instead of MySQL but you might be under the impression you are still using MySQL. Why? Because if you are like me and you are using cPanel all the branding is still MySQL. I suspect this is actually very usual for anyone using cPanel.

    Here’s my Database panel in cPanel

    MySQL written all over the place even with a registered trademark symbol reinforced with the Dolphin MySQL logo.

    Additionally phpMyAdmin never really mentions much about the database server you are using. And look it up and phpMyAdmin is often described as the MySQL client.

    Am I using MySQL —— NOPE

    Open up phpMyAdmin

    Open the SQL panel and execute the following query.

    SHOW VARIABLES LIKE "%Version%";

    So despite all the logos and branding to the contrary I’m using MariaDB for this blog! Useful to know. I don’t know if it was MariaDB when I signed up in 2014 or not but this blog has been running continuously for 7 years and 7 months (at date of publishing this article) with no issues.

    Kinsta on the differences between MariaDB and MySQL

    I’m happy with being on MariaDB just because I am not mad about Oracle. Turns out I’m not alone in that position. Google / AliBaba and the European Investment Bank are all heavily invested in MariaDB which is probably why so many hosting companies actually run MariaDB in the background. There are big companies on MySQL but MariaDB is not going anywhere and many like me might have been running everything on MariaDB without even knowing it for years!

    MariaDB on Wikipedia

    I note in the Wiki article they state as a difference between MySQL and MariaDB Geographical Function support – not sure that part is true – I have asked Wikipedia to investigate , according to my research there is support via extensions although I have no personal experience of their use.

    Setting up CRON job using cPanel

    Automation is so important these days and the following details how to create a simple cron job using CPANEL to send out a regular email. You can automate anything but I was wanting to use an example that would send me something to show that it was working. So I set up the following as an example to send out an email to myself every 5 minutes.

    See also
    How to Set Up and Run a Cron in CPanel

    I am going to focus on a single method. The above link details how to do it using the command line on a linux server which may be helpful. I am concentrating on using the cpanel interface here.

    Step 1 : Sign into your cpanel administrative panel.

    Step 2 : Look to the Advanced Tab and identify the Cron Jobs Icon

    Entering the Cron Jobs panel you will be presented with the following screen.

    Leave the Update email button…

    Step 3 : Identify the PHP script you wish to run and load it into a directory taking note of where you have noted.
    (You will need to point CRON to run your specific script)

    If you don’t have a script try this one. It uses the php mail function to send out a simple email to whatever target email you want which allows you to

    <?php     
    $to_email = 'targetemail@targetcompany.com';
    $subject = 'Mail sent using a Cron Job Script';
    $message = 'Hello and best wishes';
    $headers = 'From: noreply@yourcompany.com';
    mail($to_email,$subject,$message,$headers);
    ?>
    

    Step 4 : Save the above PHP script somewhere and load it to your server so that taking a note of where you put it.

    Now I would start and set this to run up every 5 minutes just to test that it is working –

    In the common settings there is s drop down which has a range of settings everything from once a minute to once a year select once every 5 minutes.

    Step 6 : Set the command to be run.
    Now this is the only really tricky part to the whole thing. I haven’t found a great deal of documentation on the syntax of the command that you should enter. I found two different syntaxes that seemed to work.

    /usr/local/bin/php /home/youruserid/public_html/cronscripts/testrun.php
    

    or

    php -q /home/youruserid/public_html/cronscripts/testrun.php
    

    Then simply hit the Add New Cron Job button and then wait for five minutes – if everything has completed successfully there will an additional line in the cron jobs listing and you will get an email every 5 minutes.

    Happy Cron Jobbing.

    One interesting point in my cpanel host was that you could limit the cron job to the domain which may be necessary as you become more proficient with the kind of jobs that you are wanting to run in your cron job.

    PHPRunner by Xlinesoft – a personal recommendation if you need a Low Code platform perfect for CRUD application development

    Bit of background I am like most , constantly looking out for better tools and have been periodically searching Google for low code tools now for nearly 5 years. 3 years ago I started dipping my toe into web code generators and then about 2 and a half years ago I went out and bought tools linked to Xlinesoft – I have copies of both ASP.net runner and PHPrunner and I thought I would give some feedback. It seems to me incredibly difficult to get unbiased reviews of low code platforms. If you do a search for it on google you only seem to find the really big (and expensive players) who don’t clearly indicate how good they are for CRUD application generation. The main generators I looked at were Nubuilder (open source but only supports MYSQL databases and it doesn’t produce responsive designs) – PHPMaker – not bad but wasn’t getting great performance and didn’t like the UI of the genrator that much / Radzen – good but ASP NET based and its a bit young at the moment.

    Xlinesoft Main Website

    Word up – I basically think Xlinesoft’s PHPRunner is particularly excellent. I have copies of ASP NET Runner and PHPRunner but found because I can get better performance with PHPRunner and because Linux hosting is cheaper I am moving all my applications across to PHPRunner. Their ASP Net Runner has served me well and I had one client on it and they are very happy. I still plan to move them across to PHPRunner.

    I have managed to create 5 working applications and I am starting to get independent clients who I make applications for. To date they have been very happy with my work and are usually very happy with how quickly I can turn round their requests.

    Points I would draw your attention to specific to PHPRunner.

  • Code produced is non proprietary and uses vanilla PHP.
  • PHP 7.4 support is standard and PHP 8 support is being introduced. I generally use the latest PHP version my host provides an option for which to date has been 7.4. I am really looking forward to using it with PHP 8.
  • I’ve had no problem deploying the code to Linux web servers seems to work seamlessly. Previous versions of PHP are supported.
    Initial scaffolding of a connected database is fast and flawless. Every connected table gets a List / Add / Edit / View and Search page. This saves a massive amount of time in getting started.
  • So far I have connected to MySQL / Postgres and SQL Azure databases as the backends all seem to work well.
  • To date I have only used SQL Azure databases with Azure application service
  • I have had excellent performance between PHP Runner on a Azure Web App services and SQL Azure on Azure (even the basic developer level web app and database provisions)
  • I have had excellent performance between PHP Runner and MariaDB hosted on a linux paid host
  • I have had excellent performance between PHP Runner and Postgres where the web client is on a linux paid hosting service and postgres is on ACUGIS hosting service.
  • PHP Runner seems to deploy really really nicely to an apache server.
  • Security set up seem intuitive and clear but very powerful.
  • Two factor authentication is really easy to setup either via email or via SMS through a services such as Twilio
  • Additional applications are free there is no per seat option. If you are paying for a host where you have the option of unlimited subdomains it is likely that you will be able to get an application up and running for no additional monetary cost.
  • Setting up email is really easy.
  • Code behind for events is very intuitive and extremely intuitive only thing that will hold you back is your ability to code.
  • Charting seems good – I haven’t used it extensively but works as I expected it too and pretty.
  • The menu setup either the initial screen or a tree setup is very flexible and extremely intuitive
  • Like I said the generated code is not proprietary so no need to fuss with security tokens or things like that.
  • Continue reading “PHPRunner by Xlinesoft – a personal recommendation if you need a Low Code platform perfect for CRUD application development”

    SQL Server – TSQL – Create Trigger that calculates name of day from date

    Nothing earth shattering but really quite useful and often ubiquitious as a requirement I find it useful to have days of the week in tables sometimes and in most tables I will always have default fields of updated and created. I can then use the created field to drive a dayname

    [daysname]  AS (datename(weekday,[T001TimeLog].[created]
    

    And here it is as part of a table’s creation

    CREATE TABLE [dbo].[T001TimeLog](
    	[pkid] [int] IDENTITY(1,1) NOT NULL,
    	[TimeEstimate] [time](7) NULL,
    	[Description] [nvarchar](max) NULL,
    	[updated] [date] NULL,
    	[created] [date] NULL,
    	[daysname]  AS (datename(weekday,[T001TimeLog].[created]))

    And here is the field value on screen and through a User Interface

    MYSQL / MariaDB – Useful SQL in both creating New Tables and Clarifying Default Parameters

    Within an instance to identify the default character sets and collation in all databases.

    SELECT * FROM INFORMATION_SCHEMA.SCHEMATA;
    

    To alter the default character sets and collations for a database in a MySQL or MaraiDB instance.

    ALTER DATABASE dbname CHARACTER SET utf8 COLLATE utf8_general_ci;
    

    Creating Table – Template SQL

    CREATE TABLE table1 
    (col1 INT(10) NOT NULL AUTO_INCREMENT, 
    col2 CHAR(30), 
    PRIMARY KEY (col1))
    ENGINE = INNODB;
    

    And a good source of data types are here
    Tech on the Net

    There are unconfirmed reposts that queries against TEXT fields are generally 3 times slower than against VARCHAR fields.

    Values in VARCHAR columns are variable-length strings. The length can be specified as a value from 0 to 255 before MySQL 5.0.3, and 0 to 65,535 in 5.0.3 and later versions. The effective maximum length of a VARCHAR in MySQL 5.0.3 and later is subject to the maximum row size (65,535 bytes, which is shared among all columns) and the character set used.

    VARCHAR(MAX)

    Is a thing

    And if you want to change the engine on a table retrospectively. See OPERATIONS.

    And a short note on your choice of collation

    From Stack overflow here

    Actually, you probably want to use utf8_unicode_ci or utf8_general_ci.

    utf8_general_ci sorts by stripping away all accents and sorting as if it were ASCII
    utf8_unicode_ci uses the Unicode sort order, so it sorts correctly in more languages
    However, if you are only using this to store English text, these shouldn’t differ.

    Note further down there is a person that states “be very aware of this problem that can occur using utf8_general_ci

    “MySQL will not distinguish between some characters in select statements, if the utf8_general_ci collation is used. This can lead to very nasty bugs – especially for example, where usernames are involved. Depending on the implementation that uses the database tables, this problem could allow malicious users to create a username matching an administrator account.

    This problem exposes itself at the very least in early 5.x versions – I’m not sure if this behaviour as changed later.”

    And here is some code that allowed me to reset the auto increment index on a table.

    ALTER TABLE my_table MODIFY COLUMN pkid INT(10) UNSIGNED;
    COMMIT;
    ALTER TABLE my_table MODIFY COLUMN pkid INT(10) UNSIGNED AUTO_INCREMENT;
    COMMIT;
    

    I have tested it and works with INNODB and MySQL

    Another way to do it might be to drop the pkid table and then recreate it again

    Using

    ALTER TABLE mytable DROP pkid;
    ALTER TABLE mytable ADD pkid INT NOT NULL AUTO_INCREMENT PRIMARY KEY FIRST;
    

    CPanel – Setting up an additional FTP Account

    As previously stated there are a lot of hosted services that come with CPANEL installed. We have already indicated how it is possible to create a subdomain that can be used to host php based sites.
    If you are using a design environment you will want to set up an FTP account that can be used to upload your applications too. It is worth just making sure that you are familiar with this.

    Firstly log into your account at your host cpanel.

    Once configured it should appear in your FTP Accounts section here

    If you want to see the FTP parameters for upload hit the related Configure FTP Client

    CPANEL only allows you to create against domains and subdomains that already exist so this should be you ready to install software.

    SQL Azure – Trying to Alter Computed Column definition

    The situation – I have a computed column that I want to alter in SQL Azure

    After googling away it turns out you cannot alter a computed column what you have to do is drop it and then re-create it again..

    Bit disappointing – I then thought maybe I can just alter the function – Foiled AGAIN – that MIGHT have been possible but for the fact I needed to introduce an additional parameter into the function and that won’t work as I have to alter the parameter definition in the column definition which again is an alteration of the column..

    Oh well

    You might want to take a minute to compose yourself / make sure you’ve got a backup and then I guess go for it!

    ALTER TABLE dbo.t002table1
    	DROP COLUMN DistillPeriod
    ALTER TABLE dbo.t002table1 ADD DistillPeriod AS ([dbo].[calculatematuration]([t002table1].[dateoffill],getdate(),[t002table1].[disgorgedate]));
    

    Postgres – create a trigger to automatically calculate area of Polygon on edit

    This will calculate the area in hectares of a polygon on creation or edit..( referenced here as Before Insert or Update On)

    This appears to work for polygon and multi-polygon geometry types I have had a discussion with a colleague and he has indicated that the type of project may affect the accuracy of the measurement. Certainly there is a straight division in there so it would probably be possible to alter the division figure by empircally testing to get it as accurate as possible. Apparently with some popular projections the further north you are the less accurate the resultant figure. (Just something to bear in mind).

    CREATE OR REPLACE FUNCTION calc_area() 
    RETURNS trigger AS $BODY$ 
    BEGIN NEW.grossarea := ROUND((st_area(NEW.geom)/10000)::numeric,2); 
    RETURN NEW; 
    END; 
    $BODY$ LANGUAGE plpgsql; 

    What I found interesting about this was the field name is defined in the function and called from the trigger. There is possibly a better way of writing this to define the field in the trigger itself..

    And the trigger that goes on the table

    CREATE TRIGGER area_calculate BEFORE INSERT OR UPDATE ON public.t001landparcels FOR EACH ROW EXECUTE PROCEDURE calc_area();

    CPANEL setting up ALLOW and Deny on IP addresses / forcing HTTPS

    Two changes to the .htaccess that are easy to implement and will improve the security of PHP websites.

    If you have subdomains that you would like to restrict access to and they are running a PHP site. You can use the .htaccess file to allow or deny access.

    A very uncomplicated way is if you have a particular site that you would like to ringfence why not just reduce access to it to one location… Your home..

    <RequireAny>
    require ip x.x.x.x
    require ip y.y.y.y
    </RequireAny>
    

    You can also use this to allow or block entire countries

    Block a country from your site

    And if you would like to find out the IP address range required to block you can find that here.

    IP2location.com

    The nice thing about this is that it works on subdomains. Using the CPANEL file manager just go into the subdomain find the htaccess file and block accordingly.

    And

    Force all of your pages to use HTTPS. To do this you will need to modify your .htaccess file (or create one if it doesn’t exist)

    Using the Code Editor in the file manager, add these lines to the beginning of the .htaccess file

    RewriteEngine On
    RewriteCond %{HTTPS} off
    RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

    Installation of Nubuilder through CPANEL

    Background – I wrote sometime ago about an open source low code project called nubuilder originally started by a Steven Copely – it has been steadily and consistently developed over the years and Steven has since shared development with some other talented individuals. I took another look at it recently to see whether  I could find a way of installing and playing about with it on my existing web hosting provider…

    and…

    If you have a webhost that allows you access to a CPANEL client to configure your hosting environment then it is highly likely that you can set up a nubuilder low code environment. If you have a webhost that allows access to a CPANEL account AND allows you unlimited sub domains and unlimited storage then it is highly likely that you will be able to create a nubuilder low code environment for zero cost!!! 

    A quick search online found that the following offer CPANEL / unlimited subdomains / unlimited MYSQL

    A2Hosting

    WebhostingHub (which are a subsidiary of InMotion)

    SiteGround

    There is usually a limit on the size of an individual MYSQL database but it is so high as to not be an issue.

    Configuration Instructions:

    This post was adapted from Steven Copleys video which can be found here..

    Installing on Bluehost

    First below I have blanked out my cpanel username variable with either a grey circular box or in code I’ve replaced it with an X. This variable is implemented by CPANEL software to allow resellers to ensure that thousands of users on the same servers do not accidentally duplicate directory and filenames. The username can only be accessed with an additional password but it is not generally available to anyone but the host admin and as such is additional security. It will be used in many of the default settings when creating things like databases and directory names.

    Firstly Log into your providers CPANEL

    Look to the Domains section and within it there should be a Subdomains icon.

    and select the Subdomains icon

    Create a new subdomain

    Here is the Subdomain field I have typed nbexample – the document root will automatically be completed

    Hit the Create button

    Next we create a database

    go to MySQLDatabases within the Databases menu

    Now we create a New User

    Here I do the same

    username will be X_nb4exampledbuser

    password

    BlueSkyIsEverywhereToday2020

    Hit the create user and Go Back and add the user to the database

    On hitting add you will be asked to select the privileges that the user has over the database just indicate that you would like to allow them all privileges

    Then hit the make changes button and you can if you want then check the MySQL databases and check that the database exists and that the user is there.

    Next we go to the Nbuilder Github site and download the master file but zipped

    Nubuilder 4.5 on GitHub

    And select the Dowload ZIP option from the Code drop down list.

    This will download a master file to your dowload folder – In your browser window in the bottom left you will see the file like this

    You now need to go back to the main CPANEL hub and look for section marked Files and look for the File Manager option. In my CPANEL it looks it is a red icon.

    The subdomain will be created as a directory within your root home directory and in my example looks like this. You will be transferring the master zip file into this directory so you want to select it.

    Next hit the Upload button and navigate to find

    nubuilder-4.5-master.zip

    And load it into your base directory. It should look something like this now

    Next you want to extract it..

    It will then ask where you wish to extract it to

    I will normally not put anything in here as it creates its own directory.

    There will be a short delay after which you will be presented with a dialog that shows the outcome of the extraction process.

    You should now see a new directory in which in my case is called nuBuilder-4.5-master.

    I don’t like hypens dots and capitals or special characters in directory names. So

    • I rename it to nubuilder45
    • Convert to all lower case
    • Delete the old zip file

    Next we want to go into the nuBuilder45 file and open up the nuconfig.php file.

    This is where we will link allow the nubuilder php  to link to the created mysql database.

    Right mouse click and select edit.

    And find the following lines

    Now remember from the start our database name / user and password are as follows.

    • Database : X_nb4exampledb
    • Username : X_nb4exampledbuser
    • Password : BlueSkyIsEverywhereToday2020

    And hit the changes..

    Next we want to create a very simple index.php file – alter the path to suit your subdomain / domain and directory where you extracted the nubuilder master file to.

    <!DOCTYPE html>
    <html>
    <body>
    <script type="text/javascript">
    window.location.replace("https://nbexample.cloudydatablog.net/nubuilder45")
    </script>
    </body>
    </html>
    </body>
    </html>
    
    

    It may well look like this

    This simple index.php file is placed in the root of the new subdomain you created

    Now you should be able to go to the new subdomain in any browser on the planet and …

     

    Congratulations you have successfully configured a default instance of nubuilder – your journey to creating low code online databases can now begin..

    PS the default credentials are

    Username : globeadmin

    Password : nu

    Please change these immediately in the nuconfig.php file when you get a chance.

    For more information on starting to design and develop with nubuilder please follow the link below

    link