If you are running things on the cloud you always want to be thinking about backups and ensure that you are really aware of what is happening with backups and how to restore from backups. You should regularly check what is happening with backups and regularly practice restoring databases.
Basic / Standard and Premium SQL Azure has two basic options for restore.
Point in Time Restore and
Long Term Retention
They perform the same task – they collect old databases that can be used to restore to a server if need be but what varies in the UI by which you restore them.
First point – Long Term Retention Policy is attached to the server and you go into it by first navigating to your server and then looking to the left hand section marked as Data management.
Select Backups and you will be presented with a list of all the long term backups of the databases on that server. The following are a couple of basic dtu databases and what can bee seen on the fictional server 01SeverName
In the above table you can see that Long Term Retention policy has NOT been set. How do you change this.
Firstly ensure that you are on the retention policies tab and then tick the particular database you would like to set the retention policy for in the lower section this will allow you to Configure a policy for that particular database.
Here are the options for a BASIC DTU database
Compare this with the options for a database in the Standard Tier
Set it up and then hit apply and you should be good to go.
There are no shortages of articles on how to take a backup of a MySQL or MariaDB database from cPanel however given how important it is I like to write these things down showing how I accomplished this for my own reference. Its extremely easy to do a download which means that you should not have any opportunity or reason not to do it regularly if you have an important database that for instance is part of a web application.
1. Enter your given cPanel management portal.
2.Find the database section and select the phpMyAdmin icon.
3.Select the database you are interested in from the lefthandside.
4.Click export in the menu section and then its just a case of clicking go. An SQL will be downloaded to the download directory (on a windows machine). And keep this and you can run this to create a new database.
And its very easy to go in and check if its ok you can use any good ide or you could change the suffix to TXT and then just look at it in word.
Happy backing up.!!!
I note that for local SQL Express and I believe enterprise SQL Server there is the additional option of creating a Backup in SQL Express. I also note that from my reading Baks are considered preferable to bacpac files because they have enforced ACID compliance and that for large databases that are constantly being used they are recommended to bacpac files. SQL Azure doesn’t allow BAK file backups through SSMS from what I can see so if this is an issue for you consider temporarily disconnecting front ends from the database while a bacpac is conducted. If you need a bak file for some reason you can attach locally to a SQL Server instance and from there take a bak file.
Something to be aware of..
See this link for further information
Blobeater blog post
Connect to SQL Azure through SQL Server Management Studio
Select the database that contains the table you wish to script
Right click on the database and select Generate Scripts
The Generate Scripts dialog should appear choose
Select specific database objects
Hit Next and set the location of where you wish the file to go
Review the summary dialog to ensure you have everything correct
Next you want to decide whether you want to include data or not in the script file
Hit the advanced button and scroll down to Types of data to script
If you are wanting data to be included select schema and data
Hit the Next button and review your configuration
Hit Next and it will start the process of exporting your table with or without data
The dialog will update to show status of script generation
You will now be able to navigate to the script and open it in an editor of your choice to double check that things look ok.
Here I show a script that includes data – note you can’t see all of the insert information because the script is rather wide but I assure you it is there at the bottom.
I was contemplating a better way of storing our old emails and unhappy with some of the systems in place I started considering whether I could dump them into a database.
Thing is when you export from Outlook some of the standard fields in particular To and From are concatenated. Wouldn’t it be nice to separate those fields into their own table of names and addresses and reference them back to a main table of messages.
This is what I came up with.
For demonstrations purposes I will use two tables
pkid - autonumber primary key
ccaddresses - memo or long text
and the child table
pkid - autonumber primary key
ccaddress - string(150 should do it)
pkidt001 - number
and here is the blank second table
Next we create a user defined function
Public Function CreateChildTable()
Dim db As DAO.Database
Dim rs As DAO.Recordset
Dim rsTarget As DAO.Recordset
Dim strSQL As String
Dim strField1 As String
Dim strField2 As String
Dim varData As Variant
Dim i As Integer
Dim intPKID As Integer
Set db = CurrentDb
'Select all the ccaddresses from the parent table
strSQL = "SELECT pkid,ccaddresses FROM t001parent"
Set rsTarget = db.OpenRecordset("t002newchildren", dbOpenDynaset, dbAppendOnly)
Set rs = db.OpenRecordset(strSQL, dbOpenDynaset)
While Not .EOF
intPKID = !pkid
varData = Split(!ccaddresses, ";") ' Get all semi colon delimited fields
'add email addresses if there is only one email address there
!ccaddresss = Trim(varData(0)) ' remove spaces before writing new fields
!pkidt001 = intPKID
'loop through addtional email addresses and add them as children to table
For i = 1 To UBound(varData)
!ccaddresss = Trim(varData(i)) ' remove spaces before writing new fields
!pkidt001 = intPKID
Set rsTarget = Nothing
Set rs = Nothing
Set db = Nothing
After running this we should
We’ve just nicely split the parent table ccaddesses field into constituent emails and normalised it as a child of the parent table.
SQL Azure as part of the service offers a number of differing back up options as standard – however you may wish to take additional backups which for instance you can load onto a local version of SQL Server (Express or Enterprise). Here’s a quick rundown of one method of doing this.
Firstly open up your version of SQL Server Management Studio and navigate to the instance and database that you wish to backup
Highlight the database in this case DB001 and right click -Select Tasks and Export Data-Tier Application
Choose an appropriate location to put the backup file
Review the summary of items that will be backed up and then hit Finish
There will be a run down of how the export has worked
And just a check to see the exported file in the directory
This is hardly ground breaking but windows Task Scheduler can be used to run VBS scripts on a variety of events to automate repetitive and boring tasks. Backups for example or refreshing caches on web browsers before users come in.
Hit Window button
Type “Task Scheduler”
Press return you should get the following screen.
Make sure Task Schedule Library is selected in the tree on the left then within Actions sub window on the right click Create
Give the Task a name in the area above marked with a red arrow and then go to the actions Tab where you can use the New… button to navigate to the vbs file you wish to run.
Next use the browse button OR if you know the path and file name simply type in the path to your visual basic script you would like to run. Here I have put in an imaginary backup script. Next go to the Trigger tab and again hit the New… tab
The Begin Task selection list gives a good indication of when you want to trigger whatever tasks.
If we continue with the theme and run a vbs script to do a backup at 11:48 everyday
Best place to search for content particularly on mobile
- Recent Posts
- Archive by Month Year