Thursday, September 30, 2010

Azure Storage Performance

From a presentation by Brad Calder - Microsoft PDC 2009 

Per Object/Partition Performance at Commercial Availability


Single Queue and Single Table Partition

Up to 500 transactions per second 

Single Blob

Small reads/writes up to 30 MB/s

Large reads/writes up to 60 MB/s


Typically around 100ms (for small trans)

But can increase to a few seconds during heavy spikes while load balancing kicks in

Tuesday, September 21, 2010

A really good explanation of SQL Server and indexes

It didn't answer my current question, but it was good to read this again:

For reference: my current question was how to optimise a query which does a > comparison on two columns.... I can see how to do it using a computed column with an index, but that's not available to me at present - as I'm using Fluent nhibernate...

This question helped a bit "What happens when a function is used in the WHERE clause? In that case, the index will not be used because the function will have to be applied to every row in the table."

Sunday, September 19, 2010

When your SQL log (ldf) file gets Huge...

If you're having problems with large SQL log files, then this was the most useful advice I found.

For my purposes I was a "no" - my data wasn't money sensitive... so I switched to "simple" and could then shrink my database using the Tasks right-click menu in SSMS

Do you make transaction backups?
Yes -> Do not shrink the log unless you have done some exceptional operation (such as a massive delete) as the "cost" of SQL Server re-growing the Log file is significant, and will lead to fragmentation and thus worse performance.
No and I don't want to -> Change the Recovery Model to "Simple" - Enterprise Manager : Right click database : Properties @ [Options]
Don't know -> Transaction log backups allow you to recover to a "point in time" - you restore the last Full Backup (and possibly a subsequently Differential backup) and then every log backup, in sequence, until the "point in time" that you want to roll-forward to.

Saturday, September 18, 2010

Timing tests on Azure

In order to test the performance of your app on Azure, you really need to test on Azure.

This is especially the case when your app needs to talk to local SQL Azure, Table, Blob, and Queue resources - you can't test the performance of these:
- when you are running the storage on the DevFabric
- when you are running on the real storage, but with your app on a dev machine outside of the data center

When you are running on Azure, then how do you get your results out.

Well, to start with I've just used some simple tests using System.Diagnostics.StopWatch:

At the start of the method to be tested
            List<KeyValuePair<string, long>> times = new List<KeyValuePair<string, long>>();
            times.Add(new KeyValuePair<string, long>("Original", 0L));
            System.Diagnostics.Stopwatch timer = new System.Diagnostics.Stopwatch();

Half way through:
            times.Add(new KeyValuePair<string,long>("namedPoint", timer.ElapsedMilliseconds));

And at the end:
            times.Add(new KeyValuePair<string, long>("timeToEnd", timer.ElapsedMilliseconds));

            foreach (var t in times)
                System.Diagnostics.Trace.TraceError(string.Format("{0}:{1}", t.Value, t.Key));

This then uploads the results to Azure Diagnostics - currently Trace is routed to Table storage.

Obviously this is only a temporary solution - only suited for simple methods, and you have to remove this code for Production.

Friday, September 17, 2010

Azure Storage Tools

A useful list from

Personally I quite like although I've also used recently

Windows Azure Storage 

Block Blob

Page Blob




Azure Blob Client



Azure Blob Compressor 
Enables compressing blobs for upload and download



Azure Blob Explorer



Azure Storage Explorer





Azure Storage Simple Viewer






Cerebrata Cloud Storage Studio






Cloud Berry Explorer




Clumsy Leaf Azure Explorer 
Visual studio plug-in






Factonomy Azure Utility



Gladinet Cloud Desktop


A portal to access blobs, tables and queues






Space Block



Windows Azure Management Tool






Azure Links

Credit to Neil Mackenzie for this list - from!B863FF075995D18A!706.entry

Azure Team – Azure Support Team – Microsoft Codename Dallas – SQL Azure Team - Windows Azure Developer Tools Team – Windows Azure Storage Team – Windows Azure Team

Individual – Adam Sampson – Brent Stineman (@BrentCodeMonkey) – Brian Hitney – Cloud Cover (@cloudcovershow) – David Aiken (@TheDavidAiken) – David Chappell – Eric Nelson (@ericnel) – Guy Shahine – J.D. Meier – Jim Nakashima (@jnakashima) - Maarten Balliauw Neil Mackenzie (@mknz) - Patterns & Practices: Windows Azure Security Guidance – Roger Jennings (@rogerjenn) – Ryan Dunn (@dunnry) – Scott Densmore (@scottdensmore) - Steve Marx (@smarx) - Sumit Mehrotra – Steven Nagy (@snagy) - Toddy Mladenov (@toddysm)

Azure Status – Azure Status

Other Clouds – High Scalability – James Hamilton – Werner Vogels

SQL Azure Management - Backup and Monitoring

SQL Azure Backup and Restore
SQL Azure does not support the normal SQL Server BACKUP commands.

It does provide inherent data redundancy and reliability. However, this will not help if:
  • There is an app bug which corrupts your data.
  • There's a hacker who destroys your data
  • The Microsoft's datacenter loses all of your data (unlikely but possible)
To cope with these scenarios, there are two mechanisms currently available to backup your data:
  • You can use the SQL Azure copy commands to copy your database contents to a second, still stored within SQL Azure - this will protect again only the first problem above (assuming the hacker has a way into your Azure store)
  • You can use the open source SQL Azure Migration Wizard to download data to a local copy - this will provide a recovery mechanism for all 3 items above
The use of this second tool is discussed further below:

SQL Azure Data Migration

Warning - using this for a backup process will involve a large amount of data transfer (which will cost money), but it's better than not having a backup at all.


To use this tool:
  1. download it from
  2. choose "Analyze and migrate T-SQL database"
  3. choose the Azure database - - username - username@yourserver - and password.
  4. Note that you can only connect to the SQL azure database from IP addresses whitelisted on the firewall settings on the SQL Azure web control panel
  5. choose "[databaseName]"
  6. choose "script all database objects"
  7. hit "next" and it will slowly happen
  8. choose your local SQL server to connect to
  9. create a new SQL server database
  10. hit "next" to insert the data into your local database
For restore, you need to go in the opposite direction


To do this:
  1. typically suspend the Azure web application
  2. use the Azure web user interface to drop the existing SQL Azure database
  3. run the steps above in reverse - from local to SQL Azure
  4. restart the Azure web application.
    1. connect to SQL Azure using SQL Server Management Studio
    2. open a "New Query"
    3. type "CREATE DATABASE backupName AS COPY OF mainName"
    4. Hit Execute

      The state of the copy can then be monitored using "select name, state, state_desc from sys.databases where name = 'backupName"
Note that costs of running multiple databases on Azure suggest that you should not keep taking more and more backups and storing them all within SQL Azure.

SQL Azure Monitoring
There really isn't much that can currently be done in the way of monitoring the health of an Azure SQL Server.

This document describes the ability to query:
  • Database size
  • Number of connections - and their activity
  • Some simple query performance
However, this doesn't really provide any clue as to monitoring the database "health" - e.g. uptime/status, loading, etc.

There is some access to overall status at - but this is at a very coarse level.

The only current suggestions for checking a SQL server instance is alive are:
  • just rely on Microsoft uptime (under their SLA) - not ideal
  • use tools like Nagios to add "HTTP" monitoring to Azure applications connected to the SQL Server database - use this web monitoring to check the health of the server (at least to check it is alive).
  • try to build something (e.g. a web app) based around sys.dm_exec_connections ( which would provide some analysis of loading.
  • try to extend the enzo SQL library - - it's benchmarking tools might give you some clue on current loading (but this monitoring might also impact your current performance).

Thursday, September 16, 2010

If you can't run the Visual Studio wizard for SharpArchitecture

If you see a series of messages like "A problem was encountered
creating the sub project 'sharp.Core'. A project with that name is
already opened in the solution."

Then it could be due to problems with AnkhSVN - see

So I had to uninstall source control to get it working....

Tuesday, September 14, 2010

Azure - stuck in Initialising-Busy-Stopping...repeat

This link seems a good reference!

Some points:
  • missing dependencies
  • problems with initialising WAD (Azure Diagnostics connection string)
  • problems with other connection strings
  • lack of permission (medium trust!)

If you hit a problem with ConfigMonitoringPoll - Polling for configuration changes - when using Azure Diagnostics

Reported as:
Polling for configuration changes:Microsoft.WindowsAzure.StorageClient.StorageClientException: The specified blob does not exist. ---> System.Net.WebException: The remote server returned an error: (404) Not Found. at System.Net.HttpWebRequest.EndGetResponse(IAsyncResult asyncResult) at Microsoft.WindowsAzure.StorageClient.EventHelper.ProcessWebResponse(WebRequest req, IAsyncResult asyncResult, EventHandler`1 handler, Object sender) --- End of inner exception stack trace --- at Microsoft.WindowsAzure.StorageClient.Tasks.Task`1.get_Result() at Microsoft.WindowsAzure.StorageClient.Tasks.Task`1.ExecuteAndWait() at Microsoft.WindowsAzure.StorageClient.CloudBlob.DownloadToStream(Stream target, BlobRequestOptions options) at Microsoft.WindowsAzure.Diagnostics.ControlChannel.<>c__DisplayClass10.<ConfigMonitoringPoll>b__f() at Microsoft.WindowsAzure.Diagnostics.ControlChannel.TryExpectError(HttpStatusCode status, Action act) at Microsoft.WindowsAzure.Diagnostics.ControlChannel.ConfigMonitoringPoll(Object sender, ElapsedEventArgs args); TraceSource 'WaWebHost.exe' event

Then the problem might be that you must initialise the Azure Diagnostics from the start of the WebRole - and not from the start of app in Global.asax - just add this new file to your webrole project:

using System;

using System.Collections.Generic;

using System.Linq;

using System.Web;

using Microsoft.WindowsAzure.ServiceRuntime;

namespace MyProject.Web


public class WebRole : RoleEntryPoint


public override bool OnStart()



return base.OnStart();




Adding Diagnostic monitoring to Azure

If you want to get some diagnostics from Windows Azure deployed applications then follow the instructions on:

This gives you a solution like:

    public static class AzureDiagnostics


        public static void InitializeAzureDiagnostics()


            // do not run this code if we're not running in the cloud.

            if (!Microsoft.WindowsAzure.ServiceRuntime.RoleEnvironment.IsAvailable)



            // Enabled the capture of partial crash dumps



            // configure and start the diagnostics

            DiagnosticMonitorConfiguration dmc = DiagnosticMonitor.GetDefaultInitialConfiguration();







            DiagnosticMonitor.Start("DiagnosticsConnectionString", dmc);



        private static void AddInfrastructureLogs(DiagnosticMonitorConfiguration dmc)


            dmc.DiagnosticInfrastructureLogs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;

            dmc.DiagnosticInfrastructureLogs.ScheduledTransferPeriod = TimeSpan.FromMinutes(10.0);



        private static void AddSystemEventLog(DiagnosticMonitorConfiguration dmc)


            // Add event collection from the Windows Event Log


            dmc.WindowsEventLog.ScheduledTransferLogLevelFilter = LogLevel.Verbose;

            dmc.WindowsEventLog.ScheduledTransferPeriod = TimeSpan.FromMinutes(10);



        private static void AddLogs(DiagnosticMonitorConfiguration dmc)


            dmc.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(10);

            dmc.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;           



        private static void AddPerformanceCounters(DiagnosticMonitorConfiguration dmc)


            List<string> counters = new List<string>();

            counters.Add(@"\Processor(_Total)\% Processor Time");

            counters.Add(@"\Memory\Available Mbytes");

            counters.Add(@"\TCPv4\Connections Established");

            counters.Add(@"\ASP.NET Applications(__Total__)\Requests/Sec");

            // These network interfaces removed - not that interesting for now

            //counters.Add(@"\Network Interface(*)\Bytes Received/sec");

            //counters.Add(@"\Network Interface(*)\Bytes Sent/sec");

            foreach (string counter in counters)


                PerformanceCounterConfiguration counterConfig = new PerformanceCounterConfiguration();

                counterConfig.CounterSpecifier = counter;

                counterConfig.SampleRate = TimeSpan.FromSeconds(10);



            dmc.PerformanceCounters.ScheduledTransferPeriod = TimeSpan.FromMinutes(5.0);





Monday, September 13, 2010

Re: "The "COM0" port did not open." Bada device development

If you're trying to debug a Bada device and you see "The "COM0" port
did not open." under the entry for the Broker.exe in the Bada IDE
Console, then:

1. If it's the first time you've ever tried to connect then check that
you've installed the necessary USB drivers and have the device
2. If you've previously been using this fine, but this has suddenly
stopped working then try:
--- rebooting your phone
--- killing broker.exe in the PC task manager
--- setting your phone's USB connect to Samsung Kies or to Mass
Storage, and then using Run As rather than Debug As in the IDE
--- hit the task manager on the phone and choose "End All Applications"

I'm not entirely sure what worked for me... but I think it was a
combination of the last few in the end!