Thursday, September 30, 2010

Azure Storage Performance

From a presentation by Brad Calder - Microsoft PDC 2009 


Per Object/Partition Performance at Commercial Availability

Throughput

Single Queue and Single Table Partition

Up to 500 transactions per second 


Single Blob

Small reads/writes up to 30 MB/s

Large reads/writes up to 60 MB/s


Latency

Typically around 100ms (for small trans)

But can increase to a few seconds during heavy spikes while load balancing kicks in

Tuesday, September 21, 2010

A really good explanation of SQL Server and indexes

It didn't answer my current question, but it was good to read this again:
http://www.sqlteam.com/article/sql-server-indexes-the-basics

For reference: my current question was how to optimise a query which does a > comparison on two columns.... I can see how to do it using a computed column with an index, but that's not available to me at present - as I'm using Fluent nhibernate...

This question helped a bit "What happens when a function is used in the WHERE clause? In that case, the index will not be used because the function will have to be applied to every row in the table."

Sunday, September 19, 2010

When your SQL log (ldf) file gets Huge...

If you're having problems with large SQL log files, then this was the most useful advice I found.

For my purposes I was a "no" - my data wasn't money sensitive... so I switched to "simple" and could then shrink my database using the Tasks right-click menu in SSMS


Do you make transaction backups?
Yes -> Do not shrink the log unless you have done some exceptional operation (such as a massive delete) as the "cost" of SQL Server re-growing the Log file is significant, and will lead to fragmentation and thus worse performance.
No and I don't want to -> Change the Recovery Model to "Simple" - Enterprise Manager : Right click database : Properties @ [Options]
Don't know -> Transaction log backups allow you to recover to a "point in time" - you restore the last Full Backup (and possibly a subsequently Differential backup) and then every log backup, in sequence, until the "point in time" that you want to roll-forward to.


Saturday, September 18, 2010

Timing tests on Azure

In order to test the performance of your app on Azure, you really need to test on Azure.

This is especially the case when your app needs to talk to local SQL Azure, Table, Blob, and Queue resources - you can't test the performance of these:
- when you are running the storage on the DevFabric
- when you are running on the real storage, but with your app on a dev machine outside of the data center

When you are running on Azure, then how do you get your results out.

Well, to start with I've just used some simple tests using System.Diagnostics.StopWatch:

At the start of the method to be tested
            List<KeyValuePair<string, long>> times = new List<KeyValuePair<string, long>>();
            times.Add(new KeyValuePair<string, long>("Original", 0L));
            System.Diagnostics.Stopwatch timer = new System.Diagnostics.Stopwatch();
            timer.Start();

Half way through:
            times.Add(new KeyValuePair<string,long>("namedPoint", timer.ElapsedMilliseconds));

And at the end:
            times.Add(new KeyValuePair<string, long>("timeToEnd", timer.ElapsedMilliseconds));
            timer.Stop();

            foreach (var t in times)
            {
                System.Diagnostics.Trace.TraceError(string.Format("{0}:{1}", t.Value, t.Key));
            }

This then uploads the results to Azure Diagnostics - currently Trace is routed to Table storage.

Obviously this is only a temporary solution - only suited for simple methods, and you have to remove this code for Production.

Friday, September 17, 2010

Azure Storage Tools

A useful list from http://blogs.msdn.com/b/windowsazurestorage/archive/2010/04/17/windows-azure-storage-explorers.aspx

Personally I quite like https://www.myazurestorage.com/ although I've also used http://www.cerebrata.com recently

Windows Azure Storage 
Explorer

Block Blob

Page Blob

Tables

Queues

Free

Azure Blob Client

X

Y

Azure Blob Compressor 
Enables compressing blobs for upload and download

X

Y

Azure Blob Explorer

X

Y

Azure Storage Explorer

X

X

X

Y

Azure Storage Simple Viewer

X

 

X

X

Y

Cerebrata Cloud Storage Studio

X

X

X

X

Y/N

Cloud Berry Explorer

X

X

Y

Clumsy Leaf Azure Explorer 
Visual studio plug-in

X

X

X

X

Y

Factonomy Azure Utility

X

Y

Gladinet Cloud Desktop

X

N

MyAzureStorage.com 
A portal to access blobs, tables and queues

X

X

X

X

Y

Space Block

X

Y

Windows Azure Management Tool

X

X

X

X

Y


Azure Links

Credit to Neil Mackenzie for this list - fromhttp://nmackenzie.spaces.live.com/blog/cns!B863FF075995D18A!706.entry

Azure Team

http://blogs.msdn.com/b/azuredevsupport/ – Azure Support Team

http://blogs.msdn.com/b/dallas/ – Microsoft Codename Dallas

http://blogs.msdn.com/b/sqlazure/ – SQL Azure Team

http://blogs.msdn.com/b/cloud/ - Windows Azure Developer Tools Team

http://blogs.msdn.com/b/windowsazurestorage/ – Windows Azure Storage Team

http://blogs.msdn.com/b/windowsazure/ – Windows Azure Team

Individual

http://blogs.msdn.com/b/partlycloudy/ – Adam Sampson

http://bstineman.spaces.live.com/blog/ – Brent Stineman (@BrentCodeMonkey)

http://blog.structuretoobig.com/ – Brian Hitney

http://channel9.msdn.com/shows/Cloud+Cover/ – Cloud Cover (@cloudcovershow)

http://www.davidaiken.com/ – David Aiken (@TheDavidAiken)

http://davidchappellopinari.blogspot.com/ – David Chappell

http://geekswithblogs.net/iupdateable/Default.aspx – Eric Nelson (@ericnel)

http://gshahine.com/blog/ – Guy Shahine

http://blogs.msdn.com/b/jmeier/ – J.D. Meier

http://blogs.msdn.com/b/jnak/ – Jim Nakashima (@jnakashima)

http://blog.maartenballiauw.be/ - Maarten Balliauw

http://nmackenzie.spaces.live.com/blog/ Neil Mackenzie (@mknz)

http://azuresecurity.codeplex.com/ - Patterns & Practices: Windows Azure Security Guidance

http://oakleafblog.blogspot.com/ – Roger Jennings (@rogerjenn)

http://dunnry.com/blog/ – Ryan Dunn (@dunnry)

http://scottdensmore.typepad.com/blog/ – Scott Densmore (@scottdensmore)

http://blog.smarx.com/ - Steve Marx (@smarx)

http://blogs.msdn.com/b/sumitm/ - Sumit Mehrotra

http://azure.snagy.name/blog/ – Steven Nagy (@snagy)

http://blog.toddysm.com/ - Toddy Mladenov (@toddysm)

Azure Status

http://www.microsoft.com/windowsazure/support/status/servicedashboard.aspx – Azure Status

Other Clouds

http://highscalability.com/ – High Scalability

http://perspectives.mvdirona.com/ – James Hamilton

http://www.allthingsdistributed.com/ – Werner Vogels

SQL Azure Management - Backup and Monitoring

SQL Azure Backup and Restore
SQL Azure does not support the normal SQL Server BACKUP commands.

It does provide inherent data redundancy and reliability. However, this will not help if:
  • There is an app bug which corrupts your data.
  • There's a hacker who destroys your data
  • The Microsoft's datacenter loses all of your data (unlikely but possible)
To cope with these scenarios, there are two mechanisms currently available to backup your data:
  • You can use the SQL Azure copy commands to copy your database contents to a second, still stored within SQL Azure - this will protect again only the first problem above (assuming the hacker has a way into your Azure store)
  • You can use the open source SQL Azure Migration Wizard to download data to a local copy - this will provide a recovery mechanism for all 3 items above
The use of this second tool is discussed further below:

SQL Azure Data Migration

Warning - using this for a backup process will involve a large amount of data transfer (which will cost money), but it's better than not having a backup at all.

Backup

To use this tool:
  1. download it from http://sqlazuremw.codeplex.com/
  2. choose "Analyze and migrate T-SQL database"
  3. choose the Azure database - yourserver.database.windows.net - username - username@yourserver - and password.
  4. Note that you can only connect to the SQL azure database from IP addresses whitelisted on the firewall settings on the SQL Azure web control panel
  5. choose "[databaseName]"
  6. choose "script all database objects"
  7. hit "next" and it will slowly happen
  8. choose your local SQL server to connect to
  9. create a new SQL server database
  10. hit "next" to insert the data into your local database
For restore, you need to go in the opposite direction

Restore

To do this:
  1. typically suspend the Azure web application
  2. use the Azure web user interface to drop the existing SQL Azure database
  3. run the steps above in reverse - from local to SQL Azure
  4. restart the Azure web application.
Notes:
    1. connect to SQL Azure using SQL Server Management Studio
    2. open a "New Query"
    3. type "CREATE DATABASE backupName AS COPY OF mainName"
    4. Hit Execute

      The state of the copy can then be monitored using "select name, state, state_desc from sys.databases where name = 'backupName"
Note that costs of running multiple databases on Azure suggest that you should not keep taking more and more backups and storing them all within SQL Azure.

SQL Azure Monitoring
There really isn't much that can currently be done in the way of monitoring the health of an Azure SQL Server.

This http://msdn.microsoft.com/en-us/library/ff394114.aspx document describes the ability to query:
  • Database size
  • Number of connections - and their activity
  • Some simple query performance
However, this doesn't really provide any clue as to monitoring the database "health" - e.g. uptime/status, loading, etc.

There is some access to overall status at http://www.microsoft.com/windowsazure/support/status/servicedashboard.aspx - but this is at a very coarse level.

The only current suggestions for checking a SQL server instance is alive are:
  • just rely on Microsoft uptime (under their SLA) - not ideal
  • use tools like Nagios to add "HTTP" monitoring to Azure applications connected to the SQL Server database - use this web monitoring to check the health of the server (at least to check it is alive).
  • try to build something (e.g. a web app) based around sys.dm_exec_connections (http://msdn.microsoft.com/en-us/library/ff394114.aspx) which would provide some analysis of loading.
  • try to extend the enzo SQL library - http://enzosqlbaseline.codeplex.com/ - it's benchmarking tools might give you some clue on current loading (but this monitoring might also impact your current performance).

Thursday, September 16, 2010

If you can't run the Visual Studio wizard for SharpArchitecture

If you see a series of messages like "A problem was encountered
creating the sub project 'sharp.Core'. A project with that name is
already opened in the solution."

Then it could be due to problems with AnkhSVN - see
http://www.softweb.net.au/blogs/development/sharp-architecture-new-project-wizard-is-incompatible-with-ankhsvn

So I had to uninstall source control to get it working....

Tuesday, September 14, 2010

Azure - stuck in Initialising-Busy-Stopping...repeat

This link seems a good reference!
 
http://blog.toddysm.com/2010/01/windows-azure-deployment-stuck-in-initializing-busy-stopping-why.html

Some points:
  • missing dependencies
  • problems with initialising WAD (Azure Diagnostics connection string)
  • problems with other connection strings
  • lack of permission (medium trust!)

If you hit a problem with ConfigMonitoringPoll - Polling for configuration changes - when using Azure Diagnostics

Reported as:
Polling for configuration changes:Microsoft.WindowsAzure.StorageClient.StorageClientException: The specified blob does not exist. ---> System.Net.WebException: The remote server returned an error: (404) Not Found. at System.Net.HttpWebRequest.EndGetResponse(IAsyncResult asyncResult) at Microsoft.WindowsAzure.StorageClient.EventHelper.ProcessWebResponse(WebRequest req, IAsyncResult asyncResult, EventHandler`1 handler, Object sender) --- End of inner exception stack trace --- at Microsoft.WindowsAzure.StorageClient.Tasks.Task`1.get_Result() at Microsoft.WindowsAzure.StorageClient.Tasks.Task`1.ExecuteAndWait() at Microsoft.WindowsAzure.StorageClient.CloudBlob.DownloadToStream(Stream target, BlobRequestOptions options) at Microsoft.WindowsAzure.Diagnostics.ControlChannel.<>c__DisplayClass10.<ConfigMonitoringPoll>b__f() at Microsoft.WindowsAzure.Diagnostics.ControlChannel.TryExpectError(HttpStatusCode status, Action act) at Microsoft.WindowsAzure.Diagnostics.ControlChannel.ConfigMonitoringPoll(Object sender, ElapsedEventArgs args); TraceSource 'WaWebHost.exe' event

Then the problem might be that you must initialise the Azure Diagnostics from the start of the WebRole - and not from the start of app in Global.asax - just add this new file to your webrole project:

using System;

using System.Collections.Generic;

using System.Linq;

using System.Web;

using Microsoft.WindowsAzure.ServiceRuntime;

namespace MyProject.Web

{

public class WebRole : RoleEntryPoint

{

public override bool OnStart()

{

AzureDiagnostics.InitializeAzureDiagnostics();

return base.OnStart();

}

}

}

Adding Diagnostic monitoring to Azure

If you want to get some diagnostics from Windows Azure deployed applications then follow the instructions on: http://msdn.microsoft.com/en-us/library/ee843890

This gives you a solution like:

    public static class AzureDiagnostics

    {

        public static void InitializeAzureDiagnostics()

        {

            // do not run this code if we're not running in the cloud.

            if (!Microsoft.WindowsAzure.ServiceRuntime.RoleEnvironment.IsAvailable)

                return;

 

            // Enabled the capture of partial crash dumps

            Microsoft.WindowsAzure.Diagnostics.CrashDumps.EnableCollection(false);

 

            // configure and start the diagnostics

            DiagnosticMonitorConfiguration dmc = DiagnosticMonitor.GetDefaultInitialConfiguration();

 

            AddPerformanceCounters(dmc);

            AddLogs(dmc);

            AddSystemEventLog(dmc);

            AddInfrastructureLogs(dmc);

 

            DiagnosticMonitor.Start("DiagnosticsConnectionString", dmc);

        }

 

        private static void AddInfrastructureLogs(DiagnosticMonitorConfiguration dmc)

        {

            dmc.DiagnosticInfrastructureLogs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;

            dmc.DiagnosticInfrastructureLogs.ScheduledTransferPeriod = TimeSpan.FromMinutes(10.0);

        }

 

        private static void AddSystemEventLog(DiagnosticMonitorConfiguration dmc)

        {

            // Add event collection from the Windows Event Log

            dmc.WindowsEventLog.DataSources.Add("System!*");

            dmc.WindowsEventLog.ScheduledTransferLogLevelFilter = LogLevel.Verbose;

            dmc.WindowsEventLog.ScheduledTransferPeriod = TimeSpan.FromMinutes(10);

        }

 

        private static void AddLogs(DiagnosticMonitorConfiguration dmc)

        {

            dmc.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(10);

            dmc.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;           

        }

 

        private static void AddPerformanceCounters(DiagnosticMonitorConfiguration dmc)

        {

            List<string> counters = new List<string>();

            counters.Add(@"\Processor(_Total)\% Processor Time");

            counters.Add(@"\Memory\Available Mbytes");

            counters.Add(@"\TCPv4\Connections Established");

            counters.Add(@"\ASP.NET Applications(__Total__)\Requests/Sec");

            // These network interfaces removed - not that interesting for now

            //counters.Add(@"\Network Interface(*)\Bytes Received/sec");

            //counters.Add(@"\Network Interface(*)\Bytes Sent/sec");

            foreach (string counter in counters)

            {

                PerformanceCounterConfiguration counterConfig = new PerformanceCounterConfiguration();

                counterConfig.CounterSpecifier = counter;

                counterConfig.SampleRate = TimeSpan.FromSeconds(10);

                dmc.PerformanceCounters.DataSources.Add(counterConfig);

            }

            dmc.PerformanceCounters.ScheduledTransferPeriod = TimeSpan.FromMinutes(5.0);

        }

    }

 

 

Monday, September 13, 2010

Re: "The "COM0" port did not open." Bada device development

If you're trying to debug a Bada device and you see "The "COM0" port
did not open." under the entry for the Broker.exe in the Bada IDE
Console, then:

1. If it's the first time you've ever tried to connect then check that
you've installed the necessary USB drivers and have the device
attached.
2. If you've previously been using this fine, but this has suddenly
stopped working then try:
--- rebooting your phone
--- killing broker.exe in the PC task manager
--- setting your phone's USB connect to Samsung Kies or to Mass
Storage, and then using Run As rather than Debug As in the IDE
--- hit the task manager on the phone and choose "End All Applications"

I'm not entirely sure what worked for me... but I think it was a
combination of the last few in the end!