Cloud seems to be mainstream now!!

February 29, 2012 Leave a comment

This is a super-cool experience, hence wanted to share this. You know a technology is main-stream when the end user knows it’s useful and feels handicapped without it.

Currently I am working in a small town about 100+miles from Chicago, and staying at a motel with good ambience and good people around. Last weekend I found their Manager busy with a lot of paper work. She looked harried and said she had been very busy because their Server Crashed. They would have to wait for almost a week to get their new server which would cost them more than just a few dollars. Her tale of woe didn’t end there – after the server came in, they would have to transfer the manual paper entries to the server. Oh… long hours for the poor lady and her team, I thought.L

It struck me how appropriate Cloud would be in such a situation, but desisted from showing off my cloud skills to the manager. Still, I couldn’t hold back a lot and out of curiosity, I asked, “But why do you have only one server?”

She gave a big laugh and said “You are right!” She had asked the same question to the owner and the reason she was given was the high cost of the server – they could afford only one now.

Now, all that is normal and run-of-the-mill conversation. In a later chat, I was astonished enough to have my eyes popping out with delight.

She said, “I don’t understand why we don’t use the cloud here? I can use it on my mobile; everybody is making use of the cloud – at least the cloud drive from anywhere, why can’t we do that?

I was like WOW!! She started explaining to me whatever she knew about the cloud. Now, that feeling was like super-cool. Remember those days when we had to explain to techies about what is cloud and here it is, a manager at small motel not being very technical, knows what the Cloud is and more importantly how and where it makes sense to use it! J

Just using the cloud server from somewhere else with high availability, automated replication and quick provisioning would definitely fit into this situation. If nothing else, they should use at least a cloud drive which holds the regular automated backup of their server and restores it when needed.

Good conversation to remember.  Now on, if anybody asks me about the cloud and where it makes sense, I would definitely remember this smart manager at a motel who knows not only about Cloud but also its use-case!!!

There are a lot of scenarios, workflows, cases and best-fits for Cloud but rather than always going into the detailed analysis, we just need to look around and would quickly be able to figure it out where it makes sense to be in Cloud.

Happy to experience that cloud is mainstream now.

Advertisements

SQL Azure Pricing Change – where is 100MB MAXSIZE

February 22, 2012 Leave a comment

Earlier blogged about the big SQL Azure Spring Sale here http://adititechnologiesblog.blogspot.com/2012/02/sql-azure-spring-sale-its-implications.html

Also, the latest pricing structure including the 100MB database price is below

Database Size

Price Per Database Per Month

0   to 100 MB Flat   $4.995
Greater than 100 MB to 1 GB Flat   $9.99
Greater   than 1 GB to 10 GB $9.99   for first GB, $3.996 for each additional GB
Greater   than 10 GB to 50 GB $45.954   for first 10 GB, $1.998 for each additional GB
Great   than 50 GB to 150 GB $125.874 for first 50 GB, $0.999 for each additional GB

Important thing to note is to avail 100MB database you have to select 1 GB as MAXSIZE while creating the SQL Azure database. But the charges would be applicable for 100MB ($4.995 per month) as long as it is under 100MB and from 101MB to 1 GB it would charge $9.99 per month.

Cross-Post why do we like Windows Azure

January 29, 2012 Leave a comment

Instead of comparing Windows Azure with any other platform, I thought of putting down points as cloud architects why do we like it so much. It is a two part series available here Part I and Part II

Hope you like it, any comments are welcome.

Presented for Techgig Cloud Computing Series – Part 1

January 29, 2012 Leave a comment

I got an opportunity to start an online webinar series on Cloud Computing for Techgig http://www.techgig.com an online community. It was the first session on cloud in the series so built right from what is cloud to SPI service models and introduced Windows Azure and its features. I don’t like to present without a demo so a ground up end to end application from Visual Studio runnin on emulator to Windows Azure cloud. It was fun presenting and more than 300 people attended it. Got a good feedback to feel good about and more than that happy to share knowledge with community.

You can watch the recorded session here http://www.techgig.com/webinar.php?webinar_id=66.

Planning to take next session on advanced role scenarios in Windows Azure, will share it once it is done.

Thanks to TechGig for giving me this opportunity to share knowledge with community that to one of the hottest topics in technology Cloud and Windows Azure.

 

Cross-Post: Could Cloud have saved the bubble from bursting?

December 30, 2011 Leave a comment

My boss/mentor/friend (Yes I mean it and it is possible:)) Bharat Kumar K and I got into an interesting discussion about Cloud and dot com bust

would the dot.com boom and bust have happened if cloud computing in its current form and name had been around in the 1990s? Could millions of dollars have been saved?

And the outcome is a blog posted on our company blog site, read complete article here

.

Windows Azure AutoScaling Application Block (WASABi) Issue and Resolution

December 30, 2011 Leave a comment

Recently, we have implemented Windows Azure AutoScaling Application Block (WASABi) in one of our project. It looked really simple to begin with but later on it was not that easy as we faced two specific issues which ate lot of our time.
To begin with we developed a worker role with WASABi and tested it on development emulator which worked fine. Configuring rules XML is some kind of pain though. Before deploying it to Windows Azure we found that there is a new version of WASABi the version 5.0.1118.0 from nuget. Again tested it locally and it wasn’t working :(.   we got following error while WASABi was trying to insert entities into the collector table in our case table name was DataPointsCollected.

“<code>InvalidInput</code> One of the request inputs is not valid.” 

To solve above error we had to upgrade to Windows Azure SDK 1.6 (we didn’t evalute it as we were using umbraco with SDK 1.5 so were not sure about the implication).  This problem is also indicated here in msdn forum.

After fixing the above problem we thought it is going to be smooth after deploying to Windows Azure but it wasn’t. We were getting the following error

System.ServiceModel.Security.SecurityNegotiationException: Could not establish secure channel for SSL/TLS with authority ‘management.core.windows.net’. —&gt; System.Net.WebException: The request was aborted: Could not create SSL/TLS secure channel.  “

and it was difficult to debug and get pointers.

While checking all the possibilities I came across an entry of subscription in file service-information-store.xml the entry was

 <subscription name="mysubscription"
                  subscriptionId="80fa6a27-xxxxxxxxxxxxxxa13f"
                  certificateThumbprint="7xxxxxxxxxxxxxxxxxxxx1"
                  certificateStoreLocation="LocalMachine"                  certificateStoreName="My">

Also the role certificate had the same location (LocalMachine) and same store name (My). I guessed probably the worker role is not able to read the certificate from the LocalMachine so we changed the certificate location to CurrentUser and My as indicated below 

 <subscription name="mysubscription" 
 subscriptionId="80fa6a27-xxxxxxxxxxxxxxa13f" 
        certificateThumbprint="7xxxxxxxxxxxxxxxxxxxx1"
        certificateStoreLocation="CurrentUser"       
        certificateStoreName="My">

Fixed the above two issues and our worker role with Auto Scaling block is working like charm. Hope this saves time for others who trying WASABi.

Improve Azure Table performance with query projections

September 27, 2011 Leave a comment

                At build 2011 Microsoft announced improvements in REST APIs used for Azure Storage with version (“2011-08-18”). Out of these improved features, a specific feature we are going to discuss here is about query projection while selecting entities from Azure Table storage.  In one of our project we had requirement for this feature but due to it’s unavailability we couldn’t implement it. Now, with this feature we will have to change our implementation but this change is good as it helps improve performance.

Earlier, selecting a specific or few properties from the Azure Table Storage was not supported. For example, in one of our implementations we were using a flag IsDeleted in table storage (with somewhere around 50 more properties) to maintain soft delete of an entity. Just to retrieve IDs of the entities we had to retrieve entire entity/ies with 50 properties and that would unnecessarily increase the latency and bandwidth load for the application.

e.g.

var query = from entity in playersServiceContext.CreateQuery<Players>(PlayersTableName)  where entity.IsDeleted == true;

 Now, with Version (“2011-08-18”) we can select specific properties using Select predicates which reduces the latency, bandwidth and improves the performance of application.

var query = from entity in playersServiceContext.CreateQuery<Players>(PlayersTableName) where entity.IsDeleted == true
select new 
{
  PlayerName =  entity.PlayerName,
  Rank = entity.Rank
};

Here, select new does the trick using anonymous entity.

We could explicitly project the properties into a specific entity type like 

var query = from entity in playersServiceContext.CreateQuery<Players>(PlayersTableName) where entity.IsDeleted == true
select new DeletedPlayers
           {
             PlayerName =  entity.PlayerName,
            Rank = entity.Rank
           };

Here, DeletedPlayers is a DataServiceEntity can be called as partial view entity.

In one of the other scenarios, we wanted to display the name, rank and age of players in a HTML table and then user might go into the details (view/edit) of the player using a link from the table. We had used paging size of 10 to decrease latency as there were around 50 properties attached with every player entity. Earlier, we had to retrieve all the 50 properties but now with this select predicate we can get only specific properties (Name, Rank, Age and Core properties) for the entity. This gives us better performance and an option to think about displaying more than 10 entities on a page with a good performance.

A few things to remember when using select predicate with Azure Table Storage.

  • If you are not defining core properties [PartitionKey, RowKey and ETag] in your data service entity definition then it would by default get selected in select predicate
  • In case you are explicitly defining core properties then you will have to explicitly select it in select predicate which would be useful for updating the properties
  • Prefer explicit entities (Partial) over anonymous entity as it makes the looping and updates explicit and improves readability
  • Remember even if you want to select only one property in the select predicate still you will have to at least use the anonymous entity. Just select entity.entityproperty would not work

So, go ahead and revisit and change your code for betterment where you had taken decisions just because query projections were not supported for Azure tables.