Friday, July 31, 2015

Updating Service Tiers in Azure SQL Database

Azure SQL Web and Business databases are being deprecated and retired September 2015 so it's time to start planning to upgrade existing Web or Business databases to the Basic, Standard, or Premium service tiers.

Upgrading a Web or Business database to a new service tier involves the following steps:

  1. Determine service tier based on feature capability
  2. Determine an acceptable performance level based on historical resource usage
  3. Why does existing performance for my Web or Business database map to the higher Premium levels?
  4. Tuning your workload to fit a lower performance level
  5. Upgrade to the new service tier/performance level
  6. Monitor the upgrade to the new service tier/performance level
  7. Monitor the database after the upgrade

In this post I will discuss about the point 2 : “Determine an acceptable performance level based on historical resource usage”

 

How to view the recommended service tier in the new Azure Management Portal

New Management Portal

  1. Log on to the new management portal and navigate to a server containing a Web or Business database.
  2. Go to browse all, Then SQL Servers
  3. Select the server you want to upgrade
  4. Click the Latest Update part in the server blade.

image

5. Click Upgrade this server.

The Upgrade this server blade now shows a list of Web or Business databases on the server along with the recommended service tier.

image

How to view DTU consumption in the Management Portal

Log on to the management portal and navigate to an existing Web or Business database.

  1. Click the MONITOR tab.
  2. Click ADD METRICS.
  3. Select DTU percentage and click the checkmark at bottom to confirm.

image

Hope this helps.

Monday, July 13, 2015

Clear Azure Cache : Get All keys in Azure Redis cache

 

Recently I got a requirement to clear cache value for a given client account. In our systems we have multiple contractors , and we cache some data related to that contractor account. Cache keys are as follows.

  1. string.Format("{0}:ServiceTypeList", clientConfiguration.Code);
  2.       string.Format("{0}:PropertyViewModelList", clientConfiguration.Code);
  3.       string.Format("{0}-{1}:FaultTypeList", clientConfiguration.Code,
  4.                     serviceType.ServiceTypeID);
  5.       string.Format("{0}-{1}:LocationList", clientConfiguration.Code,
  6.                     property.PropertyNo);

Here some cache keys are based on Client Configuration code, and some have client configuration and some code too. But when we have to clear the cache for a given client account we have only the Client Configuration code. So we used the following approach.

1. Get all Cache Keys

  1. private static ConnectionMultiplexer _connection;
  2.  
  3. public static IServer GetDataCacheServer()
  4. {
  5.     if (_connection == null || !_connection.IsConnected)
  6.     {
  7.         _connection = ConnectionMultiplexer.Connect(HybridConfig.GetAppSetting("RedisCacheConnectionString"));
  8.     }
  9.  
  10.  
  11.     var endpoints = _connection.GetEndPoints();
  12.     var server = _connection.GetServer(endpoints.First());
  13.     return server;
  14.  
  15. }
  16.  
  17.  
  18. public static IEnumerable<RedisKey> GetAllCacheKeys()
  19. {
  20.     var server = GetDataCacheServer();
  21.  
  22.     var keys = server.Keys();
  23.     return keys;
  24. }

2. Create Remove method

  1. public static void Remove(string key)
  2.         {
  3.            
  4.  
  5.             try
  6.             {
  7.                 key = string.Format("{0}_{1}_{2}", key, HybridConfig.GetAppSetting("ApplicationUrl"), HybridConfig.GetAppSetting("ApplicationVersion"));
  8.                 var database = GetDataCache();
  9.                 database.KeyDelete(key);
  10.              
  11.                 if (AllKeys.Contains(key))
  12.                 {
  13.                     AllKeys.Remove(key);
  14.                 }
  15.             }
  16.             catch (Exception ex)
  17.             {
  18.                 SqlLogger.Error(typeof(AzureCache), string.Format("Error while trying to remove the key : {0} | Message : {1}", key, ex.Message), ex);
  19.             }
  20.         }

3. Search the client configuration code in it and clear the cache


            var keys = AzureCache.GetAllCacheKeys();
            string applicationTag = string.Format("_{0}_{1}", HybridConfig.GetAppSetting("ApplicationUrl"),
                                                  HybridConfig.GetAppSetting("ApplicationVersion"));
            List<string> cacheKeys = (from redisKey in keys
                                      select redisKey.ToString()
                                      into key
                                      where key.Contains(applicationTag) && key.Contains(clientConfiguration.Code)
                                      select key.Replace(applicationTag, string.Empty)).ToList();
            if (cacheKeys.Any())
            cacheKeys.ForEach(AzureCache.Remove);

Happy Coding !!!

Monday, July 6, 2015

Azure Search Introduction

 

Today almost al the web applications and mobile applications provides search functionality. That is the best way the user could access the data they want to see. Most common way of searching data is giving a search textbox. Further there are many ways of providing richer search experience to the users to make their life comfort when using the application. But providing such search functionality can be challenging for the application developers.It’s not reasonable to expect every development team to build its own search engine. Even installing and running a commercial search engine can be a lot of work. What’s needed is a managed search service that can be used by many different applications, whether they’re running in the cloud or on premises. This is exactly what Azure Search offers.

What is Azure Search?

Azure Search is a managed service running in the public cloud. You can create a new instance of azure search and use it. Application using azure search can run on Microsoft azure, on-premises or in another cloud platform.You can create Indexes in azre search, which is similar to a Table in SQL and sync your data to that. There are different ways of syncing your data to Search indexes.

  • Push: Where the data is in a different store, or when the application wants more control over how data is updated, developers can use the upload API to push content into the index in batches
  • Indexer: Data population and updates from Azure SQL, SQL Server on Azure VM’s or DocumentDB stores can be automatically loaded into the index (NOTE: Other stores will be supported based on customer demand)

Azure search provides a RESTful interface to the application so the application developers can send search requests to that service. Following figure illustrates the this.

image

 

This provides many functionalities.

  • Search Text: Text as written by the user used for full text search
  • Highlighting: Define fields to be used for hit highlighting of search text
  • Filter: Used to further limit results. E.g., a product catalog might exclude products with no stock left
  • Sorting: Sort results by values in document fields instead of score
  • Paging: Limit the number of results to be returned (skip & take)
  • Projection: Limits results to a subset of the fields in results allowing conservation of bandwidth
  • Count: Total count of fetched items
  • Lookup: Retrieves a specific document from Azure Search by its key
  • MoreLikeThis: Finds documents that are relevant to another specific document
Scenarios

E-commerce applications:

such as the web site for an online retailer. Providing a search option for an ecommerce site is essential—users expect it. But the organization that provides this site almost certainly wants to control what information is returned and—especially—the order of those results. Think of an online shoe store, for example, that’s currently running a promotion with a particular shoe manufacturer. Suppose that manufacturer is paying the online retailer for this promotion, and so the site’s search results need to list this brand of shoes first. Or perhaps the shoe store has lots of a particular style in stock right now that it wants to sell off. Placing the style first in search results can help the retailer achieve this business goal. Owning its own search function has other benefits, too, such as letting this firm see what its customers are searching for that it doesn’t currently sell. None of this makes internet search engines any less important; an online retailer should still do whatever it can to direct Google and Bing searches to its site. Once people are there, however, the retailer can benefit from controlling how customers search the site.

User-generated content sites

such as a discussion site for movie buffs. As with e-commerce applications, users expect to be able to navigate this kind of site via search. For the creators of the site, controlling that search once again brings some advantages. As with e-commerce applications, for example, there might be business reasons for returning search results in a particular order. Suppose an online cooking site is sponsored by three large food companies. The site’s owners might choose to show recipes that use foods sold by these companies higher in search results. (This might seem cynical—do we really need more pay-for-play sites?—yet it is in fact how much of the internet business works.) And because Azure Search lowers the barrier to entry for creating custom search, an organization doesn’t need to realize enormous benefits to justify the effort of doing this.

Custom business applications

such as an employee benefits solution. Traditionally, a line-of-business application is accessed by clicking through its UI until the user finds what he needs. If the application is simple, or if the user knows the application very well, this approach works. But many business applications (maybe even most of them) would be significantly more usable if they provided a search option. Once again, people love search. (Don’t you?) Add a search box to an application’s UI, then watch how rapidly people start using it. In fact, one way to smooth adoption of a new business application in an organization might be to make sure that it has a search option in its UI.

Azure Search Pricing

image

for more details about azure search pricing go here.

I will discuss how to create a azure search serving using azure portal and how to manage it in my next post.

Saturday, June 13, 2015

Localizing Azure Push Notification in Server end

 

Suppose you mobile application supports localization. And you need to send localized push notifications for your mobile app users. In that case now Azure push notification has templates which supports localization.

See the following link which gives good guidance on it.

Use Notification Hubs to send localized breaking news

In this post I will show how we can use resource files and localize some messaged to be sent.

1. First Configure the Resource file as you want with the messages.

2. Configure the supported languages for the notification.

  1. public const string LangzhSG = "zh-sg";
  2.       public const string LangenUS = "en-us";
  3.       public const string LangenGB = "en-gb";
  4.       public const string LangitIT = "it-it";
  5.       public const string LangsiLK = "si-lk";

 

  1. public static List<string> SupportedLanguages
  2.     {
  3.         get { return new List<string> { LangzhSG, LangenUS, LangenGB, LangitIT, LangsiLK }; }
  4.     }

3.  Localize the messages as your need. In here based on work order status message to be sent will be varied.

  1. public static Dictionary<string, string> GetMessageTranslations(WorkorderUpdateNotification message,
  2.                                                                       List<string> supportedLanguages)
  3.       {
  4.           string messageText = string.Empty;
  5.  
  6.           Dictionary<string, string> messageList = new Dictionary<string, string>();
  7.           if (message.Message.ToLower().Equals("failed"))
  8.           {
  9.  
  10.               foreach (var supportedLanguage in supportedLanguages)
  11.               {
  12.                   Resources.terms.Culture = new CultureInfo(supportedLanguage);
  13.                   messageText =
  14.                       string.Format(Resources.terms.webapi_helpers_pushnotification_messages_FailedToUpdateWO1,
  15.                                     message.WorkOrderNumber);
  16.                   messageList.Add(supportedLanguage, messageText);
  17.               }
  18.  
  19.           }
  20.  
  21.           return messageList;
  22.       }
  23.  
  24.   }

4. Create the class which has the properties which notification holds all required data.

  1.  
  2. public class WorkOrderUpdatePushNotification
  3. {
  4.     public string WorkOrderNumber { get; set; }
  5.     public bool IsSuccessful { get; set; }
  6.     public string From { get; set; }
  7.     public string Message { get; set; }
  8.     public string PushNotificationHub { get; set; }
  9. }

5. Create the required JSON string

  1. private static Dictionary<string, string> GetWorkorderUpdatePushNotificationsCollection(
  2.          WorkorderUpdateNotification message)
  3.      {
  4.  
  5.          List<string> supportedLanguages = ResourceKeyConstants.SupportedLanguages;
  6.          Dictionary<string, string> messageList = GetMessageTranslations(message, supportedLanguages);
  7.          Dictionary<string, string> workorderUpdateNotifications = new Dictionary<string, string>();
  8.          string notificationName = "WorkorderUpdateNotification_{0}";
  9.          string alertName = "WorkorderUpdateAlert_{0}";
  10.          foreach (var supportedLanguage in supportedLanguages)
  11.          {
  12.              string localizedNotificationName = string.Format(notificationName,
  13.                                                               supportedLanguage.Replace("-", string.Empty));
  14.              string localizedAlertName = string.Format(alertName, supportedLanguage.Replace("-", string.Empty));
  15.              Resources.terms.Culture = new CultureInfo(supportedLanguage);
  16.              string from = Resources.terms.webapi_helpers_pushnotification_messages_FromDtz;
  17.  
  18.              WorkOrderUpdatePushNotification workOrderUpdatePushNotification = new WorkOrderUpdatePushNotification
  19.                  {
  20.                      WorkOrderNumber = message.WorkOrderNumber,
  21.                      IsSuccessful = message.Message.ToLower().Equals("successful"),
  22.                      From = from,
  23.                      Message = messageList[supportedLanguage],
  24.                      PushNotificationHub = HybridConfig.GetAppSetting("PushNotificationHub")
  25.                  };
  26.  
  27.              var alert = JsonConvert.SerializeObject(workOrderUpdatePushNotification);
  28.  
  29.              workorderUpdateNotifications.Add(localizedNotificationName, alert);
  30.              workorderUpdateNotifications.Add(localizedAlertName, workOrderUpdatePushNotification.Message);
  31.          }
  32.  
  33.  
  34.  
  35.          return workorderUpdateNotifications;
  36.      }

6. Push the notification

  1. public static Dictionary<string, string> SendWorkorderUpdatePushNotificationTest(
  2.          WorkorderUpdateNotification message)
  3.      {
  4.          NotificationHubClient hub = NotificationHubClient.CreateClientFromConnectionString
  5.              (HybridConfig.GetAppSetting("PushNotificationServiceBus"),
  6.               HybridConfig.GetAppSetting("PushNotificationHub"));
  7.  
  8.  
  9.  
  10.          Dictionary<string, string> workorderUpdateNotifications =
  11.              GetWorkorderUpdatePushNotificationsCollection(message);
  12.  
  13.          hub.SendTemplateNotificationAsync(workorderUpdateNotifications, message.UserUpdated);
  14.  
  15.          return workorderUpdateNotifications;
  16.      }

 

Happy Coding !!!

Tuesday, May 5, 2015

protobuf-net

 

Protocol Buffers is the name of the binary serialization format used by Google for much of their data communications. It is designed to be: small in size - efficient data storage (far smaller than xml) cheap to process - both at the client and server platform independent - portable between different programming architectures extensible - to add new data to old messages. protobuf-net is a .NET implementation of this, allowing you to serialize your .NET objects efficiently and easily. It is compatible with most of the .NET family, including .NET 2.0/3.0/3.5/4.0, .NET CF 2.0/3.5, Mono 2.x, Silverlight, etc.

See the following samples where you can use it.

Step 1: Install protobuff

To install protobuf-net, run the following command in the Package Manager Console

PM> Install-Package protobuf-net

Step 2: Create the Data Models

Unlike XmlSerializer, the member-names are not encoded in the data - instead, you must pick an integer to identify each member. Additionally, to show intent it is necessary to show that we intend this type to be serialized. See the following sample.

  1. [ProtoContract]
  2.   class Person {
  3.       [ProtoMember(1)]
  4.       public int Id {get;set;}
  5.       [ProtoMember(2)]
  6.       public string Name { get; set; }
  7.       [ProtoMember(3)]
  8.       public Address Address {get;set;}
  9.   }
  10.   
  11.   [ProtoContract]
  12.   class Address {
  13.       [ProtoMember(1)]
  14.       public string Line1 {get;set;}
  15.       [ProtoMember(2)]
  16.       public string Line2 {get;set;}
  17.   }

Notes for Identifiers

  • they must be positive integers
  • they must be unique within a single type
    • but the same numbers can be re-used in sub-types if inheritance is enabled
  • the identifiers must not conflict with any inheritance identifiers lower numbers take less space - don't start 100,000,000
  • the identifier is important; you can change the member-name, or shift it between a property and a field, but changing the identifier changes the data

 

Step 3: Serialize to stream

Class with protobuf functions
  1. public class ProtoBuf
  2.   {
  3.  
  4.      public static MemoryStream Serialize(object content)
  5.      {
  6.          MemoryStream objectStream=new MemoryStream();
  7.          Serializer.Serialize(objectStream,content);
  8.          return objectStream;
  9.      }
  10.  
  11.  
  12.      public static T Deserialize<T>(MemoryStream objectStream)
  13.      {
  14.          objectStream.Position = 0;
  15.          T content = Serializer.Deserialize<T>(objectStream);
  16.          return content;
  17.      }
  18.   }

 

Calling the methods
  1. var person = new Person
  2.           {
  3.               Id = 12345,
  4.               Name = "Fred",
  5.               Address = new Address
  6.               {
  7.                   Line1 = "Flat 1",
  8.                   Line2 = "The Meadows"
  9.               }
  10.           };
  11.  
  12.  
  13.           //serialize to stream
  14.           MemoryStream stream=  ProtoBuf.Serialize(person);
  15.  
  16.  
  17.           //deserilize to person object
  18.           stream.Position = 0;
  19.           Person personObject = ProtoBuf.Deserialize<Person>(stream);

 

Step 3: Serialize and write to a file.

  1. var person = new Person
  2.             {
  3.                 Id = 12345,
  4.                 Name = "Fred",
  5.                 Address = new Address
  6.                 {
  7.                     Line1 = "Flat 1",
  8.                     Line2 = "The Meadows"
  9.                 }
  10.             };
  11.  
  12.  
  13.             using (var file = File.Create("person.bin"))
  14.             {
  15.                 Serializer.Serialize<Person>(file, person);
  16.             }
  17.             
  18.     
  19.             using (var file = File.OpenRead("person.bin"))
  20.             {
  21.                 Person personObject = ProtoBuf.Deserialize<Person>(file);
  22.            
  23.             }

 

for more information in protobuf visit here

Happy Coding !!!

Sunday, April 5, 2015

Clean Code Practices : Conditions

 

It is said

“Any fool can write code that a computer can understand. Good programmers write code that humans can understand.”

so that we have to make sure the code we write should be easily read and understood by the other developers. This post I will be discussing practices we can follow to make our conditional statements clean. In this post I will mention code dirty and clean to you to understand the difference

Principles:-

  1. Clear Intent
  2. Use the right tool
  3. Bite-size logic
  4. Sometimes code isn’t the answer

 

1.Use Positive Conditionals

When you write conditions always try to use positive conditionals rather using negative because humans are more comfortable in grabbing positive things.

Dirty
if (!isNotlogggedIn)
            {
            }

see above example it is bit confusing to understand the meaning in first time.

Clean
  1. if (logggedIn)
  2.             {
  3.             }

 

2.Use Ternary operator

see the following example first.

Dirty

  1. int bookingFee;
  2.  
  3.           if (isEarlyBirdBooking)
  4.           {
  5.               bookingFee = 5000;
  6.           }
  7.  
  8.           else
  9.           {
  10.               bookingFee = 8000;
  11.           }

we can see the variable “bookingfee” is referenced two places and this code has 11 lines of code. For this kind of conditions you can use the ternary operator to make the code cleaner.

Clean

  1. int bookingFee = isEarlyBirdBooking ? 5000 : 8000;

so that you can see, this has

  • less code
  • intention of the logic is short and clearly presented

3.Avoid using ‘Stringly’ types

see the following example first

Dirty

  1. if (user.Type= "Administrator")

in this sample we are checking a condiition against a string. Which might lead to many issues. There could be case mismatch, also developer might make spelling mistakes. To avoid these try to use enums rather using string values.

Clean

  1. if(user.UserType==UserType.Administrator)

if you use enums

  • there will be no typos
  • You will get the intellisense support from visual studio as it shows all possible values for user type
  • Also you can easily search where a specific user type is used in the code. Rather using enums if you use sting and then you search it will search commented code and all other unwanted stuff.