Choosing the right MS SQL editions and implementation

thumbnailI have frequently come across a situation where, clients are usually confused about the selection of MS SQL edition and whether they should go on-premise or on cloud. And given the fact that there are other IaaS players too, selection of hosting a SQL server on cloud becomes even more difficult.

 

Editions of  2016 MS SQL 

Microsoft recently launched  2016 MS SQL editions. They are available in 4 flavors: Express, Standard, Enterprise and Developer

Here is the brief summary of what these editions are meant for :

Screen Shot 2016-09-09 at 1.20.04 PM

 

Although there is a lot of information available on Microsoft’s website, I have filtered out some of the most important features of different  2016 editions  which should be considered before finalizing which edition is right for you.

Screen Shot 2016-09-09 at 1.52.40 PM

* Basic HA – restricted to 2 node single database failover, and non readable secondary db. Basic HA ensure data availability so your data is not lost with basic HA and a fast two nodes non-readable synchronous replica.

** Advanced HA – Always On- availability groups, multi-database failover with readable secondaries.

On-premise deployment.

There are 4 costs associated in an op-premise set up.

  1. Infra cost
  2. Hardware
  3. Licenses cost
  4. Personnel

I would like to focus only on licensing cost. There are 2 types of licenses available for SQL Standard editions

  1. Server + CAL license:   Cost of MS SQL server is  $ 931 and $ 209 per Client access license (CAL) which is either used based or device based.
  2. Core based license :  $ 3,717 per core, in 2 core packs. There is no restriction on the number of users or devices which can access the server in this type of license.

On Cloud deployment:  

The benefit of spinning up SQL server on cloud is it’s fast, easy and you also have an option of getting a fully managed SQL instances.

Comparison of costs on Azure, AWS and Softlayer


Screen Shot 2016-09-09 at 6.36.28 PM


Below tables shows the configurations which were considered.


For AWS and IBM SoftLayer
Cores RAM HDD
CONF 1 2 8 100
CONF 2 4 16 200
CONF 3 8 32 400
CONF4 16 64 800

For Azure
Cores RAM HDD
CONF 1 2 7 100
CONF 2 4 14 200
CONF 3 8 28 400
CONF4 16 56 800

The graph shows clearly that SoftLayer is the cheapest when compared to both AWS and Azure. Following are the added advantages with IBM SoftLayer :

  1. Data download limit of 250 GB with virtual instances and 500 GB with Bare Metal instances
  2. There is no inter DC charges
  3. The instances are not bundled, so you have the flexibility of increasing or decreasing cores, RAM and HDD independently which is not in the case of Azure

Comparing the cost of on-cloud vs on-premise is  little tricky. You need to take the following things in to account :

  1. When is your server hardware refresh due: This is important because assuming you have recently invested in the hardware  and the next refresh is due only after 3 years, then you will incur only the license cost of MS SQL. In this case, most of the times, going for an on-premise will make more sense.
  2. No. of users in the organization:  Assuming you only have 20 -25 users and there is a lot of uncertainty about the increase or decrease of the no. of users, then in this case, most of the times, on-cloud will make sense. You just have to purchase the server license and then you can take CALs from your cloud service provider which comes in a minimal monthly cost.

In case you want to need to know more about the implementation and pricing on SoftLayer and want to do a TCO for your implementation then, you can reach out to me on this link or drop a comment here.

 

Shopping cart for Hadoop as a Service

Hadoop is an open source framework which different vendors take, customize it, add their own products on top of it  and bring the newly created product with different features and functionalities to the market.
I don’t know how far  this analogy is  correct but it’s like Android OS. Different vendors take the same core of Android, customize it build their own functionalities on top of it and create a  different product altogether.
Typically different hadoop distributions have different set of tools, support, optimizations and additional features. So the challenge then how can we decide which Hadoop service is suitable for our requirement  and which Hadoop service can serve the organization’s purpose.
You can see a list of Hadoop distribution here . Forrester in its report has recently done a market analysis and have rated different Hadoop on Cloud vendors.
Screen Shot 2016-06-13 at 3.00.06 PM
Here is a list of top Hadoop distributions, the value additions in them and also my thoughts on what would work for which use case :

  1. Cloudera distribution of Apache Hadoop  ( CDH ):  It’s the first commercial Hadoop Startup. offers core open distribution  along with a no. of frameworks which include Cloud era search, Impala, Cloudera Navigator and Cloudera Manager.
  2. Pivotal HD : includes a number of Pivotal software products such as HAWQ (SQL engine) GemFire, XD (analytics), Big Data extensions and USS storage abstraction. Pivotal supports building one physical platform to support multiple virtual clusters as well as PaaS using Hadoop and RabbitMQ.  
  3. IBM Infosphere BigInsghts : includes visualization and exploration, advanced analytics, security and administration. There is no other vendor which can give you the flexibility of working on a Bare Metal machine. But that comes at the price of scalability. Bare Metal machine can’t be scale up or down on the fly. IBM’s other products BigQuality, Bigintegrate, and IBM InfoSphere Big Match can be seamlessly integrated for a mature enterprise operations.
  4. Amazon Elastic MapRedue:  comes with EMRFS which allows EMR to be connected with S3 and use it as a storage layer. The fact that S3 is the market leader in object storage and many enterprises are already using S3 for their Big Data storage, makes it an obvious choice.
    But AWS EMR work with AWS data stores only and I really doubt if it can be integrated with other storage options.
  1. Azure HD Insight : Azure HD Insight uses HDP (Hortondataworks Platform) distribution which  is designed for Azure Cloud. Enterprise Architects can use C#, JAVA and .NET to create configure, monitor and submit Hadoop jobs.
  2. Google Cloud Dataproc: has built in integration with Google Cloud Services like BigQuery and Big Table along with Dataproc. Unlike other vendors Google bills you in minutes.
Looking at the functionalities, features, it’s quite easy to get confused with plethora of options available right now and each vendor is trying hard to get a bigger pie of this cake.

 

4 Disaster Recovery Strategies you must know

Typically there are 4 scenarios which are possible in designing a DR solution :

1. Back up and restore or Cold DR : In this Data is backed up to Data Center in some other region and then is restored when required. We typically have following storage options available to perform the back up and restore

  • Object Storage
  • Block Storage
  • File Storage or a NAS

A couple of other things which really matter when you take this approach is how can you transfer the data from on premise data center to Cloud provider. Following are the options

  • Using internet
  • Transferring the media directly to the cloud vendor
  • Using application which can be used to transfer the data at a higher speed such as IBM Aspera
  • Direct line between your DC to Cloud provider DC

2. Pilot Light for quick recovery or Warm DR : In this a minimal version of an environment is always running in the cloud. The idea is that you have your data ready by replication in cloud. And in case of a disaster, your network is configured in such a way that, it just routes to an active site in case the other goes down
Configuring a network is possible in two ways:

  1. Using IP addresses
  2. Using Load balancing

The prerequisite for this type of set up is that you must have a two tiered architecture, i.e. App server and DB server are two separate servers. You replicate only DB server, and  keep the installation scripts ready for App server, (along with images of the production server )while the core components are always mirroring

3. Warm Standby or Hot DR:   Scaled down version of an environment is always running in cloud. The App server in the cloud is also connected with the on premise DB server and vice-versa. And both the DB servers  are always running. In this set up you end paying a little more for the DR set up because both the App serves and DM servers.

4. Multi site solution or Active/Active:
Both the cloud and on-premise solutions are alway active. App servers and DB servers are active and share the workload. The data in both the DB servers are mirrored and both the App/servers share the workload.

Below table shows the gist of the above 4 DR strategies :

In case you need any assistance in setting up the DR solution, drop me a direct message or just ping me here

How to create a video application using IBM SoftLayer

Typically IBM provides the following applications for video services

  1. Something which can upload the files faster – Aspera
  2. Something which empower the users to access videos from any devices – Clearleap
  3. Something which provide live streaming capabilities  – UStream
  4. Something which enable on-demand ingest and distribution- Clearleap

IBM provides the following applications which can serve the above purposes :

Picture1And it’s very important that the core component of storage is also planned wisely which typically includes

  1. Compute -ideally a VM which ensures faster scalability
  2. Storage : ideally an object storage
  3. Network bandwidth
  4. CDNs – A strong content delivery which comes with greater no. of POPs

If you need more information about any of these services, just click here to reach out to me.

 

Future of Healthcare

With the emergence of Cloud as the backbone, Big Data Analytics as the heart and Cognitive as the brain, we can conveniently say that healthcare industry is transforming like never before and with an unprecedented pace. Think about the time when it was extremely difficult for the patients to find the right doctor, diagnosis  taking ages and doctors burdened with the upheaval tasks of identifying the right treatment.

Here are the 5 ways in by which the Healthcare going to become to become in future :

1. Integrated healthcare ecosystem: It’s important to understand that like any other industry, there is a complete ecosystem of value providers in healthcare too. For example :

  • Pharmacists for medicines
  • Ambulance services
  • Path labs
  • Hygiene and housekeeping services
  • Insurance providers
  • Clinical instrument manufactures
  • Regulators

The technology, to a great extent has enabled to bring together this entire ecosystem to provide an integrated healthcare ecosystem. The result is improved patient services,  optimization of cost and new emerging new business models.

2. Ubiquitous and personalized healthcare: Digitization of Patient health records enable the doctors to preserve the medical history of a patient and provide the health services with a lot more personalization and precision. This trend will continue and over the period of time, when this digitized data can be analyzed using sophisticated Big Data analytics technologies and made available ubiquitously almost everywhere anytime, patients can even predict the ailment and prescribe the medicines accordingly.

3. Convergence of healthcare and mobility with IoT: Healthcare providers will place the mobile diagnostic devices in patients’ home, link these to cloud platforms and monitor them continually. The explosion of wearable devices and the amount of data which it produces will empower the doctors to make patients cautious of their medical conditions in the real time.

Consider you getting an alert from your hospital or doctor just like you get for the over usage of internet next time when your life style requires to be changed.

4. Actionable insights driven and targeted healthcare: Insights based on analytics that are integrated with mobile devices or smart sensors will be used to improve clinical outcomes. The next step for the hospitals then is develop the capability to tap in to this enormous   and complex data.

5. Healthcare access for rural and poor people :  With the advent of video streaming and the expanded high speed fibrenet coverage, it will be possible to get the treatment remotely and virtually. Doctors can see the patients on their screen using high speed internet where it’s almost impossible for the quality healthcare to reach.

Like any other profession and industry, healthcare too is looking for a big leap forward, then only deciding factor then is – how much willing and prepared are you for this embrace this change ?

How to cancel a SoftLayer VM

SoftLayer SLA says that you have to cancel the device 24 hours prior to the next billing cycle avoid the billing for the next month. Follow the below steps to cancel the device.

Step 1 : Log in to https://control.softlayer.com by entering your username and password

Step 2: Go to Devices -> Device List

Device list Click

 

Step 3: You will see all the devices which you have purchased. Go to the device which you want to cancel. On the right hand side of the device name, click on Actions

Step 4: Click on Cancel Device, as shown below.

Click to cancel

A cancellation ticket will be raised and your device will be cancelled in 24 hours. Please ensure to take the backup of the data. Once your device is deleted you will not be able to recover your data in any possible ways.

 

Amazon’s new S3 life cycle policies

Amazon just recently announced two new policies:
1. Incomplete multi part upload expiration policy : Take large files and split them  up in parallels. Multi part upload feature improves the PUT performance by breaking the large files in to small parts. This increases the upload speed. But what if the upload is not complete ? You still left with some amount of S3 storage consumed and are being charged earlier. Earlier you had to remove the incomplete parts manually.

With this new policy you can set an expiration date for the incomplete upload and remove them from the storage to avoid being charged.

2. Expired object delete markers expiration policy: The S3 bucket versioning feature helps you recover from unintended user deletes or application logic failures. When you delete a versioned object, a delete marker becomes the current version of the object and the original is retained as the previous version. While you are not charged for storing delete markers, removing expired markers can improve performance for list requests on your bucket. With this launch, you can now set a lifecycle policy to automatically remove the current version delete marker when previous versions of the object no longer exist.  ( As explained in AWS Blog)

Essentials of Mobile App hosting on cloud

Beginning your Mobile App  hosting on cloud requires a lot of decisions to be considered.  What type of Cloud service I want to take, IaaS or PaaS. What are the components are required to finish the mobile hosting

These are PaaS services one would require to host Mobile App :

  • Mobile Gateway : Mobile Gateway acts as an entry point to the cloud provider services for end users. It provides the following services
    1. Authentication
    2. Policy enforcement
    3. API Invocation Analtyics
  • Mobile back end : This is the actual platform where most of the functionalities happen. When you design a mobile app, this is the first thing which you have to consider :
    1. App Logic / API Implementation
    2. Mobile App Operational Analystics
    3. Push Notifications
    4. Location services
    5. Mobile Data Sync
    6. Mobile App security

       Most PaaS provider have a lot of services available for each of these functionalities

  • Data Services: Mobile App usually work on unstructured data with a relatively less requirement for structured data. Following are the services which you should look to take from PaaS provider:
    1. Mobile App Data / NoSQL
    2. File Repositories
    3. Cache
  • Mobile Device Management: These services are typically provided to enterprise customers who offer mobility for their employees. They include:
    1. Enterprise App Distribution
    2. Mobile Device Security
    3. Device Management
    4. Device Analytics

Enterprise use these services to manage the laptops, mobile phones and tablets     given to the employees working on the go.

  • Mobile Business Applications: These are the set of services which are helps the marketers to understand their customers better and execute personalized marketing campaigns. Typically, below services are provided :
    1. Proximity services and Analytics
    2. Campaign Management
    3. Business Analytics and Reporting
    4. Work flow / Rules
  • Security services : Needless to say, security is the most imporatant aspect of any application and when it’s mobile application, it becomes even more important. Below are the services which you should consider to implement while executing security services :
    1. Identity and Access Management
    2. Data and Application Protection
    3. Security Intelligence

There are differed products/services provided by different vendors and I will be covering those products in the coming blogs.

In addition to these you use IaaS to get the following services:

Compute: It depends on what type of computing services you want to implement or to get started with. You have following options to choose from :

  1. Virtual Machines
  2. Container Services
  3. Run time environment services

The faster you want to get started, lesser will be the control on your computing services. For Example, Virtual Machine provide the maximum control but at the same time you have to be technically very sound to host your application and set up the entire environment

Network Services: Bandwidth is the most important aspect of networking services. Typically IaaS providers don’t charge for in-bound traffic but they charge for out bond data. Go service provider which offers maximum data out bound limit. If your app is streaming videos or audio, then you need a Content Delivery Network. This streams and cache the data in the nearest server to the user to decrease the latency.

Storage Services: If your workload involve transactional data, go for Block storage, where as if it involves Unstructured data, Object storage will be a better option.

Essentials of Big Data and Analytics on Cloud

indexStarting a Big Data and Analytics (BDA) project on cloud is not only faster, but also quite cheaper. I have come across many clients who want to start their BDA project, but they don’t proceed or I will say they delay thinking about the upfront cost, required skills and execution time. All these can be well taken care of if they start their project on cloud.

The challenge for them is to decide upon which what cloud services they should go for and which vendor to select. Here is a list of the services which are currently available on cloud and will be a good idea to consider them to start BDA project.

1. Edge Services :  This act as an interface between your users, data and Cloud services provider. It serves the following purposes

  • DNS resolution
  • CDN services
  • Firewall
  • Load balancers

This is typically available as IaaS from companies like, IBM, AWS, Microsoft etc.

2. Data Streaming : This is primarily for data in motion. You need data streaming for

  • Real time analytical processing
  • Data Augmentation

Data Streaming tools are available as SaaS on various cloud market places.

3. Data integration: Data from different sources are delivered to the cloud service provider by using Edge services then go through the following to extract insights from it :

  • Data Staging
  • Data quality checks
  • Transformation and loading

These are also available was SaaS on various Cloud Market places.

4. Data Repositories: Data repository consist of both data in motion ( from streaming services) and data at rest (after Data integration process) and then prepares the data for the various Analytical engines. Data repositories are meant for the following functionalities:

  • Data warehousing
  • Landing, exploration and archive
  •  Deep Analytics and modelling
  • Interactive Analytics and Reporting
  • Catalog

Earlier not many SaaS offerings were available for Data repository services. But now there are many SaaS offerings available on different cloud market places.

5. Actionable Insights: Data from Data repository is then fetched in to a variety of tools to extract insights. Typically you need different tools to perform the following:

  • Decision Management
  • Discovery and Exploration
  • Predictive Analytics
  • Analysis and Reporting
  • Content Analtyics
  • Planning and Forecasting
  • Visulization

In addition to the above services you do get Data Security an Governance services on cloud. There are multiple vendors providing either all or part of the above services. The market is flooded with services and to select one service or vendor really require a lot of research and many points to be considered.

AWS S3 and IBM Object Storage- which one is better ?

IBM object storage is based on OpenStack Swift technology. You can use Swift APIs or one of the language clients to control Object Storage objects. Swift functions as a distributed, API-accessible storage platform that can be integrated directly into applications or used to store files like VM images, backups, and archives as well as photos and videos.

Since, IBM’s Object Storage is based on the Swift, all the APIs which work with Swift work in IBM’s Object Storage.

Architecturally, both AWS S3 and IBM SoftLayer Object Storage is same. Once you create an account, you then have to create a container which hold objects you define. You can create sub-folders and nest folders ( folder with in a folder) in a container.

The only differentiating factor then remain is “Pricing”.

How IBM SoftLayer charges for Object Storage :

IBM Softlayer Object storage pricing is relatively easier to understand and predictable. It charges only the following components:

  1. Storage used per GB per month
  2. Bandwidth

The storage pricing includes all types of requests to the object storage.

How AWS S3 charges for S3 ?

You can get started with AWS S3 for free. Upon sign-up, new AWS customers receive 5 GB of Amazon S3 standard storage, 20,000 Get Requests, 2,000 Put Requests, and 15GB of data transfer out each month for one year.

After the free tier usage AWS charges for

  1. Storage
  2. Requests to storage
  3. Data Transfer

For more details on the pricing, refer to this link.

Which is cheaper – IBM SoftLayer or AWS S3 ?

I used AWS simple monthly calculator and IBM TCO calculator to compare the prices of under the following assumptions :

  1. US Datacenter
  2. 2TB Object storage
  3. 200 GB out bond data
  4. 10,000 requests of PUT/COPY/POST/LIST Requests and GET and Other Requests

The price for IBM SofLayer Object Storage comes out to be : $98 USD

AWS S3 price is $76.06 USD

A few points to note:

  1. AWS prices is not completely predictable. Each requests are charged and further more, even the storage is further divided in to infrequent access storage and reduced redundancy storage.
  2. The definition of both infrequent access storage and reduced redundancy storage is not clear, which makes it even more difficult to predict the total bill by using AWS monthly calculator
  3. In AWS inter-region data transfer is charged, where as intra-region data transfer is free
  4. AWS provides basic tech supports which covers mainly billing related issues, where as IBM’s tech support is completely free. However, for object storage tech support requirement is not essential and it can easily be ignored.

Conclusion

AWS S3 comes out to be cheaper than IBM SoftLayer Object storage if we consider the above parameters.

 

 

 

 

 

 

 

Page 1 Page 2