Welcome!

Log Management Authors: Dana Gardner, Pat Romanski, Elizabeth White, David H Deans, Carmen Gonzalez

Related Topics: @CloudExpo

@CloudExpo: Article

How Do You Qualify Something as Cloud Computing?

Jeff Bezos: "You don't generate your own electricity, why generate your own computing?"

Melvin Lancelot's Blog

Practically every one with an online business model is now referring to their service as cloud computing - starting from your average Joe hosting firm all the way to the SaaS/S+S vendors, every wants to ride the next buzzword wave, and this distorts the cloud computing term all together. Part of the reason being that historically the term "cloud" loosely referred to anything that’s available online/on the Internet. Ask a bunch of geeks and you would get a different explanation of cloud computing from each person (if you ever asked a bunch of people what Web 2.0 is you know what I mean). So how can we qualify if a service is really leveraging the cloud computing model?

That’s a tough question. An easier way to answer this is by first examining the behavior of services provided by some of the well known cloud computing vendors. Lets take Amazon and Google as two examples, both have different business models but under the hood when we use their cloud computing service within our application/service the 3 common behavior that we see from it are:

a. Scalability
b. Availability
c. Economical/cost effective

Lets discuss scalability first. Imagine you're tasked to design an online application/service that should be "Internet scalable". Imagine (if you will) that you're designing the next big social networking/ the next big YouTube etc. How do you go about designing it to support millions of users? For that matter how does anyone do it?

The short answer - you do that iteratively. Iteratively is a good euphemism, the reality is that you do that after several design blunders and limitations :). In the iterative approach you would first design a cost effective solution to scale for a smaller audience and then when you start seeing more traffic you add more hardware till the point it doesn’t improve anything then you start to redesign for better scalability. This is how Amazon and Google have grown as well and have designed their overall system to support such high scalability - but interestingly they have abstracted their design to a degree that it could be repackaged into a subscription-based service - a cloud computing service.

But first how do you build a massively scalable solution? How would you build your architecture to accommodate linear scalability?

Any architecture can be described as being composed of two types of components:

a. Stateless components
b. State-full components

Stateless components are those that only do some processing on data and don’t persist state - hence are easily to scale via a scaled out design which as a byproduct also gives you higher availability, also using scaled out architecture you can keep adding inexpensive boxes to the system thereby reducing your cost while the system continue to keep humming.

Statefull components on the other hand are those that persist state of resources that it need to work with - for e.g., File system, databases, BLOBs etc. traditionally the only option to scale these would be to via a scaled-up approach - i.e. beef up the hardware on the server. This is more expensive than a scaled out architecture and introduces single point of failure. Traditionally we have been using approaches such as data replication etc to scale out these components but it introduces several complications. Fundamentally statefull components are the ones that impact overall performance and scalability of a system.

Statefull components may be notoriously difficult to scale, however even stateless components can present scalability challenges when your planning for massively scalable/internet scalability scenarios. Cloud computing vendors have made significant investments in technology to ensure that compute intensive processing can been parallelized and distributed to the max. Googles MapReduce is a elegant approach to solving these challenges.

We need a new breed of products to handle how stateless and statefull components can be scaled and distributed, the traditional approach of persisting state in the database etc just doesn’t make sense, increasingly we see that the trend is not to store atomic level transaction information in a database as we used to but store it in the form of blobs, but at the same time ensure that we do so while managing resources economically. Large part of Amazons and Googles innovation (their magic sauce if you will) in cloud computing involves developing these proprietary components.

Another way of looking at the scalability that cloud computing gives you is the ability to scale your computing resources as an when you see demand - after all it’s a pay as you go model. Traditionally online companies have been provisioning for hardware to meet spikes in traffic (like holiday seasons etc). That would also imply that they are unnecessarily paying for the extra scale which they don’t leverage all the time. By hosting it via a cloud vendor they can dramatically reduce their operational cost.

Well its not just Amazon and Google that are thinking in this direction, several platform vendors like Microsoft and other open source groups are releasing products are that will address these challenges. You can expect to see some radically different products being released from these vendors that address the distributed massively-scalable challenges. You can also expect (hosting)companies to leverage these prepackaged cloud computing capabilities and provide it as a subscription service.

Cloud computing vendors are able to provide Internet scalability at an affordable cost and can potentially give you a better SLA that if you were to manage your own infrastructure - that’s the overall package that makes cloud computing so compelling, probably best described by Jef Bezos - "You don't generate your own electricity, why generate your own computing?". Arguably there are other factors that influence your vendor decision but we hope that the next time your evaluating a cloud computing vendor/solution or building your own, you know what to look for.

 

More Stories By Melvin T Lancelot

Melvin Lancelot is a technical architect working for the consulting group at Aditi Technologies. On a day to day basis he helps ISVs and enterprises succeed by leverage the right blend of technology, platform and market trends. He contributes to a blog at http://techturks.blogspot.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
CloudEXPO | DevOpsSUMMIT | DXWorldEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
DXWorldEXPO LLC announced today that Big Data Federation to Exhibit at the 22nd International CloudEXPO, colocated with DevOpsSUMMIT and DXWorldEXPO, November 12-13, 2018 in New York City. Big Data Federation, Inc. develops and applies artificial intelligence to predict financial and economic events that matter. The company uncovers patterns and precise drivers of performance and outcomes with the aid of machine-learning algorithms, big data, and fundamental analysis. Their products are deployed...
All in Mobile is a place where we continually maximize their impact by fostering understanding, empathy, insights, creativity and joy. They believe that a truly useful and desirable mobile app doesn't need the brightest idea or the most advanced technology. A great product begins with understanding people. It's easy to think that customers will love your app, but can you justify it? They make sure your final app is something that users truly want and need. The only way to do this is by ...
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next...
Cell networks have the advantage of long-range communications, reaching an estimated 90% of the world. But cell networks such as 2G, 3G and LTE consume lots of power and were designed for connecting people. They are not optimized for low- or battery-powered devices or for IoT applications with infrequently transmitted data. Cell IoT modules that support narrow-band IoT and 4G cell networks will enable cell connectivity, device management, and app enablement for low-power wide-area network IoT. B...
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...