Log Management Authors: Dana Gardner, Pat Romanski, Elizabeth White, David H Deans, Carmen Gonzalez

Blog Feed Post

Control, choice, and cost: The Conflict in the Cloud

One of the oft cited reasons in surveys that enterprises aren’t flocking to the cloud like lemmings off a cliff is “lack of control”. Problem is that articles and pundits quoting this reason never really define what that means.

After all, cloud providers appear to be cognizant of the need for users (IT) to be able to define thresholds, reserve instances, deploy a variety of “infrastructure”, and manage their cloud deployment themselves. The lack of control, however, is at least partially about control over the infrastructure itself and, perhaps, complicated by the shallow definition of “infrastructure” by cloud providers.

cloudcontrol Perusing the options available from Amazon can be overwhelming and yet underwhelming at the same time, depending on what you’re looking for. The list of operating systems and application environments that can be deployed is extensive, a veritable potpourii of options. But that’s where it ends – at the application infrastructure. If there’s an option for deploying or managing the rest of the supporting infrastructure I’ve yet to stumble across it. Load balancing. Acceleration. Optimization. Identity and Access Management. Security. Quality of Service. Rate Shaping. Compression. Secure remote access? Certificate management? LDAP? XML gateway? The list would grow quite extensive if you listed out all the components that go into a well-oiled network and application network infrastructure. You know, the infrastructure that’s necessary to deliver applications.

Suffice it to say this is where the shallow definition of infrastructure in the cloud meets lack of control and forces C-level executives to make a decision on which way to go: with the cloud or with control.


When IT is tasked with delivering an application it must necessarily consider available, security, and performance of that application as it pertains to traversing the data center network. When applications perform poorly IT is tasked with addressing the issue in order to adhere to organizational service-level agreements. This is the reason so many solutions and options exist in the network and application network infrastructure: to provide the means by which IT can architect a well-performing, reliable, and secure environment in which applications can be delivered.

When the ability of IT to choose and deploy those integral pieces of the network and application network infrastructure they lose control over the delivery of that application. They no longer have the means by which they can deploy solutions designed to augment or enhance or fix issues commonly occurring with the deployment of applications. They lose control over their own architectural destiny, trusting implicitly in cloud providers to offer the choices necessary to the successful deployment and delivery of applications.

The shallow definition of “infrastructure” used today is impeding adoption of the cloud because users lack choices and options. They lack control over that environment in an architectural sense. It isn’t the turning over of the reins for management and maintenance that’s problematic; it’s the architectural control that’s lost that gives rise to these concerns.


Cloud providers are doing the term “infrastructure” a disservice when they focus solely on the application infrastructure – databases, operating systems, web and application servers, server hardware – and ignore that a real infrastructure includes network and application infrastructure. And that without the ability to easily choose and deploy network and application-network focused services in the cloud that organizations are necessarily going to be leery of the ability to meet service-level agreements and improve performance and security levels in the cloud without the control they are typically afforded by a physical data center.

Increased capacity does not always result in improved performance of applications. It is but one way in which solutions attempt to address the problem of application performance. In the cloud, this is the only way to address the problem, unless you’re using Amazon and are keen on adding yet another line-item to your monthly cloud provider bill by signing up for CloudFront. That’s certainly an option, but unfortunately for organizations it’s currently the only option for many in the cloud.

Cloud providers need to recognize the need for alternative solutions addressing performance and security in the cloud and offer their potential enterprise customers not only choice but control over those solutions. If providers are going to offer Infrastructure as a Service then they need to support more than just application infrastructure as a service. They need to broaden their support and inclusion of the same types of options available to the CTO and CIO in their own data center, which includes a much wider array of solutions than is currently available today.


Dave Rosenberg of C|NET’s Negative Approach hit upon the reason for the reluctance we’re likely to continue seeing in the cloud around choice and control:

Disaster recovery, compliance, and enterprisey [sic] features are where the growth is in the near term. You have to have an SLA and support for true enterprise-class applications. Amazon will probably do this over time, but right now, you have no real option. Rowell pointed out that for Amazon to offer these services would add significantly overhead and likely cause the price point to rise significantly.

Cost. Right now cloud is selling, albeit slowly in the enterprise market, based on perceived initial and long term cost reductions. Offering choice and therefore the control necessary to implement the infrastructure necessary to ensure compliance with SLAs and “true enterprise-class applications” would require additional investments by cloud providers that would most certainly be passed on to the customer by way of price increases. Increasing the cost of cloud computing would certainly decrease its already lackluster appeal to many enterprises, thus effectively shooting providers in their own feet.

It’s a Catch-22 situation; giving control to customers would make it more appealing technically but less appealing from a financial perspective. This means cloud providers are going to need to evaluate options that keep the costs down while affording enterprise markets the control and choice they need to adopt cloud computing as a viable alternative to in-house data centers.


Vendors already understand the evolutionary need for a more dynamic infrastructure that addresses the issues cropping up with cloud and virtualization and emerging data center models. Hence the drive toward Infrastructure 2.0. In addition to its core capabilities around connectivity intelligence (collaboration), elasticity, and intelligence is elasticity.

That elasticity needs to expand beyond the traditional view of “capacity on demand” and include “management on demand”. Support for multi-tenant environments like cloud computing and virtualized architectures is a must. Coupled with a flexible architecture and collaborative intelligence, multi-tenancy allows providers the ability to deploy new functionality “on-demand” on existing infrastructure platforms and extend the ability to control that new functionality to customers in an isolated management structure.

By enabling cloud providers to extend the functionality of existing investments to include many of the solution choices enterprises would normally make in the course of deploying applications and their supporting infrastructure, they can remove “lack of control” and “limited choice” from the list of reasons why an organization rejects cloud computing as an option while ensuring that the “cost” borne by the provider is significantly reduced, thus making it a feasible option to offer these options to customers who may require it.

Providers will, of course, charge for such options but if the investment is not so heavy a burden on the part of the provider then the cost passed on to the organization will certainly be (or at least certainly can be) lessened to the point it is comparable to the investment the enterprise would make internally.


Follow me on Twitter View Lori's profile on SlideShare friendfeedicon_facebook AddThis Feed Button Bookmark and Share

Reblog this post [with Zemanta]

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

@ThingsExpo Stories
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
"Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
"There's plenty of bandwidth out there but it's never in the right place. So what Cedexis does is uses data to work out the best pathways to get data from the origin to the person who wants to get it," explained Simon Jones, Evangelist and Head of Marketing at Cedexis, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
SYS-CON Events announced today that Telecom Reseller has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics gr...