|By Pat Romanski||
|May 31, 2014 01:00 PM EDT||
"While cost has traditionally been a foundational benefit of the cloud, we believe that other factors can play a more important part in defining the cloud of tomorrow," said Todd Gleason, Vice President of Technology at FireHost, in this exclusive Q&A with Cloud Expo conference chairs Larry Carvalho and Vanessa Alvarez. "We believe that the generalist cloud provider - those low-cost commodity clouds that created price competition and insecure clouds - are going to either be gobbled up or beaten up as the industry evolves from this generalist mentality to focus on specialist clouds that have a particular focus."
Cloud Computing Journal: How are cloud standards playing a role in expanding adoption among users? Are standards helping new business models for service providers?
Todd Gleason: Standardization is important, but we're concerned because security desperately needs to be part of this effort. It is not addressed enough, if at all. Much of the open standards work in cloud and other IT technologies traditionally is not security-conscious, making adoption riskier than many want to admit or truly understand. So, while standards are a good thing, we believe security should be baked in at their inception. We advocate for innovation, but it's important that the innovative spirit is not blind to the need to protect vendors and customers.
Cloud Computing Journal: How are hybrid clouds evolving to allow the coexistence of private and public clouds? What are the challenges to meeting a true hybrid cloud scenario?
Gleason: Too much attention and marketing real estate has been paid to debating the merits of public, private and hybrid clouds. Definitions of what's public, what's private and what's hybrid are wide-reaching and are often created to suit a vendor's point of view rather than an educational, objective delineation for the industry. From where we sit, use cases and security are key to disarming this debate. The conversation needs to shift from which model is best in general to which one is appropriate for an individual vendor, provider, or customer, and it must address and incorporate security because regardless of the model chosen, a cloud is insecure unless it's built secure. Use cases vary for industries and individual companies, but security is a common denominator and applicable for all of them. This is where the industry discussion needs to shift.
Cloud Computing Journal: Are on-premise software vendors successfully migrating their business model to a SaaS model? What are the challenges faced in this journey?
Gleason: Most are and the bigger question is how effective will they be at delivering from the cloud and will it be secure. Will they expand their business model and provide infrastructure, or will they stay in their niche as software only players? And for those who do not migrate, will they become more inefficient and outdated?
Cloud as a delivery vehicle is the next wave of the Internet, especially as automation, software techniques, and infrastructures adapt toward applications and servicing their performance more and more. The efficiencies of today are miles ahead of an insulated, on-premise software solution - a situation that gets much sweeter when security is architected into the infrastructure. However, we are not even close to being a truly application-centric world.
Truthfully, though, we need to focus on what the customer wants. Every model will be viable to various customers, so some software vendors may become irrelevant due to the cloud providing a better way to service the customer's needs. The cloud industry can be too hung up on terms such as hybrid, public, SaaS or PaaS. Instead, we have to focus on the customers' needs and their applications that run their business and give them competitive advantages operationally and provide a truly seamless experience.
Cloud Computing Journal: What are the challenges for end users to adopt a new model for application development using Platform as a Service? Are vendors doing enough to meet their needs?
Gleason: Platforms are important to reduce time-to-market for SaaS or internal use cases, and it's good to see the industry shifting from infrastructures being the center of the universe to how they can be architected to service applications in a more on-demand, dynamic, intelligent manner. But, again, where is security in the discussion? It's not anywhere to be found in the majority of industry conversations and vendor communications. It badly needs to be.
In addition to security, there are two other big to-dos that need to be addressed: orchestration and inter-cloud technology. To make applications work well across infrastructures, those infrastructures must work well together in an inter-cloud fashion. This brings us back to security. Businesses will have many clouds, as will their customers. The weakest link is the most insecure cloud, so change must happen to bring security into the larger, strategic conversations rather than focusing so much on architecting an infrastructure or specific software techniques.
Cloud Computing Journal: With several vendors lowering costs for infrastructure, is there a way for new cloud service providers entering this space to make money?
Gleason: While cost has traditionally been a foundational benefit of the cloud, we believe that other factors can play a more important part in defining the cloud of tomorrow. We believe that the generalist cloud provider - those low-cost commodity clouds that created price competition and insecure clouds - are going to either be gobbled up or beaten up as the industry evolves from this generalist mentality to focus on specialist clouds that have a particular focus. For FireHost, that's security. We evolved out of the need for security and have created an infrastructure that provides an advanced security architecture, top security personnel and compliance expertise for customers that are security- and compliance-driven.
• • •
Todd Gleason, Vice President of Technology at FireHost, is a central figure in continuously innovating and architecting the secure cloud infrastructure for FireHost and its customers. He brings 15 years of global IT and R&D experience, including deep knowledge of security, cloud, networking, compute, virtualization, storage, and application delivery technologies.
Gleason is driven by the notion that customers ultimately care about servicing their applications to perform efficiently in a secure environment. With this in mind, Gleason has displayed a knack for understanding the latest technology trends in the industry, synthesizing what is applicable for FireHost's customers, and incorporating relevant technology into the company's cloud infrastructure to ensure businesses' applications and data are protected in a cutting-edge, secure cloud. Gleason has helped redefine industry expectations about performance, security, and compliance in the cloud. His creative approach to architecting FireHost's secure cloud proves that security can be incorporated without compromising infrastructure performance, which improves customers' risk management while maintaining thrifty usage of infrastructure resources.
Prior to his role with FireHost, Gleason worked as director of information technology at Panini America, formerly Donruss Trading Cards, which is now a subsidiary of the global Panini conglomerate. There he implemented a high-performance infrastructure and unique business applications enabling rapid high-quality production workflows and integration into the Panini global business. Gleason holds a degree in computer information systems from Remington College.
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, showed how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He used a Raspberry Pi to connect sensors...
Feb. 14, 2016 04:30 AM EST Reads: 401
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Feb. 14, 2016 04:00 AM EST Reads: 258
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
Feb. 14, 2016 04:00 AM EST Reads: 485
If we look at slow, traditional IT and jump to the conclusion that just because we found its issues intractable before, that necessarily means we will again, then it’s time for a rethink. As a matter of fact, the world of IT has changed over the last ten years or so. We’ve been experiencing unprecedented innovation across the board – innovation in technology as well as in how people organize and accomplish tasks. Let’s take a look at three differences between today’s modern, digital context...
Feb. 14, 2016 03:45 AM EST Reads: 257
The principles behind DevOps are not new - for decades people have been automating system administration and decreasing the time to deploy apps and perform other management tasks. However, only recently did we see the tools and the will necessary to share the benefits and power of automation with a wider circle of people. In his session at DevOps Summit, Bernard Sanders, Chief Technology Officer at CloudBolt Software, explored the latest tools including Puppet, Chef, Docker, and CMPs needed to...
Feb. 14, 2016 03:30 AM EST Reads: 389
The (re?)emergence of Microservices was especially prominent in this week’s news. What are they good for? do they make sense for your application? should you take the plunge? and what do Microservices mean for your DevOps and Continuous Delivery efforts? Continue reading for more on Microservices, containers, DevOps culture, and more top news from the past week. As always, stay tuned to all the news coming from@ElectricCloud on DevOps and Continuous Delivery throughout the week and retweet/favo...
Feb. 14, 2016 03:00 AM EST Reads: 253
In a previous article, I demonstrated how to effectively and efficiently install the Dynatrace Application Monitoring solution using Ansible. In this post, I am going to explain how to achieve the same results using Chef with our official dynatrace cookbook available on GitHub and on the Chef Supermarket. In the following hands-on tutorial, we’ll also apply what we see as good practice on working with and extending our deployment automation blueprints to suit your needs.
Feb. 14, 2016 02:00 AM EST Reads: 104
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, will discuss how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved effi...
Feb. 14, 2016 01:15 AM EST Reads: 293
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
Feb. 13, 2016 11:15 PM EST Reads: 314
Data-as-a-Service is the complete package for the transformation of raw data into meaningful data assets and the delivery of those data assets. In her session at 18th Cloud Expo, Lakshmi Randall, an industry expert, analyst and strategist, will address: What is DaaS (Data-as-a-Service)? Challenges addressed by DaaS Vendors that are enabling DaaS Architecture options for DaaS
Feb. 13, 2016 08:45 PM EST Reads: 400
One of the bewildering things about DevOps is integrating the massive toolchain including the dozens of new tools that seem to crop up every year. Part of DevOps is Continuous Delivery and having a complex toolchain can add additional integration and setup to your developer environment. In his session at @DevOpsSummit at 18th Cloud Expo, Miko Matsumura, Chief Marketing Officer of Gradle Inc., will discuss which tools to use in a developer stack, how to provision the toolchain to minimize onboa...
Feb. 13, 2016 08:00 PM EST Reads: 153
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed...
Feb. 13, 2016 07:00 PM EST Reads: 421
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
Feb. 13, 2016 04:45 PM EST Reads: 208
CIOs and those charged with running IT Operations are challenged to deliver secure, audited, and reliable compute environments for the applications and data for the business. Behind the scenes these tasks are often accomplished by following onerous time-consuming processes and often the management of these environments and processes will be outsourced to multiple IT service providers. In addition, the division of work is often siloed into traditional "towers" that are not well integrated for cro...
Feb. 13, 2016 04:00 PM EST Reads: 514
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, will discuss the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filte...
Feb. 13, 2016 02:00 PM EST Reads: 260
SYS-CON Events announced today that Avere Systems, a leading provider of enterprise storage for the hybrid cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Avere delivers a more modern architectural approach to storage that doesn’t require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data or the overbuilding of data centers ...
Feb. 13, 2016 01:30 PM EST Reads: 148
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management...
Feb. 13, 2016 12:45 PM EST Reads: 479
SYS-CON Events announced today that Alert Logic, Inc., the leading provider of Security-as-a-Service solutions for the cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Alert Logic, Inc., provides Security-as-a-Service for on-premises, cloud, and hybrid infrastructures, delivering deep security insight and continuous protection for customers at a lower cost than traditional security solutions. Ful...
Feb. 13, 2016 12:00 PM EST Reads: 475
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 ad...
Feb. 13, 2016 12:00 PM EST Reads: 456
In most cases, it is convenient to have some human interaction with a web (micro-)service, no matter how small it is. A traditional approach would be to create an HTTP interface, where user requests will be dispatched and HTML/CSS pages must be served. This approach is indeed very traditional for a web site, but not really convenient for a web service, which is not intended to be good looking, 24x7 up and running and UX-optimized. Instead, talking to a web service in a chat-bot mode would be muc...
Feb. 13, 2016 11:45 AM EST Reads: 318