Welcome!

Microservices Expo Authors: Elizabeth White, Liz McMillan, Mehdi Daoudi, Pat Romanski, Flint Brenton

Related Topics: Containers Expo Blog, Industrial IoT, Microservices Expo, Agile Computing, @CloudExpo, Apache

Containers Expo Blog: Blog Feed Post

Bare Metal Blog: FPGAs: Reaping the Benefits

All the goodness FPGAs bring hardware in general, and ADO hardware in particular.

All the goodness FPGAs bring hardware in general, and ADO hardware in particular.

In two previous installments, I talked at a high level about the uses of FPGAs, risk mitigation, and the potential benefits. Today I’d like to delve into the benefits that the industry in general, and F5 in particular, gain from using FPGAs, and why it matters to IT. If you’re a regular reader, you know that I try not to be a chorus line for F5 solutions, but don’t shy away from talking about them when it fits the topic. That will continue with this post. While I will use F5 for the specifics, the benefits can be generalized to the bulk of the industry.

Used to be, way back in the day, everyone walked everywhere. That worked for a long period of world history. The horse was adopted for longer trips, and it about doubled travel speed, but still, the bulk of the world populace walked nearly all of the time. Then along came cars, and they enabled a whole lot of things. One of the great benefits that the automobile introduced was the ability to be more agile. By utilizing the machinery, you could move from one town to another relatively quickly. You could even work in a town 30 miles – a days’ walk for a physically fit person – from your home. At this point in human – or at least first world – history, walking is a mode of transportation that is rarely used for important events. There are some cities so tightly packed that walking makes sense, but for most of us, we take a car the vast majority of the time. When speed is not of the essence – say when you take a walk with a loved one – the car is left behind, but for day-to-day transport, the car is the go-to tool.

There is a corollary to this phenomenon in the Application Delivery world. While in some scenarios, a software ADC will do the trick, there are benefits to hardware that mean if you have it, you’ll use the hardware much more frequently. This is true of far more than ADCs, but bear with me, I do work for an ADC vendor Winking smile. There are some things that can just be done more efficiently in hardware, and some things that are best left (normally due to complexity) to software. In the case of FPGAs, low-level operations that do a lot of repetitive actions are relatively easily implemented – even to the point of FPGA and/or programming tools for FPGAs coming with certain pre-built layouts at this point. As such, certain network processing that is latency-sensitive and can be done with little high-level logic are well suited to FPGA processing. When a packet can be processed in X micro-seconds in FPGA, or in X^3 milliseconds by the time it passes through the hardware, DMA transfer, firmware/network stack, and finally lands in software that can manipulate it, definitely go with the FPGA option if possible.

And that’s where a lot of the benefits of FPGAs in the enterprise are being seen. Of course you don’t want to have your own FPGA shop and have to maintain your own installation program to reap the benefits. But vendors have sets of hardware that are largely the same and are produced en-masse. It makes sense that they would make use of FPGAs, and they do. Get that packet off the wire, and if it meets certain criteria, turn it around and get it back on the wire with minor modifications.

But that’s not all. While it was a great step to be able to utilize FPGAs in this manner and not have to pay the huge up-front fees of getting an ASIC designed and a run of them completed, the use of FPGAs didn’t stop there – indeed, it is still growing and changing. The big area that has really grown the usage of ever-larger FPGAs is in software assistance. Much like BIOS provides discrete functionality that software can call to achieve a result, FPGAs can define functions with register interface that are called directly from software – not as a solution, but as an incremental piece of the solution. This enables an increase in the utilization of FPGAs and if the functions are chosen carefully, an improvement in the overall performance of the system the FPGAs are there to support. It is, essentially, offloading workload from software. When that offload is of computationally intensive operations, the result can be a huge performance improvement. Where a software solution might have a function call, hardware can just do register writes and reads, leaving the system resources less taxed. Of course if the operation requires a lot of data storage memory, it still will, which is why I mentioned “computationally expensive”.

The key thing is to ask your vendor (assuming they use FPGAs) what they’re doing with them, and what benefit you see. It is a truth that the vast majority of vendors go to FPGAs for their own benefit, but that is not exclusive of making things better for customers. So ask them how you, as a customer, benefit.

And when you wonder why a VM can’t perform every bit as well as custom hardware, well the answer is at least partially above. The hardware functionality of custom devices must be implemented in software for a VM, and that software then runs on not one, but two operating systems, and eventually calls general purpose hardware. While VMs, like feet, are definitely good for some uses, when you need your app to be the fastest it can possibly be, hardware – specifically FPGA enhanced hardware – is the best answer, much as the car is the best answer for daily travel in most of the world. Each extra layer – generic hardware, the host operating system, the virtual network, and the guest operating system – adds cost to processing. The lack of an FPGA does too, because those low-level operations must be performed in software.

So know your needs, use the right tool for the job. I would not drive a car to my neighbors’ house – 200 feet away – nor would I walk from Green Bay to Cincinnati (just over 500 miles). Know what your needs are and your traffic is like, then ask about FPGA usage. And generalize this… To network switches, WAPs, you name it. You’re putting it into your network, so that IS your business.

Walking in Ust-Donetsk

And yeah, you’ll hear more on this topic before I wrap up the Bare Metal Blog series, but for now, keep doing what you do so well, and I’ll be back with more on testing soon.

Read the original blog entry...

More Stories By Don MacVittie

Don MacVittie is founder of Ingrained Technology, A technical advocacy and software development consultancy. He has experience in application development, architecture, infrastructure, technical writing,DevOps, and IT management. MacVittie holds a B.S. in Computer Science from Northern Michigan University, and an M.S. in Computer Science from Nova Southeastern University.

@MicroservicesExpo Stories
Today most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes significant work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reducti...
Enterprises are moving to the cloud faster than most of us in security expected. CIOs are going from 0 to 100 in cloud adoption and leaving security teams in the dust. Once cloud is part of an enterprise stack, it’s unclear who has responsibility for the protection of applications, services, and data. When cloud breaches occur, whether active compromise or a publicly accessible database, the blame must fall on both service providers and users. In his session at 21st Cloud Expo, Ben Johnson, C...
Most of the time there is a lot of work involved to move to the cloud, and most of that isn't really related to AWS or Azure or Google Cloud. Before we talk about public cloud vendors and DevOps tools, there are usually several technical and non-technical challenges that are connected to it and that every company needs to solve to move to the cloud. In his session at 21st Cloud Expo, Stefano Bellasio, CEO and founder of Cloud Academy Inc., will discuss what the tools, disciplines, and cultural...
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
With the rise of DevOps, containers are at the brink of becoming a pervasive technology in Enterprise IT to accelerate application delivery for the business. When it comes to adopting containers in the enterprise, security is the highest adoption barrier. Is your organization ready to address the security risks with containers for your DevOps environment? In his session at @DevOpsSummit at 21st Cloud Expo, Chris Van Tuin, Chief Technologist, NA West at Red Hat, will discuss: The top security r...
‘Trend’ is a pretty common business term, but its definition tends to vary by industry. In performance monitoring, trend, or trend shift, is a key metric that is used to indicate change. Change is inevitable. Today’s websites must frequently update and change to keep up with competition and attract new users, but such changes can have a negative impact on the user experience if not managed properly. The dynamic nature of the Internet makes it necessary to constantly monitor different metrics. O...
Agile has finally jumped the technology shark, expanding outside the software world. Enterprises are now increasingly adopting Agile practices across their organizations in order to successfully navigate the disruptive waters that threaten to drown them. In our quest for establishing change as a core competency in our organizations, this business-centric notion of Agile is an essential component of Agile Digital Transformation. In the years since the publication of the Agile Manifesto, the conn...
The nature of the technology business is forward-thinking. It focuses on the future and what’s coming next. Innovations and creativity in our world of software development strive to improve the status quo and increase customer satisfaction through speed and increased connectivity. Yet, while it's exciting to see enterprises embrace new ways of thinking and advance their processes with cutting edge technology, it rarely happens rapidly or even simultaneously across all industries.
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
These days, APIs have become an integral part of the digital transformation journey for all enterprises. Every digital innovation story is connected to APIs . But have you ever pondered over to know what are the source of these APIs? Let me explain - APIs sources can be varied, internal or external, solving different purposes, but mostly categorized into the following two categories. Data lakes is a term used to represent disconnected but relevant data that are used by various business units wit...
There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scalability and development velocity challenges. In his session at 21st Cloud Expo, Ryland Degnan, a Senior Software Engineer on the Netflix Edge Platform team, will discuss how by leveraging a reactive stream-based protocol,...
Many organizations adopt DevOps to reduce cycle times and deliver software faster; some take on DevOps to drive higher quality and better end-user experience; others look to DevOps for a clearer line-of-sight to customers to drive better business impacts. In truth, these three foundations go together. In this power panel at @DevOpsSummit 21st Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, industry experts will discuss how leading organizations build application success from all...
The last two years has seen discussions about cloud computing evolve from the public / private / hybrid split to the reality that most enterprises will be creating a complex, multi-cloud strategy. Companies are wary of committing all of their resources to a single cloud, and instead are choosing to spread the risk – and the benefits – of cloud computing across multiple providers and internal infrastructures, as they follow their business needs. Will this approach be successful? How large is the ...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
One of the biggest challenges with adopting a DevOps mentality is: new applications are easily adapted to cloud-native, microservice-based, or containerized architectures - they can be built for them - but old applications need complex refactoring. On the other hand, these new technologies can require relearning or adapting new, oftentimes more complex, methodologies and tools to be ready for production. In his general session at @DevOpsSummit at 20th Cloud Expo, Chris Brown, Solutions Marketi...
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
Today companies are looking to achieve cloud-first digital agility to reduce time-to-market, optimize utilization of resources, and rapidly deliver disruptive business solutions. However, leveraging the benefits of cloud deployments can be complicated for companies with extensive legacy computing environments. In his session at 21st Cloud Expo, Craig Sproule, founder and CEO of Metavine, will outline the challenges enterprises face in migrating legacy solutions to the cloud. He will also prese...
DevOps at Cloud Expo – being held October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real r...