Welcome!

Microservices Expo Authors: Elizabeth White, Chris Schwarz, Yeshim Deniz, Kevin Jackson, Jason Bloomberg

Related Topics: Microsoft Cloud, Microservices Expo, Containers Expo Blog, Silverlight, Agile Computing, @CloudExpo

Microsoft Cloud: Blog Post

Looking at “Real World” Windows Azure Scenarios

Migrating a Classic 3-Tier Application to Windows Azure with Don Noonan from Skylera

Looking at “Real World” Windows Azure Scenarios – Migrating a Classic 3-Tier Application to Windows Azure with Don Noonan from Skylera

I wrote this article about Don Noonan, a Cloud Architect from Skylera and his overview of “Infrastructure as a Service” platform. Don and I met at TechEd in Orlando 2012 last year and I interviewed him on the newest technologies around Windows Azure. Don has experience working at Microsoft, Boeing and has been working with storage technologies, virtual machines, workloads and desktop client deployment using cloud services - instead of the usual on-premise infrastructure services.

We start by discussing the working components or parts of cloud deployment in a real customer scenario. His current customer had a future mobile application on .Net but wanted to sell more of their current classic products. The customer had many servers to manage, with their IT staff on call to manage their on-premise infrastructure. Given the new technology, Don’s customer decided to look at Windows Azure to scale their applications and workloads on Microsoft’s Infrastructure cloud services.

So they started with a collection or set of functional groups within IaaS. They separated their virtual machines by roles such as Active Directory and other core services. This was a basic implementation of Windows Azure availability sets, which means at the datacenter level there is a promise that at least one member of a group of virtual machines will remain available while updates are being made to the Windows Azure platform.

You should use a combination of availability sets and load-balancing endpoints to make sure that your application is always available and running efficiently. For more information about using load-balanced endpoints, see Load Balancing Virtual Machines.

This task includes the following steps from the Windows Azure website below:

· Step 1: Create a virtual machine and an availability set

· Step 2: Add a virtual machine to the cloud service and assign it to the availability set during the creation process

· Step 3: (Optional) Create an availability set for previously created virtual machines

· Step 4: (Optional) Add a previously created virtual machine to an availability set

Don wanted to make sure that the cloud services and hypervisor have the appropriate virtual machines and that the compute resources will remain there. In this project, they had availability sets around there SQL virtual machines and the goal was that the system understands that one of the SQL instances is always highly available. Even though they have availability sets, you still have to implement failover at the database level, either using a witness, or the new Always On capability in SQL Server 2012.. They also have a custom management service specific to their mobile solution so their customers can look at logs and activities as well as their custom C++ sync service application used to sync data between the mobile phone application and backend database. Don explains that from a Windows Azure Mobile Services context, he likes to group the virtual machines, define what roles they will be playing and how the networking might be specifically laid out like load balancers and endpoints. Don shows in the IT Time Radio interview the Windows Azure portal and shows the interface with virtual machines within an availability set with 2 Domain Controllers paired up running. Don configures the DC availability set that has Active Directory running and AD Domain Services itself has built-in replication giving it high availability capabilities. The demo in the video shows setting up affinity groups and we explain how they are used in the Windows Azure datacenter which keeps your resources closely together like a high-level container that has compute and storage can be close together for provisioning. So for instance, since we’re here on the East Coast we would pick EAST US and build out Affinity Groups close to where we are physically located. Datacenters are large so you would first set up an Affinity Group and then within the Affinity Group you can build out your storage and virtual networks. For security reasons, within virtual networking you may want to divide out or subnet out the virtual networks so that the services are segregated and only certain ports can talk to each other which in common within public clouds services. You could say that you only want to have Windows firewall rules that say I only want external servers to talk to me on port 443, or only have SQL traffic go from the middle-tier to the database-tier.

So the nice part about IaaS is that each customer can have their own management network with an instance of their own virtual machines so you can segregate customers and services. I had a chance to explain the overview picture with segregating the workloads with first discussing Directory Services, Database Services, Management Services, Sync Services, and then wrapping around the whole thing with an Affinity Group and around that the virtual networking. We took a look at building this out in the video and Don shows how to use Powershell scripts and the Windows Azure IaaS cmdlets that makes the actual application work. What he likes to do is break them out into chucks like core infrastructure and back-end management servers like Active Directory Domain Controller, the middleware tier in the front-end like in this case SharePoint Server. So similar to how he segmented the network out and Don shows the scripts he uses to provision objects using Windows Azure and Powershell. He shows how to script out an Affinity Group so that the resources are not a football field away from each other for performance reasons. XML is used to do many of the functions within the portal that you can create from scratch or you can also find pre-canned management scripts up on http://www.windowsazure.com and Don has been working with the Windows Azure team to get more scripts up after they have had time to test these “real world” proof of concepts.

Don shows the foundation including the networking, affinity groups and storage he then shows how to create a virtual machine. He creates the management service layer which contains two Domain Controllers, with the same header information he then tells the default storage account to put new objects in the same storage account like for instance, 5 virtual machines within that storage account. Don explains what cmdlets do what functions like setting up instance variables for his two domain controllers to be in the same availability set. When the DC’s are being configured he explains the beauty of Windows Azure in that it has an existing gallery or catalogue of pre-built virtual machines so he builds it off the Windows Server 2008 R2 SP1 install and then he tells it what subnet and then he shows the cmdlet New-AzureVMConfig command and create the first and second virtual machine and added them to the same availability set name. If we did not include them they would be independent and therefore might be serviced at the same time which would not give you high availability. The last thing he configures is the cloud service for the management network. He explains that this is where you would open ports and configure the connection to the virtual machines to service them via RDP. He finishes the overview of the real world Windows Azure application covering computing power, administrative privileges and adding a set of disks to the database tier like adding a 100GB LUN for data and a 50GB LUN for log files, and you can add lots of disks. Up to 16 data disks at 1TB a piece so that give you room for expansion. There are over 2400 cmdlets for Powershell in Windows Server 2012 and you can get the Windows Azure PowerShell cmdlets from the Windows Azure manage area on http://www.windowsazure.com . The last piece is the web-tier on the newly created subnet that is public facing and two web front-ends and he explains the setup at the end of (Part 1 of 5) Real World Azure - Migrating a Classic 3-Tier Application to Windows Azure IT Time Radio – TechNet Episode .

Catch the previous episodes of “IT Time Radio” below -

TechNet Radio: IT Time – (Part 2 of 5) Real World Azure - Deploying a Custom SharePoint Application to Windows Azure

TechNet Radio: IT Time – (Part 3 of 5) Real World Azure – Moving an All-In-One Server from Co-location to Windows Azure

TechNet Radio: IT Time – (Part 4 of 5) Real World Azure – Implementing RemoteApp for Client / Server Applications on Windows Azure

TechNet Radio: IT Time – (Part 5) Real World Azure – Real World Azure - Migrating a Classic 3-Tier Application to Windows Azure

Try Windows Azure http://aka.ms/try-azure – (Free account requires credit card but not charged)

Get your Microsoft Trial Products at http://aka.ms/msproducts

In case you missed any of the series here is a list to all of the articles: http://aka.ms/31azure

More Stories By Blain Barton

Raised in Spokane Washington, Blain Barton has been with Microsoft for 20 years and has held many diverse positions. His career started in 1988 as Team Leader in Manufacturing and Distribution, progressed to PSS Team Manager for Visual Basic Product Support, Product Consultant for Microsoft Word Division, OEM Systems Engineer and currently serves as a Senior IT Pro Evangelist.

Blain has organized and delivered a wide array of technical events and has presented over 1000 live events and has received over six “top-presenter” speaking awards. He has traveled around the world 3 times delivering OEM training sessions on pre-installing Microsoft Windows on new PC’s.

He attended Washington State University graduating with a Bachelor’s Degree in English/Business and Minor in Computer Science. After college, Blain taught snow skiing on a professional level in the Cascade Mountains before starting his career with Microsoft. Blain currently resides in Tampa Florida.

@MicroservicesExpo Stories
Colocation is a central pillar of modern enterprise infrastructure planning because it provides greater control, insight, and performance than managed platforms. In spite of the inexorable rise of the cloud, most businesses with extensive IT hardware requirements choose to host their infrastructure in colocation data centers. According to a recent IDC survey, more than half of the businesses questioned use colocation services, and the number is even higher among established businesses and busine...
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo Silicon Valley which will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. "DevOps is at the intersection of technology and business-optimizing tools, organizations and processes to bring measurable improvements in productivity and profitability," said Aruna Ravichandran, vice president, DevOps product and solutions marketing...
In his session at Cloud Expo, Alan Winters, an entertainment executive/TV producer turned serial entrepreneur, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to ma...
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business - from apparel to energy - is being rewritten by software. From planning to development to management to security, CA creates software that fuels transformation for companies in the applic...
There's a lot to gain from cloud computing, but success requires a thoughtful and enterprise focused approach. Cloud computing decouples data and information from the infrastructure on which it lies. A process that is a LOT more involved than dragging some folders from your desktop to a shared drive. Cloud computing as a mission transformation activity, not a technological one. As an organization moves from local information hosting to the cloud, one of the most important challenges is addressi...
In the decade following his article, cloud computing further cemented Carr’s perspective. Compute, storage, and network resources have become simple utilities, available at the proverbial turn of the faucet. The value they provide is immense, but the cloud playing field is amazingly level. Carr’s quote above presaged the cloud to a T. Today, however, we’re in the digital era. Mark Andreesen’s ‘software is eating the world’ prognostication is coming to pass, as enterprises realize they must be...
Hybrid IT is today’s reality, and while its implementation may seem daunting at times, more and more organizations are migrating to the cloud. In fact, according to SolarWinds 2017 IT Trends Index: Portrait of a Hybrid IT Organization 95 percent of organizations have migrated crucial applications to the cloud in the past year. As such, it’s in every IT professional’s best interest to know what to expect.
A common misconception about the cloud is that one size fits all. Companies expecting to run all of their operations using one cloud solution or service must realize that doing so is akin to forcing the totality of their business functionality into a straightjacket. Unlocking the full potential of the cloud means embracing the multi-cloud future where businesses use their own cloud, and/or clouds from different vendors, to support separate functions or product groups. There is no single cloud so...
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Companies have always been concerned that traditional enterprise software is slow and complex to install, often disrupting critical and time-sensitive operations during roll-out. With the growing need to integrate new digital technologies into the enterprise to transform business processes, this concern has become even more pressing. A 2016 Panorama Consulting Solutions study revealed that enterprise resource planning (ERP) projects took an average of 21 months to install, with 57 percent of th...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
For most organizations, the move to hybrid cloud is now a question of when, not if. Fully 82% of enterprises plan to have a hybrid cloud strategy this year, according to Infoholic Research. The worldwide hybrid cloud computing market is expected to grow about 34% annually over the next five years, reaching $241.13 billion by 2022. Companies are embracing hybrid cloud because of the many advantages it offers compared to relying on a single provider for all of their cloud needs. Hybrid offers bala...
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
What's the role of an IT self-service portal when you get to continuous delivery and Infrastructure as Code? This general session showed how to create the continuous delivery culture and eight accelerators for leading the change. Don Demcsak is a DevOps and Cloud Native Modernization Principal for Dell EMC based out of New Jersey. He is a former, long time, Microsoft Most Valuable Professional, specializing in building and architecting Application Delivery Pipelines for hybrid legacy, and cloud ...
Containers, microservices and DevOps are all the rage lately. You can read about how great they are and how they’ll change your life and the industry everywhere. So naturally when we started a new company and were deciding how to architect our app, we went with microservices, containers and DevOps. About now you’re expecting a story of how everything went so smoothly, we’re now pushing out code ten times a day, but the reality is quite different.
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.