Welcome!

Microservices Expo Authors: Elizabeth White, Liz McMillan, Pat Romanski, Gregor Petri, AppDynamics Blog

Related Topics: Containers Expo Blog, Microservices Expo, Microsoft Cloud, @CloudExpo

Containers Expo Blog: Article

Understanding Remote Desktop Services (RDS)

The backbone of Microsoft VDI Solution

In Windows Server 2008 R2 (WS2008R2), Terminal Services (TS) has been expanded and renamed to Remote Desktop Services (RDS). RDS is the backbone of Microsoft's VDI solutions. And in Windows Server 2012, RDS is further enhanced and with a scenario-based configuration wizard. Still the concept and architecture remain very much the same since WS2008R2. The new and enhanced architecture takes advantage of virtualization and makes remote access a much flexible solution with new deployment scenarios. To realize the capabilities of RDS, it is essential to understand the functions of key architectural components and how they complement one another to process a RDS request. There are many new terms and acronyms to get familiar with in the context of RDS. For the remainder of this post, notice RDS implies the server platform of WS2008R2 and later, while TS implies WS2008.

There are five main architectural components in RDS, as shown, and all require a RDS licensing server. Each component includes a set of features designed to achieve particular functions. Together, the five form a framework for accessing Terminal Services applications, remote desktops, and virtual desktops. Essentially, WS2008R2 offers a set of building blocks with essential functions for constructing enterprise remote access infrastructure.

To start, a user will access a RDS webpage by specifying an URL where RDS resources are published to. This interface, provided by Remote Desktop Web Access (RDWA) and configured with a local IIS with SSL, is the web access point to RemoteApp and VDI. The URL is consistent regardless how resources are organized, composed, and published from multiple RDS session hosts behind the scene. By default, RDS publishes resources at https://the-FQDN-of-a-RDWA-server/rdweb and this URL is the only information a system administrator needs to provide to a user for accessing authorized resources via RDS. A user will need to be authenticated with one's AD credentials when accessing the URL and the RemoteApp programs presented by this URL is trimmed with access control list. Namely, an authenticated user will see and be able to access only authorized RemoteApp programs.

Remote Desktop Gateway (RDG) is optional and functions very much the same with that in TS. A RDG is to be placed at the edge of a corporate network to filter out incoming RDS requests by referencing criteria defined in a designated Network Policy Server (NPS). With a server certificate, RDG offers secure remote access to RDS infrastructure. As far as a system administrator is concerned, RDG is the boundary of a RDS network. There are two policies in NPS relevant to an associated RDG:

  • One is Connection Authorization Policy or CAP. I call it a user authorization list, showing who can access an associated RDG
  • The other is Resource Authorization Policy or RAP. In essence, this is a resource list specifying which devices a CAP user can connect to via an associated RDG.

In RDS, applications are installed and published in a Remote Desktop Session Host (RDSH) similar to a TS Session Host, or simply a Terminal Server in a TS solution. A RDSH loads applications, crunches numbers, and produces results. It is our trusted and beloved working horse in a RDS solution. Digital signing can be easily enabled in a RDSH with a certificate. Multiple RDSHs can be deployed along with a load balancing technology. Which requires every RDSH in a load-balancing group to be identically configured with the same applications.

A noticeable enhancement in RDSH (as compared with TS Session Host) is the ability to trim the presence of a published application based on the access control list (ACL) of the application. An authorized user will see, hence have an access to, only published applications of which the user is authorized in the ACL. By default, the Everyone group is included in a published application's ACL, and all connected user will have access to a published application.

Remote Desktop Virtualization Host (RDVH) is a new feature which serves requests for virtual desktops running in virtual machines, or VMs. A RDVH server is a Hyper-V based host, for instance a Windows Server with Hyper-V server role enabled. When serving a VM-based request, an associated RDVH will automatically start an intended VM, if the VM is not already running. And a user will always be prompted for credentials when accessing a virtual desktop. However, a RDVH does not directly accept connection requests and it uses a designated RDSH as a "redirector" for serving VM-based requests. The pairing of a RDVH and its redirector is defined in Remote Desktop Connection Broker (RDCB) when adding a RDVH as a resource.

Remote Desktop Connection Broker (RDCB), an expansion of the Terminal Services Session Broker in TS, provides a unified experience for setting up user access to traditional TS applications and virtual machine (VM)-based virtual desktops. Here, a virtual desktop can be running in either a designated VM, or a VM dynamically picked based on load balancing from a defined VM pool. A system administrator will use the RDCB console, called Remote Desktop Connection Manager, to include RDSHs, TS Servers, and RDVHs such that those applications published by the RDSHs and TS Servers, and those VMs running in RDVHs can be later composed and presented to users with a consistent URL by RDWA. And with this consistent URL, authenticated users can access authorized RemoteApp programs and virtual desktops.

A Remote Desktop (RD) Client gets connection information from the RDWA server in a RDS solution. If a RD client is outside of a corporate network, the client connects through a RDG. If a RD client is internal, the client can then directly connect to an intended RDSH or RDVH once RDCB provides the connection information. In both cases, RDCB plays a central role to make sure a client gets connected to a correct resource. With certificates, a system administrator can configure digital signing and single sign-on among RDS components to provide a great user experience with high security.

Conceptually, RDCB is the chief intelligence and operation officer of a RDS solution and knows which is where, whom to talk to, and what to do with a RDS request. Before a logical connection can be established between a client and a target RDSH or RDVH, RDCB acts as a go-between passing and forwarding pertinent information to and from associated parties when serving a RDS request. From a 50,000-foot view, a remote client uses RDWA/RDG to obtain access to a target RDSH or RDVH, while RDCB connects the client to a session on the target RDSH, or an intended VM configured in a target RDVH. Above is a RDS architecture poster with visual presentation on how all flow together. Http://aka.ms/free has number of free e-books and this poster for additional information of WS2008R2 Active Directory, RDS, and other components.

The configuration in WS2008 is a bit challenging with many details easily overlooked. Windows Server 2012 greatly improved the user experience by facilitating the configuration processes with a scenario-based wizard. Stay tuned and I will further discuss this in an upcoming blog post series.

Recommended additional reading on RDS/VDI/App-V, cloud essentials, and private cloud

[This is a cross-posting from http://blogs.technet.com/yungchou.]

More Stories By Yung Chou

Yung Chou is a Technology Evangelist in Microsoft. Within the company, he has had opportunities serving customers in the areas of support account management, technical support, technical sales, and evangelism. Prior to Microsoft, he had established capacities in system programming, application development, consulting services, and IT management. His recent technical focuses have been in virtualization and cloud computing with strong interests in hybrid cloud and emerging enterprise computing architecture. He is a frequent speaker in Microsoft conferences, roadshow, and TechNet events.

@MicroservicesExpo Stories
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2016 Silicon Valley. The 19th Cloud Expo and 6th @ThingsExpo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Interne...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
In this strange new world where more and more power is drawn from business technology, companies are effectively straddling two paths on the road to innovation and transformation into digital enterprises. The first path is the heritage trail – with “legacy” technology forming the background. Here, extant technologies are transformed by core IT teams to provide more API-driven approaches. Legacy systems can restrict companies that are transitioning into digital enterprises. To truly become a lea...
Video experiences should be unique and exciting! But that doesn’t mean you need to patch all the pieces yourself. Users demand rich and engaging experiences and new ways to connect with you. But creating robust video applications at scale can be complicated, time-consuming and expensive. In his session at @ThingsExpo, Zohar Babin, Vice President of Platform, Ecosystem and Community at Kaltura, will discuss how VPaaS enables you to move fast, creating scalable video experiences that reach your...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
SYS-CON Events announced today that Tintri Inc., a leading producer of VM-aware storage (VAS) for virtualization and cloud environments, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Tintri VM-aware storage is the simplest for virtualized applications and cloud. Organizations including GE, Toyota, United Healthcare, NASA and 6 of the Fortune 15 have said “No to LUNs.” With Tintri they mana...
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...
SYS-CON Events announced today that LeaseWeb USA, a cloud Infrastructure-as-a-Service (IaaS) provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LeaseWeb is one of the world's largest hosting brands. The company helps customers define, develop and deploy IT infrastructure tailored to their exact business needs, by combining various kinds cloud solutions.
About a year ago we tuned into “the need for speed” and how a concept like "serverless computing” was increasingly catering to this. We are now a year further and the term “serverless” is taking on unexpected proportions. With some even seeing it as the successor to cloud in general or at least as a successor to the clouds’ poorer cousin in terms of revenue, hype and adoption: PaaS. The question we need to ask is whether this constitutes an example of Hype Hopping: to effortlessly pivot to the ...
DevOps and microservices are permeating software engineering teams broadly, whether these teams are in pure software shops but happen to run a business, such Uber and Airbnb, or in companies that rely heavily on software to run more traditional business, such as financial firms or high-end manufacturers. Microservices and DevOps have created software development and therefore business speed and agility benefits, but they have also created problems; specifically, they have created software sec...
Much of the value of DevOps comes from a (renewed) focus on measurement, sharing, and continuous feedback loops. In increasingly complex DevOps workflows and environments, and especially in larger, regulated, or more crystallized organizations, these core concepts become even more critical. In his session at @DevOpsSummit at 18th Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, showed how, by focusing on 'metrics that matter,' you can provide objective, transparent, and meaningful f...
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
With the rise of Docker, Kubernetes, and other container technologies, the growth of microservices has skyrocketed among dev teams looking to innovate on a faster release cycle. This has enabled teams to finally realize their DevOps goals to ship and iterate quickly in a continuous delivery model. Why containers are growing in popularity is no surprise — they’re extremely easy to spin up or down, but come with an unforeseen issue. However, without the right foresight, DevOps and IT teams may lo...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, will compare the Jevons Paradox to modern-day enterprise IT, e...
Digitization is driving a fundamental change in society that is transforming the way businesses work with their customers, their supply chains and their people. Digital transformation leverages DevOps best practices, such as Agile Parallel Development, Continuous Delivery and Agile Operations to capitalize on opportunities and create competitive differentiation in the application economy. However, information security has been notably absent from the DevOps movement. Speed doesn’t have to negat...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. Big Data at Cloud Expo - to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is...
Your business relies on your applications and your employees to stay in business. Whether you develop apps or manage business critical apps that help fuel your business, what happens when users experience sluggish performance? You and all technical teams across the organization – application, network, operations, among others, as well as, those outside the organization, like ISPs and third-party providers – are called in to solve the problem.