Welcome!

Microservices Expo Authors: Pat Romanski, Liz McMillan, Simon Hill, Madhavan Krishnan, VP, Cloud Solutions, Virtusa, John Rauser

Related Topics: Microservices Expo

Microservices Expo: Article

Portals and Web Services - When business issues are technical

Portals and Web Services - When business issues are technical

We've all heard the terms: portals, gadgets, portlets, dashboarding. But what does it all mean? And what role do Web services play in this exciting new world of componentized content?

I would define "portalization" as the creation of an environment that gives the end user one-stop shopping for actionable content and collaboration. Portals don't create anything new; they merely give us comprehensive, personalized access to content that may reside in any number of repositories (see Figure 1). Moreover, portal environments benefit developers and implementers by providing a robust set of "out of the box" tools and features to ease application development and deployment. These tools include single sign-on (SSO) access to virtually any authentication data source, centralized authorization, logging and reporting, document versioning, and so on.

 

Content and Web services
In the vast expanse of where content may come from, Web services is just one of the vast sources that provide content. The difference, however, between Web services and other content-providing mechanisms is that Web services are a natural and synergic fit with portal environments. From the dawn of Web services time, the goal has been to deliver content in a standard, XML-based format that allows the consumers of that data to render or use it as they choose. Since personalization, localization, and customization are all standard fare in a portal environment, the presentation layer can be left to the developer of the portlet consuming the data. Sounds like the perfect job for good ol' Web services, doesn't it? It's also no secret that the great benefits to using Web services are their reuse and client-agnostic and protocol-independent properties. Given all this, the use of Web services in a portal environment is a no-brainer.

When to 'Portalize'
Does portalizing the weather along with a link to your company's homepage give you a good ROI on your portal software? I would guess probably not. But many other applications may make sense. Let's say we are a Web-engineering firm and at any given time our CEO wants to get:

  • Information on sales leads that are in the pipeline
  • Updates on development efforts that are in progress and status of client deliverables
  • Meeting information for the day
  • Access to the latest proposals with impending deadlines
  • And maybe even an employee vacation schedule

    Access to this content in a centralized, concise, easy-to-use format is a powerful thing. It eliminates the need to open and swap among multiple applications. It lets him log in to one SSO system and have access to all authorized resources rather than needing to log in to multiple systems. Furthermore, it eliminates the hassle of crawling through a document server's directory structure to find a specific proposal document. Finally, it allows him to collaborate with other team members on documents, meetings, etc. With its ability to improve project management and accelerate proposal and development efforts, the ROI for this type of portal application is very valuable, indeed.

    The portal above can be described as a "dashboard" specific to our CEO's needs. It gives a personalized set of relevant information to the current end user. The information is a collection of data subsets from larger repositories. Those repositories in this case are a sales system, a project system, a scheduling system, a file structure, and an HR system. To create a portlet on top of an entire repository that exposes the complete functionality of that system may seem great, but in reality the end user wants only a slice of that data customized for them. For example, at first glance exposing the entire sales system sounds like it could be meaningful for many employees. However, the sales team is only interested in lead-flow information and not billing information, while the admin department only needs to see billing information so they can generate invoices. In an ideal portal application, each of these data slices remain separate as part of a specific user's personalized dashboard.

    How Is It Done?
    So how do we go about building a dashboard that is relevant and useful to each end user? This is done by leveraging a combination of built-in portal environment functionality and custom development. The portal environment gives the developer a nice framework for doing common tasks such as authentication, authorization, content formatting, and portlet behavior. Custom development helps to provide exposure and access to application content and functionality and makes the portal "consumable" by existing and future applications. The former development effort is very specific to the business need. If the business need allows, the content access exposure is best developed using Web services. By using Web services to expose key business functionality you are "future proofing" your application. This service may be consumed by a portlet today but may need to be consumed by a third-party application tomorrow. Once the necessary content and/or functionality are exposed, it must be consumed. This is the job of a portlet.

    Today developers are somewhat limited when building consuming portlets. They must adhere to the APIs defined by the portal environment. The portability of these portlets is not quite there yet as standards like JSR 168 are still being worked out. If you do expose your content through Web services, the development effort to consume that data can be quite trivial in most portal environments. Most have simple wizard interfaces that allow a developer to point at a Web service and have the portlet nearly build itself. Due to the reusability, ease of deployment, and "future proofing" of your applications, using Web services to expose functionality and content is a smart move.

    Portal Environment: Build or Buy?
    While today's portal software is impressive, it comes at a price. Portal software with implementation can run a company anywhere from the mid–five figures to well into the high six figures and beyond, depending on the number of users, servers, etc. Is it worth the investment to get into a portal environment? The answer really depends on your business need. Giving your staff access to their vacation day reports and links to HR policies within a branded intranet site may not be worthy of the sexy and expensive features of a full-blown portal environment. In this case, your efforts would be better spent handing this project to a junior developer and giving him or her a month to bang it out, giving you far greater ROI than deploying a real portal. If, however, you truly intend to portalize your business functions as described in our CEO dashboard example, there is no doubt you'll want to buy versus building your own. The time and investment needed to match the features and functionality of a portal environment are not time and money well spent. In this instance, a good portal environment is worth the investment.

    While all portal environments differ in the goodies they offer, most will include some flavor of SSO authentication and authorization; the ability to define groups of users into communities with common interests and the associated content and applications; monitoring, management and reporting; content aggregation, indexing and searching; document check in, check out, and versioning; collaboration; portlet development APIs or SDKs; and much more. Since portal environments have been around for some time now, and most offer easy ways to develop portlets to their environment, giant libraries of portlets have been developed. Plumtree, for example, has hundreds of portlets in its library, from integrating with Lotus Notes, SAP, Siebel, PeopleSoft, and others, to using AOL Instant Messenger. Portal environment libraries provide a huge advantage to building all these components as many of them are open source or relatively inexpensive.

    What About Integration?
    How well a portal environment integrates with Web services is also very dependent on the portal company you choose. While there are certainly some environments that make the integration process less painful than others, this should not be the developers' main consideration. Developers should be more concerned about how well a portal vendor interoperates with other vendors and portlets. In other words, how good is this vendor's support for emerging portal and Web service standards? At the beginning of the portal revolution, it was acceptable for each vendor to operate in its own box. Each portlet developed was proprietary to the environment in which it lived, as standards specific to portlets did not yet exist. Enter OASIS and the Java Community Process (JCP). With the recent approval of the Web Services for Remote Portlets (WSRP) standard by OASIS and the development of the JSR 168 Portlet Specification by the JCP, vendors now have the choice of whether they want to play nice together or not. Happily, several vendors have actually integrated these standards into their products already. This is a big step in the right direction and great news for those wishing to implement Web services as portlets.

    Web Services Standards and Portals
    The WSRP is an interesting standard and somewhat of a paradigm shift from what we are used to in the Web services world. Until now everything was very datacentric. It's the nature of Web services to not worry about specific protocols or client displays. Web service calls either performed a function or they accessed some data. Presentation was left in the hands of the consumer. The WSRP standard has changed this for portlets. The idea is that no longer are portlet Web services "data-oriented," as OASIS calls it. They are now "presentation-oriented" (see Figure 2).

     

    WSRP defines a Web service interface that allows a much richer interaction with Web services than straight Web service calls. The producer of the WSRP-compliant service can expose description information and capabilities of the service, presentation markup that is now part of the service, an optional registration interface for creating a relationship between the consumer and producer, and optional management interfaces. By creating WSRP-compliant portlets, producers are able to publish portlets that have been developed in their familiar environments and be assured that consumers will receive the service as it was intended from the application logic up to the presentation. It lets them reuse portlets, make them WSRP compliant, and expose them to other consumers. From the consumer side, no additional development is required to integrate a WSRP portlet, as there would be if a straight Web service were being consumed. So long as the consumer is also WSRP compliant, the inclusion of this portlet is virtually development free.

    The adoption of the WSRP standard will ultimately make portlet interoperability issues between portal vendors a problem of the past, opening up new opportunities for businesses to expose and market their developed portlets while maintaining full control over application and presentation logic. It will also benefit businesses looking to consume existing WSRP portlets by offering them a complete packaged portlet application, presentation and all, requiring virtually no development effort.

    Though it is clear that there are many benefits to using Web services in a portal environment, there are still some drawbacks. Until the portal standards have been fully adopted, developers will need to continue to produce and consume Web services the "old fashioned" way. Consuming this way will require a good deal of additional development to create presentation and logic around the data you get back. Another problem with standard Web services is that you are constrained by the APIs you build. Let's say, for example, that you decide your fancy sales lead Web service should now return fax numbers, which it didn't do before. You will need to first produce an updated Web service containing this new field. Then this definition WSDL needs to be republished into a UDDI service so new consumers of the service may develop to the new API. This is okay for new consumers, but your existing consumers are now out of sync.

    Each existing consumer will need to develop against this new API to handle the fax data. You can see how this has the potential to very quickly become a maintenance nightmare. If controlling presentation, data, and other service attributes from a central location is a requirement due to varied consumers and ease of deployment, you may want to choose a vendor who is on board with these new portal standards.

    Implementing Web Services
    So, let's say you're willing to take the good with the bad and want in on using data-oriented Web services in a portal, how do you go about doing that? Like everything else, this is dependent on the environment you are in. But here are the basics you will need to follow: first you must develop the business logic that needs to be exposed. Since we are in the Web services world you can do this with any language and on any platform you like. Next, you need to expose this business logic as a Web service. At this stage of the game just about every IDE has some support for generating the files you need for Web service exposure. These files may include your Web service wrapper code, WSDL definition file, deployment descriptors, etc. Deploy this service and publish its description to a UDDI server and you have successfully produced a Web service.

    Now you must consume this service in your portal. The development kits that come with your portal software will determine how easily you integrate your newly deployed service. If you're lucky, you'll have a wizardstyle interface that will ask you where the service is defined. From the WSDL, the wizard will create all the necessary client proxy files to access this service. This is still only one piece of the client work. In the Java world this is just the M of the MVC (Model-View- Controller). You will still need to build the presentation layer and any logic that is necessary to interact with the service. In building the client code, you will have access to all the goodies your portal environment offers, such as session information, portlet-to-portlet communication, authorization information, and so on. Implementing portlets as Web services will take some work. How smoothly your implementation goes will depend on what development platform you use, what portal vendor you choose, and what in-house talents you can leverage to handle this development.

    Conclusion
    Individually Web services and portals are very powerful technologies. One would think then that together they would be unstoppable. In theory that's correct, but in reality there are still some limitations to seamless, portable, simple Web service portlet integration. The good news is that these are known problems and are being addressed by both standards bodies and the portal vendors themselves. If your business needs dictate it, a portal can be a wise investment. Your ROI will not take that long to be seen if your portal is used properly. And when all the portlet libraries have been searched with no luck for that special application you require, take a stab at creating your portlet with a Web service. For all the current and future standards, platform and protocol independence, reusability, interoperability, and "future proofing" benefits you get, you'll be glad you did.

  • More Stories By Alec Graziano

    Director of Web engineering at Miller Systems, Alec Graziano is responsible for all of Miller Systems’ Internet-based software development engagements and process methodology, including design, architecture, and programming. Alec has significant and broad experience in designing and managing the delivery of industry-standard, Web-based software, particularly in the Web services arena, in both J2EE and .NET frameworks.

    Comments (0)

    Share your thoughts on this story.

    Add your comment
    You must be signed in to add a comment. Sign-in | Register

    In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


    @MicroservicesExpo Stories
    High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, discussed how by using ne...
    Agile has finally jumped the technology shark, expanding outside the software world. Enterprises are now increasingly adopting Agile practices across their organizations in order to successfully navigate the disruptive waters that threaten to drown them. In our quest for establishing change as a core competency in our organizations, this business-centric notion of Agile is an essential component of Agile Digital Transformation. In the years since the publication of the Agile Manifesto, the conn...
    While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
    Cavirin Systems has just announced C2, a SaaS offering designed to bring continuous security assessment and remediation to hybrid environments, containers, and data centers. Cavirin C2 is deployed within Amazon Web Services (AWS) and features a flexible licensing model for easy scalability and clear pay-as-you-go pricing. Although native to AWS, it also supports assessment and remediation of virtual or container instances within Microsoft Azure, Google Cloud Platform (GCP), or on-premise. By dr...
    The cloud revolution in enterprises has very clearly crossed the phase of proof-of-concepts into a truly mainstream adoption. One of most popular enterprise-wide initiatives currently going on are “cloud migration” programs of some kind or another. Finding business value for these programs is not hard to fathom – they include hyperelasticity in infrastructure consumption, subscription based models, and agility derived from rapid speed of deployment of applications. These factors will continue to...
    While we understand Agile as a means to accelerate innovation, manage uncertainty and cope with ambiguity, many are inclined to think that it conflicts with the objectives of traditional engineering projects, such as building a highway, skyscraper or power plant. These are plan-driven and predictive projects that seek to avoid any uncertainty. This type of thinking, however, is short-sighted. Agile approaches are valuable in controlling uncertainty because they constrain the complexity that ste...
    identify the sources of event storms and performance anomalies will require automated, real-time root-cause analysis. I think Enterprise Management Associates said it well: “The data and metrics collected at instrumentation points across the application ecosystem are essential to performance monitoring and root cause analysis. However, analytics capable of transforming data and metrics into an application-focused report or dashboards are what separates actual application monitoring from relat...
    "This all sounds great. But it's just not realistic." This is what a group of five senior IT executives told me during a workshop I held not long ago. We were working through an exercise on the organizational characteristics necessary to successfully execute a digital transformation, and the group was doing their ‘readout.' The executives loved everything we discussed and agreed that if such an environment existed, it would make transformation much easier. They just didn't believe it was reali...
    "Codigm is based on the cloud and we are here to explore marketing opportunities in America. Our mission is to make an ecosystem of the SW environment that anyone can understand, learn, teach, and develop the SW on the cloud," explained Sung Tae Ryu, CEO of Codigm, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
    "We're developing a software that is based on the cloud environment and we are providing those services to corporations and the general public," explained Seungmin Kim, CEO/CTO of SM Systems Inc., in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
    Many enterprise and government IT organizations are realizing the benefits of cloud computing by extending IT delivery and management processes across private and public cloud services. But they are often challenged with balancing the need for centralized cloud governance without stifling user-driven innovation. This strategy requires an approach that fundamentally reshapes how IT is delivered today, shifting the focus from infrastructure to services aggregation, and mixing and matching the bes...
    "CA has been doing a lot of things in the area of DevOps. Now we have a complete set of tool sets in order to enable customers to go all the way from planning to development to testing down to release into the operations," explained Aruna Ravichandran, Vice President of Global Marketing and Strategy at CA Technologies, in this SYS-CON.tv interview at DevOps Summit at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
    DevOps promotes continuous improvement through a culture of collaboration. But in real terms, how do you: Integrate activities across diverse teams and services? Make objective decisions with system-wide visibility? Use feedback loops to enable learning and improvement? With technology insights and real-world examples, in his general session at @DevOpsSummit, at 21st Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, explored how leading organizations use data-driven DevOps to close th...
    We just came off of a review of a product that handles both containers and virtual machines in the same interface. Under the covers, implementation of containers defaults to LXC, though recently Docker support was added. When reading online, or searching for information, increasingly we see “Container Management” products listed as competitors to Docker, when in reality things like Rocket, LXC/LXD, and Virtualization are Dockers competitors. After doing some looking around, we have decided tha...
    The nature of test environments is inherently temporary—you set up an environment, run through an automated test suite, and then tear down the environment. If you can reduce the cycle time for this process down to hours or minutes, then you may be able to cut your test environment budgets considerably. The impact of cloud adoption on test environments is a valuable advancement in both cost savings and agility. The on-demand model takes advantage of public cloud APIs requiring only payment for t...
    DevOps teams have more on their plate than ever. As infrastructure needs grow, so does the time required to ensure that everything's running smoothly. This makes automation crucial - especially in the server and network monitoring world. Server monitoring tools can save teams time by automating server management and providing real-time performance updates. As budgets reset for the New Year, there is no better time to implement a new server monitoring tool (or re-evaluate your current solution)....
    The benefits of automation are well documented; it increases productivity, cuts cost and minimizes errors. It eliminates repetitive manual tasks, freeing us up to be more innovative. By that logic, surely, we should automate everything possible, right? So, is attempting to automate everything a sensible - even feasible - goal? In a word: no. Consider this your short guide as to what to automate and what not to automate.
    "We are an integrator of carrier ethernet and bandwidth to get people to connect to the cloud, to the SaaS providers, and the IaaS providers all on ethernet," explained Paul Mako, CEO & CTO of Massive Networks, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
    From our perspective as consumers, perhaps the best thing about digital transformation is how consumerization is making technology so much easier to use. Sure, our television remote controls still have too many buttons, and I have yet to figure out the digital display in my Honda, but all in all, tech is getting easier for everybody. Within companies – even very large ones – the consumerization of technology is gradually taking hold as well. There are now simple mobile apps for a wide range of ...
    "I focus on what we are calling CAST Highlight, which is our SaaS application portfolio analysis tool. It is an extremely lightweight tool that can integrate with pretty much any build process right now," explained Andrew Siegmund, Application Migration Specialist for CAST, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.