|By Deborah Strickland||
|November 12, 2012 12:25 PM EST||
NGSON: An Interesting Standardization Effort
When user generated content began to populate the Internet, and to generate the revolution of content related services, researches understood that user generated services could be the next step to continue such service evolution. Over the last decade standardization efforts have emerge to provide answers and architectural models; efforts like TMF SDF, OMA OSE, ATIS SON, ITU NGN SIDE, OPUCE, SOA and SPICE, can be considered as the more relevant. Recently the IEEE WG P1903 was created to develop the concept of NGSON (Next Generation of Service Overlay Networks), a previous concept, SON (Service Overlay Network), was formulated with the intention to integrate service aware1 technologies created by SOA and delivered over multiple networks using SDP, but the concept lacked some critical features like continuity of the service over multiple networks, or the presentation of content over different devices, considering multiple and diverse contexts of environments. NGSON promises that these features are going to be included, so is natural to think that this new architectural model can revolutionize the way that today the industry develops and delivers the service over Telecommunications networks, this concept can be conceived as a big umbrella that not only integrates service aware models but also will allow the Telcos through the use of a powerful network scheme to provide the first truly pervasive service. The convergence of services, models and technologies in Telecommunications has been well criticized in the past, is obvious that adoption of convergence models hasn’t been massive or enthusiastic; can this new concept become the answer for Telcos seeking to keep services converging over their roof?
NGSON Brings To the Table…
Image courtesy of Paola Buelvas ([email protected])
This concept can be thought as extra layer between the service and the network that today Telcos use to deliver service. What really calls the attention is that it allows the service provider to extend beyond its own network infrastructure to reach different networks, contents and even users. NGSON also includes three advance functionalities; Context Awareness, Dynamic Adaptation and Self-Organization. Context Awareness is a key tool for the service realization, it allows the network to evaluate the situational context of the user in multiple dimensions these being: the service, the user, the device and the network. The Dynamic Adaptation will allow the service to transcend beyond networks and devices (Total mobility) ensuring continuity. Self-Organization ensures that internal resources configuration of this new layer can adapt autonomously imitating the proved principle of resilience from the Internet. But what can you do with it? This architectural concept is divided in several layers and functions, investigation efforts and proof of concepts so far have focused on: User created service; the user choses from a list of services, those he wishes to be combined and delivered without having deep SW or telecommunication knowledge. As an option to provide temporal content storage and optimization functions providing core traffic offloading capabilities through its specialized layer that interfaces with the traditional transport networks. It also has been conceived as an option to aggregate Cloud computational resources. Finally, what I consider the one that generates more expectative is the possibility to provide a truly pervasive service; your identity, applications, contacts and preferences everywhere, in all your devices and without interruption.
The Technical Challenges Today
This effort is just starting and a lot remain to be done, given the current state of the art, the following challenges and work ahead needs to be considered to count with a complete technical proposal:
- Context Awareness feature must be used for traffic optimization delivery, P2P concepts can be integrated to optimize traffic movement through the network.
- The protocols to be executed in the different interfaces still unknown, but have to be compatible with the current service technologies in place.
- The layer that interfaces with the current Telco networks called the Delivery control plane hasn’t been completely developed; several issues and complexity definitions remain to be known.
- QoS transparently support is required throughout the different interfaces and layers of the model, needless to say how challenging this might be in some networks.
- Easy integration with Telco networks; the interconnection not only has to be possible, but has to be simple and very fast to deploy.
The Problems of the Past Could be Problems of Today
The industry has tried through different approaches to regain the user attention lost against the Internet content providers, the barriers of adoption faced by those different approaches are well known, let me summarize those barriers that I can recognize for NGSON and contrast it against its objectives and architecture definition:
- Failure to attract critical mass of applications developers for the Telecom service: The NGSON through its Service control layer that includes discovery, integration composition and routing seem to address this issue, but again without knowing the complexity definitions and compatibility with existing service platforms it’s difficult to forecast the success.
- Complexity of the model or service architecture: there are several dimensions of complexity to consider; for the model itself, for application developers, for integration with the existent service technologies, complexity to commercialize the model and to integrate it to the network of Telcos. Although the model seems to be realizable and simple enough at a high level to provide planning for growing and maintenance activities, the current information makes it hard to conclude if complexity will ruin the party.
- Compatibility: The dimensions to be compatible with are; current content providers and service platforms. Control protocols and signaling schemes both for user content delivery and control of resources and coordination among functional entities. Within current transport networks, again the communication protocols play a key role; it must also consider the negotiation of capabilities and resources for all possible transport networks. For this consideration NGSON implements in a separate way, a control plane for the service, another plane for unified signaling control and a different one for control of the media bearers, this is definitely one of the strongest points of this functional architecture.
- Time to market: When talking about adoption of new network infrastructure for service delivery (e.g WCDMA or LTE) is quite understandable that Telcos take their time, and even desirable that standardization bodies also pace their work. But when talking about the service creation infrastructure, time is a critical factor, if there isn’t a quick path of release and availability, the whole concept can be invalidated by a new service paradigm developed by industries less bound by convergence and complexity. NGSON doesn’t count with any advantages here.
- Services/Applications/Hooks: It has been proved that Telcos normally do not “nailed it”, when it comes to service creation, so this time around collaborative efforts must be convened, Internet innovators, young developers, content giants and groups of users must be called to make part of the service layer planning department. As usual, it seems very hard to find real collaborative efforts for NGSON, and no initiative is well known and strong enough to be taken into account for existing service aware technologies.
One of the main objectives of NGSON is to beneficiate, Content providers, Service Providers and users, the problem is that there seems to be a gap between the concept development and the communication of value to the possible key stakeholders, someone in the industry should bridge this gap.
What should the Industry do to Ensure Success?
Considering the huge potential this concept holds to provide the jump towards a new type of service for Telecommunication companies, and as a fan of technology, a professional of the industry and as a plain user, in my opinion this should the action plan for the industry:
- Take full advantage of advance features, this means in the short term, integrating under the technical functional concepts of EPC 4G, the various access technologies available, considering that one access technology would not reach as many users as all of them combined.
- Cannot be timid when it comes to displaying the capabilities of the architecture, in other words Telcos must give away the power of service creation to those who can execute it to the best this concept can provide.
- Industry must take the risk, different economy indicators are not favorable for Telcos anymore, ARPU continue to decline, and network usage spikes beyond reasonable network growing, It’s time!
- I’m not sure if everyone is convinced about services created by the user, but if the industry really believes that this could be the evolution path of the service, we need to start generating information about it, to explain it in a different way, the commercial conditions must be properly set.
I see this concept as an interesting opportunity that will provide the Telcos enough arguments for the first time to give away the control of service offering, ensuring economic survival and at the same time evolving towards the next phase of services.
So rephrasing the question that frames this blog: Would the new concept of service creation really revolutionize the industry? To the best of my understanding the NGSON is a well-conceived technical concept that overcomes several flaws of early approximations, has all the necessary components to be innovative and to become successful.
But, the cold truth is that still a lot of work to be done, various barriers faced by unified frameworks in the past still present today, industry leaders remain distant to the need of a new approach to the service, in other words innovators roles that could take us the next level remain vacant. In conclusion is a very interesting concept with huge potential, which has to overcome several non-trivial challenges but still have some time to accomplish its objective. As a user and fan of Telecommunications I really would like to see the realization of the service as promised by this interesting effort.
For more, follow me, @jomaguo
- The concept of service awareness is a paradigm that conceives the service composition through high level functional blocks that abstract the complexity of lower transport and delivery layers
As we enter the final week before the 19th International Cloud Expo | @ThingsExpo in Santa Clara, CA, it's time for me to reflect on six big topics that will be important during the show. Hybrid Cloud: This general-purpose term seems to provide a comfort zone for many enterprise IT managers. It sounds reassuring to be able to work with one of the major public-cloud providers like AWS or Microsoft Azure while still maintaining an on-site presence.
Dec. 9, 2016 04:45 AM EST Reads: 2,974
I’m a huge fan of open source DevOps tools. I’m also a huge fan of scaling open source tools for the enterprise. But having talked with my fair share of companies over the years, one important thing I’ve learned is that you can’t scale your release process using open source tools alone. They simply require too much scripting and maintenance when used that way. Scripting may be fine for smaller organizations, but it’s not ok in an enterprise environment that includes many independent teams and to...
Dec. 9, 2016 02:45 AM EST Reads: 786
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
Dec. 9, 2016 01:45 AM EST Reads: 1,958
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes how...
Dec. 9, 2016 12:45 AM EST Reads: 5,131
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dec. 9, 2016 12:45 AM EST Reads: 1,217
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2017 New York. The 20th Cloud Expo and 7th @ThingsExpo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Internet to enable us all to im...
Dec. 9, 2016 12:30 AM EST Reads: 905
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Dec. 8, 2016 09:15 PM EST Reads: 1,679
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
Dec. 8, 2016 07:45 PM EST Reads: 5,749
Logs are continuous digital records of events generated by all components of your software stack – and they’re everywhere – your networks, servers, applications, containers and cloud infrastructure just to name a few. The data logs provide are like an X-ray for your IT infrastructure. Without logs, this lack of visibility creates operational challenges for managing modern applications that drive today’s digital businesses.
Dec. 8, 2016 05:00 PM EST Reads: 1,813
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
Dec. 8, 2016 04:45 PM EST Reads: 1,862
Financial Technology has become a topic of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 20th Cloud Expo at the Javits Center in New York, June 6-8, 2017, will find fresh new content in a new track called FinTech.
Dec. 8, 2016 04:45 PM EST Reads: 2,263
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Dec. 8, 2016 04:30 PM EST Reads: 1,990
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
Dec. 8, 2016 04:15 PM EST Reads: 2,329
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 8, 2016 02:15 PM EST Reads: 1,205
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. In the eyes of many, containers are at the brink of becoming a pervasive technology in enterprise IT to accelerate application delivery. In this presentation, attendees learned about the: The transformation of IT to a DevOps, microservices, and container-based architecture What are containers and how DevOps practices can operate in a container-based environment A demonstration of how ...
Dec. 8, 2016 01:15 PM EST Reads: 1,225
Cloud Expo, Inc. has announced today that Andi Mann returns to 'DevOps at Cloud Expo 2017' as Conference Chair The @DevOpsSummit at Cloud Expo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great t...
Dec. 8, 2016 01:15 PM EST Reads: 762
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Phil Hombledal, Solution Architect at CollabNet, discussed how customers are able to achieve a level of transparency that e...
Dec. 8, 2016 12:45 PM EST Reads: 1,269
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
Dec. 8, 2016 10:00 AM EST Reads: 681
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Dec. 8, 2016 09:15 AM EST Reads: 993
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Dec. 8, 2016 08:45 AM EST Reads: 1,916