|By Dana Gardner||
|December 3, 2012 06:45 AM EST||
Welcome to the latest edition of the HP Discover Performance Podcast Series. Our next discussion examines how regional healthcare services provider Lake Health in Ohio has matured from deploying security technologies to becoming more of a comprehensive risk-reduction practice provider internally for its own consumers.
We learn how Lake Health's Information Security Officer has been expanding the breadth and depth of risk management there to a more holistic level -- and we're even going to discuss how they've gone about deciding which risk and compliance services to seek from outside providers, and which to retain and keep on-premises.
Here to explore these and other security-related enterprise IT issues, we're joined by our co-hosts for this sponsored podcast, Chief Software Evangelist at HP, Paul Muller, and Raf Los, Chief Security Evangelist at HP.
And we also welcome our special guest, Keith Duemling, Information Security Officer at Lake Health. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]
Here are some excerpts:
Gardner: Many people are practicing IT security and they're employing products and technologies. They're putting in best practices and methods, of course.
But you have a different take. You've almost abstracted this up to information assurance -- even quality assurance -- for knowledge, information, and privacy. Tell me how that higher abstraction works, and why you think it's more important or more successful than just IT security?
Duemling: If you look at the history of information security at Lake Health, we started like most other organizations. We were very technology focused, implementing one or two point solutions to address specific issues. As our program evolved, we started to change how we looked at it and considered it less of a pure privacy issue and more of a privacy and quality issue.
Go back to the old tenets of security, with confidentiality, integrity, and availability. We started thinking that, of those three, we really focused on the confidentiality. But as an industry, we haven't focused that much on the integrity -- and the integrity is closely tied to the quality.
So we wanted to transform our program into an information-assurance program, so that we could allow our clinicians and other caregivers to have the highest level of assurance that the information they're making decisions based on is accurate and is available, when it needs to be, so that they feel comfortable in what they are doing.
As background, Lake Health is a not-for-profit healthcare system. We’re about 45 minutes outside of Cleveland, Ohio. We have two freestanding hospitals and approximately 16 satellite sites of different sizes that provide healthcare to the citizens of the county that we’re in and three adjacent counties.
We have three freestanding 24×7 emergency rooms (ERs), which treat all kinds of injuries, from the simple broken fingers to severe car accidents, heart-attacks, things of that nature.
We also have partnerships with a number of very large healthcare systems in the region, and organizations of that size. We send some of our more critically injured patients to those providers, and they will send some of their patients to us for more localized, smaller care closer to their place of residence.
We’ve grown from a single, small community hospital to the organization that we have now.
Typically, when I started, an individual was assigned a set of projects to work on, and I was assigned a series of security projects. I had a security background that I came to the organization with. Over time, those projects congealed into the security program that we have now, and if I am not mistaken, it's in its third iteration right now. We seem to be on a three-year run for our security program, before it goes through a major retrofit.
So it's not just protecting information from being disclosed, but it's protecting information so that it's the right information, at the right time, for the right patient, for the right plan of care.
From a high level, the program has evolved from simple origins to more of a holistic type of analysis, where we look at the program and how it will impact patient care and the quality of that patient care.
Gardner: It sounds like what I used to hear -- and it shows how long I have been around -- in the manufacturing sector. I covered that 20 years ago. They talked about a move toward quality, and rather than just looking at minute or specific parts of a process, they had to look at it in total. It was a maturity move on behalf of the manufacturers, at that time.
Raf Los, do you see this as sort of a catching up time for IT and for security practices that are maybe 20 years behind where manufacturing was?
Los: What Keith’s group is going, and where many organizations are evolving to, is a practice that focuses less on “doing security” and more on enabling the enterprise and keeping quality high. After all, security is simply a functionof -- one of the three pillars -- of quality. We look at does it perform, does it function, and is it secure?
So it's a natural expansion of this, sort of a Six Sigma-esque approach to the business, where IT is catching up, as you’ve aptly put it. So I tend to agree with it.
Gardner: Of course, compliance is really important in the healthcare field. Keith, tell us how your approach may also be benefiting you, not just in the quality of the information, but helping you with your regulatory and compliance requirements too?
Duemling: In the approach that we’ve taken, we haven’t tried to change the dynamics of that significantly. We've just tried to look at the other side of the coin, when it comes to security. We find that a lot of the controls that we put in place for security benefit from an assurance standpoint, and the same controls for assurance also benefit from a security standpoint.
As long as we align what we're doing to industry-accepted frameworks, whether it’d be NIST or ISO, and then add the healthcare-specific elements on top of that, we find that that gives us a good architecture to continue our program, and to be mindful of the assurance aspect as well as the security side.
One of the other benefits of the approach is that we look at the data itself or the business function and try to understand the risks associated with it and the importance of those functions and the availability of the data. When we put the controls and the protective measures around that, we typically find that if we're looking specifically at what the target is when we implement the control, our controls will last better and they will defend from multiple threats.
So we're not putting in a point solution to protect against the buzzword of the day. We're trying to put in technologies and practices that will improve the process and make it more resilient from both what the threats are today and what they are in the future.
Muller: A couple of observations ... The first is that we need to be really careful when we think about compliance. It's something of a security blanket, not so much for security executives. I think InfoSec security executives understand the role of compliance, but it can give business leaders a false sense of security to say, "Hey, we passed our audit, so we're compliant."
There was a famous case of a very large financial-services institution that had been through five separate audits, all of which gave them a very clear bill of health. But it was very clear from some of the honey pots they put in place in terms of certain data that they were leaking data through to a market-based adversary. In other words, somebody was selling their data, and it wasn’t until the sixth audit that it uncovered the source of the problem.
So we need to be really careful. Compliance is actually the low bar. We're dealing with a market-based adversary. That is, someone will make money from your data. It's not the nation-state that we need to worry about so much as the people who are looking to exploit the value of your information.
Of course, once money and profit enter the equation, there are a lot of people very interested in automating and mechanizing their attack against your defense, and that attack surface is obviously constantly increasing.
The challenge, particularly in examples such as the one that Keith is talking about, comes in the mid-sized organizations. They've got all of the compliance requirements, the complexity, and the fascinating, or interesting, data from the point of view from a market-based adversary. They have all of that great data, but don't necessarily have the scale and the people to be able to protect that.
It's a question of how you balance the needs of a large enterprise with the resources of a mid-sized organization. I don't know, Keith, whether you've had any experience of that problem.
Duemling: I have all too many times experienced that problem that you’re defining right there. We find that technology that helps us to automate our situational awareness is something that's key for us. We can take the very small staff that we have and make it so that we can respond to the threats and have the visibility that we need to answer those tough questions with confidence, when we stand in front of the board or senior management. We're able to go home and sleep at night and not be working 24×7.
Los: Keith, let me throw a question at you, if you don't mind. We mentioned automation, and everybody that I have with this conversation with tends to -- I don't want to say oversimplify -- but can have an over-reliance on automation technology.
In an organization of your size, you’re right smack in the middle of that, too big not to be a target, too small to have all the resources you've ever wanted to defend yourself. How do you keep from being overrun by automation -- too many dashboards, too many red lights blinking at you, so you can actually make sense of any of this?
Duemling: That's actually one of the reasons we selected HP's ArcSight. We had too many dashboards for our very small staff to manage, and we didn’t want Monday to be the dashboard for Product A, Tuesday for Product B, and things of that nature.
So we figured we would aggregate them and create the master dashboard, which we could use to have a very high-level, high-altitude view, drill down into the specific events, and then start referring them to subject-matter experts. We wanted to have just those really sensitive events bubble up to the surface, so that we could respond to them and they wouldn’t get lost in the maze of dashboards.
Gardner: How did you unify all of these different elements under what you call a program for security? What were some of the steps you needed to take? We heard a little bit about the dashboard issue, but I'm trying to get a larger perspective on how you unified culture around this notion of information assurance?
Duemling: We started within the information and technology department where we had to really do an evaluation of what technologies we had in place? What are different individuals responsible for, and who do they report to? Once we found that there was this sprinkling of technology and responsibilities throughout the department, we had to put together a plan to unify that all into one program that has one set of objectives, is under one central leadership, and has its clear marching orders.
Then once we accomplished that, we started to do the same thing across the entire organization. We improved our relationship within IT, not just with sub-departments within IT, but then we also started to look outside and said, "We have to improve our relationship with compliance and we have to improve our relationship with physical security."
So we’re unifying our security program under the mantra of risk, and that's bringing all the different departments that are related to risk into the same camp, where we can exchange notes and drive towards a bigger enterprise focused set of objectives.
Los: At the end of the day, what security is chartered with, along with most of the rest of IT, as I said earlier, is empowering the organization to do its work. Lake Health does not exist for the sole purpose of security, and clearly they get that.
That's step one on this journey of understanding what the purpose of an IT security organization is. Along the broader concept of resiliency, one of the things that we look at in terms of security and its contribution to the business is, can the organization take a hit and continue, get back up to speed, and continue working?
Not if, but when
Most organization technologists by now know it’s not a question of if you’re going to be hacked or attacked, but a question of when, and how you’re going to respond to that by allowing the intelligent use of automation, the aligning towards business goals, and understanding the organization, and what's critical in the organization.
They rely on critical systems, critical patient-care system. That goes straight to the enterprise resiliency angle. If you get hacked and your network goes down, IT security is going to be fighting that hack. At the same time, we need to realize how we separate the bad guys from the patient and the critical-care system, so that our doctors and nurses and support professionals can go back to saving lives, and making people’s lives better, while we contain the issue and eradicate it from our system.
It's more than just about security, and that's a fantastic revelation to wake up to every morning.
Gardner: Are there some other returns on investment (ROI), maybe it's a softer return like an innovation benefit or being able to devote more staff to innovation?
Duemling: I'd put forward two paybacks. One is about some earlier comments I heard. We, as an organization, did suffer a specific event in our history, where we were fighting a threat, while it was expected that our facilities would continue operating. Because of the significant size of that threat, we had degraded services, but we were able to continue -- patients were able to continue coming in, being treated, things of that nature.
That happened earlier in our program, but it didn’t happen to the point where we didn’t have a program in place. So, as an organization, we were able to wage that war, for lack of a better term, while the business continued to function.
Although those were some challenging times for us, and luckily there was no patient data directly or indirectly involved with that, it was a good payoff that we were able to continue to fight the battle while the operations of the organization continued. We didn't have to shut down the facilities and inconvenience the patients or potentially jeopardize patient safety and/or care.
A second payoff is, if we fast forward to where we are now, lessons learned, technologies put in place, and things of that nature. We have a greater ability to answer those questions, when people put them to us, whether it's a middle manager, senior manager, or the board. What are some of the threats we're seeing? How are we defending ourselves? What is the volume of the challenge? We're able to answer those questions with actual answers as opposed to, "I don't know," or "I'll get back to you."
So we can demonstrate more of an ROI through an improvement in situational awareness and security intelligence that we didn't have three, four, or five years earlier in the program’s life. And tools like ArcSight and some of the other technologies that we have, that aggregate that for us, get rid of the noise, and just let us hone in on the crown jewels of the information are really helpful for us to answer those questions.
System of record
Gardner: How about looking at this through the lens of a system of record perspective, an architectural term perhaps, has that single view, that single pane of glass, allowed you to gain the sense that you have a system of record or systems of record. Has that been your goal, or has that been perhaps even an unintended consequence?
Duemling: It's actually kind of both. One, it retains information that sometimes you wish you didn't retain, but that's the fact of what the device and the technology are in the solution and it’s meeting its objective.
But it is nice to have that historical system of record, to use your term, where you can see the historical events as they unfold and explain to someone, via one dashboard or one image, as a situation evolves.
Then, you can use that for forensic analysis, documentation, presentation, or legal to show the change in the threat landscape related to a specific incident, or from a higher level, a specific technology that's providing its statistical information into ArcSight, but you can then do trending and analysis on.
It is also good to get towards a single unified dashboard where you can see all of the security events that are occurring in the environment or outside the environment that you are pulling in, like edit from a disaster recovery (DR) site. You have that single dashboard where if you think there's a problem, you can go to that, start drilling down, and answer that question in a relatively short period of time.
Muller: I'll go back to Keith’s opening comments as well. Let's not undervalue the value of confidence -- not having to second guess not just the integrity of your systems and your applications, but to second guess the value of information. It's one thing when we're talking about the integrity of the bank balance of a customer. Let's be clear that that's important, but it can also be corrected just as easily as it can be modified.
When you're talking about confidence in patient data, medical imaging, drug dispensations, and so forth, that’s the sort of information you can't afford to lack confidence in, because you need to make split-second decisions that will obviously have an impact on somebody’s life.
Duemling: I would add to that. Like you were saying, you can undo an incorrect or a fraudulent bank transfer, but you cannot undo something such as the integrity of your blood bank. If your blood bank has values that randomly change or if you put the wrong type of blood into a patient, you cannot undo those without there being a definitely negative patient outcome.
Los: Keith, along those lines, do you have separate critical systems that you have different levels of classifications for that are defended and held to a different standard of resilience, or do you have a network wide classification? I am just curious how you figure out what gets the most attention or what gets the highest concentration of security?
Duemling: The old model of security in healthcare environments was to have a very flat type of architecture, from both networking, support, and a security standpoint. As healthcare continues to modernize for multiple reasons, there's a need to build islands or castles. That’s the term we use internally, "castles," to describe it. You put additional controls, monitoring, and integrity checks in place around specific areas, where the data is the most valuable and the integrity is the most critical, because there are systems in a healthcare environment that are more critical than others.
Obviously, as we talked about earlier, the ones that are used for clinical decision making are technically more critical than the ones that are used for financial compensation as it results from treating patients. So although it's important to get paid, it's more important that patient safety is maintained at all times.
We can't necessarily defend all of our vast resources with the limited set of tools that we have. So we've tried to pick the ones that are the most critical to us and that's where we've tried to put all the hardening steps in place from the beginning, and we will continue to expand from there.
We look at every security project with the mindset of how we can do this the most effectively and with the least amount of resources that are diverted from the clinical environment to the information security program.That being said, security as a service, cloud-based technology, outsourcing, whatever term you would like use, is definitely something that we consider on a regular basis, when it comes to different types of controls or processes that we have to be responsible for. Or professional services in the events of things like forensics, where you don’t do it on a regular basis, so you may not consider yourself an expert.
We tend to do an evaluation of the likelihood of the threat materializing or dependence on the technology, what offerings are out there, both as a service and premise-based, what it would take from an internal resource standpoint to adequately support and use a technology. Then, we try and articulate that into a high-level summary of the different options, with cost, pros and cons related to each.
Then, typically our senior management will discuss all of those, and we'll try and come to the decision that we think makes best for our organizations, not just for that point, but for the next three to five years. So some initiatives have gone premise-based and some have gone security-as-a-service based. We are kind of a mix.
Gardner: It's interesting that a common thread for successful organizations is knowing yourself well. It's also an indicator of maturity, of course.
You have had a good opportunity to know yourself and then to track your progress. Is that helping you make these decisions about what's core or context in the design of your risk-mitigation activities?
What you do well
Duemling: Yes, it is. You have to know what you do well and also you have to know the areas where you, as an organization, are not going to be able to invest the time or the resources to get to a specific comfort level that you would feel would be adequate for what you are trying to achieve. Those are some of the things where we look to use security as a service.
We don't want to necessarily become experts on spam filtering, so we know that there are companies that specialize in that. We will leverage their investment, their technology, and their IP to help defend us from email-borne threats and things of that nature.
We're not going to try and get into the business of having a program or to create an event-correlation engine. That's why we're going to go out and look for the best-of-breed technologies out there to do it for us.
We'll pick those different technologies, whether it's as a service or premise-based and we'll implement those. That will allow us to invest in the people that know our environment the best and intimately and who can make decisions based on what those tools and those managed services tell them.
They can be the boots on the ground, for lack of a better term, making the decisions that are effective at the time, with all the situational awareness that they need to resolve the problem right then and there.
Gardner: For those of our listeners who are perhaps juggling quite a few security products or technologies and they would like to move into this notion of a program, and would like to have a unified view -- any thoughts about getting started, any lessons learned that you could share?
Duemling: I would say just a couple of bullet points. Security is more than just technology. It really is the people, the process, and the technology. You have to understand the business that you are trying to protect. You have to understand that security is there to support the business, not to be the business.
Probably most importantly, when you want to evolve your security and set up projects into an actual security program, you have to be able to talk the language of the business to the people who run the business, so that they understand that it’s a partnership and you are there to support them, not to be a drain on their valuable resources.
Los: I think he has put it brilliantly just now. IT security is a resource and also a potential drain on resources. So the less we can take away from anything else the organization is doing, while enabling them to basically be better, deliver better, deliver smarter, and save more lives and make people healthier, that is ultimately the goal.
If there's nothing else that anybody takes away from a conversation like this, IT security is just another enabler in the business and we should really continue to treat it that way and work towards that goal.
Gardner: All right, last word to you today, Paul Muller. What sort of lessons learned or perhaps perceptions from the example of Lake Health would you amplify or extend?
Muller: I will just go back to some of my earlier comments, which is, let’s remember that our adversary is increasingly focused on the market opportunity of exploiting the data that we have inside our organizations -- data in all of its forms. Where there is profit, as I said, there will be a drive for automation and best practices. They are also competing to hire the best security people in the world.
But as a result of that, and mixed in with the fact that we have this ever-increasing attack surface, the vulnerabilities are increasing dramatically. The statistic I saw from just October is that the cost of cyber crime has risen by 40 percent and the attack frequency has doubled in the last 12 months. This is very real proof that this market forces are at work.
The challenge that we have is educating our executives that compliance is important, but it is the low bar. It is table stakes, when we think about information and security. And particularly in the case of mid-sized enterprises, as Raf pointed out, they have all of the attractiveness as a target of a large enterprise, but not necessarily the resources to be able to effectively detect and defend against those sorts of attacks.
You need to find the right mix of services, whether we call it hybrid, whether we call it cloud or managed services, combined with your own on-premises services to make sure that you're able to defend yourself responsibly.
Gardner: I'd like to thank our supporter for this series, HP Software, and remind our audience to carry on the dialogue with Paul Muller through the Discover Performance Group on LinkedIn, and also to follow Raf on his popular blog, Following the White Rabbit.
You can also gain more insights and information on the best of IT performance management at http://www.hp.com/go/discoverperformance.
And you can always access this and other episodes in our HP Discover Performance Podcast Series at hp.com and on iTunes under BriefingsDirect. Thanks!
You may also be interested in:
- HP Discover Performance Podcast: McKesson Redirects IT to Become a Services Provider That Delivers Fuller Business Solutions
- Investing Well in IT With Emphasis on KPIs Separates Business Leaders from Business Laggards, Survey Results Show
- Expert Chat with HP on How Better Understanding Security Makes it an Enabler, Rather than Inhibitor, of Cloud Adoption
- Expert Chat with HP on How IT Can Enable Cloud While Maintaining Control and Governance
- Expert Chat on How HP Ecosystem Provides Holistic Support for VMware Virtualized IT Environments
One of the biggest challenges when developing connected devices is identifying user value and delivering it through successful user experiences. In his session at Internet of @ThingsExpo, Mike Kuniavsky, Principal Scientist, Innovation Services at PARC, described an IoT-specific approach to user experience design that combines approaches from interaction design, industrial design and service design to create experiences that go beyond simple connected gadgets to create lasting, multi-device experiences grounded in people's real needs and desires.
Nov. 26, 2014 01:15 PM EST Reads: 729
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) i...
Nov. 26, 2014 01:00 PM EST Reads: 697
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
Nov. 26, 2014 01:00 PM EST Reads: 699
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Nov. 26, 2014 01:00 PM EST Reads: 520
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrategies, will examine why IT must finally fulfill its role in support of its SBUs or face a new round of...
Nov. 26, 2014 01:00 PM EST Reads: 575
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
Nov. 26, 2014 01:00 PM EST Reads: 586
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics is e...
Nov. 26, 2014 12:30 PM EST Reads: 349
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
Nov. 25, 2014 09:30 PM EST Reads: 1,134
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
Nov. 25, 2014 09:30 PM EST Reads: 1,188
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at @ThingsExpo, Robin Raymond, Chief Architect at Hookflash, will walk through the shifting landscape of traditional telephone and voice services ...
Nov. 25, 2014 08:00 PM EST Reads: 1,386
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
Nov. 25, 2014 07:00 PM EST Reads: 1,240
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
Nov. 25, 2014 04:30 PM EST Reads: 1,273
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Architect for the Internet of Things and Intelligent Systems at Red Hat, described how to revolutioniz...
Nov. 24, 2014 07:00 PM EST Reads: 1,598
Bit6 today issued a challenge to the technology community implementing Web Real Time Communication (WebRTC). To leap beyond WebRTC’s significant limitations and fully leverage its underlying value to accelerate innovation, application developers need to consider the entire communications ecosystem.
Nov. 24, 2014 12:00 PM EST Reads: 1,477
The definition of IoT is not new, in fact it’s been around for over a decade. What has changed is the public's awareness that the technology we use on a daily basis has caught up on the vision of an always on, always connected world. If you look into the details of what comprises the IoT, you’ll see that it includes everything from cloud computing, Big Data analytics, “Things,” Web communication, applications, network, storage, etc. It is essentially including everything connected online from hardware to software, or as we like to say, it’s an Internet of many different things. The difference ...
Nov. 24, 2014 11:00 AM EST Reads: 1,616
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.
Nov. 24, 2014 09:00 AM EST Reads: 1,629
SYS-CON Events announced today that Windstream, a leading provider of advanced network and cloud communications, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Windstream (Nasdaq: WIN), a FORTUNE 500 and S&P 500 company, is a leading provider of advanced network communications, including cloud computing and managed services, to businesses nationwide. The company also offers broadband, phone and digital TV services to consumers primarily in rural areas.
Nov. 23, 2014 07:30 PM EST Reads: 1,817
"There is a natural synchronization between the business models, the IoT is there to support ,” explained Brendan O'Brien, Co-founder and Chief Architect of Aria Systems, in this SYS-CON.tv interview at the 15th International Cloud Expo®, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Nov. 23, 2014 12:00 PM EST Reads: 1,750
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com), moderated by Ashar Baig, Research Director, Cloud, at Gigaom Research, Nate Gordon, Director of T...
Nov. 23, 2014 07:45 AM EST Reads: 1,776
ARMONK, N.Y., Nov. 20, 2014 /PRNewswire/ -- IBM (NYSE: IBM) today announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM's platform-as-a-service. The new platform enables developers to build ap...
Nov. 22, 2014 05:30 PM EST Reads: 1,573