Microservices Expo Authors: Liz McMillan, TJ Randall, Elizabeth White, Pat Romanski, Lori MacVittie

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Log Management, Cloud Security, SDN Journal

@CloudExpo: Article

Adopting a Comprehensive Risk Management Program

In the Boardroom with...Mr. Rob Wigley, Director, Cybersecurity Consulting Services, HP Enterprise Services, U.S. Public Sector

Rob Wigley is Director, Cybersecurity Consulting Services at HP Enterprise Services, U.S. Public Sector. He has more than 30 years of information technology experience supporting manufacturing, high tech, healthcare, and public sector market segments. For the last 10 years, he has focused on developing and delivering cybersecurity consulting solutions for public sector clients.

SecuritySolutionsWatch.com: Thank you for joining us today, Rob. Can you please tell us a little about your background and your role within HP.

Rob Wigley: I have more than 30 years of information technology experience supporting manufacturing, high tech, healthcare, and public sector market segments. For the last 10 years, as regulatory requirements for information security have increased in association with mounting threats facing government and businesses today, I have focused on developing and delivering cybersecurity consulting solutions for public sector clients. This market has unique IT security requirements and is facing a significant increase of cyber threats. HP Cybersecurity Consulting Services are soundly structured to help our clients manage risks to their environment.

SecuritySolutionsWatch.com: Cybersecurity is front-page news on a daily basis. Attacks and threats may emanate from anywhere, at any time - from well-organized state-sponsored and foreign-operated governments, to lone wolf hackers, and even from natural disasters such as Hurricanes Katrina and Sandy. We seem to be in an environment where it is not "IF" your organization will be exposed to a breach or experience downtime due to a security threat, but "WHEN." What are the minimum "best-practices" that your Cybersecurity Consulting Team recommend for implementation in this very challenging environment?

Rob Wigley: Without question, adversaries have become smarter, better organized, and more persistent as they seek to cause disruptions and access intellectual property and other sensitive information. The number of cyber threats is proliferating faster than many organizations can defend. Many companies and global governments have been subjected to some very sophisticated and targeted attacks, which have had a significant impact on their mission. This includes some recent extremely high-profile breaches conducted by insiders.

In order to address these threats, first and foremost an organization needs to have an effective risk management program supported by senior management, with a strong governance structure, and integrated across all business processes. As businesses look for new growth opportunities and consequently adopt new technologies, the tradeoff becomes uncertainty and risk that could affect their mission and goals. A continuous process of risk management activities should be applied to identify new risks, reassess previously identified risks, and monitor and track the effectiveness of risk mitigation plans.

An often overlooked component of risk management is the benefit of integrating the processes early within the business and technology lifecycle. Doing so helps to identify potential threats and vulnerabilities so they can be addressed from the start in order to prevent security breaches. It's much less costly in terms of financial impact to fix vulnerabilities detected up front in the design and development phases, rather than fix them in production environments or after a security incident. This lifecycle-based risk management approach can also reduce costs, as full risk assessments would not be required as frequently.

Another important aspect of a comprehensive risk management program is that it allows business leaders to make informed decisions when balancing the cost of managing risk. As IT budgets continue to come under scrutiny, a thorough documented risk analysis is necessary to justify the expenditures of implementing risk-based controls. That's why it's critical to describe risk in terms of "risk to the business." The bottom line is that an effective risk management program is one that permeates throughout an organization's culture, including people, processes, technology, and governance.

SecuritySolutionsWatch.com: If possible, may we have a brief overview of the comprehensive risk management services you are able to provide to both public and private sector entities?

Rob Wigley: HP has a set of core services under our Governance Risk and Compliance Consulting Services specifically designed to improve a client's overall risk posture in a cost-effective manner. These include compliance and risk assessment services that evaluate the infrastructure and applications with your agency, or corporate security policies and industry best practice. We perform vulnerability scanning, penetration testing, code review, and comprehensive application threat assessments.

We also perform assessments on the strategic aspects of an information security program. We evaluate the effectiveness of security governance, security strategy, incident management, and an organization's risk management process. This consists of a current state and maturity assessment, a gap analysis, and the development of a strategic roadmap with the end goal of aligning your security program with business requirements and measurably reducing business risk. We also offer Security Discovery Workshops, which are one-day interactive workshops facilitated by senior HP consultants involving both business and IT stakeholders. The objective is to examine your cybersecurity strategy and identify your biggest challenges, looking at how you're currently addressing those challenges, and showing how you can use our maturity model to implement a prioritized roadmap to improve your overall cybersecurity posture.

SecuritySolutionsWatch.com: In your opinion, does a misperception exist within the IT world between the terms "risk management" and "risk assessments"?

Rob Wigley: This is a very good question and one that I find myself explaining quite frequently. A risk assessment is just one step in a risk management framework - the overall process of risk identification and analysis by determining potential threats and vulnerabilities associated with an IT system. Conversely, risk management includes selecting and implementing security controls to reduce risk to an acceptable level, categorizing systems to determine their criticality, and reviewing regulations, and policies and standards affecting the security of the information.

Another very important process with risk management is monitoring security controls. Because of the increased threat landscape and sophistication of attackers, periodic risk assessments are no longer sufficient for many organizations. Continuously monitoring for threats and vulnerabilities has become critical to support risk management decisions.

There are some well-established risk management frameworks and standards that explain in detail the processes with all aspects of risk management and risk assessments. This includes publications from the International Organization for Standardization (ISO) and the National Institutes of Standards and Technology (NIST).

SecuritySolutionsWatch.com: With everything moving into the Cloud, can we take a moment and drill down specifically into Cloud Security? What resources does HP offer its clients to help design and deploy a secure cloud strategy?

Rob Wigley: Many surveys indicate that one of the top concerns organizations have in moving to the cloud is the security of their information. While security concerns are not unique to cloud computing, cloud is just one of many disruptive technology trends that organizations are facing. HP offers a secure comprehensive portfolio for cloud computing from private and virtual private cloud, to public cloud and hybrid environments serving customers in both the private and public sector market segments.

To help clients address their concerns on cloud security, we establish a risk-based approach. First, we assess our client's risk tolerance profile, compliance requirements, operational requirements, organizational capabilities, and resources. We typically do this within HP Cloud and HP Security Discovery Workshops with the client. We then look to transform the client's environment.

To assist in that transformation process, HP Security Architects will help clients in developing a secure reference architecture. This provides a common set of the essential architectural design artifacts that can be tailored to fit the needs of a particular cloud delivery model. It also provides a checklist against which architects and engineers will be able to ensure that they have covered all the necessary security requirements in their design solution. Next, HP consultants implement secure application design and deployment practices and secure data management in the cloud. As I mentioned previously, security needs to be built into the infrastructure and applications early in the design cycle to reduce attack surfaces.

SecuritySolutionsWatch.com: Can you highlight any Cybersecurity process/methodologies HP follows?

Rob Wigley: For our U.S. Public Sector clients, HP routinely uses the NIST Special Publications (800 Series). There's an extensive list of documents developed collaboratively by the government, academic organizations, and the private sector that's available to the public. This ranges from very specific technical requirements and guidance on IT systems to strategic guidance at the organizational and business process level.

HP has developed an Enterprise Security Framework that encompasses end-to-end security. Our end-to-end approach incorporates the capabilities of HP ArcSight, Fortify and TippingPoint, along with our suite of Risk Management Consulting Services and Managed Security Services.

To support our clients with this framework, HP has developed a risk-based methodology: "Assess, Transform, Optimize, Manage" (or ATOM) that helps enable organizations reduce risk in a cost-effective manner. We Assess our client's risk tolerance profile, compliance requirements, operational requirements, organizational capabilities and resources. We then work to Transform our client's environment, structuring and prioritizing their security issues, and then undertaking remediation projects with them. Next, we Optimize and broaden our client's level of security awareness. We help them continually monitor their environment and proactively recommend operational and process improvements that can deliver an optimized security and risk posture. We also Manage the associated transformation programs required to deliver security in the most effective way for the enterprise. In this phase, we can also reduce cost by leveraging our worldwide security operations centers.

SecuritySolutionsWatch.com: Any final thoughts you'd like to share?

Rob Wigley: Cyber threats are real and growing, and most organizations are overwhelmed with the increasing risk to their business. When organizations adopt a comprehensive risk management program, they experience significant benefits in compliance achievement, reduced risk and better decision-making. Risk management is most effective when it's engrained within an organization's culture. The role of senior leaders must be to establish and emphasize the need for a strong risk management program. The failure to do so will pose a significant risk to organizational objectives.

This interview originally appeared in SecuritySolutionsWatch.com. Republished with permission.

More Stories By Liz McMillan

News Desk compiles and publishes breaking news stories, press releases and latest news articles as they happen.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@MicroservicesExpo Stories
At its core DevOps is all about collaboration. The lines of communication must be opened and it takes some effort to ensure that they stay that way. It’s easy to pay lip service to trends and talk about implementing new methodologies, but without action, real benefits cannot be realized. Success requires planning, advocates empowered to effect change, and, of course, the right tooling. To bring about a cultural shift it’s important to share challenges. In simple terms, ensuring that everyone k...
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, will contrast how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He will show the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He will also have live demos of building immutable pipe...
So you think you are a DevOps warrior, huh? Put your money (not really, it’s free) where your metrics are and prove it by taking The Ultimate DevOps Geek Quiz Challenge, sponsored by DevOps Summit. Battle through the set of tough questions created by industry thought leaders to earn your bragging rights and win some cool prizes.
Application transformation and DevOps practices are two sides of the same coin. Enterprises that want to capture value faster, need to deliver value faster – time value of money principle. To do that enterprises need to build cloud-native apps as microservices by empowering teams to build, ship, and run in production. In his session at @DevOpsSummit at 19th Cloud Expo, Neil Gehani, senior product manager at HPE, will discuss what every business should plan for how to structure their teams to d...
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
When we talk about the impact of BYOD and BYOA and the Internet of Things, we often focus on the impact on data center architectures. That's because there will be an increasing need for authentication, for access control, for security, for application delivery as the number of potential endpoints (clients, devices, things) increases. That means scale in the data center. What we gloss over, what we skip, is that before any of these "things" ever makes a request to access an application it had to...
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service. 

All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...
JetBlue Airways uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-time monitoring of mobile applications. The next BriefingsDirect Voice of the Customer performance engineering case study discussion examines how JetBlue Airways in New York uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-tim...
SYS-CON Events announced today that eCube Systems, the leading provider of modern development tools and best practices for Continuous Integration on OpenVMS, will exhibit at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. eCube Systems offers a family of middleware products and development tools that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
DevOps is speeding towards the IT world like a freight train and the hype around it is deafening. There is no reason to be afraid of this change as it is the natural reaction to the agile movement that revolutionized development just a few years ago. By definition, DevOps is the natural alignment of IT performance to business profitability. The relevance of this has yet to be quantified but it has been suggested that the route to the CEO’s chair will come from the IT leaders that successfully ma...
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his session at @DevOpsSummit 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will show how customers are able to achieve a level of transparency that enables everyon...
It’s surprisingly difficult to find a concise proper definition of just what exactly DevOps entails. However, I did come across this quote that seems to do a decent job, “DevOps is a culture, movement or practice that emphasizes the collaboration and communication of both software developers and other information-technology (IT) professionals while automating the process of software delivery and infrastructure changes.”
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and microservices. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your conta...
DevOps theory promotes a culture of continuous improvement built on collaboration, empowerment, systems thinking, and feedback loops. But how do you collaborate effectively across the traditional silos? How can you make decisions without system-wide visibility? How can you see the whole system when it is spread across teams and locations? How do you close feedback loops across teams and activities delivering complex multi-tier, cloud, container, serverless, and/or API-based services?
Today every business relies on software to drive the innovation necessary for a competitive edge in the Application Economy. This is why collaboration between development and operations, or DevOps, has become IT’s number one priority. Whether you are in Dev or Ops, understanding how to implement a DevOps strategy can deliver faster development cycles, improved software quality, reduced deployment times and overall better experiences for your customers.
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and ...
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.