|By Maureen O'Gara||
|November 21, 2012 08:00 AM EST||
HP Tuesday morning charged its Autonomy acquisition with massive fraud.
It claims it was the victim of accounting improprieties and disclosure failures made by the publicly held British company prior to its acquisition last year by HP and downright misrepresentations made to HP in connection with the ultimately $12 billion and something acquisition.
As a result HP will write off most of the inflated purchase price and take an $8.8 billion non-cash charge, worse than the $8 billion goodwill charge HP took this summer on its EDS acquisition.
HP has turned the case over to the SEC and the UK's Serious Fraud Office for civil and criminal prosecution. HP CEO Meg Whitman said the company would also press its own civil and criminal charges against certain unnamed individuals seeking redress for its benighted stockholders. She expects it all to take years.
HP said that after Autonomy founder Mike Lynch was fired earlier this year because the unit was significantly underperforming an unidentified senior member of Autonomy's staff blew the whistle on the company's inflated margins and phony growth rates, and the slight-of-hand that got it so overvalued.
A seven-month internal investigation ensued, leading HP to conclude that Autonomy was overvalued all along.
That investigation is still going on but it found that Autonomy misstated its financial performance, including its revenue, core growth rate and gross margins, and misrepresented its business mix.
Margins were inflated from 28% to 45% by passing low-end hardware sales off as IDOL software license sales; VAR license fees were paraded as revenue when no end customer existed at the time of the sale; VARs paid high up-front fees in exchange lower future fees; and revenues were pulled forward. The idea was to suggest better growth than existed.
"This appears to have been a willful effort on behalf of certain former Autonomy employees to inflate the underlying financial metrics of the company in order to mislead investors and potential buyers," HP said in a statement. "These misrepresentations and lack of disclosure severely impacted HP management's ability to fairly value Autonomy at the time of the deal."
Whitman said HP's board, which voted to make the Autonomy acquisition, "feels terribly" about all this but depended on financials audited by Deloitte and re-audited by KPMG before doing the deal. "You have to depend on audited results," she said, although neither accounting firm tumbled to what had been going on.
Apparently PricewaterhouseCoopers' forensics team finally caught it after in was brought in in May.
According to Meg, "blogs" at the time of the acquisition questioned Autonomy's revenue recognition and what it called a sale but Autonomy's management was allowed to run the reports to ground with the pair of original auditors and claimed there was nothing to them.
After listening to the HP conference call Tuesday morning Lynch "flatly rejected" HP's allegations, "which are false."
Meg blamed her predecessor Leo Apotheker and former head of R&D and strategy Shane Robison. Due diligence at HP no longer reports to M&A, as it was when Apotheker was therem but it's kind of late in the day to give it to the CFO.
Robison has yet to be heard from, but Apotheker, who orchestrated the Autonomy acquisition, issued a statement to the Wall Street Journal saying he was "stunned and disappointed" to learn of thr alleged accounting improprieties and defended HP's due diligence as "meticulous and thorough."
He pointed out that since Autonomy was a public company "much of the process relied on public financial reports - accounting statements approved, filed and backed by Autonomy's leadership, board and auditors" and offered to make himself available "to assist HP and the appropriate authorities to get to the bottom of this."
Autonomy's shenanigans account for $5.3 billion of the charge; the rest stems from HP's declining stock price, which has been in free fall since Apotheker said it was buying Autonomy and disposing of HP's PC unit in order to turn HP into a software company. The company quickly disposed on Mr. Apotheker, replacing him with Whitman, who went through with the Autonomy acquisition and decided to keep PCs. Most outsiders couldn't understand the Autonomy buy.
Although Autonomy's long-term financial performance is expected to be impacted by the mess, Whitman says the company is "100% committed to Autonomy" and expects it to play a "significant role" in HP's recovery.
The disclosure was part of Hewlett-Packard's quarterly report. It lost $6.85 billion, or $3.49 a share, compared to a profit of $239 million, or 12 cents a share, this time last year. Revenue was $29.96 billion, down 4%. Adjusted EPS was $1.16. Analysts were expecting adjusted earnings of $1.14 a share, on revenue of $30.44 billion.
HP said revenue from its PC unit was down 14% since commercial revenues dropped 13% and consumer revenue plunged 16%. Revenue in the enterprise, servers, storage and networking segment dropped about 9%. Printing sales fell about 5%. Services fell about 6%. Software revenue was up 14%.
HP's battered shares were down 12.48% to $11.64 in pre-market trades on the news, dropped further as the market woke up and then recouped slightly to $11.87, down 10.68%. Its stock has lost over 48% this year and the company had $21.8 billion in long-term debt at the end of its fiscal fourth quarter
The IT industry is undergoing a significant evolution to keep up with cloud application demand. We see this happening as a mindset shift, from traditional IT teams to more well-rounded, cloud-focused job roles. The IT industry has become so cloud-minded that Gartner predicts that by 2020, this cloud shift will impact more than $1 trillion of global IT spending. This shift, however, has left some IT professionals feeling a little anxious about what lies ahead. The good news is that cloud computin...
Mar. 27, 2017 04:30 PM EDT Reads: 1,395
By now, every company in the world is on the lookout for the digital disruption that will threaten their existence. In study after study, executives believe that technology has either already disrupted their industry, is in the process of disrupting it or will disrupt it in the near future. As a result, every organization is taking steps to prepare for or mitigate unforeseen disruptions. Yet in almost every industry, the disruption trend continues unabated.
Mar. 27, 2017 04:15 PM EDT Reads: 462
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In his Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, will explore t...
Mar. 27, 2017 03:30 PM EDT Reads: 2,967
As Enterprise business moves from Monoliths to Microservices, adoption and successful implementations of Microservices become more evident. The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Documenting hurdles and problems for the use of Microservices will help consultants, architects and specialists to avoid repeating the same mistakes and learn how and when to use (or not use) Microservices at the enterprise level. The circumstance w...
Mar. 27, 2017 03:00 PM EDT Reads: 4,297
Lots of cloud technology predictions and analysis are still dealing with future spending and planning, but there are plenty of real-world cloud use cases and implementations happening now. One approach, taken by stalwart GE, is to use SaaS applications for non-differentiated uses. For them, that means moving functions like HR, finance, taxes and scheduling to SaaS, while spending their software development time and resources on the core apps that make GE better, such as inventory, planning and s...
Mar. 27, 2017 03:00 PM EDT Reads: 386
Everyone wants to use containers, but monitoring containers is hard. New ephemeral architecture introduces new challenges in how monitoring tools need to monitor and visualize containers, so your team can make sense of everything. In his session at @DevOpsSummit, David Gildeh, co-founder and CEO of Outlyer, will go through the challenges and show there is light at the end of the tunnel if you use the right tools and understand what you need to be monitoring to successfully use containers in your...
Mar. 27, 2017 01:15 PM EDT Reads: 1,738
What if you could build a web application that could support true web-scale traffic without having to ever provision or manage a single server? Sounds magical, and it is! In his session at 20th Cloud Expo, Chris Munns, Senior Developer Advocate for Serverless Applications at Amazon Web Services, will show how to build a serverless website that scales automatically using services like AWS Lambda, Amazon API Gateway, and Amazon S3. We will review several frameworks that can help you build serverle...
Mar. 27, 2017 01:00 PM EDT Reads: 2,099
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
Mar. 27, 2017 11:30 AM EDT Reads: 6,798
SYS-CON Events announced today that HTBase will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. HTBase (Gartner 2016 Cool Vendor) delivers a Composable IT infrastructure solution architected for agility and increased efficiency. It turns compute, storage, and fabric into fluid pools of resources that are easily composed and re-composed to meet each application’s needs. With HTBase, companies can quickly prov...
Mar. 27, 2017 10:30 AM EDT Reads: 3,030
Building custom add-ons does not need to be limited to the ideas you see on a marketplace. In his session at 20th Cloud Expo, Sukhbir Dhillon, CEO and founder of Addteq, will go over some adventures they faced in developing integrations using Atlassian SDK and other technologies/platforms and how it has enabled development teams to experiment with newer paradigms like Serverless and newer features of Atlassian SDKs. In this presentation, you will be taken on a journey of Add-On and Integration ...
Mar. 27, 2017 08:15 AM EDT Reads: 3,193
Culture is the most important ingredient of DevOps. The challenge for most organizations is defining and communicating a vision of beneficial DevOps culture for their organizations, and then facilitating the changes needed to achieve that. Often this comes down to an ability to provide true leadership. As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership abi...
Mar. 27, 2017 05:00 AM EDT Reads: 11,179
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to transi...
Mar. 27, 2017 05:00 AM EDT Reads: 6,300
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service.
Mar. 27, 2017 03:45 AM EDT Reads: 3,095
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Mar. 27, 2017 03:00 AM EDT Reads: 3,110
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
Mar. 27, 2017 12:45 AM EDT Reads: 2,319
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and micro services. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your contain...
Mar. 26, 2017 09:45 PM EDT Reads: 7,780
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningf...
Mar. 26, 2017 07:45 PM EDT Reads: 9,729
DevOps has often been described in terms of CAMS: Culture, Automation, Measuring, Sharing. While we’ve seen a lot of focus on the “A” and even on the “M”, there are very few examples of why the “C" is equally important in the DevOps equation. In her session at @DevOps Summit, Lori MacVittie, of F5 Networks, explored HTTP/1 and HTTP/2 along with Microservices to illustrate why a collaborative culture between Dev, Ops, and the Network is critical to ensuring success.
Mar. 26, 2017 03:00 PM EDT Reads: 10,714
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend @CloudExpo | @ThingsExpo, June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA. Learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
Mar. 26, 2017 01:45 PM EDT Reads: 8,721
An overall theme of Cloud computing and the specific practices within it is fundamentally one of automation. The core value of technology is to continually automate low level procedures to free up people to work on more value add activities, ultimately leading to the utopian goal of full Autonomic Computing. For example a great way to define your plan for DevOps tool chain adoption is through this lens. In this TechTarget article they outline a simple maturity model for planning this.
Mar. 26, 2017 06:00 AM EDT Reads: 4,316