Welcome!

Microservices Expo Authors: Pat Romanski, Liz McMillan, Simon Hill, Madhavan Krishnan, VP, Cloud Solutions, Virtusa, John Rauser

Related Topics: Machine Learning , Java IoT, Industrial IoT, Microservices Expo, PowerBuilder, Containers Expo Blog

Machine Learning : Blog Post

Where Is My Disk Space?

An HTML 5 File System Visualizer

by Nick Mueller, Zetta.net

Hello new users! The file system visualizer can be found at wheresmydiskspace.com - continue reading to learn more about the development of the tool and the visualization options.

Before buying more storage space it's a good idea to make sure your existing space isn't filled with redundant or old data - or hundreds of downloaded cat videos.

Disk capacity is increasing and while prices continue to drop, those savings are offset by demands for new capacity to store more and larger files. Not only does this mean more primary disk space, but 2x that amount for backups.

Zetta co-founder Lou Montulli may have the answer to this problem. Recently Lou combined his experience with browsers and storage in creating an open-source tool - a File System Visualizer (www.wheresmydiskspace.com) - for analyzing storage usage.

Lou was a founding engineer at Netscape in 1994 when he helped create the first commercial web browser Netscape Navigator. Over the years he's been responsible for the development of many browser related innovations, and co-founded Zetta.net in 2008 - where he continues to serve as VP of Engineering and Chief Scientist.

"The tool was conceived as a method for visualizing multiple aspects of any large file set: an existing file system, a backup or an archive," he says. "This can be a great tool to use if you find yourself running low on disk space and need to find files to delete to free up space."

The tool makes heavy use of the Data-Driven Documents JavaScript library together with jQuery, Dojo, PrettyPhoto, JavaScript and Scalable Vector Graphics. Sponsored by Zetta, all the source code for the File System Visualizer is available and a BSD license allows anyone to use it, commercially and non-commercially, free of charge.

"Part of the challenge and opportunity of this tool was writing it in JavaScript and using HTML as the user interface," Lou says. "I was part of the team who wrote the very first web browsers, so I was personally motivated to design a tool that takes advantage of some of the great new technologies coming out of HTML5, Mozilla.org and the broader web community."

Getting Started
The File System Visualizer is free to use and doesn't require installing any software. You just need a web browser that supports SVG and has a fast JavaScript engine. Go to www.wheresmydiskspace.com. The home page has a few video demonstration of the product that you can view before running the software.

Or you can:

  1. Click the link at the top of the page to take you directly to the visualizer.
  2. There you have three options: you can look at some sample data sets, use a Java applet to collect the data from your local machine and create a manifest file detailing what is in the file system, or you can load a manifest file created in a previous scan.
  3. If you choose to do a new scan, and there are a large number of folders, the software will prompt you to save the manifest to your disk rather than keeping it in the browser.

Test Setup
We recently had the File System Visualizer tested on a Windows 7 desktop with a third generation Intel Core i7 processor and 16 GB RAM. The scan took approximately 5 minutes. When completed, a message came up that there were 52,993 folders.

The software can analyze a local disk, or an administrator can run it remotely on any mountable drive. At this point it runs on Windows (32-bit and 64-bit) and OSX.

Visualizing Your Data
After running the scan, the software then presents seven different views of the data. The views are illustrated at the top of the page and you can click on any of the images to access that view of the data.

Summary Page - This showed that the test computer had 353.1 GB of data in 52,993 folders containing 364,931 items, with an average file size of 967.7 KB.

Visual Tree - This gives a hierarchical tree visualization of the data. On the left is a pull-down box where you can select to view the data by size, by type or by date. There is also a slider where you can select the tree display depth from one to seven levels.

Screenshot of the Tree View

Viewing by size shows a hierarchical view of the file system and the amount of data in each folder with up to seven levels of depth. To look at just the contents of a single folder, rather than the entire file system at once, just click on the dot next to that folder.

Viewing by type at the first level divided the data into known types and uncategorized. Going to the second depth level divided the uncategorized by their file extension and the categorized into groups such as disk images, games, database, software development, fonts, plugins, office types, settings, executables, media, backup and system. For most of those categories, going to the next level would give the file extensions, but some categories (media, office types and encodings) would further subdivide before getting to their final level.

Viewing by date, the first level divides the data into "1 year and older" and "within 1 year" and shows the GB of data in each category. Taking it to the second level splits the "within 1 year" branch into five levels and the "1 year and older" into each of the years for which you have data. There is no third level available.

Hierarchical List - This view presents the data in list rather than tree format. To get to deeper levels, click the + sign next to any of the categories. In addition to the file names, there are columns for Size in Directory, Total Size and % with children. When you click on the headers for the columns, up and down arrows appear, making it look like the data is sortable by those columns, but it isn't.

Flattened List - This is a sortable, non-hierarchical list of the folders. When viewing by Size, in addition to File Name, there are seven other sortable columns of data in each folder, including Size and Number of Items. The Type and Date views are similarly sortable. In none of these views can you look at a subtree, only at the entire file system. To view a subtree, go to one of the other views and narrow it down to the subtree and view type you want, and then click on the Flattened List visualization.

Your hard drive in "sun burst" view.

Sunburst - A type of pie chart, with rings showing each of the levels of depth. The chart can display each slice as an even size, or can adjust the sizes by the file count or amount of data in the slice. Clicking on any of the slices will move that folder or data point into the center circle, with the rings showing the subfolders or subcategories of that particular subdirectory.

Tree Map - A box type view of the data. As with the Sunburst, the boxes can be sized equally, or sized by data size or number of files. Clicking on any of the boxes will show the details within that subdirectory or data type.

Bubble Chart - This gives two layout options for showing the data: Bubble Chart or Circle Pack. The Bubble Chart shows bubbles for all the items in that category sized by the amount of data in that folder or file type. The Circle Pack presents a hierarchical view of the bubbles. In either view, clicking on a bubble or circle will give the bubbles showing the subcategories of that item.

Conclusion
The File System Visualizer is a quick and easy way to gain understanding of what's on your file system. It's intuitive to use and within minutes, you can start locating what is taking up disk space. Then you can delete or archive anything that is no longer needed, or establish policies to prevent wasted space. Then, if additional storage space is still needed, you can give management a clear visual presentation of how storage is being used in your environment. You can start visualizing your hard drive right now.

Nick is Zetta's Corporate Reporter, and has been writing and telling stories about technology with blogs, social media, and content marketing since the days when the BBS reigned.

More Stories By Derek Kol

Derek Kol is a technology specialist focused on SMB and enterprise IT innovations.

@MicroservicesExpo Stories
Agile has finally jumped the technology shark, expanding outside the software world. Enterprises are now increasingly adopting Agile practices across their organizations in order to successfully navigate the disruptive waters that threaten to drown them. In our quest for establishing change as a core competency in our organizations, this business-centric notion of Agile is an essential component of Agile Digital Transformation. In the years since the publication of the Agile Manifesto, the conn...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
Cavirin Systems has just announced C2, a SaaS offering designed to bring continuous security assessment and remediation to hybrid environments, containers, and data centers. Cavirin C2 is deployed within Amazon Web Services (AWS) and features a flexible licensing model for easy scalability and clear pay-as-you-go pricing. Although native to AWS, it also supports assessment and remediation of virtual or container instances within Microsoft Azure, Google Cloud Platform (GCP), or on-premise. By dr...
The cloud revolution in enterprises has very clearly crossed the phase of proof-of-concepts into a truly mainstream adoption. One of most popular enterprise-wide initiatives currently going on are “cloud migration” programs of some kind or another. Finding business value for these programs is not hard to fathom – they include hyperelasticity in infrastructure consumption, subscription based models, and agility derived from rapid speed of deployment of applications. These factors will continue to...
While we understand Agile as a means to accelerate innovation, manage uncertainty and cope with ambiguity, many are inclined to think that it conflicts with the objectives of traditional engineering projects, such as building a highway, skyscraper or power plant. These are plan-driven and predictive projects that seek to avoid any uncertainty. This type of thinking, however, is short-sighted. Agile approaches are valuable in controlling uncertainty because they constrain the complexity that ste...
identify the sources of event storms and performance anomalies will require automated, real-time root-cause analysis. I think Enterprise Management Associates said it well: “The data and metrics collected at instrumentation points across the application ecosystem are essential to performance monitoring and root cause analysis. However, analytics capable of transforming data and metrics into an application-focused report or dashboards are what separates actual application monitoring from relat...
"This all sounds great. But it's just not realistic." This is what a group of five senior IT executives told me during a workshop I held not long ago. We were working through an exercise on the organizational characteristics necessary to successfully execute a digital transformation, and the group was doing their ‘readout.' The executives loved everything we discussed and agreed that if such an environment existed, it would make transformation much easier. They just didn't believe it was reali...
"Codigm is based on the cloud and we are here to explore marketing opportunities in America. Our mission is to make an ecosystem of the SW environment that anyone can understand, learn, teach, and develop the SW on the cloud," explained Sung Tae Ryu, CEO of Codigm, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"We're developing a software that is based on the cloud environment and we are providing those services to corporations and the general public," explained Seungmin Kim, CEO/CTO of SM Systems Inc., in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Many enterprise and government IT organizations are realizing the benefits of cloud computing by extending IT delivery and management processes across private and public cloud services. But they are often challenged with balancing the need for centralized cloud governance without stifling user-driven innovation. This strategy requires an approach that fundamentally reshapes how IT is delivered today, shifting the focus from infrastructure to services aggregation, and mixing and matching the bes...
DevOps promotes continuous improvement through a culture of collaboration. But in real terms, how do you: Integrate activities across diverse teams and services? Make objective decisions with system-wide visibility? Use feedback loops to enable learning and improvement? With technology insights and real-world examples, in his general session at @DevOpsSummit, at 21st Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, explored how leading organizations use data-driven DevOps to close th...
"CA has been doing a lot of things in the area of DevOps. Now we have a complete set of tool sets in order to enable customers to go all the way from planning to development to testing down to release into the operations," explained Aruna Ravichandran, Vice President of Global Marketing and Strategy at CA Technologies, in this SYS-CON.tv interview at DevOps Summit at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
We just came off of a review of a product that handles both containers and virtual machines in the same interface. Under the covers, implementation of containers defaults to LXC, though recently Docker support was added. When reading online, or searching for information, increasingly we see “Container Management” products listed as competitors to Docker, when in reality things like Rocket, LXC/LXD, and Virtualization are Dockers competitors. After doing some looking around, we have decided tha...
The nature of test environments is inherently temporary—you set up an environment, run through an automated test suite, and then tear down the environment. If you can reduce the cycle time for this process down to hours or minutes, then you may be able to cut your test environment budgets considerably. The impact of cloud adoption on test environments is a valuable advancement in both cost savings and agility. The on-demand model takes advantage of public cloud APIs requiring only payment for t...
DevOps teams have more on their plate than ever. As infrastructure needs grow, so does the time required to ensure that everything's running smoothly. This makes automation crucial - especially in the server and network monitoring world. Server monitoring tools can save teams time by automating server management and providing real-time performance updates. As budgets reset for the New Year, there is no better time to implement a new server monitoring tool (or re-evaluate your current solution)....
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, discussed how by using ne...
The benefits of automation are well documented; it increases productivity, cuts cost and minimizes errors. It eliminates repetitive manual tasks, freeing us up to be more innovative. By that logic, surely, we should automate everything possible, right? So, is attempting to automate everything a sensible - even feasible - goal? In a word: no. Consider this your short guide as to what to automate and what not to automate.
"We are an integrator of carrier ethernet and bandwidth to get people to connect to the cloud, to the SaaS providers, and the IaaS providers all on ethernet," explained Paul Mako, CEO & CTO of Massive Networks, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
From our perspective as consumers, perhaps the best thing about digital transformation is how consumerization is making technology so much easier to use. Sure, our television remote controls still have too many buttons, and I have yet to figure out the digital display in my Honda, but all in all, tech is getting easier for everybody. Within companies – even very large ones – the consumerization of technology is gradually taking hold as well. There are now simple mobile apps for a wide range of ...
"I focus on what we are calling CAST Highlight, which is our SaaS application portfolio analysis tool. It is an extremely lightweight tool that can integrate with pretty much any build process right now," explained Andrew Siegmund, Application Migration Specialist for CAST, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.