Zycko has launched an online tool that can help IT bosses analyse the massive growth of unstructured data within their organisation
Value added distributor Zycko has launched an online analysis tool for organisations struggling with the massive growth of unstructured data (emails, documents, videos etc). The aim is to give them a better understanding of the most appropriate and cost effective storage options for their organisation.
Zycko said that its Storage File Analyser has already been employed by over 20 organisations, revealing up to 400 percent over-use of primary disk space. The company also said that in a recent study of 40 million files (25TB) that it analysed, over 6TB (or nearly 25 percent of the data) was taken up by email archive resulting in unnecessary duplication.
Similarly, only about 20 percent of the data files had been accessed in the last six months.
“We get involved in a lot of data storage projects, and we see a lot of different storage methods out there,” said David Galton-Fenzi, Zycko’s Group Sales Director, speaking to eWEEK Europe UK.
“Zycko sells business to business, and we saw a requirement for a consultative tool to empower our resellers, so they can go to the end user and show them a profile of their storage systems,” he said. “We were astonished to find out how inefficient some end-user storage policies were. Even in some enterprise-sized organisations the tool found that many files had not been modified for six months, but they were still backing these files up every night.”
“This is an astonishing lack of management, and these files are not only costing a lot of money to store, considering equipment costs, but are also costing because of the heat and energy requirements as well,” he added. “If 60 percent of data hasn’t been reviewed or monitored for over six months, why are they not asking themselves questions, especially as they are backing up these files every day?”
Galton-Fenzi acknowledges that many organisations are struggling to deal with the growth of unstructured data.
“The need to deal with unstructured data will be our greatest IT challenge in the next four or five years,” he said. “There has been an explosion of digital content, with some analysts pointing to 50 to 80 percent compound growth of data storage per year. Up until now, people have not been controlling this, but they have just been buying more and more capacity.”
“Deduping data is not rocket science, but the problem is implementing these policies in the first place,” he said. “We are targetting this tool at the MD or FD, rather than the IT manager, so they can question the IT manager about all these files. For example, we found one enterprise organisation had up to 0.5TB (or 500GB) of data in a recycling bin, simply because they had not set a rule.”
“Until now, when an organisation wanted to profile its unstructured data, the supplier would send agents to the site for weeks at a time,” said Galton-Fenzi. “These agents install software and firewalls, interrupting business continuity and drawing out the data collection process.”
Galton-Fenzi said its Storage File Analyser is benign software, which can be downloaded from the Zycko website. It collect the raw data in less than a day. “The analysis is then conducted on our premises, with no further disruption to the client.”
“It is a sniffer that collects data (only file types not file content), and it gives you a list of those files (.xls .pdf .doc) and who owns them (user or department). From that you can work out how many are duplicate files,” he said. The tool analyses files by modification date, which has direct implications for backup and recovery, and also analyses them by date accessed and date created, which can be used to determine optimum storage media.
Zycko said that after the analysis, organisations will be in a better position to establish which storage options are most appropriate and cost effective for them. These could include anything from deduplication to virtualisation, archiving or compression.
Galton-Fenzi admits this is not the first file analyser out there, but it offers the first 3TB of analysis for free. “It is a good tool, and from there, we have developed a backup analytical tool as well. Backup is an obvious area for it to move to.”