The responsibility for managing that balance falls on the shoulders of Manager of IT David Kirchhoff and his team. Organizations like Brigham understand that technology is a competitive weapon and are committed to the development of IT infrastructure that delivers a business advantage. But they also know that with that commitment comes a dose of data storage reality. This is especially true with data sets that Brigham uses for 3-D reservoir characterization, which can be up to several terabytes in size. The storage system they select is critical to keeping the business running and insuring the productivity of Brigham's geologists as well as their IT staff. The core storage system that Brigham uses is the Isilon IQ clustered NAS solution. "We have been very pleased with our Isilon system. It is easy to manage, easy to scale and very cost effective. We have most of our raw SEG-Y data and GeoFrame data on it, in addition to other geo-seismic data sets. We're also hosting VMware virtual machines on it, along with user home directories." says Kirchoff.


While the Isilon system is easy to manage and scale, that scaling, of course, involves the cost of buying additional capacity. At some point, Brigham would have to archive projects to tape, which meant long restore times if those projects were needed again. The restore process includes finding what tapes had the project on them, retrieving them from off-site, and then executing a multi-terabyte recovery. This process represents a very time consuming dent in IT staff’s already overloaded day.


There is also the concern of the size of the physical footprint consumed by the storage system. In fact, Brigham was already approaching 100 terabytes of capacity in their current environment and was in a near-constant buying cycle for additional storage. They were also just about out of physical space in their data center. The physical limits to adding additional systems combined with the challenges of powering and cooling those systems were becoming greater than the concern of buying additional storage.


Recently, they evaluated Ocarina Networks to see if their solution could help solve Brigham's challenges. Ocarina is the developer of a storage data reduction product known as the Ocarina ECOsystem, which compresses and deduplicates data. The Ocarina solution can either optimize data and archive it to another storage system or optimize data and leave it in the primary storage system. 


The product is a hybrid solution that performs its optimization step out-of-band, for maximum space savings, without impacting storage system performance. But for reads it is in-band, so the users see a transparent reconstruction of their data. This combination keeps both storage and users productive. There was concern about how well traditional deduplication appliances would be able to optimize the pre-compressed formats unique to geo-exploration including SEG-Y files. Ocarina’s specialized compressors were able to shrink SEG-Y by up to 50%. This out-of-band optimization step provides the additional time needed to fully optimize pre-compressed data sets like SEG-Y.


An additional problem when considering other deduplication products is that they require a separate storage tier. This means that the data must be sent to it as opposed to being optimized directly in primary storage. "The ability to optimize the data and leave it in place appealed to us. Many of the geo-seismic applications that we use have very specific and somewhat elaborate directory structures, and moving that data around has proven to be problematic for the software. The last thing I can afford is to have one of our projects come to a halt because a file can't be found. The exploration process is too time consuming and too competitive to afford that risk", explained Kirchhoff. 


The first step in evaluating Ocarina was to use the optimizer to see what kind of data reduction could be seen against what Kirchhoff thought would be the most challenging data set; their GeoFrame data. "This is a fairly dense data format already, I was going to be happy with a 10% to 20% reduction." Instead, after some fine tuning from Ocarina, Brigham is seeing a 30 to 40% overall reduction in capacity consumed. "A 30 to 40% reduction on potentially 100TBs of data caught my attention", said Kirchhoff.  The below graph shows the actual savings by file type that Brigham is experiencing.

George Crump, Senior Analyst

 

Ocarina's success at compressing the seismic data, combined with the fact that Ocarina could optimize the data in place, led Brigham to move a pair of 2u Ocarina optimizers into production. As data becomes optimized they're keeping that data, now smaller, on the same Isilon shares that they occupied before. This saves Kirchhoff the trouble of managing another tier of storage and finding data, in addition to any application issue that may have arisen with data being in different locations. 


One concern was the performance impact in reading this data from its compressed state. Kirchhoff's informal testing did not show a noticeable delay in reading from the optimized version of the data. There were no lags in displaying, time-slicing or viewing the data in 3D. The other important consideration is that the Ocarina system allowed Kirchhoff to use policies for designating which projects are optimized and the system was set up to compress the older projects that had reached 120-days of inactivity. 


The system has worked so well that Kirchhoff has already decided to begin expanding its use beyond the geo-physical data. Already underway is the use of the product to reduce the size of user home directories and they're finding that the optimizer works well with pre-compressed formats like Microsoft Office and others. 


Going forward, Kirchhoff expects to use the product on the directories that act as targets for their disk to disk backup, as well as the TIFF-file-heavy geology projects the company is involved with. 


"Overall we are very satisfied with Ocarina Optimizer thus far. We will likely free up 20 to 30TBs worth of data, which will be the fastest return on investment I've ever seen in IT, but most important is the time this saves us by not having to constantly move data back and forth between different system. Everything can now reside, cost effectively, in one place." concluded Kirchhoff.

10 Most Common Files

This Article Sponsored by Ocarina Networks