Attunity: Moving Big Data for Dollars
September 17, 2012 Alex Woodie
A year after acquiring fellow data replication software vendor RepliWeb, Attunity is remaking itself as a provider of Big Data pumps. The company, which recently moved up from the pink sheets to trade on the NASDAQ Capital Market, excels at moving large amounts of data in a quick fashion. Thanks to its new partnership with Amazon Web Services (AWS), a whole lot of data is going to go through the Attunity pump. IBM i shops may have heard of Attunity for its change data capture (CDC) software, called Attunity Stream. That product excelled at continuously pulling updates from source databases, such as DB2/400, to other relational databases, most commonly Microsoft SQL Server. Attunity, after all, is also an OEM partner of Microsoft, and its technology is sold as part of SQL Server Integration Services (SSIS). The company has established an IBM i customer base, including companies in the hospitality industry and manufacturers and distributors, such as EMI Music North America. Since its acquisition of RepliWeb for $8 million last September, however, the Burlington, Massachusetts, company has changed its approach in quite a dramatic fashion. Instead of focusing entirely on moving the structured data that sits in relational database management systems, Attunity is also now moving unstructured and semi-structured data, which is RepliWeb’s forte. The acquisition has worked out well for Attunity, and has helped the company redefine itself. Matt Benati, Attunity’s vice president of global marketing, recently gave IT Jungle the lowdown on Attunity’s Big Data aspirations. “What Attunity aspires to be, and the reason we acquired RepliWeb, is frankly we aim to be data agnostics, or poly-structured as some people call it,” Benati says. “We don’t care about the structure of the information, or the unstructured nature of it. Customers need it to go from A to B to get value out of it. And that’s what Attunity is all about. Whatever information you need moved, we’re your player. We can do it when and where you need it. That’s our universal availability model for Attunity.” That universal model was on full display in July, when Attunity rolled out its first cloud offering, called Attunity CloudBeam. The offering is a software as a service (SaaS) version of its data replication technology that lives on the Amazon cloud, called AWS. According to Benati, the service has enabled AWS customers to move data 100 times faster, in some cases, than was previously available with AWS utilities. Analytics is also a Big Data interest of Attunity. Also in July, Attunity announced a deal with EMC to support that company’s GreenPlum analytic devices. According to Benati, Attunity worked closely with EMC’s GreenPlum team to optimize the connection between GreenPlum and Attunity, providing very fast loads of source material into the analytic machine. Analytics has always been one of the main drivers behind the desire to move massive amounts of data from one location to another, as quickly as possible. In the IBM i world, this usually meant moving customer or sales data from a DB2/400 production database to an Oracle or SQL Server database for final analysis. The analytic needs are similar today, but the infrastructure pieces and technologies are changing. For example, ETL (extract, transform, and load) is no longer a critical component with the new generation of large analytic machines. “The model of ETL had much more applicability 10 years ago than it does today,” Benati says. “In today’s world with the advent of these high-powered, parallelized analytics devices, purpose built for analytics, like [IBM‘s] Netezza or Greenplum–they do all the data transformations and processing when they get it there. And they’re so much faster than using ETL on a general purpose GPU box.” When ETL is no longer a major factor, speed becomes the main factor, he says. “People want to get data from source to target as fast as possible, so they can do the analytics as fast as possible. Data movement is the longest duration of that process,” he says. “Attunity saw this trend, and built Replicate for this process.” Replicate was developed to keep the replicated data in memory at all times, and never to write it to disk. “The data never touches down,” he says. “In an ETL process, the data touches down and goes to disk multiple times. But Replicate streams the data straight across from source to target. That is a huge performance gain. We clearly, unabashedly state that our performance is much better than others.” Attunity can’t prove those claims. It would like to, but it’s still a small company, and cannot afford the large sums that are required to conjure up an independent source to test and validate the performance of one piece of software against another. In the meantime, it will have to focus on developing and selling software, which appears to be going well enough. Today Attunity has more than 2,000 customers; RepliWeb had about 1,500 when it was bought. The company boasts seven straight quarters of revenue growth, although it has reported several losses. It did squeak out a $500,000 profit for the quarter ended June 30, on revenues of $6.4 million. On August 9, just as the company was moving up from the OTC:BB exchange, the company’s Chairman and CEO, Shimon Alon, got to ring the NASDAQ’s closing bell. RELATED STORIES Attunity Unveils New Data Replication Suite Attunity Signs OEM Deal with Microsoft for SSIS Attunity Updates CDC Software for SQL Server Insurance Company Chooses Attunity for DB2/400-to-SQL Server Replication Equipment Dealer Lauds Attunity for Speedy DB2/400 Replication Attunity Delivers New Data Replication Agent for DB2/400 Music Company Picks Attunity to Help Connect Bricks with Clicks Attunity Web Services Gateway Extends ‘Legacy’ OS/400 Apps This article was corrected. Amazon Web Services is not an OEM partner of Attunity. IT Jungle regrets the error.
|