How to Handle the “V” for Volume in Big Data on DB2 for z/OS

Posted by Frank Fillmore on March 14, 2013 under Big Data, DB2 for z/OS, IDAA, Netezza. Tags: , .

One of the challenges of Big Data is, well, its bigness.  There are two approaches you should consider – one fairly new, the other not yet delivered – for your data volume issues on DB2 for z/OS.

The first technique exploits the IBM DB2 Analytics Accelerator (IDAA).  IDAA is a Netezza (now PureData System for Analytics) appliance lashed together with DB2 for z/OS using a high-speed connection.  The same data typically resides in both DB2 and Netezza.  The DB2 optimizer determines if a particular dynamic query can be addressed at a lower cost by offloading that query to Netezza.

Until November 2012, all of the data needed to be present on both Netezza and DB2.  In IDAA version 3, that restriction has been removed.  The attached presentation describes offloading older data to the Netezza and eliminating it from DB2.  In a common use case the most current data (say, the last 30 days) remains in DB2, but older data (days 31 to the preceding 2 years) reside *only* in Netezza.  This is called the High Performance Storage Saver (HPSS).  IDAA HPSS

Based on the cost of mainframe DASD storage and other factors, it might be less expensive to store stale data in a “hot” back-end like the Netezza part of an IDAA than to continue to store that data natively in DB2 for z/OS.  From an API standpoint, you run your queries as if all of the data resides locally in DB2 for z/OS.  That way you can change which data reside in each of the IDAA components with out impacting applications or SQL changes.

The second technique exploits temporal data functionality delivered in DB2 10 for z/OS.  Temporal tables enable you to see values for attributes of a particular business object (account, part, employee, customer, etc.) at a specific point-in-time (PIT).  As SQL UPDATEs are processed against a transaction table, deltas are recorded in a history table associated with the transaction table.  That way you can issue a query like:

SELECT coverage_amt FROM policy FOR SYSTEM_TIME AS OF '12-01-2010' WHERE id = 111

and see the coverage_amt at a point-in-time regardless of any intervening changes.  DB2 11 for z/OS – not yet Generally Available (GA), but in beta with Early Support Program (ESP) customers – builds on this technology by extending it to “archive” data.  The use case is the same as the one described above: for processing efficiency you want the last 30 days of transaction data to reside in one DB2 table and older data (days 31 to the preceding 2 years) to be stored in another.  Think insurance claims: most activity regarding a claim occurs within the first 30 days of a loss.  But for business and compliance reasons, you want to retain up to 2 years of data on spinning disks; even older data might be kept on offline storage.  Using this technique, all of the data resides natively in two different DB2 for z/OS tables (current and archive).  The benefit this time is segregating commonly accessed data for processing efficiency.  As was the case with IDAA/HPSS, the location of the data is transparent to the SQL.  You write the query for the data you want and DB2 determines whether it resides in the current data table, archive table, or both.

In summary



Data segregated by

Data movement

Administrative overhead

Static SQL

Value prop



DB2 and IDAA

Bulk load or replication



Cost savings

DB2 11 for z/OS


Current and Archive tables




Processing efficiency

Big Data Forum in Virginia

Posted by Kim May on March 11, 2013 under Big Data, Frank Fillmore, IBM Information Management Software Sales, IBM Pure Systems, Netezza, Optim, TFG Blog. Tags: , , , , , , .

I had the pleasure of attending and participating as a Business Partner in the IBM Big Data and Governance Forum road show event in Reston, Virginia, last week.  Many of the attendees were from Federal government agencies – no surprise – and most of the attendees, at least in my opinion, seemed to be trying to satisfy their curiosity about  Big Data. 

The first session presenter was Tracey Mustacchio, the Director of Marketing for Big Data.  Tracey came to IBM from Vivisimo, one of the latest IBM Information Management acquisitions.  She started her session with a standard show of hands opening.  The amazing thing was, with about 100 people in the room, only THREE were actually using any Big Data solutions in their IT shops.  3%!!  That meant – with the simplistic way I do math –  the other 97% of the attendees were, like me, still trying to figure out how to apply this new technology.  And of course, overcome the skepticism that comes from hearing a solution referred to as a “game changer” a few too many times.

The information delivered was excellent, with the presenters using clear, understandable business use case examples to demonstrate the Big Data solutions at work.  I thought it was great. 

As a Business Partner The Fillmore Group raffled off the gorgeous, huge coffee table book, “The Human Side of Big Data”, by Rick Smolan and Jennifer Erwitt.  After reading a glowing review I bought the book about a month ago for myself, and while I am bit overwhelmed by the book’s size (did I mention it’s huge?) which made it impossible to stuff in my laptop bag like my Kindle, it is simply beautiful and the perfect way to get excited about the potential of Big Data.  Each article is short – maximum length 2 pages – and tells about how Big Data has changed the way people are doing something.  And yes IBM fans, page 208 features Jeff Jonas.  Because what book about Big Data could leave out Jeff Jonas and his fascinating life and experience developing the IBM Entity Analytics solutions.

The forum delivered lots of great ideas and use cases.  It’s time to start moving beyond the ideas.  We all need to figure out how to apply this amazing technology and reap the benefits!  The next step, recommended by IBM’s Crysta Anderson, is to attend the IBM Big Data announcement on April 30th: to hear more about how you can move from theory to practice with Big Data.

DB2 101 Luncheon at Cinghiale April 11th

Posted by Kim May on February 28, 2013 under Big Data, DB2 Education, DB2 for i, DB2 for Linux Unix Windows, DB2 for VSE&VM, DB2 for z/OS, DB2 Gold Consultants, DB2 Migrations, Frank Fillmore, IBM Information Management Software Sales, IBM Mid Market Customers, IBM Pure Systems, InfoSphere, Netezza, Optim, Oracle. Tags: , , , , , .

As volumes of data explode and organizations look for unique ways to leverage the quantities of data – often in real-time – it’s easy to become confused.  We can help – join us for lunch at Cinghiale for an overview of the data management products in the IBM portfolio and gain an understanding of today’s top solutions.

Frank Fillmore will deliver an overview of hot topics in IBM data management:  Big Data, Information Governance, and appliances, and explain how new solutions are changing database and warehousing technologies.  Gain an understanding of how IBM’s acquisitions and integrated solutions have them positioned as today’s thought leader in data management.

DB2101 at Cinghiale (the “Boar”), 822 Lancaster Street, Baltimore, MD, 21202
April 11, 2013, 11am – 1pm

To register, click here.  

Satisfy your appetite and your curiosity with a tasty Italian lunch at local favorites Tony Foreman and Cindy Wolf’s Cinghiale. Whether you already use IBM data products or are interested in learning what IBM has to offer, the presentation will provide a dose of nutritious food for thought.

I will send you a confirmation with directions to the award winning Cinghiale.  Valet parking will be provided.

Free Webinar: What Your IBM Competitive Sales Specialist Wants You To Know

Posted by Kim May on February 25, 2013 under Baltimore Washington DB2 Users Group, Big Data, DB2 Education, DB2 for Linux Unix Windows, DB2 Migrations, IBM DB2 Services, IBM Information Management Software Sales, IBM Mid Market Customers, IBM Pure Systems, International DB2 Users Group (IDUG), Netezza, Optim, Oracle, Q-Replication, TFG Blog. Tags: , , , , , , , .

Join IBM DB2 Competitive Sales Specialist Bill Kincaid and me for The Mid-Atlantic Information Management Virtual Users Group webinar on February 28th.  The Mid-Atlantic Information Management Virtual Users Group is focused on delivering information and education to DB2 users in the Mid-Atlantic region.  Given the topic and the terrific speaker lined up, and since the session will be delivered via webinar, anyone interested in IBM database solutions is welcome to join us.
In the session current DB2 customers will learn how IBM has enhanced DB2 administrative tool functionality, added new features, and is offering new product bundles that deliver better value at a lower cost.  Customers considering DB2 will learn how IBM is providing better assessment tools and easier migration processes to simplify the transition to DB2. 
Is DB2 the best?  Is moving to DB2 a no-brainer?  Join us, ask questions, and learn more about today’s DB2. 
Date:  Thursday, February 28, 2013
Time:  11am – 12:00pm Eastern
Registration is required.  Please register here
Bill Kincaid
Bill Kincaid is the IBM DB2 Competitive Sales Specialist for the Mid-Atlantic Region.  Bill has worked for IBM, Microsoft and Oracle, as well as data virtualization solution developer xkoto.  Bill’s specialty is helping database users – customers currently using DB2 and customers considering migrating to DB2 – to fully understand the value of DB2.  Bill earned his BA at the University of North Carolina – Charlotte and his MBA from the University of Georgia.  He lives in Charlotte and works with IBM customers in the Eastern US.
I hope you will join us.

New Netezza – aka “Striper” Announced

Posted by Kim May on February 5, 2013 under Big Data, DB2 for z/OS, IBM Pure Systems, Netezza, Sidecar, TFG Blog. Tags: , , , .

When IBM acquired Netezza the phenomenal processing capability enabled by the Netezza FPGA technology (Field Programmable Gate Arrays) made Netezza’s query speeds clear winners when benchmarked against all competitors.  Today IBM announced a new generation of Netezza, now branded the “IBM PureData System for Analytics”Powered by Netezza”, the N2001 systems.

The new N2001 system specs once again put Netezza into the “catch-me-if-you-can category, with a 3 times faster scan rate than the current Netezza, while supporting 50% greater storage capacity per rack, and improved resiliency and fault tolerance, thanks to more spare drives in the cabinet and faster drive regeneration.

The N2001 is available with the same quick, simple installation, and IBM also updated the IBM DB2 Analytics Accelerator, the “sidecar” Netezza that attaches to the mainframe.  More on the new sidecar soon – I hear the IDAA has even more capabilities to offer DB2 for z customers.  Stay tuned!


IBM Big Data, Integration and Governance Forum

Posted by Kim May on January 23, 2013 under Baltimore Washington DB2 Users Group, Big Data, DB2 Education, DB2 Gold Consultants, Frank Fillmore, IBM Information Management Software Sales, IBM Mid Market Customers, IBM Pure Systems, InfoSphere, Netezza, Q-Replication, TFG Blog. Tags: , , , , .

The Fillmore Group – and a special guest to-be-announced soon, will be be participating in the IBM Big Data, Integration and Governance Forum at the Sheraton Hotel in Reston, VA on March 5th.  The forum begins at 8am with a complimentary breakfast and sessions end at 1:15pm, followed by lunch provided by IBM.   There is no charge to attend the forum.

Great speakers are lined up to explain how best to leverage innovative techniques and gain a rock-solid understanding of how to:

  • Enhance the quality, availability and integrity of your data
  • Gain real-time insight into performance
  • Optimize and improve critical decision making
  • Reduce data management cost and complexity issues
  • Enforce data quality standards and stewardship policies
  • Effectively manage regulatory compliance demands
  • Leverage support for master data management (MDM) programs

Read More…