Webcast to Explain New DB2 Packaging – AESE

Posted by Kim May on November 5, 2010 under DB2 for Linux Unix Windows, IBM DB2 Services, InfoSphere, Optim, Q-Replication, Uncategorized. Tags: , , , , , , , .

Frank Fillmore will deliver a one hour presentation to explain the new DB2 for LUW Advanced Enterprise Server Edition (AESE) packaging – what’s included, how AESE differs from DB2 Enterprise Server Edition and most importantly, why anyone running DB2 for LUW should give AESE serious consideration.  

Date:  Thursday, November 18, 2010, Time:  12pm-1pm Eastern

With AESE, IBM is bundling Optim and InfoSphere tools that have been available as for-fee add-on features in the past.  Come learn how to upgrade DB2 and get: 

• DB2 Storage Optimization Feature
• DB2 Performance Optimization Feature
• Optim Performance Manager
• InfoSphere Federation Server
• InfoSphere Replication Server (Q Replication)
• Optim Development Studio
• DB2 Advanced Access Control Feature

To register, click here.  If you have a problem registering or have questions, email Blaire Crowley (blaire.crowley@thefillmoregroup.com).

InfoSphere Data Architect Webcast with Norma Mullin

Posted by Kim May on October 5, 2010 under Optim, Uncategorized. Tags: , , , , .

IBM’s Norma Mullin is presenting a webcast next week titled, “Using What’s New in InfoSphere Data Architect to Accelerate Design for Data Warehouse and BI”.  Norma, whose official IBM title is Consulting IT Specialist/Client Technical Specialist, has worked extensively with customers implementing IDA and authored a whitepaper on migrating from ERWIN to IDA.  The webcast is scheduled to run from 1:00pm to 2:00pm Eastern time this coming Thursday, October 14th.  To register, go to:  http://bit.ly/bq4U1J

Norma’s presentation abstract is below:

InfoSphere Data Architect is a data design solution to model, relate, and develop data assets. New features will help accelerate the design of data warehouses, automatically discover and create data warehouse schema, and enable the use of standardized logical models to centralize data definitions across the enterprise.

What you will learn: • Classify dimensions, create dimensional objects and analyze dimensional models in InfoSphere Data Architect • Work with dimensional physical models • How InfoSphere Data Architect integrates with Cognos and InfoSphere Warehouse

Local September Events – RUG and PoT

Posted by Kim May on August 25, 2010 under Baltimore Washington DB2 Users Group, DB2 Education, DB2 for Linux Unix Windows, Optim, Uncategorized. Tags: , , , .

If you are in the Baltimore/Washington area I want to make you are aware of two upcoming DB2 events – the September Baltimore/Washington DB2 Users Group meeting with a special LUW track on September 15th, and a hands-on pureQuery Proof of Technology The Fillmore Group is hosting in our Towson headquarters the following day, September 16th. 

IBM Silicon Valley Lab’s Vijay Bommireddipalli is presenting at the users group meeting the 15th on .NET and Java tuning, then leading the pureQuery PoT the following day.  Read More…

IBM/Optim Team Virtual Tech Briefing – DB2 Connect

Posted by Kim May on August 16, 2010 under DB2 Connect, DB2 for i, DB2 for Linux Unix Windows, DB2 for z/OS, Optim, Uncategorized. Tags: , , , , .

Kimberly Madia’s IBM/Optim team is hosting a Virtual Tech Briefing this Thursday:

DB2 Connect for DBAs: A Primer and Look Ahead

Whether you are a DB2 for LUW DBA who would like to access enterprise information or a z/OS DBA wondering how all those Java and .NET programmers are getting to your data, DB2 Connect is the solution.   Frank Fillmore, A DB2 Gold Consultant with an extensive history in training and consulting, will step you through a DB2 Connect primer from end-to-end, including platform architecture, DB2 Connect configuration parameters, and more. Case studies from large scale DB2 Connect health checks will be included.   IBM’s Kevin Foster, who manages the development of the product, will be on hand to discuss licensing, new packaging options such as the DB2 Connect Advanced Edition, which provides pureQuery acceleration in the box, and upcoming changes to the product.

Date: 19 August 2010
Time:  10:00-11:00AM Pacific, 1:00-2:00PM Eastern
Register here: http://ow.ly/2iAlr

IBM zEnterprise Has Benefits on Many Levels

Posted by Frank Fillmore on July 23, 2010 under DB2 for z/OS, InfoSphere, Optim, Q-Replication. Tags: , , , , , .

I attended IBM’s zEnterprise announcement in New York yesterday.  IBM Senior VP Steve Mills said it was the most important announcement IBM had ever made in its impact on saving customers money.  He also said IBM spent US$1.5 billion dollars on the zEnterprise research and development effort over the past several years.  So, as movie reviewers ask about the latest blockbuster: “Can you see the money up on the screen?”.  The answer comes in a few loosely coupled parts.

  1. Oskar Schindler said you must have a “clever accountant”.  Mills made it clear that organizations that can accurately allocate their IT expenses will see the most benefit from zEnterprise.  zEnterprise delivers System z quality of services (QOS) across heterogeneous architectures: the aforementioned System z as well as Power 7 blade servers and (eventually) System x blades.  The problem for most organizations is that System z “mainframe” costs have been capitalized from central IT budgets for over four decades.  As the PC revolution unfolded since the early 1980’s, most of the costs for networking, systems administrator salaries, PCs themselves and the software they run have been expensed out of departmental budgets.  Organizations with the discipline to accurately accumulate these costs certainly will be able to see the benefit of deploying the zEnterprise platform.  Interestingly, the table talk at lunch indicated that some IBMers see the sweet-spot for zEnterprise in the rapidly growing economies of China and Russia.  The reason?  Tight budgetary control and hierarchical, centralized decision-making in state and quasi-state enterprises (think Gazprom) will help them “get it” immediately.  I would not be surprised to see zEnterprise adoption in emerging and growing economies exceed that of North America in the next two years.
  2. IBM has been able to run Linux on System z hardware using Virtual Servers and z/VM for a decade.  And the System z has been able to dispatch workloads to specialty engines within System z such as the Integrated Facility for Linux, zIIP, and zAAP for years.  Think of zEnterprise as extending that dispatching capability out of the physical System z box to discrete blade servers.  IBM’s goal is to move away from the “you can do everything on System z” posture – which in reality was a losing, rear-guard action – to embracing disparate architectures and acknowledging that maybe a print server really should run under Linux on an x86 platform.  Yet you can benefit from the centralized management and security of the System z.  This is workload integration at the chip, firmware, hypervisor, and middleware levels.  A pretty neat trick.
  3. So what can I do with the zEnterprise?  Here are two relatively simple scenarios.

The Online Travel Portal  One well-known travel reservation site front-ends their expensive Oracle transaction servers with MySQL running on cheap x86 hardware.  While you’re noodling around trying to figure out the best intinerary, all you’re seeing is data replicated from Oracle to MySQL on a near-real-time basis.  When you enter your credit card number and hit “Purchase”, you’re routed to the Oracle OLTP server.  This is called “database tiering”.  I can now architect the same topology on System z with DB2 for z/OS on the back-end and x86 blades running the DB2 Express-C freeware database.  On the zEnterprise platform, these databases will communicate over a 10Gb private, secure network with extraordinarily low latency.  Ever get the “That seat is no longer available” message?  It might be a thing of the past with zEnterprise.

The Hospitality Company  This organization runs their centralized reservation systems on DB2 for z/OS already.  In order to support their frequent-guest affinity program portal, they have WebSphere Application Server running on a separate System p AIX server.  The only problem is that sometimes transactions hang to the point that the JVMs have to be recycled.  The Java programmers say their code is tightly written and the DB2 for z/OS database administrations say that the incoming SQL requests are satisfied sub-second.  While zEnterprise alone would not resolve this problem – see pureQuery and the lyrically named Optim Performance Manager Extended Edition – the application and the database servers will be as tightly coupled as possible while each runs on the optimal platform.  Since the transfer points and the servers themselves are under unified management, an entire layer of complexity (and potential breakage) will be eliminated.

The real buzz in the announcement for me is the IBM Smart Analytics Optimizer (ISAO).  For a generation as a DB2 database administrator, I’ve told my clients that OLTP and ad hoc query workloads should not be intermingled.  The solution has been to make copies of the data using replication technologies – InfoSphere Change Data Capture and Q Replication among them.  This approach has been a boon to DBAs, and software, storage, and server salesmen everywhere.  When it achieves its full promise, ISAO will evaluate incoming database requests and dispatch them along with the data needed to satisfy the request to the appropriate platform server.  DB2 for z/OS will serve as a centralized front-end for all workloads: OLTP, OLAP, ad hoc query, etc.  ISAO will transparently run the workload on the optimal platform and return the result set to the requesting application.  Organizations will be able to dismantle the miasma of extracts, FTPs, and other artifices now necessary to keep analytic workloads from bogging down OLTP.  And they’ll reduce complexity.  And save a ton of cash.

So on whose door will IBM knock first?  Clearly the System z installed base will be getting lots of attention.  But could Facebook or some other enterprise with orders-of-magnitude scaling issues (500 million Facebook users and counting) benefit from zEnterprise?  Surprisingly, the answer is Yes!  Facebook needs to manage lots of unstructured data (pictures, videos, et al) , but they also have the need for complex analytics.  First, to target online advertising ever more precisely, but also to serve larger societal needs.  Let’s say a man declares he needs a reduction in child support because he’s nearly broke.  The local social services agency unleases a smart agent to run against social networking sites and comes up with pictures on Facebook from the man’s recent two-week vacation in Hawaii.  Too big brother-ish?  A topic for another day.

Summer Services Special

Posted by Kim May on July 21, 2010 under DB2 Connect, DB2 for i, DB2 for Linux Unix Windows, DB2 for VSE&VM, DB2 for z/OS, IBM DB2 Services, InfoSphere, Optim, Q-Replication, SQL Tuning. Tags: , , , , , , , , , , , .

As much as I dislike the relentless repetition that’s part of the nature of the world of blogs and twitter and listserves and email blasts, here I go with a shameless pitch for a TFG special services offering I emailed to several DB2 users earlier today.  I am doing this because, at the end of the day, the rate disparity in today’s DB2 services market baffles me.  Are the ridiculously high rates being charged eroding product adoption?  I am afraid so, which is why we are offering a summer services special. 

Read More…

InfoSphere Data Architect – Help for ERWIN Users

Posted by Kim May on July 2, 2010 under DB2 Education, InfoSphere, Optim, Uncategorized. Tags: , , , .

While the IBM Optim teams are excited to see customers adopting InfoSphere Data Architect, everyone recognizes the journey from CA ERwin to IDA can be frustrating.   The Optim team has been working to develop more resource materials for ERwin users and recently emailed a new Tips & Tricks guide developed by Joe Cullen.  If you find this valuable please let us know, and we will let you know when the new materials are released.  Thanks to Joe Cullen for offering to share this with the community!

IDA Tips&Tricks by Joe Cullen

Free Webcast Thursday, June 24th – pureQuery from developerworks

Posted by Kim May on June 18, 2010 under DB2 Education, DB2 for z/OS, Optim, Uncategorized. Tags: , , , .

The developerworks team has announced a free technical webcast next week on pureQuery, based on feedback from the z community.  Like other developerworks webcasts it will be available for replay – as is the recommended prerequisite, pureQuery Part 1.

pureQuery Deep Dive Part 3:  Client Optimization Administration Enhancements for DBA’s

June 24th, at 1pm Eastern, 10am Pacific

Our dynamic duo of client optimization experts is back again!  Patrick Titzler and Chris Farrar, who some of you may remember from pureQuery Deep Dive Part 1, will discuss the enhancements released in Fix Pack 3 in Optim Development Studio and pureQuery Runtime that make the maintenance, security, and management of the client optimization processes better for DBA’s.  These enhancements are driven by real customer experiences and are centered around a relational repository that enhances security, fosters collaboration and streamlines the process of administering a pureQuery-enabled application.  Tooling support for these capabilities in Optim Development Studio will be demonstrated. 

Register today and don’t miss out on your opportunity to hear from the experts!

DB2 Connect Virtual Briefing with the IBM Team

Posted by Kim May on June 7, 2010 under DB2 for i, DB2 for Linux Unix Windows, DB2 for z/OS, Optim. Tags: , , , , .

IBM has moved DB2 Connect support into the Optim group, as it is in some ways the original Optim solution.  Kathy Zeidenstein, who manages community outreach efforts for the Optim team (a weekly e-newsletter, content on the developerworks site, Twitter, etc.) is coordinating a DB2 Connect Virtual Briefing scheduled for August 19th.  The primary purpose of the presentation is to introduce new features in the advanced edition, however, as DB2 Connect can be used in so many ways, and is so often under-utilized, she’s invited Frank to devote some of the time allocated to delivering an overview of what DB2 Connect can do.  More information is on the way…for the moment we have the date and time reserved (mark your calendars – August 19th at 1pm Eastern), and a tentative agenda:

DB2 Connect for DBAs:  A Primer and a Look to the Future

Whether you are a DB2 for LUW DBA who would like to access enterprise information or a z/OS DBA wondering how all those Java and .NET programmers are getting to your data, DB2 Connect is the solution.  Frank Fillmore, A DB2 Gold Consultant with an extensive history in training and consulting, will step you through a DB2 Connect primer from end-to-end, including platform architecture, DB2 Connect configuration parameters, and more.  Case studies from large scale DB2 Connect health checks will be included. Kevin Foster, who manages the development of the product will be on hand to discuss packaging including licensing, new packaging options such as the DB2 Connect Advanced Edition, which provides pureQuery acceleration in the box, and upcoming changes to the product.
What you will learn:
• Why you need DB2 Connect
• How DB2 Connect is packaged and licensed
• Platform architecture
• Configuration and tuning options

As soon as I have registration information I will post it.


Take Me Out to the Ball*park*

Posted by Kim May on March 4, 2010 under DB2 Education, InfoSphere, Optim. Tags: , , , , , .

The countdown to the first day of Spring is below 20!  Spring can’t come soon enough for me, as here in Baltimore we’ve had over 80 inches of snow – our average is 18.  In preparation for Spring I am coordinating events with the local IBM teams to introduce customers to two great technologies we hope everyone will consider implementing in 2010:  InfoSphere Change Data Capture (CDC) and Optim Data Growth Solutions. 

Both events are planned for Baltimore’s beautiful Camden Yards, home of the Baltimore Orioles.  The tentative date for the CDC event is Thursday, May 6th, with a PoT scheduled two weeks later on May 20th.  I am working on getting the Optim dates on the calendar.  If you will be in the Baltimore/Washington area please try to join us – both topics should be of value to organizations looking for ways to better control their data.  CDC’s niche is in quickly moving data to where it’s most effective, while Optim Data Growth Solutions can help any organization with a packrat in the IT Department.  You know who you are.

Hope to see you there!