Archive for August, 2013

Reduce Transactional System Complexities to Fund Your Next Innovation

Businesses today cannot operate without data—not even for a moment.

  • Businesses once had the upper hand, but today consumers have gained tremendous power. Increased choices, lower switching costs and easier access to product information have empowered customers to make more informed decisions and compare alternatives more easily. It has become extremely important for organizations to understand and anticipate customer behavior and needs using all available sources of information, including social media.
  • The situation is further complicated as organizations are expected to do more with less. Organizations need to optimize their processes and IT resources to create new opportunities, to mitigate risk, and to increase efficiency.
  • Every day, incredible amounts of diverse data are being generated, ranging from online clicks, transactions and machine-generated sensor data to social media posts, emails and videos. Businesses understand that collecting, processing and embedding this constantly growing stream of both structured and unstructured data into daily operations is key to meeting emerging challenges and uncovering new opportunities. In short, analyzing big data brings success. This is the new reality. Embracing analytics is now a requirement for successful organizational performance.

What is needed for IT organizations to meet today’s top challenges?

  • Handle more data, faster
  • Simplify set up, use and maintenance
  • Support existing systems
  • Use existing skills and don’t require application code changes

In short, make it super fast, super easy…   …and have it deliver super savings.

PureData System for Transactions is a highly available, large scale database appliance that helps you reduce time, effort, cost and risk to design, procure, integrate, and deploy highly available transactional database services.

Database Appliance:

  • Reduce time, work effort, cost and risk to design, procure, integrate, and deploy non-stop data services
  • Fast deployment of high availability clusters and databases

High Availability / Scalability:

  • Improve uptime and reduce downtime costs
  • Simplified disaster recovery
  • Scale out to handle growing data

Infrastructure Efficiency:

  • Consolidate many databases onto a single system
  • Reduce data center costs: space, power, cooling
  • Reduce storage costs

IT Administrator Productivity:

  • Application transparency; no application changes
  • Simplified self-management lowers IT staff time
  • Leverage existing skills

Find out more about PureData System for Transaction. Attend one of our seminars on: “Reduce Transactional System Complexities to Fund Your Next Innovation.” Click here to see the dates, locations and agenda.


Read Full Post »

Oh no! The next “big data” project is coming!

Your IT infrastructure has grown and evolved over many years and is the heart and soul of how your company operates. You’ve invested years to get it to where it is today – it’s running smoothly, and you consider your IT staff to be the very best at what they do. But deep in your heart you have an uneasy feeling…you know what’s coming next.

There is a backlog of projects on your plate. Important projects that will improve your company’s bottom line. First-mover projects that will tap into “big data” and empower your line of business managers to pursue new markets and get a jump on your competitors. But you know that your infrastructure can’t handle much more and that your staff can’t keep up with performance tuning and the few projects that are currently in the works.

The CEO just requested a meeting for next week. You know another significant big data project is coming, and this is just the tip of the iceberg for what’s coming later this year. On your drive home you ask yourself, “How can I possibly take on more projects, and more data? How can I change my infrastructure so I can deploy new applications faster? How can I shift my staff from tuning and maintenance to focus on higher value work?”

Workload optimization with expert integrated systems

As more servers, storage and software components have been brought into the data center, complexity has risen to the point of being almost unmanageable. General purpose systems have been forced to handle multiple workloads, and teams of database, application and system administrators spend a great deal of time and effort configuring, tuning and maintaining the systems for top performance and efficiency. With such a complex infrastructure, reliability often suffers and system downtime becomes a serious business risk.

At the crux of the issue is that different applications have different data workload characteristics, placing often conflicting requirements on the hardware, storage and software. Transaction and analytic processing tasks constitute very different workloads. Unless your data workloads are modest with respect to characteristics like data volume, number of users and analytics complexity, you need systems optimized in different ways to efficiently meet big data challenges.

Typically, IT organizations purchase general purpose systems that are not optimized for any workload – systems that are general purpose in nature. They tune these systems for one workload or the other, and spend considerable time and effort keeping the system tuned.

But what is good tuning for one type of workload is not good for another. Data retrieval optimizations that benefit one access path are likely to penalize alternative paths. The structural elements that were optimized for transactions, for example – indexes, shared memory, locks, caches, etc. – all impose performance and complexity penalties in an analytic environment, where unpredictable (“against the grain”) access paths and patterns are the rule.  Systems that are optimized to handle structured data are different than those that handle a wide variety of unstructured or structured data.

Separating transaction processing and analytic processing onto separate, workload-optimized systems helps ensure that overall performance is optimized. Data transaction systems can process large numbers of simple look-ups, while analytic systems execute complex queries on massive volumes of data.

There is a great opportunity to improve system performance and efficiency, and to accelerate solution deployment, by using expert integrated systems that come from the factory already optimized for specific workloads. And this is why IBM designed and built the PureData System with different models that are specifically optimized for different transaction processing and analytic workloads.

Read Full Post »

%d bloggers like this: