Feeds:
Posts
Comments

Posts Tagged ‘data-driven decision making’


Many companies have found success in building data warehouses that meet basic needs, but are now finding they need to move beyond the back-office warehouse to leverage information on the front lines of decision making throughout their entire company. They need information on demand and need the ability to build systems that can deliver on those promises with real incremental returns.

For those who understand the power of an analytics-driven organization, this is a most exciting time. The opportunities are limitless: customers, prospects, suppliers and the business itself are creating endless geysers of data. Analytics tools are inexpensive, widely available and so easy to use that they make business sense in almost any situation.

To move forward, organizations need a strategy that delivers on several focused business requirements:
1) Operational management: Accelerate time-to-market to meet business SLAs for new and existing business processes, operational analytics and business intelligence (BI).
2) Big data: Leverage unstructured data, social media and other “big data” information sources to gain more insights from more data—without impacting the business SLAs.
3) Predictive analytics: Forecast future trends and analyze risks and potential outcomes.

Many IT organizations are adopting a strategy called smart consolidation that reconciles the need to simultaneously distribute data warehousing and analytics capabilities and infrastructure while centralizing management. Smart consolidation is a method for evolving an existing data warehouse architecture to meet today’s demanding analytic needs, such as big data, streaming data and unstructured data.

In a nutshell, it involves thinking beyond the traditional warehouse structures that have provided great success with structured data, basic reporting and analysis. Smart consolidation is driven by these four goals:

  1. Consolidate and govern enterprise data
  2. Optimize workloads for performance and SLAs
  3. Simplify the delivery of analytics by leveraging appliances
  4. Flexibly extend analytic capability as needed

The basis for smart consolidation is to completely optimize an analytics architecture by placing the right workload against the right data, in the right place, at the right cost and the right performance level.

Smart consolidation acknowledges that an organization requires different types of databases, analysis tools and data formats. It needs traditional data warehouses, data warehouse appliances and operational BI systems that can accommodate different types of workloads. It also needs systems based on advanced technologies that can efficiently handle data that is moving extremely quickly as well as large volumes of data that does not change frequently.

Single system? I think not

No single, data system could efficiently serve all these requirements and perform well for both transactional and analytical workloads. Under the smart consolidation strategy, multiple specialized elements use industry standards to communicate and join together to form a fluid, agile data ecosystem that delivers business insight, cross-organizational data governance and centralized IT resource management. By allowing many different elements to serve specialized needs, smart consolidation also enables organizations to accommodate the endless variety and rapidly growing ocean of semi-structured and unstructured data.

Advertisements

Read Full Post »


Well, I have to admit. Using the retail sales of high heeled shoes to indicate growth or a downturn in the economy is bit out there. But after seeing IBM in the news this week, I decided to look into this a bit further.

And they are right; the height of high heels sold does provide an indicator of the current economic situation. In an economic downturn, data shows that the height of high heels goes up – evidently, women buy higher heels in an attempt to escape the reality of tough economic times. Surely, feeling better about yourself, feeling prettier, perhaps a bit of personal indulgence, does help one escape the feeling of being controlled (can’t escape a bad job), beaten down (unemployed), and even trodden upon financially (foreclosed).

I have spoken about retail analytics a number of times in recent months, and I have been using a graph to open my presentation that shows the rise and dramatic fall of U.S.consumer spending as a wake up call to those analytically inclined. Consumer spending is such a key part of the U.S.economy and the retail industry. And as I wrote about in one of my earliest blog posts, in good economic times, everyone makes money. But in these tough economic times with consumer spending at a similar level to 1997 (mind you, after a fairly significant increase from its recent low), it takes much more than luck to survive let alone prosper. It takes an “analytics-driven” attitude to survive and thrive.

“This time…something different is happening – perhaps a mood of long term austerity is evolving among consumers sparking a desire to reduce ostentation in everyday settings.”

So what will happen later this week on “Black Friday?” Will consumer spending be strong – hitting the $20 Billion mark on Black Friday as analysts at MasterCard predict, or will it limp along like a wounded duck? Is there enough pent-up demand after consumers have cut back so much in recent months, or is there a bright future on the horizon for retailers? Will the sales, promotions and advertising make a difference? Will hot ticket items like the new Kindle Fire be strong performers? What will happen to retailers that don’t sell hot ticket items?

There are a lot of questions here…. One thing, however, is certain. Many retailers that rely on gut feel may not make it. But those that mined their data, that used predictive analytics, and that extended themselves to analyze the “big data” of social media, well…they may hit the proverbial nail on the head.

Let’s see what shakes in the upcoming days. Stay tuned for more….

http://www-03.ibm.com/press/us/en/pressrelease/35985.wss

Read Full Post »


Good data is attained by integrating multiple data sources, deriving a ‘single version of the truth,’ and putting that good data (and unstructured content) into a data warehouse where the BA/BI tools can perform their magic. DDD (data-driven decision making) begins and ends with good data.

“smarter technology” posted a great article “Data Two, Gut One” that got me thinking more about the value of good data. In the article, they state that new research shows data-driven decisions improve organizational performance and company value.

In my current position as a marketing evangelist for data warehousing and analytics, that has proven itself time and time again. When looking for ways to increase our marketing effectiveness, I look at the various marketing channels we use to get our messages out to the market. Having complete data available regarding marketing activities, marketing channels used, customer segmentation sets, response rates…and, how leads progress through the sales funnel, is critical.

When some of the data is unavailable, quite simply, the “truth” is not known, and any decisions based on this information is just a guess, gut feel, intuition, a hope and a prayer, a SWAG. When the data is complete, accurate and trusted, I can then make quality decisions to fill in any gaps, go after new markets, tweak the messaging to get higher response rates, etc.

Analytics applications that nicely present dashboards, scorecards, historical trends, predictive analys, and give me actionable insights, can all benefit from good data. Good data begins with data integration, data quality, and a good data warehouse.

If anyone reading this post has had good experience with good data, or a bad experience with bad data, I encourage you to share your story by commenting.

Read Full Post »

%d bloggers like this: