Big Data Analytics: Three Critical Success Points

As I talked about last week the new big data analytics era is upon the IT departments of all types of corporations.  Insurance, health care, financial and other types of businesses are starting to accumulate many different and new types of analytics to provide an edge over their competition.

As a data management professional these challenges have added pressure as all the hard questions get magnified through the big data analytics, volumes of big data and issues of tremendous amounts of transactions, data dumps and new types of monitoring data.  To start addressing the issues of your big data analytics project look into these three critical success points to guide and position your project for success.

First, determine how big your big data analytics project is going to be?  While big data analytics can easily get enough data storage and computing power to go against all the policy holders or by going against the entire stock market, the reality is that the determination needs to be done to see if it is really practical.  Analyze the big data requirements, run the numbers within a spreadsheet and determine the size of the requirements.  Your calculations should be done on the first day of the project if possible.  Your calculations will either shock you with how big they are or reassure you with how manageable the big data analytics project might be.

Next, start small, validate, and grow.  Decide what sample set or limited amount of data will be able to validate your return on investment for the big data project.  With the possible expenses for all the resources of the big data analytics project will require plus the fact that over half of all IT projects fail, validate your concepts, data and analytics with a prototype.  For example, instead of trying to analyze all the insurance policies start with a sample set of 10%, validate your analytics, and their return on investment.

Finally, plan an archive strategy early during the project for your big data.  As you structure your big data database and set up the system processing environment, determine and define your archiving strategy.  Big data analytics performance depends on answers from your big data. By referencing the absolute minimum of your big data necessary to achieve insights and storing big data effectively and efficiently across the environment you will get the most efficient access  Big data always has degrees of effectiveness and analytical importance across the data range.  Research these aspects of your data to determine the correct parameters for an archive process that separates big data for importance, performance and cost effective storage.  This will help minimize the big data storage set and provide all efficiency for minimizing big data analytics processing.

These are only three critical items to analyze for your big data system. As these projects get the green light, getting these areas addressed quickly will help you big data project be a success.


Dave Beulke is an internationally recognized DB2 consultant, DB2 trainer and education instructor.  Dave helps his clients improve their strategic direction, dramatically improve DB2 performance and reduce their CPU demand saving millions in their systems, databases and application areas within their mainframe, UNIX and Windows environments.

Leave a Reply

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>