With Big Data becoming overwhelmingly popular, many organizations have either already implemented a Big Data solution, or have been investigating how Big Data can assist them. A recent Gartner survey revealed that 73% of organizations plan to integrate Big Data solutions within the next 2 years. As Big Data platforms mature, many companies are seeing them as less of an experiment, and more as a viable solution to some of today’s complex data management challenges.

While it’s common for organizations to have unique Big Data solutions that may not be a popular use case, this article will be discussing 5 Big Data use cases that are often encountered. The data does not have to necessarily be petabytes, or even terabytes in size. As we’ve discussed previously, Big Data solutions can address various situations in which a typical RBDMS would not be well suited for.


1. Improving Operational Efficiency

Leveraging Big Data platforms can make a significant impact by making your data more transparent; by discovering inefficiencies in operations, organizations can realize vast savings in both time and money.

Traditional systems are already managing your analytics, but the way Big Data can help is to analyze the data that your primary systems are not making use of. Most large companies actually only structure and analyze a fraction of their data – it would be impossible to do so with all of it.


2.Archiving Data

We’ve previously talked about using Hadoop as an active archiving solution.  In today’s data-driven world, the cost of data archival is very high. One common question for any warehouse implementation is, “How much data of history we need to maintain in the warehouse?”. Many enterprises will give very simple answer – which is, 7 years or more. In some of the areas like crime control units or government organizations, online data retention is very important and may be accessed frequently. However, in corporate data warehouses the data is less valuable, and to maintain such data is very costly.

In this scenario, the combination of low-cost storage, linear-cost scalability offered by Hadoop’s HDFS storage subsystem can become an excellent solution.  By archiving historical data from high-cost, high-performance storage systems to low-cost HDFS storage, the primary reasons for removing data from the query-able data warehouse are achieved. Storage cost has been reduced and query performance and analytics capabilities are maintained.

Hadoop can scale linearly simply by adding more storage nodes, the business can continue to archive data by incrementally adding storage nodes as its archive needs to grow. Hadoop at its core is both a low cost storage platform and an engine to query the data it stores. Data stored in Hadoop can be queried in a variety of ways.

At the lowest level, flexible Map/Reduce jobs can be used.  However when the data archived are data warehouse tables that have predictable structures, a simple and effective approach is to use Hive or PolyBase. Hive is a technology that lets users use SQL-like syntax to query data in the Hadoop cluster.  It can be accessed through familiar tools like Excel, PowerPivot and Reporting Services through the use of a standard ODBC driver.

More enterprise deployments are beginning to leverage the power of Hadoop to find creative solutions to use cases that were either impossible or prohibitively expensive in the past. While Hadoop enables an amazing variety of scenarios, many of its uses are simple and have compelling and short-term ROI.  Low-cost online data warehouse archiving is one of the simplest Hadoop patterns to implement, and has an easily quantifiable and often immediate return on investment.


3. 360 degree View of the Customer

An often-heard phrase when people are discussing Customer Experience Management (CEM), is “360 degree view”. However, unless you happen to have a solid grounding in CEM already, it is unlikely that you will understand what this phrase means the first time you hear it.

When a customer interacts with the business, at any touch point, and across any channel, we should be capturing the Voice of the Customer (VoC) and make informed business decisions that improve the customer experience, based on insights gained from interrogating this VoC data. This is the very essence of Customer Experience Management, it is the reason that CEM exists – to empower the Voice of the Customer.

Leveraging emerging big data sources and types to gain a much more complete understanding of customer behavior—what makes them tick, why they buy, how they prefer to shop, why they switch, what they’ll buy next, what factors lead them to recommend a company to others—is strategic for virtually every company. To gain that 360 view of the customer, organizations need to be able to leverage internal and external sources of information – along with strong master data management – to assess customer sentiment.

A large part of building a 360 degree view of the customer experience is making sure that we have not only captured the Voice of the Customer at every possible touch point, but also that this VoC data is warehoused in a minable form. This gives us the first part of the 360 degree view – the past.

The part that really closes the loop and brings a true 360 degree view of the customer experience is, you guessed it – the future. This is the aspect of Customer Experience Management that can have incredibly beneficial effects. We basically take the past, compare it with the present, and try to gain insights into how we can improve the customer journey. We have access to plenty of VoC data from previous customer interactions, we know what the current customer experience is, and we should be able to highlight shortcomings in the customer journey, trends in customer behavior, and problems with service provisioning that can be used to develop new business processes that are more customer centric.

So the past, present and future combine to give us an entirely 360 degree view of the customer experience. Big data solutions excel in this area by handling the mass amount and fast accumulation of this data and preparing it for Business Intelligence analytics, which may to exhaustive for your primary transactional systems.


4. Security Intelligence

It is important to recognize that the attributes that make big data valuable to the business also make it valuable to others—whether they’re hardened cyber criminals or a disgruntled system administrator looking to make a quick, illicit buck. Establishing effective security across the categories above—and the massive number of specific outputs, systems, and services that fall into each category—is both critical and challenging.

Further, given the massive, widely fluctuating processing demands associated with big data environments, many organizations are leveraging cloud-based services and platforms to support their big data initiatives. For those organizations running big data environments in the cloud, the task of managing security grows even more difficult. In the cloud, security teams have to contend with the threats of vendor’s infrastructure administrators, potential exposure to other tenants, and a number of other additional risks.

Some of the big data vendors provides traditional encryption approaches to protect the data but it won’t work when organizations will move that data to cloud. Big data solutions can provide security for all types of data that organizations store in cloud. Data may be vary, ranging from audio, video, Social media, Databases, error logs or configuration files. Very sensitive information can be stored in system logs, configuration files, disk caches, error logs, and so on. With this kind of security, intelligence organizations can sift through massive amounts of data, uncover the frauds and examine new sources and varieties of data.


Data-Driven Products and Services

Innovating new products and services is the lifeblood of any business. Unless businesses can develop offerings that closely align with customer needs and desires, how else can they create new revenue streams, gain a competitive advantage and boost customer loyalty?

Savvy companies are leveraging big data to create new, data-driven product and service offerings. Think about how you can harness your customer data, social media insights, transaction data, geo-location data and device data. All this data can be combined with third-party data to offer new services. For example, a media company could provide brands and advertisers with analytic reports about how customers behave using mobile apps, allowing them to optimize ads and boost responses.

With data analytics, there are limitless possibilities. Apply smart, data-driven decision making to these mission critical business processes, and you will reap the rewards of high impact data analytics.

While cloud computing and SaaS are not new trends, their next phase will likely hinge on big data. Application providers are innovating around data and analytics architecture to make their products more powerful, intelligent, and valuable to customers. An embedded analytics interface inside the end-user application allows the vendor to fully capitalize on this innovation.

{loadposition bigdata}