big-ideas-banner
 
 

Trends in Cloud Computing: Big Data’s New Home


IT organizations are increasingly looking to cloud computing as the best structure to support their big data projects.


by Satyendra Kumar Pasalapudi,
January 2014

Big data environments require clusters of servers to support the tools that process the large volumes, high velocity, and varied formats of big data. IT organizations are increasingly looking to cloud computing as the structure to support their big data projects. While enterprises often keep their most sensitive data in-house, huge volumes of data such as social media data may be located externally. Analyzing the data where it resides—either in internal or public clouds—makes big data in the cloud more appealing in terms of cost and gaining faster insights.

Pasalapudi-spotart

With the increase in the amount of unstructured data from social media, more value can be extracted from big data when structured data sets are merged and analyzed to gain completive advantage. It is a fact that data that is too big to process is also too big to transfer anywhere, so it’s just the analytical program which needs to be moved—not the data. This is possible with public clouds, as most of the public data sets such as Facebook, Twitter, Pinterest, financial markets data, weather data, genome datasets and aggregated industry-specific data live in the cloud and it becomes more cost-effective for the enterprise to tame this data in the cloud itself.

Analyzing the data where it resides makes big data in the cloud more appealing in terms of cost and gaining faster insights.

Drivers for big data on cloud adoption

Cost reduction: Cloud computing offers a cost-effective way to support big data technologies and the advanced analytics applications that can drive business value. Enterprises are looking to unlock data’s hidden potential and deliver competitive advantage. Big data environments require clusters of servers to support the tools that process the large volumes, high velocity, and varied formats of big data. IT organizations should look to cloud computing as the structure to save costs with the cloud’s pay-per-use model.

Reduce overhead: Various components and integration are required for any big data solution implementation. With cloud computing, these components can be automated, reducing complexity and improving the IT team’s productivity.

Rapid provisioning/time to market: Provisioning servers in the cloud is as easy as buying something on the Internet. Big data environments can be scaled up or down easily based on the processing requirements. Faster provisioning is important for big data applications because the value of data reduces quickly as time goes by. 

Thought Leader


Pasalapudi-headshotSatyendra Kumar Pasalapudi (satyendra.
pasalapudi@
appsassociates.
com
) is a practice manager in the Infrastructure Managed Services Team at Apps Associates and an Oracle ACE.

Flexibility/scalability: Big data analysis, especially in the life sciences industry, requires huge compute power for a brief amount of time. For this type of analysis, servers need to be provisioned in minutes. This kind of scalability and flexibility can be achieved in the cloud, replacing huge investments on super computers with simply paying for the computing on an hourly basis.

Cloud computing provides enterprises cost-effective, flexible access to big data’s enormous magnitudes of information. Big data on the cloud generates vast amounts of on-demand computing resources that comprehend best practice analytics. Both technologies will continue to evolve and congregate in the future.

 
 
    E-mail this page E-mail this page    Printer View Printer View