13.11.2014

MathWorks seminaras

Vieta: Didžioji g. 35/2, LT-01128 Vilnius

Laikas:

Kaina: apply_url NEMOKAMAS


How MATLAB can help you use “Big Data” to your business’ advantage

 

Tesco, the largest retailer in UK, saves £100m per year on advanced data analytics with MATLAB.

 

“Big data” commonly refers to the dramatic increase in the amount and rate of data being created and made available for analysis. A primary driver of this trend is the ever increasing digitization of information. Almost all industries face challenges in taking advantage of this growing amount of data available from sales and marketing systems, financial markets, experimental data, sensors, video systems, and other similar sources. How can you turn to more agile data analytics techniques to leverage your data and make better and more informed business decisions, find new opportunities, and reduce risks?

 

Big data represents an opportunity for analysts and data scientists to gain greater insight and to make more informed decisions, but it also presents a number of challenges. Big data sets may not fit into available memory, may take too long to process, or may stream too quickly to store. Standard algorithms are usually not designed to process big data sets in reasonable amounts of time or memory and there is no single approach to big data.

 

In this seminar we will demonstrate how MATLAB can become your platform to:

–          Understand Your Data – read data from multiple sources, pre-process, filter, align and visualize

–          Put Your Data to Work – statistical analysis, machine learning, predictive modeling, “what if” scenarios

–          Scale Up – Create desktop apps, and implement into production systems, data warehouses and “big data” environments.

 

By using MATLAB as your Big Data-platform you get access to a number of tools to tackle these challenges:

–          Built-in statistics libraries; including powerful techniques for predictive analysis and machine learning

–          Parallelization; whether it is on your local multi-core computer, a GPU, a company cluster, or remotely at Amazon’s Elastic Computing Cloud (EC2)

–          Streaming data; stream processing of data that is too large and/or too fast to hold in memory

–          Application deployment; integration of your analysis/prediction algorithms with other applications, the web, etc