Conference TEDx Speaker on “Big Data”
Patrick Schwerdtfeger is a leading authority on global business trends including “big data” and the challenges and opportunities of massive data management and analytics. More and more devices are monitoring more and more activities, resulting in unprecedented quantities of data as well as the insights and opportunities hidden inside it. The trick is to identify the key performance indicators (KPIs) to reveal actionable insights. Patrick is the author of the award-winning book Marketing Shortcuts for the Self-Employed (2011, Wiley) and a regular speaker for Bloomberg TV. He has spoken at conventions and business events around the world. His approach to the data analytics topic is strategic and empowering. And since “big data” discussions tend to be dry and technical, Patrick’s dynamic and engaging speaking style make him a perfect selection to end your conference on a high note.
Visit Patrick’s video blog and subscribe to be notified of future videos by clicking the button below.
Past clients include …
Recent speaking destinations include …
Big Data and Analytics Keynote Speech
Patrick begins his Big Data keynote program with the primary players involved. Aerospace, utilities, eGovernment initiatives, healthcare and financial institutions are all accumulating incredible amounts of data. The problem is that they don’t know how to organize it, process it or analyze it. In some cases, due to the magnitude of data involved, it takes networks of thousands of servers to run simple filters and queries (using technologies like Hadoop and MapReduce). The resulting challenge has led to entirely new educational specialties and professional occupations including data scientists and data engineers. These professionals specialize in the structuring and analysis of massive data sets.
Once the evolving trends has been introduced, Patrick’s keynote shifts to the opportunities. Algorithms and predictive analytics are the future. The accumulation of data is step #1. Identifying patterns and opportunities (analytics) is step #2. The final step is to create algorithms that deliver actionable insights, as services, to customers and prospects. There are countless examples of this including automated stock market timing, arbitrage trading, medical diagnostics and pharmaceutical procurement, weather forecasting models, internet search engines, kidney transplant networks and even sports journalism. Algorithms are the final link between big data and revenue.
To what extent is big data a reality in any particular field? Where are the opportunities? What are the technical requirements? And how can you identify and test theories efficiently? Depending on the audience, this might include innovations in data center technology, strategies for database management, identification of key performance indicators or the recruitment of data engineers and scientists. Regardless of the implications, Patrick strives to present an empowering angle for event attendees.
The Future of Big Data
Data storage and processing power continue to accelerate. Much of the current innovation revolves around parallel processing facilitated by Hadoop and MapReduce. As such, the term “big data” will refer to increasingly massive data sets as time goes on. Databases that seem unmanageable today will be commonplace tomorrow, and data storage companies like EMC continue to expand capacity along the way. But at any one point, the organizations that are able to effectively analyze their data will be the first to exploit new opportunities via analytics and algorithms. As consumers, we can expect increasingly intuitive product or service offerings as businesses better understand what factors influence purchase decisions.
The trends leading to “big data” are worth mentioning. Historically, data was accumulated and recorded by employees. As the internet evolved, data could then be entered directly by users. Facebook is a great example, accumulating over 25 terabytes of fresh data every single day. Later still, data started being generated by machines. Buildings are full of monitors, residential houses have smart meters and satellites record data about our planet 24 hours each day. Each evolution increased data collection by orders of magnitude, making traditional data processing (via relational databases) impossible.
To respond to increasingly large data sets, technologies like Hadoop and MapReduce facilitated new “parallel processing” as a new option. Historically, we took the data to the processor. Today, we bring the processors to the data. Each server contains its own CPU, making data processing as scalable as data itself. Google is leading the charge. In time, companies and governments will be able to mine their proprietary data for consumer insights never before revealed and we can all expect an increasingly intuitive world as a result.
Because of the surging demand for Big Data expertise, Patrick has created a small website dedicated to that topic. You’re welcome to visit that website by clicking here.