Goals
In order to optimally prepare industry, science and the society in Germany and Europe for the global Big Data trend, highly coordinated activities in research, teaching, and technology transfer regarding the integration of data analysis methods and scalable data processing are required. To achieve this, the Berlin Big Data Center is pursuing the following seven objectives:
- Pooling expertise in scalable data management, data analytics, and big data application
- Conducting fundamental research to develop novel and automatically scalable technologies capable of performing “Deep Analysis” of “Big Data”.
- Developing an integrated, declarative, highly scalable open-source system that enables the specification, automatic optimization, parallelization and hardware adaptation, and fault-tolerant, efficient execution of advanced data analysis problems, using varying methods (e.g., drawn from machine learning, linear algebra, statistics and probability theory, computational linguistics, or signal processing), leveraging our work on Apache Flink
- Transfering technology and know-how to support innovation in companies and startups.
- Educating data scientists with respect to the five big data dimensions (i.e., applications, economic, legal, social, and technological) via leading educational programs.
- Empowering people to leverage “Smart Data”, i.e., to discover newfound information based on their massive data sets.
- Enabling the general public to conduct sound data-driven decision-making.