Accelerate the Transformation
Speed up data processes, Optimize faster for Value


What is BDU?
Big Data Universe Conference is the #1 Big Data HUB for data & IT professionals in Hungary. In the last two years we hosted almost 600 attendees from almost all European countries. Attendees include CEOs and CTO’s of both the world’s fastest growing startups and largest companies, alongside leading investors and many data enthusiast.
Our aim with BDU is to provide the latest trends & insights to data lovers and IT professionals about Big Data management, analytic solutions along with valuable advice and implementable use case examples.
The one day’s agenda will feature presentations from industry luminaries and data experts and provides an outstanding networking opportunity during and after the event.
Big Data Universe Conference 3.0 will focus on practical and technical innovations in the following topics:
Automated chat communication, learning bots
Expanded visualization in 3D
Self driven cars (AI motive)
with BI platforms
Exploring the explosive world of financial technology
Internet of Things & Internet of Everything
SPEAKERS
Keynote speaker

Kubernetes, and cloud native technologies are bringing a significant paradigm shift to infrastructure, big data deployments and applications, and these changes are fast and not incremental. We claim that the existing big data frameworks are all built on out-dated infrastructural components and the proposed changes are too little and too late. We're essentially making the pitch that Kubernetes solves these problems and implies a better separation of concerns between compute, SQL, streaming and the underlying infrastructure, let it be cloud, on-prem or hybrid. Kubernetes has become the de facto standard, as a “runtime fabric“ purposely designed and built for the cloud and scale and does things the right way - it is an overall game changer and we hope that the big data landscape will benefit from it.
This talk is making the audience familiar with these changes, the drawbacks and benefits of each, and also drive through our journey of implementing a Heroku/Cloud Foundry like PaaS, Pipeline where we move these frameworks from monoliths to microservices.
János Mátyás
CTO
@ Banzai Cloud
Living the paradigm shift

At Neticle we have designed automated social media text insight discovery methods to let the machine recognize interesting patterns. We have also put focus on how to tell this to our users so we have created a Natural Language Generation (NLG) solution to generate readable multilang insights automatically. I will let you know how we built this solution and how can you create similar.
Péter Szekeres
CEO
@ Neticle
Text Insight Discovery: From NLP to NLG

Mate has experience in Big Data architectures, data analytics pipelines, operation of infrastructures and growing organisations by focusing on culture spanning more than a decade. Mate also teaches Big Data analytics at Budapest University of Technology and Economics . Speaker and organiser of local and international conferences and meetups.
Data-driven solutions are eating up the Industry. We call it Industry 4.0, the next industrial revolution. We are still in the early stages of this significant transformation, but we already see emerging patterns. Which technologies can come out as the winners of the change coming? What tools and technologies should you invest in? Are Kafka, Spark, AWS and cloud solutions the Holy Grail of IoT? Will AI live up to the hype? What are the low-risk, high return strategies in data analytics? We'll try to answer these questions and look into what the future will bring us in data analytics in the industrial revolution.
Máté Gulyás
Partner and Instructor
@ Datapao
Winning strategies and technologies in Industry 4.0 and IoT

The Austrian research project IES-Austria strives to adapt and implement a vendor-neutral and cooperative method to achieve interoperability of ICT-systems in smart energy systems. It is based on an existing method from ICT in healthcare, where interoperability of systems has long been achieved. Integrating the Healthcare Enterprise (IHE) is a global non-profit organisation that engages actors in the health system to achieve interoperability of ICT-systems in healthcare. IHE developed a fair, cooperative and participatory method to engage vendors, manufacturers and users alike. The modular, well-defined process starts with the definition of use cases, followed by cooperatively developing “interoperability profiles” which select, optimise and specify the normalised use of existing technology standards that address the well-defined use cases. These profiles assemble specific “base standards” which together provide complete technical specifications that cover all interoperability issues (e.g. data formats, transport protocols, semantics and security methods).
Karl Knöbl
Senior Project Manager
@ University of Applied Sciences – FH Technikum Wien
Learning from e-Health – Interoperability Process for the Energy System

His mission is to spread data-driven decision making, solving strategy, sales and operation management problems with econometric and data mining models. He is specialized in potential based sales planning and forecasting using geodata, loyalty based customer behavior models, demand curve and price elasticity determination, BI consulting.
Since 2006 Gábor is guest lecturer at the Corvinus University of Budapest, he has been teaching regularly on seminars, local and international conferences. He believes that citizen data scientists can leverage the benefits of analytics, and be the accelerators of integrating data innovations, thus cross-disciplinary knowledge development has to be supported in every education channel.
Managing projects in topics of: Advanced Analytics, Statistical Analysis, Geoinformation Based Sales Analytics and Forecast, CPM Suites, BI consulting, Business Modeling, Planning and Reporting, Management Control, Management Accounting and Cost Calculation, Performance Improvement.
A story of price elasticity, demand modeling, price optimization based on transaction data enriched with external data sources. During the presentation, you will get the answers for what are the challenges, pitfalls and success stories of such projects.
How can we model the demand curve? Why is it interesting to calculate the price elasticity of a product and what kind of conclusions can we draw? How do external factors - such as seasonality and the prices of competitors - affect the demand?
How to accelerate transformation by settle the understanding and belief inside the organisation to make it actionable (data science models without business actions are nothing but art)?
Gábor Ádám
Head of Advanced Analytics
@ IFUA Horváth & Partners Kft.

A story of price elasticity, demand modeling, price optimization based on transaction data enriched with external data sources. During the presentation, you will get the answers for what are the challenges, pitfalls and success stories of such projects.
How can we model the demand curve? Why is it interesting to calculate the price elasticity of a product and what kind of conclusions can we draw? How do external factors - such as seasonality and the prices of competitors - affect the demand?
How to accelerate transformation by settle the understanding and belief inside the organisation to make it actionable (data science models without business actions are nothing but art)?
Levente Havas
Head of Enterprise Analytics Competence Center
@ IFUA Horváth & Partners Kft.
Accelerate the Transformation with Advanced Analytics in Pricing

Everyone talks about how machine learning will transform business forever and generate massive outcomes. However, it’s surprisingly simple to draw completely wrong conclusions from statistical models, and “correlation does not imply causation” is just the tip of the iceberg. The trend of the democratization of data science further increases the risk for applying models in a wrong way. In this talk, we will discuss capital mistakes as well as small errors that add up to completely ruin the potential positive impact of many data science projects. Attending this talk will hopefully help you to avoid many of those mistakes.
Zoltán Prekopcsák
VP Engineering
@ RapidMiner
How to Ruin Your Business with Machine Learning & Data Science

Blockchain is a new technology for connecting partners that not completely trust each other. In Enterprise systems every partner in a blockchain needs to connect his individual systems of records, create complete new user interfaces or both. Mostly there is also additional data that needs to be stored “off the chain“. Backend as a service provides possibilites to store data, create APIs, Frontend SDKs with analytics and even connect existing Backends like ERPs, CRMs or databases with out of the box connectors. Both technologies combined can reduce effort in Blockchain projects. Examples will be shown in the presentation.
Dr. Lutz Kohl
Co-Founder
@ ApiOmat
Blockchain meets Backend as a Service

“Blockchain-based Distributed Ledger Technologies (DLT) are rapidly transforming established B2B and B2C cooperation patterns; innovative solutions range from ledger-based non-repudiable communication to radical disintermediation scenarios.
There are many exciting and disruptive new possibilities at the intersection of DLT and data-driven cooperation, with applications from IoT and cyber-physical systems through healthcare to supply chain management. However, at the same time, real life applications have to fulfill a number of requirements beyond functionality - e.g. data visibility, privacy, process orchestration and performance - that make solution development a much more nuanced task than “smart contract programming“ in itself.
The presentation gives a broad overview of these novel application patterns, and demonstrates how Blockchain technologies of the Linux Foundation Hyperledger project can be used as a framework to realize data-driven cooperation solutions that fulfill key enterprise and industrial requirements.“
Imre Kocsis
Assistant Lecturer
@ BME
Blockchain as a platform for data-driven cooperation

Big Data vendors often say: “Save all of your logs, or any kind of information as raw data. Storage is endless and you never know what data could come handy in the future, or in what format you’ll need it.” However, when you want vendor support and use quality hardware, Big Data suddenly becomes expensive. It feels even more expensive because you won’t ever use the majority of the data. Using syslog-ng for logging can optimize your Big Data costs in multiple ways:
- Parse and enrich log messages (or any kind of collected data) to help message filtering. Discarding useless messages can significantly reduce the amount of data to be stored.
- Convert messages to the format that the processing applications use, or save them in a structured format that is easy to process, for example JSON. Syslog-ng can do this in real time on the clients, saving you processing time in the Big Data server farm.
- Collect logs and forward them directly to your Big Data destination (for example Elastic, Apache Kafka, Hadoop), without having to install additional applications, simplifying your infrastructure.
- The processing capabilities of syslog-ng are much better than most Big Data client applications, it can process more messages using less resources.
- Processing, reformatting, and filtering messages saves on the costs storage, maintenance, and processing power of your Big Data cluster.
Péter Czanik
Syslog-ng Evangelist
@ Balabit
Save all or save costs?

If you’re not into eSports, some of the stats surrounding the rapidly growing industry can be surprising. Electronic video gaming has extended from being a hobby into a serious sport and business. Every day millions and millions play these games or watch them just as spectators, and during these activities produce a significant amount of data, which can be easily obtainable. The data is stored in replay files and made accessible online, providing us the opportunity not only to visualize and analyses them, but also to dive into the details at a level that is unprecedented in other sports. In my speech I will present our solution for an after-game analytics and how commentators can make the most out of it.
Attila Komor
Data Analyst
@ Nextent
“For the win” – eSport as a goldmine for data analytics

When we started originally our first application to create a SaaS solution for energy management we didn’t anticipated for such quick ramp up in demand. The application was aimed for large, multinational companies where it can bring significant saving and fast return on investment. We did a handful of implementation in the last 18 months and have 3 major branches of the same core solution.
As we get more traction even with mid-size companies, where our value-added intelligent analytics can generate significant returns while reducing manual work, we started to rethink our architecture. Our current cloud-based, single-tenant architecture runs on Amazon, leveraging Kubernetes, Angular and handful of current technology. Our aim is to integrate the branches into one multi-tenant application, where we can provide scalable, tailor-made services more cost effectively while lowering the operational requirements of the system. The presentation will go into the whys and hows of this journey.
Ádám Szücs
Chief Architect of Data Analytics & Visualization of IoT-Energy Branch
@Nextent
Challenges of moving from single-tenant to multi-tenant application architecture

The presentation gives a rational abstract of chained applications like:
- Parameter settings for automated alert-generation systems
- Possibilities of suspicion generating based on data assets about energy uses
- Development of a controlling/monitoring system for basic stations in the telecommunication
- Possibilities of automated outlier detection in time series in case of energy consumptions
László Pitlik
CEO & Associate Professor
@ CREW-PA Ltd., SZIE
Suspicion-generation - theory and case studies

Most of the employees of any organization in the world have only a few data points about how they work every day - the time they start and finish work, plus the task(s) they work on during the day. World-class employees want to know more, they look at how they spend their time, they need deep insights so they can make data-driven decisions. Come and see how Crossover scales up the organization in a fast-paced environment, by creating world-class teams and taking productivity to the next level.
Ovidiu Gavril
VP of Engineering
@Crossover
The future of work in a data-driven culture

He is engaged in various international projects, such as the DARPA-UPSIDE program developing efficient algorithms for object detection and recognition exploiting non-Boolean architectures.
The projects and solutions implemented by their group and faculty provide solutions among others in healthcare, finance, biology, automotive and agricultural industries.
András regularly lectures and facilitates workshops in machine learning, and he is in collaboration with international research and academic institutions such as Intel, Hughes Research Lab, M.I.T or the University of Notre Dame.
As Algorithm Development Lead he has also worked in a former US-Hungarian start up later acquired by a leading global telecommunications company on advancing cutting-edge computer vision technologies for smart cities.
Modern machine learning approaches brought new opportunities in various fields where sufficient training data is available and these methods enabled us to solve extremely complex tasks, but human learning still outperforms algorithmic approaches in various aspects.
We can learn new concepts from a handful of examples, meanwhile when the amount of labeled data is limited -or the generation of a large dataset would be extremely resource-consuming-, Machine Learning approaches tend to reveal their weaknesses.
The aim of this talk is to introduce a couple of techniques which can help increasing the generalization power of neural networks and reduce their strong dependence on extreme amount of data.
András Horváth
Associate Professor
@ Peter Pazmany Catholic University, Faculty of Information Technology and Bionics
Reducing Data Hunger of Machine Learning Algorithms

He is the CEO and the owner of Com-Forth Ltd., which is a Hungarian family-owned business (founded in 1988), deals with innovative industrial IT solutions, and has a significant role in introducing IoT and Industry 4.0 technologies in the country. Com-Forth’s main mission is to support manufacturing and other industrial companies all along their journey until they reach the Industry 4.0 status, meanwhile our primary expertise is related to automatic, real-time data collection which is the fundament of a future Big Data analysis.
Collecting data is the basis of data analytics. If you don't have accurate, automatically collected, real-time data, then data analytics doesn't make much sense. Recently many studies have been published referring to the main challenges and barriers to start the industrial IoT journey. These challenges are related first of all to Connectivity (collecting real-time data directly from the processes), secondly to create a bridge between IT and OT (Operational Technology), and last but not least to Cyber Security. During my presentation, I will introduce some ideas and best practices how to handle real-time data acquisition and how to connect OT systems like PLC, SCADA directly to the Cloud for data analytics.
Péter Bóna
CEO @ Com-Forth Ipari Informatikai Kft.
3 Major Considerations to Bring Industrial IoT to Reality

How NEXOGEN help Waberer's transcend its industry: “The difficulty lies not so much in developing new ideas as in escaping from old ones.“ - John Maynard Keynes
Waberer’s is the European leader in full truckload transportation with an industry leading 92% utilization rate. The company has achieved 18% growth in revenue with a 16.5% growth in profit in 2017. This is a story about a small team with a big dream and their journey to tackle challenges in optimization and automation in the transportation logistics industry with Waberer's. They are called NEXOGEN, and their dream is to help shape a world where all companies are built around smart algorithms to automate and optimize their operations. Turning data into actions and visions into reality.
Giang Le Hoang
COO - Production and Commercial Operations
@ Nexogen
Turning data into action, a simple idea into reality - A NEXOGEN case study

During the presentation we will follow through how Magyar Telekom created the Data Lake. We will cover the challenges faced and overcome during implementation and operation. We will discuss how a DWH and a Data Lake can co-exist, and how Data Science activity can augment traditional development lifecycles in a truly enterprise production Big Data environment.
Márton Kelemen
Head of BI Innovation Center
@ Magyar Telekom Nyrt.
Data Lake Evolution at Magyar Telekom
BEST MOMENTS
CONTACT US
Orsolya Kovács
Mobile: +36.30.560.2927
kovacs.orsolya@nextent.hu
For sponsorship opportunities please contact,
For press inquiries
SPONSORS
TICKETS
The conference is FREE OF CHARGE but attendees must register via Eventbrite.
However VIP tickets are also sold (35.000 HUF) giving you the excellent opportunity to spoil yourself with quality gourmet food, PaaS (Palinka as a Service) and leverage the possibility of excellent networking.
