INNOQ Technology Lunch: Autorisierung mit RBAC, ABAC, ReBAC, PBAC – What the Heck? 👉 02 Jul 12:15
In today’s data engineering, the focus is primarily on developing modular data products. This article outlines the advantages of modularity over monolithic data pipelines and explains, step-by-step, how to develop data products using Databricks – from defining a data contract to creating and implementing Databricks Asset Bundles, setting up a CI/CD pipeline, and publishing metadata.
Time is a valuable asset, and it is easily consumed by the analysis and preparation of data. In other words, there may be good reason to come to grips with BI software.
Ein Data Contract definiert die Struktur, das Format, die Semantik, die Qualität und die Nutzungsbedingungen für den Datenaustausch zwischen einem Datenanbieter und seinen Konsumenten. Es ist damit ein zentrales Werkzeug, damit sich Teams über Daten-Schnittstellen verständigen können und somit Stabilität, Datenqualität und Nachvollziehbarkeit in der Datenarchitektur gewährleisten.
Have you heard of data mesh? Are you intrigued by its potential but uncertain how to get started building data mesh and data products? If so, this article outlines a potential approach and delves into the key concepts behind it!
Together with our customer CluePoints, we evaluated new technologies, tools and standards for data storage, data processing, data versioning, and data lineage. These might become useful for refactoring their self-serve data platform.
You know what a data mesh is? You understand its basic principles? But you don’t know how on earth to get the data product? Then I will show you how to extract your data product from your Domain-driven Design (DDD) artifacts.
apidays Munich
Find us on