Disclaimer: ATTOW, Zhamak Dehghani is writing an (awesome and challenging) book about her vision. Therefore, I will keep myself as much as possible far from its (current) content, avoiding any spoilers. That is why I am not expanding myself much on Data Mesh's definition; for example, if you need some more, I recommend (re)read her brilliant blogs, e.g., How to Move Beyond a Monolithic Data Lake to a Distributed Data Mesh.
Before headstart with the content, I would like to shortlist what you’ll go away with after reading this article:
In this blog, I want to introduce you to something I had to deal with as either an engineer, a scientist, a leader, or a manager: Datastrophes.
Nevertheless, because most events have an opportunistic component, I will not only cover the odd-part of datastrophes but also what and how you could benefit from them.
Here are the key takeaways in this article:
- The success of a data project depends on a series of assumptions to be valid and robust.
- A datastrophe is an impactful event occurring in a (managed) data system.
- A datastrophe always has a good…
When we think about enterprise-level management, a lot of tools and systems are available for different key positions in organizations:
Albeit CDO, CDMO, Head of Data, Head of Analytics will leverage Data Management, the critical aspects of the data lifecycle won’t be intrinsically covered:
Managing the value of data throughout its various usages.
This is what I’ll cover here as the first entry of this series of blogs about Data Intelligence Management.
Here are the key takeaways in this article:
…
I get energy from using his expertise in mathematics, data technologies to build innovative solutions to help organisations - former Spark Notebook creator