DiscoverMetaDAMA - Data Management in the Nordics
MetaDAMA - Data Management in the Nordics
Claim Ownership

MetaDAMA - Data Management in the Nordics

Author: Winfried Etzel VP Activities DAMA Norway

Subscribed: 10Played: 140


This is DAMA Norway's podcast to create an arena for sharing experiences within Data Management, showcase competence and level of knowledge in this field in the Nordics, get in touch with professionals, spread the word about Data Management and not least promote the profession Data Management. / Dette er DAMA Norge sin podcast for å skape en arena for deling av erfaringer med Data Management​, vise frem kompetanse og kunnskapsnivå innen fagfeltet i Norden​, komme i kontakt med fagpersoner​, spre ordet om Data Management og ikke minst fremme profesjonen Data Management​.
33 Episodes
«The real value in changing the status quo and getting more women to the forefront, is by having women share about the important work that they do, and not just talk about their gender as a topic.»This episode was recorded right before Christmas, when I had the pleasure to chat with Alexandra Gunderson and Sheri Shamlou.Alexandra inspired by a Women in Data dinner in New York, took it upon her to find likeminded people in Norway. That is how she came across Women in Data Science, the conference that was brought to Norway by Heidi Dahl in 2017. First meetup as a Community was June 2018, and this years WiDS event «Crossing the AI Chasm» is coming to Oslo (and digitally) on May 24th, 2023.Here are my key takeaways:Women in Data Science (WiDS)«Creating a meeting place, a place for people to connect and get inspired»Creating a platform and stage for outstanding women.Here are some of the events WiDS organizes:«Champagne Coding» - hands on event«Data after Dark» - after work event: 1-2 quick high level presentations«Data for Good» - get together and solve difficult challenges for greater causesAn important mission is to increase the number of role models in the community to look top to.The goal is to provide arenas to learn together, son it is as important to share stories about failure and collaborate around the learnings from those.WiDS is looking for sponsors, and one benefit can be, that trough events real-life uses cases can be solved.The focus for 2023 is «scalability» how to get unstuck from ML and AI pilots and bring your work to production?The Quest for DiversityDiversity is a complex topic with several perspectives: gender, nationality, background, knowledge, expertise, and experience.Why is diversity important?Leads to more innovative and effective solutionsLeads to more fair and just outcomesThe starting point when working with diversity on a daily basis is awareness.Diversity in the workplaceDiversity doesn’t magically happen. You have to work for it.Awareness is a first step, but you also need to collaborate in broader groups.The value is gained when you are able to include everyone in your events and talks.For people to work together against biases of any kind, you need an inclusive culture from the beginning.Be open in your communication and foster a culture of collaboration.The «3rd shift» is an important requisite for women to be able to spend the same amount of time and intellectual capacity at work.The work for an inclusive work environment is never over. We have to continuously work on its and talk about it.Diversity in recruitmentYou have to actively seek out to hire people with different backgrounds.In recruitment, be aware of how you write a job announcement.Use gender neutral language (avoid stuff like «Data Science Ninja» or «Data Rock Star»).There are online applications to check if your language is gender neutral and with suggestions for replacements of bias´ words.You need to highlight more possibilities with a job, like growth and learning opportunities.Minimize the list of requirements in a job-posting.Be aware of you own biases and work with diverse teams also in recruiting.When screening CVs, be aware that different people write in different styles.In an interview process, eg. Women don’t like to do coding tests, with someone watching them.Get involved: LinkedIn group, Meetup or
«Data Mesh promises so much, so of course everyone is talking about it.»I had the pleasure to chat with Karin Håkansson about Data Governance in a Mesh. Karin has worked with Data Governance and Data Mesh and is really active in the Data Mesh Community, speaking at podcasts and moderating the LinkedIn Group «Data Governance - Data Mesh»Here are my key takeaways from our conversation:Data in RetailThe culture in retail is about innovation, experimentation, new products. So governance has to adapt to this environment in order to be successful. If retail would do, what we do in data a fashion retailer would sell yarn instead of t-shirts.Retail knows what the customer wants, before the customer wants it. What would happen if we in data think like retailers?Its more about understanding the business better, then making the business data literate.Data GovernanceData Governance best practices in the DMBOK is still relevant, also in a Data Mesh setting.Data Governance has been on a journey from compliance driven to business value driven.Centralized Data governance creates a bottleneck. Decentralized Governance creates silos. So federated Data Governance is the middle ground.Create incentives to create trust.If you utilize your platform correctly, you can have high expectations towards computational governance.Data MeshData Mesh comes with a cost - you need to invest in Data Mesh.But more than anything Data Mesh implementation is an enormous change effort.If you don’t know why Data Mesh, you will implement something elseImplement Data Mesh in an agile way: «start small, fail fast and iterate»To start with Data Mesh, work with a business team that is eager to get started and sees the benefits - «You have to have business onboard, otherwise its not going to work.»Always check if you get the value that you expected.When you do it, make sure you get Governance, business and tech teams to work together and a re aligned on the why.Make sure two upskill for Data Mesh - its is fundamentally different: talk about it, have debates, book clubs, ++The 4 elements of data mesh: Can you implement those in a sequence or should you look at them as a unite to implement within a limited scope? Start finding ways for people to work together (eg. Common goal, and environment where it is ok to share).A good first step is to find an example of data with a certain issue or limitation and talk with the business user about exactly this.Data Governance, as much as Data Mesh, is about change management: You need to get close to the business and collaborate actively.Your first two steps should be:Start with one business unite, an early adopter.Find their most critical data and talk about actual data.MDM and Data MeshAre we still hunting for that golden record? How do we work with MDM in a mesh? This is not solved yet.You can refer to data, instead of collecting data in a MDM system.Maybe the best approach so far are global IDs to track data cross domains, but how you link your data might become the new MDM.«You still need to connect the data, but you don’t need to collect the data.» - MDM in a Mesh.Domains, federation and responsibilityIf you federate responsibility to the domains, they also need the resources and competency to fulfill these responsibilities.If the domain data teams are successful in abstracting the complexity, it will become easy to create data products.If you scale too fast (faster than your data platform), you might end up having to duplicate teams.
«If we think of Data Mesh as an evolution of data lakes, knowledge graphs are an evolution of Master Data Management.»Data overload is becoming a real challenge for all types of businesses. With all data that is gathered, both in multiple formats and huge volumes, has created a need for connected, contextualized data. Combined with continuing developments in AI, has resulted in increasing interest in knowledge graphs as a means to generate context-based insights.I had a fantastic chat with Mozhgan Tavakolifard. Mozhgan describes herself as «incubator and alchemist». She worked with PhD-research on trust and SOME. The techniques Mozhgan used to collect data for her research introduced her to Data Science. On this episode of #MetaDAMA, we talked about Knowledge Graph enabled Data Mesh.Here are my key takeaways:Data-driven transformationPersonal transformation and transformation of businesses are quite similar.Very few companies can blame that they are industrializing data and are transformed in a data-driven way.For a transformation to be successful, you need to have a holistic view and invest in a practical manner according to your business case.Don’t change your business culture to match a data culture, but rather let data be an enabler for the business.Data MeshEach domain can have a different culture, produce different data products.4 indigents:Give data back to the domains where it is producedCreate self-service data infrastructureFederated governanceData as a productThe centralized data and analytics platform has failed.Knowledge Graph can basically be considered as Data Supply Chain for Data Mesh.KG can be used to semantically link data products.Knowledge GraphsKG enable a human brain-like approach to derive new knowledge. A KG is quite simply any graph of data that accumulates and conveys knowledge of the real world.Every consumer-facing digital brand, such as Google, Amazon, Facebook, Spotify, etc., has invested significantly in building knowledge and the concept of graphs has evolved to underpin everything from critical infrastructure to supply chains and policing. There is a difference between Knowledge graph and graph data store. The semantic layer is what makes data smart.KG is when you have a dynamic and rich context around knowledge.KG can be used to semantically search for data.KG are a very important part of data mesh.If you want to start with knowledge graphs: find your business case. What is the purpose for you?«RIP Semantic Web! The Semantic Web is dead.»Maybe part of the problem was that is was academy focused, more than practical, industry-focused.If we manage to implement Data Mesh on a society level, we might have taken a big step realizing some parts of the semantic web.TrustTrust and explainability is based on context. Knowledge graphs can provide context and connect between them, this can ultimately generate trust.
«How can you use AI algorithms to make your city more sustainable?»How can we use AI to work more sustainable and optimize our operations for less pollution and more efficiency? I talked with Umair, Head of Data Science, Data Warehouse and Artificial Intelligence at Ruter AS, the public transport authority in the Oslo region, the Norwegian capital. Umair is also a associate professor at OsloMet, teaching AI to bachelor level student, as well as founder and CTO of Bineric Crowdsourcing and founder of the volunteer organization Offentlig AI.Public Transportation;:«Public transportation is complex, because data is not coming in from many different internal and external sources.«A bus can have minimum 20 sensors.» All theses sensors are sending realtime data.A huge amount of data is collected through external sources.Ruter is going away from centralized Data Management teams to more of a mesh approach.Data Mesh will give you a more complex data function, with need for more people and more organization and coordination.One team cannot have the full ownership for the entire data-driven prerogative of a company.2 factors that helped Ruter succeed with Data and AI:E2E responsibility for the whole AI algorithmCreate in Production and don’t overdue PoCsSustainability & AICapacity predictionPredict capacity 3 days in advance, but possible up to 1 month in advanceThis gives passages the possibility to plan their trips betterAn operations team that sees a pick in traffic realtime, send additional busses to ensure enough capacityThrough the prediction algorithm, fleet capacity can be reduced and it is easy to plan for balancing the load beforehandFleet managementThe long term vision for Ruter is to work more with Order services. In the future you shouldn’t have to walk to a bus stop, but can order a bus to your home.The existing solution is for seniors (67+) and is tested in the Viken areaBut how can you ensure that the busses are close to an eventual future order?Ruter is training an algorithm to predict where orders might come from to ensure a bus is parked close by. This results in less driving and less emissions.To train the algorithm Ruter uses mainly historical information, but combined with e.g. weather informationAnalysis of customer feedbackSentiment analysis to see how happy/unsatisfied a customer isExplainable AI«AI is just statistics on steroids!»Its hard to explain how a probability output is achievedThat makes an AI algorithm a black boxDevelopers create a set of tools and applications which can give insight on different factors how a decision was achieved by an AI algorithmQuantum ComputingTraditional computing is expensive and needs more time and resources to reach a specific outputVolumes of data are constantly increasingThis was a research project together with OsloMetQuantum computing was cheaper to work with then traditional computingQuantum Computers are more sustainable and energy efficient
«Its not just about the use of data, but the use of data in a cross-functional setting.»What a fantastic conversation with Nina Walberg. Nina has been with Oda since 2019 and has a background in Optimization and SCM from NTNU.Oda strive  to create a society where people have more space for life. Make life as hassle-free as possible. And to achievetis with help from data, Oda has created its 6 principles for how they create value with data.Here are my key takeaways:Business model and use of dataOdas business model allows for a better and more cautious way of thinking sustainability. The quantum of products can be tailored to the actual need.Climate recipe: Oda provides data on the climate footprint to its customers, when ordering products.Use data to provide not just inspiration to your customers, but also help them to create a complete basket of groceries for the week, to avoid any additional trips to the super market.The use of data needs to be combined with all functions, also business functions. All should be part of the development process6 Principles6 principles recently updated, but originally created by the entire team in 2019.Oda believes in autonomous teams and that trust and responsibility is given to these team.Domain knowledge and discipline expertise70% of Data and Insight people are embedded in cross-functional teams.Data is connected across the company and across domains. So you can not work exclusively in an embedded model. You need some central functionality.Data Maturity differs between domains. So really the embedded model depends on the circumstances and how Oda applies that.Data Mesh has been an inspiration.Distributed data ownership, shared data governanceProcesses and parts of the product for Oda are developed by the domain teams. To federate the ownership is a natural move.Domain boundaries need to be explicit. «Every model that we have in dbt is tagged with a team.»Ownership of certain products is harder or sometimes not right to distribute. It can either be core products with no natural domain team to own, and that many are dependent on, or the core data platform itself.Customer data first, but without proper product data you can’t live up to thatData as a productMake sure you don’t just deliver a product, but that it meets the customer need, not just the customer demand.Consistency matters, structure matters, naming matters when it comes to data products.Enablement over handoversEnabling others to do what they should be able to do themselves.Oda has established a segmentation model for five levels of self-service with different expectations to different user groups.Self-service needs to be tailored to different roles, maturity, and needs of the internal users.Data University, Data Hours and many other initiatives help to create a learning culture and improve data literacy.Impact through exploration and experimentationIt is important to test and see how a solution actually provides value to the expectations.This provides insights and information you can act on.Proactive attitude towards privacy and data ethicsData ethics needs to be incorporated and can’t be an afterthought.Company values can and should be directly linked to the work with ethics.This episode was recorded in September 2022. Click here if you what to know more.
«If you do guidance correctly, people will follow it. People want to do the correct thing. Nobody wants to do things wrong.»From fisherman on Island to Data MVP in Copenhagen! Asgeir has been through a fantastic journey and we had a great conversation around PowerBI Governance. Asgeir has his own blog about the topic.Here are my key takeaways:Think of platforms at a translation: You are own a journey and platform is where you stand on, from where you push off, start your journey.A lot of organizations are only on maturity level 1 when it comes to Power BI, even though they think they range higher, or even if there are different levels of maturity in different parts of the organization.Buzzwords help shape opinions, and to have discussions. With good help, these opinions can be shaped into actions.If you have the possibility to do a green field approach to data mesh - consider it: It will not get easier than this.The parts of Data Governance, that can be solved with a focus on technology, and process are much more mature and easier to handle than the softer parts, that are concerned with people.It is easier to lock out intentional mistakes than unintentional.Governance is changing focus, from being compliance driven to providing a framework of how we can get more value from our data, it’s about making people productive.By making people more productive and efficient, you can pay off the cost of governance really fast.If you use data more proactively, you are moving the power from people to the machine. But there needs to be a balance, its not either or.PoweBIIn PowerBI implementations we should have talked Governance from the start.PowerBI enables people in your organization to use data on their own. There is always value ion that.To use self-service tools correctly, you need to either have people formally trained or/and have guidelines in place BEFORE they start doing things.Not everyone in your organization will become a PowerBI expert or is a data person. Don’t expect everyone to go there.Governance is about giving people a framework and a good chance to do things correctly from the beginning.Most low hanging fruit: Build a report inventory!Try to keep the usage of PowerBI to the M365 environment and meet people where they are.Know your requirements before you start using tools.The 5 pillars cover all of what your governance strategy implementation should cover:The people pillar is about having the right roles in place and train people to perform in these roles.Because PowerBI is a self-service tool the administrator role is often allocated randomly.The processes and framework pillar is about what proper documents in place to make it work so users can use Power BI correctly and be compliant. This covers administrative documents as well as end user documents.The training and support pillar is about making sure everyone that uses Power BI has gotten the required training. This is also about deciding what kind of support mechanisms you need to have in place.The monitoring pillar is about setting up monitoring of Power BI. Usually, it involves extracting data from the Power BI activity log as well as the Power BI REST APIs for information about existing artifacts in your Power BI tenant. The settings and external tools pillar is about making sure Power BI settings are correctly sat as well as how to use other approved tools to support Power BI (extensions to Power BI).
«If you think about how we can work effectively together, you need to look at how you can effectuate your delivery teams. »I had the pleasure of chatting with Trond Sogn-Lunden from Veidekke, one of the biggest construction contractors in the Nordics. The construction industry is characterized by Merger and Acquisition. This is an interesting setting and reflects the importance to understand the organizations culture when looking at ways of working. The other important element for Veidekkes culture and organization is project-orientation. Some projects can last for years and are located away from the corporate office.Here are my key takeaways:The Project setting"A project is the smallest unit for profit and loss. If the project is a success, the business succeeds. »Even if some projects are more digital than others, even frontrunners need to use the same base systems and be consistent in how problems are tackled.There is a balance to strike between project autonomy and application landscape consolidation.In the same way there is an important distinction between what is best for each project vs. what is best for the company as a whole. For the insight factory, this is a constant push and pull.The Insight FactoryThe Insight factory brings a factory mindset to data and ensures that data products are delivered.Capacity to deliver is more important than the technology stack.Developers (solution responsibles) translate business requirement to solutions, close to the business.A developer needs to be a combination of «superman and Jesus».A coordinating role, ala business analyst, can help to ensure a unified application portfolio.This role can help to access relevant raw data for the insight factory to be used for analysis.The Insight factory is providing data sets to the business and help to enable the business to do their data visualization on top of it.This helps to build data competency in the business, secures ownership, and added capacity.Product ManagementProduct Management is an important concept on how deliveries are organized.Product focus and pipeline structure enables Veidekke to have different change rates for different products and manage them as units and can deliver on demand.Every product can be connected to a domain, which is what insight is organized around.You need allies in the business to get the reach you need as a central insight factory.Business users should be part of your data product groups.You can set up inspirational events, to get traction and understanding for your insight & analytics work.
«Think of data availability as online vs. offline.»What much of the discussions around data products, data catalogs, self-service boil down to is data discoverability, observability and availability.I talked to Ivan Karlovic, Director of Data Analytics and Master Data at Norwegian about these topics and gained some fantastic insights. Ivan always loved analytics and using data to improve the business and started his dat journey with a course in data mining and with «Pure curiosity on how we can use data!»Here are my key takeaways:The Airline sectorAirline industry is reliant on partner «A Flight is just a subset of an end-2-end journey»Data privacy and ethics are important topics for Norwegian and are faced with a systematic approach with an aim for automation.Norwegian is building a cloud based analytical platform to ensure a greater visibility of the data analytics setup.The first improvement should be on data discoverability, closely connected to data observability.«A Data Catalog will raise awareness of what we have of data assets.»There is a clear goal to ensure an automated observation of all data assets in Real time.A central team needs to be able to deliver cross-domain use cases, also across domains with different data maturity.«With this crisis-domain approach we are putting away some of the legacy discussions.» We can engage with each domain.It is ok to have specific crawlers on local data, but you need to synchronize it into the central data catalog.The organization needs to have a way to stay aware of everything that is produced.Data AvailabilityExcept for sensitive data, everyone in a domain should be able to see all domain data. Outside the domain, people should be aware of what kind of data each domain maintains.«If you work with analytics or machine learning you always what to talk to the domain people, because you can easily misinterpret if you don’t have that domain experience.»Domain data products that are domain spesific without a use case outside the domain, do not have to adhere to central strandards. But if they can have a use cases outside the domain, they need to be fed into the central data catalog.Communication and understanding intentions from data producers to data users is really important. You have to continuously work with understanding.There is lost out business potential in not having data discoverable, no matter the quality.Most effort is wasted in rework of data products that where just not discoverable.Self-Service«When it comes to self-service we need to set up technology in a way that the end-user does not have to think about the data, only the problem to solve»Even if only 70% of use cases can be solved by self service, we need to strive for 100% to ensure that we offload the expert data analytics team as much as possible to work on the tough cases.«Data Catalog: You can buy a monster that gives you 95% of things you don’t need, or cutting edge super-niche start ups. But you have some interesting players somewhere in the midle.»«Can we do data engineering on a meta level, without seeing the underlying data? Eg. For PII?»DocumentationHow can you retain knowledge in a distributed architecture?Ensure domain knowledge is fostered in the domains. Build a documentation repositoryInfrastructure as code. Technical knowledge supplied with contextDomain knowledge is the most tricky and most difficult to replaceGreat documentation can be both motivating and time saving. Motivating to reach a higher standard and time saving for problem finding, onboarding, etc.
"The data lifecycles collides with the system lifecycles. It’s a classic."Let’s talk about the paradoxes of Data: Data Lifecyle, Search and Data Catalog! What a fantastic chat Ole and I had! Ole is writing his O’Reilly book Enterprise Data Catalog, has newsletter Symphony of Search. He brings in a new perspective from Library and Information Science and is a great advocate for transforming the way we think around data and search. Ole has worked as a specialist, as a leader and as an architect, and has an academical background as PhD in Information Science from University of Copenhagen. Here are my key takeaways:Data Lifecycle was first mentioned as the POSMAD lifecycle Plan - Plan for creation Obtain - Acquire data Store - store it in a system Share - expose it and make it accessible Maintain - curate data, keep it accurate Apply - Use the data Dispose - Archive or deleteStore, share and apply is where the business value is derivedThe points where you get value from data are normally not the same, we use to manage data.The work e.g. national archives do, in cataloguing, and readying data for research is done at the very last stage of the lifecycle. But the value resides much earlier in the lifecycle.Data-driven innovation, data-drive culture… What these terms actually mean is that we need to get better at utilizing the value insight data.Intangible assets hold the highest value - data is the key to value creation.One of the potentials of a data catalog is to push the high-level DM activities to earlier stages of the lifecycle.Catalogs are pushing inventory activities from the dispose phase to the store and share phase of the lifecycle.There is a huge difference in the perspective of an IT system lifecycle and data lifecycle.Data always resides in a system, and that system has its own lifecycle. These lifecycles do not match.If you do not maintain data in your systems, any potential data migration becomes exponentially more difficult. What do we migrate, what do we keep, what do we delete?The solution can be a Data Catalog and/or metadata repository with retention policies for data.The distinction between searching in and searching for data has become really important due to the rise of data science.When you search for data, you are looking at data sources with potential value to search in.Metadata is key in searching for data - that means we have to manage the metadata lifecycle as well.A data Catalog is basically just a search engine.Data Catalogs rely more and more on the same technology components as search engines for the web, e.g. knowledge graphs.The key capability of data catalogs is a metadata overview over the data in your company.Data catalogs have an untouched potential to ensure data lifecycle management
"Data is mainly used to create value for customers, both inside the company and outside!" Customer centric is one of the great mantras in data at the time. I wanted to get to the bottom of what Customer Experience actually means. So, whom better to as then Leif Eric Fedheim, Customer Insights Manager at Elkjøp and one of the top 100 Data, Analytics and AI professionals in the Nordics? We talked about the retail data quest, what we can learn from retain in other sectors and naturally the value of customer experience and insight.  Here are my key takeaways:Data has to be fast and easily accessible for the business, so they can use and consume data when they need it to make better decisions. This happens on a self-service basis.Make people aware how they can use, combine, and analyze data.Analysts and Scientists must be placed at the different business domains.Elkjøp is organizing towards a product-oriented organization.The data governance structure is organized towards freedom under responsibility. There are rules in place, but the creativity should not be hindered.Business critical data products are created and maintained centrally. Data privacy is important, especially when it comes to customer data.Privacy is an important element of good Customer experience.Are data savvy front runners setting the requirements of how we work with data?There are data science & analysis teams embedded in data savvy departments.Analysis should happen embedded in the business and involve the central data team in necessary.Lasting customer relations and a strong focus on customer experience across all channels are learnings from retail towards other industries.Omnichannel players need to find ways to connect the physical customer experience with the digital.Retail can learn from other sectors like banking and finance, that there is a need for explainable models - how to be transparent with your customers.How do we manage the gap between explain ability and performance?CX is about listening to the customer and use tools and data to better their experience. Digital stores make it easier to use data for better customer experience then in physical storesQuantitative analysis is about CX without directly involving the costumer.Qualitative is the opposite: the customer is actively involved to better his/her experience.Only 30-40% of CX is focus on the actual buy, the rest is moved before the point of decision and after a purchase.Correct and updated product information is vital before the purchase. You can support the decision process and idea phase through helpful articles and inspirational content at the right time at the right place.Natural Language Processing (NLP) for review data is a vital part to ensure good customer experience.Many have a way to go when it comes to self-service CX. You need to instrumentalize all your customer touchpoints to gain a reliable variant data foundation to ensure CX success.
“The closer you are to the business, the great the chance to make an impact with data!” In this episode I interviewed Marti Colominas, VP Head of Data & Insight at reMarkable. When we had our chat this summer, Marti was still working as Head of Data for Kahoot!. Marti combines business with data and works on a daily basis for value creation and balance on the crossroads between business and tech. Marti has experience from big corp but was looking for that high pace and ever-changing environment of a startup. Here are my key takeaways: The Startup / scaleup settingIn startup and scaleup roles are not that defined. There is a huge amount of flexibility.You can react quickly and explore new options without a lot of bureaucracy.It is very dynamic, and data is used by everyone to base their decisions on.On the other hand, datasets are not as stable and with less quality.There is a significantly shorter distance between C-level executives and Operations.The Business Value of DataYou need to find balance in delivering fast (time to insight, speed and accuracy).Speed is there no matter what, you need to ensure the right level of accuracy at that speed.If you want to have impact, you need good data quality. If not, the numbers will not be trusted or used.The goal is data that is trustworthy and easy to use.Long term commitment to Data Quality and Data Governance, whilst speedy day-to day operations with little time to insight, have been key to success.The role of CDO is to ensure that be business derives value from Data Science and Analytics, that counts for a Startup as much as for a large enterprise.The combination of business and engineering becomes more and more important.The data stack is moving towards speed and scalability, which makes it easier to handle large volumes of data.The key innovation will happen on automated data quality, self-serve analytics, even API-contracts for click-steam data, as well as tracking and lineage.Data is not always the goal. It can be a means to create value.For Kahoot! And reMarkable, data is used to make a better product and to improve the user experience, not to monetize or even mine that data.User should see and feel enhancements in the product through their provided data right away.To show the business value of Data Management you have to argue with "What if…?" What if we don't do it? You need to show the consequences of not acting to the C-level executives.Data quality problems grow exponentially over time. If you do not act, data-driven decision making will eventually be replaced by gut feeling.Not having control and metrics in place is like going to the casino: You can win once or twice3, but in the long run you will lose.Data products should have an assigned value - so Data as a product can help us argue for the business value of data.Step 1 is to treat data as an API contract.Self-service is dependent on a good structure and governance at the data producer side.Self-service can cover 60-70%, not everything can be self-service.A dataset tells the story of what happened in the business. You need business context to understand it, you need to phrase a business question into a data question, you need to know how to manipulate the data correctly.
Welcome to Season 2 of MetaDAMA!The first episode is dedicated to DAMA. I talked to Marilu Lopez, Leader of the Presidents Council for DAMA, Peter Aiken, President of DAMA International, and Achillefs Tsitsonis, President of DAMA Norway.We had a great conversation about the vision and mission of the voluntary, vendor-independent organization DAMA and its value for the knowledge worker community, as well as society as a whole.We also talked about what Data Literacy is, how we can operationalize the term, and to what means. The best definition of Data Literacy so far is “the ability to read, write and communicate data in context, with an understanding of the data sources and constructs, analytical methods and techniques applied, and the ability to describe the use case application and resulting business value or outcome.”Here are my key takeaways:The Data QuestData needs to become a profession.People come to data through different means, with brings new and different perspectives. It is not easy to view data as an uniform term.Data is as much a part of the business world as it is of the IT world. DAMA wants to bring business and IT world together to collaborate and understand each other better.The metaverse is collecting all data about us, and we give it willingly. Is there a lack of understanding of consequences in society as a whole?The aspect of ChangeA lot of Data Management is about Change Management and reliant on the existing culture. At the same time, culture is something unique and needs to be fostered locally.We need to prepare data professionals to become change agents.The principles of Data Management are about collaboration, and DAMA is trying to live by this principle.Data literacy in societyWhat can we tell people that they objectively need to have in terms of skills and knowledge in order to become data literate?What conversations do you need to have in your family to ensure that you are data literate enough to operate a smartphone without exposing your data?Knowledge workersAnyone that works with her/his brain, uses data and thereby is a knowledge worker.The sooner we make our knowledge workers more literate the sooner we will end up with smarter and more effective organizations.Knowledge workers need a learning path.Professionals need to have an ethical compass, an urge or even duty to call out if one sees unethical behavior.We all have a responsibility to share our knowledge.
Is the EU providing the legal framework for data-driven value creation?Digitalization is a focus area for the European commission, and at its core, the European digital strategy is a data strategy - digitalization is focused on data.The goal is to utilize the value of data and give better conditions to SMB in the European marked.I was fortunate to talk to Astrid Solhaug from DigDir, working for the Norwegian resource center for sharing and use of data. Astrid provides both a Norwegian and European perspective on the topic.We talked about:Eu digitalization strategyWhat does that mean?How does the EU approach digitalization at a strategic level?How do regulations and incentives interplay with digitalization in the EU?Eu regulationsWhat are EU Data Act, Open Data Directive, Data Governance Act?How can we share and reuse data? What does it mean for value creation with data?Why do we need a data innovation board and what should it look like?ChallengesCan the market be regulated? Is EU regulation harming innovation in the marked?How can ethnical values be regulated across countries? Is the EU trying to take an active role in the development of European society?PossibilitiesWhat are the possibilities in these regulations? Can it mean safe and easy data sharing?What are the effects of simpler easier data sharing between the public and private sector? Can it create value for us?What are the opportunities in the regulations for users in Norway? What I've learned:EU is trying to take an active part in democratize data.EU defines data is a non-competitive good.A data collector has no exclusive rights to the data collected.All open governmental data should be accessible for everyoneThe Data Act arranges for private sector data, to secure sharing and end user rights.Private organizations will have a duty to share information with public sector in times of crisisThese three regulations for sharing data are interesting to see in combination with the AI Act, that ensures use of data that now is available.
Are we living "The Truman Show"? I had a chat with Mads Flensted Hauge, Chairman of DAMA Denmark and DPO and Data Governance Manager at a Danish health care provider.We looked at Data Privacy from four different perspectives:The society perspective What is data privacy and why is it important? How transparent are we as citizens?What considerations need to take place when we talk about data sharing across public agencies? The company perspective Why should we care about privacy in our company? Are we just talking about compliance? Where should the DPO role be places in a company?Does your HR system need a feature to show your employees home on google maps?The data worker perspective What does this mean for us data workers in how we treat data? What does privacy by design mean?What is the impact of AI, ML,… on privacy?The personal perspectiveWhat can I do to keep my personal data save? How many smart devices do I need in my home? Can I live without a washing machine with Wi-Fi connection?Has the corona pandemic made it even more ok to share private heath data?Here are my key takeaways:Convenience drives change and digitalization in public sector - and sometimes privacy becomes the victim for efficiency.To apply GDPR you need to apply different knowledge areas of Data Management. That is why these two are closely combined.It has always been hard to show the value of a data management journey from the start, but with GDPR and the ominous notion of fines, data management got the ear of the C-level.Since GDPR is framed as compliance, it leads companies to do just the bare minimum to be compliant.GDPR forces you to get a deeper knowledge about your business.There is an ethics dimension to data privacy, and DPOs are on the forefront to instigate this ethics site.A DPO does not just write policies and procedures but must navigate company culture to promote ethics and privacy actively.
Everyone who ever played with Legos knows that the bricks don't just fall into place. It takes dedication, finding the right brick at the right time, maybe even sorting your bricks.The same goes for Data Governance.So whom better to ask about Data Governance, then the Director of Data Governance at the Lego Group, Michael Bendixen?Michael was really clear in his message to all of us, by given us his 13 commandments for Data Governance:1. Drop the data management "lingo".2. Invest the time to build a strong data governance framework.3. Align your data governance/data ownership structure with existing organizational structures, terminology being used etc. to the extent possible, as that will also make your implementation less intrusive.4. Make sure to get the right people in the team that facilitates and supports data governance - people with great collaboration and communication skills, that are good a building strong relationships are vital. 5. Ensure data quality is a part of your setup and that you are able to report on data quality.6. There is no "one size fits all" when it comes to data governance.7. Depending on the organization you work for, compliance can be a good driver for data governance - but have a plan that will take you towards a more value focused data governance with more carrot and less stick. 8. Data governance is not about technology and tools.9. Communication is key.10. Data governance is not a project nor a program - it is a lifestyle change and does not have an end date.11. Invest in training and onboarding the people that will take on data governance roles. 12. Be ready to support the people that takes on data governance roles - and make sure they know you are there to help them.13. Be very aware that until you demonstrate business value - you will often just be the guy with a PowerPoint slide-deck talking about something fairly abstract that not everyone understands.
Can you put a value on quality data? Definitely! But how?Who better to ask than Kristin Otter Rønnevig and Espen Hjelmeland? They both dedicated their master thesis to explore this topic with really interesting results. Their thesis "An Investment Perspective on Data Quality in Data Usage" asks "How can an organization optimize its investments in data quality?"To answer the question Kristin and Espen posed three Research Questions:What are the main drivers for willingsness to pay for data quality?What part of data quality aspects should one invest in to maximize opportunities and minimize risk?How can the quality of data be improved, and what are the costs?Here are some of their key observations, I found particularly interesting:Increasing confidence in data in order to enhance company operations is one of themotivations for willingness to pay for data quality.A greater level of knowledge in data quality gives a higher willingness to pay for data quality.Demonstrating to the customer how quality improvements may enhance profit at each stepof the value chain contributes to the willingness to pay for data quality.Prioritizing improvements are based on the time and cost of the particular improvement. A way to reverse-engineer the impact of various improvementscan help to identify how to improve the quality.It is critical to invest in a professional, highly skilled team environment to succeedin data quality investments.To cope with data quality, it is necessary to invest in security.To know whether the organization is investing in data quality optimally, it will need a deep understanding of the business and experience with it.
Espen Langbråten is leading Europris' bet on digital platform, data warehouse, and analytics.We had a great talk about everything data, but at the core we discussed three questions: How do you build your team?How to recruit and build competencies in an ever-changing environment whilst keeping an eye on cost?What resources do you need in house? What can you outsource to a consultancy?Here are my key takeaways:It is not only hard to find competency in the marked and recruit, but also to retain valuable knowledge in the company,Give your employees the responsibilities they want and need to thrive on the job,Mentor-programs give the new employees a change to get integrated fast and at the same time the existing employees a boost of their sense of value and the chance to share their knowledgeIt is important to recruit basic technical skills and a hunger for learning, business acumen and domain knowledge is something you learn on the jobYou need to have core-competencies in house and work in close relation with your consultants to transfer knowledgeThe conceptual and even logical architecture needs to be owned in houseMost problems the business faces are process based. Tech can aid to solve these, but it is seldom a tech problem by definition
What has been the hot topic in Norway this winter? Energy prices. The prices for energy skyrocketed. But how does this relate to data? Can we save energy with Data Management? The answer might be on the infrastructure side. And since "the color of Data centers is Green!", I took this question to Espen Bjarnes, who works as VP Sales for Green Mountain, one of the most sustainable data centers in the world according to Data Center Magazine. Green Mountain was founded in 2009 and opened their first Data Center in Rennesøy (Stavanger) in 2013 in an old NATO facility.Norway has an unique standing compared to other countries, because we use close to 100% renewable energy sources.We have cold climate, which reduces cooling expensesWe have relatively low energy costs With that, there is a chance to reduce CO2 emissions by storing data in Norway.An on top of that there are political incentives in place to support the data center industry in Norway, such as reduced taxes. So, for Norway, "Data Center is the new oil" - because this might become the driving force for Norwegian industry going forward.The amount of data in the world doubles every second year, streaming of live data grows exponentially. All this needs data center to support. That translates to a natural growth of the data center industry. At the same time there is a potential for the Nordic data center industry to take the lead, due to sustainability advantages in the Nordics. Espen calls it the "sustainable edge".
How can you democratize data from Norway’s public services?What does it mean to have a user centric approach to public services?What needs to be in place to organize your organization, the public services as a whole, while maintaining focus on good services to your citizens?I had the pleasure of chatting with Gustav Aagesen, Chief Data Officer at Lånekassen. The Norwegian State Educational Loan Fund, who is celebrating its 75th anniversary in 2022. Gustav started as Information Architect in Lånekassen in 2012, became analysis manager before taking the position as CDO at Lånekassen. Today, he sees his responsibility in supporting the entire organization and to institutionalize information management. We looked at 3 different Perspectives on “Orden I eget hus” a Data Governance framework for public services and Democratization of data in Norway: 1.       Lånekassen. An internal view on automation and structures, data citizenship, and culture.2.       Public Norway. A perspective that includes valuable work on the common data catalogue, “orden i eget hus”, common concepts and datasets.3.        Citizen perspective. Thoughts about finding ways to make use and consumption of data easier and with less barriers, and provide citizen-centric services.Here are some of my key takeaways:-          Information Management is not a goal by itself, but a way to create and gain value-          Information Management has to start with a purpose!-          Data and Information has a longer lifecycle then applications.-          Data Lineage is important, with the objective in mind to create services and gather data based on consumer demands, or the needs of the citizens. -          The value for the citizen is an end-to-end- value stream that is traceable and can create trust in the data.-          If you want to give the citizens access to proactive public services, information has to flow between different institutions.-          It seems easier to get funding for technology then work-processes. That is also a reason automation is in high demand.-          Data Sharing needs to be balanced with trust and privacy to ensue good solutions for the consumer and citizens.
How can you set up your services to have technology support your data journey? How do you work with tech procurement? And what is the impact of Data Mesh, especially for data governance? How to evaluate which consumer need to satisfy first? How to prioritize what is important to create value?I chatted with Bente Busch, who is leading the Service Platform Team at NAV, the Norwegian Labour and Welfare Administration. Bente and her team are responsible for platform services for the product teams at NAV, the application platform, design toolbox, as well as good practice around digital product development.Bente is driven by being a product director, helping the users of data to minimize their cognitive burden and deliver an attractive and easy to use service.Modernization of digital systems and ways of working has been a priority for NAV. In the last 5 years they fundamentally changed the way they deliver projects and programs, by focusing on ongoing product-development. This was done in combination with breaking down technological one-size-fits-all suits, to a micro-service oriented architecture.With that, NAV provides technology and systems tailored to the consumer-needs.Here are some of my key takeaways:Good technological services can help foster a data culture.The goal is for the consumers and citizens to connect with public services easier, faster, and more user friendly.Framework agreements can help to liberate resources, gain agility, and tailor to specialized needs.Data analysis can help in procurement processes to ensure consistency and integrity in future needs.NAV views the consumer needs as digital products to be developed and design products that can be improved and changed if needed.Specific solutions that need to be tailored to provide best possible service to the consumer needs to be built, whilst more generic services can be based on "our-of-the-box" solutions.Open-source should be a stable part of the architecture, both usage of open-source libraries as well as solutionsData is moving in the same direction as software, to a more decentralized architecture.A network of actors and collaborators needs to agree and enforce governance in a distributed architecture. Technology can help to codify and automate Data Governance. "Lett å gjøre rett" - Easy to make the right decision.The data domains in NAV are building autonomous architectures. They are independent in their architectural decisions but need to account for their surroundings.NAV is inspired by "team topologies" to create "APIs" between teams.
Download from Google Play
Download from App Store