This article was written by Informatica and originally appeared on the Informatica Cloud Data Management Blog here: https://www.informatica.com/blogs/the-top-five-data-management-trends-to-watch-in-2022.html
Data Management Trend #1: Multicloud and Intercloud Data Management
The survey found 82% of organizations are currently using multiple clouds – or plan to within the next 12 months. As more applications and data move to the cloud, data leaders face increasingly complex data management requirements: within the same cloud, across different clouds and with on-premises sources. Multicloud and intercloud data management are crucial to supporting these diverse topologies.
Multicloud means a given data management service can operate on more than one cloud ecosystem. For example, being able to run a data integration service on Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform. Whether a multicloud environment comes about because of data sovereignty issues, trying to avoid vendor lock-in or mergers and acquisitions, organizations want the flexibility of running their data management services across cloud ecosystems.
Meanwhile, intercloud data management enables services running on different cloud ecosystems to work seamlessly together. For example, a data engineer can find data through a data catalog and marketplace service running on AWS, which uses a data integration service running on Azure to access data from Snowflake, and move it into Google Cloud Platform for use in a TensorFlow project.
Figure 2 shows how Informatica can help you address fragmentation and complexity with our multicloud and intercloud data management capabilities.
Automation is the only viable alternative to tackling the scale of data fragmentation and complexity. Yet more than two-thirds (68%) of the surveyed organizations have not operationalized AI for data management across the organization.
AI can help with automation of all aspects of data management, including data discovery and cataloging, data and application integration, cleansing and mastering, governance and privacy and data sharing. It also improves productivity for all data users, including developers, architects, application administrators, data stewards, financial analysts and front-line workers.
And automating data management using AI increases operationalization of AI for decision making and in business processes. Optimized data management organizations are:
- Five times more like to have fully operationalized AI for data management
- Three times more likely to have fully operationalized AI for insights and analysis
- Six times more likely to have fully operationalized AI for process automation and optimization
Click above image to explore interactive experience.
As organizations put more data in more clouds, they need a way to connect the siloed data sources and make data more accessible across the organization. To address these cloud data siloes, data management leaders are looking to data fabric architectures.
In fact, more than half (54%) of the surveyed organizations said they are either investigating approaches and solutions or have put some parts of a data fabric architecture in place.
Data fabric is a design concept that serves as an architectural layer for simplifying and scaling data management tasks and empowering broader and more consistent use of data throughout the organization.
The key components of a data fabric include (see Figure 6):
- An augmented metadata catalog for discovery and curation of data assets
- A metadata knowledge graph for understanding the relationships between data assets
- An AI-enabled recommendation engine to suggest data assets for use
- Data preparation and data delivery that support ETL, streaming and API data movement
- An enterprise data orchestration layer that coordinates the collaboration of different data management services
Embedded into those five components is an AI engine that automates the data management tasks performed by the data fabric. For example, recommending data sets that might be of interest or automatically associating business terms and definitions with the underlying technical data to empower business users to self-serve.
As organizations seek to digitize their businesses, they are also accelerating the number of cloud applications they are using. Managing end-to-end digital experiences requires consistent master data across the applications used in digital processes.
When asked about their main data management budget priorities, 61% of survey respondents said multidomain master data management (MDM) for a 360-degree view of the business.
While many companies initially focus on managing customer data, they quick realize that material, supplier, product, location and other domains of master data need to be managed and connected to get the 360-degree view of the business that will help them deliver extraordinary digital experiences. Some of the ways multidomain MDM improves experiences includes:
- Customer experience: Enables marketing to use customer, product and channel data to understand preferences and deliver personalized offers. Empowers personalized support and services across customer touchpoints.
- Product experience: Enables commerce and merchandising teams to use customer, product and location data to deliver content for engaging, relevant product experiences throughout the customer journey.
- Supplier experience: Enables procurement and supplier relationship teams to use supplier, material and location data to simplify supplier onboarding and better manage total spend with suppliers across the organization.
- Financial experience: Enables financial planning and analysis teams to use customer, product, channel, supplier, cost center and location data to model scenarios, develop plans and deliver timely reporting and analysis.
Figure 8 shows how Informatica can connect customer, policy and location master data with interaction, transaction, and service request data to create a knowledge graph that provides a comprehensive 360-degree view of the customer and their interactions with the company.
Data in the hands of people is transformational. It drives innovation in products and services, empowers collaboration and transforms business and society. Yet 72% of survey respondents said of line-of-business staff could not self-access all the data they need to use.
To address the need for greater data access and sharing, I believe the trend to expand beyond just cataloging data to more comprehensive data marketplace capabilities will accelerate in 2022.
While a data catalog is a component of a data marketplace, the marketplace also provides order management as well as delivery and fulfillment capabilities. It simplifies the data consumption experience with an online, retail-like shopping experience. With just a few clicks, employees can search for a topic or domain of interest, add data sets to the shopping cart, check out and have data securely delivered.
More advanced data marketplaces also ensure that the organization’s data assets are used in a compliant and ethical manner. Data governance policies can be mapped to data sets, which are then used to create terms and conditions for data usage based on the type of data being accessed. This gives data consumers guidance on appropriate use, and consumers must accept the terms before being given access. The marketplace provides full auditability of who is using what data, where the data is being used and for what purposes.