Data Driven Enterprise – Part II: Building an operative data ecosystems strategy
Add Your Heading Text Here
Ecosystems—interconnected sets of services in a single integrated experience—have emerged across a range of industries, from financial services to retail to healthcare. Ecosystems are not limited to a single sector; indeed, many transcend multiple sectors. For traditional incumbents, ecosystems can provide a golden opportunity to increase their influence and fend off potential disruption by faster-moving digital attackers. For example, banks are at risk of losing half of their margins to fintechs, but they have the opportunity to increase margins by a similar amount by orchestrating an ecosystem.
In my experience, many ecosystems focus on the provision of data: exchange, availability, and analysis. Incumbents seeking to excel in these areas must develop the proper data strategy, business model, and architecture.
What is a data ecosystem?
Simply put, a data ecosystem is a platform that combines data from numerous providers and builds value through the usage of processed data. A successful ecosystem balances two priorities:
Building economies of scale by attracting participants through lower barriers to entry. In addition, the ecosystem must generate clear customer benefits and dependencies beyond the core product to establish high exit barriers over the long term.Cultivating a collaboration network that motivates a large number of parties with similar interests (such as app developers) to join forces and pursue similar objectives. One of the key benefits of the ecosystem comes from the participation of multiple categories of players (such as app developers and app users).
What are the data-ecosystem archetypes?
As data ecosystems have evolved, five archetypes have emerged. They vary based on the model for data aggregation, the types of services offered, and the engagement methods of other participants in the ecosystem.
- Data utilities. By aggregating data sets, data utilities provide value-adding tools and services to other businesses. The category includes credit bureaus, consumer-insights firms, and insurance-claim platforms.
- Operations optimization and efficiency centers of excellence. This archetype vertically integrates data within the business and the wider value chain to achieve operational efficiencies. An example is an ecosystem that integrates data from entities across a supply chain to offer greater transparency and management capabilities.
- End-to-end cross-sectorial platforms. By integrating multiple partner activities and data, this archetype provides an end-to-end service to the customers or business through a single platform. Car reselling, testing platforms, and partnership networks with a shared loyalty program exemplify this archetype.
- Marketplace platforms. These platforms offer products and services as a conduit between suppliers and consumers or businesses. Amazon and Alibaba are leading examples.
- B2B infrastructure (platform as a business). This archetype builds a core infrastructure and tech platform on which other companies establish their ecosystem business. Examples of such businesses are data-management platforms and payment-infrastructure providers.
The ingredients for a successful data ecosystem : Data ecosystems have the potential to generate significant value. However, the entry barriers to establishing an ecosystem are typically high, so companies must understand the landscape and potential obstacles. Typically, the hardest pieces to figure out are finding the best business model to generate revenues for the orchestrator and ensuring participation.
If the market already has a large, established player, companies may find it difficult to stake out a position. To choose the right partners, executives need to pinpoint the value they can offer and then select collaborators who complement and support their strategic ambitions. Similarly, companies should look to create a unique value proposition and excellent customer experience to attract both end customers and other collaborators. Working with third parties often requires additional resources, such as negotiating teams supported by legal specialists to negotiate and structure the collaboration with potential partners. Ideally, partnerships should be mutually beneficial arrangements between the ecosystem leader and other participants.
As companies look to enable data pooling and the benefits it can generate, they must be aware of laws regarding competition. Companies that agree to share access to data, technology, and collection methods restrict access for other companies, which could raise anti-competition concerns. Executives must also ensure that they address privacy concerns, which can differ by geography.
Other capabilities and resources are needed to create and build an ecosystem. For example, to find and recruit specialists and tech talent, organizations must create career opportunities and a welcoming environment. Significant investments will also be needed to cover the costs of data-migration projects and ecosystem maintenance.
Ensuring ecosystem participants have access to data
Once a company selects its data-ecosystem archetype, executives should then focus on setting up the right infrastructure to supports its operation. An ecosystem can’t deliver on its promise to participants without ensuring access to data, and that critical element relies on the design of the data architecture. We have identified five questions that incumbents must resolve when setting up their data ecosystem.
How do we exchange data among partners in the ecosystem?
Industry experience shows that standard data-exchange mechanisms among partners, such as cookie handshakes, for example, can be effective. The data exchange typically follows three steps: establishing a secure connection, exchanging data through browsers and clients, and storing results centrally when necessary.
How do we manage identity and access?
Companies can pursue two strategies to select and implement an identity-management system. The more common approach is to centralize identity management through solutions such as Okta, OpenID, or Ping. An emerging approach is to decentralize and federate identity management—for example, by using blockchain ledger mechanisms.
How can we define data domains and storage?
Traditionally, an ecosystem orchestrator would centralize data within each domain. More recent trends in data-asset management favor an open data-mesh architecture . Data mesh challenges conventional centralization of data ownership within one party by using existing definitions and domain assets within each party based on each use case or product. Certain use cases may still require centralized domain definitions with central storage. In addition, global data-governance standards must be defined to ensure interoperability of data assets.
How do we manage access to non-local data assets, and how can we possibly consolidate?
Most use cases can be implemented with periodic data loads through application programming interfaces (APIs). This approach results in a majority of use cases having decentralized data storage. Pursuing this environment requires two enablers: a central API catalog that defines all APIs available to ensure consistency of approach, and strong group governance for data sharing.
How do we scale the ecosystem, given its heterogeneous and loosely coupled nature?
Enabling rapid and decentralized access to data or data outputs is the key to scaling the ecosystem. This objective can be achieved by having robust governance to ensure that all participants of the ecosystem do the following:
- Make their data assets discoverable, addressable, versioned, and trustworthy in terms of accuracy
- Use self-describing semantics and open standards for data exchange
- Support secure exchanges while allowing access at a granular level
The success of a data-ecosystem strategy depends on data availability and digitization, API readiness to enable integration, data privacy and compliance—for example, General Data Protection Regulation (GDPR)—and user access in a distributed setup. This range of attributes requires companies to design their data architecture to check all these boxes.
As incumbents consider establishing data ecosystems, we recommend they develop a road map that specifically addresses the common challenges. They should then look to define their architecture to ensure that the benefits to participants and themselves come to fruition. The good news is that the data-architecture requirements for ecosystems are not complex. The priority components are identity and access management, a minimum set of tools to manage data and analytics, and central data storage.Truly mentioning , Developing an operative data ecosystem strategy is far more difficult than getting the tech requirements right.
Related Posts
AIQRATIONS
Data Driven Enterprise – Part I: Building an effective Data Strategy for competitive edge
Add Your Heading Text Here
Few Enterprises take full advantage of data generated outside their walls. A well-structured data strategy for using external data can provide a competitive edge. Many enterprises have made great strides in collecting and utilizing data from their own activities. So far, though, comparatively few have realized the full potential of linking internal data with data provided by third parties, vendors, or public data sources. Overlooking such external data is a missed opportunity. Organizations that stay abreast of the expanding external-data ecosystem and successfully integrate a broad spectrum of external data into their operations can outperform other companies by unlocking improvements in growth, productivity, and risk management.
The COVID-19 crisis provides an example of just how relevant external data can be. In a few short months, consumer purchasing habits, activities, and digital behavior changed dramatically, making preexisting consumer research, forecasts, and predictive models obsolete. Moreover, as organizations scrambled to understand these changing patterns, they discovered little of use in their internal data. Meanwhile, a wealth of external data could—and still can—help organizations plan and respond at a granular level. Although external-data sources offer immense potential, they also present several practical challenges. To start, simply gaining a basic understanding of what’s available requires considerable effort, given that the external-data environment is fragmented and expanding quickly. Thousands of data products can be obtained through a multitude of channels—including data brokers, data aggregators, and analytics platforms—and the number grows every day. Analyzing the quality and economic value of data products also can be difficult. Moreover, efficient usage and operationalization of external data may require updates to the organization’s existing data environment, including changes to systems and infrastructure. Companies also need to remain cognizant of privacy concerns and consumer scrutiny when they use some types of external data.
These challenges are considerable but surmountable. This blog series discusses the benefits of tapping external-data sources, illustrated through a variety of examples, and lays out best practices for getting started. These include establishing an external-data strategy team and developing relationships with data brokers and marketplace partners. Company leaders, such as the executive sponsor of a data effort and a chief data and analytics officer, and their data-focused teams should also learn how to rigorously evaluate and test external data before using and operationalizing the data at scale.
External-data success stories: Companies across industries have begun successfully using external data from a variety of sources . The investment community is a pioneer in this space. To predict outcomes and generate investment returns, analysts and data scientists in investment firms have gathered “alternative data” from a variety of licensed and public data sources, many of which draw from the “digital exhaust” of a growing number of technology companies and the public web. Investment firms have established teams that assess hundreds of these data sources and providers and then test their effectiveness in investment decisions.
A broad range of data sources are used, and these inform investment decisions in a variety of ways:
- Investors actively gather job postings, company reviews posted by employees, employee-turnover data from professional networking and career websites, and patent filings to understand company strategy and predict financial performance and organizational growth.
- Analysts use aggregated transaction data from card processors and digital-receipt data to understand the volume of purchases by consumers, both online and offline, and to identify which products are increasing in share. This gives them a better understanding of whether traffic is declining or growing, as well as insights into cross-shopping behaviors.
- Investors study app downloads and digital activity to understand how consumer preferences are changing and how effective an organization’s digital strategy is relative to that of its peers. For instance, app downloads, activity, and rating data can provide a window into the success rates of the myriad of live-streaming exercise offerings that have become available over the last year.
Corporations have also started to explore how they can derive more value from external data . For example, a large insurer transformed its core processes, including underwriting, by expanding its use of external-data sources from a handful to more than 40 in the span of two years. The effort involved was considerable; it required prioritization from senior leadership, dedicated resources, and a systematic approach to testing and applying new data sources. The hard work paid off, increasing the predictive power of core models by more than 20 percent and dramatically reducing application complexity by allowing the insurer to eliminate many of the questions it typically included on customer applications.
Three steps to creating value with external data:
Use of external data has the potential to be game changing across a variety of business functions and sectors. The journey toward successfully using external data has three key steps.
1. Establish a dedicated team for external-data sourcing
To get started, organizations should establish a dedicated data-sourcing team. Per our understanding at AIQRATE , a key role on this team is a dedicated data scout or strategist who partners with the data-analytics team and business functions to identify operational, cost, and growth improvements that could be powered by external data. This person also would be responsible for building excitement around what can be made possible through the use of external data, planning the use cases to focus on, identifying and prioritizing data sources for investigation, and measuring the value generated through use of external data. Ideal candidates for this role are individuals who have served as analytics translators and who have experience in deploying analytics use cases and in working with technology, business, and analytics profiles.
The other team members, who should be drawn from across functions, would include purchasing experts, data engineers, data scientists and analysts, technology experts, and data-review-board members . These team members typically spend only part of their time supporting the data-sourcing effort. For example, the data analysts and data scientists may already be supporting data cleaning and modeling for a specific use case and help the sourcing work stream by applying the external data to assess its value. The purchasing expert, already well versed in managing contracts, will build specialization on data-specific licensing approaches to support those efforts.
Throughout the process of finding and using external data, companies must keep in mind privacy concerns and consumer scrutiny, making data-review roles essential peripheral team members. Data reviewers, who typically include legal, risk, and business leaders, should thoroughly vet new consumer data sets—for example, financial transactions, employment data, and cell-phone data indicating when and where people have entered retail locations. The vetting process should ensure that all data were collected with appropriate permissions and will be used in a way that abides by relevant data-privacy laws and passes muster with consumer.This team will need a budget to procure small exploratory data sets, establish relationships with data marketplaces (such as by purchasing trial licenses), and pay for technology requirements (such as expanded data storage).
2. Develop relationships with data marketplaces and aggregators
While online searches may appear to be an easy way for data-sourcing teams to find individual data sets, that approach is not necessarily the most effective. It generally leads to a series of time-consuming vendor-by-vendor discussions and negotiations. The process of developing relationships with a vendor, procuring sample data, and negotiating trial agreements often takes months. A more effective strategy involves using data-marketplace and -aggregation platforms that specialize in building relationships with hundreds of data sources, often in specific data domains—for example, consumer, real-estate, government, or company data. These relationships can give organizations ready access to the broader data ecosystem through an intuitive search-oriented platform, allowing organizations to rapidly test dozens or even hundreds of data sets under the auspices of a single contract and negotiation. Since these external-data distributors have already profiled many data sources, they can be valuable thought partners and can often save an external-data team significant time. When needed, these data distributors can also help identify valuable data products and act as the broker to procure the data.
Once the team has identified a potential data set, the team’s data engineers should work directly with business stakeholders and data scientists to evaluate the data and determine the degree to which the data will improve business outcomes. To do so, data teams establish evaluation criteria, assessing data across a variety of factors to determine whether the data set has the necessary characteristics for delivering valuable insights . Data assessments should include an examination of quality indicators, such as fill rates, coverage, bias, and profiling metrics, within the context of the use case. For example, a transaction data provider may claim to have hundreds of millions of transactions that help illuminate consumer trends. However, if the data include only transactions made by millennial consumers, the data set will not be useful to a company seeking to understand broader, generation-agnostic consumer trends.
3. Prepare the data architecture for new external-data streams
Generating a positive return on investment from external data calls for up-front planning, a flexible data architecture, and ongoing quality-assurance testing.Up-front planning starts with an assessment of the existing data environment to determine how it can support ingestion, storage, integration, governance, and use of the data. The assessment covers issues such as how frequently the data come in, the amount of data, how data must be secured, and how external data will be integrated with internal data. This will provide insights about any necessary modifications to the data architecture.
Modifications should be designed to ensure that the data architecture is flexible enough to support the integration of a continuous “conveyor belt” of incoming data from a variety of data sources—for example, by enabling application-programming-interface (API) calls from external sources along with entity-resolution capabilities to intelligently link the external data to internal data. In other cases, it may require tooling to support large-scale data ingestion, querying, and analysis. Data architecture and underlying systems can be updated over time as needs mature and evolve.The final process in this step is ensuring an appropriate and consistent level of quality by constantly monitoring the data used. This involves examining data regularly against the established quality framework to identify whether the source data have changed and to understand the drivers of any changes (for example, schema updates, expansion of data products, change in underlying data sources). If the changes are significant, algorithmic models leveraging the data may need to be retrained or even rebuilt.
Minimizing risk and creating value with external data will require a unique mix of creative problem solving, organizational capability building, and laser-focused execution. That said, business leaders who demonstrate the achievements possible with external data can capture the imagination of the broader leadership team and build excitement for scaling beyond early pilots and tests. An effective route is to begin with a small team that is focused on using external data to solve a well-defined problem and then use that success to generate momentum for expanding external-data efforts across the organization.
Related Posts
AIQRATIONS
Redefine the new code for GCCs: Winning with AI – strategic perspectives
Add Your Heading Text Here
Global Capability Centers( GCCs) are reflections of strategic components to parent organization’s business imperatives. GCCs are at an inflection point as the pace at which AI is changing every aspect is exponential and at high velocity. The rapid transformation and innovation of GCCs today is driven largely by ability for them to position AI strategic imperative for their parent organizations. AI is seen to the Trojan horse to catapult GCCs to the next level on innovation & transformation. In recent times; GCC story is in a changing era of value and transformative arbitrage.
Most of the GCCs are aiming towards deploying suite of AI led strategies to position themselves up as the model template of AI Center of Excellence. It is widely predicted that AI will disrupt and transform capability centers in the coming decades. How are Global Capability Centers in India looking at positioning themselves as model template for developing AI center of competence? How have the strategies of GCCs transformed with reference to parent organization? whilst delivering tangible business outcomes, innovation & transformation for parent organizations?
Strategic imperatives for GCC’s to consider to move incrementally in the value chain & develop and edge and start winning with AI:
AI transformation :
Artificial Intelligence has become the main focus areas for GCCs in India. The increasing digital penetration in GCCs across business verticals has made it imperative to focus on AI. Hence, GCCs are upping their innovation agenda by building bespoke AI capabilities , solutions & offerings. Accelerated AI adoption has transcended industry verticals, with organizations exploring different use cases and application areas. GCCs in India are strategically leveraging one of the following approaches to drive the AI penetration ahead –
- Federated Approach: Different teams within GCCs drive AI initiatives
- Centralized Approach: Focus is to build a central team with top talent and niche skills that would cater to the parent organization requirements
- Partner ecosystem : Paves a new channel for GCCs by partnering with research institutes , start-ups , accelerators
- Hybrid Approach: A mix of any two or more above mentioned approaches, and can be leveraged according to GCCs needs and constraints.
- Ecosystem creation : Startups /research institutes/Accelerators
One of the crucial ways that GCCs can boost their innovation agenda is by collaborating with start-ups, research institutes , accelerators. Hence, GCCs are employing a variety of strategies to build the ecosystem. These collaborations are a combination of build, buy, and partner models:
- Platform Evangelization: GCCs offer access to their AI platforms to start-ups
- License or Vendor Agreement: GCCs and start-ups enter into a license agreement to create solutions
- Co-innovate: Start-ups and GCCs collaborate to co-create new solutions & capabilities
- Acqui-hire: GCCs acquire start-ups for the talent & capability
- Research centers : GCCs collaborate with academic institutes for joint IP creation, open research , customized programs
- Joint Accelerator program : GCCs & Accelerators build joint program for customized startups cohort
To drive these ecosystem creation models, GCCs can leverage different approaches. Further, successful collaboration programs have a high degree of customization, with clearly defined objectives and talent allocation to drive tangible and impact driven business outcomes.
Differentiated AI Center of Capability :
GCCs are increasingly shifting to competency, capability creation models to reduce time-to-market. In this model, the AI Center of Competence teams are aligned to capability lines of businesses where AI center of competence are responsible for creating AI capabilities, roadmaps and new value offerings, in collaboration with parent organization’s business teams. This alignment and specific roles have clear visibility of the business user requirement. Further, capability creation combined with parent organization’s alignment helps in tangible value outcomes. In several cases, AI teams are building new range of innovation around AI based capabilities and solutions to showcase ensuing GCC as model template for innovation & transformation. GCCs need to conceptualize a bespoke strategy for building and sustaining AI Center of Competence and keep it up on the value chain with mature and measured transformation & innovation led matrices.
AI Talent Mapping Strategy:
With the evolution of analytics ,data sciences to AI, the lines between different skills are blurring. GCCs are witnessing a convergence of skills required across verticals. The strategic shift of GCCs towards AI center of capability model has led to the creation of AI, data engineering & design roles. To build skills in AI & data engineering, GCCs are adopting a hybrid approach. The skill development roadmap for AI is a combination of build and buy strategies. The decision to acquire talent from the ecosystem or internally build capabilities is a function of three parameters – Maturity of GCC s existing AI capabilities in the desired or adjacent areas ,Tactical nature of skill requirement & Availability and accessibility of talent in the ecosystem. There’s always a heavy Inclination towards building skills in-house within GCCs and a majority of GCCs have stressed upon that the bulk of the future deployment in AI areas will be through in-house skill-building and reskilling initiatives. However, talent mapping strategy for building AI capability is a measured approach else can result in being a Achilles heel for GCC and HR leaders.
GCCs in India are uniquely positioned to drive the next wave of growth with building high impact AI center of competence , there are slew of innovative & transformative models that they are working upon to up the ante and trigger new customer experience , products & services and unleash business transformation for the parent organizations. This will not only set the existing GCCs on the path to cutting-edge innovation but also pave the way for other global organizations contemplating global center setup in India.AI is becoming front runner to drive innovation & transformation for GCCs.
Related Posts
AIQRATIONS
Key Strategic Imperatives for GCCs to drive AI Center of Excellence : The new model
Add Your Heading Text Here
Global Capability Centers(GCC’s) are at an inflection point as the pace at which AI is changing every aspect is exponential and at high velocity. The rapid transformation and innovation of GCC’s today is driven largely by ability for them to position AI strategic imperative for their parent organizations. AI is seen to the Trojan horse to catapult GCC’s to the next level on innovation & transformation. In recent times; GCC story is in a changing era of value and transformative arbitrage. Most of the GCCs are aiming towards deploying suite of AI led strategies to position themselves up as the model template of AI center of Excellence . It is widely predicted that AI will disrupt and transform capability centers in the coming decades. How are Global Capability Centers in India looking at positioning themselves as model template for developing AI center of competence? How have the strategies of GCCs transformed with reference to parent organization? whilst delivering tangible business outcomes , innovation & transformation for parent organizations?
Strategic imperatives for GCC’s to consider to move incrementally in the value chain & become premier AI center of excellence
AI transformation
Artificial Intelligence has become the main focus areas for GCCs in India. The increasing digital penetration in GCCs across business verticals has made it imperative to focus on AI. Hence, GCCs are upping their innovation agenda by building bespoke AI CoEs. Accelerated AI adoption has transcended industry verticals, with organizations exploring different use cases and application areas. GCCs in India are strategically leveraging one of the following approaches to drive the AI penetration ahead –
- Federated Approach: Different teams within GCCs drive AI initiatives
- Centralized Approach: Focus is to build a central team with top talent and niche skills that would cater to the parent organization requirements
- Partner ecosystem : Paves a new channel for GCCs by partnering with research institutes , start-ups , accelerators
- Hybrid Approach: A mix of any two or more above mentioned approaches, and can be leveraged according to GCC’s needs and constraints.
Ecosystem creation : Startups /research institutes/Accelerators
One of the crucial ways that GCCs can boost their innovation agenda is by collaborating with start-ups, research institutes , accelerators. Hence, GCCs are employing a variety of strategies to build the ecosystem. These collaborations are a combination of build, buy, and partner models:
- Platform Evangelization: GCCs offer access to their AI platforms to start-ups
- License or Vendor Agreement: GCCs and start-ups enter into a license agreement to create solutions
- Co-innovate: Start-ups and GCCs collaborate to co-create new solutions & capabilities
- Acqui-hire: GCCs acquire start-ups for the talent & capability
- Research centers : GCCs collaborate with academic institutes for joint IP creation , open research , customized programs
- Joint Accelerator program : GCCs & Accelerators build joint program for customized startups cohort
To drive these ecosystem creation models, GCCs can leverage different approaches. Further, successful collaboration programs have a high degree of customization, with clearly defined objectives and talent allocation to drive tangible and impact driven business outcomes.
AI Center of Competence/ Capability
GCCs are increasingly shifting to competency , capability creation models to reduce time-to-market. In this model, the AI Center of Competence teams are aligned to capability lines of businesses where AI center of competence are responsible for creating AI capabilities , roadmaps and new value offerings, in collaboration with parent organization’s business teams. This alignment and specific roles have clear visibility of the business user requirement. Further, capability creation combined with parent organization’s alignment helps in tangible value outcomes. In several cases, AI teams are building new range of innovation around AI based capabilities and solutions to showcase ensuing GCC as model template for innovation & transformation . GCCs need to conceptualize a bespoke strategy for building and sustaining AI Center of Competence and keep it up on the value chain with mature and measured transformation & innovation led matrices.
Talent Mapping Strategy
With the evolution of analytics ,data sciences to AI , the lines between different skills are blurring. GCCs are witnessing a convergence of skills required across verticals. The strategic shift of GCCs towards AI center of capability model has led to the creation of AI , data engineering & design roles. To build skills in AI & data engineering, GCCs are adopting a hybrid approach. The skill development roadmap for AI is a combination of build and buy strategies. The decision to acquire talent from the ecosystem or internally build capabilities is a function of three parameters –Maturity of GCC ’s existing AI capabilities in the desired or adjacent areas ,Tactical nature of skill requirement & Availability and accessibility of talent in the ecosystem. There’s always a heavy Inclination towards building skills in-house within GCCs and a majority of GCCs have stressed upon that the bulk of the future deployment in AI areas will be through in-house skill-building and reskilling initiatives. However, talent mapping strategy for building AI capability is a measured approach else can result in being a Achilles heel for GCC and HR leaders.
GCCs in India are uniquely positioned to drive the next wave of growth with building high impact AI center of competence , there are slew of innovative & transformative models that they are working upon to up the ante and trigger new customer experience , products & services and unleash business transformation for the parent organizations. This will not only set the existing GCCs on the path to cutting-edge innovation but also pave the way for other global organizations contemplating global center setup in India.AI is becoming front runner to drive innovation & transformation for GCCs.
Related Posts
AIQRATIONS
Lock in winning AI deals : Strategic recommendations for enterprises & GCCs
Add Your Heading Text Here
Artificial Intelligence is unleashing exciting growth opportunities for the enterprises & GCCs , at the same time , they also present challenges and complexities when sourcing, negotiating and enabling the AI deals . The hype surrounding this rapidly evolving space can make it seem as if AI providers hold the most power at the negotiation table. After all, the market is ripe with narratives from analysts stating that enterprises and GCCs failing to embrace and implement AI swiftly run the risk of losing their competitiveness. With pragmatic approach and acknowledgement of concerns and potential risks, it is possible to negotiate mutually beneficial contracts that are flexible, agile and most importantly, scalable. The following strategic choices will help you lock in winning AI deals :
Understand AI readiness & roadmap and use cases
It can be difficult to predict exactly where and how AI can be used in the future as it is constantly being developed, but creating a readiness roadmap and identifying your reckoner of potential use cases is a must. Enterprise and GCC readiness and roadmap will help guide your sourcing efforts for enterprises and GCCs , so you can find the provider best suited to your needs and able to scale with your business use cases. You must also clearly frame your targeted objectives both in your discussions with vendors as well as in the contract. This includes not only a stated performance objective for the AI , but also a definition of what would constitute failure and the legal consequences thereof.
Understand your service provider’s roadmap and ability to provide AI evolution to steady state
Once you begin discussions with AI vendors & providers, be sure to ask questions about how evolved their capabilities and offerings are and the complexity of data sets that were used to train their system along with the implementation use cases . These discussions can uncover potential business and security risks and help shape the questions the procurement and legal teams should address in the sourcing process. Understanding the service provider’s roadmap will also help you decide whether they will be able to grow and scale with you. Gaining insight into the service provider’s growth plans can uncover how they will benefit from your business and where they stand against their competitors. The cutthroat competition among AI rivals means that early adopter enterprises and GCCs that want to pilot or deploy AI@scale will see more capabilities available at ever-lower prices over time. Always mote that the AI service providers are benefiting significantly from the use cases you bring forward for trial as well as the vast amounts of data being processed in their platforms. These points should be leveraged to negotiate a better deal.
Identify business risk cycles & inherent bias
As with any implementation, it is important to assess the various risks involved. As technologies become increasingly interconnected, entry points for potential data breaches and risk of potential compliance claims from indirect use also increase. What security measures are in place to protect your data and prevent breaches? How will indirect use be measured and enforced from a compliance standpoint? Another risk AI is subject to is unintentional bias from developers and the data being used to train the technology. Unlike traditional systems built on specific logic rules, AI systems deal with statistical truths rather than literal truths. This can make it extremely difficult to prove with complete certainty that the system will work in all cases as expected.
Develop a sourcing and negotiation plan
Using what you gained in the first three steps, develop a sourcing and negotiation plan that focuses on transparency and clearly defined accountability. You should seek to build an agreement that aligns both your enterprise’s and service provider’s roadmaps and addresses data ownership and overall business and security related risks. For the development of AI , the transparency of the algorithm used for AI purposes is essential so that unintended bias can be addressed. Moreover, it is appropriate that these systems are subjected to extensive testing based on appropriate data sets as such systems need to be “trained” to gain equivalence to human decision making. Gaining upfront and ongoing visibility into how the systems will be trained and tested will help you hold the AI provider accountable for potential mishaps resulting from their own erroneous data and help ensure the technology is working as planned.
Develop a deep understanding of your data, IP, commercial aspects
Another major issue with AI is the intellectual property of the data integrated and generated by an AI product. For an artificial intelligence system to become effective, enterprises would likely have to supply an enormous quantity of data and invest considerable human and financial resources to guide its learning. Does the service provider of the artificial intelligence system acquire any rights to such data? Can it use what its artificial intelligence system learned in one company’s use case to benefit its other customers? In extreme cases, this could mean that the experience acquired by a system in one company could benefit its competitors. If AI is powering your business and product, or if you start to sell a product using AI insights, what commercial protections should you have in place?
In the end , do realize the enormous value of your data, participate in AI readiness, maturity workshops and immersion sessions and identification of new and practical AI use cases. All of this is hugely beneficial to the service provider’s success as well and will enable you to strategically source and win the right AI deal.
(AIQRATE advisory & consulting is a bespoke global AI advisory & consulting firm and provides strategic advisory services to boards, CXOs, senior leaders to curate , design building blocks of AI strategy , embed AI@scale interventions & create AI powered enterprises . Visit www.aiqrate.ai , reach out to us at consult@aiqrate.ai )