Reimagine Business Strategy & Operating Models with AI : The CXO’s Playbook
Add Your Heading Text Here
AlphaGo caused a stir by defeating 18-time world champion Lee Sedol in Go, a game thought to be impenetrable by AI for another 10 years. AlphaGo’s success is emblematic of a broader trend: An explosion of data and advances in algorithms have made technology smarter than ever before. Machines can now carry out tasks ranging from recommending movies to diagnosing cancer — independently of, and in many cases better than, humans. In addition to executing well-defined tasks, technology is starting to address broader, more ambiguous problems. It’s not implausible to imagine that one day a “strategist in a box” could autonomously develop and execute a business strategy. I have spoken to several CXOs and leaders who express such a vision — and they would like to embed AI in the business strategy and their operating models
Business Processes – Increasing productivity by reducing disruptions
AI algorithms are not natively “intelligent.” They learn inductively by analyzing data. Most leaders are investing in AI talent and have built robust information infrastructures, Airbus started to ramp up production of its new A350 aircraft, the company faced a multibillion-euro challenge. The plan was to increase the production rate of that aircraft faster than ever before. To do that, they needed to address issues like responding quickly to disruptions in the factory. Because they will happen. Airbus turned to AI , It combined data from past production programs, continuing input from the A350 program, fuzzy matching, and a self-learning algorithm to identify patterns in production problems.AI led to rectification of about 70% of the production disruptions for Airbus, by matching to solutions used previously — in near real time.
Just as it is enabling speed and efficiency at Airbus, AI capabilities are leading directly to new, better processes and results at other pioneering organizations. Other large companies, such as BP, Wells Fargo, and Ping , an Insurance, are already solving important business problems with AI. Many others, however, have yet to get started.
Integrated Strategy Machine – The Implementation Scope of AI @ scale
The integrated strategy machine is the AI analogy of what new factory designs were for electricity. In other words, the increasing intelligence of machines could be wasted unless businesses reshape the way they develop and execute their strategies. No matter how advanced technology is, it needs human partners to enhance competitive advantage. It must be embedded in what we call the integrated strategy machine. An integrated strategy machine is the collection of resources, both technological and human, that act in concert to develop and execute business strategies. It comprises a range of conceptual and analytical operations, including problem definition, signal processing, pattern recognition, abstraction and conceptualization, analysis, and prediction. One of its critical functions is reframing, which is repeatedly redefining the problem to enable deeper insights.
Amazon represents the state-of-the-art in deploying an integrated strategy machine. It has at least 21 AI systems, which include several supply chain optimization systems, an inventory forecasting system, a sales forecasting system, a profit optimization system, a recommendation engine, and many others. These systems are closely intertwined with each other and with human strategists to create an integrated, well-oiled machine. If the sales forecasting system detects that the popularity of an item is increasing, it starts a cascade of changes throughout the system: The inventory forecast is updated, causing the supply chain system to optimize inventory across its warehouses; the recommendation engine pushes the item more, causing sales forecasts to increase; the profit optimization system adjusts pricing, again updating the sales forecast.
Manufacturing Operations – An AI assistant on the floor
CXOs at industrial companies expect the largest effect in operations and manufacturing. BP plc, for example, augments human skills with AI in order to improve operations in the field. They have something called the BP well advisor that takes all of the data that’s coming off of the drilling systems and creates advice for the engineers to adjust their drilling parameters to remain in the optimum zone and alerts them to potential operational upsets and risks down the road. They are also trying to automate root-cause failure analysis to where the system trains itself over time and it has the intelligence to rapidly assess and move from description to prediction to prescription.
Customer-facing activities – Near real time scoring
Ping An Insurance Co. of China Ltd., the second-largest insurer in China, with a market capitalization of $120 billion, is improving customer service across its insurance and financial services portfolio with AI. For example, it now offers an online loan in three minutes, thanks in part to a customer scoring tool that uses an internally developed AI-based face-recognition capability that is more accurate than humans. The tool has verified more than 300 million faces in various uses and now complements Ping An’s cognitive AI capabilities including voice and imaging recognition.
AI for Different Operational Strategy Models
To make the most of this technology implementation in various business operations in your enterprise, consider the three main ways that businesses can or will use AI:
- Insights enabled intelligence
Now widely available, improves what people and organizations are already doing. For example, Google’s Gmail sorts incoming email into “Primary,” “Social,” and “Promotion” default tabs. The algorithm, trained with data from millions of other users’ emails, makes people more efficient without changing the way they use email or altering the value it provides. Assisted intelligence tends to involve clearly defined, rules-based, repeatable tasks.
Insights based intelligence apps often involve computer models of complex realities that allow businesses to test decisions with less risk. For example, one auto manufacturer has developed a simulation of consumer behaviour, incorporating data about the types of trips people make, the ways those affect supply and demand for motor vehicles, and the variations in those patterns for different city topologies, marketing approaches, and vehicle price ranges. The model spells out more than 200,000 variations for the automaker to consider and simulates the potential success of any tested variation, thus assisting in the design of car launches. As the automaker introduces new cars and the simulator incorporates the data on outcomes from each launch, the model’s predictions will become ever more accurate.
2. Recommendation based Intelligence
Recommendation based Intelligence, emerging today, enables organizations and people to do things they couldn’t otherwise do. Unlike insights enabled intelligence, it fundamentally alters the nature of the task, and business models change accordingly.
Netflix uses machine learning algorithms to do something media has never done before: suggest choices customers would probably not have found themselves, based not just on the customer’s patterns of behaviour, but on those of the audience at large. A Netflix user, unlike a cable TV pay-per-view customer, can easily switch from one premium video to another without penalty, after just a few minutes. This gives consumers more control over their time. They use it to choose videos more tailored to the way they feel at any given moment. Every time that happens, the system records that observation and adjusts its recommendation list — and it enables Netflix to tailor its next round of videos to user preferences more accurately. This leads to reduced costs and higher profits per movie, and a more enthusiastic audience, which then enables more investments in personalization (and AI).
3. Decision enabled Intelligence
Being developed for the future, Decision enabled intelligence creates and deploys machines that act on their own. Very few intelligence systems — systems that make decisions without direct human involvement or oversight — are in widespread use today. Early examples include automated trading in the stock market (about 75 percent of Nasdaq trading is conducted autonomously) and facial recognition. In some circumstances, algorithms are better than people at identifying other people. Other early examples include robots that dispose of bombs, gather deep-sea data, maintain space stations, and perform other tasks inherently unsafe for people.
As you contemplate the deployment of artificial intelligence at scale , articulate what mix of the three approaches works best for you.
a) Are you primarily interested in upgrading your existing processes, reducing costs, and improving productivity? If so, then start with insights enabled intelligence with a clear AI strategy roadmap
b) Do you seek to build your business around something new — responsive and self-driven products, or services and experiences that incorporate AI? Then pursue an decision enabled intelligence approach, probably with more complex AI applications and robust infrastructure
c) Are you developing a genuinely new platform ? In that case, think of building first principles of AI led strategy across the functionalities and processes of the platform .
CXO’s need to create their own AI strategy playbook to reimagine their business strategies and operating models and derive accentuated business performance.
Related Posts
AIQRATIONS
Session on AI Strategy at The Vedica Women’s Alliance (V-WA)
Add Your Heading Text Here
The Vedica Women’s Alliance (V-WA) is hosting a session with Sameer Dhanrajani, Chief Executive Officer, AIQRATE, and President, 3AI – AI & Analytics Association on AI Strategy: The New Next in Transformation & Innovation.
Wedenesday, 24th November 2021 | 5pm – 6pm ISTIn this session, Sameer will demystify the adoption of AI and discuss how organisations can accelerate embracing AI.
Related Posts
AIQRATIONS
Fifth TEDx speaking engagement at TEDxNMIMSBangalore
Add Your Heading Text Here
Sameer Dhanrajani, CEO, AIQRATE Advisory & Consulting, is part of speakers line up at TEDxNMIMS Bangalore.. TEDx speaking engagement, his fifth. A marquee platform with a distinguished & ensemble line up of speakers.
The TEDx theme “miniature yet monumental “ is apt in topical times . I will be sharing my perspectives on AI taking center stage in the enterprise decision making process .. the unknown phenomenon yet the biggest one !!
Related Posts
AIQRATIONS
AI & Decision Making session at Spectrum 6.0 Learning Unlimited event at Reliance Industries Limited
Add Your Heading Text Here
Sameer Dhanrajani, CEO, AIQRATE Advisory & Consulting, presented AI framework & approach on the art of possible & decision making at scale with AI at Spectrum 6.0 Learning Unlimited event at Reliance Industries Ltd on 12th October, 2021. The session was attended by 300+ HR senior leaders & executives. The session carried aspects on reimagining HR business chain with AI and the ensuing impact across employee journey & experience .
Related Posts
AIQRATIONS
8th NASSCOM Annual Technology Conference 2021
Add Your Heading Text Here
NASSCOM Annual Technology Conference 2021
GO DIGITAL – Thrive in the New Normal
22nd – 24th September, 2021
REGISTER NOW: https://nasscom.in/natc/
The global pandemic has accelerated technology disruption, in addition to shifting buyer behaviors and increasing competition. At NASSCOM Annual Technology Conference 2021, you will get a uniquely informed update on the technology trends that are impacting your business. A guide to thriving and managing high growth, at the same time ensuring seamless delivery, process optimization, efficiencies, customer experience enhancement without compromising security, trust, and ethics.The future of any business is AI. Being AI ready is changing the whole economics of the world, buying behaviour of the consumers and dynamics of the organizations. In this ever evolving world, any organization that is not in sync with velocity and agility of change stands at a huge risk of becoming irrelevant.
Leveraging AI will result in emergence of trifecta of better decision making – Augmented Intelligence, Automate & learn and Incorporate human behaviour. The trifecta of better decision making enabled by AI will compel CXOs to usher new strategic paradigms.
AI: the New Next in strategy, innovation & business transformation
The session delivered by seasoned and proven AI evangelist and business builder Sameer Dhanrajani, CEO, AIQRATE Advisory & Consulting, will compel the participants to cogitate towards developing AI strategies in conjunction with looking at developing frameworks and action plans for leveraging AI capabilities for inculcating Transformation, Innovation and Disruption dynamics within your organizations.
Related Posts
AIQRATIONS
“RE-ENGINEERING” BUSINESSES – THINK “AI” led STRATEGY
Add Your Heading Text Here
AI adoption across industries is galloping at a rapid pace and resulting benefits are increasing by the day, some businesses are challenged by the complexity and confusion that AI can generate. Enterprises can get stuck trying to analyse all that’s possible and all that they could do through Ai, when they should be taking that next step of recognizing what’s important and what they should be doing — for their customers, stakeholders, and employees. Discovering real business opportunities and achieving desired outcomes can be elusive. To overcome this, enterprises should pursue a constant attempt to re-engineer their AI strategy to generate insights & intelligence that leads to real outcomes
Re-engineering Data Architecture & Infrastructure
To successfully derive value from data immediately, there is a need for faster data analysis than is currently available using traditional data management technology. With the explosion of digital analytics, social media, and the “Internet of things” (IoT) there is an opportunity to radically re-engineer data architecture to provide organizations with a tiered approach to data collection, with real-time and historical data analyses. Infrastructure-as-a-service for AI is the combination of components that enables architecture that delivers the right business outcomes. Developing this architecture involves aspects of design of the cluster computing power, networking, and innovations in software that enable advanced technology services and interconnectivity. Infrastructure is the foundation for optimal processing and storage of data and is an important which is also the foundation for any data farm.
The new era of AI led infrastructure is virtualized (analytics) environments also can be referred to as the next Big “V” of big data. The virtualization infrastructure approach has several advantages, such as scalability, ease of maintenance, elasticity, cost savings due better utilization of resources, and the abstraction of the external layer from the internal implementation (back-end) of a service or resource. Containers are the trending technology making headlines recently, which is an approach to virtualization and cloud-enabled data centres. Fortune 500 companies have begun to “containerize” their servers, data centre and cloud applications with Docker. Containerization excludes all of the problems of virtualization by eliminating hypervisor and its VMs. Each application is deployed in its own container, which runs on the “bare metal” of the server plus a single, shared instance of the operating system.
AI led Business Process Re-Engineering
The BPR methodologies of the past have significantly contributed to the development of today’s enterprises. However, today’s business landscape has become increasingly complex and fast-paced. The regulatory environment is also constantly changing. Consumers have become more sophisticated and have easy access to information, on-the-go. Staying competitive in the present business environment requires organizations to go beyond process efficiencies, incremental improvements and enhancing transactional flow. Now, organizations need to have a comprehensive understanding of its business model through an objective and realistic grasp of its business processes. This entails having organization-wide insights that show the interdependence of various internal functions while taking into consideration regulatory requirements and shifting consumer tastes.
Data is the basis on which fact-based analysis is performed to obtain objective insights of the organization. In order to obtain organization-wide insights, management needs to employ AI capabilities on data that resides both inside and outside its organization. However, an organization’s AI capabilities are primarily dependent on the type, amount and quality of data it possesses.
The integration of an organization’s three key dimensions of people, process and technology is also critical during process design. The people are the individuals responsible and accountable for the organization’s processes. The process is the chain of activities required to keep the organization running. The technology is the suite of tools that support, monitor and ensure consistency in the application of the process. The integration of all these, through the support of a clear governance structure, is critical in sustaining a fact-based driven organizational culture and the effective capture, movement and analysis of data. Designing processes would then be most effective if it is based on data-driven insights and when AI capabilities are embedded into the re-engineered processes. Data-driven insights are essential in gaining a concrete understanding of the current business environment and utilizing these insights is critical in designing business processes that are flexible, agile and dynamic.
Re-engineering Customer Experience (CX) – The new paradigm
It’s always of great interest to me to see new trends emerge in our space. One such trend gaining momentum is enterprise looking at solving customer needs & expectations with what I’d describe as re-engineering customer experience . Just like everything else in our industry, changes in consumer behaviour caused by mobile and social trends are disrupting the CX space. Just a few years ago, web analytics solutions gave brands the best view into performance of their digital business and user behaviours. Fast-forward to today, and this is often not the case. With the growth in volume and importance of new devices, digital channels and touch points, CX solutions are now just one of the many digital data silos that brands need to deal with and integrate into the full digital picture. While some vendors may now offer ways for their solutions to run in different channels and on a range of devices, these capabilities are often still a work in progress. Many enterprises today find their CX solution as another critical set of insights that must be downloaded daily into a omni-channel AI data store and then run visualization to provide cross-channel business reporting.
Re-shaping Talent Acquisition and Engagement with AI
AI s is causing disruption in virtually every function but talent acquisition t is one of the more recent to get a business refresh. A new data driven approach to talent management is reshaping the way organizations find and hire staff, while the power of talent analytics is also changing how HR tackles employee retention and engagement. The implications for anyone hoping to land a job, and for businesses that have traditionally relied on personal relationships are extreme, but robots and algorithms will not yet completely replace human interaction.AI will certainly help to identify talent in specific searches. rather than relying on a rigorous interview process and resume, employers are able to “mine” through deep reserves of information, including from your online footprint. The real value will be in identifying personality types, abilities, and other strengths to help create well-rounded teams. Also, companies are also using people analytics to understand the stress levels of their employees to ensure long-term productiveness and wellness.
The Final Word
Based on my experiences with clients across enterprises , GCCs ,start-ups ; alignment among the three key dimensions of talent, process and AI led technology within a robust governance structure are critical to effectively utilize AI and remain competitive in the current business environment. AI is able to open doors to growth & scalability through insights & intelligence resulting in the identification of industry white spaces. It enhances operational efficiency through process improvements based on relevant and fact-based data. It is able to enrich human capital through workforce analysis resulting in more effective human capital management. It is able to mitigate risks by identifying areas of regulatory and company policy non-compliance before actual damage is done. AI led re-engineering approach unleashes the potential of an organization by putting the facts and the reality into the hands of the decision makers.
(AIQRATE, A bespoke global AI advisory and consulting firm. A first in its genre, AIQRATE provides strategic AI advisory services and consulting offerings across multiple business segments to enable clients on their AI powered transformation & innovation journey and accentuate their decision making and business performance.
AIQRATE works closely with Boards, CXOs and Senior leaders advising them on navigating their Analytics to AI journey with the art of possible or making them jumpstart to AI rhythm with AI@scale approach followed by consulting them on embedding AI as core to business strategy within business functions and augmenting the decision-making process with AI. We have proven bespoke AI advisory services to enable CXO’s and Senior Leaders to curate & design building blocks of AI strategy, embed AI@scale interventions and create AI powered organizations.
AIQRATE’s path breaking 50+ AI consulting frameworks, assessments, primers, toolkits and playbooks enable Indian & global enterprises, GCCs, Startups, SMBs, VC/PE firms, and Academic Institutions enhance business performance and accelerate decision making.
AIQRATE also consults with Consulting firms , Technology service providers , Pure play AI firms , Technology behemoths & Platform enterprises on curating differentiated & bespoke AI capabilities & offerings , market development scenarios & GTM approaches
Visit www.aiqrate.ai to experience our AI advisory services & consulting offerings)
Related Posts
AIQRATIONS
AI Experiential Masterclass – NASSCOM CoE IOT & AI
Add Your Heading Text Here
NASSCOM Centre of Excellence IOT & AI in collaboration with Startup Karnataka invited Sameer Dhanrajani, CEO at AIQRATE Advisory & Consulting, for delivering one-of-a-kind customized and experiential masterclass “AI: The New Next in Strategy and Business Transformation“. This was for delegates to learn about the developing AI strategies that can be inculcated to perceive disruption dynamics within organizations.
Masterclass was held on 22nd July, 2021 | 3pm – 4:30pm.
Related Posts
AIQRATIONS
Data Driven Enterprise – Part II: Building an operative data ecosystems strategy
Add Your Heading Text Here
Ecosystems—interconnected sets of services in a single integrated experience—have emerged across a range of industries, from financial services to retail to healthcare. Ecosystems are not limited to a single sector; indeed, many transcend multiple sectors. For traditional incumbents, ecosystems can provide a golden opportunity to increase their influence and fend off potential disruption by faster-moving digital attackers. For example, banks are at risk of losing half of their margins to fintechs, but they have the opportunity to increase margins by a similar amount by orchestrating an ecosystem.
In my experience, many ecosystems focus on the provision of data: exchange, availability, and analysis. Incumbents seeking to excel in these areas must develop the proper data strategy, business model, and architecture.
What is a data ecosystem?
Simply put, a data ecosystem is a platform that combines data from numerous providers and builds value through the usage of processed data. A successful ecosystem balances two priorities:
Building economies of scale by attracting participants through lower barriers to entry. In addition, the ecosystem must generate clear customer benefits and dependencies beyond the core product to establish high exit barriers over the long term.Cultivating a collaboration network that motivates a large number of parties with similar interests (such as app developers) to join forces and pursue similar objectives. One of the key benefits of the ecosystem comes from the participation of multiple categories of players (such as app developers and app users).
What are the data-ecosystem archetypes?
As data ecosystems have evolved, five archetypes have emerged. They vary based on the model for data aggregation, the types of services offered, and the engagement methods of other participants in the ecosystem.
- Data utilities. By aggregating data sets, data utilities provide value-adding tools and services to other businesses. The category includes credit bureaus, consumer-insights firms, and insurance-claim platforms.
- Operations optimization and efficiency centers of excellence. This archetype vertically integrates data within the business and the wider value chain to achieve operational efficiencies. An example is an ecosystem that integrates data from entities across a supply chain to offer greater transparency and management capabilities.
- End-to-end cross-sectorial platforms. By integrating multiple partner activities and data, this archetype provides an end-to-end service to the customers or business through a single platform. Car reselling, testing platforms, and partnership networks with a shared loyalty program exemplify this archetype.
- Marketplace platforms. These platforms offer products and services as a conduit between suppliers and consumers or businesses. Amazon and Alibaba are leading examples.
- B2B infrastructure (platform as a business). This archetype builds a core infrastructure and tech platform on which other companies establish their ecosystem business. Examples of such businesses are data-management platforms and payment-infrastructure providers.
The ingredients for a successful data ecosystem : Data ecosystems have the potential to generate significant value. However, the entry barriers to establishing an ecosystem are typically high, so companies must understand the landscape and potential obstacles. Typically, the hardest pieces to figure out are finding the best business model to generate revenues for the orchestrator and ensuring participation.
If the market already has a large, established player, companies may find it difficult to stake out a position. To choose the right partners, executives need to pinpoint the value they can offer and then select collaborators who complement and support their strategic ambitions. Similarly, companies should look to create a unique value proposition and excellent customer experience to attract both end customers and other collaborators. Working with third parties often requires additional resources, such as negotiating teams supported by legal specialists to negotiate and structure the collaboration with potential partners. Ideally, partnerships should be mutually beneficial arrangements between the ecosystem leader and other participants.
As companies look to enable data pooling and the benefits it can generate, they must be aware of laws regarding competition. Companies that agree to share access to data, technology, and collection methods restrict access for other companies, which could raise anti-competition concerns. Executives must also ensure that they address privacy concerns, which can differ by geography.
Other capabilities and resources are needed to create and build an ecosystem. For example, to find and recruit specialists and tech talent, organizations must create career opportunities and a welcoming environment. Significant investments will also be needed to cover the costs of data-migration projects and ecosystem maintenance.
Ensuring ecosystem participants have access to data
Once a company selects its data-ecosystem archetype, executives should then focus on setting up the right infrastructure to supports its operation. An ecosystem can’t deliver on its promise to participants without ensuring access to data, and that critical element relies on the design of the data architecture. We have identified five questions that incumbents must resolve when setting up their data ecosystem.
How do we exchange data among partners in the ecosystem?
Industry experience shows that standard data-exchange mechanisms among partners, such as cookie handshakes, for example, can be effective. The data exchange typically follows three steps: establishing a secure connection, exchanging data through browsers and clients, and storing results centrally when necessary.
How do we manage identity and access?
Companies can pursue two strategies to select and implement an identity-management system. The more common approach is to centralize identity management through solutions such as Okta, OpenID, or Ping. An emerging approach is to decentralize and federate identity management—for example, by using blockchain ledger mechanisms.
How can we define data domains and storage?
Traditionally, an ecosystem orchestrator would centralize data within each domain. More recent trends in data-asset management favor an open data-mesh architecture . Data mesh challenges conventional centralization of data ownership within one party by using existing definitions and domain assets within each party based on each use case or product. Certain use cases may still require centralized domain definitions with central storage. In addition, global data-governance standards must be defined to ensure interoperability of data assets.
How do we manage access to non-local data assets, and how can we possibly consolidate?
Most use cases can be implemented with periodic data loads through application programming interfaces (APIs). This approach results in a majority of use cases having decentralized data storage. Pursuing this environment requires two enablers: a central API catalog that defines all APIs available to ensure consistency of approach, and strong group governance for data sharing.
How do we scale the ecosystem, given its heterogeneous and loosely coupled nature?
Enabling rapid and decentralized access to data or data outputs is the key to scaling the ecosystem. This objective can be achieved by having robust governance to ensure that all participants of the ecosystem do the following:
- Make their data assets discoverable, addressable, versioned, and trustworthy in terms of accuracy
- Use self-describing semantics and open standards for data exchange
- Support secure exchanges while allowing access at a granular level
The success of a data-ecosystem strategy depends on data availability and digitization, API readiness to enable integration, data privacy and compliance—for example, General Data Protection Regulation (GDPR)—and user access in a distributed setup. This range of attributes requires companies to design their data architecture to check all these boxes.
As incumbents consider establishing data ecosystems, we recommend they develop a road map that specifically addresses the common challenges. They should then look to define their architecture to ensure that the benefits to participants and themselves come to fruition. The good news is that the data-architecture requirements for ecosystems are not complex. The priority components are identity and access management, a minimum set of tools to manage data and analytics, and central data storage.Truly mentioning , Developing an operative data ecosystem strategy is far more difficult than getting the tech requirements right.
Related Posts
AIQRATIONS
Data Driven Enterprise – Part I: Building an effective Data Strategy for competitive edge
Add Your Heading Text Here
Few Enterprises take full advantage of data generated outside their walls. A well-structured data strategy for using external data can provide a competitive edge. Many enterprises have made great strides in collecting and utilizing data from their own activities. So far, though, comparatively few have realized the full potential of linking internal data with data provided by third parties, vendors, or public data sources. Overlooking such external data is a missed opportunity. Organizations that stay abreast of the expanding external-data ecosystem and successfully integrate a broad spectrum of external data into their operations can outperform other companies by unlocking improvements in growth, productivity, and risk management.
The COVID-19 crisis provides an example of just how relevant external data can be. In a few short months, consumer purchasing habits, activities, and digital behavior changed dramatically, making preexisting consumer research, forecasts, and predictive models obsolete. Moreover, as organizations scrambled to understand these changing patterns, they discovered little of use in their internal data. Meanwhile, a wealth of external data could—and still can—help organizations plan and respond at a granular level. Although external-data sources offer immense potential, they also present several practical challenges. To start, simply gaining a basic understanding of what’s available requires considerable effort, given that the external-data environment is fragmented and expanding quickly. Thousands of data products can be obtained through a multitude of channels—including data brokers, data aggregators, and analytics platforms—and the number grows every day. Analyzing the quality and economic value of data products also can be difficult. Moreover, efficient usage and operationalization of external data may require updates to the organization’s existing data environment, including changes to systems and infrastructure. Companies also need to remain cognizant of privacy concerns and consumer scrutiny when they use some types of external data.
These challenges are considerable but surmountable. This blog series discusses the benefits of tapping external-data sources, illustrated through a variety of examples, and lays out best practices for getting started. These include establishing an external-data strategy team and developing relationships with data brokers and marketplace partners. Company leaders, such as the executive sponsor of a data effort and a chief data and analytics officer, and their data-focused teams should also learn how to rigorously evaluate and test external data before using and operationalizing the data at scale.
External-data success stories: Companies across industries have begun successfully using external data from a variety of sources . The investment community is a pioneer in this space. To predict outcomes and generate investment returns, analysts and data scientists in investment firms have gathered “alternative data” from a variety of licensed and public data sources, many of which draw from the “digital exhaust” of a growing number of technology companies and the public web. Investment firms have established teams that assess hundreds of these data sources and providers and then test their effectiveness in investment decisions.
A broad range of data sources are used, and these inform investment decisions in a variety of ways:
- Investors actively gather job postings, company reviews posted by employees, employee-turnover data from professional networking and career websites, and patent filings to understand company strategy and predict financial performance and organizational growth.
- Analysts use aggregated transaction data from card processors and digital-receipt data to understand the volume of purchases by consumers, both online and offline, and to identify which products are increasing in share. This gives them a better understanding of whether traffic is declining or growing, as well as insights into cross-shopping behaviors.
- Investors study app downloads and digital activity to understand how consumer preferences are changing and how effective an organization’s digital strategy is relative to that of its peers. For instance, app downloads, activity, and rating data can provide a window into the success rates of the myriad of live-streaming exercise offerings that have become available over the last year.
Corporations have also started to explore how they can derive more value from external data . For example, a large insurer transformed its core processes, including underwriting, by expanding its use of external-data sources from a handful to more than 40 in the span of two years. The effort involved was considerable; it required prioritization from senior leadership, dedicated resources, and a systematic approach to testing and applying new data sources. The hard work paid off, increasing the predictive power of core models by more than 20 percent and dramatically reducing application complexity by allowing the insurer to eliminate many of the questions it typically included on customer applications.
Three steps to creating value with external data:
Use of external data has the potential to be game changing across a variety of business functions and sectors. The journey toward successfully using external data has three key steps.
1. Establish a dedicated team for external-data sourcing
To get started, organizations should establish a dedicated data-sourcing team. Per our understanding at AIQRATE , a key role on this team is a dedicated data scout or strategist who partners with the data-analytics team and business functions to identify operational, cost, and growth improvements that could be powered by external data. This person also would be responsible for building excitement around what can be made possible through the use of external data, planning the use cases to focus on, identifying and prioritizing data sources for investigation, and measuring the value generated through use of external data. Ideal candidates for this role are individuals who have served as analytics translators and who have experience in deploying analytics use cases and in working with technology, business, and analytics profiles.
The other team members, who should be drawn from across functions, would include purchasing experts, data engineers, data scientists and analysts, technology experts, and data-review-board members . These team members typically spend only part of their time supporting the data-sourcing effort. For example, the data analysts and data scientists may already be supporting data cleaning and modeling for a specific use case and help the sourcing work stream by applying the external data to assess its value. The purchasing expert, already well versed in managing contracts, will build specialization on data-specific licensing approaches to support those efforts.
Throughout the process of finding and using external data, companies must keep in mind privacy concerns and consumer scrutiny, making data-review roles essential peripheral team members. Data reviewers, who typically include legal, risk, and business leaders, should thoroughly vet new consumer data sets—for example, financial transactions, employment data, and cell-phone data indicating when and where people have entered retail locations. The vetting process should ensure that all data were collected with appropriate permissions and will be used in a way that abides by relevant data-privacy laws and passes muster with consumer.This team will need a budget to procure small exploratory data sets, establish relationships with data marketplaces (such as by purchasing trial licenses), and pay for technology requirements (such as expanded data storage).
2. Develop relationships with data marketplaces and aggregators
While online searches may appear to be an easy way for data-sourcing teams to find individual data sets, that approach is not necessarily the most effective. It generally leads to a series of time-consuming vendor-by-vendor discussions and negotiations. The process of developing relationships with a vendor, procuring sample data, and negotiating trial agreements often takes months. A more effective strategy involves using data-marketplace and -aggregation platforms that specialize in building relationships with hundreds of data sources, often in specific data domains—for example, consumer, real-estate, government, or company data. These relationships can give organizations ready access to the broader data ecosystem through an intuitive search-oriented platform, allowing organizations to rapidly test dozens or even hundreds of data sets under the auspices of a single contract and negotiation. Since these external-data distributors have already profiled many data sources, they can be valuable thought partners and can often save an external-data team significant time. When needed, these data distributors can also help identify valuable data products and act as the broker to procure the data.
Once the team has identified a potential data set, the team’s data engineers should work directly with business stakeholders and data scientists to evaluate the data and determine the degree to which the data will improve business outcomes. To do so, data teams establish evaluation criteria, assessing data across a variety of factors to determine whether the data set has the necessary characteristics for delivering valuable insights . Data assessments should include an examination of quality indicators, such as fill rates, coverage, bias, and profiling metrics, within the context of the use case. For example, a transaction data provider may claim to have hundreds of millions of transactions that help illuminate consumer trends. However, if the data include only transactions made by millennial consumers, the data set will not be useful to a company seeking to understand broader, generation-agnostic consumer trends.
3. Prepare the data architecture for new external-data streams
Generating a positive return on investment from external data calls for up-front planning, a flexible data architecture, and ongoing quality-assurance testing.Up-front planning starts with an assessment of the existing data environment to determine how it can support ingestion, storage, integration, governance, and use of the data. The assessment covers issues such as how frequently the data come in, the amount of data, how data must be secured, and how external data will be integrated with internal data. This will provide insights about any necessary modifications to the data architecture.
Modifications should be designed to ensure that the data architecture is flexible enough to support the integration of a continuous “conveyor belt” of incoming data from a variety of data sources—for example, by enabling application-programming-interface (API) calls from external sources along with entity-resolution capabilities to intelligently link the external data to internal data. In other cases, it may require tooling to support large-scale data ingestion, querying, and analysis. Data architecture and underlying systems can be updated over time as needs mature and evolve.The final process in this step is ensuring an appropriate and consistent level of quality by constantly monitoring the data used. This involves examining data regularly against the established quality framework to identify whether the source data have changed and to understand the drivers of any changes (for example, schema updates, expansion of data products, change in underlying data sources). If the changes are significant, algorithmic models leveraging the data may need to be retrained or even rebuilt.
Minimizing risk and creating value with external data will require a unique mix of creative problem solving, organizational capability building, and laser-focused execution. That said, business leaders who demonstrate the achievements possible with external data can capture the imagination of the broader leadership team and build excitement for scaling beyond early pilots and tests. An effective route is to begin with a small team that is focused on using external data to solve a well-defined problem and then use that success to generate momentum for expanding external-data efforts across the organization.
Related Posts
AIQRATIONS
Redefine the new code for GCCs: Winning with AI – strategic perspectives
Add Your Heading Text Here
Global Capability Centers( GCCs) are reflections of strategic components to parent organization’s business imperatives. GCCs are at an inflection point as the pace at which AI is changing every aspect is exponential and at high velocity. The rapid transformation and innovation of GCCs today is driven largely by ability for them to position AI strategic imperative for their parent organizations. AI is seen to the Trojan horse to catapult GCCs to the next level on innovation & transformation. In recent times; GCC story is in a changing era of value and transformative arbitrage.
Most of the GCCs are aiming towards deploying suite of AI led strategies to position themselves up as the model template of AI Center of Excellence. It is widely predicted that AI will disrupt and transform capability centers in the coming decades. How are Global Capability Centers in India looking at positioning themselves as model template for developing AI center of competence? How have the strategies of GCCs transformed with reference to parent organization? whilst delivering tangible business outcomes, innovation & transformation for parent organizations?
Strategic imperatives for GCC’s to consider to move incrementally in the value chain & develop and edge and start winning with AI:
AI transformation :
Artificial Intelligence has become the main focus areas for GCCs in India. The increasing digital penetration in GCCs across business verticals has made it imperative to focus on AI. Hence, GCCs are upping their innovation agenda by building bespoke AI capabilities , solutions & offerings. Accelerated AI adoption has transcended industry verticals, with organizations exploring different use cases and application areas. GCCs in India are strategically leveraging one of the following approaches to drive the AI penetration ahead –
- Federated Approach: Different teams within GCCs drive AI initiatives
- Centralized Approach: Focus is to build a central team with top talent and niche skills that would cater to the parent organization requirements
- Partner ecosystem : Paves a new channel for GCCs by partnering with research institutes , start-ups , accelerators
- Hybrid Approach: A mix of any two or more above mentioned approaches, and can be leveraged according to GCCs needs and constraints.
- Ecosystem creation : Startups /research institutes/Accelerators
One of the crucial ways that GCCs can boost their innovation agenda is by collaborating with start-ups, research institutes , accelerators. Hence, GCCs are employing a variety of strategies to build the ecosystem. These collaborations are a combination of build, buy, and partner models:
- Platform Evangelization: GCCs offer access to their AI platforms to start-ups
- License or Vendor Agreement: GCCs and start-ups enter into a license agreement to create solutions
- Co-innovate: Start-ups and GCCs collaborate to co-create new solutions & capabilities
- Acqui-hire: GCCs acquire start-ups for the talent & capability
- Research centers : GCCs collaborate with academic institutes for joint IP creation, open research , customized programs
- Joint Accelerator program : GCCs & Accelerators build joint program for customized startups cohort
To drive these ecosystem creation models, GCCs can leverage different approaches. Further, successful collaboration programs have a high degree of customization, with clearly defined objectives and talent allocation to drive tangible and impact driven business outcomes.
Differentiated AI Center of Capability :
GCCs are increasingly shifting to competency, capability creation models to reduce time-to-market. In this model, the AI Center of Competence teams are aligned to capability lines of businesses where AI center of competence are responsible for creating AI capabilities, roadmaps and new value offerings, in collaboration with parent organization’s business teams. This alignment and specific roles have clear visibility of the business user requirement. Further, capability creation combined with parent organization’s alignment helps in tangible value outcomes. In several cases, AI teams are building new range of innovation around AI based capabilities and solutions to showcase ensuing GCC as model template for innovation & transformation. GCCs need to conceptualize a bespoke strategy for building and sustaining AI Center of Competence and keep it up on the value chain with mature and measured transformation & innovation led matrices.
AI Talent Mapping Strategy:
With the evolution of analytics ,data sciences to AI, the lines between different skills are blurring. GCCs are witnessing a convergence of skills required across verticals. The strategic shift of GCCs towards AI center of capability model has led to the creation of AI, data engineering & design roles. To build skills in AI & data engineering, GCCs are adopting a hybrid approach. The skill development roadmap for AI is a combination of build and buy strategies. The decision to acquire talent from the ecosystem or internally build capabilities is a function of three parameters – Maturity of GCC s existing AI capabilities in the desired or adjacent areas ,Tactical nature of skill requirement & Availability and accessibility of talent in the ecosystem. There’s always a heavy Inclination towards building skills in-house within GCCs and a majority of GCCs have stressed upon that the bulk of the future deployment in AI areas will be through in-house skill-building and reskilling initiatives. However, talent mapping strategy for building AI capability is a measured approach else can result in being a Achilles heel for GCC and HR leaders.
GCCs in India are uniquely positioned to drive the next wave of growth with building high impact AI center of competence , there are slew of innovative & transformative models that they are working upon to up the ante and trigger new customer experience , products & services and unleash business transformation for the parent organizations. This will not only set the existing GCCs on the path to cutting-edge innovation but also pave the way for other global organizations contemplating global center setup in India.AI is becoming front runner to drive innovation & transformation for GCCs.