ACCELERATED DECISION MAKING AMPLIFIED BY REAL TIME ANALYTICS – A PERSPECTIVE
Add Your Heading Text Here
Companies are using more real-time analytics, because of the pressure to increase the speed and accuracy of business processes — particularly for digital business and the Internet of Things (IoT). Although data and analytics leaders intuitively understand the value of fast analytical insights, many are unsure how to achieve them.
Every large company makes thousands of real-time decisions each minute. For example, when a potential customer calls the contact center or visits the company’s website to gather product information, the company has a few seconds to figure out the best-next-action offer to propose to maximize the chance of making a sale. Or, when a customer presents a credit card to buy something or submits a withdrawal transaction request to an automated teller machine, a bank has one second or less to determine if the customer is who they say they are and whether they are likely to pay the bill when it is due. Of course, not all real-time decisions are tied to customers. Companies also make real-time decisions about internal operations, such as dynamically rerouting delivery trucks when a traffic jam forms; calling in a mechanic to replace parts in a machine when it starts to fail; or adjusting their manufacturing schedules when incoming materials fail to arrive on time.
Many decisions will be made in real time, regardless of whether real-time analytics are available, because the world is event-driven and the company has to respond immediately as events unfold. Improved real-time responses that are informed by fact-based, real-time analytics are optional, but clearly desirable.
Real-time analytics can be confusing, because different people may be thinking of different concepts when they use the term “real time.” Moreover, it isn’t always simple to determine where real-time analytics are appropriate, because the “right time” for analytics in a given business situation depends on many considerations; real-time is not always appropriate, or even possible. Finally, data and analytics leaders and their staff typically know less about real-time analytics than about traditional business intelligence and analytics.
Find the certain concept of “Real Time” Relevant to Your Business Problem
Real-time analytics is defined as “the discipline that applies logic and mathematics to data to provide insights for making better decisions quickly.” Real time means different things to different people.
When engineers say “real time” they mean that a system will always complete the task within a specified time frame.
Each component and subtask within the system is carefully designed to provide predictable performance, avoiding anything that could take longer to occur than is usually the case. Real-time systems prevent random delays, such as Java garbage collection, and may run on real-time operating systems that avoid nondeterministic behavior in internal functions such as scheduling and dispatching. There is an implied service-level agreement or guarantee. Strictly speaking, a real-time system could take hours or more to do its work, but in practice, most real-time systems act in seconds, milliseconds or even microseconds.
The concept of engineering real time is most relevant when dealing with machines and fully automated applications that require a precise sequence and timing of interactions among multiple components. Control systems for airplanes, power plants, self-driving cars and other machines often use real-time design. Time-critical software applications, such as high-frequency trading (HFT), also leverage engineering real-time concepts although they may not be entirely real time.
Use Different Technology and Design Patterns for Real-Time Computation Versus Real-Time Solutions
Some people use the term real-time analytics to describe fast computation on historical data from yesterday or last year. It’s obviously better to get the answer to a query, or run a model, in a few seconds or minutes (business real time) rather than waiting for a multihour batch run. Real-time computation on small datasets is executed in-memory by Excel and other conventional tools. Real-time computation on large datasets is enabled by a variety of design patterns and technologies, such as:
- Preloading the data into an in-memory database or in-memory analytics tool with large amounts of memory
- Processing in parallel on multiple cores and chips
- Using faster chips or graphics processing units (GPUs)
- Applying efficient algorithms (for example, minimizing context switches)
- Leveraging innovative data architectures (for example, hashing and other kinds of encoding)
Most of these can be hidden within modern analytics products so that the user does not have to be aware of exactly how they are being used.
Real-time computation on historical data is not sufficient for end-to-end real-time solutions that enable immediate action on emerging situations. Analytics for real-time solutions requires two additional things:
- Data must be real time (current)
- Analytic logic must be predefined
If conditions are changing from minute to minute, a business needs to have situation awareness of what is happening right now. The decision must reflect the latest sensor readings, business transactions, web logs, external market data, social computing postings and other current information from people and things.
Real-time solutions use design patterns that enable them to access the input data quickly so that data movement does not become the weak link in the chain. There is no time to read large amounts of data one row or message at a time across a wide-area network. Analytics are run as close as possible to where the data is generated. For example, IoT applications run most real-time solutions on or near the edge, close to the devices. Also, HFT systems are co-located with the stock exchanges to minimize the distance that data has to travel. In some real-time solutions, including HFT systems, special high-speed networks are used to convey streaming data into the system.
Match the Speed of Analytics to the Speed of the Business Decision
Decisions have a range of natural timing, so “right time” is not always real time. Business analysts and solution architects should work with managers and other decision makers to determine how fast to make each decision. The two primary considerations are:
- How quickly will the value of the decision degrade?
- Decisions should be executed in real time if a customer is waiting on the phone for an answer; resources would be wasted if they sit idle; fraud would succeed; or physical processes would fail if the decision takes more than a few milliseconds or minutes. On the other hand, a decision on corporate strategy may be nearly as valuable in a month as it would be today, because its implementation will take place over months and years so starting a bit earlier may not matter much.
- How much better will a decision be if more time is spent?
- Simple, well-understood decisions on known topics, and for which data is readily available, can be made quickly without sacrificing much quality.
Lastly, Automate Decisions if Algorithms Can Represent the Entire Decision Logic
Algorithms offers the “last mile” of the decision. However, automating algorithms requires a well described process to code against. According to Gartner, “Decision automation is possible only when the algorithms associated with the applicable business policies can be fully defined.”
Final Word
Performing some analytics in real time is a goal in many analytics and business intelligence modernization programs. To operate in real time, data and analytics leaders must leverage predefined analytical models, rather than ad hoc models, and use current input data rather than just historical data.
Related Posts
AIQRATIONS
THE BEST PRACTICES FOR INTERNET OF THINGS ANALYTICS
Add Your Heading Text Here
In most ways, Internet of Things analytics are like any other analytics. However, the need to distribute some IoT analytics to edge sites, and to use some technologies not commonly employed elsewhere, requires business intelligence and analytics leaders to adopt new best practices and software.
There are certain prominent challenges that Analytics Vendors are facing in venturing into building a capability. IoT analytics use most of the same algorithms and tools as other kinds of advanced analytics. However, a few techniques occur much more often in IoT analytics, and many analytics professionals have limited or no expertise in these. Analytics leaders are struggling to understand where to start with Internet of Things (IoT) analytics. They are not even sure what technologies are needed.
Also, the advent of IoT also leads to collection of raw data in a massive scale. IoT analytics that run in the cloud or in corporate data centers are the most similar to other analytics practices. Where major differences appear is at the “edge” — in factories, connected vehicles, connected homes and other distributed sites. The staple inputs for IoT analytics are streams of sensor data from machines, medical devices, environmental sensors and other physical entities. Processing this data in an efficient and timely manner sometimes requires event stream processing platforms, time series database management systems and specialized analytical algorithms. It also requires attention to security, communication, data storage, application integration, governance and other considerations beyond analytics. Hence it is imperative to evolve into edge analytics and distribute the data processing load all across.
Hence, some IoT analytics applications have to be distributed to “edge” sites, which makes them harder to deploy, manage and maintain. Many analytics and Data Science practitioners lack expertise in the streaming analytics, time series data management and other technologies used in IoT analytics.
Some visions of the IoT describe a simplistic scenario in which devices and gateways at the edge send all sensor data to the cloud, where the analytic processing is executed, and there are further indirect connections to traditional back-end enterprise applications. However, this describes only some IoT scenarios. In many others, analytical applications in servers, gateways, smart routers and devices process the sensor data near where it is generated — in factories, power plants, oil platforms, airplanes, ships, homes and so on. In these cases, only subsets of conditioned sensor data, or intermediate results (such as complex events) calculated from sensor data, are uploaded to the cloud or corporate data centers for processing by centralized analytics and other applications.
The design and development of IoT analytics — the model building — should generally be done in the cloud or in corporate data centers. However, analytics leaders need to distribute runtime analytics that serve local needs to edge sites. For certain IoT analytical applications, they will need to acquire, and learn how to use, new software tools that provide features not previously required by their analytics programs. These scenarios consequently give us the following best practices to be kept in mind:
Develop Most Analytical Models in the Cloud or at a Centralized Corporate Site
When analytics are applied to operational decision making, as in most IoT applications, they are usually implemented in a two-stage process – In the first stage, data scientists study the business problem and evaluate historical data to build analytical models, prepare data discovery applications or specify report templates. The work is interactive and iterative.
A second stage occurs after models are deployed into operational parts of the business. New data from sensors, business applications or other sources is fed into the models on a recurring basis. If it is a reporting application, a new report is generated, perhaps every night or every week (or every hour, month or quarter). If it is a data discovery application, the new data is made available to decision makers, along with formatted displays and predefined key performance indicators and measures. If it is a predictive or prescriptive analytic application, new data is run through a scoring service or other model to generate information for decision making.
The first stage is almost always implemented centrally, because Model building typically requires data from multiple locations for training and testing purposes. It is easier, and usually less expensive, to consolidate and store all this data centrally. Also, It is less expensive to provision advanced analytics and BI platforms in the cloud or at one or two central corporate sites than to license them for multiple distributed locations.
The second stage — calculating information for operational decision making — may run either at the edge or centrally in the cloud or a corporate data center. Analytics are run centrally if they support strategic, tactical or operational activities that will be carried out at corporate headquarters, at another edge location, or at a business partner’s or customer’s site.
Distribute the Runtime Portion of Locally Focused IoT Analytics to Edge Sites
Some IoT analytics applications need to be distributed, so that processing can take place in devices, control systems, servers or smart routers at the sites where sensor data is generated. This makes sure the edge location stays in operation even when the corporate cloud service is down. Also, wide-area communication is generally too slow for analytics that support time-sensitive industrial control systems.
Thirdly, transmitting all sensor data to a corporate or cloud data center may be impractical or impossible if the volume of data is high or if reliable, high-bandwidth networks are unavailable. It is more practical to filter, condition and do analytic processing partly or entirely at the site where the data is generated.
Train Analytics Staff and Acquire Software Tools to Address Gaps in IoT-Related Analytics Capabilities
Most IoT analytical applications use the same advanced analytics platforms, data discovery tools as other kinds of business application. The principles and algorithms are largely similar. Graphical dashboards, tabular reports, data discovery, regression, neural networks, optimization algorithms and many other techniques found in marketing, finance, customer relationship management and advanced analytics applications also provide most aspects of IoT analytics.
However, a few aspects of analytics occur much more often in the IoT than elsewhere, and many analytics professionals have limited or no expertise in these. For example, some IoT applications use event stream processing platforms to process sensor data in near real time. Event streams are time series data, so they are stored most efficiently in databases (typically column stores) that are designed especially for this purpose, in contrast to the relational databases that dominate traditional analytics. Some IoT analytics are also used to support decision automation scenarios in which an IoT application generates control signals that trigger actuators in physical devices — a concept outside the realm of traditional analytics.
In many cases, companies will need to acquire new software tools to handle these requirements. Business analytics teams need to monitor and manage their edge analytics to ensure they are running properly and determine when analytic models should be tuned or replaced.
Increased Growth, if not Competitive Advantage
The huge volume and velocity of data in IoT will undoubtedly put new levels of strain on networks. The increasing number of real-time IoT apps will create performance and latency issues. It is important to reduce the end-to-end latency among machine-to-machine interactions to single-digit milliseconds. Following the best practices of implementing IoT analytics will ensure judo strategy of increased effeciency output at reduced economy. It may not be suffecient to define a competitive strategy, but as more and more players adopt IoT as a mainstream, the race would be to scale and grow as quickly as possible.
Related Posts
AIQRATIONS
Ethics and Ethos in Analytics – Why is it Imperative for Enterprises to Keep Winning the Trust from Customers?
Add Your Heading Text Here
Data Sciences and analytics technology can reap huge benefits to both individuals and organizations – bringing personalized service, detection of fraud and abuse, efficient use of resources and prevention of failure or accident. So why are there questions being raised about the ethics of analytics, and its superset, Data Sciences?
Ethical Business Processes in Analytics Industry
At its core, an organization is “just people” and so are its customers and stakeholders. It will be individuals who choose what to organization does or does not do and individuals who will judge its appropriateness. As an individual, our perspective is formed from our experience and the opinions of those we respect. Not surprisingly, different people will have different opinions on what is appropriate use of Data Sciences and analytics technology particularly – so who decides which is “right”? Customers and stakeholders may have different opinions on to the organization about what is ethical.
This suggests that organizations should be thoughtful in their use of this Analytics; consulting widely and forming policies that record the decisions and conclusions they have come to. They will consider the wider implications of their activities including:
Context – For what purpose was the data originally surrendered? For what purpose is the data now being used? How far removed from the original context is its new use? Is this appropriate?
Consent & Choice – What are the choices given to an affected party? Do they know they are making a choice? Do they really understand what they are agreeing to? Do they really have an opportunity to decline? What alternatives are offered?
Reasonable – Is the depth and breadth of the data used and the relationships derived reasonable for the application it is used for?
Substantiated – Are the sources of data used appropriate, authoritative, complete and timely for the application?
Owned – Who owns the resulting insight? What are their responsibilities towards it in terms of its protection and the obligation to act?
Fair – How equitable are the results of the application to all parties? Is everyone properly compensated? Considered – What are the consequences of the data collection and analysis?
Access – What access to data is given to the data subject?
Accountable – How are mistakes and unintended consequences detected and repaired? Can the interested parties check the results that affect them?
Together these facets are called the ethical awareness framework. This framework was developed by the UK and Ireland Technical Consultancy Group (TCG) to help people to develop ethical policies for their use of analytics and Data Sciences. Examples of good and bad practices are emerging in the industry and in time they will guide regulation and legislation. The choices we make, as practitioners will ultimately determine the level of legislation imposed around the technology and our subsequent freedom to pioneer in this exciting emerging technical area.
Designing Digital Business for Transparency and Trust
With the explosion of digital technologies, companies are sweeping up vast quantities of data about consumers’ activities, both online and off. Feeding this trend are new smart, connected products—from fitness trackers to home systems—that gather and transmit detailed information.
Though some companies are open about their data practices, most prefer to keep consumers in the dark, choose control over sharing, and ask for forgiveness rather than permission. It’s also not unusual for companies to quietly collect personal data they have no immediate use for, reasoning that it might be valuable someday.
In a future in which customer data will be a growing source of competitive advantage, gaining consumers’ confidence will be key. Companies that are transparent about the information they gather, give customers control of their personal data, and offer fair value in return for it will be trusted and will earn ongoing and even expanded access. Those that conceal how they use personal data and fail to provide value for it stand to lose customers’ goodwill—and their business.
At the same time, consumers appreciate that data sharing can lead to products and services that make their lives easier and more entertaining, educate them, and save them money. Neither companies nor their customers want to turn back the clock on these technologies—and indeed the development and adoption of products that leverage personal data continue to soar. The consultancy Gartner estimates that nearly 5 billion connected “things” will be in use in 2015—up 30% from 2014—and that the number will quintuple by 2020.
Resolving this tension will require companies and policy makers to move the data privacy discussion beyond advertising use and the simplistic notion that aggressive data collection is bad. We believe the answer is more nuanced guidance—specifically, guidelines that align the interests of companies and their customers, and ensure that both parties benefit from personal data collection
Understanding the “Privacy Sensitiveness” of Customer Data
Keeping track of the “privacy sensitiveness” of customer data is also crucial as data and its privacy are not perfectly black and white. Some forms of data tend to be more crucial for customers to protect and maintained private. To see how much consumers valued their data , a conjoint analysis was performed to determine what amount survey participants would be willing to pay to protect different types of information. Though the value assigned varied widely among individuals, we are able to determine, in effect, a median, by country, for each data type.
The responses revealed significant differences from country to country and from one type of data to another. Germans, for instance, place the most value on their personal data, and Chinese and Indians the least, with British and American respondents falling in the middle. Government identification, health, and credit card information tended to be the most highly valued across countries, and location and demographic information among the least.
This spectrum doesn’t represents a “maturity model,” in which attitudes in a country predictably shift in a given direction over time (say, from less privacy conscious to more). Rather, our findings reflect fundamental dissimilarities among cultures. The cultures of India and China, for example, are considered more hierarchical and collectivist, while Germany, the United States, and the United Kingdom are more individualistic, which may account for their citizens’ stronger feelings about personal information.
Adopting Data Aggregation Paradigm for Protecting Privacy
If companies want to protect their users and data they need to be sure to only collect what’s truly necessary. An abundance of data doesn’t necessarily mean that there is an abundance of useable data. Keeping data collection concise and deliberate is key. Relevant data must be held in high regard in order to protect privacy.
It’s also important to keep data aggregated in order to protect privacy and instill transparency. Algorithms are currently being used for everything from machine thinking and autonomous cars, to data science and predictive analytics. The algorithms used for data collection allow companies to see very specific patterns and behavior in consumers all while keeping their identities safe.
One way companies can harness this power while heeding privacy worries is to aggregate their data…if the data shows 50 people following a particular shopping pattern, stop there and act on that data rather than mining further and potentially exposing individual behavior.
Things are getting very interesting…Google, Facebook, Amazon, and Microsoft take the most private information and also have the most responsibility. Because they understand data so well, companies like Google typically have the strongest parameters in place for analyzing and protecting the data they collect.
Finally, Analyze the Analysts
Analytics will increasingly play a significant role in the integrated and global industries today, where individual decisions of analytics professionals may impact the decision making at the highest levels unimagined years ago. There’s a substantial risk at hand in case of a wrong, misjudged model / analysis / statistics that can jeopardize the proper functioning of an organization.
Instruction, rules and supervisions are essential but that alone cannot prevent lapses. Given all this, it is imperative that Ethics should be deeply ingrained in the analytics curriculum today. I believe, that some of the tenets of this code of ethics and standards in analytics and data science should be:
- These ethical benchmarks should be regardless of job title, cultural differences, or local laws.
- Places integrity of analytics profession above own interests
- Maintains governance & standards mechanism that data scientists adhere to
- Maintain and develop professional competence
- Top managers create a strong culture of analytics ethics at their firms, which must filter throughout their entire analytics organization
Related Posts
AIQRATIONS
Can AI and Eternal Humanity both Co-exist
Add Your Heading Text Here
At the turn of the century, it’s likely few, if any, could anticipate the many ways artificial intelligence would later affect our lives.
Take Emotional Robot with Intelligent Network, or ERWIN, for example. He’s designed to mimic human emotions like sadness and happiness in order to help researchers understand how empathy affects human-robot connections. When ERWIN works with Keepon—a robot who looks eerily similar to a real person—scientists gather data on how emotional responses and body language can foster meaningful relationships in an inevitably droid-filled society. Increasingly, robots are integrating into our lives as laborers, therapeutic and medical tools, assistants and more.
While some predict mass unemployment or all-out war between humans and artificial intelligence, others foresee a less bleak future.
The Machine-Man Coexistence
Professor Manuela Veloso, head of the machine learning department at Carnegie Mellon University, is already testing out the idea on the CMU campus, building roving, segway-shaped robots called “cobots” to autonomously escort guests from building to building and ask for human help when they fall short. It’s a new way to think about artificial intelligence, and one that could have profound consequences in the next five years.
There will be a co-existence between humans and artificial intelligence systems that will be hopefully of service to humanity. These AI systems will involve software systems that handle the digital world, and also systems that move around in physical space, like drones, and robots, and autonomous cars, and also systems that process the physical space, like the Internet of Things.
You will have more intelligent systems in the physical world, too — not just on your cell phone or computer, but physically present around us, processing and sensing information about the physical world and helping us with decisions that include knowing a lot about features of the physical world. As time goes by, we’ll also see these AI systems having an impact on broader problems in society: managing traffic in a big city, for instance; making complex predictions about the climate; supporting humans in the big decisions they have to make.
Digital – The Ultimate Catalyst to Accelerate AI
A lot of [AI] research in the early days was actually acquiring [that] knowledge. We would have to ask humans. We would have to go to books and manually enter that information into the computer.
in the last few years, more and more of this information is digital. It seems that the world reveals itself on the internet. So AI systems are now about the data that’s available and the ability to process that data and make sense of it, and we’re still figuring out the best ways to do that. On the other hand, we are very optimistic because we know that the data is there.
The question now becomes, how do we learn from it? How do you use it? How do you represent it? How do you study the distributions — the statistics of the data? How do you put all these pieces together? That’s how you get deep learning and deep reinforcement learning and systems that do automatic translation and robots that play soccer. All these things are possible because we can process all this data so much more effectively and we don’t have to take the enormous step of acquiring that knowledge and representing it. It’s there.
Rules of Coexistence
As of late, discussions have run rampant about the impact of intelligent systems on the nature of work, jobs and the economy. Whether it is self-driving cars, automated warehouses, intelligent advisory systems, or interactive systems supported by deep learning, these technologies are rumored to first take our jobs and eventually run the world.
There are many points of view with regard to this issue, all aimed at defining our role in a world of highly intelligent machines but also aggressively denying the truth of the world to come. Below are the critical arguments of how we’ll coexist with machines in the future:
Machines Take Our Jobs, New Jobs Are Created
Some arguments are driven by the historical observation that every new piece of technology has both destroyed and created jobs. The cotton gin automated the cleaning of cotton. This meant that people no longer had to do the work because a machine enabled the massive growth of cotton production, which shifted the work to cotton picking. For nearly every piece of technology, from the steam engine to the word processor, the argument is that as some jobs were destroyed, others were created.
Machines Only Take Some Of Our Jobs
A variant of the first argument is that even if new jobs are not created, people will shift their focus to those aspects of work that intelligent systems are not equipped to handle. This includes areas requiring the creativity, insight and personal communication that are hallmarks of human abilities, and ones that machines simply do not possess. The driving logic is that there are certain human skills that a machine will never be able to master.
A similar, but more nuanced argument portrays a vision of man-machine partnerships in which the analytical power of a machine augments the more intuitive and emotional skills of the human. Or, depending on how much you value one over the other, human intuition will augment a machine’s cold calculations.
Machines Take Our Jobs, We Design New Machines
Finally, there is the view that as intelligent machines do more and more of the work, we will need more and more people to develop the next generation of those machines. Supported by historical parallels (i.e. cars created the need for mechanics and automobile designers), the argument is that we will always need someone working on the next generation of technology. This is a particularly presumptuous position as it is essentially technologists arguing that while machines will do many things, they will never be able to do what technologists do.
But Could Coexistence Exist Eternally?
These are all reasonable arguments above, and each one has its merits. But they are all based on the same assumption: Machines will never be able to do everything that people can do, because there will always be gaps in a machine’s ability to reason, be creative or intuitive. Machines will never have empathy or emotion, nor have the ability to make decisions or be consciously aware of themselves in a way that could drive introspection.
These assumptions have existed since the earliest days of A.I. They tend to go unquestioned simply because we prefer to live in a world in which machines cannot be our equals, and we maintain control over those aspects of cognition that, to this point at least, make us unique.
But the reality is that from consciousness to intuition to emotion, there is no reason to believe that any one of them will hold. It is quite conclusive that the only alternative to the belief that human thought can be modeled on a machine is to believe that our minds are the product of “magic.” Either we are part of the world of causation or we are not. If we are, A.I. is possible.
Related Posts
AIQRATIONS
Analytics Business Leaders are a Scant Community
Add Your Heading Text Here
A considerable amount of current conversation in the area of data science and analytics focuses on the virtues of solving all the challenges that organizations face when using this new paradigm in the business world. There is also a lot of discussion around the technology-related issues that impact achieving data science and analytics goals.
What hasn’t gotten the attention that it merits, however, is the role of business leadership and how thought leaders need to raise the stakes to become not only well versed in analytics, but to build data science and analytics literacy throughout their organizations. They need a heightened awareness of analytics if they are going to effectively drive analytics strategies and outcomes for their organizations and become true leaders in this area by all relevant measures.
The following three significant findings from align with this point of view:
- Create a culture for making fact-based decisions.
- Establish a common data science and analytics vision—and strategy—to focus everyone on the outcomes.
- Instill analytics expertise across the entire organization, from the top down.
Senior executives and business managers should aspire to create the core competencies and to develop analytical insights that enable them to become data science and analytics leaders within their industries. Education, mentorship, and consultation with outside advisors should be implemented to gain the knowledge necessary to attain a leadership role. When selecting a consultancy, business leaders should choose one that can advise, mentor, and support based on specific needs and levels of maturity, as opposed to those that may take a force-fit approach that essentially attempts to force a round peg methodology into a square pegenvironment. And a good grasp of numbers in respect to numerical literacy is also important in this age of fact-based decision making based on insights derived from large volumes of data.
Numerical literacy means acceptable levels of working knowledge and experience in decision science in which analytical techniques such as statistical and descriptive analysis, forecasting, and performance management can be applied. In addition, moving from a gut-based decision model to a fact-based one requires both cultural change as well as the tools and know-how required to create and manage the facts themselves. Finance teams can typically be the source of such competencies, and they can be used as a center for fostering and developing these competencies across the enterprise. It is ultimately at the mercy of top level management on how they are going to leverage their business acumen to instil a cultural change to foster data thinking.
Bringing In Data Mentors At The Very Top
Today’s executives and managers are trained primarily in operations, finance, marketing, and sales, along with a bit of strategy thrown in for good measure. While a significant number of senior executives in the US have advanced degrees in their field of expertise, few have been formally trained in information management, analytics, or decision science. Yet, virtually none have been schooled in decision science, information theory, analytics, or risk management. Lack of training in these areas creates a dilemma for those organizations that want to focus on data science and analytics but do not have experienced leaders who can lead from a position of domain expertise. But that wouldn’t mean stacking up data workers at the TLM (top level management). Leadership drive , domain insight and People Skills will still be the most coveted virtues at the very top. But the lack of ground level data-sifting skills need not be only plugged at the lower levels. To achieve these competencies without formal education or hands-on experience requires consultation with outside data mentors and advisors who can work hand in hand with the entire senior executive team. These advisors help ground the team in both the science and the pragmatics required to achieve successful data science and analytics outcomes that can be applied pervasively across the organization. This approach—some call it the “charm school” approach—can be characterized by a close collaboration among all parties involved. It can rapidly accelerate the process of nondisruptively developing the senior executive team’s data science and analytics expertise and competency to maximize strategic outcomes.
Data science and analytics success should be driven by the business, and more importantly from the ranks of its senior executives and managers—not from the bottom up or from the IT function. The inherent accountability for all strategic initiatives is at the very top of the organization and cascades down and across to business managers at various levels who then have responsibilities for its execution within their area of control. Organizations today remain hierarchical in both structure and cultural behavior. To change either of these structures requires engaged and competent senior executive teams that are committed to the outcome and can influence and align behaviors to support it.
Strategic Thinking in Data Monetization
A number of organizations have come to this realization already. They’re now engaging with management consultancies and analytics boutiques to address their shortcomings and accelerate results from their data science and analytics strategies and successfully monetize them. Alongside these mentoring activities, organizational leadership is strongly advised to consider organizational structures and change readiness as complementary endeavors. They can help illuminate the revisions to structure and Organizational Change Management (OCM) activities required to bring the entire organization to a level that they could start assessing the monetizing estimates of data thinking. These accompanying measures can also bring cohesion to the entire data science and analytics strategy and the pursuit of its outcomes.
Firms that are looking to monetize Data and their Data Science strategies, must look beyond the data and into the economic questions that the data can answer. Often the data can help answer questions about the value, use, risk, or future value or risk of a specific asset. Or the data can say something about an overall market and how asset classes perform and how customers behave generally. Such insights are understood to have great economic value to asset owners and market participants. However, not all data will offer these features or value. The temperature readings from inside our refrigerators are unlikely to alter markets. However, the temperature readings of our furnaces and air conditioners could, in aggregate, drive new energy conservation and policy decisions.
Transforming data into economic insights will be the focus of top level executives who will monetize data. This transformation will require the creation of data products. It may be that such data products can be sold or traded to clients. It can also be that giving away data products will drive other related monetization strategies. Hence creating the data product will not only require technical expertise with manipulating data but a clear vision on how data can be leveraged to answer economical questions and how the future market of the various domains will pan out.
Related Posts
AIQRATIONS
2017 Digital Trends
Add Your Heading Text Here
Digital transformation reshapes every aspect of a business. As digital technology continues to evolve, I believe that successful digital transformation will require careful collaboration, thoughtful planning, and the inclusion of every department.
During recent years, we’ve seen shifts in how traditional leadership roles operate, as silos break down and the scopes of various roles widen and change. Digital transformation has morphed from a trend to a central component of modern business strategy. Following are the enlisted major trends that will capture the gist of what is to come in 2017.
DIGITAL PLATFORM VIEW OF BUSINESS
A platform provides the business with a foundation where resources can come together — sometimes quickly and temporarily, sometimes in a relatively fixed way — to create value. The value comes largely from connecting the resources, and the network effects between them. As digitalization moves from an innovative trend to a core competency, enterprises will understand and exploit platform effects throughout all aspects of their businesses.
- The deepening of digital means that lines are becoming increasingly blurred, and boundaries semi porous — both inside and outside the enterprise — as multiple networks of stakeholders bring value to each other by exploiting and exploring platform dynamics
- CIOs are clearly being given the opportunity to lead a digital transformation that exploits platform effects majorly in managing delivery, talent and executing leadership
Detailed Analysis can be found here:
https://sameerdhanrajani.wordpress.com/2016/12/18/sameer-dhanrajani-key-win-themes-to-master-in-digital-business/
THE ADVENT OF IMMERSIVE CONTENT: AUGMENTED REALITY AND VIRTUAL REALITY
The booming success of the Pokémon GO AR app is a wakeup call to any business that hasn’t evaluated the potential of AR and VR. These technologies were once limited to the gaming realm, but they’re now easier to implement than ever before. The mainstream shift toward AR and VR provides new ways to connect with customers and offer unique, memorable interactions.
- The AR and VR resurgence will open up the gates for workplace gamification in a big way into a core business strategy
- 2017 is also going to mark a turning point in the way audiences interact with and consume video content through the releases of the HTC Vive, Oculus Rift, PSVR etc.
- Significant improvements in immersive devices as well as software is anticipated
Detailed Analysis can be found here:
https://sameerdhanrajani.wordpress.com/2016/05/27/sameer-dhanrajani-retail-industry-redefined-through-data-sciences/
SMART MACHINES AND ARTIFICIAL INTELLIGENCE ARE TAKING OFF IN A BIG WAY
Our relationships to technology continue to evolve. Soon machines will be able to learn and adapt to their environments. While advanced learning machines may replace low-skill jobs, AIs will be able to work collaboratively with human professionals to solve intensely complex problems.
- Data complexity is the top challenge standing in the way of digital transformation
- AI tools will evolve to read, review and analyze vast quantities of disparate data, providing insight into how customers feel about a company’s products or services and why they feel the way they do
- using AI to expedite knowledge-based activities to improve efficiency and performance will spread from reducing costs through automation, to transforming customer experience
Detailed Analysis can be found here:
https://sameerdhanrajani.wordpress.com/2016/11/18/sameer-dhanrajani-banking-evolution-using-ai/
GROWING IMPORTANCE OF THE USER EXPERIENCE
The customer experience (including employees) is the ultimate goal of any digital transformation. Customers are more cautious than ever; they’ll turn away from brands that don’t align with their values and needs. A top-notch user experience is a fantastic way to keep customers involved and engaged with your brand.
- Every touch point matters, and those leading the transformation will strive to constantly ask how they are removing friction and enhancing the experience for every customer regardless of where they are in the journey
- Understanding digital consumers’ biases, behaviors and expectations at each point along the customer journey will be at the heart of every successful digital transformation
Detailed Analysis can be found here:
https://sameerdhanrajani.wordpress.com/2015/07/24/sameer-dhanrajani-how-to-bring-about-a-customer-experience-focused-digital-transformation/
https://sameerdhanrajani.wordpress.com/2016/12/18/sameer-dhanrajani-key-win-themes-to-master-in-digital-business/
BLOCKCHAIN’S DISRUPTIVE GROWTH
What Uber did for on-demand auto transformation, Blockchain will to do for financial transactions. And with $1.4 billion in venture-capital money in the past three years, 24 countries investing in Blockchain technology for government services, 90-plus central banks engaged in related discussions, and 10 percent of global GDP to be traded via Blockchain technology by 2025-2027, it is important that marketers understand the potential implications for their business.
- Blockchain technology will majorly be a part of the next great flattening and removal of middle-layer institutions
- The semi-public nature of some types of Blockchain paves the way for an enhanced level of security and privacy for sensitive data – a new kind of database where the information ‘header’ is public but the data inside is ‘private’
- Data analytics using Blockchain, distributed ledger transactions and smart contracts will become critical in future, creating new challenges and opportunities in the world of data science
Detailed Analysis can be found here:
https://sameerdhanrajani.wordpress.com/2016/06/21/sameer-dhanrajani-data-sciences-fintech-companies-for-competitive-disruption-advantage/
DIGITAL TRANSFORMATION DRIVEN BY THE INTERNET OF THINGS (IOT).
Speaking of how invaluable big data is to marketers, the IoT offers immeasurable insight into customer’s mind. Businesses and customers alike will continue to benefit from the IoT. With an estimated 50 billion IoT Sensors by 2020 and more than 200 billion “Things” on the Internet by 2030, it is no question that IoT will be not only transformative, but disruptive to business models.
- IoT will change how daily life operates by helping create more efficient cities and leaner enterprises
- The staple tech for autonomous systems would be the Internet of Things (IoT) which would be the infrastructure, as well as the customers, since they work, interact, negotiate and decide with zero human intervention
- Real-time streaming analytics will collection, integration, analysis, and visualization of IoT data in real-time without disrupting the working of existing sources, storage, and enterprise systems
Detailed Analysis can be found here:
https://sameerdhanrajani.wordpress.com/2015/09/25/sameer-dhanrajani-real-time-streaming-analytics/
API ECONOMY
We live in an API economy, a set of business models and channels based on secure access of functionality and exchange of data. APIs will continue to make it easier to integrate and connect people, places, systems, data, things and algorithms, create new user experiences, share data and information, authenticate people and things, enable transactions and algorithms, leverage third-party algorithms, and create new product/services and business models.
- An industry vision seeks using APIs to turn a business into a platform involving digital business models
- As the Internet of Things (IoT) gets smarter, things using an application programming interface (API) to communicate, transact and even negotiate with one another will become the norm
Detailed Analysis can be found here:
https://sameerdhanrajani.wordpress.com/2016/02/12/mr-algorithms-the-new-member-in-the-board-room-to-discuss-algorithm-economy/
http://www.gartner.com/smarterwithgartner/welcome-to-the-api-economy/
Related Posts
AIQRATIONS
The Rush for Artificial Intelligence in Silicon Valley…Is This Here to Stay?
Add Your Heading Text Here
For more than a decade, Silicon Valley’s technology investors and entrepreneurs obsessed over social media and mobile apps that helped people do things like find new friends, fetch a ride home or crowdsource a review of a product or a movie.
Robots after the “Like” Button
Now Silicon Valley has found its next shiny new thing. And it does not have a “Like” button.
The new era in Silicon Valley centers on artificial intelligence and robots, a transformation that many believe will have a payoff on the scale of the personal computing industry or the commercial internet, two previous generations that spread computing globally. Computers have begun to speak, listen and see, as well as sprout legs, wings and wheels to move unfettered in the world.
The shift was evident in a Lowe’s home improvement store here this month, when a prototype inventory checker developed by Bossa Nova Robotics silently glided through the aisles using computer vision to automatically perform a task that humans have done manually for centuries.
The robot, which was skilled enough to autonomously move out of the way of shoppers and avoid unexpected obstacles in the aisles, alerted people to its presence with soft birdsong chirps. Gliding down the middle of an aisle at a leisurely pace, it can recognize bar codes on shelves, and it uses a laser to detect which items are out of stock.
Silicon Valley’s financiers and entrepreneurs are digging into artificial intelligence with remarkable exuberance. The region now has at least 19 companies designing self-driving cars and trucks, up from a handful five years ago. There are also more than a half-dozen types of mobile robots, including robotic bellhops and aerial drones, being commercialized.
The Surge after the Static – The Social Way
There has been a slow trickle in investments in robotics all this while, and suddenly, there seem to be a dozen companies securing large investment rounds focusing on specific robotic niches. Funding in A.I. start-ups has increased more than fourfold to $681 million in 2015, from $145 million in 2011, according to the market research firm CB Insights. The firm estimates that new investments will reach $1.2 billion this year, up 76 percent from last year.
By contrast, funding for social media start-ups peaked in 2011 before plunging. That year, venture capital firms made 66 social media deals and pumped in $2.4 billion. So far this year, there have been just 10 social media investments, totaling $6.9 million, according to CB Insights. Last month, the professional social networking site LinkedIn was sold to Microsoft for $26.2 billion, underscoring that social media has become a mature market sector.
Even Silicon Valley’s biggest social media companies are now getting into artificial intelligence, as are other tech behemoths. Facebook is using A.I. to improve its products. Google will soon compete with Amazon’s Echo and Apple’s Siri, which are based on A.I., with a device that listens in the home, answers questions and places e-commerce orders. Satya Nadella, Microsoft’s chief executive, recently appeared at the Aspen Ideas Conference and called for a partnership between humans and artificial intelligence systems in which machines are designed to augment humans.
The auto industry has also set up camp in the valley to learn how to make cars that can do the driving for you. Both technology and car companies are making claims that increasingly powerful sensors and A.I. software will enable cars to drive themselves with the push of a button as soon as the end of this decade — despite recent Tesla crashes that have raised the question of how quickly human drivers will be completely replaced by the technology.
AI Outdoes the Silicon Valley Reset Trend
Silicon Valley’s new A.I. era underscores the region’s ability to opportunistically reinvent itself and quickly follow the latest tech trend. This is at the heart of the region’s culture that goes all the way back to the Gold Rush. The valley is built on the idea that there is always a way to start over and find a new beginning.
The change spurred a rush for talent in A.I. that has become intense. It is unusual that the number of people trying to get the students to drop out of the class halfway through because now they know a little bit of this stuff is crazy. The valley’s tendency toward reinvention dates back to the region’s initial emergence from the ashes of a deep aerospace industry recession as a consumer-electronics manufacturing center producing memory chips, video games and digital watches in the mid-1970s. A malaise in the personal computing market in the early 1990s was followed by the World Wide Web and the global expansion of the consumer internet.
A decade later, in 2007, just as innovation in mobile phones seemed to be on the verge of moving away from Silicon Valley to Europe and Asia, Apple introduced the first iPhone, resetting the mobile communications marketplace and ensuring that the valley would — for at least another generation — remain the world’s innovation center.
In the most recent shift, the A.I. idea emerged first in Canada in the work of cognitive scientists and computer scientists like Geoffrey Hinton, Yoshua Bengio and Yann LeCun during the previous decade. The three helped pioneer a new approach to deep learning, a machine learning method that is highly effective for pattern recognition challenges like vision and speech. Modeled on a general understanding of how the human brain works, it has helped technologists make rapid progress in a wide range of A.I. fields.
The Road Ahead
How far the A.I. boom will go is hotly debated. For some technologists, today’s technical advances are laying the groundwork for truly brilliant machines that will soon have human-level intelligence. Yet Silicon Valley has faced false starts with A.I. before. During the 1980s, an earlier generation of entrepreneurs also believed that artificial intelligence was the wave of the future, leading to a flurry of start-ups. Their products offered little business value at the time, and so the commercial commercial enthusiasm ended in disappointment, leading to a period now referred to as the “A.I. Winter.” The current resurgence will not fall short this time, and the economic potential in terms of new efficiency and new applications is strong.
Related Posts
AIQRATIONS
Chatbots – The Protege of AI & Data Sciences
Add Your Heading Text Here
There has been a great deal of talk about the use of Artificial Intelligence chatbots in the last few weeks, especially given the news that Facebook are looking to implement AI into their Messenger and WhatsApp platforms, which are currently used by more than 1.8 billion people worldwide. However, does this bode well for the relationship between humans and Artificial Intelligence programs? Would you rather speak to an intelligent algorithm rather than a fellow human being?
The Sales and Customer Support Bot-ler ?
Chatbots, done right, are the cutting-edge form of interactive communications that captivate and engage users. But what kind of potential do they have for sales & customer support ?
To answer this, I should emphasize that customer service can be a delicate field. A lot of consumer engagement with a company happens when something goes wrong — such as a recently-purchased broken product or an incorrect bill or invoice.
By nature, these situations can be highly emotional. And as a business, you want to be responsive to potentially problematic customer inquiries like these. So relying on a chatbot to resolve issues that require a human touch might not be the best idea.
This is especially true if you let your bot “learn” from interactions it sees (say, in user forums) with no or minimal supervision. Things can easily go wrong, as the disaster around Microsoft’s Twitter bot “Tay” showed.
On the other hand, with the right supervision and enough training data, machine learning as an A.I. technique can help build very responsive and accurate informational chatbots — for example those that are meant to help surface data from large text collections, such as manuals.
I’d say that machine learning as a technique has been shown to work best on image processing. The advancements that Google, Facebook, and innovative startups such as Moodstocks (just acquired by Google) are showing in that space are truly amazing. Part of the amazement however, comes from the fact that we now see software take on another cognitive task that we thought could only be managed by humans.
What can bots do for the bottom line?
In my opinion, a bot’s primary application lies in customer service since most companies unfortunately continue to rely on an ancient methodology to manage customer interaction. And this is to be expected as most consumers themselves are still “hard-wired” to pick up a phone and dial a number when they want to engage with a company.
Companies haven’t necessarily made it easy for consumers to transition to digital-first interaction. Consumers are forced to either download a mobile app, browse websites, or use voice, the “dumbest” channel the smartphone has to offer, to retrieve information or perform transactions.
This is truly unfortunate because when it comes to paying a bill, checking on an order status, or reviewing account transactions, nothing is easier than sending a simple message. And with 900 million users now on Facebook Messenger, 1 billion on WhatsApp, and hundreds of millions more on basic SMS, companies have a consumer-preferred new medium for engaging with customers.
With messaging, a simple question can be posed in a simple message such as “Where is my order?”
Contrast this to the conventional options of being forced to shepherding that question through a maze of web or mobile app menus, or with IVR systems over the phone. Now imagine how a consumer-adopted, digital and automated interaction for simple questions vs. agent interaction over the phone could impact customer service and its cost. When chatbots handle the most commonly-asked questions, agent labor is reduced or redeployed to manage more complex and time-consuming interactions. Simple and moderate issues are resolved faster, leading to greater customer satisfaction and long-term loyalty. Bots can help deflect calls from the contact center and your IVR, which further reduces speech recognition license and telephony cost.
Could there be Bot-tle-necks?
There is also the question of whether these chatbots will take jobs from humans; a subject of fierce debate for all industries and levels in the last few months. Facebook itself has been quick to clarify that these chatbots are not going to replace the people in their organisation, but instead to work alongside them. For example, Facebook have said that the customer service executives will be required to train the AI bots, and to step in when the AI comes unstuck, which is likely to be fairly frequently in the early stages! Chinese messenger service WeChat has taken the chatbot idea on, with companies having official accounts through which they are able to communicate with their customers. However, the platform is still in its early stages, and is reported to be incredibly frustrating to use, so those in the customer service sector needn’t worry that their jobs are under threat quite yet!
While we might see chatbots starting to appear through the likes of Facebook Messenger and WhatsApp platforms in the coming 12 months, and will be dedicating teams of engineers to train the platforms, rather than relying on the general public. There are three main factors on which their success depends.
The first is with how much freedom AI in general is allowed to be developed, especially given the hesitation that the likes of Elon Musk and Bill Gates have about a potential ‘Singularity’, with Musk recently being quoted as saying that ‘Artificial Intelligence is our biggest existential threat’.
The second is arguably more important; how willing the general public are to help develop the chatbots, by having conversations with them, in the knowledge that they are talking to an autonomous entity.
More important still, are these chatbots going to be safe from cyberattacks? How will you know if your financial information will be secure if you disclose it to a chatbot, especially if there are unlikely to be the same multi-stage security checks that are the hallmark of P2P customer service interactions?
The Road Ahead?
Many companies are already launching bots for customer acquisition or customer service. We will see failures, and in parts, have already seen some. Bots are not trivial to build: you need people with experience in man-machine interface design. But to quote Amara’s Law: “We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.”
Bots are here to stay, and will be a great new platform and make things easier for all of us. But bots that try to do too much or set unreasonable expectations will slow consumer confidence and acceptance of them. What might help us now is maybe to calm down a bit with the hype, and focus on building good bots that have value — then share our experiences, and show the world where the true value lies.
Related Posts
AIQRATIONS
Data Sciences @ Fintech Companies for Competitive Disruption & Advantage
Add Your Heading Text Here
Long considered an impenetrable fortress dominated by a few well-known names, the banking and financial services industry is currently riding a giant wave of entrepreneurial disruption, disinter-mediation, and digital innovation. Everywhere, things are in flux. New, venture-backed arrivals are challenging the old powerhouses. Banks and financial services companies are caught between increasingly strict and costly regulations, and the need to compete through continuous innovation.
How does an entire industry remain relevant, authoritative, and trustworthy while struggling to surmount inflexible legacy systems, outdated business models, and a tired culture? Is there a way for banks and other traditional financial services companies to stay on budget while managing the competitive threat of agile newcomers and startups that do business at lower costs and with better margins? The threat is real. Can established institutions evolve in time to avoid being replaced? What other strategies can protect their extensive infrastructures and win the battle for the customer’s mind, heart, and wallet?
Financial technology, or fintech, is on fire with innovation and investment. The movement is reshaping entrepreneurial businesses and shaking the financial industry, reimagining the methods and tools consumers use to manage, save, and spend money. Agile fintech companies and their technology-intensive offerings do not shy away from big data, analytics, cloud computing, and machine learning, and they insist on a data-driven culture.
Consider a recent Forbes article by Chance Barnett, which quantifies fintech startup investments at $12 billion in 2014, quadrupling the $3 billion level achieved a year earlier. Adding to the wonderment, crowdfunding is likely to surpass venture capital in 2016 as a primary funding source. And people are joining the conversation. Barnett writes, “According to iQ Media, the number of mentions for ‘fintech’ on social media grew four times between 2013 and 2014, and will probably double again in 2015.” All of this activity underscores how technology is rattling the financial status quo and changing the very nature of money.
Yesterday’s Bank: A Rigid Culture, Strapped for Funds
Established banking institutions are strapped. The financial meltdown in 2008 questioned their operations, eroded trust, and invited punitive regulation designed to command, control, and correct the infractions of the past. Regulatory requirements have drained budgets, time, and attention, locking the major firms into constant compliance reporting. To the chagrin of some, these same regulations have also opened the door for new market entrants, technologies, platforms, and modalities—all of which are transforming the industry.
For traditional banking institutions, focus and energy for innovation are simply not there, nor are the necessary IT budgets. Gartner’s Q3 2015 forecast for worldwide IT spending growth (including all devices, data center systems, enterprise software, IT and Telecom services) hints at the challenge banks face: global IT spending is now down to -4.9%, even further from the -1.3% originally forecast, evidence of the spending and investment restraint we see across the financial landscape.
With IT budgets limited, it is hard to imagine banking firms easily reinventing themselves. Yet some are doing just that. Efficient spending is a top strategic priority for banking institutions. Many banks are moving away from a heavy concentration on compliance spending to instead focus on digital transformation, innovation, or collaboration with fintech firms. There is a huge amount of activity on all fronts. To begin, let’s review the competitive landscape of prominent fintech startups.
Data Sciences Intervention
Digital data has snowballed, with the proliferation of the internet, smartphones and other devices. Companies and governments alike recognize the massive potential in using this information – also known as Big Data – to drive real value for customers, and improve efficiency.
Big Data could transform businesses and economies, but the real game changer is data science.
Data science goes beyond traditional statistics to extract actionable insights from information – not just the sort of information you might find in a spreadsheet, but everything from emails and phone calls to text, images, video, social media data streaming, internet searches, GPS locations and computer logs.
“Data sciences enables us to process data better, faster and cheaper than ever
With powerful new techniques, including complex machine-learning algorithms, data science enables us to process data better, faster and cheaper than ever before.
We’re already seeing significant benefits of this – in areas such as national security, business intelligence, law enforcement, financial analysis, health care and disaster preparedness. From location analytics to predictive marketing to cognitive computing, the array of possibilities is overwhelming, sometimes even life-saving. The New York City Fire Department, for example, was one of the earlier success stories of using data science to proactively identify buildings most at risk from fire.
Unleashing the power of Advanced Data Mining using Data Sciences
For banks – in an era when banking is becoming commoditized – the data mining provides a massive opportunity to stand out from the competition. Every banking transaction is a nugget of data, so the industry sits on vast stores of information.
By using data science to collect and analyses Data, banks can improve, or reinvent, nearly every aspect of banking. Data science can enable hyper-targeted marketing, optimized transaction processing, personalized wealth management advice and more – the potential is endless.
A large proportion of the current Data Mining projects in banking revolve around customers – driving sales, boosting retention, improving service, and identifying needs, so the right offers can be served up at the right time.
“Data sciences can help strengthen risk management such as cards fraud detection
Banks can model their clients’ financial performance on multiple data sources and scenarios. Data science can also help strengthen risk management in areas such as cards fraud detection, financial crime compliance, credit scoring, stress-testing and cyber analytics.
The promise of Big Data is even greater than this, however, potentially opening up whole new frontiers in financial services.
Seamless experience for customers
Over 1.7 billion people with mobile phones are currently excluded from the formal financial system. This makes them invisible to credit bureaus, but they are increasingly becoming discoverable through their mobile footprint. Several innovative Fintech firms have already started building predictive models using this type of unconventional data to assess credit risk and provide new types of financing.
While banks have historically been good at running analytics at a product level, such as credit cards, or mortgages, very few have done so holistically, looking across inter-connected customer relationships that could offer a business opportunity – say when an individual customer works for, supplies or purchases from a company that is also a client of the bank. The evolving field of data science facilitates this seamless view.
Blockchain as the new database
Much more is yet to come. Blockchain, the underlying disruptive technology behind cryptocurrency Bitcoin, could spell huge change for financial services in the future. Saving information as ‘hash’, rather than in its original format, the blockchain ensures each data element is unique, time-stamped and tamper-resistant.
The semi-public nature of some types of blockchain paves the way for an enhanced level of security and privacy for sensitive data – a new kind of database where the information ‘header’ is public but the data inside is ‘private’.
As such, the blockchain has several potential applications in financial markets – think of trade finance, stock exchanges, central securities depositories, trade repositories or settlements systems.
Data analytics using blockchain, distributed ledger transactions and smart contracts will become critical in future, creating new challenges and opportunities in the world of data science.
Related Posts
AIQRATIONS
How Healthcare Industry will Benefit by Embracing Data Sciences
Add Your Heading Text Here
In the healthcare industry, what could be more important than having better healthcare outcomes? Each and every day healthcare workers around the globe are striving hard to find more ways of improving our lives. However, the world is changing, and frankly, at a faster rate than most of us can keep up. Intuition alone will no longer be enough for quality patient outcomes. The amount of healthcare data continues to mound every second, making it harder and harder to find any form of helpful information. Big Data is not to be romanticized; it can be a blessing and a curse. It can contribute to both the insight and the fog of visibility.
In truth, data science is proving invaluable to improving outcomes due to its ability to automate so much of the heavy lifting – in fast, scalable, and precise ways. All one has to do is look at our ability to predict epidemics, advance cures, and make patient stays in hospitals safer and more pleasant. In healthcare, data science should be seen as a beneficial intelligence rather than only artificial intelligence, providing an augmentation of services to the healthcare experts already in play.
Hospital Claims Data
In 2010, there were 35.1 million discharges with an average length of stay of 4.8 days according to the National Hospital Discharge Survey. That same survey went on to note that there were 51.4 million procedures performed. The National Hospital Ambulatory Medical Care Survey in 2011 stated the number of outpatient department visits were 125.7 million with 136.3 million emergency department visits. These are some of the basic figures showing the amount of care the U.S. health care system has provided. Using Data Science to annualize this sort of data allows healthcare providers to start building a new intuition built on a data narrative that could possibly help avoid the spread of diseases or address specific health threats. Using a combination of descriptive statistics, exploratory data analysis, and predictive analytics, it becomes relatively easy to identify the most cost-effective treatments for specific ailments and allows for a process to help reduce the number of duplicate or unnecessary treatments. The power in predicting a future state is in using that knowledge to change the behavior patterns of today.
Electronic Health Record (EHR)
Interoperable electronic health records (EHRs) for patient care hold tremendous potential to reduce the growth in costs. EHRs can help healthcare organizations improve chronic disease management, increase operating efficiencies, transform their finances, and improve patient outcomes. However, EHR implementations are in various stages of maturity across the country, and their benefits have not been fully realized. One of the primary challenges healthcare decision-makers face is how to make meaningful use of the data collected, available, and accessible in EHRs.
By optimizing the use of data accessible in EHRs, we can uncover hidden relationships and identify patterns and trends in this diverse and complex information to improve chronic disease management, increase operating efficiencies, and transform healthcare organizations’ finances.
Patient Behavior and Sentiment Data
A study by AMI Research suggests that “wearables” are expected to reach $52 million by 2019. Wearables monitor heart rates, sleep patterns, walking, and much more while providing new dimensions of context, geolocation, behavioral pattern, and biometrics. Combine this with the unstructured “lifestyle” data that comes across social media and you have a potent combination that is more than just numbers and tweets.
It is obvious that we will experience huge benefits from analyzing the in’s and out’s of healthcare data. In my judgment, we will continue to see a push for prevention over cure which puts predicting outcomes front and center. After all, catching things in the earlier stages is easier to treat and outbreaks can be more easily contained.
It may not resonate as widely today, but in the future we will look back on data science as something significant for healthcare. It is reasonable to expect that we will likely recover more quickly from illness and injury, live longer because of newly discovered drugs, and benefit from more efficient hospital surgeries – and in large part this will be because of how we analyze Big Data.
What makes living in the era of Big Data such a delight is that the healthcare industry is being pressed to find better tools, skills, and techniques to deal competently with the deluge of patient data and its inherent insights? When healthcare makes the choice to fully embrace data science, it will change the future for everyone.
Genomics
Inexpensive DNA sequencing and next-generation genomic technologies are changing the way healthcare providers do business. We now have the ability to map entire DNA sequences and measure tens of thousands of blood components to assess health.
Next-generation genomic technologies allow data scientists to drastically increase the amount of genomic data collected on large study populations. When combined with new informatics approaches that integrate many kinds of data with genomic data in Healthcare applications such as disease research, Prescription Effectiveness etc, we will better understand the genetic bases of drug response and disease. Researchers aim to achieve ultra-personalized healthcare. As a beginning, the FDA has already begun to issue medicine labels that specify different dosages for patients with particular genetic variants.
Predictive Analytics and Preventive Measures
Prevention is always better than cure. For the health-care industry, it also happens to save a lot of money. (The Centers for Medicaid and Medicare Services, for instance, can penalize hospitals that exceed average rates of readmission – indicating that they could be doing more to prevent medical problems.)
Take, for example, the partnership between Mount Sinai Medical Center and former Facebook guru Jeff Hammerbach. Mount Sinai’s problem was how to reduce readmission rates. Hammerbach’s solution was predictive analytics:
- In a pilot study, Hammerbach and his team combined data on disease, past hospital visits and other factors to determine a patient’s risk of readmission.
- These high-risk patients would then receive regular communication from hospital staff to help them avoid getting sick again.
Sinai isn’t alone. In 2008, Texas Health partnered with Healthways to merge and analyze clinical and insurance claims information. Their goal was the same – identify high-risk patients and offer them customized interventions.
Meanwhile, in 2013, data scientists at Methodist Health System are looking at accountable-care organization claims from 14,000 Medicare beneficiaries and 6,000 employees. Their aim? You guessed it. Predict which patients will need high-cost care in the future.
Patient Monitoring and Home Devices
Doctors can do a lot, but they can’t follow a patient around every minute of the day. Wearable body sensors – sensors tracking everything from heart rate to testosterone to body water – can.
Sensors are just one way in which medical technology is moving beyond the hospital bed. Home-use, medical monitoring devices and mobile applications are cropping up daily. A scanner to diagnose melanomas? A personal EEG heart monitor? No problem.
These gadgets are designed to help the patient, naturally, but they’re also busy harvesting data.
For example:
- Asthmapolis’s GPS-enabled tracker, already available by 2011, records inhaler usage by asthmatics. This information is collated, analyzed and merged with data on asthma catalysts from the CDC (e.g., high pollen counts in New England) to help doctors learn how best to prevent attacks.
- With Ginger.io’s mobile application, out in 2012, patients consent to have data about their calls, texts, location and movements monitored. These are combined with data on behavioral health from the NIH and other sources to pinpoint potential problems. Too many late-night phone calls, for instance, might signal a higher risk of anxiety attack.
- To improve patient drug compliance, Eliza, a Boston-based company, monitors which types of reminders work on which types of people. Smarter targeting means more compliance.
The Challenges Ahead
There are plenty of hurdles to creating a data-driven health care industry. Some are technical, some emotional. Health care providers have had decades to accumulate paper records, inefficiencies and entrenched routines. A remedy will not be quick.
And some say it shouldn’t. At least, not without a hard look at patient privacy, data ownership and the overall direction of U.S. health care.