Strategic Insights and Clickworthy Content Development

Category: Uncategorized (Page 1 of 2)

Why Quantum Computing Should Be on Your Radar Now

Quantum computer

Boston Consulting Group and Forrester are advising clients to get smart about quantum computing and start experimenting now so they can separate hype from reality.

There’s a lot of chatter about quantum computing, some of which is false and some of which is true. For example, there’s a misconception that quantum computers are going to replace classical computers for every possible use case, which is false. “Quantum computing” is not synonymous with “quantum leap,” necessarily. Instead, quantum computing involves quantum physics which makes it fundamentally different than classical, binary computers. Binary computers can only process 1s and 0s. Quantum computers can process many more possibilities, simultaneously.

If math and physics scare you, a simple analogy (albeit not an entirely correct analogy) involves a light switch and a dimmer switch that represent a classical computer and a quantum computer, respectively. The standard light switch has two states: on and off. The dimmer switch provides many more options, including on, off, and range of states between on and off that are experienced as degrees of brightness and darkness. With a dimmer switch, a light bulb can be on, off, or a combination of both.

If math and physics do not scare you, quantum computing involves quantum superposition, which explains the nuances more eloquently.

One reason quantum computers are not an absolute replacement for classical computers has to do with their physical requirements. Quantum computers require extremely cold conditions in order for quantum bits or qubits to remain “coherent.” For example, much of D-Wave’s Input/Output (I/O) system must function at 15 millikelvin (mK), which is near absolute zero. 15 mK is equivalent to minus 273.135 degrees Celsius or minus 459.643 degrees Fahrenheit. By comparison, the classical computers most individuals own have built-in fans, and they may include heat sinks to dissipate heat. Supercomputers tend to be cooled with circulated water. In other words, the ambient operating environments required by quantum computers and classical computers vary greatly. Naturally, there are efforts that are aimed at achieving quantum coherence in room temperature conditions, one of which is described here.

Quantum computers and classical computers are fundamentally different tools. In a recent report, Brian Hopkins, vice president and principal analyst at Forrester explained, “Quantum computing is a class of emerging hardware and software that exploits subatomic phenomenon to solve computationally hard problems.”

What to expect, when

There’s a lot of confusion about the current state of quantum computing which industry research firms Boston Consulting Group (BCG) and Forrester are attempting to clarify.

In the Forrester report, Hopkins estimates that quantum computing is in the early stages of commercialization, a stage that will persist through 2025 to 2030. The growth stage will begin at the end of that period and continue through the end of the forecast period which is 2050.

A recent BCG report estimates that quantum computing will become a $263 to $295 billion-dollar market given two different forecasting scenarios, both of which span 2025 to 2050. BCG also reasons that the quantum computing market will advance in three distinct phases:

  1. The first generation will be specific to applications that are quantum in nature, similar to what D-Wave is doing.
  2. The second generation will unlock what report co-author and BCG senior partner Massimo Russo calls “more interesting use cases.”
  3. In the third generation, quantum computers will have achieved the number of logical qubits required to achieve Quantum Supremacy. (Note: Quantum Supremacy and logical qubits versus physical qubits are important concepts addressed below.)

“If you consider the number of logical qubits [required for problem-solving], it’s going to take a while to figure out what use cases we haven’t identified yet,” said BCG’s Russo. “Molecular simulation is closer. Pharma company interest is higher than in other industries.”

Life sciences, developing new materials, manufacturing, and some logistics problems are ideal for quantum computers for a couple of possible reasons:

  • A quantum machine is more adept at solving quantum mechanics problems than classical computers, even when classical computers are able to simulate quantum computers
  • The nature of the problem is so difficult that it can’t be solved using classical computers at all, or it can’t be solved using classical computers within a reasonable amount of time, at a reasonable cost.

There are also hybrid use cases in which parts of a problem are best solved by classical computers and other parts of the problem are best solved by quantum computers. In this scenario, the classical computer breaks the problem apart, communicates with the quantum computer via an API, receives the result(s) from the quantum computer and then assembles a final answer to the problem, according to BCG’s Russo.

“Think of it as a coprocessor that will address problems in a quantum way,” he said.

While there is a flurry of quantum computing announcements at present, practically speaking, it may take a decade to see the commercial fruits of some efforts and multiple decades to realize the value of others.

Logical versus physical qubits

All qubits are not equal, which is true in two regards. First, there’s an important difference between logical qubits and physical qubits. Second, the large vendors are approaching quantum computing differently, so their “qubits” may differ.

When people talk about quantum computers or semiconductors that have X number of qubits, they’re referring to physical qubits. The reason the number of qubits matters is that the computational power grows exponentially with the addition of each, individual qubit. According to  Microsoft, a calculator is more powerful than a single qubit, and “simulating a 50-qubit quantum computation would arguably push the limits of existing supercomputers.”

BCG’s Russo said for semiconductors, the number of physical qubits required to create a logical qubit can be as high as 3,000:1. Forrester’s Hopkins stated he’s heard numbers ranging from 10,000 to 1 million or more, generally.

“No one’s really sure,” said Hopkins. “Microsoft thinks [it’s] going to be able to achieve a 5X reduction in the number of physical qubits it takes to produce a logical qubit.”  

The difference between physical qubits and logical qubits is extremely important because physical qubits are so unstable they need the additional qubits to ensure error correction and fault tolerance.

Get a grip on Quantum Supremacy

Quantum Supremacy does not signal the death of classical computers for the reasons stated above. Google cites this definition: “A critical question for the field of quantum computing in the near future is whether quantum devices without error correction can perform a well-defined computational task beyond the capabilities of state-of-the-art classical computers, achieving so-called quantum supremacy.”

“We’re not going to achieve Quantum Supremacy overnight, and we’re not going to achieve it across the board,” said Forrester’s Hopkins. “Supremacy is a stepping stone to delivering a solution. Quantum Supremacy is going to be achieved domain by domain, so we’re going to achieve Quantum Supremacy, which Google is advancing, and then Quantum Value, which IBM is advancing, in quantum chemistry or molecular simulation or portfolio risk management or financial arbitrage.”

The fallacy is believing that Quantum Supremacy means that quantum computers will be better at solving all problems, ergo classical computers are doomed.

Given the proper definition of the term, Google is attempting to achieve Quantum Supremacy with its 72-qubit quantum processor, Bristlecone.

How to get started now

First, understand the fundamental differences between quantum computers and classical computers. This article is merely introductory, given its length.

Next, (before, after and simultaneously with the next piece of advice) find out what others are attempting to do with quantum computers and quantum simulations and consider what use cases might apply to your organization. Do not limit your thinking to what others are doing. Based on a fundamental understanding of quantum computing and your company’s business domain, imagine what might be possible, whether the end result might be a minor percentage optimization that would give your company a competitive advantage or a disruptive innovation such as a new material.

Experimentation is also critical, not only to test hypotheses, but also to better understand how quantum computing actually works. The experimentation may inspire new ideas, and it will help refine existing ideas. From a business standpoint, don’t forget to consider the potential value that might result from your work.

Meanwhile, if you want to get hands-on experience with a real quantum computer, try IBM Q. The “IBM Q Experience” includes user guides, interactive demos, the Quantum Composer which enables the creation of algorithms that run on real quantum computing hardware, and the QISKit software developer kit.

Also check out Quantum Computing Playground which is a browser-based WebGL Chrome experiment that features a GPU-accelerated quantum computer with a simple IDE interface, its own scripting language with debugging and 3D quantum state visualization features.

In addition, the Microsoft Quantum Development Kit Preview is available now. It includes the Q# language and compiler, the Q# standard library, a local quantum machine simulator, a trace quantum simulator which estimates the resources required to run a quantum program and Visual Studio extension.


Lisa Morgan Will Address A/IS Ethics at the University of Arizona Eller College of Management’s Annual Executive Ethics Symposium

Lisa Morgan will be speaking at this year’s University of Arizona Eller College of Management’s Annual Executive Ethics Symposium, an invitation-only event in September 2018.   Her presentation will address the need for AI ethics given the current state of AI and innovation, as well as the opportunities and challenges shaping the advancement of AI ethics.

Lisa Morgan Advances Digital Ethics

Lisa Morgan is on a mission to educate the high-tech industry about the importance of digital ethics.  In advancement of that goal, she has been appointed Program Manager, Content and Community of the IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems Outreach Committee.  In that capacity, she is responsible for content and community development for the worldwide membership which now exceeds 1,050 members.  She is also a contributor to the group’s Ethically-Aligned Design document that is being cooperatively developed by technologists, business leaders, law makers, attorneys and others dedicated to advancing A/IS ethics.

How AR/VR Analytics May Help Your Business

Alternative reality and virtual reality are gaining traction, and some of the early adopters are already trying to figure out what it means to their businesses. The use cases vary from industry to industry, but the idea is to leverage virtual assets (AR) or create a completely virtual environment (VR) that provide low-cost, yet effective means of accomplishing what is otherwise expensive and difficult in the real world.

The possibilities seem only limited by the imagination; however, adoption numbers underscore the early nature of the products and related analytics among businesses.  For example, a recent survey by IT trade association CompTIA shows that about 21% of the responding organizations had some kind of AR or VR initiative in place.

“Most organizations realize there’s some potential because they saw what happened with Pokémon Go last year, but it’s going to take some time to happen,” said Tim Herbert, senior VP of research and market Intelligence at CompTIA.

Right now, people are focused on the visualization aspects and what that means. Interest in analytics will come later as it becomes clear that what happens in an AR or VR environment needs to be monitored, analyzed and optimized. Right now, most are more focused on the technology aspect and the talent needed.

“VR analytics can empower organizations to better understand and connect with their audiences. It’s about knowing exactly how your audience interacted with your content and, on a psychological level, how emotionally salient they found it,” said Joshua Setzer, CEO of VR/AR solutions provider  Lucid Dream. How you look at it depends upon your own job function and the objectives behind your project. A marketer may want to [understand] which parts of a message resonates with an audience and which don’t. A trainer may wish to tease out the psychophysical signatures of learning to understand which elements of content are being imprinted in memory and which are more likely to be forgotten.”

Companies in different industries are exploring AR/VR technologies to see what impact they have on sales, marketing, HR, product development and more.

“If you think through some of those use cases, you can see how having some of the new streams of data would be valuable to an organization,” said Herbert.

Following are a few things your organization can start thinking about today.

B2B Chatbots are Poised for Explosive Growth

Chatbot use is on the rise, and the use cases are growing. According to Gartner, by 2021, more than 50% of enterprises will spend more each year on bots and chatbot creation than traditional mobile app development.

In a recent blog, Gartner Brand Content Manager Kasey Panetta said, “Individual apps are out. Bots are in. In the ‘post-app era,’ chatbots will become the face of AI, and bots will transform the way apps are built. Traditional apps, which are downloaded from a store to a mobile device, will become just one of many options for customers.”

Chatbots and virtual assistants such as Alexa are being interwoven into consumer lifestyles. KPMG Digital Enablement Managing Director Michael Wolf says his company sees tremendous potential on the B2B side.

“B2B chatbots and virtual assistants could be the interface across multiple systems,” said Wolf. “We’re seeing a lot of growth in that, and the enterprise platform companies are making investments there, either acquiring the capability or acquiring the platforms to do that stuff.”

Implementing chatbots and implementing virtual assistants differs, based on their respective designs and capabilities. Traditional chatbots are script-based, so they respond to pre-programmed inputs. Virtual assistants utilize machine learning to continually improve their ability to understand and respond appropriately to natural language.

“One of the problems with bots is modeling what they think customers want rather than training the system with real people, not just employees and customers, but the person asking the questions. What are they asking?  How are they asking it?” said Wolf. “If you just try to follow your same traditional route paradigms without concentrating on learning and design thinking, you’re going to get less desirable outcomes.”

Expanding B2B use cases

Like other forms of automation, chatbots and virtual assistants are seen as human-augmenting technologies that enable humans to focus on less repetitive, higher-value tasks.

David Nichols, Americas Innovation and Alliance Leader for EY Advisory sees numerous opportunities for B2B chatbots, including internal employee communications, most HR interactions, and everyday interactions such as checking invoice status, delivery status and updates, and customer service interactions.

“The biggest challenge with B2B companies is getting suppliers and customers to use the Chabot functionality,” said Nichols. “Also, B2B companies don’t usually place the same priority on customer personalization as B2C companies. As a result, the customer service interactions at B2B companies don’t usually have the same level of detailed customer segmentation and interaction history. This will present a challenge when developing the use-cases and scenarios for the bot conversation flow.”

In HR scenarios, chatbots provide intelligent means of re-engaging with candidates, specifically sourcing, screening, and updating candidate information.

“[Using] other methods these interactions can take days to weeks for an organization to handle,” said Chris Collins, CEO of recruitment automation company RoboRecruiter. “Chatbots significantly increase the speed and scale that you can operate down to hours and combined with AI can keep the data active.”

That could lead to more positive recruiting experiences for candidates, contract workers, and employers. Similarly, from an outward-facing standpoint, chatbots and virtual assistants could improve brands’ relationships with customers.

It might seem counter-intuitive that an AI-driven chatbot can help companies build relationships with their customers, but remember, the ‘Millennial Mindset’ is quickly becoming the dominant purchasing orientation, and those customers want to efficiently self-service,” said Anthony SmithCEO of CRM solution provider Insightly. “In 2018, B2B chatbots will be utilized not only for lead generation, but also as virtual business assistants and they will handle different tasks such as scheduling and cancelling meetings, setting alarms etc.”

Depending on the enterprise applications chatbots are integrated with, they’ll be able to undertake more complex tasks, such as placing orders, invoicing and other B2B activities that are time consuming and usually require precision. However, there are challenges,

“Integrating chatbots with the major payment systems and with social media is tough and it will probably take time, but once this is covered, chatbots will be able to take orders directly through social accounts and that will be a revolution,” said Insightly’s Smith.

Application integration is critical

Automating business processes requires tight integration with enterprise systems. Exactly how many and which systems depends on the purpose of the chatbot. However, because user experience is vitally important, it’s critical to understand what the users of such systems will want to do with them.

“Some are just trying to redo web and mobile rather than using a design approach to using this,” said KPMG’s Wolf. “There’s an assumption because it’s not visual, it doesn’t involve design.”

In B2B contexts, there are a lot of repetitive tasks that take place within businesses processes, some of which require integrations with different types of systems.

“The injection of the chatbot is allowing consumer-like experiences. ‘I want my ERP to feel like Google

and ‘I want my CRM to feel like Amazon’ is a constant discussion for my customers,” said KPMG’s Wolf. “Applying an enterprise chatbot is obvious in that scenario.”

The end goal for virtual assistants is orchestrating everything necessary to answer a query or execute a request, which can involve a complex web of interconnections among disparate systems.

In short, the best way forward is iterative because requirements, technology and user expectations are constantly changing.

4 Ways to Improve Data Storytelling

Analytical results are often interpreted differently by different people. Sometimes the conclusions presented don’t align with intuition. Differences in experience and expertise can also come into play. An effective way to align thinking is through data storytelling, although there are better and worse ways to do it.

Data storytelling typically includes text, visualizations and sometimes tables to illustrate a developing trend or issue that requires attention if not action. Data storytelling can make the results more memorable and impactful for those who hear it, assuming the presentation is done effectively. Following are a few things to consider

1. Consider the Audience

Data scientists are often considered poor data storytellers because they struggle to align a story with the needs and knowledge level of the audience. Sometimes others are brought in to translate all the technical jargon into something that that is meaningful to business leaders.

Similarly, different parts of a business may require a slightly different focus that uses different language and maybe even different types of data visualizations to have the desired effect, which is understanding analytical results in context.

2. Tell A Story

Effective stories have a beginning, a middle, and an end. The beginning of a story provides context, setting the stage for the story itself. The middle tells the story, and the end usually includes a set of possibilities. Getting the end right is important because insights without action have little value. Are there actionable insights from the data? How can the results be used to drive strategy? In a business context, is there a significant revenue opportunity or an opportunity for cost savings? How much more likely is it that one course of action will succeed versus another? If you provide curated data points and visualizations that support the key points, you can often pre-emptively address the most likely questions and objections.

Effective storytelling also address issues beyond the “what.” Take a sales situation for example. Heads of sales are constantly monitoring progress against sales targets. Let’s say sales fell short or exceeded expectations last quarter. That leads to other questions such as why were sales better or worse than we expected? How could we use those insights to turn the situation around or increase sales even further? How well do we understand our customer base and their requirements? What levers work well and which don’t?

With some solid analytics and effective data storytelling, everyone in the room — the head of sales along with the C-suite or her team can have a common understanding of what impacted the sales results, why, how things are changing and what that means going forward, for the sales team, products, marketing, etc.

Data storytelling should also explain why the analysis was performed, how the analysis was performed, whether hypotheses were proven or disproven in addition to the important findings and what those findings mean for the audience. Some people make the mistake of showing the many steps required for an analysis to demonstrate how challenging the exercise was, which adds little, if any, value.

3. Quality Matters

Great stories can be derailed by simple mistakes, such as misspellings, a lack of focus and a propensity to demonstrate the mastery of a software program to the point of distracting the audience.

Misspellings and grammatical errors tend to be addressed by modern software; however, they don’t always catch everything. Some of them have default settings that limit the amount of text that can be included; however, that’s usually configurable. Sadly, it’ possible to overload stories with so much noise that the audience has trouble staying focused. The point is not clear, in other words. Similarly, trying to get too creative with the colors used in data visualizations can detract the audience’s attention away from the point.

Also consider the presentation of the data in relation to the data itself. On a scale of one to two, a move from one to two reflects a 100% increase. On an actual scale of 25, 50, 100, or 1000, a single-digit increase would appear differently.

4. Be Prepared to Address Alternatives

One of the reasons businesses have placed greater emphasis on analytics versus traditional reporting is the ability to interact with data versus passively consuming it. There is a parallel with data storytelling which is a move away from the traditional and static business presentation format that tends to reserve questions for the end to interactive storytelling in which questions or alternate points of view can be explored live.

Generally speaking, data storytellers should be prepared for questions and challenges, regardless. Why wasn’t something else explored? If a particular variable were added or subtracted, what would the effect be? Of the X possibilities, which is the most likely to see and why?

AI Has a Foothold in Business, Now for the Next Steps

AI is seeping into different industries, slowly remolding the global competitive landscape. However, most business leaders still don’t know how machine intelligence will impact their businesses.

EY recently published a brief, which focuses the current state of AI. We interviewed Nigel Duffy, EY Global Innovation AI leader who co-authored the document with Chris Mazzei, EY Global Innovation Technologies Leader and Global Chief Analytics Officer.

The brief frames the current state of AI well: “Most organizations aren’t exploiting the potential of AI; they are just at the beginnings of their AI journeys. What should be holding companies back is a lack of talent, but it’s actually a lack of understanding of what’s possible – particularly at the top of large enterprises.”

Addressing the C-Suite disconnect

It’s often hard to imagine the impact new technologies will have on a business. Granted, AI is not new; however, due to recent research and developments, it’s finally at a point where more organizations are either using it in production or experimenting with it.

Some say AI is at an inflection point, namely, at the beginning stages of exponential “hockey stick” growth. If that’s true, the latecomers may find themselves blind-sided by competitors, simply because they didn’t think about and learn first-hand how AI would affect their own companies.

According to Duffy, part of the confusion stems from the fact that AI is a broad set of technologies as opposed to a single, coherent capability. Given the complexity of the landscape (machine learning, computer vision, natural language processing, deep learning, neural networks, etc.), it’s not surprising that business leaders don’t have a clear understanding of how it will transform their businesses.

Also, the hype about AI is skewed. When new technologies hit the scene, evangelists and the media tend to focus on the opportunities and disregard the potential challenges. These skewed views fuel silver-bullet belief systems when silver bullets do not actually exist. It takes hands-on experience, including successes and failures, to truly understand the potential and limitations of a technology as applied to a specific business.

“It goes without saying that AI has the potential to completely transform business. I recently spoke on a panel about this topic at Fortune Global Forum in China, and everyone there, from prime ministers to chairmen of Fortune 500 firms, discussed the transformational potential of AI,” said Duffy. “There is a broad understanding that it is going to be transformational, but the challenge is that it requires work and investment to develop the strategy [and] vision to realize that potential.”

Many organizations are in the early stages of AI adoption, so they have not yet invested sufficient time and money in the process. In order to bridge this gap, leaders need to start gaining experience now, developing initial use cases or proofs of concept. Duffy recommends investing in a big-picture strategy, and developing a vision for how this could transform a firm or sector.

AI is more than a technology

AI is a piece of the digital transformation puzzle. As with all things related to digital transformation, technology is only part of the picture. The most effective strategies focus on business problem-solving.

“I believe companies will have the most impact with a business-first, value-led approach,” said Duffy. “The best way to approach AI is to focus on how to add value to a business beyond just cost efficiencies. Businesses must think now about AI from a strategic perspective and ask themselves how much more value they can deliver through more intelligent use of AI.”

Of course, there are some barriers to adoption that are technology-related. As is typical in the early adoption stages of a technology, the initial tools tend to be targeted at a narrow, technical audience that is capable of using them. However, as the technology matures, easier-to-use tools follow and abstract the some of the complexity. Usually those tools are aimed at “power users.”

Finally, becomes easy enough for the masses to take advantage of, such as analytics dashboards in the enterprise. Already, AI is built into and will be embedded in many kinds of devices and software, to the point where it is transparent to the user. For example, one does not have to be an AI expert to use Amazon Echo.

AI will create winners and losers in every industry,” said Duffy. “AI is here today and can provide significant value now. Can you really afford to be slower to adopt it than your competitors?”

Business leaders and organizations should get familiar with AI technology now, because it will make it easier to determine where AI can be used as an effective problem-solving solution, Duffy said.

Business leaders and technologists need to work together. EY does this internally to meld cultures and disrupt traditional ways of thinking.

Set reasonable ROI expectations

In the AI brief, Duffy and Mazzei say, “Many early projects will have low ROI and a limited impact.”  So, at what point, then, should businesses invest in AI?  On one hand, the early adopters gain insight and experience that those sitting on the sidelines miss. On the other hand, those who are later to the game have the luxury of using more mature toolsets and learning from others’ mistakes.

“Early adoption doesn’t necessarily have a low ROI. [To clarify,] the early adopters are often focused on the technology rather than the business problem – this can lead to low ROI,” Duffy said. “However, [early adoption] does lead to invaluable learning.”

Early technology-led projects may also have low ROI because they are (and should be) as much about learning as about value. Rather than limiting the scope to only technology-led projects; businesses should identify projects based on their business value and have them led by business stakeholders.

“Because of the transformational potential of AI, if you wait and your competitors don’t, you will be at a disadvantage. AI will differentiate between winners and losers, and the pace at which that is happening is only accelerating,” said Duffy. “Most people are early in their AI journey and the actual investment can be small relative to the potential. It’s a smart decision to make a relatively small investment to start.”

Overconfidence can be dangerous

The immense interest in AI is creating career opportunities and with it overstatements about qualifications. Duffy said it’s important to get the right talent.

“The Dunning–Kruger Effect is of significant concern in this space, that is, people can be unskilled and unaware of it,” said Duffy. “The field has grown so rapidly that there are many people who can solve technical problems, but they have a lack of deep experience.”

EY conducted a survey of 200 senior AI professionals, 56% of which said that a lack of talent is the greatest barrier to implementation within business operations. If companies don’t have competent AI professionals, they face three big risks that are easily preventable if business leaders think about them in advance and do something about them proactively.

The first is testing. It’s much easier to get AI testing wrong compared to other technologies. Getting it right requires a certain amount of sophistication, as there are many subtle statistical issues. According to Duffy, AI testing requires talent deep expertise, working with this type of technology.

[Maybe it’s time for your organization to make a real investment and commitment to AI. Read more here.]

The second challenge is that machine learning can amplify bias, which was another one of the key takeaways from the recent EY AI survey. Forty-one percent of the survey participants said they see the gender diversity of existing AI talent influencing machine biases. Researchers need to be especially mindful of bias, specifically, racial, gender or other cultural biases. To proactively avoid those types of bias, organizations will need to ensure that they’re hiring from a diverse talent pool when hiring AI talent.

Finally, AI tools are making automated decisions, quickly. A sophisticated monitoring system needs to be in place to ensure that anomalies are caught quickly, Duffy said.

How to ask the right questions

Business leaders who lack experience with AI may wonder how it’s possible to know whether they’re asking the right questions in the first place.

“Some of this is about building up experience over time, which reflects back to my point about how it’s better to be an early adopter. You start asking the questions, seeing the answers, and seeing the outcomes that lead to asking better questions,”‘ said Duffy. “By starting soon, leaders can get experience in determining how AI can have the most meaningful impact on their business.”

How CIO/CFO Relationships Are Evolving

Digital transformation is driving huge organizational changes, not the least of which is the evolving relationships of CIOs and CFOs.

Traditionally, the two roles have been somewhat at odds because CIOs must continually invest in technologies and CFOs are ultimately responsible for financial performance. In today’s’ highly competitive business environment, CIOs and CFOs need to partner at a strategic level to drive growth and enable organizational agility.

From old school to new school

Data provider Dun & Bradstreet is going through a digital transformation that allows the 176-year-old company to behave and compete like a much younger entity. To get there, the CFO and former CIO (now Chief Content and Technology Officer) are working in partnership to set strategies and execute them.

“We come together quite a lot because what we’re trying to drive is more innovation at a faster clip in a more efficient way,” said Richard Veldran, CFO of Dun & Bradstreet. “It all comes down to data and technology which is at the core of many of the things we’re trying to get done here.”

As the sheer amount of data continues to grow exponentially, Dun & Bradstreet has more opportunities to drive growth by monetizing data. However, to do that, the CFO and CTO need to work as partners.

“So much of it now depends on the alignment of your technical capabilities and investments,” said Curtis Brown, chief content and technology officer at the firm. “Rich and I spend a lot more time talking about our strategy and our execution against that strategy. I would say that’s the single biggest change.”

The partnership allows Veldran and Brown to allocate resources more effectively and make joint decisions about where to invest and how to invest. They’re also working together in a lean agile fashion which enables them to accomplish more in less time while reducing the risk of big project failures.

Focused on high growth

Hitachi Vantara CIO Renee McKaskle and CFO Lori Varlas act as if they’re co-founders and, in a way, they are. Both women were hired into their respective positions about two years ago to spearhead digital transformation. Years before, Varlas and McKaskle had become acquainted while working at Peoplesoft.

“We’re two women in non-traditional women’s roles, so from the get-go, we bonded on the common vision of where we’re going to take this company and how our individual skills and experiences added to that story and towards that journey,” said Varlas. “I think the other thing that bonded us was time is not our friend, particularly in terms of technology, so we had quickly align on what the business strategy was and figure out how we leverage our own backgrounds and experiences to make that vision a reality.”

They both say it’s important to learn from each other, listen to each other and be aligned on the vision or outcome.

“As we work really closely with the business, things come up. Someone might approach Renee or [me] for different purposes, but it springs to mind, ‘Has Renee’s cybersecurity team looked at that?’ ” said Varlas. Or, “Does Lori know about that for investment purposes?” said McKaskle.” There’s a bit of a tag team going there because we both have a common understanding and purpose of how it fits together.”

Empathy is key

Cross-functional collaboration is necessary to drive effective digital transformation; however, everyone interviewed for this blog said empathy for the other person’s role is critical.

“I can sometimes be a propeller head, but to think more empathically and as a partnership toward the enablement and delivery of the operation of the company, that’s where folks sometimes get stuck,” said Dun & Bradstreet’s Brown. “CFOs do have to put pressure on delivering a certain set of results within a certain financial framework while [CIOs and] CTOs are trying to drive technical improvements that often require investment.”

As businesses undergo digital transformation, the CIO and CFO have to move quickly and in unison. The best results come when they’re aligned on the business outcomes they’ve trying to achieve. That alignment also helps CIOs and CFOs overcome some of the tensions that stem from traditionally separate roles.

Beware Analytics’ Mid-Life Crisis

Businesses are using analytics to stay competitive. One by one, departments are moving from static reports to modern analytics so they can fine-tune their operations. There’s no shortage of solutions designed for specific functions, such as marketing, sales, customer service and supply chain, most of which are available in SaaS form. So, when it’s possible just to pull out a credit card and get started with an application, why complicate things by involving IT?

Freedom from IT seems like a liberating concept until something goes wrong. When data isn’t available or the software doesn’t work as advertised, it becomes the IT department’s job to fix it.

“I used to call this the BI mid-life crisis. Usually about a year and a half or two years in, [departments] realize they can’t report accurately and then they need some help,” said Jen Underwood, founder of Impact Analytix, and a recognized analytics industry expert. “Now I’m seeing more IT involvement again.”

Organizations serious about competing on insights need to think holistically about how they’re approaching analytics and the role of IT. Disenfranchising IT from analytics may prove to be short-sighted. For example, a proof of concept may not scale well or the data required to answer a question might not be available.

Analytics’ long-term success depends on IT

IT was once the sole gatekeeper of technology, but as the pace of business has continued to accelerate, departments have become less tolerant of delays caused by IT. While it’s true no one understands departmental requirements better than the department itself, IT is better equipped to identify what could go wrong, technically speaking.

Even if a department owns and manages all of its data, at some point it will likely want to combine that data with other data, perhaps from a different group.

“We became accustomed to IT organizations managing the database architectures or the data stores and any of the enterprise wide user-facing applications,” said Steven Escaravage, vice president in Booz Allen Hamilton’sStrategic Innovation Group. “I think that’s changed over the last decade, where there’s been a greater focus on data governance, and so you also see IT organizations today managing the process and the systems used to govern data.”

Additionally, as more organizations start analyzing cross-functional data, it becomes apparent that the IT function is necessary.

“IT plays an important part in ensuring that these new and different kinds of data are in a platform or connected or integrated in a way that the business can use. That is the most important thing and something companies struggle with,” said Justin Honaman, a managing director in the Digital Technology Advisory at Accenture.

Where analytics talent resides varies greatly

There’s an ongoing debate about where analytics talent should reside in a business unit.  It’s common for departments to have their own business analysts, but data science teams, including data analysts, often reside in IT.

The argument in favor of a centralized analyst team is visibility across the organization, though domain-specific knowledge can be a problem. The argument in favor of decentralization is the reverse. Accenture’s Honoman said he’s seeing more adoption of the decentralized model in large companies.

Hybrid analytics teams, like hybrid IT, combines a center of excellence with dedicated departmental resources.

Hot analytics techs

Machine learning and AI are becoming popular features of analytics solutions. However, letting machine learning loose on dirty and biased data can lead to spurious results; the value of predictive and prescriptive analytics depends on their accuracy.

As machine learning-based applications become more in vogue, analytics success depends on “the quality of not just the data, but the metadata associated with it [that] we can use for tagging and annotation,” said Booz Allen Hamilton’s Escaravage “If IT is not handling all of that themselves, they’re insisting that groups have metadata management and data management capabilities.”

Meanwhile, the IoT is complicating IT ecosystems by adding more devices and edge analytics to the mix.  Edge analytics ensures that the enterprise can filter meaningful data out of the mind-boggling amount of data IoT devices can collect and generate.

In short, the analytical maturity of organizations can’t advance without IT’s involvement.

Just a Bit of Advice:  

Strategies for Successful Analytics

A few helpful hints as you move through your analytics journey.

If you’re just getting started on your data and analytics journey, think before you act.

Steven Escaravage of Booz Allen Hamilton noted, “I tell clients to take a step back before they invest millions of dollars.” Among other things, he said, make sure to have a good foundation around what questions you’re trying to solve today and the questions you perceive are coming down the path.

“Let’s put together a data wish list and compare it to where we’re at, because usually you’re going to have to make investments in generating data to answer questions effectively,” he added. All the other pieces about methods and techniques, tools and solutions follow these actions.

If you’re at the pilot stage, beware of scalability challenges.

“Very rarely for sophisticated analytic problems would I lean on a typical Python pilot deployment in production,” said Escaravage. “You’d typically move to something you knew could scale and wouldn’t become a bottleneck in the computational pipeline.”

If you’re in production, you may be analyzing all kinds of things, but are you measuring the effectiveness of your solutions, processes and outcomes? If not, you may not have the complete feedback loop you think you have.

5 Cross-functional Analytics Challenges

More businesses are attempting to optimize their operations with the help of analytics, although most of the activity still takes place at the departmental level. Additional value can be gained from cross-functional analytics, but it represents a different kind of challenge because the functional units tend to use different systems and data owners often want to maintain control of their data.

According to recent research by EY and Forbes Insights, 60% to 70% of companies now use analytics at a departmental level, up from 30% to 40% in 2015.

“Companies have had success in one part of the business, they then try to replicate that in other departments,” said Chris Mazzei, global chief analytics officer and emerging technology leader at EY. “The companies that are more mature across a number of different dimensions, those we would put into the “leading” category, are out-performing the others. They’re reporting higher revenue growth, better operating margins and more effective risk management, so there’s at least there’s a correlation between analytics adoption and driving better business outcomes.”

Here are a few things that can hold cross-functional analytics back.

Analytics Isn’t Part of the Business Strategy

Cross-functional analytics is more likely to yield competitive advantages and drive more business value when the analytics are an integral part of the business model and strategy.

“The vast majority of organizations still are not able to say that their business strategy has really reflected the role analytics plays in how they’re trying to compete,” said Mazzei. “There’s this fundamental misalignment that can occur when across the leadership team is not able to have a consistent view of where and how analytics is making the biggest impact on the business strategy.”

Operating Models Don’t Facilitate Cross-Functional Analytics

Executing an analytics strategy at a departmental level such as finance or marketing is relatively easy because it’s clear that resources need to be dedicated to the effort. When it’s a cross-functional endeavor, who’s responsible for providing, funding and managing those resources? What should the data flow look like and how can that be facilitated?

“If you’re trying to deploy analytics across the organization, the operating model becomes much more important,” said Mazzei. “Do we have a centralized team? Do we distribute analytics resources in the individual business units or functions? What’s the relationship between those teams?”

Like bimodal IT, bimodal analytics services benefit the enterprise and the departments simultaneously. The centralized group helps facilitate best practices and ensures appropriate governance while dedicated resources tend to have specialized knowledge of that particular function and its analytics requirements.

The Initiatives Aren’t Designed Well

Analytics efforts should drive business value. There’s a lot to do, but not everything will have the same level of impact or necessarily achieve the desired results, so the desired business outcomes should drive the prioritization of analytics efforts.

“Initiative design is really important and are there competent frameworks/processes you use for that,” said Mazzei.

Not surprisingly, companies are still at very different stages of maturity in terms of having any kind of consistent process for designing an analytics initiative. The more analytically mature a company is, the greater the likelihood is that they have common frameworks. There is also a common understanding of what the term, “analytics initiative” means and common tools for executing that, Mazzei said.Analytics Isn’t Part of Business Operations

As companies embrace analytics and mature in their use of analytics, business processes tend to change. It’s wise to think about that and other impacts early on.

“The more mature companies are thinking about that earlier in the process and using an initial point of view about what that intervention needs to be to inform how you design the analytics themselves,” said Mazzei. “A lot of companies don’t think about that early enough.”

According to the report, design intervention is “Translating all the upfront goal-setting, modeling, and methodology into action— making analytics insights an integral part of business operations.”

The True Value of Analytics Isn’t Understood

Interestingly, analytics enables organizations to measure all kinds of things and yet success metrics may not have been defined for the analytics initiatives themselves.

“That really matters because [if] you can learn, what’s working and what’s not earlier on, you can change the nature of the intervention or the analytic you’re building,” said Mazzei. “It’s that feedback loop you have in place.”

« Older posts