Lisa Morgan's Official Site

Strategic Insights and Clickworthy Content Development

Author: misslisa (page 2 of 10)

How Valuable Is Your Company’s Data?

Companies are amassing tremendous volumes of data, which they consider their greatest asset, or at least one of their greatest assets. Yet, few business leaders can articulate what their company’s data is worth.

Successful data-driven digital natives understand the value of their data and their valuations depend on sound applications of that data. Increasingly venture capitalists, financial analysts and board members will expect startup, public company and other organizational leaders to explain the value of their data in terms of opportunities, top-line growth, bottom line improvement and risks.

For example, venture capital firm Mercury Fund recently analyzed SaaS startup valuations based on market data that its team has observed. According to Managing Director Aziz Gilani, the team confirmed that SaaS company valuations, which range from 5x to 11x revenue, depend on the underlying metrics of the company. The variable that determines whether those companies land in the top or bottom half of the spectrum is the company’s annual recurring revenue (ARR) growth rate, which reflects how well a company understands its customers.

Mercury Fund’s most successful companies scrutinize their unit economics “under a microscope” to optimize customer interactions in a capital-efficient manner and maximize their revenue growth rates.

For other companies, the calculus is not so straightforward and, in fact, it’s very complicated.

Direct value

When business leaders and managers ponder the value of data, their first thought is direct monetization which means selling data they have.

“[I]t’s a question of the holy grail because we know we have a lot of data,” said David Schatsky, managing director at Deloitte. “[The first thought is] let’s go off and monetize it, but they have to ask themselves the fundamental questions right now of how they’re going to use it: How much data do they have? Can they get at it? And, can they use it in the way they have in mind?”

Data-driven digital natives have a better handle on the value of their data than the typical enterprise because their business models depend on collecting data, analyzing that data and then monetizing it. Usually, considerable testing is involved to understand the market’s perception of value, although a shortcut is to observe how similar companies are pricing their data.

“As best as I can tell, there’s no manual on how to value data but there are indirect methods. For example, if you’re doing deep learning and you need labeled training data, you might go to a company like CrowdFlower and they’d create the labeled dataset and then you’d get some idea of how much that type of data is worth,” said Ben Lorica, chief data officer at O’Reilly Media. “The other thing to look at is the valuation of startups that are valued highly because of their data.”

Observation can be especially misleading for those who fail to consider the differences between their organization and the organizations they’re observing. The business models may differ, the audiences may differ, and the amount of data the organization has and the usefulness of that data may differ. Yet, a common mistake is to assume that because Facebook or Amazon did something, what they did is a generally-applicable template for success.

However, there’s no one magic formula for valuing data because not all data is equally valuable, usable or available.

“The first thing I look at is the data [a client has] that could be turned into data-as-a-service and if they did that, what is the opportunity the value [offers] for that business,” said Sanjay Srivastava, chief digital officer at global professional services firm Genpact.

Automation value

More rote and repeatable tasks are being automated using chatbots, robotic process automation (RPA) and AI. The question is, what is the value of the work employees do in the absence of automation and what would the value of their work be if parts of their jobs were automated and they had more time to do higher-value tasks?

“That’s another that’s a shortcut to valuing that data that you already have,” said O’Reilly’s Lorica.

Recombinant value

Genpact also advances the concept of “derivative opportunity value” which means creating an opportunity or an entirely new business model by combining a company’s data with external data.

For example, weather data by zip code can be combined with data about prevalent weeds by zip code and the available core seed attributes by zip codes. Agri-food companies use such data to determine which pesticides to use and to optimize crops in a specific region.

“The idea is it’s not just selling weather data as a service, that’s a direct opportunity,” said Srivastava. “The derivative opportunity value is about enhancing the value of agriculture and what value we can drive.”

It is also possible to do an A/B test with and without a new dataset to determine the value before and after the new data was added to the mix.

Algorithmic value

Netflix and Amazon use recommendation engines to drive value. For example, Netflix increases its revenue and stickiness by matching content with a customer’s tastes and viewing habits. Similarly, Amazon recommends products, including those that others have also viewed or purchased. In doing so, Amazon successfully increases average order values through cross-selling and upselling.

“Algorithmic value modeling is the most exciting,” said Srivastava. “For example, the more labeled data I can provide on rooftops that have been damaged by Florida hurricanes, the more pictures I have of the damage caused by the hurricanes and the more information I have about claim settlements, the better my data engine will be.”

For that use case, the trained AI system can automatically provide an insurance claim value based on a photograph associated with a particular claim.

Risk-of-Loss value

If a company using an external data source were to lose access to that data source, what economic impact would it have? Further, given the very real possibility of cyberattacks and cyberterrorism, what would the value of lost or corrupted data be? Points to consider would be the financial impact which may include actual loss, opportunity cost, regulatory fines and litigation settlement values. If the company has cybersecurity insurance, there’s a coverage limit on the policy which may differ from the actual claim settlement value and the overall cost to the company.

A bigger risk than data loss is the failure to use data to drive value, according to Genpact’s Srivastava.

There’s no silver bullet

No single equation can accurately assess the value of a company’s data. The value of data depends on several factors, including the usability, accessibility and cleanliness of the data. Other considerations are how the data is applied to business problems and what the value of the data would be if it were directly monetized, combined with other data, or used in machine learning to improve outcomes.

Further, business leaders should consider not only what the value of their company’s data is today, but the potential value of new services, business models or businesses that could be created by aggregating data, using internal data or, more likely, using a combination of internal and external data. In addition, business leaders should contemplate the risk of data loss, corruption or misuse.

While there’s no standard playbook for valuing data, expect data valuation and the inability to value data to have a direct impact on startup, public company, and merger and acquisition target valuations.

Lisa Morgan Advances Digital Ethics

Lisa Morgan is on a mission to educate the high-tech industry about the importance of digital ethics.  In advancement of that goal, she has been appointed Program Manager, Content and Community of the IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems Outreach Committee.  In that capacity, she is responsible for content and community development for the worldwide membership which now exceeds 1,050 members.  She is also a contributor to the group’s Ethically-Aligned Design document that is being cooperatively developed by technologists, business leaders, law makers, attorneys and others dedicated to advancing A/IS ethics.

Why Operationalizing Analytics is So Difficult

Today’s businesses are applying analytics to a growing number of use cases, but analytics for analytics’ sake has little, if any, value. The most analytically astute companies have operationalized analytics, but many of them, particularly the non-digital natives, have faced several challenges along the way getting the people, processes and technology aligned in a way that drives value for the business.

Here are some of the hurdles that an analytics initiative might encounter.

Analytics is considered a technology problem

Some organizations consider analytics a technology problem, and then they wonder why the ROI of their efforts is so poor. While having the right technology in place matters, successful initiatives require more.

“The first key challenge is designing how and in what way an analytics solution would affect the outcome of the business,” said Bill Waid, general manager of Decision Management at FICO. “We start by modeling the business problem and then filling in the analytic pieces that address that business problem. More often than not, there’s a business process or business decision that needs to be incorporated into the model as we build the solution.”

Framing the business problem is essential, because if the analytics don’t provide any business value, they won’t get used.

“Better than 80% of analytics never end up being used. A lot of that stems from the fact that an analysis gets built and it might make sense given the dataset but it’s not used to make something happen,” said Waid. “That’s probably the hardest element.”

Placing analytics in the hands of the business requires access to the right data, but governance must also be in place.

“[T]he technical aspects are becoming easier to solve and there are many more options for solving them, so the people and the process challenges that you’ll face obviously have to come along,” said Bill Franks, chief analytics officer at the International Institute for Analytics (IIA). “In a non-digital-native company, the people and process progress does not match the technology progress.”

Operationalizing analytics lacks buy in

Many analytics initiatives have struggled to get the executive and organizational support they need to be successful. Operationalizing analytics requires the same thing.

“When you operationalize analytics, you’re automating a lot of decisions, so the buy-in you require from all of the various stakeholders has to be high,” said IIA’s Franks. “If you’re a digital native, this is what you do for a living so people are used to it. When you’re a large, legacy company dipping your toe into this, the first couple of attempts will be painful.”

For example, if an organization is automating what used to be batch processes, there need to be more safety checks, data checks, and accuracy checks. Chances are high that everything won’t be done right the first time, so people have to get comfortable with the concept of iteration, which is just part of the learning process.

Analytical results are not transparent

If your company operates in a regulated environment, you need to be able to explain an analytical result. Even if you’re not in a regulated industry, business leaders, investors and potential M&A partners may ask for an explanation.

“We refer to it as ‘reasoning code’ or ‘the outcomes,’ but in AI it’s a form of explainable AI where you can explain to a business owner or a business user why the analytics came to the conclusion it came to,” said FICO’s Waid. “The second thing that you need to provide the business person with is some kind of dashboard for them to be able to change, adjust or accommodate different directions.”

4 Ways Companies Impede Their Analytics Efforts

Businesses in the race to become “data-driven” or “insights-driven” often face several disconnects between their vision of an initiative and their execution of it. Of course, everyone wants to be competitive, but there are several things that differentiate the leaders from the laggards. Part of it is weathering the growing pains that companies tend to experience, some of which are easier to change than others. These are some of the stumbling blocks.

Business objectives and analytics are not aligned

Analytics still takes place in pockets within the majority of organizations. The good news is that various functions are now able to operate more effectively and efficiently as a result of applying analytics. However, there is greater power in aligning efforts with the strategic goals of the business.

In a recent research note, Gartner stated, “Internally, the integrative, connected, real-time nature of digital business requires collaboration between historically independent organizational units. To make this collaboration happen, business and IT must work together on vision, strategy, roles and metrics. Everyone is going to have to change, and everyone is going to have to learn.”

All of that requires cultural adjustment, which can be the most difficult challenge of all.

There’s insight but no action

It’s one thing to get an insight and quite another to put that insight into action. To be effective, analytics need to be operationalized, which means weaving analytics into business processes so that insights can be turned into meaningful actions. Prescriptive analytics is part of it, but fundamentally, business processes need to be updated to include analytics. A point often missed is that decisions and actions are not ends in themselves. They, too, need to be analyzed to determine their effectiveness.

An EY presentation stresses the need to operationalize analytics. Specifically, it says, ” The key to operationalizing analytics is to appreciate the analytics value chain.”

Interestingly, when most of us think about “the analytics value chain” we think of data, analytics, insights, decisions and optimizing outcomes. While that’s the way work flows, EY says our thought process should be the reverse. Similarly, to optimize a process, one must understand what that process is supposed to achieve (e.g., thwart fraud, improve customer experience, reduce churn).

They’re not looking ahead

Less analytically mature companies haven’t moved beyond descriptive analytics yet. They’re still generating reports, albeit faster than they used to because IT and lines of business tend to agree that self-service reporting is better for everyone. Gartner says “the BI and analytics market is in the final stages of a multiyear shift from IT-lead, system-of-record reporting to business-led, self-service analytics. As a result, the modern business intelligence and analytics platform has emerged to meet new organizational requirements for accessibility, agility and deeper analytical insight.”

Still, organizations can only get so far with descriptive analytics. If they want to up their competitive game, they need to move to predictive and prescriptive analytics.

Poor data quality prevents accurate analytics

If you don’t have good data or a critical mass of the right data, your analytical outcomes are going to fall short. Just about any multichannel (and sometimes even single-channel) communication experience with a bank, a telephone company, a credit card company, or a vendor support organization will prove data quality is still a huge issue. Never mind the fact some of these companies are big brand companies who invest staggering amounts of money in technology, including data and analytics technologies.

In a typical telephone scenario, a bot asks the customer to enter an account number or a customer number. If the customer needs to be transferred to a live customer service representative (CSR), chances are the CSR will ask the customer to repeat the number because it doesn’t come up on their screen automatically. If the CSR can’t resolve the issue, then the call is usually transferred to a supervisor or different department. What was your name and number again? It’s a frustrating problem that’s all too common.

The underlying problem is that customer’s information is stored in different systems for different reasons such as sales, CRM and finance.

I spoke with someone recently who said a company he worked with had gone through nearly 20 acquisitions. Not surprisingly, data quality was a huge issue. The most difficult part was dealing with the limited fields in a legacy system. Because the system did not contain enough of the appropriate fields in which to enter data, users made up their own workarounds.

These are just a few of the challenges organizations face on their journey.

How AR/VR Analytics May Help Your Business

Alternative reality and virtual reality are gaining traction, and some of the early adopters are already trying to figure out what it means to their businesses. The use cases vary from industry to industry, but the idea is to leverage virtual assets (AR) or create a completely virtual environment (VR) that provide low-cost, yet effective means of accomplishing what is otherwise expensive and difficult in the real world.

The possibilities seem only limited by the imagination; however, adoption numbers underscore the early nature of the products and related analytics among businesses.  For example, a recent survey by IT trade association CompTIA shows that about 21% of the responding organizations had some kind of AR or VR initiative in place.

“Most organizations realize there’s some potential because they saw what happened with Pokémon Go last year, but it’s going to take some time to happen,” said Tim Herbert, senior VP of research and market Intelligence at CompTIA.

Right now, people are focused on the visualization aspects and what that means. Interest in analytics will come later as it becomes clear that what happens in an AR or VR environment needs to be monitored, analyzed and optimized. Right now, most are more focused on the technology aspect and the talent needed.

“VR analytics can empower organizations to better understand and connect with their audiences. It’s about knowing exactly how your audience interacted with your content and, on a psychological level, how emotionally salient they found it,” said Joshua Setzer, CEO of VR/AR solutions provider  Lucid Dream. How you look at it depends upon your own job function and the objectives behind your project. A marketer may want to [understand] which parts of a message resonates with an audience and which don’t. A trainer may wish to tease out the psychophysical signatures of learning to understand which elements of content are being imprinted in memory and which are more likely to be forgotten.”

Companies in different industries are exploring AR/VR technologies to see what impact they have on sales, marketing, HR, product development and more.

“If you think through some of those use cases, you can see how having some of the new streams of data would be valuable to an organization,” said Herbert.

Following are a few things your organization can start thinking about today.

Why Businesses Must Start Thinking About Voice Interfaces, Now

Voice interfaces are going to have an immense impact on human-to-machine interaction, eventually replacing keyboards, mice and touch. For one thing, voice interfaces can be much more efficient than computers, laptops, tablets and smartphones. More importantly, they provide an opportunity to develop closer relationships with customers based on a deeper understanding of those customers.

Despite the popularity of Alexa among consumers, one might assume that voice interfaces are aspirational at best for businesses, although a recent Capgemini conversational commerce study tells a different story. The findings indicate that 40% of the 5,000 consumers interviewed would use a voice assistant instead of a mobile app or website. In three years, the active users expect 18% of their total expenditures will take place via a voice assistant, which is a six-fold increase from today. The study also concluded that voice assistants can improve Net Promoter Scores by 19%. Interestingly, this was the first such study by Capgemini.

“Businesses really need to come to grips with voice channels because they will change the customer experience in ways that we haven’t seen since the rise of ecommerce,” Mark Taylor, chief experience officer, DCX Practice, Capgemini. “I think it’s going to [have a bigger impact] than ecommerce because it’s broader. We call it ‘conversational commerce,’ but it’s really voice-activated transactions.”

Voice interfaces need to mimic humans

The obvious problem with voice interfaces is their limited understanding of human speech, which isn’t an easy problem to solve. Their accuracy depends on understanding of the words spoken in context, including the emotions of the speaker.

“We’re reacting in a human way to very robotic experience and as that experience evolves, it will only increase our openness and willingness to experience that kind of interaction,” said Taylor. “Businesses have recognized that they’re going to need a branded presence in voice channels, so some businesses have done a ton of work to learn what that will be.”

For example, brands including Campbell’s Soup, Sephora and Taco Bell are trying to understand how consumers want to interact with them, what kind of tone they have as a brand and what to do with the data they’re collecting.

“Brands have spent billions of dollars over the years representing how they look to their audience,” said Taylor. “Now they’re going to have to represent how they sound. What is the voice of your brand? Is it a female or male voice, a young voice or an older voice? Does it have a humorous or dynamic style? There are lots of great questions that will need to be addressed.”

Don’t approach voice like web or mobile

Web and mobile experiences guide users down a path that is meant to translate human thought into something meaningful, but the experience is artificial. In web and mobile experiences, it’s common to search using keywords or step through a pre-programmed hierarchy. Brands win and lose market share based on the customer experience they provide. The same will be true for voice, but the difference is that voice will enable deeper customer relationships.

Interestingly, in the digital world, voice has lost its appeal. Businesses are replacing expensive call centers with bots. Meanwhile, younger generations are using smartphones for everything but traditional voice phone calls. Voice interfaces will change all of that that, albeit not in the way older generations might expect. In Europe, for example, millennials prefer to use a voice assistant in stores rather than talking to a person, Taylor said.

“What’s interesting here is the new types of use cases [because you can] interact with customers where they are,” said Ken Dodelin, VP of Conversational AI Products at Capital One.

Instead of surfing the web or navigating through a website, users can simply ask a question or issue a command.

“[Amazon’s] dash button was the early version a friction-free thing where someone can extend their finger and press a button to go from thought to action,” said Dodelin. “Alexa is a natural evolution of that.”

In banking, there is considerable friction between wanting money or credit and getting it. Capital One has enabled financial account access via voice on Alexa and Cortana platforms. It is also combining visual and voice access on Echo Show. The reasoning for the latter is because humans communicate information faster by speaking and consume information faster visually.

“[I]t usually boils down to what’s the problem you’re solving and how do you take friction out of things,” said Dodelin. “When I think about what it means for a business, it’s more about how can we [get] good customer and business outcomes from these new experiences.”

When Capital One first started with voice interfaces, customers would ask about the balance on their credit cards, but when they asked about the balance due, the system couldn’t handle it.

“Dialogue management is really important,” said Dodelin. “The other piece is who or what is speaking?”

Brand image is reflected in the characteristics of the voice interface. Capital One didn’t have character development experts, so it hired one from Pixar that now leads the conversational AI design work.

“Natural language processing technology has progressed so much that we can expect it to become an increasingly common channel for customer experience,” said Dodelin. “If they’re not doing it directly through a company’s proprietary voice interface, they’re doing it by proxy through Alexa, Google Home or Siri and soon through our automobiles.”

The move to voice interfaces is going to be a challenge for some brands and an opportunity for others. Now is the time for companies to experiment and if they’re successful, leap ahead of their competitors and perhaps even set a new standard for creating customer experiences.

Clearly, more works needs to be done on natural language processing, but already, some consumers have been tempted to thank Alexa, despite its early-stage capabilities, said Capgemini’s Taylor.

In short, voice interfaces are here and evolving rapidly. What will your brand do?

B2B Chatbots are Poised for Explosive Growth

Chatbot use is on the rise, and the use cases are growing. According to Gartner, by 2021, more than 50% of enterprises will spend more each year on bots and chatbot creation than traditional mobile app development.

In a recent blog, Gartner Brand Content Manager Kasey Panetta said, “Individual apps are out. Bots are in. In the ‘post-app era,’ chatbots will become the face of AI, and bots will transform the way apps are built. Traditional apps, which are downloaded from a store to a mobile device, will become just one of many options for customers.”

Chatbots and virtual assistants such as Alexa are being interwoven into consumer lifestyles. KPMG Digital Enablement Managing Director Michael Wolf says his company sees tremendous potential on the B2B side.

“B2B chatbots and virtual assistants could be the interface across multiple systems,” said Wolf. “We’re seeing a lot of growth in that, and the enterprise platform companies are making investments there, either acquiring the capability or acquiring the platforms to do that stuff.”

Implementing chatbots and implementing virtual assistants differs, based on their respective designs and capabilities. Traditional chatbots are script-based, so they respond to pre-programmed inputs. Virtual assistants utilize machine learning to continually improve their ability to understand and respond appropriately to natural language.

“One of the problems with bots is modeling what they think customers want rather than training the system with real people, not just employees and customers, but the person asking the questions. What are they asking?  How are they asking it?” said Wolf. “If you just try to follow your same traditional route paradigms without concentrating on learning and design thinking, you’re going to get less desirable outcomes.”

Expanding B2B use cases

Like other forms of automation, chatbots and virtual assistants are seen as human-augmenting technologies that enable humans to focus on less repetitive, higher-value tasks.

David Nichols, Americas Innovation and Alliance Leader for EY Advisory sees numerous opportunities for B2B chatbots, including internal employee communications, most HR interactions, and everyday interactions such as checking invoice status, delivery status and updates, and customer service interactions.

“The biggest challenge with B2B companies is getting suppliers and customers to use the Chabot functionality,” said Nichols. “Also, B2B companies don’t usually place the same priority on customer personalization as B2C companies. As a result, the customer service interactions at B2B companies don’t usually have the same level of detailed customer segmentation and interaction history. This will present a challenge when developing the use-cases and scenarios for the bot conversation flow.”

In HR scenarios, chatbots provide intelligent means of re-engaging with candidates, specifically sourcing, screening, and updating candidate information.

“[Using] other methods these interactions can take days to weeks for an organization to handle,” said Chris Collins, CEO of recruitment automation company RoboRecruiter. “Chatbots significantly increase the speed and scale that you can operate down to hours and combined with AI can keep the data active.”

That could lead to more positive recruiting experiences for candidates, contract workers, and employers. Similarly, from an outward-facing standpoint, chatbots and virtual assistants could improve brands’ relationships with customers.

It might seem counter-intuitive that an AI-driven chatbot can help companies build relationships with their customers, but remember, the ‘Millennial Mindset’ is quickly becoming the dominant purchasing orientation, and those customers want to efficiently self-service,” said Anthony SmithCEO of CRM solution provider Insightly. “In 2018, B2B chatbots will be utilized not only for lead generation, but also as virtual business assistants and they will handle different tasks such as scheduling and cancelling meetings, setting alarms etc.”

Depending on the enterprise applications chatbots are integrated with, they’ll be able to undertake more complex tasks, such as placing orders, invoicing and other B2B activities that are time consuming and usually require precision. However, there are challenges,

“Integrating chatbots with the major payment systems and with social media is tough and it will probably take time, but once this is covered, chatbots will be able to take orders directly through social accounts and that will be a revolution,” said Insightly’s Smith.

Application integration is critical

Automating business processes requires tight integration with enterprise systems. Exactly how many and which systems depends on the purpose of the chatbot. However, because user experience is vitally important, it’s critical to understand what the users of such systems will want to do with them.

“Some are just trying to redo web and mobile rather than using a design approach to using this,” said KPMG’s Wolf. “There’s an assumption because it’s not visual, it doesn’t involve design.”

In B2B contexts, there are a lot of repetitive tasks that take place within businesses processes, some of which require integrations with different types of systems.

“The injection of the chatbot is allowing consumer-like experiences. ‘I want my ERP to feel like Google

and ‘I want my CRM to feel like Amazon’ is a constant discussion for my customers,” said KPMG’s Wolf. “Applying an enterprise chatbot is obvious in that scenario.”

The end goal for virtual assistants is orchestrating everything necessary to answer a query or execute a request, which can involve a complex web of interconnections among disparate systems.

In short, the best way forward is iterative because requirements, technology and user expectations are constantly changing.

4 Ways to Improve Data Storytelling

Analytical results are often interpreted differently by different people. Sometimes the conclusions presented don’t align with intuition. Differences in experience and expertise can also come into play. An effective way to align thinking is through data storytelling, although there are better and worse ways to do it.

Data storytelling typically includes text, visualizations and sometimes tables to illustrate a developing trend or issue that requires attention if not action. Data storytelling can make the results more memorable and impactful for those who hear it, assuming the presentation is done effectively. Following are a few things to consider

1. Consider the Audience

Data scientists are often considered poor data storytellers because they struggle to align a story with the needs and knowledge level of the audience. Sometimes others are brought in to translate all the technical jargon into something that that is meaningful to business leaders.

Similarly, different parts of a business may require a slightly different focus that uses different language and maybe even different types of data visualizations to have the desired effect, which is understanding analytical results in context.

2. Tell A Story

Effective stories have a beginning, a middle, and an end. The beginning of a story provides context, setting the stage for the story itself. The middle tells the story, and the end usually includes a set of possibilities. Getting the end right is important because insights without action have little value. Are there actionable insights from the data? How can the results be used to drive strategy? In a business context, is there a significant revenue opportunity or an opportunity for cost savings? How much more likely is it that one course of action will succeed versus another? If you provide curated data points and visualizations that support the key points, you can often pre-emptively address the most likely questions and objections.

Effective storytelling also address issues beyond the “what.” Take a sales situation for example. Heads of sales are constantly monitoring progress against sales targets. Let’s say sales fell short or exceeded expectations last quarter. That leads to other questions such as why were sales better or worse than we expected? How could we use those insights to turn the situation around or increase sales even further? How well do we understand our customer base and their requirements? What levers work well and which don’t?

With some solid analytics and effective data storytelling, everyone in the room — the head of sales along with the C-suite or her team can have a common understanding of what impacted the sales results, why, how things are changing and what that means going forward, for the sales team, products, marketing, etc.

Data storytelling should also explain why the analysis was performed, how the analysis was performed, whether hypotheses were proven or disproven in addition to the important findings and what those findings mean for the audience. Some people make the mistake of showing the many steps required for an analysis to demonstrate how challenging the exercise was, which adds little, if any, value.

3. Quality Matters

Great stories can be derailed by simple mistakes, such as misspellings, a lack of focus and a propensity to demonstrate the mastery of a software program to the point of distracting the audience.

Misspellings and grammatical errors tend to be addressed by modern software; however, they don’t always catch everything. Some of them have default settings that limit the amount of text that can be included; however, that’s usually configurable. Sadly, it’ possible to overload stories with so much noise that the audience has trouble staying focused. The point is not clear, in other words. Similarly, trying to get too creative with the colors used in data visualizations can detract the audience’s attention away from the point.

Also consider the presentation of the data in relation to the data itself. On a scale of one to two, a move from one to two reflects a 100% increase. On an actual scale of 25, 50, 100, or 1000, a single-digit increase would appear differently.

4. Be Prepared to Address Alternatives

One of the reasons businesses have placed greater emphasis on analytics versus traditional reporting is the ability to interact with data versus passively consuming it. There is a parallel with data storytelling which is a move away from the traditional and static business presentation format that tends to reserve questions for the end to interactive storytelling in which questions or alternate points of view can be explored live.

Generally speaking, data storytellers should be prepared for questions and challenges, regardless. Why wasn’t something else explored? If a particular variable were added or subtracted, what would the effect be? Of the X possibilities, which is the most likely to see and why?

How SaaS Strategies Are Evolving

Enterprises are subscribing to more SaaS services than ever, with considerable procurement happening at the departmental level. Specialized SaaS providers target problems that those departments want solved quickly. Because SaaS software tends to be easy to set up and use, there appears to be no need for IT’s involvement, until something goes wrong.

According to the Harvey Nash /KPMG 2017 CIO Survey, 91% of the nearly 4,500 CIO and IT leaders who responded expect to make moderate or significant SaaS investments, up from 82% in 2016. The report also states that 40% of SaaS product procurement now happens outside IT.

“IT needs a new operating model,” said Gianna D’Angelo, principal of KPMG CIO Advisory. “CIOs must respond by continuing to focus on operational excellence while adopting a new operating model for IT to drive innovation and value in these changing times.”

Some IT shops are reacting to shadow IT like they reacted to “bring your own device” (BYOD), meaning if you can’t stop it, you have to enable it with governance in mind. However, issues remain.

“In the last three years, we’ve put policies and some governance in place, but it doesn’t matter. You pull out your credit card, you buy an open source application and I have a virus on my network,” said Todd Reynolds, CTO of WEX Health, which provides a platform for benefit management and healthcare-related financial management. “I don’t even know about it until there’s an issue.”

How SaaS pricing is changing

KPMG’s D’Angelo said most SaaS pricing is based on users or by revenue, and that the contract timeframe is three to five years. There has been some movement to shorter timeframes as low as two years.

Sanjay Srivastava, chief digital officer of Genpact, a global professional services company, said his firm sees a shift from user-based pricing to usage-based pricing, which in Genpact’s case takes the form of a per-item charge for a document or balance sheet, for example.

Regardless of what the SaaS pricing model is, SaaS providers are facing downward pricing pressure. According to Gartner, “Vendors are becoming more creative with their SaaS business models to reflect a need to stand out in the fast-growing subscription economy.”

For its part, WEX Health is responding with new services that drive additional revenue. It has also put some usage-based pricing in place for customers that require elastic compute capabilities. “Mobile is killing us,” said Wex Health’s Reynolds. “You’ve given somebody an application to use on their phone 24/7, so they’re starting to leverage that usage so much more. It’s good people are using [our software] more often, but it requires us to have more storage.”

Longer-term thinking is wise

When departments purchase SaaS software, they usually are seeking relief from some sort of business problem, such as multichannel marketing attribution – studying the set of actions that users take in various environments. What business people often miss is the longer-term requirement to share data across disparate systems.

“If you have half on-premises and half in different clouds, you might have a private cloud, some in Azure and some in Amazon because the technology stack is beneficial to the apps,” said WEX Health’s Reynolds. “Pulling all of that together and making it safe and accessible is the biggest challenge from an operational perspective on the IT side.”

While SaaS systems tend to have APIs that help with data exchange, most enterprises have hybrid environments that include legacy systems, some of which do not have APIs. In the older systems, the data dictionaries may not be up-to-date and Master Data Management (MDM) may not have been maintained. So enterprises often face substantial data quality issues that negatively impact the value they’re getting from their investments.

“If you really want to get value out of [SaaS] — if you want Salesforce to run CRM and you want it to run sales, integrated, and it still has to be connected to ERP — each thing has to be connected,” said Genpact’s  Srivastava. “There’s a lot of back and forth. Planning for that back and forth, and planning well, is really critical.”

Part of that back-and-forth is ensuring that the right governance, compliance and security controls are in place.

Bottom line

There’s more to SaaS investments than may be obvious to the people procuring them. At the same time, IT departments can no longer be the sole gatekeepers of all things tech.

“The challenge for CIOs is enormous, the stakes are large and change efforts of this magnitude take years, but transforming the IT operating model can be done,” said KPMG’s D’Angelo. “Complicating the effort is that IT must continue to support the existing portfolios, including retained infrastructure and legacy applications, during the transformation.”

This means that, for a period of time, IT will have to use a hybrid model comprising both the project-oriented, plan-build-run approach and the next-generation, broker-integrate-orchestrate approach, D’Angelo added.

Tips for Ensuring Winning SaaS Strategies

SaaS software is not a one-size-fits-all proposition. Costs and benefits vary greatly, as do the short-term and long-term trade-offs. Following are a few things you can do along the way to ease the transition.

If you’re just starting out, chances are that most if not all of the software you procure will be SaaS because that’s the way things are going. In addition, SaaS allows for an economic shift to relatively low-cost subscriptions that include upgrades and maintenance (an operational expenditure). This is instead of substantial up-front, on-premises software investments that require subsequent maintenance investments and IT’s help (a capital expenditure). Regardless of what type of software you choose, though, it’s wise to think beyond today’s requirements so you have a better chance of avoiding unforeseen challenges and costs in the future.

If you’re piloting a new type of software, SaaS is probably the way to go because you can usually experiment without a long-term commitment. However, be mindful of the potential integration, security and governance challenges you may encounter as you attempt to connect different data sources.

If you’re in production, you’ll want to continuously assess your requirements in terms of software models, integration, compliance, governance and security. As you continue your move into the cloud, understand what’s holding you back. Finance and HR, for instance, may still hesitate to store their sensitive data anywhere but on-premises. For the foreseeable future, you’ll probably have a hybrid strategy that becomes more cloud-based with time.

At each stage, it’s wise to understand the potential risks and rewards beyond what’s obvious today.

Why Surveys Should Be Structured Differently

keyboard-417093_1280If you’re anything like me, you’re often asked to participate in surveys.  Some of them are short and simple.  Others are very long, very complicated, or both.

You may also design and implement surveys from time to time like I do.   If you want some insight into the effectiveness of your survey designs and their outcomes, pay attention to the responses you get.

Notice the Drop-off Points

Complicated surveys that take 15 or 20 minutes to complete tend to reflect drop off points at which the respondents decided that the time investment required wasn’t worth whatever incentive was offered.  After all, not everyone actually cares about survey results or a  1-in-1,000 chance of winning the latest iPad, for example.  If there’s no incentive whatsoever, long and complicated surveys may  be even less successful, even if you’re pinging your own  database.

A magazine publisher recently ran such a survey, and boy, was it hairy.  It started out like similar surveys, asking questions about the respondent’s title, affiliation, company revenue and size.  It also asked about purchasing habits – who approves, who specifies, who recommends, etc. for different kinds of technologies.  Then, what the respondent’s content preferences are for learning about tech (several drop-down menus), using tech (several drop-down menus), purchasing tech (several drop-down menus), and I can’t remember what else.  At that point, one was about 6% done with the survey.  So much for “10 – 15 minutes.”  It took about 10 or 15 minutes just to wade through the first single-digit percent of it.  One would really want a slim chance of winning the incentive to complete that survey.

In short, the quest to learn everything about everything in one very long and complex survey may end in more knowledge about who took the survey than how how people feel about important issues.

On the flip side are very simple surveys that take a minute or two to answer.  Those types of surveys tend to focus on whether a customer is satisfied or dissatisfied with customer service, rather than delving into the details of opinions about several complicated matters.

Survey design is really important.  Complex fishing expeditions can and often do reflect a lack of focus on the survey designer’s part.

Complex Surveys May Skew Results

Overly complicated surveys may also yield spurious results.  For example, let’s say 500 people agree to take a survey we just launched that happens to be very long and very complex.  Not all of the respondents will get past the who-are-you questions because those too are complicated.  Then, as the survey goes on, more people drop, then more.

The result is that  X% of of the survey responses at the end of the survey are not the same as X% earlier in the survey.  What I mean by that is 500 people started, maybe 400 get past the qualification portion, and the numbers continue to fall as yet more complicated questions arise but  the “progress bar” shows little forward movement.  By the end of the survey, far less than 500 have participated, maybe 200  or 100.

Of course, no one outside the survey team knows this, including the people in the company who are presented with the survey results.  They only know that 500 people participated in the survey and X% said this or that.

However, had all 500 people answered all the questions, the results of some of the questions would likely look slightly or considerably different, which may be very important.

Let’s say 150 people completed our  survey and the last question asked whether they planned to purchase an iPhone 7 within the next three months.  40% of them or 60 respondents said yes.  If all 500 survey respondents answered that same question, I can almost guarantee you the answer would not be 40% .  It might be close to 40% or it might not be even close to 40%.

So, if you genuinely care about divining some sort of “truth” from surveys, you need to be mindful about how to define and structure the survey and that the data you see may not be telling you the entire story, or even an accurate story.

The point about accuracy is very important and one that people without some kind of statistical background likely haven’t even considered because they’re viewing all aggregate numbers as having equal weight and equal accuracy.

I, for one, think that survey “best practices” are going to evolve in the coming years with the help of data science.  While the average business person knows little about data science now, in the future it will likely seem cavalier not to consider the quality of the data you’re getting and what you can do to improve the quality of that data.  Your credibility and perhaps your job may depend on it.

In the meantime, try not to shift the burden of thinking entirely to your survey audience because it won’t do either of you much good.  Think about what you want to achieve, structure your questions in a way that gives you insight into your audience and their motivations (avoid leading questions!), and be mindful that not all aggregate answers are equally accurate or representative, even within the same survey.

Older posts Newer posts