Strategic Insights and Clickworthy Content Development

Category: Business strategy (Page 2 of 3)

Why Businesses Must Start Thinking About Voice Interfaces, Now

Voice interfaces are going to have an immense impact on human-to-machine interaction, eventually replacing keyboards, mice and touch. For one thing, voice interfaces can be much more efficient than computers, laptops, tablets and smartphones. More importantly, they provide an opportunity to develop closer relationships with customers based on a deeper understanding of those customers.

Despite the popularity of Alexa among consumers, one might assume that voice interfaces are aspirational at best for businesses, although a recent Capgemini conversational commerce study tells a different story. The findings indicate that 40% of the 5,000 consumers interviewed would use a voice assistant instead of a mobile app or website. In three years, the active users expect 18% of their total expenditures will take place via a voice assistant, which is a six-fold increase from today. The study also concluded that voice assistants can improve Net Promoter Scores by 19%. Interestingly, this was the first such study by Capgemini.

“Businesses really need to come to grips with voice channels because they will change the customer experience in ways that we haven’t seen since the rise of ecommerce,” Mark Taylor, chief experience officer, DCX Practice, Capgemini. “I think it’s going to [have a bigger impact] than ecommerce because it’s broader. We call it ‘conversational commerce,’ but it’s really voice-activated transactions.”

Voice interfaces need to mimic humans

The obvious problem with voice interfaces is their limited understanding of human speech, which isn’t an easy problem to solve. Their accuracy depends on understanding of the words spoken in context, including the emotions of the speaker.

“We’re reacting in a human way to very robotic experience and as that experience evolves, it will only increase our openness and willingness to experience that kind of interaction,” said Taylor. “Businesses have recognized that they’re going to need a branded presence in voice channels, so some businesses have done a ton of work to learn what that will be.”

For example, brands including Campbell’s Soup, Sephora and Taco Bell are trying to understand how consumers want to interact with them, what kind of tone they have as a brand and what to do with the data they’re collecting.

“Brands have spent billions of dollars over the years representing how they look to their audience,” said Taylor. “Now they’re going to have to represent how they sound. What is the voice of your brand? Is it a female or male voice, a young voice or an older voice? Does it have a humorous or dynamic style? There are lots of great questions that will need to be addressed.”

Don’t approach voice like web or mobile

Web and mobile experiences guide users down a path that is meant to translate human thought into something meaningful, but the experience is artificial. In web and mobile experiences, it’s common to search using keywords or step through a pre-programmed hierarchy. Brands win and lose market share based on the customer experience they provide. The same will be true for voice, but the difference is that voice will enable deeper customer relationships.

Interestingly, in the digital world, voice has lost its appeal. Businesses are replacing expensive call centers with bots. Meanwhile, younger generations are using smartphones for everything but traditional voice phone calls. Voice interfaces will change all of that that, albeit not in the way older generations might expect. In Europe, for example, millennials prefer to use a voice assistant in stores rather than talking to a person, Taylor said.

“What’s interesting here is the new types of use cases [because you can] interact with customers where they are,” said Ken Dodelin, VP of Conversational AI Products at Capital One.

Instead of surfing the web or navigating through a website, users can simply ask a question or issue a command.

“[Amazon’s] dash button was the early version a friction-free thing where someone can extend their finger and press a button to go from thought to action,” said Dodelin. “Alexa is a natural evolution of that.”

In banking, there is considerable friction between wanting money or credit and getting it. Capital One has enabled financial account access via voice on Alexa and Cortana platforms. It is also combining visual and voice access on Echo Show. The reasoning for the latter is because humans communicate information faster by speaking and consume information faster visually.

“[I]t usually boils down to what’s the problem you’re solving and how do you take friction out of things,” said Dodelin. “When I think about what it means for a business, it’s more about how can we [get] good customer and business outcomes from these new experiences.”

When Capital One first started with voice interfaces, customers would ask about the balance on their credit cards, but when they asked about the balance due, the system couldn’t handle it.

“Dialogue management is really important,” said Dodelin. “The other piece is who or what is speaking?”

Brand image is reflected in the characteristics of the voice interface. Capital One didn’t have character development experts, so it hired one from Pixar that now leads the conversational AI design work.

“Natural language processing technology has progressed so much that we can expect it to become an increasingly common channel for customer experience,” said Dodelin. “If they’re not doing it directly through a company’s proprietary voice interface, they’re doing it by proxy through Alexa, Google Home or Siri and soon through our automobiles.”

The move to voice interfaces is going to be a challenge for some brands and an opportunity for others. Now is the time for companies to experiment and if they’re successful, leap ahead of their competitors and perhaps even set a new standard for creating customer experiences.

Clearly, more works needs to be done on natural language processing, but already, some consumers have been tempted to thank Alexa, despite its early-stage capabilities, said Capgemini’s Taylor.

In short, voice interfaces are here and evolving rapidly. What will your brand do?

How SaaS Strategies Are Evolving

Enterprises are subscribing to more SaaS services than ever, with considerable procurement happening at the departmental level. Specialized SaaS providers target problems that those departments want solved quickly. Because SaaS software tends to be easy to set up and use, there appears to be no need for IT’s involvement, until something goes wrong.

According to the Harvey Nash /KPMG 2017 CIO Survey, 91% of the nearly 4,500 CIO and IT leaders who responded expect to make moderate or significant SaaS investments, up from 82% in 2016. The report also states that 40% of SaaS product procurement now happens outside IT.

“IT needs a new operating model,” said Gianna D’Angelo, principal of KPMG CIO Advisory. “CIOs must respond by continuing to focus on operational excellence while adopting a new operating model for IT to drive innovation and value in these changing times.”

Some IT shops are reacting to shadow IT like they reacted to “bring your own device” (BYOD), meaning if you can’t stop it, you have to enable it with governance in mind. However, issues remain.

“In the last three years, we’ve put policies and some governance in place, but it doesn’t matter. You pull out your credit card, you buy an open source application and I have a virus on my network,” said Todd Reynolds, CTO of WEX Health, which provides a platform for benefit management and healthcare-related financial management. “I don’t even know about it until there’s an issue.”

How SaaS pricing is changing

KPMG’s D’Angelo said most SaaS pricing is based on users or by revenue, and that the contract timeframe is three to five years. There has been some movement to shorter timeframes as low as two years.

Sanjay Srivastava, chief digital officer of Genpact, a global professional services company, said his firm sees a shift from user-based pricing to usage-based pricing, which in Genpact’s case takes the form of a per-item charge for a document or balance sheet, for example.

Regardless of what the SaaS pricing model is, SaaS providers are facing downward pricing pressure. According to Gartner, “Vendors are becoming more creative with their SaaS business models to reflect a need to stand out in the fast-growing subscription economy.”

For its part, WEX Health is responding with new services that drive additional revenue. It has also put some usage-based pricing in place for customers that require elastic compute capabilities. “Mobile is killing us,” said Wex Health’s Reynolds. “You’ve given somebody an application to use on their phone 24/7, so they’re starting to leverage that usage so much more. It’s good people are using [our software] more often, but it requires us to have more storage.”

Longer-term thinking is wise

When departments purchase SaaS software, they usually are seeking relief from some sort of business problem, such as multichannel marketing attribution – studying the set of actions that users take in various environments. What business people often miss is the longer-term requirement to share data across disparate systems.

“If you have half on-premises and half in different clouds, you might have a private cloud, some in Azure and some in Amazon because the technology stack is beneficial to the apps,” said WEX Health’s Reynolds. “Pulling all of that together and making it safe and accessible is the biggest challenge from an operational perspective on the IT side.”

While SaaS systems tend to have APIs that help with data exchange, most enterprises have hybrid environments that include legacy systems, some of which do not have APIs. In the older systems, the data dictionaries may not be up-to-date and Master Data Management (MDM) may not have been maintained. So enterprises often face substantial data quality issues that negatively impact the value they’re getting from their investments.

“If you really want to get value out of [SaaS] — if you want Salesforce to run CRM and you want it to run sales, integrated, and it still has to be connected to ERP — each thing has to be connected,” said Genpact’s  Srivastava. “There’s a lot of back and forth. Planning for that back and forth, and planning well, is really critical.”

Part of that back-and-forth is ensuring that the right governance, compliance and security controls are in place.

Bottom line

There’s more to SaaS investments than may be obvious to the people procuring them. At the same time, IT departments can no longer be the sole gatekeepers of all things tech.

“The challenge for CIOs is enormous, the stakes are large and change efforts of this magnitude take years, but transforming the IT operating model can be done,” said KPMG’s D’Angelo. “Complicating the effort is that IT must continue to support the existing portfolios, including retained infrastructure and legacy applications, during the transformation.”

This means that, for a period of time, IT will have to use a hybrid model comprising both the project-oriented, plan-build-run approach and the next-generation, broker-integrate-orchestrate approach, D’Angelo added.

Tips for Ensuring Winning SaaS Strategies

SaaS software is not a one-size-fits-all proposition. Costs and benefits vary greatly, as do the short-term and long-term trade-offs. Following are a few things you can do along the way to ease the transition.

If you’re just starting out, chances are that most if not all of the software you procure will be SaaS because that’s the way things are going. In addition, SaaS allows for an economic shift to relatively low-cost subscriptions that include upgrades and maintenance (an operational expenditure). This is instead of substantial up-front, on-premises software investments that require subsequent maintenance investments and IT’s help (a capital expenditure). Regardless of what type of software you choose, though, it’s wise to think beyond today’s requirements so you have a better chance of avoiding unforeseen challenges and costs in the future.

If you’re piloting a new type of software, SaaS is probably the way to go because you can usually experiment without a long-term commitment. However, be mindful of the potential integration, security and governance challenges you may encounter as you attempt to connect different data sources.

If you’re in production, you’ll want to continuously assess your requirements in terms of software models, integration, compliance, governance and security. As you continue your move into the cloud, understand what’s holding you back. Finance and HR, for instance, may still hesitate to store their sensitive data anywhere but on-premises. For the foreseeable future, you’ll probably have a hybrid strategy that becomes more cloud-based with time.

At each stage, it’s wise to understand the potential risks and rewards beyond what’s obvious today.

Deloitte: 5 Trends That Will Drive Machine Learning Adoption

Companies across industries are experimenting with and using machine learning, but the actual adoption rates are lower than it might be seem. According to a 2017 SAP Digital Transformation Study, fewer than 10% of 3,100 executives from small, medium and large companies said their organizations were investing in machine learning. That will change dramatically in the coming years, according to a new Deloitte report, because researchers and vendors are making progress in five key areas that may make machine learning more practical for businesses of all sizes.

1. Automating data science

There is a lot of debate about whether data scientists will or won’t be automated out of a job. It turns out that machines are far better at doing rote tasks faster and more reliably than humans, such as data wrangling.

“The automation of data science will likely be widely adopted and speak to this issue of the shortage of data scientists, so I think in the near term this could have a lot of impact,” said David Schatsky, managing director at Deloitte and one of the authors of Deloitte’s new report.

Industry analysts are bullish about the prospect of automating data science tasks, since data scientists can spend an inordinate amount of time collecting data and preparing it ready for analysis. For example, Gartner estimates that 40% of a data scientist’s job will be automated by 2020.

Data scientists aren’t so sure about that, and to be fair, few people, regardless of their position, have considered which parts of their job are ripe for automation.

2. Reducing the need for training data

Machine learning tends to require a lot of data. According to the Deloitte report, training a machine learning model might require millions of data elements. While machine learning requirements vary based on the use case, “acquiring and labeling data can be time-consuming and costly.”

One way to address that challenge is to use synthetic data. Using synthetic data, Deloitte was able to reduce the actual amount of data required for training by 80%. In other words, 20% of the data was actual data and the remaining 80% was synthetic data.

“How far we can go in reducing the need for training data has two kinds of question marks: How far can you reduce the need for training data and what characteristics of data are most likely minimized and which require massive datasets?” said Schatsky.

3. Accelerating training

Massive amounts of data and heavy computation can take considerable time. Chip manufacturers are addressing this issue with various types of chips, including GPUs and application-specific integrated circuits (ASICs). The end result is faster training of machine learning models.

“I have no doubt that with the new processor architectures, execution is going to get faster,” said Schatsky. “[The chips] are important and necessary, but not sufficient to drive significant adoption on their own.”

4. Explaining results

Many machine learning models spit out a result, but they don’t provide the reasoning behind the result. As Deloitte points out, business leaders often hesitate to place blind faith in a result that can’t be explained, and some regulations require an explanation.

In the future, we’ll likely see machine learning models that are more accurate and transparent, which should open the door for greater use in regulated industries.

[Deloitte also recently discussed 9 AI Benefits Enterprises Are Experiencing Today.]

“No one knows how far you can go yet in terms of making an arbitrary neural network-based model interpretable,” said Schatsky. “We could end up hitting some limits identifying a fairly narrow set of cases where you can turn a black box model into an open book for certain kinds of models and situations, but there will be other scenarios where they work well but you can’t use them in certain situations.”

5. Deploying locally

Right now, machine learning typically requires a lot of data and training can be time-consuming. All of that requires a lot of memory and a lot of processing power, more than mobile and smart sensors can handle, at least for now.

In its report, Deloitte points out there is research in this area too, some of which has reduced the size of models by an order of magnitude or more using compression.

The bottom line

Machine learning is having profound effects in different industries ranging from TV pilots to medical diagnoses. It seems somewhat magical and somewhat scary to the uninitiated, though the barriers to adoption are falling. As machine learning becomes more practical for mainstream use, more businesses will use it whether they realize it or not.

“[The five] things [we identified in the report] are converging to put machine learning on a path toward mainstream adoption,” said Schatsky.  “If companies have been sitting it out waiting for this to get easier and more relevant, they should sit up instead and start getting involved.”

What Data Analysts Want to See in 2018

The demand for data analysts is at an all-time high, but organizations don’t always get the value they expect, mainly because the organization, or parts of it, are getting in the way.

Being an analyst can be a frustrating job if your position isn’t getting what it needs in terms of data, tools and organizational support. Are you getting what you need? Here are some of the things your contemporaries are saying.

More Data

Despite the glut of data companies have, analysts don’t always get the data they need, often because the data owners are concerned about privacy, security, losing control of their data or some combination of those things.

“The problem of data ownership and data sharing is universal,” said Sam Ruchlewicz, director of Digital Strategy & Data Analytics at advertising, digital, PR and brand agency Warschawski. “For analytics professionals, these artificial barriers hinder the creation of comprehensive, whole-organization analyses that can provide real, tangible value and serve as a catalyst for the creation (and funding) of additional analytics programs.”

Jesse Tutt, program lead of the IT Analytics Center of Excellence at Alberta Health Services said getting access to the data he needs takes a lot of time because he has to work with the data repository owners to get their approval and then work with the technologists to get access to the systems. He also has to work with the vendors and the data repository subject matter experts.

“We’ve worked really hard getting access to the data sets, correlating the different datasets using correlation tables and cleaning up the data within the source systems,” he said. “If you ask a specific set or data repository what something is, it can tell you, but if you can snapshot it on a monthly basis you can see a trend. If you correlate that across other systems, you can find more value. In our case, the highest value is connecting the system and creating the capability in a data warehouse, reporting you can correlate across the systems.

Four years ago, people at Alberta Health Services wanted to see trend data instead of just snapshots, so one system was connected to another. Now, 60 connected data sources are connected with 60 more planned by the end of 2017. The company has a total of about 1,600 data sources, many of which will be connected in the next couple of years.

More Respect

The most effective data analytics align with business objectives, but what happens when your data analysts aren’t informed? Warschawski’s Ruchlewicz recently had dinner with the CEO of a large, international agency who spent millions of dollars on a marketing campaign that failed simply because the executive didn’t want to listen to “the analytics kids.” Never mind the fact that the analytics team had identified a major issue the target audience had with the client’s brand.

“[The CEO] dismissed them as analytics kids who didn’t know what they were talking about and proceeded to launch the campaign,” said Ruchlewicz. “Only later, after millions of dollars in spending (with no results to show for it), did the CEO allow them to make their case and implement their recommendations.”

Ultimately, their recommendations turned the campaign around. Ruchlewicz said.

“I wish this as a one-off story. It’s not. I wish this was confined to ‘old school’ companies. It’s not,” said Ruchlewicz. “Until analytics teams are given a seat at the table where decisions are made, analytics will continue to be undervalued and underappreciated across the entire organization.”

Analysts have to earn respect like anyone else, however. That requires communicating to business professionals in business terms.

“Executives and investors today are hyper-focused on the bottom line, and most that I’ve interacted with perceive analytics as a line item expenditure,” said Ruchlewicz. “[A]nalytics professionals need to take the first step toward resolution. There are several methods that allow the creation of a rigorous, defensible first approximation, which is sufficient to get the conversation started (and usually, some data shared).”

To help turn the tide, analytics practitioners are well-advised present information and construct business cases around their activities.

More Consistency

If everyone in the organization used the same terminology for everything, always had the right database fields accessible, and always entered data correctly and in the same manner, some enterprise data would be much cleaner than it is today. However, the problem doesn’t stop there

“If a person says, ‘I want an analytical tool,’ how do you group that and do trending on it when a person may call it one of the 100 different analytical tool names or they’ll say I need to do analysis on data? The words the submit are often different from what they actually want,” said Alberta Health Services’ Tutt

Tutt and his team are endeavoring to better understand what people are requesting in service desk tickets so the company can manage its software investments more effectively. Now that his team has access to the different systems, they know who’s using a product and when they used it. They’re looking at the problem from a Robotics Process Automation (RPA) perspective so software can be automatically removed if it hasn’t been used in a certain time period.

More Power to Affect Change

Industry analysts are pushing back on “data-driven” mantras because they think companies should be “insight-driven.” While they have a valid point, insights without action have little value.

For example, a large U.S. health provider has a massive analytics team that’s generating highly-actionable insights, but those insights are not being acted upon by the business. They can meet with a functional unit such as risk or compliance and show them insights. The operating unit will say, “That’s interesting,” but there’s no way to connect insights and action.

“The data teams are frustrated because they’re not getting the operational support they need,” said Adam Nathan, CEO and Founder of analytics strategy firm The Bartlett System. “The data teams don’t know how to drive that, except to get frustrated and quiet and get more value elsewhere. I think the tipping point will come when the company realizes it’s falling behind competitors. They’ll realize the company isn’t getting the value it could from analytics and that will put pressure on them to do something with those insights.”

How Today’s Analytics Change Recruiting

HR is late to the analytics game by modern standards, and yet, HR metrics is not a new concept. The difference is that modern analytics enable HR professionals and recruiters to measure more things in less time and derive more insight than ever before.

Rosemary Haefner

Rosemary Haefner

“If you’re looking at recruiting, there have always been metrics such as time to hire and cost per hire, but you’re seeing other channels and avenues opening up,” said Rosemary Haefner, chief human resources officer at online employment website, CareerBuilder.com.

The “time to hire” or “time to fill” metric measures how many days it takes from the time a requisition is posted until the time an offer is accepted. The longer a position remains open, the higher the cost of talent acquisition. In addition, if a position remains open, an intervention may be necessary to ensure the work at hand is getting done.

If time to fill were the only measure of success, then, in theory, the faster a position is filled, the better. However, as most working professionals have experienced, the person who can be hired the fastest isn’t necessarily (and probably isn’t), the best candidate.

On the other hand, moving too slowly can cost organizations sought-after talent.

“There’s the time to fill, the cost of the person you hire, whether that person is high-potential and what their expected tenure in the organization is. That’s an example of four interrelated metrics,” said Muir Macpherson, Americas analytics leader, People Advisory Services at EY. “HR needs to stop thinking about individual metrics and consider the problem they’re trying to solve and how to optimize across a set of metrics simultaneously.”

Beyond keywords

Talent marketplaces and talent acquisition software made it easier to navigate a sea of resumes using keywords and filters. In response, some candidates stuffed their resumes full of keywords so their resumes would rank higher in searches. If one’s resume ranked higher in searches, then more people would see it, potentially increasing the candidate’s chance of getting interviews and landing a job.

Masterful keyword use demonstrated an awareness that the recruiting process was changing from a paper-based process to a computer or web-based process. However, other candidates who might have been better fits for positions risked getting lost in the noise.

The whole keyword trend was a noble effort, but keywords, like anything else, are not a silver bullet.

With today’s analytics tools, HR departments and search firms can understand much more about candidates and the effectiveness of their operations.

“You can use a variety of big data and machine learning techniques that go way beyond the keyword analysis people have been doing for a while that integrates all of the data available about a candidate into one, unified prediction score that can then be used as one additional piece of information that recruiters and hiring managers can look at when making their decisions,” said Macpherson.

Data impacts recruiters too

Recruiters now have access to data analytics tools that enable them to better match candidates with potential employers and improve the quality of their services. Meanwhile, HR departments want insight into what recruiters are doing and how well they’re doing it. The Scout Exchange marketplace provides transparency between the two.

“We can look at every candidate [a recruiter] submits to see how far they got in the process and whether they got hired. We use that for ratings so [companies and the recruiters they use] can see the other side’s rating,” said Scout Exchange CEO Ken Lazarus.

The site enables organizations to quickly find appropriate recruiters who can identify the best candidates for a position. HR departments also allows HR departments to see data and trends specific to their company.

Bottom line

Analytics is providing HR departments, recruiters and business leaders with quantitative information they can use to improve their processes and outcomes.

“Knowledge is power and having that data is helpful. For me, the first step is knowing what you’re solving for,” said CareerBuilder’s Haefner.

Right now, HR analytics tend to emphasize recruitment. However, attracting talent is sometimes easier than retaining it so it’s important to have insight throughout the lifecycle of employee relationships. EY’s Macpherson said HR departments should think in terms of “employee lifetime value” similar to the way marketers think about customer lifetime value.

“[HR analytics represents] a huge opportunity because for most companies, people and compensation are their biggest costs and yet there has been very little effort put into analyzing those costs or getting the most out of those investments that companies are making,” said EY’s Macpherson.

Why Privacy Is a Corporate Responsibility Issue

Many organizations have Corporate Responsibility programs that focus on social issues and philanthropy. Especially in today’s Big Data era, why is privacy not part of the program?

Today’s companies are promising to lower their carbon footprints and save endangered species. They’re donating to people in developing countries who have far less than we do, which is also noble. But what about the fact that American citizens are a product whose information is bought, sold, and obtained without consent? In light of recent events, perhaps the privacy policies deserve more consideration than just two linked words at the bottom of a website home page.

“Privacy is a big issue for a host of reasons — legal, ethical, brand protection and moral,” Mark Cohen, Chief Strategy Officer at consultancy and technology service provider Elevate. “[Privacy] is an element of corporate culture [so what goes into a privacy policy depends on] your values and priorities.”

Problems with Privacy Policies

There are three big problems with privacy policies, at least in the US: what’s in them, how they’re written, and how they’re ignored.

One might think that privacy policies are tailored to a particular company and its audience. However, such documents are not necessarily original. Rather than penning a privacy policy from scratch, some are literally cutting and pasting entire privacy policies regardless of their contents. In fact, the people who are simply grabbing another company’s privacy policy might not even bother to read the content before using it.

The boilerplate language is also a problem. In-house counsel often uses freely available forms to put together a privacy policy. They may use one form or a combination of forms available to lawyers, but again, they’re not thinking about what should be in the document.

In addition, the documents are written in legalese, which is difficult for the average person to read. Businesses are counting on that because if you don’t know what’s in a privacy policy, what you’re giving away and what they intend to do with your information, you’ll probably just hope for the best. Even better, you’ll click an “I agree” button without knowing what clicking that button actually means. It’s a common practice, so you’re not alone if that’s the case.

Oh, and what’s stated in the documents may or may not be true, either because the company changed the policy since you last read it or they’re ignoring the document itself.

“After May 2018 when the new GDPR [General Data Protection Regulation] goes into effect, it’s going to force many companies to look at their privacy policies. their privacy statements and consents and make them more transparent,” said Sheila Fitzpatrick, Data Governance & Privacy counsel and chief privacy officer at data services for hybrid cloud company NetApp. “They’re going to have to be easily understandable and readable.”

Businesses Confuse Privacy with Security

Privacy and security go hand-in-hand, but they’re not the same thing. However, the assumption is, if you’re encrypting data then you’re protecting privacy.

“Every company focuses on risk, export control trade compliance, security, but rarely you find companies focused on privacy,” said Fitzpatrick. “That’s changing with GDPR because it’s extraterritorial. It’s forcing companies to start really addressing areas around privacy.”

It’s entirely possible to have all kinds of security and still not address privacy issues. OK, so the data is being locked down, but are you legally allowed to have it in the first place? Perhaps not.

“Before you lock down that data, you need the legal right to have it,” said Fitzpatrick. “That’s the part that organizations still aren’t comprehending because they think they need the data to manage the relationship. In the past organizations thought they need the data to manage employment, customer or prospect relationships, but they were never really transparent about what they’re doing with that data, and they haven’t obtained the consent from the individual.”

In the US the default is opt-in. In countries that have restrictive privacy policies, the default is opt-out.

The Data Lake Mentality Problem

We hear a lot about data lakes and data swamps. In a lot of cases, companies are just throwing every piece of data into a data lake, hoping it will have value in the future. After all, cloud storage is dirt cheap.

“Companies need to think about the data they absolutely need to support a relationship. If they’re an organization that designs technology, what problem are they trying to solve and what data do they need to solve the problem?” said Fitzpatrick.

Instead of collecting massive amounts of information that’s totally irrelevant, they should consider data minimization if they want to lower privacy-related risks and comply with the EU’s GDPR.

“Companies also need to think about how long are they’re maintaining this data because they have a tendency to want to keep data forever even if it has no value,” said Fitzpatrick. “Under data protection laws, not just the GDPR, data should only be maintained for the purpose it was given and only for the time period for which it was relevant.”

The Effect of GDPR

Under the GDPR, consent has to be freely given, not forced or implied. That means companies can’t pre-check an opt-in box or force people to trade personal data for the use or continued use of a service.

“Some data is needed. If you’re buying a new car they need financial information, but they’d only be using it for the purpose of the purchase, not 19 other things they want to use it for including sales and marketing purposes,” said Fitzpatrick.

Privacy may well become the new competitive advantage as people become more aware of privacy policies and what they mean and don’t mean.

“Especially Europeans, Canadians, and those who live in Asia-Pacific countries that have restrictive privacy laws, part of their vetting process will be looking at your privacy program,” said Fitzpatrick. “If you have a strong privacy program and can answer a privacy question with a privacy answer as opposed to answering a privacy question with a security answer, [you’ll have an advantage].”

On the flip side, sanctions from international countries can destroy a company from reputational, brand and financial points of view. The sanction under the new GDPR regulation can be as high as 4% of a company’s annual turnover.

Quantum Computing Brings Promise and Threats

Digital computing has some serious limitations. While the technology advances made over the past few decades are impressive such as smaller footprints, faster processors, better UIs and more memory and storage, some problems could be solved better by quantum computers.

For one thing, quantum computers are faster than classical (traditional) computers. They are also able to solve problems that classical computers can’t do well or can’t do within a reasonable amount of time.

“Quantum computing exploits fundamental laws of physics to solve complex computing problems in new ways, problems like discovering how diseases develop and creating more effective drugs to battle them,” said Jim Clarke, director of quantum hardware at Intel Labs.”Once quantum systems are available commercially, they can be used to simulate nature to advance research in chemistry, materials science and molecular modeling. For instance, they can be used to help create a new catalyst to sequester carbon dioxide or a room temperature superconductor.”

Quantum computing will also drive new levels of business optimization, benefit machine learning and artifical intelligence, and change the cryptography landscape.

David Schatsky, managing director at Deloitte, said the common thread is optimization problems where there are multiple probable answers and the task is to find the right one. Examples include investment management, portfolio management, risk mitigation and the design of communication systems and transportation systems. Logistics companies are already exploring route optimization while the defense industry is considering communications applications.

“A year ago [quantum computing] was thought of more of as a physics experiment [but] the perception has changed quickly,” said Schatsky.  “In the last 3 months there have been a flurry of breakthroughs including fundamental engineering breakthroughs and commercial product announcements.”

Test drive a quantum computer today

It’s probably safe to say that none of us will have a quantum computer sitting on our desks anytime soon, but just about anyone with a browser can get access to IBM’s 5 and 16 quantum bit (qubit) computers via the cloud.  Earlier this year, the company announced IBM Q, an initiative intended to result in commercially available quantum computing systems.  IBM also announced that it had built and tested two quantum computing processors including the 16 qubit open processor for use by the public and the 17-qubit commercial processor for customers.

According to an IBM paper in Nature, scientists successfully used a seven-qubit quantum processor to address a molecular structure problem for beryllium hydride (BeH2), the largest molecule simulated on a quantum computer to date.

“It is early days, but it’s going to scale rapidly,” said Scott Crowder, vice president and CTO, Quantum Computing, Technical Strategy & Transformation at IBM Systems. “When you start talking about hundreds or low thousands of qubits, you can start exploring business value problems that [can’t be addressed well using] classical computers such as quantum chemistry [and] certain types of optimization problems that are also exponential problems.”

An exponential problem is one that scales exponentially with the number of elements in it. For example, planning a route involving 50 locations could be optimized in a number of ways depending on the objective, such as identifying the fastest route. That seemingly simple problem actually involves one quadrillion different possibilities, which is too many possibilities for a classical computer to handle, Crowder said.

Intel is making progress too

Intel teamed up with QuTech, an academic partner in the Netherlands in 2015. Since then, Intel has achieved milestones such as demonstrating key circuit blocks for an integrated cryogenic-CMOS control system, developing a spin qubit fabrication flow on Intel’s 300mm process technology and developing a unique packaging solution for superconducting qubits that it demonstrated in the 17-qubit superconducting test chip introduced on October 10, 2017. A week later, at the Wall Street Journal D.Live conference in Laguna, Calif., Intel CEO Brian Krzanich said he expects Intel to deliver a 49-qubit quantum chipby the end of 2017.

“Ultimately the goal is to develop a commercially relevant quantum computer, one that is relevant for many applications and one that impacts Intel’s bottom line,” said Intel’s Clarke.

Toward that end, Intel’s work with QuTech spans the entire quantum stack from the qubit devices to the overall hardware architecture, software architecture, applications and complementary electronics that workable quantum systems will require.

“Quantum computing, in essence, is the ultimate in parallel computing, with the potential to tackle problems conventional computers can’t handle,” said Clarke. “But, realizing the promise of quantum computing will require a combination of excellent science, advanced engineering and the continued development of classical computing technologies, which Intel is working towards through our various partnerships and R&D programs.”

Decryption and other threats

There is a debate about whether quantum computers will render current encryption methods obsolete or not. Take a brute force attack, for example. In a brute force attack, hackers continually guess passwords and use computers to accelerate that work. Quantum computing would accelerate such an attack even further.

“Virtually all security protocols that are used and deployed today are vulnerable to an attack by a quantum computer,” said William “Whurley” Hurley) Chair of the Quantum Standards Working Group at the IEEE. “Quantum information allows us to secure information in ways that are completely unbreakable even against a quantum attack.”

Along those lines, there are efforts to develop a new type of security protocol that doesn’t necessarily leverage quantum mechanics. Hurley said they’re using extremely difficult mathematical problems that even quantum computers won’t be able to solve, which is referred to as “Quantum-ibmSafe Cryptography” or “Post-Quantum Cryptography).

The IEEE Quantum Standards Working Group is working on other quantum technologies including, quantum sensors and quantum materials. The research institute has brought together physicists, chemists, engineers, mathematicians and computer scientists to ensure that the institute can adapt rapidly to change.

Deloitte’s Schatsky said synthetic biology and gene editing are also potentially dangerous, mainly because capabilities can be developed faster than one’s ability to understand how to apply such technologies wisely. The same could be said for many emerging technologies.

Quantum computing should be on your radar

Quantum computing is advancing rapidly now so it’s wise to ponder how the capabilities might benefit your business.  The reality is that no one knows all the ways quantum computing can be used, but it will eventually impact businesses in many different industries.

Will quantum computers overtake classical computers, following the same evolutionary path we’ve seen over the past several decades or will the two co-exist? For the foreseeable future, co-existence is the answer because binary and quantum computers each solve different kinds of problems better than the other.

 

 

Your Data Is Biased. Here’s Why.

Bias is everywhere, including in your data. A little skew here and there may be fine if the ramifications are minimal, but bias can negatively affect your company and its customers if left unchecked, so you should make an effort to understand how, where and why it happens.

“Many [business leaders] trust the technical experts but I would argue that they’re ultimately responsible if one of these models has unexpected results or causes harm to people’s lives in some way,” said Steve Mills, a principal and director of machine intelligence at technology and management consulting firm Booz Allen Hamilton.

In the financial industry, for example, biased data may cause results that offend the Equal Credit Opportunity Act (fair lending). That law, enacted in 1974, prohibits credit discrimination based on race, color, religion, national origin, sex, marital status, age or source of income. While lenders will take steps not to include such data in a loan decision, it may be possible to infer race in some cases using a zip code, for example.

“The best example of [bias in data] is the 2008 crash in which the models were trained on a dataset,” said Shervin Khodabandeh, a partner and managing director of Boston Computing Group (BCG) Los Angeles, a management consulting company. “Everything looked good, but the datasets changed and the models were not able to pick that up, [so] the model collapsed and the financial system collapsed.”

What Causes Bias in Data

A considerable amount of data has been generated by humans, whether it’s the diagnosis of a patient’s condition or the facts associated with an automobile accident.  Quite often, individual biases are evident in the data, so when such data is used for machine learning training purposes, the machine intelligence reflects that bias.  A prime example of that was Microsoft’s infamous AI bot, Tay, which in less than 24 hours adopted the biases of certain Twitter members. The results were a string of shocking, offensive and racist posts.

“There’s a famous case in Broward County, Florida, that showed racial bias,” said Mills. “What appears to have happened is there was historically racial bias in sentencing so when you base a model on that data, bias flows into the model. At times, bias can be extremely hard to detect and it may take as much work as building the original model to tease out whether that bias exists or not.”

What Needs to Happen

Business leaders need to be aware of bias and the unintended consequences biased data may cause.  In the longer-term view, data-related bias is a governance issue that needs to be addressed with the appropriate checks and balances which include awareness, mitigation and a game plan should matters go awry.

“You need a formal process in place, especially when you’re impacting people’s lives,” said Booz Allen Hamilton’s Mills. “If there’s no formal process in place, it’s a really bad situation. Too many times we’ve seen these cases where issues are pointed out, and rather than the original people who did the work stepping up and saying, ‘I see what you’re seeing, let’s talk about this,’ they get very defensive and defend their approach so I think we need to have a much more open dialog on this.”

As a matter of policy, business leaders need to consider which decisions they’re comfortable allowing algorithms to make, the safeguards which ensure the algorithms remain accurate over time, and model transparency, meaning that the reasoning behind an automated decision or recommendation can be explained.  That’s not always possible, but still, business leaders should endeavor to understand the reasoning behind decisions and recommendations.

“The tough part is not knowing where the biases are there and not taking the initiative to do adequate testing to find out if something is wrong,” said Kevin Petrasic, a partner at law firm White & Case.  “If you have a situation where certain results are being kicked out by a program, it’s incumbent on the folks monitoring the programs to do periodic testing to make sure there’s appropriate alignment so there’s not fair lending issues or other issues that could be problematic because of key datasets or the training or the structure of the program.”

Data scientists know how to compensate for bias, but they often have trouble explaining what they did and why they did it, or the output of a model in simple terms. To bridge that gap, BCG’s Khodabandeh uses two models: one that’s used to make decisions and a simpler model that explains the basics in a way that clients can understand.

Drexel University’s online MS in Data Science will set you on the path to success in one of today’s fastest growing fields. Learn how to examine and manipulate data to solve problems by creating machine learning algorithms and emerge from the program work-place ready.

Brought to you by Drexel University

BCG also uses two models to identify and mitigate bias.  One is the original model, the other is used to test extreme scenarios.

“We have models with an opposite hypothesis in mind which forces the model to go to extremes,” said Khodabandeh. “We also force models to go to extremes. That didn’t happen in the 2008 collapse. They did not test extreme scenarios. If they had tested extreme scenarios, there would have been indicators coming in in 2007 and 2008 that would allow the model to realize it needs to adjust itself.”

A smart assumption is that bias is present in data, regardless.  What the bias is, where it stems from, what can be done about it and what the potential outcomes of it may be are all things to ponder.

Conclusion

All organizations have biased data.  The questions are whether the bias can be identified, what effect that bias may have, and what the organization is going to do about it.

To minimize the negative effects of bias, business leaders should make a point of understanding the various types and how they can impact data, analysis and decisions. They should also ensure there’s a formal process in place for identifying and dealing with bias, which is likely best executed as a formal part of data governance.

Finally, the risks associated with data bias vary greatly, depending on the circumstances. While it’s prudent to ponder all the positive things machine learning and AI can do for an organization, business leaders are wise to understand the weaknesses also, one of which is data bias.

How to Teach Executives About Analytics

If your data is failing to persuade executives, maybe it’s not the data that is the problem. Here’s how to change your approach to fit the audience.

One of the biggest challenges data analysts and data scientists face is educating executives about analytics. The general tendency is to nerd out on data and fail to tell a story in a meaningful way to the target audience.

Sometimes data analytics professionals get so wrapped up in the details of what they do that they forget not everyone has the same background or understanding. As a result, they may use technical terms, acronyms, or jargon and then wonder why no one “got” their presentations or what they were saying.

They didn’t anything wrong, per se, it’s how they’re saying it and to whom.

If you find yourself in such a situation, following are a few simple things you can do to facilitate better understanding.

Discover What Matters

What matters most to your audience? Is it a competitive issue? ROI? Building your presence in a target market? Pay attention to the clues they give you and don’t be afraid to ask about their priorities. Those will clue you in to how you should teach them about analytics within the context of what they do and what they want to achieve.

Understand Your Audience

Some executives are extremely data-savvy, but the majority aren’t just yet. Dialogs between executives and data analysts or data scientists can be uncomfortable and even frustrating when the parties speak different languages. Consider asking what your target audience would like to learn about and why. That will help you choose the content you need to cover and the best format for presenting that content.

For example, if the C-suite wants to know how the company can use analytics for competitive advantage, then consider a presentation. If one of them wants to understand how to use a certain dashboard, that’s a completely different conversation and one that’s probably best tackled with some 1:1 hands-on training.

Set Realistic Expectations

Each individual has a unique view of the world. Someone who isn’t a data analyst or a data scientist probably doesn’t understand what that role actually does, so they make up their own story which becomes their reality. Their reality probably involves some unrealistic expectations about what data-oriented roles can do or accomplish or what analytics can accomplish generally.

One of the best ways to deal with unrealistic expectations is to acknowledge them and then explain what is realistic and why. For example, a charming and accomplished data scientist I know would be inclined to say, “You’d think we could accomplish that in a week, right? Here’s why it actually takes three weeks.”

Stories can differ greatly, but the one thing good presentations have in common is a beginning, a middle, and an end. One of the mistakes I see brilliant people making is focusing solely on the body of a presentation, immediately going down some technical rabbit hole that’s fascinating for people who understand it and confusing for others.

A good beginning gets everyone on the same page about what the presentation is about, why the topic of discussion is important, and what you’re going to discuss. The middle should explain the meat of the story in a logical way that flows from beginning to end. The end should briefly recap the highlights and help bring your audience to same conclusion you’re stating in your presentation.

Consider Using Options

If the executive(s) you’re presenting to hold the keys to an outcome you desire, consider giving them options from which to choose. Doing that empowers them as the decision-makers they are. Usually, that approach also helps facilitate a discussion about tradeoffs. The more dialog you have, the better you’ll understand each other.

Another related tip is make sure your options are within the realm of the reasonable. In a recent scenario, a data analyst wanted to add two people to her team. Her A, B, and C options were A) if we do nothing, then you can expect the same results, B) if we hire these two roles we’ll be able to do X and Y, which we couldn’t do before, and C) if we hire 5 people we’ll be able to do even more stuff, but it will cost this much. She came prepared to discuss the roles, the interplay with the existing team and where she got her salary figures. If they asked what adding 1, 3, or 4 people looked like, she was prepared to answer that too.

Speak Plainly

Plain English is always a wise guide. Choose simple words and concepts, keeping in mind how the meaning of a single word can differ. For example, if you say, “These two variables have higher affinity,” someone may not understand what you mean by variables or affinity.

Also endeavor to simplify what you say, using concise language. For example, “The analytics of the marketing department has at one time or another tended overlook the metrics of the customer service department” can be consolidated into, “Our marketing analytics sometimes overlooks customer service metrics.”

Why You Business May Not Be Ready for Analytics

Artificial intelligence is on the minds of business leaders everywhere because they’ve either heard or believe that AI will change the way companies do business.

What we’re seeing now is just the beginning. For everyone’s sake, more thought needs to be given to the workforce impact and how humans and machines will complement each other.

Recently, professional services company Genpact and FORTUNE Knowledge Group surveyed 300 senior executives from companies in the North American, European and Asia-Pacific regions with annual revenues of $1 billion per year or more. According to the report, “AI leaders expect that the modern workforce will be comfortable working alongside robots by 2020.”

However, getting there will require a different approach to organizational change.

“A bunch of people are thinking about AI as a technology. What they’re not thinking about is AI as the enabler of new enterprise processes, AI as an augmenter of humans in enterprise processes,” said Genpact Senior Vice President Gianni Giacomelli. “Right now, 70% of the effort is spent on technology, 20% on processes and 10% on humans as a process piece. I think that’s the wrong way to look at it.”

What is the right way to think about AI? At one end of the spectrum, people are touting all the positive things AI will enable, such as tackling some of our world’s biggest social problems. On the other end of the spectrum are Elon Musk, Stephen Hawking and others who foresee a dark future that involves unprecedented job losses if not human extermination.

Regardless of one’s personal view of the matter, business leaders need to be thinking harder and differently about the impact AI may have on their businesses and their workforces. Now.

How to think about the problem

The future’s trajectory is not set. It changes and evolves with technology and culture. Since AI’s end game is not completely foreseeable, one way to approach the problem, according to the survey, is to begin with the desired outcome, think about the processes required to achieve that outcome and then ponder how machines and humans can complement each other.

“Generally, the biggest impediment we see out there is the inability to create a portfolio of initiatives, so having a team or a number of teams coming back and saying, ‘These are the 50 things I could do with AI based on what AI is able to do today and in the next 12 months,’ and then [it’s up to senior management to] prioritize them,” said Giacomelli. “You need to have people going through the organization, unearthing places where value can be impacted.”

Over the last three decades or so, business leaders have been setting strategy and then implementing it, which isn’t going to work moving forward. The AI/human equation requires a hypothesis-driven approach in which experiments can fail fast or succeed.

“It’s a lot more about collective intelligence than let’s get a couple of experts and let them tell us where to do this. There are no experts here,” Giacomelli said.

Focus on the workforce

AI will impact every type of business in some way. The question is, what are business leaders doing to prepare their workforce for a future in which part or all of their jobs will be done by AI? According to the survey, 82% of the business leaders plan to implement AI-related technologies in the next three years but only 38% are providing employees with reskilling options.

“I think HR functions are completely backwards on this one,” said Giacomelli. “They haven’t started connecting the dots with what needs to be done with the employees.”

Some companies are already working on workforce planning, but they view AI as a means of materially reducing the workforce, such as by 20% or 30%, which Giacomelli considers “a primitive approach.”

“There are jobs that will go away completely. For example, people who do reconciliation of basic accounts, invoices, that kind of stuff,” he said. “Most of the jobs that will be impacted will be impacted fractionally, so part of the job is eliminated and then you figure out how to skill the person who does that job so she can use the machine better.”

What would people do, though? It’s clear that most working professionals have various types of experience. The challenge for HR is to stop looking at a snapshot of what a candidate or employee is today and what prior experience has qualified them to do what they do today. Instead, they should consider an individual’s future trajectory. For example, some accountants have become sales analysts or supply chain analysts.

Looking for clues about what particular roles could evolve into is wise, but that does not provide the entire picture, since all types of jobs will either evolve or become obsolete in their current forms.

“I don’t feel that many people are looking at the human element of digital transformation and AI except fearful people,” said Giacomelli. “Every year, we will see people somewhere making sense of this riddle and starting to work in a different way. I think we need to change the way we look at career paths. We’ll have to look at them in a hypothesis testing way as opposed to have a super guru in HR who knows how AI will impact our career paths, because they don’t [know].”

The bottom line is that individuals need to learn how to learn because what AI can do today differs from what it will be able to do tomorrow, so the human-and-machine relationship will evolve over time.

Even if AI was just a science fiction concept today, the accelerating paces of technology and business underscore the fact that change is inevitable, so organizations and individuals need to learn how to cope with it.

Don’t dismiss the other guy

AI proponents and opponents both have valid arguments because any tool, including AI, can be used for good or evil. While it’s true AI will enable positive industrial, commercial and societal outcomes, the transition could be extremely painful for the organizations and individuals who find themselves relics of a bygone era, faster than they imagined.

AI-related privacy and security also need more attention than they’re getting today because the threats are evolving rapidly and the pace will accelerate over time.

An important fundamental question is whether humans can ultimately control AI, which remains to be seen. Microsoft’s Tay Twitterbot demonstrated that AI can adopt the most deplorable forms of human expression, quickly. In less than 24 hours, that experiment was shut down. Similarly, a Facebook chatbot experiment demonstrated that AI is capable of developing its own language, which may be nonsensical or even undecipherable by humans. So risks and rewards both need to be considered.

 

« Older posts Newer posts »