Lisa Morgan's Official Site

Strategic Insights and Clickworthy Content Development

Category: Analytics (page 1 of 4)

Workforce Analytics Move Beyond HR

orkforce analytics have traditionally focused on HR’s use of them when their value can actually have significant overall business impacts. Realizing this, more business leaders are demanding insights into workforce dynamics to unearth insights that weren’t apparent before.

Businesses often claim that talent is their greatest asset, but they’re not always able to track what’s working, what isn’t and why. For example, in Deloitte Consulting’s 2018 Global Human Capital Trends report, 71% of survey participants said their companies consider people analytics a high priority, but only 10% are “very ready” to deal with it. According to David Fineman, specialist leader at  Deloitte Consulting, who co-authored the report, business leaders want insights into six focus areas that include workforce planning and shaping, recruiting and staffing talent optimization, culture and engagement, performance and rewards, and HR service delivery.

“The important distinction between focus areas that are addressed today compared with the focus areas from prior years is the emphasis on issues that are important to business leaders, not limiting analytics recipients to an HR audience,” said Fineman.

In fact, the Deloitte report explicitly states that board members and CEOs want access to people analytics because they’re “impatient with HR teams that can’t deliver actionable information and insights…”

As businesses continue to digitize more tasks and functions, it’s essential for them to understand the current makeup of their workforces, what talent will be needed in the future, and what’s necessary to align the two.

Shebani Patel, People Analytics leader at professional services firm PricewaterhouseCoopers (PwC) said that companies now want to understand employee journeys from onboarding to daily work experiences to exit surveys.

“They’re trying to get more strategic about how all of that comes together to build and deliver an exceptional [employee] experience that ultimately has ROI attached to it,” she said.

What companies are getting right

The availability of more people analytics tools enables businesses to understand their workforces in greater detail than ever before. However, the insights sought are not just insights about people, but rather how those insights directly relate to business value such as achieving higher levels of customer satisfaction or improving product quality. Businesses are also placing more emphasis on organizational network analysis (ONA) which provides insight into the interactions and relationships among people.

While it’s technologically possible to track what individuals do, there are also privacy concerns that are best addressed using clustering techniques. For example, KPMG’s clients are looking at email patterns, chat correspondence and calendared meetings to understand how team behavior correlates with performance, productivity or sales.

“Organizations today are using [the data] to derive various hypotheses and then use analytics to prove them out,” said Paul Krasilnick, director, Advisory Services at KPMG. “They recognize that it needs to be done cautiously in terms of data privacy and access to information, but they also recognize the value of advancing their internal capabilities and maturity from descriptive reporting to more prescriptive [analytics].”

According to Deloitte’s Fineman, high performing people analytics teams are characterized by increasing the analytics acumen within the HR function and among stakeholders.

What needs to improve

Like any other analytics journey, what needs to be improved depends on an organization’s level of mastery.  While all organizations have people data, they don’t necessarily have clean data.  Further, the mere existence of the data does not mean it’s readily usable or necessarily valuable.

The Trouble with Data About Data

Two people looking at the same analytical result can come to different conclusions. The same goes for the collection of data and its presentation. A couple of experiences underscore how the data about data — even from authoritative sources — may not be as accurate as the people working on the project or the audience believe. You guessed it: Bias can turn a well-meaning, “objective” exercise into a subjective one. In my experience, the most nefarious thing about bias is the lack of awareness or acknowledgement of it.

The Trouble with Research

I can’t speak for all types of research, but I’m very familiar with what happens in the high-tech industry. Some of it involves considerable primary and secondary research, and some of it involves one or the other.

Let’s say we’re doing research about analytics. The scope of our research will include a massive survey of a target audience (because higher numbers seem to indicate statistical significance). The target respondents will be a subset of subscribers to a mailing list or individuals chosen from multiple databases based on pre-defined criteria. Our errors here most likely will include sampling bias (a non-random sample) and selection bias (aka cherry-picking).

The survey respondents will receive a set of questions that someone has to define and structure. That someone may have a personal agenda (confirmation bias), may be privy to an employer’s agenda (funding bias), and/or may choose a subset of the original questions (potentially selection bias).

The survey will be supplemented with interviews of analytics professionals who represent the audience we survey, demographically speaking. However, they will have certain unique attributes — a high profile or they work for a high-profile company (selection bias). We likely won’t be able to use all of what a person says so we’ll omit some stuff — selection bias and confirmation bias combined.

We’ll also do some secondary research that bolsters our position — selection bias and confirmation bias, again.

Then, we’ll combine the results of the survey, the interviews, and the secondary research. Not all of it will be usable because it’s too voluminous, irrelevant, or contradicts our position. Rather than stating any of that as part of the research, we’ll just omit those pieces — selection bias and confirmation bias again. We can also structure the data visualizations in the report so they underscore our points (and misrepresent the data).

Bias is not something that happens to other people. It happens to everyone because it is natural, whether consciously or unconsciously. Rather than dismiss it, it’s prudent to acknowledge the tendency and attempt to identify what types of bias may be involved, why, and rectify them, if possible.

I recently worked on a project for which I did some interviews. Before I began, someone in power said, “This point is [this] and I doubt anyone will say different.” Really? I couldn’t believe my ears. Personally, I find assumptions to be a bad thing because unlike hypotheses, there’s no room for disproof or differing opinions.

Meanwhile, I received a research report. One takeaway was that vendors are failing to deliver “what end customers want most.” The accompanying infographic shows, on average, that 15.5% of end customers want what 59% of vendors don’t provide. The information raised more questions than it answered on several levels, at least for me, and I know I won’t get access to the raw data.

My overarching point is that bias is rampant and burying our heads in the sand only makes matters worse. Ethically speaking, I think as an industry, we need to do more.

 

Analytics Leaders and Laggards: Which Fits Your Company?

Different companies and industries are at different levels of analytical maturity. There are still businesses that don’t use analytics at all and businesses that are masters by today’s standards. Most organizations are somewhere in between.

So, who are the leaders and laggards anyway? The International Institute for Analytics (IIA) asked that question in 2016 and found that digital natives are the most mature and the insurance industry is the least mature.

How Industries and Sectors Stack Up

IIA’s research included 11 different industries and sectors, in addition to digital natives. The poster children included Google, Facebook, Amazon, and Netflix. From Day 1, data has been their business and analytics has been critical to their success.

The report shows the descending order of industries in terms of analytical maturity, with insurance falling behind because its IT and finance analytics are the weakest of all.

Another report, from business and technology consultants West Monroe Partners found that only 11% of the 122 insurance executives they surveyed think their companies are realizing the full benefits of advanced analytics. “Advanced analytics” in this report is defined as identifying new revenue opportunities, improving customer and agent experience, performing operational diagnostics, and improving control mechanisms.

Two of the reasons West Monroe cited for the immaturity of the insurance industry are the inability to quantify the ROI and poor data quality.

Maturity is a Journey

Different organizations and individuals have different opinions about what an analytics maturity model looks like. IIA defines five stages ranging from “analytically impaired” (organizations that make decisions by gut feel) to “analytical nirvana” (using enterprise analytics).

“Data-first companies haven’t had to invest in becoming data-driven since they are, but for the companies that aren’t data-first, understanding the multi-faceted nature of the journey is a good thing,” said Daniel Magestro, research director at IIA. “There’s no free lunch, no way to circumvent this. The C-suite can’t just say that we’re going to be data-driven in 2017.”

Others look at the types of analytics companies are doing: descriptive, predictive, and prescriptive. However, looking at the type of analytics doesn’t tell the entire story.

What’s interesting is that different companies at different stages of maturity are stumped by different questions: Do you think you need analytics? If the answer is no, then it’s going to be a long and winding road.

Why do you think you need analytics? What would you use analytics to improve? Those two related questions require serious thought. Scope and priorities are challenges here.

How would you define success? That can be a tough question because the answers have to be quantified and realistic to be effective. “Increase sales” doesn’t cut it. How much and when are missing.

One indicator of maturity is what companies are doing with their analytics. The first thing everyone says is, “make better business decisions,” which is always important. However, progressive companies are also using analytics to identify risks and opportunities that weren’t apparent before.

The degree to which analytics are siloed in an organization also impacts maturity as can the user experience. Dashboards can be so complicated they’re ineffective versus simple to prioritize and expedite decision-making.

Time is another element. IT-created reports have fallen out of favor. Self-service is where it’s at. At the same time, it makes no sense to pull the same information in the same format again and again, such as weekly sales reports. That should simply be automated and pushed to the user.

The other time element — timeliness whether real-time, near real-time, or batch — is not an indication of maturity in my mind because what’s timely depends on what’s actually necessary.

4 Ways Companies Impede Their Analytics Efforts

Businesses in the race to become “data-driven” or “insights-driven” often face several disconnects between their vision of an initiative and their execution of it. Of course, everyone wants to be competitive, but there are several things that differentiate the leaders from the laggards. Part of it is weathering the growing pains that companies tend to experience, some of which are easier to change than others. These are some of the stumbling blocks.

Business objectives and analytics are not aligned

Analytics still takes place in pockets within the majority of organizations. The good news is that various functions are now able to operate more effectively and efficiently as a result of applying analytics. However, there is greater power in aligning efforts with the strategic goals of the business.

In a recent research note, Gartner stated, “Internally, the integrative, connected, real-time nature of digital business requires collaboration between historically independent organizational units. To make this collaboration happen, business and IT must work together on vision, strategy, roles and metrics. Everyone is going to have to change, and everyone is going to have to learn.”

All of that requires cultural adjustment, which can be the most difficult challenge of all.

There’s insight but no action

It’s one thing to get an insight and quite another to put that insight into action. To be effective, analytics need to be operationalized, which means weaving analytics into business processes so that insights can be turned into meaningful actions. Prescriptive analytics is part of it, but fundamentally, business processes need to be updated to include analytics. A point often missed is that decisions and actions are not ends in themselves. They, too, need to be analyzed to determine their effectiveness.

An EY presentation stresses the need to operationalize analytics. Specifically, it says, ” The key to operationalizing analytics is to appreciate the analytics value chain.”

Interestingly, when most of us think about “the analytics value chain” we think of data, analytics, insights, decisions and optimizing outcomes. While that’s the way work flows, EY says our thought process should be the reverse. Similarly, to optimize a process, one must understand what that process is supposed to achieve (e.g., thwart fraud, improve customer experience, reduce churn).

They’re not looking ahead

Less analytically mature companies haven’t moved beyond descriptive analytics yet. They’re still generating reports, albeit faster than they used to because IT and lines of business tend to agree that self-service reporting is better for everyone. Gartner says “the BI and analytics market is in the final stages of a multiyear shift from IT-lead, system-of-record reporting to business-led, self-service analytics. As a result, the modern business intelligence and analytics platform has emerged to meet new organizational requirements for accessibility, agility and deeper analytical insight.”

Still, organizations can only get so far with descriptive analytics. If they want to up their competitive game, they need to move to predictive and prescriptive analytics.

Poor data quality prevents accurate analytics

If you don’t have good data or a critical mass of the right data, your analytical outcomes are going to fall short. Just about any multichannel (and sometimes even single-channel) communication experience with a bank, a telephone company, a credit card company, or a vendor support organization will prove data quality is still a huge issue. Never mind the fact some of these companies are big brand companies who invest staggering amounts of money in technology, including data and analytics technologies.

In a typical telephone scenario, a bot asks the customer to enter an account number or a customer number. If the customer needs to be transferred to a live customer service representative (CSR), chances are the CSR will ask the customer to repeat the number because it doesn’t come up on their screen automatically. If the CSR can’t resolve the issue, then the call is usually transferred to a supervisor or different department. What was your name and number again? It’s a frustrating problem that’s all too common.

The underlying problem is that customer’s information is stored in different systems for different reasons such as sales, CRM and finance.

I spoke with someone recently who said a company he worked with had gone through nearly 20 acquisitions. Not surprisingly, data quality was a huge issue. The most difficult part was dealing with the limited fields in a legacy system. Because the system did not contain enough of the appropriate fields in which to enter data, users made up their own workarounds.

These are just a few of the challenges organizations face on their journey.

Why Surveys Should Be Structured Differently

keyboard-417093_1280If you’re anything like me, you’re often asked to participate in surveys.  Some of them are short and simple.  Others are very long, very complicated, or both.

You may also design and implement surveys from time to time like I do.   If you want some insight into the effectiveness of your survey designs and their outcomes, pay attention to the responses you get.

Notice the Drop-off Points

Complicated surveys that take 15 or 20 minutes to complete tend to reflect drop off points at which the respondents decided that the time investment required wasn’t worth whatever incentive was offered.  After all, not everyone actually cares about survey results or a  1-in-1,000 chance of winning the latest iPad, for example.  If there’s no incentive whatsoever, long and complicated surveys may  be even less successful, even if you’re pinging your own  database.

A magazine publisher recently ran such a survey, and boy, was it hairy.  It started out like similar surveys, asking questions about the respondent’s title, affiliation, company revenue and size.  It also asked about purchasing habits – who approves, who specifies, who recommends, etc. for different kinds of technologies.  Then, what the respondent’s content preferences are for learning about tech (several drop-down menus), using tech (several drop-down menus), purchasing tech (several drop-down menus), and I can’t remember what else.  At that point, one was about 6% done with the survey.  So much for “10 – 15 minutes.”  It took about 10 or 15 minutes just to wade through the first single-digit percent of it.  One would really want a slim chance of winning the incentive to complete that survey.

In short, the quest to learn everything about everything in one very long and complex survey may end in more knowledge about who took the survey than how how people feel about important issues.

On the flip side are very simple surveys that take a minute or two to answer.  Those types of surveys tend to focus on whether a customer is satisfied or dissatisfied with customer service, rather than delving into the details of opinions about several complicated matters.

Survey design is really important.  Complex fishing expeditions can and often do reflect a lack of focus on the survey designer’s part.

Complex Surveys May Skew Results

Overly complicated surveys may also yield spurious results.  For example, let’s say 500 people agree to take a survey we just launched that happens to be very long and very complex.  Not all of the respondents will get past the who-are-you questions because those too are complicated.  Then, as the survey goes on, more people drop, then more.

The result is that  X% of of the survey responses at the end of the survey are not the same as X% earlier in the survey.  What I mean by that is 500 people started, maybe 400 get past the qualification portion, and the numbers continue to fall as yet more complicated questions arise but  the “progress bar” shows little forward movement.  By the end of the survey, far less than 500 have participated, maybe 200  or 100.

Of course, no one outside the survey team knows this, including the people in the company who are presented with the survey results.  They only know that 500 people participated in the survey and X% said this or that.

However, had all 500 people answered all the questions, the results of some of the questions would likely look slightly or considerably different, which may be very important.

Let’s say 150 people completed our  survey and the last question asked whether they planned to purchase an iPhone 7 within the next three months.  40% of them or 60 respondents said yes.  If all 500 survey respondents answered that same question, I can almost guarantee you the answer would not be 40% .  It might be close to 40% or it might not be even close to 40%.

So, if you genuinely care about divining some sort of “truth” from surveys, you need to be mindful about how to define and structure the survey and that the data you see may not be telling you the entire story, or even an accurate story.

The point about accuracy is very important and one that people without some kind of statistical background likely haven’t even considered because they’re viewing all aggregate numbers as having equal weight and equal accuracy.

I, for one, think that survey “best practices” are going to evolve in the coming years with the help of data science.  While the average business person knows little about data science now, in the future it will likely seem cavalier not to consider the quality of the data you’re getting and what you can do to improve the quality of that data.  Your credibility and perhaps your job may depend on it.

In the meantime, try not to shift the burden of thinking entirely to your survey audience because it won’t do either of you much good.  Think about what you want to achieve, structure your questions in a way that gives you insight into your audience and their motivations (avoid leading questions!), and be mindful that not all aggregate answers are equally accurate or representative, even within the same survey.

What Retailers Know About You

Retailers now have access to more information than ever. They’re using loyalty cards, cameras, POS transaction data, GPS data, and third-party data in an effort to get shoppers to visit more often and buy more. The focus is to provide better shopping experiences on a more personalized level. Operationally speaking, they’re trying to reduce waste, optimize inventory selection, and improve merchandising.

The barrier to personalized experiences is PII (personally identifiable information), of course.

“[What retailers know about you] is still largely transaction-based,” said Dave Harvey, VP of Thought Leadership, branding and retail services provider Daymon. “Information about lifestyles, behaviors and attitudes are hard for retailers to get themselves so they partner with companies and providers that have those kinds of panels, especially getting information about how people are reacting through social media and what they’re buying online.”

Transactions Are Driving Insights

The most powerful asset is a retailer’s transactional database. How they segment data is critical, whether it’s transaction-based reach, frequency, lifestyle behaviors, or product groupings. Retailers can identify how you live your life based on the products you buy.

“The biggest Achille’s heel of transaction data, no matter how much you’re segmenting it and how much you’re mining it, is you’re not seeing what your competitors are doing,” said Harvey. “Looking at transaction data across your competition becomes critical.”

As consumers we see the results of that in offers, which may show up in an app, email, flyer or coupons generated at the POS.

Social media scraping has also become popular, not only to gauge consumer sentiment about brands and products, but also to provide additional lifestyle insight.

Some retailers are using predictive and prescriptive analytics to optimize pricing, promotions and inventory. They also have a lot of information about where their customers are coming from, based on credit card transactions. In addition, they’re using third party data to understand customer demographics, including the median incomes of the zip codes in which customers live.

They’re Watching Your Buying Patterns

Retailers monitor what shoppers buy over time, including items they tend to buy together, such as shirts and ties or eggs and orange juice. The information helps them organize shelves, aisles, and end caps.

“There’s a lot of implications for meal solutions and category adjacencies, how people are shopping in the store, how that might lead a retailer test way to offer a right combination of products to create a solution somewhere in the store,” said Harvey. “You can’t be everything to everyone, so how can the information help you prioritize where to focus? The information you can mine from your transaction database, your loyalty card database can help you become more efficient.”

Information about buying patterns and price elasticity allows retailers to micro-target so effectively that shoppers visit the store more often and spend more money.

They May Know How You Shop the Store

Shopping carts and baskets are a necessary convenience for customers, although the latest ones include sensors that track customers’ paths as they navigate through the store.

“They can get the path data and purchase data about how much time you spend at stations, and they can use it to redesign the store or and get you move through the store much more because they know the more you move through the store the more you buy,” said PK Kannan, a professor of marketing science at the University of Maryland’s Robert H. Smith School of Business.

Retailers also use cameras to optimize merchandising to better understand customer behavior including where they go and how long they stay. Now they’re also analyzing facial expressions to determine one’s state of mind.

Driving Business Value from Analytics

Different kinds of analytics result in different ROI. If a retailer is just starting out, Kannaan recommends starting with loyalty cards since other types of data capture and analysis can be prohibitively expensive and the analysis can be cumbersome.

“The ROI on loyalty cards is pretty good,” said Kannaan. “The initial ROI is going to be high and then as you go into more of these cart or visual data, video data, your ROI is going to level off.”

Strategies also differ among types of retailers. For example, a specialty retailer will want data that provides deep insight into the category and shoppers of that category versus a store such as Walmart that carries items in many different categories.

“If you’re a retailer trying to sell a ton of categories you want to understand how people are talking about their shopping experience,” said Harvey. “There’s still a lot of untapped opportunity in understanding social media as it relates to doing better analysis with retailers.”

They’re Innovating

Retailers are working hard to understand their customers, so they can provide better shopping experiences. While personalization techniques are getting more sophisticated, there’s only so far they can go legally in many jurisdictions.

Kannan said a way of getting around this is to take all the informational content, remove any PII, and then extract the resulting information out of the data.

“It’s like I’m taking the kernel from this thing because I don’t have the space to store it and keeping it is not a good policy, so I am going to keep some of the sufficient statistics with me and as new data comes in, I’m going to combine the old data with new data and use it for targeting purposes,” said Kannan. That’s becoming more of a possibility now, and also it’s a reality because data volumes are increasing like crazy. That way I don’t have to store all the data in a data lake.”

How Today’s Analytics Change Recruiting

HR is late to the analytics game by modern standards, and yet, HR metrics is not a new concept. The difference is that modern analytics enable HR professionals and recruiters to measure more things in less time and derive more insight than ever before.

Rosemary Haefner

Rosemary Haefner

“If you’re looking at recruiting, there have always been metrics such as time to hire and cost per hire, but you’re seeing other channels and avenues opening up,” said Rosemary Haefner, chief human resources officer at online employment website, CareerBuilder.com.

The “time to hire” or “time to fill” metric measures how many days it takes from the time a requisition is posted until the time an offer is accepted. The longer a position remains open, the higher the cost of talent acquisition. In addition, if a position remains open, an intervention may be necessary to ensure the work at hand is getting done.

If time to fill were the only measure of success, then, in theory, the faster a position is filled, the better. However, as most working professionals have experienced, the person who can be hired the fastest isn’t necessarily (and probably isn’t), the best candidate.

On the other hand, moving too slowly can cost organizations sought-after talent.

“There’s the time to fill, the cost of the person you hire, whether that person is high-potential and what their expected tenure in the organization is. That’s an example of four interrelated metrics,” said Muir Macpherson, Americas analytics leader, People Advisory Services at EY. “HR needs to stop thinking about individual metrics and consider the problem they’re trying to solve and how to optimize across a set of metrics simultaneously.”

Beyond keywords

Talent marketplaces and talent acquisition software made it easier to navigate a sea of resumes using keywords and filters. In response, some candidates stuffed their resumes full of keywords so their resumes would rank higher in searches. If one’s resume ranked higher in searches, then more people would see it, potentially increasing the candidate’s chance of getting interviews and landing a job.

Masterful keyword use demonstrated an awareness that the recruiting process was changing from a paper-based process to a computer or web-based process. However, other candidates who might have been better fits for positions risked getting lost in the noise.

The whole keyword trend was a noble effort, but keywords, like anything else, are not a silver bullet.

With today’s analytics tools, HR departments and search firms can understand much more about candidates and the effectiveness of their operations.

“You can use a variety of big data and machine learning techniques that go way beyond the keyword analysis people have been doing for a while that integrates all of the data available about a candidate into one, unified prediction score that can then be used as one additional piece of information that recruiters and hiring managers can look at when making their decisions,” said Macpherson.

Data impacts recruiters too

Recruiters now have access to data analytics tools that enable them to better match candidates with potential employers and improve the quality of their services. Meanwhile, HR departments want insight into what recruiters are doing and how well they’re doing it. The Scout Exchange marketplace provides transparency between the two.

“We can look at every candidate [a recruiter] submits to see how far they got in the process and whether they got hired. We use that for ratings so [companies and the recruiters they use] can see the other side’s rating,” said Scout Exchange CEO Ken Lazarus.

The site enables organizations to quickly find appropriate recruiters who can identify the best candidates for a position. HR departments also allows HR departments to see data and trends specific to their company.

Bottom line

Analytics is providing HR departments, recruiters and business leaders with quantitative information they can use to improve their processes and outcomes.

“Knowledge is power and having that data is helpful. For me, the first step is knowing what you’re solving for,” said CareerBuilder’s Haefner.

Right now, HR analytics tend to emphasize recruitment. However, attracting talent is sometimes easier than retaining it so it’s important to have insight throughout the lifecycle of employee relationships. EY’s Macpherson said HR departments should think in terms of “employee lifetime value” similar to the way marketers think about customer lifetime value.

“[HR analytics represents] a huge opportunity because for most companies, people and compensation are their biggest costs and yet there has been very little effort put into analyzing those costs or getting the most out of those investments that companies are making,” said EY’s Macpherson.

How the IoT Will Impact Data Analytics

IoT devices are just about everywhere, in cities, on oil rig, and on our wrists. They’re impacting virtually every industry, and their growth is outpacing organizations’ ability to make the most of that data.

To give you an idea of scale, IDC expects global IoT spending to reach nearly $1.4 trillion by 2021, up from $800 billion in 2017. The IoT is all around us, in many cases fading into the backgrounds of our homes and lifestyles, all the while generating massive amounts of data. The trick is driving value from that data.

The Balance of Data is Shifting

Over the past decade, we’ve witnessed several shifts in enterprises’ ability to deal with data. While different companies and industries are at different stages of maturity, we’ve seen and continue to see analytics evolving, whether it’s adding unstructured analytics capabilities to structured analytics, third-party data sources to our own, or IoT data to enterprise data. Slowly but surely, we’ve been seeing the balance of data shift from internal data to external data, particularly as more IoT devices emerge.

Edge analytics helps separate meaningful data from all the noise, which usually means identifying, and perhaps reacting to, exceptions and outliers. For example, if the temperature of a piece of industrial equipment rises beyond a threshold, maintenance crews may be alerted, or the equipment might be shut down.

Organizations attempting to manage IoT data using their traditional data centers are fighting a losing battle. In fact, Gartner noted that the IoT is causing businesses to move to the cloud faster than they might move otherwise. In other words, when so many things are happening in the cloud, it makes sense to analyze them in the cloud.

Data and Analytics Strategies: Top-down and Bottom-up

The sheer amount of data organizations must deal with increases greatly with the IoT, and there are still philosophical debates about how much data should be kept and how much data should discarded. Gartner strongly advises its clients to be smart about IoT data, meaning that one should not save all the data hoping to drive value from it in the future, but instead focus on strategic goals and how IoT data fits into that.

We often hear how important it is to align analytics efforts with business goals. At the same time, we also hear how important it is to uncover unknown opportunities and risks simply by allowing the data to speak for itself. Some of the most sophisticated companies I’ve talked to over the last several years are doing both, with machine learning identifying that which was not obvious previously. In Gartner’s view, “data and analytics must drive business operations, not reflect them.”

One major challenge organizations face, practically speaking, is operationalizing analytics — with or without the IoT. The core problem is moving from insights to action, which can’t be solved completely with prescriptive analytics. It’s a larger problem that has to do with company culture, stubborn attitudes and the very real challenges of integrating data sources.

Meanwhile, some organizations are pondering how they can use the IoT to improve customer experience, whether that’s minimizing transportation delays, improving environmental safety or otherwise eliminating friction points that tend to irritate humans. Humans have become fickle customers after all, and each touch point can affect a brand positively or negatively.

For example, Walmart placed kiosks in some of its stores that retrieve online orders, scan receipts and trigger the conveyor belt delivery of the items a customer purchased. The kiosks address a customer pain point which is walking all the way to the back of the store and waiting several minutes for someone to show up only to be told the order can’t be located.

Now think about what Walmart gets from the kiosk: trend data about customer use and experiences that may impact staffing, inventory management, marketing, supply chain. Clearly, the data will also indicate whether the kiosk idea is ultimately a good idea or a bad idea.

In the pharmaceutical industry, GSK has been working with partners to develop smart inhalers that track prescription compliance and dosing. The data helps inform research, and it also has value to doctors and pharmacies.

Similarly, enterprises can use IoT data to develop predictive models that help improve business operations, logistics, supply chain and more, depending on the nature of the sensors and the device.

Your Data Is Biased. Here’s Why.

Bias is everywhere, including in your data. A little skew here and there may be fine if the ramifications are minimal, but bias can negatively affect your company and its customers if left unchecked, so you should make an effort to understand how, where and why it happens.

“Many [business leaders] trust the technical experts but I would argue that they’re ultimately responsible if one of these models has unexpected results or causes harm to people’s lives in some way,” said Steve Mills, a principal and director of machine intelligence at technology and management consulting firm Booz Allen Hamilton.

In the financial industry, for example, biased data may cause results that offend the Equal Credit Opportunity Act (fair lending). That law, enacted in 1974, prohibits credit discrimination based on race, color, religion, national origin, sex, marital status, age or source of income. While lenders will take steps not to include such data in a loan decision, it may be possible to infer race in some cases using a zip code, for example.

“The best example of [bias in data] is the 2008 crash in which the models were trained on a dataset,” said Shervin Khodabandeh, a partner and managing director of Boston Computing Group (BCG) Los Angeles, a management consulting company. “Everything looked good, but the datasets changed and the models were not able to pick that up, [so] the model collapsed and the financial system collapsed.”

What Causes Bias in Data

A considerable amount of data has been generated by humans, whether it’s the diagnosis of a patient’s condition or the facts associated with an automobile accident.  Quite often, individual biases are evident in the data, so when such data is used for machine learning training purposes, the machine intelligence reflects that bias.  A prime example of that was Microsoft’s infamous AI bot, Tay, which in less than 24 hours adopted the biases of certain Twitter members. The results were a string of shocking, offensive and racist posts.

“There’s a famous case in Broward County, Florida, that showed racial bias,” said Mills. “What appears to have happened is there was historically racial bias in sentencing so when you base a model on that data, bias flows into the model. At times, bias can be extremely hard to detect and it may take as much work as building the original model to tease out whether that bias exists or not.”

What Needs to Happen

Business leaders need to be aware of bias and the unintended consequences biased data may cause.  In the longer-term view, data-related bias is a governance issue that needs to be addressed with the appropriate checks and balances which include awareness, mitigation and a game plan should matters go awry.

“You need a formal process in place, especially when you’re impacting people’s lives,” said Booz Allen Hamilton’s Mills. “If there’s no formal process in place, it’s a really bad situation. Too many times we’ve seen these cases where issues are pointed out, and rather than the original people who did the work stepping up and saying, ‘I see what you’re seeing, let’s talk about this,’ they get very defensive and defend their approach so I think we need to have a much more open dialog on this.”

As a matter of policy, business leaders need to consider which decisions they’re comfortable allowing algorithms to make, the safeguards which ensure the algorithms remain accurate over time, and model transparency, meaning that the reasoning behind an automated decision or recommendation can be explained.  That’s not always possible, but still, business leaders should endeavor to understand the reasoning behind decisions and recommendations.

“The tough part is not knowing where the biases are there and not taking the initiative to do adequate testing to find out if something is wrong,” said Kevin Petrasic, a partner at law firm White & Case.  “If you have a situation where certain results are being kicked out by a program, it’s incumbent on the folks monitoring the programs to do periodic testing to make sure there’s appropriate alignment so there’s not fair lending issues or other issues that could be problematic because of key datasets or the training or the structure of the program.”

Data scientists know how to compensate for bias, but they often have trouble explaining what they did and why they did it, or the output of a model in simple terms. To bridge that gap, BCG’s Khodabandeh uses two models: one that’s used to make decisions and a simpler model that explains the basics in a way that clients can understand.

Drexel University’s online MS in Data Science will set you on the path to success in one of today’s fastest growing fields. Learn how to examine and manipulate data to solve problems by creating machine learning algorithms and emerge from the program work-place ready.

Brought to you by Drexel University

BCG also uses two models to identify and mitigate bias.  One is the original model, the other is used to test extreme scenarios.

“We have models with an opposite hypothesis in mind which forces the model to go to extremes,” said Khodabandeh. “We also force models to go to extremes. That didn’t happen in the 2008 collapse. They did not test extreme scenarios. If they had tested extreme scenarios, there would have been indicators coming in in 2007 and 2008 that would allow the model to realize it needs to adjust itself.”

A smart assumption is that bias is present in data, regardless.  What the bias is, where it stems from, what can be done about it and what the potential outcomes of it may be are all things to ponder.

Conclusion

All organizations have biased data.  The questions are whether the bias can be identified, what effect that bias may have, and what the organization is going to do about it.

To minimize the negative effects of bias, business leaders should make a point of understanding the various types and how they can impact data, analysis and decisions. They should also ensure there’s a formal process in place for identifying and dealing with bias, which is likely best executed as a formal part of data governance.

Finally, the risks associated with data bias vary greatly, depending on the circumstances. While it’s prudent to ponder all the positive things machine learning and AI can do for an organization, business leaders are wise to understand the weaknesses also, one of which is data bias.

How to Teach Executives About Analytics

If your data is failing to persuade executives, maybe it’s not the data that is the problem. Here’s how to change your approach to fit the audience.

One of the biggest challenges data analysts and data scientists face is educating executives about analytics. The general tendency is to nerd out on data and fail to tell a story in a meaningful way to the target audience.

Sometimes data analytics professionals get so wrapped up in the details of what they do that they forget not everyone has the same background or understanding. As a result, they may use technical terms, acronyms, or jargon and then wonder why no one “got” their presentations or what they were saying.

They didn’t anything wrong, per se, it’s how they’re saying it and to whom.

If you find yourself in such a situation, following are a few simple things you can do to facilitate better understanding.

Discover What Matters

What matters most to your audience? Is it a competitive issue? ROI? Building your presence in a target market? Pay attention to the clues they give you and don’t be afraid to ask about their priorities. Those will clue you in to how you should teach them about analytics within the context of what they do and what they want to achieve.

Understand Your Audience

Some executives are extremely data-savvy, but the majority aren’t just yet. Dialogs between executives and data analysts or data scientists can be uncomfortable and even frustrating when the parties speak different languages. Consider asking what your target audience would like to learn about and why. That will help you choose the content you need to cover and the best format for presenting that content.

For example, if the C-suite wants to know how the company can use analytics for competitive advantage, then consider a presentation. If one of them wants to understand how to use a certain dashboard, that’s a completely different conversation and one that’s probably best tackled with some 1:1 hands-on training.

Set Realistic Expectations

Each individual has a unique view of the world. Someone who isn’t a data analyst or a data scientist probably doesn’t understand what that role actually does, so they make up their own story which becomes their reality. Their reality probably involves some unrealistic expectations about what data-oriented roles can do or accomplish or what analytics can accomplish generally.

One of the best ways to deal with unrealistic expectations is to acknowledge them and then explain what is realistic and why. For example, a charming and accomplished data scientist I know would be inclined to say, “You’d think we could accomplish that in a week, right? Here’s why it actually takes three weeks.”

Stories can differ greatly, but the one thing good presentations have in common is a beginning, a middle, and an end. One of the mistakes I see brilliant people making is focusing solely on the body of a presentation, immediately going down some technical rabbit hole that’s fascinating for people who understand it and confusing for others.

A good beginning gets everyone on the same page about what the presentation is about, why the topic of discussion is important, and what you’re going to discuss. The middle should explain the meat of the story in a logical way that flows from beginning to end. The end should briefly recap the highlights and help bring your audience to same conclusion you’re stating in your presentation.

Consider Using Options

If the executive(s) you’re presenting to hold the keys to an outcome you desire, consider giving them options from which to choose. Doing that empowers them as the decision-makers they are. Usually, that approach also helps facilitate a discussion about tradeoffs. The more dialog you have, the better you’ll understand each other.

Another related tip is make sure your options are within the realm of the reasonable. In a recent scenario, a data analyst wanted to add two people to her team. Her A, B, and C options were A) if we do nothing, then you can expect the same results, B) if we hire these two roles we’ll be able to do X and Y, which we couldn’t do before, and C) if we hire 5 people we’ll be able to do even more stuff, but it will cost this much. She came prepared to discuss the roles, the interplay with the existing team and where she got her salary figures. If they asked what adding 1, 3, or 4 people looked like, she was prepared to answer that too.

Speak Plainly

Plain English is always a wise guide. Choose simple words and concepts, keeping in mind how the meaning of a single word can differ. For example, if you say, “These two variables have higher affinity,” someone may not understand what you mean by variables or affinity.

Also endeavor to simplify what you say, using concise language. For example, “The analytics of the marketing department has at one time or another tended overlook the metrics of the customer service department” can be consolidated into, “Our marketing analytics sometimes overlooks customer service metrics.”

Older posts