Strategic Insights and Clickworthy Content Development

Author: misslisa (Page 4 of 10)

I'm a writer, editor, analyst, and writing coach.

Beware Analytics’ Mid-Life Crisis

Businesses are using analytics to stay competitive. One by one, departments are moving from static reports to modern analytics so they can fine-tune their operations. There’s no shortage of solutions designed for specific functions, such as marketing, sales, customer service and supply chain, most of which are available in SaaS form. So, when it’s possible just to pull out a credit card and get started with an application, why complicate things by involving IT?

Freedom from IT seems like a liberating concept until something goes wrong. When data isn’t available or the software doesn’t work as advertised, it becomes the IT department’s job to fix it.

“I used to call this the BI mid-life crisis. Usually about a year and a half or two years in, [departments] realize they can’t report accurately and then they need some help,” said Jen Underwood, founder of Impact Analytix, and a recognized analytics industry expert. “Now I’m seeing more IT involvement again.”

Organizations serious about competing on insights need to think holistically about how they’re approaching analytics and the role of IT. Disenfranchising IT from analytics may prove to be short-sighted. For example, a proof of concept may not scale well or the data required to answer a question might not be available.

Analytics’ long-term success depends on IT

IT was once the sole gatekeeper of technology, but as the pace of business has continued to accelerate, departments have become less tolerant of delays caused by IT. While it’s true no one understands departmental requirements better than the department itself, IT is better equipped to identify what could go wrong, technically speaking.

Even if a department owns and manages all of its data, at some point it will likely want to combine that data with other data, perhaps from a different group.

“We became accustomed to IT organizations managing the database architectures or the data stores and any of the enterprise wide user-facing applications,” said Steven Escaravage, vice president in Booz Allen Hamilton’sStrategic Innovation Group. “I think that’s changed over the last decade, where there’s been a greater focus on data governance, and so you also see IT organizations today managing the process and the systems used to govern data.”

Additionally, as more organizations start analyzing cross-functional data, it becomes apparent that the IT function is necessary.

“IT plays an important part in ensuring that these new and different kinds of data are in a platform or connected or integrated in a way that the business can use. That is the most important thing and something companies struggle with,” said Justin Honaman, a managing director in the Digital Technology Advisory at Accenture.

Where analytics talent resides varies greatly

There’s an ongoing debate about where analytics talent should reside in a business unit.  It’s common for departments to have their own business analysts, but data science teams, including data analysts, often reside in IT.

The argument in favor of a centralized analyst team is visibility across the organization, though domain-specific knowledge can be a problem. The argument in favor of decentralization is the reverse. Accenture’s Honoman said he’s seeing more adoption of the decentralized model in large companies.

Hybrid analytics teams, like hybrid IT, combines a center of excellence with dedicated departmental resources.

Hot analytics techs

Machine learning and AI are becoming popular features of analytics solutions. However, letting machine learning loose on dirty and biased data can lead to spurious results; the value of predictive and prescriptive analytics depends on their accuracy.

As machine learning-based applications become more in vogue, analytics success depends on “the quality of not just the data, but the metadata associated with it [that] we can use for tagging and annotation,” said Booz Allen Hamilton’s Escaravage “If IT is not handling all of that themselves, they’re insisting that groups have metadata management and data management capabilities.”

Meanwhile, the IoT is complicating IT ecosystems by adding more devices and edge analytics to the mix.  Edge analytics ensures that the enterprise can filter meaningful data out of the mind-boggling amount of data IoT devices can collect and generate.

In short, the analytical maturity of organizations can’t advance without IT’s involvement.

Just a Bit of Advice:  

Strategies for Successful Analytics

A few helpful hints as you move through your analytics journey.

If you’re just getting started on your data and analytics journey, think before you act.

Steven Escaravage of Booz Allen Hamilton noted, “I tell clients to take a step back before they invest millions of dollars.” Among other things, he said, make sure to have a good foundation around what questions you’re trying to solve today and the questions you perceive are coming down the path.

“Let’s put together a data wish list and compare it to where we’re at, because usually you’re going to have to make investments in generating data to answer questions effectively,” he added. All the other pieces about methods and techniques, tools and solutions follow these actions.

If you’re at the pilot stage, beware of scalability challenges.

“Very rarely for sophisticated analytic problems would I lean on a typical Python pilot deployment in production,” said Escaravage. “You’d typically move to something you knew could scale and wouldn’t become a bottleneck in the computational pipeline.”

If you’re in production, you may be analyzing all kinds of things, but are you measuring the effectiveness of your solutions, processes and outcomes? If not, you may not have the complete feedback loop you think you have.

How Today’s Analytics Change Recruiting

HR is late to the analytics game by modern standards, and yet, HR metrics is not a new concept. The difference is that modern analytics enable HR professionals and recruiters to measure more things in less time and derive more insight than ever before.

Rosemary Haefner

Rosemary Haefner

“If you’re looking at recruiting, there have always been metrics such as time to hire and cost per hire, but you’re seeing other channels and avenues opening up,” said Rosemary Haefner, chief human resources officer at online employment website, CareerBuilder.com.

The “time to hire” or “time to fill” metric measures how many days it takes from the time a requisition is posted until the time an offer is accepted. The longer a position remains open, the higher the cost of talent acquisition. In addition, if a position remains open, an intervention may be necessary to ensure the work at hand is getting done.

If time to fill were the only measure of success, then, in theory, the faster a position is filled, the better. However, as most working professionals have experienced, the person who can be hired the fastest isn’t necessarily (and probably isn’t), the best candidate.

On the other hand, moving too slowly can cost organizations sought-after talent.

“There’s the time to fill, the cost of the person you hire, whether that person is high-potential and what their expected tenure in the organization is. That’s an example of four interrelated metrics,” said Muir Macpherson, Americas analytics leader, People Advisory Services at EY. “HR needs to stop thinking about individual metrics and consider the problem they’re trying to solve and how to optimize across a set of metrics simultaneously.”

Beyond keywords

Talent marketplaces and talent acquisition software made it easier to navigate a sea of resumes using keywords and filters. In response, some candidates stuffed their resumes full of keywords so their resumes would rank higher in searches. If one’s resume ranked higher in searches, then more people would see it, potentially increasing the candidate’s chance of getting interviews and landing a job.

Masterful keyword use demonstrated an awareness that the recruiting process was changing from a paper-based process to a computer or web-based process. However, other candidates who might have been better fits for positions risked getting lost in the noise.

The whole keyword trend was a noble effort, but keywords, like anything else, are not a silver bullet.

With today’s analytics tools, HR departments and search firms can understand much more about candidates and the effectiveness of their operations.

“You can use a variety of big data and machine learning techniques that go way beyond the keyword analysis people have been doing for a while that integrates all of the data available about a candidate into one, unified prediction score that can then be used as one additional piece of information that recruiters and hiring managers can look at when making their decisions,” said Macpherson.

Data impacts recruiters too

Recruiters now have access to data analytics tools that enable them to better match candidates with potential employers and improve the quality of their services. Meanwhile, HR departments want insight into what recruiters are doing and how well they’re doing it. The Scout Exchange marketplace provides transparency between the two.

“We can look at every candidate [a recruiter] submits to see how far they got in the process and whether they got hired. We use that for ratings so [companies and the recruiters they use] can see the other side’s rating,” said Scout Exchange CEO Ken Lazarus.

The site enables organizations to quickly find appropriate recruiters who can identify the best candidates for a position. HR departments also allows HR departments to see data and trends specific to their company.

Bottom line

Analytics is providing HR departments, recruiters and business leaders with quantitative information they can use to improve their processes and outcomes.

“Knowledge is power and having that data is helpful. For me, the first step is knowing what you’re solving for,” said CareerBuilder’s Haefner.

Right now, HR analytics tend to emphasize recruitment. However, attracting talent is sometimes easier than retaining it so it’s important to have insight throughout the lifecycle of employee relationships. EY’s Macpherson said HR departments should think in terms of “employee lifetime value” similar to the way marketers think about customer lifetime value.

“[HR analytics represents] a huge opportunity because for most companies, people and compensation are their biggest costs and yet there has been very little effort put into analyzing those costs or getting the most out of those investments that companies are making,” said EY’s Macpherson.

How the IoT Will Impact Data Analytics

IoT devices are just about everywhere, in cities, on oil rig, and on our wrists. They’re impacting virtually every industry, and their growth is outpacing organizations’ ability to make the most of that data.

To give you an idea of scale, IDC expects global IoT spending to reach nearly $1.4 trillion by 2021, up from $800 billion in 2017. The IoT is all around us, in many cases fading into the backgrounds of our homes and lifestyles, all the while generating massive amounts of data. The trick is driving value from that data.

The Balance of Data is Shifting

Over the past decade, we’ve witnessed several shifts in enterprises’ ability to deal with data. While different companies and industries are at different stages of maturity, we’ve seen and continue to see analytics evolving, whether it’s adding unstructured analytics capabilities to structured analytics, third-party data sources to our own, or IoT data to enterprise data. Slowly but surely, we’ve been seeing the balance of data shift from internal data to external data, particularly as more IoT devices emerge.

Edge analytics helps separate meaningful data from all the noise, which usually means identifying, and perhaps reacting to, exceptions and outliers. For example, if the temperature of a piece of industrial equipment rises beyond a threshold, maintenance crews may be alerted, or the equipment might be shut down.

Organizations attempting to manage IoT data using their traditional data centers are fighting a losing battle. In fact, Gartner noted that the IoT is causing businesses to move to the cloud faster than they might move otherwise. In other words, when so many things are happening in the cloud, it makes sense to analyze them in the cloud.

Data and Analytics Strategies: Top-down and Bottom-up

The sheer amount of data organizations must deal with increases greatly with the IoT, and there are still philosophical debates about how much data should be kept and how much data should discarded. Gartner strongly advises its clients to be smart about IoT data, meaning that one should not save all the data hoping to drive value from it in the future, but instead focus on strategic goals and how IoT data fits into that.

We often hear how important it is to align analytics efforts with business goals. At the same time, we also hear how important it is to uncover unknown opportunities and risks simply by allowing the data to speak for itself. Some of the most sophisticated companies I’ve talked to over the last several years are doing both, with machine learning identifying that which was not obvious previously. In Gartner’s view, “data and analytics must drive business operations, not reflect them.”

One major challenge organizations face, practically speaking, is operationalizing analytics — with or without the IoT. The core problem is moving from insights to action, which can’t be solved completely with prescriptive analytics. It’s a larger problem that has to do with company culture, stubborn attitudes and the very real challenges of integrating data sources.

Meanwhile, some organizations are pondering how they can use the IoT to improve customer experience, whether that’s minimizing transportation delays, improving environmental safety or otherwise eliminating friction points that tend to irritate humans. Humans have become fickle customers after all, and each touch point can affect a brand positively or negatively.

For example, Walmart placed kiosks in some of its stores that retrieve online orders, scan receipts and trigger the conveyor belt delivery of the items a customer purchased. The kiosks address a customer pain point which is walking all the way to the back of the store and waiting several minutes for someone to show up only to be told the order can’t be located.

Now think about what Walmart gets from the kiosk: trend data about customer use and experiences that may impact staffing, inventory management, marketing, supply chain. Clearly, the data will also indicate whether the kiosk idea is ultimately a good idea or a bad idea.

In the pharmaceutical industry, GSK has been working with partners to develop smart inhalers that track prescription compliance and dosing. The data helps inform research, and it also has value to doctors and pharmacies.

Similarly, enterprises can use IoT data to develop predictive models that help improve business operations, logistics, supply chain and more, depending on the nature of the sensors and the device.

Are Media Relationships Dead?

question-mark-160071_640Strange as it may seem, PR pros used to spend incredible amounts of time cultivating relationships with the media. It wasn’t an email here or there or a social media ping. It was face-to-face time with editors of the target publications at events, on the road, and elsewhere.

I don’t know how many lunches, dinners, and media tours I went on when all of those things were fashionable.  While my PR clients were more interested in “hits” and cover stories, my agency was more concerned about the relationships we established because relationships transcend any client engagement.

In today’s highly fragmented world, things are very different.  PR people have to multitask on entirely different levels and in doing so, they sacrifice focus – focus on relationships, focus on targeting pitches, focus on learning what their clients really do.

I believe it’s still important to develop actual relationships with the media.  I can’t speak for all journalists on this point, but I can tell you that if we’ve established a relationship, your pitch will be placed at the top of the virtual pile, and I’m less inclined to delete it in the first place.  Also, if I have to do outreach for a story, I’ll probably contact you first.

One time, I spent 30 minutes on the phone talking to a PR person about his client’s product strategy simply because every time I needed him to cut through the red tape at that client’s organization, he did it.

PR success requires a confluence of many things, some of which are in your control and some of which are not.  One thing you can control is the way you approach and work with the media.  If you want to have more influence, stop looking at your job as a series of rat-tat-tat news announcements and start looking at the bigger picture.  Cultivate actual relationships with people, because there will be times when you need them, and vice versa.

Remember:  actual relationships transcend clients and publications.  You or I may move tomorrow.  If one or both of us does, you can count on me to point you in some kind of helpful direction, even if your client does not fit within one of my beats.

Why Privacy Is a Corporate Responsibility Issue

Many organizations have Corporate Responsibility programs that focus on social issues and philanthropy. Especially in today’s Big Data era, why is privacy not part of the program?

Today’s companies are promising to lower their carbon footprints and save endangered species. They’re donating to people in developing countries who have far less than we do, which is also noble. But what about the fact that American citizens are a product whose information is bought, sold, and obtained without consent? In light of recent events, perhaps the privacy policies deserve more consideration than just two linked words at the bottom of a website home page.

“Privacy is a big issue for a host of reasons — legal, ethical, brand protection and moral,” Mark Cohen, Chief Strategy Officer at consultancy and technology service provider Elevate. “[Privacy] is an element of corporate culture [so what goes into a privacy policy depends on] your values and priorities.”

Problems with Privacy Policies

There are three big problems with privacy policies, at least in the US: what’s in them, how they’re written, and how they’re ignored.

One might think that privacy policies are tailored to a particular company and its audience. However, such documents are not necessarily original. Rather than penning a privacy policy from scratch, some are literally cutting and pasting entire privacy policies regardless of their contents. In fact, the people who are simply grabbing another company’s privacy policy might not even bother to read the content before using it.

The boilerplate language is also a problem. In-house counsel often uses freely available forms to put together a privacy policy. They may use one form or a combination of forms available to lawyers, but again, they’re not thinking about what should be in the document.

In addition, the documents are written in legalese, which is difficult for the average person to read. Businesses are counting on that because if you don’t know what’s in a privacy policy, what you’re giving away and what they intend to do with your information, you’ll probably just hope for the best. Even better, you’ll click an “I agree” button without knowing what clicking that button actually means. It’s a common practice, so you’re not alone if that’s the case.

Oh, and what’s stated in the documents may or may not be true, either because the company changed the policy since you last read it or they’re ignoring the document itself.

“After May 2018 when the new GDPR [General Data Protection Regulation] goes into effect, it’s going to force many companies to look at their privacy policies. their privacy statements and consents and make them more transparent,” said Sheila Fitzpatrick, Data Governance & Privacy counsel and chief privacy officer at data services for hybrid cloud company NetApp. “They’re going to have to be easily understandable and readable.”

Businesses Confuse Privacy with Security

Privacy and security go hand-in-hand, but they’re not the same thing. However, the assumption is, if you’re encrypting data then you’re protecting privacy.

“Every company focuses on risk, export control trade compliance, security, but rarely you find companies focused on privacy,” said Fitzpatrick. “That’s changing with GDPR because it’s extraterritorial. It’s forcing companies to start really addressing areas around privacy.”

It’s entirely possible to have all kinds of security and still not address privacy issues. OK, so the data is being locked down, but are you legally allowed to have it in the first place? Perhaps not.

“Before you lock down that data, you need the legal right to have it,” said Fitzpatrick. “That’s the part that organizations still aren’t comprehending because they think they need the data to manage the relationship. In the past organizations thought they need the data to manage employment, customer or prospect relationships, but they were never really transparent about what they’re doing with that data, and they haven’t obtained the consent from the individual.”

In the US the default is opt-in. In countries that have restrictive privacy policies, the default is opt-out.

The Data Lake Mentality Problem

We hear a lot about data lakes and data swamps. In a lot of cases, companies are just throwing every piece of data into a data lake, hoping it will have value in the future. After all, cloud storage is dirt cheap.

“Companies need to think about the data they absolutely need to support a relationship. If they’re an organization that designs technology, what problem are they trying to solve and what data do they need to solve the problem?” said Fitzpatrick.

Instead of collecting massive amounts of information that’s totally irrelevant, they should consider data minimization if they want to lower privacy-related risks and comply with the EU’s GDPR.

“Companies also need to think about how long are they’re maintaining this data because they have a tendency to want to keep data forever even if it has no value,” said Fitzpatrick. “Under data protection laws, not just the GDPR, data should only be maintained for the purpose it was given and only for the time period for which it was relevant.”

The Effect of GDPR

Under the GDPR, consent has to be freely given, not forced or implied. That means companies can’t pre-check an opt-in box or force people to trade personal data for the use or continued use of a service.

“Some data is needed. If you’re buying a new car they need financial information, but they’d only be using it for the purpose of the purchase, not 19 other things they want to use it for including sales and marketing purposes,” said Fitzpatrick.

Privacy may well become the new competitive advantage as people become more aware of privacy policies and what they mean and don’t mean.

“Especially Europeans, Canadians, and those who live in Asia-Pacific countries that have restrictive privacy laws, part of their vetting process will be looking at your privacy program,” said Fitzpatrick. “If you have a strong privacy program and can answer a privacy question with a privacy answer as opposed to answering a privacy question with a security answer, [you’ll have an advantage].”

On the flip side, sanctions from international countries can destroy a company from reputational, brand and financial points of view. The sanction under the new GDPR regulation can be as high as 4% of a company’s annual turnover.

Quantum Computing Brings Promise and Threats

Digital computing has some serious limitations. While the technology advances made over the past few decades are impressive such as smaller footprints, faster processors, better UIs and more memory and storage, some problems could be solved better by quantum computers.

For one thing, quantum computers are faster than classical (traditional) computers. They are also able to solve problems that classical computers can’t do well or can’t do within a reasonable amount of time.

“Quantum computing exploits fundamental laws of physics to solve complex computing problems in new ways, problems like discovering how diseases develop and creating more effective drugs to battle them,” said Jim Clarke, director of quantum hardware at Intel Labs.”Once quantum systems are available commercially, they can be used to simulate nature to advance research in chemistry, materials science and molecular modeling. For instance, they can be used to help create a new catalyst to sequester carbon dioxide or a room temperature superconductor.”

Quantum computing will also drive new levels of business optimization, benefit machine learning and artifical intelligence, and change the cryptography landscape.

David Schatsky, managing director at Deloitte, said the common thread is optimization problems where there are multiple probable answers and the task is to find the right one. Examples include investment management, portfolio management, risk mitigation and the design of communication systems and transportation systems. Logistics companies are already exploring route optimization while the defense industry is considering communications applications.

“A year ago [quantum computing] was thought of more of as a physics experiment [but] the perception has changed quickly,” said Schatsky.  “In the last 3 months there have been a flurry of breakthroughs including fundamental engineering breakthroughs and commercial product announcements.”

Test drive a quantum computer today

It’s probably safe to say that none of us will have a quantum computer sitting on our desks anytime soon, but just about anyone with a browser can get access to IBM’s 5 and 16 quantum bit (qubit) computers via the cloud.  Earlier this year, the company announced IBM Q, an initiative intended to result in commercially available quantum computing systems.  IBM also announced that it had built and tested two quantum computing processors including the 16 qubit open processor for use by the public and the 17-qubit commercial processor for customers.

According to an IBM paper in Nature, scientists successfully used a seven-qubit quantum processor to address a molecular structure problem for beryllium hydride (BeH2), the largest molecule simulated on a quantum computer to date.

“It is early days, but it’s going to scale rapidly,” said Scott Crowder, vice president and CTO, Quantum Computing, Technical Strategy & Transformation at IBM Systems. “When you start talking about hundreds or low thousands of qubits, you can start exploring business value problems that [can’t be addressed well using] classical computers such as quantum chemistry [and] certain types of optimization problems that are also exponential problems.”

An exponential problem is one that scales exponentially with the number of elements in it. For example, planning a route involving 50 locations could be optimized in a number of ways depending on the objective, such as identifying the fastest route. That seemingly simple problem actually involves one quadrillion different possibilities, which is too many possibilities for a classical computer to handle, Crowder said.

Intel is making progress too

Intel teamed up with QuTech, an academic partner in the Netherlands in 2015. Since then, Intel has achieved milestones such as demonstrating key circuit blocks for an integrated cryogenic-CMOS control system, developing a spin qubit fabrication flow on Intel’s 300mm process technology and developing a unique packaging solution for superconducting qubits that it demonstrated in the 17-qubit superconducting test chip introduced on October 10, 2017. A week later, at the Wall Street Journal D.Live conference in Laguna, Calif., Intel CEO Brian Krzanich said he expects Intel to deliver a 49-qubit quantum chipby the end of 2017.

“Ultimately the goal is to develop a commercially relevant quantum computer, one that is relevant for many applications and one that impacts Intel’s bottom line,” said Intel’s Clarke.

Toward that end, Intel’s work with QuTech spans the entire quantum stack from the qubit devices to the overall hardware architecture, software architecture, applications and complementary electronics that workable quantum systems will require.

“Quantum computing, in essence, is the ultimate in parallel computing, with the potential to tackle problems conventional computers can’t handle,” said Clarke. “But, realizing the promise of quantum computing will require a combination of excellent science, advanced engineering and the continued development of classical computing technologies, which Intel is working towards through our various partnerships and R&D programs.”

Decryption and other threats

There is a debate about whether quantum computers will render current encryption methods obsolete or not. Take a brute force attack, for example. In a brute force attack, hackers continually guess passwords and use computers to accelerate that work. Quantum computing would accelerate such an attack even further.

“Virtually all security protocols that are used and deployed today are vulnerable to an attack by a quantum computer,” said William “Whurley” Hurley) Chair of the Quantum Standards Working Group at the IEEE. “Quantum information allows us to secure information in ways that are completely unbreakable even against a quantum attack.”

Along those lines, there are efforts to develop a new type of security protocol that doesn’t necessarily leverage quantum mechanics. Hurley said they’re using extremely difficult mathematical problems that even quantum computers won’t be able to solve, which is referred to as “Quantum-ibmSafe Cryptography” or “Post-Quantum Cryptography).

The IEEE Quantum Standards Working Group is working on other quantum technologies including, quantum sensors and quantum materials. The research institute has brought together physicists, chemists, engineers, mathematicians and computer scientists to ensure that the institute can adapt rapidly to change.

Deloitte’s Schatsky said synthetic biology and gene editing are also potentially dangerous, mainly because capabilities can be developed faster than one’s ability to understand how to apply such technologies wisely. The same could be said for many emerging technologies.

Quantum computing should be on your radar

Quantum computing is advancing rapidly now so it’s wise to ponder how the capabilities might benefit your business.  The reality is that no one knows all the ways quantum computing can be used, but it will eventually impact businesses in many different industries.

Will quantum computers overtake classical computers, following the same evolutionary path we’ve seen over the past several decades or will the two co-exist? For the foreseeable future, co-existence is the answer because binary and quantum computers each solve different kinds of problems better than the other.

 

 

5 Cross-functional Analytics Challenges

More businesses are attempting to optimize their operations with the help of analytics, although most of the activity still takes place at the departmental level. Additional value can be gained from cross-functional analytics, but it represents a different kind of challenge because the functional units tend to use different systems and data owners often want to maintain control of their data.

According to recent research by EY and Forbes Insights, 60% to 70% of companies now use analytics at a departmental level, up from 30% to 40% in 2015.

“Companies have had success in one part of the business, they then try to replicate that in other departments,” said Chris Mazzei, global chief analytics officer and emerging technology leader at EY. “The companies that are more mature across a number of different dimensions, those we would put into the “leading” category, are out-performing the others. They’re reporting higher revenue growth, better operating margins and more effective risk management, so there’s at least there’s a correlation between analytics adoption and driving better business outcomes.”

Here are a few things that can hold cross-functional analytics back.

Analytics Isn’t Part of the Business Strategy

Cross-functional analytics is more likely to yield competitive advantages and drive more business value when the analytics are an integral part of the business model and strategy.

“The vast majority of organizations still are not able to say that their business strategy has really reflected the role analytics plays in how they’re trying to compete,” said Mazzei. “There’s this fundamental misalignment that can occur when across the leadership team is not able to have a consistent view of where and how analytics is making the biggest impact on the business strategy.”

Operating Models Don’t Facilitate Cross-Functional Analytics

Executing an analytics strategy at a departmental level such as finance or marketing is relatively easy because it’s clear that resources need to be dedicated to the effort. When it’s a cross-functional endeavor, who’s responsible for providing, funding and managing those resources? What should the data flow look like and how can that be facilitated?

“If you’re trying to deploy analytics across the organization, the operating model becomes much more important,” said Mazzei. “Do we have a centralized team? Do we distribute analytics resources in the individual business units or functions? What’s the relationship between those teams?”

Like bimodal IT, bimodal analytics services benefit the enterprise and the departments simultaneously. The centralized group helps facilitate best practices and ensures appropriate governance while dedicated resources tend to have specialized knowledge of that particular function and its analytics requirements.

The Initiatives Aren’t Designed Well

Analytics efforts should drive business value. There’s a lot to do, but not everything will have the same level of impact or necessarily achieve the desired results, so the desired business outcomes should drive the prioritization of analytics efforts.

“Initiative design is really important and are there competent frameworks/processes you use for that,” said Mazzei.

Not surprisingly, companies are still at very different stages of maturity in terms of having any kind of consistent process for designing an analytics initiative. The more analytically mature a company is, the greater the likelihood is that they have common frameworks. There is also a common understanding of what the term, “analytics initiative” means and common tools for executing that, Mazzei said.Analytics Isn’t Part of Business Operations

As companies embrace analytics and mature in their use of analytics, business processes tend to change. It’s wise to think about that and other impacts early on.

“The more mature companies are thinking about that earlier in the process and using an initial point of view about what that intervention needs to be to inform how you design the analytics themselves,” said Mazzei. “A lot of companies don’t think about that early enough.”

According to the report, design intervention is “Translating all the upfront goal-setting, modeling, and methodology into action— making analytics insights an integral part of business operations.”

The True Value of Analytics Isn’t Understood

Interestingly, analytics enables organizations to measure all kinds of things and yet success metrics may not have been defined for the analytics initiatives themselves.

“That really matters because [if] you can learn, what’s working and what’s not earlier on, you can change the nature of the intervention or the analytic you’re building,” said Mazzei. “It’s that feedback loop you have in place.”

Your Data Is Biased. Here’s Why.

Bias is everywhere, including in your data. A little skew here and there may be fine if the ramifications are minimal, but bias can negatively affect your company and its customers if left unchecked, so you should make an effort to understand how, where and why it happens.

“Many [business leaders] trust the technical experts but I would argue that they’re ultimately responsible if one of these models has unexpected results or causes harm to people’s lives in some way,” said Steve Mills, a principal and director of machine intelligence at technology and management consulting firm Booz Allen Hamilton.

In the financial industry, for example, biased data may cause results that offend the Equal Credit Opportunity Act (fair lending). That law, enacted in 1974, prohibits credit discrimination based on race, color, religion, national origin, sex, marital status, age or source of income. While lenders will take steps not to include such data in a loan decision, it may be possible to infer race in some cases using a zip code, for example.

“The best example of [bias in data] is the 2008 crash in which the models were trained on a dataset,” said Shervin Khodabandeh, a partner and managing director of Boston Computing Group (BCG) Los Angeles, a management consulting company. “Everything looked good, but the datasets changed and the models were not able to pick that up, [so] the model collapsed and the financial system collapsed.”

What Causes Bias in Data

A considerable amount of data has been generated by humans, whether it’s the diagnosis of a patient’s condition or the facts associated with an automobile accident.  Quite often, individual biases are evident in the data, so when such data is used for machine learning training purposes, the machine intelligence reflects that bias.  A prime example of that was Microsoft’s infamous AI bot, Tay, which in less than 24 hours adopted the biases of certain Twitter members. The results were a string of shocking, offensive and racist posts.

“There’s a famous case in Broward County, Florida, that showed racial bias,” said Mills. “What appears to have happened is there was historically racial bias in sentencing so when you base a model on that data, bias flows into the model. At times, bias can be extremely hard to detect and it may take as much work as building the original model to tease out whether that bias exists or not.”

What Needs to Happen

Business leaders need to be aware of bias and the unintended consequences biased data may cause.  In the longer-term view, data-related bias is a governance issue that needs to be addressed with the appropriate checks and balances which include awareness, mitigation and a game plan should matters go awry.

“You need a formal process in place, especially when you’re impacting people’s lives,” said Booz Allen Hamilton’s Mills. “If there’s no formal process in place, it’s a really bad situation. Too many times we’ve seen these cases where issues are pointed out, and rather than the original people who did the work stepping up and saying, ‘I see what you’re seeing, let’s talk about this,’ they get very defensive and defend their approach so I think we need to have a much more open dialog on this.”

As a matter of policy, business leaders need to consider which decisions they’re comfortable allowing algorithms to make, the safeguards which ensure the algorithms remain accurate over time, and model transparency, meaning that the reasoning behind an automated decision or recommendation can be explained.  That’s not always possible, but still, business leaders should endeavor to understand the reasoning behind decisions and recommendations.

“The tough part is not knowing where the biases are there and not taking the initiative to do adequate testing to find out if something is wrong,” said Kevin Petrasic, a partner at law firm White & Case.  “If you have a situation where certain results are being kicked out by a program, it’s incumbent on the folks monitoring the programs to do periodic testing to make sure there’s appropriate alignment so there’s not fair lending issues or other issues that could be problematic because of key datasets or the training or the structure of the program.”

Data scientists know how to compensate for bias, but they often have trouble explaining what they did and why they did it, or the output of a model in simple terms. To bridge that gap, BCG’s Khodabandeh uses two models: one that’s used to make decisions and a simpler model that explains the basics in a way that clients can understand.

Drexel University’s online MS in Data Science will set you on the path to success in one of today’s fastest growing fields. Learn how to examine and manipulate data to solve problems by creating machine learning algorithms and emerge from the program work-place ready.

Brought to you by Drexel University

BCG also uses two models to identify and mitigate bias.  One is the original model, the other is used to test extreme scenarios.

“We have models with an opposite hypothesis in mind which forces the model to go to extremes,” said Khodabandeh. “We also force models to go to extremes. That didn’t happen in the 2008 collapse. They did not test extreme scenarios. If they had tested extreme scenarios, there would have been indicators coming in in 2007 and 2008 that would allow the model to realize it needs to adjust itself.”

A smart assumption is that bias is present in data, regardless.  What the bias is, where it stems from, what can be done about it and what the potential outcomes of it may be are all things to ponder.

Conclusion

All organizations have biased data.  The questions are whether the bias can be identified, what effect that bias may have, and what the organization is going to do about it.

To minimize the negative effects of bias, business leaders should make a point of understanding the various types and how they can impact data, analysis and decisions. They should also ensure there’s a formal process in place for identifying and dealing with bias, which is likely best executed as a formal part of data governance.

Finally, the risks associated with data bias vary greatly, depending on the circumstances. While it’s prudent to ponder all the positive things machine learning and AI can do for an organization, business leaders are wise to understand the weaknesses also, one of which is data bias.

Computer History May Not Be What You Think

When many of us ponder computer history, we think of Steve Jobs, Bill Gates, and other high-profile white men who changed the course of history through innovation and shrewd business tactics. Of course, there have been other significant contributors along the way who are not white including Guy Kawasaki and Jerry Yang, but how much have we really heard about the computer industry contributions made by African-Americans?

For the most part, precious little, if anything. However, that may change with the help of Arvid Nelsen, IEEE Computer Society member, Southern Methodist University rare books and manuscripts librarian, and contributor to the IEEE Annals of the History of Computing.

“I look at historical understanding as something that evolves over time as new information comes to light or as we examine the past through the lens of different priorities and values,” said Nelsen, in an email interview. “Scholars are just beginning to scratch the surface in respect to persons of color. I think these efforts add to history by examining previously ignored, overlooked, invisible, and perhaps devalued evidence. I hope that means the development of a more complete, complex, and nuanced understanding of history.”

Is Computer History Revisionist History?

What if everything we know about the computer industry isn’t entirely correct?  In today’s global business environment, innovation, disruption and contributions can come from anywhere. However, it may be that African-Americans still remain in the shadows rather than the limelight, at least in the US.

But what, exactly, have we in the computer industry missed? More work needs to be done to answer those and other questions.

Unearthing the African-Americans’ computer industry contributions won’t be an easy task because there’s a lack of archival source material. In Nelsen’s recent IEEE Annals of the History of Computing article, he writes,”Archives and libraries should undertake to identify and collect materials from persons of color. Meanwhile, scholars may find material in nontraditional sources, and prosopography may prove useful for examining computer professionals of color.”

One non-traditional source is Ebony magazine, which lists at least 57 African-Americans working in various computing fields between 1959 and 1996.

I hope that the article encourages historians who are interested in critically examining race in computing simply to start looking for stories. They are out there,” said Nelsen. “I provide a number of examples and specifically encourage the examination of publications by and for particular communities, publications which may have been previously considered out-of-scope in contrast to scientific and professional publications.”

Why Computer Industry History Lacks Color

Racism was rampant in the computer industry’s early days. Perhaps it’s less obvious to some of us now, given the diversity of today’s high-tech workforce. However, racism is still alive and well, despite greater workforce diversity.

To align contributions with contributors, Nelsen thinks historians need to understand the development of the computer industry, as well as the specific technologies that comprise the computer industry.

One of Nelsen’s articles inspired a letter from a retired professor who had worked for Burroughs Corp. While at Burroughs, some of his African-American colleagues developed new hardware and software, including the operating system for the Burroughs B5000 and B5500 mainframe computers.

“I hope my article will inspire readers to reach out with their own stories to scholars and to archives like the Charles Babbage Institute with papers and other source materials,” said Nelsen.

The Time is Ripe for Change

The movie Hidden Figures, based on the book by the same name written by Margot Lee Shetterly, helped raise at least partial awareness that the accomplishments of African Americans in the computer industry have indeed been ignored, forgotten or overlooked. The book and the movie focus on mathematicians Mary Jackson, Katherine Johnson and Dorothy Vaughn, all of whom worked for the National Aeronautics and Space Administration (NASA).

“The contributions of these three women were essential to both the Space Race and the development of the computing disciplines, and have been shamefully neglected ” wrote Nathan Ensmenger, Editor in Chief, IEEE Annals of the History of Computing.

in his own commentary he said. “[A]s we begin to incorporate race and ethnicity into our scholarship, we will discover new insights, methods, and perspectives that will radically reshape the focus of our discipline.”

How the focus of our discipline may change as the result of such research remains to be seen. As both Nelsen and Ensmenger note, the task won’t be easy, but it’s a necessary endeavor.

How to Teach Executives About Analytics

If your data is failing to persuade executives, maybe it’s not the data that is the problem. Here’s how to change your approach to fit the audience.

One of the biggest challenges data analysts and data scientists face is educating executives about analytics. The general tendency is to nerd out on data and fail to tell a story in a meaningful way to the target audience.

Sometimes data analytics professionals get so wrapped up in the details of what they do that they forget not everyone has the same background or understanding. As a result, they may use technical terms, acronyms, or jargon and then wonder why no one “got” their presentations or what they were saying.

They didn’t anything wrong, per se, it’s how they’re saying it and to whom.

If you find yourself in such a situation, following are a few simple things you can do to facilitate better understanding.

Discover What Matters

What matters most to your audience? Is it a competitive issue? ROI? Building your presence in a target market? Pay attention to the clues they give you and don’t be afraid to ask about their priorities. Those will clue you in to how you should teach them about analytics within the context of what they do and what they want to achieve.

Understand Your Audience

Some executives are extremely data-savvy, but the majority aren’t just yet. Dialogs between executives and data analysts or data scientists can be uncomfortable and even frustrating when the parties speak different languages. Consider asking what your target audience would like to learn about and why. That will help you choose the content you need to cover and the best format for presenting that content.

For example, if the C-suite wants to know how the company can use analytics for competitive advantage, then consider a presentation. If one of them wants to understand how to use a certain dashboard, that’s a completely different conversation and one that’s probably best tackled with some 1:1 hands-on training.

Set Realistic Expectations

Each individual has a unique view of the world. Someone who isn’t a data analyst or a data scientist probably doesn’t understand what that role actually does, so they make up their own story which becomes their reality. Their reality probably involves some unrealistic expectations about what data-oriented roles can do or accomplish or what analytics can accomplish generally.

One of the best ways to deal with unrealistic expectations is to acknowledge them and then explain what is realistic and why. For example, a charming and accomplished data scientist I know would be inclined to say, “You’d think we could accomplish that in a week, right? Here’s why it actually takes three weeks.”

Stories can differ greatly, but the one thing good presentations have in common is a beginning, a middle, and an end. One of the mistakes I see brilliant people making is focusing solely on the body of a presentation, immediately going down some technical rabbit hole that’s fascinating for people who understand it and confusing for others.

A good beginning gets everyone on the same page about what the presentation is about, why the topic of discussion is important, and what you’re going to discuss. The middle should explain the meat of the story in a logical way that flows from beginning to end. The end should briefly recap the highlights and help bring your audience to same conclusion you’re stating in your presentation.

Consider Using Options

If the executive(s) you’re presenting to hold the keys to an outcome you desire, consider giving them options from which to choose. Doing that empowers them as the decision-makers they are. Usually, that approach also helps facilitate a discussion about tradeoffs. The more dialog you have, the better you’ll understand each other.

Another related tip is make sure your options are within the realm of the reasonable. In a recent scenario, a data analyst wanted to add two people to her team. Her A, B, and C options were A) if we do nothing, then you can expect the same results, B) if we hire these two roles we’ll be able to do X and Y, which we couldn’t do before, and C) if we hire 5 people we’ll be able to do even more stuff, but it will cost this much. She came prepared to discuss the roles, the interplay with the existing team and where she got her salary figures. If they asked what adding 1, 3, or 4 people looked like, she was prepared to answer that too.

Speak Plainly

Plain English is always a wise guide. Choose simple words and concepts, keeping in mind how the meaning of a single word can differ. For example, if you say, “These two variables have higher affinity,” someone may not understand what you mean by variables or affinity.

Also endeavor to simplify what you say, using concise language. For example, “The analytics of the marketing department has at one time or another tended overlook the metrics of the customer service department” can be consolidated into, “Our marketing analytics sometimes overlooks customer service metrics.”

« Older posts Newer posts »