Strategic Insights and Clickworthy Content Development

Category: Business strategy (Page 3 of 3)

One Point the Equifax Breach Drives Home

oday’s developers use more third-party and open-source components, libraries and frameworks than ever to deliver and update products in ever-shrinking delivery cycles. In the race to get to market, it’s easy to overlook or ignore details that can lead to a security breach.

For example, Equifax blamed its recent security breach on an Apache Struts vulnerability (CVE-2017-9805) and later admitted it failed to install a patch. That patch had been available for six months, according to The Apache Foundation.

“The Equifax hack is so interesting, mostly because their response to the hack has been so poor. Blaming a hack on any sort of software issue – open source or proprietary – is simply part of their inadequate response. It’s a ‘the dog ate my paper’ excuse,” said James Stanger, chief technology evangelist at CompTIA. “That’s not much of an explanation, especially considering that Equifax disclosed this problem on September 7 after knowing about it since July 29. ”

What if the software you built was compromised and you discovered that the root cause was a third-party building block you used? You didn’t build that piece of software, after all, so perhaps that party should be liable for any damages that piece of software caused.

Practically speaking, good luck with that argument.

Little or No Third-Party Liability

If you’re using third-party building blocks in your software, which you likely are, the buck stops with you. Sure, someone else’s code may have caused a catastrophic failure, but did you read the fine print in the license agreement?  Third-party developers have several ways of dealing with the matter contractually.

“There may be disclaimers, especially in the open source community, that say, ‘This component is [provided] as-is’ and you as the licensee are responsible for its testing and usage in another system,” said Roy Hadley, Jr., co-chair of the Privacy & Cybersecurity team at law firm Thompson Hine “If you choose to use it in a nuclear facility or the space shuttle, that’s on you.”

“This WAS a different cybersecurity conference experience, and I really enjoyed all of the interaction and honest discussions.” (2017 Attendee) LEARN MORE

Those who use third-party software in their products are ultimately responsible because the provider can’t foresee how its software will be used or configured by others. So, the licensor protects itself using an “as-is” clause or a limitation of liability. Alternatively, the licensor may require indemnity from the licensee, which means if you use third-party software, something goes wrong and the provider of the component you use gets sued, you’re liable.

What Software Developers Should Do

Test, test, test. Ideally, developers should take the time to understand every piece of third-party software they’re using to make sure it does what it’s supposed to do and that it’s been tested for security vulnerabilities. They should also have a mechanism to ensure that the associated updates and patches are up-to-date.

“I think you have an absolute responsibility to make sure that third-party components work, work together and work the way they’re supposed to,” said Jason Wacha, an attorney and founder of  WS Law Offices which specializes in software licensing. “One of the things about the open source community is you hear [about a software vulnerability], they announce it and everybody jumps on it and tries to fix it. Certainly this was true for the Struts project. One of the things about proprietary software is if someone discovers a vulnerability, it’s not going to get out there and people aren’t going to talk about it.”

The obvious constraint is time. There just isn’t enough time to test everything.

“The issues we keep confronting or not confronting in the IT industry are ignoring or omitting key steps of the Software Development Lifecycle (SDLC) and then mismanaging how that resulting software is deployed,” said CompTIA’s Stanger. “One of the primary reasons why software issues get missed by the good guys and exploited by the bad guys is because companies, individuals and groups that develop software tend to rush software to market.”

There are also challenges with the way software is configured and deployed.

“Many IT pros and even security pros still tend to think, ‘If I obtain the code from a secure source and run the hash tag, I’ll be fine. I’ll just update the code as necessary.’ Plus, relatively few companies actually test the software properly in a production environment by “throwing rocks” at the code and doing proper threat hunting,” said CompTIA’s Stanger. “Fewer still are able to update software adequately, because updates often break custom code. Imagine how security issues can propagate when you combine installed and cloud solutions.”

While developers should verify that the third-party software they use has been adequately tested by whomever built it, they need to retest it in the context of their own product.

Rob Rosenzweig, Risk Strategies Co.

Rob Rosenzweig, Risk Strategies Co.

“The reality of the current world we live in is that any business must undertake extreme caution and implement a thorough due diligence process when vetting any vendor that impacts their supply chain or is processing or storing any information on its behalf,” said Rob Rosenzweig, vice president and national cyber practice leader for insurance brokerage Risk Strategies Company. “While there is significant upside to the utilization of outsourced vendors in managing expense, obtaining a higher level of security and realizing operational efficiencies; the flipside is that organizations lose control and still retain all of the risk.”

Lesson Learned

The Equifax breach underscores the need for vigilance because hackers are constantly experimenting to find and exploit vulnerabilities, particularly when sensitive information is involved. When a vulnerability is found, it needs to be addressed in a timely fashion, unlike the Equifax breach. Due to the confluence of events, the Federal Trade Commission (FTC) is now investigating Equifax.

As is evident, the failure to implement one patch can have devastating consequences.

5 Things to Know About IT Candidates

Hiring and retaining IT talent is difficult. Part of the problem is that some companies don’t understand what IT professionals want and why they want it.

Manpower Solutions Group recently published a survey-based report that sheds some light on the matter. More than 14,000 currently-employed individuals between the ages of 18 and 65 participated, across industries. Some of the results are specific to IT professionals and they may surprise you.

#1: Expect turnover

IT professionals who change jobs frequently do it for two reasons: to increase their compensation (43%) and to advance their careers (60%). Employers should appeal to those desires.

“Candidates within the IT space shouldn’t be measured solely on their time spent within a specific role,” said Stephen Rees, Director of Program Delivery at Manpower Group Solutions, in an interview. “A review of a project’s purpose, the candidate’s role and [her] accomplishments within the timeframe of the project should be the key areas of focus. Seasoned recruiters and hiring managers will need to account for the time needed to ramp up performance in order to understand the value of work delivered.”

Technology is constantly changing which impacts what IT does and what IT professionals must know. Those who learn the newest must-have skills, whether it’s DevOps or virtualized IT infrastructures, tend to be in high demand. When skills are in high demand and there’s a “skills shortage,” companies will pay handsomely for the right talent.

IT professionals have to acquire those new skills somewhere, however. If they can’t learn those skills at their present companies or their present company doesn’t invest education or training, they may seek opportunities at a company that provides such benefits.

#2:  Monetary compensation isn’t everything

IT professionals weigh several factors before making a decision. The top three of seven options are compensation (23%) opportunity for advancement (22%) and benefits (21%). Schedule flexibility, type of work, geography and the company’s brand reputation rank lower. Of those, schedule flexibility ranks the highest.

Nsecurity is a 2-day conference for learning, skill building and networking. It allows IT and security pros to share information about what works-and what doesn’t-in cybersecurity defense. LEARN MORE

Interestingly, opportunity for advancement is almost twice as important to IT professionals than individuals who work in financial services, healthcare/pharmaceuticals and retail. Benefits are more important to IT individuals than others too and not just traditional benefits, such as a 401K program or health and dental insurance. They tend to value non-traditional benefits such as game areas, rest areas and perhaps a healthy drink on tap. Although benefits hold some value in themselves, more importantly, they tend reflect a company’s culture.

“Today’s benefits are becoming more lifestyle/non-work specific,” said Rees. “The emphasis is shifting from the immediate short-term benefits that tie employees to the office and are instead focusing on the broader impact on an individual’s life such as PTO, sabbaticals, learning and development, diversity and inclusion, etc. While the specific role, project or product is still important, the company the work is being done for is increasing in importance as candidates increasingly want to align themselves with an organization that shares their values.”

#3: Your digital presence and industry associations matter

Most survey respondents, including IT professionals, use company websites and search engines to research career opportunities. However, IT professionals are more likely to rely on social media (55%) and industry associations (33%) than the U.S. average of 38% and 18%, respectively.

In the IT world, associations are where standards are defined. Defining standards involves a lot of intellectual banter and collaboration among individuals who work at competing companies. The comradery can result in very compelling career opportunities that don’t appear on a job site or a company’s website.

Manpower notes that some of these IT associations have emerged around certification, training programs and hacking events. Within those groups knowledge exchange and mentoring happen.

“Networking has always been a core component of the IT space. For IT professionals, their work is typically their passion,” said Rees. “This participation is also seen as a way of giving back and helping others develop – there is a true desire to share experiences and knowledge, helping others to learn instead of keeping information to themselves.”

Companies can create their own hubs for interaction, whether that’s offering training or certification at an event or hosting informational sessions that enable IT professionals to meet with some of the company’s engineers.

#4:  They want you to reach out to them

More than half (55%) of IT professionals said they prefer weekly emails from potential employers of interest, which is considerably more than retail (37%), financial services (37%) and healthcare/pharmaceuticals (33%). Manpower equates this finding with the fact that 65% of IT professionals are always looking for the next job opportunity.

If you’re going to reach out to IT professionals and you’re truly interested in maintaining a dialog, don’t send out a general email blast. Instead, engage in a meaningful conversation.

#5: They’re more willing to relocate than others

IT professionals are more likely to relocate to a new city (38%) or a new state (40%) than the U.S. average of 30% and 29%, respectively, but less willing to move to a different country (8%) than the U.S. average (10%). Manpower attributes the greater degree of mobility to the lure of California locations.

While Skype interviews are common, be ready and willing to reimburse top candidates for their travel to and from an on-site interview. It demonstrates a willingness to invest in your employees.

Conclusion

Companies should avoid cookie-cutter approaches to IT recruitment because they tend to overlook some of the important things andidates value. What they value changes with time.

Manpower’s report can provide more insight into what IT professionals really want. It also includes some great advice. Happy reading.

Want to Succeed at Data-Driven Transformation? Start Slow

Data-driven transformation efforts often fail because companies are moving too fast or too slow. If they’re racing into it, they’re probably not getting the details right. If they’re trying to get everything into a perfect state before doing anything, they’re wasting precious time. Either way, ROI suffers.

If you want to strike the right balance, you have to think carefully about what you want to do and why you want to do it. You also have to move at an appropriate speed which means balancing short-term and long-term goals.

Start small

Boston Consulting Group recently published an in-depth article about data-driven transformation. In it, the authors raise a lot of good points, one of which is taking small steps rather than giant leaps.

Taking small steps enables businesses to succeed more often, albeit on a smaller scale. If the project is a success, the money saved or earned can be used to fund subsequent steps. Small successes also help keep teams motivated. However, executing too many small projects simultaneously can be counterproductive.

“The first mistake I see is doing 200 proof of concepts. The second thing I see is people start to do a pilot, even at scale, but [they think] first we need a developer to reinvent the system and then we’ll get the value at the end,” said Antoine Gourévitch, a Senior Partner and Managing Director at BCG. “I’d rather find a quick and dirty way to connect the IT systems and get [some immediate value] rather than doing a full transformation. You can have a transformation over three to five years, but at least I need to do the connection between my pilot at scale and the dedicated systems for the data platform that’s needed to be of value as we go.”

The third challenge is prioritizing pilots or projects. Value is the criteria there. Without a prioritized roadmap, “cool” projects may take precedence over projects that deliver business value.

Three steps to better ROI

BCG offers a lot of good advice in its article, not the least of which is breaking short-term and long-term goals into a three-step process that enables quick wins while paving the way to data-driven transformation. The three steps are:

  • Use quick wins fund the digital journey and learn
  • Design the company-wide transformation
  • Organize for sustained performance
“This WAS a different cybersecurity conference experience, and I really enjoyed all of the interaction and honest discussions.” (2017 Attendee) LEARN MORE

Within those three steps, BCG specified actions companies should take. Upon close review, it’s clear that some of the recommended actions, such as “fail fast,” apply to more than one step. If you read BCG’s article and ponder the graphics, it will get you thinking about how to scale success in your organization.

BCG also presents a five-level transformation model that includes vision, use cases, analytics, data governance and data infrastructure. Gourévitch said data governance tends to be the most problematic because data isn’t viewed as a corporate asset, so data owners may hesitate to share it.

Bottom line

Companies often move too fast or too slow when becoming data-driven organizations. When they move too fast, they can overlook important details that cause initiatives to fail. When they move too slow, they risk losing competitive advantages.

Somewhere between 200 pilots and one massive transformation effort is a balance of short-term and long-term goals, investment and ROI. The challenge, as always, is finding the balance.

Can Virtual Companies Scale?

Modern technology has enabled more working professionals to telecommute, whether they’re working for traditional companies or progressive companies. Their employers may maintain dedicated office space for each employee nevertheless. Alternatively, there may be shared workspaces that are not assigned to any one employee.

Over the past couple of decades, technology and societal changes have enabled the rise of virtual companies. Those that succeed have some kind of “secret sauce,” which differs from organization to organization.

“If you’re a virtual company, you have to work differently than if you were in an office,” said Bjorn Freeman-Benson, CTO at product design platform provider InVision. “We have to deliberately coordinate our work. And because we’re deliberate, we scale more easily. We’ve got 250 – 300 employees now.”

Worldwide talent pool

Virtual companies tend to be distributed by default because the talent is spread out over several geographic locations. As they grow, one of two things happens:  They either move into office space because they’re unable to operate efficiently, or they stay virtual by placing more emphasis on talent than where that talent resides. New York-based InVision has employees in Montana, Argentina, and other locations, for example.

Similarly, virtual law firm Culhane Meadowshas 55 partners in different locations, most of whom are senior partners with eight or more years’ experience.

“I think we are the only alternative model that’s truly a partnership,” said Culhane Meadows Managing Partner Kelly Culhane. “You really have to use technology purposefully to maintain the standards of traditional law firms.”

Culhane Meadows started out with two advantages: Fortune 100 clients and four founding partners with different areas of expertise. Those four founders are responsible for operations, finance, marketing, and technology, respectively, which provides a solid foundation from which to grow.

If a partner needs a temporary office, she rents it from temporary office space provider Regus. If she needs paralegals or secretaries, they are hired on a contract basis through a temporary staffing partner.

The lower overhead enables Culhane Meadows to provide big law firm service without the big law firm price tag.

Results trump time

Managing a virtual workforce can be challenging, especially if it’s done in a traditional context, not only meaning 8:00 am to 5:00 pm, but scenarios where an employee might work 40 hours one week and many more or fewer the following week.

“We don’t care about butt time, we care about results,” said Freeman-Benson.

His company uses Slack for collaboration. Within Slack, the company has set up different channels so that salespeople can go to a virtual “deal desk” before extending an offer. Similarly, if a customer wants to know what InVision’s security policies are, a salesperson can tap into the security channel.

Some structure is good

Virtual companies often have less formal reporting structures, but not always. For example, Disney-focused MickeyTravels has a two-founder husband-and-wife team and 115 contract travel agents. Some of those travel agents have additional responsibilities, such as managing a group of contract agents or training agents.

Apparently, the business model is working well because MickeyTravels is one of the most successful Disney travel agencies in the world, being among only 12 Disney Platinum agencies.

“Our agents are well-versed on what we sell so nobody can ask them a question they don’t know the answer to,” said Greg Antonelle, co-founder of MickeyTravels. “The beauty of technology now is you have FaceTime, Skype, GoToMeeting, webinars, and all that stuff.”

Why Marketing Is So Smart, Yet So Dumb

Marketing is considered the most analytically advanced function in most companies. Yet, consumers and businesses are still bombarded with irrelevant promotional messages.

It’s true that marketers have had access to “modern” analytics tools longer than most others in an organization. It started with web analytics and then grew to encompass other digital channels and even offline channels.

In the last decade, there has been a push toward “multi-channel” and “omnichannel” analytics. Multi-channel analytics is designed to optimize marketing effectiveness within and across channels. Omnichannel analytics focuses on improving a continuous user experience across channels.

Marketing analytics is difficult, in other words, despite the availability of more and better tools.

“What are the exact ads, the exact conversations, and the exact place that drove someone to make a purchase on my site or in my store?” said Chris Madden, co-founder of digital marketing agency Matchnode. “I think in 0% of the cases does the super smart, data-driven marketer, CMO, or CEO know what drove the sale.”

It’s Complicated

The number of online channels has exploded over the past couple of decades with the rise of Search Engine Marketing (SEM), social media, and mobile, to name a few. Anyone familiar with even one of those channels knows that change is constant, and if you don’t keep up, you’ll slip up eventually.

“We’ve seen Facebook come out twice in the past year claiming that their method for measuring engagement metrics [was] wrong. There will be more growing pains as these platforms stabilize,” said Mitul Jain, vice president at data science platform provider r4 Technologies.

Meanwhile, brick and mortar entities are tracking what’s happening in stores using kiosks, digital point of sale (POS) systems, customers’ smartphones, and security cameras. They also have e-commerce sites. Their big challenge is to understand the relationship of online and offline channels.

We’re Tracking Activities, Not People

Activities are being monitored in every channel whether posts, clicks, downloads, foot traffic, or credit card swipes, but not all of that information is being stitched together into a coherent, accurate picture.

“Analytics does not do a very good job of knowing that the person on my phone is the same person on my tablet and desktop,” said Matchnode’s Madden. “The marketers who are doing well are those that start with the person.

Attribution is Difficult

Most of the time, there isn’t a 1:1 relationship between a message and an outcome (e.g., a sale, download, or donation). Usually, the final outcome is influenced by several factors that may include search-based research, search or social media advertising, product reviews, direct mail and email offers, apps, and websites.

The natural and incorrect thing to do is to attribute the last interaction to the outcome. The shopper visited the site and bought something, so the ecommerce site gets full credit. However, since several other factors likely influenced the decision, what percentage the sale should be attributed to each? That’s the burning question.

Data Quality Could Improve

Marketing tends to use several different systems and platforms, each of which may differ enough to affect data quality. Perhaps fields or tags are implemented differently, or there are five instances of a customer record, all of which are inconsistent.

“Large sites may not have the code in the right places or double instances of code. For example, an .edu site with multiple departments may have different tracking codes on the same site, which can create a lot of confusion,” said Max Thomas, CMO at fintech startup YayPay.

Thomas audits a client’s data points to make sure they’re correct early in the relationship. Quite often he discovers that the client hasn’t set up website analytics correctly or they haven’t set up conversion tracking correctly. If either or both of those things are true, the client is referencing faulty data.

We’re Biased

Humans are biased creatures. What we perceive is based on beliefs and experience, most of which is subjective. Our subjective view or bias causes us to do many things that skew analytical results such as selecting non-random samples or cherry-picking data.

“The mistake people are making is they don’t let the data talk to them. They’re looking for something in the data that’s not there,” said D. Anthony Miles, CEO and founder of consulting firm Miles Development Industries Corp. “You have to ask what the data is telling you and what it isn’t telling you.”

Marketers tend not to look at analytical results critically, however. They tend to accept analytical at face value unless it’s out of sync with their beliefs. If they were looking at analytical results critically, they’d ask why a particular analytical result occurred or didn’t occur.

“The data can and should tell the story, but we make up our own story and look for data to support it, so we may miss the most important thing because we were looking for something else,” said Matchnode’s Madden.

Pesky PII

Finally, marketing can only be so accurate without a critical mass of Personally Identifiable Information or PII, some of which consumers do not want to give and some of which is illegal. Without the missing data points, it’s difficult to reach consumers with the right message at the right time for the right reason.

The lack of that “last mile” data is the reason why some people think marketing will never be 100% accurate. What do you think? Before you answer, think of a compromising message that might be sent to you, just at the wrong moment.

What A Chief Analytics Officer Really Does

As analytics continues to spread out across an organization, someone needs to orchestrate it all. The “best” person for the job is likely a chief analytics officer (CAO) who understands the business, understands analytics, and can help align the two.

The CAO role is a relatively new C-suite position, as is the chief data officer or CDO. Most organizations don’t have both and when they don’t, the titles tend to be used interchangeably. The general distinction is that the CAO focuses more on analytics and its business impact while the CDO is in charge of data management and data governance.

“The new roles are really designed to expand the use of data and expand the questions that data is used to answer,” said Jennifer Belissent, principal analyst at Forrester. “It’s changing the nature of data and analytics use in the organization, leveraging the new tools and techniques available, and creating a culture around the use of data in an organization.”

Someone in your organization may already have some or all of a CAO’s responsibilities and may be succeeding in the position without the title, which is fine. However, in some organizations a C-suite title and capability can help underscore the importance of the role and the organization’s shift toward more strategic data usage.

“The CAO needs to be able to evangelize the use of data, demonstrate the value of data, and deliver outcomes,” said Belissent. “It’s a role around cultural change, change management, and evangelism.”

If you’re planning to appoint a CAO, make sure that your organization is really ready for one because the role can fail if it is prevented from making the kinds of change the organization needs. A successful CAO needs the support of senior management, as well as the authority, responsibility, budget, and people skills necessary to affect change.

One mistake organizations make when hiring a CAO is placing too much emphasis on technology and not enough emphasis on business acumen and people skills.

The making of a CAO

When professional services company EY revisited its global strategy a few years ago, it was clear to its leadership that data and analytics were of growing importance to both its core business and the new services it would provide to clients.

Rather than hiring someone from the outside, EY chose its chief strategy officer, Chris Mazzei, for the role. His charter as CAO was to develop an analytics capability across EY’s four business units and the four global regions in which it operates.

[Want to learn more about CAOs and CDOs, read 12 Ways to Connect Data Analytics to Business Outcomes.]

Part of his responsibility was shaping the strategy and making sure each of the businesses had a plan they were executing against. He also helped expand the breadth and depth of EY’s analytical capabilities, which included acquiring 30 companies in four years.

The acquisitions coupled with EY’s matrixed organizational structure meant lots of analytics tools, lots of redundancies, and a patchwork of other technology capabilities that were eventually rationalized and made available as a service. Meanwhile, the Global Analytics Center of Excellence Mazzei leads was also building reusable software assets that could be used for analytics across the business and for client engagements.

Mazzei and his team also have been responsible for defining an analytics competency profile for practitioners and providing structured training that maps to it. Not surprisingly, his team also works in a consultative capacity with account teams to help enable clients’ analytical capabilities.

“The question is, ‘What is the strategy and how does analytics fit into it?’ It sounds obvious, but few organizations have a clear strategy where analytics is really connected into it across the enterprise and at a business level,” said Mazzei. “You really need a deep understanding of how the business creates value, how the market is evolving, what the sources of competitive differentiation are and how those could evolve. Where you point analytics is fundamentally predicated on having those views.”

Mazzei had the advantage of working for EY for more than a decade and leading the strategy function before becoming the CAO. Unlike a newly-hired CAO, he already had relationships with the people at EY with whom he’d be interfacing.

“Succeeding in this role takes building really trusted relationships in a lot of different parts of the organization, and often at very senior levels,” said Mazzei. “One reason we’ve seen CAOs fail is either because they didn’t have the skills to build those relationships or didn’t invest enough time on it during their tenure.”

Big Data: The Interdisciplinary Vortex

As seen in  InformationWeek.

vortexGetting the most from data requires information sharing across departmental boundaries. Even though information silos remain common, CIOs and business leaders in many organizations are cooperating to enable cross-functional data sharing to improve business process efficiencies, lower costs, reduce risks, and identify new opportunities.

Interdepartmental data sharing can take a company only so far, however, as evidenced by the number of companies using (or planning to use) external data. To get to the next level, some organizations are embracing interdisciplinary approaches to big data.

Why Interdisciplinary Problem-Solving May Be Overlooked

Breaking down departmental barriers isn’t easy. There are the technical challenges of accessing, cleansing, blending, and securing data, as well as very real cultural habits that are difficult to change.

Today’s businesses are placing greater emphasis on data scientists, business analysts, and data-savvy staff members. Some of them also employ or retain mathematicians and statisticians, although they may not have considered tapping other forms of expertise that could help enable different and perhaps more accurate forms of data analysis and new innovations.

“Thinking of big data as one new research area is a misunderstanding of the entire impact that big data will have,” said Dr. Wolfgang Kliemann, associate VP for research at Iowa State University. “You can’t help but be interdisciplinary because big data is affecting all kinds of things including agriculture, engineering, and business.”

Although interdisciplinary collaboration is mature in many scientific and academic circles, applying non-traditional talent to big data analysis is a stretch for most businesses.

But there are exceptions. For example, Ranker, a platform for lists and crowdsourced rankings, employs a chief data scientist who is also a moral psychologist.

“I think psychology is particularly useful because the interesting data today is generated by people’s opinions and behaviors,” said Ravi Iyer, chief data scientist at Ranker. “When you’re trying to look at the error that’s associated with any method of data connection, it usually has something to do with a cognitive bias.”

Ranker has been working with a UC Irvine professor in the cognitive sciences department who studies the wisdom of crowds.

“We measure things in different ways and understand the psychological biases each method of data creates. Diversity of opinion is the secret to both our algorithms and the philosophy behind the algorithms,” said Iyer. “Most of the problems you’re trying to solve involve people. You can’t just think of it as data, you have to understand the problem area you’re trying to solve.”

Why Interdisciplinary Problem-Solving Will Become More Common

Despite the availability of new research methods, online communities, and social media streams, products still fail and big-name companies continue to make high-profile mistakes. They have more data available than ever before, but there may be a problem with the data, the analysis, or both. Alternatively, the outcome may fall short of what is possible.

“A large retail chain is interested in figuring out how to optimize supply management, so they collect the data from sales, run it through a big program, and say, ‘this is what we need.’ This approach leads to improvements for many companies,” said Kliemann. “The question is, if you use this specific program and approach, what is your risk of not having the things you need at a given moment? The way we do business analytics these days, that question cannot be answered.”

One mistake is failing to understand the error structure of the data. With such information, it’s possible to identify missing pieces of data, what the possible courses of action are, and the risk associated with a particular strategy.

“You need new ideas under research, ideas of data models, [to] understand data errors and how they propagate through models,” said Kliemann. “If you don’t understand the error structure of your data, you make predictions that are totally worthless.”

Already, organizations are adapting their approaches to accommodate the growing volume, velocity, and variety of data. In the energy sector, cheap sensors, cheap data storage, and fast networks are enabling new data models that would have been impossible just a few years ago.

“Now we can ask ourselves questions such as if we have variability in wind, solar, and other alternative energies, how does it affect the stability of a power system? [We can also ask] how we can best continue building alternative energies that make the system better instead of jeopardizing it,” said Kleinman.

Many universities are developing interdisciplinary programs focused on big data to spur innovation and educate students entering the workforce about how big data can affect their chosen field. As the students enter the workforce, they will influence the direction and culture of the companies for which they work. Meanwhile, progressive companies are teaming up with universities with the goal of applying interdisciplinary approaches to real-world big data challenges.

In addition, the National Science Foundation (NSF) is trying to accelerate innovation through Big Data Regional Innovation Hubs. The initiative encourages federal agencies, private industry, academia, state and local governments, nonprofits, and foundations to develop and participate in big data research and innovation projects across the country. Iowa State University is one of about a dozen universities in the Midwestern region working on a proposal.

In short, interdisciplinary big data problem-solving will likely become more common in industry as organizations struggle to understand the expanding universe of data. Although interdisciplinary problem-solving is alive and well in academia and in many scientific research circles, most businesses are still trying to master interdepartmental collaboration when it comes to big data.

How Corporate Culture Impedes Data Innovation

As seen in InformationWeek

Floppy disk

Corporate culture moves slower than tech

Competing in today’s data-intensive business environment requires unprecedented organizational agility and the ability to drive value from data. Although businesses have allocated significant resources to collecting and storing data, their abilities to analyze it, act upon it, and use it to unlock new opportunities are often stifled by cultural impediments.

While the need to update technology may be obvious, it may be less obvious that corporate cultures must also adapt to changing times. The necessary adjustments to business values, business practices, and leadership strategies can be uncomfortable and difficult to manage, especially when they conflict with the way the company operated in the past.

If your organization isn’t realizing the kind of value from its big data and analytics investments that it should be, the problem may have little to do with technology. Even with the most effective technologies in place, it’s possible to limit the value they provide by clinging to old habits.

Here are five ways that cultural issues can negatively affect data innovation:

1. The Vision And Culture Are At Odds

Data-driven aspirations and “business as usual” may well be at odds. What served a company well up to a certain point may not serve the company well going forward.

“You need to serve the customer as quickly as possible, and that may conflict with the way you measured labor efficiencies or productivity in the past,” explained Ken Gilbert, director of business analytics at the University of Tennessee Office of Research and Economic Development, in an interview with InformationWeek.

[ What matters more: Technology or people? Read Technology Is A Human Endeavor. ]

Companies able to realize the most benefit from their data are aligning their visions, corporate mindsets, performance measurement, and incentives to effect widespread cultural change. They are also more transparent than similar organizations, meaning that a wide range of personnel has visibility into the same data, and data is commonly shared among departments, or even across the entire enterprise.

“Transparency doesn’t come naturally,” Gilbert said. “Companies don’t tend to share information as much as they should.”

Encouraging exploration is also key. Companies that give data access to more executives, managers, and employees than they did in the past have to also remove limits that may be driven by old habits. For example, some businesses discourage employees from exploring the data and sharing their original observations.

2. Managers Need Analytics Training

Companies that are training their employees in ways to use analytical tools may not be reaching managers and executives who choose not to participate because they are busy or consider themselves exempt. In the most highly competitive companies, executives, managers, and employees are expected to be — or become — data savvy.

Getting the most from BI and big data analytics means understanding what the technology can do, and how it can be used to best achieve the desired business outcomes. There are many executive programs that teach business leaders how to compete with business analytics and big data, including the Harvard Business School Executive Education program.

3. Expectations Are Inconsistent

This problem is not always obvious. While it’s clear the value of BI and big data analytics is compromised when the systems are underutilized, less obvious are inconsistent expectations about how people within the organization should use data.

“Some businesses say they’re data-driven, but they’re not actually acting on that. People respond to what they see rather than what they hear,” said Gilbert. “The big picture should be made clear to everybody — including how you intend to grow the business and how analytics fits into the overall strategy.”

4. Fiefdoms Restrict Data Sharing

BI and analytics have moved out from the C-suite, marketing, and manufacturing to encompass more departments, but not all organizations are taking advantage of the intelligence that can be derived from cross-functional data sharing. An Economist Intelligence Unit survey of 530 executives around the world revealed that information-sharing issues represented the biggest obstacle to becoming a data-driven organization.

“Some organizations supply data on a need-to-know basis. There’s a belief that somebody in another area doesn’t need to know how my area is performing when they really do,” Gilbert said. “If you want to use data as the engine of business growth, you have to integrate data from internal and external sources across lines, across corporate boundaries.”

5. Little-Picture Implementations

Data is commonly used to improve the efficiency or control the costs of a particular business function. However, individual departmental goals may not align with the strategic goal of the organization, which is typically to increase revenue, Gilbert said.

“If the company can understand what the customer values, and build operational systems to better deliver, that is the company that’s going to win. If the company is being managed in pieces, you may save a dime in one department that costs the company a dollar in revenue.”

Six Ways to Master the Data-Driven Enterprise

As seen in InformationWeek.

StatisticsBig data is changing the way companies and industries operate. Although virtually all businesses acknowledge the trend, not all of them are equally prepared to meet the challenge. The companies in the best position to compete have transformed themselves into “data-driven” organizations.

Data-driven organizations routinely use data to inform strategy and decision-making. Although other businesses share the same goal, many of them are still struggling to build the necessary technological capabilities, or otherwise their culture is interfering with their ability to use data, or both.

Becoming a data-driven organization isn’t easy, however. In fact, it’s very difficult. While all organizations have a glut of data, their abilities to collect it, cleanse it, integrate it, manage it, access it, secure it, govern it, and analyze it vary significantly from company to company. Even though each of these factors helps ensure that data can be used with higher levels of confidence, it’s difficult for a business to realize the value of its data if its corporate culture lags behind its technological capabilities.

Data-driven organizations have extended the use of data across everyday business functions, from the C-suite to the front lines. Rather than hoping that executives, managers, and employees will use business intelligence (BI) and other analytical tools, companies that are serious about the use of data are training employees, making the systems easier to use, making it mandatory to use the systems, and monitoring the use of the systems. Because their ability to compete effectively depends on their ability to leverage data, such data-driven organizations make a point of aligning their values, goals, and strategies with their ability to execute.

On the following pages we reveal the six traits common to data-driven organizations that make them stand out from their competitors.

Forward Thinkers

Data-driven enterprises consider where they are, where they want to go, and how they want to get there. To ensure progress, they establish KPIs to monitor the success of business operations, departments, projects, employees, and initiatives. Quite often, these organizations have also established one or more cross-functional committees of decision-makers who collectively ensure that business goals, company practices, and technology implementations are in sync.

“The companies that have integrated data into their business strategies see it as a means of growing their businesses. They use it to differentiate themselves by providing customers with better service, quicker turnaround, and other things that the competition can’t meet,” said Ken Gilbert, director of business analytics at the University of Tennessee’s Office of Research and Economic Development, in an interview with InformationWeek. “They’re focused on the long-term and big-picture objectives, rather than tactical objectives.”

Uncovering Opportunities

Enterprises have been embracing BI and big data analytics with the goal of making better decisions faster. While that goal remains important to data-driven enterprises, they also are trying to uncover risks and opportunities that may not have been discoverable previously, either because they didn’t know what questions to ask or because previously used technology lacked the capability.

According to Gartner research VP Frank Buytendijk, fewer than half of big data projects focus on direct decision-making. Other objectives include marketing and sales growth, operational and financial performance improvement, risk and compliance management, new product and service innovation, and direct or indirect data monetization.

Hypothesis Trumps Assumption

People have been querying databases for decades to get answers to known questions. The shortcoming of that approach is assuming that the question asked is the optimal question to ask.

Data-driven businesses aim to continuously improve the quality of the questions they ask. Some of them also try to discover, through machine learning or other means, what questions they should be asking that they have not yet asked.

The desire to explore data is also reflected in the high demand for interactive self-service capabilities that enable users to adjust their thinking and their approaches in an iterative fashion.

Pervasive Analytics

Data analytics has completely transformed the way marketing departments operate. More departments than ever are using BI and other forms of analytics to improve business process efficiencies, reduce costs, improve operational performance, and increase customer satisfaction. A person’s role in the company influences how the data is used.

Big data and analytics are now on the agendas of boards of directors, which means that executives not only have to accept and support the use of the technologies, they also have to use them — meaning they have to lead by example. Aberdeen’s 2014 Business Analytics survey indicated that data-driven organizations are 63% more likely than the average organization to have “strong” or “highly pervasive” adoption of advanced analytical capabilities among corporate management.

Failure Is Acceptable

Some companies encourage employees to experiment because they want to fuel innovation. With experimentation comes some level of failure, which progressive companies are willing to accept within a given range.

Encouraging exploration and accepting the risk of failure that accompanies it can be difficult cultural adjustments, since failure is generally considered the opposite of success. Many organizations have made significant investments in big data, analytics, and BI solutions. Yet, some hesitate to encourage data experimentation among those who are not data scientists or business analysts. This is often because, historically, the company’s culture has encouraged conformity rather than original thinking. Such a mindset not only discourages innovation, it fails to acknowledge that the failure to take risks may be more dangerous than risking failure.

Data Scientists And Machine Learning

Data-driven companies often hire data scientists and use machine learning so they can continuously improve their ability to compete. Microsoft, IBM, Accenture, Google, and Amazon ranked first through fifth, respectively, in a recent list of 7,500 companies hiring data scientists. Google, Netflix, Amazon, Pandora, and PayPal are a few examples of companies using machine learning with the goal of developing deeper, longer-lasting, and more profitable relationships with their customers than previously possible.

Tech Buying: 6 Reasons Why IT Still Matters

ErrorOriginally published in InformationWeek, and available as a slideshow here.

Making major tech purchases, especially big data analytics and business intelligence tools, without consulting IT may cause major problems. Here’s why.

Although shadow IT is not new, the percentage of business tech purchases made outside IT is significant and growing. When Bain & Company conducted in-depth interviews with 67 marketing, customer service, and supply chain professionals in February 2014, it found that nearly one-third of technology purchasing power had moved to executives outside of IT. Similarly, member-based advisory firm CEB has estimated that non-IT departments control 30% of enterprise IT spend. By 2020, Gartner estimates, 90% of tech spending will occur outside IT.

There are many justifications for leaving IT in the dark about departmental tech purchases. For one thing, departmental technology budgets seem to point to departmental decision making. Meanwhile, cloud-based solutions, including analytics services, have become more popular with business users because they are easy to set up. In addition, their relatively low subscription rates or pay-per-use models may be more attractive from a budgetary standpoint than their traditional on-premises counterparts, which require significant upfront investments and IT consideration. Since the cost and onboarding barriers to cloud service adoption are generally lower than for on-premises products, IT’s involvement may seem to be unnecessary.

Besides, IT is busy. Enterprise environments are increasingly complex, and IT budgets are not growing proportionally, so the IT department is resource-constrained. Rather than waiting for IT — or complicating decision-making by getting others involved — non-IT tech buyers anxious to deploy a solution may be tempted to act first and answer questions later.

However, making tech purchase without IT’s involvement may result in unforeseen problems. On the following pages, we reveal six risks associated with making business tech purchases without involving IT.

1. Tech Purchases Affect Everybody
Tech purchases made without IT’s involvement may affect IT and the IT ecosystem in ways that someone outside IT couldn’t anticipate. You might be introducing technical risk factors or tapping IT resources IT will have to troubleshoot after the fact. To minimize the potential of unforeseen risks, IT can perform an in-depth assessment of your department’s requirements, the technology options, their trade-offs, and the potential ripple effect that your tech purchase might have across the organization. This kind of risk/benefit analysis is important. Even if it seems like a barrier for your department to get what it wants, it’s better for the entire organization in the long run.
Also, you may need help connecting to data sources, integrating data sources, and ensuring the quality of data, all of which require specific expertise. IT can help you understand the scope of an implementation in greater detail than you might readily see.

2. Sensitive Information May Be Compromised
Information security policies need to be defined, monitored, and enforced. While it’s common for businesses to have security policies in place, education about those policies, and the enforcement of those policies, sometimes fall short. Without appropriate precautions, security leaks can happen innocently, or you could be opening the door to intentional bad actors.
Cloud-based services can expose organizations to risks that users haven’t considered, especially when the service’s terms of use are not understood. Asurvey of 4,140 business and IT managers, conducted in July 2012 by The Ponemon Institute and sponsored by Thales e-Security, revealed that 63% of respondents did not know what cloud providers are doing to protect their sensitive or confidential data.

3. Faulty Data = Erroneous Conclusions
There is no shortage of data to analyze. However, inadequate data quality and access to only a subset of information can negatively impact the accuracy of analytics and, ultimately, decision making.
In an interview with InformationWeek, Jim Sterne, founder of the eMetrics Summit and the Digital Analytics Association, warned that the relative reliability of sources needs to be considered since CRM system data, onsite user behavior data, and social media sentiment analysis data are not equally trustworthy.
“If I’m looking at a dashboard as a senior executive and I know where the data came from and how it was cleansed and blended, I’m looking at the numbers as if they have equal weight,” he said. “It’s like opening up a spice cabinet and assuming each spice is as spicy as any other. I will make bad decisions because I don’t know how the information was derived.”

4. Not Getting What You Bought
Similar products often sound alike, but their actual capabilities can vary greatly. IT can help identify important differences.
While it may be tempting to purchase a product based on its exhaustive feature set or its latest enhancements, feature-based buying often proves to be a mistake because it omits or minimizes strategic thinking. To reduce the risk of buyer’s remorse, consulting with IT can help you assess your current and future requirements and help you choose a solution that aligns with your needs.

5. Scope Creep
Business users typically want immediate benefits from big data, analytics packages, and BI systems. But, if the project has a lot of technological complexity — and particularly if it involves tech dependencies that are outside the control of your department — it’s often best to implement in phases. Approaching large initiatives as one big project may prove to be more complicated, time-consuming, and costly than anticipated.
IT can help you break a large, difficult-to-manage project into several smaller projects, each of which has its own timeline and goals. That way, you can set realistic end-user and C-suite expectations and effectively control risks. Phasing large projects can also provide you with the flexibility you need to adjust your implementation as business requires.

6. Missing Out On Prior Experience
IT professionals and outsourced IT resources often have prior experience with BI and analytics implementations that are specific or relevant to your department. Some of them have implemented solutions in other companies, departments, or industries and have gained valuable insight from those experiences. When armed with such knowledge, they can help you understand potential opportunities, challenges, and pitfalls you may not have considered which can affect planning, implementation, and the choice of solutions.

Newer posts »