Lisa Morgan's Official Site

Click-worthy Content Development

Big Data: The Interdisciplinary Vortex

vortexAs seen in InformationWeek

Getting the most from data requires information sharing across departmental boundaries. Even though information silos remain common, CIOs and business leaders in many organizations are cooperating to enable cross-functional data sharing to improve business process efficiencies, lower costs, reduce risks, and identify new opportunities.

Interdepartmental data sharing can take a company only so far, however, as evidenced by the number of companies using (or planning to use) external data. To get to the next level, some organizations are embracing interdisciplinary approaches to big data.

Why Interdisciplinary Problem-Solving May Be Overlooked

Breaking down departmental barriers isn’t easy. There are the technical challenges of accessing, cleansing, blending, and securing data, as well as very real cultural habits that are difficult to change.

 Today’s businesses are placing greater emphasis on data scientists, business analysts, and data-savvy staff members. Some of them also employ or retain mathematicians and statisticians, although they may not have considered tapping other forms of expertise that could help enable different and perhaps more accurate forms of data analysis and new innovations.

“Thinking of big data as one new research area is a misunderstanding of the entire impact that big data will have,” said Dr. Wolfgang Kliemann, associate VP for research at Iowa State University. “You can’t help but be interdisciplinary because big data is affecting all kinds of things including agriculture, engineering, and business.”

[Tear down the silos. See How Corporate Culture Impedes Data Innovation.]

Although interdisciplinary collaboration is mature in many scientific and academic circles, applying non-traditional talent to big data analysis is a stretch for most businesses.

But there are exceptions. For example, Ranker, a platform for lists and crowdsourced rankings, employs a chief data scientist who is also a moral psychologist.

“I think psychology is particularly useful because the interesting data today is generated by people’s opinions and behaviors,” said Ravi Iyer, chief data scientist at Ranker. “When you’re trying to look at the error that’s associated with any method of data connection, it usually has something to do with a cognitive bias.”

Ranker has been working with a UC Irvine professor in the cognitive sciences department who studies the wisdom of crowds.

“We measure things in different ways and understand the psychological biases each method of data creates. Diversity of opinion is the secret to both our algorithms and the philosophy behind the algorithms,” said Iyer. “Most of the problems you’re trying to solve involve people. You can’t just think of it as data, you have to understand the problem area you’re trying to solve.”

Why Interdisciplinary Problem-Solving Will Become More Common

Despite the availability of new research methods, online communities, and social media streams, products still fail and big-name companies continue to make high-profile mistakes. They have more data available than ever before, but there may be a problem with the data, the analysis, or both. Alternatively, the outcome may fall short of what is possible.

“A large retail chain is interested in figuring out how to optimize supply management, so they collect the data from sales, run it through a big program, and say, ‘this is what we need.’ This approach leads to improvements for many companies,” said Kliemann. “The question is, if you use this specific program and approach, what is your risk of not having the things you need at a given moment? The way we do business analytics these days, that question cannot be answered.”

One mistake is failing to understand the error structure of the data. With such information, it’s possible to identify missing pieces of data, what the possible courses of action are, and the risk associated with a particular strategy.

“You need new ideas under research, ideas of data models, [to] understand data errors and how they propagate through models,” said Kliemann. “If you don’t understand the error structure of your data, you make predictions that are totally worthless.”

Already, organizations are adapting their approaches to accommodate the growing volume, velocity, and variety of data. In the energy sector, cheap sensors, cheap data storage, and fast networks are enabling new data models that would have been impossible just a few years ago.

“Now we can ask ourselves questions such as if we have variability in wind, solar, and other alternative energies, how does it affect the stability of a power system? [We can also ask] how we can best continue building alternative energies that make the system better instead of jeopardizing it,” said Kleinman.

Many universities are developing interdisciplinary programs focused on big data to spur innovation and educate students entering the workforce about how big data can affect their chosen field. As the students enter the workforce, they will influence the direction and culture of the companies for which they work. Meanwhile, progressive companies are teaming up with universities with the goal of applying interdisciplinary approaches to real-world big data challenges.

In addition, the National Science Foundation (NSF) is trying to accelerate innovation through Big Data Regional Innovation Hubs. The initiative encourages federal agencies, private industry, academia, state and local governments, nonprofits, and foundations to develop and participate in big data research and innovation projects across the country. Iowa State University is one of about a dozen universities in the Midwestern region working on a proposal.

In short, interdisciplinary big data problem-solving will likely become more common in industry as organizations struggle to understand the expanding universe of data. Although interdisciplinary problem-solving is alive and well in academia and in many scientific research circles, most businesses are still trying to master interdepartmental collaboration when it comes to big data.

Why Surveys Should Be Structured Differently

keyboard-417093_1280If you’re anything like me, you’re often asked to participate in surveys.  Some of them are short and simple.  Others are very long, very complicated, or both.

You may also design and implement surveys from time to time like I do.   If you want some insight into the effectiveness of your survey designs and their outcomes, pay attention to the responses you get.

Notice the Drop-off Points

Complicated surveys that take 15 or 20 minutes to complete tend to reflect drop off points at which the respondents decided that the time investment required wasn’t worth whatever incentive was offered.  After all, not everyone actually cares about survey results or a  1-in-1,000 chance of winning the latest iPad, for example.  If there’s no incentive whatsoever, long and complicated surveys may  be even less successful, even if you’re pinging your own  database.

A magazine publisher recently ran such a survey, and boy, was it hairy.  It started out like similar surveys, asking questions about the respondent’s title, affiliation, company revenue and size.  It also asked about purchasing habits – who approves, who specifies, who recommends, etc. for different kinds of technologies.  Then, what the respondent’s content preferences are for learning about tech (several drop-down menus), using tech (several drop-down menus), purchasing tech (several drop-down menus), and I can’t remember what else.  At that point, one was about 6% done with the survey.  So much for “10 – 15 minutes.”  It took about 10 or 15 minutes just to wade through the first single-digit percent of it.  One would really want a slim chance of winning the incentive to complete that survey.

In short, the quest to learn everything about everything in one very long and complex survey may end in more knowledge about who took the survey than how how people feel about important issues.

On the flip side are very simple surveys that take a minute or two to answer.  Those types of surveys tend to focus on whether a customer is satisfied or dissatisfied with customer service, rather than delving into the details of opinions about several complicated matters.

Survey design is really important.  Complex fishing expeditions can and often do reflect a lack of focus on the survey designer’s part.

Complex Surveys May Skew Results

Overly complicated surveys may also yield spurious results.  For example, let’s say 500 people agree to take a survey we just launched that happens to be very long and very complex.  Not all of the respondents will get past the who-are-you questions because those too are complicated.  Then, as the survey goes on, more people drop, then more.

The result is that  X% of of the survey responses at the end of the survey are not the same as X% earlier in the survey.  What I mean by that is 500 people started, maybe 400 get past the qualification portion, and the numbers continue to fall as yet more complicated questions arise but  the “progress bar” shows little forward movement.  By the end of the survey, far less than 500 have participated, maybe 200  or 100.

Of course, no one outside the survey team knows this, including the people in the company who are presented with the survey results.  They only know that 500 people participated in the survey and X% said this or that.

However, had all 500 people answered all the questions, the results of some of the questions would likely look slightly or considerably different, which may be very important.

Let’s say 150 people completed our  survey and the last question asked whether they planned to purchase an iPhone 7 within the next three months.  40% of them or 60 respondents said yes.  If all 500 survey respondents answered that same question, I can almost guarantee you the answer would not be 40% .  It might be close to 40% or it might not be even close to 40%.

So, if you genuinely care about divining some sort of “truth” from surveys, you need to be mindful about how to define and structure the survey and that the data you see may not be telling you the entire story, or even an accurate story.

The point about accuracy is very important and one that people without some kind of statistical background likely haven’t even considered because they’re viewing all aggregate numbers as having equal weight and equal accuracy.

I, for one, think that survey “best practices” are going to evolve in the coming years with the help of data science.  While the average business person knows little about data science now, in the future it will likely seem cavalier not to consider the quality of the data you’re getting and what you can do to improve the quality of that data.  Your credibility and perhaps your job may depend on it.

In the meantime, try not to shift the burden of thinking entirely to your survey audience because it won’t do either of you much good.  Think about what you want to achieve, structure your questions in a way that gives you insight into your audience and their motivations (avoid leading questions!), and be mindful that not all aggregate answers are equally accurate or representative, even within the same survey.

How To Increase Contributed Article Placement Success

abstract-1260505_640Pitching and placing contributed articles is a staple of a good PR program.  Editors are bombarded with ideas every day.  Some make the cut, some don’t.  Want to up your chances?  Think and write like a journalist.

I’ve received a few pitches lately that I found quite incredible.  I imagine the PR reps thought the pitches were logical – I certainly would have before I had journalism experience myself – but sitting in the chair of a journalist, I could see how faulty their strategy was.

The idea was, “How about if my client contributes an article that highlights the features of its product?”  My response was, “Sounds like a sponsored editorial product.”  Why?  Because the proposed content was essentially an ad.

My audiences – business executives and technologists – don’t have much of an appetite for blatantly self-promotional prose.  Moreover, promotional content does little to establish your client as “a thought leader.”  It positions them more like a salesperson.  The same can be said for other media including video and webinars.

The best contributed pieces really show off an expert’s chops.  That person knows more about leadership or emotional analytics or programming in Python than most of his or her peers do, and that person is willing to share their expertise, free of blatant product or service tie-backs.  A good pitch reflects that.

I bring this up because I hate to see people waste time and their clients’ budgets.  There are better and worse ways to do things.  It is entirely possible to advance your client’s agenda without attaching flashing lights to it.

Having said all of this, I see contributed articles published that are self-promotional, especially in the tech pubs have had their budgets slashed dramatically.  I don’t think those articles do the community or the contributor much good, so I’m sticking to my guns.

If you want to improve your chances of making it into the little pile, think and write like a journalist.

 

 

…But the bugs remain

As seen in SD Times.

bugSoftware teams are under pressure to deliver higher-quality software faster, but as high-profile failures and lackluster app ratings indicate, it’s easier said than done. With the tremendous growth of agile development, finding bugs earlier in the development cycle has become an imperative, but not all organizations are succeeding equally well.

“Developers realize they need better tools to investigate problems, but we need to make sure we’re not creating problems in the first place,” said Gil Zilberfeld, product manager of unit testing solution provider Typemock.

Software teams are using all kinds of tools, including bug and defect trackers, SCM tools, testing suites, and ALM suites, and yet software quality has not improved generally, according to William Nichols, a senior member of the technical staff at the Software Engineering Institute.

“The data don’t suggest that the software being produced is any better than it was a decade or 20 years ago, whether you measure it by lines of code or function points and defects,” he said. “We’re seeing one to seven defects per 1,000 lines of code. We’re making the same mistakes, and the same mistakes cause the same problems.”

One problem is focusing too much on the speed of software delivery rather than software quality. Nichols said this is a symptom of unrealistic management expectations. Tieren Zhou, founder and CEO of testing and ALM solution provider TechExcel, considered it a matter of attention: what’s sexy versus what matters.

“Bug fixing is less interesting than building features,” said Zhou. “In the interest of acquiring new customers, you may be losing old customers who are not happy with your products.”

While software failures are often blamed on coding and inadequate testing, there are many other reasons why software quality isn’t what it should be as evidenced by defects that are being injected at various points within the software life cycle.

Bug and defect tracking go agile

Bug and defect tracking is becoming less of a siloed practice as organizations embrace agile practices. Because agile teams are cross-functional and collaborative, tools are evolving to better align with their needs.

“We’re moving from isolation to transparency,” said Paula Rome, senior project manager at Seapine, a provider of testing and ALM solutions. “It makes no sense to have critical decision-making information trapped in systems.”

Since software teams no longer have weeks to dedicate to QA, developers and testers are working closer together than ever. While pair programming and test-driven development practices can help improve the quality of code, not every team is taking advantage of those and other means that can help find and fix defects earlier in the life cycle.

“There’s a need to find problems earlier, more often and faster, but what you’re seeing are .01 releases that fix a patch, or software teams using their customer base as a bug-tracking and bug-finding system,” said Archie Roboostoff, experience director for the Borland portfolio at Micro Focus, a software-quality tool provider.

Atlassian, maker of the JIRA issue and bug tracker, is helping teams get more insight into bugs with its latest 6.2 release. Instead of viewing bugs in “open” and “closed” terms, users can now see how many commits have been made, whether the peer reviews were success or not, and whether the code has been checked into production or not.

“The process of fixing a bug is a multi-stage process,” said Dan Chuparkoff, head of JIRA product marketing at Atlassian. “Developers check things out of the master branch, write some code, submit their code for peer reviews, peers comment on their code, the developers make some adjustments, check it into the master branch, and roll it up into production. Those steps are completely invisible in most bug systems so the stakeholders have trouble seeing whether something’s close to being finished or not.”

uTest (soon to be known as Applause) offers “in the wild” testing, which is a crowdsourced approach to quality assurance that enables organizations to find issues in production before their customers do.

Software teams are using the service to supplement their lab tests, although some, especially those doing three builds a week, are running lab and in-the-wild tests in parallel.

“In an agile world and in a continuous world, you want to make sure things are thoroughly tested and want to accelerate your sprints,” said Matt Johnston, chief strategy officer of uTest. “We help them catch things that were missed in lab testing, and we’re helping them find things they can’t reproduce.”

To keep pace with faster software release velocities, Hamid Shojaee, CEO of Scrum software provider Axosoft, is focusing on usability so individuals and teams can do a better job of resolving defects in less time.

“The custom pieces of information associated with each tracked bug are different for every team,” he said. “Creating custom fields has been a time-consuming and difficult thing to do when you’re customizing a bug-tracking tool. We have an intuitive user interface, so what would have taken you 20 to 30 minutes takes seconds.”

AccuRev is also enabling teams to spend less time using tools and more time problem-solving.

“Defect tracking can be cumbersome,” said Joy Darby, a director of engineering at AccuRev. “By the time software gets to QA, they have to ask questions or reference e-mails or look at a white board. With a central repository, you have instant access to all the artifacts, all the tests that were done, the build results, and any sort of complex code analysis you may have done.”

While more tools are evolving to support continuous integration and deployment, organizational cultures are not moving as quickly.

“While we’re all off iterating, the business is off waterfalling,” said Jeff Dalton, a Standard CMMI Appraisal Method for Process Improvement lead appraiser and CMMI instructor. “Software teams are accelerating their delivery cycles while the rest of the business still views software in terms of phases, releases, large planning efforts, large requirements, and 12-month delivery cycles.”

The disconnect between agile and traditional ways of working can work against software quality when funding is not tied to the outcome of sprints, for example.

Adopting a life cycle view of quality

As software teams become more agile, discrete workflows become collaborative ones that require a life-cycle view of software assets and interoperable tools. Even with the greater level of visibility life-cycle approaches provide, the root cause of problems nevertheless may be overlooked in the interest of finding and fixing specific bugs.

“We’ve inflated processes and tools in order to support something that could have been figured out earlier in the process if we had defined a better spec,” said Typemock’s Zilberfeld. “Instead, we spend five hours talking about something the customer doesn’t care about.”

Most ALM tools are open enough to support other tools whose capabilities equal or surpass what is in the ALM suite. Conversely, narrower tool providers are looking at bugs and defects in a broader sense because customers want to use the tools in a broader context than they have in the past.

“Software is no longer an isolated venture. It really affects all parts of the business,” said Atlassian’s Chuparkoff. “Modern issue trackers have REST APIs that allow you to easily connect your issue tracker to the entire product life cycle. We wanted to make sure JIRA can integrate with your proprietary tool and other tools via REST APIs or plug-ins from our marketplace. We realize people aren’t going to use JIRA in a silo by itself.”

Octo Consulting Group, which provides technology and management consulting services to federal agencies, is one of many organizations that are using JIRA in tandem with ALM solutions.

“Bug and defect tracking is part of ALM,” said Octo Consulting Group CTO Ashok Nare. “While we use JIRA, and there are a lot of good ALM products like Rally, VersionOne and CollabNet…the tools are really there to facilitate a process.”

Despite the broader life-cycle views, software quality efforts often focus on development and testing even though many defects are caused by ill-defined requirements or user stories.

“Philosophically, we didn’t use to think about bugs and defects in terms of requirements problems or customer problems or management problems, so we focused on code,” said CMMI’s Dalton. “But what we found was the code did what it was supposed to do, but didn’t do what the customer wanted it to do. It’s important to understand where the defect is injected into the process, because if we know that, we can change the process to fix it.”

Dalton prefers the process model approach, which includes prototypes, mockups and wireframes as part of requirements and the design process because they solve problems caused in the early stages when they’re the least costly to fix.

“Every time there’s an assumption or something fuzzy in the requirements it leads to defects,” said Adam Sandman, director at Inflectra (a maker of test-management software). “If you can’t define it, you can’t build it well.”

Inflectra, TechExcel, Seapine and the other ALM solution providers tie requirements, development, testing and other life-cycle stages together so that, among other things, defects can be identified, fixed and prevented from coming back in future iterations or releases.

“We’re connecting the dots, making it possible to have transparency between the silos so you get the data you need when you need it,” said Seapine’s Rome.

In addition to providing solutions, TechExcel is trying to help software teams deliver better products by promoting the concept of “QA floaters” who, as part of an agile team, help developers define test cases and run test cases in parallel with developers.

“When developers and QA floaters are both testing, you have a built-in process that helps you find and fix bugs earlier so the developer can satisfy a requirement or story,” said TechExcel’s Zhou. “When you tie in total traceability, you tie requirements, development and testing together in a way that improves productivity and software quality.”

Who owns software quality?
Software quality has become everyone’s job, but not everyone sees it that way, which is one reason why defects continue to fall through organizational cracks.

“When you separate the accountability and resources, that’s where disaster always starts,” said Andreas Kuehlmann, SVP of research and development at testing solution provider Coverity. “A lot of teams have gotten to the point where the developers are doing a little bit of testing, but the rest is tossed over the fence to QA who can’t even start the executable.”

Coverity offers three products that move testing into development: Quality Advisor, which uses deep semantic and static analysis to identify bugs in code when the code is compiling; Security Advisor, which uses the same technology to find security vulnerabilities; and Test Advisor which identifies the most risky code.

“Moving testing into development requires a lot to be done from a workflow perspective,” said Kuehlmann. “You have to have tests running 24×7, you have to have the tools and infrastructure in place, and you have to change developers’ mindsets. That’s really hard. The role of QA is evolving into more like a sign-off check.”

The dynamics between coders and testers is changing, but not in a uniform way. A minority of organizations are collapsing coding and testing into a single function, although the majority is leveraging the skill sets of both developers and QA with the goal of optimizing delivery speed and quality.

“Developers are really good at solving problems, and test engineers are good at finding vulnerabilities,” said Atlassian’s Chuparkoff. “If a developer can run an automated test after he finishes his code, he can fix the bug immediately while he’s in the thinking mode of fixing it. It’s a lot more efficient than fixing it four days later after someone gave you the issue.”

Annotated screen shots help speed up issue resolution, which is why Axosoft, Atlassian and Seapine have added the capability to their tools.

“You have to make sure people are taking the time to put the proper reproduction steps in to make sure those bugs are fixed,” said Axosoft’s Shojaee.

Not everyone on the team may be responsible for fixing defects, but many have the potential to inject them. Because software is increasingly the face of businesses, organizations are starting to realize that software quality isn’t a technical problem; it’s a business problem. For example, uTest’s Johnston recently met with the CIO of a major media company who considers software quality the CEO’s responsibility since a major portion of the company’s revenue is driven by digital experiences.

“If that sentiment can win the day, a lot more companies will be successful in the app economy,” said Johnston.

The complexity paradox
On one hand, the software landscape is becoming more complex, and at the same time, tools and approaches to software development are becoming more abstract, all of which can make finding and fixing defects more difficult.

“It’s not about Windows and Linux anymore,” said Inflectra’s Sandman. “Now you have all these mobile devices and frameworks, and you’re seeing constant updates to browsers. If you’re building systems with frameworks and jQuery plug-ins and something goes wrong, do you fix it, ask the vendor to fix it, or ask the open-source community to fix it? Inevitably the bugs may not be in your application but in the infrastructure you’re relying on.”

Micro Focus’ Roboostoff agreed. “If users see quality as something that works, and your product doesn’t work, then it’s hugely defective,” he said.

“When I had my Web server, application server and database server sitting in my office, I could rest assured a problem was somewhere in the closet and I’d find it eventually. Now, I might have some REST services sitting in Amazon, some service-based message in Azure, six CDNs around the world, and A/B testing for optimizing linking going on, and then on Monday morning half of my customers say something is slow.”

Because there is so much complexity and because the landscape is changing so fast at so many levels, edge-case testing is becoming more important.

“When you consider there are about 160,000 combinations of devices, browsers and platforms you have to test for, most customers aren’t coming close to where they should be,” said Roboostoff. “Since it isn’t practical, you pick the biggest screen and the smallest screen, the newest devices and the oldest devices to lower that risk profile.”

The fragmentation that is continuing to occur at so many levels can cause errors that are difficult to identify and rectify.

“One brand may have four to 10 different codebases, four to 10 product road maps, varying skill sets to accomplish all that, and a multitude of platforms and devices they are building software for that they have to test against,” said Johnston. “Meanwhile, users expect things to operate like a light switch.”

The U.S. government established a standardized approach to security assessment called the Federal Risk and Authorization Management Program (FedRAMP), which is apparently benefitting some software developers and consultants who need to be responsible for their software quality but are not in control of the cloud infrastructure. Octo Consulting Group’s Nare said that FedRAMP’s certification simplifies the testing he would otherwise have to do.

“As the level of abstraction goes up, if you’re only testing the top layer, you have to assume that the lower layers underneath like the infrastructure in the cloud and the PaaS are fundamentally sound so that everything is working the way it’s supposed to,” he said. “When we do security testing today and we test our applications, we don’t certify the whole stack anymore because the cloud service providers have already been certified. Otherwise you might have to write tests at the infrastructure of PaaS level.”

Meanwhile, most organizations are trying to wrap their arms around the breadth of testing and defect resolution practices necessary to deliver Web and mobile applications that provide the scalability, performance, and security customers expect.

“If you’re going to build better quality software faster, you need to make sure that the build actually works,” said Andreas Grabner, technology strategist at Compuware (an IT services company). “The software I write is more complex because it is interacting with things I can’t control.”

And that’s just the current state of Web and mobile development. With the Internet of Things looming, some tool providers expect that mainstream developers will have to write applications for devices other than smartphones, and as a result, system complexity and the related bug- and defect-tracking challenges will increase.

“If you think about the Web and the fragmentation of mobile devices, the complexity has increased by an order of magnitude,” said Johnston. “If you think about wearables or automobiles or smart appliances or smartwatches, it’s going to get exponentially worse.”

There’s no excuse for bad quality

There are many reasons why software quality falls short of user expectations, but the problem is that users don’t want to hear it. Even though every user complaint won’t make it to the top of a backlog, what customers consider “bugs” and “defects” have a nasty habit of making headlines, resulting in seething customer reviews and negatively impacted revenue.

“It’s unacceptable to tell users that you can’t reproduce a bug. These days they have all the cards,” said Johnston. “We live in a world where app quality—functional quality, usability quality, performance quality and security quality—are differentiators, and yet quality is still thought of as a cost center.”

Bug and defect tracking is all about problem-solving, but unfortunately some of the lingering problems aren’t being addressed despite impressive tool advancements because organizations change slower than technology does.

“I can have a product that’s completely bug free and has a great user experience, but if you get no value out of it, the quality is bad,” said Micro Focus’ Roboostoff. “People need to understand quality. It’s not about function; it’s about the customer perception of your product, your brand, and your company.”

 

Big Data: The Interdisciplinary Vortex

As seen in  InformationWeek.

vortexGetting the most from data requires information sharing across departmental boundaries. Even though information silos remain common, CIOs and business leaders in many organizations are cooperating to enable cross-functional data sharing to improve business process efficiencies, lower costs, reduce risks, and identify new opportunities.

Interdepartmental data sharing can take a company only so far, however, as evidenced by the number of companies using (or planning to use) external data. To get to the next level, some organizations are embracing interdisciplinary approaches to big data.

Why Interdisciplinary Problem-Solving May Be Overlooked

Breaking down departmental barriers isn’t easy. There are the technical challenges of accessing, cleansing, blending, and securing data, as well as very real cultural habits that are difficult to change.

Today’s businesses are placing greater emphasis on data scientists, business analysts, and data-savvy staff members. Some of them also employ or retain mathematicians and statisticians, although they may not have considered tapping other forms of expertise that could help enable different and perhaps more accurate forms of data analysis and new innovations.

“Thinking of big data as one new research area is a misunderstanding of the entire impact that big data will have,” said Dr. Wolfgang Kliemann, associate VP for research at Iowa State University. “You can’t help but be interdisciplinary because big data is affecting all kinds of things including agriculture, engineering, and business.”

Although interdisciplinary collaboration is mature in many scientific and academic circles, applying non-traditional talent to big data analysis is a stretch for most businesses.

But there are exceptions. For example, Ranker, a platform for lists and crowdsourced rankings, employs a chief data scientist who is also a moral psychologist.

“I think psychology is particularly useful because the interesting data today is generated by people’s opinions and behaviors,” said Ravi Iyer, chief data scientist at Ranker. “When you’re trying to look at the error that’s associated with any method of data connection, it usually has something to do with a cognitive bias.”

Ranker has been working with a UC Irvine professor in the cognitive sciences department who studies the wisdom of crowds.

“We measure things in different ways and understand the psychological biases each method of data creates. Diversity of opinion is the secret to both our algorithms and the philosophy behind the algorithms,” said Iyer. “Most of the problems you’re trying to solve involve people. You can’t just think of it as data, you have to understand the problem area you’re trying to solve.”

Why Interdisciplinary Problem-Solving Will Become More Common

Despite the availability of new research methods, online communities, and social media streams, products still fail and big-name companies continue to make high-profile mistakes. They have more data available than ever before, but there may be a problem with the data, the analysis, or both. Alternatively, the outcome may fall short of what is possible.

“A large retail chain is interested in figuring out how to optimize supply management, so they collect the data from sales, run it through a big program, and say, ‘this is what we need.’ This approach leads to improvements for many companies,” said Kliemann. “The question is, if you use this specific program and approach, what is your risk of not having the things you need at a given moment? The way we do business analytics these days, that question cannot be answered.”

One mistake is failing to understand the error structure of the data. With such information, it’s possible to identify missing pieces of data, what the possible courses of action are, and the risk associated with a particular strategy.

“You need new ideas under research, ideas of data models, [to] understand data errors and how they propagate through models,” said Kliemann. “If you don’t understand the error structure of your data, you make predictions that are totally worthless.”

Already, organizations are adapting their approaches to accommodate the growing volume, velocity, and variety of data. In the energy sector, cheap sensors, cheap data storage, and fast networks are enabling new data models that would have been impossible just a few years ago.

“Now we can ask ourselves questions such as if we have variability in wind, solar, and other alternative energies, how does it affect the stability of a power system? [We can also ask] how we can best continue building alternative energies that make the system better instead of jeopardizing it,” said Kleinman.

Many universities are developing interdisciplinary programs focused on big data to spur innovation and educate students entering the workforce about how big data can affect their chosen field. As the students enter the workforce, they will influence the direction and culture of the companies for which they work. Meanwhile, progressive companies are teaming up with universities with the goal of applying interdisciplinary approaches to real-world big data challenges.

In addition, the National Science Foundation (NSF) is trying to accelerate innovation through Big Data Regional Innovation Hubs. The initiative encourages federal agencies, private industry, academia, state and local governments, nonprofits, and foundations to develop and participate in big data research and innovation projects across the country. Iowa State University is one of about a dozen universities in the Midwestern region working on a proposal.

In short, interdisciplinary big data problem-solving will likely become more common in industry as organizations struggle to understand the expanding universe of data. Although interdisciplinary problem-solving is alive and well in academia and in many scientific research circles, most businesses are still trying to master interdepartmental collaboration when it comes to big data.

Are Media Relationships Dead?

question-mark-160071_640Strange as it may seem, PR pros used to spend incredible amounts of time cultivating relationships with the media. It wasn’t an email here or there or a social media ping. It was face-to-face time with editors of the target publications at events, on the road, and elsewhere.

I don’t know how many lunches, dinners, and media tours I went on when all of those things were fashionable.  While my PR clients were more interested in “hits” and cover stories, my agency was more concerned about the relationships we established because relationships transcend any client engagement.

In today’s highly fragmented world, things are very different.  PR people have to multitask on entirely different levels and in doing so, they sacrifice focus – focus on relationships, focus on targeting pitches, focus on learning what their clients really do.

I believe it’s still important to develop actual relationships with the media.  I can’t speak for all journalists on this point, but I can tell you that if we’ve established a relationship, your pitch will be placed at the top of the virtual pile, and I’m less inclined to delete it in the first place.  Also, if I have to do outreach for a story, I’ll probably contact you first.

One time, I spent 30 minutes on the phone talking to a PR person about his client’s product strategy simply because every time I needed him to cut through the red tape at that client’s organization, he did it.

PR success requires a confluence of many things, some of which are in your control and some of which are not.  One thing you can control is the way you approach and work with the media.  If you want to have more influence, stop looking at your job as a series of rat-tat-tat news announcements and start looking at the bigger picture.  Cultivate actual relationships with people, because there will be times when you need them, and vice versa.

Remember:  actual relationships transcend clients and publications.  You or I may move tomorrow.  If one or both of us does, you can count on me to point you in some kind of helpful direction, even if your client does not fit within one of my beats.

Six Characteristics of Data-Driven Rock Stars

As seen in InformationWeek

Rock starData is being used in and across more functional aspects of today’s organizations. Wringing the most business value out of the data requires a mix of roles that may include data scientists, business analysts, data analysts, IT, and line-of-business titles. As a result, more resumes and job descriptions include data-related skills.

A recent survey by technology career site Dice revealed that nine of the top 10 highest-paying IT jobs require big data skills. On the Dice site, searches and job postings including big data skills have increased 39% year-over-year, according to Dice president Shravan Goli. Some of the top-compensated skills include big data, data scientist, data architect, Hadoop, HBase, MapReduce, and Pig — and the pay range for those skills ranges from more than $116,000 to more than $127,000, according to data Dice provided to InformationWeek.

However, the gratuitous use of such terms can cloud the main issue, which is whether the candidate and the company can turn that data into specific, favorable outcomes — whether that’s increasing the ROI of a pay-per-click advertising campaign or building a more accurate recommendation engine.

If data skills are becoming necessary for more roles in an organization, it follows that not all data-driven rock stars are data scientists. Although data scientists are considered the black belts, it is possible for other roles to distinguish themselves based on their superior understanding and application of data. Regardless of a person’s title or position in an organization, there are some traits common to data-driven rock stars that have more to do with attitudes and behaviors than technologies, tools, and methods. Click through for six of them.  [Note to readers:  This appeared as a slideshow.]

They Understand Data

Of course data-driven rock stars are expected to have a keener understanding of data than their peers, but what exactly does that mean? Whether a data scientist or a business professional, the person should know where the data came from, the quality of it, the reliability of it, and what methods can be used to analyze it, appropriate to the person’s role in the company.

How they use numbers is also telling. Rather than presenting a single number to “prove” that a certain course of action is the right one, a data-driven rock star is more likely to compare the risks and benefits of alternative courses of action so business leaders can make more accurate decisions.

“‘Forty-two’ is not a good answer,” said Wolfgang Kliemann, associate VP for research at Iowa State University. “‘Forty-two, under the following conditions and with a probability of 1.2% chance that something else may happen,’ is a better answer.”

They’re Curious

Data-driven rock stars are genuinely curious about what data indicates and does not indicate. Their curiosity inspires them to explore data, whether toggling between data visualizations, drilling down into data, correlating different pieces of data, or experimenting with an alternative algorithm. The curiosity may be inspired by data itself, a particular problem, or problem-solving methods that have been used in a similar or different context.

Data scientists are expected to be curious because their job involves scientific exploration. Highly competitive organizations hire them to help uncover opportunities, risks, behaviors, and other things that were previously unknown. Meanwhile, some of those companies are encouraging “out of the box” thinking from business leaders and employees to fuel innovation, which increasingly includes experimenting with data. Some businesses even offer incentives for data-related innovation.

They Actively Collaborate with Others

The data value chain has a lot of pieces. No one person understands everything there is to know about data structure, data management, analytical methods, statistical analysis, business considerations, and other factors such as privacy and security. Although data-driven rock stars tend to know more about such issues than their peers, they don’t operate in isolation because others possess knowledge they need. For example, data scientists need to be able to talk to business leaders and business leaders have to know something about data. Similarly, a data architect or data analyst may not have the ability to manipulate, explore, understand, and dig through large data sets, but a data scientist could dig through and discover patterns and then bring in statistical and programming knowledge to create forward-looking products and services, according to Dice president Shravan Goli.

They Try to Avoid Confirmation Bias

Data can be used to prove anything, especially a person’s opinion. Data-driven rock stars are aware of confirmation bias, so they are more likely to try to avoid it. While the term itself may not be familiar, they know it is not a best practice to disregard or omit evidence simply because it differs from their opinions.

“People like to think that the perspective they bring is the only perspective or the best perspective. I’m probably not immune to that myself,” said Ravi Ivey, chief data scientist at Ranker, a platform for lists and crowdsourced rankings. “They have their algorithms and don’t appreciate experiments or the difference between exploratory and confirmatory research. I don’t think they respect the traditional scientific method as such.”

The Data Science Association’s Data Science Code of Professional Conduct has a rule dedicated specifically to evidence, data quality, and evidence quality. Several of its subsections are relevant to confirmation bias. Among them are failing to “disclose any and all data science results or engage in cherry-picking” and failing to “disclose failed experiments or disconfirming evidence known to the data scientist to be directly adverse to the position of the client.”

They Update Their Skill Sets

Technology, tools, techniques, and available data are always evolving. The data-driven rock star is motivated to continually expand his or her knowledge base through learning, which may involve attending executive education programs, training programs, online courses, boot camps, or meetups, depending on the person’s role in the company.

“I encourage companies to think about growing their workforce because there aren’t enough people graduating with data science degrees,” said Dice president Shravan Goli. “You have to create a pathway for people who are smart, data-driven, and have the ability to analyze patterns so they have to add a couple more skills.”

Job descriptions and resumes increasingly include more narrowly defined skills because it is critical to understand which specific types of big data and analytical skills a candidate possesses. A data-driven rock star understands the technologies, tools, and methods of her craft as well as when and how to apply them.

They’re Concerned About Business Impact

With so much data available and so many ways of analyzing it, it’s easy to get caught up in the technical issues or the tasks at hand while losing site of the goal: using data in a way that positively impacts the business. A data-driven rock star understands that.

Making a business impact requires three things, according to IDC adjunct research adviser Fred McGee: having a critical mass of data available in a timely manner, using analytics to glean insights, and applying those insights in a manner that advances business objectives.

A data-driven rock star understands the general business objectives as well as the specific objective to which analytical insights are being applied. Nevertheless, some companies are still falling short of their goals. Three-quarters of data analytics leaders from major companies recently told McKinsey & Company that, despite using advanced analytics, their companies had improved revenue and costs by less than 1%.

Pitch Closes That May Not Help You

smiley-1041796_640As a journalist, I’m pitched constantly.  I’d say that 20 percent of the pitches I get are good and perhaps 5 percent are excellent.  How would I know?  Lots of journalism experience and lots of PR experience.

Interestingly, whether a pitch gets a response or not can boil down to a few words.

“If you’re interested in X let me know.”  I don’t need to respond, then, with your permission.  If the close had been different, I probably would have said, “You should try pitching X instead.”  Likely, this person will follow up and ask if I got their pitch.  Yup.

The same close is often posed as a question:  “Are you interested?”  This is an easy one to answer most of the time because the answers are binary (yes or no).  These are so easy to say “no” to without explanation.

I guess my issue with all of this is the PR person doesn’t understand why they’re getting no response or curt responses, neither of which feel good.  I understand.  I spent a lot of years as a PR pro and PR exec, and I know how frustrating pitching can be.   OTOH, when you’re sorting through a pile of pitches, we can and will choose the path of least resistance whenever possible.

What Your PR Client Should NOT Do

Bulldozer

Runaway clients can hurt coverage

Media interviews are an interesting thing to “manage.” There are clients who just want you to set up interviews, clients who value your involvement and guidance, and clients who are like helium balloons that just lost their strings.

Every now and then, even the best clients can get a little out of control, because they’re so passionate.  Passion is fine, but when it gets to the point of bulldozing an interview, it’s time for media training.

Why Bulldozing is a Bad Thing

I’m one of those journalists who prepares for interviews.  I have a set of questions I develop for a story because somebody is going to ask me for one and I need to define the scope of the interviews.  Sometimes I have to improvise when I’m interviewing which is fine, but when the whole interview is off-script, it may cause problems for everyone involved.

Sometimes I can’t get a word in, let alone a question, if it’s a telephone interview (which is very rare these days).  If it’s an email response, I’ll read through it, but…

Why I Have a “Script”

I develop a list of questions for every set of interviews I do.  I’m happy to send them in advance when requested, but I tend not to send them as a matter of course.  Occasionally, whether or not the interviewee has the questions in advance, that person will say, “I know you want to cover this, but…[I’ve decided the angle of the story should be something else]” or “I’ve looked at your questions and [I’m going to ignore them].”  Then they wonder why they’re not included in the story, or why the other guy was quoted multiple times.

There are several answers to to these types of queries which are:

  • The content was irrelevant
  • The content was difficult or impossible to use given its lack of structure
  • The content doesn’t dovetail well with other conversations
  • It’s just too much work to use

An important thing to know is: I write on assignment.  That means an editor says, “write this,” or I pitch an idea, and that’s what I’m expected to deliver.

The Good News

The good news is that most interviewees have figured out that the best way to conduct interviews is to answer the questions asked, directly.  It’s fine to give examples, cite use cases, or use analogies as supplementary material as long as the content is relevant to the angle of the story.  If they respond to questions  in a relevant manner, their chances of being included in a story or getting more coverage than they would otherwise get can improve significantly.

I do my best to include everyone I interview, but it’s not always possible.  I am happy to explain the situation to the PR rep, if asked.  After all, I spent many years sitting on that side of the desk.

Thankfully for all of us, the bulldozers are few and far between.  If your client is one of them, you’re wise to explain why bulldozing isn’t wise.  It will help you better manage client expectations down the line.

How Corporate Culture Impedes Data Innovation

As seen in InformationWeek

Floppy disk

Corporate culture moves slower than tech

Competing in today’s data-intensive business environment requires unprecedented organizational agility and the ability to drive value from data. Although businesses have allocated significant resources to collecting and storing data, their abilities to analyze it, act upon it, and use it to unlock new opportunities are often stifled by cultural impediments.

While the need to update technology may be obvious, it may be less obvious that corporate cultures must also adapt to changing times. The necessary adjustments to business values, business practices, and leadership strategies can be uncomfortable and difficult to manage, especially when they conflict with the way the company operated in the past.

If your organization isn’t realizing the kind of value from its big data and analytics investments that it should be, the problem may have little to do with technology. Even with the most effective technologies in place, it’s possible to limit the value they provide by clinging to old habits.

Here are five ways that cultural issues can negatively affect data innovation:

1. The Vision And Culture Are At Odds

Data-driven aspirations and “business as usual” may well be at odds. What served a company well up to a certain point may not serve the company well going forward.

“You need to serve the customer as quickly as possible, and that may conflict with the way you measured labor efficiencies or productivity in the past,” explained Ken Gilbert, director of business analytics at the University of Tennessee Office of Research and Economic Development, in an interview with InformationWeek.

[ What matters more: Technology or people? Read Technology Is A Human Endeavor. ]

Companies able to realize the most benefit from their data are aligning their visions, corporate mindsets, performance measurement, and incentives to effect widespread cultural change. They are also more transparent than similar organizations, meaning that a wide range of personnel has visibility into the same data, and data is commonly shared among departments, or even across the entire enterprise.

“Transparency doesn’t come naturally,” Gilbert said. “Companies don’t tend to share information as much as they should.”

Encouraging exploration is also key. Companies that give data access to more executives, managers, and employees than they did in the past have to also remove limits that may be driven by old habits. For example, some businesses discourage employees from exploring the data and sharing their original observations.

2. Managers Need Analytics Training

Companies that are training their employees in ways to use analytical tools may not be reaching managers and executives who choose not to participate because they are busy or consider themselves exempt. In the most highly competitive companies, executives, managers, and employees are expected to be — or become — data savvy.

Getting the most from BI and big data analytics means understanding what the technology can do, and how it can be used to best achieve the desired business outcomes. There are many executive programs that teach business leaders how to compete with business analytics and big data, including the Harvard Business School Executive Education program.

3. Expectations Are Inconsistent

This problem is not always obvious. While it’s clear the value of BI and big data analytics is compromised when the systems are underutilized, less obvious are inconsistent expectations about how people within the organization should use data.

“Some businesses say they’re data-driven, but they’re not actually acting on that. People respond to what they see rather than what they hear,” said Gilbert. “The big picture should be made clear to everybody — including how you intend to grow the business and how analytics fits into the overall strategy.”

4. Fiefdoms Restrict Data Sharing

BI and analytics have moved out from the C-suite, marketing, and manufacturing to encompass more departments, but not all organizations are taking advantage of the intelligence that can be derived from cross-functional data sharing. An Economist Intelligence Unit survey of 530 executives around the world revealed that information-sharing issues represented the biggest obstacle to becoming a data-driven organization.

“Some organizations supply data on a need-to-know basis. There’s a belief that somebody in another area doesn’t need to know how my area is performing when they really do,” Gilbert said. “If you want to use data as the engine of business growth, you have to integrate data from internal and external sources across lines, across corporate boundaries.”

5. Little-Picture Implementations

Data is commonly used to improve the efficiency or control the costs of a particular business function. However, individual departmental goals may not align with the strategic goal of the organization, which is typically to increase revenue, Gilbert said.

“If the company can understand what the customer values, and build operational systems to better deliver, that is the company that’s going to win. If the company is being managed in pieces, you may save a dime in one department that costs the company a dollar in revenue.”

« Older posts