Strategic Insights and Clickworthy Content Development

Month: March 2018

How To Increase Contributed Article Placement Success

abstract-1260505_640Pitching and placing contributed articles is a staple of a good PR program.  Editors are bombarded with ideas every day.  Some make the cut, some don’t.  Want to up your chances?  Think and write like a journalist.

I’ve received a few pitches lately that I found quite incredible.  I imagine the PR reps thought the pitches were logical – I certainly would have before I had journalism experience myself – but sitting in the chair of a journalist, I could see how faulty their strategy was.

The idea was, “How about if my client contributes an article that highlights the features of its product?”  My response was, “Sounds like a sponsored editorial product.”  Why?  Because the proposed content was essentially an ad.

My audiences – business executives and technologists – don’t have much of an appetite for blatantly self-promotional prose.  Moreover, promotional content does little to establish your client as “a thought leader.”  It positions them more like a salesperson.  The same can be said for other media including video and webinars.

The best contributed pieces really show off an expert’s chops.  That person knows more about leadership or emotional analytics or programming in Python than most of his or her peers do, and that person is willing to share their expertise, free of blatant product or service tie-backs.  A good pitch reflects that.

I bring this up because I hate to see people waste time and their clients’ budgets.  There are better and worse ways to do things.  It is entirely possible to advance your client’s agenda without attaching flashing lights to it.

Having said all of this, I see contributed articles published that are self-promotional, especially in the tech pubs have had their budgets slashed dramatically.  I don’t think those articles do the community or the contributor much good, so I’m sticking to my guns.

If you want to improve your chances of making it into the little pile, think and write like a journalist.

 

 

Deloitte: 5 Trends That Will Drive Machine Learning Adoption

Companies across industries are experimenting with and using machine learning, but the actual adoption rates are lower than it might be seem. According to a 2017 SAP Digital Transformation Study, fewer than 10% of 3,100 executives from small, medium and large companies said their organizations were investing in machine learning. That will change dramatically in the coming years, according to a new Deloitte report, because researchers and vendors are making progress in five key areas that may make machine learning more practical for businesses of all sizes.

1. Automating data science

There is a lot of debate about whether data scientists will or won’t be automated out of a job. It turns out that machines are far better at doing rote tasks faster and more reliably than humans, such as data wrangling.

“The automation of data science will likely be widely adopted and speak to this issue of the shortage of data scientists, so I think in the near term this could have a lot of impact,” said David Schatsky, managing director at Deloitte and one of the authors of Deloitte’s new report.

Industry analysts are bullish about the prospect of automating data science tasks, since data scientists can spend an inordinate amount of time collecting data and preparing it ready for analysis. For example, Gartner estimates that 40% of a data scientist’s job will be automated by 2020.

Data scientists aren’t so sure about that, and to be fair, few people, regardless of their position, have considered which parts of their job are ripe for automation.

2. Reducing the need for training data

Machine learning tends to require a lot of data. According to the Deloitte report, training a machine learning model might require millions of data elements. While machine learning requirements vary based on the use case, “acquiring and labeling data can be time-consuming and costly.”

One way to address that challenge is to use synthetic data. Using synthetic data, Deloitte was able to reduce the actual amount of data required for training by 80%. In other words, 20% of the data was actual data and the remaining 80% was synthetic data.

“How far we can go in reducing the need for training data has two kinds of question marks: How far can you reduce the need for training data and what characteristics of data are most likely minimized and which require massive datasets?” said Schatsky.

3. Accelerating training

Massive amounts of data and heavy computation can take considerable time. Chip manufacturers are addressing this issue with various types of chips, including GPUs and application-specific integrated circuits (ASICs). The end result is faster training of machine learning models.

“I have no doubt that with the new processor architectures, execution is going to get faster,” said Schatsky. “[The chips] are important and necessary, but not sufficient to drive significant adoption on their own.”

4. Explaining results

Many machine learning models spit out a result, but they don’t provide the reasoning behind the result. As Deloitte points out, business leaders often hesitate to place blind faith in a result that can’t be explained, and some regulations require an explanation.

In the future, we’ll likely see machine learning models that are more accurate and transparent, which should open the door for greater use in regulated industries.

[Deloitte also recently discussed 9 AI Benefits Enterprises Are Experiencing Today.]

“No one knows how far you can go yet in terms of making an arbitrary neural network-based model interpretable,” said Schatsky. “We could end up hitting some limits identifying a fairly narrow set of cases where you can turn a black box model into an open book for certain kinds of models and situations, but there will be other scenarios where they work well but you can’t use them in certain situations.”

5. Deploying locally

Right now, machine learning typically requires a lot of data and training can be time-consuming. All of that requires a lot of memory and a lot of processing power, more than mobile and smart sensors can handle, at least for now.

In its report, Deloitte points out there is research in this area too, some of which has reduced the size of models by an order of magnitude or more using compression.

The bottom line

Machine learning is having profound effects in different industries ranging from TV pilots to medical diagnoses. It seems somewhat magical and somewhat scary to the uninitiated, though the barriers to adoption are falling. As machine learning becomes more practical for mainstream use, more businesses will use it whether they realize it or not.

“[The five] things [we identified in the report] are converging to put machine learning on a path toward mainstream adoption,” said Schatsky.  “If companies have been sitting it out waiting for this to get easier and more relevant, they should sit up instead and start getting involved.”

What Data Analysts Want to See in 2018

The demand for data analysts is at an all-time high, but organizations don’t always get the value they expect, mainly because the organization, or parts of it, are getting in the way.

Being an analyst can be a frustrating job if your position isn’t getting what it needs in terms of data, tools and organizational support. Are you getting what you need? Here are some of the things your contemporaries are saying.

More Data

Despite the glut of data companies have, analysts don’t always get the data they need, often because the data owners are concerned about privacy, security, losing control of their data or some combination of those things.

“The problem of data ownership and data sharing is universal,” said Sam Ruchlewicz, director of Digital Strategy & Data Analytics at advertising, digital, PR and brand agency Warschawski. “For analytics professionals, these artificial barriers hinder the creation of comprehensive, whole-organization analyses that can provide real, tangible value and serve as a catalyst for the creation (and funding) of additional analytics programs.”

Jesse Tutt, program lead of the IT Analytics Center of Excellence at Alberta Health Services said getting access to the data he needs takes a lot of time because he has to work with the data repository owners to get their approval and then work with the technologists to get access to the systems. He also has to work with the vendors and the data repository subject matter experts.

“We’ve worked really hard getting access to the data sets, correlating the different datasets using correlation tables and cleaning up the data within the source systems,” he said. “If you ask a specific set or data repository what something is, it can tell you, but if you can snapshot it on a monthly basis you can see a trend. If you correlate that across other systems, you can find more value. In our case, the highest value is connecting the system and creating the capability in a data warehouse, reporting you can correlate across the systems.

Four years ago, people at Alberta Health Services wanted to see trend data instead of just snapshots, so one system was connected to another. Now, 60 connected data sources are connected with 60 more planned by the end of 2017. The company has a total of about 1,600 data sources, many of which will be connected in the next couple of years.

More Respect

The most effective data analytics align with business objectives, but what happens when your data analysts aren’t informed? Warschawski’s Ruchlewicz recently had dinner with the CEO of a large, international agency who spent millions of dollars on a marketing campaign that failed simply because the executive didn’t want to listen to “the analytics kids.” Never mind the fact that the analytics team had identified a major issue the target audience had with the client’s brand.

“[The CEO] dismissed them as analytics kids who didn’t know what they were talking about and proceeded to launch the campaign,” said Ruchlewicz. “Only later, after millions of dollars in spending (with no results to show for it), did the CEO allow them to make their case and implement their recommendations.”

Ultimately, their recommendations turned the campaign around. Ruchlewicz said.

“I wish this as a one-off story. It’s not. I wish this was confined to ‘old school’ companies. It’s not,” said Ruchlewicz. “Until analytics teams are given a seat at the table where decisions are made, analytics will continue to be undervalued and underappreciated across the entire organization.”

Analysts have to earn respect like anyone else, however. That requires communicating to business professionals in business terms.

“Executives and investors today are hyper-focused on the bottom line, and most that I’ve interacted with perceive analytics as a line item expenditure,” said Ruchlewicz. “[A]nalytics professionals need to take the first step toward resolution. There are several methods that allow the creation of a rigorous, defensible first approximation, which is sufficient to get the conversation started (and usually, some data shared).”

To help turn the tide, analytics practitioners are well-advised present information and construct business cases around their activities.

More Consistency

If everyone in the organization used the same terminology for everything, always had the right database fields accessible, and always entered data correctly and in the same manner, some enterprise data would be much cleaner than it is today. However, the problem doesn’t stop there

“If a person says, ‘I want an analytical tool,’ how do you group that and do trending on it when a person may call it one of the 100 different analytical tool names or they’ll say I need to do analysis on data? The words the submit are often different from what they actually want,” said Alberta Health Services’ Tutt

Tutt and his team are endeavoring to better understand what people are requesting in service desk tickets so the company can manage its software investments more effectively. Now that his team has access to the different systems, they know who’s using a product and when they used it. They’re looking at the problem from a Robotics Process Automation (RPA) perspective so software can be automatically removed if it hasn’t been used in a certain time period.

More Power to Affect Change

Industry analysts are pushing back on “data-driven” mantras because they think companies should be “insight-driven.” While they have a valid point, insights without action have little value.

For example, a large U.S. health provider has a massive analytics team that’s generating highly-actionable insights, but those insights are not being acted upon by the business. They can meet with a functional unit such as risk or compliance and show them insights. The operating unit will say, “That’s interesting,” but there’s no way to connect insights and action.

“The data teams are frustrated because they’re not getting the operational support they need,” said Adam Nathan, CEO and Founder of analytics strategy firm The Bartlett System. “The data teams don’t know how to drive that, except to get frustrated and quiet and get more value elsewhere. I think the tipping point will come when the company realizes it’s falling behind competitors. They’ll realize the company isn’t getting the value it could from analytics and that will put pressure on them to do something with those insights.”

How CIO/CFO Relationships Are Evolving

Digital transformation is driving huge organizational changes, not the least of which is the evolving relationships of CIOs and CFOs.

Traditionally, the two roles have been somewhat at odds because CIOs must continually invest in technologies and CFOs are ultimately responsible for financial performance. In today’s’ highly competitive business environment, CIOs and CFOs need to partner at a strategic level to drive growth and enable organizational agility.

From old school to new school

Data provider Dun & Bradstreet is going through a digital transformation that allows the 176-year-old company to behave and compete like a much younger entity. To get there, the CFO and former CIO (now Chief Content and Technology Officer) are working in partnership to set strategies and execute them.

“We come together quite a lot because what we’re trying to drive is more innovation at a faster clip in a more efficient way,” said Richard Veldran, CFO of Dun & Bradstreet. “It all comes down to data and technology which is at the core of many of the things we’re trying to get done here.”

As the sheer amount of data continues to grow exponentially, Dun & Bradstreet has more opportunities to drive growth by monetizing data. However, to do that, the CFO and CTO need to work as partners.

“So much of it now depends on the alignment of your technical capabilities and investments,” said Curtis Brown, chief content and technology officer at the firm. “Rich and I spend a lot more time talking about our strategy and our execution against that strategy. I would say that’s the single biggest change.”

The partnership allows Veldran and Brown to allocate resources more effectively and make joint decisions about where to invest and how to invest. They’re also working together in a lean agile fashion which enables them to accomplish more in less time while reducing the risk of big project failures.

Focused on high growth

Hitachi Vantara CIO Renee McKaskle and CFO Lori Varlas act as if they’re co-founders and, in a way, they are. Both women were hired into their respective positions about two years ago to spearhead digital transformation. Years before, Varlas and McKaskle had become acquainted while working at Peoplesoft.

“We’re two women in non-traditional women’s roles, so from the get-go, we bonded on the common vision of where we’re going to take this company and how our individual skills and experiences added to that story and towards that journey,” said Varlas. “I think the other thing that bonded us was time is not our friend, particularly in terms of technology, so we had quickly align on what the business strategy was and figure out how we leverage our own backgrounds and experiences to make that vision a reality.”

They both say it’s important to learn from each other, listen to each other and be aligned on the vision or outcome.

“As we work really closely with the business, things come up. Someone might approach Renee or [me] for different purposes, but it springs to mind, ‘Has Renee’s cybersecurity team looked at that?’ ” said Varlas. Or, “Does Lori know about that for investment purposes?” said McKaskle.” There’s a bit of a tag team going there because we both have a common understanding and purpose of how it fits together.”

Empathy is key

Cross-functional collaboration is necessary to drive effective digital transformation; however, everyone interviewed for this blog said empathy for the other person’s role is critical.

“I can sometimes be a propeller head, but to think more empathically and as a partnership toward the enablement and delivery of the operation of the company, that’s where folks sometimes get stuck,” said Dun & Bradstreet’s Brown. “CFOs do have to put pressure on delivering a certain set of results within a certain financial framework while [CIOs and] CTOs are trying to drive technical improvements that often require investment.”

As businesses undergo digital transformation, the CIO and CFO have to move quickly and in unison. The best results come when they’re aligned on the business outcomes they’ve trying to achieve. That alignment also helps CIOs and CFOs overcome some of the tensions that stem from traditionally separate roles.

Beware Analytics’ Mid-Life Crisis

Businesses are using analytics to stay competitive. One by one, departments are moving from static reports to modern analytics so they can fine-tune their operations. There’s no shortage of solutions designed for specific functions, such as marketing, sales, customer service and supply chain, most of which are available in SaaS form. So, when it’s possible just to pull out a credit card and get started with an application, why complicate things by involving IT?

Freedom from IT seems like a liberating concept until something goes wrong. When data isn’t available or the software doesn’t work as advertised, it becomes the IT department’s job to fix it.

“I used to call this the BI mid-life crisis. Usually about a year and a half or two years in, [departments] realize they can’t report accurately and then they need some help,” said Jen Underwood, founder of Impact Analytix, and a recognized analytics industry expert. “Now I’m seeing more IT involvement again.”

Organizations serious about competing on insights need to think holistically about how they’re approaching analytics and the role of IT. Disenfranchising IT from analytics may prove to be short-sighted. For example, a proof of concept may not scale well or the data required to answer a question might not be available.

Analytics’ long-term success depends on IT

IT was once the sole gatekeeper of technology, but as the pace of business has continued to accelerate, departments have become less tolerant of delays caused by IT. While it’s true no one understands departmental requirements better than the department itself, IT is better equipped to identify what could go wrong, technically speaking.

Even if a department owns and manages all of its data, at some point it will likely want to combine that data with other data, perhaps from a different group.

“We became accustomed to IT organizations managing the database architectures or the data stores and any of the enterprise wide user-facing applications,” said Steven Escaravage, vice president in Booz Allen Hamilton’sStrategic Innovation Group. “I think that’s changed over the last decade, where there’s been a greater focus on data governance, and so you also see IT organizations today managing the process and the systems used to govern data.”

Additionally, as more organizations start analyzing cross-functional data, it becomes apparent that the IT function is necessary.

“IT plays an important part in ensuring that these new and different kinds of data are in a platform or connected or integrated in a way that the business can use. That is the most important thing and something companies struggle with,” said Justin Honaman, a managing director in the Digital Technology Advisory at Accenture.

Where analytics talent resides varies greatly

There’s an ongoing debate about where analytics talent should reside in a business unit.  It’s common for departments to have their own business analysts, but data science teams, including data analysts, often reside in IT.

The argument in favor of a centralized analyst team is visibility across the organization, though domain-specific knowledge can be a problem. The argument in favor of decentralization is the reverse. Accenture’s Honoman said he’s seeing more adoption of the decentralized model in large companies.

Hybrid analytics teams, like hybrid IT, combines a center of excellence with dedicated departmental resources.

Hot analytics techs

Machine learning and AI are becoming popular features of analytics solutions. However, letting machine learning loose on dirty and biased data can lead to spurious results; the value of predictive and prescriptive analytics depends on their accuracy.

As machine learning-based applications become more in vogue, analytics success depends on “the quality of not just the data, but the metadata associated with it [that] we can use for tagging and annotation,” said Booz Allen Hamilton’s Escaravage “If IT is not handling all of that themselves, they’re insisting that groups have metadata management and data management capabilities.”

Meanwhile, the IoT is complicating IT ecosystems by adding more devices and edge analytics to the mix.  Edge analytics ensures that the enterprise can filter meaningful data out of the mind-boggling amount of data IoT devices can collect and generate.

In short, the analytical maturity of organizations can’t advance without IT’s involvement.

Just a Bit of Advice:  

Strategies for Successful Analytics

A few helpful hints as you move through your analytics journey.

If you’re just getting started on your data and analytics journey, think before you act.

Steven Escaravage of Booz Allen Hamilton noted, “I tell clients to take a step back before they invest millions of dollars.” Among other things, he said, make sure to have a good foundation around what questions you’re trying to solve today and the questions you perceive are coming down the path.

“Let’s put together a data wish list and compare it to where we’re at, because usually you’re going to have to make investments in generating data to answer questions effectively,” he added. All the other pieces about methods and techniques, tools and solutions follow these actions.

If you’re at the pilot stage, beware of scalability challenges.

“Very rarely for sophisticated analytic problems would I lean on a typical Python pilot deployment in production,” said Escaravage. “You’d typically move to something you knew could scale and wouldn’t become a bottleneck in the computational pipeline.”

If you’re in production, you may be analyzing all kinds of things, but are you measuring the effectiveness of your solutions, processes and outcomes? If not, you may not have the complete feedback loop you think you have.