Strategic Insights and Clickworthy Content Development

Category: IT strategy (Page 2 of 2)

How to Teach Executives About Analytics

If your data is failing to persuade executives, maybe it’s not the data that is the problem. Here’s how to change your approach to fit the audience.

One of the biggest challenges data analysts and data scientists face is educating executives about analytics. The general tendency is to nerd out on data and fail to tell a story in a meaningful way to the target audience.

Sometimes data analytics professionals get so wrapped up in the details of what they do that they forget not everyone has the same background or understanding. As a result, they may use technical terms, acronyms, or jargon and then wonder why no one “got” their presentations or what they were saying.

They didn’t anything wrong, per se, it’s how they’re saying it and to whom.

If you find yourself in such a situation, following are a few simple things you can do to facilitate better understanding.

Discover What Matters

What matters most to your audience? Is it a competitive issue? ROI? Building your presence in a target market? Pay attention to the clues they give you and don’t be afraid to ask about their priorities. Those will clue you in to how you should teach them about analytics within the context of what they do and what they want to achieve.

Understand Your Audience

Some executives are extremely data-savvy, but the majority aren’t just yet. Dialogs between executives and data analysts or data scientists can be uncomfortable and even frustrating when the parties speak different languages. Consider asking what your target audience would like to learn about and why. That will help you choose the content you need to cover and the best format for presenting that content.

For example, if the C-suite wants to know how the company can use analytics for competitive advantage, then consider a presentation. If one of them wants to understand how to use a certain dashboard, that’s a completely different conversation and one that’s probably best tackled with some 1:1 hands-on training.

Set Realistic Expectations

Each individual has a unique view of the world. Someone who isn’t a data analyst or a data scientist probably doesn’t understand what that role actually does, so they make up their own story which becomes their reality. Their reality probably involves some unrealistic expectations about what data-oriented roles can do or accomplish or what analytics can accomplish generally.

One of the best ways to deal with unrealistic expectations is to acknowledge them and then explain what is realistic and why. For example, a charming and accomplished data scientist I know would be inclined to say, “You’d think we could accomplish that in a week, right? Here’s why it actually takes three weeks.”

Stories can differ greatly, but the one thing good presentations have in common is a beginning, a middle, and an end. One of the mistakes I see brilliant people making is focusing solely on the body of a presentation, immediately going down some technical rabbit hole that’s fascinating for people who understand it and confusing for others.

A good beginning gets everyone on the same page about what the presentation is about, why the topic of discussion is important, and what you’re going to discuss. The middle should explain the meat of the story in a logical way that flows from beginning to end. The end should briefly recap the highlights and help bring your audience to same conclusion you’re stating in your presentation.

Consider Using Options

If the executive(s) you’re presenting to hold the keys to an outcome you desire, consider giving them options from which to choose. Doing that empowers them as the decision-makers they are. Usually, that approach also helps facilitate a discussion about tradeoffs. The more dialog you have, the better you’ll understand each other.

Another related tip is make sure your options are within the realm of the reasonable. In a recent scenario, a data analyst wanted to add two people to her team. Her A, B, and C options were A) if we do nothing, then you can expect the same results, B) if we hire these two roles we’ll be able to do X and Y, which we couldn’t do before, and C) if we hire 5 people we’ll be able to do even more stuff, but it will cost this much. She came prepared to discuss the roles, the interplay with the existing team and where she got her salary figures. If they asked what adding 1, 3, or 4 people looked like, she was prepared to answer that too.

Speak Plainly

Plain English is always a wise guide. Choose simple words and concepts, keeping in mind how the meaning of a single word can differ. For example, if you say, “These two variables have higher affinity,” someone may not understand what you mean by variables or affinity.

Also endeavor to simplify what you say, using concise language. For example, “The analytics of the marketing department has at one time or another tended overlook the metrics of the customer service department” can be consolidated into, “Our marketing analytics sometimes overlooks customer service metrics.”

Why You Business May Not Be Ready for Analytics

Artificial intelligence is on the minds of business leaders everywhere because they’ve either heard or believe that AI will change the way companies do business.

What we’re seeing now is just the beginning. For everyone’s sake, more thought needs to be given to the workforce impact and how humans and machines will complement each other.

Recently, professional services company Genpact and FORTUNE Knowledge Group surveyed 300 senior executives from companies in the North American, European and Asia-Pacific regions with annual revenues of $1 billion per year or more. According to the report, “AI leaders expect that the modern workforce will be comfortable working alongside robots by 2020.”

However, getting there will require a different approach to organizational change.

“A bunch of people are thinking about AI as a technology. What they’re not thinking about is AI as the enabler of new enterprise processes, AI as an augmenter of humans in enterprise processes,” said Genpact Senior Vice President Gianni Giacomelli. “Right now, 70% of the effort is spent on technology, 20% on processes and 10% on humans as a process piece. I think that’s the wrong way to look at it.”

What is the right way to think about AI? At one end of the spectrum, people are touting all the positive things AI will enable, such as tackling some of our world’s biggest social problems. On the other end of the spectrum are Elon Musk, Stephen Hawking and others who foresee a dark future that involves unprecedented job losses if not human extermination.

Regardless of one’s personal view of the matter, business leaders need to be thinking harder and differently about the impact AI may have on their businesses and their workforces. Now.

How to think about the problem

The future’s trajectory is not set. It changes and evolves with technology and culture. Since AI’s end game is not completely foreseeable, one way to approach the problem, according to the survey, is to begin with the desired outcome, think about the processes required to achieve that outcome and then ponder how machines and humans can complement each other.

“Generally, the biggest impediment we see out there is the inability to create a portfolio of initiatives, so having a team or a number of teams coming back and saying, ‘These are the 50 things I could do with AI based on what AI is able to do today and in the next 12 months,’ and then [it’s up to senior management to] prioritize them,” said Giacomelli. “You need to have people going through the organization, unearthing places where value can be impacted.”

Over the last three decades or so, business leaders have been setting strategy and then implementing it, which isn’t going to work moving forward. The AI/human equation requires a hypothesis-driven approach in which experiments can fail fast or succeed.

“It’s a lot more about collective intelligence than let’s get a couple of experts and let them tell us where to do this. There are no experts here,” Giacomelli said.

Focus on the workforce

AI will impact every type of business in some way. The question is, what are business leaders doing to prepare their workforce for a future in which part or all of their jobs will be done by AI? According to the survey, 82% of the business leaders plan to implement AI-related technologies in the next three years but only 38% are providing employees with reskilling options.

“I think HR functions are completely backwards on this one,” said Giacomelli. “They haven’t started connecting the dots with what needs to be done with the employees.”

Some companies are already working on workforce planning, but they view AI as a means of materially reducing the workforce, such as by 20% or 30%, which Giacomelli considers “a primitive approach.”

“There are jobs that will go away completely. For example, people who do reconciliation of basic accounts, invoices, that kind of stuff,” he said. “Most of the jobs that will be impacted will be impacted fractionally, so part of the job is eliminated and then you figure out how to skill the person who does that job so she can use the machine better.”

What would people do, though? It’s clear that most working professionals have various types of experience. The challenge for HR is to stop looking at a snapshot of what a candidate or employee is today and what prior experience has qualified them to do what they do today. Instead, they should consider an individual’s future trajectory. For example, some accountants have become sales analysts or supply chain analysts.

Looking for clues about what particular roles could evolve into is wise, but that does not provide the entire picture, since all types of jobs will either evolve or become obsolete in their current forms.

“I don’t feel that many people are looking at the human element of digital transformation and AI except fearful people,” said Giacomelli. “Every year, we will see people somewhere making sense of this riddle and starting to work in a different way. I think we need to change the way we look at career paths. We’ll have to look at them in a hypothesis testing way as opposed to have a super guru in HR who knows how AI will impact our career paths, because they don’t [know].”

The bottom line is that individuals need to learn how to learn because what AI can do today differs from what it will be able to do tomorrow, so the human-and-machine relationship will evolve over time.

Even if AI was just a science fiction concept today, the accelerating paces of technology and business underscore the fact that change is inevitable, so organizations and individuals need to learn how to cope with it.

Don’t dismiss the other guy

AI proponents and opponents both have valid arguments because any tool, including AI, can be used for good or evil. While it’s true AI will enable positive industrial, commercial and societal outcomes, the transition could be extremely painful for the organizations and individuals who find themselves relics of a bygone era, faster than they imagined.

AI-related privacy and security also need more attention than they’re getting today because the threats are evolving rapidly and the pace will accelerate over time.

An important fundamental question is whether humans can ultimately control AI, which remains to be seen. Microsoft’s Tay Twitterbot demonstrated that AI can adopt the most deplorable forms of human expression, quickly. In less than 24 hours, that experiment was shut down. Similarly, a Facebook chatbot experiment demonstrated that AI is capable of developing its own language, which may be nonsensical or even undecipherable by humans. So risks and rewards both need to be considered.

 

One Point the Equifax Breach Drives Home

oday’s developers use more third-party and open-source components, libraries and frameworks than ever to deliver and update products in ever-shrinking delivery cycles. In the race to get to market, it’s easy to overlook or ignore details that can lead to a security breach.

For example, Equifax blamed its recent security breach on an Apache Struts vulnerability (CVE-2017-9805) and later admitted it failed to install a patch. That patch had been available for six months, according to The Apache Foundation.

“The Equifax hack is so interesting, mostly because their response to the hack has been so poor. Blaming a hack on any sort of software issue – open source or proprietary – is simply part of their inadequate response. It’s a ‘the dog ate my paper’ excuse,” said James Stanger, chief technology evangelist at CompTIA. “That’s not much of an explanation, especially considering that Equifax disclosed this problem on September 7 after knowing about it since July 29. ”

What if the software you built was compromised and you discovered that the root cause was a third-party building block you used? You didn’t build that piece of software, after all, so perhaps that party should be liable for any damages that piece of software caused.

Practically speaking, good luck with that argument.

Little or No Third-Party Liability

If you’re using third-party building blocks in your software, which you likely are, the buck stops with you. Sure, someone else’s code may have caused a catastrophic failure, but did you read the fine print in the license agreement?  Third-party developers have several ways of dealing with the matter contractually.

“There may be disclaimers, especially in the open source community, that say, ‘This component is [provided] as-is’ and you as the licensee are responsible for its testing and usage in another system,” said Roy Hadley, Jr., co-chair of the Privacy & Cybersecurity team at law firm Thompson Hine “If you choose to use it in a nuclear facility or the space shuttle, that’s on you.”

“This WAS a different cybersecurity conference experience, and I really enjoyed all of the interaction and honest discussions.” (2017 Attendee) LEARN MORE

Those who use third-party software in their products are ultimately responsible because the provider can’t foresee how its software will be used or configured by others. So, the licensor protects itself using an “as-is” clause or a limitation of liability. Alternatively, the licensor may require indemnity from the licensee, which means if you use third-party software, something goes wrong and the provider of the component you use gets sued, you’re liable.

What Software Developers Should Do

Test, test, test. Ideally, developers should take the time to understand every piece of third-party software they’re using to make sure it does what it’s supposed to do and that it’s been tested for security vulnerabilities. They should also have a mechanism to ensure that the associated updates and patches are up-to-date.

“I think you have an absolute responsibility to make sure that third-party components work, work together and work the way they’re supposed to,” said Jason Wacha, an attorney and founder of  WS Law Offices which specializes in software licensing. “One of the things about the open source community is you hear [about a software vulnerability], they announce it and everybody jumps on it and tries to fix it. Certainly this was true for the Struts project. One of the things about proprietary software is if someone discovers a vulnerability, it’s not going to get out there and people aren’t going to talk about it.”

The obvious constraint is time. There just isn’t enough time to test everything.

“The issues we keep confronting or not confronting in the IT industry are ignoring or omitting key steps of the Software Development Lifecycle (SDLC) and then mismanaging how that resulting software is deployed,” said CompTIA’s Stanger. “One of the primary reasons why software issues get missed by the good guys and exploited by the bad guys is because companies, individuals and groups that develop software tend to rush software to market.”

There are also challenges with the way software is configured and deployed.

“Many IT pros and even security pros still tend to think, ‘If I obtain the code from a secure source and run the hash tag, I’ll be fine. I’ll just update the code as necessary.’ Plus, relatively few companies actually test the software properly in a production environment by “throwing rocks” at the code and doing proper threat hunting,” said CompTIA’s Stanger. “Fewer still are able to update software adequately, because updates often break custom code. Imagine how security issues can propagate when you combine installed and cloud solutions.”

While developers should verify that the third-party software they use has been adequately tested by whomever built it, they need to retest it in the context of their own product.

Rob Rosenzweig, Risk Strategies Co.

Rob Rosenzweig, Risk Strategies Co.

“The reality of the current world we live in is that any business must undertake extreme caution and implement a thorough due diligence process when vetting any vendor that impacts their supply chain or is processing or storing any information on its behalf,” said Rob Rosenzweig, vice president and national cyber practice leader for insurance brokerage Risk Strategies Company. “While there is significant upside to the utilization of outsourced vendors in managing expense, obtaining a higher level of security and realizing operational efficiencies; the flipside is that organizations lose control and still retain all of the risk.”

Lesson Learned

The Equifax breach underscores the need for vigilance because hackers are constantly experimenting to find and exploit vulnerabilities, particularly when sensitive information is involved. When a vulnerability is found, it needs to be addressed in a timely fashion, unlike the Equifax breach. Due to the confluence of events, the Federal Trade Commission (FTC) is now investigating Equifax.

As is evident, the failure to implement one patch can have devastating consequences.

Want to Succeed at Data-Driven Transformation? Start Slow

Data-driven transformation efforts often fail because companies are moving too fast or too slow. If they’re racing into it, they’re probably not getting the details right. If they’re trying to get everything into a perfect state before doing anything, they’re wasting precious time. Either way, ROI suffers.

If you want to strike the right balance, you have to think carefully about what you want to do and why you want to do it. You also have to move at an appropriate speed which means balancing short-term and long-term goals.

Start small

Boston Consulting Group recently published an in-depth article about data-driven transformation. In it, the authors raise a lot of good points, one of which is taking small steps rather than giant leaps.

Taking small steps enables businesses to succeed more often, albeit on a smaller scale. If the project is a success, the money saved or earned can be used to fund subsequent steps. Small successes also help keep teams motivated. However, executing too many small projects simultaneously can be counterproductive.

“The first mistake I see is doing 200 proof of concepts. The second thing I see is people start to do a pilot, even at scale, but [they think] first we need a developer to reinvent the system and then we’ll get the value at the end,” said Antoine Gourévitch, a Senior Partner and Managing Director at BCG. “I’d rather find a quick and dirty way to connect the IT systems and get [some immediate value] rather than doing a full transformation. You can have a transformation over three to five years, but at least I need to do the connection between my pilot at scale and the dedicated systems for the data platform that’s needed to be of value as we go.”

The third challenge is prioritizing pilots or projects. Value is the criteria there. Without a prioritized roadmap, “cool” projects may take precedence over projects that deliver business value.

Three steps to better ROI

BCG offers a lot of good advice in its article, not the least of which is breaking short-term and long-term goals into a three-step process that enables quick wins while paving the way to data-driven transformation. The three steps are:

  • Use quick wins fund the digital journey and learn
  • Design the company-wide transformation
  • Organize for sustained performance
“This WAS a different cybersecurity conference experience, and I really enjoyed all of the interaction and honest discussions.” (2017 Attendee) LEARN MORE

Within those three steps, BCG specified actions companies should take. Upon close review, it’s clear that some of the recommended actions, such as “fail fast,” apply to more than one step. If you read BCG’s article and ponder the graphics, it will get you thinking about how to scale success in your organization.

BCG also presents a five-level transformation model that includes vision, use cases, analytics, data governance and data infrastructure. Gourévitch said data governance tends to be the most problematic because data isn’t viewed as a corporate asset, so data owners may hesitate to share it.

Bottom line

Companies often move too fast or too slow when becoming data-driven organizations. When they move too fast, they can overlook important details that cause initiatives to fail. When they move too slow, they risk losing competitive advantages.

Somewhere between 200 pilots and one massive transformation effort is a balance of short-term and long-term goals, investment and ROI. The challenge, as always, is finding the balance.

Analytics Ensure Safety in LA and White Plains

Security is top of mind when city CIOs think about the types of analytics they need. However, analytics is also enabling them to improve internal processes and the experience citizens and businesses have.

The City of White Plains , New York stores its data in a data center to ensure security. The City of Los Angeles has a hybrid implementation because it requires cloud-level scalability. In LA, 240 million records from 37 different departments are ingested every 24 hours just for cybersecurity purposes, according to the city’s CIO Ted Ross.

“We didn’t start off at that scale but [using the cloud] we’re able to perform large amounts of data analysis whether it’s cybersecurity or otherwise,” Ross said.

He thinks it’s extremely important that organizations understand their architecture, where the data is, and how data gets there and then put the appropriate security measures in place so they can leverage the benefits of the cloud without being susceptible to security risks.

“If you’re not doing analytics and you’re moving [to the cloud], it’s easy to think it will change your world and in certain [regards] it may. The reality is, you have to go into it with both eyes open and understand what you’re trying to accomplish and have realistic expectations about what you can pursue,” said Ross.

White Plains is on a multi-year journey with its analytics, as are its peers because connecting the dots is a non-trivial undertaking.

“Municipalities have a lot of data, but they move slowly,” said White Plains CIO Michael Coakley. “We have a lot of data and we are trying to get to some of the analytics [that make sense for a city].”

Departments within municipalities still tend to operate in silos. The challenge is eliminating those barriers so data can be used more effectively.

“It’s getting better. It’s something we’ve been working on the for the last few years which is knocking down the walls, breaking down the silos and being able to leverage the data,” said Coakley. “It’s for the betterment of citizens and businesses.”

Connecting data from individual departments improves business process efficiencies and alleviates some of the frustrations citizens and businesses have had in the past.

“If you’re a small business owner who bought a plot of land in White Plains and wants to [erect] a building, you could go to the department of Public Works to get a permit, the Building Department to get a permit and the Planning Department to get a permit and none of those departments know what you’re talking about,” said Coakley. “With the walls being broken down and each department being able to use the data, it makes the experience better for the business or home owner.”

The city is also connecting some of its data sets with data sets of an authority that operates within the city, but is not actually part of the city.

“There’s a reason for their autonomy, but it’s important to start the dialog and show them [how connecting the data sets] will benefit them,” said Coakley. “Once you show the department what they can provide for you, and ensure it’s not going to compromise the integrity of their data, they usually come along. They see the efficiencies it creates and the opportunities it creates.”

In those discussions, it becomes more obvious what kind of data can be generated when the data sets are used and shared and what kind of analytics can be done. The interconnection of the data sets creates the opportunity to get insights that were not previously possible or practical when the data generated in a department stayed in that department.

White Plains is trying to connect data from all of its departments so it can facilitate more types of analytics and further improve the services it provides citizens and businesses. However, cybersecurity analytics remain at the top of the list.

“Cybersecurity is number one,” said Coakley. “We have to worry about things like public safety, which is not just police, fire, emergency, public works, facilities, water, electrical, and engineering. There’s a lot of data and the potential for a lot of threats.

DevOps Not Working? Here’s Why.

DevOps can help organizations get better software to market faster, when it’s working. When it’s not working, development and operations teams aren’t working as a cohesive unit.  They’re operating as distinct phases of a software development lifecycle.

Part of the problem may involve tools. Either the tools still operate as silos or they don’t provide the kind of cross-functional visibility that DevOps teams require. However, a bigger task may be getting development and operations working together.

What makes DevOps even more challenging is that there’s no one right way to do it.  Of course, there are better and worse ways to approach it, so here are a few suggestions to consider.

Think before automating. Automation is part of DevOps, but it’s not synonymous with DevOps. While it’s true that automating tasks saves time, automation also accelerates the rate at which mistakes can be propagated.

“If you just automate things and you haven’t built the skills to handle high speed, you’re putting yourself in a place where friction and accidents can happen,” said Sean Regan, head of growth, software teams at software development tool provider www.atlassian.com. “Before you automate everything, start with a culture. You’ll have happier developers, happier customers, and better software.”

Test automation is essential for DevOps, and to do that well, developers need to test their software in a production environment.

“DevOps is founded in automation. One of the first things organizations recognize is they need a dynamic infrastructure which most people think is cloud,” said Nathen Harvey, vice president, Community Development at DevOps workflow platform provider www.chef.io Chef Software. “It doesn’t have to be cloud, it means you have compute resources available to developers and the people who are running your production organization.”

With the help of automation and developer access to production environments, DevOps teams are delivering software in days or weeks instead of months.

Cultivate a DevOps culture. Software teams that have gone through an Agile transformation remember they had to change their culture for it to succeed. The same is true for DevOps.

“You need to get your teams collaborating in a way they haven’t done before,” said Harvey. “It becomes much less about a hand-off and more about understanding the common goals we’re working towards.”

One indication of DevOps maturity is whether the shipment of software is considered an end or a beginning. Atlassian used to celebrate after a product shipped, which used to be common for software companies. Now Atlassian celebrates milestones hit after the release, such as the number of customers using a new feature within a given time frame.

Take a hint from web giants. A decade ago, web companies were embracing DevOps and figuring out how infrastructure could be managed as code.  Meanwhile, other companies were operating in business-as-usual mode.

“If you’re coming from a more traditional organization, the idea of managing infrastructure as code may still be new,” said Chef Software’s Harvey. “I think the best way to achieve success is to pull together a cross-functional team that cares about driving a particular business outcome, such as how to deliver this one change out to our customer.”

 Cheat. Companies spend lots of time reinventing what works at other companies. Atlassian memorialized a lot of what it has learned in self-assessments and playbooks, so DevOps teams can identify and address the challenges they face.

“Customers are coming to us saying, ‘Give us playbooks, give us patterns, give us specific actionable ways to move toward DevOps,” said Regan.  “If you’re moving to DevOps, there’s usually an early stage where you wonder if you’re doing it right.”

Why IT is in Jeopardy

ome IT departments are struggling to prove their relevance as the pace of change continues to accelerate. On one hand, they’re responsible for their own predicament and on the other hand they’re not.

IT has been the master of change. On the other hand, what department wants to be responsible for its own its own demise?  IT as a function isn’t dead and it’s not going to be dead any time soon. However, IT is changing for good. Here’s why:

IT overpromised and under-delivered

Lines of business no longer want to wait for IT. They can’t. The competitive pressures are just too great to ignore. But, when something goes wrong with their tech purchases, who do they call?  IT.

“IT is in jeopardy because of the agreements or promises they’ve made to the business,” said David Caldwell, senior IT solutions consultant at Kaiser Permanente. “You can’t deliver on time, you can’t deliver what you promised and you can’t deliver reliable systems.”

What the business really wants is a dependable, enabling service that delivers what it promises.

Business expectations are too high

IT can’t be successful if the business leadership is viewing IT as a cost rather than an investment, which seems a bit strange, given the fact that today’s companies depend on technology for survival. Nevertheless, some businesses still have legacy cultural issues to overcome, one of which is realizing how value in their company is actually produced in this day and age.

Learn how to fine tune your security initiatives to effectively cover your most important assets without compromising data or your budget. Put your existing security processes to work and protect your data. LEARN MORE

Worse, even C-level information and technology executives may not be viewed as equals among business leaders, so they’re left out of important meetings. Rather than having a partnership between IT and the business, the business may tell IT what it wants when without understanding the entire scope of the problem and how difficult or complex the solution to the problem may be.

“They don’t consider that IT leadership can help you decide how you’re going to strategically differentiate your business,” said Caldwell. “If you don’t let them in, you’re missing out on a lot of valuable input.”

A related issue is budget. If IT isn’t given enough budget to be successful, who can pin failures on IT?  Yet, over the past couple of decades IT has been told to do more with less to the point where the model itself breaks down.

IT has enabled its own demise

IT had a specific role to play before the cloud, SaaS and Shadow IT were fashionable. They were the keepers of hardware, software and networks.

“IT brought the wave of innovation in, [and yet,] IT is under the same assault of things they were the perpetrators of,” said Greg Arnette, CTO at cloud information archiving company, Sonian. “IT is going through a metamorphosis that reduces the need to have as many in IT as in previous history.”

The adoption of cloud architectures and SaaS were fueled by the economic downturn of 2008 and 2009 which forced companies to view IT in terms of operating expenses rather than capital expenses.

“It was a perfect storm,” said Arnette. “Shadow IT was driven by business unit managers frustrated with their IT departments [so they] used their credit cards to sign up for Salesforce.com or go buy ZenDesk or any of these popular SaaS apps that have become the new back office systems for most companies.”

Never mind who purchased what with which purchasing method — purchase order or credit card — when things go wrong, it’s IT’s job is to fix it. That’s one way to provide the business with services, but probably not the model IT had in mind.

The CIO/CTO role is changing

There are plenty of CIOs and CTOs, but some of them are being moved into new roles such as Chief Data Officer, Chief Analytics Officer or Chief Innovation Officer. Whether these roles are a reflection of The Brave New World or whether they’re ultimately too narrow is a debatable point.

“It’s not such a focus on information. It’s now analytics, data wrangling and a focus on innovation as a key way IT can help customers do more,” said Arnette. “I think that’s where IT will come back, but it won’t be the same type of IT department.”

Indeed. Traditional hardware and enterprise software management are being usurped by IaaS and SaaS alternatives. It’s true that a lot of companies have hybrid strategies that combine their own systems with virtualized equivalents and that some companies are managing all of their own technology, but the economics of the virtual world (when managed responsibly) are too attractive to ignore over the long term.

Why Collaboration is Critical in Technology Acquisition

Technology teams and lines of business are often seduced by cool new technologies, products, and services. The business is drawn to promises of better insights, higher productivity, improved economics, and ease of use. IT is drawn to increasingly powerful technologies that enable the team to more effectively implement and manage an increasingly complex ecosystem.

However, the art of the possible often overshadows what’s practical or what the business is trying to achieve.

Solutions architects can help better align technology acquisition with business goals, albeit not single-handedly. They need to collaborate with the business, IT, and vendors to orchestrate it all.

“A solutions architect has a foot in enterprise architecture, a foot in business program management, and a foot in vendor product management,” said Dirk Garner, principal consultant at Garner Software.
“We help determine what the business needs are and align the right technologies and products. Solutions architecture is really at the center of those things.”

Architecting the right solutions

Sound technology acquisition starts with a business problem or goal. Then, it’s a matter of selecting “the right” technologies and products that will most effectively solve the business problem or help the business achieve its goal.

“So often we do it backwards, we say we have this technology so let’s do this,” said Garner. “Once you understand the business environment, it’s assessing the current state of technology and then taking a look at what you actually need to pursue opportunities and survive in the business environment.”

Given the fast pace of business today, there’s an inclination to just acquire technology now. However, there are often trade-offs between short-term pain relief and a longer term benefit to the business. A sounder approach is to compare current capabilities with the capabilities required and then define a roadmap for getting there.

“The number one challenge is that people are myopic,” said Garner. “Vendors focus on how great their product is [rather than] what the customer needs. The business always comes to the table with unrealistic expectations – how little money they want to spend and how fast they want things delivered.”

Since IT can’t meet those expectations, lines of business purchase their own technology, not realizing that they’ll probably need IT’s help to implement it.

“You hear a lot about collaboration today but when you talk to these people, they’re still siloed,” said Curt Cornum, VP and chief solution architect at global technology provider Insight Enterprises. “Even within the IT department, when you get into those types of conversations they’re not talking to each other as much as they should.”

The persistent silos are keeping businesses from meeting their goals and staying competitive. Meanwhile, their agile counterparts are pulling ahead because their business and IT functions are working in unison. Collaboration is critical.

Rapid Tech Change Challenges IT Leaders

Faster technology innovation and competitive pressures are taking their toll on IT. Gone are the days when IT procured and managed all of an organization’s technology. The reason: IT can’t deliver fast enough on what individual operating units need.

To help keep their companies stay competitive, IT departments are evolving from centralized organizations to hub-and-spoke organizations that serve individual operating units and the enterprise simultaneously. But even then, keeping up with the latest technologies is challenging.

“Things are progressing at such an exponential rate, that it’s tough to keep up and you’re a little more uneasy about the decisions you make,” said Steve Devine, director of IT at international law firm O’Melveny. “Solutions are being developed so quickly and hitting the so market quickly, that it’s much harder to differentiate between the solutions that are coming out.”

Part of the problem is the technology landscape itself. Everything runs on software today, including businesses and hardware. Much of that software is developed in an Agile fashion so it can be delivered faster, in weeks or months verses years. The result is often a minimally viable product that is continually enhanced over time versus a traditional product that includes more features out of the gate, albeit at a much slower pace.

The cloud has also helped accelerate the pace of software innovation and the economics of software innovation because software developers no longer have to build and maintain their own infrastructure. They can buy whatever they need on demand which speeds software testing and DevOps, further accelerating software delivery.

The on-demand nature of the cloud and shift to minimally viable products lowers the barrier to market entry, which means the number of vendors in virtually every product area has exploded, and so have the number of products hitting the market.

Keeping up with all of that challenges even large IT departments.

Security is front and center

IT departments have always had some security element, but with the growing number and types of threats, they are necessarily expanding their capabilities. That means changes such as adopting more types of security products and services, and having talent on hand that understands all the details.

“With so many outsiders trying to hack into systems, even if you understand security systems, the technology is always changing,” said Jermaine Dykes, senior IT project manager Wi-Fi Strategy & Operations at telecommunications infrastructure company Mobilitie.

O’Melveny’s Devine said his company’s IT department has evolved from a “keeping-the-lights-on” type of shop to a security-focused organization in which members maintain expertise in their specific areas.

“Retaining talent is really key with all the emphasis on security, machine learning and AI,” said Devine. “People in that world are very hard to find and very hard to keep.”

Enabling analytics is critical

As more businesses become insight-driven, IT organizations need to provide a solid, governed foundation for data usage that can be leveraged by different parts of the organization as necessary. That way, departments and lines of business can access the data they need without exposing the enterprise to unnecessary risks.

“Big data is huge. Gone are the days when we used a huge server and IT was considered overhead,” said Mobilities’ Dykes. “Today’s IT leaders need to have a vision about how they can incorporate data analytics to propel their organizations into the 21st century.”

More analytics solutions use machine learning and AI to improve the quality of insights they deliver, but quite often the hype about the solutions outpaces their actual abilities.

“The healthcare industry uses machine learning for diseases and things of that nature, but if you look at other industries, it’s basically nowhere,” said O’Melveny’s Devine. “The early adopters pay a price because you spend a lot of cycles getting something like that implemented and a lot of times it’s just a non-starter once you’ve gone through all that.”

Six Ways to Master the Data-Driven Enterprise

As seen in InformationWeek.

StatisticsBig data is changing the way companies and industries operate. Although virtually all businesses acknowledge the trend, not all of them are equally prepared to meet the challenge. The companies in the best position to compete have transformed themselves into “data-driven” organizations.

Data-driven organizations routinely use data to inform strategy and decision-making. Although other businesses share the same goal, many of them are still struggling to build the necessary technological capabilities, or otherwise their culture is interfering with their ability to use data, or both.

Becoming a data-driven organization isn’t easy, however. In fact, it’s very difficult. While all organizations have a glut of data, their abilities to collect it, cleanse it, integrate it, manage it, access it, secure it, govern it, and analyze it vary significantly from company to company. Even though each of these factors helps ensure that data can be used with higher levels of confidence, it’s difficult for a business to realize the value of its data if its corporate culture lags behind its technological capabilities.

Data-driven organizations have extended the use of data across everyday business functions, from the C-suite to the front lines. Rather than hoping that executives, managers, and employees will use business intelligence (BI) and other analytical tools, companies that are serious about the use of data are training employees, making the systems easier to use, making it mandatory to use the systems, and monitoring the use of the systems. Because their ability to compete effectively depends on their ability to leverage data, such data-driven organizations make a point of aligning their values, goals, and strategies with their ability to execute.

On the following pages we reveal the six traits common to data-driven organizations that make them stand out from their competitors.

Forward Thinkers

Data-driven enterprises consider where they are, where they want to go, and how they want to get there. To ensure progress, they establish KPIs to monitor the success of business operations, departments, projects, employees, and initiatives. Quite often, these organizations have also established one or more cross-functional committees of decision-makers who collectively ensure that business goals, company practices, and technology implementations are in sync.

“The companies that have integrated data into their business strategies see it as a means of growing their businesses. They use it to differentiate themselves by providing customers with better service, quicker turnaround, and other things that the competition can’t meet,” said Ken Gilbert, director of business analytics at the University of Tennessee’s Office of Research and Economic Development, in an interview with InformationWeek. “They’re focused on the long-term and big-picture objectives, rather than tactical objectives.”

Uncovering Opportunities

Enterprises have been embracing BI and big data analytics with the goal of making better decisions faster. While that goal remains important to data-driven enterprises, they also are trying to uncover risks and opportunities that may not have been discoverable previously, either because they didn’t know what questions to ask or because previously used technology lacked the capability.

According to Gartner research VP Frank Buytendijk, fewer than half of big data projects focus on direct decision-making. Other objectives include marketing and sales growth, operational and financial performance improvement, risk and compliance management, new product and service innovation, and direct or indirect data monetization.

Hypothesis Trumps Assumption

People have been querying databases for decades to get answers to known questions. The shortcoming of that approach is assuming that the question asked is the optimal question to ask.

Data-driven businesses aim to continuously improve the quality of the questions they ask. Some of them also try to discover, through machine learning or other means, what questions they should be asking that they have not yet asked.

The desire to explore data is also reflected in the high demand for interactive self-service capabilities that enable users to adjust their thinking and their approaches in an iterative fashion.

Pervasive Analytics

Data analytics has completely transformed the way marketing departments operate. More departments than ever are using BI and other forms of analytics to improve business process efficiencies, reduce costs, improve operational performance, and increase customer satisfaction. A person’s role in the company influences how the data is used.

Big data and analytics are now on the agendas of boards of directors, which means that executives not only have to accept and support the use of the technologies, they also have to use them — meaning they have to lead by example. Aberdeen’s 2014 Business Analytics survey indicated that data-driven organizations are 63% more likely than the average organization to have “strong” or “highly pervasive” adoption of advanced analytical capabilities among corporate management.

Failure Is Acceptable

Some companies encourage employees to experiment because they want to fuel innovation. With experimentation comes some level of failure, which progressive companies are willing to accept within a given range.

Encouraging exploration and accepting the risk of failure that accompanies it can be difficult cultural adjustments, since failure is generally considered the opposite of success. Many organizations have made significant investments in big data, analytics, and BI solutions. Yet, some hesitate to encourage data experimentation among those who are not data scientists or business analysts. This is often because, historically, the company’s culture has encouraged conformity rather than original thinking. Such a mindset not only discourages innovation, it fails to acknowledge that the failure to take risks may be more dangerous than risking failure.

Data Scientists And Machine Learning

Data-driven companies often hire data scientists and use machine learning so they can continuously improve their ability to compete. Microsoft, IBM, Accenture, Google, and Amazon ranked first through fifth, respectively, in a recent list of 7,500 companies hiring data scientists. Google, Netflix, Amazon, Pandora, and PayPal are a few examples of companies using machine learning with the goal of developing deeper, longer-lasting, and more profitable relationships with their customers than previously possible.

Newer posts »