Strategic Insights and Clickworthy Content Development

Category: Uncategorized (Page 2 of 2)

Computer History May Not Be What You Think

When many of us ponder computer history, we think of Steve Jobs, Bill Gates, and other high-profile white men who changed the course of history through innovation and shrewd business tactics. Of course, there have been other significant contributors along the way who are not white including Guy Kawasaki and Jerry Yang, but how much have we really heard about the computer industry contributions made by African-Americans?

For the most part, precious little, if anything. However, that may change with the help of Arvid Nelsen, IEEE Computer Society member, Southern Methodist University rare books and manuscripts librarian, and contributor to the IEEE Annals of the History of Computing.

“I look at historical understanding as something that evolves over time as new information comes to light or as we examine the past through the lens of different priorities and values,” said Nelsen, in an email interview. “Scholars are just beginning to scratch the surface in respect to persons of color. I think these efforts add to history by examining previously ignored, overlooked, invisible, and perhaps devalued evidence. I hope that means the development of a more complete, complex, and nuanced understanding of history.”

Is Computer History Revisionist History?

What if everything we know about the computer industry isn’t entirely correct?  In today’s global business environment, innovation, disruption and contributions can come from anywhere. However, it may be that African-Americans still remain in the shadows rather than the limelight, at least in the US.

But what, exactly, have we in the computer industry missed? More work needs to be done to answer those and other questions.

Unearthing the African-Americans’ computer industry contributions won’t be an easy task because there’s a lack of archival source material. In Nelsen’s recent IEEE Annals of the History of Computing article, he writes,”Archives and libraries should undertake to identify and collect materials from persons of color. Meanwhile, scholars may find material in nontraditional sources, and prosopography may prove useful for examining computer professionals of color.”

One non-traditional source is Ebony magazine, which lists at least 57 African-Americans working in various computing fields between 1959 and 1996.

I hope that the article encourages historians who are interested in critically examining race in computing simply to start looking for stories. They are out there,” said Nelsen. “I provide a number of examples and specifically encourage the examination of publications by and for particular communities, publications which may have been previously considered out-of-scope in contrast to scientific and professional publications.”

Why Computer Industry History Lacks Color

Racism was rampant in the computer industry’s early days. Perhaps it’s less obvious to some of us now, given the diversity of today’s high-tech workforce. However, racism is still alive and well, despite greater workforce diversity.

To align contributions with contributors, Nelsen thinks historians need to understand the development of the computer industry, as well as the specific technologies that comprise the computer industry.

One of Nelsen’s articles inspired a letter from a retired professor who had worked for Burroughs Corp. While at Burroughs, some of his African-American colleagues developed new hardware and software, including the operating system for the Burroughs B5000 and B5500 mainframe computers.

“I hope my article will inspire readers to reach out with their own stories to scholars and to archives like the Charles Babbage Institute with papers and other source materials,” said Nelsen.

The Time is Ripe for Change

The movie Hidden Figures, based on the book by the same name written by Margot Lee Shetterly, helped raise at least partial awareness that the accomplishments of African Americans in the computer industry have indeed been ignored, forgotten or overlooked. The book and the movie focus on mathematicians Mary Jackson, Katherine Johnson and Dorothy Vaughn, all of whom worked for the National Aeronautics and Space Administration (NASA).

“The contributions of these three women were essential to both the Space Race and the development of the computing disciplines, and have been shamefully neglected ” wrote Nathan Ensmenger, Editor in Chief, IEEE Annals of the History of Computing.

in his own commentary he said. “[A]s we begin to incorporate race and ethnicity into our scholarship, we will discover new insights, methods, and perspectives that will radically reshape the focus of our discipline.”

How the focus of our discipline may change as the result of such research remains to be seen. As both Nelsen and Ensmenger note, the task won’t be easy, but it’s a necessary endeavor.

Tips for Getting Your Company Started with Analytics

n today’s fast-paced economy, businesses need access to insights faster than before. While periodic reporting still has its place, organizations are looking for deeper and more timely insights that can help them make better decisions, cut costs, improve efficiencies, reduce risks and drive more revenue.

It’s pretty obvious from all the hype around the topic that many powerful things can be done with analytics, but it isn’t always obvious where one should begin. We asked some experts — they or their firms presented in the Data & Analytics Track at Interop ITX this month — where they thought businesses should start, and here’s what they had to say.

Prove value, not concepts

Some organizations spend too much time trying to get everything in a perfect state before using analytics, which wastes valuable time and overcomplicates what could be a simple beginning. The best way to start is to choose a project that has the potential to demonstrate value without requiring a lot of extra work or heavy investment.

“When we build something that returns value to the organization right away, people start buying in because they see the ROI and the other types of value it provides like efficiencies in the workforce, short time to solution or short time to value,” said Kirk Borne, principal data scientist at Booz Allen Hamilton. “It can be something simple.”

For example, a financial services company managed to save $1 billion simply by analyzing web clicks, Borne said.

“They already had web analytics in place. They just needed to pay attention to what the signals were telling them,” said Borne. “It doesn’t have to be a complicated model or involve complicated data to prove value.”

Analytics can start anywhere

Analytics can begin at any level in an organization, whether it’s an executive who wants an answer a strategic question or a line of business manager or staff member who needs to solve a tactical problem.

Five years ago, the Association of Schools and Programs of Public Health(ASPPH) assigned data analytics as a part-time job to its current director of data analytics and another employee. The association had been sending members periodic reports, but as the volume of data grew, it became obvious ASPPH could provide more value to its members with analytics.

“We started out small, providing a few dashboards to our members,” said Christine Plesys, director of data analytics at ASPPH. “They didn’t have to wait four months to receive a report.”

Now ASPPH is teaching its members about the best practices in data analytics and how to use the data ASPPH provides for strategic planning and internal benchmarking purposes. In addition, its data analytics staff has grown to four full-time employees

Get an executive sponsor

First efforts can be difficult to get off the ground if no one at the executive level understands the potential value of the project. To minimize that challenge, it’s wise to have an executive involved who will help the project succeed.

“One of the things some people miss is it’s not just a chief data officer or a data scientist you should be talking to,” said Booz Allen Hamilton’s Borne. “Sometimes it’s the chief financial officer or chief marketing offer because those people hold the purse strings on the kinds of investments that need to be made.”

Expect the unexpected

When people receive reports, they often have more questions that require a new report to be built. Analytics dashboards enable individuals to explore data in a more iterative fashion which needs to be considered when launching a first attempt.

“A lot of people think analytics is a new word for data warehousing or business intelligence and then they try to run their [analytics] project the same way,” said Karen Lopez, senior project manager and architect at data project and data management consultancy InfoAdvisors. “At a base level, you’re building a Q&A system because you don’t know [all of the questions] you’re going to ask.”

It’s also difficult to anticipate what unexpected circumstances might arise, especially when launching an analytics project without the help of an expert. For example, it may be difficult to get the necessary data from IT or from another department because they don’t want to share it.

People new to analytics also tend to overlook data quality. Poor data quality can cause spurious analytical results and perfect data quality is virtually unattainable. It’s wise to understand the tradeoffs between data quality and the time and expense it takes to get it into a state that’s “good enough” for the purposes it will serve. An expert can help you find that balance.

Bottom line

Too often, first attempts are derailed by overcomplicating the problem or attempting to solve a problem that is too complex to be solved well with existing team members and tools. The best first analytics project is one that can demonstrate value quickly and cost-effectively. If you succeed, it will be easier to make a case for follow-on projects. If you fail, you’ll learn a lot without wasting months or years and several million dollars.

3 Data Governance Challenges Today’s Companies Face

Some organizations have mastered data governance, but they are in the minority. As data volumes continue to grow, most businesses are finding it hard to keep up.

“You’re going to do this one way or another,” said Shannon Fuller, director of data governance at  Carolinas Healthcare System. “You can do it in a controlled, methodical manner or you can do it when your hair’s on fire.”

Poor data governance can result in lawsuits, regulatory fines, security breaches and other data-related risks that can be expensive and damaging to a company’s reputation. “We don’t have regulation about data lineage and reporting and all that, but it’s going to come,” said Fuller. “Do you want to prepare for that now or do you want to be like Bank of America and spend billions of dollars complying with the law?  Most healthcare organizations don’t have that kind of cash lying around.”

Another problem is legal discovery. Without proper data governance, companies end up handing over information that is not relevant to the case.  Some of that information may be sensitive.

There are valid reasons why companies are struggling with data governance.  Following are three of them.

1. It’s considered a technology problem. Effective data governance requires the use of good tools; however, the use of good tools does not guarantee effective data governance.  Some companies find this out the hard way when they invest in technology but fail to make the necessary adjustments to their culture and business processes.

“The common wisdom is you need an executive sponsor and the support of the C-suite and roll that down. It helps, but there are things you can do from a data governance perspective without having that buy-in,” said Fuller in a recent interview. “It has to be tied to your business processes. In healthcare, that’s one of the biggest stumbling blocks.”

2. Old approaches are applied to new requirements. Data governance policies and procedures require updating as more data flows into and out of organizations. Nevertheless, some companies are trying to apply concepts and constructs developed decades ago to modern requirements, which doesn’t work well.

“I hear people say, ‘This is what I get out of my relational database so why can’t I just use it for everything?’ You’re forcing this rigid structure because it makes people feel warm and fuzzy,” said Jim Scott, director of converged data platform provider MapR.  “It’s dangerous when people have the myopic perspective of governing data the same old way they always have.”

Yet, some of those very organizations are now planning to add streaming data from IoT devices to the mix.

3. The value of data is not understood. Some businesses are throwing every piece of data into a data lake, hoping that that it will have value someday.  Other companies are deciding what to keep and throw away based on current requirements.  When those requirements change, they may regret some of those decisions.

“One of the challenges I hear often is how do you assign value to different datasets because that might impact how you think about your governance policies,” said Sanjay Sarathy, CMO at  Talena.  “How do I leverage data coming out of IoT streams verses the marketing folks who leverage media data?  Thinking through the value of these different datasets will enable you to define how you govern them, cleanse them, and protect them.”

Assigning value can be a difficult challenge, however. Some organizations don’t know where to start. Others struggle to assign an accurate value when the value is both qualitative and quantitative. Even if organizations are able to get the value right and get data governance right, what’s “right” may change when a merger or acquisition happens.

In short, data governance isn’t a static thing, it’s an evolving mindset that requires cultural and technological support along the way to succeed.

Don’t Let Outliers Sabotage Your Cybersecurity Analytics

Cybersecurity analytics solutions are becoming more intelligent and nuanced to understand anomalous behavior that’s outside the norm and potentially dangerous. Identifying outliers is important, but not every outlier is a threat, nor is every threat an outlier.

“Companies have made hundreds of millions of dollars building tools that look for behavior that’s outside a rule or a set of parameters,” said Jason Straight, SVP of Cyber Risk Solutions and chief privacy officer at legal outsourcing services provider United Lex. “For machines that works pretty well, for people it doesn’t.”

Tracking behavior at the machine level can be as simple as monitoring the number of packets sent to and from a particular machine.

Humans behave differently in different contexts. For example, many of us usually work at particular office Monday through Friday during “normal” work hours. However, if we’re traveling internationally, we’re probably accessing the same corporate network, albeit at a different time from a different IP address that’s located somewhere else in the world.

A rule-based system could be programmed to disallow network access under those conditions, but traveling professionals wouldn’t get much work done. The trick is to balance the needs of users and the business against potential threats.

“Instead of setting a bunch or rules that say if someone logs in from an IP address that they’ve never used before, at a time they’ve never logged in before, and they’re accessing part of the network they’ve never used before, that’s a complicated rule that would require constant updating and it would be impossible to manage on a person-by-person basis,” said Straight.

User Behavior Analytics Can Help

Enterprise security budgets have been heavily focused on keeping outside threats at bay, but more enterprises are realizing that to protect their assets, they need to assume that their network has been hacked and that there’s an active intruder at work.

Similarly, when the average person thinks about a cybersecurity breach, hackers come to mind. However, insiders are a bigger problem. In addition to being responsible for more security breaches than hackers, insiders fail their companies accidentally and willfully.

“If I see a server doing something funny, I can shut it down, take it offline, or reroute the traffic, which doesn’t disrupt an organization much or at all,” said Straight. “If I do that to people, that could be really disruptive.”

User behavior analytics are an effective mechanism for insider threats because they’re able to model a user’s behavior. For example, when an employee is getting ready to leave a job, that person usually visits certain websites and updates her resume, which isn’t the best use of company assets, but it doesn’t justify security intervention. However, when that employee starts downloading files to USB drives, uploading files to file-sharing services, and printing volumes of information, intervention is may necessary.

Monitoring a single user doesn’t always tell the entire story, however, which is why user behavior analytics enable users to see what an individual is doing within the context of a group. For example, if someone in marketing accessed a part of the network she’s never visited before, that’s strange. Whether it actually requires action or not may depend on whether others in her department have accessed that same part of the network and if so, when.

While such capabilities sound attractive, many organizations are failing to get value they expected from user behavior analytics, despite spending seven figures, because they don’t know how to handle the alerts and intelligence, Straight said.

Learn how to fine tune your security initiatives to effectively cover your most important assets without compromising data or your budget. Put your existing security processes to work and protect your data. LEARN MORE

User behavior analytics can also help determine whether someone’s login credentials have been stolen. Unlike traditional rule-based systems, user, machine learning, and AI are used to model an authorized user’s behavior and that behavior is associated with that person’s login credentials. If someone else tries to use the same User ID and password, her behavior indicates the account has been compromised.

“That’s when you start to see an account that’s never really used more than a departmental server suddenly scanning the entire network, trying to get into different places and being denied access,” said Straight.

Think First

Before investing in a new security tool, it’s essential to understand the problem you’re trying to solve, which is true of any technology. Different security tools serve different purposes.

“Do you want to understand problems you haven’t identified or are you trying prevent data leakage?” said Avivah Litan, vice president and distinguished analyst at Gartner. “You have to be real clear, and then you also need to spend some time training the models and supervising them.”

Data-Driven Effectiveness Is A Team Sport

Most companies are trying to understand how they can make the best use of their data. They’ve invested in tools and they’ve invested in people, but the results continue to fall short of expectations. Competitors are stealing customers, disruptors are upsetting the natural order or things, and business as usual is showing diminishing returns.

Why companies fall so behind or advance so fast isn’t always obvious, but there’s one thing that separates the leaders from the laggards:  The leaders have integrated data driven decision-making into their culture and business processes. In fact, their ability to use data effectively is part of their core competency.

“Legacy processes and procedures have led to really siloed organizations,” said Rich Wagner, CEO of business performance forecasting solutions provider Prevedere. “The analysts within each function all operate differently. They use different tools, different techniques, and different technologies to build their business plans to run the business so they’re not an integrated group.”

How to Make Teamwork Work

One sign of analytical maturity is the effectiveness of cross-functional problem solving. IT likely has the data, the data team needs to surface insights for the business, and the business has to be confident that the decisions they make advance their objectives. In today’s rapidly changing business environment, cross-functional teams are necessary because their collective knowledge and skills enables more effective problem solving, faster.

“You need to have a team that’s focused on a shared understanding of what the business problem is and what the objective is. Do not pass go until you do that because it’s a recipe for disaster,” said Chris Mazzei, chief analytics officer at professional services organization EY (Ernst & Young).

Unifying efforts doesn’t just happen. Business professionals need to understand what the data team and IT do and vice versa, which is best accomplished by working together to solve a business problem.

“[T]ask a team to solve a pretty big problem with a tight deadline and let each of them see the value that the other brings,” said Prevedere’s Wagner.

Success may also require some self-motivation. There’s significant value in spending time with members of the team that have different areas of expertise. By working together and being inquisitive, individuals can learn more about how other functions operate and why they operate that way, which is essential. Without that, important details may be overlooked.

For example, one of EY’s telecom clients wanted to improve its customer retention model. So the analytics team built a new model that could accurately identify customers who would leave within two weeks. That’s impressive, but marketing and sales needed four to six weeks to intervene.

“Nobody asked the marketing and sales team how far in advance they needed to know [a customer was leaving], said EY’s Mazzei. “We see that all the time.”

One of the cheapest ways to understand what works and what doesn’t is to hear what other companies have done right and wrong.

Harnessing the Disruptive Power of Analytics

There are two things that distinguish companies: what they say and what they do. Business theoreticians, marketers, and even research firms use buzzwords to sell books, products, and reports, respectively. Whenever a particular buzzword such as “disrupter” becomes popular, a lot of companies choose to use it whether it actually applies or not.

There are many ways to be disruptive, not all of which change the world like computers, smartphones, social networks, and self-driving cars do. Sometimes being disruptive isn’t just about being innovative, it’s a matter of opportunity and timing.

Disruptor or Disrupted?

Digital natives continue to impact the business models of companies in a growing number of industries as more things in the physical world are replicated or reimagined using ones and zeros. Who would have imagined that the largest bookseller would have no physical stores or that the largest taxi company would own no cars?

Today’s disruptions tend to be technology-enabled, and often a confluence of technologies is necessary for the business model to succeed. For example, connected cars and even fitness trackers require a combination of sensors, computing power, adequate bandwidth, and cloud agility. As history has demonstrated, a game-changing idea has a better chance of succeeding if there’s a practical way to implement it at scale.

For example, mobile advertising was expected to explode at the beginning of the millennium, but the cellular networks were comparatively slow, certainly not fast enough to support rich media. People carried cellphones back then, not smartphones, and there was a much smaller installed base of users. Sometimes, great ideas fail because the timing is wrong.

Meanwhile, Amazon has expanded from books to all sorts of product categories, threatening the viability of many types of storefronts. However, it has failed at several me-too initiatives that were similar to disruptive offerings from Apple, PayPal, Groupon, and others. Disrupters are not immune from disruption, nor are disrupted companies incapable of disruption.

For example, GE successfully reinvented itself and is innovating in the industrial IoT space. Big pharma companies are working closely with their competitors on R&D projects (outside of consortia work) out of necessity.

Reframing the Status Quo

More analog products are being disrupted by new-generation replacements that are digitized and connected, from smart buildings to connected inhalers. While some organizations are developing entirely new categories of technologies and products, others are finding new ways of using what already exists. Solar power is a good example of that. It isn’t new, but it’s a practical and affordable way to deliver electricity to consumers who lack access to a power grid. In rural Africa, consumers pay for their solar power using their cell phones because they don’t have bank accounts. And the solar power panel providers are using cellular networks to monitor and manage the units they’ve installed at residences and small businesses.

A number of large, established companies including Wells Fargo and Disney now have incubator or accelerator programs so they have direct insightinto start-up innovation. Acquisition is another means big companies use to become disruptive or sustain a disruptive track record.

Some businesses adapt a disruptive idea, such as Etsy’s ecommerce marketplace for handmade crafts. Other organizations including Travelocity, Hotels.com, and Zillow disrupted entire industries by eliminating the need for a middleman.

What Does Analytics Suggest?

What would happen if your company digitized a product or service in a way that provided more value to the customer and new sources of revenue for you? How could a disruptive trend in another industry apply to your industry? What insight might third-party data sources give you that would make a material difference to your customers and your company? How might your business model change if data and analytics were considered the core competency of your organization?

Emotional Analytics is Next. Are You Ready?

In the near future, more organizations will use emotional analytics to fine-tune their offerings, whether they’re designing games or building CRM systems. Already, there are platforms and software development tools that allow software developers to build emotional analytics into desktop, mobile, and web apps. In a business context, that can translate to mood indicators built into dashboards that show whether the customer on the phone or in a chat discussion is happy, whether the customer service rep is effective, or both — in real time.

Such information could be used to improve the efficiency of escalation procedures or to adapt call scripts in the moment. It could also be used to refine customer service training programs after the fact. In many cases, emotional analytics will be used in real time to determine how a bot, app, IoT device, or human should react.

Although the design approaches to emotional analytics differ, each involves some combination of AI, machine learning, deep learning, neural nets, natural language processing, and specialized algorithms to better understand the temperament and motivations of humans. The real-time analytical capabilities will likely affect the presentation of content, the design of products and services, and how companies interact with their customers. Not surprisingly, emotional analytics requires massive amounts of data to be effective.

Emotion isn’t an entirely new data point, and at the same time, it is. In a customer service or sales scenario, a customer’s emotion may have been captured “for training purposes” in a call or in a rep’s notes. In the modern sense, emotions will be detected and analyzed in real time by software that is able to distinguish the nuances of particular emotions better than humans. Because the information is digital, it can be used for analytical purposes like any other kind of data, without transformation.

Voice Inflection

What people say is one thing. How they say it provides context. Voice inflection is important because in the not-too-distant future, more IoT devices, computing devices, and apps will use voice interfaces instead of keyboards, keypads, or gestures designed for mobile devices.

Because humans and their communication styles are so diverse, contextual information is extremely important. Demographics, personas, account histories, geolocation, and what a person is doing in the moment are just a few things that need to be considered. Analyzing all that information, making a decision about it, and acting upon it requires considerable automation for real time relevance. The automation occurs inside an app, an enterprise application, or a service that acts autonomously, notifies humans, or both.

Body Language

Body language adds even more context. Facial expressions, micro expressions, posture, gait, and gestures all provide clues to a person’s state of mind.

Media agency MediaCom is using emotional analytics to more accurately gauge reactions to advertisements or campaigns so the creative can be tested with greater accuracy and adjusted.

Behavioral health is another interesting application. Using emotional analytics, healthcare providers can gain insight into conditions such as depression, anxiety, and schizophenia.

The potential applications go on, including law enforcement interrogations, retail, and business negotiations, to name a few.

A Tough Problem

Natural language processing, which is necessary for speech and text analysis, is hard enough to get right. Apple Siri, Microsoft Cortana, and even spellcheckers are proof that there’s a lot of room for improvement. Aside from getting the nuances of individual languages and their dialects right, there are also cultural nuances that need to be understood – not only in the context of words but the way in which words are spoken.

The same thing goes for gestures. Large gestures are fine in Italy, but inappropriate in Japan, for example. The meaning of gestures can change with culture, which intelligent systems must understand.

As a result, emotional analytics will crawl before it walks or runs, like most technologies.

What You Should Know About Machine-Aided Analytics

More organizations are supplementing their analytics capabilities with intelligent systems that are easier to use than ever. While the results may look impressive, the devil is in the details.

There is a key difference between traditional analytics systems and some of the newer analytics systems that is very important. If you understand the difference, you’ll be a step ahead of your peers.

Old: Input → Output

Traditional analytics systems tend to be rules-based which means they have “if/then” scenarios built into them, so if a user clicks the red button, one result occurs. If she clicks the blue button, then another result occurs. The key thing to know here is that, assuming the programming is done right, an input results in a predictable output. That’s great, but it doesn’t work so well with the complex Big Data we have today, which is why machine learning is gaining momentum.

Modern systems use machine learning to provide more intelligent solutions. The solutions are more “intelligent” because the machine learns what humans feed it, and depending on the algorithms used, they may be capable of learning on their own. Training by humans and self-learning allows such systems to “see” things in the data that weren’t apparent before, such as patterns and relationships. The other major value, of course, is the ability to comb through massive amounts of structured and unstructured data faster than a human could, understand the data, make predictions on it, and perhaps make recommendations. It is the latter characteristics — prediction and prescription — that are most obvious to analytics users.

What’s not well understood is what can potentially go wrong. An analytics system designed for general purpose use is likely not what someone on Wall Street would use. That person would want a solution that’s tailored to the needs of the financial services industry. Making the wrong movie prediction is one thing; making the wrong trade is another.

As users, it’s easy to assume that the analytics we get or come up with are accurate, but there is so much that can affect accuracy — data quality, algorithms, models, interpretation. And, as I mentioned in my last post, bias which can impact all of those things and more.

Why you should care

There is a shortage of really good data science and analytics talent. One answer to the problem is to build solutions that abstract the complexity of all the nasty stuff — data collection, data preparation, choice of algorithms and models, etc. — so business users don’t have to worry about it. On one hand, the abstraction is good because it enables solutions that are easy to use and don’t require much, if any, training.

But what if the underlying math or assumptions aren’t exactly right? How would you know what effect that might have? To understand how and why those systems are working the way they are requires someone who understands all the hairy technical stuff, like a car mechanic. That means, like a car, do not pop the hood and start tinkering with things unless you know what you’re doing.

Some solutions don’t have a pop-the-hood option. They’re black boxes, which means no one can see what’s going on inside. The opaqueness doesn’t make business users nervous, but it’s troublesome to experts who didn’t build the system in the first place.

Bottom line, you’re probably going to get spurious results once in a while, and when you do ask why. If it’s not obvious to you, ask for help.,

Newer posts »