Lisa Morgan's Official Site

Strategic Insights and Clickworthy Content Development

Month: July 2017

16 Machine Learning Terms You Should Know

Advanced analytics is heating up. AI, machine learning, deep learning, and neural networks are just some of the terms we hear and should know more about. While most of us will never become statisticians or unicorn data scientists, it’s wise for us to understand some of the basic terms, especially since we’ll be hearing a lot more about machine learning in the coming years. Here are a few terms we should all know from some sites that have much more to offer:

Algorithm – a step by step procedure for solving a problem.

Attribute – a characteristic or property of an object.

Classification – to arrange in groups.

Clusters – groups of objects that share a characteristic that is distinct from other groups.

Correlation – the extent to which two numerical variables have a linear relationship.

Deep Learning – An AI function that imitates the workings of the human brain.

Decision Tree – a decision support tool that uses a tree-like graph or model of decisions and their possible consequences, including chance event outcomes, resource costs, and utility.

Natural Language Processing (NLP) – the automatic (or semi-automatic) processing of human language.

Neural Networks – a series of algorithms that attempts to identify underlying relationships in a set of data by using a process that mimics the way the human brain operates.

Normal Distribution – symmetrical distributions that have bell-shaped density curves with a single peak.

Outlier – an observation that lies an abnormal distance from other values in a random sample from a population.

Regression – a statistical process for estimating the relationships among variables.

Statistical Model – a formalization of relationship between variables in the form of mathematical equations.

Supervised Learning – accomplished with training data that includes both the input and the desired results.

Unsupervised Learning – accomplished with training data that does not include the desired results.

Why IT is in Jeopardy

ome IT departments are struggling to prove their relevance as the pace of change continues to accelerate. On one hand, they’re responsible for their own predicament and on the other hand they’re not.

IT has been the master of change. On the other hand, what department wants to be responsible for its own its own demise?  IT as a function isn’t dead and it’s not going to be dead any time soon. However, IT is changing for good. Here’s why:

IT overpromised and under-delivered

Lines of business no longer want to wait for IT. They can’t. The competitive pressures are just too great to ignore. But, when something goes wrong with their tech purchases, who do they call?  IT.

“IT is in jeopardy because of the agreements or promises they’ve made to the business,” said David Caldwell, senior IT solutions consultant at Kaiser Permanente. “You can’t deliver on time, you can’t deliver what you promised and you can’t deliver reliable systems.”

What the business really wants is a dependable, enabling service that delivers what it promises.

Business expectations are too high

IT can’t be successful if the business leadership is viewing IT as a cost rather than an investment, which seems a bit strange, given the fact that today’s companies depend on technology for survival. Nevertheless, some businesses still have legacy cultural issues to overcome, one of which is realizing how value in their company is actually produced in this day and age.

Learn how to fine tune your security initiatives to effectively cover your most important assets without compromising data or your budget. Put your existing security processes to work and protect your data. LEARN MORE

Worse, even C-level information and technology executives may not be viewed as equals among business leaders, so they’re left out of important meetings. Rather than having a partnership between IT and the business, the business may tell IT what it wants when without understanding the entire scope of the problem and how difficult or complex the solution to the problem may be.

“They don’t consider that IT leadership can help you decide how you’re going to strategically differentiate your business,” said Caldwell. “If you don’t let them in, you’re missing out on a lot of valuable input.”

A related issue is budget. If IT isn’t given enough budget to be successful, who can pin failures on IT?  Yet, over the past couple of decades IT has been told to do more with less to the point where the model itself breaks down.

IT has enabled its own demise

IT had a specific role to play before the cloud, SaaS and Shadow IT were fashionable. They were the keepers of hardware, software and networks.

“IT brought the wave of innovation in, [and yet,] IT is under the same assault of things they were the perpetrators of,” said Greg Arnette, CTO at cloud information archiving company, Sonian. “IT is going through a metamorphosis that reduces the need to have as many in IT as in previous history.”

The adoption of cloud architectures and SaaS were fueled by the economic downturn of 2008 and 2009 which forced companies to view IT in terms of operating expenses rather than capital expenses.

“It was a perfect storm,” said Arnette. “Shadow IT was driven by business unit managers frustrated with their IT departments [so they] used their credit cards to sign up for Salesforce.com or go buy ZenDesk or any of these popular SaaS apps that have become the new back office systems for most companies.”

Never mind who purchased what with which purchasing method — purchase order or credit card — when things go wrong, it’s IT’s job is to fix it. That’s one way to provide the business with services, but probably not the model IT had in mind.

The CIO/CTO role is changing

There are plenty of CIOs and CTOs, but some of them are being moved into new roles such as Chief Data Officer, Chief Analytics Officer or Chief Innovation Officer. Whether these roles are a reflection of The Brave New World or whether they’re ultimately too narrow is a debatable point.

“It’s not such a focus on information. It’s now analytics, data wrangling and a focus on innovation as a key way IT can help customers do more,” said Arnette. “I think that’s where IT will come back, but it won’t be the same type of IT department.”

Indeed. Traditional hardware and enterprise software management are being usurped by IaaS and SaaS alternatives. It’s true that a lot of companies have hybrid strategies that combine their own systems with virtualized equivalents and that some companies are managing all of their own technology, but the economics of the virtual world (when managed responsibly) are too attractive to ignore over the long term.

3 Cool AI Projects

AI is all around us, quietly working in the background or interacting with us via a number of different devices. Various industries are using AI for specific reasons such as ensuring that flights arrive on time or irrigating fields better and more economically.

Over time, our interactions with AI are becoming more sophisticated. In fact, in the not-too-distant future we’ll have personal digital assistants that know more about us than we know about ourselves.

For now, there are countless AI projects popping up in commercial, industrial and academic settings. Following are a few examples of projects with an extra cool factor.

Get Credit. Now.

Who among us hasn’t sat in a car dealership, waiting for the finance person to run a credit check and provide us with financing options? We’ve also stood in lines at stores, filling out credit applications, much to the dismay of those standing behind us in line. Experian DataLabs is working to change all that.

Experian created Experian DataLabs to experiment with the help of clients and partners. Located in San Diego, London, and Sao Paulo, Experian DataLabs employ scientists and software engineers, 70% of whom are Ph.Ds. Most of these professionals have backgrounds in machine learning.

“We’re going into the mobile market where we’re pulling together data, mobile, and some analytics work,” said Eric Haller, EVP of Experian’s Global DataLabs. “It’s cutting-edge machine learning which will allow for instant credit on your phone instead of applying for credit at the cash register.”

That goes for getting credit at car dealerships, too. Simply text a code to the car manufacturer and get the credit you need using your smartphone. Experian DataLabs is also combining the idea with Google Home, so you can shop for a car, and when you find one you like, you can ask Google Home for instant credit.

There’s no commercial product available yet, but a pilot will begin this summer.

AI About AI

Vicarious is attempting achieve human-level intelligence in vision, language, and motor control. It is taking advantage of neuroscience to reduce the amount of input machine learning requires to achieve a desired result. At the moment, Vicarious is focusing on mainstream deep learning and computer vision.

It’s concept is compelling to many investors. So far, the company has received $70 million from corporations, venture capitalists and affluent private investors including Ashton Kutcher, Jeff Bezos, and Elon Musk.

On its website, Vicarious wisely points out the downsides of model optimization ad infinitum that results in only incremental improvements. So, instead of trying to beat a state-of-the-art algorithm, Vicarious is to trying to identify and characterize the source of errors.

Draft Better Basketball Players

The Toronto Raptors is working with IBM Watson to identify what skills the team needs and which prospective players can best fill the gap. It is also pre-screening each potential recruits’ personality traits and character.

During the recruiting process, Watson helps select the best players and it also suggests ideal trade scenarios. While prospecting, scouts enter data into a platform to record their observations. The information is later used by Watson to evaluate players.

And, a Lesson in All of This

Vicarious is using unsupervised machine learning. The Toronto Raptors are using supervised learning, but perhaps not exclusively. If you don’t know the difference between the two yet, it’s important to know. Unsupervised learning looks for patterns. Supervised learning is presented with classifications such as these are the characteristics of “good” traits and these are the characteristics of “bad” traits.

Supervised and unsupervised learning are not mutually exclusive since unsupervised learning needs to start somewhere. However, supervised learning is more comfortable to humans with egos and biases because we are used to giving machines a set of rules (programming). It takes a strong ego, curiosity or both to accept that some of the most intriguing findings can come from unsupervised learning because it is not constrained by human biases. For example, we may define the world in terms of red, yellow and blue. Unsupervised learning could point out crimson, vermillion, banana, canary, cobalt, lapis and more.