•  
 
 

Tech glossary

Defining IT & technology terms

Machine learning

Machine learning is the scientific process of training systems to act upon data without requiring explicit, programmed instructions. A subtype of Artificial Intelligence (AI), machine learning leverages algorithms and statistical models to identify patterns and predict future outcomes.

Most machine learning initiatives fit within two models: supervised and unsupervised. Supervised machine learning begins with a known, labeled dataset — often called “training data” — and uses that data to make predictions, which are compared against actual outcomes in order to further refine the algorithm. Unsupervised data leverages unlabeled data in order to provide a deeper understanding of how computers identify patterns.

Machine learning can be applied to a number of business applications, and it is particularly useful in areas where conventional algorithms have proven insufficient. Examples of machine learning innovation include self-driving cars, email filtering applications and speech-recognition software.

Learn more about machine learning

Related terms

  • Artificial Intelligence
  • Digital Innovation
  • Machine learning operations (MLOps)

Featured content for machine learning

Article Image

Infographic Dell Technologies Validated Designs for AI

Article Image

eBook Build The Future Of Education With Engagement And Accessibility

Article Image

eBook Build Better Communities With Digital Tools For Government

Article Image

TechTalk CXO Edition: How Intel Is Empowering Ambitious Healthcare Goals

Narrow your topic:

Artificial Intelligence (AI)  Digital Innovation  View all focus areas