About: Hyperparameter optimization     Goto   Sponge   NotDistinct   Permalink

An Entity of Type : dbo:Disease, within Data Space : el.dbpedia.org associated with source document(s)

In machine learning, hyperparameter optimization or tuning is the problem of choosing a set of optimal hyperparameters for a learning algorithm. A hyperparameter is a parameter whose value is used to control the learning process. By contrast, the values of other parameters (typically node weights) are learned.

AttributesValues
rdf:type
rdfs:label
  • Hyperparameter optimization (en)
rdfs:comment
  • In machine learning, hyperparameter optimization or tuning is the problem of choosing a set of optimal hyperparameters for a learning algorithm. A hyperparameter is a parameter whose value is used to control the learning process. By contrast, the values of other parameters (typically node weights) are learned. (en)
sameAs
dbp:wikiPageUsesTemplate
Subject
thumbnail
foaf:depiction
  • http://commons.wikimedia.org/wiki/Special:FilePath/Hyperparameter_Optimization_using_Grid_Search.svg
  • http://commons.wikimedia.org/wiki/Special:FilePath/Hyperparameter_Optimization_using_Random_Search.svg
  • http://commons.wikimedia.org/wiki/Special:FilePath/Hyperparameter_Optimization_using_Tree-Structured_Parzen_Estimators.svg
gold:hypernym
prov:wasDerivedFrom
Wikipage page ID
page length (characters) of wiki page
Wikipage revision ID
Link from a Wikipage to another Wikipage
has abstract
  • In machine learning, hyperparameter optimization or tuning is the problem of choosing a set of optimal hyperparameters for a learning algorithm. A hyperparameter is a parameter whose value is used to control the learning process. By contrast, the values of other parameters (typically node weights) are learned. The same kind of machine learning model can require different constraints, weights or learning rates to generalize different data patterns. These measures are called hyperparameters, and have to be tuned so that the model can optimally solve the machine learning problem. Hyperparameter optimization finds a tuple of hyperparameters that yields an optimal model which minimizes a predefined loss function on given independent data. The objective function takes a tuple of hyperparameters and returns the associated loss. Cross-validation is often used to estimate this generalization performance. (en)
foaf:isPrimaryTopicOf
is Wikipage redirect of
is Link from a Wikipage to another Wikipage of
is foaf:primaryTopic of
Faceted Search & Find service v1.17_git151 as of Feb 20 2025


Alternative Linked Data Documents: ODE     Content Formats:   [cxml] [csv]     RDF   [text] [turtle] [ld+json] [rdf+json] [rdf+xml]     ODATA   [atom+xml] [odata+json]     Microdata   [microdata+json] [html]    About   
This material is Open Knowledge   W3C Semantic Web Technology [RDF Data] Valid XHTML + RDFa
OpenLink Virtuoso version 07.20.3240 as of Nov 11 2024, on Linux (x86_64-ubuntu_focal-linux-gnu), Single-Server Edition (72 GB total memory, 1 GB memory in use)
Data on this page belongs to its respective rights holders.
Virtuoso Faceted Browser Copyright © 2009-2025 OpenLink Software