This HTML5 document contains 27 embedded RDF statements represented using HTML+Microdata notation.

The embedded RDF content will be recognized by any processor of HTML5 Microdata.

Namespace Prefixes

PrefixIRI
dcthttp://purl.org/dc/terms/
dbohttp://dbpedia.org/ontology/
foafhttp://xmlns.com/foaf/0.1/
dbthttp://dbpedia.org/resource/Template:
rdfshttp://www.w3.org/2000/01/rdf-schema#
rdfhttp://www.w3.org/1999/02/22-rdf-syntax-ns#
n6http://en.wikipedia.org/wiki/
dbchttp://dbpedia.org/resource/Category:
provhttp://www.w3.org/ns/prov#
dbphttp://dbpedia.org/property/
xsdhhttp://www.w3.org/2001/XMLSchema#
dbrhttp://dbpedia.org/resource/

Statements

Subject Item
dbr:Multitask_optimization
rdfs:label
Multitask optimization
rdfs:comment
Multi-task optimization is a paradigm in the optimization literature that focuses on solving multiple self-contained tasks simultaneously. The paradigm has been inspired by the well-established concepts of transfer learning and multi-task learning in predictive analytics. The key motivation behind multi-task optimization is that if optimization tasks are related to each other in terms of their optimal solutions or the general characteristics of their function landscapes, the search progress can be transferred to substantially accelerate the search on the other.
dbp:wikiPageUsesTemplate
dbt:Short_description dbt:Context
dct:subject
dbc:Machine_learning
prov:wasDerivedFrom
n6:Multitask_optimization?oldid=1065047534&ns=0
dbo:wikiPageID
58175832
dbo:wikiPageLength
7813
dbo:wikiPageRevisionID
1065047534
dbo:wikiPageWikiLink
dbr:Evolutionary_computation dbr:Machine_learning dbr:Gaussian_process dbr:Paradigm dbr:Multicriteria_classification dbr:Ensemble_learning dbr:Multiple-criteria_decision_analysis dbc:Machine_learning dbr:Multi-task_learning dbr:Predictive_analytics dbr:Bayesian_optimization dbr:Multi-objective_optimization dbr:Transfer_learning dbr:Crossover_(genetic_algorithm) dbr:Implicit_parallelism dbr:Hyperparameter_optimization
dbo:abstract
Multi-task optimization is a paradigm in the optimization literature that focuses on solving multiple self-contained tasks simultaneously. The paradigm has been inspired by the well-established concepts of transfer learning and multi-task learning in predictive analytics. The key motivation behind multi-task optimization is that if optimization tasks are related to each other in terms of their optimal solutions or the general characteristics of their function landscapes, the search progress can be transferred to substantially accelerate the search on the other. The success of the paradigm is not necessarily limited to one-way knowledge transfers from simpler to more complex tasks. In practice an attempt is to intentionally solve a more difficult task that may unintentionally solve several smaller problems.
foaf:isPrimaryTopicOf
n6:Multitask_optimization