This HTML5 document contains 37 embedded RDF statements represented using HTML+Microdata notation.

The embedded RDF content will be recognized by any processor of HTML5 Microdata.

Namespace Prefixes

PrefixIRI
dcthttp://purl.org/dc/terms/
yago-reshttp://yago-knowledge.org/resource/
dbohttp://dbpedia.org/ontology/
foafhttp://xmlns.com/foaf/0.1/
dbthttp://dbpedia.org/resource/Template:
rdfshttp://www.w3.org/2000/01/rdf-schema#
freebasehttp://rdf.freebase.com/ns/
rdfhttp://www.w3.org/1999/02/22-rdf-syntax-ns#
owlhttp://www.w3.org/2002/07/owl#
n13http://en.wikipedia.org/wiki/
dbchttp://dbpedia.org/resource/Category:
dbphttp://dbpedia.org/property/
provhttp://www.w3.org/ns/prov#
xsdhhttp://www.w3.org/2001/XMLSchema#
goldhttp://purl.org/linguistics/gold/
dbrhttp://dbpedia.org/resource/

Statements

Subject Item
dbr:Maximum-entropy_Markov_model
rdf:type
dbo:Person
rdfs:label
Maximum-entropy Markov model
rdfs:comment
In statistics, a maximum-entropy Markov model (MEMM), or conditional Markov model (CMM), is a graphical model for sequence labeling that combines features of hidden Markov models (HMMs) and maximum entropy (MaxEnt) models. An MEMM is a discriminative model that extends a standard maximum entropy classifier by assuming that the unknown values to be learnt are connected in a Markov chain rather than being conditionally independent of each other. MEMMs find applications in natural language processing, specifically in part-of-speech tagging and information extraction.
owl:sameAs
freebase:m.0ch0rl8 yago-res:Maximum-entropy_Markov_model
dbp:wikiPageUsesTemplate
dbt:Redirect dbt:Reflist dbt:Citation_needed
dct:subject
dbc:Statistical_natural_language_processing dbc:Markov_models
gold:hypernym
dbr:Model
prov:wasDerivedFrom
n13:Maximum-entropy_Markov_model?oldid=1000108847&ns=0
dbo:wikiPageID
27900233
dbo:wikiPageLength
6929
dbo:wikiPageRevisionID
1000108847
dbo:wikiPageWikiLink
dbr:Generalized_iterative_scaling dbr:Part-of-speech_tagging dbr:Forward–backward_algorithm dbc:Markov_models dbr:Viterbi_algorithm dbr:Hidden_Markov_model dbr:Statistics dbr:Maximum_entropy_probability_distribution dbr:Information_extraction dbr:Conditional_independence dbr:Natural_language_processing dbr:Discriminative_model dbr:Semi-supervised_learning dbr:Baum–Welch_algorithm dbr:Markov_chain dbr:Conditional_random_field dbc:Statistical_natural_language_processing dbr:Multinomial_logistic_regression dbr:Sequence_labeling dbr:Graphical_model
dbo:abstract
In statistics, a maximum-entropy Markov model (MEMM), or conditional Markov model (CMM), is a graphical model for sequence labeling that combines features of hidden Markov models (HMMs) and maximum entropy (MaxEnt) models. An MEMM is a discriminative model that extends a standard maximum entropy classifier by assuming that the unknown values to be learnt are connected in a Markov chain rather than being conditionally independent of each other. MEMMs find applications in natural language processing, specifically in part-of-speech tagging and information extraction.
foaf:isPrimaryTopicOf
n13:Maximum-entropy_Markov_model