URL: https://github.com/google-research/bert
Proper Citation: BERT (RRID:SCR_018008)
Description: Technique for Natural Language Processing pre-training developed by Google. Pre-training of Deep Bidirectional Transformers for Language Understanding. Language representation model designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers.
Abbreviations: BERT
Synonyms: Bidirectional Encoder Representations from Transformers
Resource Type: software application, software resource
Keywords: text mining, natural language processing, pre-training, Google, deep bidirectional transformers, language understanding, unlabeled text, language respresentation model
Expand Allis related to |
We found {{ ctrl2.mentions.total_count }} mentions in open access literature.
We have not found any literature mentions for this resource.
We are searching literature mentions for this resource.
Most recent articles:
{{ mention._source.dc.creators[0].familyName }} {{ mention._source.dc.creators[0].initials }}, et al. ({{ mention._source.dc.publicationYear }}) {{ mention._source.dc.title }} {{ mention._source.dc.publishers[0].name }}, {{ mention._source.dc.publishers[0].volume }}({{ mention._source.dc.publishers[0].issue }}), {{ mention._source.dc.publishers[0].pagination }}. (PMID:{{ mention._id.replace('PMID:', '') }})
A list of researchers who have used the resource and an author search tool
A list of researchers who have used the resource and an author search tool. This is available for resources that have literature mentions.
No rating or validation information has been found for BERT.
No alerts have been found for BERT.
Source: SciCrunch Registry