X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

Resource Name
RRID:SCR_018008 RRID Copied      
PDF Report How to cite
BERT (RRID:SCR_018008)
Copy Citation Copied
Resource Information

URL: https://github.com/google-research/bert

Proper Citation: BERT (RRID:SCR_018008)

Description: Technique for Natural Language Processing pre-training developed by Google. Pre-training of Deep Bidirectional Transformers for Language Understanding. Language representation model designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers.

Abbreviations: BERT

Synonyms: Bidirectional Encoder Representations from Transformers

Resource Type: software application, software resource

Keywords: text mining, natural language processing, pre-training, Google, deep bidirectional transformers, language understanding, unlabeled text, language respresentation model

Expand All
This resource

is related to

BioBERT

Usage and Citation Metrics

We found {{ ctrl2.mentions.total_count }} mentions in open access literature.

We have not found any literature mentions for this resource.

We are searching literature mentions for this resource.

Most recent articles:

{{ mention._source.dc.creators[0].familyName }} {{ mention._source.dc.creators[0].initials }}, et al. ({{ mention._source.dc.publicationYear }}) {{ mention._source.dc.title }} {{ mention._source.dc.publishers[0].name }}, {{ mention._source.dc.publishers[0].volume }}({{ mention._source.dc.publishers[0].issue }}), {{ mention._source.dc.publishers[0].pagination }}. (PMID:{{ mention._id.replace('PMID:', '') }})

Checkfor all resource mentions.

Collaborator Network

A list of researchers who have used the resource and an author search tool

Find mentions based on location


{{ ctrl2.mentions.errors.location }}

A list of researchers who have used the resource and an author search tool. This is available for resources that have literature mentions.

Ratings and Alerts

No rating or validation information has been found for BERT.

No alerts have been found for BERT.

Data and Source Information