Performance of Transformer-Based Natural Language Models Research Paper

Exclusively available on Available only on IvyPanda® Made by Human No AI

Natural Language Processing (NLP) has gained attention in several fields when examining human language computerized systems. NLP models can aid professions, for instance, doctors, in extracting and summarizing essential information from an enormous pool of data during the decision-making process in hospitals (Jain et al., 2018). The performance of transformer-based models, including BERT, XLNet, and RoBERT, varies when testing stress and conducting differential diagnoses (Aspillaga et al., 2020). Amongst various technologies deployed in extracting information, the sentence embedding approach holds several applications in encoding sentences into precise length vectors and semantic texts.

BERT and RoBERT models have depicted a modern level in their performance during sentence-pair tasks of regression such as semantic textual similarity (STS). Sentences are appropriately fed into the selected network causing an enormous computational overhead while aiding clinicians perform differential diagnosis (Ndukwe et al., 2020). In an attempt to extract relevant clinical information from massive data, finding the required similar pair from a huge collection of approximately 10000 sentences needs roughly 50 million inference searches and computations. The extracted information helps clinicians conduct a differential diagnosis just from the facts gather via the models. BERT uses approximately 65 hours for such a computation (Reimers & Gurevych, 2019). The BERT construction makes it ineffective for semantic similarity computation and unmonitored tasks such as clustering.

Sentence-BERT, which is a modification of the existing BERT model, uses Siamese together with triplet network to process and derive refined semantically significant sentence which is embedding. The refined clinical text will be compared by deploying a cosine-similarity approach (Wang & Kuo, 2020). S-BERT reduces the time and effort of clinicians when identifying the most similar pair from approximately 65 hours while using RoBERT/BERT to approximately 5 seconds. Additionally, SBERT is computationally efficient since it is roughly 9 percent faster than the famous InferSent model (Reimers & Gurevych, 2019). Moreover, it is approximately 55 percent faster than the Universal Sentence Encoder (Reimers & Gurevych, 2019). Therefore, clinicians will access relevant data quicker and easier when doing differential medical examination when using the extracted information via SBERT. The evaluation and performance of SBERT on various STS tasks outperforms several sentence embedding techniques, hence, is recommendable in clinical practices.

References

Aspillaga, C., Carvallo, A., & Araujo, V. (2020). In Proceedings of the 12th Conference on Language Resources and Evaluation (LREC 2020 (pp. 1882-1894). Santiago; European Language Resources Association. Web.

Jain, A., Kulkarni, G., & Shah, V. (2018). Natural language processing. International Journal of Computer Sciences and Engineering, 6(1), 161-167.

Ndukwe, I., Amadi, C., Nkomo, L., & Daniel, B. (2020). Lecture Notes in Computer Science, 121(64), 224-227. Web.

Reimers, N., & Gurevych, I. (2019). . Association for Computational Linguistics. Web.

Wang, B., & Kuo, C. (2020). SBERT-WK: A sentence embedding method by dissecting BERT-based word models. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 28, 2146-2157.

More related papers Related Essay Examples
Cite This paper
You're welcome to use this sample in your assignment. Be sure to cite it correctly

Reference

IvyPanda. (2023, June 24). Performance of Transformer-Based Natural Language Models. https://ivypanda.com/essays/performance-of-transformer-based-natural-language-models/

Work Cited

"Performance of Transformer-Based Natural Language Models." IvyPanda, 24 June 2023, ivypanda.com/essays/performance-of-transformer-based-natural-language-models/.

References

IvyPanda. (2023) 'Performance of Transformer-Based Natural Language Models'. 24 June.

References

IvyPanda. 2023. "Performance of Transformer-Based Natural Language Models." June 24, 2023. https://ivypanda.com/essays/performance-of-transformer-based-natural-language-models/.

1. IvyPanda. "Performance of Transformer-Based Natural Language Models." June 24, 2023. https://ivypanda.com/essays/performance-of-transformer-based-natural-language-models/.


Bibliography


IvyPanda. "Performance of Transformer-Based Natural Language Models." June 24, 2023. https://ivypanda.com/essays/performance-of-transformer-based-natural-language-models/.

If, for any reason, you believe that this content should not be published on our website, please request its removal.
Updated:
This academic paper example has been carefully picked, checked and refined by our editorial team.
No AI was involved: only quilified experts contributed.
You are free to use it for the following purposes:
  • To find inspiration for your paper and overcome writer’s block
  • As a source of information (ensure proper referencing)
  • As a template for you assignment
Privacy Settings

IvyPanda uses cookies and similar technologies to enhance your experience, enabling functionalities such as:

  • Basic site functions
  • Ensuring secure, safe transactions
  • Secure account login
  • Remembering account, browser, and regional preferences
  • Remembering privacy and security settings
  • Analyzing site traffic and usage
  • Personalized search, content, and recommendations
  • Displaying relevant, targeted ads on and off IvyPanda

Please refer to IvyPanda's Cookies Policy and Privacy Policy for detailed information.

Required Cookies & Technologies
Always active

Certain technologies we use are essential for critical functions such as security and site integrity, account authentication, security and privacy preferences, internal site usage and maintenance data, and ensuring the site operates correctly for browsing and transactions.

Site Customization

Cookies and similar technologies are used to enhance your experience by:

  • Remembering general and regional preferences
  • Personalizing content, search, recommendations, and offers

Some functions, such as personalized recommendations, account preferences, or localization, may not work correctly without these technologies. For more details, please refer to IvyPanda's Cookies Policy.

Personalized Advertising

To enable personalized advertising (such as interest-based ads), we may share your data with our marketing and advertising partners using cookies and other technologies. These partners may have their own information collected about you. Turning off the personalized advertising setting won't stop you from seeing IvyPanda ads, but it may make the ads you see less relevant or more repetitive.

Personalized advertising may be considered a "sale" or "sharing" of the information under California and other state privacy laws, and you may have the right to opt out. Turning off personalized advertising allows you to exercise your right to opt out. Learn more in IvyPanda's Cookies Policy and Privacy Policy.

1 / 1