{"id":47849,"date":"2022-08-31T15:01:46","date_gmt":"2022-08-31T13:01:46","guid":{"rendered":"https:\/\/pharma-trend.com\/en\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\/"},"modified":"2022-08-31T15:01:46","modified_gmt":"2022-08-31T13:01:46","slug":"cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models","status":"publish","type":"post","link":"https:\/\/pharma-trend.com\/en\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\/","title":{"rendered":"Cerebras Systems Enables GPU-Impossible\u2122 Long Sequence Lengths Improving Accuracy in Natural Language Processing Models"},"content":{"rendered":"<div>\n<p class=\"bwalignc\">\n<i>CS-2 Sets Another Record: The Only Single System Capable of Training Transformer-Style NLP Models with 20x Longer Sequences<\/i>\n<\/p>\n<p>SUNNYVALE, Calif.&#8211;(BUSINESS WIRE)&#8211;<a href=\"https:\/\/twitter.com\/hashtag\/AI?src=hash\" target=\"_blank\" rel=\"noopener\">#AI<\/a>&#8212;<a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Fwww.cerebras.net%2F&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=Cerebras+Systems&amp;index=1&amp;md5=9119ace795a562cb00ff913d1e26cfdc\" rel=\"nofollow noopener\" shape=\"rect\">Cerebras Systems<\/a>,<b> <\/b>the pioneer in accelerating artificial intelligence (AI) compute, released yet another industry-first capability today. Customers can now rapidly train Transformer-style natural language AI models with 20x longer sequences than is possible using traditional computer hardware. This new capability is expected to lead to breakthroughs in natural language processing (NLP). By providing vastly more context to the understanding of a given word, phrase or strand of DNA, the long sequence length capability enables NLP models a much finer-grained understanding and better predictive accuracy.\n<\/p>\n<p><a href=\"https:\/\/mms.businesswire.com\/media\/20220831005105\/en\/1557161\/19\/2022_AI-Driven_Drug_Discovery_Kim_Branson_082922_cc.mp4\"><img decoding=\"async\" src=\"https:\/\/mms.businesswire.com\/media\/20220831005105\/en\/1557161\/21\/2022_AI-Driven_Drug_Discovery_Kim_Branson_082922_cc.jpg\"><\/a><br \/><a href=\"https:\/\/mms.businesswire.com\/media\/20220831005105\/en\/839280\/5\/Cerebras_logo_in_jpeg_format_for_a_black_background.jpg\"><img decoding=\"async\" src=\"https:\/\/mms.businesswire.com\/media\/20220831005105\/en\/839280\/21\/Cerebras_logo_in_jpeg_format_for_a_black_background.jpg\"><\/a><\/p>\n<p>\n\u201cEarlier this year, the Cerebras CS-2 set the record for <a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Fwww.cerebras.net%2Fpress-release%2Fcerebras-systems-sets-record-for-largest-ai-models-ever-trained-on-a-single-device%2F&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=training+the+largest+natural+language+processing+%28NLP%29+models&amp;index=2&amp;md5=f11c00ff694c96a4ef244c57239ec0e3\" rel=\"nofollow noopener\" shape=\"rect\">training the largest natural language processing (NLP) models<\/a> of up to 20 billion parameters on a single device,\u201d said Andrew Feldman, CEO and co-founder of Cerebras Systems. \u201cWe are now enabling our customers to train with longer sequences on the largest NLP models. This provides previously unobtainable accuracy, unlocking a new world of innovation and possibilities across AI and deep learning.\u201d\n<\/p>\n<p>\nLanguage is context specific. This is why translating word by word with a dictionary fails \u2014without context, the meaning of words is often vague. In language, a word is best understood in the context of the surrounding words, which provide guides to understand the meaning. This is true in AI as well. Long sequence lengths enable an NLP model to understand a given word, within a larger and broader context.\n<\/p>\n<p>\nImagine hearing the expression \u201cTo be or not to be\u201d without context, just using a dictionary. And then imagine understanding it within the context of Act II, Scene 1 of Hamlet. And then imagine if you had broader context and could understand it within the context of the entire play \u2013 or better yet, within the context of all Shakespearian literature. As the context within which understanding occurs is broadened, so too is the precision of the understanding. By vastly enlarging the context (the sequence of words within which the target word is understood), Cerebras enables NLP models to demonstrate a more sophisticated understanding of language. Bigger and more sophisticated context improves the accuracy of understanding in AI.\n<\/p>\n<p>\nWhile many industries will benefit from this new capability, Cerebras\u2019 <a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Fwww.cerebras.net%2Findustry-pharma%2F&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=pharmaceutical+and+life+sciences&amp;index=3&amp;md5=ee33278d59077a507a7b29b747a22659\" rel=\"nofollow noopener\" shape=\"rect\">pharmaceutical and life sciences<\/a> customers are particularly excited about the implications for their drug discovery efforts. DNA is the language of life, and the analysis of DNA has been a particularly powerful application of large language models.\n<\/p>\n<p>\n\u201cMachine learning at GSK involves taking complex datasets generated at scale and answering very challenging biological questions,\u201d said Kim Branson, senior vice president and global head of AI and Machine Learning at GSK. \u201cThe long sequence length capability enables us to examine a particular gene in the context of tens of thousands of surrounding genes. We know that surrounding genes have an impact on gene expression, but we have never before been able explore this within AI.\u201d\n<\/p>\n<p>\nThe proliferation of NLP has been propelled by the exceptional performance of Transformer-style networks such as BERT and GPT. However, these models are extremely computationally intensive. Even when trained on massive clusters of graphics processing units (GPUs), today these models can only process sequences up to about 2,500 tokens in length. Tokens might be words in a document, amino acids in a protein, or base pairs on a chromosome. But an eight-page document could easily exceed 8,000 words, which means that an AI model attempting to summarize a long document would lack a full understanding of the subject matter. The unique Cerebras wafer-scale architecture overcomes this fundamental limitation and enables sequences up to a heretofore impossible 50,000 tokens in length.\n<\/p>\n<p>\nThis innovation unlocks previously unexplored frontiers of deep learning. Even within traditional language processing, there are many examples of tasks in which this type of extended context matters. <a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Fwww.cerebras.net%2Fpress-release%2Fcerebras-systems-sets-record-for-largest-ai-models-ever-trained-on-a-single-device%2F&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=Recent+work&amp;index=4&amp;md5=9d2637d2a047c2734e5aa0d8f9ad3b57\" rel=\"nofollow noopener\" shape=\"rect\">Recent work<\/a> has shown that for tasks such as <a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Fwww.cerebras.net%2Fpress-release%2Fcerebras-systems-sets-record-for-largest-ai-models-ever-trained-on-a-single-device%2F&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=evaluating+intensive+care+unit+patient+discharge+data&amp;index=5&amp;md5=f102106b69aea2c6f405dae3b76b6740\" rel=\"nofollow noopener\" shape=\"rect\">evaluating intensive care unit patient discharge data<\/a> and <a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Faclanthology.org%2FP19-1424%2F&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=analyzing+legal+documents&amp;index=6&amp;md5=8b4a819f604da23e6c0466a02ef1eb48\" rel=\"nofollow noopener\" shape=\"rect\">analyzing legal documents<\/a>, seeing the entire document matters for understanding. These documents can be tens of thousands of words long. The potential applications beyond language are even more exciting. For example, <a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Farxiv.org%2Fpdf%2F2203.00854.pdf&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=research&amp;index=7&amp;md5=a98d9aa2832d05f8e4c8c72e0e3a1302\" rel=\"nofollow noopener\" shape=\"rect\">research<\/a> has shown that protein structures are highly dependent on long-range interactions between building blocks, and training models with longer sequence lengths is likely to yield better results. Now that the Cerebras CS-2 system makes long sequence training not only possible, but easy, researchers are sure to uncover many more applications and solve problems previously thought to be intractable.\n<\/p>\n<p>\nTraining large models with massive data sets and long sequence lengths is an area that the Cerebras CS-2 system, powered by the Wafer-Scale Engine (WSE-2), excels. The WSE-2 is the largest processor ever built. It is 56 times larger, has 2.55 trillion <i>more<\/i> transistors, and has 100 times as many compute cores as the largest GPU. This scale means that the WSE-2 has both the memory to hold computations for the largest layers for the largest models, and the computational power to process such huge computations quickly. In contrast, similar workloads on GPUs have to be parallelized across hundreds or thousands of nodes to train a model in a reasonable amount of time. This type of GPU infrastructure requires specialized expertise and valuable engineering time to set up. Meanwhile, the Cerebras CS-2 system can perform similar workloads with the push of a button, removing the complexity while accelerating time to insight.\n<\/p>\n<p>\nWith customers in North America, Asia, Europe and the Middle East, Cerebras is delivering industry leading AI solutions to a growing roster of customers in the enterprise, government, and high performance computing (HPC) segments, including <a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Fwww.cerebras.net%2Fcerebras-customer-spotlight-overview%2Fspotlight-glaxosmithkline%2F&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=GSK&amp;index=8&amp;md5=e8527095a8c73c03167af400b3ba4a0d\" rel=\"nofollow noopener\" shape=\"rect\">GSK<\/a>, <a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Fwww.cerebras.net%2Fcerebras-customer-spotlight-overview%2Fspotlight-astrazeneca%2F&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=AstraZeneca&amp;index=9&amp;md5=edb7a65a767bc9c93ff980a3c1abcff0\" rel=\"nofollow noopener\" shape=\"rect\">AstraZeneca<\/a>, <a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Fwww.cerebras.net%2Fcerebras-customer-spotlight-overview%2Fspotlight-totalenergies%2F&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=TotalEnergies&amp;index=10&amp;md5=37b0d1dab6e1fd599824d3b5d5d819fe\" rel=\"nofollow noopener\" shape=\"rect\">TotalEnergies<\/a>, <a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Fcerebras.net%2Fspotlight-nference%2F&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=nference&amp;index=11&amp;md5=896b39c99c6d45a15787b8ca871cadda\" rel=\"nofollow noopener\" shape=\"rect\">nference<\/a>, <a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Fcerebras.net%2Fspotlight-argonne-national-laboratory%2F&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=Argonne+National+Laboratory&amp;index=12&amp;md5=7869ac783717a2e706712166c4231d9f\" rel=\"nofollow noopener\" shape=\"rect\">Argonne National Laboratory<\/a>, <a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Fcerebras.net%2Fspotlight-lawrence-livermore-national-laboratory%2F&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=Lawrence+Livermore+National+Laboratory&amp;index=13&amp;md5=09f753117b7f92c78d0cca582f61f912\" rel=\"nofollow noopener\" shape=\"rect\">Lawrence Livermore National Laboratory<\/a>, <a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Fnam10.safelinks.protection.outlook.com%2F%3Furl%3Dhttps%253A%252F%252Fwww.businesswire.com%252Fnews%252Fhome%252F20200609005134%252Fen%252FNSF-Funds-Neocortex-Groundbreaking-AI-Supercomputer-PSC%26data%3D04%257C01%257Cliz%2540cerebras.net%257C7af5cd07225e4bb7936208d99f2545dd%257C16c409e7e5a24663a88467e3ba571505%257C0%257C0%257C637715805624503595%257CUnknown%257CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%253D%257C1000%26sdata%3Dk6hE859Pd9%252FLf8P3UqKVIMvSzmdwI%252BTkJGWz1%252Fmtboo%253D%26reserved%3D0&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=Pittsburgh+Supercomputing+Center&amp;index=14&amp;md5=b131b7f35425f3a4f860692b0981a7e7\" rel=\"nofollow noopener\" shape=\"rect\">Pittsburgh Supercomputing Center<\/a>, <a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Fwww.cerebras.net%2Fcerebras-customer-spotlight-overview%2Fspotlight-leibniz-supercomputing-centre%2F&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=Leibniz+Supercomputing+Centre&amp;index=15&amp;md5=6dd165e06f26b8bb493c31f3bf9fc484\" rel=\"nofollow noopener\" shape=\"rect\">Leibniz Supercomputing Centre<\/a>, <a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Fwww.cerebras.net%2Fcerebras-customer-spotlight-overview%2Fspotlight-national-center-for-supercomputing-applications%2F&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=National+Center+for+Supercomputing+Applications&amp;index=16&amp;md5=61284224caea4eb38fffdce7ccead6d6\" rel=\"nofollow noopener\" shape=\"rect\">National Center for Supercomputing Applications<\/a>, <a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Fnam10.safelinks.protection.outlook.com%2F%3Furl%3Dhttps%253A%252F%252Fwww.businesswire.com%252Fnews%252Fhome%252F20210203005062%252Fen%252FEPCC-Selects-Cerebras-Systems-AI-Supercomputer-to-Rapidly-Accelerate-AI-Research%26data%3D04%257C01%257Cliz%2540cerebras.net%257C7af5cd07225e4bb7936208d99f2545dd%257C16c409e7e5a24663a88467e3ba571505%257C0%257C0%257C637715805624513551%257CUnknown%257CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%253D%257C1000%26sdata%3D18MAsAvTZz78hAKOB3Za0nfi4aYz86mPV4ZNlW2xap0%253D%26reserved%3D0&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=Edinburgh+Parallel+Computing+Centre+%28EPCC%29&amp;index=17&amp;md5=e8b2edf0cef07aff7a93757c8d72d752\" rel=\"nofollow noopener\" shape=\"rect\">Edinburgh Parallel Computing Centre (EPCC)<\/a>, <a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Fcerebras.net%2Fcerebras-customer-spotlight-overview%2Fspotlight-national-energy-technology-laboratory%2F&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=National+Energy+Technology+Laboratory&amp;index=18&amp;md5=3b4a85948b4985f690ed62ca0f287301\" rel=\"nofollow noopener\" shape=\"rect\">National Energy Technology Laboratory<\/a>, and <a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Fwww.businesswire.com%2Fnews%2Fhome%2F20210720005426%2Fen%2FNew-TED-AI-Lab-Featuring-Cerebras-Systems-CS-1-AI-Accelerator-Opens-in-Tokyo&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=Tokyo+Electron+Devices&amp;index=19&amp;md5=05f40d23203db11eef26465796db31cf\" rel=\"nofollow noopener\" shape=\"rect\">Tokyo Electron Devices<\/a>.\n<\/p>\n<p>\nFor more information on Cerebras Systems, please visit the <a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Fwww.cerebras.net%2Fblog%2Fcontext-is-everything-why-maximum-sequence-length-matters%2F&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=Cerebras+blog&amp;index=20&amp;md5=e7422ca5d3b8ebd07029469703c3bb0e\" rel=\"nofollow noopener\" shape=\"rect\">Cerebras blog<\/a>.\n<\/p>\n<p>\n<b>About Cerebras Systems<\/b>\n<\/p>\n<p>\n<a target=\"_blank\" href=\"https:\/\/cts.businesswire.com\/ct\/CT?id=smartlink&amp;url=https%3A%2F%2Fcerebras.net%2F&amp;esheet=52842062&amp;newsitemid=20220831005105&amp;lan=en-US&amp;anchor=Cerebras+Systems&amp;index=21&amp;md5=68d0c6311b2b1dd4262220da67ed747f\" rel=\"nofollow noopener\" shape=\"rect\">Cerebras Systems<\/a> is a team of pioneering computer architects, computer scientists, deep learning researchers, and engineers of all types. We have come together to build a new class of computer system, designed for the singular purpose of accelerating AI and changing the future of AI work forever. Our flagship product, the CS-2 system, which is powered by the world\u2019s largest processor \u2013 the 850,000 core Cerebras WSE-2, enables customers to accelerate their deep learning work by orders of magnitude over graphics processing units.\n<\/p>\n<p> <b>Contacts<\/b> <\/p>\n<p>\nMedia Contact:<br \/>\n<br \/>Kim Ziesemer<br \/>\n<br \/>Email: <a target=\"_blank\" href=\"&#109;&#x61;&#105;&#x6c;&#116;&#x6f;:&#x70;r&#x40;z&#109;&#x63;&#111;&#x6d;&#109;&#x75;&#110;&#x69;c&#x61;t&#x69;o&#110;&#x73;&#46;&#x63;&#111;&#x6d;\" rel=\"nofollow noopener\" shape=\"rect\">pr&#64;&#122;&#109;&#99;&#x6f;&#x6d;&#x6d;&#x75;&#x6e;ic&#97;&#116;&#105;&#111;&#x6e;&#x73;&#x2e;&#x63;&#x6f;m<\/a>\n<\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>CS-2 Sets Another Record: The Only Single System Capable of Training Transformer-Style NLP Models with 20x Longer Sequences SUNNYVALE, Calif.&#8211;(BUSINESS WIRE)&#8211;#AI&#8212;Cerebras Systems, the pioneer in accelerating artificial intelligence (AI) compute, released yet another industry-first capability today. Customers can now rapidly train Transformer-style natural language AI models with 20x longer sequences than is possible using traditional &#8230; <span class=\"more\"><a class=\"more-link\" href=\"https:\/\/pharma-trend.com\/en\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\/\">[Read more&#8230;]<\/a><\/span><\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[13],"tags":[],"class_list":{"0":"entry","1":"post","2":"publish","3":"author-business","4":"post-47849","6":"format-standard","7":"category-industry"},"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.5 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Cerebras Systems Enables GPU-Impossible\u2122 Long Sequence Lengths Improving Accuracy in Natural Language Processing Models - Pharma Trend<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/pharma-trend.com\/en\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Cerebras Systems Enables GPU-Impossible\u2122 Long Sequence Lengths Improving Accuracy in Natural Language Processing Models - Pharma Trend\" \/>\n<meta property=\"og:description\" content=\"CS-2 Sets Another Record: The Only Single System Capable of Training Transformer-Style NLP Models with 20x Longer Sequences SUNNYVALE, Calif.&#8211;(BUSINESS WIRE)&#8211;#AI&#8212;Cerebras Systems, the pioneer in accelerating artificial intelligence (AI) compute, released yet another industry-first capability today. Customers can now rapidly train Transformer-style natural language AI models with 20x longer sequences than is possible using traditional ... [Read more...]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/pharma-trend.com\/en\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\/\" \/>\n<meta property=\"og:site_name\" content=\"Pharma Trend\" \/>\n<meta property=\"article:published_time\" content=\"2022-08-31T13:01:46+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/mms.businesswire.com\/media\/20220831005105\/en\/1557161\/21\/2022_AI-Driven_Drug_Discovery_Kim_Branson_082922_cc.jpg\" \/>\n<meta name=\"author\" content=\"Business Wire\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Business Wire\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\\\/\"},\"author\":{\"name\":\"Business Wire\",\"@id\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/#\\\/schema\\\/person\\\/02d41342c7a74fa7f0032bb35ef0bb24\"},\"headline\":\"Cerebras Systems Enables GPU-Impossible\u2122 Long Sequence Lengths Improving Accuracy in Natural Language Processing Models\",\"datePublished\":\"2022-08-31T13:01:46+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\\\/\"},\"wordCount\":1077,\"publisher\":{\"@id\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/mms.businesswire.com\\\/media\\\/20220831005105\\\/en\\\/1557161\\\/21\\\/2022_AI-Driven_Drug_Discovery_Kim_Branson_082922_cc.jpg\",\"articleSection\":[\"Industry\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\\\/\",\"url\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\\\/\",\"name\":\"Cerebras Systems Enables GPU-Impossible\u2122 Long Sequence Lengths Improving Accuracy in Natural Language Processing Models - Pharma Trend\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/mms.businesswire.com\\\/media\\\/20220831005105\\\/en\\\/1557161\\\/21\\\/2022_AI-Driven_Drug_Discovery_Kim_Branson_082922_cc.jpg\",\"datePublished\":\"2022-08-31T13:01:46+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/pharma-trend.com\\\/en\\\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\\\/#primaryimage\",\"url\":\"https:\\\/\\\/mms.businesswire.com\\\/media\\\/20220831005105\\\/en\\\/1557161\\\/21\\\/2022_AI-Driven_Drug_Discovery_Kim_Branson_082922_cc.jpg\",\"contentUrl\":\"https:\\\/\\\/mms.businesswire.com\\\/media\\\/20220831005105\\\/en\\\/1557161\\\/21\\\/2022_AI-Driven_Drug_Discovery_Kim_Branson_082922_cc.jpg\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Startseite\",\"item\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Cerebras Systems Enables GPU-Impossible\u2122 Long Sequence Lengths Improving Accuracy in Natural Language Processing Models\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/#website\",\"url\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/\",\"name\":\"Pharma Trend\",\"description\":\"\",\"publisher\":{\"@id\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/#organization\",\"name\":\"Pharma Trend\",\"url\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"\",\"contentUrl\":\"\",\"caption\":\"Pharma Trend\"},\"image\":{\"@id\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/#\\\/schema\\\/logo\\\/image\\\/\"}},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/pharma-trend.com\\\/en\\\/#\\\/schema\\\/person\\\/02d41342c7a74fa7f0032bb35ef0bb24\",\"name\":\"Business Wire\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Cerebras Systems Enables GPU-Impossible\u2122 Long Sequence Lengths Improving Accuracy in Natural Language Processing Models - Pharma Trend","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/pharma-trend.com\/en\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\/","og_locale":"en_US","og_type":"article","og_title":"Cerebras Systems Enables GPU-Impossible\u2122 Long Sequence Lengths Improving Accuracy in Natural Language Processing Models - Pharma Trend","og_description":"CS-2 Sets Another Record: The Only Single System Capable of Training Transformer-Style NLP Models with 20x Longer Sequences SUNNYVALE, Calif.&#8211;(BUSINESS WIRE)&#8211;#AI&#8212;Cerebras Systems, the pioneer in accelerating artificial intelligence (AI) compute, released yet another industry-first capability today. Customers can now rapidly train Transformer-style natural language AI models with 20x longer sequences than is possible using traditional ... [Read more...]","og_url":"https:\/\/pharma-trend.com\/en\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\/","og_site_name":"Pharma Trend","article_published_time":"2022-08-31T13:01:46+00:00","og_image":[{"url":"https:\/\/mms.businesswire.com\/media\/20220831005105\/en\/1557161\/21\/2022_AI-Driven_Drug_Discovery_Kim_Branson_082922_cc.jpg","type":"","width":"","height":""}],"author":"Business Wire","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Business Wire","Est. reading time":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/pharma-trend.com\/en\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\/#article","isPartOf":{"@id":"https:\/\/pharma-trend.com\/en\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\/"},"author":{"name":"Business Wire","@id":"https:\/\/pharma-trend.com\/en\/#\/schema\/person\/02d41342c7a74fa7f0032bb35ef0bb24"},"headline":"Cerebras Systems Enables GPU-Impossible\u2122 Long Sequence Lengths Improving Accuracy in Natural Language Processing Models","datePublished":"2022-08-31T13:01:46+00:00","mainEntityOfPage":{"@id":"https:\/\/pharma-trend.com\/en\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\/"},"wordCount":1077,"publisher":{"@id":"https:\/\/pharma-trend.com\/en\/#organization"},"image":{"@id":"https:\/\/pharma-trend.com\/en\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\/#primaryimage"},"thumbnailUrl":"https:\/\/mms.businesswire.com\/media\/20220831005105\/en\/1557161\/21\/2022_AI-Driven_Drug_Discovery_Kim_Branson_082922_cc.jpg","articleSection":["Industry"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/pharma-trend.com\/en\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\/","url":"https:\/\/pharma-trend.com\/en\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\/","name":"Cerebras Systems Enables GPU-Impossible\u2122 Long Sequence Lengths Improving Accuracy in Natural Language Processing Models - Pharma Trend","isPartOf":{"@id":"https:\/\/pharma-trend.com\/en\/#website"},"primaryImageOfPage":{"@id":"https:\/\/pharma-trend.com\/en\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\/#primaryimage"},"image":{"@id":"https:\/\/pharma-trend.com\/en\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\/#primaryimage"},"thumbnailUrl":"https:\/\/mms.businesswire.com\/media\/20220831005105\/en\/1557161\/21\/2022_AI-Driven_Drug_Discovery_Kim_Branson_082922_cc.jpg","datePublished":"2022-08-31T13:01:46+00:00","breadcrumb":{"@id":"https:\/\/pharma-trend.com\/en\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/pharma-trend.com\/en\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/pharma-trend.com\/en\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\/#primaryimage","url":"https:\/\/mms.businesswire.com\/media\/20220831005105\/en\/1557161\/21\/2022_AI-Driven_Drug_Discovery_Kim_Branson_082922_cc.jpg","contentUrl":"https:\/\/mms.businesswire.com\/media\/20220831005105\/en\/1557161\/21\/2022_AI-Driven_Drug_Discovery_Kim_Branson_082922_cc.jpg"},{"@type":"BreadcrumbList","@id":"https:\/\/pharma-trend.com\/en\/cerebras-systems-enables-gpu-impossible-long-sequence-lengths-improving-accuracy-in-natural-language-processing-models\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Startseite","item":"https:\/\/pharma-trend.com\/en\/"},{"@type":"ListItem","position":2,"name":"Cerebras Systems Enables GPU-Impossible\u2122 Long Sequence Lengths Improving Accuracy in Natural Language Processing Models"}]},{"@type":"WebSite","@id":"https:\/\/pharma-trend.com\/en\/#website","url":"https:\/\/pharma-trend.com\/en\/","name":"Pharma Trend","description":"","publisher":{"@id":"https:\/\/pharma-trend.com\/en\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/pharma-trend.com\/en\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/pharma-trend.com\/en\/#organization","name":"Pharma Trend","url":"https:\/\/pharma-trend.com\/en\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/pharma-trend.com\/en\/#\/schema\/logo\/image\/","url":"","contentUrl":"","caption":"Pharma Trend"},"image":{"@id":"https:\/\/pharma-trend.com\/en\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/pharma-trend.com\/en\/#\/schema\/person\/02d41342c7a74fa7f0032bb35ef0bb24","name":"Business Wire"}]}},"_links":{"self":[{"href":"https:\/\/pharma-trend.com\/en\/wp-json\/wp\/v2\/posts\/47849","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/pharma-trend.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/pharma-trend.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/pharma-trend.com\/en\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/pharma-trend.com\/en\/wp-json\/wp\/v2\/comments?post=47849"}],"version-history":[{"count":0,"href":"https:\/\/pharma-trend.com\/en\/wp-json\/wp\/v2\/posts\/47849\/revisions"}],"wp:attachment":[{"href":"https:\/\/pharma-trend.com\/en\/wp-json\/wp\/v2\/media?parent=47849"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/pharma-trend.com\/en\/wp-json\/wp\/v2\/categories?post=47849"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/pharma-trend.com\/en\/wp-json\/wp\/v2\/tags?post=47849"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}