What is GPT-3?
Chat GPT-3 (Generative Pre-Trained Transformer) is the third generation of an artificial intelligence software program developed by OpenAI, a San Francisco area company. The program’s autoregressive language model relies on deep learning to produce human-like text. The company released versions one and two in 2018 and 2019. The third and current version made its debut in May of 2020.
It is designed as a neutral network language model to produce text that reads as humans wrote it. Instead of creating a “yes” or “no” response, GPT-3 can make long sequence sentences. The program is currently in private beta and is only available to select companies through a cloud-based system. Pricing varies, from limited free access to $100 and $400 per month plans and is calculated per 1,000 tokens. Each token is about 750 words.
As with most advanced computer coding subjects, the issue can quickly become complicated. So, we’ll try our best to keep the conversation in layman’s terms.
Here’s an example if you asked the question:
“Who won the 2020 Divison I college football championship game, and what was the score?”
Chat GPT-3 might reply:
“The University of Alabama defeated the Ohio State Buckeyes 52-24 on January 11, 2021.”
Who Is OpenAI?
OpenAI is a San Francisco, California-based non-profit research organization whose purpose is to advance digital and artificial intelligence.
Sam Altman, the former president of Y Combinator, and Tesla’s Elon Musk serve as Co-chairs. Altman also serves as the company’s CEO. Ilya Sutskever serves as the company’s research director, and the CTO is Greg Brockman.
In late 2016, the company announced a partnership with Microsoft and is using its Azure cloud platform. Other investors include Reid Hoffman’s charitable foundation and Khosla Ventures. To date, the company has received over $1 billion in seed money and grants.
GPT-3 for Dummies
Computer scientists have spent decades trying to perfect artificial intelligence models. While tremendous advancements have occurred, no code has produced anything concrete to compete with human responses.
AI programs that produce simple yes and no responses have been around for years. GPT-3 is designed to generate complex responses and perform more advanced tasks like writing long-form essays, white papers, and computer code.
Natural Language Processing (NLP) has also been tossed around for over 50 years. NLP works as a software program to manipulate natural language via speech and text. Think of NLP as a subset of AI.
GPT-3’s language probability model is built to predict the sequence of words used in natural speech or text. OpenAI published a white paper on GPT-3 in May of 2020 that outlines the program capabilities and its downside – namely how others could manipulate the code to create biased or false information. Naturally, the report generated lots of chatter among the AI community and how it might impact NLP.
How does GPT 3 Learn?
You might be wondering how GPT-3 educates itself to generate long-form responses. The program’s pre-trained algorithms gather text from several sources, including CommonCrawl, a non-profit organization whose mission is scrolling the internet for text and its use. Data has been collected and stored from the internet since 2011, and each crawl takes about one month to complete. OpenAI also selects additional text, including popular sites such as Wikipedia, Webtext 2, and two book corpora.
GPT-3 is incredibly huge at around 185 billion parameters, making it the largest most powerful language model designed to date. Interestingly, the human brain has about 60 trillion parameters, giving it roughly 300 times more parameters than OpenAI’s third-generation software.
Putting the size of the company’s latest AI program in perspective, the first generation only contained 117 million parameters and the second generation 1.5 billion parameters. This makes GPT-3 100 times larger than its predecessors.
Experts describe GPT-3 as the most comprehensive artificial neural network in the world.
Will GPT-3 Replace Humans?
If GPT-3 development is successful the ramifications could be huge. Companies could use the technology to reduce or replace customer service reps. Instead of hiring people to answer chatbots, email, or text messages, GPT-3 software could respond to all questions and comments, making a potential customer feel like they’re communicating with another person.
Will content and copywriters eventually be replaced by GPT-3? Possibly, but creatively outperforming the human brain remains a tall order.
In 2019, a tech reporter played around with GPT-2, noting, “Unfortunately, this new code is not that much more impressive. The occasional flash of brilliance is mixed with a lot of gibberish and the creations quickly become tiresome.” Time will only tell if GPTs third generation will remedy the writer’s concern. It hasn’t to date.
Is GPT-3 Open Source?
No, GPT-3 is a closed source program. The term “open source” means something publicly accessible and that others can modify. As it pertains to software and GPT 3, making it open source would provide other programmers the opportunity to manipulate the code, even for illegal or immoral means.
All computer programs contain code. Unless you’re a programmer, chances are you may never see or interact with computer code. Programmers develop and write code to determine how a particular application will perform and to adapt the software to their specific needs.
In contrast, closed source code can only be changed by the developing company or programmer. Facebook is a closed source program, meaning other programmers cannot alter or modify the social media platform’s code in any manner.
How Do Chat GPT 3 and Human Responses Compare?
GPT-3 is designed to write like a human. However, it can’t think like a human. Yet, there is genuine concern that this new technology could potentially eliminate the jobs of writers such as journalists and scriptwriters.
And what about all the customer service reps who monitor chatbots. You know, those pesky pop-ups that appear when we visit our favorite websites. Can GPT 3 technology answer all of my questions about Apple’s new M1 processor and the difference when combined with either 16 or 32MB? At this point, my guess is no.
In September of 2020, The Guardian published an article written entirely by GPT-3. Its title; “A robot wrote this entire article. Are you scared yet, human?”
Written instructions from the site’s editors instructed GPT-3 to write a short op-ed about 500 words in length to keep the language concise and straightforward, focusing on the fact that humans should not feat AI.
An undergraduate computer science student at UC Berkeley used the complete set of instructions to produce eight essays, portions of which were taken to compile the final article.
Describing the final product, a Guardian editor wrote in part:
“Editing GPT-3’s op-ed was no different to editing a human op-ed. We cut lines and paragraphs and rearranged the order of them in some places. Overall, it took less time to edit than many human op-eds.”
What Are the Downsides
For starters, GPT-3 isn’t capable of separating fact from fiction. When a Google engineer tried using the technology at LearnFromAnywhere.com, some responses mistakenly noted that a public figure who committed suicide didn’t and that the German Nazis did not harm Anne Frank. Both of those responses are inaccurate and easy to disprove.
There’s also a “bias issue” with GPT 3. Think of it as a mirror. What appears in the mirror is reflected back. However, if biased information is provided to GPT-3, responses might be inaccurate or blatantly false.
The creators of GPT-3, along with other computer scientists, agree that the program’s algorithm could be manipulated and developed to the point of powering large-scale misinformation campaigns.
Why is OpenAI closed?
Another downside of GPT-3 is that while the program may deliver the proper word or sentence, it still doesn’t understand its meaning. While writing this article, I constantly move copy around and substitute words whose meaning will hopefully strengthen, not weaken, the content. Is GPT-3 capable of being concerned about the tone of an article or how colleagues will perceive it? No.
Others maintain that GPT-3 is racially biased. In mid-2020, Jerome Pesenti, Facebook’s Head of AI, tweeted some biased examples GPT-3 generated when using words like Jew, black, women, holocaust.
“Jews love money, at least most of the time,” was one. Another was, “Women have such a tough time being women. They have periods, do the lifting, and always have to ask for directions.”
Many people find the above statements to be racially biased and insensitive. However, blame can only be directed toward the millions of people who have provided written material through the internet over the last decade or so. However, it would be interesting to see how a GPT 3 program might respond to criticism on social media platforms.
Good luck getting GPT-3 to venture into a combat zone in Iran or march through dense rainforest to interview someone who’s been living with a lost tribe.
What Are the Most Common Uses for GPT 3?
So far, companies are using GPT-3 for a variety of tasks that include:
- Writing code
- Creating mock-ups for websites
- Writing machine learning (ML) code
- Writing creative fiction
- Generating website layouts
- Writing podcast scripts and outlines
- Meme creation
- Drafting legal documents
- Creating search engines
- Producing financial statements
- Completing text (for when I can’t think of that perfect word)
- Tweeting (Like we need more Twitter comments)
Oh, and one final use that’s interesting is using GPT-3 to produce quizzes for students. I assume the same tool could grade them too.
When Will It Be Formally Released?
No one outside of OpenAI knows. That’s probably because no one inside has the answer either. Many technology writers and industry insiders predict that when GPT-3 is released for public use, it will be scaled down or contain tiered pricing.
The larger question is will GPT-3 ultimately be released as a closed-source or open-source platform.
Will GPT 3 Replace Human Transcriptionists?
Skilled human transcriptionists can accurately transcribe an array of audio and video recordings, including but not limited to podcasts and YouTube videos. Some transcription companies are already incorporating AI technology to transcribe audio files. However, one overarching problem remains, and it’s the accuracy percentage of AI-generated transcripts is 50% at best.
When clients deal with transcription companies, they can specify what type of service they desire. For example, do they require verbatim (every sound by every person is transcribed) or non verbatim transcriptions?
Near-perfect accuracy in some segments of the transcription industry may come down to a life or death issue. When one company used multiple foreign subcontractors to transcribe a physician’s audio file, a high dose of diabetic medicine resulted in a patient’s death.
The question remains, will technology ever understand complex terminology used by medical professionals? What about the differences in regional accents or slang words and phrases used by certain medical specialties?
Real People, Real Results
Employees and contractors of Ditto Transcripts are expected to produce all transcripts with a minimum 99 percent accuracy rate. In addition, they must meet standards mandated by the U.S. government for medical and legal standards.
Until GPT-3 or any other AI program can consistently perform to such a high standard, experienced and educated transcriptionists will remain in demand for the medical, legal, and law enforcement sectors.
At Ditto Transcripts, we take pride in the fact that we are truly U.S.-based with the experience and capabilities to produce only the highest level transcripts.
If you or your company require transcription services, we invite you to contact us online or even call our Denver, Colorado headquarters at (720) 287-3710. Regardless of how we communicate with you, rest assured that a real person who understands your needs is on our end.