ChatGPT - The Good, The Bad & The Ugly
ChatGPT is a new artificial intelligence tool, created by OpenAI, that serves as a chat. You can ask it a question of almost any kind, and it will reply with an answer in a way a human would answer - with fully-formed sentences.
It is also important to notice that ChatGPT remembers its state. This means that when you ask a question, it sees it in the same context as with previous questions.
How does it do all this? The "GPT" part in its name means Generative Pre-trained Transformer. So it was fed with training data, which it uses as source for creating potentially large answers with short inputs.
The possibilities are endless, but here are some examples of what you can do with it.
- Ask general questions
"Tell me about the history of independence of Luxembourg"
- Tell is to view things from a certain point of view
"You are a robot from the future and came to earth today. You are observing Christmas for the first time. Tell me what you think"
- Tell it to write songs
- Come up with ideas for blog posts
"Give me 10 Christmas-themed ideas for blog posts about artificial intelligence"
- Write entire blog posts
"Write a blog post about Using AI to improve the accuracy of weather forecasting for Christmas"
- Write tweets
"Write 10 controversial tweets that relate well to web developers"
- Write code
"Write a Vue component that lets snowflakes fall down on the website"
- Convert vode
"Convert this PHP code to Node.js code
<paste in code here>"
- Write an outline for a talk or article
"I want to give a conference talk about Tailwind versus Bootstrap. Can you give me an outline of that?"
- Write a story
"Write a Harry Potter short fan fiction where Harry turns evil"
As you can see, only your imagination is the limit of what you can do with this tool.
There are many great things about ChatGPT, so I want to touch on them first.
When you need to be creative, and you are just blanking, this is the perfect tool to get the creative juices flowing. You could for example ask ChatGPT about a few different viewpoints of a certain topic, or propose blog post topics of your area of expertise.
For storytellers: It can also help you writing a story. I am not saying you should copy that story (actually, you shouldn't do that) but it might give you a few ideas you can work into your story.
Similar to assisting with the creative workflow, it can also help outlining a topic you want to write about, so you don't forget anything. You could for example ask it to outline an article for you, and you have something to go off.
Instead of googling things, you can ask ChatGPT and it will give you a precise answer. You could for example ask it about the difference between a framework and a library.
This could be of tremendous help for beginners in a certain topic (it doesn't have to be tech).
Also great for beginners, you can ask it to explain something. For example, paste it a piece of code and ask if what it does. Depending on how complex the code is, it should give you a great answer.
When you're stuck on a problem, it can be useful to ask ChatGPT what it thinks about. It might come up with a point of view you haven't thought about.
But not everything is sunshine and rainbows. There are some downsides.
At the time of writing this article, my knowledge is that ChatGPT has been trained on data from 2021. So it might not be accurate if you query it new technologies, current events, and so on.
For a certain type of question, there might be contradicting points of view. This is something that ChatGPT might not always take into account. (Although to be fair some things I tried out resulted in it not being able to give a concrete answer).
My problem here though is: Where does it have the information from it provides me with? I tried to ask it, where it got a certain information from and it wasn't able to tell me.
I am a large language model trained by OpenAI. I do not have access to external information and I generate all of my responses based on the input I receive from users. My responses are generated using advanced machine learning algorithms that have been trained on a large dataset of text, allowing me to understand and generate natural language. I do not have access to the internet or any external sources of information.
In other AI tools, such as GitHub Copilot, you are able to correct information. It feeds back into the system and makes it better. This is not the case with ChatGPT.
I'm sorry to hear that there is some inaccurate information. I am a large language model trained by OpenAI, so I don't have the ability to browse the internet or verify information. I can provide general information on a wide range of topics, but I don't have the ability to update or correct specific information. Is there something specific you would like to know more about? I would be happy to help answer any questions you may have to the best of my ability.
I am all for this tool to help students understand things better, and help with their studies. But I know first hand that some students use it to downright write entire essays or answer exam questions with it.
They might get excellent grades on their work they turned in, but this doesn't help them to solve certain problems. People that cheat their way through school like this might not develop a healthy set of problem solving skills.
This harms them more than anyone else, but I still see it as an issue for them.
We just saw how easy it is to come up with a list of topics to write about, and how easy it is to generate blog posts about almost any topic.
This makes it even easier for people to pretend they are experts when they are not.
Granted, this is a problem that has been there before ChatGPT as well, but this makes it even easier. And people pretending experts when they are not is not even the real issue I see.
The potential issues I see here are:
- People could give false "expert" advice on people (e.g. on social media or giving conference talks)
- People could sell courses to people who think they are experts. Basically cheating people out their money.
I don't think this will happen anytime soon.
As they always say: Jobs where creativity is required is not being replaced by AI. Although this is partially debunked (e.g. DALL-E, Stable Diffusion) I still think that coding requires a certain way of creative problem solving that for the time being I cannot imagine a computer making.
Tools like ChatGPT will make our jobs as developers easier, and replace chunks of work we used to do manually before. However, it will not understand customer requirements or the best way to implement it for that matter.
Not everything that ChatGPT spits out is gold.
It is a great tool to get started and get a basic understanding of things, but we must always question whether the solution it gives us for a problem is the best one.
It's up to us to decide whether the answer is correct, or not.
ChatGPT is an unbelievably useful tool, if we know how to use it correctly. There are some downsides I see to it, but all in all I am excited about this being in our lives.
And I wouldn't worry too much about our jobs being replaced - as long as we adapt to the changing circumstances, we are going to be just fine.