6
$\begingroup$

ChatGPT has been a big thing lately. It also makes a lot of mistakes. For example, it creates fake functions of a package and tells it as it works for real. I was wondering how that works. Why is it creating fake functions of code and not just telling that it doesn't know how to solve the question that was asked?


Example taken from Stack Overflow.

Question

Link: https://stackoverflow.com/q/74662093/

Title:

App script accessing secret manager to retrieve api token

Body:

I am trying to access secret manager from a small script I built on app script. I need to access and API with with a key but do not want to hardcode it.

I have already assigned the app to a project. I am just not sure how to query the token.

Answer (deleted)

Link: https://stackoverflow.com/a/74682465/1595451

Body

To access the Secret Manager from Google App Script, you can use the SecretManager.getSecret(secret) method, which takes a secret parameter specifying the name of the secret you want to retrieve. For example:

 // Get the secret named "my-api-token"
 var secret = SecretManager.getSecret('my-api-token');
 
 // Use the secret value in your script
 var apiToken = secret.getValue();

Note that this method will throw an error if the specified secret does not exist or if the user does not have permission to access it. You can use a try-catch block to handle these cases gracefully. For example:

 try {
   // Get the secret named "my-api-token"
   var secret = SecretManager.getSecret('my-api-token');
 
   // Use the secret value in your script
   var apiToken = secret.getValue();
 } catch (e) {
   Logger.log('Error getting secret: ' + e.message);
 }

Also note that the getValue() method returns the secret value as a string, so you may need to convert it to the appropriate data type (such as a number or boolean) depending on your use case.

For more information, see the Secret Manager documentation and the SecretManager class reference.

SecretManager.getSecret(secret) method doesn't exist, and obviously the links returns page not found.

$\endgroup$
3
  • 1
    $\begingroup$ It's have been nice if you had given the example too. If it happens, it has mostly to do with the training data, the output has a huge relationship with how it was trained. Also, it is a generalized model, and might not be good for specialized tasks, you might want to use codex instead $\endgroup$ Commented Dec 31, 2022 at 10:27
  • $\begingroup$ Experts don't like to say, "I don't know." $\endgroup$ Commented Jan 13 at 1:16
  • $\begingroup$ ChatGPT is trained to lie. And to do that passably well. It is simulating knowledge, not having it. All the source materials it is trained on are human communications full of metaphors, generalizations and other half-truths and rhetorical devices. If humans try to approximate unambiguous truthful communications, they end up with mathematics or lawyer speech, and only a small part of the internet and other sources of digitized content contains that. $\endgroup$ Commented Jan 15 at 11:40

1 Answer 1

13
$\begingroup$

ChatGPT is a generalised model. It does not understand any code. It does not know it is creating fake functions. It doesn't know that it isn't solving the question asked, because it doesn't know what the question means. All it knows is what sort of words fit together based on its training corpus.

This works very well for chat, but is less suited for anything technical, specialised, or in fact non-chat.

$\endgroup$
5
  • 1
    $\begingroup$ I have used ChatGPT very few times. The first attemp I asked how to create a macro in Google Sheets 80% of the answer was correct, but the 20% incorrect was very disapoting. Few days later after researching on Wordpress API docs, Stack Overflow and Wordpress SE and not finding something that help to get unstuck I did a new attemp and surprinsingly it created Google Apps Script several functions to create Wordpress posts with different parameters that worked correctly. $\endgroup$
    – Rubén
    Commented Jan 4, 2023 at 1:01
  • 3
    $\begingroup$ ‘Utterly unsuited’ seems rather strong: must be used with caution $\endgroup$
    – innisfree
    Commented Oct 26, 2023 at 1:32
  • $\begingroup$ @innisfree - I have updated. As time goes on your point becomes ever more correct. $\endgroup$
    – Rory Alsop
    Commented Jan 12 at 15:51
  • $\begingroup$ I would like to say that humans are better at filtering out rubbish than a compiler, but that would not be correct. Compilers refuse to compile rubbish, but humans take it on and run with it. Failing would really be better. $\endgroup$ Commented Jan 13 at 1:19
  • $\begingroup$ Any computer of our time is utterly incapable of the activity "know", insofar as the general use of this verb implies a reflecting consciousness behind it. $\endgroup$ Commented Jan 15 at 11:35

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .