Arthur Besse@lemmy.ml to Programmer Humor@lemmy.mlEnglish · edit-21 year agoI prefer to be the one who writes codefed.dyne.orgimagemessage-square43fedilinkarrow-up1735arrow-down123file-text
arrow-up1712arrow-down1imageI prefer to be the one who writes codefed.dyne.orgArthur Besse@lemmy.ml to Programmer Humor@lemmy.mlEnglish · edit-21 year agomessage-square43fedilinkfile-text
minus-squareLucidlethargy@sh.itjust.workslinkfedilinkarrow-up11arrow-down3·1 year agoMan, it’s great until it contently feeds you incorrect information. I’ve been burned far too many times at this point…
minus-squarestebo02@sopuli.xyzlinkfedilinkarrow-up11·1 year agoThat’s why you should always verify the information with other sources. Just like information you get from any other website/person. It’s not any different.
minus-squarefinestnothing@lemmy.worldlinkfedilinkarrow-up5·1 year agoI only verify information I get on the Internet if I don’t agree with it or need to use it in an argument, and I’m not about to change
minus-squareGabu@lemmy.mllinkfedilinkarrow-up7·edit-21 year agoTBH, if you can’t almost instantly figure out why and how ChatGPT suggested bad code, you shouldn’t be using it at all - you’re out of depth. It’s why I’ll gladly use it to suggest markdown or C code, but never for a complex Python library.
minus-squareR0cket_M00se@lemmy.worldlinkfedilinkEnglisharrow-up5·1 year agoBlaming the AI for misinformation is like blaming Google for giving you bad search results. Learn how to parse the data and fact check it. Usually you can get a hyperlink to the source to see if it’s even reasonably trustworthy.
Man, it’s great until it contently feeds you incorrect information. I’ve been burned far too many times at this point…
That’s why you should always verify the information with other sources. Just like information you get from any other website/person. It’s not any different.
I only verify information I get on the Internet if I don’t agree with it or need to use it in an argument, and I’m not about to change
Oof, glad I’m not alone
Wait…
TBH, if you can’t almost instantly figure out why and how ChatGPT suggested bad code, you shouldn’t be using it at all - you’re out of depth.
It’s why I’ll gladly use it to suggest markdown or C code, but never for a complex Python library.
Blaming the AI for misinformation is like blaming Google for giving you bad search results.
Learn how to parse the data and fact check it. Usually you can get a hyperlink to the source to see if it’s even reasonably trustworthy.