This is from the account that spread the image originally: https://x.com/ai_for_success/status/1793987884032385097
Alternate Bluesky link with screencaps (must be logged in): https://bsky.app/profile/joshuajfriedman.com/post/3ktarh3vgde2b
This is from the account that spread the image originally: https://x.com/ai_for_success/status/1793987884032385097
Alternate Bluesky link with screencaps (must be logged in): https://bsky.app/profile/joshuajfriedman.com/post/3ktarh3vgde2b
Apologies for being so sketchy on the details but I really can’t remember too many of the specifics. I’m fairly certain it wasn’t that his family name came first, because that’s fairly straightforward. I think the author might have been from an east or southeast Asian culture? I think that part of the essay might have been about how addressing him as Mr. Firstname is actually more formal than Mr. Lastname, even though Firstname is not his family name. I don’t want to keep guessing on more details about how the naming conventions were different because I’m probably going to get it wrong, I have fairly low confidence in what I remember from it.
Because I have been completely unable to find it again and this seems like a relevant place to ask: does anyone have a link to an article similar to this, that I believe might have been titled ‘My First Name is My Last Name’? This is made extra hard to look up because I’ve forgotten the specific culture and details it’s talking about, but it’s about the same basic issue with cultural conventions on names.
Especially in context, where it’s contrasting QA testers and ‘normal’ people.
It would probably take longer to prompt ChatGPT to write this than it would to just write it. It’s two short paragraphs.
Is there a stair version of a Norman door? I feel like I’ve seen a story about every president stumbling or falling on those steps. They must be particularly wily, for stairs.
ChatGPT is OK at summarizing popular, low specificity topics that tons of people have already written a ton about, but it’s terrible at anything else. When I tested its knowledge about the process of a niche interest of mine (fabric dyeing) it skipped completely over certain important pieces of information, and when I prompted it to include them it basically just mirrored my prompt back at me.
Which has pretty much summed up my ChatGPT experience: it just regurgitates stuff I can find myself, but removes the ability to determine if the source is reliable. And if it’s something I’m already having trouble finding detailed information about it usually doesn’t help.
Not sure I understand why train games are on topic because they’re open source but an open source desktop icon isn’t.