• 0 Posts
  • 51 Comments
Joined 1 year ago
cake
Cake day: October 1st, 2023

help-circle

  • Electrical Engineering really is a no-frills field; you either can do it, or you can’t. Our only testing methodology is this: if they know what they’re doing, they’ll pass and do well in the major. If they don’t know what they’re doing, they’ll fail and rethink their major.

    Knowing what they’re doing is the important part. If it were the case that genAI chatbots helped in that regard, then we’d allow them, but none of us have observed any improvement. rather they’re wasting time they could be using to progress in the assignment to instead struggle to incorporate poorly optimized nonsense code they don’t understand. I can’t tell you how many times I’ve had conversations like:

    “Why doesn’t this work?”

    “Well I see you’re trying to do X, but as you know, you can’t do X so long as Y is true, and it is.”

    “Oh, I didn’t know that. I’ll rewrite my prompt.”

    “Actually, there’s a neat little trick you can do in situations like these. I strongly suggest you look up the documentation for function Z. It’s an example of a useful approach you can take for problems like these in the future.”

    But then instead of looking it up, they just open their chatgpt tab and type “How to use function Z to do X when Y is true.”

    I suppose after enough trial and error, they might get the program to work. But what then? Nothing is learned. The computer is as much a mystery to them after as it was before. They don’t know how to recognize when Y is true. They don’t know why Y prevents X. They don’t understand why function Z is the best approach to solving the problem, nor could they implement it again in a different situation. Those are the things they need to know in order to be engineers. Those are the things we test for. The why. The why is what matters. Without the why, there can be no engineering. From all that we’ve seen thus far, genAI chatbots take that why away from them.

    If they manage to pass the class without learning those things, they will have a much, much harder time with the more advanced classes, and all the more so when they get to the classes where chatgpt is just plain incapable of helping them. And if even then, by some astronomical miracle they manage to graduate, what then? What will they have learned? What good is an engineer who can only follow pre-digested instructions instead of making something nobody else has?


  • I mean, they don’t generally keep their use of chatgpt a secret. Not for now, anyway. Meanwhile, the people who do well in the class write their code in a way that clearly shows they read the documentation, and have made use of the headers we’ve written for them to use.

    In the end, does it matter? This isn’t a CS major, where you can just BS your way through all your classes and get a well paying career doing nothing but writing endpoints for some js framework. We’re trying to prepare them for when they’re writing their own architecture, their own compilers, their own OSses; things that have 0 docs for chatgpt to chew up at spit out, because they literally don’t exist yet.


  • I TA for an electrical engineering class. It’s amusing, to look at student’s code these days. Everything is so needlessly wrapped up in 3-line functions, students keep trying to do in 25 lines what can be done in 2, and it all becomes impossible to debug.

    When their code inevitably breaks, they ask me to tell them why it isn’t working. My response is to ask them what its meant to be doing, but they can’t answer, because they don’t know.

    The sad thing is we try to make it easy on them. Their assignment specs are filled with tips, tricks, hints, warnings, and even pseudo-code for the more confusing algorithms. But these days, students would rather prompt chatgpt than read docs.

    I’ve never seen chatgpt ever benefit a student. Either it misunderstands and just confuses the student with nonsense code and functions, or else in rare cases it does its job too well and the students don’t end up learning anything. The department has collectively decided to ban it and all other genAI chatbots starting next semester.




  • "Dear floss4life,

    Our developers have encountered an issue while using the open source framework you published on github. We have lost as many as 400 user accounts. The estimated cost of this error is $6800.

    This is unacceptable. Be a professional and fix it immediately.

    Chad Elkowitz, MBA, Gruvbert and sons Finance Lt"





  • “I need you to tell me how we can incorporate ai in our product.”

    “Ai? How could ai possibly benefit our product?”

    “Don’t ask me that. you’re the engineer, you should know.”

    “Well, then I’m telling you the product has nothing to gain from incorporating ai.”

    “Fine, I’ll keep looking until I can find someone with actual vision. See you at your performance review.”




  • its the things I hear from real software developers that concern me:

    • You will spend your entire career chasing trends.
    • The market is volatile. People are constantly getting abruptly laid off. SD has never been very stable, so you should plan on getting a new job every few years.
    • Software companies are constantly looking for ways to make SD easier. As a result, your value will decrease over time, in preference for bootcampers and 2 year degree graduates.
    • Nobody listens to developers. Your manager’s beliefs about SD come entirely from consultants, magazines, and Elon Musk tweets.
    • Nobody cares about quality software. If you take the time to make your code efficient and lightweight, all your manager sees is you taking longer to make something than your peers. After all, we can just raise hardware requirements if the software is slow.