I think the thing with C++ is they have tried to maintain backward compatibility from Day 1. You can take a C++ program from the 80s (or heck, even a straight up C program), and there’s a good chance it will compile as-is, which is rather astonishing considering modern C++ feels like a different language.
But I think this is what leads to a lot of the complexity as it stands? By contrast, I started Python in the Python 2 era, and when they switched to 3, I was like “Wow, did they just break hello world?” It’s a different philosophy and has its trade-offs. By reinventing itself, it can get rid of the legacy cruft that never worked well or required hacky workarounds, but old code will not simply run under the new interpreter. You have to hope your migration tools are up to the task.
There were breaking changes between C and C++ (and some divergent evolution since the initial split) as well as breaking changes between different releases of C++ itself. I am not saying these never happened, but the powers that be controlling the standard have worked hard to minimize these for better or worse.
If I took one of my earliest ANSI C programs from the 80s and ran it through a C++23 compiler, I would probably need to remove a bunch of register statements and maybe check if an assumption of 16-bit int is going to land me in some trouble, but otherwise, I think it would build as long as it’s not linking in any 3rd party libraries.
TypeScript is still built on JavaScript, all numbers are IEEE-754 doubles 🙃
Edit: Actually I lied, there are BigInts which are arbitrarily precise integers but I don’t think there’s a way to make them unsigned. There also might be a byte-array object that stores uint8 values but I’m not completely sure if I’m remembering that correctly.
Yeah JavaScript is a bit weird, semicolons being optional and compulsory at the same time: I remember trying to build an electron example ~5yrs ago and it didn’t work unless I put in the semicolons which the developers omitted.
Python is just glorified shell scripting. Libraries like numpy are cool but I don’t like the indentation crap, I’m getting used to it because University likes it.
Jokes aside, I struggle more with abominations like JavaScript and even Python.
Python has its quirks, but it’s much much cleaner than js or c++, not fair to drag it down with them imo
The terse indexing and index manipulation gets a bit Perl-ish and write-only to me. But other than that I agree.
I think the thing with C++ is they have tried to maintain backward compatibility from Day 1. You can take a C++ program from the 80s (or heck, even a straight up C program), and there’s a good chance it will compile as-is, which is rather astonishing considering modern C++ feels like a different language.
But I think this is what leads to a lot of the complexity as it stands? By contrast, I started Python in the Python 2 era, and when they switched to 3, I was like “Wow, did they just break hello world?” It’s a different philosophy and has its trade-offs. By reinventing itself, it can get rid of the legacy cruft that never worked well or required hacky workarounds, but old code will not simply run under the new interpreter. You have to hope your migration tools are up to the task.
C++ is not a superset of C.
There were breaking changes between C and C++ (and some divergent evolution since the initial split) as well as breaking changes between different releases of C++ itself. I am not saying these never happened, but the powers that be controlling the standard have worked hard to minimize these for better or worse.
If I took one of my earliest ANSI C programs from the 80s and ran it through a C++23 compiler, I would probably need to remove a bunch of
register
statements and maybe check if an assumption of 16-bitint
is going to land me in some trouble, but otherwise, I think it would build as long as it’s not linking in any 3rd party libraries.Do you have a minute for our lord and savoir TypeScript?
As long as it can distinguish between int and uint - yesss!
TypeScript is still built on JavaScript, all numbers are IEEE-754 doubles 🙃
Edit: Actually I lied, there are BigInts which are arbitrarily precise integers but I don’t think there’s a way to make them unsigned. There also might be a byte-array object that stores uint8 values but I’m not completely sure if I’m remembering that correctly.
Not only is there a UInt8Array, there’s also a bunch of others: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/TypedArray#typedarray_objects
Yeah JavaScript is a bit weird, semicolons being optional and compulsory at the same time: I remember trying to build an electron example ~5yrs ago and it didn’t work unless I put in the semicolons which the developers omitted.
Python is just glorified shell scripting. Libraries like numpy are cool but I don’t like the indentation crap, I’m getting used to it because University likes it.
Absolutely not, python is an actual programming language with sane error handling and arbitrarily nestable data structures.
Don’t be so superficial. When learning something, go with the flow and try to work with the design choices, not against them.
Python simply writes a bit differently: you do e.g. more function definitions and list comprehensions.
Do you not use indentation in other languages?