I’m starting to like this AI thing…
I’m starting to like this AI thing…
Sound like a critical race condition or bad memory access (this latter only in languages with pointers).
Since it’s HTTP(S) and judging by the average developer experience in the domain of multi-threading I’ve seen even for people doing stuff that naturally tends to involve multiple threads (such as networked access by multiple simultaneous clients), my bet is the former.
PS: Yeah, I know it’s a joke, but I made the serious point anyways because it might be useful for somebody.
I suspect indirectly both variants come from the same source or maybe even it’s the La Haine that’s indirectly the source for my variant (though I learned this joke a long time ago, possibly before 1995).
By the way, that’s excellent film intro.
Reminds me of the joke about the guy falling from the top of the Empire State Building who, half way down, was heard saying: “Well, so far, so good”
Who needs integration testing when we have users who will do it for us?!
That sounds like an error in the specification of the client-server API or an erroneous implementation on the server side for the last version: nothing should be signaled via presence or absence of fields when using JSON exactly because, as I described in my last post, the standard with JSON is that stuff that is not present should be ignore (i.e. it has no meaning at all) for backwards compatibility, which breaks if all of the sudden presence or absence are treated as having meaning.
Frankly that there isn’t a specific field signalling authorized/not-authorized leads me to believe that whomever has designed that API isn’t exactly experienced at that level of software design: authorization information should be explicit, not implicit, otherwise you end up with people checking for not-in-spec side effects like you did exactly for that reason (i.e. “is the no data being returned because of user not authorized or because there was indeed no data to retunr?”), which is prone to break since not being properly part of the spec means any of the teams working on it might interpret things differently and/or change them at any moment.
If I remember it correctly, per the JSON definition when a key is present but not expected it should be ignored.
The reason for that is to maintain compatibility between versions: it should be possible to add more entries to the data and yet old versions of the software that consumes that data should still continue to operate if all the data they’re designed to handle is still there and still in the correct format.
Sure, that’s not a problem in the blessed world of web-based frontends where the user browser just pulls the client code from the server so frontend and backend are always in synch, but is a problem for all other kinds of frontend out there where the life-cycle of the client application and the server one are different - good luck getting all your users to update their mobile apps or whatever whenever you want to add functionality (and hence data in client-server comms) to that system.
(Comms API compatibility is actually one of the big problems in client-server systems development)
So it sounds like an issue with the way your JavaScript library handles JSON or your own implementation not handling per-spec the presence of data which you don’t use.
Granted, if the server side dev only makes stuff for your frontend, then he or she needs not be an asshole about it and can be more accomodating. If however that data also has to serve other clients, then I’m afraid you’re the one in the wrong since you’re demanding that the backwards compatibility from the JSON spec itself is not used by anybody else - which as I pointed out is a massive problem when you can’t guarantee that all client apps get updated as soon as the server gets updated - because you couldn’t be arsed to do your implementation correctly.
Android development app that runs on a PC and can connect to an Android device via USB to control it.
Lets you do way more than what you can do directly in the Android device itself.
Just make it a LJM (Large JSON Model) capable of predicting the next JSON token from the previous JSON tokens and you would have massive savings in file storagre and network traffic from not having to store and transmit full JSON documents all in exchange for an “acceptable” error rate.
Don’t take this badly but it sounds like you’ve only seen a tiny slice of the software development done out there and had some really bad experiences with Agile in it.
It’s perfectly understandable: there are probably more bad uses of Agile out there than good ones and certain areas of software development tend to be dominated by environments which are big bloody “amateur hour every hour of the day, every day of the year” messes, Agile or no Agile.
That does however not mean that your experience stands for the entirety of what’s out there trumphing even the experience of other people who also work in QA in environments where Agile is used.
Agile was definitelly taken in with the same irrationality as fashion at some point.
It’s probably the best software development process philosophy for certain environments (for example: were there are fast changing requirements and easy access to end users) whilst being pretty shit for others (good luck trying to fit it at a proceess level when some software development is outsourced to independent teams or using for high performance systems design) and it eventually mostly came out of that fad period being used more for the right things (even if, often, less that properly) and less for the wrong things.
That said the Agile as fad phase was over a decade ago.
Agile made Management, who had actual Senior Designer-Developers and Technical Architects designing and adjusting actual development processes, think that they had this silver bullet software development recipe that worked for everything so they didn’t need those more senior (read more expensive and unwilling to accpet the same level of exploitation as the more junior types) people anymore.
It also drove the part of the Tech Industry that relies mainly on young and inexperienced techies and management (*cough* Startups *cough*) to think they didn’t need experienced techies.
As usual it turned out that “there are no silver bullets”, things are more complex, Agile doesn’t work well for everything and various individual practices of it only make sense in some cases (and in some are even required for the rest to work properly) whilst in others are massive wasting of time (and in some cases, the usefull-wasteful balance depends on frequency and timing), plus in some situations (outsourced development) they’re extremelly hard or even impossible to pull at a project scope.
That said, I bet that what you think is “The Industry” is mainly Tech companies in the US rather than were most software development occurs: large non-Tech companies with with a high dependency of software for competitive advantage - such as Banks - and hence more than enough specific software requirements to hire vast software development departments to in-house develop custom solutions for their specific needs.
Big companies whose success depends on their core business-side employees doing their work properly care a lot more about software not breaking or even delaying their business processes (and hence hire QA to figure out those problems in new software before it even gets to the business users) than Tech companies providing software to non-paying retail users who aren’t even their customers (the customers are the advertisers they sell access to the eyeballs of those users) and hence will shovel just about anything out and hopefully sort out the bugs and lousy UX/UI design through A/B testing and user bug-reports.
Yeah.
Any good software developer is going to account for and even test all the weird situations they can think of … and not the ones they cannot think of as they’re not even aware of those as a possibility (if they were they would account for and test them).
Which is why you want somebody with a different mindset to independently come up with their own situations.
It’s not a value judgment on the quality of the developer, it’s just accounting for, at a software development process level, the fact that humans are not all knowing, not even devs ;)
“Wrong way” for whom?
In Software Development it ultimatelly boils down to “are making software for the end users or are you making it for yourself?”
Because in your example, that’s what ultimatelly defines whose “wrong” the developer is supposed to guide him/herself by.
(So yeah, making software for fun or you own personal use is going to follow quite different requirement criteria than making software for use by other people).
I’m pretty sure that won’t stand in the way of somebody inventing a square bottle nipple and blaming the users for not using it properly.
I’ve actually worked with a genuine UX/UI designer (not a mere Graphics Designer but their version of a Senior Developer-Designer/Technical-Architect).
Lets just say most developers aren’t at all good at user interface design.
I would even go as far as saying most Graphics Designers aren’t all that good at user interface design.
Certain that explains a lot the shit user interface design out there, same as the “quality” of most common Frameworks and Libraries out there (such as from the likes of Google) can be explained by them not actually having people with real world Technical Architect level or even Senior Designer-Developer experience overseeing the design of Frameworks and Libraries for 3rd party use.
Actually, there are plenty of interpreted programming languages, for example Perl or Shell Script so that definition is incorrect.
HTML is not a programming language because it only defines form (how things look), and does not control action (executing operations by itself).
The language for Web Development that controls the execution of operations (say: if the user fills a certain field, fetch related data from a server and display it in certain page areas) is called Javascript and is separate from HTML (which existed before Javascript and can exist without it).
Modern Web standards have also moved a lot of the form stuff to yet another language - CSS, Cascading Style Sheets - which is more powerful and reusable, so HTML is more used for the visual structure of the page and less for things like the fonts of the various pieces of text, though it still contains support for that stuff and you can still use it.
For the common folk working with a markup language is programming.
LOL!
It’s been many years since I was so young that this was all it took to reach the level of hilarity.
It’s the most boring thing of the technical side of the job especially at the more senior levels because it’s so mindnumbingly simple, uses a significant proportion of development time and is usually what ends up having to be redone if there are small changes in things like input or output interfaces (i.e. adding, removing or changing data fields) which is why it’s probably one of the main elements in making maintaining and updating code already in Production a far less pleasant side of job than the actual creation of the application/system is.