Programming

I know, right? :laughing: And the reason that would be so weird is because we, as a society, have agreed on the minimum standards that must be met in order to call yourself a doctor. You can’t practice medicine w/out a license, and you can’t get a license w/out a medical degree (although, as soon as I typed that I realized I don’t know if that’s strictly true; but I also can’t imagine an example where it wouldn’t be – what would “or equivalent experience” look like for a doctor?).

But we don’t have anything like that for software development (or IT in general). Did you ship some code? Congratulations, you’re a programmer! :partying_face: Or are you? :thinking: It depends on who you ask.

Same here. And again, I’m not knocking folks w/out credentials, and I totally agree that the web is the place to go to keep up with the current state of the art (such as it is). Just imagine what it’d be like if doctors had to deal with the same pace of change we see in technology:

Sure, you’re an established physician, well-versed in anatomy and how the various organs interact to keep us hale and hearty (and alive), but check this out: from now on, some portion of your patients are going to have more than one liver. They’ll have 2n (where n >= 1) livers, to be precise. Multiple livers greatly increase throughput for the processing of ethyl alcohol, thus allowing people to operate at “lush scale”.
[12-18 months later]
OK, I know you’re just starting to feel like you’re finally getting the hang of troubleshooting parallel metabolization issues, but never mind all that. The new thing is Liverless! You should plan to remove all your patients’ livers as soon as possible.


This is a really interesting question.

My gut reaction is “no.” Software is at least nominally intentionally designed, and who would deliberately build a system as convoluted as the human body? I mean it’s a mess: multiple incompatible interfaces, many many layers of abstraction, overloaded operators, functions with side effects, global state, too-clever-by-half encodings, and worst of all, no documentation.

OTOH, sufficiently long-lived codebases can, I think, accurately be described as having evolved over time (due to different environmental pressures, even). And if you spread out enough of those systems and let them interact according to a few simple rules or protocols, such as we have with the internet, then it’s not too much of a stretch to describe it as an ecosystem with it’s own emergent properties, similar to biological systems.

My money’s still on biology being hella’ way more complicated than technology, but give it time, maybe. If you subscribe to some version of the singularity, then it’s only a matter of time before our tools become more complex than we’re even capable of understanding.

7 Likes

Good way to put it. It seems to me it’s evolving in some pretty unplanned ways too.

Until our technology really gets down to machines on the molecular level, I would agree.

7 Likes

…Oops.

8 Likes

Talk about blaming the user.

7 Likes

If only they’d done some research first…

(I could swear I’ve seen a similar license plate issue a few years ago, but haven’t been able to come up with anything on google so far.)

10 Likes

(Replying into Programming topic since it seems more appropriate.)

Oof. I could probably rant for a long time but my mind is tired so I’ll try to keep it short. A couple of examples:

  • EAV data. It seems like every single programmer who has to work with a database at some point comes up with this great idea - instead of creating domain-specific data models with proper constraints and validations, why not just store a table of Entity, Attribute, and Value fields - then anything can have any data structure! So flexible, and the domain becomes irrelevant, so we can do anything with it!
  • Similarly, the opposite. Why not put each individual field in its own table, so we can also store all kinds of metadata about it? So flexible, and we can do anything with it!
  • Then a hot thing - ORMs. Why not just treat the database as dumb storage and rewrite all of the indexing, querying, joining, and sorting stuff in code? That way we don’t have to deal with SQL!

Of course, with any of those, the queries are horrific, performance is awful, and the bulk of everything that the database would handle so much better and more elegantly instead has to be re-implemented in the application code (and additional dependencies/libraries) instead.

After you’ve seen it, you know you’re still gonna end up having to make properly-modeled tables and/or denormalized ones for the software to even begin to work decently and then you have to keep all those copies synchronized, which is awful error-prone, and a resource hog. And worse need a ton more application code to handle that mess. Yet every hot new thing seems to take at least one of those approaches (and/or go noSQL).

Then there’s OOP. Almost everyone who learns it discovers the famous Patterns, gets blinded by the light of revelation and then that’s all they can see. Let’s add abstraction layers to everything! Inheritance, composition, and indirection everywhere! It’ll be so flexible!

If I see one more AbstractSingletonEntityFactoryManagerFactoryInterface… :face_with_symbols_over_mouth: Ravioli code is just spaghetti code with a designer logo.

But those are fairly recent things that have peeved me lately. Going back further, you see things like the switch from thin-client dumb terminals connected to mainframes, over to fully-capable networked PCs, now back to thin-client browsers connected to cloud services.

Or how about the back and forth between “Statically-typed languages are so constraining and have so much boilerplate, a more dynamic language would be so much more productive and flexible!” vs “This dynamic language needs so much inline type-checking and validation, if only the compiler could do this for us it would be so much more productive and robust!”

We keep cycling over the same stuff, just reinventing the cycle’s wheels. And adding epicycles with the tooling. Not that tooling is bad. It’s just that the fact that we now need so much and so complicated of tools to deal with the complexity that we’ve made for ourselves says something.

Guess I failed at keeping it short. :man_facepalming:

I never actually ‘broke in’, but at the right age, dressed the right way, with a backpack… You could just walk right in and stay in the computer room until they shut the lights out and the security guard came around. Of course, then you had to take the stairs because the elevators were already shut down. :joy:

Not all of that time was spent playing MUDs or on IRC. We also had gopher, telnet, ftp, etc. Also found an open university dialin which you could use from your own home to telnet to other non-local universities and dial-out to what would’ve been long-distance BBSes.

12 Likes

+1 just for AbstractSingletonEntityFactoryManagerFactoryInterface
:laughing:

But this is the one I always think about in these sorts of conversations:

And if you count javascript frameworks like React or nodejs and the like (assuming those are valid examples – I haven’t done much web dev since jquery was the new hotness), a lot of the heavy lifting has moved back to the browser, and instead of dll hell, cross-browser compatibility is the new (ok, that’s not really new, per se) cross to bear. Same thing with mobile: it seems like every web site wants you to install their app, effectively making native apps the new fat clients.

I can’t remember who said it, but this adage rings true for tech: “history doesn’t necessarily repeat itself, but it definitely rhymes.”

What I find interesting is that the cycles repeat, but the motivations driving these trends change. The initial move to fat clients was at least partly to take advantage of all the spare processing power on these new-fangled PCs. Now I suspect the push for native apps is driven primarily by the need to slurp up as much user/device data as you can, which is more difficult in the browser, especially since the advent of ad-blockers.

8 Likes

What’s damnable about this to-and-fro is that type inference is a thing. It has been possible to design a language with static typing without a great deal of boilerplate since the inception of ML some 46 years ago (and indeed there have been a fair number of such languages since then).

7 Likes

As a computational evolutionary biologist, I’m often bothered by the attitude of some computer scientists that they’re going to solve biology.

Then there is this dude:

But also look at his [wikihttps://en.m.wikipedia.org/wiki/David_Gelernter) page. Shitlord gonna shitlord.

8 Likes

Sounds to me like he’s making some of the same fatal mistakes a lot of evolution deniers make:

Try to mutate your way from 150 links of gibberish to a working, useful protein and you are guaranteed to fail. Try it with ten mutations, a thousand, a million — you fail. The odds bury you. It can’t be done.”

How many tries does he imagine there are in a planet-wide ocean over hundreds of millions or billions of years? Only a million? And do we know all the possible useful proteins there are in 150 links? Getting to a specific protein is difficult. Getting some working protein is a lot easier.

I think he’s a wacko.

11 Likes

Right. I work with probabilistic models for understanding the evolution of morphology. Any combination of changes has a low probability. Any one change might have a high one.

You getting out of bed, putting on your shoes, and taking the sequence of steps it takes to get to work is probably low probability, too.

People, even really clever ones, are bad at understanding individual probabilities and probabilities of multiple events, particularly when non-independence is involved.

8 Likes

It’s amazing how some people who are really bright in one field seem to think they are always bright in other fields.

5 Likes

Bright, competent people aren’t immune to Dunning-Kruger: Linus Pauling comes to mind.

8 Likes

There are no examples of mutations that are not fatal.

One must assume that the person stating this is from another planet and has never before observed Earth life.

9 Likes

But I bet hes fine with the idea of using machine learning.

7 Likes

…coupled with genetic programming, no doubt…

7 Likes

How many tries does he imagine there are in a planet-wide ocean over hundreds of millions or billions of years? Only a million? And do we know all the possible useful proteins there are in 150 links? Getting to a specific protein is difficult. Getting some working protein is a lot easier.

He might be a good programmer but he’s not a very good AI researcher/engineer if he doesn’t know enough math to get into infinities and calculus and metaphysics and relativity to understand the implications of deep time and even a limited universe. The Drake equation has a shitload of variables. Maybe it is really uncommon for a stable environment of deep time to exist.

When one does though it seems to pay off in bounty.

7 Likes
8 Likes

i’m kind of obsessed with this keyboard. i just have a hard time imagining forking over three-hundred-ish dollars for it.

4 Likes

For $300, I hope they also include labels for the keys. Less cool looking, but useful.

7 Likes