CanadaPlus

joined 2 years ago
[–] CanadaPlus 1 points 2 years ago

Wow, does that mean parity with FF cars? I seem to remember the (up-front) price gap was in thousands last I checked. Of course, it's already a better long-term deal.

[–] CanadaPlus 6 points 2 years ago* (last edited 2 years ago)

The thing people always overlook is that these legacy systems are only still running because they're super important. Nobody's hiring a junior COBOL dev to maintain NORAD, and hopefully nobody's contemplating putting ChatGPT in charge either.

The move if you want this kind of job is to learn a language that's not quite a dinosaur yet, and have 20 years experience in 20 years. Perl or PHP maybe.

[–] CanadaPlus 2 points 2 years ago* (last edited 2 years ago) (3 children)

At the simplest, it takes in a vector of floating-point numbers, multiplies them with other similar vectors (the "weights"), sums each one, applies a RELU* the the result, and then uses those values as a vector for another layer with it's own weights (or gives output). The magic is in the weights.

This operation is a simple matrix-by-vector product followed by pairwise RELU, if you know what that means.

In Haskell, something like:

layer layerInput layerWeights = map relu $ map sum $ map (zipWith (*) layerInput) layerWeights

foldl layer modelInput modelWeights

Where modelWeights is [[[Float]]], and so layer has type [Float] -> [[Float]] -> [Float].

* RELU: if i>0 then i else 0. It could also be another nonlinear function, but RELU is obviously fast and works about as well as anything else. There's interesting theoretical work on certain really weird functions, though.


Less simple, it might have a set pattern of zero weights which can be ignored, allowing fast implementation with a bunch of smaller vectors, or have pairwise multiplication steps, like in the Transformer. Aaand that's about it, all the rest is stuff that was figured out by trail and error like encoding, and the math behind how to train the weights. Now you know.

Assuming you use hex values for 32-bit weights, you could write a line with 4 no problem:

wgt35 = [0x1234FCAB, 0x1234FCAB, 0x1234FCAB, 0x1234FCAB];

And, you can sometimes get away with half-precision floats.

[–] CanadaPlus 0 points 2 years ago* (last edited 2 years ago) (6 children)

Off the top of my head, 2. One with no UN seat and one long gone, to be fair, but they still exist and are/were sovereign. You can't say either turned into totalitarianism.

Maybe you could say they would have or will, but that's just your guess. I could say the same thing about liberal democracy and be equally as well supported.

[–] CanadaPlus 1 points 2 years ago

Alright, that's a weaker claim (that is, less of an extraordinary claim) than I was expecting. LLMs aren't quite as good as a human at conceptual originality yet, and I can't prove they will catch up, especially if thematic subtext is the measure.

I guess I'll just say my original point stands then. There's a difference between something made from a prompt by ChatGPT, and something produced from a roughly equivalent text by a translation tool.

[–] CanadaPlus 1 points 2 years ago* (last edited 2 years ago)

They left a cryptic note about free will and not mixing textiles a few millennia back and haven't been on the mammals-dev Slack since.

It's alright, we're pretty sure adoption is going to peak soon anyway.

[–] CanadaPlus 1 points 2 years ago

Oooh, that's a new one to me! Biology is a never ending source of these oddball examples.

[–] CanadaPlus 1 points 2 years ago

I think this guy might be an exec, not a programmer.

[–] CanadaPlus 1 points 2 years ago

Yeah, why waste time talking? Sneakersnet me!

[–] CanadaPlus 4 points 2 years ago* (last edited 2 years ago)

Quite possibly. I'm no good as a politician or salesperson, but that would be the policy solution to a lack of reliability in the allocation process.

If the guys who are always drunk on working Saturdays win because they have a longer attention span, that's just unbelievable.

[–] CanadaPlus 32 points 2 years ago* (last edited 2 years ago) (2 children)

To be clear to anyone skimming, we're currently spending half of what Russia does each month.

It's kind of impressive how well it's been going in that light. Our system is truly much more efficient.

In the future, it would be good if there was a way to allocate budget to supporting foreign wars the way it's allocated for domestic militaries. Right now it sounds like it goes package-by-package, so spending is very difficult to sustain once the public gets bored.

[–] CanadaPlus 3 points 2 years ago (5 children)

Actually, I bet you could implement that in less. You should be able to legibly get several weights in one line.

view more: ‹ prev next ›