Hacker Newsnew | past | comments | ask | show | jobs | submit | gilgoomesh's commentslogin

> Why use splines or polynomials or haphazardly chosen basis functions if you can just fit (gradient descent) your data or wave functions to the proper computational EML tree?

Same reason all boolean logic isn't performed with combinations of NAND – it's computationally inefficient. Polynomials are (for their expressivity) very quick to compute.


The Cray 1 was built 100% from NOR gates and SRAM.

50 years ago.

Correct. Your point being? Digital logic didn't change.

They are done with transistors though. Transistors form an efficient, single element, universal digital basis.

And are a much less arbitrary choice than NAND, vs. NOR, XOR, etc.

Using transistors as conceptual digital logic primitives, where power dissipation isn't a thing, Pass Logic is "The Way".


Single transistors aren't yet logic gates by themselves; they are amplifiers with a very specific gain function that makes it possible to use them as switches. Logic gates usually consist of at least two transistors. See https://en.wikipedia.org/wiki/CMOS for an example of how it is done in CMOS technology.

That is correct.

Moreover, the amplifying function must exist at least in some gates, to restore the logic levels, but there is no need for it to exist in all gates.

For instance, any logic circuit can be implemented using AND gates and/or OR gates made with diodes, which have no gain, i.e. no amplifying function, together with 1-transistor inverters, which provide both the NOT function and the gain needed to restore the logic levels.

The logic functions such as NAND can be expressed in several way using simpler components, which correspond directly with the possible hardware implementations.

Nowadays, the most frequent method of logic implementation is by using parallel and series connections of switches, for which the MOS transistors are well suited.

Another way to express the logic functions is by using the minimum and maximum functions, which correspond directly with diode-based circuits.

All the logic functions can also be expressed using the 2 operations of the binary finite field, addition (a.k.a. XOR) and multiplication (a.k.a. AND).

This does not lead to simpler hardware implementations, but it can simplify some theoretical work, by using algebraic results. Actually this kind of expressing logic is the one that should have been properly named as Boolean logic, as the contribution of George Boole has been precisely replacing "false" and "true" with "0" and "1" and reinterpreting the classic logic operations as arithmetic operations. It is very weird to see in some programming languages a data type named Boolean, whose possible values are "false" and "true", instead of the historically correct Boolean values, "0" and "1", which can also be much more useful in programming than "false" and "true", by simplifying expressions, especially in array operations (which is why array-oriented languages like APL use "0" and "1", not "false" and "true").


"Pass transistors" are transistors being operated in pass/impedance switch mode.

Pass logic. Digital. [0]

This is extremely basic digital circuit design. You can create digital circuits as compositions of gates. But you can often implement the same logic, with fewer transistors, using pass logic.

Pass logic is also great for asynchronous digital design.

[0] https://en.wikipedia.org/wiki/Pass_transistor_logic


> Transistors form an efficient, single element, universal digital basis

But transistors can be N or P-channel, so it’s not a single logical primitive, like e.g. NAND-gates.


True.

NAND also use both N and P-channel transistors, and have to use at least the same number, but frequently many more transistors for anything non-trivial.

But as you point out, as a computational basis, pass transistors have two variants.


I think you can look at Star Trek as a fairly grounded example of where current LLMs could go: the ship's computer is not autonomous in any way but it does accept fairly vague instructions and you can apparently vibe-code the holodeck.

Im hoping more for red dwarf

For 20-ish% of Alzheimer's patients, the Shingles vaccine may be a treatment. This has been suspected for a few years now but has received recent confirmation studies.

https://www.alzheimers.org.uk/news/2025-11-18/promising-rese...


While the study was about the shingles vaccine, I wonder if having passed normally through shingles influences positively or negatively the chances of later developing Alzheimer's.


It seems to be back, now.


Do they define "Handmade"? I couldn't find a definition.


I assume you have absolutely no clue what it refers to.

Handmade Hero is a long running yt series by Casey Muratori. He builds a game engine from scratch, no cheats, no shortcuts, straight to the metal (from C-ish perspective). So you learn how to deal with computers to achieve things, fast and efficient, by understanding computers.

At some point Casey thought it was a failure and a waste of time. But to his surprise quite a fanbase evolved around it and it turned that it really helped people to go from zero to "hero". The handmade "movement" relates to this timeline and the aftermath of people thriving from it. My rough definition of "Handmade" dev mentality would be: Ignore the things that seem to make things "easy" (high level software) and learn the actual thing. So you learn what a framebuffer is instead of looking for a drawing api, applicable to different contexts.

That being said is that this foundation doesn't seem to be endorsed by Casey. Their mission goals seem quite shallow, if at all.


> no cheats, no shortcuts, straight to the metal (from C-ish perspective)

Not the person you replied to but even when I stumbled over this (the network, not the game) for the first time, I was left wondering where the line is drawn.

> You can learn how computers actually work, so you can unleash the full potential of modern systems. You can dig deep into the tech stack and learn what others take for granted.

Just.. no libraries? Are modern languages with batteries included ok? What makes a library for C worse than using Python? Is using Python too bloated already? Why is C ok and I don't have to bootstrap a compiler first? (E.g. building with Rust is a terrible experience from a performance perspective, the resulting software can be really nice and small)

I'm not even trying to be antagonistic, I simply don't understand. I'm just not willing to accept "you'll notice when you see it" as an example.


My immediate assumption was that this was a reaction against LLM–assisted or –written software, but I couldn’t see any mention of this in the front page.

So maybe ‘handmade’ refers to artisanal, high quality, made with care, etc.



That page seems like it's trying to define what Handmade is through a bunch of complaints and what it is not

Still no idea what they actually do, other than maybe this is just some random site about building a community to "make better software".

Software isn't bad because engineers don't care. It's bad because eventually people need to eat food, so they need to get paid, which means you have to build something people will pay for, this involves tradeoffs and deadlines, so we take shortcuts and software is imperfect.


> Software isn't bad because engineers don't care.

Caring is certainly a wide spectrum. I see the Handmade stuff being proudly on far end of it.


I don't disagree, the field has become lucrative enough it has attracted people who are interested in the money and not the craft

I'd use unrealistic to describe Handmade, proud is also accurate and works too


> the field has become lucrative enough it has attracted people who are interested in the money and not the craft

Yup, exactly.

> I'd use unrealistic to describe Handmade, proud is also accurate and works too

In certain settings definitely. But even in those corporate settings where it's unrealistic I'd rather work with one than not. If not applied dogmatically, that corner of the corp has a good chance of being an oasis. But a fleeting one perhaps.


I read that but it doesn't define handmade. It gripes about large frameworks and rewriting in different languages but doesn't say what handmade is or how it addresses anything.


I agree it's vague.


You got Liquid Metal? I'm stuck with Liquid Glass :-(


liquid metal / glass whats the difference its all shiny and amorphous.


That 3B model is a local model that eventually got built into macOS 26. Gemini 3 Pro is a frontier model (cloud). They're very different things.


Sure. And the same paper describes a ‘larger’ cloud-served model.


Network operations are "asynchrony". Together with parallelism, they are both kinds of concurrency and Swift concurrency handles both.

Swift's "async let" is parallelism. As are Task groups.


Sure, but as soon as they released their first iteration, they immediately went back to the drawing board and just slapped @MainActor on everything they could because most people really do not care.


Well yes, but that’s because the iOS UI is single threaded, just like every other UI framework under the sun.

It doesn’t mean there isn’t good support for true parallelism in swift concurrency, it’s super useful to model interactions with isolated actors (e.g. the UI thread and the data it owns) as “asynchronous” from the perspective of other tasks… allowing you to spawn off CPU-heavy operations that can still “talk back” to the UI, but they simply have to “await” the calls to the UI actor in case it’s currently executing.

The model works well for both asynchronous tasks (you await the long IO operation, your executor can go back to doing other things) and concurrent processing (you await any synchronization primitives that require mutual exclusivity, etc.)

There’s a lot of gripes I have with swift concurrency but my memory is about 2 years old at this point and I know Swift 6 has changed a lot. Mainly around the complete breakage you get if you ever call ObjC code which is using GCD, and how ridiculously easy it is to shoot yourself in the foot with unsafe concurrency primitives (semaphores, etc) that you don’t even know the code you’re calling is using. But I digress…


Not really true; @MainActor was already part of the initial version of Swift Concurrency. That Apple has yet to complete the needed updates to their frameworks to properly mark up everything is a separate issue.


async let and TaskGroups are not parallelism, they're concurrency. They're usually parallel because the Swift concurrency runtime allows them to be, but there's no guarantee. If the runtime thread pool is heavily loaded and only one core is available, they will only be concurrent, not parallel.


> If the runtime thread pool is heavily loaded and only one core is available, they will only be concurrent, not parallel

Isn't that always true for thread pool-backed parallelism? If only one core is available for whatever reason, then you may have concurrency, but not parallelism.


> Charlie Brown may have been as popular as any character in all of literature

Was he? Maybe this is true inside the US but from outside the US, I've always viewed the character as a peculiarly American artefact – something I was aware of but never really read or watched. This seemed to be reinforced by most major Charlie Brown titles seemingly tied to other American customs like Halloween and baseball.


Snoopy as a character is popular in Japan, but only as a character design - kind of like Hello Kitty. There is zero awareness of any of the shows or really Charlie Brown himself.


I'm Brazilian, in my middle 40s. When I was a little kid my best friend used to carry a blanket around. Neighbors called him "Linus" for years. But I'm confident it was because of the TV show, not the comic strips.


The BBC published this article. I agree with "all of literature" being hyperbolic though.


It was very popular in Australia. Serialised in newspapers for many years. As a kid, our family owned pretty much every Charlie Brown paperback.


People in eg Germany are mostly aware of the Peanuts, but it's nowhere near as central to the culture as in the US.


I'm an American and I've really never related to Charlie Brown myself, but I've heard Peanuts is huge in Japan and other asian countries.


Mac sales are up 12%, year over year. It's Apple's fastest growing hardware category. They're just going to be lower next month (year over year), due to the release cycles being different.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: