Hacker Newsnew | past | comments | ask | show | jobs | submit | efsavage's commentslogin

bassbuzz.com isn't free but it's really good. I've about halfway through the course and feel like I've already gotten more my money's worth out of it.

I'd argue that they're above average for the population, and below average for experts. Can they draw as well as an expert/professional illustrator? Probably not. Can they draw better than almost anyone who isn't a expert/professional illustrator? Probably.

I think the value we're losing is where people are bad at things, which is often where new ideas/approaches come from, but this is a macro metric, so it's a hard sell to the person struggling when there's an easy button available.


> I'd argue that they're above average for the population, and below average for experts. Can they draw as well as an expert/professional illustrator? Probably not. Can they draw better than almost anyone who isn't a expert/professional illustrator? Probably.

That's pretty much the definition of "average" (as most commonly used, to refer to "mean" rather than median or much less commonly mode), isn't it?


I don't think so, to put some made-up-but-illustrative numbers, I think AI is going to be worse than the 1% of people who do X professionally or at a high level, and better than the 99% who don't.

Can Suno make a better song than $YOUR_FAVORITE_ARTIST? Unlikely. Can it make a song better than 99% of a random selection of people? Probably.

I think this is actually a good thing in many ways. If I have a tool that elevates me on things I'm not very good at (like making songs) which far outnumbers the things I am good at, that's a big win for me personally, it's just a loss for the population since people who are going to push music further aren't going to be encouraged to struggle through the curve and find their own path.


I'd argue that they're above average for the population, and below average for experts. Can they draw as well as an expert/professional illustrator? Probably not. Can they draw better than almost anyone who isn't a expert/professional illustrator? Probably."

This has always been true for any new technology. However, what's bizarre here is, billions and potentially trillions are being dumped into learning this the hard way.

There's a reason why specialisation is a thing and has driven economies forward for the better part of the last century. This is not going away. 'Democractisation' is a pipe-dream and frankly it should be - equal opportunity not equal outcome.


Great QA people are rarer than great developers, and potentially even more valuable.


Yes, QA and test engineers tend to have a better ability to specify the correct behavior of a system than anyone else. It's literally their job.

This is a big asset in the current paradigm where a LLM agent can effectively implement behavior only when given a robust test suite to iterate against.


> “good enough” for 95% of use cases

Maybe, for current use cases. I'd argue that anyone who thinks they can do everything a 10kW server can do on their 10W device just isn't being creative enough :)


Yegge's list resonated a little more closely with my progression to a clumsy L8.

I think eventually 4-8 will be collapsed behind a more capable layer that can handle this stuff on its own, maybe I tinker with MCP settings and granular control to minmax the process, but for the most part I shouldn't have to worry about it any more than I worry about how many threads my compiler is using.


I was surprised the author didn’t mention Yegge’s list (or maybe I missed it in my skim).


>"Yegge's list resonated a little more closely with my progression to a clumsy L8."

I thought level 8 was a joke until Claude Code agent teams. Now I can't even imagine being limited to working with a single agent. We will be coordinating teams of hundreds by years end.


Yep I was also surprised to see MCP & Skills as not only a distinct "level", but so high up.

In my mind, MCP & Skills is inseparable part of chat interfaces for LLMs, not a distinct level.


Agreed a bit. I'm probably too paranoid for MCP, but also don't mind rolling my own CLI tools that do the exact minimum I need them to do. Will see where we're at in a year or so....


Those are all great things to do, but I don't think OP needs to do more things, they need to do different things. The biggest thing that jumped out was that they know they need to be with people but work remote and with a huge time shift.

My top advice would be to get an in-person job, even that means less money or moving, or just pivoting to a new industry. Even better find a job where people are your business so you're not pinning everything on socializing with co-workers. The people I know who are like this do jobs where they have to meet/find customers, coordinate people and teams, do on-site projects, etc. They are energized and fulfilled by these interactions even if the job itself isn't that important to them.


I think there should be an option to assume I'm a child and proceed from there. If I want access to any mature content or real identify related stuff, I'll verify, but if your service doesn't have or need that anyways then there's no reason to prove I'm an adult.


Roblox: Hold my beer


In the earliest days of getting people to pay for cable TV when OTA was free, the pitch was that you'd see fewer/no commercials. That didn't last long...


> In the earliest days of getting people to pay for cable TV when OTA was free, the pitch was that you'd see fewer/no commercials.

No, it was quality of reception, especially for people who were farther from (or had inconvenient terrain between them and) broadcast stations; literally the only thing on early capable was exactly the normal broadcast feed from the covered stations, which naturally included all the normal ads.

Premium add-on channels that charged on top of cable, of which I think HBO was the first, had being ad free among their selling points, but that was never part of the basic cable deal.


That varied by region. When cable came to my town in the early 1980s, HBO and Cinemax were part of the local cable provider's basic package. That lasted until the next provider bought them out.


Oh, sure, definitely some providers did some things like that early on to drive growth, especially when they were trying to drive into the areas less dissatisfied with existing broadcast quality then the initial cable markets. (And even once it stopped, it was common to bundle premium channels into the basic cost for a limited time for new customer acquisition.)


this doesn't ring true; TV has always been deeply linked with ads, it just seems that they moved to fractional ownership of a show via many advertisers vs. the (perhaps less intrusive) show sponsor where the advertising was woven into the plot.


I think I'm older than most HN commenters. I can't Google up a citation, but "no or fewer ads" was part of the pitch in the early-mid 1970s in my recollection. You are correct about TV and ads, so maybe I'm wrong.


Not really. Cable TV started as a better way for people to get OTA channels when they were in marginal reception areas. My family had cable TV in the 1970s and it was a maybe eight or ten OTA channels and except for the PBS station they all had commercials, between shows and during shows.

HBO was the first offering that didn't have ads during the show.


Catv originally stood for 'community antenna' and was for those who lived in a valley where tv signals couldn't reach. The community built one antenna at the top and ran a cable down to everyone. Of course it was an obvious addition after that to add extra channels.


Interesting! That makes sense now. I thought it stood for CAble TV and always wondered why they used two letters instead of just CTV.


Interesting, I grew up in an area with good reception, so the pitch was definitely fewer commercials on the cable channels (HBO, Nickelodeon, MTV), I remember standing in the living room as the salesman said this. It was true for a while, but eventually they caught up to OTA ad loads.


HBO was always a premium ad free channel. MTV was never promoted as ad free.


and premium channels were ridiculously expensive back then too!


Yea, the no ads theory of the history is cable seems to be pervasive. The only ad free channels were the premium ones like HBO. It's like people think the OTA channels that were packaged together had some magic applied that eliminated ad breaks from the exact same feed as the OTA broadcast. The cable only channels like USA also had ads as well. I guess it's just another example if you tell a lie often enough people will accept it as truth


It was "writing 90% of the code", which seems to be pretty accurate, if not conservative, for those keeping up with the latest tools.


> which seems to be pretty accurate

It's not, even by his own citing: https://www.youtube.com/watch?v=iWs71LtxpTE

He said that this applies to "many teams" rather than "uniformly across the whole company".


Yes, those using the tools use the tools, but I don't really see those developers absolutely outpacing the rest of developers who do it the old fashioned way still.


I think you're definitely right, for the moment. I've been forcing myself to use/learn the tools almost exclusively for the past 3-4 months and I was definitely not seeing any big wins early on, but improvement (of my skills and the tools) has been steady and positive, and right now I'd say I'm ahead of where I was the old-fashioned way, but on an uneven basis. Some things I'm probably still behind on, others I'm way ahead. My workflow is also evolving and my output is of higher quality (especially tests/docs). A year from now I'll be shocked if doing nearly anything without some kind of augmented tooling doesn't feel tremendously slow and/or low-quality.


it’s wild that engineers need months or years to properly learn programming languages but dismiss AI tooling after one bad interaction


I think inertia and determinism play roles here. If you invest months in learning an established programming language, it's not likely to change much during that time, nor in the months (and years) that follow. Your hard-earned knowledge is durable and easy to keep up to date.

In the AI coding and tooling space everything seems to be constantly changing: which models, what workflows, what tools are in favor are all in flux. My hesitancy to dive in and regularly include AI tooling in my own programming workflow is largely about that. I'd rather wait until the dust has settled some.


totally fair. I do think a lot of the learnings remain relevant (stuff I learned back in April is still roughly what I do now), and I am increasingly seeing people share the same learnings; tips & tricks that work and whatnot (i.e. I think we’re getting to the dust settling about now? maybe a few more months? definitely uneven distribution)

also FWIW I think healthy skepticism is great; but developers outright denying this technology will be useful going forward are in for a rude awakening IMO


Motivated reasoning combined with incomplete truths is the perfect recipe for this.

I kind of get it, especially if you are stuck on some shitty enterprise AI offering from 2024.

But overall it’s rather silly and immature.


That's not even close. The keyboard is writing 100% of my code. They keyboard is not replacing me anytime soon.


If you added up all the code written globally on Dec 3 2025, how much do you think was written by AI and how much was clacked out on a keyboard?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: