Hacker Newsnew | past | comments | ask | show | jobs | submit | QuiEgo's commentslogin

I would not be surprised at all, a $1,000/mo tool that makes your $20,000/mo engineer a lot more productive is an easy sell.

I’m guessing we’re gonna have a world like working on cars - most people won’t have expensive tools (ex a full hydraulic lift) for personal stuff, they are gonna have to make do with lesser tools.


No engineer will cost 20.000 bucks a month at this point in time. Offshoring is still happening aggressively.

noway.

i bought a $3k AMD395+ under the Sam Altman price hike and its got a local model that readily accomplishes medial tasks.

theres a ceiling to these price hikes because open weights will keep popping up as competitors tey to advertise their wares.

sure, we POV different capabilities but theres definitely not that much cash in propfietary models for their indererminance


What about when there is a $100/month tool that makes your engineer 90% as productive as they were on the $1000/mo tool?

What if that tool is something you can run on prem, and over time make the investment back?

It's not so simple.


If your company is making $1 mil per employee per year, then 10% is 100k. Even at 500k employee or lesseer numbers it's almost always better to buy the $1000/month tool (break even is a measly $108k revenue per employee per year)

It's not just about cost, it's about having the control, stability, and autonomy of on-prem. Plus you can probably repurpose that compute when employees are out of the office.

Anyways, I'm just saying it's not so simple ;)


The article was entertaining and made me smile. Thank you for that.

Real advice: national parks very much have seasons, be it weather, tourist (or lack thereof), wildlife, bugs, or all of the above. The same park can be a miserable experience or incredible a few months apart.


"Piracy is almost always a service problem and not a pricing problem"

I left when shows I enjoyed were a revolving door, and the UI felt hostile (constantly trying to shove terrible quality original content on me).


Is it really that hard to just hit cancel yourself?

When you're holding a hammer all the world looks to be nails or something.

EDIT: Also, there's no way in hell I'd let any of the AIs near something with my credit card info saved in their current state.


I agree the wording is a bit alarmist, but a closer example to what they are saying is:

  bool silly_mistake = false;
  
  //... lots of lines of code

  free(x);

  //... lots of lines of code

  if (silly_mistake) { // silly_mistake shown to be false at this point in the program in all testing, so far
     free(x);
  }
A bug like above would still be something that would be patched, even if a way to exploit it has not yet been found, so I think it's fair to call out (perhaps with less sensationalism).

FWIW there's a whole boutique industry around finding these. People have built whole careers around farming bug bounties for bugs like this. I think they will be among the first set of software engineers really in trouble from AI.


That is something a good static analyser or even optimising compiler can find ("opaque predicate detection") without the need for AI, and belongs in the category of "warning" and nowhere near "exploitable". In fact a compiler might've actually removed the unreachable code completely.

Well yeah, it’s a toy example to illustrate a point in an HN discussion :).

Imagine “silly mistake” is a parameter, and rename it “error_code” (pass by reference), put a label named “cleanup” right before the if statement, and throw in a ton of “goto cleanup” statements to the point the control flow of the function is hard to follow if you want it to model real code ever so slightly more.

It will be interesting to see the bugs it’s actually finding.

It sounds like they will fall into the lower CVE scores - real problems but not critical.


That's what I'm saying; a static analyser will be able to determine whether the code and/or state is reachable without any AI, and it will be completely deterministic in its output.

You cannot tell if code is actually reachable if it depends on runtime input.

Those really evil bugs are the ones that exist in code paths that only trigger 0.001% of the time.

Often, the code path is not triggerable at all with regular input. But with malicious input, it is, so you can only find it through fuzzing or human analysis.


You cannot tell if code is actually reachable if it depends on runtime input.

That is precisely what a static analyser can determine. E.g. if you are reading a 4-byte length from a file, and using that to allocate memory which involves adding that length to some other constant, it will assume (unless told otherwise) that the length can be all 4G values and complain about the range of values which will overflow.


Why hasn't it then? The Linux kernel must be asking the most heavily-audited pieces of software in existence, and yet these bugs were still there.

People find and report bugs in the kernel using static analysers all the time.

Rust’s trait system and the embedded HAL say “hi there.”


It's also reasonable from a business point of view to say "we can't justify the investment to optimize our software in the current environment." I assume this is what's happening - people are trying to get their products in customers hands as quickly as possible, and everything else is secondary once it's "good enough." I suspect it's less about developers and more about business needs.

Perhaps the math will change if the hardware market stagnates and people are keeping computers and phones for 10 years. Perhaps it will even become a product differentiator again. Perhaps I'm delusional :).


There are upsides here as well! I think of things like the NUC or Mac Mini - ATX is from 1995, I'm hopeful computers will become nicer things as we trend away from the bucket-o-parts model.

I'm very excited about the Steam Machine for the reasons you mention - I want to buy a system, not a loose collection of parts that kind-of-sort-of implement some standard to the point that they probably work together.


What are the upsides? You only listed a few things that you like, but not why they should take over all parts of the PC market. The only factor I can think of is size, but those small all-in-one computers are already widely available now without the need to hollow out the custom PC market.

There's nothing wrong with ATX or having interchangeable components. An established standard means that small companies can start manufacturing components more easily and provide more competition. If you turn PCs into prepackaged proprietary monoliths, expect even fewer players on the market than we have now, in addition to a complete lack of repairability and upgradability. When you can't pick and choose the parts, you let the manufacturer dictate what you're allowed to buy in what bundles, what spare parts they may sell to you (if any) and what prices you will pay for any of these things. Even if you're not building custom PCs yourself, the availability of all these individual components is putting an intrinsic check on what all-in-one manufacturers can reasonably charge you.


The above post is making a case that the market will implode. I think there's a chance that's really gonna happen. I'm trying to find a silver lining. If the parts market survives that'd be awesome, but there's a real chance this is the beginning of the end.


That I agree with. I'm just also making the point that the silver lining had always existed, since similar fully-integrated products go back decades. The end seems inevitable to me now, and there's no good to be found there. We already had everything. Now is when that starts to be taken away.


I'm thinking of this like car radios. Most cars used to have this standard called DIN to put the radio in. Most cars today don't have DIN mounts anymore. We've gotten way nicer, bigger touch screens in our infotainment now since cars are not locked into one form factor. On the other hand, it sucks in some ways because vendor lock in. I hope we at least get a tradeoff like that - that there will be something in return for it.

There are systems like the NUC but if I want a super-high-end 5090 and top-end CPU, all of the options to cool them feel like... well, something kluged together from whatever parts I can find, not something that's designed as a total system. Maybe we'll get some interesting designs out of this.


I'm afraid the acceptance (and, more troubling, the seeming desire on the part of technical people who I see as misguided) of mobile computers in the smart phone form factor to be locked down and hostile to their owners has moved the Overton window on personal computers being equally owner-hostile. The bucket-of-parts PC ecosystem is less susceptible to an effort to lock down the platform and create walled gardens. If that market goes away it gets easier to turn all of our personal computers into simply computer-shaped devices (like Chromebooks and iPads).

I'm really fearful that PCs are going down the road of locked bootloaders, running the user-facing OSs inside bare-metal hypervisors that "protect" the hardware from the owner, etc.

I'll accept that I'm likely under the influence of a bit of paranoia, too.

I'm strongly of the opinion several unaffiliated factions (oligarchs, cultural authoritarians, "intellectual property" maximalists, software-as-a-service providers, and intelligence agencies, to name a few) see unregulated general purpose computers in the hands of the public as dangerous.

I don't think there's an overt conspiracy to remove computing from the hands of the public. The process is happening because of an unrelated confluence of goals.

I don't see anybody even remotely comparable in lobbying power standing up for owner's rights, either.


Unfortunately, data center computers are not something you can just use as a consumer. They usually have custom connectors, and the parts are soldered down into rack-scale computers. They use custom water cooling that needs building-sized pumps, and so on. A Blackwell rack uses 140,000 watts and weighs 3500 lbs. A typical house in the US has 40,000-50,000 watts of power max and can only support 40 lbs per sq foot. These things are never going to be useable by consumers.

If the AI boom slows, it will free up manufacturing capacity for the consumer supply chain, but there is going to be a long drought of supply.


On the plus side, we've reached the end of Moore's law and are living in an amazing age of personal computing devices.

M1 Apple Silicon MacBook Airs are still good computers 5+ years after release.

Many games are still playable (and being released on!) the PS4, which is almost 12 years old.

The iPhone 15 pro has 8gb of RAM which will likely be sufficient for a long time.

Don't get me wrong, this whole parts shortage is exceptionally annoying, but we're living in a great time to weather the storm.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: