Hacker Newsnew | past | comments | ask | show | jobs | submit | throw234234234's commentslogin

F# has since gotten Functional State machines which make many computation expressions more efficient (https://github.com/fsharp/fslang-design/blob/main/FSharp-6.0...). Been there a while.

I actually think F# has received some "love" over the recent years contrary to some on this forum; that feature being an example. My view, maybe unpopular but in the age of AI maybe less so, is there is a diminishing returns to language features anyway w.r.t complexity and the use cases that new feature will actually apply for. F# in my mind and many other languages now for that matter is pretty much there or are almost there; the languages are converging. When I used F# I liked how it unified features and tried to keep things simple. Features didn't feel "tacked on" mostly with some later exceptions.

Last time I used F# a few libraries started adopting this for their CE's (e.g. IcedTasks library, etc).


Agree. Anthrophic in particular have been quite clear in what they are trying to do. Every blog post about every new model almost dismisses every other use case other than coding - every other use case seems almost a footnote in their communication.

The question really is what you think the long term direction of SWE as a profession is. If we need juniors later and senior's become expensive that's a nice problem to have mostly and can be fixed via training and knowledge transfer. Conversely people being hired and trained, especially when young into a sinking industry isn't doing anyone any favors.

While I think both sides have an argument on the eventual SWE career viability there is a problem. The downsides of hiring now (costs, uncertainity of work velocity, dry backlogs, etc) are certain; the risk of paying more later is not guaranteed and maybe not as big of an issue. Also training juniors doesn't always benefit the person paying.

* If you think long term that we will need seniors again (industry stays same size or starts growing again) given the usual high ROI on software most can afford to defer that decision till later. Goes back to pre-AI calculus and SWE's were expensive then and people still payed for them.

* If you think that the industry shrinks then its better to hold off so you get more out of your current staff, and you don't "hire to fire". Hopefully the industry on average shrinks in proportion to natural retirement of staff - I've seen this happen for example in local manufacturing where the plant lives but slowly winds down over time and as people retire they aren't replaced.


> The question really is what you think the long term direction of SWE as a profession is. If we need juniors later and senior's become expensive that's a nice problem to have mostly and can be fixed via training and knowledge transfer. Conversely people being hired and trained, especially when young into a sinking industry isn't doing anyone any favors.

Yes exactly!

What will SWE look like in 1 year? 5 years? 10?

Hiring juniors implies you're building something that's going to last long enough that the cost of training them will pay off. And hiring now implies that there's some useful knowledge/skill you can impart upon them to prepare them.

I think two things are true: there will be way fewer developer type jobs, full stop. And I also think whatever "developers" are / do day to day will be completely alien from what we do now.

If I "zoom out" and put my capitalist had on, this is the time to stop hiring and figure out who you already have who is capable of adapting. People who don't adapt will not have a role.

> If you think that the industry shrinks then its better to hold off so you get more out of your current staff, and you don't "hire to fire". Hopefully the industry on average shrinks in proportion to natural retirement of staff - I've seen this happen for example in local manufacturing where the plant lives but slowly winds down over time and as people retire they aren't replaced.

You can look even closer than that - look at some legacy techs like mainframe / COBOL / etc. Stuff that basically wound down but lasted long enough to keep seniors gainfully employed as they turned off the lights on the way out.


Domain knowledge and gatekeeping. We don't know what is required in their role fully, but we do know what is required in ours. We also know that we are the target of potentially trillions in capital to disrupt our job and that the best and brightest are being paid well just to disrupt "coding". A perfect storm of factors that make this faster than other professions.

It also doesn't help that some people in this role believe that the SWE career is a sinking ship which creates an incentive to climb over others and profit before it tanks (i.e. build AI tools, automate it and profit). This is the typical "It isn't AI, but the person who automates your job using AI that replaces you".


I think it's pretty clear that Anthrophic was the main AI lab pushing code automation right from the start. Their blog posts, everything just targeted code generation. Even their headings for new models in articules would be "code". My view if they weren't around, even if it would of happened eventually, code would of been solved with cadence to other use cases (i.e. gradually as per general demand).

AI Engineers aren't actually SWE's per se; they use code but they see it as tedious non-main work IMO. They are happy to automate their compliment and raise in status vs SWE's who typically before all of this had more employment opportunities and more practical ways to show value.


> disrupting others careers is why you have a career in the first place.

Not every software project has or did this. In fact I would argue many new businesses exist that didn't exist before software and computing and people are doing things they didn't beforehand. Especially around discovery of information - solving the "I don't know what I don't know" problem also expanded markets and demand to people who now know.

Whereas the current AI wave seems to be more about efficiency/industrialization/democratizing of existing use cases rather than novel things to date. I would be more excited if I saw more "product orientated" AI use cases other than destroying jobs. While I'm hoping that the "vibing" of software will mean that SWE's are needed to productionise it I'm not confident that AI won't be able to do that soon too nor any other knowledge profession.

I wouldn't be surprised with AI if there's mass unemployment but we still don't cure cancer for example in 20 years.


> Not every software project has or did this. In fact I would argue many new businesses exist that didn't exist before software and computing and people are doing things they didn't beforehand.

That's exactly what I am hoping to see happen with AI.


All I can say to that is "I hope so too"; but logic is telling me otherwise at this point. Because the alternative, as evidenced by this thread, isn't all that good. The fear/dread in people since the holidays has been sad to see - its overwhelmed everything else in tech now.


They commoditized their complement to their hardware/infra, that being software. Good for them and the value of tech will shift to what is still scarce relatively.


Because of point 3 most SWE's are also hesistant to pay for software. The positive feedback loop of "I did well out of this so i will support others as well" is over.

When you are thinking your days are numbered any cost to develop software (even token budget) is measured. As coding becomes commoditized the ROI in code will drop of that code (capitalism rewards scarcity; not value delivered) and you suddenly become cost conscious. We are moving from a monopoly-moat like market to a competitive cost based market in SWE as AI improves.


I think AI has come as the industry was somewhat maturing and most frameworks/software had previous incarnations that mostly did the same thing or could be done adhoc anyway. The need for libraries as the models get better probably declines as well.

Not all open source but a lot of it is fundamentally for humans to consume. If AI can, at its extreme (still remains to be seen), just magic up the software then the value of libraries and a lot of open source software will decline. In some ways its a fundamentally different paradigm of computing, and we don't yet understand what that looks like.

As AI gets better OSS contributes to it; but in its source code feeding the training data not as a direct framework dependency. If the LLM's continue to get better I can see the whole concept of frameworks being less and less necessary.


In the face of LLM's it won't be rational for many people to open source their work. People don't want their work/effort being used against them.


I've considered no longer uploading work I do to GitHub.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: