For me, there's a sharp binary:
If I ask AI to "own" a coding-problem solution — with me passing back the failure responses until resolved — my mind gets numb and I learn nothing.
If I insist on owning the solution — using AI in my effort to better understand the problem space — my mind is active and I get better at coding.
Sometimes I'm lazy and fall into the former. But mostly, so far, the latter.
That's an interesting thought. As AI is taking over the "mundane", us humans will need to shift "up", where work is supposed to be "deep". So, hopefully, being always-busy, and complaining about "back to back meetings", will stop being the flex it is today.
Wait.. is it expected to be a flex? Last bigger project I had no time for actual work, precisely because of back to back meetings. Is it a flex as an indicator that your presence is needed?
> This reliance on readily available solutions, particularly for familiar problems, creates a real risk: engineers may inadvertently atrophy their own problem-solving skills, hindering their ability to tackle truly novel challenges.
Yes, that will happen. But it also happens every time we move up the abstraction ladder. Most engineers go through their entire careers and never do anything TRULY novel.
It's helpful to distinguish problem solving from creative thinking. The main goal of problem solving is to make a problem go away. The main goal of creative thinking is to come up with new problems to solve. Some also call this convergent vs. divergent thinking.
When I want to think creatively but need to solve problems that feel more like housekeeping or toil, LLMs are a useful tool to stay in the right mindset. I have yet to successfully engage with an LLM to help with creative thought. All I've gotten is uninspiring brainstorming.
While it's difficult to define, wisely can turn 'LLMs are useless' to 'ten X productivity boost'. However, at the end, of course, it all comes down to products. Before LLMs stole the show, we had built beautiful system software over the course of decades, linux, git, k8s and rust and yet the products that we use everyday are mostly (mostly) user-hostile and incorporate dark patterns, offer a suboptimal UX, and (in my opinion) sometimes involve outright inhuman marketing practices. That being said, even if you get AGI I don't think it will lead to any breakthroughs if we continue to do 'software engineering' like this year after year.
Agreed. I am starting to think that the only sane way to approach most of it is to learn enough to be able implement as much as possible yourself. It.. can suck hard, because you will spend a lot of time learning what true control really means, but in exchange you get exactly what you want and how you want it.
And this sucks, because I don't think I could reasonably apply this to anything else like cars..
Yes, that will happen. But it also happens every time we move up the abstraction ladder. Most engineers go through their entire careers and never do anything TRULY novel.
I think the new question here is, if the new status quo is to offload your creative thinking to LLMs, will now any engineers do anything truly novel?
If you're not engaging your mind to create and think on a day-to-day basis, will you be in position to have some new insight?
When I want to think creatively but need to solve problems that feel more like housekeeping or toil, LLMs are a useful tool to stay in the right mindset. I have yet to successfully engage with an LLM to help with creative thought. All I've gotten is uninspiring brainstorming.
While it's difficult to define, wisely can turn 'LLMs are useless' to 'ten X productivity boost'. However, at the end, of course, it all comes down to products. Before LLMs stole the show, we had built beautiful system software over the course of decades, linux, git, k8s and rust and yet the products that we use everyday are mostly (mostly) user-hostile and incorporate dark patterns, offer a suboptimal UX, and (in my opinion) sometimes involve outright inhuman marketing practices. That being said, even if you get AGI I don't think it will lead to any breakthroughs if we continue to do 'software engineering' like this year after year.
And this sucks, because I don't think I could reasonably apply this to anything else like cars..
Nothing beats intuition + experience