I feel super fortunate to be a part of that generation where screwing around at home could lead directly to employment. I taught myself Atari BASIC on my 800 and took a 286 IBM compatible to college where I was a music major. I dropped out and landed working for an industrial automation company because I knew how to write dumb little programs. A couple years later I was the sole guy programming robots for them in a structured BASIC language.
Same here. My first programming job was a "crossover" from a hardware technician job. It both got me into software, and introduced me to the title of "Engineer." (I was originally a Technician, then, an Electrical Engineer, even though I mostly did software, but in those days, I also designed the hardware the software ran on).
I got my first Apple programming job, because I had a Mac Plus at home, and learned to program it in ASM and Pascal.
I've only taken some non-matriculated math courses. All the rest was pretty much OJT and home study (and a lot of seminars and short classes). My original education was High School Dropout/GED.
So... The current generation? Between mobile devices, raspberry pis, Web pages, Linux and even Windows there is plenty of stuff you can do just futzing and in your basement. Yeah it might be impossible to create your own AAA game, but you can still even create your own software. Plenty of open source opportunities out there as well
>> might be impossible to create your own AAA game
Like Minecraft? Factorio? Modern tools allow for very small team to quickly generate very AAA games. Eye candy is still an issue, but AI is quickly creeping into that space. I would not be surprised if within the next decade we have the tools for a single person to generate what we would today call a AAA game.
> When the need for juniors comes back around, I’m sure we’ll start to see it again.
Man, I'm skeptical, at least in the US. Since the pandemic, I've seen an absolute explosion in offshoring, which makes perfect sense when so many people are working remotely anyway. I've worked with lots of excellent engineers from Argentina to Poland and many places in between. It's tough for me to see how an American "tinkerer" will be able to find a job in that world if he wants an American-level salary.
Also, I know the adage about "this time it's different" being the most dangerous phrase in language, but, at least in one example, something really is different. In the early 00s, after the dot com bust, there was a ton of fear about outsourcing the bulk of software work to India. That turned out not to happen, of course, because (a) remote meeting software was nowhere close to where it is today, (b) remote work in general wasn't common, and (c) the timezones issues between US and India were an absolute productivity killer. These days, though, everyone is used to remote work, and US companies have realized there are enough lower cost locales with plenty of timezone overlap to make offshoring the norm these days.
I hope this is still true. There are certainly lots of opportunities for self-taught software and hardware development. And university lectures and course material (much of which is very good) that used to be locked inside physical campuses with expensive tuition fees are often freely available to anyone on the internet.
You can definitely build a nice portfolio of open source software (and even hardware) on github. I would hope that is enough to get a job, but it might not be, especially in the current era of AI-fueled employment pressure.
And he says on his about page "This art is primarily non-objective and abstract, focusing on complex shapes and colors. I use my math, programming, and digital manipulation knowledge to produce highly unique art." It's not AI generated.
Often networking is seen as this robot-like bleep bloop hello, here’s my business card thing and at the dedicated events it very well could be but networking in the most basic sense is just making friends and shooting the shit, only difference is that you can leverage those friends for opportunities in the workplace and vice versa.
If there's mutual interest, certainly, but in most cases networking feels shallow and forced. If the only thing in common between us is the weather, I tune out quickly. Networking is mainly for those who truly like people.
If you worked in a cubicle farm you'd know. The cubicles were generally divided by low portable walls. There were different setups but generally you don't see people when you're seated but if you stand you can see your neighbors.
It's a program. "App" is a word, short for "Application Program," publicized by Apple for its handheld computers that masquerade as (and are euphemistically called) "telephones." "App" effectively means "proprietary closed-source program that talks to proprietary walled-garden programs running on someone else's computer, and acts as a spy sending all your sensitive data to who-knows-where."
No-one ever called a real program an "app" before that, did they?
I've been programming professionally for over 30 years and "app", "application", and "program" have been interchangeable for me and the people I worked with as far back as I can remember.
"Application" has been a common general term for an end-user program for a very long time, and "app" is just an obvious abbreviation that people and UIs have used to varying degrees all along. iOS apps merely mainstreamed the term, they didn't take ownership of it.
I don't recall seeing "app" on its own that often, but there was the idiom "killer app", meaning an application that was compelling enough to drive sales of its host platform (VisiCalc on Apple II being the go-to example).
> No-one ever called a real program an "app" before that, did they?
Yes. Apple called them apps in the 80s, at least on the Mac - this is Apple II but it's plausible they were also referred to as apps there?
For my part I read the title as "Taking over a wall changed my direction as a programmer" which had me really confused for a while. I'd like to read that article, I think.
Apple (App-le?) certainly popularized abbreviating "applications programs" or "application software" (vs. system software, systems programs etc.) to "applications" in the 1980s, and "apps" with the advent of the App Store in 2008, but Apple was unsuccessful in trying to obtain and enforce an App Store trademark given prior uses of app, store, and app store (including, perhaps ironically given Steve Jobs' return and Apple's acquisition of NeXT, a store for NeXTSTEP apps.) "Killer App(lication)" dates to the 1980s, applying to software like VisiCalc for the Apple II.
GEM on the Atari ST supported the .app (short for "application") extension for gui executables. One of its components was the AES, short for Application Environment Services. This stuff dates from the early to mid 1980s.
"Applications" was a very common term in the classic Mac days. "Programs" was a more Windows-y term. ("Applications" vs "Program Files" in ye olden 90s world of where to put things you installed.)
I got my first Apple programming job, because I had a Mac Plus at home, and learned to program it in ASM and Pascal.
I've only taken some non-matriculated math courses. All the rest was pretty much OJT and home study (and a lot of seminars and short classes). My original education was High School Dropout/GED.
Like Minecraft? Factorio? Modern tools allow for very small team to quickly generate very AAA games. Eye candy is still an issue, but AI is quickly creeping into that space. I would not be surprised if within the next decade we have the tools for a single person to generate what we would today call a AAA game.
Still waiting for my breakthrough.
Up until the current hiring lull, it was very possible to get a programming position with just a self taught background.
When the need for juniors comes back around, I’m sure we’ll start to see it again.
Man, I'm skeptical, at least in the US. Since the pandemic, I've seen an absolute explosion in offshoring, which makes perfect sense when so many people are working remotely anyway. I've worked with lots of excellent engineers from Argentina to Poland and many places in between. It's tough for me to see how an American "tinkerer" will be able to find a job in that world if he wants an American-level salary.
Also, I know the adage about "this time it's different" being the most dangerous phrase in language, but, at least in one example, something really is different. In the early 00s, after the dot com bust, there was a ton of fear about outsourcing the bulk of software work to India. That turned out not to happen, of course, because (a) remote meeting software was nowhere close to where it is today, (b) remote work in general wasn't common, and (c) the timezones issues between US and India were an absolute productivity killer. These days, though, everyone is used to remote work, and US companies have realized there are enough lower cost locales with plenty of timezone overlap to make offshoring the norm these days.
You can definitely build a nice portfolio of open source software (and even hardware) on github. I would hope that is enough to get a job, but it might not be, especially in the current era of AI-fueled employment pressure.
Is generative art just AI, or is there something else out there that was called that before the emergence of AI? Genuinely curious.
https://artasartist.com/what-is-generative-art/?ref=thecodis...
I do love the ways random events can change folks’ lives. Would the author have ended up doing art at all without this happening?
Often networking is seen as this robot-like bleep bloop hello, here’s my business card thing and at the dedicated events it very well could be but networking in the most basic sense is just making friends and shooting the shit, only difference is that you can leverage those friends for opportunities in the workplace and vice versa.
No-one ever called a real program an "app" before that, did they?
Yes. Apple called them apps in the 80s, at least on the Mac - this is Apple II but it's plausible they were also referred to as apps there?
For my part I read the title as "Taking over a wall changed my direction as a programmer" which had me really confused for a while. I'd like to read that article, I think.