The point is late X/early millennial were the only ones “forced” to fix tech if we wanted to use it (obviously people older than that needed to as well but they were less likely to be into tech). Shit rarely worked out of the box, plug and play was shit, nothing was standardized, etc. Around the late 90s into the 2000s things worked more reliably without needing tinkering, and then apps came in and shifted things even further from tech literacy.
I’m Gen Z and I was still “forced” to fix tech if I wanted to use it. I mean sure, I didn’t have to deal with IRQs, setting up autoexec.bat and config.sys, and so on, but if you’re not at least a little bit inclined you wouldn’t have the patience to fix things even when you’re “forced”. You’d just give up and move on. There’s always something else to do. Things have gotten easier for sure, which is reducing the exposure to “falling in the rabbit hole” but one way or another interested people will get into it.
It’s like how cars are getting simpler to use, but you still have car guys around. We don’t say only old people know how to drive stick.
In any case, there’s better things to use as a generational boundary; like how a single G5 piano note will trigger a very specific group of people.
Edit: I went off on a tangent above and got argumentative. My original comment before this one was intended to be sarcastic but tone doesn’t carry well over text. This whole thing isn’t really something to argue about so I’ll leave it at that.
The PC revolution started with the Apple 2 in 1977. In the early 80’s everyone had a Commodore 64. By the mid 80’s everyone had a PC. If you were born in the 80’s, you were not editing autoexec files in diapers.
Unless you were poor and your parents could never afford a PC. We still got to use computers some in school at least. I once volunteered for a ‘computer camp’ which was basically summer school where they would let you play on the computers.
The point is late X/early millennial were the only ones “forced” to fix tech if we wanted to use it (obviously people older than that needed to as well but they were less likely to be into tech). Shit rarely worked out of the box, plug and play was shit, nothing was standardized, etc. Around the late 90s into the 2000s things worked more reliably without needing tinkering, and then apps came in and shifted things even further from tech literacy.
I’m Gen Z and I was still “forced” to fix tech if I wanted to use it. I mean sure, I didn’t have to deal with IRQs, setting up autoexec.bat and config.sys, and so on, but if you’re not at least a little bit inclined you wouldn’t have the patience to fix things even when you’re “forced”. You’d just give up and move on. There’s always something else to do. Things have gotten easier for sure, which is reducing the exposure to “falling in the rabbit hole” but one way or another interested people will get into it.
It’s like how cars are getting simpler to use, but you still have car guys around. We don’t say only old people know how to drive stick.
In any case, there’s better things to use as a generational boundary; like how a single G5 piano note will trigger a very specific group of people.
Edit: I went off on a tangent above and got argumentative. My original comment before this one was intended to be sarcastic but tone doesn’t carry well over text. This whole thing isn’t really something to argue about so I’ll leave it at that.
'72 Gen X here, I HEAR YOUR CALL!!!
The PC revolution started with the Apple 2 in 1977. In the early 80’s everyone had a Commodore 64. By the mid 80’s everyone had a PC. If you were born in the 80’s, you were not editing autoexec files in diapers.
Unless you were poor and your parents could never afford a PC. We still got to use computers some in school at least. I once volunteered for a ‘computer camp’ which was basically summer school where they would let you play on the computers.