It’s hard to believe digital computing has been around for less than a century.
Perhaps you are old enough to recall the annoying punch cards we were compelled to fill out in public school so our standardized test results could be graded and printed without human intervention, probably my first experience with “computing”. In my teens, a friend’s father paid me to input data on the Commodore PET and first generation TRS-80, staring at ASCII screens for hours on end and hoping I’d remember to save my work before it crashed, as frequently occurred.
Back then, it was all shiny new tech. Less than a single human lifetime removed, it all seems hilariously primitive.