Opinion by Richard Bleil
The year was 1985. I forget the circumstance, but something had to be written or data had to be entered into a spreadsheet or some such circumstance, I guess. I was working as an analytical chemist at a small private company. Computers were still uncommon, but my first major purchase was an IBM PC, with no hard drive but two (count ‘em, two) 5 ¼ floppy drives (one for the operating system and the other for a program disk; it wasn’t necessary but was a nice feature when you’re running a program that insists on removing the program disk and inserting the operating system disk multiple times) and a separate math co-processing chip. It was rockin’, with a cool amber monochrome monitor, its own printer, 512 K of memory, and a port for a tape-reorder drive. It was, however, before my first modem.
Okay, enough of walking down memory lane. For whatever reason, we needed another computer for me to work on, and I had one at home. So, I agreed to take a day to work away from the office. Great day, eh? Take it easy…sleep in…work at my own pace, except, it wasn’t like that. If anything, I got up earlier, and worked harder than were I in the office. Maybe it’s my workaholic nature, maybe it’s my generation, but they made their $12/hr for eight hours that day, that’s for sure.
But I was also young and dumb. See, they weren’t paying me for my time. They were paying me for the job. That document, or spreadsheet, or whatever it was that I was creating; that’s what they paid me for. I didn’t really think about it at the time, but it’s a dichotomy in thought, a paradigm shift between paying for a certain number of hours at a job, and a job that needed to be completed. The pay for these jobs is still based on education and/or skill level, and anticipated hours for completion, but if the job isn’t done, it really doesn’t matter how long you’ve spent on it.
In my opinion, remote work should be far more common today than it is. Working as a dean at a university, (you know, one of those institutions that are supposed to be the most intellectually advanced and politically advanced places in the country), we had an adviser who was specifically slated to work with online students. She had a desk in a large, noisy, crowded room that was stuck in the corner and somehow isolated. They kept moving her desk every time they had somebody new coming in (although why they couldn’t put the new person at the desk where they moved her has always eluded me), and she was terribly unhappy. Because of her home situation, she wanted to work remotely. She worked with remote students, so she was constantly online or on the phone with nobody visiting her in person, and she lived close enough to come in if somebody made an appointment anyway. Any questions or concerns that come up (or for her) could easily be done on communicating on the computer. It seemed very reasonable to me, so I brought it up to my supervisor. Although she was technically my supervisee, and it was a “progressive” environment, the request was denied.
The problem, in my humble opinion, is with the people “in charge” today. Sadly, I’m talking about my own generation. We were raised to work on site, eight hours a day. “Distractions” were largely restricted to “water bottle talk” and phone, and you just never made personal calls. Today, the people in charge, the bosses and supervisors, tend to be people about my age, and people fear change. Because we fear change, even the younger supervisors and bosses were raised working in an environment where they were paid to stand or sit for eight hours and just work. But, this, sadly, is stifling our workforce, stifling our productivity, and stifling our lives.
Today, there are plenty of distractions, be it in person, or at home. I myself frequently catch myself sitting in my (admittedly temporary and shared adjunct) office on my social media site, chatting on a chat engine, or, just because I’m kind of a social freak, actually reading online newspapers (BBC, NPR, CNN, etc.). I don’t have to be at work; I’m a part-time temporary employee, but I’m still old and used to spending eight hours a day (or more) five (or more) days a week in the office. This is just who I am, but does that make it the best way?
It’s time we start re-thinking how we do things. Some jobs, like working in a store or assembling products in a factory, do require a certain number of hours present in the location. Yes, maybe we’ll someday be able to have robots for these jobs (that day is closer than we might think), but many jobs just no longer should need a person physically present, here, sitting at this desk so we can be sure they’re working. If we start paying for productivity rather than hours, we have the potential to find great ways to save money, as well. We won’t need so much space for employees so can use smaller offices, which also means paying less for utilities. Talent can be drawn from, literally, anywhere in the world (or at least the country if we are going to support our own economy), saving relocation expenses and getting the very best for our companies. Plus, people like my adviser could be available far longer than just eight hours a day. Yes, there might be days that they put in only a few hours, or take a “work day” off, but they’re also more likely to be available after hours, on weekends and perhaps holidays (at least for short periods of time) if all it takes is logging on for the necessary resources.
Maybe, just maybe, there will be a generation in charge who are not afraid of change.