A good reminder that everything we say/hear/write/read exists in the unseen context of all the things we believe we should not say.
Unlike OP though, I cannot be as open about these companies as we would definitely not have any clients left after.
Also, I am not sure how not touching computers after work is a bad thing; people can have families and other hobbies?
No not bad per se, but it did clearly show that, without on the job courses, why all of them are stuck in the early 2000s tech wise.
Some people start with a company and get lucky with early success and then get restricted because of that success: get new clients via existing, everyone likes it and asks for new features and without noticing it you might find yourself 15 years down the road with ancient tech and no one understanding anything current. Then you can still thrive if your clients like it... we have similar clients: a 1980s factory, another 1980s factory, a logistics app from the 1990s etc. Things deeply ingrained in some vertical, expensive but better priced than the SAPs etc of this world so it keeps going and going.
(obviously it's not but it is super nice that main in Rust is just:)
fn main() {
} void main() {
}I don't know if I could tell you with confidence the proper way to get a string length in any language. Is it a global function or an object method or property? Is it length or count or size? I have to look it up or rely on intellisense every time. I do too much bouncing between languages.
Well, I know it in BASIC. Len().
They don't even have a main() any more, it's great
def main(): # code
The dunder syntax you see around isn't required.
A “Confessions of a Software Developer” website where devs can come in and make anonymous confessions.
Without making judgment on the actions of any involved party, I do wonder why the author would choose to bring up this incident and submit it as part of a story to a site where there is a significant overlap in readership.
Honestly, if there's any chance the content they posted on your profile before locking you out comes close to defamation, I'd consider talking to a lawyer about it. It could be that getting one to send them a cease-and-desist letter on your behalf could take care of the problem.
I've generally found conversation there to be more respectful than HN, rather than less, when discussions get heated - so I had high hopes it would be a different site, but alas.
This leaves a really bad taste in my mouth.
Edit: you know what, screw it. In the spirit of "no more self censorship", here's the link: https://lobste.rs/~7u026ne9se
Sadly, it seems like nothing was learned, since he settles only for diminishing his culpability in anti-social behaviour. He goes so far as to describe, in his blog post, his code as an "AI-assisted patch". When you profess that you don't even know the language of the code that the LLM generated, there is no "assistance" about it, you're at the deepest end of vibe coding. And in submitting it to an open-source project, you're making a maintainer spend more time and effort reviewing it than you did prompting it, which is not sustainable. Moreover, if the maintainer wanted a pure-LLM-generated solution, there was nothing stopping them from hopping over and typing in a prompt themselves.
In fact, most of the comments were purely a debate with no direct attacks at all. The extent of "not respectful comments" I see are something like...
So your original comment that you "didn't want to hype up AI" was a lie, you really just wanted to pretend it was your own work and didn't want the project to be able to make a choice about it. There are good reasons why projects may not want to accept code generated by AI. They may not care. But by consciously choosing not to disclose that, you took that choice away from the project. That's pretty lousy behavior if you ask me.
"Pretty lousy behaviour if you ask me" is incredibly tame. If that's what counts for toxicity, then you're advocating for a toxicly positive carebear forum where nobody is allowed to criticise anybody else's decisions.> you're advocating for a toxicly positive carebear forum
Please stop putting words in my mouth.
Being bad at problem solving with people far away is just another problem you can solve with practice. Same as being bad at problem solving even when help is right next to you.
Yes, "remote work sucks" is reductive, but I elaborated beyond the heading. Also, I wouldn't disagree with "office work sucks." Remote work simply has its warts, too.
> just another problem you can solve with practice
Perhaps, but practice alone clearly isn't enough. I've been working remotely since 2020 and it hasn't gotten more enjoyable. I would love to solve that problem, though. I read Remote: Office Not Required by Jason Fried in the past, but that was written a long time ago. I've added more recent works (Effective Remote Work by James Stanier and The Async-First Playbook by Sumeet Gayathri Moghe) to my reading list.
Work sucks in general. Remote work is of course not perfect, but its problems need to be compared against non-remote work problems..
And this is my biggest complaint about arguments about remote working. People turn it into something that’s evidence-based when actually it’s a deeply subjective topic and thus different personality types thrive in different working environments.
Thus it is always going to be an emotional argument rather than a fact-based discussion.
It’s like asking someone what their favourite food is. They might be able to explain why they like the dish, but that doesn’t translate to that meal being everyone’s favourite.
But comparisons aside, some people will just argue over anything. Such as the meta debate we’re having right now over the stuff people will argue about. ;)
People not wanting others to indirectly force them to relocate, work additional unpaid commute hours and pay for it is not "emo". It's a critical fight.
The cost of my utilities? Listen, I don't know know much electricity costs where you are, but the cost or running an extra computer is pennies a day. The cost of internet is set for me. We might talk about the increased cost of heating and cooling, but I was never one of those people who turned their system off when gone because that doesn't make literally any sense with my utility's time based pricing. It's literally cheaper to let it run as it is than than do that.
As for space and confidential items. I'm not sure ahat to say. I don't have thieves coming in and out of my house and I have a password good enough to defend the casual nosy child or relative. I have an office now because I have a house, but I have worked remotely in smaller spaces and it was never any problem. At least not compared to commuting 1 hour being, although I have commuted up to 2 hours on bad traffic days which were not particularly rare occurrences. And this is just have it is in all the cities I have lived. Perhaps not all cities, but the two metropolitan areas I have lived within and in the suburbs of. Living within the city didn't even guarantee me a reasonable commute.
If the trade off is a company getting a corner or a room I wasn't using anyway plus a few dollars of electricity subsidy and I get several hundred dollars in my time measure by my pay rate not commuting plus a couple dollar in not spending it on gas, I am happy to trade that. I'm also capable of putting my computer away and safely like literally anything else I own.
I also don't worry about the isolation that people mention (although not you here) because I have a vibrant social life. As someone who was never the typical demographic of the field, I have neve depending on socializing with my coworkers in office for social fulfillment. I still somehow maintain the correct level of social comraderie via digital means. Remote work doesn't mean not interacting with your coworkers at all.
We were doing remote work effectively decades ago. Don't have hallway conversations to fix bugs? Easy, just post your problems on the team chat and someone (often one of several people) would love to drop by to help.
I'm not sure exactly all of the forces that have led to this changing so much, but I'm certain that merely blaming "remote work" isn't it.
Somehow we were better at using remote tools while literally in the same office than some teams are at using them now while fully remote.
Some people consider email something you MUST answer ASAP, others can have 9000 unanswered emails without an issue. They'll get around to it.
And with chat, some think it's real-time and expect an answer in seconds, others will check $chat_app a few times a day like email and reply.
I personally prefer chat, because you can use it asynchronously, BUT you can escalate to near real-time if everyone is present at the same time. You can have an actual "conversation" via chat - you can't do that with email.
Consider the effort to accommodate those preferences though. Accommodating a video call preference is easy. Same for chat. Accommodating a preference for face-to-face requires spending an hour (2x average US commute) traveling to meet you. That's quite a significant ask of the other person.
In electronic chat I can ask someone to explain their question and wait for it in writing. In person, I often have to listen to them stumble over the concept because they didn’t think about what they wanted to ask before asking it.
In a video call I can clearly see the other person’s screen and zoom in on what I’m trying to look at. In person I have to lean over their desk and squint at the right angle.
I couldn't agree more. I pushed to get the place I worked for to use Slack when it first launched, moving us off AIM (ha!). Our use of Slack when we shared an office in the twenty-teens was so much better than the use I've seen of Slack/competitors on fully-remote teams.
I wonder if it's because the failure mode was, as you said, to "drop by." Now the failure mode is... just failure.
My team rooms are pretty dead. I'll send stuff there but by and large the team simply doesn't use chat functions.
Arbitrary discretion in the exercise of power is the bedrock of our society.
I’ve managed to be invited or told of them after ingratiating myself to the teams, or more often, after quitting and getting invited as one of the “good ones”
They all know that every word on company shit is being monitored
It’d be like using Blind as your company chat - nobody goes on there to say how great their experience has been, and the tone infects everything else.
But maybe I’m just not very fun at parties…
This should be avoided at all costs by creating a culture that is receptive to people’s concerns and doesn’t do stupid things without explanation - but I get how difficult that is in reality and most orgs end up messing this up.
Moving all of your work related chats off-platform so you can “say whatever you want” about work and eventually making it into a defacto team chat is what I’m talking about here. This isn’t chatting with friends, this is creating team divisions and huge gaps in context for the rest of your team. This approach is being a poor colleague in my opinion.
You can do both things - they’re not mutually exclusive.
Sometimes you just want to vent and call your boss a fuckhead and it only takes one time in a persons life to see HR punishing/firing/admonishing someone for conduct on company communication channels that would have been perfectly fine in any other setting, for that person to never trust in the “company culture”
There is no environment where messy human beings fit into the perfect set of rules and behavior that companies demand
IME building up communication skills (including when and what to communicate) comes with experience.
Notification fatigue is a thing and people are just used to ignoring notifications and messages nowadays which ends up with slow responses and poor communication all around.
I'm currently in feeling things out phase with my current team, and people seem really laid back about responding to messages - but it also seems like we're getting stuff done. Hard to figure.
This is sort of the point. Remote tools work great when you have spent a lot of time building relationships and rapport with the people involved. That's hard to do in professional settings, and extremely hard to do in remote professional settings.
Letting teams that know each other well work remotely works great. Building teams remotely is very hard.
I'm a diehard for remote work, but we have to be realistic abouts limitations.
It requires to be comfortable exposing lack of knowledge or saying weird things to peers, and be confident it will be taken in good faith. As you point out, that requires a whole level of culture building.
The same people DMing however will also extol the virtues of posting in public and lament why there is not more conversation happening in the open.
There's no question! It absolutely is.
Obviously the help also came with you bonding and chit chatting about other stuff, I miss it.
Ok, gonna go read it.
(Hmm. The author says "Follow Scrum, Lean / Kanban, or eXtreme Programming to the letter, and let your team focus on the product."
I disagree - quite vehmently. I guess that's obvious, given that I'm calling out the various capital-A agile methodoliges, and the parasitical industry around them, as harmful/bloated/mostly pointless. But how did you figure out I hadn't read the blog post from that? Now I have, I just think he's wrong).
The industry around it makes total sense - often bloated and misusing terms, but the process itself can work well as a starting point. I get the impression many engineers never attempt it because of bad experiences named Agile sometimes. Understandable of course.
> as a starting point
Even you disagree with the author :) But yeah, a team with the power to change its own processes, rather than have Agile imposed on it, isn't a team that's cargo-culting an Agile Brand).
(Last couple of months I've been introduced to "retrospective story points"; we're supposed to fill in how many points the ticket actually took after we've done it. I haven't yet found the words to express how pointless I think this is).
That's the thing... Agile processes do position themselves as a starting point, and suggest that once the team understands it by living it (but not sooner!) they might adapt it and customize it.
From The Art of Agile Development, 2nd Edition by James Shore et al. (the most recent eXtreme Programming book, if you will):
> As a result, although it’s tempting to customize your Agile method from the beginning, it’s best to start with a by-the-book approach. The practices that are the least familiar are the ones that are most tempting to cut, but they’re the ones you need most, if you’re really going to be Agile. They’re the ones that involve the biggest change in philosophy.
> Mastering the art of Agile development requires real-world experience using a specific, well-defined Agile method. Start with a by-the-book approach. Put it into practice—the whole thing—and spend several months refining your usage and understanding why it works. Then customize. Choose one of the rough edges, make an educated guess about what happens, and repeat.
From The Scrum Book by Jeff Sutherland et al.:
> It’s important to understand the rules, and it’s even useful to follow them most of the time. But reading the rulebook of chess won’t make you a great chess player. After learning the rules, the player then learns about common strategies for the game; the player may also learn basic techniques at this level. Next is learning how to combine strategies you learn from others while maybe adding some of your own. Ultimately, one can transcend any formalism and proceed from the cues one receives from one’s center, from one’s instinct. [...]
> Some day, long from now, you may even outgrow these patterns as you evolve them and define your own. There are no points for doing Scrum, and these patterns are the gate through which a highly driven team passes on the road to the top echelons of performance.
That’s pretty weird and uncomfortable and I don’t know that I would want to work with someone like that in or out of office.
Sure it applies to things like random people on social media and such, but after a mutual exchange or two you should be over it.
Text requires correctness to some extent; bullshitters will just yap away for hours and nobody can point to one piece of text and say "Here, this is where you are objectively wrong, and/or misrepresenting things".
The unfortunate reality of remote work is there's a lot of zoom meetings where yappers in high places will BS away -- a lot more "important" zoom meetings than "important chats", especially in public.
Perhaps it’s useful to have these people in the office, in a room of mirrors, where they can listen to themselves talk all day. There’s a subset of people who have weaseled their way into tech coming from the world of hyper-anxious very public social media engagement that simply make life miserable for everyone else.
Chatrooms have evolved in a really interesting way. I think the first generation to have them didn't fully understand how "public" they were. Maybe there are more people in the more recent generations that have a more visceral understanding of online "publicness" as they have grown up with (and perhaps have been burned by) those concepts from the very beginning. Maybe they have a better understanding of the permanence of online utterances and therefore have a more conservative approach to interacting on what feels like the permanent public ledger.
Maybe it's because the concept of pseudonyms has devolved since the early days. Corporate social media has an interest in doxing its users to advertise to and control them but pre-corporate social media was filled with anonymous usernames. Posting in a large group under your permanent forever name is much scarier than posting under an anonymous, temporary identity. One of the things I advocate people do is post online anonymously, instead of with their real name. It alleviates a lot of the fear of speaking your truth, which we need more of!
There is something there. The ability to try on identities in a safe environment before you discover which one you really identify with. It's much harder to do this with your real name. Your past comes with a lot of baggage and people who know you don't want you to change because it makes them feel uncomfortable.
Use tools for what they are good for and create a culture that makes each tool work best for your organization.
They would be right: HR will get access to everything you ever posted in a company chat if they have a reason to check. Some people don’t care, some… do.
My comment was related more to the _overhead_ required before posting. Either way best not post anything that would be an HR issue in any format. (Also your private chats will be available in discovery as well if found out)
Maybe I’m not the best to opine on this as I’ve been wildly successful at building community at companies but I’ve also been burned by this. I suppose I’m privileged enough that I’d like to work somewhere that I can still collaborate with low friction remotely - and if the company doesn’t like it then I’m not a good fit.
Yes it absolutely is formal communication. Microsoft makes this painfully clear with how they market teams.
I agree, but this may mostly be pointing out they are not very good/qualified at whatever they are doing tbh.
Kind of a ship of theseus situation culture wise - when the original leaders are all gone, did they pick good successors to fill their spots? Very often not.
Have a senior leadership team and want them to not tell you bad news when you are the CEO/Leader? Then link their salary/performance to metrics like number of production incidents their team has. Suddenly the number of incidents that you know of decreases.
If that does not work to isolate you as the leader from thr reality of your company then link their salaries to a metric like number of projects finished before or at deadline and watch how tech debt increases multiple folds and how everything is suddenly estimates are increasing all over the place.
Want people not to ask meaningful hard questions in All Hands? Just make sure anyone that seems critical be labeled as not culture fit and done. All questions are positive and nice. Make sure to always ask for name and disable any anonymous questions asked.
Not trying to say metrics are bad or they should not be used. But they are not pure functions :) they do have side effects and sometimes very large ones.
At first they said it was "great". But it soon turned sour and resulted in "it seems like you spend too much time answering questions", and I should "focus" and "free up" that time to work on my assigned tasks.
Well, I don't answer anything anymore. In fact nobody does. It used to be that you got precise technical answers from someone directly working on the tool or problem you asked about. The previous CEO would sometime even answer themself. Not anymore.
Now people ask, but nobody answers. The rest has devolved into LinkedIn style self-promotions and announcements.
It also has access to our internal wikis, GitHub, and other internal tools.
Smartphones changed that with Youtube and Facebook. Youtube incentivized you to use a Google account, and Facebook wouldn't let you use it anonymously without an account. Because you could use one account to log into multiple places people could track you across websites. People could make archives, screenshots, and transcriptions of anything you had done with those linked accounts. With this change there was no safe corner to hide if you said something stupid. And because so many people were foolish enough to tie their real identities to these online accounts with their real names or pictures of themselves, it gave a way for particularly unruly people to track these individuals even offline. There was now a real danger if you said something stupid, because instead of just getting your post deleted or starting a derailment in the thread people could harass you at your home, get you fired, and even send the police to terrorize you in the middle of the night via SWAT raids. It's no longer just one person calling you out. It's now hundreds, maybe even thousands, all armed with information.
And this is why I say it's stupid to require phone numbers and real names to sign up for insignificant things like being able to view someone bake a duck shaped cake live over the internet.
People weren’t assholes and/or snowflakes in those days. Implicit in being on the net was that you were fairly well behaved.
The main difference is that more spaces were quasi-professional and non-pseudonymous, in that one largely got one’s internet access and identity (IP address, email address, invitation) from the institution of higher learning one attended or worked for. So there were direct, two or three degrees separation consequences (my boss knows someone at your institution) in those spaces. I suppose this is what you are referring to.
(In my early era of commercial internet work I can remember a colleague shutting down an accidentally abusive scraping bot by working out who was likely to be the boss of the person running it and phoning them up)
But away from those spaces were many places that were just as bad as they are now.
The internet has always (in my time of using it, which is all of my adult life as someone who is over half a century old) demonstrated that a good culture is a question of starting conditions and quick maintenance actions.
A non-trivial amount of the worst behaviour I have personally witnessed on the internet happened before the year 2000.
That’s not to say there’s more vitriol today; it’s swung the opposite direction, where newbies expect to have answers handed to them, or worse, they’ll post AI slop and then be genuinely surprised when someone asks them to explain it, or to show their work.
I don’t think that people should be belittled, but I also think it’s unrealistic to expect that experienced people should patiently teach every newcomer the same things over and over, without them having put in the minimum effort of reading docs.
I’m reminded of something I saw once on Twitter from a hiring manager who said that the best signal they had was whether a candidate had a homelab. This was met with criticism from many, who bizarrely accused him of being classist, since “not everyone has time to do that for fun.”
For the 70s, I would agree with you. But the moment home users, and particularly kids, gained access to the internet, you started to see a subculture of trolling.
Source: I was one of those 80s kids. It’s not something I’m proud of, but writing bots to troll message boards and scrapers for porn and warez played just as significant role in my journey into my IT profession as writing games on 8bit micros.
And everyone was in on it. We were all trolling, and being trolled, and perfectly well aware of what trolling was. But now people deliberately target and exploit the vulnerable on the internet.
I feel like the only thing you needed before was a fairly thick skin, but now you need a lawyer and a smorgasboard of security.
As for security, that was always an issue. Malware, denial of services attacks, etc aren’t a recent phenomena. And hacking was so prevalent that even Hollywood caught wind, hence the slew of hacker movies in the 80s and 90s (Wargames, TRON, Hackers, Anti Trust, Swordfish, Lawnmower Man, and so on and so forth).
The problem isn’t that internet etiquette has gotten worse. The problem is that there is so much more online these days that the attack surface has grown by several orders magnitude. Like how there’s more road accidents now than there was in the 70s despite driving tests progressively getting tougher (in most countries). People aren’t worse drivers, there’s just more roads and busier with more vehicles.
Early 2000s, public channel on a LAN with ~3k people in a post soviet country – say something stupid to a wrong person and you'll find yourself with a broken nose, because the guy/gal is a friend of the admin.
I was just responding to the generalisation made by the GP.
I think this is merely the shift from doing this as a hobby, to doing this for work. Random coding problems mixed with banter I posted or answered on IRC back in the day? Purely hobby stuff, things I done after school instead of doing my homework. No stakes beyond the community itself, I could disengage at any moment, nobody would care - there was no commitment of any kind involved.
Today? Even if we switched back from Slack/Teams/whatnot to IRC, the fact remains, the other people are my co-workers, and we're talking about work, and it's all made of commitments and I can't disengage, or else I starve.
That changes the dynamic quite a bit.
There's nothing new here, there's no problem to solve. Doesn't matter if you're anonymous or publicly identifiable. 90% of people don't contribute, they just consume. 9% contribute occasionally. And 1% are regular contributors.
The 1% or 90-9-1 rule is pretty well known.
I typically have busy but important channels muted with a carve out for @mentions, watercooler channels are just muted but I check on them a few times a day.
Living beings do it all kinda ways. Bees waggle their butts, crickets rub their legs, geese honk, snakes hiss, some fish detect electrical signals. And to collaborate, the bees' dance indicates a flight path, birds singing indicates interest in mating, the snake's hiss and the geese's honk tells you to watch out. You use the tools you have and develop collaboration with them. There's clearly no right way, there's just ways.
But tomorrow morning, would you wanna learn to honk at people, or rub your legs, or waggle your butt, to order a latte at Starbucks? It'd be awkward, weird, painful, and unnecessary. So if you were asked to, you'd probably not try very hard to adapt to it. And if everybody you knew were in the same boat, all being forced to change with no real guidance, kinda not trying that hard to make it work? It would suck for everybody.
People just don't like changing what they're used to. They probably don't even mean to fight it. But we do like culture we're already familiar with. Change is hard, not changing is easy. We like easy. So people who grew up with remote work (on IRC, mailing lists, etc) find it easy, even more productive. But a company that's thrown into it without a healthy established culture are going to be swimming upstream indefinitely.
IRC selects for people who like chatting and communicating via text.
I think the mistake made with remote work was assuming that everyone could easily work that way.
The best experiences I had with remote work were pre-COVID, when the teams working remote were carefully selected for having good remote work abilities and anyone who couldn’t handle it was kicked back to the office (or out of the company)
Then something changed during COVID and remote work was treated as something everyone could do equally well. The remote teams I worked with were now a mix of people who could work well remotely and people who wanted to work remote but tried to force communication to happen like we were back in the office: Meetings for everything. Demands to “jump on a quick call” when a few Slack messages would have done the job. Then there were the people who read “Four Hour Work Week” and thought they were going to do their jobs from their iPhone while traveling the world or at the ski resort.
I don’t know. Having seen the before and after it doesn’t feel so surprising that remote work faltered when applied indiscriminately to everyone. The best remote teams I work with to this day are still the ones who know how to communicate in that old school IRC style where communication flowed easily and everyone was on the same page, not trying to play office games through Slack.
Interacting in person and cooperating is something you start learning from a young age. Working in person in the office is a natural extension from years of schooling in person with your peers.
Working remote is a skill that must be learned. Many people have barely done it at all before their first WFH job. It doesn't come as naturally. It's not a symmetric comparison.
I'm not sure what your experiences were like in school, but during my early years, there were drastic differences between how much different classmates thrived or struggled in highly social environments. Just because everyone is forced to interact in a certain way doesn't mean that it works well for everyone equally.
> It doesn't come as naturally.
I'd argue that it doesn't come as naturally to a lot of people to work in largely dense social environments either. To your own point, this is something that people are actively conditioned for, not a naturally occurring phenomenon, and I'd argue that even despite that it still leads to a pretty wide variety of outcomes for people at an individual level not in small part because of how suitable an environment like that is for each of them.
To me, this seems like a pretty fundamental disagreement in how much uniformity should be imposed on a population based on how well that proposed norm fits with the members of the population. I imagine that to people who disagree with me, the idea that many people might work better in seclusion than in a larger shared environment probably seems radical, but I've yet to see a justification for it as the rule rather than the exception that doesn't end up coming from an assumption that people who don't prefer this are a small minority that aren't worth changing things for. I don't have any clue what the actual number of people who don't fit the assumed norm are, but I don't find it nearly as easy to accept that the threshold at which point it's worth reconsidering how we do things is comfortably higher based on any of the arguments I've seen presented. Maybe this is due to my perception of what a fair threshold would be being lower than average, but most of the disagreements I've encountered seem to already stem from an assumption that the number of people who prefer to work in an office-like environment is high enough to be the basis of how things get run in the first place, and then extrapolate the threshold from that fact.
that is a wiiiild wiiillllld take sir
I think the realization was more that some people are simply there, either at the office or at home. Of course the experiment worked fine. Those people were already not doing much. Not doing much in a different location makes no difference.
Note that these days, the Mozilla community has moved to Matrix, which also works very well for these things.
I believe the change is largely demographic, and I'm NOT referring to gender/nationality/age/race. Rather to the personality of people working in tech today. Long ago tech people were almost exclusively sourced from the weird kids who couldn't/didn't read the room and other things of that nature, they just said stuff, asked stupid questions because they wanted to know. Most people don't do that, so if tech is now made of more "commonly adjusted" people, then there will be less dexterity in navigating a less social and more (actual) productivity focused medium (remote/async comms).
As much as I hate RTO, chat/video really does have lower social clue information density
And it wasn’t just working from home, it was lockdown. Total isolation. Of course people missed human contact and blamed it on remote work.
Even the "productivity" data people cite is skewed, because most of it compares lockdown productivity to normal-life office productivity, which isn’t a fair comparison at all.
I don't fear they'll deny others the opportunity for remote work. The company is "headquartered" in California, but I don't know if they even lease an office anymore. The CTO lives in the upper midwest, the architect lives on the east coast, my manager lives along the Mississippi River, and I live in the Ozarks.
> enjoy the benefits more than you suffer the drawbacks
Yes. I'm sorry, I thought I made that clear in the post. The benefits of remote work include, but are not limited to: no stress or time from commuting, an opportunity for geographic arbitrage, and the ability to build a better lifestyle around the lack of a commute. Beyond just the remote worker themself, a society that transitioned all office work to remote would also gain more benefits: more efficient use of real estate with entire office buildings rendered unnecessary, less chance of land value distortion due to centralization of workers, and less pollution due to fewer commutes.
I'm glad to also criticize in-office work for having other drawbacks. For example, I was rear-ended commuting to work more than once, the family needed the expense of two cars, we spent more on clothing, and the ambient level of noise being above 35 dBA was annoying.
Would upper management from other companies read your blog?
Sure, maybe not organically, but certainly via a google search, looking for support for motivated reasoning for why all the plebs should be brought back to the office.
> Yes. I'm sorry, I thought I made that clear in the post.
Maybe for someone doing a close read. Honestly, with the discussion about the cost and the mortgage, it reads more like you are stuck there.
Taking a switch statement and spreading it out over 3x classes is not a general improvement, it is very context specific. It makes the code difficult to navigate because what used to all be in one spot and easy to read is now spread out who-knows-where and there might be a special case lurking somewhere.
While I do appreciate this joke (and I do hope this is a joke), I've recently had a project majorly held up because a lead dev didn't understand SQL. It's great to admit gaps but it's equally important to close those gaps.
> As a hiring manager I interviewed software engineers and tried to filter for object-oriented knowledge. Retroactively, it’s clear I was hypocritical.
As some one who has been on the other side of "rejected by an interviewer who didn't understand the thing they've interviewed you about" I, again, appreciate the transparency, but I'm not entirely feeling that the lesson has been learned in the case.
There was a time in my life where I felt ashamed that I didn't know calculus... so I learned calculus and my life has been better for it. While refusing to admit ignorance of a topic is particular problem in tech, confessing that you don't know something and gleefully stopping there is not much better. Holding people up to a standard you do not hold yourself to is a major problem in this field. The technical people I've learned the most from hold you to a high standard and hold themselves to an even higher one.
Of course not every engineer has to hold themselves to a high standard, but if you want to write a blog about a topic, then part of the requirements here is that you do hold yourself to a high standard. Yes, we all have gaps, and we shouldn't let shame get in the way of learning, but we shouldn't let shamelessness about what we don't know limit us either.
My suggestion is using Khan Academy if you want to better your math knowledge. It's really quiet good for that sort of thing. It was just starting to take off when I finished my degree. I wish it was available before then.
I learned way more reading crafting interpreters than I did in my compiler class for example.
But I also do read textbooks for fun… Now that I have a few decades of experience in a lot of these subjects I get way more out of the books. And I can start to understand more of the meta information. Like, of all the things the author could’ve used as an example, why did they pick that. Also, it’s hugely interesting for me to look at the homework problems and theorize why this particular problem was picked. Especially fun for electrical engineering books. But ya, I’m weird like that.
For automated testing, I'm in the middle of reading Developer Testing by Alexander Tarlinder, with xUnit Test Patterns by Gerard Meszaros coming close behind. I'm also working through Test-Driven Development: By Example with my wife as we have time.
For SQL, I read Grokking Relational Database Design by Qiang Hao last winter, and I started SQL Queries for Mere Mortals by John Viescas this week. Sadly, my flub with "left inner join" was not a joke.
For OOP, I've been on a whirlwind tour: OOA&D With Applications by Booch et al., Object Thinking by David West, POODR and 99 Bottles of OOP by Sandi Metz, Domain-Driven Design by Eric Evans, IDDD and DDDD by Vaughn Vernon, Design Patterns in Ruby by Russ Olsen, Clean Architecture by Robert C. Martin, and Smalltalk Best Practice Patterns by Kent Beck. Still on the docket are Design Patterns by the Gang of Four, PoEAA by Martin Fowler, Smalltalk, Objects, and Design by Chamond Liu, and Object Design by Rebecca Wirfs-Brock.
> confessing that you don't know something and gleefully stopping there is not much better [...] we shouldn't let shamelessness about what we don't know limit us either
I promise you, this was not gleeful and this was not shameless. Shame and fear affected me for months on these issues. And I'm not stopping there... From the end of the article: "I’m going to continue to work on skill building, but now I feel free to write about it. If [...] you’d like to help me fill [my knowledge gaps], [...]"
> if you want to write a blog about a topic, then part of the requirements here is that you do hold yourself to a high standard
A high standard of writing, maybe. But plenty of great stories come from those who are striving for a high standard, not just those already in the upper echelon. It's what makes this place different from academic journals.
It's easy to feel dumb on the internet, around every corner there are people pointing out your mistakes and they seem to know it all. But it's often just by chance, of course there is someone out there that know this exact thing you got wrong. You did the rare thing what no one does in tech, you said what you don't know, all the _experts_ out there simply hide that.
The other issue is also that people try to overwhelm you with their questionable knowledge. I find that quite problematic with OOP. I've smoked a lot of the OOP crack, but I feel more efficient without all the rituals and dogmas. Knowing all the bells and whistles of smart OOP stuff will just cause more shame, because with every piece of code you will think "oh I have to do it that way or people will hate me".
That being said, I usually prefer to know the essence of something. There are many ways to testing, but if you know what it's conceptually doing, then you can write your own test lib. It's not hard, neither magic. But elaborate frameworks and ritualized concepts often lead you away from the gist of things. You are supposed to do things how they've imagined it for you, if you don't, feel shame.
I'm honestly so confused by this. Has the author never worked in an office before? Building a grudge for someone that you are forced to work with and sit next to all day is one of the classic office dilemmas. Being forced to be around them all day can really build resentment to people
I much prefer working with people who can just be honest about what they don't know, it's way better than pretending to know or trying to save face, and generally people in the former camp seem to have higher EQ.
- I blog with my real name, which includes an uncommon first name. It's easy for hiring managers to search the web for.
- My blog is linked from the website I host on the domain name I use for my email address, including for job applications. Anybody I email is likely to follow that thread.
I think this is objectively mostly a silly and counter-productive worry. But I still feel it.
Kudos on publishing this piece!
Would it make more sense to ask how to apply git rebase in certain scenarios?
Dude, your employer is toxic AF. Look for a new job starting today.
The inflection point will be when business hires an AI to fill a managerial role. AI will discriminate against hiring human developers.
Since you’re so familiar with the process, I suggest you start positioning yourself as an AI development efficiency management consultant or similar.
Uh oh.. maybe it's a problem to grease the wheels of the least-skilled...
Ever heard of flu season? What if you have a family and don't want to bring diseases home?
> Attempts to represent ideas spatially get mutilated by online whiteboard and sticky note software.
Right... like, the Linux kernel team? Or any of the major open source key pieces of technology you use? Built by large teams that worked remotely for decades even when tools where orders of magnitude worse than the current state of the art? Some of them never meeting each other in person?
---
Remote work DOESN'T suck. YOU make it suck.
Remote work is great if you care about shipping.
Want to go for coffee or want to talk about our weekends? No thanks.
Did you see the distracting thing outside the building? No, I didn't because I don't have to go there anymore.
Is the heat too high or too low? It's your own home, just adjust it to YOUR convenience.
Worried about your pets being alone? Just be next to them. I care more about my pets than some stranger from work.
Want to be loud and flex about random stuff? Log into LinkedIn and talk to the other geniuses like you while I focus on doing my job.
Most people SUCK at drawing, suck at calligraphy and their whiteboard diagrams SUCK. Therefore, whiteboards SUCK. Unless you have great calligraphy and drawing skills, whiteboards are not helpful. You are just sad because you are not getting attention by being in front of other people looking at you.
The problem with if-else chains is it's easy for a programmer to forget to handle a case that another developer added in the called component. Unit tests can't help a spec miscommunication. But, visitor pattern can as it forces the handling logic to be complete.
Hence my example at the end using discriminated unions and exhaustive pattern matching in F#. Much, much simpler with the same benefits.
sealed interface Result permits ValidationError, SearchQuery, UserProfile {}
Along with the specific implementations of those ValidationError, SearchQuery and UserProfile classes and then a switch statement like: Result res = db.query(input);
return switch(res) {
case ValidationError ve -> context.renderError(ve);
case SearchQuery sq -> context.redirect("Search", Map.of("q", sq));
case UserProfile up -> context.redirect("Profile", Map.of("user", up));
};
The sealed interface gives you compile time checks that your switch statement is exhaustive wherever you use it.Before that pattern matching, I might have used a Function<Context, R> instead in the Result interface. This is off the top of my head without the benefit of an IDE telling me if I've done a stupid with the generics and type erasure but something like:
interface Result<R> {
public R handleWithContext(Context c);
}
class ValidationError<RenderedError> {
public RenderedError handleWithContext(Context c) {
return c.renderError(this);
}
}
class SearchQuery<Redirect> {
public Redirect handleWithContext(Context c) {
return c.redirect("Search", Map.of("q", this);
}
}
etc. In either case though I think you're right that an empty interface is something that should be examined closer.When I was in school, I discovered that I studied more effectively and efficiently when I'm surrounded by other students who's also studying.
Then at work, I found I worked much more productively if my coworkers are all doing their work.
It's not just simply peer pressure, it's an atmosphere effect, it tell you "hey, this place is for doing this thing, now you do it too", it makes you concentrate. Sometimes being concentrated is a good thing.
I feel much the same as the article author in that
"this place is for doing this thing, now you do it too"
Is somehow powerfully motivating. But at least for me it's about the place, not the other people in the place.
I had the same covid-related journey from an office worker to unexpectedly fully remote. But I'm also lucky/privileged enough to be able to dedicate a room in my house that's quite separate from the rest of the house, and for me that's "where I work". I had coworkers who started out having to work from their kitchen table, some with housemates or children around - pretty sure that would have completely killed my productivity.
I do sometimes resent losing that room, effectively subsidising my boss by relieving his office rent costs. It used to be my "workshop" where I used my 3D printers, built drones, tinkered with electronics, and repaired stuff that broke - and I just don't do those things much any more because going into that room now feels way to much like "work" not "hobby or play".
Doors are a necessity in the work place and I hate open offices. 1 other person is okay but I'd rather a small room than no room.
A door let's be close out the rest of the world when I'm in the zone. There's time for collaboration but there's time for isolation. In a physical place I can turn off all notifications and close my door. I can make a space where there's low physical distractions like noise or people walking in front of my desk (or talking 5 feet away...) A slack setting of "away" is interpreted as either "eh, they'll probably answer" or "they forgot to turn it back off" (or they don't notice/care)[0,1]. But a physical door, people are much more cautious about knocking on it when it's usually open. It's not the same thing as a busy sign.
But I also don't think a door should be usually closed. It should usually be open. Indie collaboration but also respect your coworkers. A door is a great communication tool that you just can't get online.
[0] and for the love of god, do not hit me up with "hey". It's an asynchronous messaging system. I'll read the notification as it comes across my screen. Don't try to become synchronous with me that way. Call me, physically find me, or ask when I'm free for a call.
[1] seriously, my time is just as valuable as yours. To me it's even more valuable.
I would say it comes down to common practices. Commonly, engineers don't do coffee runs. Commonly, office jobs have always been done in an office, and this status quo has only changed fairly recently (let's say in the last 20 years). History could have played out such that it became common practice for coffee run duty to cycle over everyone in the office, regardless of role, in which case asking you to do it would have been perfectly reasonable, and someone would be right to ask what makes you so special than you can't do something everyone else does. Likewise for remote work. Some places allow it, some don't, and some may transition from one to the other. Since this is known to be the case (no one will believe you if you try to feign ignorance on this), it is reasonable to require to come to the office, no matter how many years you've been doing remote work.
>having expectations change without my consent (e.g. being told after multiple years working remotely that I need to start going into an office or "voluntarily" leave).
When expectations change unilaterally, that usually calls for a renegotiation. The correct response to "we want you to start coming to the office every day" is "okay, then I want $x more every year to cover my additional expenses". Now, it could be that either or neither party is willing to negotiate on such terms, or even that they do negotiate but no consensus is reached, in which case you just have to dissolve the business relationship. What else can you do?
Being asked to do the exact thing you've signed up for isn't inherently dehumanizing (although it certainly can be; I don't have any trouble imagining that people agree to do jobs that are dehumanizing because they need income and don't have any stronger prospects, but that's an entirely different topic of discussion). I feel like you've missed the context I gave about the initial job one is hired for being different from what they're tasked with doing; I didn't say that having to do coffee runs is inherently going to be dehumanizing, but that it's dehumanizing when you're hired to do something entirely different. Treating people as interchangeable units of labor is pretty much a textbook example of dehumanization in my opinion; we're not cogs who should be freely reassigned by authorities based on their whims, but individuals deserving of some semblance of autonomy and self-determination.
> Some places allow it, some don't, and some may transition from one to the other. Since this is known to be the case (no one will believe you if you try to feign ignorance on this), it is reasonable to require to come to the office, no matter how many years you've been doing remote work.
This is honestly a pretty absurd conclusion. Because I'm aware that some companies have certain policies, I'm implicitly agreeing to literally any of those policies by agreeing to employment to any single one? Plenty of companies require their employees to be clean-shaven, but someone who has worked for a company for years is told they need to either shave their beard or quit without severance, I can't imagine any argument I would find compelling about why that would be reasonable. I'm sure you'll be able to come up with plenty of arguments about why you also think this example is absurd, but so far everything you've described is extremely abstract, so it's not clear to me whether there are any real-world examples you wouldn't reject based on not being an exact match to the hypothetical you've described.
> The correct response to "we want you to start coming to the office every day" is "okay, then I want $x more every year to cover my additional expenses". Now, it could be that either or neither party is willing to negotiate on such terms, or even that they do negotiate but no consensus is reached, in which case you just have to dissolve the business relationship. What else can you do?
In some societies (but not the United States), a company unilaterally trying to change the terms of employment in a way that the employee disagrees with is grounds for the employee to receive severance. There are examples even in American society of companies being forced to restore positions to people who were terminated for reasons found to be unlawful.
I fundamentally disagree with the presumption that I need to be willing to present a company with an amount of money for them to force me to change my circumstances; if they're the ones who want to change things, the onus should be on them to convince me, or else they should be required to compensate me for their unwillingness to continue with the previous agreement. This isn't how things work with "at-will" employment though, and the number of software companies in the United States that offer anything other than at-will employment beyond finite length contracts are at most a rounding error above zero. This doesn't mean I have to think this is fair or reasonable; quite a lot of things in life are unfair or unreasonable without being within our individual abilities to influence, and it's not hypocritical to be willing to point those out even if I'm not willing to risk the livelihood of myself or my family to make a point about it that will in all likelihood change nothing.
But you do realize that you're going to be treated that way regardless of whether it's overtly or not, right? That's why you're paid by the hour, not by how much your effort contributes to the bottom line (supposing for a moment that that could be accurately quantified). When you become an employee you do agree to become a cog in a machine. You're not some independent artist making your own way in the world, you're working on someone else's project and following someone else's success criteria, along with a bunch of other people. An employee gives up a small amount of autonomy and self-determination in exchange for stability. If that's not what you want perhaps you should become an entrepreneur.
I honestly don't understand how being asked to perform a wildly different task is much worse that the default state of affairs. If it were me I'd think "hell yeah! You're paying me the same money to go fetch coffee? The hell do I care?"
>I'm implicitly agreeing to literally any of those policies by agreeing to employment to any single one?
No. But it does make those policies not unreasonable. It can't be unreasonable when so many other places have said policies. That the place you're at isn't currently one of them doesn't mean it can't be one in the future, nor does it mean that it changing would be unreasonable. You especially can't put on the surprised Pikachu face when so many companies are doing it. "Wha... What do you mean in this software company they're requiring people to return to the office like they're doing at all the other software companies? This is totally unexpected!"
>In some societies (but not the United States), a company unilaterally trying to change the terms of employment in a way that the employee disagrees with is grounds for the employee to receive severance.
I live in one such country, and most people would still rather negotiate than just be fired with severance, or even just bear with it and start looking for a new job. All severance does is make it so small and medium-sized companies can't fire a lot of people at once. It's still a bigger blow to the employee, even with severance.
>I fundamentally disagree with the presumption that I need to be willing to present a company with an amount of money for them to force me to change my circumstances; if they're the ones who want to change things, the onus should be on them to convince me, or else they should be required to compensate me for their unwillingness to continue with the previous agreement. This isn't how things work with "at-will" employment though, and the number of software companies in the United States that offer anything other than at-will employment beyond finite length contracts are at most a rounding error above zero. This doesn't mean I have to think this is fair or reasonable; quite a lot of things in life are unfair or unreasonable without being within our individual abilities to influence, and it's not hypocritical to be willing to point those out even if I'm not willing to risk the livelihood of myself or my family to make a point about it that will in all likelihood change nothing.
To be honest, I'm not sure what your point is anymore. All I said was that if circumstances change and you and the other party can't come to an agreement, all that's left is to dissolve the business relationship. Everything else around that simple fact, such as the particular terms of the business relationship, seem to me largely inconsequential.
I'm not paid by the hour. Yes, my salary is quantified in a unit of time, but if you're lumping hourly and yearly wage jobs in together to contrast them with working on commission, you're ignoring a lot of details that make a huge difference in the actual experience people in their jobs, and my point is that I think even small details add up and make a difference in how fulfilled people feel in their jobs in the long run.
> When you become an employee you do agree to become a cog in a machine. You're not some independent artist making your own way in the world, you're working on someone else's project and following someone else's success criteria, along with a bunch of other people. An employee gives up a small amount of autonomy and self-determination in exchange for stability. If that's not what you want perhaps you should become an entrepreneur.
As far as I can tell, this is pretty much covered by the last part of my previous comment: just because the world works in a certain way that I can't change doesn't mean that I have to accept it as fair and not criticize it. I don't think it's hypocritical for me to make a choice based on the stability that it affords myself and my family but think it's unfair that people have to make choices like that in the first place. The fact that most companies unilaterally decide the terms of employment and employees have no actual power to negotiate is something I can call out as unfair even if I still end up accepting that it would cost me more to refuse to participate in it.
> No. But it does make those policies not unreasonable. It can't be unreasonable when so many other places have said policies. That the place you're at isn't currently one of them doesn't mean it can't be one in the future, nor does it mean that it changing would be unreasonable. You especially can't put on the surprised Pikachu face when so many companies are doing it. "Wha... What do you mean in this software company they're requiring people to return to the office like they're doing at all the other software companies? This is totally unexpected!"
I disagree that "everyone is doing it" makes it inherently reasonable. You're misconstruing my criticism as surprise. I'm not obligated to refrain from criticizing bad things because they're expected.
> I live in one such country, and most people would still rather negotiate than just be fired with severance, or even just bear with it and start looking for a new job. All severance does is make it so small and medium-sized companies can't fire a lot of people at once. It's still a bigger blow to the employee, even with severance.
You're certainly entitled to disagree with me about this. I don't find your claim that severance only has negative effects compelling though, and I'm entitled to disagree with your claim on this as well.
> To be honest, I'm not sure what your point is anymore. All I said was that if circumstances change and you and the other party can't come to an agreement, all that's left is to dissolve the business relationship. Everything else around that simple fact, such as the particular terms of the business relationship, seem to me largely inconsequential.
My point is that I think the way a lot of companies do things is unfair, and the fact that they do them doesn't inherently make them fair. If you think this is a pointless opinion, you're certainly welcome to ignore or criticize it, as you have been doing, but I don't happen to think your criticisms are particularly convincing.
I think this is one reason this topic is so touchy -- it's hard to even express an opinion without someone assuming you mean to impose that opinion on everyone else (e.g. mandatory RTO), and then taking offense to that imagined imposition.
Perhaps in the long run we can self-organize into companies or groups within companies that universally prefer in-office or remote work.
I prefer at least some % (e.g. 50%) of my work to be in person. But I also don't like working with people who don't want to be there, or for whom being there is a huge burden. So I personally really hate RTO.
Instead, I'll choose a team or company that is open about requiring in-office time (and has been open about it for years), and is therefore staffed by people who also like that environment. It would be ludicrous to join a remote-first or remote-only company and then try to start imposing my in-person preference on others.
> Remote work eliminates a lot of problems with office work: commutes, inefficient use of real estate, and land value distortion. But software development is better when you breathe the same air as the folks you work with. Even with a camera-on policy, video calls are a low-bandwidth medium. You lose ambient awareness of coworkers’ problems, and asking for help is a bigger burden. Pair programming is less fruitful. Attempts to represent ideas spatially get mutilated by online whiteboard and sticky note software. Even conflict gets worse: it’s easy to form an enemy image of somebody at the end of video call, but difficult to keep that image when you share a room with them and sense their pain.
None of that is phrased as personal opinion or their own subjective experience. I don't think it's hard to express an experience about personal preference, but it's hard to express an opinion about how other people's experiences without them having something to say about it if they disagree.
But I guess this is a good example of default assumptions. When I read a personal blog, unless I see words like "objectively", or see explicit arguments stating that others should/must do X, I by default read it as the author expressing their personal opinion on X.
But I can see it the other way too. I think it could be solved by the author using an "I" framing as they did in every other section.
Yeah, this is probably what really confused me most about it. The tone felt remarkably different in that section than the others, and when it's also the topic that at least to me seems the most controversial, I can't help but wonder if it's representative in some way. It's not at all uncommon for people to dig in their heels more when presented with disagreement, so it's hard for me not to be concerned that the only reason this section is phrased differently is that this is the topic where they've received the most pushback on, which would be exactly what I'd expect even before accounting for the actual phrasing of their opinion.
Except in most cases it isn't true.
Yes, my best school/university years were when surrounded by people interested in studying. That was the optimal scenario, and everyone's reward was passing grades.
But in work nowadays, the reward is getting paid and promoted. That's not achieved by work, but by socializing, playing politics, creating mutually-beneficial relationships, building empires, and using everyone else.
Which is exactly what happens in a moderately sized workplace today, and one of the main reasons everyone else wants to stay home.
Remote work only sucks if your goals are misaligned with doing the actual job.
However, sadly my employer is doing their best to make this as hard as possible now. Despite not allowing people to work from home, they are allowing the consultants to do it, and now slowly implementing a "clean desk" policy which drives me up the wall. Sitting and working 8 hours a day on a "clean desk" with equipment you cant tailor to your needs just sucks.
And I want to offer some contrast—not as a rebuttal, but just as a reminder that there’s lots of different ways to navigate this strange field.
The _majority_ of the paid code delivery I’ve done for a decade+ has been in Ruby. (The balance has been a mix of mostly devops and some TS/JS and Elixir.)
Remote work has been an utter boon. Admittedly, I do feel like it’s got worse since Covid. But I’ve been able to work with people all across the globe without uprooting my family and leaving my community, and conversely can travel without having to leave my job or clientele.
And I do find that some places benefit from thinking hard about their process. Small senior teams do great with Shape Up. Projects where you have a non-negotiable scope (replatforms) and work streams that are more reactive than planned do better with kanban than something involving estimates.
That’s not to say the author’s wrong! Again, just that the world is wide and experiences differ.
Some context here: I’ve consulted full time almost continuously since 2018, which certainly colors my experience.
> And yet, my lack of awareness of polymorphism showed me I’ve been writing little more than structured programs. That I could replace conditionals and case staments with specialized classes had never crossed my mind.
> Polymorphism is covered in every college OO course.
Consider yourself blessed then because you're in for one hell of a ride if you pursue that path to its extreme. For me, it was the opposite: been taught OOP, I had to unlearn most of it to be able to better structure my mind and how I think about programs.
You should know what polymorphism is (also, there's static, dynamic, ad hoc, single dispatch, multiple dispatch), but I don't think it's a weakness if you have not been using it that much (the real weakness is over-using it and making a clusterfuck out of your code.)
Which is a long-winded way of saying that you could be doing much worse, if that makes you feel any better, lol.
I would also not judge you for having your own preferences and opinions. I too prefer working in an office to remote work, but when I say this out loud other developers take it as advocating RTO or saying remote work is worse when it just doesn't suit my personality. I get that it's a touchy subject but there is no need to get up in my face about it.
You mention bullying and brigading and that seems to be an unfortunate reality of this industry. I suspect there is a lot of insecurity and imposter syndrome that causes people to write hyper-confident blog posts about why they are better without AI and how their tests have 100% coverage and how (unfashionable language which half the world uses) is garbage etc. Maybe if we all follow your example and be candid everyone could chill out a bit.
I'll go next: despite trying several times, I have never successfully written anything more complicated than Fibonnacci in Lisp or Haskell. I know it's clean and pure and all that, but my brain just won't work that way.
I hadn't until a joined a lisp based project. Learned a ton. My brain didn't work that way at first either, but working with it every day I eventually got it.
This blog post does pretty much the opposite though; its analysis of remote work is pretty much entirely just generalizations of their own experience, but phrased as if they're objective truth. It was an especially weird editorial choice to make use of the "general" second person given how much outside of that one paragraph was written in the first person. In an article that's ostensibly trying to be humble and vulnerable like you mention, it just comes across as patronizing. I can't say I'm surprised that the author might have been judged for expressing this opinion because it's not about their personal preference, but a judgement of its own.
I think a lot of people genuinely struggle with the idea that sometimes how something is said can matter just as much as what's being said. Being correct and being respectful are orthogonal concepts even when talking about objective truth rather than opinions; if someone asks what 7 times 9 is, there's a difference between telling them "63" and "Well, obviously it's fucking 63, duh!". For a subjective topic like remote work that some people's lives have been quite significantly affected by, it's even more important to put some effort into understanding how one's words will come across, because if the phrasing is poor, people aren't necessarily going to feel the need to go out of their way to try to give it the benefit of the doubt. I can't know what exactly the author was thinking when writing that paragraph, but I also can't distinguish between whether they have the same viewpoint as you but communicated it poorly or if they genuinely think that there's some sort of objective truth than I'm worse at my job working remotely than I would be working in person. Given the amount of care I've put into addressing many of the exact issues they've raised due to needing to work remotely because of a medical condition of an immediate family member, it was quite hard for me not to have an immediate strong angry reaction to how flippant they seem to be with what's at best their phrasing of their opinions. My point is that it's a lot more work to actually care about how one's point comes across than it is to claim that people are overreacting after the fact, and it's worth considering how much of the reaction the author mentions having gotten in the past is reflective of this.
The 7 times 9 analogy doesn't track it all. 7x9 = 63 is an objective fact by definition. His thoughts on remote work are an opinion by definition. If other people decide that what he says is dogmatic, blame it on their own lack of critical thinking skills.
The meta-point of the article is that we should express are thoughts without qualifiers and embellishments to manipulate other people's perceptions of us.
In my experience this is a common failure point among tech/analytical folks (myself included) which leads to their words and actions being genrally misconstrued and effectively misunderstood by the larger segement of the population which is rarely able or disposed to handling communications without embellishments.
Blaming the rest of the world for an inability to communicate effectively is not orienting the blame correctly.
Not prefacing what clearly is an opinion with "IMO" is not a jedi mind trick that makes others believe it as fact.
You're also demonstrating some hypocrisy by presenting your own point of view in the same manner. No qualifiers. You're simply stating something as truth
A lot of folks flip to “it’s just my opinion” only after they get pushback, but if you present something as a fact, it’s fair game to question it.
Like if someone says “apples taste bitter and have no flavor” that reads like a universal claim, so yeah people will argue. If you say “I find apples bitter and lacking flavor” that’s obviously personal taste and nobody is going to demand proof.
Nobody is asking for IMO everywhere. Just don’t frame opinions as facts or the other way around.
I fundamentally disagree with this. In my experience, it's in pretty much in possible for people to perfectly understand intent without a certain amount of effort from both the communicator to express it clearly and the listener to understand it. In practice, I don't think there's a good chance of successful communication for any nuanced topic without good-faith effort from both sides, and I can't differentiate between the language the author used and what I'd expect to hear from someone who reflexively dismisses any disagreement as in bad faith.
I say that because you (and everyone else who seems upset) clearly understand it's just his opinion. Therefore, why are you offended by his intent? Whatever his intent might be, I think it's irrelevant. It's simply a strongly held opinion.
I genuinely don't understand whether it's the case or not, and I've tried to be clear about that. I am not able to tell whether it's their opinion or if they actually feel like they're objective facts; both are plausible to me, and I'm arguing that if they want people to understand which they mean, they need to be more specific. Otherwise, people will draw conclusions that may not align with their intent, and that's something they could avoid if they put more care into how they expressed it.
His words read the same as any editorial I’ve seen.
"It is my opinion 7 x 9 = 63," wouldn't be an opinion in the sense that opinion was being used in the thread. Yes, we can question the veracity of a statement of fact, but that isn't the same sort of opinion as whether something is subjectively good or bad.
It's pretty hard to know where the opinion is.
The whole paragraph presents as though author is relating known symptoms of a disease. We're never really sure which they themself actually experienced. They look more like arguments in support of a cause.
Author is totally entitled to open that door, but then it also becomes fair game to attack the perspective.
Would you agree that whether something is an opinion or fact is itself objective, for most cases at least?
I ask because nobody is questioning whether or not what he states was actually an opinion. They seem to simply be upset with the manner in which he phrased it. He was simply too sure of himself and people found that offensive. Which seems a little ridiculous don't you think?
> Remote work eliminates a lot of problems with office work: commutes, inefficient use of real estate, and land value distortion. But software development is better when you breathe the same air as the folks you work with. Even with a camera-on policy, video calls are a low-bandwidth medium. You lose ambient awareness of coworkers’ problems, and asking for help is a bigger burden. Pair programming is less fruitful. Attempts to represent ideas spatially get mutilated by online whiteboard and sticky note software. Even conflict gets worse: it’s easy to form an enemy image of somebody at the end of video call, but difficult to keep that image when you share a room with them and sense their pain.
It's hard for me to read that as anything other than literally describing to me what my the experience of working with me remotely is. OP has never worked with me as far as I'm aware, so they have no idea whether it's accurate or not. Charitably, they might not mean what they're saying literally, but I'm making the argument that for topics that are controversial because of how people have been burned by overly prescriptive policies in the past, the burden is on the speaker to avoid voicing opinions in a careless way that relies on the listener to glean that their intent isn't the same as what people have experienced in the past.
My meta-point is that while people are free to express their opinions without spending effort trying to make their intent understood, but by the same token, people are free to react to those opinions with the exact same level of effort spend trying to understand their intent. In my experience, there are a lot of people who complain that they're treated unfairly for expressing their opinions without realizing that what people are actually reacting to is how they express their opinions, not their opinions themselves. I've personally struggled quite a lot over the years in having trouble understanding how other people will interpret my communications, so I have a lot of sympathy for people who also struggle with this, but if someone doesn't seem to even accept the premise that part of the responsibility for being understood lies with the person in expressing their intent clearly, I lose patience quickly. This is especially true when the "opinions" are expressed in a medium where the person communicating has an unbounded amount of time to work on clarifying their intent before the message actually is received by someone else; I don't expect everyone to be able to perfectly articulate things in real-time when talking in person, but when the opinion is expressed via a blog post, they don't have the same constraints in working on how they convey what they're saying. The fact that the blog post seems to be overall taking the stance that it's better not to try to worry about how someone will interpret their intent makes it feel even more likely they might just not understand what people's actual issue with their communications have been in the past.
It genuinely seems like they might not have been able to distinguish between good-faith misunderstandings and bad-faith intentional misinterpretations of what they've said, and that's unfortunate if it's led them to the conclusion that they just don't need to care about what anyone thinks about their opinions rather than that they need to learn how to better communicate to those who are attempting to respond in good faith and ignore the ones who aren't. A lot of people understand that people can disagree with them in good faith in the abstract but fail to actually recognize when that's happening in the present, and quite a lot of what's expressed in this blog post resembles what I've seen from other people who struggle with that.
I mean you're describing 90% of blog and forum posts on the Internet here.
This (IMO - so it's not ironic) is the biggest leap most people need to make to become more self-aware and to better parse the world around them: recognizing there is rarely an objective truth in most matters, and the idea that "my truth is not your truth, both can be different yet equally valid" (again in most cases, not all cases).
You ever wonder why? Serfes finally got freedom after corona, but apparently some actually prefer to be in a serfdom instead of having freedom to choose for yourself. You're being a useful idiot for managers, that's why you get backlash.
So I'd prefer to work in an office, so long as it was nearby and the commute was short and my officemates were fairly quiet. This does not mean that I'm "advocating for serfdom". Working for an employer is no more (and no less) serfdom in an office than it is at home.
Some people like having an office.
I could call someone like you a hyper dramatic agoraphobic socially inept recluse over a simple posted opinion but that wouldn't be kind,fair, or mature
It's a huge open space filled with stale, stinky, dry office air, obnoxious people and dirt. Conditions are not cotton field or not cotton field, only serfs think like that.
> I could call someone like you a hyper dramatic agoraphobic socially inept recluse over a simple posted opinion but that wouldn't be kind,fair, or mature
Projecting much? My home setup costs costs more than half of that "office" combined, and that's just my room.
Do you seriously think I want to kill my back, my eyes and my attention for 40 hours a week when I can comfortably work at home and be much more productive instead of playing clown because some bubs can't tolerate working without distracting another person for 5 minutes?
Working in an office as a preference is one that naturally relies on the control of other people. The reason people like working in an office isn't because of the office. If you went to the office, by yourself, it would be worthless. The value of the office is the communal nature of it.
So, one position naturally requires forcing other people to work where they might not want to, and one doesn't. With WFH, you can work in an office, nothing is stopping you.
When you say you prefer working in an office, you aren't stating your preference. You're stating what you arbitrarily think everyone elses preference should be.
In a forum like this, stating your preference is just that: stating your preference.
If you were talking with your manager and stated your preference, you'd be stating your preference and, between the lines, asking to make it happen for yourself.
If you were talking with your manager and stated your preference and specified the reason is because you prefer working around people, only then, between the lines, you'd be asking to make it happen for your whole team.
But the buried lede so to speak is that RTO has literally nothing to do with the office. The office is just an empty box that happens to exist somewhere.
So the level of control for each preference is wildly different, and they can't just be compared like that. One is naturally 'closed', and the other naturally 'open'. That, to me, does speak to the intrinsic value of each preference.
Not at all. Working in an office as a preference is one that can instead rely on working with other people who also share that preference. No control is necessary.
With such a preference I can't help but wonder:
1. How genuine is it? Where is the "cutoff" point where in-office work no longer works? Do we need 100% compliance? What about 80%, is that good enough?
2. What, materially, do you gain from the preference and does that material gain actually rely on the preference? From what I've heard, 99% of the time it does not.
I find that I work better in an office, depending on the office. I'm in no position to enforce that position on anyone. (I'm currently unemployed and looking for work, in fact.) I find that I dislike giving up room in my small house for work. And I dislike having no separation between work and home.
These are all personal preferences. Nothing is being enforced on anyone. Your reaction is overblown.
This isn't a reaction on my end, I'm just explaining where the value judgment of the preference comes from. It's intrisincally a "closed" preference, and people don't like that generally.
You are not in a position of power to exercise said preference, you rely on the goodwill of your company. That's fine, but still, you exert some influence. People are listening, and some of them do have the power to exert that control. So when you say "I like that control", it makes people a little nervous.
And, onto my whole "does this actually require in office work" point:
> find that I dislike giving up room in my small house for work. And I dislike having no separation between work and home.
This is that. None of these preferences require in office work, that's just a close enough proxy. I would argue these are more obtainable in a WFH environment, because the cost savings of WFH can easily afford you a dedicated office space away from home.
Because, again, one is open, and one is closed. So with the open one you can just do that.
I mean, it's better than the hyper-confident blog posts about how AI made them an ubermensch superdooper 100000000x developer but "no you can't see the code because it's for WORK". Nice subtle dig, though.
Except… science has shown this to be true. Even after a year-plus of work, less than 2% of devs work faster or more efficiently with AI than without it. And for almost 90% of the remainder, regression analysis strongly indicated that none of them would ever be better with AI than without it, regardless of how much practice they had with it.
These general results have come up with study after study over the last few years, with very consistent patterns. And with AI becoming more hallucinatory and downright wrong with every generation - about 60-80% of all responses with the latest models, depending on model being examined - the proportion of devs being able to wrestle AI into creating functionally viable work faster than they could to it themselves has also decreased slightly.
A few years ago I was the TL on a FAANG Android project, where for a few months I was doing more spreadsheet/TPM work than usual, and didn't have much time for coding. Once we had a meeting where I ended up coding in Kotlin live in front of a dozen younger devs to discuss the implementation of some feature. My work background is Android and Java/Kotlin, but at the time I was mostly coding in C on the side, and in the moment my brain just forgot what the syntax in Kotlin is for a "switch-case" statement, so I wrote "switch", "match", etc, struggling like a first year student, while everyone watched me fumble, until I just gave up and said: "oh my god, I'm forgetting Kotlin. What the hell is the switch keyword in Kotlin called?". Then someone said: "it's when".
I felt old and a little embarrassed, but mostly I was surprised at how quickly I could forget a programming language I used daily.
I notice that general concepts usually stick better in my brain than specific things like your example with ‘when’. Even those are pruned down a bit after long enough though.
Every ounce of data proves this statement wrong. If you feel like you work better non-remote then do it. Don’t shill it as a panacea. I’ve been remote for 11 years now and if I wasn’t I wouldn’t have been able to take care of my family, go back to school part time, work on my health with better meals and reasonable gym hours, etc. even IF in office was better for the employer (even though all data says it’s not in terms of productivity) it is unequivocally better for the employees life to work remote as much as humanly possible.
This hot take is just simply insane. Humanity had no problem coordinating massive projects over IRC and mailing lists. It’s clear the author is a “nu-coder”.
> Every ounce of data proves this statement wrong
There is no need for hyperbole. Because one thing we can all agree on is that remote work eliminates commutes, by definition.
Humanity has plenty of problems coordinating over IRC and mailing lists. That we have succeeded some of the time does not imply we would succeed all the time or that there aren't significant downsides. These discussions often bring up Linux as an example and sure, the remote development of the Linux kernel is indeed a testament to what you can accomplish with remote teams and strong coordination. On the other hand, we could note that despite how successful that remote coordination has been, the (arguably) most successful Linux OS business (Red Hat) decided they needed offices and in person work long before they were owned by IBM. Likewise the SuSE Linux folks have offices around the world. There must be some benefit they're getting from that to have decided to take what was a fully remotely coordinated project and centralize some of it.
> even IF in office was better for the employer (even though all data says it’s not in terms of productivity) it is unequivocally better for the employees life to work remote as much as humanly possible.
That might be true for some people, but it is not true for all. If you'd asked me before COVID if I wanted a 100% remote job, I would have told you yes. I'd even applied for a number of (the far more limited at the time) remote jobs like Gitlab. And then COVID hit and I spent 2-3 years working from the single spare 10x10 space in my home. In that time I lost precious living and hobby space to having a dedicated working location (approximately 20% of my home). I increased my personal utility costs without compensation. My mental state deteriorated due to a lack of mental and physical separation from work and home. I found it far more difficult to accomplish my work due to a number of at home distractions. I struggled heavily to keep up with things happening across my team as it was difficult to both keep up with the async chat discussions without also burning massive amounts of time and energy context switching. I found that I personally need some form of a "commute" in order to switch my mental state from home to work and back again. I had to allow corporate devices filled with corporate spyware on my personal network. I had to isolate parts of my house from my spouse at various times. I had to allow strangers and colleagues a video view into my personal and private home. Working 100% remote was unequivocally worse for me as an employee.
By contrast, now that I'm back in the office most days, I have an employer provided dedicated working space. I have free coffee, tea and fruit. I have a space where I can be focused on working on something and still keep an ear on other things happening within my team, allowing me to context switch when my attention is needed without needing to switch just to find out if my attention is needed. I have a free gym on site that allows me to exercise with equipment that I don't have at home and wouldn't have the space for even if I could afford it. I don't have to allow corporate devices on my home network anymore. I have a cafeteria which serves reasonable and healthy food at reasonable prices when I don't feel like making my own lunches. I have access to high quality and private video conferencing systems when I need to coordinate with other remote individuals and I no longer have to allow strangers visibility into my home in order to conduct interviews. I get to eat meals with co-workers and colleagues and have social engagement during my breaks. I can get away from home distractions and more easily focus on the work I have at hand. I have a reasonable commute that's just long enough to allow me a mental switch without being oppressively long, and takes me past a number of locations that I would have needed to go to any way each week.
Which isn't to say it was 100% bad. To this day I have a hybrid situation which affords me benefits that I would not have with a 100% in office position, and for which I am eternally grateful and fortunate. I also recognize that I work for a very good company that provides a number of perks that aren't available to everyone who works in an office. But that's the point. In office work doesn't mean just one thing, and neither does remote work. Both are highly subjective experiences and to say that remote work is "unequivocally" better for everyone is just wrong.
The monolith to microservices trend was one great example of this.
I had the sweetest manager once. Someone stared talking about iphone and she [1] casually asked "what is iphone?" (this is after 6-7 months after iPhone was launched). Everyone's jaw dropped ... what? In which world u live in? ... to which she said with a wide smile and not an ounce of embarrassment .. "what? I don't know what iphone is?"
But she was otherwise so good in every other aspect ...
[1] She is/was mother of 4 kids and that left her very little time for anything else.
Once I got over the embarrassment hurdle of asking “dumb” questions, I grew a lot in my early career. Then people saw me as highly engaged, and my questions and understanding got better over time.
In particular, I encourage all the new joiners on my team to play the “newbie” card to allow them to ask as many questions as they want.
“Hey, new guy here… what is X?”
I even tell them to set a goal of one question per day if it helps.
I think it’s so important not to be passive because you absorb / understand less that way.
Technology isn't intuitive and there ares loads of things to remember. Sometimes the dumb question is needed by more than the person that asked.
I'm surprised at this statement. My team pair a lot - at least half of the time - and the majority is remote.
We find it much more comfortable to pair remotely on our own setups than crowded around a single desk and keyboard.
I wonder why our experiences are so different.
I work remote and wouldn't have it any other way, but I'll admit there are culture things I miss since I really love my team. As an employee? My employer gets way more value from me without the bullshit of going to an office.
I would try Tuple but to be honest we are fine on Code with Me.
In person, it's usually easy to see when a junior is struggling, and pull up a chair. That same junior, in a remote environment, might not proactively ask for help. And me sending a slack message to them asking how they're doing might get a "all good" even when they are not. And sending a huddle request to them because I suspect they're struggling is just a VERY different thing than looking over at them to read their body language, or swinging my their desk to check in.
Maybe some would say that it's on that junior to know they need to ask for help. Sure, great. That does nothing to resolve the reality of this very real situation. Of course part of coaching this junior (in person or remote) would be encouraging them to be more proactive about asking for help, but if you have fewer opportunities to offer help and coaching in the first place, their growth is slower. Significantly slower in my opinion.
This is a hard difference to convey to someone who has never experienced a good in-person culture. But I know I would not be where I am if I had spent the first decade of my career working remotely.
I love software development but recently I have being doing too much data analysis and nothing too exciting.
I get afraid of getting the basics wrong when I go long streaks like that. I appreciate the fact that you showed that to others. This is healthy for everyone. We are living in a society where faking it is so common that people cannot just bear to be themselves anymore
This is from uncle bob. I hate the argument by people that 100% leads to "bad quality tests". Not doing it leads to bad quality code, people who don't care about quality of code, and hence dont write tests, suddenly start to care about quality of tests.
If you tell people that 95% is just as inadequate as 0%, they'll tend towards 0%.
1. The getters and setters are not called anywhere in application logic. In that case, delete the getters / setters and get to 100%.
2. The getters and setters are called somewhere in the application logic. In that case, they should have already been covered in the test for the application.
There is really no excuse to not write tests to get to 100%.
It's common that getters and setters will be called by serialization library. The fact that they are "grey" in IntelliJ doesn't matter.
You can argue that it _shouldn't_ be this way, and I would agree, but it is that way. Perhaps in part because developers are humans, and even humans with the best initial intentions will game metrics with a large enough sample size over a long enough time horizon.
I get the feeling there are a lot of people who are developers by trade who are following what they feel they should do to progress their career rather than what they should do to work on something that interests them. Sure, the industry isn't what it used to be in terms of job market, but for a good long while there it was relatively easy to find something that interests you if you were competent. But if you're not interested in what you're working on, I find it strange to write a blog about the C#/.NET journey. I'm asking these questions genuinely: Was this a self imposed expectation for career reasons? To have a tech blog? Are you actually interested in programming or is this something you found that you're decent at and knew there were career opportunities?
About SQL knowledge, over the many years in this profession, I've re-learned SQL three times. This is because I'm a generalist, and there were large portions of time where I just didn't need to know it. But surely, in all your years of experience, you must have realized: You will never know or recall everything you learn or even scratch the surface of what you haven't. Even if you did remember everything about SQL when you learned it, you're going to have to relearn it anyway. Everytime I touch Postgres there's loads new features out. That is how any maintained software is. That is just the job, as far as I'm concerned. Very few people have the luxury of being able to hyper specialize and looking things up isn't just important, it is necessary, because being an encyclopedia of product development isn't why you're employed. The knowledge and expertise you are paid for isn't SQL syntax or how well you know C#, it is your ability to apply technology to solve problems and effectively work within a team to do so (a tangent, but this is why in the long wrong AI won't replace you).
I don't mean to preach or give a tome of a message. I am _genuinely_ interested in your perspective. I have bounced around a lot in my career but generally have to mentor folks and I'd love to chat more about this particular struggle because I hear it frequently. Especially from those who seem to have a bit of an identity crisis in their profession.
Reading what he wrote about remote work in particular is so strange to me, even though I am fully aware that I've got colleagues that seemingly mirror his views.
On this anonymous platform I feel comfortable pointing out that this isn't an artifact of remote work whatsoever.
If you weren't like this before, and only noticed yourself becoming like that... I'm afraid you've just changed. Accepting people/coworkers along with their flaws is a conscious decision you need to make. You likely just weren't able to see their shortcomings earlier in your life, likely because you haven't been negatively effected by actions like they're taking.
Let's say you have a colleague that always does as little as they can get away with. Everyone will know such a person. Eventually you get the ability to tell quickly when a person is like that. And then you will either start to see them as a negative or just accept this about them - because it doesn't make them into a bag person and you can still joke with them and learn from each other.
0. anti-excellence:
0.a. slobs/jerks who didn't care about system or code entropy, or the effects of their carelessness on other developers, users, or other stakeholders
0.b. uncurious people who lacked knowledge and didn't care about learning because it was "just a job to them"
1. "competitive" egotists who sought to exert force, control, or domination of "their way" rather than collaborative experimentation
Furthermore, environments that reward impact and achievement to maximize credit for performance review seldom reward refactoring, training, or any sort of deep problem solving or presentation of tools, methodology, or experience.
I’ve been doing this for years and I still have to look up basic SQL syntax or regex patterns if I haven't used them in a month. The skill isn't memorization; it’s knowing what is possible and how to find the solution quickly.
If they can get away with a query that takes 2s to return a single row, they will be quite content and will not be bothered to look at the query plan.
It's a shame, because everything could be a little better with hardly any effort.
No, it's not. This goes against the whole thread and the article posted.
“Follow Scrum, Lean / Kanban, or eXtreme Programming to the letter” - there are plenty of failed projects and unhappy devs that have done just that. And these methodologies are not tuned for the LLM-generation age and, talking to lots of other devs around the world, I think it is showing.
In regards to remote work, I’ve worked for shops that have been fully remote since before the pandemic and are wonderful experiences. They’ve figured it out. The OP’s feelings on remote work, to me, say “the companies I’ve worked for are really bad at supporting remote work”, but if you believe your experiences are representative then you say “remote work is bad”.
You can get to 100% by having tests that run the code, but have no assertions.
You can run tests that test unimportant code just as much as super critical code. There's no differential between the two. Of course super critical code should have a number of different tests that exercise it. Its not the same as testing every path, its testing different inputs and checking that you get the right results. Also see property testing.
Chasing 100% is like any metric that becomes a goal, it perverts the metric, and moves the meaning away from the metric.
Why is that? Well, we dont really want tests at all, if only people could write perfect software first time, we wouldn't need them. Stupid people!
What we want are reliable systems! So we use feedback loops between deployed systems and code to help us discover those places where we need more tests, or a different type of testing, and then we do that.
Of course if your test coverage is 0%, thats probably bad, but 100% is a non-goal.
You'll also find that if there are no tests in a system, when you need to add them, its really hard, cos its not designed in a way that makes it testable. So maybe the TDDs will help you! You end up with a system that you have high confidence in, and also is testable.. so when you find something that doesn't work how you thought, its easy to add that test right in there.
I've always wanted to spend some more time on mutation testing, which can be used to improve test quality instead of just focussing on quantity. But I found it to be completely irrelevant in the industry so far.
mind sharing the challenge (or an analogous one so you don't give away the real one)? I'm curious how bad I'd do with it
Not remembering sql or details of different joins - after looking it up how much time does it take to kick back in?
I don’t remember a lot of stuff by heart but as once I worked with it I can go back rather quick, if you need it just brush it up. If you don’t need it who cares.
Polymorphism I can do stuff with it and it hurts my brain yes when I see switch statements that should be objects - but on the other hand it is much easier to understand switches or ifs for a lot of folks.
> Remote work eliminates a lot of problems with office work: commutes, inefficient use of real estate, and land value distortion. But software development is better when you breathe the same air as the folks you work with. Even with a camera-on policy, video calls are a low-bandwidth medium. You lose ambient awareness of coworkers’ problems, and asking for help is a bigger burden. Pair programming is less fruitful. Attempts to represent ideas spatially get mutilated by online whiteboard and sticky note software. Even conflict gets worse: it’s easy to form an enemy image of somebody at the end of video call, but difficult to keep that image when you share a room with them and sense their pain.
Buddy, that's either a you problem or a problem of the software culture around you.
I've thoroughly enjoyed remote work. I thoroughly hate being in the office.
In the office, I can't type out notes nearly as quickly. I can't read people's inquiries two or three times to understand what they were asking in the first place.
Why should I form an "enemy image" at the end of a video call? I don't. But in a conference room? That is easy. I can see their faces, see when they're lying or hiding something, and it only ever builds drama.
Attemps to represent ideas spatially doesn't get mutilated by online whiteboard and sticky note software. Unless you don't know how to use online whiteboard and sticky note software. You need integration, you need training to use the integration, and you need to keep focus on the topic. You need to identify the ancillary information and objects. You can do all of that online while someone else is leading or speaking. Everyone can. It's a hell of a lot harder to do that when everyone is sitting in a conference room staring at the speaker instead of typing in their computer screens.
Asking for help is only a bigger burden if your culture allows information silos. Encourage your team to speak up into a common chat room. Let anyone answer the question instead of just whoever was asked. It's easier to develop empathy and watch someone grow when everyone is given the shared burden of answering questions. It's easier to identify what processes are causing the biggest problems when everyone can see the amount of questions or frustration about it. It's harder when that information silo is real.
Remote work is way better than in-office work. If you disagree then you are a very different person than I am, and/or have very different experience than I have.
Remote work is different than in-office. And it's better for some people and worse for others. For example, while I personally find it useful to be able to type out an example when talking to someone about things, I also find that in order to get things done effectively at home, I have to ignore the notifications (or will ignore even if they're not explicitly ignored) when I'm "in the zone" as it were. The problem with this is that someone might have asked a question that I have the answer to, or even asked me a question directly, or someone else on my team may have been going down a rabbit hole that I could have stopped them from going down. But because using a chat system inherently requires context switching and your full attention, the only way to be in the loop is to be continually breaking out of where you are to go look at chat.
By comparison, in the office with my team around me, I can keep one ear open to the conversation that's happening in the air around me. My screen, my literally single focus within the computer and my fingers can all be occupied working on something, and I can use my additional sense of hearing to keep up with other things going on. When someone needs me specifically, they can (with varying degrees of forcefulness) grab my attention, where as online they have one and only one way, and it has the same priority as any other notification both in my conscious and unconscious mind unless I specifically read the notification (and again, context switch).
Video calls still to this day suffer from latency issues. We all, continually have the "What about - sorry - what if - sorry you go - do you want me to go?" conversation in video calls. That's objectively a worse experience than just having everyone in the same room. Even when people in the room start talking over each other, that can be resolved much faster in person than on the video call.
It's also really easy to get into the habit of not paying attention in conference calls/video calls. Because of the scheduling issues, remote work tends to include a lot more "just in case" invitations to meetings and discussions. Sometimes you really do need to be there, other times you don't. So you often get courtesy invites, and you might go, and while you're listening you might do that "identifying the ancillary information", or just keep trucking on whatever you were doing before the call started because you can just listen in. And slowly over time you and everyone else starts to build up the habit of not paying attention at all. It takes conscious effort and specific behaviors to not let yourself get distracted by the big distraction box sitting in front of you while you're having your meetings. There's a reason we generally consider it rude to be on your phone or computer in an in person meeting without specific need.
Perhaps more telling though is the fact that even remote work people acknowledge the importance of having dedicated working space. Even if you don't have other people in the space with you, almost everyone can agree that having dedicated space for working is important. But remote work puts the burden of paying for and subsidizing that space on each individual employee. For some of us, that's not a significant burden and for others, it's quite significant.
I'd also ask if 100% remote work was objectively better for all people and all things, I'd ask why co-working spaces exist? Why do remote workers congregate in coffee shops? Why, even though the internet and online communities are "remote first" groups, do we still have conferences, meet-ups and conventions? Why do we bother with these expensive and difficult to coordinate in person gatherings if everything we would do with them we could do better remotely?
In the end, remote work isn't one thing, it's many different things for each individual person and how you experience it is highly subject to your personal circumstances and your work environment as a whole. It should be entirely unsurprising that people are different and experience remote work differently and that as a result, plenty of people will genuinely prefer working in office to working remotely.
When I was a remote worker (forced to be so during extended lockdown insanity in my state), I worried constantly about how I was seen by coworkers and my manager.
I felt very vulnerable to (and fell prey to) RIF's like I had never experienced in my whole corporate career stretching back to the 90's.
Slack is no replacement for confidential discussions with coworkers for understanding what is really going on with the business, with budgets, with whatever demons the brand folks are flirting with. And generally speaking, informal desk meetings and socializing during breaks was artificial when forced and non-existent otherwise.
It was sad to go to holiday parties and see some of the introverted young guys so hungry for in-person contact and chats.
When I joined a hybrid office, I found I was very short with people in-person and was very demanding. Remote work had left me sensitive to and irritated with faults I perceived in people, and I had to consciously work to soften up with them.
I know many remote workers have drunk the koolaid and are very passionate about it, God bless 'em. I personally think it's shortsighted and weakens our bargaining power.
When my grumbliest coworkers eventually self-exit as the RTO announcements inevitably roll in, its partly a relief, but I worry about them and my industry.
Yet when I look at the conversation itself, I cannot find anything that seems directly hostile towards them:
I do not see personal attacks from either side. I see a normal debate with some snark now and then, which is hardly unusual in online discussions. I may be missing something that happened out of view, but from what is available, the tone does not appear toxic.
It looks like a normal debate, very prolonged, but a normal debate until the author gets banned: https://lobste.rs/~7u026ne9se I don't understand why they get banned though. I see no bad interaction from either side.
A moderator rebukes the OP for starting a fight: https://lobste.rs/s/gkpmli#c_mejg0v I don't understand this either. Both sides were equally forceful in the debate. But the debate was always civil. Yet only one side receives a rebuke. The whole incident seems strange to me.
More weirdness:
I wonder if I'm an anomaly, or if it's actually more common that one might assume?
Obviously pragmatism is always important and no advice applies to 100% of features/projects/people/companies. Sometimes the test is more trouble to write than it's worth and TDD never worked for me with the exception of specific types of work (good when writing parsers I find!).
If you make logical error in the code, chances are you will make logical error in the test that tests that segment too.
A big problem with tests is making sure they are correct.
Would having someone else write the tests catch more logical errors? Very possibly, I haven't tried it but that sounds reasonable. It also does seem like that (and the other things it implies) would be a pretty extreme change in the speed of development. I can see it being worth it in some situations but honestly I don't see it as something practical for many types of projects.
What I don't understand is saying "well we can't do the really extremely hard version so let's not do the fairly easy version" which is how I took you original comment.
Where the costs are high, like say in safety critical software or large companies with highly paid engineers on-call where 9s of uptime matters, the amount of testing and development rigor naturally scale up.
This is why rigid stances like that from "Uncle Bob" are shortsighted: they have no awareness of the actual economics of things.
NikxDa•2mo ago
I wish we'd be more open about our flaws and knowledge gaps in general. I think we'd all benefit.
gblargg•2mo ago
cortesoft•2mo ago
I used to also fear appearing incompetent if I admitted to not knowing too many things, so I would avoid showing my knowledge gaps whenever possible.
However, this colleague was the exact opposite. He would gleefully tell people he had no idea how to do certain things, would be a ready listener when the person he was talking to explained how it worked, and would heap praise on the person for their knowledge and teaching skills. He would always defer to other people as experts when he didn’t know, and would make sure our bosses and coworkers knew who had helped him and how much they knew about the topic.
What I saw and experienced was that this did NOT, in any way shape or form, make people think less of him. It did the exact opposite. First, it made people REALLY happy to help him with stuff; he made you feel so smart and capable when you explained things and helped him, everyone jumped at the opportunity to show him things. He learned so much because he made everyone excited to teach him, and made his coworkers feel smart and appreciated for their knowledge.
And then, when he did speak with confidence on a subject, everyone knew he wasn’t bullshitting, because we knew he never faked it. Since he gave everyone else the chances to be the expert and deferred all the time, you didn’t get the one-upmanship you often get when tech people are trying to prove their bonafides. People were happy to listen to him because he first listened to them.
I have really tried to emulate him in my career. I go out of my way to praise and thank people who help me, always try to immediately admit where my skills and experience lack, and don’t try to prove myself in subjects I don’t really know that well. It has worked well for me in my career, as well.