This is sort of surprising information, from my view.
The information about their business model and data practices which people seem to be so surprised by was/is common knowledge among people who work with these technology stacks. Going to work for Facebook seemed like it would have meant supporting and abetting those practices; they were very clear about what they were doing, and the roles were in no way ambiguous.
The cynic in me wonders if these employees are just responding to how others are starting to view their positions, more than ethical or moral quandaries.
If you read "Surely you're joking Mr Feynman", the author clearly states that when making the bomb, they had a lot of fun. Lots of budgets. Only smart people. Working on the coolest projects. He even plays with the censorship and security practices.
They never really thought too hard about the ethic part of it: they need to end the war, and that's it. They really said "oh we fucked up" once the bomb exploded.
And we are talking about brilliant minds with very positive personalities.
While skilled, I doubt than more than a small fraction of FB work force is close to Feynman's IQ. And they have a very arrogant culture. Somehow I doubt FB is going to bleed talent anytime soon.
Feynman's own country, though, was at war; when he was recruited into the team that would eventually work on the Manhattan project, the Pearl Harbour attack had already taken place and irrevocably wounded and scarred the American public consciousness. Many other members of the Manhattan project were from countries or communities that had suffered tremendously at the hands of the Axis' power. War in Europe had been raging for more than two years. Fun or no fun, you can find motivation for working on weapons in a lot of things: the next Pearl Harbour attack could be against your home town, the next ones to be sent to labour camps back at home could be your only surviving relatives, the next casualties of war could be your high school friends.
Barring a couple of teams that work on interesting projects, Facebook has little other than money and reputation to throw at people. The former isn't that hard to find in our field, though, and the latter is kindda fickly. Facebook may well keep its talent pool constant, but largely by having people for whom the latter (or, more generally, what they work on and what impact it has) is important replaced by people for whom it is not.
Fortunately (for most of us, at least), the kind of talent that money, disregard for ethics and public doublespeak attracts is pretty self-destructive in the long run.
You can replace facebook with google, apple, ms or amazon and you have the same picture though.
And they got the best projects and the most money.
Yet when I said to my relatives, 5 years ago, that I refused an interview from google and facebook, they questioned my sanity.
I don't pretend to have the moral high ground here, as i made enought bad things in my life, but this kind of pondering work opportunities doesn't happen enough imo.
> And they got the best projects and the most money.
I think what they don't have is their CEO testifying in front of the Congress after building a platform whose hunger for money made it an exceptional weapon against society. Smart, young people start avoiding companies only after public perception turns against them. Facebook is closer to that than Amazon or even Google.
That's not exactly my recollection from the book. I believe they thought closely about how the bomb was needed to end the war in Europe as quickly as possible.
What they failed to do was reevaluate the project after the Allies won the war in Europe. It was after Hiroshima they said, "oh, we fucked up." (to parapahrase)
Exactly. There is no war/enemy here. Only ideals, whose ethics are now openly questioned.
I think the public discussion probably hits FB most in hiring - I mean, if I were doing an interview there I'd at least ask the question and see how they responded (maybe negotiation point on what I'd be working on or increased comp).
Contrast Feynman's attitude to that of Wolfgang Pauli, who refused to work on the Manhattan Project, saying "it's not the business of science to engineer mass destruction."
There have been plenty of other highly intelligent people who acted ethically, often at great danger to themselves, rejecting many financial and social incentives to act unethically.
This. We rarely realize that we idolize the wrong historical actors which contributes to our inaccurate evaluation of their deeds. In this case, pointing out Feynman or any other "brilliant minds" behind horrific events and technologies fails to acknowledge that they had detractors in during their time who pointed out these things to them. And they ignored them.
I usually try to keep that in mind now. In the back of my mind I'm saying "they knew, someone told them" and lo and behold, if I go looking for said detractors I can usually find them in spades.
I'm not defending or excusing anyone. Just pointing out that its unlikely these employees were unaware.
The bomb was initially built against the Nazis, and with the understanding that they would be building something similar.
Japan came later.
Also the group was sequestered to a remote lab (Los Alamos?) and working under strong secrecy. You either try to keep a good atmosphere or you go crazy.
If you are capable of building the most powerful weapon on earth, you are capable of guessing it will be used on something else than your current designated target, later.
But they didn't think about that. They don't mention that they thought that they were working on something that could literally end humanity.
And the thing is, we didn't even use the bomb to win against the Nazi.
That the bomb was a good decision or not is a hard debate that I still think about sometime.
But there was not that much debate reported from the people building it.
It brought to mind words from the Bhagavad Gita: "Now I am become Death, the destroyer of worlds."
I'm sure those scientists thought about it and discussed it among themselves, but they were in a super top secret lab in the middle of nowhere. Not much opportunity to leave an historical record of what they thought (letters highly censored, etc).
> If you are capable of building the most powerful weapon on earth, you are capable of guessing it will be used on something else than your current designated target, later.
Yes, however the point is moot if your adversary hits you with it before you are finished.
You are creating a false dichotomy, between pacifism and the use of a doomsday weapon. There are other options. And we actually won against the nazi without using the bomb.
Now given the cold war event, I can't say for sure if the bomb was a good or a bad thing.
But's saying "we had no choice" is a really dangerous way of putting things.
Yep -- Los Alamos. You should read "Surely You're Joking," by the way. It's worth it just for the story about finding a hole in the fence, then walking in the guard gate and out the hole again and again until someone noticed.
It's one of those rare books where you laught while reading it. The whole reason he ends up in Brazil is just so relatable to me, pricesily because it's absurd and so realistic at the same time.
Thank you for wasting my time ;-). That story was in "Los Alamos from Below," which may not have been part of "Surely You're Joking." Feynman was a fascinating guy and an excellent writer, even if he was a bit too young to be a serious player in the Manhattan Project. By the way, if you're interested in physics, these are about as good as it gets: http://www.feynmanlectures.caltech.edu/
Unless I'm mistaken, I tested out at 140 a long time ago.
There is no friggin way I'm remotely as intelligent as Feynman was so there goes any faith in IQ tests being meaningful. Most of the people who have worked for me are demonstrably smarter than me and none of them are close to Feynman. Smart people but not at his level IMO.
Doesn't it depend a lot on what age you take the test though? Like I was very smart for my age as a child and I tested very high. Later in life I'm probably not much higher than my peers.
It seems like a goal of IQ is to assign a single, unchanging score to someone's cognitive ability, but it is a severely flawed concept. The test should not be called "intelligence quotient" but something like "cognitive battery" instead.
You don't have an IQ. It's a test that you take that can provide some useful information, but it isn't a fixed score, and "intelligence" (a word that is too ambiguous to use without careful definition) can't be reduced to a single number.
There's a difference as it were between, roughly speaking, stored knowledge and being able to figure things out on the fly, but multi-factor analysis says there's only one factor for being good at IQ tests, oddly enough. And every test of doing mental tasks we know how to create correlates strongly with any other test. There's a good explanation about how the tests work here:
"Intelligence" is poorly defined in most conversations. The nature of the word and concepts around it lead to problems, even for people who can score high on IQ tests.
It will take me a long time to write out my thoughts completely, but I plan to do it at some point.
Mostly it seems to be quick and adaptable thinking, if I understand the test results right. The colloquial usage is all over the place, as usual, but the tests strongly correlate with things like being able to attain advanced education and ability to be successful. Some of it appears to be inherent, whether it's clock speed/RAM equivalent for our brains or what, because it survives twins raised in completely different homes (including large economic differences). That said, there are certainly environmental effects, too, especially in the negative direction (e.g. lead exposure).
Originally though IQ was calculated by dividing by age, now it's calculated by comparing with those in the same age group. So any random child outscoring a famous scientist doesn't seem to be that significant.
I think I tested around 30 (apparently still insecure enough back then to care, these days I care what my dogs think more than I care what people think :)
> And we are talking about brilliant minds with very positive personalities.
Do you think "very positive personalities" were a plus in this case? Obviously you don't want homicidal sociopaths on the team, but someone whose first thought was "What's the worst possible thing that could happen here?" might have come in handy.
> They never really thought too hard about the ethic part of it
Feynman was only speaking for himself there. It's been 25 years since I read that book but I thought he described some of the efforts by Einstein, et al. to prevent the bomb from being used.
It’s also just one tweet. It’s an article based off a single sentence in the nytimes article. Maybe people are transferring, maybe this is a reason or an excuse, but the support in this article is pretty flimsy.
having worked in Facebook games in 2012, I think some today forget how much worse for privacy Facebook used to be than it is today
Also good points - plus, it's not very compassionate to let a knee-jerk negative opinion color my view of other peoples' actions without having heard their opinions.
> The cynic in me wonders if these employees are just responding to how others are starting to view their positions, more than ethical or moral quandaries.
I think you hit the nail on the head. I doubt it has to do with moral qualms as employees. More to do with your friends and family giving you crap for working at a company that's getting bad press.
The best way to summarize FB employees, from @numair from 8 months ago:
> Having known so many people involved with Facebook for so long, I have come up with a phrase to describe the cultural phenomenon I’ve witnessed among them – ladder kicking. Basically, people who get a leg up from others, and then do everything in their power to ensure nobody else manages to get there. No, it’s not “human nature” or “how it works.” Silicon Valley and the tech industry at large weren’t built by these sorts of people, and we need to be more active in preventing this mind-virus from spreading.
The issue is people who in the past would have went to wall street are now going to Silicon Valley solely for the money and it has poisoned the culture. Tech companies shouldn't have pushed the coolness factor so much and instead let people continue to think tech was for nerds
This is an exceptionally poor way to "summarize" the employees of any large company, including Facebook. Your personal ethics aside, whenever you're about to describe an entire company's staff using phrases like, "these sorts of people" and "mind-virus", you should stop and reconsider what you think you know about the world.
You are making an uninformed judgement based on a generalization of over 20,000 people.
I disagree. My basis is the conversations I had when I interviewed there. They did not extend an offer, so read what you will into my own biases. :)
But a repeated theme of the conversations was me trying to get inside their heads: I said: "Your job is to figure out when I'm going to be making a decision, purchase or whatever else, and make sure that your advertisers' message is in front of my eyes just as it's occurring to me to make a choice".
With creepy consistency, the replies were rejections of this idea. "No, we're connecting the world!".
My conclusion was an echo of the Upton Sinclair quote:
"It is difficult to get a man to understand something, when his salary depends upon his not understanding it!"
I hope that the quitting phenomenon is real, and that it represents the creme de la creme, with many other options, leaving because they can no longer avoid understanding. If this is happening, then FB's decline will just accelerate.
I'm not one to normally defend Facebook, but I disagree. Advertising was not prioritized for a huge chunk of Facebook's history, so I'd argue it's not part of their core ethos (it's a necessary evil, if you will). While a lot of their more recent product decisions have been made around revenue generation, I don't think that's what keeps their leadership up at night. Keeping a monopoly on user attention (euphemism is "connecting the world") is their biggest challenge, especially in the ever-evolving social industry.
> Advertising was not prioritized for a huge chunk of Facebook's history, so I'd argue it's not part of their core ethos (it's a necessary evil, if you will).
Perhaps it was not prioritized because a certain amount of scale, in terms of active members, is required to be very successful (to the extent of getting to "rule the world" kind of level)? Many companies start with fewer user hostile items until they reach a massive scale so as to protect their turf from competitors who may use that against them as well as to keep people coming back to the platform often and for long durations.
You're right in what keeps the leadership up at night - figuring out how to keep users glued to Facebook. That however, is also core to their business model, in that without user attention, advertisers wouldn't pay to advertise on Facebook.
Given that FB is now a public company, it needs to prove to shareholders, beyond anything else, that it's growing its revenues and profits. Turn FB into a company that puts its users first instead of its profits, and its stock price would probably be 1/3 of what it is today.
This is just you trying to get an answer you knew you wouldn't get.
"Connecting the world!" is indeed the job of almost all employees at Facebook. You can have a large majority of the company focus on this one goal, and also know that the byproduct of that goal allows you to expand another part of a bigger goal: to make more money on ads.
A company in a capitalist economy must make money, of course. No one is arguing that and no one is saying that Facebook doesn't want - indeed must - make money.
It's like asking an Uber driver what is job is: "to transport people!". Of course, that much is obvious. If you ask someone in the "growth" department (or whatever) what their job is - the description will necessarily be much different.
> The cynic in me wonders if these employees are just responding to how others are starting to view their positions, more than ethical or moral quandaries.
Doesn't need to be cynical. "I never thought of it that way" is maybe one of the most important phrases you can say.
These people didn't see their jobs as unethical until someone really put a spotlight on what they do. It forced them to look at their roles in all of this and say "is this right?". To quote David Mitchell: "Are we the baddies?".
>These people didn't see their jobs as unethical until someone really put a spotlight on what they do.
That reflects a certain quality in the individual that I generally think of as bad, but I suppose may have some place in the greater scheme of coporate america. However I generally think that we have a moral imperative to constantly ask ourselves, "is this right? . . . 'Are we baddies'?
I've understood that Facebook is essentially evil for a decade. People who work there who are jumping ship now either look like they aren't paying attention but are moralizing the company's action or are abandoning a sinking ship, neither are really positive traits in a potential employee.
I mean that's good for you and I tend to agree. But it's incredibly naive to think that any significant percentage of the working population is using that criteria to determine where and on what they should work. We tend to think of tech people as being incredibly bright but at the end of the day they are human and FB is a very cool place to work. If you don't take a minute to evaluate the ethics of the work you're doing which might not on the surface be all that questionable it's easy to fool yourself into thinking, you're doing nothing wrong. It's a much easier realization that working for Defense Contractors making missile guidance systems is bad than it is to think hey all this data i'm collecting for my social network features could be put to some very bad uses
I've quit a job for ethical reasons before, in a similar vein to this issue with Facebook. I don't begrudge those who didn't. Not every issue is black and white.
We saw the same thing after Snowden, people were leaving the NSA in mass after the average person became aware of the situation.
Data collection is what enabled the Nazis to find Jews and homosexuals, the next mass percution can be more specific than race. Anti-vaxers, pizzagaters, people who don't watch anough TV.
As the world becomes more aware of the dangers of algorithms and AI, Facebook engineers are rethinking their career path.
It's clear to me that Zuckerberg was entirely too willing to abuse his power from the start, he doesn't seem to have good intentions at all.
Facebook is a huge organization and although to the outside world it's perceived as one, internally it's basically hundreds of separated teams and companies. So to your point, yes probably some are aware of all what's going on around the company, but there's probably an equal amount that are not. I work for a Fortune 50 company and I get much of the information about the company from the news, despite having a relatively senior level role.
> "The cynic in me wonders if these employees are just responding to how others are starting to view their positions, "
Or just giving politically correct reasons that conveniently fit the current broader narrative.
Might this be the last straw for some that were already close to leaving? Sure. But it's not the only reason, just the one that's simple to explain and/or easy to spot.
That said, I have to wonder how often unintended consequences are discussed at FB, or at any of the other (tech) powerhouses. I mean, targeting ads based on personal info hardly sounds like a big moral / ethical problem, until someone finds a loop hole and attempts to exploit it (in what should have been foreseeable ways?).
The cynic in you is obviously right about this one.
The best case scenario here is many people simply didn't ever consider the actual ethics of the business that was paying them until everyone else started talking about it.
Not thinking about it until someone makes you doesn't make you a terrible person of course, but I won't be throwing any admiration your way.
> The cynic in me wonders if these employees are just responding to how others are starting to view their positions
Employees rarely flip until the organization stops making them money, then they flip en masse. As long as Facebook's stock remains stagnant, the insider leaks will likely continue.
yeah but it wasn’t in the cultural zeitgeist to care about these. Now that it’s popular, many people will want to virtue signal and leave the company so others see how much principle they have.
Or perhaps, over time, they have changed their minds. The principle of charity, it seems to me, requires that we take them at their word in the absence of evidence otherwise.
> Facebook engineers are quitting or trying to transfer to Instagram or WhatsApp
I find trying to transfer into another product hilarious: it seems to me that they just don't want to have the stigma of working for "Facebook, the product", without really solving the issues working for "Facebook, the company".
Silicon Valley is turning on Facebook. Employees are realizing this and do not want a tarnished resume. If you actually believe this is for ethical reasons then you're not equipped for the world.
related: does having uber on your resume tarnish it?
At $JOB-1, my boss (only good boss in the whole company, TBH) got fired soon after I left. As a knee-jerk reaction by upper mgmt. (poorly run company.) His replacement knows 1/10 of what he knows, and hails from Uber.
depends who's evaluating it and what their values are. If their values are aligned with Uber's values, then no. If their values are opposed to Uber's values, then yes. I'm squarely of the latter. Uber would be a major negative to me. But everyone is different. Some people might view it positively.
Why would working at Uber be a detractor on a resume? It's major company with assuredly hefty tech - regardless of the HR issues rampant, shouldn't the hire be based on merit not association?
If you work for someone that does a bad thing, you are helping them do the bad thing. You are actively making the bad thing happen, and by doing so, you are also doing a bad thing.
I think Uber is a bad thing. If you work for Uber, you have helped create a bad thing. At that point, I do not trust your judgement, and I do not want to work with people whose judgement I do not trust.
Following your logic, then anyone who perhaps simply took an Uber to the interview should be off the list because they're providing revenue and therefore helping Uber do more of the Bad Thing....Uber clearly has an engineering staff still, and you can't tell me they're all Bad People doing Bad Things - that's unreasonable.
Should we bar military veterans from these development jobs because the Military Industrial Complex has a toxic culture of killing millions around the world for profit?
If someone chooses to work there then clearly their cultural norms are questionable. Once it became obvious there are two choices - accept the culture or leave.
Please don't be flippant. You know the what we mean here: the perception that those who work at Facebook are lacking in ethics or moral fiber. This isn't about politics.
It's not a flippant remark - it's argument to extreme [1] (ie, paining those who question FB as only Trump-haters when many people have questioned FB long before CA/Trump entered the mainstream discussion).
It doesn't help FB that Trump is associated with them and that he's in hot water already.
The reason why I called it out as flippant was because I felt that they might have had a valid point, that Facebook was being unethical with regards to influencing elections, but they did it in a really poor way which ended up having the effect you mentioned in your response.
You do realize that I likely responded to them because I live in California as well, right? This isn't a political issue in the traditional sense of "on no they got Trump elected"; the issue would still exist if it was Clinton that ended up winning the election due to Facebook. The problem is that Facebook just should not have control over the politics, more so because the control they have was shown to be gained without their user's trust and without regard to their privacy.
Yeah, but what drives that perception? It doesn't seem to be rooted in any coherent theory of ethics. After all, users happily agreed to their data being used, they weren't forced to sign up to social networks en-masse, they weren't forced to install tons of crappy apps or agree to grant them permission to their friends data. They weren't denied any information or lied to.
So the normal rules of ethics that would lead to conclusions about immoral behaviour don't apply here. Privacy isn't something that should be forced on someone. It's their life, their decisions, their information. They can do what they want with it. In reality virtually no users care about privacy and will happily upload their lives or even broadcast it to the whole world given the slightest chance.
As such, it's hard to conclude what exactly Facebook employees are supposed to have done wrong here.
That leaves "OMG YOU GOT TRUMP ELECTED", which isn't going to convince the roughly 50% of Americans who voted for him. After all, they won't believe that Trump voters were mindlessly reprogrammed into voting for Trump against their will by social media. And a lot of Clinton voters won't believe that either.
I expect a whole lot of virtue signalling from the same Silicon Valley types who foam at the mouth if someone says there aren't many women coders because on average they choose to be teachers instead. And not much else.
After all, users happily agreed to their data being used, they weren't forced to sign up to social networks en-masse, they weren't forced to install tons of crappy apps or agree to grant them permission to their friends data.
The modern understanding of consent is that it can be withdrawn at any time. Sometimes even retrospectively
In this case, yes. The issue here is the fact that Facebook had any effect on politics through what it was doing, not that Trump ended up getting elected.
As sure as it's still a developer's market and FB controversy doesn't erase technological accomplishments of individuals. Those accomplishments are easily untied from the politics of the situation.
You may lose a small amount of jobs that would come your way, but you'd still likely have dozens of offers to choose from from companies that don't view you as 'tainted'.
Software engineering doesn't have the same ethical consequences and motivations as other fields (medicine, law, etc.). It doesn't make much sense to punish the engineers when you invent ethical standards after-the-fact.
> FB controversy doesn't erase technological accomplishments of individuals
It doesn't, but it doesn't look great if those accomplishments end up having a negative impact on society. Do you really want to have "I wrote an novel algorithm to track people" on your resume?
> Software engineering doesn't have the same ethical consequences and motivations as other fields (medicine, law, etc.). It doesn't make much sense to punish the engineers when you invent ethical standards after-the-fact.
I'll have to disagree with you there. Software engineers should have to deal with the ethical consequences of the work they've done. And these ethical standards are hardly "after-the-fact": the idea of a right to privacy has been around for a long time.
>Do you really want to have "I wrote an novel algorithm to track people" on your resume?
The great thing about writing resumes is that you can leave things off.
>Software engineers should have to deal with the ethical consequences of the work they've done.
Give us legal protections and resources to stand up and to and report unethical behavior by companies then. You're arguing correctly on a moral ground, but you don't have good industry ethics without good legal precedents.
Otherwise the best chance you've got is to get media attention and now you're fighting the company's PR.
When you can point to "Defendant can no longer work in software development because of an ethical failure" cases, you will have a much stronger argument to get engineers to follow ethical guidelines (we're assuming those have been written down on something other than a Github-feelgood repo for that case).
This is exactly right. In fact, Engineers can't blow the whistle on a company that's violating privacy law without harming our careers.
I think Engineers who carry water for unethical companies probably get paid more. Your next employer would much rather hire you if you kept your head down during the PR Nightmare than if you made noise and created a problem for management.
> The great thing about writing resumes is that you can leave things off.
Sure, but then you're diminishing the value of what you did. If someone asks you want you did at Facebook, it would be odd if you said "I'd rather not say", as would it if you had five years of blank in your resume.
> Give us legal protections and resources to stand up and to and report unethical behavior by companies then.
I do agree that this is an issue that should be addressed.
>Sure, but then you're diminishing the value of what you did. If someone asks you want you did at Facebook, it would be odd if you said "I'd rather not say", as would it if you had five years of blank in your resume.
I guess if you only worked on creepy and unsavory things, you'd be in a tough spot. Assuming you can't weasel-word your way out of the creepiness and make it more palatable.
If someone ultimately thinks that, you're back to "move on to the next company". And the consequences are paid then and there. But again, you can lean on the market's health to make those consequences really small.
it doesn't look great if those accomplishments end up having a negative impact on society
You're assuming all employers (a) believe in and (b) care about Cambridge Analytica. That likely isn't the case.
It requires quite a leap to begin with, to believe that CA could actually "influence elections". But this is just the latest iteration of a very, very biased political viewpoint emanating from Dems-in-Denial. Back in Obama's time it was "elections can be bought with campaign spending", then it became "elections can be controlled with Twitter spam" and now it's "elections can be controlled through some vaguely specified process involving years-old data from Facebook profiles". The mechanism keeps changing but the underlying belief that voters who don't vote left-wing are just marionnettes on string remains.
If you don't believe that, and believe / know that voters make up their own minds consciously, based on the information they have, then this entire Facebook story would make no impact on your hiring decisions because the idea that Facebook is some sort of sordid unethical power broker seems ridiculous to begin with.
You're assuming that I'm even referring to Cambridge Analytica. I'm talking about the general sentiment towards Facebook in general as being an unethical company, of which the "leak" of data to Cambridge Analytica is a small part. There are other bad-faith things that Facebook has done, especially with regards to their users' privacy, that certainly have a negative impact on society.
The vast majority of people do not consider advertising unethical. Nor do they consider social networks to be unethical.
I always struggle to figure out what classical rules of ethics Facebook (or Google) are supposed to have violated, but nobody can ever give me one. Instead there's a lot of begging the question.
So where do you draw the line? How many of these would you put on your resume?
"I wrote software that took advantage of addictive behavior in new ways"
"My new flagging mechanism, tailored to prevent users from leaving the platform, led to someone committing suicide after being unable to block an abuser"
"My relevancy algorithm, based on what had the most engagement, led to a genocide"
I've made these purposefully extreme, but these are all based on reality. Making something "novel" isn't always a good thing.
> "I wrote software that took advantage of addictive behavior in new ways" "My new flagging mechanism, tailored to prevent users from leaving the platform, led to someone committing suicide after being unable to block an abuser" "My relevancy algorithm, based on what had the most engagement, led to a genocide"
I wouldn't put any of those on my resume, because they're ridiculous and don't represent the way people report their achievements. However, I would put the technology I developed on my resume that enabled any one of those things.
Personally, I'd be proud to have had significant involvement in the development of Facebook's data analysis and ingestion platform. I would not be proud of the Cambridge Analytica mistake in of itself, but that's because I don't conflate the underlying technology with a particularly grievous mistake that was happened because of the technology's existence.
Moreover, I contest your use of examples which are cartoonishly evil. I disagree with the fundamental premise that the technology is innately bad, or that your examples even remotely constitute an accurate portrayal of the internal culture at Facebook. I believe there are clear failures in Facebook's data analysis processes, but it's intellectually lazy and dishonest to misrepresent those failures using the obviously biased language in your examples. It's also unfair to mischaracterize the majority of people who work there as lacking in ethics.
> Software engineering doesn't have the same ethical consequences and motivations as other fields (medicine, law, etc.). It doesn't make much sense to punish the engineers when you invent ethical standards after-the-fact.
You are correct that it doesn't. But our field also carries a lot less prestige, with most of it tied to the employer as opposed to our profession and skillsets.
This board often laments this fact.
I think adopting an Eichmann defense here is a big mistake and hurts our profession as a whole. We'd be indirectly surrendering a huge amount of autonomy. How can we have ownership over the positive outcome of what we create without also be willing to accept some of the responsibility for negative outcomes? When we wash our hands of these externaliities, to an outsider, it just looks like developers are nothing more than cogs in a machine, following orders. It just moves the glory up the food chain.
Because there are plenty of sociopaths in positions of power at companies all over the world that would probably gladly hire skilled coders that either have no moral compass or are fine with their moral compass spinning wildly while they cash their paychecks?
I think that's too strong of a word. It doesn't take a "sociopath" for a company to get to a bad place; merely the efforts of many engineers each doing something slightly immoral. Hopefully there are less of these now that they see what can happen to a company where such engineers work.
Seems pretty unlikely that it would meaningfully stop engineers from finding work after Facebook. A lot of Arthur Andersen individuals were hired pretty easily after its dissolution following the Enron scandal.
> Working for FB itself during this controversy won't tarnish your resume anyway
Depends on what you’re doing. I wouldn’t want a former Facebook employee near anything remotely customer data related, or anything with political or PR sensitivity.
The people working on the political and PR areas at Facebook are badass. They come from places like the Obama administration. They now have a front-row seat in the negotiations for how governments around the world are negotiating with Facebook.
After the dust settles, these people will start charging $1k/hour to consult on "political or PR" sensitive business areas.
> these people will start charging $1k/hour to consult on "political or PR" sensitive business areas
I frequently see people try this when their employers start burning in political and criminal crosshairs. It’s a tough and treacherous strategy. It is likely just an easier thing to tell oneself than “I need this job for the pay cheque, the next 10 years be damned.”
Unless you leverage the crisis into deep relationships with regulators and politicos who remain relevant afterwards, you walk away weaker. Low-touch Facebook PR folks saw their best days when they worked in the last administration.
Not sure, personally. The scandal is that the company is exposing user data, not that employees are siphoning data and selling it on the side.
I can understand it if you're in a situation where the bottom can kind of dictate the company direction in a "hey, with this data, we could make the company a killing, our shareholders happy, etc." sort of way. Perhaps avoid the upper level employees, but engineering talent is probably safe.
> Be civil. Don't say things you wouldn't say face-to-face. Don't be snarky. Comments should get more civil and substantive, not less, as a topic gets more divisive.
That's a bit much, don't you think? This reminds me of that baseball coach who would refuse anybody from Colorado simply because marijuana is legal there.
Bias: I've refused to work on any adtech revenue business, and I've passed on opportunities at Uber multiple times due to ... moral and ethical concerns.
I wouldn't hold experience at Uber or FB against any applicant.
Yep. I once worked at a place that leveraged PII for profit. I won't work at a place like that again.
But I wouldn't hold working at Facebook against anyone, unless they knew and worked directly on monetizing PII and didn't see problems with it. That's for other reasons -- there are certain things you can't engineer without having a really good idea how, where, and why it'll be used. I don't want an engineer that is blindly solving technical problems.
Agreed. And besides that, I decided I didn't want to work for FB long ago, even if they somehow came calling... But it wasn't about ethics. Now it'd be about ethics, but I wasn't even aware they were doing anything unethical until now.
I'd expect people working there to know the deal a bit better, but even so they might not be directly involved with anything unethical, and even if they knew about unethical things, they might not have had an opportunity to switch jobs.
You'll pass on a potentially excellent hire simply because of the name on their resume? You won't even expend a modicum of effort to interview the candidate or see if they're at all involved in the specific work you despise? Are over 20,000 employees guilty in your view?
If this isn't irrational behavior, what would it take for you to be acting irrationally?
If they continue to work for Facebook after CA it shows that they are willing to be paid an income despite the clear and studied effects of their work (directly or indirectly). There are lots of businesses that don't have the luxury of hiring for culture on issues as granular as this, but is it ideal to be able to? I think so.
What are those "clear and studied effects"? More importantly, just how "indirectly" do one of those 20,000 employees need to be tied to the work you dislike before you'd acknowledge that rebuffing them is silly?
Are there other companies you'd indiscriminately never hire from? What about Google? How is their data mining materially different, in your view? What about a subcontractor for Facebook? Would you hire engineers from a startup which advertised on Facebook? If so, why are they not complicit?
> More importantly, just how "indirectly" do one of those 20,000 employees need to be tied to the work you dislike before you'd acknowledge that rebuffing them is silly?
It's not their work that is important, but rather their understanding of Facebook's impact. I think it would be kind to give conscientious quitters and H-1B employees extra points as their situation varies.
> Are there other companies you'd indiscriminately never hire from?
I never advocated for such a draconian policy. I was stating that having this information could help with culture fit in an ideal hiring situation (in either direction, depending on your business). Maybe these people are hired less or more in highly sensitive, privacy-related roles based on the company's prerogatives.
> What about Google? How is their data mining materially different, in your view?
Google's mission is to "To organize the world's information and make it universally accessible and useful." Their first value is to "focus on the user". Facebook's mission is to "Give people the power to build community and bring the world closer together." Their first value is to "be bold".
Facebook's stated mission is to be in the business of social information. Google's stated mission is to be in the business of information more generally. Materially, this general mission leads Google to lean less on people's personal information to achieve their mission.
> What about a subcontractor for Facebook?
Direct contractors are typically employees in all but name, but if they were working for a company hired by Facebook I would think it only reasonable to take that into consideration as their will may not have been aligned with the business decision.
> Would you hire engineers from a startup which advertised on Facebook? If so, why are they not complicit?
Again, to be clear, I would hire engineers directly from Facebook so certainly I would hire their partners employees. It is one of many data points to consider when hiring. They may be complicit in Facebook's misdeeds, but still can have value to a business.
For those that don't have time to read it, the article you linked debunks misconceptions about the CA privacy breach. I think the article goes to show the type of culture that allows for such a privacy breach to occur.
A quote from the article you linked:
> Asked what kind of control Facebook had over the data given to outside developers, he replied: “Zero. Absolutely none. Once the data left Facebook servers there was not any control, and there was no insight into what was going on. Parakilas said he “always assumed there was something of a black market” for Facebook data that had been passed to external developers.
I doubt having Facebook on your resume would tarnish it. I'm sure Facebook engineers would not have trouble finding jobs at other companies like Apple, Google, etc if they have the right skillsets and technical smarts. I certainly would not pass up on a Facebook employee who checks the right competency boxes regardless of Facebook's image.
WhatsApp and Instagram seem to be much more respected than Facebook proper right now, so it kind of makes sense. I also haven't noticed a decline in usage of these apps among my friends, but posts on Facebook itself have really dried up.
> WhatsApp and Instagram seem to be much more respected than Facebook proper right now
Yes, that's my point. These people, in my head, are trying to minimize the damage done to their reputation, but without actually realizing that moving around inside Facebook doesn't really help. I seriously doubt that how Facebook, the product, is run is significantly different than how Instagram or WhatsApp is run. Anyone who just jumps around isn't trying to really address the root issue.
Agreed - they reason they "jumped-ship" was not to address the root issues. It was to address to hit on their vanity for working for a company that is currently in a bad light.
It says more about the individuals than it does about the company.
From what I've heard, they're fairly independent of the mothership and have their own distinct identities, similar to how people at YouTube don't consider themselves to be Googlers as much as YouTubers.
I hear that but I don't believe it for a second. In fact I noticed something really interesting...
Install Facebook and Instagram on iOS. Launch FB and note how long it takes to start, now launch Instagram and do the same. Instagram starts up nearly instantly.
Now, force quit both apps and remove Facebook. Start Instagram and all of the sudden it takes much longer to start. I don't know why this is but it certainly shows a deep connection between the apps in a way that users are probably not expecting.
Sure, but corporate culture generally always seeps into projects like these. Especially when looking for ways to monetize, since those decisions generally come from upper management at the "parent" company.
I wonder how many new hires pass up Facebook because of ethical concerns. Whats scary is if even internally people are moving departments because of ethics, that basically filters so that the least ethical people are on the teams with the most ethical concerns.
If there's a real net effect on their hiring, they'll have to increase salaries to lure people in, which will take in even more unethical people (of course, not everyone that works for or wants to work for Facebook is unethical).
I find this issue much more pressing in the defense industry. I (thankfully) have the freedom to choose where I work. I will never willingly work on weapon systems. There are those that do though. Engineers who design things to kill other people. What the actual fuck.
That's great that you've got a set of principles that you live by, and are able to stick to them. However, your principles are not everyone's principles, and what you consider moral and ethical is not universal. Grellas had a great comment on working for the military industrial complex a few days ago, that I think is worth a read:
There's some things that are considered immoral or unethical that are fairly universal, especially in developed countries. We're against slavery, rape, murder, etc. Sure, there are perpetrators, but don't wave them off and say "Oh, they just thought differently, give them a break!" We punish them, which is all that can be hoped for.
Some things exist in a gray area (for now) so while some people might argue that its righteous or necessary to do something, there's no problem with an opposition to such an ideal shoving it back.
The alternative is the military can insource these jobs and change their policies so they can hire the people they need to do defense work instead of coopting organizations that do other sensitive work for the public.
You just reduced the linked comment - a nuanced, neutrally presented argument that there are significant practical and philosophical difficulties inherent to refusing to develop technology for military application - to "lots of words", followed by what is effectively a rhetorical sneer.
You don't have to agree with every comment asserting an opposing view, but at least give them a modicum of respect if they put in as much intellectual effort as that one. Why even bother commenting if you're just going to issue a middle-brow dismissal without a substantive critique?
I grew up in Maryland, which probably has the most defense contractors in the country and I decided in college that I didn't want to work in defense for ethical reasons. But there is another side of it, some people genuinely believe that the work they are doing is helping to keep the country safe.
I also used to work in defense, and thankfully not on weapons systems. I would like to point out some mild hilarity in your comment though:
Engineer used to (hundreds of years ago) literally mean someone who worked on designing weapons of war and/or defensive structures. How times have changed, hah.
So the fact that Galkovskiy, the engineer who perfected the Katyusha rocket launcher, decided to work on a weapon that played a major role in defeating Nazi Germany offends you?
That argument is actually on strong footing in principle, because only one counterexample is required to demonstrate that something isn't universally true a priori.
On the other hand, your argument is weak because 1) you attack a strawman (the commenter did not indicate by their choice of example that it is the most recent), and 2) you did not substantiate why developing military technology then would be different to developing military technology now, in such a way that precludes beneficial use.
So let me get this straight: You expect the police and military to use their weapons to defend you. You're willing to pay taxes to pay for those weapons. But you're too ethical to get paid to make them? I just want to be clear where on the spectrum your outrage falls.
The existence of our nuclear arsenal does it without having to invade a country every decade or so. Are you saying we need more than 5K+ nukes to be secure?
> Engineers who design things to kill other people. What the actual fuck.
The popular rationalizations are "it will help save lives of armed services" or "prevent the big terrorist attack". Ther are dozens of reasons people make up to convince themselves and others that it's ok.
Judgement as to whether their reasons hold up to the light of day are left as an exercise to reader.
which you seem to think automatically is exactly the position you believe :-)
So you have read Aquinas and have a reasoned argument against the concept of Jus ad bellum - please I am sure hn readers would like to see your argument.
There are companies like cigarette makers who's entire business is generally considered unethical, and others like big banks that could be ethical but often aren't. I would expect this to make a big difference in the type of workers they attract.
Could Facebook exist without selling as much personal data to advertisers as possible? It doesn't look like it currently, but it is at least hypothetical possible.
Count me as one. Who can say if I'd have passed their interview process, but I was recruited by them and immediately said "no thanks", and told their recruiter I didn't like that they subjected their users to psychological experiments without their knowledge[1].
There are a lot of reasons to dislike Facebook, but I don't think their (public) research on the effect of social media on their users is one.
You're exposed to psychological experiments on every large website you use, although typically the psychometrics are more limited to things like "did you click the button that gives us money?" instead of questions like "how does the emotional content of the news feed affect our users well being?"
Also the fact that they made this research public is, in my eyes, a sign of good faith (in this specific case), and an indication that at least some people in the company take the power they have over their users' health seriously.
I buy it was done in good faith, (and it’s unfortunate that the lesson they took away from it may have been just to keep their mouth shut). The lesson here though was that people at Facebook weren’t aware of the responsibility the data they have puts on them.
It showed a surprising lack of awareness and care for the users of their platform. Regardless of the intention it was grossly irresponsible to do the experiment in that way. The kind of data that Facebook had and was manipulating is completely more personal and completely different than getting people to click a buy button, a distinction lost on them. We’re talking about people’s relationships with their family, their old friends, and we’re talking about cries for help or compassion that were suppressed to see how it would affect users mood, not just engagement.
I hear you, but can't agree fully. They showed a subset of their users negative posts to see if it negatively impacted their lives. As this could have been expected, they knowingly hurt their own users.
I have lots of other reservations about working for Facebook, that was just the one I specifically cited to the recruiter.
I find a fairly large disconnect between everyone saying people may be uncomfortable working at Facebook and their Glassdoor ratings (probably the highest of any tech company).
I passed, twice. I went through the whole process, was offered an enormous compensation package (7 figures). I went through it because the entire time I was telling myself if they came back with a big enough offer I'd do it. Turns out I was wrong, I couldn't do it even for what definitely qualifies as "gets paid".
This was several years ago, before any specific scandal I can think of. I have constantly questioned whether I made the right move ... it was a lot of money. Now, finally, I have my answer.
That said, of course my story is anecdotal and for the most part, I guarantee you that the great majority of folks would jump for a job at FB. At the time I was going through the process, the work environment was very much kool-aid, with kool-aid posters everywhere, "work hard" posters and such. It went past my sad filter straight to laughable. I was on campus a few months ago and that kind of thing is gone or mostly gone. I hear it's a pretty good working environment these days, actually.
I am one. I didn't even reply to the many targeted requests I got from recruiters for FB positions.
I'd say at the time several years ago it had more to do with "I don't want to have to re-activate my closed FB just to join the company", never mind having to drink the koolaid.
Remember being contacted by the Oculus recruiter, but the association with a white supremacist founder, Palmer Luckey, felt really wrong to me personally. Even after he left it feels tainted.
I left YouTube about a year ago, in part due to cognitive dissonance between my values and the company's objectives. If you're a current employee of a similar company and you're considering your options, I understand the struggle and am happy to talk about it! Everyone's circumstances are different, but for me personally the TL;DR is that I'm much happier working somewhere where I can get 100% behind the mission. Anyway, email's in my profile.
As a parent, thank you. YT is like crack for kids and they take everything on there at face value. The sheer amount of fake or conspiracy videos that show up in the suggestions makes it beyond unusable. I've found myself in the seemingly backwards position of telling my kids to just go watch some normal TV if they want to watch something.
I used to work at facebook and I disagree. Even prior to the recent scandals, there are plenty of employees who have decided to leave facebook due to ethical concerns (myself included). However, most keep this to themselves as Facebook fosters an environment where dissent is not tolerated.
Of course, it's unclear from this article whether negative sentiments have increased substantially this year compared to previous years.
And to clarify, I'm not saying that I am innocent or that I have taken some sort of ethical high-road. I gladly spent many years cashing out my pre-IPO stock grants while turning a blind eye to numerous immoral business practices. But soon after going public, there wasn't much benefit working at Facebook compared to any other large silicon valley tech companies. Without the financial motivation, the ethical concerns made it hard to be excited about remaining.
I am well-connected to Facebook and I have no impression that employees are transferring or worried in any way. The media's exaggerations of employees' reactions is almost as bad as their exaggerations of what Facebook is doing with data.
I have a lot of respect for someone who chooses to not work at a place because that place is doing bad things.
There are so many things that go into that decision, how "important" is it to be employed, how "bad" is the business behavior, how "costly" to a reputation is it to leave or to stay. It is the kind of question that puts a persons character and self image on trial, and in my experience few people emerge on the other side unchanged.
Faulkner said, "... the young man or woman writing today has forgotten the problems of the human heart in conflict with itself which alone can make good writing because only that is worth writing about, worth the agony and the sweat." I didn't really understand what he was talking about at the time. This was one of my High School English teacher's favorite quotes. But in the late 90's I met people who were being ripped apart by working at a dot.com company, knowing it was all smoke and mirrors, but staying to that they could vest their stock and sell it before the rest of the world figured it out. Being in a position where your personal wealth will benefit because you are participating in, although not directly responsible for, a fraud that is being perpetuated on the investors. How tainted is that silver?
A couple of years ago I was having lunch with an ex-Googler who was coming to grips with their loss of innocence. They had worked in AdWords and were part of a group that was increasing the average 'spend' of an AdWords customer with targeted messaging that was designed to appeal to their use of the system. This person had recognized that the goal of the program was to essentially "trick" the customer into spending more money when all the statistics said that doing so would not give them a commensurate growth in business or traffic. And as bad as they felt about participating, they continued to do so as the program manager stressed that nobody was "forcing" them to spend more money, just offering them a way to think about where they would "want" to spend more. This person was dealing with the realization that a younger version of themselves would be disgusted with the older version of themselves for "giving in."
It seems that people will talk about honor, character, and integrity in the abstract with one set of views and then find their real character (or lack thereof) when confronted with a test of their values.
I expect everywhere is different. When I worked at Google I brought things up during open microphone time at the Friday meetings (usually hosted by one of Larry, Sergei, or Eric)
Too little, too late. Well, perhaps not, it's never too late, but still.
You built this behemoth, we told you so, you insisted because it was too hard to resist, deal with it.
For the record, the situations in which some people find themselves are complex and not everyone has the freedom a high salary or stocks or other perks grant some of us.
Nevertheless, for those of us who choose to make the world, our world, a nastier, less friendly, more neurotic place, I have zero sympathy. Reap what you sow, in my case it's disdain.
I apologize for the hard tone but I'm not exactly known for accepting "sorry not sorry" kind of apologies.
I was recently testing the water with some Facebook engineers about switching companies because of privacy related concerns. There was some sympathy wrt the company going in a bad direction, but by and large they were staying put because they enjoyed the technical challenges presented to them and pay. I think stories like this are mostly fluff. Most engineers simply don’t care.
We don't know if this is a cluster of concerned people, or a broad trend across the company. Given the vagueness, it's probably the former, because exaggerating the number lends to a more interesting narrative. Not much to see here, just a few people leaving FB.
Leaking corporate data or trade secrets to the general public is not the solution for a couple of reasons.
- Depending on the actual contents of the leak, people might panic; this is basically never useful.
- Depending on the outlet, the contents may get distorted before being published in order to fit some narrative -- potentially causing a panic. See above.
- You or the original leaker may be traced back using even editorialized data, or without the corporation seeing the data. This would land you and/or the original leaker in a lot of legal hot water.
- A side-effect of that is that you'd become unemployable in many cases, and you might find yourself running from the law in other cases.
So, yeah -- quitting facebook probably isn't a bad idea as a user. Refusing work as a contractor or an employee may be difficult, but because absolute morality doesn't exist, it's really up to the person in that position to make the call about quitting.
Facebook is indeed in a crisis, but I just got the feeling that the media is exaggerating anything that has the slightest negative bend to pour more fuel to the fire.
The US is funny. We once had privacy laws that protected your video rental history. And yet very little has been done about protecting privacy except for COPA.
I have wondered for a long time why Facebook employees were even around. It's not like the Cambridge Analytica scandal was the first time an ethically problematic issue had happened because of the way Facebook was designed to work. The people working on the platform created all these issues in some way or another (due to not understanding the implications or management direction).
My guess is that employees of Facebook mostly stay for the money, with a second reason being for some kind of exposure on working on things on "planet scale". I find it difficult to understand anyone being fine with Facebook's ethics all these years and then changing their minds now. It is quite weird.
Facebook is a soulless company. If you're a Facebook employee and haven't understood that yet, I cannot even sympathize with you (while being frustrated and worried about the implications for billions of humans).
While data security is an issue Facebook must reckon with, it is not the major issue Facebook's existence poses to democracies around the world.
The real issue in my mind is that the platform can be used by bad actors to spread misinformation. It's as if during WWII the Nazi's had the ability to publish editorials in leading newspapers throughout the US.
While this is a feature/bug of the internet in general, the vast scale of Facebook and its near ubiquitous use amongst the general voting public in the US and other democratic nations make it the most potent vector through which a bad actor like Russia could spread propaganda and misinformation.
I'm not sure the recent move to label political advertisements as such will do much to safeguard us from misinformation.
How many times have there been privacy settings changes which undermined user intent or resulted in surprising defaults? How many snafus have we witnessed since its inception? I am astounded that employees are just now deciding they've had enough.
Maybe over concerns that their department is in limbo due to the general public's concern, but I can't imagine they have been working on their projects and only now found ethical issues with them.
IDK, they may have ethical misgivings before but now find they have to rationalize these with their friends/family outside of the tech bubble. I'll cut them some slack until I know otherwise.
Is there any kind of organization for engineers that supports these types of ethical issues? I recently resigned from a job because because of similar privacy concerns and I found it really hard to talk to anyone. My situation was a very clear violation of California law but everyone I talked to said the same thing, "if you blow the whistle, you'll never work again."
There are some professors/researchers at Stanford and other CS-heavy schools building curricula around ethical CS/engineering. Perhaps it could be worth reaching out to some of these professors since you're in California. I'm not aware of any sort of organization for this topic, but I imagine you could talk with researchers in a private capacity.
So now, FB employees all of a sudden have 'ethical concerns' over their job and what they were building. Let's be real - Facebook (the product) has always had the goal of collecting as much data from users as possible, and displaying as many relevant ads to users as possible from data that users provide. The product, throughout all these years have been tailored and designed to do just that.
Facebook isn't a billion dollar business like it is today because they take steps in protecting your data and ensuring transparency and not selling out to advertisers. Transferring to Instagram or Whatsapp isn't going to absolve you of ethical issues if the leadership is the same. It'll be a matter of time before FB finds a way to monetize all the messages you send via Whatsapp
Ethics are only important to the workforce when the company is shows to have bad ethics, thus just having the company on your C.V. makes it look bad.
I saw the same things at school, studying for forensic IT.
Whole classes of people who think they learned to be an ethical person. The thing is, sure they makke ethical decisions at times. But in a pinch they might not consider ethics as much. Like when they really need a job. I myself don't consider myself to be the most ethical, but my morals are strong... don't do evil!
Knowing what is evil and not evil, is a big part of doing the right thing, can't expect the whole workforce to know where the line is or that there is a line.
This article is arguably fake news. To support their claim they cite a few tweets and one line from a NYTimes article which is barely related. I love Hacker News but lately this place has had a bit of a singular focus... It seems to me that the ban on politics should perhaps be extended to a softban on Facebook news.
The real way to tell if there is a substantial number of employees and potential employees who feel this way is looking at Facebook compensation vs similar companies. If they have to offer more money for similar work then that means that there are a substantial number of people who don't want to work for them. If they are offering similar money, then that means that the people who don't want to work for Facebook are not a significant number of people.
The compensation will tell the real story apart from all the publicity and attempts at spin.
This is probably FB people sitting on a few million bucks who have stayed due to inertia. Now being at FB looks icky to their friends and they have enough $ to say fuck it.
This is surprising given the recent articles on leaking inside of Facebook, and the response from other employees that such behavior lacks "integrity".
As a problem solver, Facebook looks very attractive to me right now. Imagine the shit they're going through right now. Not just damage mitigation but they probably need to revamp their business strategy which will turn in to serious engineering work.
c'mon wasn't it super obvious that they are doing that stuff already for years, the whole Cambridge Analytica isn't that suprising imho it's like starting at Uber now and think they're just a "normal" company...
> As it became evident that Facebook's core product might be to blame, engineers working on it reportedly found it increasingly difficult to stand by what it built.
What are they referring to here? That the code base has gotten unwieldy and has become hard to change?
Can we stop feeding these accounts? The submitter is a new account whose submissions are mainly facebook articles. It's not contributing anything useful to HN. Why is this account posting so many FB related articles? Do they have some other motive that HN is falling for?
At this point I feel like HN has turned into a FB news dump since it makes the people who "quit facebook and never looked back" feel good. Where are the quality submissions?
What an incredibly first-world problem to have... Actually, it's worse than that. It's a top 1% of the first world problem. Most companies, in the broader world, are unethical, and do shady shit on a regular basis, but most rank-and-file people aren't lucky enough to have been making the kind of salary Facebookers can, nor do they have in-demand skills that let them hit the ground running if they decide they don't like it for merely ethical, rather than existential, reasons.
The information about their business model and data practices which people seem to be so surprised by was/is common knowledge among people who work with these technology stacks. Going to work for Facebook seemed like it would have meant supporting and abetting those practices; they were very clear about what they were doing, and the roles were in no way ambiguous.
The cynic in me wonders if these employees are just responding to how others are starting to view their positions, more than ethical or moral quandaries.