Google is overrated. It's been obvious for over a decade after they built an interview process that is heavily biased towards those who recently took an algorithms class. You had talking heads yammering about how hard it was to get into Google, which to be honest, made the company even less appealing. Is this a software engineering company or some kind of hyped up nightclub?
Yeah, maybe it's a hard interview if you've been out of college for a while, haven't written classical algorithms professionally for years, and don't want to spend weeks or months of free time bashing your head on leetcode. What isn't pressure tested by the Google interview process? Just about every other skill that is needed to be a good software engineer.
Obviously, Google has some good engineers, but my goodness was the hype around the company offputting.
I wouldn't characterize Leetcode interviewing as being biased towards those who recently took an algorithms class - it's biasing in favor of those (of any level of experience) who are willing to spend a few months practicing Leetcode.
I thought that Google/etc had at least dialed this back a bit, or maybe just dialed back the "how many gas stations are in the US" type questions, after realizing this wasn't the best predictor of good performance.
> or maybe just dialed back the "how many gas stations are in the US" type questions, after realizing this wasn't the best predictor of good performance.
These problems (known as fermi problems) have been out of vogue for over a decade now. Google is one of the companies that pioneered algorithm-centric leetcode problems as a replacement for fermi problems.
Leetcode problems are not hugely useful outside of the data given by solving a fizzbuzz. Rather, it’s just another excuse so interviewers can convince themself a person is smart, call it signal, and justify a hire.
The last time Google gave me a job offer, one of my interviews was literally a souped up fizzbuzz - straightforward imperative code with no trick, no complicated algorithms, and no fancy data structures. I suppose that may be the reason I got an offer, that I didn’t need fancy algorithms that I hadn’t prepared.
Ultimately it’s impossible to know if someone will be a good hire from an interview. Being a good engineer requires a bunch of traits that simply can’t be tested. The leetcode interview, as I see it, acknowledges this weakness and instead chooses to filter out low-effort candidates, as anyone persistent can practice leetcoding and interviewing (in theory).
This is pretty consistent with my experience. I had one hardcore algorithms question that I bombed, but the other ones hinged on things like "when should you use a map vs a list" that should be second nature to anyone who has been writing code long enough.
I think there's a lot more that could be tested that what current implementation-centric interviews measure. At my company for example, I feel like we've gotten a lot of use out of our debugging interviews.
Last year I went through a loop and a couple of the questions I got and mentioned to friends who are SWE's at Google thought they were too hard for an interview, especially after checking their proposed solutions in the internal problem bank.
So it's definitely still happening.
No I didn't get the job because they were too hard for me to even get a brute force solution.
It’s just sad, since of course those questions have nothing to do with the skills you actually need on the job: “the login authentication is failing once every 10,000 times, go fix it!” Oooh, shall I use a B-tree!?
> It's been obvious for over a decade after they built an interview process that is heavily biased towards those who recently took an algorithms class.
What would you do instead? Threads about tech interviews are always the same: we all complain about the process with no real alternative when you want to hire at that scale.
In particular about:
> Just about every other skill that is needed to be a good software engineer.
How would you assess them better than current common processes like leetcode interviews?
If one could do better hiring at the same cost or less, the whole industry would be interested, even if they'd have to delegate some of their hiring to an external company. The fact that leetcode interviews are still so common indicates that maybe something is missing from alternatives, be it scalability, fairness or even whether they actually provide more signal than leetcode interviews.
I've seen plenty of HN threads where good alternative processes were described. I'm not going to be exhaustive here: writing a program over a few hours that is reflective of the kind of work that the company actually does, coming up with and debating the architecture of a system, mock code reviews, etc.
You might say these don't scale as well as standardized testing of university classical algorithms knowledge. The general response to that would be that if you're optimizing for scale, then you're not optimizing for quality, so stop the hype around Google only hiring "the best."
It's telling that as Google has begun to regularly lay off engineers, they have also begun to deemphasize classical algorithms in their interview process.
At big tech companies, "writing a program over a few hours that is reflective of the kind of work that the company actually does" is not really a good, representative performance measure. Often times, you will be solving problems across multiple domains, outside of your area of expertise. You have to take on the role of PM, data scientist, SWE, researcher, etc.
Internal restructuring of the company may even take you from working on backend web apis to distributed databases. It is expected that you re-acclimatize and learn quickly. Giving you a take-home test to write some CRUD app isn't necessarily sampling those same attributes.
p.s.: also not a fan of classical algorithm style interviews. Clearly, they also have a bias.
> Internal restructuring of the company may even take you from working on backend web apis to distributed databases.
This isn’t true. I’ve worked in multiple FAANG companies’ infrastructure orgs, including distributed kv stores, as a SWE. Anybody joining those kinds of teams are either specialists, very junior, or already had some kind of experience in the domain before joining.
Recruiters that say stuff like “we want strong generalists… blah blah” are not the people who are sourcing candidates for these deep systems roles. It looks a lot like the ML roles.
> Internal restructuring of the company may even take you from working on backend web apis to distributed databases
This is a stretch. It's unlikely that an interview process would test such disparate skills directly, and a competent company will avoid moving an engineer into a role that requires a drastically different skillset without separate verification that they can handle the new role.
To your point, while I haven't interviewed with Google, don't they actually do a number of the things you're talking about (e.g. ask about system design or ask you to look at some existing code)? I'm guessing every interviewer isn't asking you some variation of "find a loop in a linked list".
I've been interviewing people for Software Eng jobs for more than 10 years an I haven't used leetcode style problems for like 8 of those years.
Leetcode interviewing is lazy interviewing. It's for when you don't want to put an effort in your interviewing process to check if the candidates have the skills you need for the position you are filling.
For example. My interview process nowadays is one web API interaction challenge: Https://challenge.bax-dev.com (in Spanish, so you may have to do a Google translate. Use with curl)
In that untimed challenge the candidate submits their code and email. They arrive to my email and I check the code quality of submissions with my morning coffee.
Then in the ONLY 90 minutes Interview with me, we talk 30 mins about experience in their resume, 30 mins about our company and the position, and we code the "server" side to the web client they created .
We do it pair programming style. The questions I achieve to answer are: would I pair program with thus guy? (Like, is she cool to work with?) And does she know what she is doing?
I've had very good results with this method.
And, from the code exercises I am sure anyone can guess what skills am I looking for.
There was a great circlejerk on reddit in one of the outages where people asked whither Google engineers have inverted enough binary trees to bring their systems online. Basically saying that the skills they hire for are for Algirithm competitions, not for building enterprise software.
And I say this as a CompSci PhD who did his good share of data structure and algorithms during my time in academia.
How many people have you interviewed with this technique? After how many applications will you need to change your exercise, such that it becomes expensive to put that much structure into it? How do you deal with applicants knowing each other and giving each other the challenge and tips on how to solve it / what your reactions and probes were?
The pair programming part is not that different to how algorithm interviews are conducted at least in some places. You can definitely be silent for 1h and expect the best solution, or you can collaborate with the applicant and see whether their communication is good, their thinking is structured, etc.
> In that untimed challenge the candidate submits their code and email.
I expect you can't do that in companies which are highly sought after because it's too easy to cheat, and the stakes are high enough that people will do it. Yes the pair programming part lowers the chances, but people already cheat on live coding through various means, it'd be even easier if parts of the challenge are untimed.
Overall I think the difference here is that of scale: if you hire a few people in specific positions, then fine. If you hire thousands (and you can rightly question whether that makes sense in the first place but that's the assumption for many of these companies), you can't have a specific interview for each and every position, you can't reuse the same interview question after having used it for 500 times, such that you can't invest that much into it in the first place.
Has your company pioneered some great open source software (Kubernetes, Go, React, PyTorch, Cassandra? No.
How do you even know that your method is correct, scalable? Answer is you don't know.
It's mind-boggling that people who hire for software engineers who write CRUD apps used by 20 people, think they know how to hire for firms that serve 4 Billion+ users.
1. I would guess that many of the commenters in this subthread work for Silicon Valley big tech, including some apparent (ex-)Googlers who are openly skeptical of their company’s interview process.
2. By reputation, most big Silicon Valley companies, including those that operate “high scale” apps don’t put as much stock into leetcode-style interviews as Google.
3. Silicon Valley, obviously, does not have the monopoly on high scale applications.
No offense, but most (ex-) Googlers have zero business / customer value sense. So, they have no weight in their opinions about how to build a $1T Business
So, with your interview process, assuming two people could complete the (very simple) task of doing a simple backend web service, how do you determine who is worth 100k and who is worth 300k?
How can you tell who is going to come up with the innovative solution to a challenging problem and who is just merely competent?
Lol not at all. I am actually part of a group of CTOs in my area and I know of a couple of people that have "copied" the method one way or another. For all I care I am happy for people to copy it. That's the type of process I would like to have. So who knows, maybe later I'll be applying for your company, and nothing better than following a process that I enjoy (or at least is not as asinine as the leetcode/whiteboard algorithm crap)
In Denmark we rarely perform tech interviews the way you Americans do. Part of this is because of how available education is, so virtually everyone who’s applying for a tech job comes with some sort of academic education, meaning that they’ve proven their worth to get it. I’ve worked a side gig as an external examiner for a decade now, I’ve got good confidence in anyone I haven’t failed.
Another part is that tech interviews are often sort of useless. One of the most expensive mistakes you can make as a manager is to hire the wrong person. Even in Denmark where it’s relatively easy to let a new hire go within the first three months, you’ve still invested an enormous amount of resources into the process as well as the impact a wrong hire will have on team moral. So what we look for first and foremost is cultural and personal fit within the team you’re supposed to be working in. We’re far more likely to do some personality test on you than a technical interview, not because we believe those are a science or accurate in anyway but because they are great talking points to get to know you in a high stress situation. We do this because technical skills can also be taught, and regardless of your background, we’re going to have to “adjust” you into the way we work.
We may ask you some technical questions, but they’ll typically be on a more theoretical level than practical because we want to know how you think about tech. Again to see if your “ideological” fit is good rather than to test your technical prowess.
In my anecdotal experience this is a far better process than technical interviews. But it is made possibly by education, precious work experience and the fact that if you really turn out to be terrible then it’s “free” to let you go. It also requires direct managers and often team members to handle the process with HR on the sideline, rather than HR handling the entire hiring process… but I can’t imagine working for someone I’ve never met.
> Part of this is because of how available education is, so virtually everyone who’s applying for a tech job comes with some sort of academic education, meaning that they’ve proven their worth to get it.
I interviewed hundreds of people in my career, 95%+ had university degrees, and in my experience, this is a much weaker signal than you make it. Only consistent takeaway I have is people with CS degrees from elite schools are unlikely to be completely terrible, but not much beyond that: the people with these elite degrees didn’t even necessarily pass coding/algorithm interviews, and people with less elite degrees would pass or fail with no additional signal from the degree. I’ve seen many, many people with CS degrees fail at things that I wouldn’t expect someone who passed an intro programming class to fail, things like keeping track of running maximum and an index of element at which the maximum occurs.
I’m not the kind of person to say that an elite degree is end all be all. But the kind of skill a person exhibits to get into a good college and complete a degree is not necessarily well exercised within the sprint that is a tech interview. Obtaining a degree, especially at a good school, is much more akin to running a marathon. To the extent that you need hires who can be marathon runners for your team, I’d upweight an elite degree accordingly.
That means that from the start, you rule out people who don't have this academic education. Then there's the question of the relevance of this education: either you only take CS graduates, in which case your applicant pool is quite small if you want to hire thousands of employees (but this might work if you only need to hire a few people). Or you broaden it, and you then have no guarantee that people will actually be able to write or understand code because they didn't have to to get your degree. That's without even getting into grade inflation and exam cheating.
> We do this because technical skills can also be taught, and regardless of your background, we’re going to have to “adjust” you into the way we work.
The thing is, after your technical fit, if you add a leetcode easy question, you can quickly check that the person you're interviewing can at least write a for-loop. Isn't this a useful filter?
Well yes, but as education is freely available, you’ll even get paid to study by the government, almost every applicant is going to have a relevant education. Or in the case of some people who’ve changed careers an education to show that they are capable of getting a degree and then a work history to show how they’ve transitioned into CS.
That being said your point isn’t completely without merit. But think about it from the management perspective, as I said, the wrong hire is the most expensive mistake you’ll make in middle management, so it’s more about risk management. If you get 10 good applicants, and pick the top 4-5 of those, then you’re likely left with 5 people who will each be a good hire. It’s not so much about finding the perfect hire, you’re just looking for a good hire, and part of the process of selecting those 4-5 people will be choosing people with good educations, or very good work histories. That might sort some people that are excellent out of the pile, but it also lowers the risk significantly.
In America, there is a history of using "cultural fit" as an excuse to discriminate against socially marginalized groups (both those that have and haven't escaped economic marginalization in the intervening years). So that's another criterion that Americans will have trouble with: it's difficult to be certain that your lack of "cultural fit" is something fundamental to do with your personality or work style, or if it's some aspect of your identity that is unrelated to your work performance (something that an effective, if biased, employer and team might discover if you were given a chance).
Cultural, personal and ideological (oof, what a word... kind of a red flag) are just proxies for "people _I_ like". Go far enough down that path and that is how you end up discriminating people for how they look/think/express themselves.
I'll honestly take Leetcode every time over this. I can grind algorithms, I can't change the fact that I'm Indian/Hispanic/Whatever.
It can be, again, being Danish makes this a little hard to talk about on the global scale. We’re very homogeneous to the point that I once worked in a team where 33.3% weren’t white men and that was so far above the organisations average that we were awarded… we had one woman and one Indian guy, so 2 out of 6, and that’s how homogeneous our society is.
But I’ve worked in a little bit of both environments. Usually it depends more on the people than the hiring practices in my experience. It’s always a danger that you’ll want to hire someone like yourself.
Agreed. The implication that a lack of algorithms testing during software engineering interviews has much of anything to do with successful innovation, as opposed to the infinitude of other factors that go into commercializing a product, is unpersuasive.
And do these leetcode style questions test innovation at all? I don't think they're asking candidates to invent new algorithms.
My 2c is that culture around work (in general - not specific to tech) and pay in Europe are a much bigger contribution than hiring. By no means am I trying to disparage European engineers or those who stay for a myriad of good reasons. But, it's also clear that some of the best EU engineers would rather work in the US for much higher pay. Even companies like Google* pay much worse if you work on YouTube in Paris vs the US.
"Yeah, it sucks for everybody and it doesn't work very well, but it scales!"
Not picking on you, this is a very common refrain. The apparent advantages of scalable processes seem to trump the demonstrable disadvantages, especially in hiring. The end result is a system most people acknowledge flat-out sucks, it just flat-out sucks in bulk. We say "hiring the right people is the most important thing we do, so let's mechanize it and spend as little time-per-unit as possible". I find it more sad than funny.
I think absolutes like "it sucks" are rather unhelpful when you need to choose something anyway, so it's not like doing nothing is an option. What would help is a clear alternative and how it does better one certain metrics while satisfying requirements. Yes scale does seem to be a constraint for big companies using this process, among other requirements making the whole leetcode process appealing. It's not only about costs.
Have you actually every talked about code and architecture to another peer? If levels are cca same it quickly becomes an effortless exercise where ideas flow quickly and topics are immediately picked up. Lets call it brainstorming.
Why can't we repeat that in interview? I don't care if some dev can code some method names or detailed algorithms from his head, when it takes 3 seconds to find exact solution. In other words, there is little real added value from such a skill. You get much better peek into somebody's head with above - but it requires significantly more effort on interviewee's side, heck maybe even some preparation.
You can to certain extent hack leetcode process by just doing leetcode. Its much harder to be verbally fluent about designs, concepts and libraries that you never used. And for the rest, there is a trial period, you can't skip that 3 months experience and cram it into interviews.
Are you referring to system design interviews, or are you describing something else? System design interviews are commonly used along with leetcode interviews, especially in senior roles.
> How would you assess them better than current common processes like leetcode interviews?
OP said "just about every other skill that is needed to be a good software engineer", for which leetcode is useless.
Communication skills, organizational skills, hell, even general knowledge isn't covered by leetcode-like interviews.
Personally, speaking as someone who doesn't like but has to interview potential hires, I have found that there is zero overlap between people doing well in leetcode interviews, and hires doing well in their first year.
> If one could do better hiring at the same cost or less, the whole industry would be interested
I think this is a naive perspective. If there is any takeaway from the recent wave of layoffs, it would be that company wide decisions may be driven by factors that have no relation with actual financial performance. Google could have kept all these engineers and still make an absurd profit.
The notion here is that some company, at some point in time, started putting their applicants through these kind of puzzles, and it became trendy.
AFAIK communication skills are routinely evaluated in those interviews: if you have the best code but can't explain how it works, you won't score highly on them. You should be able to discuss tradeoffs, how you reason, etc.
They clearly don't help to evaluate organizational skills, but no one said you should only have these interviews. I'm guessing no company does only leetcode-style interviews, at least I don't know of any.
> I have found that there is zero overlap between people doing well in leetcode interviews, and hires doing well in their first year.
You could only evaluate this correlation among people who did pass your leetcode interviews and were thus beyond a certain bar, unless I'm missing something. The question is whether selecting according to this bar is useful, but it doesn't look like you evaluated this. Or did you do leetcode interviews and just hired anyone regardless of how they performed in them? I agree with you that beyond a certain bar, the signal becomes lesser. But being able to tell whether a candidate can write a for loop is a pretty strong (anti-)indicator for many SWE jobs.
> The notion here is that some company, at some point in time, started putting their applicants through these kind of puzzles, and it became trendy.
I agree that there's probably some of that. But companies are made of people, and if so many are still doing that decades now after they started and don't see a competitive advantage in switching, maybe it's the best they've been able to come up with, given their constraints. Again, there's real money to be made for those coming up with an alternative that'd scale and perform better.
> AFAIK communication skills are routinely evaluated in those interviews: if you have the best code but can't explain how it works, you won't score highly on them. You should be able to discuss tradeoffs, how you reason, etc.
I want to specifically address this because communication can make or break teams. The kind of communication that you describe - conveying objective information - is not enough to build a well-functioning team. One of the most important questions, IMHO, is how do your teammates disagree with one another, especially on issues that don’t have a clear right answer? If you don’t probe for the answer to that question, then you’re hiring with a huge unknown.
There’s no way to know if Mrs. leetcode master X is a team player based on these kinds of interviews. She could come in and destroy a team with her personality. Granted a socially adept person can fake a pleasant personality during a few hours of interviewing, but the rare great interview processes will at least force a mini disagreement or few and gauge the candidate’s response.
This begs the question though: how do you evaluate communication skills better than this to encompass what you miss on technical discussions? It's easy to look like a team player in an interview setting where disagreements are low-stakes and theoretical. People do prepare for these behavioral interviews, there are whole sections in the same old reference book used for leetcode interviews. I honestly doubt people can tell between a well-prepared candidate from one that's a genuine team player before actually working with them.
I also want to reiterate the point that I don't know of any company that solely uses leetcode-style interviews, so they do tend to evaluate some kind of communication skills in another one.
I would hope no company relies solely on leetcode questions, but the common understanding was that Google was, at least in the not-so-distant past, among big tech companies, the closest to that style.
The weight that a company allocates to communication skills can also be measured on a spectrum. What I suggested in the comment to which you responded may be considered extreme by engineering interview standards, but the more you can probe an engineer’s communication style during an interview, especially in an adversarial conversation, the higher confidence you can have in their interpersonal compatibility with your team.
It’s one thing to answer questions about behavioral characteristics, which can be prepped. It’s another to induce the behavioral characteristics you’d like to test and see them for yourself. That’s a lot harder to fake.
You are absolutely correct. This discussion is pointless. When you have a 10k org you cannot allow ad-hoc methods because there are unscrupulous actors: people will sell the job, there will be nepotism, and discrimination.
Hiring at scale with some bar of quality needs industrial processes. That's a real constraint of the system.
Any alpha gets ruthlessly optimized away. Referrals? Now sold for split. GitHub Open Source? Now produced for cash or undifferentiated slop.
People who don't run large orgs don't get this. But also many people who run small orgs don't get this: you don't need to run large org machinery for small orgs. Part of the advantage is agility. Part of the advantage is that everyone is still on the same page and that it's obvious when they're not.
So big orgs have no choice. Small orgs have no need.
> Hiring at scale with some bar of quality needs industrial processes.
Sounds kinda logical, but is it really?
It's not like BigCorp only has one guy doing the interviewing for 100's of positions. You're interviewing for a job on a team regardless of company size, and presumably what matters to them is whether you are a good fit for the need they have on their team, not whether you pass some industrial scale screening test.
The heart of the problem is that people hiring for software engineering roles think hiring is all about "what you know". It isn't. To hire a good candidate, hire for "who you are" and "who you can become". You want to hire learners not "experts" (read: someone versed in the methods that are about to be obsolete). Hiring is always more about adaptability and soft skills, no matter what role you're hiring for except in very specific, usually time-bound circumstances. This is why true specialists are either on contract or are consultants. Most of the tech industry does hiring completely wrong because, as usual, most software engineers are hyper literal, narrow minded, and lack the social insight required to be effective on a fundamentally social problem.
Encourage a competitive market environment where there aren't just a handful of polarized "make it" companies that everyone is applying to regardless of fit or career objective because landing a job there is a golden ticket?
I did over 400 interviews working there, and many hundreds more serving on hiring committees. I never gave a leetcode interview, and saw only a modest amount of them in committee.
“It’s all leetcode BS” makes a great offhand d complaint but was not the actual truth.
I only interviewed once with them, but there wasn't anything I'd consider leetcode.
There was a rather aggressive interviewer who drilled me on an architectural web services question despite interviewing for a systems programming position on Fuchsia and having a resume solely in embedded/systems programming. From my limited experience I'd say the interview process is broken, but not in the way people on here think.
I’ve interviewed there multiple times over the years. It was full of typical leetcode shit. There’s a reason there’s a list of problems on leetcode tagged as Google having used them recently…
I watched this a while ago and thought I’m clearly too dumb to work at Google because I’m not going to come up with convex hull algorithms in an interview room.
Google reached out to me a few times to see if I'd interview with them. On the one or two phone calls I had with their recruiters, they explicitly advised me to practice algorithms (leetcode et al) if I wanted to move forward. Perhaps what you were doing was not the norm.
I personally did a lot of system design, debugging (systems or code), or deep technical dive (explain in more and more detail how something works and why it works that way).
I don’t think Google is necessarily overrated instead I think they have some issues being both an advertising company and something else.
I look at them from an EU enterprise perspective. Back 10-15 when the big move into the cloud started Google was ahead of their competition. They had online office, and they still have some excellent services that are sort of unmatched by both Amazon and AWS in terms of managed backends like Firebase, but today they make up almost no enterprise sales. This is largely because they never managed to transition into a world where they could sell their products to enterprise. One part is their data and privacy policies which are an obvious issue the other part is support. One of the most important things Microsoft sells to Enterprise is support, and I don’t mean the stuff you and I get as private persons, I mean how their headquarters will call your CTO with updates when something goes down, how you have direct channels to get things changed like when teams was turned on by default instead of something your IT department controlled, or how you can even visit their Azure server centres and look at “your” server if you’re a big enough customer. This is why AWS sort of “lost” in the EU, because when they first entered the market they had the automated support similar to what Google has now, where you can talk to a useless chatbot and never get anywhere even if you’re paying them millions of dollars. Unlike Google, Amazon quickly adjusted and suddenly they had better support and EU compliance than Microsoft (who still can’t guarantee that only EU citizens ever work on the maintenance of their data centres where your data is stored).
The one place Google was a little different was in Education. They actually seemed to know how to sell that, but even here their advertising roots are now losing them deals. Because now there is a focus on how everything in Google education is shared with Google, and while that data might be valuable to Google it’s losing them all their sales in education, which also means they lose the data…
Unless Google somehow changes course, and becomes both an advertising company, and, a tech company, they are just never going to be relevant outside of advertising again. At least for Enterprise, but even as a private customer, you’re probably thinking twice about their products considering how many of them they shut down.
I think it’s a shame considering a lot of their products are very good and affordable, but is what it is.
Having worked there I agree on some points. I thought the interview process was bullshit.
But, people generally work there for two reasons:
Pay and benefits top-knotch. I made two times there what I'm making now, post-Google, and I'm still making more 30% than my peers who work in-office for local companies do. (I work remote). Free food and other perks were also amazing.
Exposure to really large systems and scale. A lot of people really get off on building systems that scale as big as some of the Google stuff.
And honestly the internal engineering quality at Google is excellent. But conservative, and bespoke. They build their own everything, which they can do because they have buckets of cash. And what they build is mostly superior, and more consistently engineered. The internal code quality is generally meticulous.
I joined Google 11 years into my career. So while I had taken an algorithms class recently, I just reread the algorithm book I used in college and spent a bunch of time studying all of them. I actually found it a lot of fun, as it was a good refresher for myself.
The problem is it only works for a certain personality and thinking type that can do that under pressure in front of an interviewer. Most of us who have been in the industry for a while solve algorithm problems by working in an editor or REPL, and do so in a solitary way, with time to pause and think.
The Google-style process (that so many people copied) acts as if the ability to do that kind of thinking under time pressure and in front of a generally-elitist interviewer is some kind of marker of the ability to work on the job. It isn't.
Plus I worked at Google for a decade and the amount of times I had to do actual algorithm/data structure fundamental stuff was about 0. 90% of Googlers are wiring existing crap together, and if they stray outside of that they'll get spanked in a code review anyways.
Most "leetcode" questions they ask are just easy level programming puzzles. The fact that many developers seem to hate it and even struggle with it says more about the developers applying than it does about Google.
The fact that there are whole industries built on training these puzzles is a big signal that you're wrong.
Also it's funny that engineers leaving Google, Amazon, et all NEED to practice these skills in order to get other roles because it just shows that these skills aren't needed (read: exercised, grown) working in their day-to-day jobs.
> The fact that there are whole industries built on training these puzzles is a big signal that you're wrong.
I'm not sure that logic alone is a compelling: It's conceivable that this secondary industry addresses a real gap in university curricula, or a need for ongoing training of experienced developers.
But I think your overall point still holds, because there's a consensus that a large number of Leetcode-like puzzles require familiarity with problems and solutions that hardly every come up in real professional software development, and aren't even good proxies for the abilities that do matter.
>The fact that there are whole industries built on training these puzzles is a big signal that you're wrong.
Wrong about what? That they are easy? If you know binary search, sorting, sets, hash tables, recursion then it is easy. If you wanted to make it more aligned with what a developer does day to day the alternative is doing a small project in your own time which is more time consuming for everyone.
Leetcode "easy" questions are just common sense application of everyday algorithms and data structures. These might make sense for an automated screening test of "is this candidate lying about knowing the language".
But, then there's the ones that are all about specific techniques such as dynamic programming + memoization, or specific graph algorithms etc, etc. Any decent programmer can learn how to do these harder problems under time pressure (interview) through practice, but this is really about Leetcode grinding prep... these are not problems you would likely encounter in most jobs, and in the real world you'd just Google for algorithms or ask a colleague if you needed help.
> Most "leetcode" questions they ask are just easy level programming puzzles.
As a senior developer currently looking for work, I have mixed feelings about leetcode-like questions. Here's my take:
Pro: They rule out most/all applicants who lack basic programming competence.
Con: Any timed or live programming test can make some applicants fail because of performance anxiety.
Pro/con: Some of the problems require (a) very high intelligence or (b) familiarity with how people have solved that specific problem in the past.
I think (b) is a major source of complaints, because for most people the only solution is Leetcode grinding, which outside of the interview process isn't a good use of one's professional development time.
Con: At least for timed problems on Hackerrank, you have a dilemma: A simple, straightforward solution takes little time to code, and is easy to explain. But it might also take too long to run on some of the test cases, which then requires you to guess at the source of slowness, and try to fix it.
But in multi-problem Hackerrank tests, you don't necessarily know if you have enough time to do that, because the other problems in the set might or might not require lots of time as well. And you can't revisit an earlier problem once you've submitted a solution.
Your assertion is that Google asks “just easy level programming problems.” Assuming we accept this argument, that probably tells most of us more about Google than it does about some straw-man developer.
But I’ll give Google a bit more credit than you and guess that they ask questions that are at least above the “easy” level.
As I’ve grown more senior in age and rank I’ve come to realise just how important it is to carry on the selection process after the interview, during probation, and managing performance throughout your hire’s entire journey. It just feels so incredibly imprecise relying on a few hours of meetings to gauge if someone is good or not, and the probationary period is crucial for correcting mistakes. Frankly, we also can’t afford to say no accidentally to someone who would turn out to be a great hire.
I’m seeing this now through the lens of hiring at a 100 person org, but I’ve done hundreds of interviews as a FAANG employee too. The difference there, I would say, and what GOOG try to do is to shoot for the moon and get hiring nailed at the interview process. This comes at the expense of rejecting nine out of ten candidates because the candidate firehouse is free flowing and plentiful: something I don’t have at my much more normal startup.
I’ve don’t think I’ve ever seen engineers complain about the rejection rate of Google’s interview process. The more common criticism/meme regards the laziness of running candidates through a unidimensional leetcode gauntlet. Half joking, but why even have engineers run these interviews at all? A proctor could get the same job done at a fraction of the cost. If a proctor can run the interview process, then how valuable is the signal?
Unless you're willing to spend more than 20~30% of your engineer's precious time on the hiring process (or have GPT-7 or whatever GAI lol), it will always be BS. And that's not scalable; this could work when your hiring is mostly through employee's personal/professional network since ROI will be better than average, but when you need to hire thousands new employees it will quickly become a bottleneck.
The only thing you can do is designing an acceptable level proxy done with high efficiency. Unfortunately we don't really have a good way there to figure out something more efficient than coding interview + system design interview yet. A good interviewer can still extract a surprising amount of valuable signals within 45 mins but not everyone is interested in being a good interviewer.
I think there is some truth to this for sure, it doesn't mean you're hiring practical / driven people, it means you're likely hiring people who are good at rote memorization.
>many of their software products are bloated and slow as fuck
Which? I regularly use Gmail, Maps, Drive, Docs, Sheets, Home, Voice, and Takeout. I don’t think any of those are bloated or slow, but maybe I’m missing something.
The ones I think are top class are not really end-user oriented, it's core stuff like the search engine, their JS engine, etc. (IMO) The bloated ones are their webapps that suck up gigabytes and gigabytes of ram and CPU resources (client side) for basic stuff like showing you one couple of pages of emails or playing a video file.
Yeah, maybe it's a hard interview if you've been out of college for a while, haven't written classical algorithms professionally for years, and don't want to spend weeks or months of free time bashing your head on leetcode. What isn't pressure tested by the Google interview process? Just about every other skill that is needed to be a good software engineer.
Obviously, Google has some good engineers, but my goodness was the hype around the company offputting.