Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Mechanical engineering interviews seem to do the same as software: "Engineers always ask about beam bending, stress strain curves, and conservation of work. Know the theory and any technical questions are easy."

Basically an equivalent of simple algorithmic questions. Not "real" because it's impossible to share enough context of a real problem in an interview to make it practical. Short, testing principles, but most importantly basic thinking and problem solving facilities.



> Mechanical engineering interviews seem to do the same as software:

I've been an engineer in the past (physics undergrad -> aerospace job -> grad school/ml). I have never seen or heard of an engineer being expected to solve math equations on a whiteboard during an interview. It is expected that you already know these things. Honestly, it is expected that you have a reference to these equations and you'll have memorized what you do most.

As an example, I got a call when I was finishing my undergrad for a job from Raytheon. I was supposedly the only undergrad being interviewed but first interview was a phone interview. I got asked an optics question and I said to the interviewer "you mind if I grab my book? I have it right next to me and I bookmarked that equation thinking you might ask and I'm blanking on the coefficients (explain form of equation while opening book)". He was super cool with that and at the end of the interview said I was on his short list.

I see no problem with this method. We live in the age of the internet. You shouldn't be memorizing a bunch of stuff purposefully, you should be memorizing by accident (aka through routine usage). You should know the abstractions and core concepts but the details are not worth knowing off the top of your head (obviously you should have known at some point) unless you are actively using them.


I've had a coding interview (screen, not whiteboard) fail where the main criticism was that one routine detail I took a while to get right could have been googled faster. In hindsight I still doubt that, given all the semi-related tangents you end up following from Google, but that was their expectation, look up the right piece of example code and recognize the missing bit (or get out right immediately).

For a proper engineering question (as in not software), I'd expect the expected answer to be naming the reference book where you'd look up the formula. Last thing you want is someone overconfident in their from memory version of physics.


> Last thing you want is someone overconfident in their from memory version of physics.

Honestly, having been in both worlds, there's not too much of a difference. Physics is harder but coding you got more things to juggle in your brain. So I really do not think it is an issue to offload infrequent "equations"[0] to a book/google/whatever.

[0] And equations could be taken out of quotes considering that math and code are the same thing.


I had a senior engineer chastise me once for NOT using the lookup tables.

"How do you know your memory was infallible at that moment? Would you stake other people's lives on that memory?"

So what you did on that phone interview was probably the biggest green-flag they'd seen all day.


We live in the age of ChatGPT. It might actually be time to assess how candidates use it during interviews. What prompts they write, how they refine their prompts, how they use the answers, whether they take them at face value, etc.


Sure, and we live in the age of calculators. Just because we have calculators doesn't mean we should ban them on math tests. It means you adapt and test for the more important stuff. You remove the rote mundane aspect and focus on the abstract and nuance.

You still can't get GPT to understand and give nuanced responses without significant prompt engineering (usually requiring someone that understands said nuance of the specific problem). So... I'm not concerned. If you're getting GPT to pass your interviews, then you should change your interviews. LLMs are useful tools, but compression machines aren't nuanced thinking machines, even if they can mascaraed as such in fun examples.

Essentially ask yourself this: why in my example was the engineer not only okay with me grabbing my book but happy? Understand that and you'll understand my point.

Edit: I see you're the founder of Archipelago AI. I happen to be an ML researcher. We both know that there's lots of snakeoil in this field. Are you telling me you can't frequently sniff that out? Rabbit? Devon? Humane Pin? I have receipts for calling several of these out at launch. (I haven't looked more than your profile, should I look at your company?)


I'm actually not talking about interviewees (ab)using ChatGPT to pass interviews and interviewers trying to catch that or work around that. I'm talking about testing candidates' use of ChatGPT as one of the skills they have.

> I see you're the founder of Archipelago AI.

I don't know where you got that from, but I'm not.


> I'm talking about testing candidates' use of ChatGPT as one of the skills they have.

The same way? I guess I'm confused why this is any different. You ask them? Assuming you have expertise in this, then you do that. If you don't, you ask them to maybe demonstrate it and try to listen while they explain. I'll give you a strong hint here: people that know their shit talk about nuance. They might be shy and not give it to you right away or might think they're "showing off" or something else, but it is not too hard to get experts to excitedly talk about things they're experts in. Look for that.

> I don't know where you got that from, but I'm not.

Ops, somehow I clicked esafak's profile instead. My bad


You might as well ask how they use book libraries and web search.


I'm a chemist by education, so all my college friends are chemists.

Being asked a theoretical chemistry question at a job interview would be...odd.

You can be asked about your proficiency with some lab equipment, your experience with various procedures and what not.

But the very thought of being asked theoretical questions is beyond ridiculous.


Why, don't they get imposters? You sure run into people who can't code in coding interviews.


Because to be a chemist you need to graduate in chemistry.

What would be the point of asking theoretical questions?

There's just no way in hell people can remember even 10% of what they studied in college, book knowledge isn't really the goal, rather than teaching you how to learn and master the topics.


Because to actually have those types of conversations you have to have legitimate experience. To be a bit flippant, here's a relevant xkcd[0]. To be less so, "in groups" are pretty good at detecting others in their groups. I mean can you not talk to another <insert anything where you have domain expertise, including hobbies> and not figure out who's also a domain expert? It's because people "in-group" understand nuance of the subject matter.

[0] https://xkcd.com/451/


Doesn’t that comic more closely hew to the idea that some fields are complete bullshit?


That's one interpretation. But that interpretation is still dependent upon intra-group recognition. The joke relies on the intra-group recognition __being__ the act of bullshitting.


Hmm… I have a twist on this. Chemistry is a really big field.

My degree is in computational/theoretical chemistry. Even before I went into software engineering, it would have been really odd for me to be asked questions about wet chemistry.

Admittedly it would have been odd to be quizzed on theory out of the blue as well.

What would not have been odd was to give a job talk and be asked questions based on that talk; in my case this would have included aspects of theory relevant to the simulation work and analysis I presented.


And software and computing isn’t a big field? Ever heard of EE?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: