Your cart is currently empty!
Result and Reflection for “My Experiment to Crack the Coding Interview”
So… my mad ditch effort to pass the mock coding interview (written about in the previous blog here) was not a great success but not a great fail either. In short, I was unable to complete the coding challenge in time, but the interviewer suspected I’d have been able to complete it if given more time. However, in life, there aren’t many prizes for 2nd place, and none if you don’t finish the race. I didn’t finish, and that in itself is enough to disqualify. NOTE: (the actual review is included at the bottom of this post as a jpg, along with some resources to check out).
Perhaps the biggest mistake I made was switching gears from one possible solution to another. Though my second approach was more optimal, and I may not have finished with the first one had I stuck-with, the interviewer pointed out that it’s incredibly rare for anyone to “get it right” the first time around… and … that it’s better to get some solution, ANY SOLUTION, and then refactor it into something better.
As for the actual “experiment” (and to use that as a reference point for refactoring my approach to learning this stuff) and doing better next time? That, I would like to think was most definitely a success. Doubling down to familiarize myself with the most commonly used built-ins that appeared in the solution examples on AlgoExpert was a HUGE help. I’ve also heard from other students that they were advised to familiarize themself with built-ins in order to save time. Moving forward, I should think that I will not only triple down on the built-ins initially covered in the previous blog post… but, I will also expand my catalog to include the Udemy courses I have, as well as the more difficult questions on AlgoExpert. Looking forward, I’d like to build something of a parser that could automate collection and counting of syntactical features. Although, I suspect my skill level now precludes me from building something that robust.
Beyond familiarizing myself with syntactical features, I had focused MUCH on explaining code, essentially planning out features in plain English. I would say this in and of itself was a huge success, but in terms of the larger context of completing the interview… well, it was incomplete, a fail. It was an element, but not the compound. Reviewing “the review” (see below) and reflecting on my own performance, I agree that I failed to translate ideas into code. Or, at the very least, failed to do it quickly. While doing my final prep, for the mock technical “the day of” and “the day before”… I had the impression that the number of challenges I’d be able to code from beginning to end wouldn’t have been that many, AND in the event that I encountered a challenge that was totally unfamiliar, then having dove deep into any specific challenge would’ve been a waste of time. The one thing I thought was in my control/power to get decent at (and fast) was to practice articulating code. I tried to work through as many of the problems on AlgoExpert as I could… but only to the extent of explaining a step-by-step approach in Plain English. Upon completing a step-by-step explanation (without worrying too much about the code), I would then flip to the solution and attempt to read through the code, to see if my solution essentially translated into the same thing.
Reflecting on that approach, it seems I was able to develop the “muscle of explanation” on short notice. Going forward, I suspect I might refine my verbal problem solving by attempting to explain HARDS, VERY HARDS and other such difficult problems in plain English, without much thought to the code…then, compare code solutions to my “expressed solutions”, that I might refactor unnecessary verbiage to more concisely articulate an efficient approach to a problem. The thinking here is: getting practice at explaining complex ideas will make it easier to explain less complex ideas more quickly and accurately… that becoming proficient to articulate complex situations will allow me to more efficiently keep track of the bits and pieces in a smaller problem.
As an aside: The Goal is not to become good at any ONE problem, but develop myself in such a way that any problem is within reach.
To bridge the gap from English to Code… I think I need to find a more standardized way of talking about code. Though an advocate for analogy, I think my analogy from the previous blog failed me. Not completely, it gave a decent foundation to identify what needed to be done. However, when translating my own expressions back into code, I was at a loss. And ultimately, as per the review, I got lost in my own intentions.
Right now, I don’t think I have a solution… but I do think I’m now better able to articulate the problem (my problem): Given that I can generally identify what the code is “supposed to do”, how can I express it in such a way that will not only make it easier to refactor (in real time), but easier to code?
To unpack that question: it would seem that an expert coder, a master of design patterns etc… they’ll be able to see a coding problem, and the way they express it in English will be different than someone just starting out, but even more than that: when they’re expressing things in regular English, the definitions would equate to code. To illustrate by corollary, it’s how some foreign words have no counterpart in English. For example: in Greek they have 4 different words for different types of love. “Agape” translates loosely into “charitable/selfless love” in English. I can use “agape” in English, but the definition for it exists in Greek. Code-wise, it’d be like using some phrase in English but having a definition that exists purely in code.
To compare a beginner to an expert, I suspect having “code definitions” as substitutes for things expressed in English might make all the difference.
TO SUMMARIZE: the mock technical interview wasn’t a disaster, but it could’ve been better. Being able to express coding concepts in regular English seems like winning a battle but losing the war. Having non-squishy definitions in code seems like it can close the gap. Having found that immersing myself in JS built-ins was very helpful when expressing ideas; I suspect that immersion in more complex code snippets might be the next level, and coding design patterns once more beyond that. I look forward to testing/proving/or disproving that hypothesis by the time of my first real interview.
——