I could be wrong, but I believe the gist of breakpoints is to pause the state of an application, and review it's previous/current state, and cycle through each "breakpoint", to identify holes, bugs, and irregularities.
I was never taught to use this, and I believe I fall into a much more primitive, yet common, approach of debugging. Console.logs. Console.logs everywhere.
To me, a "breakpoint" is the equivalent of placing a console.log at the location where a "breakpoint" should occur, to understand the state of a particular object/class/etc.
Something tells me however, this is far from ideal, and I am lacking what appears to be a very valuable skill in debugging. Everyone that I know who is a talented developer, professor, and manager seem to heavily lean on breakpoints, yet from googling I can't find any solid explanations on how to use them to a degree superior than simple console logs.
This question is hoping to get the answer on how to differentiate the "console logger" mentality from what appears to be a more advanced form of debugging: breakpoints.
P.S. I tagged javascript and ajax as this is what I use, and my question is primarily geared towards these two categories, however I am confident this applies to all languages, and a general culture of optimal debugging.
Aucun commentaire:
Enregistrer un commentaire