Whilst I expect that OG was joking, it's worth taking seriously that suggestion for long enough to explain why it wouldn't work.
Firstly, even in theory you can never reach the speed of light, c, assuming that you weren't already travelling at c. You can get arbitrarily close, it is true, and we do have to be aware of the limiting behaviour, but you'd always measure your speed as less than c.
Secondly, it's misleading to say that time stops at the speed c anyhow. What equals zero, when moving at c, is instead the "proper time", the difference defined as (ct)^2 - x^2, where t is the time taken to go between two points a distance x apart. For a given object moving from A to B, then they'd always measure the same value of (ct)^2 - x^2 even when they measure a different value of x and t separately. For the case x/t smaller than c (that is, for any physical object), this allows you to define a proper time, which basically records how long you'd note on your own clock as you move around; but when x/t = c then the concept of proper time, rather than equalling zero, no longer makes sense. This is often referred to as "time stopping", but it is not that, because the other aspect of proper time is that it's "time in your own rest frame", but light is never at rest in any frame.
Anyway, that's a long-winded way of saying that no, time doesn't stop when you move at speed c. Still, let's ignore this for a second and consider a practical case of accelerating arbitrarily often -- as noted, you never reach c, but if you accelerate at a constant rate, then you can work out what time is recorded using the fairly simple formulas described on, for example, this page:
https://en.wikipedia.org/wiki/Acceleration_(special_relativity)#Curved_world_lines
Specifically, equations (6a). In this notation, x and t are the length and time as measured by the audience, and τ is the proper time recorded by our superhuman athlete, while α is the "proper acceleration", the value of acceleration on which all observers can agree.
As a matter of pedantry, in this set-up our athlete is "convinced" that they aren't actually moving, so that the proper distance moved is always zero. However, x is still the relevant distance -- it's the length of the track the athlete perceives before starting the race.
Just to get our bearings, the current 400m world record is 43.03 seconds, and this turns out to be equivalent to a proper acceleration of around 0.415 m/s^2 (NB in practice, athletes don't accelerate constantly through the race, but we will ignore this piffling little detail). Even if we suppose that our athlete was capable of accelerating at, say, 10^10 m/s^2 (this is actually physical despite, on the surface, implying that the athlete can exceed the speed of light -- ask about this if you care) -- for the entire race, then the athlete would still record a time of 0.000282842 seconds for the entire lap -- and, importantly, the audience would see a slightly longer time of 0.000282846 seconds. This last matters because it's evidently good practice to record the time taken to do a lap according to somebody watching, to avoid bias (and also because of the physics!).
Finally, if the athlete did indeed accelerate infinitely rapidly, then the time recorded by the athlete *does* approach zero. But, the time recorded by the audience -- or, for that matter, the referee -- is still going to be equal to just 400/c = 1.33*10^(-6) seconds. You can accelerate however fast you like, but you're never going to beat the speed of light.
tldr, NJ is still correct :)