I'll respectfully share the following:
Most corded lawnmowers don't use AC motors, they use Permanent Magnet DC motors (PMDC) driven from a full wave bridge rectifier. (this is due to the torque requirements of a lawnmower application)
The power company will only 'guarantee'(I use that word loosely!) a nominal 120VAC +/- 5% (114vac to 126vac). The NEC allows for a 5% variation too.
NEMA (National Electrical Manufacturers Association) recommends motors be able to run at +/- 10% nameplate rating satisfactorily(not necessarily optimally)
Given this, you can see that achieving a 3% variance would be hard when the power company says it can only achieve within 5%. I am unaware of any substantive negative consequences that take place when you hit 3%. I've been an electrical engineer for 37 years and worked in a lot of residential, commercial and industrial locations and with a bunch of different kinds and sizes(up to 5000HP DC!) of motors, and I've seen supply voltages vary even more than the power company's stated +/- 5% (I've measured close to 10%!) In my personal experience and based on NEMA specs, I would typically not get too concerned until voltages are under or over by 10% (obviously application dependent)
I agree that the addition of a second cord adds a potential source of a problem but I think using a little common sense when connecting them minimizes it. I suspect you'll hear the difference in performance if a problem develops which causes a significant drop where they are connected.
BOTTOM LINE stays the same. Of course, always try and minimize voltage loss and given the specific facts here, IMHO the mower will be just fine and happy with the two cords. :smile: