I finished a little Air-app to stream complete jobs to the Grbl and render the result. This is the first “full scale test” running some 7000 lines of sample G-code I scrounged from this thread on CncForum. The Air-app feeds g-code to the microcontroller one line at a time, while the microcontroller generates the steps and feeds them back to be simulated in the Air-app. Everything seems to be working perfectly - the only thing missing now is a test with real steppers on a real machine.
After completing the support for helical motion, the project has reached the point where it is theoretically feature complete. It supports all common g-code that I have seen in the wild. It does lines, arcs, helical motion correctly at the designated feed rate.
Grbl now uses a simple buffer mechanism to decouple the production of stepper pulses from the computations of steps. A couple of very precise timers control the generation of stepper pulses, and as long as the step computation keeps up with the steppers on average, the pulses will be generated in a rock steady pace. Especially setting up for a new circular movement is a heavy task for a tiny microcontroller like the atmega168. Thanks to the step-buffer it will have up to a couple of microseconds to complete the task without the motor controller missing a step.
All this is slightly theoretical at this point as I’m only testing the timing with a scope, waiting for my first CNC to arrive from Lumenlab.
The next stage is to verify the design. I have written a test-rig in Adobe Air/Flex that emulates the CNC and lets me see visually that the motions generated is correct. Now I need to check that all movements are optimally precise and that it is able to complete full, real world machining sequences.
I will write a suite of tests to verify that it generates the precise, intended movements
I am going to expand my test rig to be able to feed full CAM-generated motion sequences and render the resulting tool-paths in some kind of isometric view.