Quote Originally Posted by ShadowX View Post
Some of your responses do not make any sense. The response on the force needed on the dial indicator is absolutely ridiculous. The force required is not the problem with metal tool bits. The amount of force will not deform the tool bit. However, if you mean the internal mechanisms inside the indicator may have backlash, that is a different story. Every indicator has its accuracy levels, but the manufacturer states the accuracy on the spec. You still haven't answered how an ultrasonic device can detect the tip of a cutting bit in reference to the tool holder (ie tormach, R8, etc). The z plane is established to the tip of the tool to the top of a raw stock, not the bed on the mill.

Another factor not mentioned is if the sensor has been calibrated against NIST standard instruments to establish your accuracy claims. The problem I see is the angle of the ultrasonic sensor and detector are fixed, therefore, the arc angle varies in relation to the distance from the tip of the sensor to the surface. Is this a linear signal that is returned or is the signal a non-linear curve? If its non-linear, how is it calibrated? At a fixed location or distance? A range of distance to establish enough curve for the profile? Does that accuracy change as a function of distance from the measuring object? What is the distance range where this accuracy claim is spec'd for? Does the ultrasonic detector compensate for humidity, temperature, barometric pressure, etc that defines air density and compensates? This is an important factor since the speed of sound varies in relation to air density. If it doesn't compensate, measurements taken at hot environments with high humidity would not match the results from measurements at cold locations with low humidity. The advantage that IR sensor has is the speed of light is not affected (or the effects are negligible) by these environmental factors. The speed of sound, on the other hand, is definitely affected by the environment.

http://www.migatron.com/high-accuracy-sensor/

The reason I mention this is the RPS-412A high-accuracy ultrasonic sensor compensates for all these parameters and cost over $2000, and yet, they can only claim an accuracy of 50um maximum. I like to see hard concrete accuracy results using established calibrated industry standards before I can accept what is claimed. I know your spec is for "precision" and not "accuracy". I understand the difference, but listing just the "precision" is not enough (repeatability of the measurement). The measurement might be precise in that moment when you measure, but the accuracy could be off. I would get getting different results on hot days vs cold days. Getting repeatable measurements is not good enough to level the bed. The measured distance needs to be accurate (and precise) since I need the sensor to be calibrated against my nozzle tip. I don't want to keep changing my delta z probe height firmware setting depending on the temperature and humidity of my environment. In case you didn't know, the IR sensor is spec'd to 0.01mm (10 um), so its more precise than your device.

https://miscsolutions.wordpress.com/mini-height-sensor-board/

I saw on one of the FAQ that claims the device is accurate to 0.0015 cm (15 um). Yet, the spec for precision is 15um to 25um. That absolutely makes no sense at all. Unless you mean the 15um only applies to the sensor spec'd at 15um and the 25um sensor's accuracy is 25um. Even then, its a stretch to say the accuracy is higher than the precision level for the 25 um. I can't be measuring something accurate to 15um when my measurements itself is varying up to 25um from one measurement to the next without moving the carriage. It just doesn't make any sense. If you said the sensor is precise to 15um, but only accurate to 25um, that is more believable. I hope this is more of an oversight in terms of spec'ing the device properly. A simple temperature cycling test with the device in a controlled chamber cycling between hot and cold (10F to 160F) can easily determine the accuracy claims. If the measurements are within 15um in that condition, then it would prove the accuracy claims.

The autoleveling mechanism is the tool to keep the bed level. Not only is the parameters used to calibrate out the level but adjust for any imperfections of the bed. I don't understand your logic where you have 4 points in the corner (not clearly defied via Gcode) is more accurate than 40 measurements across the bed using precise and repeatable locations defined by the bed leveling algorithms that goes to the same spot every time you level. In terms of checking each time, the process is to first establish the endstop positions using fixed length of rod (in the case of a delta machine). At that point, the system will move the carriage to each point and directly read the Z height value on the computer or LCD screen. You can adjust the endstops as needed if you want a pre-set height. I don't see how your system is any different in terms of process other than a separate computer program to measure the height. You can also take the data from the bed leveling and do a least squares fit to adjust for imperfections of the build such as Delta frame angles, the bed tilt, bed warpage, and so on. The process has to go through several iterations if you want to get the height measurements within a narrow range. These values are automatically fed back to the firmware to control the delta radius, end stop adjustments. Every firmware has differences. I hope you understand this is not just a check four corners and you are ready to go situation.

Adding a analog/I2C/USART interface sounds like a great idea, but why should it be end users or 3rd party developers making tools for it to work on a 3D printer? If you are trying to sell this to 3D printer users, it should already have a pre-defined interface that works with the existing controller boards like Rumba, Duet, and firmware like Marlin, Reprap, Repetier, etc.

I absolutely like the idea of having M3 mounting holes. It would be great for users to mount the sensor in a rigid configuration.

I am not here to crap on your thread. I just want to get reasonable answers so other 3D users are informed.
“Some of your responses do not make any sense. The response on the force needed on the dial indicator is absolutely ridiculous. The force required is not the problem with metal tool bits. The amount of force will not deform the tool bit. However, if you mean the internal mechanisms inside the indicator may have backlash, that is a different story. Every indicator has its accuracy levels, but the manufacturer states the accuracy on the spec. “
Which part of my response is “absolutely ridiculous”? Please be specific about what does not make sense.
I’m not talking about the effect of applied force on the metal tool bits or backlash or precision of the dial indicator.
The spring-loaded probe (or plunger) of the dial indicator applies a force (often about 1N - 3N) to the under-test surface. Based on the Newton’s third law of motion, the under-test surface also applies the same force in the opposite direction to the dial indicator. This force can cause the displacement of the dial indicator if it is not mounted very well. So, you need a rigid mounting structure to prevent that. If these 4 sentences look “absolutely ridiculous” to you, please let me know if you need more clarifications.

“You still haven't answered how an ultrasonic device can detect the tip of a cutting bit in reference to the tool holder (ie tormach, R8, etc). The z plane is established to the tip of the tool to the top of a raw stock, not the bed on the mill. “
The ultrasonic sensor, alignG, is a distance measurement sensor, not a tip detector. It measures the distance from any under test (target) surface. Like what a dial indicator does. It does the same job that inductive proximity or IR sensors do; but, with much higher precision. If you want to use it for the zeroing, you need to align it for the first time. Then you zero the reading of the sensor by software and mount the alignG in that position.

“Another factor not mentioned is if the sensor has been calibrated against NIST standard instruments to establish your accuracy claims. The problem I see is the angle of the ultrasonic sensor and detector are fixed, therefore, the arc angle varies in relation to the distance from the tip of the sensor to the surface. Is this a linear signal that is returned or is the signal a non-linear curve? If its non-linear, how is it calibrated? At a fixed location or distance? A range of distance to establish enough curve for the profile? Does that accuracy change as a function of distance from the measuring object? What is the distance range where this accuracy claim is spec'd for? “
It is not a simple narrow beam transmitted and reflected in a direct path. Ultrasonic is a wave, it has a distribution function based on the type of transducer we use. The ultrasonic transmitter transducer is a source of wave; the wave is distributed based on the diffraction theory. A portion of that wave is reflected from the target surface and returns to the received transducer. The received signal is not a single wave. It is the result of constructive and destructive interference of many waves. We know the coordinates of the wave source and the receiver. Knowing the amplitude/phase/Doppler data, we solve the diffraction equations for the distance. There is a complicated physics behind that, I really don’t want to go through all that here.

“Does the ultrasonic detector compensate for humidity, temperature, barometric pressure, etc that defines air density and compensates? This is an important factor since the speed of sound varies in relation to air density. If it doesn't compensate, measurements taken at hot environments with high humidity would not match the results from measurements at cold locations with low humidity. The advantage that IR sensor has is the speed of light is not affected (or the effects are negligible) by these environmental factors. The speed of sound, on the other hand, is definitely affected by the environment.”
alignG compensates for that. It does not mean that there is no effect; but, it is minimized. There are many ways to do that. The parameters that you mentioned, directly affect the transit time based ultrasonic sensors. We are not doing a transit time measurement. For Phase/Doppler measurement, there is a different story. Our measurement is not as simple as sending a pulse and measuring the time that it returns. There are many techniques employed by many researchers during past decades to minimize the effect of sound speed on the Doppler/Phase measurements. A simple search in google or IEEEXplore website will show you many practiced techniques including multi-frequency approach, T&R orientation, …
Any measurement instrument is sensitive to the temperature. We never claimed that we have an ideal sensor that solves all the problems for everybody on any condition. What are the sensors that you can use for auto bed leveling today? Please let me know the available choices. Which of them are not dependent to the temperature? Is an inductive proximity sensor or IR sensor kept several millimeters away from the hot bed not affected by its temp? You can keep alignG 10cm away from the bed and do the measurement.


http://www.migatron.com/high-accuracy-sensor/
The reason I mention this is the RPS-412A high-accuracy ultrasonic sensor compensates for all these parameters and cost over $2000, and yet, they can only claim an accuracy of 50um maximum.”
The RPS-412A sensor does not cost them $2000, they sell it for $2000!
However, the link that you posted is not showing what measurement technology is employed in this sensor. But, its structure and the shown parameters obviously (99% but not 100% sure) shows it is a transit time based transducer. A transit time ultrasonic transducer for 50-micron precision must be a high-frequency transducer. High-frequency transducers are expensive + high frequency wave gets damped in the air; so, you need an expensive precision high power system to provide enough energy to compensate for that + fabrication of acoustic lenses in that frequency is much complicated and costly + when designing a precision analog circuit for a 20X higher frequency, it is much more difficault + … So, it costs them a lot and they sell it $2000
We developed a new measurement technology that allows for precision measurement without using expensive, high frequency transducers and circuits. Should we sell it for $2000 to you to believe its precision ??!! :-)

“ I like to see hard concrete accuracy results using established calibrated industry standards before I can accept what is claimed. I know your spec is for "precision" and not "accuracy". I understand the difference, but listing just the "precision" is not enough (repeatability of the measurement). The measurement might be precise in that moment when you measure, but the accuracy could be off. I would get getting different results on hot days vs cold days. Getting repeatable measurements is not good enough to level the bed. The measured distance needs to be accurate (and precise) since I need the sensor to be calibrated against my nozzle tip. “
The practical accuracy of the alignG is about twice its precision value in the recommended calibrated range of 1cm -10cm.

“I don't want to keep changing my delta z probe height firmware setting depending on the temperature and humidity of my environment. In case you didn't know, the IR sensor is spec'd to 0.01mm (10 um), so its more precise than your device.
https://miscsolutions.wordpress.com/...-sensor-board/
If you really get 10 micron precision from your IR sensor, please forget alignG !
The strong comments and scientific questions that you are asking about our product shows that you have lots of knowledge in electrical engineering and measurement science. I’m surprised how you are that much sure about 10 micron precision of your IR sensor.
An IR emitter (a temperature dependent P-N junction diode) probably in series connection with a current limiting (temperature dependent) resistor emits the light. In the surface of the bed you have refraction, scattering, and reflection issues; all of them depend on many parameters including the tilt angle of the bed, the surface roughness in that spot, the material ,…. The reflected light is detected by a detector (a temperature dependent P-N junction diode) then it probably has a voltage divider (temperature dependent) resistor and an op-amp to increase the slope and a second op amp to trigger on a threshold voltage. It has 10 micron precision? If you look at the link that you posted on your previous comment you see this sentence: “Reproducibility of repeated probing at same spot: approx. 0.01mm”
What does “at the same spot” mean? remember that we are talking about an optical device with the wavelength of about 800nm. Does your XYZ stage has that much repeatability to keep the sensor at the same spot? What is happening if in the next measurement your sensor is positioned 1 micron away from the previous point? Is scattering of the light on the surface changed? What if your bed is not perpendicular, is it affecting the reflection and changes the intensity of the detected light?
Does IR sensors measure the distance or you move your z-stage to find the zero? What is the accuracy of the Z-stage in your 3D printer? Have you ever measured the backlash of the Z-stage? How can we have 10 micron precision when most of the 3D printers in the market have above 50 micron backlash?

“I saw on one of the FAQ that claims the device is accurate to 0.0015 cm (15 um). Yet, the spec for precision is 15um to 25um. That absolutely makes no sense at all. Unless you mean the 15um only applies to the sensor spec'd at 15um and the 25um sensor's accuracy is 25um. Even then, its a stretch to say the accuracy is higher than the precision level for the 25 um. I can't be measuring something accurate to 15um when my measurements itself is varying up to 25um from one measurement to the next without moving the carriage. It just doesn't make any sense. If you said the sensor is precise to 15um, but only accurate to 25um, that is more believable. I hope this is more of an oversight in terms of spec'ing the device properly. A simple temperature cycling test with the device in a controlled chamber cycling between hot and cold (10F to 160F) can easily determine the accuracy claims. If the measurements are within 15um in that condition, then it would prove the accuracy claims.”
The accuracy of the device is not 15 micron. If it has been written anywhere in our page, certainly it is not true and needs to be corrected, I apologize for that mistake. I highly appreciate if you let me know in what FAQ you observed that. I could not find it.
“The autoleveling mechanism is the tool to keep the bed level. Not only is the parameters used to calibrate out the level but adjust for any imperfections of the bed. I don't understand your logic where you have 4 points in the corner (not clearly defied via Gcode) is more accurate than 40 measurements across the bed using precise and repeatable locations defined by the bed leveling algorithms that goes to the same spot every time you level. “
We never claimed that a 4 point leveling is more accurate than 40 point measurement. For sure 40 point leveling is much better.

“In terms of checking each time, the process is to first establish the endstop positions using fixed length of rod (in the case of a delta machine). At that point, the system will move the carriage to each point and directly read the Z height value on the computer or LCD screen. You can adjust the endstops as needed if you want a pre-set height. I don't see how your system is any different in terms of process other than a separate computer program to measure the height. You can also take the data from the bed leveling and do a least squares fit to adjust for imperfections of the build such as Delta frame angles, the bed tilt, bed warpage, and so on. The process has to go through several iterations if you want to get the height measurements within a narrow range. These values are automatically fed back to the firmware to control the delta radius, end stop adjustments. Every firmware has differences. I hope you understand this is not just a check four corners and you are ready to go situation.
Adding a analog/I2C/USART interface sounds like a great idea, but why should it be end users or 3rd party developers making tools for it to work on a 3D printer? If you are trying to sell this to 3D printer users, it should already have a pre-defined interface that works with the existing controller boards like Rumba, Duet, and firmware like Marlin, Reprap, Repetier, etc. “
As explained in my previous post, we have a limited resource, time, and forces here. We can’t build software for every things and release it for end users on April 2016. We will consider that for future updates of our software.

“I absolutely like the idea of having M3 mounting holes. It would be great for users to mount the sensor in a rigid configuration.
I am not here to crap on your thread. I just want to get reasonable answers so other 3D users are informed.”
Thanks for all fantastic questions and comments. We are here to get feedback and learn from people who have experience in working with 3D printer and CNC. I will be more than happy to receive more comments. Thanks for your time.