Results 1 to 10 of 72
Threaded View
-
06-25-2014, 12:22 AM #34
- Join Date
- Dec 2013
- Posts
- 24
Yes, which should be a constant PWM per layer. The only time the PWM should change is when a layer will take more or less time than the previous layer. The only time the PWM will be off is when the print is finished or paused. I think we are essentially saying the same thing.
These two issues are related. If we want lower cost, we want a smaller magnet, smaller coil, and less current. A larger magnet and stronger field will be less susceptible to fluctuations caused by the viscosity and pressure of the fluid, but shrink it all down and the forces might become relativistic. In either case, the solution is to create a guiding element with the epoxy/polyurethane. If you look at Rylan's original design concept in his video at the beginning of the thread, it is "encased" in a right circular cone on each side. This will keep the magnet coming back to where it should be regardless of minor fluctuations and will allow the magnet to be much smaller, require less energy to operate, and potentially increase the drip frequency.
Additionally, the drip is not precise, you are correct. However, it is accurate. That is to say, it does of course deviate a little, but this magnet can only come up and go down in so many ways. The drips will be of a size that will all fit under a normal PDF curve, and the drips are so small in comparison to the level (assuming a large enough print volume), the error in calibration and calculation should be no more negligible than the error in measurement and calculation. You have to either assume the drip and container volume or assume the resolution and the zero position of the measurement device. Standard PLA printers do calibration tests. Inkjet and Laser 2d printers do calibration tests and alignment prints. Doing an alignment print on the peachy and assuming you are using a right rectangular prism for the print volume is no different than assuming you have assembled and properly calibrated a level sensor within the print area.
In shrinking the magnet and speeding up the drip frequency, the constant surface tension and viscosity should hit a critical point that will simply not allow liquid to pass. If it is designed such that it's relatively close to this critical point the surface tension and viscosity should maintain a very constant drip volume. Another approach to this would be to use some sort of telescoping system for the magnet that only allows a specific volume by design to enter a chamber. Magnet goes up, chamber is released, magnet goes down, chamber is refilled. Not sure how to do that right of the top of my head, but I assume something like that could be done.
Calibration and assumption is used often in electronics for many purposes, but yes, there is usually some feedback mechanism to determine that the calibration and assumption still holds true. I would agree that measuring is important, but adds another layer of complexity for users that just want the $100 printer To print some really cool objects or don't care about it being accurate to a few microns. So, for simplicity and costs sake I say why bother? On the other hand, measuring afterwards allows you to have basically any printable volume container shape. Whichever solution is more important for the most basic $100 users and cost lest to implement, we should pursue. Hell, why not just do it both ways?
Printer will print perfect...
06-14-2024, 10:44 AM in Tips, Tricks and Tech Help