On the other hand, when the engine inertia is larger than the strain inertia, the motor will need more power than is otherwise essential for the particular application. This boosts costs since it requires having to pay more for a engine that’s larger than necessary, and since the increased power intake requires higher operating costs. The solution is by using a gearhead to match the inertia of the engine to the inertia of the strain.
Recall that inertia is a measure of an object’s resistance to change in its movement and is a function of the object’s mass and form. The higher an object’s inertia, the more torque is needed to accelerate or decelerate the object. This means that when the strain inertia is much bigger than the motor inertia, sometimes it can cause excessive overshoot or increase settling times. Both circumstances can decrease production range throughput.
Inertia Matching: Today’s servo motors are producing more torque relative to frame size. That’s due to dense copper windings, light-weight materials, and high-energy magnets. This creates greater inertial mismatches between servo motors and the loads they want to move. Utilizing a gearhead to better match the inertia of the electric motor to the inertia of the load allows for using a gearbox presisi smaller motor and results in a far more responsive system that’s simpler to tune. Again, this is attained through the gearhead’s ratio, where in fact the reflected inertia of the load to the motor is decreased by 1/ratio^2.
As servo technology has evolved, with manufacturers producing smaller, yet better motors, gearheads are becoming increasingly essential partners in motion control. Finding the ideal pairing must take into account many engineering considerations.
So how does a gearhead go about providing the energy required by today’s more demanding applications? Well, that all goes back to the fundamentals of gears and their capability to change the magnitude or path of an applied drive.
The gears and number of teeth on each gear create a ratio. If a motor can generate 20 in-pounds. of torque, and a 10:1 ratio gearhead is attached to its result, the resulting torque can be close to 200 in-lbs. With the ongoing emphasis on developing smaller footprints for motors and the equipment that they drive, the ability to pair a smaller electric motor with a gearhead to achieve the desired torque result is invaluable.
A motor could be rated at 2,000 rpm, however your application may just require 50 rpm. Trying to run the motor at 50 rpm might not be optimal predicated on the following;
If you are working at a very low speed, such as for example 50 rpm, as well as your motor feedback resolution isn’t high enough, the update rate of the electronic drive could cause a velocity ripple in the application form. For instance, with a motor feedback resolution of 1 1,000 counts/rev you have a measurable count at every 0.357 amount of shaft rotation. If the electronic drive you are using to control the motor includes a velocity loop of 0.125 milliseconds, it will look for that measurable count at every 0.0375 degree of shaft rotation at 50 rpm (300 deg/sec). When it does not discover that count it will speed up the electric motor rotation to find it. At the quickness that it finds another measurable count the rpm will be too fast for the application and the drive will sluggish the engine rpm back off to 50 rpm and the whole process starts yet again. This constant increase and decrease in rpm is exactly what will trigger velocity ripple within an application.
A servo motor operating at low rpm operates inefficiently. Eddy currents are loops of electrical current that are induced within the engine during operation. The eddy currents actually produce a drag power within the motor and will have a larger negative impact on motor efficiency at lower rpms.
An off-the-shelf motor’s parameters might not be ideally suited to run at a minimal rpm. When an application runs the aforementioned electric motor at 50 rpm, essentially it isn’t using all of its obtainable rpm. As the voltage constant (V/Krpm) of the electric motor is set for an increased rpm, the torque constant (Nm/amp), which is usually directly linked to it-is certainly lower than it requires to be. Because of this the application needs more current to drive it than if the application had a motor particularly made for 50 rpm.
A gearheads ratio reduces the electric motor rpm, which explains why gearheads are occasionally called gear reducers. Using a gearhead with a 40:1 ratio, the engine rpm at the insight of the gearhead will be 2,000 rpm and the rpm at the output of the gearhead will become 50 rpm. Working the electric motor at the bigger rpm will permit you to avoid the issues mentioned in bullets 1 and 2. For bullet 3, it allows the look to use much less torque and current from the electric motor based on the mechanical benefit of the gearhead.