Soldato
- Joined
- 4 Aug 2007
- Posts
- 22,396
- Location
- Wilds of suffolk
I've seen voltage regulators in action (which is what they were called in the plant I worked at).
It was a multi acre manufacturing plant with hundreds of motors around it.
It was the motors they optimised, one by one, which is important. Every one (they started with the largest hungriest ones) was given a regulator and it was slowly dialled down to provide lower voltage whilst watching the current.
Part of what this also did was connect to a control room so if (when) something broke they knew.
This faster response and hence up time funded the project.
It was throwing up decent savings on energy usage, like 10-15% but varied my motor. Some liked the lower voltage more than others, but all saw some energy reduction over time.
I believe part of it (iirc) was due to reduced heat making the motor run more efficiently as it was cooler.
There was talk about it reducing breakdowns but we never included anything from the finance side for that, just a reduction in (demonstrated) energy usage.
I believe (again iirc it was over a decade ago) some stuff like control panels it actually increased the usage if you stopped giving them mains voltage. Many would drop it to 12v anyway so if you fed lower voltage you suffered the same issue as you see in consumer space when a PSU is lower efficiency at 120v than at 240v.
I think the problem in house vs industry is that in a house you will have 1, vs industry where you will have many. It will affect everything downstream so it may improve some things, but will potentially worse efficiency in others (like the PSU I mention above).
FWIW I believe the units were basically the same thing but slightly better so they had more voltage drop choices rather than just three. I don't think they were on like a fully variable dial, but a kind of swappable dial type arrangement.
In fact now I think back they had a more expensive unit that they worked out what voltage to supply a motor with and then a more basic version they would set to that voltage (or close as possible) and then swap them over.
It was a multi acre manufacturing plant with hundreds of motors around it.
It was the motors they optimised, one by one, which is important. Every one (they started with the largest hungriest ones) was given a regulator and it was slowly dialled down to provide lower voltage whilst watching the current.
Part of what this also did was connect to a control room so if (when) something broke they knew.
This faster response and hence up time funded the project.
It was throwing up decent savings on energy usage, like 10-15% but varied my motor. Some liked the lower voltage more than others, but all saw some energy reduction over time.
I believe part of it (iirc) was due to reduced heat making the motor run more efficiently as it was cooler.
There was talk about it reducing breakdowns but we never included anything from the finance side for that, just a reduction in (demonstrated) energy usage.
I believe (again iirc it was over a decade ago) some stuff like control panels it actually increased the usage if you stopped giving them mains voltage. Many would drop it to 12v anyway so if you fed lower voltage you suffered the same issue as you see in consumer space when a PSU is lower efficiency at 120v than at 240v.
I think the problem in house vs industry is that in a house you will have 1, vs industry where you will have many. It will affect everything downstream so it may improve some things, but will potentially worse efficiency in others (like the PSU I mention above).
FWIW I believe the units were basically the same thing but slightly better so they had more voltage drop choices rather than just three. I don't think they were on like a fully variable dial, but a kind of swappable dial type arrangement.
In fact now I think back they had a more expensive unit that they worked out what voltage to supply a motor with and then a more basic version they would set to that voltage (or close as possible) and then swap them over.