Algorithms tend to fall pretty squarely in either the “prevent your sibling from reading your diary” or the “prevent the NSA and Mossad from reading your Internet traffic” camps.
Computers get faster every year, so a cipher with a very narrow safety margin will tend to become completely broken rapidly.
Some things must be encrypted well enough so that even if NSA records them now, even 10 years or 20 years later they will not be able to decipher them.
Other things must be encrypted only well enough so that nobody will be able to decipher them close to real time. If the adversaries decipher them by brute force after a week, the data will become useless by that time.
Lightweight cryptography is for this latter use case.
The hard part is that anything designed to be breakable in a week is only ~150x away in strength from being broken in an hour. That means you need to be incredibly confident about how strong your algorithm is. It's much easier to eat a little bit of cost such that you think it's invulnerable for thousands of years because that way you don't need to worry about a factor of 2 here and there.
Right. Strength in a shadow contest like cryptography can be, best case, estimated to sixteen-bit orders of magnitude (+-65000x). Just because you can't break it doesn't mean somebody else secretly knows a game changing way to break it. So you keep padding with huge exponential hedges such that if they scan shave a dozen bits off the strength of the scheme, it's still secure under finite resources.
If "every little bit helps" is true for the environment, it's also true for cryptography, and vice versa.