Once in a while you probably stumble upon this one: Strict error: Only variables should be passed by reference
. The workaround is usually to add some temporary variable that you'll never use anyway.
But what bothers me is why this warning is even there? Does it serve any purpose? If I know I won't need that damn variable, why not allow me to just skip it? It's most annoying when working with older APIs that are not object oriented and use reference arguments like crazy.
FORTRAN used to pass all parameters by reference. That could cause strange problems, as in:
FUNCTION DUBL(X)
X = X + X
DUBL = X
RETURN
END
Y = DUBL(5.0)
Z = 5.0 + 1.0
The last line sets Z to 11.0. That's because the function call in the next to last line caused 5.0 to be doubled, so that wherever 5.0 appears in the source code it now means 10.0.
Partly this was because the compiler, trying to conserve memory, would store each literal constant value only once. If it had used a separate copy of 5.0 every time it was mentioned, the problem would still exist but have no effect. (If a tree doubles in the forest and there's no one to notice, did it really double?)
But mainly it was because you passed a non-const reference to a literal, effectively turning a constant into a variable. (The adage was: "Constants aren't. Variables don't.")
Most modern languages either don't allow references to constants or allow them only if you specifically const-qualify them.
A non-const reference to a function result is considered a no-no because you're essentially promising the function that it can modify the thing referred to, but that's misleading because the thing is about to disappear taking all modifications with it. Preventing the function from modifying it forces the function to realize that if it wants to make meaningful changes to the object referred to, it should copy the object first.
It's easy to come up with scenarios where it would useful to modify the object anyway (generally because the function is going to refer to it again before returning), but there are many more scenarios that result in subtle and very hard to diagnose bugs. Programmers chafe at the inconvenience of strictness until they're bitten by a bug caused by laxness, and then they're very OK with it.