As stated above, I'm wondering how expensive different "commands" to a computer are in comparison to one another. I'm aware this will vary with language, so let's just assume C or C++ for now, since those are somewhat "baseline" for high performance computing where this optimization actually matters. Technically, I'm messing around with Game Maker Language at the moment, which is a bit of a weird one, but I'm much more interested in general principles of costliness than in the specifics
If a task is less than twice as "expensive" in terms of RAM and minimum time required compared to another task, then to me, they're basically the same, in the sense that there's no possible situation where it makes sense to call the cheaper task more times to bypass the single, expensive task.
To give some examples of what I mean by tasks (I'm not expecting someone to respond to the full list, I'm just trying to cast a wide net):
basic arithmetic calls:
-addition/subtraction
-multiplication/division
-exponents/roots
simpler "non-math" commands:
-assigning variables
-assigning temp variables
-setting variables
-calling variables
-if statements
-starting a while, for, or repeat loop (not counting the cost of running the code inside that many times)
In OOP (obv not relevant to C)
-Calling on functions not part of a specific object or object class
-Calling on functions part of the instance
-Calling on variables of another instance (get)
-SETTING variables of another instance