Using a “getter” as a default value of a python function

I found out this very weird behaviour in python’s default values in functions and I’d appreciate any help on the reasons why I get two different behaviours on seemingly the same thing.

We have a pretty straightforward custom int class:

class CustomInt(object):
    def __init__(self, val=0):
        self._val = int(val)
    def increment(self, val=1):
        self._val +=val
        return self._val
    def __str__(self):
        return str(self._val)
    def __repr__(self):
        return 'CustomInt(%s)' % self._val
    def get(self):
        return self._val

Class instantiation:

test = CustomInt()

Then we define our function accepting a getter as a default argument

def move_selected(file_i = test.get()):
    global test
    test.increment()
    print(file_i)
    print(test)

If we hit move_selected() once, we get a local copy of test (aka file_i) and our global variable, test, is updated (we get 0n1)

The default value of move_selected() the second time we call it is still 0 (we get0n2). Even though test was updated. If we were to explicitly write move_selected(test.get()) the outcome is not the same (we’d get 1n2).

Why? Are we not supposed to pass functions as default arguments?

Answer

Default values are evaluated when the function is defined, not when it is called. The value of test.get() was 0 when you defined the function, so that’s what it’s going to be.

If you want to call the getter every time the function runs, you can do exactly that in the function body:

def move_selected(file_i = None):
    global test
    test.increment()

    if file_i == None:
        file_i = test.get()

    print(file_i)
    print(test)