import sys
class Vector:
def __init__(self, x, y):
self.x = x
self.y = y
v1 = Vector(1, 2)
v2 = Vector(1, 2)
print(f"v1 __dict__ : {sys.getsizeof(v1.__dict__)} bytes")
print(f"v2 __dict__ : {sys.getsizeof(v2.__dict__)} bytes")
v3 = Vector(1, 2)
v4 = Vector(1, 2)
print(f"v3 __dict__ : {sys.getsizeof(v3.__dict__)} bytes")
print(f"v4 __dict__ : {sys.getsizeof(v4.__dict__)} bytes")
v5 = Vector(1, 2)
print(f"v5 __dict__ : {sys.getsizeof(v5.__dict__)} bytes")
v6 = Vector(1, 2)
print(f"v6 __dict__ : {sys.getsizeof(v6.__dict__)} bytes")
Anyone can tell me why the result of the code above is
v1 __dict__ : 288 bytes
v2 __dict__ : 288 bytes
v3 __dict__ : 272 bytes
v4 __dict__ : 272 bytes
v5 __dict__ : 264 bytes
v6 __dict__ : 256 bytes
I try to ask AI and the explanation seems to be strange and i can't understand that clearly.
and if I move the print statement up and down ,it will lead to different results,so why this matters?
Why v2 and v3 differ?
I found the code acts differently from various python versions,and the result above is in Python3.13
A really strange phenomenon for me ,thank you all .