You may reason about the behavior of Java code involving immutable types exactly as if they were primitive types because:
1) "Immutable" and "primitive" are synonyms; there is no difference between them
2) computations involving immutable types are just as efficient as those involving primitive types
3) aliasing, which can happen with immutable types but not primitive types, cannot cause trouble because object values for immutable types cannot be changed
4) in any code where an immutable type is used in a way where it would not behave like a primitive type, it causes a Java compile-time error