incompatible types: int cannot be converted to boolean
I'm interested in why C does allow it and java not. Therefore, I'm interested in the language's type system, specifically its strongness.
There are two parts to your question:
Why does Java not convert int
to boolean
?
This boils down to Java being intended to be as explicit as possible. It is very static, very "in your face" with its type system. Things that are automatically type-cast in other languages are not so, in Java. You have to write int a=(int)0.5
as well. Converting float
to int
would lose information; same as converting int
to boolean
and would thus be error-prone. Also, they would have had to specify a lot of combinations. Sure, these things seem to be obvious, but they intended to err on the side of caution.
Oh, and compared to other languages, Java was hugely exact in its specification, as the bytecode was not just an internal implementation detail. They would have to specified any and all interactions, precisely. Huge undertaking.
Why does if
not accept other types than boolean
?
if
could perfectly well be defined as to allow other types than boolean
. It could have a definition that says the following are equivalent:
true
int != 0
String
with .length>0
- Any other object reference that is non-
null
(and not a Boolean
with value false
).
- Or even: any other object reference that is non-
null
and whose method Object.check_if
(invented by me just for this occasion) returns true
.
They didn't; there was no real need to, and they wanted to have it as robust, static, transparent, easy to read etc. as possible. No implicit features. Also, the implementation would be pretty complex, I'm sure, having to test each value for all possible cases, so performance just may have played a small factor as well (Java used to be sloooow on the computers of that day; remember there was no JIT compilers with the first releases, at least not on the computers I used then).
Deeper reason
A deeper reason could well be the fact that Java has its primitive types, hence its type system is torn between objects and primitives. Maybe, if they had avoided those, things would have turned out another way. With the rules given in the previous section, they would have to define the truthiness of every single primitive explicitely (since the primitives don't share a super class, and there is no well defined null
for primitives). This would turn into a nightmare, quickly.
Outlook
Well, and in the end, maybe it's just a preference of the language designers. Each language seems to spin their own way there...
For example, Ruby has no primitive types. Everything, literally everything, is an object. They have a very easy time making sure that every object has a certain method.
Ruby does look for truthiness on all types of objects you can throw at it. Interestingly enough, it still has no boolean
type (because it has no primitives), and it has no Boolean
class either. If you ask what class the value true
has (handily available with true.class
), you get TrueClass
. That class actually does have methods, namely the 4 operators for booleans (| & ^ ==
). Here, if
considers its value falsey if and only if it is either false
or nil
(the null
of Ruby). Everything else is true. So, 0
or ""
are both true.
It would have been trivial for them to create a method Object#truthy?
which could be implemented for any class and return an individual truthiness. For example, String#truthy?
could have been implemented to be true for non-empty strings, or whatnot. They didn't, even though Ruby is the antithesis of Java in most departments (dynamic duck-typing with mixin, re-opening classes and all that).
Which might be surprising to a Perl programmer who is used to $value <> 0 || length($value)>0 || defined($value)
being the truthiness. And so on.
Enter SQL with its convention that null
inside any expression automatically makes it false, no matter what. So (null==null) = false
. In Ruby, (nil==nil) = true
. Happy times.
int
values. A more proper example would beif (pointer)
.