Understand that in classes, materials and especially on the internet people make simplifications. Probably your teacher did a simplification. Or learned wrong, and is moving forward, I do not know: P
Each definition of terms needs a context. When you use a term in one context it can mean one thing, in another context it may mean something else. The Java language could define what it considers as primitive types. It is her prerogative to define this in context.
Even if it were not for this, Java could still say it is, if you wanted to.
What you learned in another discipline applies to that discipline, to that context. Probably in this context that you learned only the types in which most processors usually have dedicated instructions to manipulate them are considered primitive.
In the Java documentation there is a definition that string is part of the language and supported directly by the JVM and the compiler, that's all. But it is not defined as primitive. You can read more about this in Oracle's "official" tutorial . Often materials use wrong terms. And people reproduce it. And then others learn wrong.
If you know English you can read more about these types on Wikipedia .
Some people do not like this terminology. C # and other languages do not even use it.
What can be considered primitive type is when type is by value.
The string type is not a value type per se. It is a pointer for the string, so it is a type by reference. But for all purposes and it works as if it were a kind by value. It has its own identity and it is not possible to change only part of it.
Maybe the confusion will come from there.
I do not know if I should pass this but there is an answer that helps me understand the difference between types by value and by reference, but is in C # . Do not confuse, Java is a bit different. The basic idea is the same but Java, at least until version 8 does not allow you to create your own types by value. This can also help (or confuse more, I do not know).