Can Java primitives be considered light objects [closed] - java

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
As per this answer here
both java objects and primitives go on heap. So from the point of view of JVM, are objects and primitives similar except thaty objects take more space on the heap? In essence, are primitives nothing but 'light' objects?

Java primitives are not "light objects". They are primitives. They fail as objects in two very significant ways: they cannot go into Collection objects and they do not have methods.
They also do not go on the heap, except as fields of an actual Java object. You cannot do new int. Note also that when you declare a local variable that is of a primitive type, the variable comes into existence. When you declare a local variable of an object type, all you get is a reference to an object, but it is set to null and no object of the declared type is allocated by simply declaring the variable.
Note that autoboxing blurs the distinction somewhat, but the distinction is definitely there.

There is a bit of confusion here. The question you're linking to in your question says that primitives inside an object can be in the heap. Primitives can't be in the heap by themselves.
You can't have an int referenced like an object, they are accessed directly without being "dereferenced".

You're extrapolating the fact that primitives could go in heap (as part of other objects) to conclude they could be light-weight objects. A set of primitives make up the state of an Object. They're not objects by themselves.
Primitives just have a value. They don't have a state and behaviour like Objects do. They don't exhibit inheritance, polymorphism etc. They don't behave like entities but like their attributes.

Related

Deep copying in getters for mutable object [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
Would it be overhead to always return a copy of collection/object field?
Clearly, yes it would be a overhead ... compared with returning a reference or a shallow copy.
But that's not really the point. The real point is whether the overhead is warranted / necessary, and whether it can be avoided by using some other data structure / design / technique. The answer to those questions depends on the context.
Here are some illustrations:
If a target object getter returns an immutable object, a copy is unnecessary. Example, any getter that returns a String.
If a target object getter returns an object that is not part of the target object abstraction, a copy is undesirable. Example list.get(int), Iterator.next().
If a target object getter returns a mutable object (or array) AND the returned object is part of the object's internal state AND the target doesn't necessarily trust the caller, then the getter should either copy it or wrap it ... or there may be a security problem.
The same may apply in non-security-related contexts; e.g. ArrayList.toArray(...) copies the list into an separate array rather than returning the list's backing array. (Similar for getChars() for a String, StringBuffer, etc.) This is all about maintaining the abstraction boundary so that on class won't "break" another one.
If a target object getter returns a mutable object (or array) AND the returned object is part of the object's internal state BUT the target object's API / abstraction boundary is designed to be "porous" (e.g. for performance reasons), then copying may be self defeating.
Of these, 3 is the only case where cloning is strictly mandatory. In 2, 4 and 5 you could argue that it is a matter of how you design the public (or internal) APIs for the classes, libraries, applications. And often there are many viable choices.
It is overhead for sure but there are already some framework classes which do that. It is also described in the book Effective Java.
Remember:
"Classes should be immutable unless there's a very good reason to make them mutable....If a class cannot be made immutable, limit its mutability as much as possible."
When you want to create immutable classes than you can use a framework like that: http://immutables.github.io
For examples check this Oracle documentation

What is the difference between the terms non-primitive type and object type? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
When I read about Java, I generally see variables described as either primitive type or object type.
When I read about C#, I generally see variables described as either primitive type or non-primitive type?
What is the difference between the terms object type and non-primitive type?
Part of this confusion may be in that, in C#, (mostly) everything inherits from Object. To refer to an Object type in the same way, would refer to every type in the language, and essentially be useless.
In C#, the primitive types are Boolean, Byte, Char, Double, Int16, Int32, Int64, IntPtr, SByte, Single, UInt16, UInt32, UInt64, UIntPtr. These types still inherit from object, though they are treated differently by the language. There are a few types that don't inherit from object, but they are not what you would consider primitives (ie Interfaces). The list of C# primitives can be acquired with this code, taken from here:
var primitives = typeof(int).Assembly.GetTypes().Where(type => type.IsPrimitive).ToArray();
A more appropriate dichotomy, if you wanted such a thing, would be value types versus reference types. When you begin considering that difference, then you can include things such as Enum types, and other values type, like structs.
in Java:
the primitive variables are categorized in 8 data types: boolean,byte,short,int,long,float,double and char.Every primitive variable has his own range of space in memory.
The references variables,refers to objects(Array,String,ArrayList,StringBuilder,...), and doesnt matter the space of the object referred.
Differences:
1.references types can be assinged as null /primitives dont.
2.references types can be used to call methods when they dont point to null/primitives uses literals.
3.references types have all the same size / in primitives depends of the
data type
4.primitives declarations starts with lowercase/java classes with
Uppercase

Is Java a "completely" object-oriented language? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Recently at a job interview I was asked "Is Java a "completely" object-oriented language?"
As I was completely unable to answer that question and do not know the answer, could someone, please, help me to understand the nature of this question.
As I see the question is being closed as "opinion-based" that's not an opinion I'm asking. I'm asking if I am unaware of that completely/incompletely category.
Kindly tell me if that's a wrong forum to ask this.
Java has primitives. Primitives are not Objects.
Depends on how you look at it, but yes, as #biziclop said, it's a matter of opinion.
To break it down, Java is, as you know, an object oriented language, but it's still possible to do functional programming in it (a static method that takes a primitive argument and returns a result).
Since primitives are not objects you can do non-object programming with Java.
So technically, no, Java is not a completely object-oriented language.
No. Java is not because it supports Primitive datatype[^] such as int, byte, long... etc, to be used, which are not objects.
There are seven qualities to be satisfied for a programming language to be pure Object Oriented. They are:
Encapsulation/Data Hiding
Inheritance
Polymorphism
Abstraction
All predefined types are objects
All operations are performed by sending messages to objects
All user defined types are objects
well Java is not 100% object oriented, because it still contains Primitive datatype
for example:
int i=0;
here i is not an object but contains the actual value.
However,
Set<String> set=new HashSet<String>();
set is a refrence that referes to a HashSet

Is it bad to have many references to the same object? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Thinking about this question, I don't think it would be bad since object references only take up 4 bytes of memory (in a 32-bit JVM), but intuitively I feel like I'm doing something wrong when I have many (+100) references to the same object. This usually happens when I create a certain class +100 times and each need to hold the reference.
I know I can re-design my code in most cases to avoid this, but sometimes its much easier to just keep the reference within each object.
Anyway, is it bad to have many references to the same object?
Having many references to the same object is unavoidable, but has no dissadvantage IMHO.
Every Class has a reference to it from every instance of that class. Each class loader has a reference from every class. The empty String is often the most referenced object with tens of thousands of references being common.
I know I can re-design my code in most cases to avoid this, but sometimes its much easier to just keep the reference within each object.
I suggest you do what you believe is simplest and clearest and this will be easiest to maintain.
Thinking about this question, I don't think it would be bad since object references only take up 4 bytes of memory (in a 32-bit JVM), but intuitively I feel like I'm doing something wrong when I have many (+100) references to the same object.
From a performance/resource utilization standpoint, references are waaaaaaaaaaaaaaaay more efficient than creating and destroying objects. Lots of ittybitty objects can fragment the heap and tax the memory manager/garbage collector. This is why it's usually worth the effort to make immutable objects singletons. Construction of even small objects in Java is more expensive than using references.
Most programs won't notice any significant difference, but some will.
This usually happens when I create a certain class +100 times and each need to hold the reference.
If every instance of a class references that object, use a static rather than instance variable to store the reference. You can use a static initializer to allocate it, or create a factory method to instantiate objects of the class and have the factory method allocate the referenced object the first time it is invoked.

Why do we need to declare the variable before we use it in some languages, but not in others? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
How is memory allocated for a variable declaration in Python vs. Java? Without a compilation step how does an interpreted language know how much memory is needed for a variable?
Before being usable, variables must be allocated a memory location and then initialized--whether in Java, Python, or even Logo.
Declare means that you make that variable come to life with a specific snippet of code, with (using Java as an example) something like
int i;
Person p;
These are declared, but not initialized. They are now assigned a location in memory--which, in some languages, may be ever-changing, both in location and size. But regardless, there is now some physical location in memory that the runtime environment can query, to retrieve the variable (either an indirect pointer to it, or the actual location itself).
Now that it has an empty "box" in which to go, it must be filled, which is to say it must be "initialized":
i = 3;
p = new Person();
Now there is something concrete in the box. It is ready for use. Attempting to use it before its initialized will result (in Java) in a NullPointerException.
Some languages require you to declare variables, in order to explicitly allocate memory for it (location and/or size). Some languages do this memory-allocation for you. As stated in the comments to both your question and this answer, there's a lot of variation.

Categories