Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I am looking into improving my programming skils (actually I try to do my best to suck less each year, as our Jeff Atwood put it), so I was thinking into reading stuff about metaprogramming and self explanatory code.
I am looking for something like an idiot's guide to this (free books for download, online resources). Also I want more than your average wiki page and also something language agnostic or preferably with Java examples.
Do you know of such resources that will allow to efficiently put all of it into practice (I know experience has a lot to say in all of this but i kind of want to build experience avoiding the flow bad decisions - experience - good decisions)?
EDIT:
Something of the likes of this example from the Pragmatic Programmer:
...implement a mini-language to control a simple drawing package... The language consists of single-letter commands. Some commands are followed by a single number. For example, the following input would draw a rectangle:
P 2 # select pen 2
D # pen down
W 2 # draw west 2cm
N 1 # then north 1
E 2 # then east 2
S 1 # then back south
U # pen up
Thank you!
Welcome to the wonderful world of meta-programming :) Meta programming relates actually to many things. I will try to list what comes to my mind:
Macro. The ability to extend the syntax and semantics of a programming language was explored first under the terminology macro. Several languages have constructions which resemble to macro, but the piece of choice is of course Lisp. If you are interested in meta-programming, understanding Lisp and the macro system (and the homoiconic nature of the languge where code and data have the same representation) is definitively a must. If you want a Lisp dialect that runs on the JVM, go for Clojure. A few resources:
Clojure mini language
Beating the Averages (why Lisp is a secret weapon)
There is otherwise plenty of resource about Lisp.
DSL. The ability to extend one language syntax and semantics is now rebranded under the term "DSL". The easiest way to create a DSL is with the interpreter pattern. Then come internal DSL with fluent interface and external DSL (as per Fowler's terminology). Here is a nice video I watched recently:
DSL: what, why, how
The other answers already pointed to resources in this area.
Reflection. Meta-programming is also inseparable form reflection. The ability to reflect on the program structure at run-time is immensely powerful. It's important then to understand what introspection, intercession and reification are. IMHO, reflection permits two broad categories of things: 1. the manipulation of data whose structure is not known at compile time (the structure of the data is then provided at run-time and the program stills works reflectively). 2. powerful programming patterns such as dynamic proxy, factories, etc. Smalltalk is the piece of choice to explore reflection, where everything is reflective. But I think Ruby is also a good candidate for that, with a community that leverage meta programming (but I don't know much about Ruby myself).
Smalltalk: a reflective language
Magritte: a meta driven approach to empower developpers and end-users
There is also a rich literature on reflection.
Annotations. Annotations could be seen as a subset of the reflective capabilities of a language, but I think it deserves its own category. I already answered once what annotations are and how they can be used. Annotations are meta-data that can be processed at compile-time or at run-time. Java has good support for it with the annotation processor tool, the Pluggable Annotation Processing API, and the mirror API.
Byte-code or AST transformation. This can be done at compile-time or at run-time. This is somehow are low-level approach but can also be considered a form of meta-programming (In a sense, it's the same as macro for non-homoiconic language.)
DSL with Groovy (There is an example at the end that shows how you can plug your own AST transformation with annotations).
Conclusion: Meta-programming is the ability for a program to reason about itself or to modify itself. Just like meta stack overflow is the place to ask question about stack overflow itself. Meta-programming is not one specific technique, but rather an ensemble of concepts and techniques.
Several things fall under the umbrella of meta-programming. From your question, you seem more interested in the macro/DSL part. But everything is ultimately related, so the other aspects of meta-programming are also definitively worth looking at.
PS: I know that most of the links I've provided are not tutorials, or introductory articles. These are resources that I like which describe the concept and the advantages of meta-programming, which I think is more interesting
I've mentioned C++ template metaprogramming in my comment above. Let me therefore provide a brief example using C++ template meta-programming. I'm aware that you tagged your question with java, yet this may be insightful. I hope you will be able to understand the C++ code.
Demonstration by example:
Consider the following recursive function, which generates the Fibonacci series (0, 1, 1, 2, 3, 5, 8, 13, ...):
unsigned int fib(unsigned int n)
{
return n >= 2 ? fib(n-2) + fib(n-1) : n;
}
To get an item from the Fibonacci series, you call this function -- e.g. fib(5) --, and it will compute the value and return it to you. Nothing special so far.
But now, in C++ you can re-write this code using templates (somewhat similar to generics in Java) so that the Fibonacci series won't be generated at run-time, but during compile-time:
// fib(n) := fib(n-2) + fib(n-1)
template <unsigned int n>
struct fib // <-- this is the generic version fib<n>
{
static const unsigned int value = fib<n-2>::value + fib<n-1>::value;
};
// fib(0) := 0
template <>
struct fib<0> // <-- this overrides the generic fib<n> for n = 0
{
static const unsigned int value = 0;
};
// fib(1) := 1
template <>
struct fib<1> // <-- this overrides the generic fib<n> for n = 1
{
static const unsigned int value = 1;
};
To get an item from the Fibonacci series using this template, simply retrieve the constant value -- e.g. fib<5>::value.
Conclusion ("What does this have to do with meta-programming?"):
In the template example, it is the C++ compiler that generates the Fibonacci series at compile-time, not your program while it runs. (This is obvious from the fact that in the first example, you call a function, while in the template example, you retrieve a constant value.) You get your Fibonacci numbers without writing a function that computes them! Instead of programming that function, you have programmed the compiler to do something for you that it wasn't explicitly designed for... which is quite remarkable.
This is therefore one form of meta-programming:
Metaprogramming is the writing of computer programs that write or manipulate other programs (or themselves) as their data, or that do part of the work at compile time that
would otherwise be done at runtime.
-- Definition from the Wikipedia article on metaprogramming, emphasis added by me.
(Note also the side-effects in the above template example: As you make the compiler pre-compute your Fibonacci numbers, they need to be stored somewhere. The size of your program's binary will increase proportionally to the highest n that's used in expressions containing the term fib<n>::value. On the upside, you save computation time at run-time.)
From your example, it seems you are talking about domain specific languages (DSLs), specifically, Internal DSLs.
Here is a large list of books about DSL in general (about DSLs like SQL).
Martin Fowler has a book that is a work in progress and is currently online.
Ayende wrote a book about DSLs in boo.
Update: (following comments)
Metaprogramming is about creating programs that control other programs (or their data), sometimes using a DSL. In this respect, batch files and shell scripts can be considered to be metaprogramming as they invoke and control other programs.
The example you have shows a DSL that may be used by a metaprogram to control a painting program.
Tcl started out as a way of making domain-specific languages that didn't suck as they grew in complexity to the point where they needed to get generic programming capabilities. Moreover, it remains very easy to add in your own commands precisely because that's still an important use-case for the language.
If you're wanting an implementation integrated with Java, Jacl is an implementation of Tcl in Java which provides scriptability focussed towards DSLs and also access to access any Java object.
(Metaprogramming is writing programs that write programs. Some languages do it far more than others. To pick up on a few specific cases, Lisp is the classic example of a language that does a lot of metaprogramming; C++ tends to relegate it to templates rather that permitting it at runtime; scripting languages all tend to find metaprogramming easier because their implementations are written to be more flexible that way, though that's just a matter of degree..)
Well, in the Java ecosystem, i think the simplest way to implement a mini-language is to use scripting languages, like Groovy or Ruby (yes, i know, Ruby is not a native citizen of the java ecosystem). Both offer rather good DSL specification mechanism, that will allow you to do that with far more simplicity than the Java language would :
Writing DSL in Groovy
Creating Ruby DSL
There are however pure Java laternatives, but I think they'll be a little harder to implement.
You can have a look at the eclipse modeling project, they've got support for meta-models.
There's a course on Pluralsight about Metaprogramming which might be a good entry point https://app.pluralsight.com/library/courses/understanding-metaprogramming/table-of-contents
Related
Hi the following post says there is "built in dependency injection" in scala
"As a Scala and Java developer, I am not even slightly tempted to
replace Scala as my main language for my next project with Java 8. If
I'm forced to write Java, it might better be Java 8, but if I have a
choice, there are so many things (as the OP correctly states) that
make Scala compelling for me beyond Lambdas that just adding that
feature to Java doesn't really mean anything to me. Ruby has Lambdas,
so does Python and JavaScript, Dart and I'm sure any other modern
language. I like Scala because of so many other things other than
lambdas that a single comment is not enough.
But to name a few (some were referenced by the OP)
Everything is an expression, For
comprehensions (especially with multiple futures, resolving the
callback triangle of death in a beautiful syntax IMHO), Implicit
conversions, Case classes, Pattern Matching, Tuples, The fact that
everything has equals and hashcode already correctly implemented (so I
can put a tuple, or even an Array as a key in a map), string
interpolation, multiline string, default parameters, named parameters,
built in dependency injection, most complex yet most powerful type
system in any language I know of, type inference (not as good as
Haskell, but better than the non existent in Java). The fact I always
get the right type returned from a set of "monadic" actions thanks to
infamous things like CanBuildFrom (which are pure genius). Let's not
forget pass by name arguments and the ability to construct a DSL.
Extractors (via pattern matching). And many more.
I think Scala is
here to stay, at least for Scala developers, I am 100% sure you will
not find a single Scala developer that will say: "Java 8 got lambdas?
great, goodbye scala forever!". Only reason I can think of is compile
time and binary compatibility. If we ignore those two, all I can say
is that this just proves how Scala is in the right direction (since
Java 8 lambdas and default interface methods and steams are so clearly
influenced)
I do wish however that Scala will improve Java 8
interoperability, e.g. support functional interfaces the same way. and
add new implicit conversions to Java 8 collections as well as take
advantage to improvements in the JVM.
I will replace Scala as soon as
I find a language that gives me what Scala does and does it better. So
far I didn't find such a language (examined Haskell, Clojure, Go,
Kotlin, Ceylon, Dart, TypeScript, Rust, Julia, D and Nimrod, Ruby
Python, JavaScript and C#, some of them were very promising but since
I need a JVM language, and preferably a statically typed one, it
narrowed down the choices pretty quickly)
Java 8 is by far not even
close, sorry. Great improvement, I'm very happy for Java developers
that will get "permission" to use it (might be easier to adopt than
Scala in an enterprise) but this is not a reason for a Scala shop to
consider moving back to Java." [1]
what is exactly the built in dependency injection in scala?
It's not a discrete language feature. I think the author was referring to the fact that Scala's feature set is flexible enough to support a number of techniques that could be said to accomplish DI:
the cake pattern, building on the trait system
the Reader monad, building on higher-kinded types
DI through currying, building on functional techniques
using implicit class parameters, building on Scala's concept of implicits
in my own project, we accomplish DI by requiring function values in the class constructor explicitly
This diversity is rather emblematic of Scala. The language was designed to implement a number of very powerful concepts, mostly orthogonally, resulting in multiple valid ways to solve many problems. The challenge as a Scala programmer is to understand this breadth and then make an intelligent choice for your project. A lot of times, that choice depends on what paradigms are being used internally to implement your components.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
Much of my programming background is in Java, and I'm still doing most of my programming in Java. However, I'm starting to learn Python for some side projects at work, and I'd like to learn it as independent of my Java background as possible - i.e. I don't want to just program Java in Python. What are some things I should look out for?
A quick example - when looking through the Python tutorial, I came across the fact that defaulted mutable parameters of a function (such as a list) are persisted (remembered from call to call). This was counter-intuitive to me as a Java programmer and hard to get my head around. (See here and here if you don't understand the example.)
Someone also provided me with this list, which I found helpful, but short. Anyone have any other examples of how a Java programmer might tend to misuse Python...? Or things a Java programmer would falsely assume or have trouble understanding?
Edit: Ok, a brief overview of the reasons addressed by the article I linked to to prevent duplicates in the answers (as suggested by Bill the Lizard). (Please let me know if I make a mistake in phrasing, I've only just started with Python so I may not understand all the concepts fully. And a disclaimer - these are going to be very brief, so if you don't understand what it's getting at check out the link.)
A static method in Java does not translate to a Python classmethod
A switch statement in Java translates to a hash table in Python
Don't use XML
Getters and setters are evil (hey, I'm just quoting :) )
Code duplication is often a necessary evil in Java (e.g. method overloading), but not in Python
(And if you find this question at all interesting, check out the link anyway. :) It's quite good.)
Don't put everything into classes. Python's built-in list and dictionaries will take you far.
Don't worry about keeping one class per module. Divide modules by purpose, not by class.
Use inheritance for behavior, not interfaces. Don't create an "Animal" class for "Dog" and "Cat" to inherit from, just so you can have a generic "make_sound" method.
Just do this:
class Dog(object):
def make_sound(self):
return "woof!"
class Cat(object):
def make_sound(self):
return "meow!"
class LolCat(object):
def make_sound(self):
return "i can has cheezburger?"
The referenced article has some good advice that can easily be misquoted and misunderstood. And some bad advice.
Leave Java behind. Start fresh. "do not trust your [Java-based] instincts". Saying things are "counter-intuitive" is a bad habit in any programming discipline. When learning a new language, start fresh, and drop your habits. Your intuition must be wrong.
Languages are different. Otherwise, they'd be the same language with different syntax, and there'd be simple translators. Because there are not simple translators, there's no simple mapping. That means that intuition is unhelpful and dangerous.
"A static method in Java does not translate to a Python classmethod." This kind of thing is really limited and unhelpful. Python has a staticmethod decorator. It also has a classmethod decorator, for which Java has no equivalent.
This point, BTW, also included the much more helpful advice on not needlessly wrapping everything in a class. "The idiomatic translation of a Java static method is usually a module-level function".
The Java switch statement in Java can be implemented several ways. First, and foremost, it's usually an if elif elif elif construct. The article is unhelpful in this respect. If you're absolutely sure this is too slow (and can prove it) you can use a Python dictionary as a slightly faster mapping from value to block of code. Blindly translating switch to dictionary (without thinking) is really bad advice.
Don't use XML. Doesn't make sense when taken out of context. In context it means don't rely on XML to add flexibility. Java relies on describing stuff in XML; WSDL files, for example, repeat information that's obvious from inspecting the code. Python relies on introspection instead of restating everything in XML.
But Python has excellent XML processing libraries. Several.
Getters and setters are not required in Python they way they're required in Java. First, you have better introspection in Python, so you don't need getters and setters to help make dynamic bean objects. (For that, you use collections.namedtuple).
However, you have the property decorator which will bundle getters (and setters) into an attribute-like construct. The point is that Python prefers naked attributes; when necessary, we can bundle getters and setters to appear as if there's a simple attribute.
Also, Python has descriptor classes if properties aren't sophisticated enough.
Code duplication is often a necessary evil in Java (e.g. method overloading), but not in Python. Correct. Python uses optional arguments instead of method overloading.
The bullet point went on to talk about closure; that isn't as helpful as the simple advice to use default argument values wisely.
One thing you might be used to in Java that you won't find in Python is strict privacy. This is not so much something to look out for as it is something not to look for (I am embarrassed by how long I searched for a Python equivalent to 'private' when I started out!). Instead, Python has much more transparency and easier introspection than Java. This falls under what is sometimes described as the "we're all consenting adults here" philosophy. There are a few conventions and language mechanisms to help prevent accidental use of "unpublic" methods and so forth, but the whole mindset of information hiding is virtually absent in Python.
The biggest one I can think of is not understanding or not fully utilizing duck typing. In Java you're required to specify very explicit and detailed type information upfront. In Python typing is both dynamic and largely implicit. The philosophy is that you should be thinking about your program at a higher level than nominal types. For example, in Python, you don't use inheritance to model substitutability. Substitutability comes by default as a result of duck typing. Inheritance is only a programmer convenience for reusing implementation.
Similarly, the Pythonic idiom is "beg forgiveness, don't ask permission". Explicit typing is considered evil. Don't check whether a parameter is a certain type upfront. Just try to do whatever you need to do with the parameter. If it doesn't conform to the proper interface, it will throw a very clear exception and you will be able to find the problem very quickly. If someone passes a parameter of a type that was nominally unexpected but has the same interface as what you expected, then you've gained flexibility for free.
The most important thing, from a Java POV, is that it's perfectly ok to not make classes for everything. There are many situations where a procedural approach is simpler and shorter.
The next most important thing is that you will have to get over the notion that the type of an object controls what it may do; rather, the code controls what objects must be able to support at runtime (this is by virtue of duck-typing).
Oh, and use native lists and dicts (not customized descendants) as far as possible.
The way exceptions are treated in Python is different from
how they are treated in Java. While in Java the advice
is to use exceptions only for exceptional conditions this is not
so with Python.
In Python things like Iterator makes use of exception mechanism to signal that there are no more items.But such a design is not considered as good practice in Java.
As Alex Martelli puts in his book Python in a Nutshell
the exception mechanism with other languages (and applicable to Java)
is LBYL (Look Before You Leap) :
is to check in advance, before attempting an operation, for all circumstances that might make the operation invalid.
Where as with Python the approach is EAFP (it's easier to Ask for forgiveness than permission)
A corrollary to "Don't use classes for everything": callbacks.
The Java way for doing callbacks relies on passing objects that implement the callback interface (for example ActionListener with its actionPerformed() method). Nothing of this sort is necessary in Python, you can directly pass methods or even locally defined functions:
def handler():
print("click!")
button.onclick(handler)
Or even lambdas:
button.onclick(lambda: print("click!\n"))
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Can anyone help comparing and contrasting between Java and cobol in terms of technical differences as well as architectural design styles
Similarities
Cobol and Java were going to change the world and solve the problem of programming.
Neither lived up to the initial hype.
There are now very large, bloated Cobol and Java programs that are used by banks and are "legacy" ... too large and critical to rewrite or throw away.
Cobol introduce the idea of having long, readable names in their code. Java recommends long, readable names.
Differences
Cobol was invented by an American, Grace Murray Hopper, who received the highest award by the Department of Defense, the Defense Distinguished Service Medal.
Java was invented by a Canadian, James Gosling, who received Canada's highest civilian honor, an Officer of the Order of Canada.
3 COBOL convention uses a "-" to separate words in names, Java convention uses upper/lower CamelCase.
COBOL was popular for the simple reason, to develop business applications.
Since the syntax was so clear and human-like, written in procedural style, it was for that reason, that made adapting to the changes in the business environment much easier, for example, to assign a value of pi to a variable, and then subtract zero from it - simple example to show the actual COBOL statements/sentences (it is years since I last programmed in Cobol)
MOVE 3.14 INTO VARPI.
SUBTRACT ZERO FROM VARPI GIVING VARPIRESULT.
IF VARPIRESULT AND ZERO EQUALS VARPI THEN DISPLAY 'Ok'.
If I remember, the COBOL sentences have to be at column 30...
And it is that, hence easier to troubleshoot because any potential business logic error can be easily pin-pointed as a result. Not alone that, since COBOL ran on mainframe systems, it was for a reason, the data transfer from files were shifted at a speed that is light-years ahead of the other systems such as PC's and that is another reason why data processing in COBOL was blindingly fast.
I have worked on the Y2k stuff on the mainframe (IBM MVS/360) and it was incredible at the dawn of the 21st century, praying that the fixes I put in wouldn't bring the business applications to their knees...that was hype, aside from that..to this day, it is still used because of the serious transfer speed of data shuffling around within mainframes and ease of maintainability.
I know for starters, Java would not just be able to do that, has Java got a port available for these mainframes (IBM MVS/360, 390, AS400)?
Now, businesses cannot afford to dump COBOL as they would effectively be 'committing suicide' as that is where their business applications resides on, which is the reason why the upgrade, migration, porting to a different language is too expensive and would cause a serious headache in the world of businesses today...
Not alone that, imagine having to rewrite procedural code which are legacy code and could contain vital business logic, to take advantage of the OOP style of Java, the end result would be 'lost in translation' and requiring a lot of patience, stress and pressure.
Imagine, a healthcare system (I have worked for one, which ran on the system I mentioned above), was to ditch all their claims processing,billing etc (written in COBOL) to Java, along with the potential for glitches and not to mention, serious $$$ amount of money to invest which would cost the healthcare company itself far more, the end result would be chaos and loss of money, and customers (corporations that offer employee benefits) would end up dumping the company for a better one.
So to answer your question, I hope I have illustrated the differences - to summarize:
COBOL is:
Procedural language
Simple human like syntax
very fast on mainframe systems
Easy to maintain code due to syntax
In contrast,
Java is:
Object Oriented
Syntax can get complicated
Requires a Java Virtual Machine to run and execute the compiled bytecode.
Hope this helps,
It is easier to point out what they have in common instead of listing their differences.
So here is the list:
You can use both to make the computer do things
They both get compiled to yet a different language (machine code, byte-code)
That is it!
Similarities:
Both extremely verbose and created with pointy-haired bosses, not programmers, in mind.
Both used primarily for boring business software.
Both have huge legacy and are going to be around a while.
Both languages target the "Write Once, Run Anywhere" idea. If vendor specific extensions are avoided, Cobol is very portable.
Cobol is very much a procedural language, while Java is very much an object oriented language. That said, there have been vendor specific OO extensions to Cobol for decades, and the new specification contains a formal specification. It is also possible to write procedural code in Java, you can easily make a program out of a single main() method.
Both are widely used in enterprise computing for their relative ease of use. Both languages are somewhat hard to shoot yourself in the foot with, compared with other common languages like C and C++.
The most significant difference is that Cobol supports native fixed point arithmetic. This is very important when dealing with financals. Most languages, Java included, only support this via add on libraries, thus they are many orders of magnitude slower when dealing with fixed point data and prone to (potentially very expensive) errors in that library code.
Cobol is a pure procedural language, not even functions in it (I used cobol in the 90s, so it might have changed since).
Java is OO (Although I heared there is a OO version for Cobol too), Oh...And the syntax is different.
Excelent list of similarities and differences : http://www.jsrsys.com/jsrsys/s8383sra.htm
It'swhat we do!
COBOL: COBOL Concept Description
Java: Java/OO Similar Concept
++: What Java/OO adds to Concept
When I began Java, I used to think the OO (Object Orientation) was "just like" good programming practices, except it was more formal, and the compiler enforced certain restrictions.
I no longer think that way. However, when you are beginning I think certain "is similar to" examples will help you grasp the concepts.
COBOL: Load Module/Program
Java: Class
COBOL: PERFORM
Java: method
++: can pass parameters to method, more like FUNCTION
other programs/classes can call methods in different classes if declared public. public/private gives designer much control over what other classes can see inside a class.
COBOL: Working Storage, statically linked sub-routine
Java: instance variables
++: (see next)
COBOL: Working Storge, dynamically loaded sub-routine
Java: Class variables
++: Java can mix both Class variables (called static, just the reverse of our COBOL example, and instance variables (the default).
Class variables (static) occur only once per Class (really in one JVM run-time environment).
Instance variables are unique to each instance of a class.
Here is an example from class JsrSysout. From my COBOL background I like to debug my code by DISPLAYing significant data to the SYSOUT data set. There is a Java method for this, System.out.prinln(...). The problem with this method is that the data you want just scrolls off the Java console, the equivalent of SYSOUT or perhaps DISPLAY UPON CONSOLE if you had your own stand-alone machine. I needed a way to easily do displays that would stop when the screen was full. Since there is only one Java console, the line-count for the screen clearly needs to be a class variable, so all instances (each program/class that logs here has its own instance of JsrSysout) stop at the bottom of the screen.
Multiple Instances of same class:
One (calling program) class can create multiple instances of the same class. Why would you want to do this? One good COBOL example is I/O routines. In COBOL you would need to code one I/O routine for each file you wish to access. If you want to open a particular file twice in one run-time environment you would need a different I/O routine with a different name, even if the logic was identical.
With Java you could code just one class for a particular logical file type. Then for each file you wish to read (or write) you simply create another instance of that class using the new operator. Here are some snippets of code from program IbfExtract that do exactly that. This program exploits the fact that I have written a class for Line Input, and another class for Line Output. These are called JsrLineIn and JsrLineOut.
This illustrates another dynamic feature of Java. When output is first created, it is an array of null pointers, it takes very little space. Only when a new object is created, and the pointer to it implicitly put in the array does storage for the object get allocated. That object can be anything from a String to an very complex Class.
COBOL: PICTURE
Java: No real equivalent.
I therefore invented a method to mymic a ZZZ,ZZZ,... mask for integer input. I have generally grouped my utility functions in JsrUtil. These are methods that really don't related to any type of object. Here is an example of padLeft that implements this logic. padLeft is also a good example of polymorphism. In COBOL, if you have different parameter lists you need different entry points. In Java, the types of parameters are part of the definition. For example:
COBOL: Decimal arithmetic
Java: Not in native Java, but IBM has implemented some BigDecimal classes.
I consider this the major weakness of Java for accounting type applications. I would have liked to see the packed decimal data type as part of the native JVM byte architecture. I guess it is not there because it is not in C or C++. I have only read about the BigDecimal classes, so I can't realy comment on their effectiveness.
COBOL: COPY or INCLUDE
Java: Inheritance
++: Much more powerfull!
In COBOL, if you change a COPY or INCLUDE member, you must recompile all the programs that use it. In Java, if program B inherits from program A, a change in program A is automatically inherited by program B without recompiling! Yes, this really works, and lends great power to Java applications. I exploited this for my Read/Sort/Report system. Class IbfReport contains all the basic logic common to the report programs. It has appropriate defaults for all of its methods. Classes IbfRP#### extend IbfReport, and contain only those methods unique to a particular report. If a change is made in IbfReport, it is reflected in the IbfRP#### programs (classes) the next time they are run.
COBOL: ON EXCEPTION
Java: try/throw/catch
++: can limit scope of error detection (see following)
COBOL: OPEN
Java: Input Streams
++: Automatic error detection, both a blessing and a curse.
COBOL: WRITE
Java: write (yes, really).
COBOL: CLOSE
Java: close method
COBOL: READ
Java: read...
I can perfectly see why Clojure is really good for concurrent programming. I can see the advantages of FP also in this regard.
But clearly, not every line of code that we write is part of a thread or needs concurrent access. For those parts of the code (the more simple and sequential piece of code) what is it that Java really missed that Clojure provided?
Were features like Multimethods, Dynamic binding, Destructuring bind really missed in Java?
I supposed my question can also be framed as:
If Clojure did not have the
Concurrency features that it had and
the whole Immutability/Mutability
issue was not of our concern, then
what other features Clojure provides
that would make you use it instead of
Java ?
Were features like Multimethods, Dynamic binding, Destructuring bind really missed in Java?
Yes. Also...
First-class functions. Delicious first-class functions. This isn't just an FP thing. There's a good reason people are clamoring for closures in Java 7.
Code-is-data. This is the benefit of any Lisp. Lisp code isn't just blobs of text you feed into the mouth of a compiler and never see again, it's structures of lists and vectors and symbols and literals that you can manipulate progammatically. This leads to powerful macros and first-class Symbols and lots of other goodies. It leads to a highly extensible and powerful language.
Clojure has better control and looping constructs and the ability to create your own via macros and first-class functions. Java has for and foreach and while (and didn't even have foreach for years). Clojure has map, filter, reduce, mapcat, lots of do forms, lots of if and when forms, list comprehensions via for, and so on. If these didn't exist, you could write them yourself. In Java you get to wait a decade for a committee to (maybe) approve such features.
Minus those dealing with static typing, all of the features set for Java 7, Clojure either already has or could have trivially. "Automatic resource management", Clojure has as with-open. "Language support for collections", Clojure (and Ruby, Perl, Python...) has already. "Strings in switch", Clojure has more powerful case-like constructs like condp, and whatever else you can think up. You could write any of these yourself in a dozen lines of Clojure.
Concise syntax for lists, maps, arrays, sets, sorted sets, sorted maps etc. and nearly interchangeable use of them all thanks to the seq abstraction. Literal support for regexes, characters, anonymous functions, etc.
Java has mandatory checked exceptions, which are annoying; Clojure doesn't.
Java syntax is verbose and irregular. Clojure syntax is concise and regular. Even Java written in Clojure is often more concise than Java written in Java thanks to macros like -> and doto, and constructs like proxy and (soon) reify.
Java code has too much mandatory boilerplate and endless repetition. public static void main(String[] args){...} etc. Clojure has next to none of this boilerplate, while sacrificing little to nothing in terms of expressiveness or power. Even other statically typed languages today seem to be going the way of type inference. There's good reason you need a bulky Java-centric IDE to write and endlessly "refactor" Java code; writing it by hand would drive you insane and wear your fingers down to nubs.
In Java everything is a class or interface, whether it should be or not, which is a cause of unnecessary complexity. There are many programs that have to be mangled beyond recognition to fit into an OOP style. Clojure lets you avoid this. A nice rant to this effect. Clojure focuses largely on verbs.
Interactive programming via REPL is fun. Compile/run/debug cycles are not. Clojure still compiles to .class files if you want it; in the meantime you can sit in the middle of your code and tinker freely while it's running.
Clojure's metadata and sane equality testing are enjoyable to work with. As are its auto-promotion of int to long to Bigint, native handling of rational numbers, and so on.
Dynamic typing leads to shorter, more generic thus more reusable thus more powerful code than static typing. (This is a highly debatable point, obviously, so I put it last.)
The popularity of Scala and Groovy and JRuby and Jython and endless other JVM-languages-that-aren't-Java should be seen as a good indication that while the JVM is good, Java-the-language is unpleasant for many people.
Brian has summarized it really well. Here is something that really impressed me. (From the book Programming Clojure by Stuart Halloway)
Java code, from the Apache Commons:
public class StringUtils {
public static boolean isBlank(String str) {
int strLen;
if (str == null || (strLen = str.length()) == 0) {
return true;
}
for (int i = 0; i < strLen; i++) {
if ((Character.isWhitespace(str.charAt(i)) == false)) {
return false;
}
}
return true;
}
}
Here is a similar implementation in Clojure:
(defn blank? [s] (every? #(Character/isWhitespace %) s))
Well for one there is generally a lot less "ceremony" in Clojure. Languages like Python and Ruby have this advantage over Java as well (thus the popularity of JRuby, Jython).
But note there are times when verbosity cannot be avoided in Java though there might be a clear pattern. Clojure's macros are a huge win here- even over other similarly dynamic languages.
Another thing to consider is that Clojure programs tend to be concurrency safe. So if you decide down the road to make a particular application concurrent, it won't be too painful. Without making alot of decisions up front this will be considerably more difficult with Java.
Also wondering if Clojure would have a clear advantage over Java if it lacked strong concurrency primitives and immutability is kind of like saying, "Well what if Clojure wasn't Clojure?"
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I've been using Java almost since it first came out but have over the last five years gotten burnt out with how complex it's become to get even the simplest things done. I'm starting to learn Ruby at the recommendation of my psychiatrist, uh, I mean my coworkers (younger, cooler coworkers - they use Macs!). Anyway, one of the things they keep repeating is that Ruby is a "flexible" language compared to older, more beaten-up languages like Java but I really have no idea what that means. Could someone explain what makes one language "more flexible" than another? Please. I kind of get the point about dynamic typing and can see how that could be of benefit for conciseness. And the Ruby syntax is, well, beautiful. What else? Is dynamic typing the main reason?
Dynamic typing doesn't come close to covering it. For one big example, Ruby makes metaprogramming easy in a lot of cases. In Java, metaprogramming is either painful or impossible.
For example, take Ruby's normal way of declaring properties:
class SoftDrink
attr_accessor :name, :sugar_content
end
# Now we can do...
can = SoftDrink.new
can.name = 'Coke' # Not a direct ivar access — calls can.name=('Coke')
can.sugar_content = 9001 # Ditto
This isn't some special language syntax — it's a method on the Module class, and it's easy to implement. Here's a sample implementation of attr_accessor:
class Module
def attr_accessor(*symbols)
symbols.each do |symbol|
define_method(symbol) {instance_variable_get "##{symbol}"}
define_method("#{symbol}=") {|val| instance_varible_set("##{symbol}", val)}
end
end
end
This kind of functionality allows you a lot of, yes, flexibility in how you express your programs.
A lot of what seem like language features (and which would be language features in most languages) are just normal methods in Ruby. For another example, here we dynamically load dependencies whose names we store in an array:
dependencies = %w(yaml haml hpricot sinatra couchfoo)
block_list %w(couchfoo) # Wait, we don't really want CouchDB!
dependencies.each {|mod| require mod unless block_list.include? mod}
It's also because it's a classless (in the Java sense) but totally object oriented (properties pattern) so you can call any method, even if not defined, and you still get a last chance to dynamically respond to the call, for example creating methods as necessarry on the fly. Also Ruby doesn't need compilation so you can update a running application easily if you wanted to. Also an object can suddenly inherit from another class/object at anytime during it's lifetime through mixins so it's another point of flexibility. Anyways I agree with the kids that this language called Ruby , which has actually been around as long as Java, is very flexible and great in many ways, but I still haven't been able to agree it's beatiful (syntax wise), C is more beatiful IMHO (I'm a sucker for brackets), but beauty is subjective, the other qualities of Ruby are objective
Blocks, closures, many things. I'm sure some much better answers will appear in the morning, but for one example here's some code I wrote ten minutes ago - I have an array of scheduled_collections, some of which have already happened, others which have been voided, canceled, etc. I want to return an array of only those that are pending. I'm not sure what the equivalent Java would be, but I imagine it's not this one-line method:
def get_all_pending
scheduled_collections.select{ |sc| sc.is_pending? }
end
A simpler example of the same thing is:
[0,1,2,3].select{|x| x > 1}
Which will produce [2,3]
Things I like
less code to get your point across
passing around code blocks (Proc, lambdas) is fun and can result in tinier code. e.g. [1, 2, 3].each{|x| puts "Next element #{x}"}
has the scripting roots of PERL.. very nice to slice n dice routine stuff like parsing files with regexps, et. all
the core data structure class API like Hash and Array is nicely done.
Metaprogramming (owing to its dynamic nature) - ability to create custom DSLs (e.g. Rails can be termed a DSL for WebApps written in Ruby)
the community that is spawning gems for just about anything.
Mixins. Altering a Ruby class to add new functionality is trivially easy.
Duck typing refers to the fact when types are considered equivalent by what methods them implement, not based on their declared type. To take a concrete example, many methods in Ruby take a IO-like object to operate on a stream. This means that the object has to implement enough functions to be able to pass as an IO type object (it has to sound enough like a duck).
In the end it means that you have to write less code than in Java to do the same thing. Everything is not great about dynamic languages, though. You more or less give up all of the compile-time typechecking that Java (and other strongly/statically typed languages) gives you. Ruby simply has no idea if you're about to pass the wrong object to a method; that will give you a runtime error. Also, it won't give you a runtime error until the code is actually called.
Just for laughs, a fairly nasty example of the flexibility of the language:
class Fixnum
def +(other)
self - other
end
end
puts 5 + 3
# => 2