Testing ANTLR Grammar - java

So I've been making a grammar in Eclipse with ANTLR v3.4 and I've made one that works and I want to make sure when I edit it everything still works. I can go into the interpretter everytime but that seems like a huge waste of time.
Questions:
I've read about gunit but the link it gives to download gUnit:
( http://antlr.org/hudson/job/gUnit/org.antlr$gunit/lastSuccessfulBuild/ ) doesn't work. How can I get gUnit.
What is the best way to test grammars? Is it actually gUnit or should I just do java tests like jUnit tests?

The question is old, but I'm leaving a reference for completeness:
For me, the gUnit was useless. So I managed to find how test only the Lexer and then, only the parser.
I answered it here: https://stackoverflow.com/a/53884851/976948
Basically, there are links for 2 articles about how to test it:
Unit test for Lexer
Unit test for Parser

I recently completed two ANTLR3 assignments (I'm working on my Master's in Computer Science) using Eclipse. I found no single document that had a process for installing, configuring, writing, and debugging a grammar in Eclipse. So, after working through various issues, I found the easiest thing to do was to stay in Eclipse for testing.
To use the process I have come to use (outlined below) you must first have the ANTLR IDE v2.1.2 installed. Add it right from inside Eclipse Indigo: http://antlrv3ide.sourceforge.net/updates. This site also has some useful doc on using the ANTLR IDE. Once installed, the IDE has to be configured. Video tutorials are a bit out of date but helpful. See a detailed how to guide on configuring ANTLR IDE in Eclipse. The main configuration item is the java output folder. Do this in Eclipse by going to Windows, Preferences, ANTLR, Code Generator, check Project relative folder and in the Output folder name box type a folder name (mine is called "antlr-java", others use "generated").
Test/Debug Process for ANTLR in Eclipse Indigo with ANTLR IDE
After a new project is created, right-click it, select Configure, Convert to
ANTLR Project...
Create the grammar in a .g file and save it. Note: filename has to match grammar name.
If there are significant errors, debug the grammar. Eclipse shows the ANTLR error(s)
and what line(s) are affected. At first, these errors seem hard to understand but
they can be worked through by using various resources:
- The Definitive ANTLR Reference by Terence Parr the guy who wrote ANTLR
- the ANTLR Reference Manual
- google the error; many times you will end up here at stackoverflow;
in particular, Bart Kiers is both knowledgeable and helpful (Bart: thx for
the help you didn't know you gave me)
On the first save after the serious ANTLR errors are resolved, the java output folder you
configured in Eclipse will be created and a java file in that folder will also be created.
Right-click on the java output folder, select Build Path, Use As a Source Folder. This
tells Eclipse where to look for the project's java source.
There are likely to be errors in the new java file. Select it, then search through looking
for java errors. Go back to your grammar or java file(s), correct the errors, and re-save
the grammar until both grammar and java files are error free, then run it.
From this point on, it's the usual modify-run-debug cycle.
The only other Eclipse change I needed was to create a few Run Configurations for testing
command line parameters.

You can download gUnit there but I think there is no latest version...
Try Jarvana... Latest version there is 3.4: http://repo1.maven.org/maven2/org/antlr/gunit/3.4/gunit-3.4.jar
#Dave Newton is right. As of ANTLR v3.1, gUnit is included in the main ANTLR Tool jar as is stated there.
I didn't know for gUnit till now. It looks great for grammar testing, but I think that JUnit tests will do their job to...

This is the first time I heard of gUinit and read up on it. (I don't use ANTLR much.) It sounds interesting, but half useless.
My approach to validating grammars is to actually validate the the entire parser with "normal" unit tests. The thing is, you should have unit tests in place anyway and the tests that check for grammar regression you just add it there. The thing is in my experience that most errors come in semantic analysis and reduction and not the grammar.

Related

Validate Gherkin Feature File

I used IntelliJ to write Cucumber feature files i.e., *.feature files and corresponding step definition files in Java within the IDE. The user experience is great and IDE opens up its IntelliSense showing all valid options for which the step definitions exist. If I write a new step in a scenario or a step that has no corresponding step definition exist already in the Java file, it highlights the step with different colour and similarly if regex don't match. I get informed by this highlighting that something has to be written for this step in Java file or the step is wrong and will not execute.
I need the same functionality on a non developer machine i.e., on a Business Analyst or Product owner machine, where there is no IDE installed but just an editor like Notepad++. I know about the Notepad++ plug-in for Gherkin but it doesn't highlight the step for which step definition is missing in the *.java step definition file. Please suggest any editor or free IDE that has similar plug-in available.
Thanks,
Shany
In case you use JIRA to manage your stories then you can use a JIRA plugin which has Auto-complete, parameter highlight functionality. There are other similar plugins you can browse.

Penn Discourse Tree Bank (PDTB) Parser

I am currently trying to run the following project (https://github.com/ilija139/PDTB-Parser). The text file I used to run the project, is the wsj_2300.txt, which can be found in the "output" directory. Unfortunately without any success. The project is based on the Stanford CoreNLP. What I found out until now:
I can't use CoreNLP version 3.5.2 (the latest version) due to the fact, that the project is based on the older dependencies. By using Universal Dependencies, I get the following error message "No head rule defined for MWE using class edu.stanford.nlp.trees.SemanticHeadFinder in (MWE (JJ such) (IN as))". However, the following answer from Stackowerflow (PrintTree - No head rule defined for MWE - Bug with version 3.5.2) about the same problem could not solve the problem! If anyone knows how to fix it, please let me know.
Nevertheless, due to the fact, that the PDTB parser was last updated 1 year ago, I simply used 2 older versions (3.5.1 and then 3.4.1), expecting that the project run as said by Thematrixme (PrintTree - No head rule defined for MWE - Bug with version 3.5.2). Unfortunately, only the first problem was solved and another one appeared. A simple "String index out of range: -1 ()" in the function "buildDependencyTrees" due to the fact, that no "root" could be found in the dependencies.
I tried to fix the problem by simply excluding that the node is build, but then I get an "indexOutOfBoundsException" at the next dependency, because no child could be found... Does anyone know what I need to do or which CoreNLP version/model I need to use to let this program run correctly as mentioned in the Manual?
Thank you very much
I am not sure how you are using the parser. But you can get Stanford CoreNLP 3.5.2 to create the older Stanford Dependencies by setting "parse.originalDependencies" to "true" in the Properties object you use to make a pipeline. Or if you're running from the command line just include the flag "-parse.originalDependencies" If you're using the neural net dependency parse you would use the Stanford Dependencies model instead by setting the property "depparse.model" to "edu/stanford/nlp/models/parser/nndep/english_SD.gz"
If you let me know how specifically you are creating parses I can tell you exactly what setting to use to get the older dependencies.

Import EMX into Eclipse via command line

I hope my question will be clear enough as I am not used to Eclipse environment.
My goal is to create a project in Eclipse with an already existing EMX file. I gather that file from a Git repository.
And I would like to create a command line (or a script, or anything I can run automatically from remote) that imports that EMX file into Eclipse, so I can use another script (already created) to work with this EMX.
My problem is that I don't know how to create this script or command line.
I've tried solutions found here at StackOverflow, without success. Maybe I'm just bad :p
As I am beginner in Eclipse stuff, if someone has a link to a clear tutorial or working example, it would be very nice.
Note: I just have the EMX file (no .project or whatever). So I guess I need to create a project with this file, and not just import an existing project.
Thank you very much for your help :)
Laurent
I finally managed to do what I wanted by creating a plugin which imports projects from a given path into my current workspace.
Main part of the code is from: http://code.google.com/p/headlesseclipse/source/browse/branches/JUnit/com.ind.eclipse.headlesseclipse/src/com/ind/eclipse/headlessworkspace/HeadlessProjectImport.java?r=88
I did not find any way to do this outside the plugin. Grovvy solution was giving plenty of errors and without CDT, the first simple solution I looked for was not available.

Why does my ANTLR build Ant task fail with "Unable to determine generated class"?

I'm trying to use ANTLR3 task for Ant, but I get an "Unable to determine generated class" build failure message.
A quick research shows that many people have had the same problem, with no solution provided (see links below).
Can someone suggest a solution that doesn't resort to using a regular Java Ant task?
External links:
http://www.antlr.org/pipermail/antlr-interest/2009-November/036795.html
http://www.antlr.org/pipermail/antlr-interest/2006-July/016870.html
http://palove.kadeco.sk/itblog/posts/40
The antlr task included with Ant 1.8.2 (the latest version) seems to be dependent on ANTLR 2.7.2 (defined in $ANT_HOME/lib/ant-antlr.pom and using $ANT_HOME/lib/ant-antlr.jar.
What the task is doing is scanning the target file for a line matching ^class (.*) extends .*, where the first match group will be used as the name of the generated file. This whole bit of syntax seems to have been dropped in ANTLR 3.x, or at least made optional, because I'm able to generate parsers without it using the regular java task work-around you mentioned.
On the front page of http://antlr.org/ under the "File Sharing" heading is a link to ANTLR v3 task for Ant, but unfortunately it doesn't appear to be the sort of drop-in replacement I was hoping for. Actually, it seems to be rather convoluted so I've stuck with using the plain java task.

Java Reflection not working on my system - working for team members

I am working on a team project in Java. One requirement is that we dynamically populate a drop-down menu of all classes that implement a certain interface. New classes can be added after compile time. To accomplish this we are using reflection.
Problem: All of the drop-down menus are blank on my system. I cannot for the life of me figure out why they are not populating. All other 5 team members have it working on their system.
Things I tired that didn't work:
1) Installing most recent eclipse (galileo) because rest team was using it
2) Re-install most recent java release (jdk1.6.0-17 and jre6)
3) Check PATH and JAVA_HOME variables
Any thoughts as to what else I can try or if something I did should have solved it and didn't? It is driving me crazy.
Edit:
I should have been clearer that we are developing in a team. We are using SVN for version control and we are all running the exact same source code. I even tried checking out a fresh copy of the entire tree from SVN, but I had the same issue with reflection on my system while it worked for teammates.
The team created an executable jar and that ran on everyone's system fine except for mine. Everything worked for me except the reflection bit.
You need to debug your application. This means you have to systematically explore possible causes of the problem. Here are some things that come to mind:
Could your GUI be failing rather than reflection? What if you output with System.out.println() rather than your menu?
Is your reflection code throwing an exception, and are you ignoring it?
Is your reflection code actually being called? Toss a println() in there to be sure!
Is the test for the interface suffering from a typo or similar error that's causing it to fail? Try finding classes that implement Serializable instead!
Is your reflection test running in the main thread and trying to update your GUI? You need to use SwingUtilities.invokeAndWait to get an update to the Swing worker thread.
You're working with Eclipse; Eclipse has a fantastic debugger. Set a breakpoint near where your main action is and then single step through the code.
PATH and JAVA_HOME won't help. PATH only affects dynamically-linked libraries ("native code"). JAVA_HOME is a scripting variable that happens to be used by some Java-based utilities like Ant and Tomcat; it means nothing to the Java runtime itself.
You need to be investigating the classpath, which should be specified by the -classpath option to the java command, in the Build Path in your Eclipse project properties, or in the Class-Path attribute of the main section of a JAR file if you're launching java with the -jar option.
From within your code, you should be able to list the contents of your classpath by examining the system property, "java.class.path"
System.out.println(System.getProperty("java.class.path"));
Problem solution:
Classpath leading to source code must have no spaces in it.
I am running windows XP and, for whatever reason, if the classpath that leads to the jar file or source code that is using reflection has any spaces in it, then the reflection fails.
I took the jar file that works for the rest of my team and ran it from C:\ on my system and the reflection worked perfectly fine.
I do not know why this is so please comment if you know what is happening.
Might be a long shot, but look for differences in security settings for you and your team mates. Article describing more details http://www.ibm.com/developerworks/library/j-dyn0603/ heading "Security and reflection"

Categories