When I tried to debug the stream in the code below via Stream Trace in IntelliJ, the debugger can't evaluate the foreach because the error below is thrown. I have no idea what it's about, the code by itself runs fine.
Fully updated IntelliJ community edition, JUnit 5, Spring Boot, Maven, Java 11.
The error that happens during Stream Trace debugging only:
java.lang.IncompatibleClassChangeError: Type
com.progonkpa.file.FileService$GeneratedEvaluationClass$5 is not a
nest member of com.progonkpa.file.FileService: types are in different
packages
The code that contains the stream:
public class FileService {
public void createDirs(File parentDir, String[] fileNames) {
Stream.of(fileNames)
.map(fileName -> new File(parentDir, fileName))
.forEach(file -> {
if (file.mkdirs())
System.out.println("Created file: " + file);
else
System.err.println("Failed to create file: " + file);
});
}
}
The test that invokes the method above:
public class FileServiceTest {
private FileService fileService = new FileService();
#Test
public void generateDirs_createsList() {
File tmpDir = new File("/tmp");
String[] dirNamesList = {"dir1", "dir2"};
File createdDir1 = new File(tmpDir, dirNamesList[0]);
File createdDir2 = new File(tmpDir, dirNamesList[1]);
fileService.createDirs(tmpDir, dirNamesList);
assertTrue(createdDir1.exists());
assertTrue(createdDir2.exists());
assertTrue(createdDir1.delete());
assertTrue(createdDir2.delete());
assertTrue(tmpDir.delete());
}
}
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.1.3.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.unknown.somefunction</groupId>
<artifactId>joske</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<java.version>11</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<!--<dependency>-->
<!--<groupId>org.springframework.boot</groupId>-->
<!--<artifactId>spring-boot-starter-test</artifactId>-->
<!--<scope>test</scope>-->
<!--</dependency>-->
<!--Data processing-->
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi</artifactId>
<version>4.0.1</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>4.0.1</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-csv</artifactId>
<version>1.6</version>
</dependency>
<dependency>
<groupId>org.simplejavamail</groupId>
<artifactId>simple-java-mail</artifactId>
<version>5.1.3</version>
</dependency>
<dependency>
<groupId>org.simplejavamail</groupId>
<artifactId>outlook-message-parser</artifactId>
<version>1.1.17</version>
</dependency>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.6</version>
</dependency>
<dependency>
<groupId>com.github.vatbub</groupId>
<artifactId>mslinks</artifactId>
<version>1.0.5</version>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
<version>5.3.2</version>
<scope>test</scope>
</dependency>
<!--Testing-->
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
The Stream debugger apparently generates bytecode and defines classes on the fly to evaluate expressions.
Relevant source files are
CompilingEvaluator.java
CompilingEvaluatorImpl.java
And there is a currently opened issue on YouTrack, with the exact same exception
Type some.Type$GeneratedEvaluationClass$1 is not a nest member of some.Type: types are in different packages
IDEA-204665
This manifests itself only on JDK versions greater than 10, and unfortunately you have
<java.version>11</java.version>
As the Issue suggest, it happens because
JDK 11 has "Nest-based Access Control" feature
(https://cr.openjdk.java.net/~dlsmith/nestmates.html)
The JEP 181 says
Impact on Other Tools
Any tool that operates on classfiles, or which
generates or processes bytecodes is potentially impacted by these
changes. At a minimum such tools must tolerate the presence of the new
classfile attributes and allow for the change in bytecode rules. For
example:
The javap classfile inspection tool, The Pack200 implementation, and
The ASM bytecode manipulation framework, which is also used internally
in the JDK.
Related
I have trained and saved keras model using Tensorflow 2.7.0, it's basically a predictive model who take text and classify it to categories. I saved the model using model.save(), I load the model in java but when I try to do prediction with that I get the error as you can see below. All details can be found below:
package com.example.deployml;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.tensorflow.*;
import java.nio.charset.StandardCharsets;
#SpringBootApplication
public class DeployMlApplication {
public static void main(String[] args) {
System.out.println(TensorFlow.version());
try (SavedModelBundle b = SavedModelBundle.load("src/main/resources/model/", "serve")) {
System.out.println("SavedModelBundle: " + b.metaGraphDef());
System.out.println( "Hello World! I'm using tensorflow version " + TensorFlow.version() );
Session s = b.session();
String data = " providing support director admin manager assisting daily day day activities carry indoor sales network tele marketing exp willing learn also generate quotation invoicing etc sales coordination functions.";
Tensor input = Tensor.create(data.getBytes(StandardCharsets.UTF_8));
Tensor result = s.runner().feed("serving_default_text_vectorization_input:0", input).fetch("StatefulPartitionedcall_1:0").run().get(0);
float[][] res=new float[1][24];
res[0]=new float[1];
/*result.copyTo(res);*/
System.out.println("results: "+result);
/*for (Iterator it = b.graph().operations(); it.hasNext();) {
Operation op = (Operation) it.next();
System.out.println("Operation name: " + op.name());
}*/
}
SpringApplication.run(DeployMlApplication.class, args);
}
}
Maven dependencies:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.7.3</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.example</groupId>
<artifactId>deployML</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>deployML</name>
<description>deployML</description>
<properties>
<java.version>8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-devtools</artifactId>
<scope>runtime</scope>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.tensorflow</groupId>
<artifactId>tensorflow</artifactId>
<version>1.15.0</version>
</dependency>
<dependency>
<groupId>org.tensorflow</groupId>
<artifactId>tensorflow-java</artifactId>
<version>0.4.0</version>
<type>pom</type>
</dependency>
<dependency>
<groupId>org.tensorflow</groupId>
<artifactId>tensorflow-core-api</artifactId>
<version>0.4.0</version>
</dependency>
<dependency>
<groupId>org.tensorflow</groupId>
<artifactId>tensorflow-core-platform</artifactId>
<version>0.4.2</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<excludes>
<exclude>
<groupId>org.project-lombok</groupId>
<artifactId>lombok</artifactId>
</exclude>
</excludes>
</configuration>
</plugin>
</plugins>
</build>
</project>
Error I get when I run this:
Exception in thread "main" org.tensorflow.TensorFlowException: Op type not registered 'RaggedBincount' in binary running on XXXXXX. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
at org.tensorflow.SavedModelBundle.load(Native Method)
at org.tensorflow.SavedModelBundle.access$000(SavedModelBundle.java:27)
at org.tensorflow.SavedModelBundle$Loader.load(SavedModelBundle.java:32)
at org.tensorflow.SavedModelBundle.load(SavedModelBundle.java:95)
at com.example.deployml.DeployMlApplication.main(DeployMlApplication.java:18)
Any suggestion on how to solve that, thaanks!
I have unit tests in my springboot project in addition to the default application test that comes with the project bundle when I create the project from start.spring.io. When I run mvn test from command line, I see that only the default application tests are run but not the unit tests that I have written. However, I can run these tests from IntelliJ. I am using maven version 3.6.2 and maven surefire plugin version 2.22.2. Can someone let me know what I am missing here? Thanks.
Here's my test class:
#RunWith(SpringRunner.class)
#SpringBootTest
public class BranchServiceUnitTest {
#Autowired
private BranchService branchService;
#MockBean
private BranchRepository branchRepository;
#Test
public void testAddNewBranch() {
Branch testBranch = new Branch();
testBranch.setBranchName("TestBranch");
testBranch.setCity("TestCity");
testBranch.setContactNumber("TestContactNumber");
testBranch.setEmailId("TestEmailId");
Mockito.when(branchRepository.save(testBranch)).thenReturn(testBranch);
Branch addedBranch = branchService.addBranch(testBranch);
assertThat(addedBranch.getCity()).isEqualTo("TestCity");
}
#Test
public void findBranchById() {
Branch testBranch = new Branch();
testBranch.setId(1);
testBranch.setBranchName("TestBranch");
testBranch.setCity("TestCity");
testBranch.setContactNumber("TestContactNumber");
testBranch.setEmailId("TestEmailId");
Mockito.when(branchRepository.findById(testBranch.getId())).thenReturn(java.util.Optional.of(testBranch));
Branch foundBranch = branchService.getBranchById(1);
assertThat(foundBranch.getId()).isEqualTo(1);
}
#Test
public void testGetAllBranches() {
Branch testBranch1 = new Branch();
testBranch1.setId(1);
testBranch1.setBranchName("TestBranch");
testBranch1.setCity("TestCity");
testBranch1.setContactNumber("TestContactNumber");
testBranch1.setEmailId("TestEmailId");
Branch testBranch2 = new Branch();
testBranch2.setId(2);
testBranch2.setBranchName("TestBranch");
testBranch2.setCity("TestCity");
testBranch2.setContactNumber("TestContactNumber");
testBranch2.setEmailId("TestEmailId");
List<Branch> branches = Arrays.asList(testBranch1,testBranch2);
Mockito.when(branchRepository.findAll()).thenReturn(branches);
assertThat(branches.size()).isEqualTo(2);
assertThat(branches.get(0).getId()).isEqualTo(1);
}
}
Following is my pom file:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.2.6.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.rkasibha</groupId>
<artifactId>rentabook</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>rentabook</name>
<description>Rent a book service</description>
<properties>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-devtools</artifactId>
<scope>runtime</scope>
<optional>true</optional>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.modelmapper</groupId>
<artifactId>modelmapper</artifactId>
<version>2.3.6</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
I think its a weird mix of JUnit 4 and Junit 5 that causes the issue:
Spring boot 2.2.6 (I've used start.spring.io to generate a sample application) uses junit 5.
On the other hand, your test is written with #RunWith which means that it uses junit 4 under the hood.
The dependency:
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
also seems suspicious - the spring-boot-starter-test already contains all the required dependencies on JUnit 5, so you don't need this one.
Now in terms of resolution, check out the default test that comes with this sample application (the one you've described in the question). The chances are that it uses JUnit 5 by itself, so you better migrate your test to JUnit 5 and rerun.
This looks spurious.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<includes>
<include>**/*Test*.java</include>
</includes>
</configuration>
</plugin>
What do you hope to gain from using <include>**/*Test*.java</include>? I'm pretty certain the trailing * does not mean zero-or-more characters. It's 1 or more. Documentation
Are there specific classes in your test directory that you want to exclude? If not, I would remove the whole plugin. Surefire is already declared in Maven's implicit parent POM, with sensible defaults that will cover all of your tests. Declaring it yourself is both needlessly verbose and has actively broken something which works out of the box.
If your tests run successfully when ran alone, But not picked up during the
mvn test
Chances are that you might be using an Older Junit package in your tests. Usually the above weird scenario happens when there's a mix up in the Junit version.
If Junit5 is being used, please ensure that the package in imports is
import org.junit.jupiter.api
I am trying to convert my Json file to Parquet format.
Following is my pom file.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.mypackage</groupId>
<artifactId>JSONToParquet</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<repositories>
<repository>
<id>wso2</id>
<url>http://dist.wso2.org/maven2/</url>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>org.kitesdk</groupId>
<artifactId>kite-data-core</artifactId>
<version>1.1.0</version>
</dependency>
<dependency>
<groupId>org.kitesdk</groupId>
<artifactId>kite-morphlines-all</artifactId>
<version>1.0.0</version> <!-- or whatever the latest version is -->
<type>pom</type>
</dependency>
<!-- https://mvnrepository.com/artifact/ua_parser/ua-parser -->
<dependency>
<groupId>ua_parser</groupId>
<artifactId>ua-parser</artifactId>
<version>1.3.0</version>
<type>pom</type>
</dependency>
</dependencies>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
</project>
Following is the code for conversion :
Schema jsonSchema = JsonUtil.inferSchema(inputstream, "Movie", 10);
try (JSONFileReader<Movie> reader = new JSONFileReader<>(
inputstream, jsonSchema, Movie.class)) {
reader.initialize();
ParquetWriter parquetWriter
= new AvroParquetWriter(outputPath, jsonSchema, compressionCodecName, ParquetWriter.DEFAULT_BLOCK_SIZE, ParquetWriter.DEFAULT_PAGE_SIZE);
for (Movie record : reader) {
parquetWriter.write(record);
}
In the above code Movie is my POJO class.
When I run the program I am facing the following exception :
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/RecordReader
at com.mypackage.jsontoparquet.JsonToParquet.main(JsonToParquet.java:34)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.RecordReader
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 1 more
I am using JDK : 8.
I don't have any background of hadoop, so I am unable to understand it's root cause.
What is the issue ?
Based on Kite-SDK Documentation, JSONFileReader,ParquetWriter and AvroParquetWriter use Hadoop libraries to work. It is needed to add Hadoop dependencies in your pom. You need at least below dependencies. Add them in your pom.xml:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-jobclient</artifactId>
<version>2.6.0</version>
</dependency>
Your kite is missing hadoop dependencies
there are some cases where you may have to provide the relevant Hadoop component dependencies yourself, and Kite has grouping dependencies for this purpose.
For Haddop2 (default) add to your pom:
<dependency>
<groupId>org.kitesdk</groupId>
<artifactId>kite-hadoop2-dependencies</artifactId>
<version>1.0.0</version>
<type>pom</type>
<scope>compile</scope>
</dependency>
I am writing code for a spring program, and I need to convert a File object to a MultipartFile object. I am trying to do this by using the MockMultipartFile class:
FileInputStream input = new FileInputStream(toUpload);
MultipartFile multipartFile = new MockMultipartFile("file",
toUpload.getName(), "text/json", IOUtils.toByteArray(input));
However, I keep getting the error MockMultipartFile cannot be resolved to a type. Why is this? If it helps, I'm using Maven, in case it may be a dependency error that I don't know about.
I also tried to do this using CommonsMultipartFile. That code is here (toUpload is the File object I am trying to convert):
DiskFileItem fileItem = new DiskFileItem("file", "text/plain", false, toUpload.getName(), (int) toUpload.length() , toUpload.getParentFile());
fileItem.getOutputStream();
MultipartFile multipartFile = new CommonsMultipartFile(fileItem);
When I try this method, I get the error The constructor CommonsMultipartFile(DiskFileItem) is undefined. Why is that?
EDIT: Here is our pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework</groupId>
<artifactId>gs-uploading-files</artifactId>
<version>0.1.0</version>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.3.5.RELEASE</version>
</parent>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-thymeleaf</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-devtools</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>commons-fileupload</groupId>
<artifactId>commons-fileupload</artifactId>
<version>1.3.1</version>
</dependency>
</dependencies>
<properties>
<java.version>1.8</java.version>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
To use spring's MockMultipartFile you need to add spring-test as a compile dependency (which would be unusual, since it is generally used as a test dependency):
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
</dependency>
I am trying to use File change to MultipartFile, it can be transferred successfully using the below code
File uploadFile = new File("src/main/resources/static/images/icon/icon4.jpg");
FileInputStream is = new FileInputStream(uploadFile);
return new MockMultipartFile("icon4.jpg", "icon4.jpg", "image/jpeg", IOUtils.toByteArray(is));
I must admit this JUnit Test has been a nightmare till now, I have invested some days, searching the Internet but everything I tried did not work.
The Error I have is following:
SCHWERWIEGEND: Exception while invoking class org.glassfish.persistence.jpa.JPADeployer prepare method
java.lang.RuntimeException: Invalid resource : jdbc/sample__pm
The jdbc/sample__pm does not exist but the jdbc/sample in the persistance.xml does.
As I said I have already searched for this problem but I cound not find a solution to the problem.
I guess the problem is with the pom.xml, but don't know how to fix it.
Thanks for any help
the pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.maggioni</groupId>
<artifactId>SampleApp</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>war</packaging>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>eclipselink</artifactId>
<version>2.5.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>org.eclipse.persistence.jpa.modelgen.processor</artifactId>
<version>2.5.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>javax</groupId>
<artifactId>javaee-api</artifactId>
<version>7.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.glassfish.main.extras</groupId>
<artifactId>glassfish-embedded-all</artifactId>
<version>4.0</version>
</dependency>
<dependency>
<groupId>org.apache.derby</groupId>
<artifactId>derbyclient</artifactId>
<version>10.10.2.0</version>
</dependency>
</dependencies>
<build>
<finalName>SampleApp</finalName>
</build>
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<failOnMissingWebXml>false</failOnMissingWebXml>
</properties>
the source can also be found here
I had the same problem, and after losing lot's of time I got it right. I now can run my tests with my connection pool.
This problem is resulted by series of small maven code generation bugs in Netbeans.
Bellow are a serie of steps for you to check
1 - Your pom.xml must have the path , plugin and dependencies configured for glassfish-embedded-static-shell, as follow:
Add in the properties at the top of your pom the path to your glassfish instalation folder(here is mine):
<properties>
<endorsed.dir>${project.build.directory}/endorsed</endorsed.dir>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<glassfish.embedded-static-shell.jar>C:/Program Files/glassfish-4.0/glassfish/lib/embedded/glassfish-embedded-static-shell.jar</glassfish.embedded-static-shell.jar>
</properties>
Add/check the plugin definition bellow(substitute mysql per maven in your case):
`
<build>
...
<plugins>
...
<plugin>
<groupId>org.glassfish.embedded</groupId>
<artifactId>maven-embedded-glassfish-plugin</artifactId>
<version>4.0</version>
<dependencies>
<dependency>
<groupId>org.glassfish.main.common</groupId>
<artifactId>simple-glassfish-api</artifactId>
<version>4.1</version>
</dependency>
<dependency>
<groupId>org.glassfish.main.extras</groupId>
<artifactId>glassfish-embedded-static-shell</artifactId>
<version>4.1</version>
<scope>system</scope>
<systemPath>${glassfish.embedded-static-shell.jar}</systemPath>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.28</version>
</dependency>
</dependencies>
<configuration>
<app>target/${project.artifactId}-${project.version}</app>
<port>8282</port>
<contextRoot>${project.artifactId}</contextRoot>
<foo>bar</foo>
</configuration>
</plugin>
...
</plugins>
...
</build>
`
2 - Your glassfish-resources... it's silly, but the jndi-name of your jdbc resource must have the prefix java:app/ so, add it as sample bellow(just the resource, the connection pool can have any name):
*Just remembering the name of your jdbc-resource in glassfish will not have the prefix java:app, e.g.: my jdbc-resource in glassfish is just ShrewdPCPool
3 - Third and the silliest, when deploying the embedded glassfish, the auto generated script of Netbeans doen't include /src/main/resources/setup in the class path, so, the simple solution is to make a copy of your glassfish-resources.xml para /src/main/resources/META-INF/ or if you are patient enough change the script.
Dont forget to clean, and after build with dependencies before you run your tests!