Inject between jars - java

I have a Jar, Helper project, which has a process manager, which controls the execution of threads etal. Where it receives an interface that performs the execution when fired.
I have another Jar, Service project, which implements the interface Helper, and injects a factory Service project itself.
The error occurs when you run the code
processor = statementProcFactory.getProduct(modelMessage.getProcessType ());
when executed via Helper. I tried calling the processMessage StatementToPRocessDequeueable within the Application Service, mocando the message, and it worked without problems. The error occurs only when the processMessage is called by SQSQueueRunnable.
The structure of Helper:
├── src
│   ├── main
│   │   ├── java
│   │   │   └── br
│   │   │   └── com
│   │   │ └── aws
│   │   │ └── sqs
│   │   │   ├── ISQSDequeueable.java
│   │   │   ├── SQSDequeueableException.java
│   │   │   ├── SQSQueueManager.java
│   │   │   └── SQSQueueRunnable.java
│   │   └── resources
│   │   └── META-INF
│   │   └── beans.xml
And the structure of the Service
├── src
│   ├── main
│   │   ├── java
│   │   │   └── br
│   │   │   └── com
│   │   │ └── reconciliation
│   │   │ ├── service
│   │   │ │   ├── Application.java
│   │   │ │   ├── MainService.java
│   │   │  │   └── statement
│   │   │  │   └── processor
│   │   │  │   ├── IStatementProcessor.java
│   │   │  │   ├── processors
│   │   │  │   │   ├── BanriProcessor.java
│   │   │  │   │   ├── CieloProcessor.java
│   │   │  │   │   ├── RedeCreditProcessor.java
│   │   │  │   │   └── RedeDebitProcessor.java
│   │   │  │   ├── StatementProcessorFactory.java
│   │   │  │   ├── StatementProcessorType.java
│   │   │  │   └── StatementProcessorTypeLiteral.java
│   │   │  └── sqsdequeueable
│   │   │  ├── StatementToProcessDequeueable.java
│   │   │  └── StatementToProcessModel.java
│   │   └── resources
│   │   ├── aws.properties
│   │   └── META-INF
│   │   └── beans.xml
Main Service:
public static void main(String[] args) {
timer = new Timer();
timer.schedule(new EchoTask(), 0, 1000);
Weld weld = new Weld();
WeldContainer container = weld.initialize();
Application application = container.instance().select(Application.class).get();
application.run();
weld.shutdown();
}
Class Application:
#Singleton
public class Application {
#Inject
private SQSQueueManager queueManager;
#Inject
private Provider<StatementToProcessDequeueable> statementProcDequeueProvider;
public void run() {
queueManager.addSQSDequeueable(statementProcDequeueProvider.get());
}
}
Method of addSQSDequeueable SQSQueueManager:
public void addSQSDequeueable(ISQSDequeueable dequeueable) {
SQSQueueRunnable runnable = runnableProvider.get();
runnable.setDequeueable(dequeueable);
Thread th = new Thread(runnable);
dictionaryDequeueables.put(dequeueable, runnable);
dequeueable.startRunning();
th.start();
}
SQSRunnable the run method:
public void run() {
ReceiveMessageRequest request = new ReceiveMessageRequest(
dequeueable.getQueue());
do {
List<Message> messages = sqsAmazon.receiveMessage(request)
.getMessages();
for (Message message : messages) {
processed = true;
Thread th = new Thread(new Runnable() {
private Message message;
public void run() {
try {
dequeueable.processMessage(message);
} catch (SQSDequeueableException e) {
// TODO Mover mensagem para fila de erros
e.printStackTrace();
}
sqsAmazon.deleteMessage(new DeleteMessageRequest(
dequeueable.getQueue(), message
.getReceiptHandle()));
}
public Runnable setMessage(Message message) {
this.message = message;
return this;
}
}.setMessage(message));
th.run();
}
try {
Thread.sleep(this.getSleepThread());
} catch (InterruptedException e) {
e.printStackTrace();
}
if (isRunning == false)
dequeueable.stopRunning();
} while (isRunning);
}
method processMessage of StatementToProcessDequeueable that implements the interface ISQSDequeueable
public void processMessage(Message message) throws SQSDequeueableException {
logger.debug("Processando mensagem");
StatementToProcessModel modelMessage = this.generateModel(message.getBody());
if (modelMessage != null) {
processor = statementProcFactory.getProduct(modelMessage.getProcessType());
logger.debug("Processor gerado: " + processor);
if (processor != null) {
Statement statement = new Statement();
Retail retail = repoRetail.getRetailByID(modelMessage.getRetailId());
List<OperationCard> operations = processor.process(
statement.getInputStream(), retail);
for (OperationCard operation : operations) {
}
} else {
throw new SQSDequeueableException(String.format(
"O tipo de processador %s está inválido", processor));
}
} else {
logger.debug("Atributos inválidos da mensagem");
throw new SQSDequeueableException(
"Mensagem não possuí todos atributos necessários");
}
}
Class StatementProcessorFactory
#Singleton
public class StatementProcessorFactory {
#Inject
#Any
private Instance<IStatementProcessor> products;
public IStatementProcessor getProduct(String type) {
StatementProcessorTypeLiteral literal = new StatementProcessorTypeLiteral(type);
Instance<IStatementProcessor> typeProducts = products.select(literal);
return typeProducts.get();
}
}
Class StatementProcessorTypeLiteral
#SuppressWarnings("serial")
public class StatementProcessorTypeLiteral extends
AnnotationLiteral<StatementProcessorType> implements StatementProcessorType {
private String type;
public StatementProcessorTypeLiteral(String type) {
this.type = type;
}
public String value() {
return type;
}
}
Interface StatementProcessorType
#Qualifier
#Retention(RetentionPolicy.RUNTIME)
#Target({ElementType.FIELD, ElementType.TYPE})
public #interface StatementProcessorType {
public String value();
}
Class that implements the IStatmentProcessor and has the qualifier StatementProcessorType
#StatementProcessorType("Banri")
#Stateless
public class BanriProcessor implements IStatementProcessor {
public List<OperationCard> process(InputStream statementFile, Retail retail) {
// TODO Auto-generated method stub
return null;
}
}
Error:
Exception in thread "Thread-2" org.jboss.weld.exceptions.UnsatisfiedResolutionException: WELD-001308: Unable to resolve any beans for Types: [interface br.com.agilebox.gestorfiscal.reconciliation.service.statement.processor.IStatementProcessor]; Bindings: [QualifierInstance{annotationClass=interface br.com.agilebox.gestorfiscal.reconciliation.service.statement.processor.StatementProcessorType, values={[BackedAnnotatedMethod] public abstract br.com.agilebox.gestorfiscal.reconciliation.service.statement.processor.StatementProcessorType.value()=Banri}, hashCode=-1507802905}, QualifierInstance{annotationClass=interface javax.enterprise.inject.Any, values={}, hashCode=-2093968819}]
at org.jboss.weld.manager.BeanManagerImpl.getBean(BeanManagerImpl.java:854)
at org.jboss.weld.bean.builtin.InstanceImpl.get(InstanceImpl.java:75)
at br.com.agilebox.gestorfiscal.reconciliation.service.statement.processor.StatementProcessorFactory.getProduct(StatementProcessorFactory.java:19)
at br.com.agilebox.gestorfiscal.reconciliation.sqsdequeueable.StatementToProcessDequeueable.processMessage(StatementToProcessDequeueable.java:31)
at br.com.agilebox.gestorfisccal.aws.sqs.SQSQueueRunnable.run(SQSQueueRunnable.java:64)
at java.lang.Thread.run(Thread.java:745)

Related

IntelliJ is not recognizing annotation-generated class files

I have a custom annotation in Java, that uses ByteBuddy to generate a new class based on the annotated one. Since ByteBuddy is already compiling the class, I am outputting the bytecode directly, rather than source.
My gradle build compiles fine, with references to the generated class.
The problem is that in IntelliJ, the editor does not recognize the classes that are output directly. It does recognize classes generated as source.
My Annotation sub-project
gradle.build
plugins {
id 'java-library'
id 'maven-publish'
}
group 'org.example'
version 'unspecified'
repositories {
mavenCentral()
}
dependencies {
implementation 'net.bytebuddy:byte-buddy:1.12.23'
}
MyAnnotation.java
package org.example.annotation;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
#Target(ElementType.TYPE)
#Retention(RetentionPolicy.CLASS)
public #interface MyAnnotation {}
MyAnnotationProcessor.java
package org.example.annotation.processor;
import net.bytebuddy.ByteBuddy;
import javax.annotation.processing.*;
import javax.lang.model.SourceVersion;
import javax.lang.model.element.TypeElement;
import java.io.IOException;
import java.util.Set;
#SupportedAnnotationTypes({"org.example.annotation.MyAnnotation"})
#SupportedSourceVersion(SourceVersion.RELEASE_11)
public class MyAnnotationProcessor extends AbstractProcessor {
Filer filer;
#Override
public synchronized void init(ProcessingEnvironment processingEnv) {
super.init(processingEnv);
filer = processingEnv.getFiler();
}
#Override
public boolean process(Set<? extends TypeElement> annotations, RoundEnvironment roundEnv) {
annotations.forEach(a -> {
try {
// Generate a java source file
var className = a.getSimpleName()+"_source";
var fooFile = filer.createSourceFile(className);
var writer = fooFile.openWriter();
writer.write("package org.example.annotation; public class "+className+" {}");
writer.close();
// Generate a java class file
var otherClassName = a.getSimpleName()+"_bytecode";
var bb = new ByteBuddy();
var dtoClass = bb.subclass(Object.class)
.name(otherClassName)
.make();
var javaFileObject = filer.createClassFile(otherClassName);
var dtoOutStr = javaFileObject.openOutputStream();
dtoOutStr.write(dtoClass.getBytes());
dtoOutStr.close();
} catch (IOException e) {
throw new RuntimeException(e);
}
});
return false;
}
}
Notice that I am outputting one class called MyAnnotation_source and one called MyAnnotation_bytecode
My sub-project that uses the annotation
gradle.build (NOTE that core is the name of the annotation sub-project)
plugins {
id 'java'
}
group 'org.example'
version 'unspecified'
repositories {
mavenCentral()
}
dependencies {
implementation project(':core')
annotationProcessor project(':core')
}
UserTest.java
import org.example.annotation.MyAnnotation;
import org.example.annotation.MyAnnotation_source;
import org.example.annotation.MyAnnotation_bytecode; //Cannot resolve symbol 'MyAnnotation_bytecode'
#MyAnnotation
public class UserTest {
public static void main(String[] args) {
MyAnnotation_source foo = new MyAnnotation_source();
MyAnnotation_bytecode other = new MyAnnotation_bytecode(); //Cannot resolve symbol 'MyAnnotation_bytecode'
}
}
The build directory ends up looking like this:
build/
├─ classes
│  └─ java
│   └─ main
│   ├─ MyAnnotation_bytecode.class <-- Generated
│   ├── UserTest.class
│   └── org
│   └─ example
│   └─ annotation
│   └─ MyAnnotation_source.class <-- Compiled
└─ generated
  └─ sources
   └─ annotationProcessor
     └─ java
      └─ main
     └─ MyAnnotation_source.java <-- Generated
I have already tried various configurations of the annotation processor settings in IntelliJ, but I think the answer is somewhere else.

mapstruct - Use mapper from another package

I use mapstruct in my projects and it works fine for the straight forward way (All mapper in one package).
Now I have the requirement to move one mapper to another package, but this doesn't work well.
working package structure (1):
de.zinnchen
├── dto
│ └── Car.java
├── entity
│ └── CarEntity.java
└── mapper
└── a
├── CarMapper.java
└── DateMapper.java
NOT working package structure (2):
de.zinnchen
├── dto
│ └── Car.java
├── entity
│ └── CarEntity.java
└── mapper
├── a
│ └── CarMapper.java
└── b
└── DateMapper.java
my java files:
package de.zinnchen.dto;
import java.time.LocalDateTime;
public class Car {
private String manufacturer;
private String model;
private String color;
private LocalDateTime productionDate;
...
package de.zinnchen.entity;
import java.sql.Timestamp;
import java.time.Instant;
import java.time.LocalDateTime;
public class CarEntity {
private String manufacturer;
private String model;
private String color;
private Instant productionDate;
...
package de.zinnchen.mapper.a;
import de.zinnchen.dto.Car;
import de.zinnchen.entity.CarEntity;
import de.zinnchen.mapper.b.DateMapper;
import org.mapstruct.Mapper;
#Mapper(
uses = DateMapper.class
)
public interface CarMapper {
Car asDto(CarEntity entity);
CarEntity asEntity(Car car);
}
package de.zinnchen.mapper.b;
import java.time.Instant;
import java.time.LocalDateTime;
import java.time.ZoneId;
import java.time.ZoneOffset;
public class DateMapper {
LocalDateTime instantToLocalDateTime(Instant instant) {
return instant
.atZone(ZoneId.of("UTC"))
.toLocalDateTime();
}
Instant LocalDateTimeToInstant(LocalDateTime localDateTime) {
return localDateTime.toInstant(ZoneOffset.UTC);
}
}
Once I try to compile the variant with mappers in different packages I get the following error message:
Can't map property "Instant productionDate" to "LocalDateTime productionDate". Consider to declare/implement a mapping method: "LocalDateTime map(Instant value)".
Can you please help me for solving this problem?
Edit
Resulting CarMapperImpl.java of package structure 1:
package de.zinnchen.mapper.a;
import de.zinnchen.dto.Car;
import de.zinnchen.entity.CarEntity;
import javax.annotation.processing.Generated;
#Generated(
value = "org.mapstruct.ap.MappingProcessor",
date = "2021-01-06T09:36:43+0100",
comments = "version: 1.4.1.Final, compiler: javac, environment: Java 11.0.9.1 (AdoptOpenJDK)"
)
public class CarMapperImpl implements CarMapper {
private final DateMapper dateMapper = new DateMapper();
#Override
public Car asDto(CarEntity entity) {
if ( entity == null ) {
return null;
}
Car car = new Car();
car.setManufacturer( entity.getManufacturer() );
car.setModel( entity.getModel() );
car.setColor( entity.getColor() );
car.setProductionDate( dateMapper.instantToLocalDateTime( entity.getProductionDate() ) );
return car;
}
#Override
public CarEntity asEntity(Car car) {
if ( car == null ) {
return null;
}
CarEntity carEntity = new CarEntity();
carEntity.setManufacturer( car.getManufacturer() );
carEntity.setModel( car.getModel() );
carEntity.setColor( car.getColor() );
carEntity.setProductionDate( dateMapper.LocalDateTimeToInstant( car.getProductionDate() ) );
return carEntity;
}
}
The reason why it isn't working is due to the fact that the methods in the DateMapper are package protected and are not available from other packages.
So if you add public to the methods then it will work.

How to implement a Rest endpoint along a Websocket endpoint

I want to expose an additional Rest endpoint along a Websocket endpoint.
The websocket endpoint is already implemented and works, but I have trouble hitting the Rest endpoint.
Any idea what I'm missing here?
I expect to run localhost:8080/hello with a text plain result of hello.
Btw, I'm using Quarkus if that matters.
Here is my project structure
│   │   ├── java
│   │   │   └── org
│   │   │   └── company
│   │   │   ├── chat
│   │   │   │   ├── boundary
│   │   │   │   ├── control
│   │   │   │   └── entity
│   │   │   ├── JAXRSConfiguration.java // Rest endpoint ???
│   │   │   └── websockets
│   │   │   └── ChatSocket.java // Websocket server endpoint (works)
│   │   └── resources
│   │   ├── application.properties
│   │   ├── application.properties.example
│   │   └── META-INF
│   │   └── resources
│   │   ├── index.html
JAXRSConfiguration.java
package org.company;
import javax.ws.rs.ApplicationPath;
import javax.ws.rs.GET;
import javax.ws.rs.Produces;
import javax.ws.rs.core.Application;
import javax.ws.rs.core.MediaType;
#ApplicationPath("hello")
public class JAXRSConfiguration extends Application {
#GET
#Produces(MediaType.TEXT_PLAIN)
public String hello() {
return "hello";
}
}
ChatSocket.java
package org.company.websockets;
import com.google.gson.Gson;
import com.google.gson.JsonObject;
import com.google.gson.JsonParser;
import java.time.LocalDateTime;
import java.util.Dictionary;
import java.util.Enumeration;
import java.util.Hashtable;
import java.util.List;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import javax.enterprise.context.ApplicationScoped;
import javax.inject.Inject;
import javax.websocket.OnClose;
import javax.websocket.OnError;
import javax.websocket.OnMessage;
import javax.websocket.OnOpen;
import javax.websocket.Session;
import javax.websocket.server.PathParam;
import javax.websocket.server.ServerEndpoint;
import org.company.chat.boundary.ChatResource;
import org.company.chat.entity.Chat;
import org.company.chat.entity.ChatGroup;
import org.company.chat.entity.ChatMessage;
import org.company.time.Time;
import org.jboss.logging.Logger;
#ServerEndpoint("/chat/websocket/{username}/{chatRoom}")
#ApplicationScoped
public class ChatSocket {
#Inject
Time time;
#Inject
ChatResource chatResource;
// Chatroom, user, session dictionary
final Dictionary<String, Map<String, Session>> chatRooms = new Hashtable<>();
#OnOpen
public void onOpen(final Session session, #PathParam("username") final String username, #PathParam("chatRoom") final String chatRoom) {
// ...
}
#OnClose
public void onClose(final Session session, #PathParam("username") final String username, #PathParam("chatRoom") final String chatRoom)
throws Exception {
// ...
}
#OnError
public void onError(final Session session, #PathParam("username") final String username, #PathParam("chatRoom") final String chatRoom,
final Throwable throwable) throws Exception {
// ...
}
#OnMessage
public void onMessage(String json, #PathParam("username") final String username, #PathParam("chatRoom") final String chatRoom)
throws Exception {
// ...
}
private void broadcast(final String event, final String chatRoom) {
// ...
}
The solution was simply to add another #Path annotation.
Here I changed JAXRSConfiguration.java as you can see in the following.
Works now, if I call the localhost:8080/api/hello endpoint.
#ApplicationPath("api") // Changed the basic route to 'api'
#Path("hello") // Added a new path
public class JAXRSConfiguration extends Application {
#GET
#Produces(MediaType.TEXT_PLAIN)
public String hello() {
return "hello";
}
}

Java move sub directory with files and structure to parent directory

Trying to move files from a sub-directory along with the structure to a parent directory. And am not able to accomplish this using Files.move(). To illustrate the issue please see the below directory structure.
$ tree
.
└── b
├── c
│   ├── cfile.gtxgt
│   └── d
│   ├── dfile.txt
│   └── e
└── x
└── y
└── z
├── 2.txt
└── p
├── file1.txt
└── q
├── file
├── file2.txt
└── r
└── 123.txt
I want to emulate the below move command via Java.
$ mv b/x/y/z/* b/c
b/x/y/z/2.txt -> b/c/2.txt
b/x/y/z/p -> b/c/p
And the output should be something similar to
$ tree
.
└── b
├── c
│   ├── 2.txt
│   ├── cfile.gtxgt
│   ├── d
│   │   ├── dfile.txt
│   │   └── e
│   └── p
│   ├── file1.txt
│   └── q
│   ├── file
│   ├── file2.txt
│   └── r
│   └── 123.txt
└── x
└── y
└── z
In this move all the files and directories under directory z have been moved to c.
I have tried to do this:
public static void main(String[] args) throws IOException{
String aPath = "/tmp/test/a/";
String relativePathTomove = "b/x/y/z/";
String relativePathToMoveTo = "b/c";
Files.move(Paths.get(aPath, relativePathTomove), Paths.get(aPath, relativePathToMoveTo), StandardCopyOption.REPLACE_EXISTING);
}
However this causes this exception to the thrown java.nio.file.DirectoryNotEmptyException: /tmp/test/a/b/c and if the the REPLACE_EXISTING option is taken out the code throws a java.nio.file.FileAlreadyExistsException: /tmp/test/a/b/c.
This question has an answer that uses a recursive function to solve this problem. But in my case it will involve further complexity as I need to even re-created the sub-dir structure in the new location.
I have not tried the option of using the commons-io utility method org.apache.commons.io.FileUtils#moveDirectoryToDirectory as this code seems to be first copying files and then deleting them from the original location. And In my case the files are huge and hence this is not a preferred option.
How can I achieve the the move functionality in java without resorting to copying. Is individual file move my only option?
TLDR: How can I emulate the mv functionality in java for moving sub dir with files and structure to parent directory.
I ended up doing this:
Create a FileVisitor Implementation like so:
package com.test.files;
import org.apache.log4j.Logger;
import javax.validation.constraints.NotNull;
import java.io.IOException;
import java.nio.file.FileVisitResult;
import java.nio.file.FileVisitor;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.attribute.BasicFileAttributes;
import java.util.Objects;
import static java.nio.file.FileVisitResult.TERMINATE;
public class MoveFileVisitor implements FileVisitor<Path> {
private static final Logger LOGGER = Logger.getLogger(MoveFileVisitor.class);
private final Path target;
private final Path source;
public MoveFileVisitor(#NotNull Path source, #NotNull Path target) {
this.target = Objects.requireNonNull(target);
this.source = Objects.requireNonNull(source);
}
#Override
public FileVisitResult preVisitDirectory(Path dir, BasicFileAttributes attrs) throws IOException {
Path relativePath = source.relativize(dir);
Path finalPath = target.resolve(relativePath);
Files.createDirectories(finalPath);
return FileVisitResult.CONTINUE;
}
#Override
public FileVisitResult visitFile(Path file, BasicFileAttributes attrs) throws IOException {
Path relativePath = source.relativize(file);
Path finalLocation = target.resolve(relativePath);
Files.move(file, finalLocation);
return FileVisitResult.CONTINUE;
}
#Override
public FileVisitResult visitFileFailed(Path file, IOException exc) {
LOGGER.error("Failed to visit file during move" + file.toAbsolutePath(), exc);
return TERMINATE;
}
#Override
public FileVisitResult postVisitDirectory(Path dir, IOException exc) throws IOException {
Files.delete(dir);
return FileVisitResult.CONTINUE;
}
}
And then walking the path with this Visitor like so:
String source = "/temp/test/a/b/x/y/z";
String target = "/temp/test/a/b/c";
MoveFileVisitor visitor = new MoveFileVisitor(Paths.get(source), Paths.get(target));
Files.walkFileTree(Paths.get(source), visitor);

When CDI injection into POJO should work? (GlassFish v3)

When I inject EJB 3.1 beans into POJO created by #Inject then injection works. When I construct POJO on my own then it doesn't (Glassfish v3). Is it correct behavior?
My classes (in EJB module):
#Singleton
#LocalBean
#Startup
#Named
public class NewSingletonBean {
#PostConstruct
public void init(){
System.out.println("NewSingletonBean INIT");
}
}
_
public class MyPOJO {
#Inject NewSingletonBean newSingletonBean;
public void sth(){
System.out.println("EJB injected into POJO: " + (newSingletonBean != null));
}
}
This does not work:
#Singleton
#LocalBean
#Startup
#DependsOn(value="NewSingletonBean")
public class NewSingletonBean2 {
#Inject NewSingletonBean newSingletonBean;
#PostConstruct
public void init(){
System.out.println("NewSingletonBean2 INIT");
System.out.println("EJB injected into EJB: " + (newSingletonBean != null));
MyPOJO p = new MyPOJO();
p.sth();
}
}
_
And this works OK:
#Singleton
#LocalBean
#Startup
#DependsOn(value="NewSingletonBean")
public class NewSingletonBean2 {
#Inject NewSingletonBean newSingletonBean;
#Inject MyPOJO p;
#PostConstruct
public void init(){
System.out.println("NewSingletonBean2 INIT");
System.out.println("EJB injected into EJB: " + (newSingletonBean != null));
p.sth();
}
}
I'm using NetBeans 7.0.1.
dist directory structure:
│ CDITest.ear
│
└───gfdeploy
└───CDITest
├───CDITest-ejb_jar
│ │ .netbeans_automatic_build
│ │ .netbeans_update_resources
│ │
│ ├───META-INF
│ │ beans.xml
│ │ MANIFEST.MF
│ │
│ └───tries
│ MyPOJO.class
│ NewSingletonBean.class
│ NewSingletonBean2.class
│
├───CDITest-war_war
│ │ index.jsp
│ │
│ ├───META-INF
│ │ MANIFEST.MF
│ │
│ └───WEB-INF
│ └───classes
│ .netbeans_automatic_build
│ .netbeans_update_resources
│
└───META-INF
MANIFEST.MF
Unpacked EAR structure:
│ CDITest-ejb.jar
│ CDITest-war.war
│
└───META-INF
MANIFEST.MF
Unpacked EJB module jar structure:
├───META-INF
│ beans.xml
│ MANIFEST.MF
│
└───tries
MyPOJO.class
NewSingletonBean.class
NewSingletonBean2.class
Is it correct behavior?
The following part might be an answer to your question:
As per CDI 1.0 specification:
3.7. Bean constructors
When the container instantiates a bean class, it calls the bean
constructor. The bean constructor is a constructor of the bean class.
The application may call bean constructors directly. However, if the
application directly instantiates the bean, no parameters are passed
to the constructor by the container; the returned object is not bound
to any context; no dependencies are injected by the container; and
the lifecycle of the new instance is not managed by the container.
HTH!
It is correct behaviour, because DI works only for container-managed beans, not for that ones, you've created yourself.

Categories