I have the following class,
public class Processor {
JobUpdate jobUpdate = new JobUpdate();
public void process(Job job){
try {
doProcess(job);
}catch (Exception e){
handleError(job);
System.out.println("This is error");
throw e; // Here throw Runtime Exception.
}
}
private void handleError(Job job) {
if (job.getFailure() > 0){
jobUpdate.updateJobStatus(job, JobStatus.SUBMITTED);
}else{
jobUpdate.updateJobStatus(job, JobStatus.FAILED);
}
}
private void doProcess(Job job){
String name = job.getName();
name += "added";
// ....
}
}
and my test cases as below,
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.mockito.ArgumentCaptor;
import org.mockito.Captor;
import org.mockito.Mock;
import org.mockito.Mockito;
import org.mockito.junit.jupiter.MockitoExtension;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.*;
#ExtendWith(MockitoExtension.class)
public class ProcessTest {
Processor processor = new Processor();
#Mock
private Job job;
#Mock
private JobUpdate jobUpdate;
#Captor
private ArgumentCaptor<Job> captorJob;
#Captor
private ArgumentCaptor<JobStatus> captorStatus;
#Test
public void shouldThrowException(){
Mockito.when(job.getName()).thenThrow(RuntimeException.class);
Mockito.when(job.getFailure()).thenReturn(2);
processor.process(job);
Mockito.verify(jobUpdate).updateJobStatus(captorJob.capture(),captorStatus.capture());
assertThat(captorJob.getValue().getFailure(), equalTo( 2));
assertThat(captorStatus.getValue(), equalTo( JobStatus.SUBMITTED));
}
}
The flow goes fine in catch block and after executing System.out.println("This is error"); statement, it throws exception. The controller not backing to test class to verify my ArgumentCaptor.
am i missing something or need to be changed?
Related
I first created a POM model framework for a test practice with LOG4J, listners for Screenshots.
Later tried to add Cucumber BDD framework also into the same framework. I'm able to run the tests as expected, but facing two issues:
POM Framework initial tests are able to take Screenshots, but BDD test fails with Null pointer exception, unable to get the driver object from the Methods.
Logs not getting printed for BDD tests while works fine with POM tests.
Code
TestRunner.java
package cucumberRunner;
import org.junit.runner.RunWith;
import io.cucumber.junit.Cucumber;
//import io.cucumber.junit.CucumberOptions;
import io.cucumber.testng.AbstractTestNGCucumberTests;
import io.cucumber.testng.CucumberOptions;
//#RunWith(Cucumber.class)
#CucumberOptions(
features="src/test/java/features",
glue="stepDefinitions")
public class TestRunner extends AbstractTestNGCucumberTests {
}
MyStepDefinitions.java
package stepDefinitions;
import org.apache.log4j.LogManager;
import org.apache.log4j.Logger;
import org.junit.Assert;
import org.openqa.selenium.WebDriver;
import io.cucumber.java.en.Given;
import io.cucumber.java.en.Then;
import io.cucumber.java.en.When;
import e2E4.UserLogins;
import pageObjects.LandingPage;
import pageObjects.LoggedOnPage;
import resources.BaseClass;
public class MyStepDefinitions extends BaseClass {
public WebDriver driver;
Logger log=LogManager.getLogger(UserLogins.class.getName());
#Given("^Initialize browser with Chrome$")
public void initialize_browser_with_Chrome() throws Throwable {
driver = driverIni();
}
#Given("^Navigate to \"([^\"]*)\" website$")
public void navigate_to_saucelabs_website(String arg1) throws Throwable {
driver.get(arg1);
}
#When("^User enters \"([^\"]*)\" and \"([^\"]*)\" and Logs in$")
public void user_enters_and_and_Logs_in(String arg1, String arg2) throws Throwable {
LandingPage lp=new LandingPage(driver);
lp.sendUsername().sendKeys(arg1);
lp.sendPassword().sendKeys(arg2);
lp.sendLoginBtn().click();
log.info("Logging in");
}
#Then("^Verify if user successfully logged in$")
public void verify_if_user_successfully_logged_in() throws Throwable {
LoggedOnPage lop=new LoggedOnPage(driver);
Assert.assertTrue(lop.filterbtn().isDisplayed());
log.info("Logged in successfully");
}
}
BaseClass.java
package resources;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.time.Duration;
import java.util.Properties;
import org.apache.commons.io.FileUtils;
import org.openqa.selenium.OutputType;
import org.openqa.selenium.TakesScreenshot;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
public class BaseClass {
public WebDriver driver;
public Properties prop;
String dataPath=System.getProperty("user.dir");
public WebDriver driverIni() throws IOException {
FileInputStream fis=new FileInputStream(dataPath+"\\src\\main\\java\\resources\\data.properties");
prop=new Properties();
prop.load(fis);
String browserRequired=prop.getProperty("browser");
if(browserRequired.equalsIgnoreCase("chrome")) {
System.setProperty("webdriver.chrome.driver", dataPath+"\\chromedriver.exe");
driver=new ChromeDriver();
}
driver.manage().window().maximize();
driver.manage().timeouts().implicitlyWait(Duration.ofSeconds(10));
return driver;
}
public String screenshot(String methodName, WebDriver driver) throws IOException {
TakesScreenshot ts=(TakesScreenshot) driver;
File source=ts.getScreenshotAs(OutputType.FILE);
String dest=dataPath+"\\reports\\"+methodName+".png";
FileUtils.copyFile(source, new File(dest));
return dest;
}
}
Listners.java
package e2E4;
import java.io.IOException;
import org.openqa.selenium.WebDriver;
import org.testng.ITestContext;
import org.testng.ITestListener;
import org.testng.ITestResult;
import com.aventstack.extentreports.ExtentReports;
import com.aventstack.extentreports.ExtentTest;
import com.aventstack.extentreports.Status;
import resources.BaseClass;
import resources.ExtendRep;
public class Listners extends BaseClass implements ITestListener {
WebDriver driver=null;
ExtendRep er=new ExtendRep();
ExtentReports extent=er.getReports();
ExtentTest test;
ThreadLocal<ExtentTest> et=new ThreadLocal<ExtentTest>();
public void onTestStart(ITestResult result) {
test=extent.createTest(result.getMethod().getMethodName());
et.set(test);
}
public void onTestSuccess(ITestResult result) {
String methodName=result.getMethod().getMethodName();
et.get().log(Status.PASS, "Test Passed Successfully");
try {
driver=(WebDriver)result.getTestClass().getRealClass().getDeclaredField("driver").get(result.getInstance());
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
try {
et.get().addScreenCaptureFromPath(screenshot(methodName,driver),result.getMethod().getMethodName());
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
public void onTestFailure(ITestResult result) {
String methodName=result.getMethod().getMethodName();
et.get().fail(result.getThrowable());
try {
driver=(WebDriver)result.getTestClass().getRealClass().getDeclaredField("driver").get(result.getInstance());
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
try {
et.get().addScreenCaptureFromPath(screenshot(methodName,driver),result.getMethod().getMethodName());
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
public void onTestSkipped(ITestResult result) {
}
public void onTestFailedButWithinSuccessPercentage(ITestResult result) {
}
public void onTestFailedWithTimeout(ITestResult result) {
}
public void onStart(ITestContext context) {
}
public void onFinish(ITestContext context) {
extent.flush();
}
}
P.S : ExtentReports were working fine tho throu the Listners. Not able to figure it out :(
I have to create singleton classes using DAO classes.
Following is a DAO reading class example:
package com.luiz.teste.dao;
import com.luiz.teste.exceptions.postgres.ReadSubjectDaoFindException;
import org.eclipse.microprofile.opentracing.Traced;
import javax.enterprise.context.RequestScoped;
import javax.persistence.EntityManager;
import javax.persistence.EntityTransaction;
import javax.persistence.NoResultException;
import javax.persistence.Persistence;
import javax.persistence.PersistenceException;
#Traced
#ApplicationScoped
public class ReadSubjectDao {
private static ReadSubjectDao instance = new ReadSubjectDao();
protected EntityManager em;
public static ReadSubjectDao getInstance() {
return instance;
}
private ReadSubjectDao() {
if (em == null) {
em = Persistence.createEntityManagerFactory("postgres").createEntityManager();
}
}
public ReadSubject findById(int id) throws ReadSubjectDaoFindException {
try {
return em.find(ReadSubject.class, id);
}
catch (NoResultException e) {
return null;
}
catch (PersistenceException e) {
throw new ReadSubjectDaoFindException(e);
}
}
}
Following is a DAO writing class example:
package com.luiz.teste.dao;
import com.luiz.teste.exceptions.mysql.WriteSubjectDaoFindException;
import com.luiz.teste.exceptions.mysql.WriteSubjectDaoPersistException;
import com.luiz.teste.exceptions.mysql.WriteSubjectDaoMergeException;
import org.eclipse.microprofile.opentracing.Traced;
import javax.enterprise.context.RequestScoped;
import javax.persistence.EntityManager;
import javax.persistence.EntityTransaction;
import javax.persistence.NoResultException;
import javax.persistence.Persistence;
import javax.persistence.PersistenceException;
#Traced
#ApplicationScoped
public class WriteSubjectDao {
private static WriteSubjectDao instance = new WriteSubjectDao();
protected EntityManager em;
public static WriteSubjectDao getInstance() {
return instance;
}
private WriteSubjectDao() {
if (em == null) {
em = Persistence.createEntityManagerFactory("mysql").createEntityManager();
}
}
public WriteSubject findById(int id) throws WriteSubjectDaoFindException {
try {
return em.find(WriteSubject.class, id);
}
catch (NoResultException e) {
return null;
}
catch (PersistenceException e) {
throw new WriteSubjectDaoFindException(e);
}
}
public void persist(WriteSubject writeSubject) throws WriteSubjectDaoPersistException {
EntityTransaction et = em.getTransaction();
try {
et.begin();
em.persist(writeSubject);
et.commit();
}
catch (Exception e) {
if (et.isActive())
et.rollback();
throw new WriteSubjectDaoPersistException(e);
}
}
public void merge(WriteSubject writeSubject) throws WriteSubjectDaoMergeException {
EntityTransaction et = em.getTransaction();
try {
et.begin();
em.merge(writeSubject);
et.commit();
}
catch (Exception e) {
if (et.isActive())
et.rollback();
throw new WriteSubjectDaoMergeException(e);
}
}
}
Following is application.properties:
# Configuration file
# key = value
quarkus.log.console.format=%d{HH:mm:ss} %-5p [%c{2.}] (%t) %X{requestID} %s%e%n
mp.metrics.tags=app=${quarkus.application.name},version=${quarkus.application.version}
%test.mp.metrics.tags=app=app-test,version=1.0.0
mp.openapi.filter=com.luiz.teste.dev.ext.filters.OpenApiFilter
quarkus.swagger-ui.path=/api-docs
quarkus.smallrye-openapi.path=/api-docs-json
quarkus.swagger-ui.always-include=true
quarkus.http.test-port=8083
quarkus.http.test-ssl-port=8446
quarkus.datasource.jdbc.enable-metrics=true
# Postgre - Build time
quarkus.datasource."postgres".db-kind=db2
quarkus.datasource."postgres".jdbc.url=${POSTGRE_JDBC}
quarkus.datasource."postgres".username=${POSTGRE_USER}
quarkus.datasource."postgres".password=${POSTGRE_PASSWORD}
quarkus.hibernate-orm."postgres".datasource=postgres
quarkus.hibernate-orm."postgres".packages=com.luiz.teste.models.postgres
quarkus.hibernate-orm."postgres".log.jdbc-warnings=false
quarkus.hibernate-orm."postgres".log.sql=true
# MySQL - Build time
quarkus.datasource."mysql".db-kind=mysql
quarkus.datasource."mysql".jdbc.url=${MYSQL_JDBC}
quarkus.datasource."mysql".username=${MYSQL_USER}
quarkus.datasource."mysql".password=${MYSQL_PASSWORD}
quarkus.hibernate-orm."mysql".datasource=mysql
quarkus.hibernate-orm."mysql".packages=com.luiz.teste.models.mysql
quarkus.hibernate-orm."mysql".log.jdbc-warnings=false
quarkus.hibernate-orm."mysql".log.sql=true
As far as I've searched through this site (and through https://www.google.com too), I know so far only by using persistence.xml.
How to achieve the same result using only the application.properties when calling createEntityManagerFactory?
UPDATE (2022-01-03): As requested, changed from #RequestScoped to #ApplicationScoped and changed from postgre to postgres.
Finally found a solution to this issue, after searching lots here and googling for any answer.
Instead of manually creating instance field on those classes, to make singleton classes within Quarkus you shall use #Singleton annotation.
Fixed ReadSubjectDao.java:
package com.luiz.teste.dao.postgres;
import com.luiz.teste.exceptions.postgres.ReadSubjectDaoFindException;
import org.eclipse.microprofile.opentracing.Traced;
import javax.inject.Singleton;
import javax.inject.Inject;
import io.quarkus.hibernate.orm.PersistenceUnit;
import javax.persistence.EntityManager;
import javax.persistence.NoResultException;
import javax.persistence.PersistenceException;
#Traced
#Singleton
public class ReadSubjectDao {
#Inject
#PersistenceUnit("postgres")
EntityManager em;
public ReadSubject findById(int id) throws ReadSubjectDaoFindException {
try {
return em.find(ReadSubject.class, id);
}
catch (NoResultException e) {
return null;
}
catch (PersistenceException e) {
throw new ReadSubjectDaoFindException(e);
}
}
}
Fixed WriteSubjectDao.java:
package com.luiz.teste.dao.mysql;
import com.luiz.teste.exceptions.mysql.WriteSubjectDaoFindException;
import com.luiz.teste.exceptions.mysql.WriteSubjectDaoPersistException;
import com.luiz.teste.exceptions.mysql.WriteSubjectDaoMergeException;
import org.eclipse.microprofile.opentracing.Traced;
import javax.inject.Singleton;
import javax.inject.Inject;
import io.quarkus.hibernate.orm.PersistenceUnit;
import javax.persistence.EntityManager;
import javax.persistence.NoResultException;
import javax.persistence.PersistenceException;
import javax.transaction.Transactional;
#Traced
#Singleton
public class WriteSubjectDao {
#Inject
#PersistenceUnit("mysql")
EntityManager em;
public WriteSubject findById(int id) throws WriteSubjectDaoFindException {
try {
return em.find(WriteSubject.class, id);
}
catch (NoResultException e) {
return null;
}
catch (PersistenceException e) {
throw new WriteSubjectDaoFindException(e);
}
}
#Transactional
public void persist(WriteSubject writeSubject) throws WriteSubjectDaoPersistException {
try {
em.persist(writeSubject);
}
catch (Exception e) {
throw new WriteSubjectDaoPersistException(e);
}
}
#Transactional
public void merge(WriteSubject writeSubject) throws WriteSubjectDaoMergeException {
try {
em.merge(writeSubject);
}
catch (Exception e) {
throw new WriteSubjectDaoMergeException(e);
}
}
}
The application.properties remain unchanged.
To call any of those 2 singleton DAO classes, just use #Inject annotation on caller classes, as usual.
Your pattern is extremely weird.
If you want a singleton DAO, just use #ApplicationScoped for the scope of your DAO. Don't make it #RequestScoped and then have a static instance, it's going to be broken.
Then, with Quarkus, you shouldn't create the EntityManagerFactory yourself. You can just inject an EntityManager with:
#Inject
#PersistenceUnit("postgre")
EntityManager entityManager;
and you're done.
I am having following spring-batch application
SpringBatchApplication.java
package com.spbt.job.sample;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
#SpringBootApplication
public class SpringBatchApplication {
public static void main(String[] args) {
SpringApplication.run(SpringBatchApplication.class, args);
}
}
TraverseJob.java
package com.spbt.job.sample;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.batch.core.launch.support.RunIdIncrementer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
#EnableBatchProcessing
public class TraverseJob {
#Autowired
protected JobBuilderFactory jobBuilderFactory;
#Autowired
protected StepBuilderFactory stepBuilderFactory;
private String inputFolderPath = "/tmp/inputFolder";
#Bean("TraverseJob")
public Job job() {
return jobBuilderFactory.get("TraverseJob")
.incrementer(new RunIdIncrementer())
.start(traverseStep())
.build();
}
#Bean("TraverseStep")
public Step traverseStep() {
return stepBuilderFactory.get("TraverseStep")
.tasklet(traverseJobTasklet(null))
.build();
}
#Bean("TraverseJobTasklet")
#StepScope
public TraverseJobTasklet traverseJobTasklet(#Value("#{jobParameters[date]}") String date) {
TraverseJobTasklet tasklet = new TraverseJobTasklet();
tasklet.setJobDate(date);
tasklet.setJobDirPath(inputFolderPath);
return tasklet;
}
}
TraverseJobTasklet.java
package com.spbt.job.sample;
import org.springframework.batch.core.StepContribution;
import org.springframework.batch.core.scope.context.ChunkContext;
import org.springframework.batch.core.step.tasklet.Tasklet;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.beans.factory.annotation.Autowired;
import java.io.File;
public class TraverseJobTasklet implements Tasklet {
private String jobDirPath;
private String jobDate;
#Autowired
private RemoteFilePush remoteFilePush;
#Override
public RepeatStatus execute(StepContribution stepContribution, ChunkContext chunkContext) throws Exception {
try {
traverseDir(new File(jobDirPath));
} catch (Exception ex) {
throw ex;
}
return RepeatStatus.FINISHED;
}
private void traverseDir(File filePath) throws Exception {
try {
File[] files = filePath.listFiles();
if (files != null) {
for (File file : files) {
String name = file.getName();
if (file.isDirectory()) {
if (remoteFilePush.isRemoteDirExist(name)) {
continue;
} else {
remoteFilePush.createRemoteDir(name);
traverseDir(file);
}
} else {
remoteFilePush.pushFile(file.getPath());
}
}
} else {
throw new Exception("empty/null dir -> " + filePath.getName());
}
} catch (Exception ex) {
throw ex;
}
}
public String getJobDirPath() {
return jobDirPath;
}
public void setJobDirPath(String jobDirPath) {
this.jobDirPath = jobDirPath;
}
public String getJobDate() {
return jobDate;
}
public void setJobDate(String jobDate) {
this.jobDate = jobDate;
}
}
RemoteFilePushLogic.java
package com.spbt.job.sample;
import org.springframework.stereotype.Component;
#Component
public class RemoteFilePush {
public boolean isRemoteDirExist(String name) throws InterruptedException {
boolean isRemoteDirExist = false;
// code to check dir on remote server
return isRemoteDirExist;
}
public void createRemoteDir(String name) throws InterruptedException {
// code to create dir on remote server
}
public void pushFile(String path) throws InterruptedException {
// code to push file on remote server
System.out.println("Pushed");
}
}
I want to do parallel traversal and execution in traverseDir method in TraverseJobTasklet, by keeping my RemoteFilePush Logic intact, my inputFolderPath can contain multiple child directories each of which contains some files in it.
I have tried to follow link for spring-batch version which I am using, But its xml based and I do not seem to get how can I create multiple step out of single traverseStep I have?
input a sub-folder string path per worker step is where i am hitting wall with spring code, if you can point me to some ref. it will be helpful, most of the example on net is xml based.
Here is a quick self-contained example with Java config:
import java.util.Arrays;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.JobParametersBuilder;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.StepContribution;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.core.partition.support.Partitioner;
import org.springframework.batch.core.scope.context.ChunkContext;
import org.springframework.batch.core.step.tasklet.Tasklet;
import org.springframework.batch.item.ExecutionContext;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.task.SimpleAsyncTaskExecutor;
#Configuration
#EnableBatchProcessing
public class PartitionJobSample {
private final JobBuilderFactory jobs;
private final StepBuilderFactory steps;
public PartitionJobSample(JobBuilderFactory jobs, StepBuilderFactory steps) {
this.jobs = jobs;
this.steps = steps;
}
#Bean
public Step managerStep() {
return steps.get("masterStep")
.partitioner(workerStep().getName(), partitioner(null))
.step(workerStep())
.gridSize(4)
.taskExecutor(taskExecutor())
.build();
}
#Bean
public SimpleAsyncTaskExecutor taskExecutor() {
return new SimpleAsyncTaskExecutor();// TODO useful for testing, use a more robust task executor in production
}
#Bean
#StepScope
public Partitioner partitioner(#Value("#{jobParameters['rootFolder']}") String rootFolder) {
List<String> subFolders = getSubFolders(rootFolder);
return new Partitioner() {
#Override
public Map<String, ExecutionContext> partition(int gridSize) {
Map<String, ExecutionContext> map = new HashMap<>(gridSize);
for (String folder : subFolders) {
ExecutionContext executionContext = new ExecutionContext();
executionContext.put("filePath", folder);
map.put("partition-for-" + folder, executionContext);
}
return map;
}
};
}
private List<String> getSubFolders(String rootFolder) {
// TODO implement this
return Arrays.asList("/data/folder1", "/data/folder2");
}
#Bean
public Step workerStep() {
return steps.get("workerStep")
.tasklet(getTasklet(null))
.build();
}
#Bean
#StepScope
public Tasklet getTasklet(#Value("#{stepExecutionContext['filePath']}") String filePath) {
return new TraverseJobTasklet(filePath);
}
#Bean
public Job job() {
return jobs.get("job")
.start(managerStep())
.build();
}
public static void main(String[] args) throws Exception {
ApplicationContext context = new AnnotationConfigApplicationContext(PartitionJobSample.class);
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
Job job = context.getBean(Job.class);
JobParameters jobParameters = new JobParametersBuilder()
.addString("rootFolder", "/data")
.toJobParameters();
jobLauncher.run(job, jobParameters);
}
class TraverseJobTasklet implements Tasklet {
private String filePath;
public TraverseJobTasklet(String filePath) {
this.filePath = filePath;
}
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
// TODO call traversePath for filePath which is a sub-folder here
System.out.println(Thread.currentThread().getName() + " processing sub-folder " + filePath);
return RepeatStatus.FINISHED;
}
}
}
It passes the root directory as a job parameter and executes a partitioned step where each worker processes a sub-folder (calling your tasklet).
If you run it, you should see something like:
SimpleAsyncTaskExecutor-2 processing sub-folder /data/folder1
SimpleAsyncTaskExecutor-1 processing sub-folder /data/folder2
I will let you adapt it to your situation accordingly.
i have tried to "wait 1 second" in minecraft forge and i use "Thread" to do it.
But i got this error.
Exception in thread "Thread-14" java.lang.IllegalMonitorStateException
at java.lang.Object.wait(Native Method)
at java.lang.Object.wait(Object.java:218)
at io.github.bloodnighttw.WaitAndReplaceBlock.run(WaitAndReplaceBlock.java:22)
i have tried to use world.scheduleBlockUpdate(......) to do it,but anything happened after i break block.
And this is my code,there have three class.
ExampleMod.java
package io.github.bloodnighttw;
import net.minecraft.init.Blocks;
import net.minecraftforge.common.MinecraftForge;
import net.minecraftforge.fml.common.Mod;
import net.minecraftforge.fml.common.Mod.EventHandler;
import net.minecraftforge.fml.common.event.FMLInitializationEvent;
import net.minecraftforge.fml.common.event.FMLPreInitializationEvent;
import org.apache.logging.log4j.Logger;
import io.github.bloodnighttw.event.Event;
#Mod(modid = ExampleMod.MODID, name = ExampleMod.NAME, version = ExampleMod.VERSION)
public class ExampleMod
{
public static final String MODID = "nothing";
public static final String NAME = "Bang!";
public static final String VERSION = "3.0";
private static Logger logger;
#EventHandler
public void preInit(FMLPreInitializationEvent event)
{
logger = event.getModLog();
logger.info("Bang");
}
#EventHandler
public void init(FMLInitializationEvent event)
{
// some example code
logger.info("DIRT BLOCK >> {}", Blocks.DIRT.getRegistryName());
MinecraftForge.EVENT_BUS.register(new Event());
}
}
Event.java
package io.github.bloodnighttw.event;
import io.github.bloodnighttw.WaitAndReplaceBlock;
import net.minecraft.util.math.BlockPos;
import net.minecraft.util.text.TextComponentString;
import net.minecraftforge.event.world.BlockEvent.BreakEvent;
import net.minecraftforge.fml.common.eventhandler.SubscribeEvent;
public class Event {
#SubscribeEvent
public void breakBlockEvent(final BreakEvent e) {
e.getPlayer().sendMessage(new TextComponentString(e.getState().toString()+"222222"));
BlockPos a = e.getPos();
new WaitAndReplaceBlock(a,e).start();
}
}
WaitAndReplaceBlock.java
package io.github.bloodnighttw;
import net.minecraft.init.Blocks;
import net.minecraft.util.math.BlockPos;
import net.minecraft.world.World;
import net.minecraftforge.event.world.BlockEvent;
public class WaitAndReplaceBlock extends Thread{
BlockPos a;
BlockEvent.BreakEvent e ;
public WaitAndReplaceBlock(BlockPos a, BlockEvent.BreakEvent e){
this.a = a ;
this.e = e ;
}
#Override
public void run() {
try {
wait(10);
} catch (InterruptedException interruptedException) {
interruptedException.printStackTrace();
}
World world = e.getWorld();
world.setBlockState(a, Blocks.BEDROCK.getDefaultState());
}
}
i have tried to google it,but everything i search is too old to be use in 1.12.2 or it is not a solution to me,so i decided to ask in stackoverflow.
Edit:
This is the code that i used world.scheduleBlockUpdate(......)
Only Event.java have some change.
Event.java
package io.github.bloodnighttw.event;
import io.github.bloodnighttw.WaitAndReplaceBlock;
import net.minecraft.util.math.BlockPos;
import net.minecraft.util.text.TextComponentString;
import net.minecraftforge.event.world.BlockEvent.BreakEvent;
import net.minecraftforge.fml.common.eventhandler.SubscribeEvent;
public class Event {
#SubscribeEvent
public void breakBlockEvent(final BreakEvent e) {
e.getPlayer().sendMessage(new TextComponentString(e.getState().toString()+"222222"));
BlockPos a = e.getPos();
//new WaitAndReplaceBlock(a,e).start();
e.getWorld().scheduleBlockUpdate(a,e.getState().getBlock(),10,1000000);
}
}
And this is what i said "Nothing happened".
Video: https://youtu.be/tZMIRHDUnV4
I have to create a unit testcases for a method which acquires the lock with zookeeper and data is processed with CompletableFuture are used
Below is the high level code :
import lombok.Data;
#Data
public class ConfigurationsIntegrationModel {
public enum InteractionType {
TEST,
DEV;
}
private InteractionType interactionType;
private String lockName;
}
import org.springframework.stereotype.Service;
import java.util.Arrays;
import java.util.List;
#Service("configurationsIntegrationService")
public interface ConfigurationsIntegrationService {
public default List<ConfigurationsIntegrationModel> getRecords(ConfigurationsIntegrationModel.InteractionType integrationType) {
return Arrays.asList(getDynamicIntegrationConfigurationMock(integrationType));
}
private static ConfigurationsIntegrationModel getDynamicIntegrationConfigurationMock(ConfigurationsIntegrationModel.InteractionType integrationType) {
ConfigurationsIntegrationModel configurationsIntegration = new ConfigurationsIntegrationModel();
configurationsIntegration.setLockName("Test_Lock");
configurationsIntegration.setInteractionType(integrationType);
return configurationsIntegration;
}
}
import org.apache.curator.RetryPolicy;
import org.apache.curator.framework.CuratorFramework;
import org.apache.curator.framework.CuratorFrameworkFactory;
import org.apache.curator.framework.recipes.locks.InterProcessLock;
import org.apache.curator.framework.recipes.locks.InterProcessSemaphoreMutex;
import org.apache.curator.retry.RetryNTimes;
import java.util.concurrent.TimeUnit;
public class DistributedLockProcessor {
private CuratorFramework client;
private String path;
public DistributedLockProcessor(String host, String path) {
RetryPolicy retryPolicy = new RetryNTimes(5, 90);
client = CuratorFrameworkFactory.newClient(host, retryPolicy);
client.start();
}
public InterProcessLock acquire(String lockName) throws Exception {
InterProcessSemaphoreMutex sharedLock = new InterProcessSemaphoreMutex(client, path + "/" + lockName);
if (!sharedLock.acquire(0, TimeUnit.SECONDS)) {
return null;
}
return sharedLock;
}
public boolean release(InterProcessLock sharedLock) throws Exception {
sharedLock.release();
return true;
}
}
import org.apache.curator.framework.recipes.locks.InterProcessLock;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Executor;
import java.util.concurrent.Executors;
import java.util.function.Consumer;
public class LockingExecutorProcessor<T> {
private Executor executor = null;
private DistributedLockProcessor distributedLock = null;
public LockingExecutorProcessor(String host, String path, int executorCount) {
executor = Executors.newFixedThreadPool(executorCount);
distributedLock = new DistributedLockProcessor(host, path);
}
public void process(List<String> locks, List<T> items, Consumer<T> consumer) throws ExecutionException, InterruptedException {
final List<CompletableFuture<Void>> completableFutures = new ArrayList<>();
for (int i = 0; i < locks.size(); i++) {
final int record = i;
CompletableFuture<Void> future =
CompletableFuture.runAsync(
() -> {
InterProcessLock interProcessLock = null;
try {
interProcessLock = distributedLock.acquire(locks.get(record));
} catch (Exception e) {
e.printStackTrace();
}
if (interProcessLock != null) {
consumer.accept(items.get(record));
}
}, executor);
completableFutures.add(future);
}
CompletableFuture<Void> completable = CompletableFuture.allOf(completableFutures.toArray(new CompletableFuture[completableFutures.size()]));
completable.get();
}
}
import org.springframework.stereotype.Service;
import java.util.Arrays;
import java.util.List;
#Service("messageService")
public interface MessageService {
public default List<String> getMessagesList(ConfigurationsIntegrationModel.InteractionType integrationType) {
return Arrays.asList("Message1", "Message2", "Message3","Message4");
}
}
import com.google.common.util.concurrent.RateLimiter;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Component;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Executors;
#Component("sampleJob")
public class SampleJob {
#Autowired
private LockingExecutorProcessor<ConfigurationsIntegrationModel> lockingExecutorProcessor;
#Autowired
private ConfigurationsIntegrationService configurationsIntegrationService;
#Autowired
private RateLimiter rateLimiter;
#Autowired
private MessageService messageService;
private List<String> getLockNames(List<ConfigurationsIntegrationModel> integrationConfigurations) {
List<String> lockNames = new ArrayList<>();
for (ConfigurationsIntegrationModel integrationConfiguration : integrationConfigurations) {
lockNames.add(integrationConfiguration.getLockName());
}
return lockNames;
}
#Scheduled(fixedRateString = "100")
public void executeJob() throws ExecutionException, InterruptedException {
List<ConfigurationsIntegrationModel> testRecords = configurationsIntegrationService.getRecords(ConfigurationsIntegrationModel.InteractionType.TEST);
List<String> lockNames = getLockNames(testRecords);
lockingExecutorProcessor.process(
lockNames,
testRecords,
recordsConfig -> {
final List<CompletableFuture<Void>> completableFutures = new ArrayList<>();
List<String> msgList = messageService.getMessagesList(recordsConfig.getInteractionType());
for (String message : msgList) {
completableFutures.add(
CompletableFuture.runAsync(
() -> {
System.out.println("Message is #### "+ message);
}, Executors.newFixedThreadPool(10)));
}
});
}
}
Below is the testcase which i tried so far :
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.InjectMocks;
import org.mockito.Mock;
import org.mockito.Mockito;
import org.mockito.MockitoAnnotations;
import org.mockito.junit.MockitoJUnitRunner;
import java.util.concurrent.ExecutionException;
#RunWith(MockitoJUnitRunner.class)
public class SampleJobTest {
#InjectMocks
private SampleJob sampleJob = new SampleJob();
#Mock
private ConfigurationsIntegrationService configurationsIntegrationService;
#Mock
private MessageService messageService;
#Mock
private LockingExecutorProcessor<ConfigurationsIntegrationModel> lockingExecutorProcessor;
#Test
public void testSampleJob() throws ExecutionException, InterruptedException {
Mockito.doCallRealMethod().when(lockingExecutorProcessor).process(Mockito.any(), Mockito.any(), Mockito.any());
Mockito.doCallRealMethod().when(configurationsIntegrationService).getRecords(Mockito.any());
Mockito.doCallRealMethod().when(messageService).getMessagesList(Mockito.any());
sampleJob.executeJob();
}
}
When i debug the code then it breaks at the line CompletableFuture.runAsync of LockingExecutorProcessor and throws null pointer error; the reason is distributedLockProcessor object is null;
How can we mock it and how to connect to test zookeeper server instead of original to ensure locking is working fine
This is done. Now the testcases are working fine. I have used curator-test dependency and also used reflection to mock the private objects.