i have tried to "wait 1 second" in minecraft forge and i use "Thread" to do it.
But i got this error.
Exception in thread "Thread-14" java.lang.IllegalMonitorStateException
at java.lang.Object.wait(Native Method)
at java.lang.Object.wait(Object.java:218)
at io.github.bloodnighttw.WaitAndReplaceBlock.run(WaitAndReplaceBlock.java:22)
i have tried to use world.scheduleBlockUpdate(......) to do it,but anything happened after i break block.
And this is my code,there have three class.
ExampleMod.java
package io.github.bloodnighttw;
import net.minecraft.init.Blocks;
import net.minecraftforge.common.MinecraftForge;
import net.minecraftforge.fml.common.Mod;
import net.minecraftforge.fml.common.Mod.EventHandler;
import net.minecraftforge.fml.common.event.FMLInitializationEvent;
import net.minecraftforge.fml.common.event.FMLPreInitializationEvent;
import org.apache.logging.log4j.Logger;
import io.github.bloodnighttw.event.Event;
#Mod(modid = ExampleMod.MODID, name = ExampleMod.NAME, version = ExampleMod.VERSION)
public class ExampleMod
{
public static final String MODID = "nothing";
public static final String NAME = "Bang!";
public static final String VERSION = "3.0";
private static Logger logger;
#EventHandler
public void preInit(FMLPreInitializationEvent event)
{
logger = event.getModLog();
logger.info("Bang");
}
#EventHandler
public void init(FMLInitializationEvent event)
{
// some example code
logger.info("DIRT BLOCK >> {}", Blocks.DIRT.getRegistryName());
MinecraftForge.EVENT_BUS.register(new Event());
}
}
Event.java
package io.github.bloodnighttw.event;
import io.github.bloodnighttw.WaitAndReplaceBlock;
import net.minecraft.util.math.BlockPos;
import net.minecraft.util.text.TextComponentString;
import net.minecraftforge.event.world.BlockEvent.BreakEvent;
import net.minecraftforge.fml.common.eventhandler.SubscribeEvent;
public class Event {
#SubscribeEvent
public void breakBlockEvent(final BreakEvent e) {
e.getPlayer().sendMessage(new TextComponentString(e.getState().toString()+"222222"));
BlockPos a = e.getPos();
new WaitAndReplaceBlock(a,e).start();
}
}
WaitAndReplaceBlock.java
package io.github.bloodnighttw;
import net.minecraft.init.Blocks;
import net.minecraft.util.math.BlockPos;
import net.minecraft.world.World;
import net.minecraftforge.event.world.BlockEvent;
public class WaitAndReplaceBlock extends Thread{
BlockPos a;
BlockEvent.BreakEvent e ;
public WaitAndReplaceBlock(BlockPos a, BlockEvent.BreakEvent e){
this.a = a ;
this.e = e ;
}
#Override
public void run() {
try {
wait(10);
} catch (InterruptedException interruptedException) {
interruptedException.printStackTrace();
}
World world = e.getWorld();
world.setBlockState(a, Blocks.BEDROCK.getDefaultState());
}
}
i have tried to google it,but everything i search is too old to be use in 1.12.2 or it is not a solution to me,so i decided to ask in stackoverflow.
Edit:
This is the code that i used world.scheduleBlockUpdate(......)
Only Event.java have some change.
Event.java
package io.github.bloodnighttw.event;
import io.github.bloodnighttw.WaitAndReplaceBlock;
import net.minecraft.util.math.BlockPos;
import net.minecraft.util.text.TextComponentString;
import net.minecraftforge.event.world.BlockEvent.BreakEvent;
import net.minecraftforge.fml.common.eventhandler.SubscribeEvent;
public class Event {
#SubscribeEvent
public void breakBlockEvent(final BreakEvent e) {
e.getPlayer().sendMessage(new TextComponentString(e.getState().toString()+"222222"));
BlockPos a = e.getPos();
//new WaitAndReplaceBlock(a,e).start();
e.getWorld().scheduleBlockUpdate(a,e.getState().getBlock(),10,1000000);
}
}
And this is what i said "Nothing happened".
Video: https://youtu.be/tZMIRHDUnV4
Related
I first created a POM model framework for a test practice with LOG4J, listners for Screenshots.
Later tried to add Cucumber BDD framework also into the same framework. I'm able to run the tests as expected, but facing two issues:
POM Framework initial tests are able to take Screenshots, but BDD test fails with Null pointer exception, unable to get the driver object from the Methods.
Logs not getting printed for BDD tests while works fine with POM tests.
Code
TestRunner.java
package cucumberRunner;
import org.junit.runner.RunWith;
import io.cucumber.junit.Cucumber;
//import io.cucumber.junit.CucumberOptions;
import io.cucumber.testng.AbstractTestNGCucumberTests;
import io.cucumber.testng.CucumberOptions;
//#RunWith(Cucumber.class)
#CucumberOptions(
features="src/test/java/features",
glue="stepDefinitions")
public class TestRunner extends AbstractTestNGCucumberTests {
}
MyStepDefinitions.java
package stepDefinitions;
import org.apache.log4j.LogManager;
import org.apache.log4j.Logger;
import org.junit.Assert;
import org.openqa.selenium.WebDriver;
import io.cucumber.java.en.Given;
import io.cucumber.java.en.Then;
import io.cucumber.java.en.When;
import e2E4.UserLogins;
import pageObjects.LandingPage;
import pageObjects.LoggedOnPage;
import resources.BaseClass;
public class MyStepDefinitions extends BaseClass {
public WebDriver driver;
Logger log=LogManager.getLogger(UserLogins.class.getName());
#Given("^Initialize browser with Chrome$")
public void initialize_browser_with_Chrome() throws Throwable {
driver = driverIni();
}
#Given("^Navigate to \"([^\"]*)\" website$")
public void navigate_to_saucelabs_website(String arg1) throws Throwable {
driver.get(arg1);
}
#When("^User enters \"([^\"]*)\" and \"([^\"]*)\" and Logs in$")
public void user_enters_and_and_Logs_in(String arg1, String arg2) throws Throwable {
LandingPage lp=new LandingPage(driver);
lp.sendUsername().sendKeys(arg1);
lp.sendPassword().sendKeys(arg2);
lp.sendLoginBtn().click();
log.info("Logging in");
}
#Then("^Verify if user successfully logged in$")
public void verify_if_user_successfully_logged_in() throws Throwable {
LoggedOnPage lop=new LoggedOnPage(driver);
Assert.assertTrue(lop.filterbtn().isDisplayed());
log.info("Logged in successfully");
}
}
BaseClass.java
package resources;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.time.Duration;
import java.util.Properties;
import org.apache.commons.io.FileUtils;
import org.openqa.selenium.OutputType;
import org.openqa.selenium.TakesScreenshot;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
public class BaseClass {
public WebDriver driver;
public Properties prop;
String dataPath=System.getProperty("user.dir");
public WebDriver driverIni() throws IOException {
FileInputStream fis=new FileInputStream(dataPath+"\\src\\main\\java\\resources\\data.properties");
prop=new Properties();
prop.load(fis);
String browserRequired=prop.getProperty("browser");
if(browserRequired.equalsIgnoreCase("chrome")) {
System.setProperty("webdriver.chrome.driver", dataPath+"\\chromedriver.exe");
driver=new ChromeDriver();
}
driver.manage().window().maximize();
driver.manage().timeouts().implicitlyWait(Duration.ofSeconds(10));
return driver;
}
public String screenshot(String methodName, WebDriver driver) throws IOException {
TakesScreenshot ts=(TakesScreenshot) driver;
File source=ts.getScreenshotAs(OutputType.FILE);
String dest=dataPath+"\\reports\\"+methodName+".png";
FileUtils.copyFile(source, new File(dest));
return dest;
}
}
Listners.java
package e2E4;
import java.io.IOException;
import org.openqa.selenium.WebDriver;
import org.testng.ITestContext;
import org.testng.ITestListener;
import org.testng.ITestResult;
import com.aventstack.extentreports.ExtentReports;
import com.aventstack.extentreports.ExtentTest;
import com.aventstack.extentreports.Status;
import resources.BaseClass;
import resources.ExtendRep;
public class Listners extends BaseClass implements ITestListener {
WebDriver driver=null;
ExtendRep er=new ExtendRep();
ExtentReports extent=er.getReports();
ExtentTest test;
ThreadLocal<ExtentTest> et=new ThreadLocal<ExtentTest>();
public void onTestStart(ITestResult result) {
test=extent.createTest(result.getMethod().getMethodName());
et.set(test);
}
public void onTestSuccess(ITestResult result) {
String methodName=result.getMethod().getMethodName();
et.get().log(Status.PASS, "Test Passed Successfully");
try {
driver=(WebDriver)result.getTestClass().getRealClass().getDeclaredField("driver").get(result.getInstance());
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
try {
et.get().addScreenCaptureFromPath(screenshot(methodName,driver),result.getMethod().getMethodName());
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
public void onTestFailure(ITestResult result) {
String methodName=result.getMethod().getMethodName();
et.get().fail(result.getThrowable());
try {
driver=(WebDriver)result.getTestClass().getRealClass().getDeclaredField("driver").get(result.getInstance());
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
try {
et.get().addScreenCaptureFromPath(screenshot(methodName,driver),result.getMethod().getMethodName());
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
public void onTestSkipped(ITestResult result) {
}
public void onTestFailedButWithinSuccessPercentage(ITestResult result) {
}
public void onTestFailedWithTimeout(ITestResult result) {
}
public void onStart(ITestContext context) {
}
public void onFinish(ITestContext context) {
extent.flush();
}
}
P.S : ExtentReports were working fine tho throu the Listners. Not able to figure it out :(
I am developing some minecraft mod for 1.8.9.
What I'm trying to create is command that simply send message to sender.
here's the code for command class and main class
command class:
package happyandjust.happymod.commands;
import java.util.HashMap;
import java.util.List;
import net.minecraft.command.CommandBase;
import net.minecraft.command.CommandException;
import net.minecraft.command.ICommandSender;
import net.minecraft.util.ChatComponentText;
import net.minecraft.util.EnumChatFormatting;
public class Command extends CommandBase {
private HashMap<String, String> collection = new HashMap<String, String>();
#Override
public String getCommandName() {
return "collection";
}
#Override
public String getCommandUsage(ICommandSender sender) {
return "collection <enchant name>";
}
#Override
public void processCommand(ICommandSender sender, String[] args) throws CommandException {
collection.put("harvesting", "Wheat Collection Level 2");
collection.put("cubism", "Pumpkin Collection Level 5");
if (args.length < 1) {
sender.addChatMessage(new ChatComponentText(EnumChatFormatting.RED + "Usage: /collection [Enchant Name]"));
return;
}
if (args.length == 1) {
String enchant_name = args[0].toLowerCase();
String collec = collection.get(enchant_name);
if (collec == null) {
sender.addChatMessage(new ChatComponentText(
EnumChatFormatting.RED + enchant_name.toUpperCase() + " is not valid Enchant Name"));
return;
}
sender.addChatMessage(new ChatComponentText(
EnumChatFormatting.GREEN + enchant_name.toUpperCase() + " is at " + collection.get(enchant_name)));
}
}
#Override
public boolean canCommandSenderUseCommand(ICommandSender sender) {
return true;
}
}
main class:
package happyandjust.happymod.main;
import happyandjust.happymod.commands.Command;
import happyandjust.happymod.proxy.CommonProxy;
import happyandjust.happymod.util.Reference;
import net.minecraftforge.client.ClientCommandHandler;
import net.minecraftforge.fml.common.Loader;
import net.minecraftforge.fml.common.Mod;
import net.minecraftforge.fml.common.ModContainer;
import net.minecraftforge.fml.common.Mod.EventHandler;
import net.minecraftforge.fml.common.Mod.Instance;
import net.minecraftforge.fml.common.SidedProxy;
import net.minecraftforge.fml.common.event.FMLInitializationEvent;
import net.minecraftforge.fml.common.event.FMLPostInitializationEvent;
import net.minecraftforge.fml.common.event.FMLServerStartingEvent;
#Mod(modid = Reference.MOD_ID, name = Reference.NAME, version = Reference.VERSION)
public class HappyMod {
#Instance
public static HappyMod instance;
#SidedProxy(clientSide = Reference.CLIENT_PROXY_CLASS, serverSide = Reference.COMMON_PROXY_CLASS)
public static CommonProxy proxy;
#EventHandler
public static void preInit(FMLPostInitializationEvent e) {
}
#EventHandler
public static void init(FMLInitializationEvent e) {
ClientCommandHandler.instance.registerCommand(new Command());
}
#EventHandler
public static void postInit(FMLPostInitializationEvent e) {
}
}
It works fine in single player but if I went to the multi player server like hypixel.
It says "Unknown command"
I have no idea to do this
Can anyone help me to work this command in multi player server?
You need to override the getRequiredPermissionLevel() method from CommandBase for it to work on multiplayer.
#Override
public int getRequiredPermissionLevel() {
return 0;
}
I have to create a unit testcases for a method which acquires the lock with zookeeper and data is processed with CompletableFuture are used
Below is the high level code :
import lombok.Data;
#Data
public class ConfigurationsIntegrationModel {
public enum InteractionType {
TEST,
DEV;
}
private InteractionType interactionType;
private String lockName;
}
import org.springframework.stereotype.Service;
import java.util.Arrays;
import java.util.List;
#Service("configurationsIntegrationService")
public interface ConfigurationsIntegrationService {
public default List<ConfigurationsIntegrationModel> getRecords(ConfigurationsIntegrationModel.InteractionType integrationType) {
return Arrays.asList(getDynamicIntegrationConfigurationMock(integrationType));
}
private static ConfigurationsIntegrationModel getDynamicIntegrationConfigurationMock(ConfigurationsIntegrationModel.InteractionType integrationType) {
ConfigurationsIntegrationModel configurationsIntegration = new ConfigurationsIntegrationModel();
configurationsIntegration.setLockName("Test_Lock");
configurationsIntegration.setInteractionType(integrationType);
return configurationsIntegration;
}
}
import org.apache.curator.RetryPolicy;
import org.apache.curator.framework.CuratorFramework;
import org.apache.curator.framework.CuratorFrameworkFactory;
import org.apache.curator.framework.recipes.locks.InterProcessLock;
import org.apache.curator.framework.recipes.locks.InterProcessSemaphoreMutex;
import org.apache.curator.retry.RetryNTimes;
import java.util.concurrent.TimeUnit;
public class DistributedLockProcessor {
private CuratorFramework client;
private String path;
public DistributedLockProcessor(String host, String path) {
RetryPolicy retryPolicy = new RetryNTimes(5, 90);
client = CuratorFrameworkFactory.newClient(host, retryPolicy);
client.start();
}
public InterProcessLock acquire(String lockName) throws Exception {
InterProcessSemaphoreMutex sharedLock = new InterProcessSemaphoreMutex(client, path + "/" + lockName);
if (!sharedLock.acquire(0, TimeUnit.SECONDS)) {
return null;
}
return sharedLock;
}
public boolean release(InterProcessLock sharedLock) throws Exception {
sharedLock.release();
return true;
}
}
import org.apache.curator.framework.recipes.locks.InterProcessLock;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Executor;
import java.util.concurrent.Executors;
import java.util.function.Consumer;
public class LockingExecutorProcessor<T> {
private Executor executor = null;
private DistributedLockProcessor distributedLock = null;
public LockingExecutorProcessor(String host, String path, int executorCount) {
executor = Executors.newFixedThreadPool(executorCount);
distributedLock = new DistributedLockProcessor(host, path);
}
public void process(List<String> locks, List<T> items, Consumer<T> consumer) throws ExecutionException, InterruptedException {
final List<CompletableFuture<Void>> completableFutures = new ArrayList<>();
for (int i = 0; i < locks.size(); i++) {
final int record = i;
CompletableFuture<Void> future =
CompletableFuture.runAsync(
() -> {
InterProcessLock interProcessLock = null;
try {
interProcessLock = distributedLock.acquire(locks.get(record));
} catch (Exception e) {
e.printStackTrace();
}
if (interProcessLock != null) {
consumer.accept(items.get(record));
}
}, executor);
completableFutures.add(future);
}
CompletableFuture<Void> completable = CompletableFuture.allOf(completableFutures.toArray(new CompletableFuture[completableFutures.size()]));
completable.get();
}
}
import org.springframework.stereotype.Service;
import java.util.Arrays;
import java.util.List;
#Service("messageService")
public interface MessageService {
public default List<String> getMessagesList(ConfigurationsIntegrationModel.InteractionType integrationType) {
return Arrays.asList("Message1", "Message2", "Message3","Message4");
}
}
import com.google.common.util.concurrent.RateLimiter;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Component;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Executors;
#Component("sampleJob")
public class SampleJob {
#Autowired
private LockingExecutorProcessor<ConfigurationsIntegrationModel> lockingExecutorProcessor;
#Autowired
private ConfigurationsIntegrationService configurationsIntegrationService;
#Autowired
private RateLimiter rateLimiter;
#Autowired
private MessageService messageService;
private List<String> getLockNames(List<ConfigurationsIntegrationModel> integrationConfigurations) {
List<String> lockNames = new ArrayList<>();
for (ConfigurationsIntegrationModel integrationConfiguration : integrationConfigurations) {
lockNames.add(integrationConfiguration.getLockName());
}
return lockNames;
}
#Scheduled(fixedRateString = "100")
public void executeJob() throws ExecutionException, InterruptedException {
List<ConfigurationsIntegrationModel> testRecords = configurationsIntegrationService.getRecords(ConfigurationsIntegrationModel.InteractionType.TEST);
List<String> lockNames = getLockNames(testRecords);
lockingExecutorProcessor.process(
lockNames,
testRecords,
recordsConfig -> {
final List<CompletableFuture<Void>> completableFutures = new ArrayList<>();
List<String> msgList = messageService.getMessagesList(recordsConfig.getInteractionType());
for (String message : msgList) {
completableFutures.add(
CompletableFuture.runAsync(
() -> {
System.out.println("Message is #### "+ message);
}, Executors.newFixedThreadPool(10)));
}
});
}
}
Below is the testcase which i tried so far :
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.InjectMocks;
import org.mockito.Mock;
import org.mockito.Mockito;
import org.mockito.MockitoAnnotations;
import org.mockito.junit.MockitoJUnitRunner;
import java.util.concurrent.ExecutionException;
#RunWith(MockitoJUnitRunner.class)
public class SampleJobTest {
#InjectMocks
private SampleJob sampleJob = new SampleJob();
#Mock
private ConfigurationsIntegrationService configurationsIntegrationService;
#Mock
private MessageService messageService;
#Mock
private LockingExecutorProcessor<ConfigurationsIntegrationModel> lockingExecutorProcessor;
#Test
public void testSampleJob() throws ExecutionException, InterruptedException {
Mockito.doCallRealMethod().when(lockingExecutorProcessor).process(Mockito.any(), Mockito.any(), Mockito.any());
Mockito.doCallRealMethod().when(configurationsIntegrationService).getRecords(Mockito.any());
Mockito.doCallRealMethod().when(messageService).getMessagesList(Mockito.any());
sampleJob.executeJob();
}
}
When i debug the code then it breaks at the line CompletableFuture.runAsync of LockingExecutorProcessor and throws null pointer error; the reason is distributedLockProcessor object is null;
How can we mock it and how to connect to test zookeeper server instead of original to ensure locking is working fine
This is done. Now the testcases are working fine. I have used curator-test dependency and also used reflection to mock the private objects.
I have the following class,
public class Processor {
JobUpdate jobUpdate = new JobUpdate();
public void process(Job job){
try {
doProcess(job);
}catch (Exception e){
handleError(job);
System.out.println("This is error");
throw e; // Here throw Runtime Exception.
}
}
private void handleError(Job job) {
if (job.getFailure() > 0){
jobUpdate.updateJobStatus(job, JobStatus.SUBMITTED);
}else{
jobUpdate.updateJobStatus(job, JobStatus.FAILED);
}
}
private void doProcess(Job job){
String name = job.getName();
name += "added";
// ....
}
}
and my test cases as below,
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.mockito.ArgumentCaptor;
import org.mockito.Captor;
import org.mockito.Mock;
import org.mockito.Mockito;
import org.mockito.junit.jupiter.MockitoExtension;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.*;
#ExtendWith(MockitoExtension.class)
public class ProcessTest {
Processor processor = new Processor();
#Mock
private Job job;
#Mock
private JobUpdate jobUpdate;
#Captor
private ArgumentCaptor<Job> captorJob;
#Captor
private ArgumentCaptor<JobStatus> captorStatus;
#Test
public void shouldThrowException(){
Mockito.when(job.getName()).thenThrow(RuntimeException.class);
Mockito.when(job.getFailure()).thenReturn(2);
processor.process(job);
Mockito.verify(jobUpdate).updateJobStatus(captorJob.capture(),captorStatus.capture());
assertThat(captorJob.getValue().getFailure(), equalTo( 2));
assertThat(captorStatus.getValue(), equalTo( JobStatus.SUBMITTED));
}
}
The flow goes fine in catch block and after executing System.out.println("This is error"); statement, it throws exception. The controller not backing to test class to verify my ArgumentCaptor.
am i missing something or need to be changed?
I am developing a custom interpreter for a domain specific language. Based on the example given in the Apache Zeppelin documentation (https://zeppelin.incubator.apache.org/docs/latest/development/writingzeppelininterpreter.html), the interpreter works pretty well. Now I want to store some results in a new DataFrame.
I found code to create DataFrames (http://spark.apache.org/docs/latest/sql-programming-guide.html), but I cant use this in my interpreter because I basically dont find a way to access a valid runtime SparkContext (often called "sc") from within my custom interpreter.
I tried (static) SparkContext.getOrCreate() but this even led to a ClassNotFoundException. Then I added the whole zeppelin-spark-dependencies...jar to my interpreter folder, which solved the class loading issue but now I am getting a SparkException ("master url must be set...").
Any idea how I could get access to my Notebook's SparkContext from within the custom interpreter? Thanks a lot!
UPDATE
Thanks to Kangrok Lee's comment below, my code now looks as follows: see below. It runs and seems to create a DataFrame (at least it doesnt throw any Exception any more). But I can not consume the created DataFrame in a subsequent SQL paragraph (the first paragraph uses my "%opl" interpreter, as given below, that should create the "result" DataFrame):
%opl
1 2 3
> 1
> 2
> 3
%sql
select * from result
> Table not found: result; line 1 pos 14
So probably there is still something wrong with my way of dealing with the SparkContext. Any ideas? Thanks a lot!
package opl;
import java.io.ByteArrayOutputStream;
import java.io.PrintStream;
import java.util.ArrayList;
import java.util.List;
import java.util.Properties;
import org.apache.spark.SparkContext;
import org.apache.spark.sql.DataFrame;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.RowFactory;
import org.apache.spark.sql.SQLContext;
import org.apache.spark.sql.types.DataTypes;
import org.apache.spark.sql.types.StructType;
import org.apache.zeppelin.interpreter.Interpreter;
import org.apache.zeppelin.interpreter.InterpreterContext;
import org.apache.zeppelin.interpreter.InterpreterPropertyBuilder;
import org.apache.zeppelin.interpreter.InterpreterResult;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class OplInterpreter2 extends Interpreter {
static {
Interpreter.register("opl","opl",OplInterpreter2.class.getName(),
new InterpreterPropertyBuilder()
.add("spark.master", "local[4]", "spark.master")
.add("spark.app.name", "Opl Interpreter", "spark.app.name")
.add("spark.serializer", "org.apache.spark.serializer.KryoSerializer", "spark.serializer")
.build());
}
private Logger logger = LoggerFactory.getLogger(OplInterpreter2.class);
private void log(Object o) {
if (logger != null)
logger.warn("OplInterpreter2 "+o);
}
public OplInterpreter2(Properties properties) {
super(properties);
log("CONSTRUCTOR");
}
#Override
public void open() {
log("open()");
}
#Override
public void cancel(InterpreterContext arg0) {
log("cancel()");
}
#Override
public void close() {
log("close()");
}
#Override
public List<String> completion(String arg0, int arg1) {
log("completion()");
return new ArrayList<String>();
}
#Override
public FormType getFormType() {
log("getFormType()");
return FormType.SIMPLE;
}
#Override
public int getProgress(InterpreterContext arg0) {
log("getProgress()");
return 100;
}
#Override
public InterpreterResult interpret(String string, InterpreterContext context) {
log("interpret() "+string);
PrintStream oldSys = System.out;
try {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
PrintStream ps = new PrintStream(baos);
System.setOut(ps);
execute(string);
System.out.flush();
System.setOut(oldSys);
return new InterpreterResult(
InterpreterResult.Code.SUCCESS,
InterpreterResult.Type.TEXT,
baos.toString());
} catch (Exception ex) {
System.out.flush();
System.setOut(oldSys);
return new InterpreterResult(
InterpreterResult.Code.ERROR,
InterpreterResult.Type.TEXT,
ex.toString());
}
}
private void execute(String code) throws Exception {
SparkContext sc = SparkContext.getOrCreate();
SQLContext sqlc = SQLContext.getOrCreate(sc);
StructType structType = new StructType().add("value",DataTypes.IntegerType);
ArrayList<Row> list = new ArrayList<Row>();
for (String s : code.trim().split("\\s+")) {
int value = Integer.parseInt(s);
System.out.println(value);
list.add(RowFactory.create(value));
}
DataFrame df = sqlc.createDataFrame(list,structType);
df.registerTempTable("result");
}
}
Finally I found a solution although I don't think this is a very nice one. In the code below, I am using a function getSparkInterpreter() that I found in org.apache.zeppelin.spark.PySparkInterpreter.java.
This requires that I put my packaged code (jar) into the Spark interpreter folder, instead of its own interpreter folder, which I believe should be the preferred way (according to https://zeppelin.incubator.apache.org/docs/latest/development/writingzeppelininterpreter.html). Also, my interpreter does not show up in Zeppelin's interpreter configuration page as an interpreter of its own. But it can be used in a Zeppelin paragraph nevertheless.
And: In the code I can create a DataFrame and this is also consumable outside my paragraph -- which is what I wanted to achieve.
package opl;
import java.io.ByteArrayOutputStream;
import java.io.PrintStream;
import java.util.ArrayList;
import java.util.List;
import java.util.Properties;
import org.apache.spark.sql.DataFrame;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.RowFactory;
import org.apache.spark.sql.SQLContext;
import org.apache.spark.sql.types.DataTypes;
import org.apache.spark.sql.types.StructType;
import org.apache.zeppelin.interpreter.Interpreter;
import org.apache.zeppelin.interpreter.InterpreterContext;
import org.apache.zeppelin.interpreter.InterpreterPropertyBuilder;
import org.apache.zeppelin.interpreter.InterpreterResult;
import org.apache.zeppelin.interpreter.LazyOpenInterpreter;
import org.apache.zeppelin.interpreter.WrappedInterpreter;
import org.apache.zeppelin.spark.SparkInterpreter;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class OplInterpreter2 extends Interpreter {
static {
Interpreter.register(
"opl",
"spark",//"opl",
OplInterpreter2.class.getName(),
new InterpreterPropertyBuilder()
.add("sth", "defaultSth", "some thing")
.build());
}
private Logger logger = LoggerFactory.getLogger(OplInterpreter2.class);
private void log(Object o) {
if (logger != null)
logger.warn("OplInterpreter2 "+o);
}
public OplInterpreter2(Properties properties) {
super(properties);
log("CONSTRUCTOR");
}
#Override
public void open() {
log("open()");
}
#Override
public void cancel(InterpreterContext arg0) {
log("cancel()");
}
#Override
public void close() {
log("close()");
}
#Override
public List<String> completion(String arg0, int arg1) {
log("completion()");
return new ArrayList<String>();
}
#Override
public FormType getFormType() {
log("getFormType()");
return FormType.SIMPLE;
}
#Override
public int getProgress(InterpreterContext arg0) {
log("getProgress()");
return 100;
}
#Override
public InterpreterResult interpret(String string, InterpreterContext context) {
log("interpret() "+string);
PrintStream oldSys = System.out;
try {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
PrintStream ps = new PrintStream(baos);
System.setOut(ps);
execute(string);
System.out.flush();
System.setOut(oldSys);
return new InterpreterResult(
InterpreterResult.Code.SUCCESS,
InterpreterResult.Type.TEXT,
baos.toString());
} catch (Exception ex) {
System.out.flush();
System.setOut(oldSys);
return new InterpreterResult(
InterpreterResult.Code.ERROR,
InterpreterResult.Type.TEXT,
ex.toString());
}
}
private void execute(String code) throws Exception {
SparkInterpreter sintp = getSparkInterpreter();
SQLContext sqlc = sintp.getSQLContext();
StructType structType = new StructType().add("value",DataTypes.IntegerType);
ArrayList<Row> list = new ArrayList<Row>();
for (String s : code.trim().split("\\s+")) {
int value = Integer.parseInt(s);
System.out.println(value);
list.add(RowFactory.create(value));
}
DataFrame df = sqlc.createDataFrame(list,structType);
df.registerTempTable("result");
}
private SparkInterpreter getSparkInterpreter() {
LazyOpenInterpreter lazy = null;
SparkInterpreter spark = null;
Interpreter p = getInterpreterInTheSameSessionByClassName(SparkInterpreter.class.getName());
while (p instanceof WrappedInterpreter) {
if (p instanceof LazyOpenInterpreter) {
lazy = (LazyOpenInterpreter) p;
}
p = ((WrappedInterpreter) p).getInnerInterpreter();
}
spark = (SparkInterpreter) p;
if (lazy != null) {
lazy.open();
}
return spark;
}
}
I think that you should configure spark cluster such as the below statement.
spark.master = "local[4]"
spark.app.name = "My Spark App"
spark.serializer = "org.apache.spark.serializer.KryoSerializer"
Using SparkContext.getOrCreate() looks good to me.
Thanks,
Kangrok Lee