Pass parameter to lambda expression - Java - java

My program requires that I accept a user input and, based on this input, a method is to be carried out. My basic thoughts are described well by the following question/answer:
How to call a method stored in a HashMap? (Java)
To do this, I have created an array of lambda expressions:
public final Runnable[] userCommandMethods = {
() -> userCommand1(),
() -> userCommand2(),
};
And an array of keys:
public final String[] userCommandKeys = {
commandKey1,
commandKey2,
};
Which are joined to create a HashMap using the following method:
public Map<String, Runnable> mapArrays (String[] array1, Runnable[] array2) {
Map<String, Runnable> mappedArrays = new HashMap<String, Runnable>();
for (int i = 0; i < array1.length; i ++) {
mappedArrays.put(array1[i], array2[i]);
}
return mappedArrays;
}
When I attempt to run a method by using myHashMap.get(userInput).run(); it works perfectly, provided none of the methods in userCommandMethods require input parameters.
My question:
How would I pass an input parameter (specifically a Hash Map) into the methods contained within userCommandMethods?
When the userCommand1() method takes an input parameter, but the lambda expression does not, I get the following error:
The method userCommand1(Map<String, String>) in the type ProgramCommands is not applicable for the arguments ()
However, when I do pass a parameter to the method, it states that it cannot be resolved to a variable.
Edit: to elaborate:
When the userCommand1() method takes no arguments:
public void userCommand1 () {
// Do some stuff
}
It works perfectly fine. However, I am unsure how to use the lambda expressions if the method does take an input parameter:
public void userCommand1 (Map<String, String> myMap) {
// Do some stuff
}

You just need to choose another functional interface (not Runnable).
For example, if your methods all take a String parameter, you should use Consumer<String>. If they take a String and an int, then you should use BiConsumer<String, Integer>. If your methods need more than 2 parameters, you need to create your own functional interface. For an example, see my answer here.
// use a list instead of an array, because arrays don't work well with generic types
public final List<Consumer<String>> userCommandMethods = List.of(
x -> userCommand1(x),
x -> userCommand2() // it's fine if the method takes fewer parameters
);
Instead of run, you would call accept, which is what Consumer and BiConsumer's single abstraction method is called.
Note that you can also use the method reference syntax. If userCommand1 is static, x -> userCommand1(x) can be rewritten as SomeClass::userCommand1 where SomeClass is the enclosing class of userCommand1. If userCommand1 is non static, it can be rewritten as this::userCommand1.
You don't need to build the map from two arrays. You can use ofEntries and entry to write the entries inline.
private final Map<String, Consumer<String>> someMap = Map.ofEntries(
Map.entry("foo", SomeClass::userCommand1),
Map.entry("bar", SomeClass::userCommand2),
Map.entry("baz", SomeClass::userCommand3),
// and so on
)

You are using Runnable interface that takes no argument on input:
#FunctionalInterface
public interface Runnable {
public abstract void run();
}
Instead, you can define your custom interface and consume it.
As a simple example:
#FunctionalInterface
public interface RunnableWithArg {
void apply(String t) throws RuntimeException;
}
And implementation may look like:
public class RunnableTest {
//also fine:
//public final RunnableWithArg[] userCommandMethods = { t -> this.userCommand1(t), t -> this.userCommand2(t) };
public final RunnableWithArg[] userCommandMethods = { this::userCommand1, this::userCommand2 };
public String commandKey1 = "commandKey1";
public String commandKey2 = "commandKey2";
public final String[] userCommandKeys = { commandKey1, commandKey2, };
public Map<String, RunnableWithArg> mapArrays(String[] array1, RunnableWithArg[] array2) {
Map<String, RunnableWithArg> mappedArrays = new HashMap<>();
for (int i = 0; i < array1.length; i++) {
mappedArrays.put(array1[i], array2[i]);
}
return mappedArrays;
}
public void userCommand1(String data) {
System.out.println("userCommand1 called with " + data);
}
public void userCommand2(String data) {
System.out.println("userCommand2 called with " + data);
}
public void test()
{
var fncMap = mapArrays(userCommandKeys, userCommandMethods);
for(String key: fncMap.keySet())
{
var fnc = fncMap.get(key);
fnc.apply(key);
}
}
}
And of course you can also define some generic types of "#FunctionalInterface" like this, so you can use it for both taking input and returning some output of generic types:
#FunctionalInterface
public interface AbcFunction<T, R> {
R apply(T t) throws AbcException;
static <T> Function<T, T> identity() {
return t -> t;
}
}

Is this something you are thinking of?
interface Command<T> {
public void run(T arg);
}
class SayHelloCommand implements Command<String>{
public void run(String name){
System.out.println("hello " + name);
}
}
class CountCommand implements Command<Integer>{
public void run(Integer limit){
for(int i=0; i<=limit; i++)
System.out.println(i);
}
}
public class Main{
public static void main(String[] args) {
Command[] commands = new Command[3];
commands[0] = new SayHelloCommand();
commands[1] = new CountCommand();
commands[0].run("Joe");
commands[1].run(5);
}
}

Related

Java 8: Input a list of functional Interfaces and call them dynamically after .stream()

I have the following method:
public void caller(){
List<Class1> data1 = Arrays.asList(new Class1(), new Class1() ...);
List<Class2> data2 = Arrays.asList(new Class2(), new Class2() ...);
// The following is what I'm trying to implement:
List<BiConsumer<Class1, Double>> peeks1 = Arrays.asList(Class1::setOneNum, Class1::setAnotherNum, Class1:: setMoreNum);
List<BiConsumer<Class2, Double>> peeks2 = Arrays.asList(Class2::setSomeNum1, Class2::setSomeNum2);
helper(data1, peeks1);
helper(data2, peeks2);
...
}
private <T> List<T> helper(List<T> data, List<BiConsumer<T, Double>> peeks) {
for(BiConsumer<T, Double> singlePeek: peeks){
data = data.stream()
.peek(a -> singlePeek.accept(a, math.random()))
.collect(Collectors.toList());
}
return data;
}
There are other implementation in common for Class1 and Class2, the only difference are the methods called after the .stream() which is why I'm trying to "merge" the functions into one helper.
Where BiConsumer is a setter. I want to call a list of setters after stream(). But I cannot input a list of functional interface into helper() (what I tried was Arrays.asList(Class1::setNum, Class1::setAnotherNum, Class1::setMoreNum) won't work as an input since Array.asList() only accepts Object). So is there any work-around? Thanks!
#user7 Thanks for pointing it out. I was careless but I've fixed the "typo". And added the caller function.
You have to specify the target type, when you call the .asList method:
Arrays.<BiConsumer<Object, Double>>asList(Class1::setOneNum, ...)
Update:
According to the updated code of the question the result of Arrays.asList is not directly handed over to the helper method, so no explicit typing is required.
The only possible reasons left why the code is not working are:
At least one of the methods (setOneNum, setSomeNum1, ...) has wrong parameters types
At least one of the methods is not static
Could I advise you in trying to make it a little bit more functional?
For your code consider the following helper, this one will make use of function as a first class citizen concept and make some High Order Functions:
private <T, V> Function<Supplier<T>, Supplier<T>> helper(Supplier<V> v,
BiConsumer<T, V> bc) {
return (Supplier<T> r) -> {
bc.accept(r.get(), v.get());
return r;
};
}
This helper function expects a Supplier of some value kind of value and a BiConsumer that will be your setter function. The returns is a function of Suppliers of the same class you are working with.
With that we can make something like a pipe operator of functional languages. Their premises is that the data should processed in a pipeline operation.
List<Class1> data1 = Arrays.asList(new Class1(), new Class1());
List<Class2> data2 = Arrays.asList(new Class2(), new Class2());
Supplier<Double> random = () -> Math.random();
This will be our data, you have the same array and now a Supplier with the random value you want.
Now lets compose our pipeline with andThem:
data1.stream()//
.forEach(data -> {
helper(random, Class1::setOneNum)//
.andThen(helper(random, Class1::setAnotherNum))//
.andThen(helper(random, Class1::setMoreNum))//
.apply(() -> data);
System.out.println(data.toString());
});
data2.stream()//
.forEach(data -> {
helper(random, Class2::setSomeNum1)//
.andThen(helper(random, Class2::setSomeNum2))//
.apply(() -> data);
System.out.println(data.toString());
});
As you can see the helper function can be chained together with "andThem" method of Function interface. This will make Java execute the helper function and use it's return as the parameter of the next Function.
The data parameter will hole the values of classes and will be changed each chain. As we iterated all objects will
And the result:
Class1 [oneNum=0,047, anotherNum=0,482, moreNum=0,339]
Class1 [oneNum=0,131, anotherNum=0,889, moreNum=0,411]
Class2 [someNum1=0,18, someNum2=0,004]
Class2 [someNum1=0,497, someNum2=0,702]
I think it is the same result you want. And as you can see you don't need to pass any generics as the Java will understand it well.
The classes that I made for reference:
class Class1 {
double oneNum;
double anotherNum;
double moreNum;
public double getOneNum() {
return oneNum;
}
public void setOneNum(double oneNum) {
this.oneNum = oneNum;
}
public double getAnotherNum() {
return anotherNum;
}
public void setAnotherNum(double anotherNum) {
this.anotherNum = anotherNum;
}
public double getMoreNum() {
return moreNum;
}
public void setMoreNum(double moreNum) {
this.moreNum = moreNum;
}
#Override
public String toString() {
return MessageFormat.format("Class1 [oneNum={0}, anotherNum={1}, moreNum={2}]", oneNum, anotherNum, moreNum);
}
}
class Class2 {
double someNum1;
double someNum2;
public double getSomeNum1() {
return someNum1;
}
public void setSomeNum1(double someNum1) {
this.someNum1 = someNum1;
}
public double getSomeNum2() {
return someNum2;
}
public void setSomeNum2(double someNum2) {
this.someNum2 = someNum2;
}
#Override
public String toString() {
return MessageFormat.format("Class2 [someNum1={0}, someNum2={1}]", someNum1, someNum2);
}
}

function name as a string

I am trying to wrap my head around generic and functions... what I am trying to achieve: Passing function name as a string to get it executed:
I want to do Wrapper.useFunction("eleven") or Wrapper.useFunction("ten")
public class Wrapper<T> {
public F useFunction(Function<F, F> function) {
return function.apply(F);
}
Function<F, String> ten = s -> "10";
Function<F, String> eleven = s -> "11";
}
But this code not even close to compiling. Maybe it doesn't make any sense. Any suggestions?
If you have a finite set of functions which you would like to be able to call I would recommend building a Map which maps Strings to instances of Runnable (or similar functional interfaces). Your useFunction method may then look up the function implementation in the Map and call it if it exists.
Example:
public class SomeClass {
private final Map<String, Runnable> methods = new HashMap<>();
{
methods.put("helloworld", () -> {
System.out.println("Hello World!");
});
methods.put("test", () -> {
System.out.println("test!");
});
methods.put("doStuff", () -> {
System.out.println("doStuff!");
});
}
public void perform(String code) {
methods.getOrDefault(code,
() -> {
System.err.println("No such Method: "+code);
})
.run();
}
}
If you want to call arbitrary methods you should probably use Reflection as stated by others.

Anonymous class do not have an argument

I am learning Apache Spark. Given such an implementation of spark using java below, I am confused about some details about it.
public class JavaWordCount {
public static void main(String[] args) throws Exception {
if (args.length < 2) {
System.err.println("Usage: JavaWordCount <master> <file>");
System.exit(1);
}
JavaSparkContext ctx = new JavaSparkContext(args[0], "JavaWordCount",
System.getenv("SPARK_HOME"), System.getenv("SPARK_EXAMPLES_JAR"));
JavaRDD<String> lines = ctx.textFile(args[1], 1);
JavaRDD<String> words = lines.flatMap(new FlatMapFunction<String, String>() {
public Iterable<String> call(String s) {
return Arrays.asList(s.split(" "));
}
});
JavaPairRDD<String, Integer> ones = words.map(new PairFunction<String, String, Integer>() {
public Tuple2<String, Integer> call(String s) {
return new Tuple2<String, Integer>(s, 1);
}
});
JavaPairRDD<String, Integer> counts = ones.reduceByKey(new Function2<Integer, Integer, Integer>() {
public Integer call(Integer i1, Integer i2) {
return i1 + i2;
}
});
List<Tuple2<String, Integer>> output = counts.collect();
for (Tuple2 tuple : output) {
System.out.println(tuple._1 + ": " + tuple._2);
}
System.exit(0);
}
}
According to my comprehension, begin in line 12, it passed an anonymous class FlatMapFunction into the lines.flatMap() as an argument. Then what does the String s mean? It seems that it doesn't pass an created String s as an argument, then how will the FlatMapFunction<String,String>(){} class works since no specific arguments are passed into?
The anonymous class instance you're passing is overriding the call(String s) method. Whatever is receiving this anonymous class instance is something that wants to make use of that call() method during its execution: it will be (somehow) constructing strings and passing them (directly or indirectly) to the call() method of whatever you've passed in.
So the fact that you're not invoking the method you've defined isn't a worry: something else is doing so.
This is a common use case for anonymous inner classes. A method m() expects to be passed something that implements the Blah interface, and the Blah interface has a frobnicate(String s) method in it. So we call it with
m(new Blah() {
public void frobnicate(String s) {
//exciting code goes here to do something with s
}
});
and the m method will now be able to take this instance that implements Blah, and invoke frobnicate() on it.
Perhaps m looks like this:
public void m(Blah b) {
b.frobnicate("whatever");
}
Now the frobnicate() method that we wrote in our inner class is being invoked, and as it runs, the parameter s will be set to "whatever".
All your are doing here is passing a FlatMapFunction as argument to the flatMap method; your passed FlatMapFunction overrides call(String s):
JavaRDD<String> words = lines.flatMap(new FlatMapFunction<String, String>()
{
public Iterable<String> call(String s)
{
return Arrays.asList(s.split(" "));
}
});
The code implementing lines.flatMap could look like this for instance:
public JavaRDD<String> flatMap(FlatMapFunction<String, String> map)
{
String str = "some string";
Iterable<String> it = map.call(str);
// do stuff with 'it'
// return a JavaRDD<String>
}

Looking for appropriate design pattern

Our code has several processors, each one having several api methods, where each method is overloaded also with same method that can accept collection.
For example:
public class Foo {
public X foo(Y y){...}
public Collection<X> foo(Collection<Y> y){... // iterate and execute foo(y) ... }
public Z bar(W w){...}
public Collection<Z> bar(Collection<W> w){... // iterate and execute bar(w) ... }
}
public class Other{
// also method and method on collection
}
Naturally, those methods on collections are actually duplication code of iteration.
What we are looking for, is kind of way to make some pattern or use generics, so the iteration over collection will be implemented once, also for that need a way to somehow pass the method name.
I'd suggest Startegy pattern. And do something like:
public interface Transformer<X, Y> {
Y transform( X input );
}
class Processor {
public <X,Y> Collection<Y> process( Collection<X> input, Transformer<X, Y> transformer) {
Collection<Y> ret = new LinkedList<Y>();
// generic loop, delegating transformation to specific transformer
for( X x : input) {
ret.add( transformer.transform( x ) );
}
return ret;
}
}
Example:
public static void main( String[] args ) {
List<String> strings = new LinkedList<String>();
strings.add( "1" );
strings.add( "2" );
strings.add( "3" );
Processor p = new Processor();
Collection<Integer> numbers = p.process( strings, new Transformer<String, Integer>() {
#Override
public Integer transform( String input ) {
return Integer.parseInt( input );
}
} );
}
I can't see how reflection could help here. You're trying to replace something as trivial as
public Collection<X> foo(Collection<Y> y) {
List<X> result = Lists.newArrayList();
for (Y e : y) result.add(foo(e));
return result;
}
by something probably much slower. I don't think that saving those 3 lines (several times) is worth it, but you might want to try either annotation processing (possibly without using annotations) or dynamic code generation. In both cases you'd write the original class as is without the collection methods and use a different one containing both the scalar and the collection methods.
Or you might want to make it more functionally styled:
public class Foo {
public final RichFunction<Y, X> foo = new RichFunction<Y, X>() {
X apply(Y y) {
return foo(y);
}
}
// after some refactoring the original method can be made private
// or inlined into the RichFunction
public X foo(Y y){...}
// instead of calling the original method like
// foo.foo(y)
// you'd use
// foo.foo.apply(y)
// which would work for both the scalar and collection methods
}
public abstract class RichFunction<K, V> extends com.google.common.base.Function<K, V> {
Collection<V> apply(Collection<K> keys) {
List<V> result = Lists.newArrayList();
for (K k : keys) result.add(apply(k));
return result;
}
}
RUAKH - I chosed to implement your suggestion for reflection (although, admit, I don't like reflection). So, I did something like the code below THANKS :)
public class Resource {
private static final int CLIENT_CODE_STACK_INDEX;
static {
// Finds out the index of "this code" in the returned stack trace - funny but it differs in JDK 1.5 and 1.6
int i = 0;
for (StackTraceElement ste : Thread.currentThread().getStackTrace()) {
i++;
if (ste.getClassName().equals(Resource.class.getName())) {
break;
}
}
CLIENT_CODE_STACK_INDEX = i;
}
public static String getCurrentMethodName() {
return Thread.currentThread().getStackTrace()[CLIENT_CODE_STACK_INDEX].getMethodName();
}
protected <IN,OUT> Collection<OUT> doMultiple(String methodName, Collection<IN> inCol, Class<?>... parameterTypes){
Collection<OUT> result = new ArrayList<OUT>();
try {
Method m = this.getClass().getDeclaredMethod(methodName, parameterTypes);
if (inCol==null || inCol.size()==0){
return result;
}
for (IN in : inCol){
Object o = m.invoke(this, in);
result.add((OUT) o);
}
}catch (Exception e){
e.printStackTrace();
}
return result;
}
}
public class FirstResource extends Resource{
public String doSomeThing(Integer i){
// LOTS OF LOGIC
return i.toString();
}
public Collection<String> doSomeThing(Collection<Integer> ints){
return doMultiple(getCurrentMethodName(), ints, Integer.class);
}
}
You should use Strategy pattern. By using Strategy pattern you can omit the usage if/else which makes the code more complex. Where strategy pattern creates less coupled code which is much simpler. By using Strategy pattern you can achieve more ways to configure code dynamically. So I would like to suggest you to use Strategy pattern.

Using Scala from Java: passing functions as parameters

Consider the following Scala code:
package scala_java
object MyScala {
def setFunc(func: Int => String) {
func(10)
}
}
Now in Java, I would have liked to use MyScala as:
package scala_java;
public class MyJava {
public static void main(String [] args) {
MyScala.setFunc(myFunc); // This line gives an error
}
public static String myFunc(int someInt) {
return String.valueOf(someInt);
}
}
However, the above does not work (as expected since Java does not allow functional programming). What is the easiest workaround to pass a function in Java? I would like a generic solution that works with functions having arbitrary number of parameters.
EDIT: Does Java 8 have any better syntax than the classic solutions discussed below?
In the scala.runtime package, there are abstract classes named AbstractFunction1 and so on for other arities. To use them from Java you only need to override apply, like this:
Function1<Integer, String> f = new AbstractFunction1<Integer, String>() {
public String apply(Integer someInt) {
return myFunc(someInt);
}
};
If you're on Java 8 and want to use Java 8 lambda syntax for this, check out https://github.com/scala/scala-java8-compat.
You have to manually instantiate a Function1 in Java. Something like:
final Function1<Integer, String> f = new Function1<Integer, String>() {
public int $tag() {
return Function1$class.$tag(this);
}
public <A> Function1<A, String> compose(Function1<A, Integer> f) {
return Function1$class.compose(this, f);
}
public String apply(Integer someInt) {
return myFunc(someInt);
}
};
MyScala.setFunc(f);
This is taken from Daniel Spiewak’s “Interop Between Java and Scala” article.
The easiest way for me is to defined a java interface like:
public interface JFunction<A,B> {
public B compute( A a );
}
Then modify your scala code, overloading setFunc to accept also JFunction objects such as:
object MyScala {
// API for scala
def setFunc(func: Int => String) {
func(10)
}
// API for java
def setFunc(jFunc: JFunction[Int,String]) {
setFunc( (i:Int) => jFunc.compute(i) )
}
}
You will naturally use the first definition from scala, but still be able to use the second one from java:
public class MyJava {
public static void main(String [] args) {
MyScala.setFunc(myFunc); // This line gives an error
}
public static final JFunction<Integer,String> myFunc =
new JFunction<Integer,String>() {
public String compute( Integer a ) {
return String.valueOf(a);
}
};
}
Here's my attempt at a solution, a little library: https://github.com/eirslett/sc8
You wrap your Java 8 lambda in F(...) and then it's converted to a Scala function.

Categories