Reduce a collection of objects by common field in Java-8 - java

I have a list of object Ob defined as
class Ob {
private String type;
private List<String> attire;
// standard getter and setters
public Ob (String type){
this.type=type;
}
public Ob addAttrire(String att){
if(attire == null){
attire = new ArrayList<>();
}
attire.add(att);
return this;
}
}
I receive objects as
[{
"type" : "upper"
attires : [{"t1","t2"}]
},
{
"type" : "upper"
attires : ["t3","t4"]
},
{
"type" : "lower"
attires : ["l1","l2"]
}]
which I have to combine as
[{
"type" : "upper"
attires : ["t1","t2","t3","t4"]
},{
"type" : "lower"
attires : ["l1","l2"]
}]
How can I use stream to do that. Does reduce help?
The stream one can use is
List<Ob> coll = new ArrayList<>();
coll.add(new Ob("a").addAttrire("1").addAttrire("2").addAttrire("3"));
coll.add(new Ob("a").addAttrire("1").addAttrire("2").addAttrire("3"));
coll.add(new Ob("a").addAttrire("1").addAttrire("2").addAttrire("3"));
coll.add(new Ob("b").addAttrire("1").addAttrire("2").addAttrire("3"));
coll.add(new Ob("b").addAttrire("1").addAttrire("2").addAttrire("3"));
coll.add(new Ob("b").addAttrire("1").addAttrire("2").addAttrire("3"));
Collection<Ob> values = coll.stream()
.collect(toMap(Ob::getType, Function.identity(), (o1, o2) -> {
o1.getAttire().addAll(o2.getAttire());
return o1;
})).values();
Updated the question with solution of Ruben. There is no requirement to remove duplicates, but it can be done using set in Ob for attire. The current solution worked flawlessly.

You could collect toMap with a merge function that merges the lists
Collection<Ob> values = coll.stream()
.collect(toMap(Ob::getType, Function.identity(), (o1, o2) -> {
o1.getAttire().addAll(o2.getAttire());
return o1;
})).values();

This solution uses the groupingBy collector and then a separate step that creates a new Ob which is the result of merging all the Obs that have the same type.
I think Rubens solution has an advantage over this answer because it's a little shorter and simpler. I think this answer has an advantage because it doesn't modify the original Obs and hence is more in a functional style.
public static void testTrickyStreamSet() {
Stream<Ob> s = Stream.of(
new Ob("a", "1", "2"),
new Ob("b", "1", "4"),
new Ob("a", "1", "3"),
new Ob("b", "1", "5"));
List<Ob> os = s.collect(groupingBy(o -> o.type))
.entrySet().stream()
.map(e -> new Ob(e.getKey(),
e.getValue().stream().flatMap(o -> o.attire.stream()).collect(toList())))
.collect(toList());
// Prints [<Ob type=a, attire=[1, 2, 3]>, <Ob type=b, attire=[1, 4, 5]>]
System.out.println(os);
}
public static class Ob {
public String type;
public List<String> attire;
public Ob(String type, String... attire) {
this.type = type;
this.attire = Arrays.asList(attire);
}
public Ob(String type, List<String> attire) {
this.type = type;
this.attire = new ArrayList<>(attire);
}
#Override
public String toString() {
return "<Ob type=" + type + ", attire=" + attire + ">";
}
}

Related

Transform Java Stream and return reduced values both

Assuming I consume a Stream of entities from a source which I do not want to materialize, and I want to both transform the elements, and return some globally reduced value, what is the idiomatic way with java(8)?
This is essentially trying to perform both a reduce() and a collect().
Example:
class Person {
public String firstname,
public String lastname,
public int age;
}
class TeamSummary {
public List<String> fullnames, // firstname and lastname of all
public Person oldest
}
public TeamSummary getSummary(Stream<Person> personStream) {
final TeamSummary summary = new Summary();
summary.fullnames = personStream
.peek(p -> if (summary.getOldest() == null || summary.getOldest.age < p.age) {
summary.oldest = p;
})
.map(p -> p.firstname + ' ' + p.lastname)
.collect(toList());
return summary;
}
It looks ugly to interact with a variable outside the stream inside the peek method, but what good alternatives are there, it seems I need to combine collect() and reduce().
It get's worse if I want to get a reduced value from the whole stream (like average age), and a filtered list (like Persons above 18 years). It also get's worse if TeamSummary is an immutable class, and additional mutable variables are required.
In such cases it is more idiomatic to use a while loop on stream.iterator() to avoid coupling of stream methods and variables? Or is it natural to use reduce to a tuple like (oldest, accumulated).
I am aware this question is a matter of opinion unless there is an obvious way (like a special collector) that solves this elegantly.
So you want to reduce your collection to a single value? That's where Collectors.reducing comes into play (Alternative: You could use Stream.reduce but with other modifications). Furthermore, you want to aggregate your values in some way and also have the perfect accumulator: TeamSummary.
Now, in the below code I made the foollowing adjustments:
Team Summary has the merge/identity function required for reduce, as it serves as the accumulator
I use a Null Object instead of null for a non-existing person, which makes the code much more readable without null checks (NPE during converter being one of the problems). Have you thought about your output if the stream is empty?
I added a Person constructor for my own convenience. But consider using getters and final fields (even if you think getters and the whole fake encapsulation are boilerplate: You can use method references, e.g. to pass to a comparator, but not field references)
Here is the code:
static class Person {
public String firstname;
public String lastname;
public int age;
public Person(String firstname, String lastname, int age) {
this.firstname = firstname;
this.lastname = lastname;
this.age = age;
}
public static Person getNullObjectYoung() {
return new Person("", "", 0);
}
}
static class TeamSummary {
public List<String> fullnames;
public Person oldest;
public static TeamSummary merge(TeamSummary lhs, TeamSummary rhs) {
TeamSummary result = new TeamSummary();
result.fullnames = new ArrayList<>();
result.fullnames.addAll(lhs.fullnames);
result.fullnames.addAll(rhs.fullnames);
result.oldest = Comparator.<Person, Integer>comparing(p -> p.age).reversed()
.compare(lhs.oldest, rhs.oldest) < 0
? lhs.oldest
: rhs.oldest;
return result;
}
public static TeamSummary of(Person person) {
TeamSummary result = new TeamSummary();
result.fullnames = new ArrayList<>();
result.fullnames.add(person.firstname + " " + person.lastname);
result.oldest = person;
return result;
}
public static TeamSummary identity() {
TeamSummary result = new TeamSummary();
result.fullnames = new ArrayList<>();
result.oldest = Person.getNullObjectYoung();
return result;
}
}
public static void main(String[] args) {
Stream<Person> personStream = Arrays.asList(
new Person("Tom", "T", 32),
new Person("Bob", "B", 40))
.stream();
TeamSummary result = personStream.collect(
Collectors.reducing(
TeamSummary.identity(),
TeamSummary::of,
TeamSummary::merge
));
System.out.println(result.fullnames + " " + result.oldest.age);
}
Note: You asked for a java 8 version. Maybe in java 12, you could also use Collectors.teeing, since you basically want to do two different reductions at the same time (for which we can currently leverage the accumulator).
Edit: Also added a solution for Stream.reduce, which requires a BiFunction (summary, person) -> person:
static class TeamSummary {
...
public TeamSummary include(final Person person) {
final TeamSummary result = new TeamSummary();
result.fullnames = new ArrayList<>(fullnames);
result.fullnames.add(person.firstname + " " + person.lastname);
result.oldest = Comparator.<Person, Integer> comparing(p -> p.age).reversed()
.compare(oldest, person) < 0
? oldest
: person;
return result;
}
}
public static void main(final String[] args) {
...
final TeamSummary reduced = personStream.reduce(
TeamSummary.identity(),
TeamSummary::include,
TeamSummary::merge);
}
Based on the requirements such as - Stream as input and inferring the complete list of names in the output of teamSummary. You can perform the operation mapping the person and its name details to an entry and then reduce them further such as :
return personStream
.map(p -> new AbstractMap.SimpleEntry<>(p, Collections.singletonList(p.getFirstname() + ' ' + p.getLastname())))
.reduce((entry1, entry2) -> new AbstractMap.SimpleEntry<>(entry1.getKey().getAge() >= entry2.getKey().getAge() ?
entry1.getKey() : entry2.getKey(), Stream.of(entry1.getValue(), entry2.getValue()).flatMap(List::stream).collect(Collectors.toList())))
.map(entry -> new TeamSummary(entry.getKey(), entry.getValue()))
.orElseThrow(IllegalArgumentException::new);
For a readable and simplified approach though I would rather suggest passing on the collection and working with multiple stream operations here to construct the TeamSummary as :
public TeamSummary getSummary(List<Person> people) {
List<String> fullNames = people.stream()
.map(p -> p.getFirstname() + ' ' + p.getLastname())
.collect(Collectors.toList());
Person oldestPerson = people.stream()
.reduce(BinaryOperator.maxBy(Comparator.comparing(Person::getAge)))
.orElseThrow(IllegalArgumentException::new);
return new TeamSummary(oldestPerson, fullNames);
}
I don't know why you'd use Collectors.reducing() when you can stream.reduce() directly?
BinaryOperator<Player> older = (p1, p2) ->
Comparator.comparing(Player::getAge) > 0
? p1 : p2;
TeamSummary summary = stream.reduce(
TeamSummary::new, // identity
// accumulator
(ts, player) -> {
ts.addFullnames(String.format("%s %s", player.firstName, player.lastName));
ts.setOldest(older.apply(ts.getOldest(), player));
}
// combiner
(ts1, ts2) -> {
// we can safely modify the given summaries, they were all created while reducing
ts1.setOldest(Math.max(ts1.getOldest(), ts2.getOldest()));
ts1.addFullnames(ts2.getFullnames().toArray());
return ts1;
});
TeamSummary would then look like this:
class TeamSummary {
private int oldest;
public Player getOldest() { return oldest; }
public void setOldest(Player newOldest) { oldest = newOldest; }
private List<String> fullnames();
public List<String> getFullnames() { return Collections.unmodifiableList(fullnames); }
public void addFullnames(String... names) {
fullnames.addAll(Arrays.asList(names));
}
}
Alternative
You could also extend TeamSummary with something like addPlayer(Player p) and merge() to allow it to maintain its consistency:
class TeamSummary {
#Getter
private int oldest;
#Getter
private List<String> fullnames = new ArrayList<>();
public void addPlayer(Player p) {
fullnames.add(String.format("%s %s", p.getFirstname(), p.getLastname()));
oldest = olderPlayer(oldest, p);
}
public TeamSummary merge(TeamSummary other) {
older = olderPlayer(oldest, other.oldest)
fullnames.addAll(other.fullnames);
return this;
}
final static Comparator<Player> BY_AGE = Comparator.comparing(Player::getAge);
private static Player olderPlayer(Player p1, Player p2) {
return BY_AGE.compare(p1, p2) > 0 ? p1 : p2;
}
}
which would make the reduction
stream.reduce(
TeamSummary::new,
TeamSummary::addPlayer,
TeamSummary::merge
);

There is something else beside Jackson for complexe objects to handle in Java?

I have to be able, on the backend side, to receive/send a JSON structure, like that one below.
{
"firstObject":{
"type":"string",
"value":"productValue"
},
"secondObject":{
"type":"string",
"value":"statusValue"
},
"thirdObject":{
"type":"map",
"value":{
"customerName":{
"type":"string",
"value":"customerValue"
}
}
},
"fourthObject":{
"type":"map",
"value":{
"firstObject":{
"type":"map",
"value":{
"anotherObj1":{
"type":"string",
"value":"TEST"
},
"anotherObj2":{
"type":"date",
"value":"01/12/2018"
},
"anotherObj3":{
"type":"date",
"value":"31/01/2018"
}
}
}
}
}
}
The problem that makes this a little bit tricky it's the fact that, for each object, I have to know what kind of type is. There could be 4 types:
int
string
boolean
map
If the value for an object is map (selected by a customer), for example, on the frontend side will appear another key/value structure, so what I'm gonna receive on the backend side is a dynamic structure. I will need to have a validation for this structure, to check if it complies to what I`m expected to receive.
I would appreciate an opinion if I should use just a Java class, to make my objects that I need, or to use beside that, Jackson for JSON validation and mapping all that objects into a JSON.
If I'll use Jackson, I`ll have to make a custom serializer and deserializer.
From Jackson library you can use JsonTypeInfo and JsonSubTypes annotations. They are handle polymorphic type handling:
#JsonTypeInfo is used to indicate details of what type information is included in serialization
#JsonSubTypes is used to indicate sub-types of annotated type
#JsonTypeName is used to define logical type name to use for annotated class
Your example fits for this solution except root object which looks more like simple POJO class. In your case we should craete type structure which help work with these 3 types: string, date, map:
#JsonTypeInfo(use = JsonTypeInfo.Id.NAME, property = "type")
#JsonSubTypes({
#JsonSubTypes.Type(value = StringValue.class, name = "string"),
#JsonSubTypes.Type(value = DateValue.class, name = "date"),
#JsonSubTypes.Type(value = MapValue.class, name = "map")
})
abstract class HasValue<T> {
protected T value;
public HasValue() {
this(null);
}
public HasValue(T value) {
this.value = value;
}
public T getValue() {
return value;
}
public void setValue(T value) {
this.value = value;
}
#Override
public String toString() {
return getClass().getSimpleName() + "{" +
"value=" + value +
"}";
}
}
class StringValue extends HasValue<String> {
public StringValue() {
this(null);
}
public StringValue(String value) {
super(value);
}
}
class DateValue extends HasValue<String> {
public DateValue(String value) {
super(value);
}
public DateValue() {
this(null);
}
}
class MapValue extends HasValue<Map<String, HasValue>> {
public MapValue(Map<String, HasValue> value) {
super(value);
}
public MapValue() {
this(new LinkedHashMap<>());
}
public void put(String key, HasValue hasValue) {
this.value.put(key, hasValue);
}
}
Now, we need to introduce POJO for root value. It could look like below, but you can add getters/setters if you want. For this example below will code be enough:
class Root {
public HasValue firstObject;
public HasValue secondObject;
public HasValue thirdObject;
public HasValue fourthObject;
#Override
public String toString() {
return "Root{" +
"firstObject=" + firstObject +
", secondObject=" + secondObject +
", thirdObject=" + thirdObject +
", fourthObject=" + fourthObject +
'}';
}
}
Now, we can finally try to serialise and deserialise these objects:
MapValue customerName = new MapValue();
customerName.put("customerName", new StringValue("customerValue"));
MapValue innerMap = new MapValue();
innerMap.put("anotherObj1", new StringValue("TEST"));
innerMap.put("anotherObj2", new DateValue("01/12/2018"));
innerMap.put("anotherObj3", new DateValue("31/01/2018"));
MapValue fourthObject = new MapValue();
fourthObject.put("firstObject", innerMap);
Root root = new Root();
root.firstObject = new StringValue("productValue");
root.secondObject = new StringValue("statusValue");
root.thirdObject = customerName;
root.fourthObject = fourthObject;
ObjectMapper mapper = new ObjectMapper();
String json = mapper.writerWithDefaultPrettyPrinter().writeValueAsString(root);
System.out.println(json);
System.out.println(mapper.readValue(json, Root.class));
Aboce code prints JSON:
{
"firstObject" : {
"type" : "string",
"value" : "productValue"
},
"secondObject" : {
"type" : "string",
"value" : "statusValue"
},
"thirdObject" : {
"type" : "map",
"value" : {
"customerName" : {
"type" : "string",
"value" : "customerValue"
}
}
},
"fourthObject" : {
"type" : "map",
"value" : {
"firstObject" : {
"type" : "map",
"value" : {
"anotherObj1" : {
"type" : "string",
"value" : "TEST"
},
"anotherObj2" : {
"type" : "date",
"value" : "01/12/2018"
},
"anotherObj3" : {
"type" : "date",
"value" : "31/01/2018"
}
}
}
}
}
}
And toString representation:
Root{firstObject=StringValue{value=productValue}, secondObject=StringValue{value=statusValue}, thirdObject=MapValue{value={customerName=StringValue{value=customerValue}}}, fourthObject=MapValue{value={firstObject=MapValue{value={anotherObj1=StringValue{value=TEST}, anotherObj2=DateValue{value=01/12/2018}, anotherObj3=DateValue{value=31/01/2018}}}}}}
You can easily manipulate output by adding/removing any kind of HasValue instance.
For more info see:
Jackson annotations
Jackson Maven
Download the JSON jar from here. From the client side convert your JSON to string using json.stringify (because JSONObject's constructor accepts only string). After receiving the request from the client do this:
public void doPost(request,response) throws ParseException, JSONException {
parseMapFromJSON(request.getParameter("JSONFromClient"));
}
private void parseMapFromJSON(String JSONParam) throws JSONException
{
JSONObject requestJSON = new JSONObject(JSONParam);
for(int i=0; i<requestJSON.length();i++)
{
String key = (String) requestJSON.names().get(i);
if(key.endsWith("Object"))
{
parseMapFromJSON(requestJSON.get(key).toString());
}
else if(key.startsWith("type") && (requestJSON.get(key).equals("date") || requestJSON.get(key).equals("string")))
{
System.out.println(requestJSON.get("value"));
break;
}
else if(key.startsWith("type") && requestJSON.get(key).equals("map"))
{
parseMapFromJSON(requestJSON.get("value").toString());
}
else if(!key.equals("value"))
{
parseMapFromJSON(requestJSON.get(key).toString());
}
}
}

Java 8 List to Map conversion

I have a problem with conversion List Object to Map String, List Object. I'm looking for Map with a keys name of all components in cars, and a value is represented by cars with this component
public class Car {
private String model;
private List<String> components;
// getters and setters
}
I write a solution but looking for a better stream solution.
public Map<String, List<Car>> componentsInCar() {
HashSet<String> components = new HashSet<>();
cars.stream().forEach(x -> x.getComponents().stream().forEachOrdered(components::add));
Map<String, List<Car>> mapCarsComponents = new HashMap<>();
for (String keys : components) {
mapCarsComponents.put(keys,
cars.stream().filter(c -> c.getComponents().contains(keys)).collect(Collectors.toList()));
}
return mapCarsComponents;
}
You could do it with streams too, but I find this a bit more readable:
public static Map<String, List<Car>> componentsInCar(List<Car> cars) {
Map<String, List<Car>> result = new HashMap<>();
cars.forEach(car -> {
car.getComponents().forEach(comp -> {
result.computeIfAbsent(comp, ignoreMe -> new ArrayList<>()).add(car);
});
});
return result;
}
Or using stream:
public static Map<String, List<Car>> componentsInCar(List<Car> cars) {
return cars.stream()
.flatMap(car -> car.getComponents().stream().distinct().map(comp -> new SimpleEntry<>(comp, car)))
.collect(Collectors.groupingBy(
Entry::getKey,
Collectors.mapping(Entry::getValue, Collectors.toList())
));
}
I know this is a Java question, and there is already a Java answer. However, I would like to add that Kotlin, which is a JVM language and perfectly interoperable with Java, you can do things like this very easily and cleanly:
val carsByComponent = cars
.flatMap { it.components }
.distinct()
.map { component -> component to cars.filter { car -> component in car.components } }
.toMap()
or even more concise, allthough less readable:
val carsByComponent = cars
.flatMap { car -> car.components.map { it to car } }
.groupBy { it.first }
.mapValues {it.value.map { it.second }}

What is the java 8 equivalent to Guava's transformAndConcat?

Lets assume I have a class containing a List, e.g.
public static class ListHolder {
List<String> list = new ArrayList<>();
public ListHolder(final List<String> list) {
this.list = list;
}
public List<String> getList() {
return list;
}
}
Let's furthermore assume I have a whole list of instances of this class:
ListHolder listHolder1 = new ListHolder(Arrays.asList("String 1", "String 2"));
ListHolder listHolder2 = new ListHolder(Arrays.asList("String 3", "String 4"));
List<ListHolder> holders = Arrays.asList(listHolder1, listHolder2);
And now I need to extract all Strings to get a String List containing all Strings of all instances, e.g.:
[String 1, String 2, String 3, String 4]
With Guava this would look like this:
List<String> allString = FluentIterable
.from(holders)
.transformAndConcat(
new Function<ListHolder, List<String>>() {
#Override
public List<String> apply(final ListHolder listHolder) {
return listHolder.getList();
}
}
).toList();
My question is how can I achieve the same with the Java 8 stream API?
List<String> allString = holders.stream()
.flatMap(h -> h.getList().stream())
.collect(Collectors.toList());
Here is an older question about collection flattening: (Flattening a collection)

Finding an object from list inside a map

This is my first question in Stackoverflow.I have come to find a issue with one of the problem suggested and give to me by my colleague to do some research on it.
My question is
i have a class
Class Function{
String func;
String funcname;
boolean log;
}
i have created some objects:
obj1 : ("a" ,"b",true)- //these values come from either DB or UI
obj2 : ("c" ,"x",true)
obj3 : ("a" ,"z",true)
i have a list:
List<function> flist;
now i want to have that list in the map and want to put in inside the map
Map<String, List<function>> funcMap
and then display this following output:
a:[obj1 obj3]
b:[obj2]
if i have the list but how to go about and find the above output as desired
Try this,
add all the objects in the flist.
initialize the map
Map<String, List<Function>> funcMap = new HashMap<String, List<Function>>();
going to add the object to the relevant key based on the func value the object will add to the value list.
for (Function functionValue : flist)
{
List<Function> functionList = funcMap.get(functionValue.getFunc());
if (functionList != null && !functionList.isEmpty())
{
functionList.add(functionValue);
}
else
{
functionList = new ArrayList<Function>();
functionList.add(functionValue);
funcMap.put(functionValue.getFunc(), functionList);
}
}
Atlast print the funcMap
for (Map.Entry< String, List<Function>> entry : funcMap.entrySet())
{
System.out.println("Key : " + entry.getKey() + "Values : "+entry.getValue());
}
Hmm.. I think it's a case of parsing your list in a nested loop kind of way. Here is the pseudo-code:
public void listToMap(List<Function> list)
{
Map<String, List<Function>> map := new Map
for every function in the list.
{
if(is the current function's func value does not exist in the map)
{
func := current functions func value
List matchingFunctions := new list of Functions.
for(every function in the list.)
{
// Every Function with the same key get's added to a list.
if(function has the same func value as func)
{
add to matchingFunctions.
}
}
// That list and key get put into the HashMap.
map.put(func, matchingFunctions).
}
}
}
A Note on your code design
Java convention states that you should wrap your member objects up in getters and setters, and that those members should be private.
what about:
public class FuncTest {
public static void main(String[] args) {
new FuncTest().start();
}
private void start() {
List<Function> flist = new ArrayList<Function>();
flist.add(new Function("a", "b", true));
flist.add(new Function("c", "x", true));
flist.add(new Function("a", "z", true));
Map<String, List<Function>> funcMap = new HashMap<String, List<Function>>();
for (Function func : flist) {
this.add(func.func, func, funcMap);
this.add(func.funcname, func, funcMap);
}
}
private void add(String field, Function func, Map<String, List<Function>> funcMap) {
List<Function> subList = funcMap.get(field);
if (subList == null) {
subList = new ArrayList<Function>();
funcMap.put(field, subList);
}
subList.add(func);
}
}
Note
As already mentioned by Chris you should think about your code design. Use getters and setters ..
public class Stackoverflow {
public static void main(String[] args) {
Function obj1 = new Function("a" ,"b",true);
Function obj2 = new Function("c" ,"x",true);
Function obj3 = new Function("a" ,"z",true);
List<Function> functionsList1 = new ArrayList<Function>();
functionsList1.add(obj1);
functionsList1.add(obj3);
List<Function> functionsList2 = new ArrayList<Function>();
functionsList2.add(obj2);
Map<String, List<Function>> funcMap = new LinkedHashMap<String, List<Function>>();
funcMap.put("a", functionsList1);
funcMap.put("b", functionsList2);
Set<Entry<String,List<Function>>> entrySet = funcMap.entrySet();
for (Entry<String, List<Function>> entry : entrySet) {
System.out.println(entry.getKey() + " : " + entry.getValue());
}
}
}
class Function {
String func;
String funcname;
boolean log;
public Function(String func, String funcname, boolean log) {
super();
this.func = func;
this.funcname = funcname;
this.log = log;
}
#Override
public String toString() {
return "Function [func=" + func + ", funcname=" + funcname + ", log="
+ log + "]";
}
}
Write your own map.
Pass the list to map, let map will decide what portion of list to keep as value.
I have added put method here, like the same, have to Override other methods.
class MyHashMap<K,V> extends HashMap<K,V>{
#SuppressWarnings("unchecked")
public V put(K k, V v) {
String key = (String)k;
List<Function> list = (List<Function>) v;
List<Function> list2 = new ArrayList<Function>();
for (Function function : list) {
if(key.equalsIgnoreCase(function.func)){
list2.add(function);
}
}
return (V) list2;
};
#Override
public boolean equals(Object o) {
// Your own code
return true;
}
// other methods goes here..
}

Categories