I understand that the soft constraint in the tsp problem is to find the shortest total distance, but I see that the two drools constraints in the example seem to express the distance of each segment and the distance from the last visited city to the starting point? It seems that the distance is not calculated. And, can you explain these two constraints in detail? I'm a beginner, and I don't understand a little bit.
rule "distanceToPreviousStandstill"
when
$visit : Visit(previousStandstill != null, $distanceFromPreviousStandstill : distanceFromPreviousStandstill)
then
scoreHolder.addConstraintMatch(kcontext, - $distanceFromPreviousStandstill);
end
rule "distanceFromLastVisitToDomicile"
when
$visit : Visit(previousStandstill != null)
not Visit(previousStandstill == $visit)
$domicile : Domicile()
then
scoreHolder.addConstraintMatch(kcontext, - $visit.getDistanceTo($domicile));
end
In DRL (unlike in Java), calling Visit(..., $distanceFromPreviousStandstill : distanceFromPreviousStandstill) means its calling the getDistanceFromPreviousStandstill() getter:
public class Visit extends AbstractPersistable implements Standstill {
...
public long getDistanceFromPreviousStandstill() {
if (previousStandstill == null) {
return 0L;
}
return getDistanceFrom(previousStandstill);
}
}
If you prefer Java over DRL, take a look at the ConstraintProvider alternative implementation, which is written in Java, uses incremental score calculation too (= can scale), is equally fast (8.3.0+), easier to debug and has code highlighting and code completion in any Java IDE:
public final class TspConstraintProvider implements ConstraintProvider {
#Override
public Constraint[] defineConstraints(ConstraintFactory constraintFactory) {
return new Constraint[] {
distanceToPreviousStandstill(constraintFactory),
distanceFromLastVisitToDomicile(constraintFactory)
};
}
private Constraint distanceToPreviousStandstill(ConstraintFactory constraintFactory) {
return constraintFactory.from(Visit.class)
.penalizeLong("Distance to previous standstill",
SimpleLongScore.ONE,
Visit::getDistanceFromPreviousStandstill);
}
private Constraint distanceFromLastVisitToDomicile(ConstraintFactory constraintFactory) {
return constraintFactory.from(Visit.class)
.ifNotExists(Visit.class, Joiners.equal(visit -> visit, Visit::getPreviousStandstill))
.join(Domicile.class)
.penalizeLong("Distance from last visit to domicile",
SimpleLongScore.ONE,
Visit::getDistanceTo);
}
}
Related
I want to use "pin" feature of optaplanner for immovable operations. However i get "Bailing out of neverEnding selector (Filtering(FromSolutionEntitySelector(PersonAssignment))) to avoid infinite loop." error. I've tried both with #PlanningPin and movableEntitySelectionFilter. Optaplanner version is 7.32.0 final, also i tried with version 7.44.0.Final. I've searched a lot and as can i see you've solved this problem in version 7.31.0.Final.
I've generated my domain and it is working well. Idea of my problem is there are more then one hotel. People come to hotel at different hours. Each hotel has different capacity and i want to asign people to a hotel if there is enough space. I've also many rules about assignment. My domain model consists of two #PlanningEntity,customShadowVariable and also TimeGrain. Structure is below:
#PlanningEntity(movableEntitySelectionFilter = PersonAssignmentSelectionFilter.class)
public class PersonAssignment extends AbstractPersistable {
private Person person;
private TimeGrain startingTimeGrain;
private TimeGrain endTime; // This added
private HotelDomain hotelDomain;
public TimeGrain getStartingTimeGrain() {
return startingTimeGrain;
}
public void setStartingTimeGrain(TimeGrain startingTimeGrain) {
this.startingTimeGrain = startingTimeGrain;
}
public TimeGrain getEndTime() {
return endTime;
}
public void setEndTime(TimeGrain endTime) {
this.endTime = endTime;
}
#PlanningVariable(valueRangeProviderRefs = { "hotelDomainRange" }, nullable = true)
public HotelDomain getHotelDomain() {
return hotelDomain;
}
public void setHotelDomain(HotelDomain hotelDomain) {
this.hotelDomain = hotelDomain;
}
......
}
#DeepPlanningClone
public class HotelDomain extends AbstractPersistable {
private String hotelName;
#CustomShadowVariable(variableListenerRef = #PlanningVariableReference(variableName = "tightOccupancy, personAssignment))
private Map<Integer, HotelOccupancyPerSlot> hotelOccupancyMap;
......
}
#PlanningEntity
public class HotelOccupancyPerSlot extends AbstractPersistable {
#CustomShadowVariable(variableListenerClass = HotelDomainVariableListener.class, sources = {
#PlanningVariableReference(entityClass = PersonAssignment.class, variableName = "hotelDomain") })
private Integer tightOccupancy; // days
#CustomShadowVariable(variableListenerRef = #PlanningVariableReference(variableName = "tightOccupancy"))
private List<PersonAssignment> personAssignments;
.......
}
public class HotelDomainVariableListener implements VariableListener<PersonAssignment> {
......
}
config.xml is:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE xml>
<solver>
<moveThreadCount>3</moveThreadCount> <!-- To solve faster by saturating multiple CPU cores -->
<solutionClass>com.domain.HotelAccomodation</solutionClass>
<entityClass>com.domain.PersonAssignment</entityClass>
<entityClass>com.domain.HotelOccupancyPerSlot</entityClass>
<scoreDirectorFactory>
<scoreDrl>solver/solverScoreRules.drl</scoreDrl>
</scoreDirectorFactory>
<termination>
<minutesSpentLimit>15</minutesSpentLimit>
</termination>
</solver>
If i use #PlannigPin in HotelOccupancyPerSlot there is no problem but i want to use it(or filter) only in PersonAssignment class because it is my basic class. Is there any suggestion? Should i add something to config?
Thank You :)
The "Bailing out" warning happens when a move selection filter is filtering out every move (it never returns true). It's not an error, but a warning, because there if you see it only once, it could be a harmless fluke in theory in some cases.
Anyway, when does a filter return false for all selected moves? In these cases:
A) you configured a custom move filter in solverConfig.xml that never returns true
B) the solution value range has no values - OptaPlanner checks for that ins 7.44+ and does the right thing automatically. Not sure if it does the right thing if there are 2 planning variables and only 1 has no values...
C) the solution value range has a single value - that would mean the move isDoable() false. I don't recall this being a problem though.
D) the solution value range has no non-pinned value. See B)
E) the solution value range has only 1 non-pinned value. See C)
Take a look at your input data. Are you dealing with case B, C, D or E by any chance?
I'm trying to get into basics of functional programming with Java 8 and I have a simple task which is to set a property on the object and then persist it. The database proper type is ltree so it might fail if it contains not allowed characters. I want to process items one-by-one and log exceptions/successes.
I choose to use the Vavr library because Try.of() exception handling and I want to learn to just use it as it seems very helpful.
here is what I came up with but I'm not satisfied enough:
public class PathHandler {
private final DocVersionDAO dao;
public void processWithHandling() {
Try.of(this::process)
.recover(x -> Match(x).of(
Case($(instanceOf(Exception.class)), this::logException)
));
}
private Stream<Try<DocVersion>> logException(Exception e) {
//log exception now but what to return? also I would like to have DocVersion here too..
return null;
}
public Stream<Try<DocVersion>> process() {
return dao.getAllForPathProcessing() //returns Stream<DocVersion>
.map(this::justSetIt)
.map(this::save);
}
public DocVersion justSetIt(DocVersion v) {
String path = Optional.ofNullable(v.getMetadata().getAdditionals().get(Vedantas.PATH))
.orElse(null);
log.info(String.format("document of uuid %s has matadata path %s; setting it", v.getDocument2().getUUID(), path));
v.getDocument2().setPath(path);
return v;
}
#Transactional(propagation = Propagation.REQUIRES_NEW)
public Try<DocVersion> save(DocVersion v) {
return Try.of(() -> dao.save(v));
}
}
the goal is quite simple so could you teach me proper way to do it?
I'm afraid, this will become highly opinionated. Anyway, I try something.
... which happened before I realized, what Vavr actually provides. It attempts to cover everything mentioned here, like immutable data structures and monad syntax sugaring (with the For statement), and goes beyond that by coming up even with pattern matching. It takes a comprehensive set of FP concepts and rebuilds them using Java and it is no surprise Scala comes into one's mind seeing this ("Vavr is greatly inspired by Scala").
Now the foundations of functional programming can't be covered by a single SO post. And it might be problematic to get familiar with them in a language like Java which isn't geared towards it. So perhaps it is better to approach them in their natural habitat like the Scala language, which is still in some proximity to Java, or Haskell, which is not.
Coming back from this detour applying the features of Vavr may be more straight foward for the initiated. But likelely not for the Java developer sitting next to you in the office, who is less willing to go the extra mile and comes up with arguments that can't be just dismissed, like this one: "If we wanted to it that way, we would be a Scala shop". Therefore I'd say, applying Vavr asks for a pragmatic attitute.
To corroborate the Vavra-Scala argument, let's take Vavra's For construct (all Lists mentioned are io.vavr.collection.List), it looks like this:
Iterator<Tuple2<Integer, String>> tuples =
For(List.of(1, 2, 3), i ->
For(List.of(4, 5, 6))
.yield(a -> Tuple.of(i, String.valueOf(a))));
In Scala you'd encounter For and yield this way.
val tuples = for {
i <- 1 to 3
a <- 4 to 6
} yield (i, String.valueOf(a))
All the monad machinery remains under the hood, where Vavra brings more of an approximation, necessarily leaking some internals. For the purpose of learning it might be puzzling to start with Vavra's hybrid creatures.
So what remains of my post is a small time treatment of some FP basics, using the example of the OP, elaborating on immutability and Try on a trench-level, but omitting pattern matching. Here we go:
One of the defining characteristics of FP are functions free of side effects ("pure functions"), which naturally (so to speak) comes along with immutable data structures/objects, which may sound kind of weird. One obvious pay off is, that you don't have to worry, that your operations create unintended changes at some other place. But Java doesn't enforce that in any way, also its immutable collections are only so on a superficial level. From the FP signature characteristics Java only offers higher order functions with java-lambdas.
I used the functional style quite a bit on the job manipulating complicated structures where I stuck to those 2 principles. E.g. load a tree T of objects from a db, do some transformations on it, which meant producing another tree of objects T', sort of one big map operation, place the changes in front of the user to accept or reject them. If accepted, apply the changes to the related JPA entities and persist them. So after the functional transformation two mutations were applied.
I'd propose, to apply FP in this sense and tried to formulate an according version of your code, using an immutable DocVersion class. I chose to simplify the Metadata part for the sake of the example.
I also tried to highlight, how the "exception-free" Try approach (some of it poached from here) could be formulated and utilized some more. Its a small time version of Vavr's Try, hopefully focusing on the essentials. Note its proximity to Java's Optional and the map and flatMap methods in there, which render it an incarnation of the FP concept called monad. It became notorious in a sweep of highly confusing blog posts some years ago usually starting with "What is a monad?" (e.g. this one). They have cost me some weeks of my life, while it is rather easy to get a good intuition of the issue just by using Java streams or Optionals. Miran Lipovaca's "Learn Yourself a Haskell For Great Good" later made good for it to some extent, and Martin Odersky's Scala language.
Boasting with of, map and flatMap, Try would, roughly speaking, qualify for a syntax-sugaring like you find it in C# (linq-expressions) or Scala for-expressions. In Java there is no equivalent, but some attempts to at least compensate a bit are listed here, and Vavr looks like another one. Personally I use the jool library occasionally.
Passing around streams as function results seems not quite canonical to me, since streams are not supposed to get reused. That's also the reason to create a List as an intermediary result in process().
public class PathHandler {
class DocVersionDAO {
public void save(DocVersion v) {
}
public DocVersion validate(DocVersion v) {
return v;
}
public Stream<DocVersion> getAllForPathProcessing() {
return null;
}
}
class Metadata {
#Id
private final Long id;
private final String value;
Metadata() {
this.id = null;
this.value = null;
}
Metadata(Long id, String value) {
this.id = id;
this.value = value;
}
public Optional<String> getValue() {
return Optional.of(value);
}
public Metadata withValue(String value) {
return new Metadata(id, value);
}
}
public #interface Id {
}
class DocVersion {
#Id
private Long id;
private final Metadata metadatata;
public Metadata getMetadatata() {
return metadatata;
}
public DocVersion(Long id) {
this.id = id;
this.metadatata = new Metadata();
}
public DocVersion(Long id, Metadata metadatata) {
this.id = id;
this.metadatata = metadatata;
}
public DocVersion withMetadatata(Metadata metadatata) {
return new DocVersion(id, metadatata);
}
public DocVersion withMetadatata(String metadatata) {
return new DocVersion(id, this.metadatata.withValue(metadatata));
}
}
private DocVersionDAO dao;
public List<DocVersion> process() {
List<Tuple2<DocVersion, Try<DocVersion>>> maybePersisted = dao.getAllForPathProcessing()
.map(d -> augmentMetadata(d, LocalDateTime.now().toString()))
.map(d -> Tuple.of(d, Try.of(() -> dao.validate(d))
.flatMap(this::trySave)))
.peek(i -> i._2.onException(this::logExceptionWithBadPracticeOfUsingPeek))
.collect(Collectors.toList());
maybePersisted.stream()
.filter(i -> i._2.getException().isPresent())
.map(e -> String.format("Item %s caused exception %s", e._1.toString(), fmtException(e._2.getException().get())))
.forEach(this::log);
return maybePersisted.stream()
.filter(i -> !i._2.getException().isPresent())
.map(i -> i._2.get())
.collect(Collectors.toList());
}
private void logExceptionWithBadPracticeOfUsingPeek(Exception exception) {
logException(exception);
}
private String fmtException(Exception e) {
return null;
}
private void logException(Exception e) {
log(fmtException(e));
}
public DocVersion augmentMetadata(DocVersion v, String augment) {
v.getMetadatata().getValue()
.ifPresent(m -> log(String.format("Doc %d has matadata %s, augmenting it with %s", v.id, m, augment)));
return v.withMetadatata(v.metadatata.withValue(v.getMetadatata().value + augment));
}
public Try<DocVersion> trySave(DocVersion v) {
return new Try<>(() -> {
dao.save(v);
return v;
});
}
private void log(String what) {
}
}
Try looks like this
public class Try<T> {
private T result;
private Exception exception;
private Try(T result, Exception exception) {
this.result = result;
this.exception = exception;
}
public static <T> Try<T> of(Supplier<T> f)
{
return new Try<>(f);
}
T get() {
if (result == null) {
throw new IllegalStateException();
}
return result;
}
public void onException(Consumer<Exception> handler)
{
if (exception != null)
{
handler.accept(exception);
}
}
public <U> Try<U> map(Function<T, U> mapper) {
return exception != null ? new Try<>(null, exception) : new Try<>(() -> mapper.apply(result));
}
public <U> Try<U> flatMap(Function<T, Try<U>> mapper) {
return exception != null ? null : mapper.apply(result);
}
public void onError(Consumer<Exception> exceptionHandler) {
if (exception != null) {
exceptionHandler.accept(exception);
}
}
public Optional<Exception> getException() {
return Optional.of(exception);
}
public Try(Supplier<T> r) {
try {
result = r.get();
} catch (Exception e) {
exception = e;
}
}
}
I am trying to specify a rule for a soft constraint in my .drl-file. It is supposed to take two parameters from the PlanningSolution (Schedule-class) and then execute a Java-Function with these. Sadly the code seems not to be executed (there is no SoftScore modified, even if I replace the function call getBlockNumberDifference with a plain -10). Can someone relate to this issue?
I have another rule that specifies a HardConstraint that also uses a function call which is working perfectly fine.
The Planning Solution:
#PlanningSolution
public class Schedule {
private Semester semester;
#PlanningEntityCollectionProperty
public List<Lecture> getLectureList() {
return lectureList;
}
public void setLectureList(List<Lecture> lectureList) {
this.lectureList = lectureList;
}
public Semester getSemester() {
return semester;
}
public void setSemester(Semester semester) {
this.semester = semester;
}
}
The rule:
import function (...).getBlockNumberDifference;
//...
rule "rule"
when
Schedule ( $s : semester != null && $l : lectureList != null)
then
scoreHolder.addSoftConstraintMatch(kcontext, getBlockNumberDifference($l, $s));
end
Test:
public static int getBlockNumberDifference(List<Lecture> lectureList, Semester semester) {
System.out.println("Calling Block number Difference " + lectureList.size() + " and " + semester.getBezeichnung());
return -1;
}
I am using OptaPlanner in Version 7.9.0 with Spring Boot and Java 8.
The planning solution isn't inserted into the working memory of Drools IIRC, so the LHS (the when side) of that rule never matches.
I might be wrong on this - to prove this, make it when Schedule() then System.out.println("not in wm");end and see if you see that appear.
First, I am using Spring MVC.
I have a "Skill"-modelclass, where I placed the #JsonIgnoreProperties
#JsonIgnoreProperties({"personSkills","berufsgruppes","skills"})
#JsonPropertyOrder({"idSkill", "name", "levelBezeichnung", "skill"})
I am using it because there are many-to-many or many-to-one or one-to-many relationships and without this property it causes an StackOverFlowException (Infinite Error). One skill can have many skills, so there is a kind of recursion.
I implemented an Sub-Class for "Skill" named "SkillBean", which has one more attribute "checked" thats just relevant for the application not for database.
public class Skill implements java.io.Serializable {
private Set<Skill> skills = new HashSet<Skill>(0);
...
#OneToMany(fetch = FetchType.LAZY, mappedBy = "skill")
public Set<Skill> getSkills() {
return this.skills;
}
public void setSkills(Set<Skill> skills) {
this.skills = skills;
}
public class SkillBean extends Skill implements Serializable{
public boolean checked;
public SkillBean() {
}
public SkillBean(Skill skill, boolean checked) {
this.checked = checked;
}
public boolean isChecked() {
return checked;
}
public void setChecked(boolean checked) {
this.checked = checked;
}
}
Im Using BeanUtils.copyProperties() to copy a Skill-Object into a SkillBean-Object. This works fine. I need to reorder the skills because currently I get the lowest Child-Skill first and not its parent. For this, I am trying to reorder objects and trying to build a tree in a list. Every skill has a Set of its children.
private ArrayList<SkillBean> formatSkillMap(HashMap<Integer, SkillBean> map) {
Map<Integer, SkillBean> tempSkills = (Map<Integer, SkillBean>) map.entrySet().stream().filter(p -> p.getValue().getSkill() == null)
.collect(Collectors.toMap(Entry::getKey, Entry::getValue));
ArrayList<SkillBean> list = new ArrayList<SkillBean>(tempSkills.values());
for (int i = 0; i < list.size(); i++) {
SkillBean sb = list.get(i);
tempSkills = (Map<Integer, SkillBean>) map.entrySet().stream().filter(p -> p.getValue().getSkill() != null)
.filter(p -> p.getValue().getSkill().getIdSkill() == sb.getIdSkill()).collect(Collectors.toMap(Entry::getKey, Entry::getValue));
Set<Skill> test = new HashSet<Skill>(tempSkills.values());
list.get(i).setSkills(test);
}
return list;
But the list doesnt return the Sub-skillset Could anyone tell me why the subskills are not serialized? When I return the subset of this parent-skill it their subskills are serialized.
0: {
"idSkill": 34
"name": "Methodik"
"levelBezeichnung": {
"#id": 1
"idLevelBezeichnung": 1
"bezeichnung": "Standard"
"handler": {}
"hibernateLazyInitializer": {}
}-
"checked": true
}
Without reordering it looks sth like this, but the problem is that the skill with id=34 is the parent skill and 9 is the subskill. I want it exactly the other way around. There could be three levels.
9: {
"idSkill": 9
"name": "Standards"
"levelBezeichnung": {
"#id": 1
"idLevelBezeichnung": 1
"bezeichnung": "Standard"
"handler": {}
"hibernateLazyInitializer": {}
}-
"skill": {
"idSkill": 34
"name": "Methodik"
"levelBezeichnung": 1
}-
"checked": true
}
Finally, I end up with this:
Ignore parent of a skill or ignore children of a skill to avoid infinite recursion. In some case you don't need to ignore one of them. If you have not that much data it could work. I'am talking about 150 nodes where each node knows its parent/children.
I am querying for the path from bottom to top of my lowest skill with a custom sql query.
I am putting all my skills on the highest level in a map. That means, I have access to all my skills, cause (as I said) every node knows his children.
I am searching in my map from top to bottom and delete all references that I don't need based on the path, which I already got.
The whole code is a bit complex and I'm using recursion to made it less complex. In the end I am not that pleased with this solution because there are many loops in it and so far I am having some trouble with performance-issues.
I need to discover whether it is a database-query problem or a problem caused by the loops.
I am looking for a data structure to represent hierarchy class system in Java.
For example, I have three class, University,Major,Student, and their relationship looks like below.
Is there a efficient data structure that I can query with a path-like expression?
For example, if the expression is CMU/cs/jake,then I get a instance of student class whose name is jake. As far as I know, the Trie could do this, is there any other option?
If your data fits into memory then you can implement this by putting a Set of children in each node of the hierarchy and then walking the sets to determine if the path is valid, for example
class University {
private Set<Major> majors;
}
class Major {
private Set<Student> students;
}
class Main {
// true if the path is valid, else false
public boolean query(University university, Major major, Student student) {
return university.getMajors().contains(major) &&
major.getStudents().contains(student);
}
}
If you also need to walk the reverse path (i.e. if you need a bidirectional hierarchy) then you can put a Set of parents in each child.
This will run in average case O(d) where d is the depth of the hierarchy if you use HashSets, and in worst case O(d * lg(n)) where n is the size of the sets if you use TreeSets.
If your data doesn't fit into memory then you may want to consider using a graph database, e.g. Neo4j.
Edit: You can make the code more generic at the cost of type safety by using Map<String, E> at each level, assuming that each object has a unique name or some other string identifier.
abstract class Hierarchical<E extends Hierarchical> {
protected final Map<String, E> children;
public boolean query(Queue<String> query) {
String key = query.poll();
if(key != null) {
E value = map.get(key);
if(value != null) {
return query.isEmpty() || value.contains(query);
}
}
return false;
}
}
class University extends Hierarchical<Major> {}
class Major extends Hierarchical<Student> {}
// special case for the bottom of the hierarchy
class Student extends Hierarchical<Hierarchical> {
public Student() {
children = null;
}
#Override
public boolean query(Queue<String> query) {
throw new UnsupportedOperationException("query should never reach this depth");
}
}
class Main {
// true if the path is valid, else false
public boolean query(Hierarchial root, Queue<String> query) {
return root.contains(query);
}
}
This has the same runtime depending on whether you use a HashMap or TreeMap. The query only consists of a queue of strings; at each level of the hierarchy the first string is removed, the Map is queried and the child node is returned if found, and the query proceeds on to the child node until the queue is empty (return true) or a node isn't found (return false).