This is my code where I am trying to access a flowvariable named "question"
import org.mule.api.MuleEventContext;
import org.mule.api.MuleMessage;
public class Main {
public Object onCall(MuleEventContext eventContext) throws Exception {
MuleMessage msg = eventContext.getMessage();
msg.getInvocationProperty("Question");
return msg;
}
}
but I am getting the following error:
Message : Failed to find entry point for component, the following resolvers tried but failed: [
CallableEntryPointResolver: Object "Main#2cad86ee" does not implement required interface "interface org.mule.api.lifecycle.Callable"
ReflectionEntryPointResolver: Could not find entry point on: "Main" with arguments: "{class [B}"
AnnotatedEntryPointResolver: Component: Main#2cad86ee doesn't have any annotated methods, skipping.
MethodHeaderPropertyEntryPointResolver: The required property "method" is not set on the event
]
Code : MULE_ERROR-321
--------------------------------------------------------------------------------
Exception stack is:
1. Failed to find entry point for component, the following resolvers tried but failed: [
CallableEntryPointResolver: Object "Main#2cad86ee" does not implement required interface "interface org.mule.api.lifecycle.Callable"
ReflectionEntryPointResolver: Could not find entry point on: "Main" with arguments: "{class [B}"
AnnotatedEntryPointResolver: Component: Main#2cad86ee doesn't have any annotated methods, skipping.
MethodHeaderPropertyEntryPointResolver: The required property "method" is not set on the event
] (org.mule.model.resolvers.EntryPointNotFoundException)
org.mule.model.resolvers.DefaultEntryPointResolverSet:49 (http://www.mulesoft.org/docs/site/current3/apidocs/org/mule/model/resolvers/EntryPointNotFoundException.html)
--------------------------------------------------------------------------------
Root Exception stack trace:
org.mule.model.resolvers.EntryPointNotFoundException: Failed to find entry point for component, the following resolvers tried but failed: [
CallableEntryPointResolver: Object "Main#2cad86ee" does not implement required interface "interface org.mule.api.lifecycle.Callable"
ReflectionEntryPointResolver: Could not find entry point on: "Main" with arguments: "{class [B}"
AnnotatedEntryPointResolver: Component: Main#2cad86ee doesn't have any annotated methods, skipping.
MethodHeaderPropertyEntryPointResolver: The required property "method" is not set on the event
]
at org.mule.model.resolvers.DefaultEntryPointResolverSet.invoke(DefaultEntryPointResolverSet.java:49)
at org.mule.component.DefaultComponentLifecycleAdapter.invoke(DefaultComponentLifecycleAdapter.java:339)
at org.mule.component.AbstractJavaComponent.invokeComponentInstance(AbstractJavaComponent.java:82)
+ 3 more (set debug level logging or '-Dmule.verbose.exceptions=true' for everything)
********************************************************************************
Well, the error message is pretty clear:
Object "Main#2cad86ee" does not implement required interface "interface org.mule.api.lifecycle.Callable"
Just implement this interface and life will be peachy.
Related
Since the Log4j2 documentation unfortunately doesn't have any examples for specifying a JDBC Appender using a ConnectionFactory in a log4j2.yaml-configuration, i'd like to ask for a proper example.
This is my current state of the log4j2.yml:
Configuration:
//...
Appenders:
JDBC:
name: DataBase
connectionSource:
class: "com.example.config.Log4jConnectionFactory"
method: "getDbConnection"
tableName: "application_log"
columnConfigs:
- name: "log_date"
isEventTimestamp: true
- name: "log_level"
pattern: "%level"
- name: "logger"
pattern: "%logger"
- name: "user_id"
pattern: "$${ctx:userId}"
- name: "user_name"
pattern: "$${ctx:username}"
- name: "log_message"
pattern: "%message"
- name: "exception"
pattern: "%exception"
//...
And here the associated ConnectionFactory:
public class Log4jConnectionFactory {
private static BasicDataSource dataSource;
private Log4jConnectionFactory() { }
public static Connection getDbConnection() throws SQLException {
if (dataSource == null) {
dataSource = new BasicDataSource();
dataSource.setUrl("jdbc:h2:mem:example-db;DB_CLOSE_DELAY=-1");
dataSource.setDriverClassName("org.h2.Driver");
dataSource.setUsername("sa");
dataSource.setPassword("");
}
return dataSource.getConnection();
}
}
With this configuration above Log4j throws the following error:
No factory method found for class org.apache.logging.log4j.core.appender.db.jdbc.JdbcAppender
The complete error log:
2023-02-03 09:38:08,479 main ERROR Unable to locate plugin type for connectionSource
2023-02-03 09:38:08,481 main ERROR Unable to locate plugin type for columnConfigs
2023-02-03 09:38:08,481 main ERROR Unable to locate plugin type for columnConfigs
2023-02-03 09:38:08,481 main ERROR Unable to locate plugin type for columnConfigs
2023-02-03 09:38:08,482 main ERROR Unable to locate plugin type for columnConfigs
2023-02-03 09:38:08,482 main ERROR Unable to locate plugin type for columnConfigs
2023-02-03 09:38:08,482 main ERROR Unable to locate plugin type for columnConfigs
2023-02-03 09:38:08,482 main ERROR Unable to locate plugin type for columnConfigs
2023-02-03 09:38:08,566 main ERROR Unable to locate plugin for connectionSource
2023-02-03 09:38:08,566 main ERROR Unable to locate plugin for columnConfigs
2023-02-03 09:38:08,566 main ERROR Unable to locate plugin for columnConfigs
2023-02-03 09:38:08,567 main ERROR Unable to locate plugin for columnConfigs
2023-02-03 09:38:08,567 main ERROR Unable to locate plugin for columnConfigs
2023-02-03 09:38:08,567 main ERROR Unable to locate plugin for columnConfigs
2023-02-03 09:38:08,567 main ERROR Unable to locate plugin for columnConfigs
2023-02-03 09:38:08,567 main ERROR Unable to locate plugin for columnConfigs
2023-02-03 09:38:08,569 main ERROR Could not create plugin of type class org.apache.logging.log4j.core.appender.db.jdbc.JdbcAppender for element JDBC: java.lang.NullPointerException: Cannot invoke "org.apache.logging.log4j.core.config.plugins.util.PluginType.getElementName()" because "childType" is null java.lang.NullPointerException: Cannot invoke "org.apache.logging.log4j.core.config.plugins.util.PluginType.getElementName()" because "childType" is null
at org.apache.logging.log4j.core.config.plugins.visitors.PluginElementVisitor.findNamedNode(PluginElementVisitor.java:104)
at org.apache.logging.log4j.core.config.plugins.visitors.PluginElementVisitor.visit(PluginElementVisitor.java:88)
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.injectFields(PluginBuilder.java:187)
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:123)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:1138)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:1063)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:1055)
at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:664)
at org.apache.logging.log4j.core.config.AbstractConfiguration.initialize(AbstractConfiguration.java:258)
at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:304)
at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:621)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:694)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:711)
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:253)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:155)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:47)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:196)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:137)
at org.apache.logging.slf4j.Log4jLoggerFactory.getContext(Log4jLoggerFactory.java:61)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:47)
at org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:33)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:391)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:416)
at com.example.Application.<clinit>(Application.java:13)
2023-02-03 09:38:08,571 main ERROR Unable to invoke factory method in class org.apache.logging.log4j.core.appender.db.jdbc.JdbcAppender for element JDBC: java.lang.IllegalStateException: No factory method found for class org.apache.logging.log4j.core.appender.db.jdbc.JdbcAppender java.lang.IllegalStateException: No factory method found for class org.apache.logging.log4j.core.appender.db.jdbc.JdbcAppender
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.findFactoryMethod(PluginBuilder.java:260)
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:136)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:1138)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:1063)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:1055)
at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:664)
at org.apache.logging.log4j.core.config.AbstractConfiguration.initialize(AbstractConfiguration.java:258)
at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:304)
at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:621)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:694)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:711)
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:253)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:155)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:47)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:196)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:137)
at org.apache.logging.slf4j.Log4jLoggerFactory.getContext(Log4jLoggerFactory.java:61)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:47)
at org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:33)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:391)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:416)
at com.example.Application.<clinit>(Application.java:13)
Since I'm not sure whether the Appender structure is correct or not, and since I haven't found anything anywhere how this shoud be done in a YAML config file, and since the Log4j2 documentation isn't any help either, I'd like to ask for any help that can provide some clarification.
Many thanks in advance
Since this didn't leave me any peace, I invested a few extra hours and tried out many possible combinations and setups. In the end I came up with the following proper working solution:
log4j2.yaml:
Configuration:
//...
Appenders:
JDBC:
name: DataBase
ConnectionFactory:
class: "com.example.config.Log4jConnectionFactory"
method: "getDbConnection"
tableName: "application_log"
Column:
- name: "log_date"
isEventTimestamp: true
- name: "log_level"
pattern: "%level"
- name: "logger"
pattern: "%logger"
- name: "user_id"
pattern: "$${ctx:userId}"
- name: "user_name"
pattern: "$${ctx:username}"
- name: "log_message"
pattern: "%message"
- name: "exception"
pattern: "%exception"
//...
ConnectionFactory:
public class Log4jConnectionFactory {
private static final HikariDataSource dataSource;
static {
System.out.println("====================== This is Log4jConnectionFactory speaking ======================");
dataSource = new HikariDataSource();
dataSource.setJdbcUrl("jdbc:h2:mem:example-db;DB_CLOSE_DELAY=-1;DATABASE_TO_UPPER=false;INIT=RUNSCRIPT FROM './src/db/h2/schema-log4j2.sql'");
dataSource.setDriverClassName("org.h2.Driver");
dataSource.setUsername("sa");
dataSource.setPassword("");
}
private Log4jConnectionFactory() { }
public static Connection getDbConnection() throws SQLException {
return dataSource.getConnection();
}
}
schema-log4j2.sql:
CREATE TABLE IF NOT EXISTS `application_log` (
application_log_id INTEGER GENERATED BY DEFAULT AS IDENTITY,
log_date TIMESTAMP,
log_level VARCHAR(10),
logger VARCHAR(100),
user_id BIGINT,
user_name VARCHAR(50),
log_message TEXT,
exception TEXT,
PRIMARY KEY (application_log_id)
);
Since we are in a Spring Boot environment and an H2 database is used, we have to consider that both logger and database have to be ready for use at a point, where Spring Boot itself is still in the initialization phase. Therefore logger and database must be setup without the help of Spring Boot, respectively without the application.yml configuration file.
So since schema creation can't be done via config file, we assign this task to the H2-connection-string. However, this means that every time a connection is requested, the schema creation is processed. To not run into an error see the detail CREATE TABLE IF NOT EXISTS.
Also to be mentioned, the connection data is currently hardcoded. This is definitely not a great solution, and I'm still looking for a way to work around this. A lookup to the Spring context is not possible due to timing reasons already mentioned. JNDI doesn't make much sense either, since the Tomcat configuration for this happens also too late. Let's see what solution eventually it will be. If I'll find one, I will update this post.
I have Java class whose classname is same as one of its attribute's name.
Now I get error saying "encountered unrecoverable cycle resolving import" when trying to create object for that class in scala.
public class linkId extends org.apache.avro.specific.SpecificRecordBase implements org.apache.avro.specific.SpecificRecord {
.
.
.
.
.
public linkId(java.lang.String linkId, java.lang.String linkIdScheme) {
this.linkId = linkId;
this.linkIdScheme = linkIdScheme;
}
.
.
.
.
}
[ERROR] [Error]
/Users/dnamburi/.../src/generated/avro/com/rbccm/TradeMessage/linkId.java:15:
encountered unrecoverable cycle resolving import. Note: this is often
due in part to a class depending on a definition nested within its
companion. If applicable, you may wish to try moving some members into
another object.
We are building a pipeline using Apache Beam and DirectRunner as the runner. We are currently attempting a simple pipeline whereby we:
Pull data from Google Cloud Pub/Sub (currently using the emulator to run locally)
Deserialize into a Java object
Window events using fixed windows of 1 minute
Combine these windows using a custom CombineFn that transforms them from events into a list of events.
Pipeline code:
pipeline
.apply(PubsubIO.<String>read().topic(options.getTopic()).withCoder(StringUtf8Coder.of()))
.apply("ParseEvent", ParDo.of(new ParseEventFn()))
.apply("WindowOneMinute",Window.<Event>into(FixedWindows.of(Duration.standardMinutes(1))))
.apply("CombineEvents", Combine.globally(new CombineEventsFn()));
ParseEvent function:
static class ParseEventFn extends DoFn<String, Event> {
#ProcessElement
public void processElement(ProcessContext c) {
String json = c.element();
c.output(gson.fromJson(json, Event.class));
}
}
CombineEvents function:
public static class CombineEventsFn extends CombineFn<Event, CombineEventsFn.Accum, EventListWrapper> {
public static class Accum {
EventListWrapper eventListWrapper = new EventListWrapper();
}
#Override
public Accum createAccumulator() {
return new Accum();
}
#Override
public Accum addInput(Accum accumulator, Event event) {
accumulator.eventListWrapper.events.add(event);
return accumulator;
}
#Override
public Accum mergeAccumulators(Iterable<Accum> accumulators) {
Accum merged = createAccumulator();
for (Accum accum : accumulators) {
merged.eventListWrapper.events.addAll(accum.eventListWrapper.events);
}
return merged;
}
#Override
public EventListWrapper extractOutput(Accum accumulator) {
return accumulator.eventListWrapper;
}
}
When attempting to run this locally using Maven and DirectRunner, we are getting the following error:
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run(ExecJavaMojo.java:293)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.IllegalStateException: Unable to return a default Coder for CombineEvents/Combine.perKey(CombineEvents)/Combine.GroupedValues/ParDo(Anonymous).out [PCollection]. Correct one of the following root causes:
No Coder has been manually specified; you may do so using .setCoder().
Inferring a Coder from the CoderRegistry failed: Unable to provide a default Coder for org.apache.beam.sdk.values.KV<K, OutputT>. Correct one of the following root causes:
Building a Coder using a registered CoderFactory failed: Cannot provide coder for parameterized type org.apache.beam.sdk.values.KV<K, OutputT>: Unable to provide a default Coder for java.lang.Object. Correct one of the following root causes:
Building a Coder using a registered CoderFactory failed: Cannot provide coder based on value with class java.lang.Object: No CoderFactory has been registered for the class.
Building a Coder from the #DefaultCoder annotation failed: Class java.lang.Object does not have a #DefaultCoder annotation.
Building a Coder from the fallback CoderProvider failed: Cannot provide coder for type java.lang.Object: org.apache.beam.sdk.coders.protobuf.ProtoCoder$2#6e610150 could not provide a Coder for type java.lang.Object: Cannot provide ProtoCoder because java.lang.Object is not a subclass of com.google.protobuf.Message; org.apache.beam.sdk.coders.SerializableCoder$1#7adc59c8 could not provide a Coder for type java.lang.Object: Cannot provide SerializableCoder because java.lang.Object does not implement Serializable.
Building a Coder from the #DefaultCoder annotation failed: Class org.apache.beam.sdk.values.KV does not have a #DefaultCoder annotation.
Using the default output Coder from the producing PTransform failed: Unable to provide a default Coder for org.apache.beam.sdk.values.KV<K, OutputT>. Correct one of the following root causes:
Building a Coder using a registered CoderFactory failed: Cannot provide coder for parameterized type org.apache.beam.sdk.values.KV<K, OutputT>: Unable to provide a default Coder for java.lang.Object. Correct one of the following root causes:
Building a Coder using a registered CoderFactory failed: Cannot provide coder based on value with class java.lang.Object: No CoderFactory has been registered for the class.
Building a Coder from the #DefaultCoder annotation failed: Class java.lang.Object does not have a #DefaultCoder annotation.
Building a Coder from the fallback CoderProvider failed: Cannot provide coder for type java.lang.Object: org.apache.beam.sdk.coders.protobuf.ProtoCoder$2#6e610150 could not provide a Coder for type java.lang.Object: Cannot provide ProtoCoder because java.lang.Object is not a subclass of com.google.protobuf.Message; org.apache.beam.sdk.coders.SerializableCoder$1#7adc59c8 could not provide a Coder for type java.lang.Object: Cannot provide SerializableCoder because java.lang.Object does not implement Serializable.
Building a Coder from the #DefaultCoder annotation failed: Class org.apache.beam.sdk.values.KV does not have a #DefaultCoder annotation.
at org.apache.beam.sdk.repackaged.com.google.common.base.Preconditions.checkState(Preconditions.java:444)
at org.apache.beam.sdk.values.TypedPValue.getCoder(TypedPValue.java:51)
at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:130)
at org.apache.beam.sdk.values.TypedPValue.finishSpecifying(TypedPValue.java:90)
at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:143)
at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:418)
at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:334)
at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:154)
at org.apache.beam.sdk.transforms.Combine$Globally.expand(Combine.java:1459)
at org.apache.beam.sdk.transforms.Combine$Globally.expand(Combine.java:1336)
at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:420)
at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:350)
at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:167)
at ***************************.main(***************.java:231)
... 6 more
Apologies for the huge code dump - wanted to provide all the context.
I'm curious as to why it's complaining about no default coder for both java.lang.Object and org.apache.beam.sdk.values.KV<K, OutputT> - as far as I can tell our pipeline is changing types between String, Event, and EventListWrapper - the latter two classes have their default coders set on the class itself (AvroCoder in both cases).
The error is occurring on the line where we apply the CombineFn - can confirm that without this transform the pipeline works.
I suspect we've set up the combine transform incorrectly somehow, but as of yet have found nothing in the Beam documentation to point us in the right direction.
Any insight would be appreciated - thanks in advance!
The probable reason you are seeing java.lang.Object is because Beam is trying to infer a coder for an unresolved type variable, which will be resolved to Object. This may be a bug in how coder inference is done within Combine.
Separately, I would expect the Accum class to also cause a failure of coder inference. You can override getAccumulatorCoder in your CombineFn to provide one quite directly.
Did you check if adding Serializable to your Accumulator works directly?
So add"implements Serializable" to Accum class ...
public static class Accum implements Serializable {
EventListWrapper eventListWrapper = new EventListWrapper();
}
We've added a filter to our spring webapp that checks all incoming requests for anything that could cause an XSS vulnerability. However, when it tries to write to the log, we get the following stack trace:
com.blah.blah.web.controllers.ExceptionLoggingController - ERROR: Exception: code=500,uri=/post.html,servlet=dispatch,class=org.owasp.esapi.errors.ConfigurationException,from=1.2.3.4,message=Request processing failed; nested exception is org.owasp.esapi.errors.ConfigurationException: java.lang.IllegalArgumentException: Classname cannot be null or empty. HTTPUtilities type name cannot be null or empty.
org.owasp.esapi.errors.ConfigurationException: java.lang.IllegalArgumentException: Classname cannot be null or empty. HTTPUtilities type name cannot be null or empty.
at org.owasp.esapi.util.ObjFactory.make(ObjFactory.java:105)
at org.owasp.esapi.ESAPI.httpUtilities(ESAPI.java:121)
at org.owasp.esapi.ESAPI.currentRequest(ESAPI.java:70)
at org.owasp.esapi.reference.JavaLogFactory$JavaLogger.log(JavaLogFactory.java:308)
at org.owasp.esapi.reference.JavaLogFactory$JavaLogger.warning(JavaLogFactory.java:242)
at org.owasp.esapi.reference.DefaultEncoder.canonicalize(DefaultEncoder.java:181)
at org.owasp.esapi.reference.DefaultEncoder.canonicalize(DefaultEncoder.java:120)
at com.blah.blah.web.MyFilter.removeXSS(MyFilter.java:26)
I have ESAPI.properties on the classpath, that seems to be otherwise working, that does have the "missing" class configured:
ESAPI.HTTPUtilities=org.owasp.esapi.reference.DefaultHTTPUtilities
And DefaultHTTPUtilities is on the classpath as well.
It turns out I was also importing a library called opensaml (as a dependency of some other dependency). This library has its own implementation of SecurityConfiguration, which is the interface ESAPI uses to load configuration. For some reason the opensaml implements nearly all the methods to just return null or 0:
package org.opensaml;
/**
* Minimal implementation of OWASP ESAPI {#link SecurityConfiguration}, providing the support used within OpenSAML.
*/
public class ESAPISecurityConfig implements SecurityConfiguration {
/** Constructor. */
public ESAPISecurityConfig() {
}
// snip...
/** {#inheritDoc} */
public String getHTTPUtilitiesImplementation() {
return null;
}
// snip....
}
In a class called DefaultBootstrap, this was getting executed somewhere during the startup of my application, which overrides ESAPI's default implementation:
protected static void initializeESAPI() {
ESAPI.initialize("org.opensaml.ESAPISecurityConfig");
}
I couldn't get rid of the opensaml library, so I had to change my code so that before I invoke ESAPI, I override it back to the default implementation:
ESAPI.initialize("org.owasp.esapi.reference.DefaultSecurityConfiguration");
value = ESAPI.encoder().canonicalize(value);
Following up on Suresh's comment...Toward that end, look at wherever you captured stdout and look for "Attempting to load ESAPI.properties" and follow that trail. It should look something like this:
Attempting to load ESAPI.properties via file I/O.
Attempting to load ESAPI.properties as resource file via file I/O.
Not found in 'org.owasp.esapi.resources' directory or file not readable: /home/kww/Code/GitHub/kwwall/esapi-java-legacy/ESAPI.properties
Found in SystemResource Directory/resourceDirectory: /home/kww/Code/GitHub/kwwall/esapi-java-legacy/target/test-classes/esapi/ESAPI.properties
Loaded 'ESAPI.properties' properties file
And them make sure that it is loading ESAPI.properties from where you expected it to be loaded.
Google's compile-testing is a great tool when it comes to unit testing annotation processors. Unfortunately I'm currently facing the following error:
java.lang.AssertionError:
An expected source declared one or more top-level types that were not present.
Expected top-level types: <[test.LocalStorage]>
Declared by expected file: <LocalStorageImpl.java>
The top-level types that were present are as follows:
- [test.LocalStorageImpl] in </SOURCE_OUTPUT/test/LocalStorageImpl.java>
My processor generates the implementation of an annotated interface. So the generated code has a dependency to the interface (class LocalStorageImpl implements LocalStorage).
When I try to verify the generated code with
JavaFileObject source = /* interface definition */
JavaFileObject expected_source = /* the generated implementation */
assertAbout(javaSource()).that(source)
.processedWith(new MyProcessor())
.compilesWithoutError()
.and()
.generatesSources(expectedSource);
...the generatesSource causes the error. As the error message states, it cannot resolve the interface definition.
So my question is how to compile or write the source file of the interface defined in source, so that it is recognized by generatesSources()?