I'm new to Guice and Shiro, and i'm trying to use it with my DB (h2).
I've read this : click
but as they said it's just working for the users and roles sections, which is useless for me.
My shiro.ini is working, i managed to create user, login and logout without the Guice part.
My MyShiroModule
public class MyShiroModule extends ShiroModule{
protected void configureShiro() {
try {
bindRealm().toConstructor(IniRealm.class.getConstructor(Ini.class));
} catch (NoSuchMethodException e) {
addError(e);
}
}
#Provides
Ini loadShiroIni() {
return Ini.fromResourcePath("classpath:shiro.ini");
}
}
and my Module :
public class Module extends AbstractModule {
#Singleton
protected void configure() {
Injector injector = Guice.createInjector(new MyShiroModule());
SecurityManager securityManager = injector.getInstance(SecurityManager.class);
SecurityUtils.setSecurityManager(securityManager);
}
}
they're as they said in the tutorial.
What do i have to add to use the [main] part of my shiro.ini?
I never got the JDBC realm to work with Guice since, as you noted, it only reads the users and groups section for whatever reason. I ended up not using Shiro.ini at all just creating the JdbcRealm myself like this:
public class ShiroAuthModule extends ShiroModule {
#Override
public void configure() {
super.configure();
// Bind your data source however you need to - I use JNDI
// but it would be easy to switch to a properties file.
bind(Context.class).to(InitialContext.class);
bind(DataSource.class).toProvider(JndiIntegration.fromJndi(DataSource.class, "java:/comp/env/jdbc/security"));
}
#Provides
#Singleton
JdbcRealm loadJdbcRealm(Ini ini, DataSource ds,
#Named("shiro.authenticationQuery") String authenticationQuery,
#Named("shiro.userRolesQuery") String roleQuery,
#Named("shiro.permissionsQuery") String permissionQuery) {
JdbcRealm realm = new JdbcRealm();
realm.setAuthenticationQuery(authenticationQuery);
realm.setUserRolesQuery(roleQuery);
realm.setPermissionsQuery(permissionQuery);
realm.setPermissionsLookupEnabled(true);
realm.setDataSource(ds);
return realm;
}
#Override
protected void configureShiro() {
// shiro.properties should be on your classpath and
// contain the named properties in loadJdbcRealm
Properties properties = Module.loadProperties(this, "shiro.properties");
Names.bindProperties(binder(), properties);
try {
bindRealm().to(JdbcRealm.class);
} catch (SecurityException e) {
addError(e);
}
}
}
Related
I am trying to use com.google.inject.Injector class in AADL which is a domain specific language created using EMF and XText. when i call function Aadl2StandaloneSetup.doSetup() which provides Initialisation support for running Xtext languages without equinox extension registry, I get error java.lang.ClassNotFoundException: org.osate.xtext.aadl2.Activator cannot be found by enter code hereorg.osate.xtext.aadl2_1.0.0.qualifier. how can this be fixed.
Code of Aadl2StandaloneSetup.doSetup() is below:
public Injector createInjectorAndDoEMFRegistration() {
org.osate.xtext.aadl2.properties.PropertiesStandaloneSetup.doSetup();
Injector injector = createInjector();
register(injector);
return injector;
}
public Injector createInjector() {
return Guice.createInjector(new org.osate.xtext.aadl2.Aadl2RuntimeModule());
}
public void register(Injector injector) {
org.eclipse.xtext.resource.IResourceFactory resourceFactory = injector.getInstance(org.eclipse.xtext.resource.IResourceFactory.class);
org.eclipse.xtext.resource.IResourceServiceProvider serviceProvider = injector.getInstance(org.eclipse.xtext.resource.IResourceServiceProvider.class);
Resource.Factory.Registry.INSTANCE.getExtensionToFactoryMap().put("aadl", resourceFactory);
org.eclipse.xtext.resource.IResourceServiceProvider.Registry.INSTANCE.getExtensionToFactoryMap().put("aadl", serviceProvider);
Resource.Factory.Registry.INSTANCE.getExtensionToFactoryMap().put("aadl2", resourceFactory);
org.eclipse.xtext.resource.IResourceServiceProvider.Registry.INSTANCE.getExtensionToFactoryMap().put("aadl2", serviceProvider);
}
I want to make periodical REST request with a Dropwizard Backend. More concretely I want to make an GET request to an external REST API every minute and process the result.
I used the quartz here and now I try to use the jersey client to make a REST request. I use guice as my dependency injection.
My application class has the following methods
#Override
public void initialize(final Bootstrap<DockerwizardConfiguration> bootstrap) {
Job everyJob = new EveryTestJob();
bootstrap.addBundle(new JobsBundle(everyJob));
}
#Override
public void run(final DockerwizardConfiguration configuration,
final Environment environment) {
Injector injector = Guice.createInjector(new AbstractModule() {
#Override
protected void configure() {
bind(HelloWorldParameter.class)
.annotatedWith(Names.named("helloWorldParameter"))
.toInstance(configuration.getHelloWorldParameter());
}
});
JerseyClientConfiguration conf = configuration.getJerseyClientConfiguration();
conf.setChunkedEncodingEnabled(false);
final Client client = new JerseyClientBuilder(environment).using(conf).build(getName());
environment.jersey().register(new ExternalServiceResource(client)); // How should that be implented with guice
environment.jersey().register(injector.getInstance(HelloWorldResource.class));
}
And my EveryTestJob class is implemented as follows
#Every("1s")
public class EveryTestJob extends Job {
#Override
public void doJob(JobExecutionContext context) throws JobExecutionException {
// logic run every time and time again
}
}
I am unsure how I this can be organized.
I've been trying to figure this out for a while, and this is what I have found out:
The JobBundle is added before any Resources so the JobExecutionContext will not include the client (https://www.dropwizard.io/0.9.2/docs/manual/internals.html)
Tried using the injector but didn't work either (https://github.com/HubSpot/dropwizard-guice)
Finally I stumbled on Jersey 2.0: Create repeating job which showed how to add the client into the context!
Here's my solution:
In the resource class,
#Path("/myPath")
public class myResource {
#Inject
public myResource() {
try {
Scheduler scheduler = new StdSchedulerFactory().getScheduler();
scheduler.getContext().put"myResource", this); // Inserts myResource into the context
} catch (SchedulerException e) {
// Handle exception
}
}
// Other stuff for api
}
Then in the job class (I'm using Dropwizard-jobs 2.0.1 where doJobs doesn't take in any arguments so I used execute instead),
#Every("10s")
public class myJob extends Job {
#Override
public void execute(JobExecutionContext context) throws JobExecutionException {
try {
myResource res = (myResource) context.getScheduler().getContext().get("myResource");
// Do stuff with your resource
} catch (SchedulerException e) {
// Handle exception
}
}
}
Not sure if you have access to the ExternalServiceResource, but I hope this helps!
I have method which every now and then generates a string. I would like to register method as uri and produce a exchange method which will be used as input for a route.
The method is call by a different class
SampleClass sc = new SampleClass();
sc.sampleMethod("Hello");
Eg:
public class SampleClass{
#Produce(uri = "direct:consumerMethod")
ProducerTemplate producer;
public sampleMethod(Object obj){
producer.sendBody(object);
}
}
The route is defined as below:
#Override
public void configure() {
from("direct:consumerMethod").process(new GenerateD());
}
But the route doesnt call GenerateD class when i produce using the sampleMethod. Is this not feasible or am i doing something wrong?
Finally this is what worked for my use case.
Starting camelcontext as below:
CamelContext camelContext = new DefaultCamelContext();
camelContext.addRoutes(new SampleRoute());
camelContext.start();
My routebuilder class :
class SampleRoute extends RouteBuilder {
#Override
public void configure() {
try
{
from("direct:consumerMethod").process(new DDT());
}catch(Exception e)
{
e.printStackTrace();
}
}
}
I then create a interface which has a sendMessage method.
public interface DDTConsumer {
public String sendMessage(Object object);
}
Now i implement this method to create an endpoint of this interface and send a message to the endpoint.
DDTConsumer ddt;
try {
ddt = new ProxyBuilder(camelContext).endpoint("direct:consumerMethod").build(DDTConsumer.class);
ddt.sendMessage(msg.getValue());
} catch (Exception e) {
e.printStackTrace();
}
This solved my problem and the route is working fine now. Hope it helps others as well.
In your class where you have the sampleMethod(Object) add the following field:
#Produce(uri = "direct:consumerMethod")
ProducerTemplate template;
In your sampleMethod(Object) you can use the previously added template like this:
public sampleMethod(Object obj){
template.sendBody(object);
}
And it should send a Message to the direct:consumerMethod route.
Use something like this, if you want to call somemethod
#Override
public void configure() {
from("direct:consumerMethod").log(simple("${bean:generateD?method=generateDMethod}"));
}
The above expression will call the generateDMethod of generateD object (bean) and log the methods output to console (the default log writer).
To make above expression work, you have to store generateD bean in the Registry, which will be further associated with your application's CamelContext. You can do the same as follows
#Autowired
private GenerateD generateD;
#Override
protected CamelContext createCamelContext() throws Exception {
SimpleRegistry registry = new SimpleRegistry();
registry.put("generateD", generateD); //the generateD bean,which can be used anywhere in the camelcontext
SpringCamelContext camelContext = new SpringCamelContext();
camelContext.setRegistry(registry); //add the registry
camelContext.setApplicationContext(getApplicationContext());
camelContext.start();
return camelContext;
}
This adds the bean to camelContext. Please check my answer at this link to have complete example.
How do I access the #Value machinery dynamically at run-time?
I thought that Environment might be what I was looking for, but it
#Component
public class SpringConfiguration implements ConfigurationI {
#Autowired
private Provider<Environment> env;
#Override
public String get(String key) {
try {
return env.get().getRequiredProperty(key);
} catch (IllegalStateException e) {
return null;
}
}
}
Unfortunately, this does not access the values exposed by our PropertyPlaceholderConfigurer bean.
EDIT: To explain my use case: This is part of making a library with a lot of spring specific pieces (that a pile of older spring applications depend on) usable from newer Guice applications by switching Spring specific annotations for JSR 330 (javax.inject) ones. I was hoping to avoid rewriting all the PropertyPlaceholderConfigurer stuff across all our Spring applications, by providing a nice entrypoint like this. If there is another better way to do this (maybe with #Named?) then I am all ears.
EDIT2: This is a (cleaned up) example of what kind of PropertyPlaceholderConfigurer exists in the apps calling into this library.
#Bean
public PropertyPlaceholderConfigurer placeholderConfigurer() {
return new PropertyPlaceholderConfigurer() {
#Override
protected String resolvePlaceholder(String placeholder, Properties props) {
// Some code to parse and cleanup key here
String result = getPropertyFromLocalAppSpecificConfig(key);
if (result == null) {
result = super.resolvePlaceholder(placeholder, props);
}
// Some more random app specific logic for missing defaults
return result;
}
};
}
PropertyPlaceholder and friends do not put the properties in your Environment (mainly because of backward compatibility reasons). Instead they use Environment and its own internal Properties object gathered generally from property files from the classpath to resolve #Value properties. Thus the properties loaded from PropertyPlaceholder can not be fetched dynamically (ie no getProperty(String..)).
Some people create custom PropertyPlaceholder that store the properties publicly (through getter or whatever) but I think completely defeats Spring's new unified environment configuration handling.
What you really want is probably #PropertySource which still is pretty crappy since its not dynamic (since its an annotation you can't change where files get loaded from) but it will load properties into the Environment. I have been meaning to file issues with Spring Source about the confusion of this.
Anyway you can look at my solution here: Manually add a #PropertySource: Configuring Environment before context is refreshed
Basically you need to get hold of ConfigurableEnvironment and load your properties into it by creating PropertySources. The API for this is very powerful but not very intuitive. You can use ApplicationContextInitializers to get the Environment which has its own annoying issues (see link) or you can do what I do below.
public class ConfigResourcesEnvironment implements
ResourceLoaderAware, EnvironmentAware, BeanDefinitionRegistryPostProcessor, EnvironmentPropertiesMapSupplier {
private Environment environment;
private Map<String, String> environmentPropertiesMap;
#Override
public void postProcessBeanDefinitionRegistry(BeanDefinitionRegistry registry) throws BeansException {
if (environment instanceof ConfigurableEnvironment) {
ConfigurableEnvironment env = ((ConfigurableEnvironment) this.environment);
List<PropertySource> propertySources;
try {
propertySources = loadPropertySources(); //Your custom method for propertysources
} catch (IOException e) {
throw new RuntimeException(e);
}
//Spring prefers primacy ordering so we reverse the order of the sources... You may not need to do this.
reverse(propertySources);
for (PropertySource rp : propertySources) {
env.getPropertySources().addLast(rp);
}
environmentPropertiesMap = ImmutableMap.copyOf(environmentPropertiesToMap(env));
}
else {
environmentPropertiesMap = ImmutableMap.of();
}
}
public static Map<String,String> environmentPropertiesToMap(ConfigurableEnvironment e) {
Map<String, String> properties = newLinkedHashMap();
for (String n : propertyNames(e.getPropertySources())) {
String v = e.getProperty(n);
if (v != null)
properties.put(n, v);
}
return properties;
}
public static Iterable<String> propertyNames(PropertySources propertySources) {
LinkedHashSet<String> propertyNames = new LinkedHashSet<String>();
for (PropertySource<?> p : propertySources) {
if (p instanceof EnumerablePropertySource) {
EnumerablePropertySource<?> e = (EnumerablePropertySource<?>) p;
propertyNames.addAll(asList(e.getPropertyNames()));
}
}
return propertyNames;
}
#Override
public void postProcessBeanFactory(ConfigurableListableBeanFactory beanFactory) throws BeansException {
//NOOP
}
#Override
public void setEnvironment(Environment environment) {
this.environment = environment;
}
public Map<String, String> getEnvironmentPropertiesMap() {
return environmentPropertiesMap;
}
}
Once you have ConfigurableEnvironment loaded you can use the EnvironmentAware interface for things that need the Environment or create your own interface.
Here is a custom interface you can use for things that need dynamic properties (the above class implements it):
public interface EnvironmentPropertiesMapSupplier {
public Map<String, String> getEnvironmentPropertiesMap();
}
I have:
Some interface:
public interface ISomeObject {
void someAction();
}
Some groovy file (someObject.groovy):
public class SomeObject implements ISomeObject {
#Autowired
SomeOtherClass someField;
#Override
void someAction(){};
}
I need to Spring automatically load autowired fields. How should I load this class?
Some code (for start) that load class without Spring:
GroovyClassLoader gcl = new GroovyClassLoader();
Class clazz = null;
try {
clazz = gcl.parseClass(new File("someObject.groovy"));
ISomeObject groovyObject = (ISomeObject ) clazz.newInstance();
return Optional.of(groovyObject);
} catch (IOException |InstantiationException|IllegalAccessException e) {
return Optional.empty();
}
Personally I would use a plain old factory in this case and wire all the properties "manually".
Although I made a small research and it looks like you have other options to do it. I believe this question is what you are looking for:
Registering beans(prototype) at runtime in Spring