i want to use activiti in spring mvc. my java config for activiti is below.
when i run project this exception no processes deployed with key throw. i put one-vacation-process.bpmn20.xml in resources folder. what is my problem? thanks for help.
#Configuration
public class ActivitiConfig {
#Bean
public ProcessEngine processEngine(ProcessEngineConfigurationImpl pec, ApplicationContext applicationContext) throws Exception {
ProcessEngineFactoryBean pe = new ProcessEngineFactoryBean();
pe.setProcessEngineConfiguration(pec);
pe.setApplicationContext(applicationContext);
return pe.getObject();
}
#Bean
public ProcessEngineConfigurationImpl getProcessEngineConfiguration(
DataSource dataSource,
PlatformTransactionManager transactionManager,
ApplicationContext context) {
SpringProcessEngineConfiguration pec = new SpringProcessEngineConfiguration();
pec.setDataSource(dataSource);
pec.setDatabaseSchemaUpdate("true");
pec.setJobExecutorActivate(true);
pec.setHistory("full");
pec.setMailServerPort(2025);
pec.setDatabaseType("mysql");
pec.setTransactionManager(transactionManager);
pec.setApplicationContext(context);
return pec;
}
#Bean
public RuntimeService getRuntimeService(ProcessEngine processEngine) {
return processEngine.getRuntimeService();
}
#Bean
public TaskService taskService(ProcessEngine processEngine) throws Exception {
return processEngine.getTaskService();
}
You need to deploy your process first.
There is API for different usecases, here I deploy a process where resourceName is the name of the process xml (e.g. one-vacation-process.bpmn20.xml) and content the actual file content as string.
RepositoryService repositoryService = processEngine.getRepositoryService();
DeploymentBuilder builder = repositoryService.createDeployment().addString(resourceName, content);
builder.enableDuplicateFiltering().deploy();
Have a look at org.activiti.engine.repository.DeploymentBuilder where there is API like:
DeploymentBuilder addInputStream(String resourceName, InputStream inputStream);
DeploymentBuilder addClasspathResource(String resource);
DeploymentBuilder addString(String resourceName, String text);
DeploymentBuilder addZipInputStream(ZipInputStream zipInputStream);
DeploymentBuilder addBpmnModel(String resourceName, BpmnModel bpmnModel);
Related
I am using springboot 2.6.6 with mysql and quite new to springbatch. I was getting an error "spring context is forming a cycle" when I kept the datasource in the same config file where readers and writers are kept in SampleJob.java. Suggested solution was to put the datasource in another class so I placed the datasource within the same class as main().
Now I am getting this issue:
Error
Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.boot.autoconfigure.batch.BatchDataSourceScriptDatabaseInitializer]: Factory method 'batchDataSourceInitializer' threw exception; nested exception is java.lang.IllegalStateException: Unable to detect database type
Caused by: java.lang.IllegalStateException: Unable to detect database type
SampleJob.java
#Configuration
public class SampleJob {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Bean
public Job chunkJob() {
return jobBuilderFactory.get("Chunk Job")
.incrementer(new RunIdIncrementer())
.start(firstChunkStep())
.build();
}
private Step firstChunkStep() {
return stepBuilderFactory.get("First Chunk Step")
.<StudentJdbc, StudentJdbc>chunk(3)
.reader(jdbcItemReader())
.writer(flatFileItemWriter(null))
.build();
}
#Bean
#StepScope
public JdbcCursorItemReader<StudentJdbc> jdbcItemReader(){
JdbcCursorItemReader<StudentJdbc> jdbcReader = new JdbcCursorItemReader<StudentJdbc>();
jdbcReader.setSql("select id, first_name as firstName, last_name as lastName, email from students");
jdbcReader.setDataSource(universitydatasource());
jdbcReader.setRowMapper(new BeanPropertyRowMapper<StudentJdbc>() {
{
setMappedClass(StudentJdbc.class);
}
});
return jdbcReader;
}
#StepScope
#Bean
public FlatFileItemWriter<StudentJdbc> flatFileItemWriter(
#Value("#{jobParameters['outputFile']}") FileSystemResource fileSystemResource
){
FlatFileItemWriter<StudentJdbc> flatFileItemWriter = new FlatFileItemWriter<StudentJdbc>();
flatFileItemWriter.setResource(fileSystemResource);
flatFileItemWriter.setResource(fileSystemResource);
flatFileItemWriter.setHeaderCallback(new FlatFileHeaderCallback() {
#Override
public void writeHeader(Writer writer) throws IOException {
writer.write("Id, First Name, Last Name, Email");
}
});
flatFileItemWriter.setLineAggregator(new DelimitedLineAggregator(){
{
setFieldExtractor(new BeanWrapperFieldExtractor<StudentResponse>() {
{
setNames(new String[] {"id","firstName","lastName","email"});
}
});
}
});
flatFileItemWriter.setFooterCallback(new FlatFileFooterCallback() {
#Override
public void writeFooter(Writer writer) throws IOException {
writer.write("Created # "+new Date());
}
});
return flatFileItemWriter;
}
}
Main class file
#SpringBootApplication
#EnableBatchProcessing
public class ChunksApplication {
public static void main(String[] args) {
SpringApplication.run(ChunksApplication.class, args);
}
#Bean
#Primary
#ConfigurationProperties(prefix="spring.datasource")
public DataSource datasource() {
return DataSourceBuilder.create().build();
}
#Bean
#ConfigurationProperties(prefix="spring.universitydatasource")
public DataSource universitydatasource() {
return DataSourceBuilder.create().build();
}
}
application.properties
spring.datasource.url=jdbc:mysql://localhost:3306/udemy-springbatch-chunks
spring.datasource.username=____
spring.datasource.password=____
spring.datasource.platform=mysql
spring.datasource.driverClassName=com.mysql.cj.jdbc.Driver
#alternate datasource for db input for reader
spring.universitydatasource.url=jdbc:mysql://localhost:3306/university?createDatabaseIfNotExist=true
spring.universitydatasource.username=____
spring.universitydatasource.password=____
spring.universitydatasource.platform=mysql
spring.universitydatasource.driverClassName=com.mysql.cj.jdbc.Driver
Found the solution. It seems that #ConfigurationProperties weren't picking up the url, username and password from application.properties so this worked in the main class file itself:
#Bean
#Primary
public DataSource datasource() {
DataSourceBuilder dataSourceBuilder = DataSourceBuilder.create();
dataSourceBuilder.url("jdbc:mysql://localhost:3306/udemy-springbatch-chunks");
dataSourceBuilder.username("____");
dataSourceBuilder.password("____");
return dataSourceBuilder.build();
}
#Bean("universityDatasource")
public DataSource universitydatasource() {
DataSourceBuilder dataSourceBuilder = DataSourceBuilder.create();
dataSourceBuilder.url("jdbc:mysql://localhost:3306/university?createDatabaseIfNotExist=true");
dataSourceBuilder.username("____");
dataSourceBuilder.password("____");
return dataSourceBuilder.build();
}
It works for me
spring.batch.schema=classpath:org/springframework/batch/core/schema-mysql.sql
My solution was this, you have to add the jdbcUrl
Properties file example:
spring.datasource.url=jdbc:mysql://localhost:3306/udemy-springbatch-chunks
spring.datasource.jdbcUrl=${spring.datasource.url}
spring.datasource.username=____
spring.datasource.password=____
spring.datasource.platform=mysql
spring.datasource.driverClassName=com.mysql.cj.jdbc.Driver
#alternate datasource for db input for reader
spring.universitydatasource.url=jdbc:mysql://localhost:3306/university
spring.universitydatasource.jdbcUrl=${spring.universitydatasource.url}
spring.universitydatasource.username=____
spring.universitydatasource.password=____
spring.universitydatasource.platform=mysql
spring.universitydatasource.driverClassName=com.mysql.cj.jdbc.Driver
Regards.
My problem: Stuck on implementing change of schema after user login, following a StackOverFlow.
Description: Im using the class below. However, I have no idea on how to use it. Im reading every tutorial but I'm stuck. The result I'm expecting are:
1- Spring initializes with the default URL so the user can login.
2- After a successful login, it changes to the schema based on the UserDetails class.
I'm following the Stack Overflow solution at: Change database schema during runtime based on logged in user
The Spring version I'm using is
> : Spring Boot :: (v2.3.3.RELEASE)
import com.google.common.cache.CacheBuilder;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.LoadingCache;
import java.sql.Connection;
import java.sql.ConnectionBuilder;
import java.sql.SQLException;
import java.util.concurrent.TimeUnit;
import javax.sql.DataSource;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.jdbc.DataSourceBuilder;
import org.springframework.core.env.Environment;
import org.springframework.jdbc.datasource.AbstractDataSource;
public class UserSchemaAwareRoutingDataSource extends AbstractDataSource {
#Autowired
UsuarioProvider customUserDetails;
#Autowired
Environment env;
private LoadingCache<String, DataSource> dataSources = createCache();
public UserSchemaAwareRoutingDataSource() {
}
public UserSchemaAwareRoutingDataSource(UsuarioProvider customUserDetails, Environment env) {
this.customUserDetails = customUserDetails;
this.env = env;
}
private LoadingCache<String, DataSource> createCache() {
return CacheBuilder.newBuilder()
.maximumSize(100)
.expireAfterWrite(10, TimeUnit.MINUTES)
.build(
new CacheLoader<String, DataSource>() {
public DataSource load(String key) throws Exception {
return buildDataSourceForSchema(key);
}
});
}
private DataSource buildDataSourceForSchema(String schema) {
System.out.println("schema:" + schema);
String url = "jdbc:mysql://REDACTED.com/" + schema;
String username = env.getRequiredProperty("spring.datasource.username");
String password = env.getRequiredProperty("spring.datasource.password");
System.out.println("Flag A");
DataSource build = (DataSource) DataSourceBuilder.create()
.driverClassName(env.getRequiredProperty("spring.datasource.driverClassName"))
.username(username)
.password(password)
.url(url)
.build();
System.out.println("Flag B");
return build;
}
#Override
public Connection getConnection() throws SQLException {
return determineTargetDataSource().getConnection();
}
#Override
public Connection getConnection(String username, String password) throws SQLException {
return determineTargetDataSource().getConnection(username, password);
}
private DataSource determineTargetDataSource() {
try {
Usuario usuario = customUserDetails.customUserDetails();
//
String db_schema = usuario.getTunnel().getDb_schema();
//
String schema = db_schema;
return dataSources.get(schema);
} catch (Exception ex) {
ex.printStackTrace();
}
return null;
}
#Override
public ConnectionBuilder createConnectionBuilder() throws SQLException {
return super.createConnectionBuilder();
}
}
References:
https://spring.io/blog/2007/01/23/dynamic-datasource-routing/
How to create Dynamic connections (datasource) in spring using JDBC
Spring Boot Configure and Use Two DataSources
Edit (Additional information required on the comments):
I have 1 database.
This database has a n number of schemas. Each schema pertains to one company. One user pertains to one company. The login logic is as follows:
-User input username and password.
-When successful, the UserDetails will contain the name of the 'schema' of this user. Basically, to which company/schema this user pertains.
It should, after that, connect directly to that schema, so the user can work with the data of his own company.
I hope this clarify as much as possible.
Edit 2:
#Component
public class UsuarioProvider {
#Bean
#Scope(value = WebApplicationContext.SCOPE_REQUEST, proxyMode = ScopedProxyMode.TARGET_CLASS) // or just #RequestScope
public Usuario customUserDetails() {
return (Usuario) SecurityContextHolder.getContext().getAuthentication().getPrincipal();
}
}
public class UserSchemaAwareRoutingDataSource extends AbstractDataSource {
#Autowired
private UsuarioProvider usuarioProvider;
#Autowired // This references the primary datasource, because no qualifier is given
private DataSource companyDependentDataSource;
#Autowired
#Qualifier(value = "loginDataSource")
private DataSource loginDataSource;
#Autowired
Environment env;
private LoadingCache<String, DataSource> dataSources = createCache();
public UserSchemaAwareRoutingDataSource() {
}
private LoadingCache<String, DataSource> createCache() {
return CacheBuilder.newBuilder()
.maximumSize(100)
.expireAfterWrite(10, TimeUnit.MINUTES)
.build(
new CacheLoader<String, DataSource>() {
public DataSource load(String key) throws Exception {
return buildDataSourceForSchema(key);
}
});
}
private DataSource buildDataSourceForSchema(String schema) {
System.out.println("schema:" + schema);
String url = "jdbc:mysql://REDACTED.com/" + schema;
String username = env.getRequiredProperty("spring.datasource.username");
String password = env.getRequiredProperty("spring.datasource.password");
System.out.println("Flag A");
DataSource build = (DataSource) DataSourceBuilder.create()
.driverClassName(env.getRequiredProperty("spring.datasource.driverClassName"))
.username(username)
.password(password)
.url(url)
.build();
System.out.println("Flag B");
return build;
}
#Override
public Connection getConnection() throws SQLException {
return determineTargetDataSource().getConnection();
}
#Override
public Connection getConnection(String username, String password) throws SQLException {
return determineTargetDataSource().getConnection(username, password);
}
private DataSource determineTargetDataSource() {
try {
System.out.println("Flag G");
Usuario usuario = usuarioProvider.customUserDetails(); // request scoped answer!
String db_schema = usuario.getTunnel().getDb_schema();
return dataSources.get(db_schema);
} catch (Exception ex) {
ex.printStackTrace();
}
return null;
}
#Override
public ConnectionBuilder createConnectionBuilder() throws SQLException {
return super.createConnectionBuilder();
}
}
Do I need to put #Configuration on top of this class? I'm not being able to make Spring Boot aware of this settings. I'm a bit confused on how to tell Spring Boot what is the loginDataSource; url is. I was using the application.properties default values to login.
Your setting seams the classical situation for two different DataSources.
Here is a Baeldung-Blog-Post how to configure Spring Data JPA.
First thing to notice, they are using #Primary. This is helping and standing in your way at the same time. You can only have ONE primary bean of a certain type. This is causing trouble for some people, since they try to "override" a spring bean by making their testing spring beans primary. Which results in having two primary beans with the same type. So be careful, when setting up your tests.
But it also eases things up, if you are mostly referring to one DataSource and only in a few cases to the other. This seams to be your case, so lets adopt it.
Your DataSource configuration could look like
#Configuration
public class DataSourceConfiguration {
#Bean(name="loginDataSource")
public DataSource loginDataSource(Environment env) {
String url = env.getRequiredProperty("spring.logindatasource.url");
return DataSourceBuilder.create()
.driverClassName(env.getRequiredProperty("spring.logindatasource.driverClassName"))
[...]
.url(url)
.build();
}
#Bean(name="companyDependentDataSource")
#Primary // use with caution, I'd recommend to use name based autowiring. See #Qualifier
public DataSource companyDependentDataSource(Environment env) {
return new UserSchemaAwareRoutingDataSource(); // Autowiring is done afterwards by Spring
}
}
These two DataSources can now be used in your repositories/DAOs or how ever you structure your program
#Autowired // This references the primary datasource, because no qualifier is given. UserSchemaAwareRoutingDataSource is its implementation
// #Qualifier("companyDependentDataSource") if #Primary is omitted
private DataSource companyDependentDataSource;
#Autowired
#Qualifier(name="loginDataSource") // reference by bean name
private DataSource loginDataSource
Here is an example how to configure Spring Data JPA with a DataSource referenced by name:
#Configuration
#EnableJpaRepositories(
basePackages = "<your entity package>",
entityManagerFactoryRef = "companyEntityManagerFactory",
transactionManagerRef = "companyTransactionManager"
)
public class CompanyPersistenceConfiguration {
#Autowired
#Qualifier("companyDependentDataSource")
private DataSource companyDependentDataSource;
#Bean(name="companyEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean companyEntityManagerFactory() {
LocalContainerEntityManagerFactoryBean emf = new LocalContainerEntityManagerFactoryBean();
emf.setDataSource(companyDependentDataSource);
// ... see Baeldung Blog Post
return emf;
}
#Bean(name="companyTransactionManager")
public PlatformTransactionManager companyTransactionManager() {
JpaTransactionManager tm = new JpaTransactionManager();
tm.setEntityManagerFactory(companyEntityManagerFactory().getObject());
return tm;
}
}
As described in my SO-answer you referred to there is an important assumption
The current schema name to be used for the current user is accessible through a Spring JSR-330 Provider like private javax.inject.Provider<User> user; String schema = user.get().getSchema();. This is ideally a ThreadLocal-based proxy.
This is the trick which makes the UserSchemaAwareRoutingDataSource implementation possible. Spring beans are mostly singletons and therefore stateless. This also applies to the normal usage of DataSources. They are treated as stateless singletons and the references to them are passed over in the whole program. So we need to find a way to provide a single instance of the companyDependentDataSource which is behaving different on user basis regardless. To get that behavior I suggest to use a request-scoped bean.
In a web application, you can use #Scope(REQUEST_SCOPE) to create such objects. There is also a Bealdung Post talking about that topic. As usual, #Bean annotated methods reside in #Confiugration annotated classes.
#Configuration
public class UsuarioConfiguration {
#Bean
#Scope(value = WebApplicationContext.SCOPE_REQUEST,
proxyMode = ScopedProxyMode.TARGET_CLASS) // or just #RequestScope
public Usuario usario() {
// based on your edit2
return (Usuario) SecurityContextHolder.getContext().getAuthentication().getPrincipal();
}
}
Now you can use this request scoped object with a provider inside your singleton DataSource to behave different according to the logged in user:
#Autowired
private Usario usario; // this is now a request-scoped proxy which will create the corresponding bean (see UsuarioConfiguration.usario()
private DataSource determineTargetDataSource() {
try {
String db_schema = this.usuario.getTunnel().getDb_schema();
return dataSources.get(db_schema);
} catch (Exception ex) {
ex.printStackTrace();
}
return null;
}
I hope this helps you understand the request scope concept of Spring.
So your login process would look something like
User input username and password
A normal spring bean, referencing the userDataSource by name, is checking the login and is putting the user information into the session/securitycontext/cookie/....
When successful, during the next request the companyDependentDataSource is capable of retrieving a properly setup Usario object
You can use this datasource now to do user specific stuff.
To verify your DataSource is properly working you could create a small Spring MVC endpoint
#RestController
public class DataSourceVerificationController {
#Autowired
private Usario usario;
#Autowired
#Qualifier("companyDependentDataSource") // omit this annotation if you use #Primary
private DataSource companyDependentDataSource;
#GetRequest("/test")
public String test() throws Exception {
String schema = usario.getTunnel().getDb_schema()
Connection con = companyDependentDataSource.getConnection();
Statement stmt = con.createStatement();
ResultSet rs = stmt.executeQuery("select name from Employee"); // just a random guess
rs.next();
String name = rs.getString("name")
rs.close();
stmt.close();
con.close();
return "name = '" + name + "', schema = '" + schema + "'";
}
}
Take your favorite browser go to your login page, do a valid login and call http://localhost:8080/test afterwards
I have an application where I try to combine Spring MVC and Apache CFX(soap) web services. When I run just the app, everything seems fine, I see generated WSDL by this link(http://localhost:8080/services/customer?wsdl). But when I run tests, it throws WebServiceException: Could not send Message... Connection refused.
I've opened all ports for public, private and domain area through Windows Firewall Defender. Maybe I've missed something.
In a desperate attempt to investigate it, I've checked the link with this command (wsimport -keep -verbose http://localhost:8080/services/customer?wsdl). As a result, it gave this:
[ERROR] Server returned HTTP response code: 403 for URL: http://localhost:8080/services/customer?wsdl
Failed to read the WSDL document: http://localhost:8080/services/customer?wsdl, because 1) could not find the document; /2) the document could not be read; 3) the root element of the document is not <wsdl:definitions>.
[ERROR] Could not find wsdl:service in the provided WSDL(s):
At least one WSDL with at least one service definition needs to be provided.
Now I do not know which way to dig.
WebServiceDispatcherServletInitializer
public class WebServiceDispatcherServletInitializer implements WebApplicationInitializer {
#Override
public void onStartup(ServletContext servletContext) throws ServletException {
AnnotationConfigWebApplicationContext context = new AnnotationConfigWebApplicationContext();
context.register(WebServiceConfig.class);
servletContext.addListener(new ContextLoaderListener(context));
ServletRegistration.Dynamic dispatcher = servletContext.addServlet("dispatcher", new CXFServlet());
dispatcher.addMapping("/services/*");
}
}
WebServiceConfig
#Configuration
public class WebServiceConfig {
#Bean(name = Bus.DEFAULT_BUS_ID)
public SpringBus springBus() {
return new SpringBus();
}
#Bean
public Endpoint endpoint() {
EndpointImpl endpoint = new EndpointImpl(springBus(), new CustomerWebServiceImpl() );
endpoint.publish("http://localhost:8080/services/customer");
return endpoint;
}
}
ClientConfig
#Configuration
public class ClientConfig {
#Bean(name = "client")
public Object generateProxy() {
return proxyFactoryBean().create();
}
#Bean
public JaxWsProxyFactoryBean proxyFactoryBean() {
JaxWsProxyFactoryBean proxyFactory = new JaxWsProxyFactoryBean();
proxyFactory.setServiceClass(CustomerWebService.class);
proxyFactory.setAddress("http://localhost:8080/services/customer");
return proxyFactory;
}
}
CustomerWebServiceImplTest
#ActiveProfiles(profiles = "test")
#ContextConfiguration(classes = {
PersistenceConfig.class,
RootConfig.class,
WebServiceConfig.class,
ClientConfig.class
})
#WebAppConfiguration
public class CustomerWebServiceImplTest {
private ApplicationContext context = new AnnotationConfigApplicationContext(ClientConfig.class);
private CustomerWebService customerWsProxy = (CustomerWebService) context.getBean("client");
#Test
public void addCustomer() {
CustomerDto customer = new CustomerDto();
customer.setName("John");
assertEquals("Hello " + customer.getName(), customerWsProxy.addCustomer(customer));
}
}
Could you give a hint where the error might be?
UPD: I checked this setup on PC where I and my applications have full access rights and it still throws the Exception.
A solution was quite simple - just need to add #RunWith(SpringRunner.class). Because this annotation is run spring beans, not #WebAppConfiguration with #ContextConfiguration.
This is how it will look like
#RunWith(SpringRunner.class)
#ContextConfiguration(classes = {
RootConfig.class,
WebServiceConfig.class,
ClientConfig.class
})
public class CustomerWebServiceImplTest {
...
}
I want to create project with spring batch rest controller and dynamic input filename.
My code : Rest Controller
#RestController
public class FileNameController {
#Autowired
JobLauncher jobLauncher;
#Autowired
Job job;
#RequestMapping("/launchjob")
public String handle(#RequestParam("fileName") String fileName) throws Exception {
Logger logger = LoggerFactory.getLogger(this.getClass());
try {
JobParameters jobParameters = new JobParametersBuilder()
.addString("input.file.name", fileName)
.addLong("time", System.currentTimeMillis())
.toJobParameters();
jobLauncher.run(job, jobParameters);
} catch (Exception e) {
logger.info(e.getMessage());
}
return "Done";
}
}
The Job config:
#Configuration
#EnableBatchProcessing
public class JobConfig {
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
Filetasklet ft=new Filetasklet();
Logger log = LoggerFactory.getLogger(this.getClass().getName());
private String pathFile = urlCoffreFort + "\\" + ft.getFileName();
// => Configuration of Job
#Bean
public Job job() throws IOException {
return jobBuilderFactory.get("job")
.incrementer(new RunIdIncrementer())
.flow(step1())
.end()
.build();
}
//###### Steps
// => Step cecStep1
#Bean
public Step step1() throws IOException {
return stepBuilderFactory.get("fileDecrypt")
.<Person, String>chunk(100)
.reader(reader1())
.processor(processor1FileDecrypt())
.writer(writer1())
.faultTolerant()
.skip(Exception.class)
.skipLimit(100)
.build();
}
// ####### readers
// => reader1()
#Bean
public FlatFileItemReader<Person> reader1() throws IOException{
return new FlatFileItemReaderBuilder<CSCivique>().name("personItemReader")
.resource(new ClassPathResource(pathFile))
.delimited()
.delimiter(";")
.names(new String[] { "id", "nomNaissance", "prenom" })
.targetType(CSCivique.class)
.build();
}
// ######Processors
#Bean
public PersonItemProcessor1FileDecrypt processor1FileDecrypt() {
return new PersonItemProcessor1FileDecrypt();
}
// ######Writers
#Bean
public FlatFileItemWriter<String> writer1() {
return new FlatFileItemWriterBuilder<String>().name("greetingItemWriter")
.resource(new FileSystemResource("sav/greetings.csv"))
.lineAggregator(new PassThroughLineAggregator<>()).build();
}
}
When I write the url :
http://localhost:8080/launchjob?fileName=djecc5cpt.csv
The console print :
PreparedStatementCallback; bad SQL grammar [SELECT JOB_INSTANCE_ID, JOB_NAME from BATCH_JOB_INSTANCE where JOB_NAME =? and JOB_KEY =?]; nested exception is org.postgresql.util.PSQLException: ERROR: the relation "batch_job_instance" does not exist
Position: 39
I do not have a request, normally the framework will create its tables
Spring Batch won't make the decision to create tables on your behalf in your production database. You need to make that decision and do it manually upfront. Otherwise, if you use Spring Boot, you can tell Spring Boot to do it for your by setting spring.batch.initialize-schema=always.
Please see https://stackoverflow.com/a/51891852/5019386 for a similar question/answer.
We are using the latest Spring Boot for a Spring app and using the latest Spring Integration for SFTP. I've been to the Spring Integration SFTP documentation site, and I took the Spring Boot Configuration as is:
#Bean
public SessionFactory<LsEntry> sftpSessionFactory() {
DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory(true);
factory.setHost("localhost");
factory.setPort(port);
factory.setUser("foo");
factory.setPassword("foo");
factory.setAllowUnknownKeys(true);
return new CachingSessionFactory<LsEntry>(factory);
}
#Bean
public SftpInboundFileSynchronizer sftpInboundFileSynchronizer() {
SftpInboundFileSynchronizer fileSynchronizer = new SftpInboundFileSynchronizer(sftpSessionFactory());
fileSynchronizer.setDeleteRemoteFiles(false);
fileSynchronizer.setRemoteDirectory("/");
fileSynchronizer.setFilter(new SftpSimplePatternFileListFilter("*.xml"));
return fileSynchronizer;
}
#Bean
#InboundChannelAdapter(channel = "sftpChannel")
public MessageSource<File> sftpMessageSource() {
SftpInboundFileSynchronizingMessageSource source =
new SftpInboundFileSynchronizingMessageSource(sftpInboundFileSynchronizer());
source.setLocalDirectory(new File("ftp-inbound"));
source.setAutoCreateLocalDirectory(true);
source.setLocalFilter(new AcceptOnceFileListFilter<File>());
return source;
}
#Bean
#ServiceActivator(inputChannel = "sftpChannel")
public MessageHandler handler() {
return new MessageHandler() {
#Override
public void handleMessage(Message<?> message) throws MessagingException {
System.out.println(message.getPayload());
}
};
}
Let me be clear, after cutting and pasting, there are some unit tests that run. However, when loading the application context there was an error message because the Polling wasn't there.
When I googled that error, other posts on StackOverflow said I also had to add to remove this error message when loading the application context.
#Bean(name = PollerMetadata.DEFAULT_POLLER)
public PollerMetadata defaultPoller() {
PollerMetadata pollerMetadata = new PollerMetadata();
pollerMetadata.setTrigger(new PeriodicTrigger(60));
return pollerMetadata;
}
When I added this code, THEN at least my build would work and the tests would run because the application context was now being loaded correctly.
Now I am looking for a code sample on how to make this work and move files? The Spring Integration SFTP examples on GitHub are ok, but not great ... far from it.
The Basic Spring Integration Example shows how to read files from an SFTP Server, if the data is configured with an application-context.xml file. Where is the example where a Spring Boot configuration is used, and then the code to read from that server, and the code for the test?
I understand that regardless of whether you use a Java class for Spring Boot configuration or an application-context.xml file ... the working code should work the same for autowired SFTP channels and some inbound channel adapter.
So here is the code, I am trying to make work:
#Component
#Profile("sftpInputFetch")
public class SFTPInputFetcher implements InputFetcher
{
// The PollableChannel seems fine
#Autowired
PollableChannel sftpChannel;
#Autowired
SourcePollingChannelAdapter sftpChannelAdapter;
#Override
public Stream<String> fetchLatest() throws FileNotFoundException
{
Stream<String> stream = null;
sftpChannelAdapter.start();
Message<?> received = sftpChannel.receive();
File file = (File)received.getPayload();
// get Stream<String> from file
return stream;
}
Currently, "sftpChannelAdapter.start();" is the part I am having trouble with.
This implementation does not find the "SourcePollingChannelAdapter" class.
If this was defined in the classic XML application context with an "id" then this code autowires just fine. With a Spring Boot configuration, it doesn't look like you can define an "id" for a bean.
This just stems from my lack of knowledge on how to convert from using a traditional application-context XML file WITH annotations in the code, to using a complete Spring Boot application context configuration file.
Any help with this is much appreciated. Thanks!
I don't understand the question; you said
I had to add ... to make it work
and then
Now I am looking for a code sample on how to make this work?
What is not working?
You can also use
#InboundChannelAdapter(value = "sftpChannel", poller = #Poller(fixedDelay = "5000"))
instead of adding a default poller definition.
We will fix the docs for the missing poller config.
EDIT
I just copied the code into a new boot app (with the poller config) and it works as expected.
#SpringBootApplication
public class SftpJavaApplication {
public static void main(String[] args) {
new SpringApplicationBuilder(SftpJavaApplication.class).web(false).run(args);
}
#Bean
public SessionFactory<LsEntry> sftpSessionFactory() {
DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory(true);
factory.setHost("...");
factory.setPort(22);
factory.setUser("...");
factory.setPassword("...");
factory.setAllowUnknownKeys(true);
return new CachingSessionFactory<LsEntry>(factory);
}
#Bean
public SftpInboundFileSynchronizer sftpInboundFileSynchronizer() {
SftpInboundFileSynchronizer fileSynchronizer = new SftpInboundFileSynchronizer(sftpSessionFactory());
fileSynchronizer.setDeleteRemoteFiles(false);
fileSynchronizer.setRemoteDirectory("foo");
fileSynchronizer.setFilter(new SftpSimplePatternFileListFilter("*.txt"));
return fileSynchronizer;
}
#Bean
#InboundChannelAdapter(channel = "sftpChannel", poller = #Poller(fixedDelay = "5000"))
public MessageSource<File> sftpMessageSource() {
SftpInboundFileSynchronizingMessageSource source = new SftpInboundFileSynchronizingMessageSource(
sftpInboundFileSynchronizer());
source.setLocalDirectory(new File("ftp-inbound"));
source.setAutoCreateLocalDirectory(true);
source.setLocalFilter(new AcceptOnceFileListFilter<File>());
return source;
}
#Bean
#ServiceActivator(inputChannel = "sftpChannel")
public MessageHandler handler() {
return new MessageHandler() {
#Override
public void handleMessage(Message<?> message) throws MessagingException {
System.out.println(message.getPayload());
}
};
}
}
Result:
16:57:59.697 [task-scheduler-1] WARN com.jcraft.jsch - Permanently added '10.0.0.3' (RSA) to the list of known hosts.
ftp-inbound/bar.txt
ftp-inbound/baz.txt