ERROR java.lang.NoClassDefFoundError: com/mongodb/MongoClient - java

so i got a very strange error message. Im currently working on a java web project with maven and testing the project with Eclipse and Tomcat. So I imported all the neccessary dependencys (mongo java driver, mongodb driver, mongodb driver core, bson and javax.servlet api), or so i thought. But still i'm getting this error over and over again.
If I run the code as part of a main method it works just fine...so im in the dark what could have caused this problem.
this is my MongoDB connector,
public class Connector {
final String HOST = "localhost";
final int PORT = 27017;
final String DBNAME = "mitfahrapp";
public static Connector instance;
public MongoClient connection;
public MongoDatabase database;
public Connector(){
this.connection = new MongoClient(this.HOST, this.PORT);
this.database = connection.getDatabase(DBNAME);
}
public MongoClient getClient() {
return connection;
}
public static Connector createInstance() throws UnknownHostException {
if (Connector.instance == null) {
Connector.instance = new Connector();
}
return Connector.instance;
}
public MongoCollection<Document> getCollection(String name) {
return this.database.getCollection(name);
}
public void CloseMongo() {
connection.close();
}
}
and this is part of my LoginServlet.java
protected void doPost(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
Connector c = Connector.createInstance();
MongoCollection<Document> collection = c.getCollection("users");
String username = request.getParameter("username");
String password = request.getParameter("password");
Bson filterUsername = Filters.eq("username", username);
Bson filterPwd = Filters.eq("password", password);
Bson bsonFilter = Filters.and(filterUsername, filterPwd);
FindIterable<Document> doc = collection.find(bsonFilter);
if (doc != null) {
response.sendRedirect("welcome.jsp");
} else {
response.sendRedirect("login.jsp");
}
Thanks for any answers in advance!

This means that the classes are not included in the jar, if you are using maven you should use the maven shade plugin to include those.

Related

Can't load the jdbc driver class. ClassNotFoundException

I am using eclipse and have to develop a jsp application that connects to a derby db.
After I setup the database I went into my project and added derbyclient.jar as well as derbyclient.jar as reference.
However each time I try to get the jdbc class I am getting a ClassNotFoundException.
Class.forName("org.apache.derby.jdbc.ClientDriver");
jakarta.servlet.ServletException: java.lang.ClassNotFoundException: org.apache.derby.jdbc.ClientDriver
org.apache.jasper.runtime.PageContextImpl.handlePageException(PageContextImpl.java:674)
org.apache.jsp.index_jsp._jspService(index_jsp.java:179)
org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:71)
jakarta.servlet.http.HttpServlet.service(HttpServlet.java:770)
org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:467)
org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:379)
org.apache.jasper.servlet.JspServlet.service(JspServlet.java:327)
jakarta.servlet.http.HttpServlet.service(HttpServlet.java:770)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53)
I have also tried it with a mysql database and got the same result so I thought maybe I did something wrong with the way I reference.
I intend to use a business java class which then will be used in jsp because I don't want to have the db access code in the jsp code.
The current "BusinessManager.java" class looks like this (only mocking stuff which needs to be replaced with database code).
package com.forum.shared;
import java.sql.*;
import java.time.Instant;
import java.util.*;
public class BusinessManager {
public List<ForumEntity> LoadFromDatabase() throws ClassNotFoundException, SQLException
{
//Class.forName("com.mysql.jdbc.Driver");
//Class.forName("org.apache.derby.jdbc.ClientDriver");
//java.sql.Connection connect = null;
//String dbName = "D:/Apache/test/forum";
//String connectionURL="jdbc:derby://localhost:1527/" + dbName + ";create=true";
//connect = DriverManager.getConnection(connectionURL);
List<ForumEntity> result = new ArrayList<ForumEntity>();
ForumEntity mockData = new ForumEntity();
mockData.setID(1);
mockData.setAntwortId(0);
mockData.setAntwort(false);
mockData.setInhalt("Willkommen im Forum!");
mockData.setTitle("Willkommen");
mockData.setName("Admin");
mockData.setMail("admin#hszg.de");
mockData.setZeit(java.util.Date.from(Instant.now()));
ForumEntity answerData = new ForumEntity();
answerData.setID(2);
answerData.setAntwortId(1);
answerData.setName("Max Mustermann");
answerData.setMail("max_mustermann#gmx.com");
answerData.setInhalt("Das ist eine Antwort");
answerData.setZeit(java.util.Date.from(Instant.now()));
mockData.addAnswer(answerData);
result.add(mockData);
return result;
}
public ForumEntity LoadFromDatabaseById(int id)
{
ForumEntity result = new ForumEntity();
result.setID(1);
result.setAntwortId(0);
result.setAntwort(false);
result.setInhalt("Willkommen im Forum!");
result.setTitle("Willkommen");
result.setName("Admin");
result.setMail("admin#hszg.de");
result.setZeit(java.util.Date.from(Instant.now()));
ForumEntity answerData = new ForumEntity();
answerData.setID(2);
answerData.setAntwortId(1);
answerData.setName("Max Mustermann");
answerData.setMail("max_mustermann#gmx.com");
answerData.setInhalt("Das ist eine Antwort");
answerData.setZeit(java.util.Date.from(Instant.now()));
result.addAnswer(answerData);
return result;
}
public boolean CreateAndSafeForumEntity(int antwordId, boolean antwort, String titel, String inhalt, String name, String mail)
{
boolean result = false;
return result;
}
}
I am out of ideas. I have added the jar files for both dbms and non can be loaded. Any ideas?
Edit: I used this eclipse project template
Edit: Update
Now it gets intersting. I decided to try a different approach. I kept the libs as reference but decided to try load them during runtime.
public List<ForumEntity> LoadFromDatabase() throws ClassNotFoundException, SQLException
{
List<File> jars = Arrays.asList(new File("D:\\Apache\\db-derby-10.15.2.0-bin\\lib").listFiles());
URL[] urls = new URL[jars.size()];
for (int i = 0; i < jars.size(); i++) {
try {
urls[i] = jars.get(i).toURI().toURL();
} catch (Exception e) {
e.printStackTrace();
}
}
URLClassLoader childClassLoader = new URLClassLoader(urls, ClassLoader.getSystemClassLoader());
Class.forName("org.apache.derby.jdbc.ClientDriver", true , childClassLoader);
//Class.forName("com.mysql.jdbc.Driver");
//Class.forName("org.apache.derby.jdbc.ClientDriver");
//java.sql.Connection connect = null;
//String dbName = "D:/Apache/test/forum";
//String connectionURL="jdbc:derby://localhost:1527/" + dbName + ";create=true";
//connect = DriverManager.getConnection(connectionURL);
List<ForumEntity> result = new ArrayList<ForumEntity>();
ForumEntity mockData = new ForumEntity();
mockData.setID(1);
mockData.setAntwortId(0);
mockData.setAntwort(false);
mockData.setInhalt("Willkommen im Forum!");
mockData.setTitle("Willkommen");
mockData.setName("Admin");
mockData.setMail("admin#hszg.de");
mockData.setZeit(java.util.Date.from(Instant.now()));
ForumEntity answerData = new ForumEntity();
answerData.setID(2);
answerData.setAntwortId(1);
answerData.setName("Max Mustermann");
answerData.setMail("max_mustermann#gmx.com");
answerData.setInhalt("Das ist eine Antwort");
answerData.setZeit(java.util.Date.from(Instant.now()));
mockData.addAnswer(answerData);
result.add(mockData);
return result;
}
Guess what when I previously reached Class.forName("org.apache.derby.jdbc.ClientDriver") he allways threw the ClassNotFoundException. Now with the runtime library loading code above I don't get the ClassNotFoundException anymore when I execute Class.forName("org.apache.derby.jdbc.ClientDriver"). Very weird. I must have done something wrong while I set the references.

How to spock integration test with standalone tomcat runner?

Our project is not currently using a spring framework.
Therefore, it is being tested based on the standalone tomcat runner.
However, since integration-enabled tests such as #SpringBootTest are not possible, Tomcat is operated in advance and the HTTP API test is carried out using Spock.
Is there a way to turn this like #SpringBootTest?
TomcatRunner
private Tomcat tomcat = null;
private int port = 8080;
private String contextPath = null;
private String docBase = null;
private Context rootContext = null;
public Tomcat8Launcher(){
init();
}
public Tomcat8Launcher(int port, String contextPath, String docBase){
this.port = port;
this.contextPath = contextPath;
this.docBase = docBase;
init();
}
private void init(){
tomcat = new Tomcat();
tomcat.setPort(port);
tomcat.enableNaming();
if(contextPath == null){
contextPath = "";
}
if(docBase == null){
File base = new File(System.getProperty("java.io.tmpdir"));
docBase = base.getAbsolutePath();
}
rootContext = tomcat.addContext(contextPath, docBase);
}
public void addServlet(String servletName, String uri, HttpServlet servlet){
Tomcat.addServlet(this.rootContext, servletName, servlet);
rootContext.addServletMapping(uri, servletName);
}
public void addListenerServlet(ServletContextListener listener){
rootContext.addApplicationListener(listener.getClass().getName());
}
public void startServer() throws LifecycleException {
tomcat.start();
tomcat.getServer().await();
}
public void stopServer() throws LifecycleException {
tomcat.stop();
}
public static void main(String[] args) throws Exception {
System.setProperty("java.util.logging.manager", "org.apache.logging.log4j.jul.LogManager");
System.setProperty(javax.naming.Context.INITIAL_CONTEXT_FACTORY, "org.apache.naming.java.javaURLContextFactory");
System.setProperty(javax.naming.Context.URL_PKG_PREFIXES, "org.apache.naming");
Tomcat8Launcher tomcatServer = new Tomcat8Launcher();
tomcatServer.addListenerServlet(new ConfigInitBaseServlet());
tomcatServer.addServlet("restServlet", "/rest/*", new RestServlet());
tomcatServer.addServlet("jsonServlet", "/json/*", new JsonServlet());
tomcatServer.startServer();
}
Spock API Test example
class apiTest extends Specification {
//static final Tomcat8Launcher tomcat = new Tomcat8Launcher()
static final String testURL = "http://localhost:8080/api/"
#Shared
def restClient
def setupSpec() {
// tomcat.main()
restClient = new RESTClient(testURL)
}
def 'findAll user'() {
when:
def response = restClient.get([path: 'user/all'])
then:
with(response){
status == 200
contentType == "application/json"
}
}
}
The test will not work if the annotations are removed from the annotations below.
// static final Tomcat8Launcher tomcat = new Tomcat8Launcher()
This line is specified API Test at the top.
// tomcat.main()
This line is specified API Test setupSpec() method
I don't know why, but only logs are recorded after Tomcat has operated and the test method is not executed.
Is there a way to fix this?
I would suggest to create a Spock extension to encapsulate everything you need. See writing custom extensions of the Spock docs as well as the built-in extensions for inspiration.

HikariCP- How to use dataSource.logWriter

HikariCP Github page have the following code:
props.put("dataSource.logWriter", new PrintWriter(System.out));
But I'm getting NullPointerException beacuse LogWriter isn't supported,
DriverDataSource final HikariCP class:
#Override
public PrintWriter getLogWriter() throws SQLException
{
throw new SQLFeatureNotSupportedException();
}
#Override
public void setLogWriter(PrintWriter logWriter) throws SQLException
{
throw new SQLFeatureNotSupportedException();
}
Is this solution for updating HikariCP logging is irrelevant?
I didn't get any answer in group
EDIT
Hikari initialization code uses PoolBase which is initializing datasource with DriverDataSource (which doesn't support logWriter):
else if (jdbcUrl != null && ds == null) {
ds = new DriverDataSource(jdbcUrl, driverClassName, dataSourceProperties, username, password);
I must send jdbcUrl in Oracle and I failed setDataSourceClassName in conjunction with setDriverClassName

failed to create a child event loop/failed to open a new selector/Too many open files

I am getting errors like "failed to create a child event loop/failed to open a new selector/Too many open files" when there are 30 or more concurrent requests...How to solve the above errors? Am I doing anything wrong? I am using Spring boot and Java cassandra driver. Below is the connection file:
public class Connection {
public static Session getConnection() {
final Cluster cluster = Cluster.builder().addContactPoint(ConnectionBean.getCASSANDRA_DB_IP())
.withQueryOptions(new QueryOptions().setConsistencyLevel(ConsistencyLevel.LOCAL_ONE))
.withCredentials(ConnectionBean.getCASSANDRA_USER(), ConnectionBean.getCASSANDRA_PASSWORD())
.withPoolingOptions(poolingOptions)
.build();
final Session session = cluster.connect(ConnectionBean.getCASSANDRA_DB_NAME());
return session;
}
}
Below is the ConnectionBean file which I used in Connection file:
public class ConnectionBean {
public static String CASSANDRA_DB_IP;
public static String CASSANDRA_DB_NAME;
public static String CASSANDRA_USER;
public static String CASSANDRA_PASSWORD;
public ConnectionBean() {
}
public ConnectionBean(String CASSANDRA_DB_IP,String CASSANDRA_DB_NAME,String CASSANDRA_USER,String CASSANDRA_PASSWORD) {
this.CASSANDRA_DB_IP=CASSANDRA_DB_IP;
this.CASSANDRA_DB_NAME=CASSANDRA_DB_NAME;
this.CASSANDRA_USER=CASSANDRA_USER;
this.CASSANDRA_PASSWORD=CASSANDRA_PASSWORD;
}
public static String getCASSANDRA_DB_IP() {
return CASSANDRA_DB_IP;
}
public static void setCASSANDRA_DB_IP(String cASSANDRA_DB_IP) {
CASSANDRA_DB_IP = cASSANDRA_DB_IP;
}
public static String getCASSANDRA_DB_NAME() {
return CASSANDRA_DB_NAME;
}
public static void setCASSANDRA_DB_NAME(String cASSANDRA_DB_NAME) {
CASSANDRA_DB_NAME = cASSANDRA_DB_NAME;
}
public static String getCASSANDRA_USER() {
return CASSANDRA_USER;
}
public static void setCASSANDRA_USER(String cASSANDRA_USER) {
CASSANDRA_USER = cASSANDRA_USER;
}
public static String getCASSANDRA_PASSWORD() {
return CASSANDRA_PASSWORD;
}
public static void setCASSANDRA_PASSWORD(String cASSANDRA_PASSWORD) {
CASSANDRA_PASSWORD = cASSANDRA_PASSWORD;
}
}
Below is the class from where ConnectionBean variables are initialized :
public class SecurityConfiguration extends WebSecurityConfigurerAdapter {
private static final String LOGIN_PROCESSING_URL = "/login";
private static final String LOGIN_FAILURE_URL = "/login?error";
private static final String LOGIN_URL = "/login";
#Autowired
private BCryptPasswordEncoder bCryptPasswordEncoder;
#Autowired
private DataSource dataSource;
#Value("${spring.queries.users-query}")
private String usersQuery;
#Value("${spring.queries.roles-query}")
private String rolesQuery;
#Value("${CASSANDRA_DB_IP}")
public String CASSANDRA_DB_IP;
#Value("${CASSANDRA_DB_NAME}")
public String CASSANDRA_DB_NAME;
#Value("${CASSANDRA_USER}")
public String CASSANDRA_USER;
#Value("${CASSANDRA_PASSWORD}")
public String CASSANDRA_PASSWORD;
#Override
protected void configure(AuthenticationManagerBuilder auth) throws Exception {
ConnectionBean cb = new ConnectionBean(CASSANDRA_DB_IP, CASSANDRA_DB_NAME, CASSANDRA_USER, CASSANDRA_PASSWORD);
auth.jdbcAuthentication().usersByUsernameQuery(usersQuery).authoritiesByUsernameQuery(rolesQuery)
.dataSource(dataSource).passwordEncoder(bCryptPasswordEncoder);
}
#Override
protected void configure(HttpSecurity http) throws Exception {
// Not using Spring CSRF here to be able to use plain HTML for the login page
http.csrf().disable()
// Register our CustomRequestCache, that saves unauthorized access attempts, so
// the user is redirected after login.
.requestCache().requestCache(new CustomRequestCache())
// Restrict access to our application.
.and().authorizeRequests()
// Allow all flow internal requests.
.requestMatchers(SecurityUtils::isFrameworkInternalRequest).permitAll()
// Allow all requests by logged in users.
.anyRequest().authenticated()
// Configure the login page.
.and().formLogin().loginPage(LOGIN_URL).permitAll().loginProcessingUrl(LOGIN_PROCESSING_URL)
.failureUrl(LOGIN_FAILURE_URL)
// Register the success handler that redirects users to the page they last tried
// to access
.successHandler(new SavedRequestAwareAuthenticationSuccessHandler())
// Configure logout
.and().logout().logoutSuccessUrl(LOGOUT_SUCCESS_URL);
}
/**
* Allows access to static resources, bypassing Spring security.
*/
#Override
public void configure(WebSecurity web) throws Exception {
web.ignoring().antMatchers(
// Vaadin Flow static resources
"/VAADIN/**",
// the standard favicon URI
"/favicon.ico",
// web application manifest
"/manifest.json", "/sw.js", "/offline-page.html",
// icons and images
"/icons/**", "/images/**",
// (development mode) static resources
"/frontend/**",
// (development mode) webjars
"/webjars/**",
// (development mode) H2 debugging console
"/h2-console/**",
// (production mode) static resources
"/frontend-es5/**", "/frontend-es6/**");
}
}
And finally, below is the class through which I am querying cassandra data:
public class getData {
Session session;
public getData(){
session = Connection.getConnection();
getDataTable();
}
private void getDataTable() {
try {
String query = "SELECT * FROM tableName";
ResultSet rs = session.execute(query);
for (Row row : rs) {
/*Do some stuff here using row*/
}
} catch (Exception e) {
e.printStackTrace();
}
}
}
If getConnection() is being invoked for every request, you are creating a new Cluster instance each time.
This is discouraged because one connection is created between your client and a C* node for each Cluster instance, and for each Session a connection pool of at least one connection is created for each C* node.
If you are not closing your Cluster instances after a request completes, these connections will remain open. After a number of requests, you'll have so many connections open that you will run out of file descriptors in your OS.
To resolve this issue, create only one Cluster and Session instance and reuse it between requests. This strategy is outlined in 4 simple rules when using the DataStax drivers for Cassandra:
Use one Cluster instance per (physical) cluster (per application lifetime)
Use at most one Session per keyspace, or use a single Session and explicitely specify the keyspace in your queries

Null pointer exception on cluster.connect when using Cassandra 2.2.7

I am trying to display all users usernames from a Cassandra database using an AJAX script in the jsp page.This would display a list of the users usernames when a view all button is clicked. However the Server throws a Null pointer exception on Session session = cluster.connect("");
java.lang.NullPointerException
User.searchAll(User.java:87)
Search.doGet(Search.java:82)
javax.servlet.http.HttpServlet.service(HttpServlet.java:622)
javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
org.netbeans.modules.web.monitor.server.MonitorFilter.doFilter(MonitorFilter.java:393)
Model
public class User {
Cluster cluster;
public User() {
}
public java.util.LinkedList<ProfileBean> searchAll(){
Session session = cluster.connect("instagrim");
LinkedList<ProfileBean> profileBeanList = new LinkedList();
String cqlQuery = "select * from userprofiles";
PreparedStatement ps = session.prepare(cqlQuery);
ResultSet rs;
BoundStatement bs = new BoundStatement(ps);
rs = session.execute(bs.bind());
if(rs.isExhausted()){
System.out.println("Profile not found");
}
else
{
for (Row row : rs){
ProfileBean profile = new ProfileBean();
profile.setLogin(row.getString("login"));
profileBeanList.add(profile);
}
}
session.close();
return profileBeanList;
}
Servlet
public class Search extends HttpServlet {
Cluster cluster = null;
public void init(ServletConfig config)
{
cluster = CassandraHosts.getCluster();
}
protected void doGet(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
User us = new User();
String output ="";
LinkedList<ProfileBean> profileBeanList = new LinkedList();
profileBeanList = us.searchAll();
for (int i=0;i<profileBeanList.size();i++)
{
output="<p>"+profileBeanList.get(i).getLogin() +"</p>";
}
response.getWriter().write(output);
RequestDispatcher rd = request.getRequestDispatcher("search.jsp");
rd.forward(request,response);
}
cluster is null, therefore it does not have a connect method. The system lets you know about the situation with the error message.
The solution is to make sure that cluster is properly initialized before you try to connect.

Categories