I am trying to run some automated tests on a swing based java application using the robotframework's swing library. I think I have got all the libraries installed. However, when I run the test it is unable to actually launch the application properly (i.e. the fail tests), and I am getting whole bunch of warnings(?) before the test actually starts. Also, I am briefly see the Java icon appear on the dock, but no window actually opens.
I don't have any problem with testing the demo application here though. That works fine.
Below is my robot script:
*** settings ***
Library SwingLibrary
*** Test Cases ***
Test Push Button
Start Test Application
*** Keywords ***
Start Test Application
Start Application com.xyz.app.ui.MainFrame
Select Main Window
Here is the output:
davids$ sudo CLASSPATH=lib/swinglibrary-1.5.1.jar:app.jar jybot app.robot
Password:
10:31:34.578 [MainThread] DEBUG o.p.n.u.i.l.InternalLoggerFactory - Using SLF4J as the default logging framework
10:31:34.613 [MainThread] DEBUG o.p.n.u.internal.PlatformDependent0 - java.nio.Buffer.address: available
10:31:34.613 [MainThread] DEBUG o.p.n.u.internal.PlatformDependent0 - sun.misc.Unsafe.theUnsafe: available
10:31:34.614 [MainThread] DEBUG o.p.n.u.internal.PlatformDependent0 - sun.misc.Unsafe.copyMemory: available
10:31:34.615 [MainThread] DEBUG o.p.n.u.internal.PlatformDependent0 - java.nio.Bits.unaligned: true
10:31:34.616 [MainThread] DEBUG o.p.n.u.internal.PlatformDependent - Java version: 8
10:31:34.616 [MainThread] DEBUG o.p.n.u.internal.PlatformDependent - -Dio.netty.noUnsafe: false
10:31:34.616 [MainThread] DEBUG o.p.n.u.internal.PlatformDependent - sun.misc.Unsafe: available
10:31:34.617 [MainThread] DEBUG o.p.n.u.internal.PlatformDependent - -Dio.netty.noJavassist: false
10:31:34.697 [MainThread] DEBUG o.p.n.u.internal.PlatformDependent - Javassist: available
10:31:34.698 [MainThread] DEBUG o.p.n.u.internal.PlatformDependent - -Dio.netty.tmpdir: /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T (java.io.tmpdir)
10:31:34.698 [MainThread] DEBUG o.p.n.u.internal.PlatformDependent - -Dio.netty.bitMode: 64 (sun.arch.data.model)
10:31:34.698 [MainThread] DEBUG o.p.n.u.internal.PlatformDependent - -Dio.netty.noPreferDirect: false
10:31:34.698 [MainThread] DEBUG o.p.n.buffer.PooledByteBufAllocator - -Dio.netty.allocator.numHeapArenas: 4
10:31:34.699 [MainThread] DEBUG o.p.n.buffer.PooledByteBufAllocator - -Dio.netty.allocator.numDirectArenas: 4
10:31:34.699 [MainThread] DEBUG o.p.n.buffer.PooledByteBufAllocator - -Dio.netty.allocator.pageSize: 8192
10:31:34.699 [MainThread] DEBUG o.p.n.buffer.PooledByteBufAllocator - -Dio.netty.allocator.maxOrder: 11
10:31:34.699 [MainThread] DEBUG o.p.n.buffer.PooledByteBufAllocator - -Dio.netty.allocator.chunkSize: 16777216
10:31:34.699 [MainThread] DEBUG o.p.n.buffer.PooledByteBufAllocator - -Dio.netty.allocator.tinyCacheSize: 512
10:31:34.699 [MainThread] DEBUG o.p.n.buffer.PooledByteBufAllocator - -Dio.netty.allocator.smallCacheSize: 256
10:31:34.699 [MainThread] DEBUG o.p.n.buffer.PooledByteBufAllocator - -Dio.netty.allocator.normalCacheSize: 64
10:31:34.699 [MainThread] DEBUG o.p.n.buffer.PooledByteBufAllocator - -Dio.netty.allocator.maxCachedBufferCapacity: 32768
10:31:34.699 [MainThread] DEBUG o.p.n.buffer.PooledByteBufAllocator - -Dio.netty.allocator.cacheTrimInterval: 8192
10:31:34.720 [MainThread] DEBUG o.p.n.c.MultithreadEventLoopGroup - -Dio.netty.eventLoopThreads: 8
10:31:34.768 [MainThread] DEBUG o.p.netty.channel.nio.NioEventLoop - -Dio.netty.noKeySetOptimization: false
10:31:34.768 [MainThread] DEBUG o.p.netty.channel.nio.NioEventLoop - -Dio.netty.selectorAutoRebuildThreshold: 512
==============================================================================
app
==============================================================================
Test Push Button | FAIL |
com.xyz.app.ui.MainFrame
------------------------------------------------------------------------------
app | FAIL |
1 critical test, 0 passed, 1 failed
1 test total, 0 passed, 1 failed
==============================================================================
Output: /Users/davids/swinglibrary-demo/output.xml
Log: /Users/davids/swinglibrary-demo/log.html
Report: /Users/davids/swinglibrary-demo/report.html
Also, I have verified that the app.jar file does in fact contain the path to the MainFrame class: com/xyz/app/ui/MainFrame.class
UPDATE:
Using "Start Application In Separate Thread" keyword instead of "Start Application" gives a new error. It seems that it can't find the main class, even though I have manually confirmed the path of the main class by unzipping the jar file.:
Exception in thread "Thread-14" java.lang.RuntimeException: java.lang.ClassNotFoundException: com.xyz.app.ui.MainFrame
at org.robotframework.swing.keyword.launch.ApplicationLaunchingKeywords$1.run(ApplicationLaunchingKeywords.java:53)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: com.xyz.app.ui.MainFrame
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.robotframework.swing.keyword.launch.ApplicationLaunchingKeywords.getMainMethod(ApplicationLaunchingKeywords.java:64)
at org.robotframework.swing.keyword.launch.ApplicationLaunchingKeywords.launchApplication(ApplicationLaunchingKeywords.java:32)
at org.robotframework.swing.keyword.launch.ApplicationLaunchingKeywords$1.run(ApplicationLaunchingKeywords.java:51)
... 1 more
Following solves the issue:
Setting CLASSPATH using export instead of defining it inline.
Using the fullpath to the .jars.
Related
Error says
Testing started at 12:03 ...
12:03:10.624 [main] DEBUG io.github.bonigarcia.wdm.cache.ResolutionCache - Resolution chrome=103 in cache (valid until 12:20:19 30/07/2022 BST)
12:03:10.634 [main] DEBUG io.github.bonigarcia.wdm.cache.ResolutionCache - Resolution chrome103=103.0.5060.134 in cache (valid until 17:22:03 30/07/2022 BST)
12:03:10.635 [main] INFO io.github.bonigarcia.wdm.WebDriverManager - Using chromedriver 103.0.5060.134 (resolved driver for Chrome 103)
12:03:10.678 [main] DEBUG io.github.bonigarcia.wdm.WebDriverManager - Driver chromedriver 103.0.5060.134 found in cache
12:03:10.681 [main] INFO io.github.bonigarcia.wdm.WebDriverManager - Exporting webdriver.chrome.driver as C:\Users\lukem.cache\selenium\chromedriver\win32\103.0.5060.134\chromedriver.exe
0 Scenarios
0 Steps
0m1.300s
I'm trying to operate HDFS via Java Hadoop client. But when I call FileSystem::listFiles, the returned iterator give me no entry.
Here is my java code:
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.LocatedFileStatus;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.fs.RemoteIterator;
import java.net.URI;
import java.net.URISyntaxException;
import java.io.IOException;
class HadoopTest {
public static void main(String[] args) throws IOException, URISyntaxException {
String url = "hdfs://10.2.206.148";
FileSystem fs = FileSystem.get(new URI(url), new Configuration());
System.out.println("get fs success!");
RemoteIterator<LocatedFileStatus> iterator = fs.listFiles(new Path("/"), false);
while (iterator.hasNext()) {
LocatedFileStatus lfs = iterator.next();
System.out.println(lfs.getPath().toString());
}
System.out.println("iteration finished");
}
}
And here is the outputs:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/admin/pengduo/hadoop_test/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/admin/pengduo/hadoop_test/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]
10:49:26.019 [main] DEBUG org.apache.hadoop.util.Shell - setsid exited with exit code 0
10:49:26.064 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation #org.apache.hadoop.metrics2.annotation.Metric(always=false, sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[Rate of successful kerberos logins and latency (milliseconds)])
10:49:26.069 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation #org.apache.hadoop.metrics2.annotation.Metric(always=false, sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[Rate of failed kerberos logins and latency (milliseconds)])
10:49:26.069 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation #org.apache.hadoop.metrics2.annotation.Metric(always=false, sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[GetGroups])
10:49:26.070 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field private org.apache.hadoop.metrics2.lib.MutableGaugeLong org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailuresTotal with annotation #org.apache.hadoop.metrics2.annotation.Metric(always=false, sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[Renewal failures since startup])
10:49:26.070 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field private org.apache.hadoop.metrics2.lib.MutableGaugeInt org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailures with annotation #org.apache.hadoop.metrics2.annotation.Metric(always=false, sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[Renewal failures since last successful login])
10:49:26.071 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl - UgiMetrics, User and group related metrics
10:49:26.084 [main] DEBUG org.apache.hadoop.security.SecurityUtil - Setting hadoop.security.token.service.use_ip to true
10:49:26.096 [main] DEBUG org.apache.hadoop.security.Groups - Creating new Groups object
10:49:26.097 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Trying to load the custom-built native-hadoop library...
10:49:26.097 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
10:49:26.097 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
10:49:26.097 [main] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
10:49:26.098 [main] DEBUG org.apache.hadoop.util.PerformanceAdvisory - Falling back to shell based
10:49:26.098 [main] DEBUG org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
10:49:26.153 [main] DEBUG org.apache.hadoop.security.Groups - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
10:49:26.157 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - hadoop login
10:49:26.158 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - hadoop login commit
10:49:26.161 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - using local user:UnixPrincipal: admin
10:49:26.161 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - Using user: "UnixPrincipal: admin" with name admin
10:49:26.161 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - User entry: "admin"
10:49:26.161 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - UGI loginUser:admin (auth:SIMPLE)
log4j:WARN No appenders could be found for logger (org.apache.htrace.core.Tracer).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
10:49:26.201 [main] DEBUG org.apache.hadoop.fs.FileSystem - Loading filesystems
10:49:26.211 [main] DEBUG org.apache.hadoop.fs.FileSystem - file:// = class org.apache.hadoop.fs.LocalFileSystem from /home/admin/pengduo/hadoop_test/lib/hadoop-common-3.2.1.jar
10:49:26.216 [main] DEBUG org.apache.hadoop.fs.FileSystem - viewfs:// = class org.apache.hadoop.fs.viewfs.ViewFileSystem from /home/admin/pengduo/hadoop_test/lib/hadoop-common-3.2.1.jar
10:49:26.218 [main] DEBUG org.apache.hadoop.fs.FileSystem - har:// = class org.apache.hadoop.fs.HarFileSystem from /home/admin/pengduo/hadoop_test/lib/hadoop-common-3.2.1.jar
10:49:26.219 [main] DEBUG org.apache.hadoop.fs.FileSystem - http:// = class org.apache.hadoop.fs.http.HttpFileSystem from /home/admin/pengduo/hadoop_test/lib/hadoop-common-3.2.1.jar
10:49:26.219 [main] DEBUG org.apache.hadoop.fs.FileSystem - https:// = class org.apache.hadoop.fs.http.HttpsFileSystem from /home/admin/pengduo/hadoop_test/lib/hadoop-common-3.2.1.jar
10:49:26.226 [main] DEBUG org.apache.hadoop.fs.FileSystem - hdfs:// = class org.apache.hadoop.hdfs.DistributedFileSystem from /home/admin/pengduo/hadoop_test/lib/hadoop-hdfs-client-3.2.1.jar
10:49:26.233 [main] DEBUG org.apache.hadoop.fs.FileSystem - webhdfs:// = class org.apache.hadoop.hdfs.web.WebHdfsFileSystem from /home/admin/pengduo/hadoop_test/lib/hadoop-hdfs-client-3.2.1.jar
10:49:26.234 [main] DEBUG org.apache.hadoop.fs.FileSystem - swebhdfs:// = class org.apache.hadoop.hdfs.web.SWebHdfsFileSystem from /home/admin/pengduo/hadoop_test/lib/hadoop-hdfs-client-3.2.1.jar
10:49:26.234 [main] DEBUG org.apache.hadoop.fs.FileSystem - Looking for FS supporting hdfs
10:49:26.234 [main] DEBUG org.apache.hadoop.fs.FileSystem - looking for configuration option fs.hdfs.impl
10:49:26.251 [main] DEBUG org.apache.hadoop.fs.FileSystem - Looking in service filesystems for implementation class
10:49:26.251 [main] DEBUG org.apache.hadoop.fs.FileSystem - FS for hdfs is class org.apache.hadoop.hdfs.DistributedFileSystem
10:49:26.282 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.use.legacy.blockreader.local = false
10:49:26.282 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.read.shortcircuit = false
10:49:26.282 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.client.domain.socket.data.traffic = false
10:49:26.282 [main] DEBUG org.apache.hadoop.hdfs.client.impl.DfsClientConf - dfs.domain.socket.path =
10:49:26.291 [main] DEBUG org.apache.hadoop.hdfs.DFSClient - Sets dfs.client.block.write.replace-datanode-on-failure.min-replication to 0
10:49:26.297 [main] DEBUG org.apache.hadoop.io.retry.RetryUtils - multipleLinearRandomRetry = null
10:49:26.312 [main] DEBUG org.apache.hadoop.ipc.Server - rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcProtobufRequest, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker#7c729a55
10:49:26.322 [main] DEBUG org.apache.hadoop.ipc.Client - getting client out of cache: org.apache.hadoop.ipc.Client#222545dc
10:49:26.587 [main] DEBUG org.apache.hadoop.util.PerformanceAdvisory - Both short-circuit local reads and UNIX domain socket are disabled.
10:49:26.593 [main] DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil - DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
get fs success!
10:49:26.629 [main] DEBUG org.apache.hadoop.ipc.Client - The ping interval is 60000 ms.
10:49:26.631 [main] DEBUG org.apache.hadoop.ipc.Client - Connecting to /10.2.206.148:8020
10:49:26.658 [IPC Client (1923598304) connection to /10.2.206.148:8020 from admin] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1923598304) connection to /10.2.206.148:8020 from admin: starting, having connections 1
10:49:26.660 [IPC Parameter Sending Thread #0] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1923598304) connection to /10.2.206.148:8020 from admin sending #0 org.apache.hadoop.hdfs.protocol.ClientProtocol.getListing
10:49:26.666 [IPC Client (1923598304) connection to /10.2.206.148:8020 from admin] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1923598304) connection to /10.2.206.148:8020 from admin got value #0
10:49:26.666 [main] DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine - Call: getListing took 52ms
iteration finished
10:49:26.695 [shutdown-hook-0] DEBUG org.apache.hadoop.ipc.Client - stopping client from cache: org.apache.hadoop.ipc.Client#222545dc
10:49:26.695 [shutdown-hook-0] DEBUG org.apache.hadoop.ipc.Client - removing client from cache: org.apache.hadoop.ipc.Client#222545dc
10:49:26.695 [shutdown-hook-0] DEBUG org.apache.hadoop.ipc.Client - stopping actual client because no more references remain: org.apache.hadoop.ipc.Client#222545dc
10:49:26.695 [shutdown-hook-0] DEBUG org.apache.hadoop.ipc.Client - Stopping client
10:49:26.696 [IPC Client (1923598304) connection to /10.2.206.148:8020 from admin] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1923598304) connection to /10.2.206.148:8020 from admin: closed
10:49:26.696 [IPC Client (1923598304) connection to /10.2.206.148:8020 from admin] DEBUG org.apache.hadoop.ipc.Client - IPC Client (1923598304) connection to /10.2.206.148:8020 from admin: stopped, remaining connections 0
10:49:26.797 [Thread-4] DEBUG org.apache.hadoop.util.ShutdownHookManager - Completed shutdown in 0.102 seconds; Timeouts: 0
10:49:26.808 [Thread-4] DEBUG org.apache.hadoop.util.ShutdownHookManager - ShutdownHookManger completed shutdown.
Note that we can get the file system successfully. And the iteration is executed with no errors.
However, list the same directory using hadoop fs command looks good:
$ $HADOOP_HOME/bin/hadoop fs -ls hdfs://10.2.206.148/
Warning: fs.defaultFs is not set when running "ls" command.
Found 4 items
drwxr-x--x - hadoop hadoop 0 2020-09-21 20:29 hdfs://10.2.206.148/apps
drwxr-x--x - hadoop hadoop 0 2021-07-08 10:44 hdfs://10.2.206.148/spark-history
drwxrwxrwt - root hadoop 0 2021-07-08 10:43 hdfs://10.2.206.148/tmp
drwxr-x--t - hadoop hadoop 0 2020-11-20 11:31 hdfs://10.2.206.148/user
I have set HADOOP_HOME appropriately.
My Hadoop libs versions are 3.2.1:
$ ll hadoop-*
-rw-r--r-- 1 admin admin 60258 Jul 8 10:42 hadoop-annotations-3.2.1.jar
-rw-r--r-- 1 admin admin 139109 Jul 8 10:42 hadoop-auth-3.2.1.jar
-rw-r--r-- 1 admin admin 44163 Jul 8 10:42 hadoop-client-3.2.1.jar
-rw-r--r-- 1 admin admin 4137520 Jul 8 10:42 hadoop-common-3.2.1.jar
-rw-r--r-- 1 admin admin 5959246 Jul 8 10:42 hadoop-hdfs-3.2.1.jar
-rw-r--r-- 1 admin admin 5094412 Jul 8 10:42 hadoop-hdfs-client-3.2.1.jar
-rw-r--r-- 1 admin admin 805845 Jul 8 10:42 hadoop-mapreduce-client-common-3.2.1.jar
-rw-r--r-- 1 admin admin 1657002 Jul 8 10:42 hadoop-mapreduce-client-core-3.2.1.jar
-rw-r--r-- 1 admin admin 85900 Jul 8 10:42 hadoop-mapreduce-client-jobclient-3.2.1.jar
-rw-r--r-- 1 admin admin 3287723 Jul 8 10:42 hadoop-yarn-api-3.2.1.jar
-rw-r--r-- 1 admin admin 322882 Jul 8 10:42 hadoop-yarn-client-3.2.1.jar
-rw-r--r-- 1 admin admin 2919779 Jul 8 10:42 hadoop-yarn-common-3.2.1.jar
I'm confused why Java Hadoop client behaves differently from Hadoop CLI, and how to make my Java program performs correctly. Can anyone help me? Many thanks!
I have figured out this on myself. The problem is that I use FileSystem::listFiles. This method will only list all the files (but not the directories) under the given path. While I have only 4 directories under the given path. To list all entries including files and directories and the given path, I should use FileSystem::listLocatedStatus instead of FileSystem::listFiles.
// this will list only the files but not the directories under "/"
// RemoteIterator<LocatedFileStatus> iterator = fs.listFiles(new Path("/"), false);
// this will list all entries including the files and the directories
RemoteIterator<LocatedFileStatus> iterator = fs.listLocatedStatus(new Path("/"));
Environment
HikariCP version : 2.4.2
JDK version : 1.7.0_141
Database : MySQL
MySQL Driver version : 5.6.27
Hibernate : 4.3.4
We have successfully migrated Hikari from C3PO and everything was running smooth until recently when we encountered the following exception.
org.hibernate.exception.JDBCConnectionException: Could not open
connection Caused by: java.sql.SQLTransientConnectionException:
HikariPool-0 - Connection is not available, request timed out after
30006ms.
We deploy our webapp in AWS EC2 and database is hosted in RDS.
We have deployed the web app on 4 tomcats. The max connection pool size is configured at 250 per tomcat web app.
A total of 4000 logins happened in about 2 hours.
The use case is as follows
We have scheduled Online Tests which the student users take within a given period of time (say 9:00 am to 12:30 pm). There are simultaneous logins ( say 4000 logins )
We are able to reproduce the issue by using Locust. And we enable log4j for logging. Following is the logs
DEBUG [http-apr-8080-exec-4] - HikariPool-0 - configuration:
DEBUG [http-apr-8080-exec-4] - allowPoolSuspension.............false *
DEBUG [http-apr-8080-exec-4] - autoCommit......................true *
DEBUG [http-apr-8080-exec-4] - catalog.........................
DEBUG [http-apr-8080-exec-4] - connectionInitSql...............
DEBUG [http-apr-8080-exec-4] - connectionTestQuery.............
DEBUG [http-apr-8080-exec-4] - connectionTimeout...............30000 *
DEBUG [http-apr-8080-exec-4] - dataSource......................
DEBUG [http-apr-8080-exec-4] - dataSourceClassName.............
DEBUG [http-apr-8080-exec-4] - dataSourceJNDI..................
DEBUG [http-apr-8080-exec-4] - dataSourceProperties............{useUnicode=true, password=, prepStmtCacheSqlLimit=2048, characterEncoding=utf8, cachePrepStmts=true, useServerPrepStmts=true, prepStmtCacheSize=250}
DEBUG [http-apr-8080-exec-4] - driverClassName.................
DEBUG [http-apr-8080-exec-4] - healthCheckProperties...........{}
DEBUG [http-apr-8080-exec-4] - healthCheckRegistry.............
DEBUG [http-apr-8080-exec-4] - idleTimeout.....................600000 *
DEBUG [http-apr-8080-exec-4] - initializationFailFast..........true *
DEBUG [http-apr-8080-exec-4] - isolateInternalQueries..........false *
DEBUG [http-apr-8080-exec-4] - jdbc4ConnectionTest.............false *
DEBUG [http-apr-8080-exec-4] - jdbcUrl.........................jdbc:mysql://rds url
DEBUG [http-apr-8080-exec-4] - leakDetectionThreshold..........0 *
DEBUG [http-apr-8080-exec-4] - maxLifetime.....................1800000 *
DEBUG [http-apr-8080-exec-4] - maximumPoolSize.................250
DEBUG [http-apr-8080-exec-4] - metricRegistry..................
DEBUG [http-apr-8080-exec-4] - metricsTrackerFactory...........
DEBUG [http-apr-8080-exec-4] - minimumIdle.....................5
DEBUG [http-apr-8080-exec-4] - password........................
DEBUG [http-apr-8080-exec-4] - poolName........................HikariPool-0
DEBUG [http-apr-8080-exec-4] - readOnly........................false *
DEBUG [http-apr-8080-exec-4] - registerMbeans..................false *
DEBUG [http-apr-8080-exec-4] - scheduledExecutorService........
DEBUG [http-apr-8080-exec-4] - threadFactory...................
DEBUG [http-apr-8080-exec-4] - transactionIsolation............
DEBUG [http-apr-8080-exec-4] - username........................username
DEBUG [http-apr-8080-exec-4] - validationTimeout...............5000 *
There are few defaulted value which are marked * whereas others are set explicitly.
HikariConfig config = new HikariConfig();
config.setJdbcUrl(JDBC URL);
config.setUsername(USER_NAME);
config.setPassword(PASSWOrD);
config.setMinimumIdle(5);
config.setMaximumPoolSize(250);
config.addDataSourceProperty("cachePrepStmts", true);
config.addDataSourceProperty("prepStmtCacheSize", 250);
config.addDataSourceProperty("prepStmtCacheSqlLimit", 2048);
config.addDataSourceProperty("useServerPrepStmts", true);
datasource = new HikariDataSource(config)
Within 20 seconds or so, the connection pool is exhausted
DEBUG [http-apr-8080-exec-494] - Timeout failure pool HikariPool-0
stats
(total=100, active=100, idle=0, waiting=194)
waiting count is increasing and finally the fore mentioned exceptions are thrown.
We are not able to pinpoint what causes the issue. We tried variants of the configuration but unable to solve it yet.
I am using IntelliJ IDEA community edition and got the latest version around two months ago. I have this huge problem that whenever I try to import or make a maven project I get the unhelpful message "Unable to import maven project: See logs for details" from the editor.
What really stumps me and those I have asked for help so far is that this issue seems completely isolated to me and everyone else around me importing the same projects can do so with ease. The logs haven't been helpful to me at all either and when I tried following some other solutions for this on Stack Overflow nothing happened.
I have also tried re-installing IntelliJ it fixed it but only for that particular time i used IntelliJ.
I have the logs but the dump in it is far too long to put into a question.
So, do any of you have any idea what I should do?
Putting partial dump here in any case just to try. Just the first part.
2017-09-26 12:09:12,894 [ 47064] WARN - tectAndAdjustIndentOptionsTask - Indent detection is too long for: Main.java
2017-09-26 12:09:12,894 [ 47064] WARN - tectAndAdjustIndentOptionsTask - Indent detection is too long for: Vinir.json
2017-09-26 12:09:12,895 [ 47065] WARN - tectAndAdjustIndentOptionsTask - Indent detection is too long for: Friends.java
2017-09-26 12:09:12,895 [ 47065] WARN - tectAndAdjustIndentOptionsTask - Indent detection is too long for: Books.java
2017-09-26 12:09:12,895 [ 47065] WARN - tectAndAdjustIndentOptionsTask - Indent detection is too long for: Borrowers.java
2017-09-26 12:09:12,895 [ 47065] WARN - tectAndAdjustIndentOptionsTask - Indent detection is too long for: Bækur.json
2017-09-26 12:09:17,655 [ 51825] INFO - ellij.project.impl.ProjectImpl - 18 project components initialized in 52 ms
2017-09-26 12:55:48,767 [2842937] INFO - stubs.SerializationManagerImpl - START StubSerializationManager SHUTDOWN
2017-09-26 12:55:48,768 [2842938] INFO - stubs.SerializationManagerImpl - END StubSerializationManager SHUTDOWN
2017-09-26 12:55:48,768 [2842938] INFO - il.indexing.FileBasedIndexImpl - START INDEX SHUTDOWN
2017-09-26 12:55:48,788 [2842958] INFO - il.indexing.FileBasedIndexImpl - END INDEX SHUTDOWN
2017-09-26 12:55:48,789 [2842959] INFO - org.jetbrains.io.BuiltInServer - web server stopped
2017-09-26 12:55:48,801 [2842971] INFO - Types.impl.FileTypeManagerImpl - FileTypeManager: 0 auto-detected files
Elapsed time on auto-detect: 0 ms
2017-09-26 12:55:48,808 [2842978] INFO - pl.local.NativeFileWatcherImpl - Watcher terminated with exit code 0
2017-09-26 12:55:48,811 [2842981] INFO - newvfs.persistent.PersistentFS - VFS dispose started
2017-09-26 12:55:48,819 [2842989] INFO - newvfs.persistent.PersistentFS - VFS dispose completed
2017-09-26 12:55:48,822 [2842992] INFO - #com.intellij.idea.Main - ------------------------------------------------------ IDE SHUTDOWN ------------------------------------------------------
2017-09-26 12:55:48,824 [2842994] INFO - org.jetbrains.io.BuiltInServer - web server stopped
2017-09-27 12:13:07,909 [ 0] INFO - #com.intellij.idea.Main - ------------------------------------------------------ IDE STARTED ------------------------------------------------------
2017-09-27 12:13:07,919 [ 10] INFO - #com.intellij.util.ui.JBUI - User scale factor: 1.25
2017-09-27 12:13:07,919 [ 10] INFO - #com.intellij.util.ui.JBUI - System scale factor: 1.25 (IDE-managed HiDPI)
2017-09-27 12:13:07,953 [ 44] INFO - #com.intellij.idea.Main - IDE: IntelliJ IDEA (build #IC-172.4155.36, 11 Sep 2017 19:07)
2017-09-27 12:13:07,953 [ 44] INFO - #com.intellij.idea.Main - OS: Windows 10 (10.0, amd64)
2017-09-27 12:13:07,953 [ 44] INFO - #com.intellij.idea.Main - JRE: 1.8.0_152-release-915-b11 (JetBrains s.r.o)
2017-09-27 12:13:07,953 [ 44] INFO - #com.intellij.idea.Main - JVM: 25.152-b11 (OpenJDK 64-Bit Server VM)
2017-09-27 12:13:07,975 [ 66] INFO - #com.intellij.idea.Main - JVM Args: -Xms128m -Xmx750m -XX:ReservedCodeCacheSize=240m -XX:+UseConcMarkSweepGC -XX:SoftRefLRUPolicyMSPerMB=50 -ea -Dsun.io.useCanonCaches=false -Djava.net.preferIPv4Stack=true -XX:+HeapDumpOnOutOfMemoryError -XX:-OmitStackTraceInFastThrow -Djb.vmOptionsFile=D:\download\IntelliJ IDEA Community Edition 2017.2.4\bin\idea64.exe.vmoptions -Xbootclasspath/a:D:\download\IntelliJ IDEA Community Edition 2017.2.4\lib\boot.jar -Didea.platform.prefix=Idea -Didea.jre.check=true -Didea.paths.selector=IdeaIC2017.2 -XX:ErrorFile=C:\Users\Grimur\java_error_in_idea_%p.log -XX:HeapDumpPath=C:\Users\Grimur\java_error_in_idea.hprof
2017-09-27 12:13:07,975 [ 66] INFO - #com.intellij.idea.Main - ext: D:\download\IntelliJ IDEA Community Edition 2017.2.4\jre64\lib\ext: [access-bridge-64.jar, cldrdata.jar, dnsns.jar, jaccess.jar, jfxrt.jar, localedata.jar, meta-index, nashorn.jar, sunec.jar, sunjce_provider.jar, sunmscapi.jar, sunpkcs11.jar, zipfs.jar]
2017-09-27 12:13:07,975 [ 66] INFO - #com.intellij.idea.Main - JNU charset: Cp1252
2017-09-27 12:13:08,047 [ 138] INFO - #com.intellij.idea.Main - JNA library (64-bit) loaded in 71 ms
2017-09-27 12:13:08,070 [ 161] INFO - penapi.util.io.win32.IdeaWin32 - Native filesystem for Windows is operational
2017-09-27 12:13:08,071 [ 162] INFO - #com.intellij.idea.Main - Using "FocusKiller" library to prevent focus stealing.
2017-09-27 12:13:10,577 [ 2668] INFO - llij.ide.plugins.PluginManager - Cannot find optional descriptor duplicates-groovy.xml
2017-09-27 12:13:11,447 [ 3538] INFO - llij.ide.plugins.PluginManager - 32 plugins initialized in 1383 ms
2017-09-27 12:13:11,452 [ 3543] INFO - llij.ide.plugins.PluginManager - Loaded bundled plugins: Android Support (10.2.3), Ant Support (1.0), Bytecode Viewer (0.1), CVS Integration (11), Copyright (8.1), Coverage (172.4155.36), Eclipse Integration (3.0), EditorConfig (172.4155.36), Git Integration (8.1), GitHub (172.4155.36), Gradle (172.4155.36), Groovy (9.0), I18n for Java (172.4155.36), IDEA CORE (172.4155.36), IntelliLang (8.0), JUnit (1.0), Java Bytecode Decompiler (172.4155.36), JavaFX (1.0), Kotlin (1.1.4-release-IJ2017.2-3), Maven Integration (172.4155.36), Plugin DevKit (1.0), Properties Support (172.4155.36), Settings Repository (172.4155.36), Subversion Integration (1.1), Task Management (1.0), Terminal (0.1), TestNG-J (8.0), UI Designer (172.4155.36), XPathView + XSLT Support (4), XSLT-Debugger (1.4), YAML (172.4155.36), hg4idea (10.0)
2017-09-27 12:13:18,972 [ 11063] INFO - ellij.util.io.PagedFileStorage - lower=100; upper=500; buffer=10; max=705
2017-09-27 12:13:19,124 [ 11215] INFO - pl.local.NativeFileWatcherImpl - Starting file watcher: D:\download\IntelliJ IDEA Community Edition 2017.2.4\bin\fsnotifier64.exe
2017-09-27 12:13:19,172 [ 11263] INFO - pl.local.NativeFileWatcherImpl - Native file watcher is operational.
2017-09-27 12:13:23,116 [ 15207] INFO - til.net.ssl.CertificateManager - Default SSL context initialized
2017-09-27 12:13:23,180 [ 15271] INFO - figurations.GeneralCommandLine - Cannot run program "D:\download\IntelliJ IDEA Community Edition 2017.2.4\jre64\bin\java.exe": CreateProcess error=2, The system cannot find the file specified
java.io.IOException: Cannot run program "D:\download\IntelliJ IDEA Community Edition 2017.2.4\jre64\bin\java.exe": CreateProcess error=2, The system cannot find the file specified
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
at com.intellij.execution.configurations.GeneralCommandLine.startProcess(GeneralCommandLine.java:420)
at com.intellij.execution.configurations.GeneralCommandLine.createProcess(GeneralCommandLine.java:387)
at com.intellij.execution.process.OSProcessHandler.<init>(OSProcessHandler.java:45)
at com.intellij.execution.process.CapturingProcessHandler.<init>(CapturingProcessHandler.java:38)
at com.intellij.execution.util.ExecUtil.execAndGetOutput(ExecUtil.java:101)
at com.intellij.util.JdkBundle.getJDKNameArchVersionAndUpdate(JdkBundle.java:230)
at com.intellij.util.JdkBundle.createBundle(JdkBundle.java:103)
at com.intellij.util.JdkBundle.createBoot(JdkBundle.java:137)
at com.intellij.util.JdkBundle.createBoot(JdkBundle.java:124)
at com.intellij.ide.SystemHealthMonitor.checkRuntime(SystemHealthMonitor.java:101)
at com.intellij.ide.SystemHealthMonitor.initComponent(SystemHealthMonitor.java:79)
at com.intellij.openapi.components.impl.ComponentManagerImpl$ComponentConfigComponentAdapter.getComponentInstance(ComponentManagerImpl.java:492)
at com.intellij.openapi.components.impl.ComponentManagerImpl.createComponents(ComponentManagerImpl.java:118)
at com.intellij.openapi.application.impl.ApplicationImpl.lambda$createComponents$9(ApplicationImpl.java:474)
at com.intellij.openapi.progress.impl.CoreProgressManager.lambda$runProcess$1(CoreProgressManager.java:170)
at com.intellij.openapi.progress.impl.CoreProgressManager.registerIndicatorAndRun(CoreProgressManager.java:548)
at com.intellij.openapi.progress.impl.CoreProgressManager.executeProcessUnderProgress(CoreProgressManager.java:493)
at com.intellij.openapi.progress.impl.ProgressManagerImpl.executeProcessUnderProgress(ProgressManagerImpl.java:94)
at com.intellij.openapi.progress.impl.CoreProgressManager.runProcess(CoreProgressManager.java:157)
at com.intellij.openapi.application.impl.ApplicationImpl.createComponents(ApplicationImpl.java:481)
at com.intellij.openapi.components.impl.ComponentManagerImpl.init(ComponentManagerImpl.java:102)
at com.intellij.openapi.application.impl.ApplicationImpl.load(ApplicationImpl.java:433)
at com.intellij.openapi.application.impl.ApplicationImpl.load(ApplicationImpl.java:419)
at com.intellij.idea.IdeaApplication.run(IdeaApplication.java:203)
at com.intellij.idea.MainImpl$1.lambda$null$0(MainImpl.java:49)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:311)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:762)
at java.awt.EventQueue.access$500(EventQueue.java:98)
at java.awt.EventQueue$3.run(EventQueue.java:715)
at java.awt.EventQueue$3.run(EventQueue.java:709)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:80)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:732)
at com.intellij.ide.IdeEventQueue.dispatchEvent(IdeEventQueue.java:345)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:201)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:116)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:105)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:93)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:82)
Caused by: java.io.IOException: CreateProcess error=2, The system cannot find the file specified
at java.lang.ProcessImpl.create(Native Method)
at java.lang.ProcessImpl.<init>(ProcessImpl.java:386)
at java.lang.ProcessImpl.start(ProcessImpl.java:137)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
... 40 more
Thank you very much if you know what's going on. If this can't be resolved I might have to fail a course simply due to being unable to do the projects I need to do.
I have got the same problem with my IntelliJ Ultimate 17. The problem was that JDK for importer (maven in that case) was set to IntelliJ's internal.
Go to Settings (shortcut: CTRL+ALT+S)
Then, Build, Execution, Deployment -> Build Tools -> Maven -> Importing
Choose other JDK than the internal one for JDK for importer.
Check if works, eventually - restart IntelliJ.
Picture with settings:
Selenium-server-standalone has a 5 seconds delay while starting. If I run selenium with debug option it shows that the delay is caused by New random session seed.
I start Selenium server with java -jar selenium-server-standalone-2.44.0.jar.
If I want to specify chromedriver then I write java -jar selenium-server-standalone-2.44.0.jar -Dwebdriver.chrome.driver=chromedriver_2.11
It happens with:
OSX 10.9.5 and OSX 10.10
Java 1.7.0_51 (update 67 and update 71)
selenium-server-standalone 2.42.0, 2.43.1, 2.44.0
with and without chromedriver (2.9, 2.10, 2.11)
running as sudo and as non-priveleged user
From what I know on other machines with the same configuration there is no such delay.
It may seem that 5 seconds is nothing, but I have another issue that most probably is caused by the same problem. It gives me 1 minute of waiting instead of few seconds.
Here is the log
11:22:50.445 INFO - Launching a standalone server
11:22:50.820 INFO - Java: Oracle Corporation 24.51-b03
11:22:50.820 INFO - OS: Mac OS X 10.10 x86_64
11:22:50.948 INFO - v2.44.0, with Core v2.44.0. Built from revision 76d78cf
11:22:50.948 INFO - Selenium server running in debug mode.
11:22:51.001 DEBUG - add component: SocketListener0#0.0.0.0:4444
11:22:51.022 DEBUG - add component: org.openqa.jetty.http.ResourceCache#50265e47
11:22:51.046 DEBUG - add component: org.openqa.selenium.server.ProxyHandler in HttpContext[/,/]
11:22:51.065 DEBUG - add component: HttpContext[/,/]
11:22:51.070 DEBUG - Added HttpContext[/,/] for host *
11:22:51.071 DEBUG - add component: org.openqa.jetty.http.ResourceCache#36264c17
11:22:51.073 DEBUG - added SC{BASIC,null,user,CONFIDENTIAL} at /org/openqa/selenium/tests/html/basicAuth/*
11:22:51.107 DEBUG - add component: org.openqa.jetty.http.handler.SecurityHandler in HttpContext[/selenium-server,/selenium-server]
11:22:51.111 DEBUG - add component: org.openqa.selenium.server.StaticContentHandler in HttpContext[/selenium-server,/selenium-server]
11:22:51.123 DEBUG - add component: org.openqa.selenium.server.SessionExtensionJsHandler in HttpContext[/selenium-server,/selenium-server]
11:22:51.125 DEBUG - add component: org.openqa.selenium.server.htmlrunner.SingleTestSuiteResourceHandler in HttpContext[/selenium-server,/selenium-server]
11:22:51.126 DEBUG - add component: org.openqa.selenium.server.htmlrunner.SeleniumHTMLRunnerResultsHandler#6d9d3901
11:22:51.132 DEBUG - add component: HttpContext[/selenium-server,/selenium-server]
11:22:51.132 DEBUG - Added HttpContext[/selenium-server,/selenium-server] for host *
11:22:51.242 INFO - Default driver org.openqa.selenium.ie.InternetExplorerDriver registration is skipped: registration capabilities Capabilities [{platform=WINDOWS, ensureCleanSession=true, browserName=internet explorer, version=}] does not match with current platform: MAC
11:22:51.446 DEBUG - add component: org.openqa.jetty.http.ResourceCache#61f3318a
11:22:51.483 DEBUG - add component: org.openqa.selenium.server.SeleniumDriverResourceHandler in HttpContext[/selenium-server,/selenium-server]
11:22:51.483 DEBUG - add component: HttpContext[/selenium-server/driver,/selenium-server/driver]
11:22:51.484 DEBUG - Added HttpContext[/selenium-server/driver,/selenium-server/driver] for host *
11:22:51.484 DEBUG - add component: org.openqa.jetty.http.ResourceCache#c3b626c
11:22:51.528 DEBUG - add component: WebDriver remote server
11:22:51.568 DEBUG - add component: org.openqa.jetty.jetty.servlet.HashSessionManager#5298d146
11:22:51.568 DEBUG - add component: org.openqa.jetty.jetty.servlet.ServletHandler#2ed37507
11:22:51.596 INFO - RemoteWebDriver instances should connect to: http://127.0.0.1:4444/wd/hub
11:22:51.596 DEBUG - add component: HttpContext[/wd,/wd]
11:22:51.597 DEBUG - Added HttpContext[/wd,/wd] for host *
11:22:51.597 DEBUG - Starting org.openqa.jetty.jetty.Server#a5bdce3
11:22:51.598 INFO - Version Jetty/5.1.x
11:22:51.598 DEBUG - LISTENERS: [SocketListener0#0.0.0.0:4444]
11:22:51.598 DEBUG - HANDLER: {null={/selenium-server/driver/*=[HttpContext[/selenium-server/driver,/selenium-server/driver]], /selenium-server/*=[HttpContext[/selenium-server,/selenium-server]], /=[HttpContext[/,/]], /wd/*=[HttpContext[/wd,/wd]]}}
11:22:51.599 DEBUG - Starting HttpContext[/selenium-server/driver,/selenium-server/driver]
11:22:51.612 DEBUG - Init classloader from null, sun.misc.Launcher$AppClassLoader#5eb1404f for HttpContext[/selenium-server/driver,/selenium-server/driver]
11:22:51.612 INFO - Started HttpContext[/selenium-server/driver,/selenium-server/driver]
11:22:51.612 DEBUG - Starting HttpContext[/selenium-server,/selenium-server]
11:22:51.612 DEBUG - Init classloader from null, sun.misc.Launcher$AppClassLoader#5eb1404f for HttpContext[/selenium-server,/selenium-server]
11:22:51.613 DEBUG - Started org.openqa.jetty.http.handler.SecurityHandler in HttpContext[/selenium-server,/selenium-server]
11:22:51.613 DEBUG - Started org.openqa.selenium.server.StaticContentHandler in HttpContext[/selenium-server,/selenium-server]
11:22:51.613 DEBUG - Started org.openqa.selenium.server.SessionExtensionJsHandler in HttpContext[/selenium-server,/selenium-server]
11:22:51.613 DEBUG - Started org.openqa.selenium.server.htmlrunner.SingleTestSuiteResourceHandler in HttpContext[/selenium-server,/selenium-server]
11:22:51.613 DEBUG - Started org.openqa.selenium.server.SeleniumDriverResourceHandler in HttpContext[/selenium-server,/selenium-server]
11:22:51.613 INFO - Started HttpContext[/selenium-server,/selenium-server]
11:22:51.614 DEBUG - Starting HttpContext[/,/]
11:22:51.614 DEBUG - Init classloader from null, sun.misc.Launcher$AppClassLoader#5eb1404f for HttpContext[/,/]
11:22:51.614 DEBUG - Started org.openqa.selenium.server.ProxyHandler in HttpContext[/,/]
11:22:51.614 INFO - Started HttpContext[/,/]
11:22:51.614 DEBUG - Starting HttpContext[/wd,/wd]
11:22:51.614 DEBUG - Init classloader from null, sun.misc.Launcher$AppClassLoader#5eb1404f for HttpContext[/wd,/wd]
11:22:51.614 DEBUG - Starting org.openqa.jetty.jetty.servlet.ServletHandler#2ed37507
11:22:51.614 DEBUG - New random session seed
11:22:56.882 DEBUG - Session scavenger period = 30s
11:22:56.890 DEBUG - Started holder of class org.openqa.selenium.remote.server.DriverServlet
11:22:56.891 INFO - Started org.openqa.jetty.jetty.servlet.ServletHandler#2ed37507
11:22:56.891 INFO - Started HttpContext[/wd,/wd]
11:22:56.928 INFO - Started SocketListener on 0.0.0.0:4444
11:22:56.928 INFO - Started org.openqa.jetty.jetty.Server#a5bdce3