when I run my mod it freezes when it gets to the "Caused by: com.mojang.authlib.exceptions.MinecraftClientHttpException: Status: 401" error, I expect the error since I am not signed into a Minecraft Account but the freezing was odd. Once I stopped the process Eclipse gave me this error.
Execution failed for task ':runClient'.
Process 'command 'C:\Program Files\Eclipse Foundation\jdk-16.0.2.7-hotspot\bin\java.exe'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace.
Run with --info or --debug option to get more log output.
Run with --scan to get full insights.
I tried running with the other options, no effect. I removed the code and .json files I added right before seeing this issue for the first time, nothing. Even a simple restart did nothing. Here are some system specs and the log.
Windows 11 Home 21H2,
Intel(R) Core(TM) i7-10750H CPU # 2.60GHz 2.59 GHz,
16.0 GB RAM
[07:48:50] [main/INFO] (FabricLoader/GameProvider) Loading Minecraft 1.17.1 with Fabric Loader 0.12.12
[07:48:50] [main/INFO] (FabricLoader) Loading 52 mods:
- fabric 0.45.2+1.17
- fabric-api-base 0.4.0+cf39a74318
- fabric-api-lookup-api-v1 1.5.0+e821752d18
- fabric-biome-api-v1 3.2.2+cf39a74318
- fabric-blockrenderlayer-v1 1.1.6+cf39a74318
- fabric-command-api-v1 1.1.4+cf39a74318
- fabric-commands-v0 0.2.3+cf39a74318
- fabric-containers-v0 0.1.13+cf39a74318
- fabric-content-registries-v0 0.4.1+3447790d18
- fabric-crash-report-info-v1 0.1.6+cf39a74318
- fabric-data-generation-api-v1 1.1.1+6f83e76518
- fabric-dimensions-v1 2.0.15+b556f28c18
- fabric-entity-events-v1 1.4.1+377137cc18
- fabric-events-interaction-v0 0.4.12+e99fbe1218
- fabric-events-lifecycle-v0 0.2.4+cf39a74318
- fabric-game-rule-api-v1 1.0.8+cf39a74318
- fabric-gametest-api-v1 1.0.5+234bdc2e18
- fabric-item-api-v1 1.3.0+6617390918
- fabric-item-groups-v0 0.3.2+cf39a74318
- fabric-key-binding-api-v1 1.0.6+2a2bb57318
- fabric-keybindings-v0 0.2.4+cf39a74318
- fabric-lifecycle-events-v1 1.4.6+0392f3a618
- fabric-loot-tables-v1 1.0.5+cf39a74318
- fabric-mining-level-api-v1 1.0.3+cf39a74318
- fabric-mining-levels-v0 0.1.7+cf39a74318
- fabric-models-v0 0.3.1+cf39a74318
- fabric-networking-api-v1 1.0.14+cf39a74318
- fabric-networking-blockentity-v0 0.2.12+cf39a74318
- fabric-networking-v0 0.3.3+cf39a74318
- fabric-object-builder-api-v1 1.11.1+f907116918
- fabric-object-builders-v0 0.7.8+cf39a74318
- fabric-particles-v1 0.2.5+cf39a74318
- fabric-registry-sync-v0 0.8.0+ea29b33318
- fabric-renderer-api-v1 0.4.5+cf39a74318
- fabric-renderer-indigo 0.4.9+cf39a74318
- fabric-renderer-registries-v1 3.2.6+cf39a74318
- fabric-rendering-data-attachment-v1 0.1.6+cf39a74318
- fabric-rendering-fluids-v1 0.2.1+cf39a74318
- fabric-rendering-v0 1.1.7+cf39a74318
- fabric-rendering-v1 1.10.1+377137cc18
- fabric-resource-loader-v0 0.4.10+f09604ce18
- fabric-screen-api-v1 1.0.5+cf39a74318
- fabric-screen-handler-api-v1 1.1.9+cf39a74318
- fabric-structure-api-v1 1.1.15+cf39a74318
- fabric-tag-extensions-v0 1.2.3+cf39a74318
- fabric-textures-v0 1.0.7+cf39a74318
- fabric-tool-attribute-api-v1 1.3.3+3b96517518
- fabric-transfer-api-v1 1.5.4+cf39a74318
- fabricloader 0.12.12
- java 16
- minecraft 1.17.1
- mythicislands 1.0-1.17.1
[07:48:50] [main/INFO] (FabricLoader/Mixin) SpongePowered MIXIN Subsystem Version=0.8.4 Source=file:/C:/Users/creep/.gradle/caches/modules-2/files-2.1/net.fabricmc/sponge-mixin/0.10.7+mixin.0.8.4/7a4ca9d54d9ae564dea0363d668036a8420ed9b8/sponge-mixin-0.10.7+mixin.0.8.4.jar Service=Knot/Fabric Env=CLIENT
[07:48:51] [main/INFO] (FabricLoader/Mixin) Loaded Fabric development mappings for mixin remapper!
[07:48:51] [main/INFO] (FabricLoader/Mixin) Compatibility level set to JAVA_16
[07:48:52] [main/WARN] (FileUtil) Configuration conflict: there is more than one oshi.properties file on the classpath
[07:48:52] [main/WARN] (FileUtil) Configuration conflict: there is more than one oshi.architecture.properties file on the classpath
[07:48:57] [Render thread/INFO] (Minecraft) Environment: authHost='https://authserver.mojang.com', accountsHost='https://api.mojang.com', sessionHost='https://sessionserver.mojang.com', servicesHost='https://api.minecraftservices.com', name='PROD'
[07:48:57] [Render thread/ERROR] (Minecraft) Failed to verify authentication
com.mojang.authlib.exceptions.InvalidCredentialsException: Status: 401
at com.mojang.authlib.exceptions.MinecraftClientHttpException.toAuthenticationException(MinecraftClientHttpException.java:56) ~[authlib-2.3.31.jar:?]
at com.mojang.authlib.yggdrasil.YggdrasilSocialInteractionsService.checkPrivileges(YggdrasilSocialInteractionsService.java:112) ~[authlib-2.3.31.jar:?]
at com.mojang.authlib.yggdrasil.YggdrasilSocialInteractionsService.<init>(YggdrasilSocialInteractionsService.java:42) ~[authlib-2.3.31.jar:?]
at com.mojang.authlib.yggdrasil.YggdrasilAuthenticationService.createSocialInteractionsService(YggdrasilAuthenticationService.java:151) ~[authlib-2.3.31.jar:?]
at net.minecraft.client.MinecraftClient.createSocialInteractionsService(MinecraftClient.java:670) [minecraft-project-#-mapped.jar:?]
at net.minecraft.client.MinecraftClient.<init>(MinecraftClient.java:429) [minecraft-project-#-mapped.jar:?]
at net.minecraft.client.main.Main.main(Main.java:179) [minecraft-project-#-mapped.jar:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:78) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
at java.lang.reflect.Method.invoke(Method.java:567) ~[?:?]
at net.fabricmc.loader.impl.game.minecraft.MinecraftGameProvider.launch(MinecraftGameProvider.java:608) [fabric-loader-0.12.12.jar:?]
at net.fabricmc.loader.impl.launch.knot.Knot.launch(Knot.java:77) [fabric-loader-0.12.12.jar:?]
at net.fabricmc.loader.impl.launch.knot.KnotClient.main(KnotClient.java:23) [fabric-loader-0.12.12.jar:?]
at net.fabricmc.devlaunchinjector.Main.main(Main.java:86) [dev-launch-injector-0.2.1+build.8.jar:?]
Caused by: com.mojang.authlib.exceptions.MinecraftClientHttpException: Status: 401
at com.mojang.authlib.minecraft.client.MinecraftClient.readInputStream(MinecraftClient.java:77) ~[authlib-2.3.31.jar:?]
at com.mojang.authlib.minecraft.client.MinecraftClient.get(MinecraftClient.java:47) ~[authlib-2.3.31.jar:?]
at com.mojang.authlib.yggdrasil.YggdrasilSocialInteractionsService.checkPrivileges(YggdrasilSocialInteractionsService.java:104) ~[authlib-2.3.31.jar:?]
... 13 more
[07:48:57] [Render thread/INFO] (Minecraft) Setting user: Player592
[07:48:57] [Render thread/WARN] (FabricLoader/Mixin) #Inject(#At("INVOKE")) Shift.BY=3 on fabric-lifecycle-events-v1.mixins.json:client.WorldChunkMixin::handler$zbf000$onLoadBlockEntity exceeds the maximum allowed value: 0. Increase the value of maxShiftBy to suppress this warning.
[07:48:57] [Render thread/INFO] (Minecraft) [STDOUT]: Registering Mod Items for mythicislands
[07:48:57] [Render thread/INFO] (Minecraft) [STDOUT]: Registering ModBlocks for mythicislands
[07:48:57] [Render thread/INFO] (Minecraft) [STDOUT]: Hello Fabric world!
[07:48:57] [Render thread/INFO] (Indigo) [Indigo] Registering Indigo renderer!
[07:48:58] [Render thread/INFO] (Minecraft) Backend library: LWJGL version 3.2.2 build 10
If there is any way to fix this it would be greatly appreciated, I looked over the internet and while some solutions came up, none worked. This happened out of nowhere after adding some code but as stated removing said code does nothing.
Related
I'm asking for help in building a project for Android (my knowledge in this tends to 0).
There is a rather old project. I'm trying to build it in AndroidStudio, I get the following error:
Build command failed.
Error while executing process C:\Users\User\AppData\Local\Android\Sdk\cmake\3.18.1\bin\ninja.exe with arguments {-C C:\Users\User\IdeaProjects\1\stouch-task-hardware-upgrade-investigation\app.cxx\RelWithDebInfo\2a9512e2\armeabi-v7a stouch}
ninja: Entering directory `C:\Users\User\IdeaProjects\1\stouch-task-hardware-upgrade-investigation\app.cxx\RelWithDebInfo\2a9512e2\armeabi-v7a'
[1/1] Linking CXX shared library ........\build\intermediates\cxx\RelWithDebInfo\2a9512e2\obj\armeabi-v7a\libstouch.so
I tried different versions of studio, java, gradle but the result is always the same.
How do I fix it?
Update:
2023-01-12 23:41:37,079 [e-1136-b06] WARN -
ea.gradle.project.sync.SdkSync - Replacing invalid NDK path
C:\Users\User\AppData\Local\Android\Sdk\ndk-bundle with
C:\Users\User\AppData\Local\Android\Sdk\ndk-bundle 2023-01-12
23:41:37,079 [e-1136-b06] INFO - e.project.sync.GradleSyncState -
Started sync with Gradle for project 'stouch'. 2023-01-12
23:41:37,080 [e-1136-b06] INFO - idea.project.IndexingSuspender -
Consuming IndexingSuspender activation event: SYNC_STARTED 2023-01-12
23:41:37,080 [e-1136-b06] INFO - idea.project.IndexingSuspender -
Starting batch update for project: Project
'C:\Users\User\IdeaProjects\AS\9\stouch' stouch 2023-01-12
23:41:37,087 [thread 116] INFO - s.plugins.gradle.GradleManager -
Instructing gradle to use java from C:/Program Files/Android/Android
Studio/jre 2023-01-12 23:41:37,088 [thread 116] INFO -
s.plugins.gradle.GradleManager - Instructing gradle to use java from
C:/Program Files/Android/Android Studio/jre 2023-01-12 23:41:37,101
[thread 116] INFO - xecution.GradleExecutionHelper - Passing
command-line args to Gradle Tooling API: -Didea.version=3.2
-Djava.awt.headless=true -Pandroid.injected.build.model.only=true -Pandroid.injected.build.model.only.advanced=true -Pandroid.injected.invoked.from.ide=true -Pandroid.injected.build.model.only.versioned=3 -Pandroid.injected.studio.version=3.2.1.0 -Pandroid.builder.sdkDownload=false --init-script C:\Users\User\AppData\Local\Temp\ijinit17.gradle 2023-01-12
23:41:37,770 [thread 116] INFO - .project.GradleProjectResolver -
Gradle project resolve error org.gradle.tooling.BuildException: Could
not run build action using Gradle distribution
'https://services.gradle.org/distributions/gradle-4.2-all.zip'. at
org.gradle.tooling.internal.consumer.ExceptionTransformer.transform(ExceptionTransformer.java:51)
at
org.gradle.tooling.internal.consumer.ExceptionTransformer.transform(ExceptionTransformer.java:29)
at
org.gradle.tooling.internal.consumer.ResultHandlerAdapter.onFailure(ResultHandlerAdapter.java:41)
I try to use ideaIC inspect cli to check local uncommitted changes in Ubuntu.
According to website docs, whole project code inspections had been execed successfull with command bin/inspect.sh $(pwd) $(pwd)/.idea/inspectionProfiles/Project_Default.xml temp -v2.
But when I try to use -changes option like bin/inspect.sh $(pwd) $(pwd)/.idea/inspectionProfiles/Project_Default.xml temp -changes -v2 , unexpected error occurred. I found the error message in code, but I can't found any solution about it.
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
Starting up IntelliJ IDEA 2020.2.3 (build IC-202.7660.26) ...done.
Opening project...2021-02-26 16:41:48,634 [ 1629] WARN - Container.ComponentManagerImpl - Do not use constructor injection (requestorClass=com.alibaba.p3c.idea.component.AliProjectComponent)
2021-02-26 16:41:49,018 [ 2013] WARN - Container.ComponentManagerImpl - Do not use constructor injection (requestorClass=org.jetbrains.android.compose.AndroidComposeAutoDocumentation)
Feb 26, 2021 4:41:49 PM net.sourceforge.pmd.RuleSetFactory parseRuleSetNode
WARNING: RuleSet description is missing. Future versions of PMD will require it.
Feb 26, 2021 4:41:50 PM net.sourceforge.pmd.RuleSetFactory parseRuleSetNode
WARNING: RuleSet description is missing. Future versions of PMD will require it.
2021-02-26 16:41:50,181 [ 3176] WARN - tartup.impl.StartupManagerImpl - Activities registered via registerPostStartupActivity must be dumb-aware: org.jetbrains.kotlin.idea.configuration.ui.KotlinConfigurationCheckerComponent$projectOpened$1#48ae78
done.
Initializing project...Loaded profile 'Project Default' from file '/home/user/workspace/lab/mai/bigdata/home/.idea/inspectionProfiles/Project_Default.xml'
modified file/home/user/workspace/lab/mai/bigdata/home/src/main/java/com/maiscrm/bigdata/spark/app/etl/CommonIncrementalCollectionLoader.java
done.
Inspecting with profile 'Project Default'
Running first analysis stage...
Shelving changes...
2021-02-26 16:41:55,220 [ 8215] WARN - ion.impl.NotificationCollector - Notification group 'Heap Dump Analysis' is already registered in whitelist
2021-02-26 16:41:55,220 [ 8215] WARN - ion.impl.NotificationCollector - Notification group 'Low Memory' is already registered in whitelist
Running second analysis stage...
2021-02-26 16:41:55,370 [ 8365] ERROR - spection.InspectionApplication - java.lang.IllegalArgumentException: The code below uses the same GUI thread to complete operations.Running from EDT would deadlock
java.lang.RuntimeException: java.lang.IllegalArgumentException: The code below uses the same GUI thread to complete operations.Running from EDT would deadlock
at com.intellij.openapi.application.impl.LaterInvocator.invokeAndWait(LaterInvocator.java:149)
at com.intellij.openapi.application.impl.ApplicationImpl.invokeAndWait(ApplicationImpl.java:475)
at com.intellij.openapi.application.ex.ApplicationUtil.invokeAndWaitSomewhere(ApplicationUtil.java:160)
at com.intellij.openapi.progress.impl.CoreProgressManager.runProcessWithProgressInCurrentThread(CoreProgressManager.java:535)
at com.intellij.openapi.progress.impl.CoreProgressManager.run(CoreProgressManager.java:335)
at com.intellij.openapi.vcs.changes.CallbackData.lambda$createInteractive$3(CallbackData.java:85)
at com.intellij.openapi.vcs.changes.UpdateRequestsQueue.invokeAfterUpdate(UpdateRequestsQueue.java:177)
at com.intellij.openapi.vcs.changes.ChangeListManagerImpl.invokeAfterUpdate(ChangeListManagerImpl.java:356)
at com.intellij.openapi.vcs.changes.ChangeListManagerImpl.invokeAfterUpdate(ChangeListManagerImpl.java:344)
at com.intellij.codeInspection.InspectionApplication.run(InspectionApplication.java:235)
at com.intellij.codeInspection.InspectionApplication.execute(InspectionApplication.java:140)
at com.intellij.codeInspection.InspectionApplication.startup(InspectionApplication.java:107)
at com.intellij.codeInspection.InspectionMain.main(InspectionMain.java:99)
at com.intellij.openapi.application.ApplicationStarter.main(ApplicationStarter.java:62)
at com.intellij.idea.ApplicationLoader$startApp$8.run(ApplicationLoader.kt:231)
at java.base/java.util.concurrent.CompletableFuture$UniRun.tryFire(CompletableFuture.java:783)
at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
at java.base/java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:610)
at java.base/java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:1085)
at java.base/java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:478)
at com.intellij.idea.ApplicationLoader$startApp$nonEdtExecutor$1.execute(ApplicationLoader.kt:131)
at java.base/java.util.concurrent.CompletableFuture$UniCompletion.claim(CompletableFuture.java:568)
at java.base/java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:1069)
at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1742)
at com.intellij.util.concurrency.BoundedTaskExecutor.doRun(BoundedTaskExecutor.java:215)
at com.intellij.util.concurrency.BoundedTaskExecutor.access$200(BoundedTaskExecutor.java:26)
at com.intellij.util.concurrency.BoundedTaskExecutor$1.execute(BoundedTaskExecutor.java:194)
at com.intellij.util.concurrency.BoundedTaskExecutor$1.run(BoundedTaskExecutor.java:186)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.util.concurrent.Executors$PrivilegedThreadFactory$1$1.run(Executors.java:668)
at java.base/java.util.concurrent.Executors$PrivilegedThreadFactory$1$1.run(Executors.java:665)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at java.base/java.util.concurrent.Executors$PrivilegedThreadFactory$1.run(Executors.java:665)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: java.lang.IllegalArgumentException: The code below uses the same GUI thread to complete operations.Running from EDT would deadlock
at com.intellij.openapi.projectRoots.impl.UnknownSdkInspectionCommandLineConfigurator$configureProject$1.invokeSuspend(UnknownSdkInspectionCommandLineConfigurator.kt:38)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:56)
at kotlinx.coroutines.EventLoopImplBase.processNextEvent(EventLoop.common.kt:272)
at kotlinx.coroutines.BlockingCoroutine.joinBlocking(Builders.kt:79)
at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking(Builders.kt:54)
at kotlinx.coroutines.BuildersKt.runBlocking(Unknown Source)
at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking$default(Builders.kt:36)
at kotlinx.coroutines.BuildersKt.runBlocking$default(Unknown Source)
at com.intellij.openapi.projectRoots.impl.UnknownSdkInspectionCommandLineConfigurator.configureProject(UnknownSdkInspectionCommandLineConfigurator.kt:37)
at com.intellij.codeInspection.InspectionApplication.configureProject(InspectionApplication.java:350)
at com.intellij.codeInspection.InspectionApplication.lambda$runUnderProgress$21(InspectionApplication.java:574)
at com.intellij.openapi.progress.impl.CoreProgressManager.lambda$runProcess$2(CoreProgressManager.java:170)
at com.intellij.openapi.progress.impl.CoreProgressManager.registerIndicatorAndRun(CoreProgressManager.java:629)
at com.intellij.openapi.progress.impl.CoreProgressManager.executeProcessUnderProgress(CoreProgressManager.java:581)
at com.intellij.openapi.progress.impl.ProgressManagerImpl.executeProcessUnderProgress(ProgressManagerImpl.java:60)
at com.intellij.openapi.progress.impl.CoreProgressManager.runProcess(CoreProgressManager.java:157)
at com.intellij.codeInspection.InspectionApplication.runUnderProgress(InspectionApplication.java:573)
at com.intellij.codeInspection.InspectionApplication.runAnalysis(InspectionApplication.java:373)
at com.intellij.codeInspection.InspectionApplication.runAnalysisOnScope(InspectionApplication.java:343)
at com.intellij.codeInspection.InspectionApplication.lambda$run$4(InspectionApplication.java:241)
at com.intellij.openapi.vcs.changes.Waiter.onSuccess(Waiter.java:51)
at com.intellij.openapi.progress.impl.CoreProgressManager.finishTask(CoreProgressManager.java:549)
at com.intellij.openapi.progress.impl.CoreProgressManager.lambda$runProcessWithProgressInCurrentThread$9(CoreProgressManager.java:535)
at com.intellij.openapi.application.TransactionGuardImpl$2.run(TransactionGuardImpl.java:201)
at com.intellij.openapi.application.impl.ApplicationImpl.runIntendedWriteActionOnCurrentThread(ApplicationImpl.java:802)
at com.intellij.openapi.application.impl.ApplicationImpl.lambda$invokeAndWait$8(ApplicationImpl.java:475)
at com.intellij.openapi.application.impl.LaterInvocator$1.run(LaterInvocator.java:126)
at com.intellij.openapi.application.impl.FlushQueue.doRun(FlushQueue.java:84)
at com.intellij.openapi.application.impl.FlushQueue.runNextEvent(FlushQueue.java:132)
at com.intellij.openapi.application.impl.FlushQueue.flushNow(FlushQueue.java:47)
at com.intellij.openapi.application.impl.FlushQueue$FlushNow.run(FlushQueue.java:188)
at java.desktop/java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:313)
at java.desktop/java.awt.EventQueue.dispatchEventImpl(EventQueue.java:776)
at java.desktop/java.awt.EventQueue$4.run(EventQueue.java:727)
at java.desktop/java.awt.EventQueue$4.run(EventQueue.java:721)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at java.base/java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:85)
at java.desktop/java.awt.EventQueue.dispatchEvent(EventQueue.java:746)
at com.intellij.ide.IdeEventQueue.defaultDispatchEvent(IdeEventQueue.java:971)
at com.intellij.ide.IdeEventQueue._dispatchEvent(IdeEventQueue.java:841)
at com.intellij.ide.IdeEventQueue.lambda$dispatchEvent$8(IdeEventQueue.java:452)
at com.intellij.openapi.progress.impl.CoreProgressManager.computePrioritized(CoreProgressManager.java:744)
at com.intellij.ide.IdeEventQueue.lambda$dispatchEvent$9(IdeEventQueue.java:451)
at com.intellij.openapi.application.impl.ApplicationImpl.runIntendedWriteActionOnCurrentThread(ApplicationImpl.java:802)
at com.intellij.ide.IdeEventQueue.dispatchEvent(IdeEventQueue.java:499)
at java.desktop/java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:203)
at java.desktop/java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:124)
at java.desktop/java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:113)
at java.desktop/java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:109)
at java.desktop/java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101)
at java.desktop/java.awt.EventDispatchThread.run(EventDispatchThread.java:90)
2021-02-26 16:41:55,372 [ 8367] ERROR - spection.InspectionApplication - IntelliJ IDEA 2020.2.3 Build #IC-202.7660.26
2021-02-26 16:41:55,372 [ 8367] ERROR - spection.InspectionApplication - JDK: 11.0.8; VM: OpenJDK 64-Bit Server VM; Vendor: JetBrains s.r.o.
2021-02-26 16:41:55,372 [ 8367] ERROR - spection.InspectionApplication - OS: Linux
2021-02-26 16:41:55,372 [ 8367] ERROR - spection.InspectionApplication - Last Action:
java.lang.IllegalArgumentException: The code below uses the same GUI thread to complete operations.Running from EDT would deadlock
The issue is solved in 2020.3.2 version of IntelliJ IDEA. Related issue in JB bugtracker: https://youtrack.jetbrains.com/issue/IDEA-263195
I'm trying to set up the Hive integration with Flink as shown here. I have everything configured as mentioned, all services (Hive, MySQL, Kafka) are running properly. However, as I start a Flink SQL Client on a standalone local Flink cluster with this command:
./bin/sql-client.sh embedded
I get a ClassNotFoundException: org.apache.hadoop.conf.Configuration...
This is the detailed log file with the exception trace:
2020-04-01 11:27:31,458 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: jobmanager.rpc.address, localhost
2020-04-01 11:27:31,459 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: jobmanager.rpc.port, 6123
2020-04-01 11:27:31,459 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: jobmanager.heap.size, 1024m
2020-04-01 11:27:31,460 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: taskmanager.memory.process.size, 1568m
2020-04-01 11:27:31,460 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: taskmanager.numberOfTaskSlots, 1
2020-04-01 11:27:31,460 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: parallelism.default, 1
2020-04-01 11:27:31,460 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: jobmanager.execution.failover-strategy, region
2020-04-01 11:27:31,507 INFO org.apache.flink.core.fs.FileSystem - Hadoop is not in the classpath/dependencies. The extended set of supported File Systems via Hadoop is not available.
2020-04-01 11:27:31,527 WARN org.apache.flink.client.cli.CliFrontend - Could not load CLI class org.apache.flink.yarn.cli.FlinkYarnSessionCli.
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/exceptions/YarnException
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.apache.flink.client.cli.CliFrontend.loadCustomCommandLine(CliFrontend.java:1076)
at org.apache.flink.client.cli.CliFrontend.loadCustomCommandLines(CliFrontend.java:1030)
at org.apache.flink.table.client.gateway.local.LocalExecutor.<init>(LocalExecutor.java:135)
at org.apache.flink.table.client.SqlClient.start(SqlClient.java:85)
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:178)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.yarn.exceptions.YarnException
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:419)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
at java.lang.ClassLoader.loadClass(ClassLoader.java:352)
... 7 more
2020-04-01 11:27:31,543 INFO org.apache.flink.table.client.gateway.local.LocalExecutor - Using default environment file: file:/home/rudip7/flink/conf/sql-client-defaults.yaml
2020-04-01 11:27:31,861 INFO org.apache.flink.table.client.config.entries.ExecutionEntry - Property 'execution.restart-strategy.type' not specified. Using default value: fallback
2020-04-01 11:27:32,776 ERROR org.apache.flink.table.client.SqlClient - SQL Client must stop. Unexpected exception. This is a bug. Please consider filing an issue.
org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.
at
org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:753)
at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:228)
at org.apache.flink.table.client.SqlClient.start(SqlClient.java:98)
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:178)
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:84)
at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:371)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$4(ExecutionContext.java:547)
at java.util.HashMap.forEach(HashMap.java:1289)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$5(ExecutionContext.java:546)
at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:240)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:545)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:494)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:159)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:118)
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:742)
... 3 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:419)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
at java.lang.ClassLoader.loadClass(ClassLoader.java:352)
... 14 more
Do you have any hint about what I'm missing? I have all my Hadoop libraries on my Classpath and I don't understand why this class is missing...
I'm using Flink 1.10 and Java 8
Also, this line in the log file makes me wonder if it really my mistake:
2020-04-01 11:27:32,776 ERROR org.apache.flink.table.client.SqlClient - SQL Client must stop. Unexpected exception. This is a bug. Please consider filing an issue.
Thank you in advance!
if you not add the HADOOP_HOME in you env, you can export the HADOOP_CLASSPATH, before run the ./bin/sql-client.sh embedded
export HADOOP_CLASSPATH=`hadoop classpath`
I am trying to get some junit tests to run on jenkins through jmeter for some GAE cost testing. It has been a real headache to say the least. As far as what is going on right now, I have fought my way up to getting this issue, which I hope will be the last. Some key points:
I have no issues what-so-ever when running it locally through the GUI
The follow stack trace error is taken from the log via running it on Jenkins through a shell script that calls a java program that runs it
Here is the stack trace:
2016/07/12 08:53:13 ERROR - jmeter.threads.JMeterThread: Test failed! java.lang.NoSuchFieldError: INSTANCE
at org.apache.jmeter.protocol.http.sampler.HTTPHC4Impl.setupClient(HTTPHC4Impl.java:774)
at org.apache.jmeter.protocol.http.sampler.HTTPHC4Impl.sample(HTTPHC4Impl.java:327)
at org.apache.jmeter.protocol.http.sampler.HTTPSamplerProxy.sample(HTTPSamplerProxy.java:74)
at org.apache.jmeter.protocol.http.sampler.HTTPSamplerBase.sample(HTTPSamplerBase.java:1146)
at org.apache.jmeter.protocol.http.sampler.HTTPSamplerBase.sample(HTTPSamplerBase.java:1135)
at org.apache.jmeter.threads.JMeterThread.executeSamplePackage(JMeterThread.java:465)
at org.apache.jmeter.threads.JMeterThread.processSampler(JMeterThread.java:410)
at org.apache.jmeter.threads.JMeterThread.run(JMeterThread.java:241)
at java.lang.Thread.run(Thread.java:745)
2016/07/12 08:53:13 INFO - jmeter.threads.JMeterThread: Thread finished: jp#gc - Ultimate Thread Group 1-1
2016/07/12 08:53:13 INFO - jmeter.engine.StandardJMeterEngine: Notifying test listeners of end of test
2016/07/12 08:53:13 INFO - jmeter.reporters.Summariser: summary = 0 in 00:00:00 = ******/s Avg: 0 Min: 9223372036854775807 Max: -9223372036854775808 Err: 0 (0.00%)
2016/07/12 08:53:23 ERROR - jmeter.JMeter: Uncaught exception: java.lang.NumberFormatException: For input string: "******"
at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:2043)
at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110)
at java.lang.Double.parseDouble(Double.java:538)
at AutoTest.AutoTestUtil.CheckStats(AutoTestUtil.java:219)
at AutoTest.AutoTestUtil.main(AutoTestUtil.java:68)
It can be resolved by classpath has same jars with different versions of httpcore.
Keep any one jar compatible and remove another one, applied to httpclient jar also.
I have a cluster running Hadoop 0.20.2 and Pig 0.10.
I'm interested to add some logs to Pig's source code and to run my own Pig version on the cluster.
What I did:
built the project with 'ant' command
got pig.jar and pig-withouthadoop.jar
copied the jars to Pig home directory on the cluster's namenode
run a job
Then I've got following std output:
2013-03-25 06:35:05,226 [main] WARN org.apache.pig.backend.hadoop20.PigJobControl - falling back to default JobControl (not using hadoop 0.20 ?)
java.lang.NoSuchFieldException: runnerState
at java.lang.Class.getDeclaredField(Class.java:1882)
at org.apache.pig.backend.hadoop20.PigJobControl.<clinit>(PigJobControl.java:51)
at org.apache.pig.backend.hadoop.executionengine.shims.HadoopShims.newJobControl(HadoopShims.java:97)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:287)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:177)
at org.apache.pig.PigServer.launchPlan(PigServer.java:1320)
at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1305)
at org.apache.pig.PigServer.execute(PigServer.java:1295)
at org.apache.pig.PigServer.executeBatch(PigServer.java:375)
at org.apache.pig.PigServer.executeBatch(PigServer.java:353)
at org.apache.pig.tools.grunt.GruntParser.executeBatch(GruntParser.java:137)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:198)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84)
at org.apache.pig.Main.run(Main.java:480)
at org.apache.pig.Main.main(Main.java:157)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
2013-03-25 06:35:05,229 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig script settings are added to the job
2013-03-25 06:35:05,260 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
2013-03-25 06:35:05,272 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Setting Parallelism to 1
2013-03-25 06:35:06,041 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - creating jar file Job9091543475518322185.jar
2013-03-25 06:35:10,974 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - jar file Job9091543475518322185.jar created
2013-03-25 06:35:10,995 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Setting up single store job
2013-03-25 06:35:11,006 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Key [pig.schematuple] is false, will not generate code.
2013-03-25 06:35:11,006 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Starting process to move generated code to distributed cacche
2013-03-25 06:35:11,006 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Setting key [pig.schematuple.classes] with classes to deserialize []
2013-03-25 06:35:11,181 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2998: Unhandled internal error. org.apache.hadoop.mapred.jobcontrol.JobControl.addJob(Lorg/apache/hadoop/mapred/jobcontrol/Job;)Ljava/lang/String;
Pig Stack Trace:
ERROR 2998: Unhandled internal error. org.apache.hadoop.mapred.jobcontrol.JobControl.addJob(Lorg/apache/hadoop/mapred/jobcontrol/Job;)Ljava/lang/String;
java.lang.NoSuchMethodError: org.apache.hadoop.mapred.jobcontrol.JobControl.addJob(Lorg/apache/hadoop/mapred/jobcontrol/Job;)Ljava/lang/String;
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:298)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:177)
at org.apache.pig.PigServer.launchPlan(PigServer.java:1320)
at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1305)
at org.apache.pig.PigServer.execute(PigServer.java:1295)
at org.apache.pig.PigServer.executeBatch(PigServer.java:375)
at org.apache.pig.PigServer.executeBatch(PigServer.java:353)
at org.apache.pig.tools.grunt.GruntParser.executeBatch(GruntParser.java:137)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:198)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84)
at org.apache.pig.Main.run(Main.java:480)
at org.apache.pig.Main.main(Main.java:157)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
what went wrong? Should I do anything else except replacing pig.jar and pig-withouthadoop.jar in installation directory of namenode?
help...
the point I missed was: pig-withouthadoop.jar should be compiled with specific Hadoop version.
I compiled the jar in following way and it worked:
% ant clean jar-withouthadoop -Dhadoopversion=23