My IntelliJ IDEA plugin is based on official Jet Brains template, however Java one, not Kotlin.
Solving the error Failed to apply plugin 'org.jetbrains.intellij', I have updated the Gradle from 7.2 to 7.3. Then, trying to build the project, I have got the error
Val cannot be reassigned
(I'll append the detailed output in the end of this question).
I have checked the Gradle 5.0 causes error Val cannot be reassigned when using Kotlin DSL in buid.gradle.kts issue on GitHub. There, the cause was the classDirectories field which hase become read-only. But when I have tried to search the classDirectories in my projet, there was not the search results.
Maybe the solution will be obvious for advanced Gradle users, but Gradle/Java/Kotlin is not my specialization while I need to create the plugins for IntelliJ IDEA.
Appendix
Output with stack trace
11:58:14: Executing 'buildPlugin --stacktrace'...
> Configure project :
[gradle-intellij-plugin :] Gradle IntelliJ Plugin is outdated: 1.12.0. Update `org.jetbrains.intellij` to: 1.13.0
e: D:\IntelliJ IDEA\InHouseDevelopment\YamatoDaiwaES_Extensions\IDEsPlugins\IntelliJ_IDEA\build.gradle.kts:49:5: Val cannot be reassigned
e: D:\IntelliJ IDEA\InHouseDevelopment\YamatoDaiwaES_Extensions\IDEsPlugins\IntelliJ_IDEA\build.gradle.kts:49:15: Type mismatch: inferred type is String but Property<String> was expected
e: D:\IntelliJ IDEA\InHouseDevelopment\YamatoDaiwaES_Extensions\IDEsPlugins\IntelliJ_IDEA\build.gradle.kts:50:5: Val cannot be reassigned
e: D:\IntelliJ IDEA\InHouseDevelopment\YamatoDaiwaES_Extensions\IDEsPlugins\IntelliJ_IDEA\build.gradle.kts:50:14: Type mismatch: inferred type is List<???> but ListProperty<String> was expected
FAILURE: Build failed with an exception.
* Where:
Build file 'D:\IntelliJ IDEA\InHouseDevelopment\YamatoDaiwaES_Extensions\IDEsPlugins\IntelliJ_IDEA\build.gradle.kts' line: 49
* What went wrong:
Script compilation errors:
Line 49: version = properties("pluginVersion")
^ Val cannot be reassigned
Line 49: version = properties("pluginVersion")
^ Type mismatch: inferred type is String but Property<String> was expected
Line 50: groups = emptyList()
^ Val cannot be reassigned
Line 50: groups = emptyList()
^ Type mismatch: inferred type is List<???> but ListProperty<String> was expected
4 errors
* Try:
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Exception is:
ScriptCompilationException(errors=[ScriptCompilationError(message=Val cannot be reassigned, location=C:\Users\Takeshi Tokugawa\.gradle\.tmp\gradle-kotlin-dsl-2962856697602439688.tmp\build.gradle.kts (49:5)), ScriptCompilationError(message=Type mismatch: inferred type is String but Property<String> was expected, location=C:\Users\Takeshi Tokugawa\.gradle\.tmp\gradle-kotlin-dsl-2962856697602439688.tmp\build.gradle.kts (49:15)), ScriptCompilationError(message=Val cannot be reassigned, location=C:\Users\Takeshi Tokugawa\.gradle\.tmp\gradle-kotlin-dsl-2962856697602439688.tmp\build.gradle.kts (50:5)), ScriptCompilationError(message=Type mismatch: inferred type is List<???> but ListProperty<String> was expected, location=C:\Users\Takeshi Tokugawa\.gradle\.tmp\gradle-kotlin-dsl-2962856697602439688.tmp\build.gradle.kts (50:14))])
at org.gradle.kotlin.dsl.support.KotlinCompilerKt.compileKotlinScriptModuleTo(KotlinCompiler.kt:187)
at org.gradle.kotlin.dsl.support.KotlinCompilerKt.compileKotlinScriptToDirectory(KotlinCompiler.kt:148)
// ...
* Get more help at https://help.gradle.org
BUILD FAILED in 1s
11:58:15: Execution finished 'buildPlugin --stacktrace'.
build.gradle.kts
import io.gitlab.arturbosch.detekt.Detekt
import org.jetbrains.changelog.markdownToHTML
import org.jetbrains.kotlin.gradle.tasks.KotlinCompile
fun properties(key: String) = project.findProperty(key).toString()
plugins {
// Java support
id("java")
// Kotlin support
id("org.jetbrains.kotlin.jvm") version "1.7.10"
// gradle-intellij-plugin - read more: https://github.com/JetBrains/gradle-intellij-plugin
id("org.jetbrains.intellij") version "1.12.0"
// gradle-changelog-plugin - read more: https://github.com/JetBrains/gradle-changelog-plugin
id("org.jetbrains.changelog") version "1.3.0"
// detekt linter - read more: https://detekt.github.io/detekt/gradle.html
id("io.gitlab.arturbosch.detekt") version "1.17.1"
// ktlint linter - read more: https://github.com/JLLeitschuh/ktlint-gradle
id("org.jlleitschuh.gradle.ktlint") version "10.0.0"
}
group = properties("pluginGroup")
version = properties("pluginVersion")
// Configure project's dependencies
repositories {
mavenCentral()
}
dependencies {
detektPlugins("io.gitlab.arturbosch.detekt:detekt-formatting:1.17.1")
}
// Configure gradle-intellij-plugin plugin.
// Read more: https://github.com/JetBrains/gradle-intellij-plugin
intellij {
pluginName.set(properties("pluginName"))
version.set(properties("platformVersion"))
type.set(properties("platformType"))
downloadSources.set(properties("platformDownloadSources").toBoolean())
updateSinceUntilBuild.set(true)
// Plugin Dependencies. Uses `platformPlugins` property from the gradle.properties file.
plugins.set(properties("platformPlugins").split(',').map(String::trim).filter(String::isNotEmpty))
}
// Configure gradle-changelog-plugin plugin.
// Read more: https://github.com/JetBrains/gradle-changelog-plugin
changelog {
version = properties("pluginVersion")
groups = emptyList()
}
// Configure detekt plugin.
// Read more: https://detekt.github.io/detekt/kotlindsl.html
detekt {
config = files("./detekt-config.yml")
buildUponDefaultConfig = true
reports {
html.enabled = false
xml.enabled = false
txt.enabled = false
}
}
tasks {
// Set the compatibility versions to 1.8
withType<JavaCompile> {
sourceCompatibility = "1.8"
targetCompatibility = "1.8"
}
withType<KotlinCompile> {
kotlinOptions.jvmTarget = "1.8"
}
withType<Detekt> {
jvmTarget = "1.8"
}
patchPluginXml {
version.set(properties("pluginVersion"))
sinceBuild.set(properties("pluginSinceBuild"))
untilBuild.set(properties("pluginUntilBuild"))
// Extract the <!-- Plugin description --> section from README.md and provide for the plugin's manifest
pluginDescription.set(
File(projectDir, "README.md").readText().lines().run {
val start = "<!-- Plugin description -->"
val end = "<!-- Plugin description end -->"
if (!containsAll(listOf(start, end))) {
throw GradleException("Plugin description section not found in README.md:\n$start ... $end")
}
subList(indexOf(start) + 1, indexOf(end))
}.joinToString("\n").run { markdownToHTML(this) }
)
// Get the latest available change notes from the changelog file
changeNotes.set(provider { changelog.getLatest().toHTML() })
}
runPluginVerifier {
ideVersions.set(properties("pluginVerifierIdeVersions").split(',').map(String::trim).filter(String::isNotEmpty))
}
publishPlugin {
dependsOn("patchChangelog")
token.set(System.getenv("PUBLISH_TOKEN"))
// pluginVersion is based on the SemVer (https://semver.org) and supports pre-release labels, like 2.1.7-alpha.3
// Specify pre-release label to publish the plugin in a custom Release Channel automatically. Read more:
// https://plugins.jetbrains.com/docs/intellij/deployment.html#specifying-a-release-channel
channels.set(listOf(properties("pluginVersion").split('-').getOrElse(1) { "default" }.split('.').first()))
}
}
As part of the upcoming 2023 new year I wanted to try and move my development environment to vim or neovim. I have gone through a bit of setup already and have go and js/ts setup and appearing to work just fine. Autocomplete, linting and import management.
Trying to get lsp-zero and java working though is turning out to be a nightmare (because of course java would be a problem child). I opened a java file lsp-zero was baller and asked to install the jdtls which appears to have worked and voila nothing... I just have code highlighting. No auto-complete or importing management.
I added the following to test
-- configure an individual server
lsp.configure('jdtls', {
flags = {
debounce_text_changes = 150,
},
on_attach = function(client, bufnr)
print('lsp server (jdtls) attached')
end
})
lsp.configure('gopls', {
flags = {
debounce_text_changes = 150,
},
on_attach = function(client, bufnr)
print('lsp server (gopls) attached')
end
})
Java is not picking up the lsp server
Go picks up just fine
Does anyone know of additional configs that are needed. I am not seeing anything specifically called out.
--- Config edit ---
I updated the config to call the windows version of the scripts. I also added a data path and root_dir. The lsp still never triggers.
require'lspconfig'.jdtls.setup{
cmd = {
'jdtls-win.cmd',
"-configuration",
"C:\\Users\\Coury\\AppData\\Local\\nvim-data\\mason\\packages\\jdtls\\config_win",
"-jar",
"C:\\Users\\Coury\\AppData\\Local\\nvim-data\\mason\\packages\\jdtls\\plugins\\org.eclipse.equinox.launcher_1.6.400.v20210924-0641.jar",
"-data",
"C:\\Users\\Coury\\Documents\\Code\\interviews\\truleo\\app",
},
single_file_support = true,
root_dir = function()
return "C:\\Users\\Coury\\Documents\\Code\\interviews\\truleo\\app"
end,
flags = {
debounce_text_changes = 150,
},
on_attach = function(client, bufnr)
print('lsp server (jdtls) attached')
end
}
First, include java path to your bashrc. And retry the installation using Mason.nvim
Else: Do below
Install eclipse.jdt.ls by following their Installation instructions.
Add the plugin:
vim-plug: Plug mfussenegger/nvim-jdtls
packer.nvim: use mfussenegger/nvim-jdtls
To solve this you'd have to create your personal jdlts config file in your plugins directory like so
-- Java.lua
local config = {
cmd = {
--
"java", -- Or the absolute path '/path/to/java11_or_newer/bin/java'
"-Declipse.application=org.eclipse.jdt.ls.core.id1",
"-Dosgi.bundles.defaultStartLevel=4",
"-Declipse.product=org.eclipse.jdt.ls.core.product",
"-Dlog.protocol=true",
"-Dlog.level=ALL",
"-Xms1g",
"--add-modules=ALL-SYSTEM",
"--add-opens",
"java.base/java.util=ALL-UNNAMED",
"--add-opens",
"java.base/java.lang=ALL-UNNAMED",
--
"-jar",
"/path/to/jdtls_install_location/plugins/org.eclipse.equinox.launcher_VERSION_NUMBER.jar",
"-configuration", "/path/to/jdtls_install_location/config_SYSTEM",
"-data", "/Users/YOUR_MACHINE_NAME/local/share/nvim/java"
},
settings = {
java = {
signatureHelp = {enabled = true},
import = {enabled = true},
rename = {enabled = true}
}
},
init_options = {
bundles = {}
}
}
Source the new config and open any java file.
I recommend using mfussenegger/nvim-jdtls to run and configure the language server.
Its simply a matter of setting up a FTPlugin for java that calls jdtls.start_or_attach(jdtls_config) whenever a java file/repo is opened which will start the language server and attach it to your buffer which can be verified by :LspInfo.
ftplugin/java.lua:
local jdtls_config = require("myconfig.lsp.jdtls").setup()
local pkg_status, jdtls = pcall(require,"jdtls")
if not pkg_status then
vim.notify("unable to load nvim-jdtls", "error")
return
end
jdtls.start_or_attach(jdtls_config)
and the corresponding config using jdtls (installed via mason)
You may want to provide your own capabilities and on_attach functions but otherwise it should give you a good nudge in the right direction.
myconfig/lsp/jdtls.lua
local opts = {
cmd = {},
settings = {
java = {
signatureHelp = { enabled = true },
completion = {
favoriteStaticMembers = {},
filteredTypes = {
-- "com.sun.*",
-- "io.micrometer.shaded.*",
-- "java.awt.*",
-- "jdk.*",
-- "sun.*",
},
},
sources = {
organizeImports = {
starThreshold = 9999,
staticStarThreshold = 9999,
},
},
codeGeneration = {
toString = {
template = "${object.className}{${member.name()}=${member.value}, ${otherMembers}}",
},
useBlocks = true,
},
configuration = {
runtimes = {
{
name = "JavaSE-1.8",
path = "/Library/Java/JavaVirtualMachines/amazon-corretto-8.jdk/Contents/Home",
default = true,
},
{
name = "JavaSE-17",
path = "/Library/Java/JavaVirtualMachines/jdk-17.jdk/Contents/Home",
},
{
name = "JavaSE-19",
path = "/Library/Java/JavaVirtualMachines/jdk-19.jdk/Contents/Home",
},
},
},
},
},
}
local function setup()
local pkg_status, jdtls = pcall(require,"jdtls")
if not pkg_status then
vim.notify("unable to load nvim-jdtls", "error")
return {}
end
-- local jdtls_path = vim.fn.stdpath("data") .. "/mason/packages/jdtls"
local jdtls_bin = vim.fn.stdpath("data") .. "/mason/bin/jdtls"
local root_markers = { ".gradle", "gradlew", ".git" }
local root_dir = jdtls.setup.find_root(root_markers)
local home = os.getenv("HOME")
local project_name = vim.fn.fnamemodify(root_dir, ":p:h:t")
local workspace_dir = home .. "/.cache/jdtls/workspace/" .. project_name
opts.cmd = {
jdtls_bin,
"-data",
workspace_dir,
}
local on_attach = function(client, bufnr)
jdtls.setup.add_commands() -- important to ensure you can update configs when build is updated
-- if you setup DAP according to https://github.com/mfussenegger/nvim-jdtls#nvim-dap-configuration you can uncomment below
-- jdtls.setup_dap({ hotcodereplace = "auto" })
-- jdtls.dap.setup_dap_main_class_configs()
-- you may want to also run your generic on_attach() function used by your LSP config
end
opts.on_attach = on_attach
opts.capabilities = vim.lsp.protocol.make_client_capabilities()
return opts
end
return { setup = setup }
These examples are yanked from my personal neovim config (jdtls config). Hope this helps get you rolling.
Also make sure you have jdk17+ available for jdtls (i launch neovim with JAVA_HOME set to my jdk17 install)
(your code can still be compiled by and run on jdk8 -- I successfully work on gradle projects that are built with jdk8 no problems w/ this config)
I'm developing a web application to run native mongo query through java driver, so that I could see the results in good UI. I didn't find a straight way to do that but running js functions seems to be one way to do that.
I can run the following script from mongo shell.
rs1:PRIMARY> function showShortedItems() { return db.Items.find({});}
rs1:PRIMARY> showShortedItems()
But while trying same thing from java driver, nope.
val db = connection.getDatabase(Database.Name)
val command = new BasicDBObject("eval", "function() { return db.Items.find(); }")
val result = db.runCommand(command)
Error :
Caused by: com.mongodb.MongoCommandException: Command failed with error 13: 'not authorized on shipping-db to execute command { eval: "function() { return db.Items.find(); }" }' on server localhost:27017.
The full response is { "ok" : 0.0, "errmsg" : "not authorized on shipping-db to execute command { eval: \"function() { return db.Items.find(); }\" }", "code" : 13 }
rs1:PRIMARY> db.system.users.find({}) is empty.
mongo.conf
storage:
journal:
enabled: false
I fetched the spring framework source code from github by using git clone command. When I build the source code by using gradle build in the source code directory,it takes a long time to download dependencies and compile the java code,but failed with an exception.The output is below.
:spring-webmvc-portlet:sourcesJar UP-TO-DATE
:spring-webmvc-tiles2:javadoc SKIPPED
:spring-webmvc-tiles2:javadocJar SKIPPED
:spring-webmvc-tiles2:sourcesJar SKIPPED
:spring-websocket:javadoc UP-TO-DATE
:spring-websocket:javadocJar UP-TO-DATE
:spring-websocket:sourcesJar UP-TO-DATE
:distZip
FAILURE: Build failed with an exception.
* What went wrong:
Failed to capture snapshot of input files for task 'distZip' during up-to-date c
heck. See stacktrace for details.
> java.io.FileNotFoundException: C:\Users\yuqing\workspace\spring-framework\buil
d\distributions\spring-framework-4.3.0.BUILD-SNAPSHOT-schema.zip
* Try:
Run with --info or --debug option to get more log output.
* Exception is:
org.gradle.api.UncheckedIOException: Failed to capture snapshot of input files f
or task 'distZip' during up-to-date check. See stacktrace for details.
at org.gradle.api.internal.changedetection.rules.TaskUpToDateState.<init
>(TaskUpToDateState.java:60)
at org.gradle.api.internal.changedetection.changes.DefaultTaskArtifactSt
ateRepository$TaskArtifactStateImpl.getStates(DefaultTaskArtifactStateRepository
.java:132)
at org.gradle.api.internal.changedetection.changes.DefaultTaskArtifactSt
ateRepository$TaskArtifactStateImpl.isUpToDate(DefaultTaskArtifactStateRepositor
y.java:70)
at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.exec
ute(SkipUpToDateTaskExecuter.java:52)
at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execut
e(ValidatingTaskExecuter.java:58)
at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecu
ter.execute(SkipEmptySourceFilesTaskExecuter.java:52)
at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter
.execute(SkipTaskWithNoActionsExecuter.java:52)
at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execut
e(SkipOnlyIfTaskExecuter.java:53)
at org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter
.execute(ExecuteAtMostOnceTaskExecuter.java:43)
at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTa
skWorker.execute(DefaultTaskGraphExecuter.java:203)
at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTa
skWorker.execute(DefaultTaskGraphExecuter.java:185)
at org.gradle.execution.taskgraph.AbstractTaskPlanExecutor$TaskExecutorW
orker.processTask(AbstractTaskPlanExecutor.java:66)
at org.gradle.execution.taskgraph.AbstractTaskPlanExecutor$TaskExecutorW
orker.run(AbstractTaskPlanExecutor.java:50)
at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor.process(Defaul
tTaskPlanExecutor.java:25)
at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter.execute(Defau
ltTaskGraphExecuter.java:110)
at org.gradle.execution.SelectedTaskExecutionAction.execute(SelectedTask
ExecutionAction.java:37)
at org.gradle.execution.DefaultBuildExecuter.execute(DefaultBuildExecute
r.java:37)
at org.gradle.execution.DefaultBuildExecuter.access$000(DefaultBuildExec
uter.java:23)
at org.gradle.execution.DefaultBuildExecuter$1.proceed(DefaultBuildExecu
ter.java:43)
at org.gradle.execution.DryRunBuildExecutionAction.execute(DryRunBuildEx
ecutionAction.java:32)
at org.gradle.execution.DefaultBuildExecuter.execute(DefaultBuildExecute
r.java:37)
at org.gradle.execution.DefaultBuildExecuter.execute(DefaultBuildExecute
r.java:30)
at org.gradle.initialization.DefaultGradleLauncher$4.run(DefaultGradleLa
uncher.java:154)
at org.gradle.internal.Factories$1.create(Factories.java:22)
at org.gradle.internal.progress.DefaultBuildOperationExecutor.run(Defaul
tBuildOperationExecutor.java:90)
at org.gradle.internal.progress.DefaultBuildOperationExecutor.run(Defaul
tBuildOperationExecutor.java:52)
at org.gradle.initialization.DefaultGradleLauncher.doBuildStages(Default
GradleLauncher.java:151)
at org.gradle.initialization.DefaultGradleLauncher.access$200(DefaultGra
dleLauncher.java:32)
at org.gradle.initialization.DefaultGradleLauncher$1.create(DefaultGradl
eLauncher.java:99)
at org.gradle.initialization.DefaultGradleLauncher$1.create(DefaultGradl
eLauncher.java:93)
at org.gradle.internal.progress.DefaultBuildOperationExecutor.run(Defaul
tBuildOperationExecutor.java:90)
at org.gradle.internal.progress.DefaultBuildOperationExecutor.run(Defaul
tBuildOperationExecutor.java:62)
at org.gradle.initialization.DefaultGradleLauncher.doBuild(DefaultGradle
Launcher.java:93)
at org.gradle.initialization.DefaultGradleLauncher.run(DefaultGradleLaun
cher.java:82)
at org.gradle.launcher.exec.InProcessBuildActionExecuter$DefaultBuildCon
....
Caused by: org.gradle.api.UncheckedIOException: java.io.FileNotFoundException: C
:\Users\yuqing\workspace\spring-framework\build\distributions\spring-framework-4
.3.0.BUILD-SNAPSHOT-schema.zip (系统找不到指定的文件。)
at org.gradle.internal.hash.HashUtil.createHash(HashUtil.java:39)
at org.gradle.api.internal.hash.DefaultHasher.hash(DefaultHasher.java:24
)
at org.gradle.api.internal.changedetection.state.CachingFileSnapshotter.
snapshot(CachingFileSnapshotter.java:57)
at org.gradle.api.internal.changedetection.state.CachingFileSnapshotter.
snapshot(CachingFileSnapshotter.java:46)
at org.gradle.api.internal.changedetection.state.CachingFileSnapshotter.
snapshot(CachingFileSnapshotter.java:29)
at org.gradle.api.internal.changedetection.state.DefaultFileCollectionSn
apshotter$1.run(DefaultFileCollectionSnapshotter.java:70)
at org.gradle.internal.Factories$1.create(Factories.java:22)
at org.gradle.cache.internal.DefaultCacheAccess.useCache(DefaultCacheAcc
ess.java:192)
at org.gradle.cache.internal.DefaultCacheAccess.useCache(DefaultCacheAcc
ess.java:175)
at org.gradle.cache.internal.DefaultPersistentDirectoryStore.useCache(De
faultPersistentDirectoryStore.java:106)
at org.gradle.cache.internal.DefaultCacheFactory$ReferenceTrackingCache.
useCache(DefaultCacheFactory.java:187)
at org.gradle.api.internal.changedetection.state.DefaultTaskArtifactStat
eCacheAccess.useCache(DefaultTaskArtifactStateCacheAccess.java:60)
at org.gradle.api.internal.changedetection.state.DefaultFileCollectionSn
apshotter.snapshot(DefaultFileCollectionSnapshotter.java:62)
at org.gradle.api.internal.changedetection.rules.TaskUpToDateState.<init
>(TaskUpToDateState.java:56)
... 57 more
Caused by: java.io.FileNotFoundException: C:\Users\yuqing\workspace\spring-frame
work\build\distributions\spring-framework-4.3.0.BUILD-SNAPSHOT-schema.zip
at org.gradle.internal.hash.HashUtil.createHash(HashUtil.java:34)
... 70 more
BUILD FAILED
Total time: 20.861 secs
The error indicates the file spring-framework-4.3.0.BUILD-SNAPSHOT-schema.zip not found.I wondered why this file is not found,it should be created by distZip task but it doesn't.
I have googled this problem but not found any question about this.
I have resovled this problem,the devil is path separator.Because this command runs on windows.
Modify schemaZip task definition from
task schemaZip(type: Zip) {
group = "Distribution"
baseName = "spring-framework"
classifier = "schema"
description = "Builds -${classifier} archive containing all " +
"XSDs for deployment at http://springframework.org/schema."
duplicatesStrategy 'exclude'
moduleProjects.each { subproject ->
def Properties schemas = new Properties();
subproject.sourceSets.main.resources.find {
it.path.endsWith("META-INF/spring.schemas")
}?.withInputStream { schemas.load(it) }
for (def key : schemas.keySet()) {
def shortName = key.replaceAll(/http.*schema.(.*).spring-.*/, '$1')
assert shortName != key
File xsdFile = subproject.sourceSets.main.resources.find {
it.path.endsWith(schemas.get(key))
}
assert xsdFile != null
into (shortName) {
from xsdFile.path
}
}
}
}
to
task schemaZip(type: Zip) {
group = "Distribution"
baseName = "spring-framework"
classifier = "schema"
description = "Builds -${classifier} archive containing all " +
"XSDs for deployment at http://springframework.org/schema."
duplicatesStrategy 'exclude'
moduleProjects.each { subproject ->
def Properties schemas = new Properties();
subproject.sourceSets.main.resources.find {
it.path.endsWith("META-INF\\spring.schemas")
}?.withInputStream { schemas.load(it) }
for (def key : schemas.keySet()) {
def shortName = key.replaceAll(/http.*schema.(.*).spring-.*/, '$1')
assert shortName != key
File xsdFile = subproject.sourceSets.main.resources.find {
it.path.endsWith(schemas.get(key).replaceAll('\\/','\\\\'))
}
assert xsdFile != null
into (shortName) {
from xsdFile.path
}
}
}
}
I've run into an issue with attempting to parse json in my spark job. I'm using spark 1.1.0, json4s, and the Cassandra Spark Connector, with DSE 4.6. The exception thrown is:
org.json4s.package$MappingException: Can't find constructor for BrowserData org.json4s.reflect.ScalaSigReader$.readConstructor(ScalaSigReader.scala:27)
org.json4s.reflect.Reflector$ClassDescriptorBuilder.ctorParamType(Reflector.scala:108)
org.json4s.reflect.Reflector$ClassDescriptorBuilder$$anonfun$6.apply(Reflector.scala:98)
org.json4s.reflect.Reflector$ClassDescriptorBuilder$$anonfun$6.apply(Reflector.scala:95)
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
My code looks like this:
case class BrowserData(navigatorObjectData: Option[NavigatorObjectData],
flash_version: Option[FlashVersion],
viewport: Option[Viewport],
performanceData: Option[PerformanceData])
.... other case classes
def parseJson(b: Option[String]): Option[String] = {
implicit val formats = DefaultFormats
for {
browserDataStr <- b
browserData = parse(browserDataStr).extract[BrowserData]
navObject <- browserData.navigatorObjectData
userAgent <- navObject.userAgent
} yield (userAgent)
}
def getJavascriptUa(rows: Iterable[com.datastax.spark.connector.CassandraRow]): Option[String] = {
implicit val formats = DefaultFormats
rows.collectFirst { case r if r.getStringOption("browser_data").isDefined =>
parseJson(r.getStringOption("browser_data"))
}.flatten
}
def getRequestUa(rows: Iterable[com.datastax.spark.connector.CassandraRow]): Option[String] = {
rows.collectFirst { case r if r.getStringOption("ua").isDefined =>
r.getStringOption("ua")
}.flatten
}
def checkUa(rows: Iterable[com.datastax.spark.connector.CassandraRow], sessionId: String): Option[Boolean] = {
for {
jsUa <- getJavascriptUa(rows)
reqUa <- getRequestUa(rows)
} yield (jsUa == reqUa)
}
def run(name: String) = {
val rdd = sc.cassandraTable("beehive", name).groupBy(r => r.getString("session_id"))
val counts = rdd.map(r => (checkUa(r._2, r._1)))
counts
}
I use :load to load the file into the REPL, and then call the run function. The failure is happening in the parseJson function, as far as I can tell. I've tried a variety of things to try to get this to work. From similar posts, I've made sure my case classes are in the top level in the file. I've tried compiling just the case class definitions into a jar, and including the jar in like this: /usr/bin/dse spark --jars case_classes.jar
I've tried adding them to the conf like this: sc.getConf.setJars(Seq("/home/ubuntu/case_classes.jar"))
And still the same error. Should I compile all of my code into a jar? Is this a spark issue or a JSON4s issue? Any help at all appreciated.