No such DSL method in Jenkins Declarative pipeline - java
I have the following pipeline file which is meant to be used for creating an instance in the openstack.
#Library('dev-shared-lib#master')_
pipeline{
agent any
stages{
stage('Create OpenStack Instace'){
environment{
NAME_OF_THE_INSTANCE= "dev-setup"
IMAGE_REFERENCE = "8917nej3" // Ubuntu-18.04-LTS
KEYNAME = "user"
FLAVOR_REFERENCE = "eseeseee68fa1d2d5" // t2.large-80G
MAX_INSTANCE_COUNT = 1
MINIMUM_INSTANCE_COUNT = 1
NETWORK_ID = "i2a4b7428506" // dev-private-network
SECURITY_GROUP_NAME = "default"
REQUEST_JSON = ""
}
steps{
script{
try{
// Request to create an instance at openstack
REQUEST_JSON= openStackClient.createAnInstance "${NAME_OF_THE_INSTANCE}", "${IMAGE_REFERENCE}", "${KEYNAME}", "${FLAVOR_REFERENCE}", "${NETWORK_ID}", "${SECURITY_GROUP_NAME}", "${MAX_INSTANCE_COUNT}", "${MINIMUM_INSTANCE_COUNT}"
echo "${REQUEST_JSON}"
} catch (exception){
log.error(exception)
}
}
}
}
}
}
in the dev-shared-lib#master library which being used above, I have the openStackClient.groovy file located at /vars/ directory with the below content
def createAnInstance(String nameOfTheInstance, String imageReference, String keyName, String flavorReference,
String networkId, String securityGroupName, int maxInstanceCount, int minimumInstanceCount) {
return "Hi "
The shared lib is correctly configured at Jenkins and I have just returned Hi to check whether it is working. Once the above pipeline is run the following error is thrown, what I am doing wrong here?
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
java.lang.NoSuchMethodError: No such DSL method 'createAnInstance' found among steps [ansiColor, archive, bat, build, catchError, checkout, deleteDir, dir, dockerFingerprintFrom, dockerFingerprintRun, echo, emailext, emailextrecipients, envVarsForTool, error, fileExists, findBuildScans, getContext, git, input, isUnix, jiraComment, jiraIssueSelector, jiraSearch, junit, library, libraryResource, load, lock, mail, milestone, node, parallel, powershell, properties, publishHTML, pwd, readFile, readTrusted, resolveScm, retry, script, sh, sleep, stage, stash, step, svn, timeout, timestamps, tm, tool, unarchive, unstable, unstash, validateDeclarativePipeline, waitUntil, warnError, withContext, withCredentials, withDockerContainer, withDockerRegistry, withDockerServer, withEnv, wrap, writeFile, ws] or symbols [all, allOf, always, ant, antFromApache, antOutcome, antTarget, any, anyOf, apiToken, architecture, archiveArtifacts, artifactManager, authorizationMatrix, batchFile, bitbucket, booleanParam, branch, brokenBuildSuspects, brokenTestsSuspects, buildButton, buildDiscarder, buildingTag, caseInsensitive, caseSensitive, certificate, changeRequest, changelog, changeset, checkoutToSubdirectory, choice, choiceParam, cleanWs, clock, cloud, command, credentials, cron, crumb, culprits, defaultView, demand, developers, disableConcurrentBuilds, disableResume, docker, dockerCert, dockerfile, downloadSettings, downstream, dumb, durabilityHint, envVars, environment, equals, expression, file, fileParam, filePath, fingerprint, frameOptions, freeStyle, freeStyleJob, fromScm, fromSource, git, gitHubBranchDiscovery, gitHubBranchHeadAuthority, gitHubForkDiscovery, gitHubSshCheckout, gitHubTagDiscovery, gitHubTrustContributors, gitHubTrustEveryone, gitHubTrustNobody, gitHubTrustPermissions, github, githubPush, gradle, headRegexFilter, headWildcardFilter, hyperlink, hyperlinkToModels, inheriting, inheritingGlobal, installSource, isRestartedRun, jdk, jdkInstaller, jgit, jgitapache, jnlp, jobName, label, lastDuration, lastFailure, lastGrantedAuthorities, lastStable, lastSuccess, legacy, legacySCM, list, local, location, logRotator, loggedInUsersCanDoAnything, masterBuild, maven, maven3Mojos, mavenErrors, mavenMojos, mavenWarnings, modernSCM, myView, newContainerPerStage, node, nodeProperties, nonInheriting, none, not, overrideIndexTriggers, paneStatus, parallelsAlwaysFailFast, parameters, password, pattern, permanent, pipeline-model, pipelineTriggers, plainText, plugin, pollSCM, preserveStashes, projectNamingStrategy, proxy, queueItemAuthenticator, quietPeriod, rateLimitBuilds, recipients, requestor, run, runParam, sSHLauncher, schedule, scmRetryCount, scriptApprovalLink, search, security, shell, skipDefaultCheckout, skipStagesAfterUnstable, slave, sourceRegexFilter, sourceWildcardFilter, ssh, sshUserPrivateKey, stackTrace, standard, status, string, stringParam, swapSpace, tag, text, textParam, tmpSpace, toolLocation, triggeredBy, unsecured, upstream, upstreamDevelopers, userSeed, usernameColonPassword, usernamePassword, viewsTabBar, weather, withAnt, zfs, zip] or globals [currentBuild, docker, env, log, openStackClient, params, pipeline, scm]
at org.jenkinsci.plugins.workflow.cps.DSL.invokeMethod(DSL.java:202)
at org.jenkinsci.plugins.workflow.cps.CpsScript.invokeMethod(CpsScript.java:122)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1213)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:42)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.kohsuke.groovy.sandbox.impl.Checker$1.call(Checker.java:160)
at org.kohsuke.groovy.sandbox.GroovyInterceptor.onMethodCall(GroovyInterceptor.java:23)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onMethodCall(SandboxInterceptor.java:157)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onMethodCall(SandboxInterceptor.java:142)
at org.kohsuke.groovy.sandbox.impl.Checker$1.call(Checker.java:158)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:162)
at com.cloudbees.groovy.cps.sandbox.SandboxInvoker.methodCall(SandboxInvoker.java:17)
at WorkflowScript.run(WorkflowScript:21)
at ___cps.transform___(Native Method)
at com.cloudbees.groovy.cps.impl.ContinuationGroup.methodCall(ContinuationGroup.java:84)
at com.cloudbees.groovy.cps.impl.FunctionCallBlock$ContinuationImpl.dispatchOrArg(FunctionCallBlock.java:113)
at com.cloudbees.groovy.cps.impl.FunctionCallBlock$ContinuationImpl.fixArg(FunctionCallBlock.java:83)
at sun.reflect.GeneratedMethodAccessor225.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.cloudbees.groovy.cps.impl.ContinuationPtr$ContinuationImpl.receive(ContinuationPtr.java:72)
at com.cloudbees.groovy.cps.impl.FunctionCallBlock$ContinuationImpl.dispatchOrArg(FunctionCallBlock.java:107)
at com.cloudbees.groovy.cps.impl.FunctionCallBlock$ContinuationImpl.fixArg(FunctionCallBlock.java:83)
at sun.reflect.GeneratedMethodAccessor225.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.cloudbees.groovy.cps.impl.ContinuationPtr$ContinuationImpl.receive(ContinuationPtr.java:72)
at com.cloudbees.groovy.cps.impl.ContinuationGroup.methodCall(ContinuationGroup.java:87)
at com.cloudbees.groovy.cps.impl.FunctionCallBlock$ContinuationImpl.dispatchOrArg(FunctionCallBlock.java:113)
at com.cloudbees.groovy.cps.impl.FunctionCallBlock$ContinuationImpl.fixArg(FunctionCallBlock.java:83)
at sun.reflect.GeneratedMethodAccessor225.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.cloudbees.groovy.cps.impl.ContinuationPtr$ContinuationImpl.receive(ContinuationPtr.java:72)
at com.cloudbees.groovy.cps.impl.ConstantBlock.eval(ConstantBlock.java:21)
at com.cloudbees.groovy.cps.Next.step(Next.java:83)
at com.cloudbees.groovy.cps.Continuable$1.call(Continuable.java:174)
at com.cloudbees.groovy.cps.Continuable$1.call(Continuable.java:163)
at org.codehaus.groovy.runtime.GroovyCategorySupport$ThreadCategoryInfo.use(GroovyCategorySupport.java:129)
at org.codehaus.groovy.runtime.GroovyCategorySupport.use(GroovyCategorySupport.java:268)
at com.cloudbees.groovy.cps.Continuable.run0(Continuable.java:163)
at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.access$001(SandboxContinuable.java:18)
at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.run0(SandboxContinuable.java:51)
at org.jenkinsci.plugins.workflow.cps.CpsThread.runNextChunk(CpsThread.java:186)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.run(CpsThreadGroup.java:370)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.access$200(CpsThreadGroup.java:93)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:282)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:270)
at org.jenkinsci.plugins.workflow.cps.CpsVmExecutorService$2.call(CpsVmExecutorService.java:66)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at hudson.remoting.SingleLaneExecutorService$1.run(SingleLaneExecutorService.java:131)
at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Finished: FAILURE
Related
azure identity java sdk throwing MutableCoercionConfig error
I'm new to the java library, I learned that azure Java SDK can be used in the scala environment, So I have tried using the azure identity library in the data bricks environment DBR Version 7.3 LTS scala 2.12 maven coordinate com.azure:azure-identity:1.3.5 pls help me resolve the error, here is the code : import com.azure.identity._; import com.azure.identity.DefaultAzureCredentialBuilder; val clientID = dbutils.secrets.get(scope="****",key="****"); val ClientSecret = dbutils.secrets.get(scope="****",key="****"); val tenantID = dbutils.secrets.get(scope="****",key="****"); val endpoint = "****" ; val clientSecretCredential:ClientSecretCredential = new ClientSecretCredentialBuilder().tenantId(tenantID).clientId(clientID).clientSecret(ClientSecret).build(); here is the error i face java.lang.NoSuchMethodError: com.fasterxml.jackson.dataformat.xml.XmlMapper.coercionConfigDefaults()Lcom/fasterxml/jackson/databind/cfg/MutableCoercionConfig at com.fasterxml.jackson.dataformat.xml.XmlMapper.<init>(XmlMapper.java:145) at com.fasterxml.jackson.dataformat.xml.XmlMapper.<init>(XmlMapper.java:127) at com.fasterxml.jackson.dataformat.xml.XmlMapper.builder(XmlMapper.java:218) at com.azure.core.util.serializer.JacksonAdapter.<init>(JacksonAdapter.java:137) at com.azure.core.util.serializer.JacksonAdapter.createDefaultSerializerAdapter(JacksonAdapter.java:189) at com.azure.identity.implementation.IdentityClient.<clinit>(IdentityClient.java:96) at com.azure.identity.implementation.IdentityClientBuilder.build(IdentityClientBuilder.java:113) at com.azure.identity.ClientSecretCredential.<init>(ClientSecretCredential.java:50) at com.azure.identity.ClientSecretCredentialBuilder.build(ClientSecretCredentialBuilder.java:76) at linea34e0846b16045d09ff78af8b346393c25.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-626790621540495:18) at linea34e0846b16045d09ff78af8b346393c25.$read$$iw$$iw$$iw$$iw$$iw.<init>(command-626790621540495:68) at linea34e0846b16045d09ff78af8b346393c25.$read$$iw$$iw$$iw$$iw.<init>(command-626790621540495:70) at linea34e0846b16045d09ff78af8b346393c25.$read$$iw$$iw$$iw.<init>(command-626790621540495:72) at linea34e0846b16045d09ff78af8b346393c25.$read$$iw$$iw.<init>(command-626790621540495:74) at linea34e0846b16045d09ff78af8b346393c25.$read$$iw.<init>(command-626790621540495:76) at linea34e0846b16045d09ff78af8b346393c25.$read.<init>(command-626790621540495:78) at linea34e0846b16045d09ff78af8b346393c25.$read$.<init>(command-626790621540495:82) at linea34e0846b16045d09ff78af8b346393c25.$read$.<clinit>(command-626790621540495) at linea34e0846b16045d09ff78af8b346393c25.$eval$.$print$lzycompute(<notebook>:7) at linea34e0846b16045d09ff78af8b346393c25.$eval$.$print(<notebook>:6) at linea34e0846b16045d09ff78af8b346393c25.$eval.$print(<notebook>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:745) at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1021) at scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:574) at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:41) at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:37) at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41) at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:573) at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:600) at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:570) at com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:219) at com.databricks.backend.daemon.driver.ScalaDriverLocal.$anonfun$repl$1(ScalaDriverLocal.scala:204) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:789) at com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:742) at com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:204) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$10(DriverLocal.scala:431) at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:239) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:234) at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:231) at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:48) at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:276) at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:269) at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:48) at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:408) at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:653) at scala.util.Try$.apply(Try.scala:213) at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:645) at com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:486) at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:598) at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:391) at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:337) at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:219) at java.lang.Thread.run(Thread.java:748)
Java Spark GroupByFailure
I'm attempting to use the Java Spark libraries with a cluster running Spark 2.3.0 over Hadoop 3.1.0 (and using those versions of the Java libraries). I've run into a problem where I simply cannot use groupByKey, and I am at a loss to explain why. Any attempted usage of groupByKey for any reason in any circumstance is returning a java.lang.IllegalArgumentException. I've boiled this down to about the simplest test I can think of: package com.failuretest; import java.util.ArrayList; import java.util.List; import org.apache.spark.SparkConf; import org.apache.spark.api.java.JavaRDD; import org.apache.spark.api.java.JavaSparkContext; import scala.Tuple2; public class TestReport { public static void main(String[] args) throws Exception { SparkConf conf = new SparkConf().setAppName("TestReport").set("spark.executor.memory", "20G"); JavaSparkContext sc = new JavaSparkContext(conf); JavaRDD<String> test = sc.parallelize(generateTestData()); test.saveAsTextFile("/TEST/testfile1"); test.mapToPair(line -> { String[] testParts = line.split(" "); return new Tuple2<String, String>(testParts[0], testParts[1]); }).groupByKey().saveAsTextFile("/TEST/testfile2"); sc.close(); } private static List<String> generateTestData() { List<String> testList = new ArrayList<String>(); int keyCount = 0; int valCount = 0; while (valCount++ < 2000000) { if (valCount % 10 == 0) { keyCount++; } testList.add("Key" + keyCount + " " + "Val" + valCount); } return testList; } } I'm just programmatically creating an RDD that produces 10 values per key, then creating my JavaPairRDD with a simple split, then attempting groupByKey. When it runs, I receive the following stack: Exception in thread "main" java.lang.IllegalArgumentException at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source) at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source) at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source) at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:46) at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:449) at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:432) at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733) at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103) at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103) at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230) at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40) at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:103) at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732) at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:432) at org.apache.xbean.asm5.ClassReader.a(Unknown Source) at org.apache.xbean.asm5.ClassReader.b(Unknown Source) at org.apache.xbean.asm5.ClassReader.accept(Unknown Source) at org.apache.xbean.asm5.ClassReader.accept(Unknown Source) at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:262) at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:261) at scala.collection.immutable.List.foreach(List.scala:381) at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:261) at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:159) at org.apache.spark.SparkContext.clean(SparkContext.scala:2292) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$combineByKeyWithClassTag$1.apply(PairRDDFunctions.scala:88) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$combineByKeyWithClassTag$1.apply(PairRDDFunctions.scala:77) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) at org.apache.spark.rdd.PairRDDFunctions.combineByKeyWithClassTag(PairRDDFunctions.scala:77) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$groupByKey$1.apply(PairRDDFunctions.scala:505) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$groupByKey$1.apply(PairRDDFunctions.scala:498) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) at org.apache.spark.rdd.PairRDDFunctions.groupByKey(PairRDDFunctions.scala:498) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$groupByKey$3.apply(PairRDDFunctions.scala:641) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$groupByKey$3.apply(PairRDDFunctions.scala:641) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) at org.apache.spark.rdd.PairRDDFunctions.groupByKey(PairRDDFunctions.scala:640) at org.apache.spark.api.java.JavaPairRDD.groupByKey(JavaPairRDD.scala:559) at com.failuretest.TestReport.main(TestReport.java:22) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:564) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) It doesn't get any further than the groupByKey (I'm writing a file above with the results, but it really doesn't matter since it never gets there). I can run it all day long in my local dev instance, but running spark-submit with a jar containing the above fails every time in the cluster. I'm really not sure where to go from here - what I am trying to do is a bit of a challenge if I cannot group by key. Am I messing up? Is this a version conflict somewhere? Dave
I actually figured this out before posting this, but in the interests of helping others... I discovered that one of my colleagues had decided to have a play around with Java 10 on this particular cluster. Moved it back to Java 8 (sorry - didn't try 9) and the problem went away. Dave
Nullpointer Error in groovy script while trying to add xpath assertion
I am getting an error as java.lang.NullPointerException: Cannot invoke method getAssertionByName() on null object error at line: 5 however I am able to add xpath assertion in the test case. As, I am new to groovy so want to know :- What is the reason that I am getting this error. How can I implement a code for select from current option in xpath assertion so that i can add xpath instead of printing some junk value(i have printed "hello" as of now). log.info("Testing Start") def project = context.testCase.testSuite.project TSName = "ManagePostpayInsurance_1_0" StepName = "getInsuranceDetails_FC_004" project.getTestSuiteList().each { if(it.name == TSName) { TS = it.name it.getTestCaseList().each { TC =it.name def asserting = project.getTestSuiteByName(TS).getTestCaseByName(TC).getTestStepByName(StepName).getAssertionByName("XPath Match") log.info(asserting) if (asserting instanceof com.eviware.soapui.impl.wsdl.teststeps.assertions.basic.XPathContainsAssertion){ project.getTestSuiteByName(TS).getTestCaseByName(TC).getTestStepByName(StepName).removeAssertion(asserting) } def assertion = project.getTestSuiteByName(TS).getTestCaseByName(TC)getTestStepByName(StepName).addAssertion("XPath Match") assertion.path = "declare namespace cor='http://soa.o2.co.uk/coredata_1';\ndeclare namespace man='http://soa.o2.co.uk/managepostpayinsurancedata_1';\ndeclare namespace soapenv='http://schemas.xmlsoap.org/soap/envelope/';\n//man:getInsuranceDetails_1Response" assertion.expectedContent = "hello" } } } log.info("Testing Over") I have attached the error log below. Mon Nov 27 17:04:12 IST 2017:ERROR:java.lang.NullPointerException: Cannot invoke method getAssertionByName() on null object java.lang.NullPointerException: Cannot invoke method getAssertionByName() on null object at org.codehaus.groovy.runtime.NullObject.invokeMethod(NullObject.java:77) at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:45) at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45) at org.codehaus.groovy.runtime.callsite.NullCallSite.call(NullCallSite.java:32) at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45) at com.eviware.soapui.model.testsuite.Assertable$getAssertionByName.call(Unknown Source) at Script10$_run_closure1_closure2.doCall(Script10.groovy:11) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90) at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:233) at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:272) at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:909) at groovy.lang.Closure.call(Closure.java:411) at groovy.lang.Closure.call(Closure.java:427) at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:1325) at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:1297) at org.codehaus.groovy.runtime.dgm$148.invoke(Unknown Source) at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite$PojoMetaMethodSiteNoUnwrapNoCoerce.invoke(PojoMetaMethodSite.java:271) at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite.call(PojoMetaMethodSite.java:53) at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116) at Script10$_run_closure1.doCall(Script10.groovy:9) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90) at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:233) at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:272) at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:909) at groovy.lang.Closure.call(Closure.java:411) at groovy.lang.Closure.call(Closure.java:427) at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:1325) at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:1297) at org.codehaus.groovy.runtime.dgm$148.invoke(Unknown Source) at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite$PojoMetaMethodSiteNoUnwrapNoCoerce.invoke(PojoMetaMethodSite.java:271) at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite.call(PojoMetaMethodSite.java:53) at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116) at Script10.run(Script10.groovy:5) at com.eviware.soapui.support.scripting.groovy.SoapUIGroovyScriptEngine.run(SoapUIGroovyScriptEngine.java:90) at com.eviware.soapui.impl.wsdl.teststeps.WsdlGroovyScriptTestStep.run(WsdlGroovyScriptTestStep.java:141) at com.eviware.soapui.impl.wsdl.panels.teststeps.GroovyScriptStepDesktopPanel$RunAction$1.run(GroovyScriptStepDesktopPanel.java:250) at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.lang.Thread.run(Unknown Source) I'm badly stuck with the above issue, quick help is really appreciated!!!! Thank you very much
Here you go: Since you are going thru hierarchically, do not require to refer full chain of methods starting from project. Instead, you could directly access the step objects once you browse to step level. This way, NPE can be avoided. Here is the fixed script, see the inline relevant comment. import com.eviware.soapui.impl.wsdl.teststeps.assertions.basic.XPathContainsAssertion log.info("Testing Start") def project = context.testCase.testSuite.project def suiteName = "ManagePostpayInsurance_1_0" def stepName = "getInsuranceDetails_FC_004" project.testSuiteList.each { suite -> if(suiteName == suite.name) { suite.testCaseList.each { kase -> kase.testStepList.each { step -> if (stepName == step.name) { //Note the change here, directly getting the object from step object def asserting = step.getAssertionByName("XPath Match") log.info(asserting) if (asserting instanceof XPathContainsAssertion){ step.removeAssertion(asserting) } def assertion = step.addAssertion("XPath Match") assertion.path = "declare namespace cor='http://soa.o2.co.uk/coredata_1';\ndeclare namespace man='http://soa.o2.co.uk/managepostpayinsurancedata_1';\ndeclare namespace soapenv='http://schemas.xmlsoap.org/soap/envelope/';\n//man:getInsuranceDetails_1Response" assertion.expectedContent = "hello" } } } } } log.info 'Testing Over'
ClassNotFoundException HadoopMapReduceCommitProtocol
I am trying to run a Spark sample in local mode, but am getting the following stack trace: Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/internal/io/HadoopMapReduceCommitProtocol at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:763) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:467) at java.net.URLClassLoader.access$100(URLClassLoader.java:73) at java.net.URLClassLoader$1.run(URLClassLoader.java:368) at java.net.URLClassLoader$1.run(URLClassLoader.java:362) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:361) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.spark.sql.internal.SQLConf$.<init>(SQLConf.scala:383) at org.apache.spark.sql.internal.SQLConf$.<clinit>(SQLConf.scala) at org.apache.spark.sql.internal.StaticSQLConf$$anonfun$buildConf$1.apply(SQLConf.scala:930) at org.apache.spark.sql.internal.StaticSQLConf$$anonfun$buildConf$1.apply(SQLConf.scala:928) at org.apache.spark.internal.config.TypedConfigBuilder$$anonfun$createWithDefault$1.apply(ConfigBuilder.scala:122) at org.apache.spark.internal.config.TypedConfigBuilder$$anonfun$createWithDefault$1.apply(ConfigBuilder.scala:122) at scala.Option.foreach(Option.scala:257) at org.apache.spark.internal.config.TypedConfigBuilder.createWithDefault(ConfigBuilder.scala:122) at org.apache.spark.sql.internal.StaticSQLConf$.<init>(SQLConf.scala:937) at org.apache.spark.sql.internal.StaticSQLConf$.<clinit>(SQLConf.scala) at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$sessionStateClassName(SparkSession.scala:962) at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:111) at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109) at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878) at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878) at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99) at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99) at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230) at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40) at scala.collection.mutable.HashMap.foreach(HashMap.scala:99) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878) at com.megaport.PipelineExample$.main(PipelineExample.scala:37) at com.megaport.PipelineExample.main(PipelineExample.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147) Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.io.HadoopMapReduceCommitProtocol at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) I can see the class in the GitHub repo, but it is not in the Maven lib, or in the distro(I have the distro bundled with Hadoop) spark-core_2.11-2.0.2.jar. The code I am trying to run is taken from the examples in the Spark distro, and it fails at the getOrCreate stage... // scalastyle:off println package com.megaport // $example on$ import org.apache.spark.ml.{Pipeline, PipelineModel} import org.apache.spark.ml.classification.LogisticRegression import org.apache.spark.ml.feature.{HashingTF, Tokenizer} import org.apache.spark.ml.linalg.Vector import org.apache.spark.sql.Row // $example off$ import org.apache.spark.sql.SparkSession object PipelineExample { def main(args: Array[String]): Unit = { val spark = SparkSession.builder .appName("My Spark Application") // optional and will be autogenerated if not specified .master("local[*]") // avoid hardcoding the deployment environment // .enableHiveSupport() // self-explanatory, isn't it? .getOrCreate // $example on$ // Prepare training documents from a list of (id, text, label) tuples. val training = spark.createDataFrame(Seq( (0L, "a b c d e spark", 1.0), (1L, "b d", 0.0), (2L, "spark f g h", 1.0), (3L, "hadoop mapreduce", 0.0) )).toDF("id", "text", "label") // Configure an ML pipeline, which consists of three stages: tokenizer, hashingTF, and lr. val tokenizer = new Tokenizer() .setInputCol("text") .setOutputCol("words") val hashingTF = new HashingTF() .setNumFeatures(1000) .setInputCol(tokenizer.getOutputCol) .setOutputCol("features") val lr = new LogisticRegression() .setMaxIter(10) .setRegParam(0.01) val pipeline = new Pipeline() .setStages(Array(tokenizer, hashingTF, lr)) // Fit the pipeline to training documents. val model = pipeline.fit(training) // Now we can optionally save the fitted pipeline to disk model.write.overwrite().save("/tmp/spark-logistic-regression-model") // We can also save this unfit pipeline to disk pipeline.write.overwrite().save("/tmp/unfit-lr-model") // And load it back in during production val sameModel = PipelineModel.load("/tmp/spark-logistic-regression-model") // Prepare test documents, which are unlabeled (id, text) tuples. val test = spark.createDataFrame(Seq( (4L, "spark i j k"), (5L, "l m n"), (6L, "mapreduce spark"), (7L, "apache hadoop") )).toDF("id", "text") // Make predictions on test documents. model.transform(test) .select("id", "text", "probability", "prediction") .collect() .foreach { case Row(id: Long, text: String, prob: Vector, prediction: Double) => println(s"($id, $text) --> prob=$prob, prediction=$prediction") } // $example off$ spark.stop() } }
Well if its not in your java library, then you should download the dependent jar and add it. Check this SO for more details How to import a jar in Eclipse
Saving List of Objects Throws Hibernate Exception
When I am saving a List of objects by calling saveListOfPageChooserElement, it throws the below exception Whereas, when I am saving a single instance by calling saveOrUpdate, then it works fine. But to improve performance I want to save a List batch rather than single object at a time. Can anyone suggest what's the problem with saving a whole list at once? List<Abc> listabc = widgetCopyDAO .fetchabcByPageId(id); for (Abc abc: listabc ) { abc.setLastUpdatedBy(null); abc.setLastUpdatedOn(null); abc.setCreatedBy(widgetCopyDTO.getUserName()); abc.setCreatedOn(new Date()); abc.setPageChooser(new PageChooser(chooser.getId())); abc.setId(0l); issuePageWidgetDAO.saveOrUpdate(abc); } // widgetCopyDAO.saveListOfPageChooserElement(listabc); public void saveOrUpdate(Abc abc) { if (abc.getId() == 0) { Long id = (Long) this.getHibernateTemplate().save( abc); abc.setId(id); } else { this.getHibernateTemplate().update(abc); } } public void saveListOfPageChooserElement( List<Abc> listabc) { this.getHibernateTemplate().saveOrUpdateAll(listabc); } The exception is org.springframework.orm.hibernate3.HibernateSystemException: identifier of an instance of com.mct.model.Abc was altered from 138 to 0; nested exception is org.hibernate.HibernateException: identifier of an instance of com.mct.model.Abc was altered from 138 to 0 at org.springframework.orm.hibernate3.SessionFactoryUtils.convertHibernateAccessException(SessionFactoryUtils.java:676) at org.springframework.orm.hibernate3.HibernateAccessor.convertHibernateAccessException(HibernateAccessor.java:412) at org.springframework.orm.hibernate3.HibernateTemplate.doExecute(HibernateTemplate.java:424) at org.springframework.orm.hibernate3.HibernateTemplate.executeWithNativeSession(HibernateTemplate.java:374) at org.springframework.orm.hibernate3.HibernateTemplate.findByCriteria(HibernateTemplate.java:1055) at org.springframework.orm.hibernate3.HibernateTemplate.findByCriteria(HibernateTemplate.java:1048) at com.mct.dao.WidgetCopyDAO.fetchPageChooserWithImagesByChooser(WidgetCopyDAO.java:82) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:307) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:182) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:149) at org.springframework.aop.framework.adapter.ThrowsAdviceInterceptor.invoke(ThrowsAdviceInterceptor.java:126) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171) at org.springframework.aop.framework.adapter.AfterReturningAdviceInterceptor.invoke(AfterReturningAdviceInterceptor.java:50) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171) at org.springframework.aop.framework.adapter.MethodBeforeAdviceInterceptor.invoke(MethodBeforeAdviceInterceptor.java:50) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:106) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204) at $Proxy58.fetchPageChooserWithImagesByChooser(Unknown Source) at com.mct.service.widgethelper.ChooserWidget.copyWidget(ChooserWidget.java:676) at com.mct.service.widgethelper.ChooserWidget.copyAllWidgets(ChooserWidget.java:634) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
You set ht Ids of all objects in the list: abc.setId(0l); And that's what causes the error. You cannot change an auto-generated ID by your own. Remove this line.
In hibernate You can't set Id (Autogenrated) manulally like below. abc.setId(0l); Remove this above line try again.
The problem appears to be this line: abc.setId(0l); You are clearing the ids of the entities you've loaded from the database.