Running Hadoop Examples Jar - java

Please help. I searched online and couldn't find anything. Most of the similar questions are unanswered or unhelpful.
Hi I am trying to run Hadoop Examples of Pi. My setup is all done and successful. I ran bim/hadoop dfs -ls and I get no error.
But this:
Sanjanas-MacBook-Pro:hadoop sanjanaagarwal$ /usr/local/Hadoop/bin/hadoop jar $HADOOP_HOME/hadoop-examples-*.jar pi 10 100
Number of Maps = 10
Samples per Map = 100
13/11/21 20:57:47 WARN hdfs.DFSClient: DataStreamer Exception: org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/sanjanaagarwal/PiEstimator_TMP_3_141592654/in/part0 could only be replicated to 0 nodes, instead of 1
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1591)
at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:771)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:557)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1439)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1435)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:394)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1278)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1433)
at org.apache.hadoop.ipc.Client.call(Client.java:1150)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
at com.sun.proxy.$Proxy0.addBlock(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
at com.sun.proxy.$Proxy0.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:3719)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3586)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2400(DFSClient.java:2792)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2987)
13/11/21 20:57:47 WARN hdfs.DFSClient: Error Recovery for block null bad datanode[0] nodes == null
13/11/21 20:57:47 WARN hdfs.DFSClient: Could not get block locations. Source file "/user/sanjanaagarwal/PiEstimator_TMP_3_141592654/in/part0" - Aborting...
org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/sanjanaagarwal/PiEstimator_TMP_3_141592654/in/part0 could only be replicated to 0 nodes, instead of 1
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1591)
at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:771)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:557)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1439)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1435)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:394)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1278)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1433)
at org.apache.hadoop.ipc.Client.call(Client.java:1150)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
at com.sun.proxy.$Proxy0.addBlock(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
at com.sun.proxy.$Proxy0.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:3719)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3586)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2400(DFSClient.java:2792)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2987)
13/11/21 20:57:47 ERROR hdfs.DFSClient: Exception closing file /user/sanjanaagarwal/PiEstimator_TMP_3_141592654/in/part0 : org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/sanjanaagarwal/PiEstimator_TMP_3_141592654/in/part0 could only be replicated to 0 nodes, instead of 1
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1591)
at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:771)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:557)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1439)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1435)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:394)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1278)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1433)
org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/sanjanaagarwal/PiEstimator_TMP_3_141592654/in/part0 could only be replicated to 0 nodes, instead of 1
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1591)
at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:771)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:557)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1439)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1435)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:394)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1278)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1433)
at org.apache.hadoop.ipc.Client.call(Client.java:1150)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
at com.sun.proxy.$Proxy0.addBlock(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
at com.sun.proxy.$Proxy0.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:3719)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3586)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2400(DFSClient.java:2792)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2987)
My hadoop-env.sh is all set. Still don't know whats the problem here.. Please mention the command if there's something to check since I am pretty new with linux. Thanks

I think DataNode of your cluster is not running. Check using jps command.

Couldn't solve the error so found a workaround where I got a VM from cloud era on virtualBox and ran cloudera on it.

Related

Titan with hbase on windows 7 configuarations

I am struggling to set up titan with Hbase .Below are the stteps I followed,
Downloaed titan-hbase
downloaded cygwin
Installed Hbase ( referred this http://hbase.apache.org/cygwin.html)
Hbase in running in cygwin .
5.Titan is running on my windows and basic gremlin console is coming up.
Now I want to load Hbase as storage in titan .
I am not understanding how to configure in titan so that it will map to hbase (which running in cygwin shell (command prompt).
I am trying something like this on my titan console
gremlin> TitanGraph g = TitanFactory.open('conf\titan-hbase-es.properties');
I am getting below error
Could not instantiate implementation: com.thinkaurelius.titan.diskstorage.berkeleyje.BerkeleyJEStoreManager
Display stack trace? [yN]
java.lang.IllegalArgumentException: Could not instantiate implementation:
com.thinkaurelius.titan.diskstorage.berkeleyje.BerkeleyJEStoreManager
at com.thinkaurelius.titan.diskstorage.Backend.instantiate(Backend.java:355)
at com.thinkaurelius.titan.diskstorage.Backend.getImplementationClass(Backend.java:367)
at com.thinkaurelius.titan.diskstorage.Backend.getStorageManager(Backend.java:311)
at com.thinkaurelius.titan.diskstorage.Backend.(Backend.java:121)
at com.thinkaurelius.titan.graphdb.configuration.GraphDatabaseConfiguration.getBackend(GraphDatabaseConfiguration.java:1173)
at com.thinkaurelius.titan.graphdb.database.StandardTitanGraph.(StandardTitanGraph.java:75)
at com.thinkaurelius.titan.core.TitanFactory.open(TitanFactory.java:40)
at com.thinkaurelius.titan.core.TitanFactory.open(TitanFactory.java:29)
at com.thinkaurelius.titan.core.TitanFactory$open.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
at groovysh_evaluate.run(groovysh_evaluate:56)
at groovysh_evaluate$run.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:112)
at org.codehaus.groovy.tools.shell.Interpreter.evaluate(Interpreter.groovy:67)
at org.codehaus.groovy.tools.shell.Interpreter$evaluate.call(Unknown
Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
at org.codehaus.groovy.tools.shell.Groovysh.execute(Groovysh.groovy:152)
at org.codehaus.groovy.tools.shell.Shell.leftShift(Shell.groovy:114)
at org.codehaus.groovy.tools.shell.Shell$leftShift$0.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
at org.codehaus.groovy.tools.shell.ShellRunner.work(ShellRunner.groovy:88)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$work(InteractiveShellRunner.groovy)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:233)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1079)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:128)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:148)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.work(InteractiveShellRunner.groovy:100)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:272)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:52)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:46)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:133)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:137)
at org.codehaus.groovy.tools.shell.ShellRunner.run(ShellRunner.groovy:57)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$run(InteractiveShellRunner.groovy)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:233)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1079)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:128)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:148)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.run(InteractiveShellRunner.groovy:66)
at com.thinkaurelius.titan.tinkerpop.gremlin.Console.(Console.java:57)
at com.thinkaurelius.titan.tinkerpop.gremlin.Console.(Console.java:70)
at com.thinkaurelius.titan.tinkerpop.gremlin.Console.main(Console.java:96)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at com.thinkaurelius.titan.diskstorage.Backend.instantiate(Backend.java:344)
... 63 more Caused by: com.thinkaurelius.titan.diskstorage.PermanentStorageException: Failed
to create directory D:\GraphDBTitan\titan-all-0.4.4\bin\conf
itan-hbase-es.properties for local storage.
at com.thinkaurelius.titan.diskstorage.util.DirectoryUtil.getOrCreateDataDirectory(DirectoryUtil.java:24)
at com.thinkaurelius.titan.diskstorage.common.LocalStoreManager.(LocalStoreManager.java:29)
at com.thinkaurelius.titan.diskstorage.berkeleyje.BerkeleyJEStoreManager.(BerkeleyJEStoreManager.java:36)
... 68 more
Please someone help me to fix this .
Like remis said in the comments, installing on a Linux machine would reduce the compatibility issues. If you decide to give it a go this article contains the steps to get you started with Titan on HBase: http://blog.trackerbird.com/content/setting-up-titan-1-0-apache-hbase/.
I found the main issue under the cygwin is with the $(pwd) expression, change it to $(cygpath -aw $(pwd)) please.

When i compile aspectj file with axis2 service i found this error.Do anybody know what should i do

This Web axisService has deployment faults
Error: java.lang.NoClassDefFoundError: org/aspectj/runtime/internal/AroundClosure at java.lang.Class.getDeclaredMethods0(Native Method) at java.lang.Class.privateGetDeclaredMethods(Class.java:2427) at java.lang.Class.getDeclaredMethods(Class.java:1791) at org.apache.axis2.description.java2wsdl.DefaultSchemaGenerator.generateSchema(DefaultSchemaGenerator.java:273) at org.apache.axis2.deployment.util.Utils.fillAxisService(Utils.java:468) at org.apache.axis2.deployment.ServiceBuilder.populateService(ServiceBuilder.java:388) at org.apache.axis2.deployment.ServiceGroupBuilder.populateServiceGroup(ServiceGroupBuilder.java:101) at org.apache.axis2.deployment.repository.util.ArchiveReader.buildServiceGroup(ArchiveReader.java:109) at org.apache.axis2.deployment.repository.util.ArchiveReader.processServiceGroup(ArchiveReader.java:143) at org.apache.axis2.deployment.ServiceDeployer.deploy(ServiceDeployer.java:82) at org.apache.axis2.deployment.repository.util.DeploymentFileData.deploy(DeploymentFileData.java:136) at org.apache.axis2.deployment.DeploymentEngine.doDeploy(DeploymentEngine.java:813) at org.apache.axis2.deployment.repository.util.WSInfoList.update(WSInfoList.java:144) at org.apache.axis2.deployment.RepositoryListener.update(RepositoryListener.java:377) at org.apache.axis2.deployment.RepositoryListener.checkServices(RepositoryListener.java:254) at org.apache.axis2.deployment.DeploymentEngine.loadServices(DeploymentEngine.java:142) at org.apache.axis2.deployment.WarBasedAxisConfigurator.loadServices(WarBasedAxisConfigurator.java:283) at org.apache.axis2.context.ConfigurationContextFactory.createConfigurationContext(ConfigurationContextFactory.java:95) at org.apache.axis2.transport.http.AxisServlet.initConfigContext(AxisServlet.java:584) at org.apache.axis2.transport.http.AxisServlet.init(AxisServlet.java:454) at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1194) at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:1023) at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:4917) at org.apache.catalina.core.StandardContext.start(StandardContext.java:5324) at com.sun.enterprise.web.WebModule.start(WebModule.java:353) at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:989) at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:973) at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:704) at com.sun.enterprise.web.WebContainer.loadWebModule(WebContainer.java:1627) at com.sun.enterprise.web.WebContainer.loadWebModule(WebContainer.java:1232) at com.sun.enterprise.server.WebModuleDeployEventListener.moduleDeployed(WebModuleDeployEventListener.java:182) at com.sun.enterprise.server.WebModuleDeployEventListener.moduleDeployed(WebModuleDeployEventListener.java:278) at com.sun.enterprise.admin.event.AdminEventMulticaster.invokeModuleDeployEventListener(AdminEventMulticaster.java:1005) at com.sun.enterprise.admin.event.AdminEventMulticaster.handleModuleDeployEvent(AdminEventMulticaster.java:992) at com.sun.enterprise.admin.event.AdminEventMulticaster.processEvent(AdminEventMulticaster.java:470) at com.sun.enterprise.admin.event.AdminEventMulticaster.multicastEvent(AdminEventMulticaster.java:182) at com.sun.enterprise.admin.server.core.DeploymentNotificationHelper.multicastEvent(DeploymentNotificationHelper.java:308) at com.sun.enterprise.deployment.phasing.DeploymentServiceUtils.multicastEvent(DeploymentServiceUtils.java:231) at com.sun.enterprise.deployment.phasing.ServerDeploymentTarget.sendStartEvent(ServerDeploymentTarget.java:298) at com.sun.enterprise.deployment.phasing.ApplicationStartPhase.runPhase(ApplicationStartPhase.java:132) at com.sun.enterprise.deployment.phasing.DeploymentPhase.executePhase(DeploymentPhase.java:108) at com.sun.enterprise.deployment.phasing.PEDeploymentService.executePhases(PEDeploymentService.java:966) at com.sun.enterprise.deployment.phasing.PEDeploymentService.deploy(PEDeploymentService.java:280) at com.sun.enterprise.deployment.phasing.PEDeploymentService.deploy(PEDeploymentService.java:298) at com.sun.enterprise.admin.mbeans.ApplicationsConfigMBean.deploy(ApplicationsConfigMBean.java:584) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at com.sun.enterprise.admin.MBeanHelper.invokeOperationInBean(MBeanHelper.java:381) at com.sun.enterprise.admin.MBeanHelper.invokeOperationInBean(MBeanHelper.java:364) at com.sun.enterprise.admin.config.BaseConfigMBean.invoke(BaseConfigMBean.java:477) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:836) at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:761) at sun.reflect.GeneratedMethodAccessor17.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at com.sun.enterprise.admin.util.proxy.ProxyClass.invoke(ProxyClass.java:90) at $Proxy1.invoke(Unknown Source) at com.sun.enterprise.admin.server.core.jmx.SunoneInterceptor.invoke(SunoneInterceptor.java:304) at com.sun.enterprise.interceptor.DynamicInterceptor.invoke(DynamicInterceptor.java:174) at com.sun.enterprise.deployment.autodeploy.AutoDeployer.invokeDeploymentService(AutoDeployer.java:583) at com.sun.enterprise.deployment.autodeploy.AutoDeployer.deployJavaEEArchive(AutoDeployer.java:564) at com.sun.enterprise.deployment.autodeploy.AutoDeployer.deploy(AutoDeployer.java:495) at com.sun.enterprise.deployment.autodeploy.AutoDeployer.deployAll(AutoDeployer.java:270) at com.sun.enterprise.deployment.autodeploy.AutoDeployControllerImpl$AutoDeployTask.run(AutoDeployControllerImpl.java:374) at java.util.TimerThread.mainLoop(Timer.java:512) at java.util.TimerThread.run(Timer.java:462) Caused by: java.lang.ClassNotFoundException: Class Not found : org.aspectj.runtime.internal.AroundClosure at org.apache.axis2.deployment.DeploymentClassLoader.findClass(DeploymentClassLoader.java:96) at java.lang.ClassLoader.loadClass(ClassLoader.java:306) at org.apache.axis2.deployment.DeploymentClassLoader.loadClass(DeploymentClassLoader.java:277) at java.lang.ClassLoader.loadClass(ClassLoader.java:247) ... 68 more
It is "java.lang.NoClassDefFoundError". That means particular class can not be found in the classpath. Did you put the required libraries in the "lib" folder?.
Check, the required jars are at axis2 classpath or not

Hadoop word count example without root permissions

I am trying to run this hadoop word count example on a Linux machine without root permissions. I keep getting this error all the time, though.
13/08/27 16:00:08 ERROR security.UserGroupInformation: PriviledgedActionException as:priyankara cause:org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.mapred.JobTrackerNotYetInitializedException: JobTracker is not yet RUNNING
at org.apache.hadoop.mapred.JobTracker.checkJobTrackerState(JobTracker.java:5199)
at org.apache.hadoop.mapred.JobTracker.getNewJobId(JobTracker.java:3543)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:587)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1432)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1428)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.mapred.JobTrackerNotYetInitializedException: JobTracker is not yet RUNNING
at org.apache.hadoop.mapred.JobTracker.checkJobTrackerState(JobTracker.java:5199)
at org.apache.hadoop.mapred.JobTracker.getNewJobId(JobTracker.java:3543)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:587)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1432)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1428)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
at org.apache.hadoop.ipc.Client.call(Client.java:1113)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
at org.apache.hadoop.mapred.$Proxy2.getNewJobId(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
at org.apache.hadoop.mapred.$Proxy2.getNewJobId(Unknown Source)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:944)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:550)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580)
at org.apache.hadoop.examples.WordCount.main(WordCount.java:82)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
What is happening? How can I fix this?
I think the problem is I don't have root permission on that machine. Is there a way to run this example without root permission or not?
The problem was solved by editing the etc/hosts file.
We have to put our Ip address and host name to /etc/hosts file.
example ,
If your Ip is 192.255.50.1 then your user name is 'user' you have to put them both on your hosts file as ,
192.255.50.1 user
Please check your hostname, is it "localhost" or different name?
You can try to change the host name to be "localhost", than re-start the Hadoop. For instance,
$ hostname localhost
$ $HADOOP_HOME/bin/stop-all.sh
$ $HADOOP_HOME/bin/start-all.sh
This works for my case. For your information, good luck!
I was trying to use minicluster on Windows ,
this suggestion of 192.255.50.1 user fixed my issue , I put 127.0.0.1 myloginname(user). it fixed my issue of Datanode denied communication with namenode:
thanks DilanG

Setting up Titan and Cassandra - Could not instantiate storage manager class

I've set up cassandra server (Datastax community edition) and it's running fine.
My cassandra.yaml has the following settings:
seeds: "127.0.0.1"
storage_port: 7000
rpc_port: 9160
When I do this in gremlin shell:
conf = new BaseConfiguration();
conf.setProperty("storage.backend","cassandra");
conf.setProperty("storage.hostname","127.0.0.1");
conf.setProperty("storage.port",9160);
and,
g= TitanFactory.open(conf);
I get this error:
java.lang.IllegalArgumentException: Could not instantiate storage manager class com.thinkaurelius.titan.diskstorage.cassandra.astyanax.AstyanaxStoreManager
at com.thinkaurelius.titan.diskstorage.Backend.getStorageManager(Backend.java:215)
at com.thinkaurelius.titan.diskstorage.Backend.<init>(Backend.java:97)
at com.thinkaurelius.titan.graphdb.configuration.GraphDatabaseConfiguration.getBackend(GraphDatabaseConfiguration.java:398)
at com.thinkaurelius.titan.graphdb.database.StandardTitanGraph.<init>(StandardTitanGraph.java:78)
at com.thinkaurelius.titan.core.TitanFactory.open(TitanFactory.java:60)
at com.thinkaurelius.titan.core.TitanFactory$open.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
at groovysh_evaluate.run(groovysh_evaluate:46)
at groovysh_evaluate$run.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)
at groovysh_evaluate$run.call(Unknown Source)
at org.codehaus.groovy.tools.shell.Interpreter.evaluate(Interpreter.groovy:67)
at org.codehaus.groovy.tools.shell.Interpreter$evaluate.call(Unknown Source)
at org.codehaus.groovy.tools.shell.Groovysh.execute(Groovysh.groovy:152)
at org.codehaus.groovy.tools.shell.Shell.leftShift(Shell.groovy:114)
at org.codehaus.groovy.tools.shell.Shell$leftShift$0.call(Unknown Source)
at org.codehaus.groovy.tools.shell.ShellRunner.work(ShellRunner.groovy:88)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$work(InteractiveShellRunner.groovy)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:233)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1071)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:128)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:148)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.work(InteractiveShellRunner.groovy:100)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:272)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:52)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:137)
at org.codehaus.groovy.tools.shell.ShellRunner.run(ShellRunner.groovy:57)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$run(InteractiveShellRunner.groovy)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:233)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1071)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:128)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:148)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.run(InteractiveShellRunner.groovy:66)
at com.thinkaurelius.titan.tinkerpop.gremlin.Console.<init>(Console.java:60)
at com.thinkaurelius.titan.tinkerpop.gremlin.Console.<init>(Console.java:67)
at com.thinkaurelius.titan.tinkerpop.gremlin.Console.main(Console.java:72)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:532)
at com.thinkaurelius.titan.diskstorage.Backend.getStorageManager(Backend.java:204)
... 51 more
Caused by: com.thinkaurelius.titan.diskstorage.TemporaryStorageException: Temporary failure in storage backend
at com.thinkaurelius.titan.diskstorage.cassandra.astyanax.AstyanaxStoreManager.ensureKeyspaceExists(AstyanaxStoreManager.java:328)
at com.thinkaurelius.titan.diskstorage.cassandra.astyanax.AstyanaxStoreManager.<init>(AstyanaxStoreManager.java:136)
... 56 more
Caused by: com.netflix.astyanax.connectionpool.exceptions.NoAvailableHostsException: NoAvailableHostsException: [host=None(0.0.0.0):0, latency=0(0), attempts=0] No hosts to borrow from
at com.netflix.astyanax.connectionpool.impl.RoundRobinExecuteWithFailover.<init>(RoundRobinExecuteWithFailover.java:31)
at com.netflix.astyanax.connectionpool.impl.TokenAwareConnectionPoolImpl.newExecuteWithFailover(TokenAwareConnectionPoolImpl.java:74)
at com.netflix.astyanax.connectionpool.impl.AbstractHostPartitionConnectionPool.executeWithFailover(AbstractHostPartitionConnectionPool.java:229)
at com.netflix.astyanax.thrift.ThriftClusterImpl.executeSchemaChangeOperation(ThriftClusterImpl.java:131)
at com.netflix.astyanax.thrift.ThriftClusterImpl.addKeyspace(ThriftClusterImpl.java:252)
at com.thinkaurelius.titan.diskstorage.cassandra.astyanax.AstyanaxStoreManager.ensureKeyspaceExists(AstyanaxStoreManager.java:323)
... 57 more
What am I doing wrong?
I faced the same issue. solution is compatible version check. I downloaded the latest version of cassandra and the error is gone.
apache-cassandra-1.2.5
titan-all-0.3.1

Hadoop Program cannot find installed binary

I have hadoop setup and running on my machine , I am trying to run PEGASUS peta scale mining program , but when I do make , there are lot of warning and following is part of it:
13/03/21 12:58:42 WARN hdfs.DFSClient: DataStreamer Exception: org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/hadoop/hadi_edge/catepillar_star.edge could only be replicated to 0 nodes, instead of 1
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1558)
at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:696)
at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
at org.apache.hadoop.ipc.Client.call(Client.java:1066)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
at $Proxy1.addBlock(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
at $Proxy1.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:3507)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3370)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2700(DFSClient.java:2586)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2826)
13/03/21 12:58:42 WARN hdfs.DFSClient: Error Recovery for block null bad datanode[0] nodes =
= null
13/03/21 12:58:42 WARN hdfs.DFSClient: Could not get block locations. Source file "/user/hadoop/hadi_edge/catepillar_star.edge" - Aborting...
put: java.io.IOException: File /user/hadoop/hadi_edge/catepillar_star.edge could only be replicated to 0 nodes, instead of 1
13/03/21 12:58:42 ERROR hdfs.DFSClient: Exception closing file /user/hadoop/hadi_edge/catepillar_star.edge : org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/hadoop/hadi_edge/catepillar_star.edge could only be replicated to 0 nodes, instead of 1
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1558)
at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:696)
at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/hadoop/hadi_edge/catepillar_star.edge could only be replicated to 0 nodes, instead of 1
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1558)
at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:696)
at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
at org.apache.hadoop.ipc.Client.call(Client.java:1066)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
at $Proxy1.addBlock(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
at $Proxy1.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:3507)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3370)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2700(DFSClient.java:2586)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2826)
I thought this one is a permission issue then I tried sudo make and then I get the folloing error:
Hadoop is not installed in the system.
Please install Hadoop and make sure the hadoop binary is accessible.
make: *** [demo_hadi] Error 127
Hadoop is already installed as i print my jps :
hadoop#polaris:~/PEGASUS$ jps
7814 JobTracker
8061 TaskTracker
3799 FsShell
7718 SecondaryNameNode
9155 FsShell
8881 RunJar
7235 NameNode
6339 RunJar
9236 Jps
Thanks for the time , I really want to know what is going wrong !!!
As visible, Datanode is not up on your system (Same is the probable reason for "could only be replicated to 0 nodes, instead of 1")
It happens sometimes.
Solution : Stop all the hadoop daemons and start them again. Look if all the 5 daemons are running (namenode, Secondary namenode,datanode, jobtracker, tasktracker).
Try running some sample code to test that framework is running fine.
Restart the machine, if the above solution isnt working.
If issue persists, reformat the namenode (not a very good advise though; should be kept as last solution).

Categories