I am writing a mapreduce project.
I want to send an array from mapper to reducer.
But it has an error and I can't fix It.
I import these classes:
import java.io.DataInput;
import java.io.DataOutput;
import java.io.EOFException;
import java.io.IOException;
import java.net.Socket;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Map;
import java.util.Set;
import org.apache.hadoop.conf.Configured;
import java.io.BufferedInputStream;
import java.io.BufferedReader;
import java.util.*;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.FileReader;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.io.OutputStream;
import java.util.Iterator;
import hadoop.DENCLUE;
//import javafx.scene.text.Text;
import sun.security.krb5.Config;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.fs.viewfs.Constants;
import org.apache.hadoop.mapred.FileInputFormat;
import org.apache.hadoop.mapred.FileOutputFormat;
import org.apache.hadoop.mapred.JobClient;
import org.apache.hadoop.mapred.JobConf;
import org.apache.hadoop.mapred.MapReduceBase;
import org.apache.hadoop.mapred.Mapper;
import org.apache.hadoop.mapred.OutputCollector;
import org.apache.hadoop.mapred.Reducer;
import org.apache.hadoop.mapred.Reporter;
import org.apache.hadoop.mapred.TextInputFormat;
import org.apache.hadoop.mapred.TextOutputFormat;
//import org.apache.hadoop.mapred.jobcontrol.Job;
import org.apache.hadoop.util.Tool;
import org.apache.hadoop.util.ToolRunner;
import org.omg.CORBA.PUBLIC_MEMBER;
import com.sun.org.apache.bcel.internal.generic.NEW;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.*;
import org.apache.hadoop.mapreduce.lib.output.*;
import org.apache.hadoop.mapreduce.Mapper.Context;
import org.apache.hadoop.mapreduce.task.JobContextImpl;
import org.apache.hadoop.mapreduce.lib.input.*;
import org.apache.hadoop.mapreduce.lib.output.*;
import org.apache.hadoop.util.*;
import java.io.DataOutput;
import java.io.DataInput;
import java.io.IOException;
This is my Map class:
public static class Mapn extends MapReduceBase implements Mapper<LongWritable, Text, Text, Text> {
#SuppressWarnings("rawtypes")
Context con ;
#SuppressWarnings("unchecked")
public void map(LongWritable key, Text value, OutputCollector< Text,Text >
output, Reporter reporter) throws IOException {
String line = value.toString();
String[] words=line.split(",");
for(String word: words )
{
Text outputKey = new Text(word.toUpperCase().trim());
try {
con.write(outputKey, words);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
This is the job:
public static void main(String[] args) throws Exception {
Configuration c=new Configuration();
String[] files=new GenericOptionsParser(c,args).getRemainingArgs();
Path input=new Path(files[0]);
Path output=new Path(files[1]);
Job j=new Job(c,"wnt");
j.setJarByClass(projectmr.class);
j.setMapperClass(Mapn.class);
j.setReducerClass(Reduce.class);
j.setOutputKeyClass(Text.class);
j.setOutputValueClass(Text.class);
FileInputFormat.addInputPaths(j, input);
FileOutputFormat.setOutputPath(j, output);
System.exit(j.waitForCompletion(true)?0:1);
and this is the error I get:
Exception in thread "main" java.lang.RuntimeException: class hadoop.projectmr$Mapn not org.apache.hadoop.mapreduce.Mapper
at org.apache.hadoop.conf.Configuration.setClass(Configuration.java:1969)
at org.apache.hadoop.mapreduce.Job.setMapperClass(Job.java:891)
at hadoop.projectmr.main(projectmr.java:191)
This is the old, Hadoop 1 API
import org.apache.hadoop.mapred.*;
You should be importing from classes within
org.apache.hadoop.mapreduce.*;
As the error says
not org.apache.hadoop.mapreduce.Mapper
So, basically, you don't need MapReduceBase, and Mapper is a class now, not an interface
So, you now would have
public static class MyMapper extends Mapper<Kin, Vin, Kout, Vout>
Look at the WordCount code
Related
I have the following Spring Boot Controller:
import java.io.InputStream;
import javax.servlet.http.HttpServletRequest;
import javax.ws.rs.Consumes;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.PathParam;
import javax.ws.rs.Produces;
import javax.ws.rs.QueryParam;
import javax.ws.rs.core.Context;
import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.Response;
import com.att.aft.dme2.internal.javaxwsrs.PUT;
import com.att.bttw.model.HelloWorld;
import io.swagger.annotations.Api;
import io.swagger.annotations.ApiOperation;
import io.swagger.annotations.ApiResponse;
import io.swagger.annotations.ApiResponses;
#Api
#Path("/service")
#Produces({ MediaType.APPLICATION_JSON })
public interface RestService {
#PUT
#Path("/datarouter/{filename}")
#Consumes("application/gzip")
#Produces(MediaType.APPLICATION_JSON)
Response testEndpoint(#Context HttpServletRequest request, #PathParam("filename") String filename, InputStream inputStream);
}
The implementation of this method is simply:
import javax.servlet.http.HttpServletRequest;
import javax.ws.rs.core.Response;
import org.springframework.stereotype.Component;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.client.RestTemplate;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import com.att.bttw.model.ApplicationResults;
import com.att.bttw.model.CountsDataModel;
import com.att.bttw.model.CountsDataResults;
import com.att.bttw.model.DisplayFieldRecord;
import com.att.bttw.model.DisplayRecord;
import com.att.bttw.model.WatermarkEventObject;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.IOException;
import java.io.InputStream;
import java.sql.Timestamp;
import java.time.Instant;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;
#Override
public Response testEndpoint(HttpServletRequest request, String filename, InputStream inputStream) {
System.out.println("File name consumed: " + filename);
return null;
}
When I try to hit this endpoint, I get the following error:
"Following issues have been detected: Response testEndpoint(javax.servlet.http.HttpServletRequest, java.lang.String, java.io.InputStream), can not have an entity parameter. Try to move the parameter to the corresponding resource method".
When I remove the InputStream and HttpServletRequest parameters, the endpoint works just fine, I'm just not understanding the error message I'm getting.
I am trying to upload a file to S3 and getting error like
Target exception: java.lang.NoSuchMethodError:
com.amazonaws.SDKGlobalConfiguration.isInRegionOptimizedModeEnabled()Z
Code is
String accessKey="accesskey";
String secretKey="mysecretkey"
AWSCredentials credentials = new BasicAWSCredentials(accessKey, secretKey);
AmazonS3 conn = new AmazonS3Client(credentials);
In the line of AmazonS3 conn = new AmazonS3Client(credentials); I am getting the target exception.
Totally I imported these many java packages showing below. still getting the same error.
import java.io.File;
import java.io.IOException;
import java.io.ByteArrayInputStream;
import java.io.File;
import java.util.List;
import com.amazonaws.auth.*;
import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.util.StringUtils;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3Client;
import com.amazonaws.SDKGlobalConfiguration;
import com.amazonaws.services.s3.model.Bucket;
import com.amazonaws.services.s3.model.CannedAccessControlList;
import com.amazonaws.services.s3.model.GeneratePresignedUrlRequest;
import com.amazonaws.services.s3.model.GetObjectRequest;
import com.amazonaws.services.s3.model.ObjectListing;
import com.amazonaws.services.s3.model.ObjectMetadata;
import com.amazonaws.services.s3.model.S3ObjectSummary;
import com.amazonaws.services.s3.*;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.amazonaws.services.s3.AmazonS3Builder;
import com.amazonaws.services.s3.AmazonS3Client;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.amazonaws.ClientConfigurationFactory;
import com.amazonaws.annotation.NotThreadSafe;
import com.amazonaws.annotation.SdkTestInternalApi;
import com.amazonaws.client.AwsSyncClientParams;
import com.amazonaws.internal.SdkFunction;
import com.amazonaws.regions.AwsRegionProvider;
import com.amazonaws.ClientConfiguration;
import com.amazonaws.Protocol;
import com.amazonaws.AmazonS3Client;
import com.amazonaws.AmazonClientException;
import com.amazonaws.AmazonServiceException;
import com.amazonaws.auth.profile.ProfileCredentialsProvider;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3Client;
import com.amazonaws.services.s3.model.PutObjectRequest;
org.springframework.beans.factory.BeanDefinitionStoreException: Failed to parse configuration class [com.gaian.adwize.ssp.main.SSPStarter]; nested exception is java.io.FileNotFoundException: Could not open ServletContext resource [/application.properties] at org.springframework.context.annotation.ConfigurationClassParser.parse(ConfigurationClassParser.java:187) at org.springframework.context.annotation.ConfigurationClassPostProcessor.processConfigBeanDefinitions(ConfigurationClassPostProcessor.java:324) at
Test class
----------
package com.gaian.adwize.swagger;
import com.gaian.adwize.ssp.main.SSPStarter;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.autoconfigure.restdocs.AutoConfigureRestDocs;
import org.springframework.boot.test.autoconfigure.web.servlet.AutoConfigureMockMvc;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.http.MediaType;
import org.springframework.mock.web.MockHttpServletResponse;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.test.context.web.WebAppConfiguration;
import org.springframework.test.web.servlet.MockMvc;
import org.springframework.test.web.servlet.MvcResult;
import java.io.BufferedWriter;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.Paths;
import org.springframework.context.annotation.ImportResource;
import static org.springframework.restdocs.mockmvc.MockMvcRestDocumentation.document;
import static org.springframework.restdocs.operation.preprocess.Preprocessors.preprocessResponse;
import static org.springframework.restdocs.operation.preprocess.Preprocessors.prettyPrint;
import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get;
import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status;
#WebAppConfiguration
#RunWith(SpringJUnit4ClassRunner.class)
#AutoConfigureRestDocs(outputDir = "build/asciidoc/snippets")
#SpringBootTest(classes = {SSPStarter.class, SwaggerConfig.class})
#AutoConfigureMockMvc
public class Swagger2MarkupTest {
private static final Logger LOG = LoggerFactory.getLogger(Swagger2MarkupTest.class);
#Autowired
private MockMvc mockMvc;
#Test
public void createSpringfoxSwaggerJson() throws Exception {
//String designFirstSwaggerLocation = Swagger2MarkupTest.class.getResource("/swagger.yaml").getPath();
String outputDir = System.getProperty("io.springfox.staticdocs.outputDir");
MvcResult mvcResult = this.mockMvc.perform(get("/v2/api-docs")
.accept(MediaType.APPLICATION_JSON))
.andExpect(status().isOk())
.andReturn();
MockHttpServletResponse response = mvcResult.getResponse();
String swaggerJson = response.getContentAsString();
Files.createDirectories(Paths.get(outputDir));
try (BufferedWriter writer = Files.newBufferedWriter(Paths.get(outputDir, "swagger.json"), StandardCharsets.UTF_8)){
writer.write(swaggerJson);
}
}
}
Failed to load Application Context and could not be able to load properties file which is in classpath
Anyone please help me??
I'm trying to import vk.core.api in Java but when I try to compile it, I get only errors like "error: package vk.core.api does not exist
import vk.core.api.*;"
package main.java.tddt;
import javafx.scene.control.Label;
import main.java.tddt.data.Log;
import main.java.tddt.data.LogList;
import main.java.tddt.data.Timer;
import main.java.tddt.gui.Controller;
import vk.core.api.*;
import vk.core.api.CompilerResult;
import vk.core.api.TestResult;
import vk.core.api.CompileError;
import vk.core.api.CompilerFactory;
import vk.core.api.JavaStringCompiler;
import vk.core.internal.*;
import vk.core.internal.InternalResult;
import java.util.TreeSet;
import javax.xml.bind.JAXBException;
import java.io.File;
import java.time.LocalDateTime;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Collection;
import java.util.List;
You need to add in the dependencies of your build.gradle something like:
compile group: 'de.hhu.stups', name: 'virtual-kata-lib', version: '1.0.0'
I am trying to create a new excel file using below code. I have added all the necessary JAR files as you can see from import code lines. But I am still seeing the error that "WritableWorkbook cannot be resolved to a type" and "Workbook can not be resolved to a type".
package myPackage;
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.ie.InternetExplorerDriver;
import org.openqa.selenium.WebElement;
import java.util.List;
import java.io.File;
import java.io.OutputStream;
import org.apache.poi.xssf.usermodel.XSSFCell;
import org.apache.poi.xssf.usermodel.XSSFRow;
import org.apache.poi.xssf.usermodel.XSSFSheet;
import org.apache.poi.xssf.usermodel.XSSFWorkbook;
import org.apache.poi.ss.usermodel.Workbook;
public class myClass {
public static void main(String [] args)
{
File eFile = new File ("C:\\Users\\bondrea\\AutomationRelatedFiles\\TrialExcel.xlsx");
WritableWorkbook FinalExcel = Workbook.createWorkbook(eFile);
}
}