TargetingIdeaSelector Google Adwords - INVALID_SEARCH_PARAMETERS - java

Currently I am using the TargetingIdeaService to provide a subset of the functionality of the Keyword Planner by providing a list of locations , language and a RelatedToQuerySearchParameter query to TargetingIdeaService.get to get a list of keywords and their associated data.
My API Version: V201806
But, I am getting the exception.
Exception:
com.google.api.ads.adwords.axis.v201806.cm.ApiException) ApiException{applicationExceptionType=ApiException, errors=[TargetingIdeaError{apiErrorType=TargetingIdeaError, errorString=TargetingIdeaError.INVALID_SEARCH_PARAMETERS, fieldPath=selector.searchParameters.searchParameters[0], fieldPathElements=[FieldPathElement{field=selector}, FieldPathElement{field=searchParameters}, FieldPathElement{field=searchParameters, index=0}], reason=INVALID_SEARCH_PARAMETERS, trigger=LanguageSearchParameter}, TargetingIdeaError{apiErrorType=TargetingIdeaError, errorString=TargetingIdeaError.INVALID_SEARCH_PARAMETERS, fieldPath=selector.searchParameters.searchParameters[1], fieldPathElements=[FieldPathElement{field=selector}, FieldPathElement{field=searchParameters}, FieldPathElement{field=searchParameters, index=1}], reason=INVALID_SEARCH_PARAMETERS, trigger=LocationSearchParameter}, TargetingIdeaError{apiErrorType=TargetingIdeaError, errorString=TargetingIdeaError.INVALID_SEARCH_PARAMETERS, fieldPath=selector.searchParameters.searchParameters[2], fieldPathElements=[FieldPathElement{field=selector}, FieldPathElement{field=searchParameters}, FieldPathElement{field=searchParameters, index=2}], reason=INVALID_SEARCH_PARAMETERS, trigger=RelatedToQuerySearchParameter}]}
Java:
public TargetingIdea[] getClusterCol(ClientAccounts campInfo, String phrase, AccountDetails accDetails) throws ApiException, RemoteException, OAuthException {
ClientAccounts keyCol = new ClientAccounts();
GleServices gleServices = new GleServices();
TargetingIdea[] targetingIdea = null;
GoogleTokenResponse tokenResp = new GoogleTokenResponse();
tokenResp.setAccessToken(accDetails.getAccessToken());
tokenResp.setRefreshToken(accDetails.getRefreshToken());
gleServices.setAccountDetails(accDetails);
TargetingIdeaPage page = new TargetingIdeaPage();
try {
adWordsSession = gleServices.createAdWordsSession(clientId, clientSecret, tokenResp.getRefreshToken(), developerToken, userAgent);
} catch (ValidationException ex) {
Logger.getLogger(TenMinCampaignService.class.getName()).log(Level.SEVERE, null, ex);
}
TargetingIdeaServiceInterface targetingIdeaService = (TargetingIdeaServiceInterface) gleServices.getService(gleServices.TARGETING_IDEA_SERVICE, adWordsSession);
TargetingIdeaSelector selector = new TargetingIdeaSelector();
try {
selector.setRequestType(RequestType.IDEAS);
selector.setIdeaType(IdeaType.KEYWORD);
selector.setRequestedAttributeTypes(new AttributeType[]{AttributeType.KEYWORD_TEXT});
Paging paging = new Paging();
paging.setStartIndex(0);
paging.setNumberResults(200);
selector.setPaging(paging);
Location loc = new Location();
loc.setId(Long.valueOf(2356));
LocationSearchParameter countryTargetSearchParameter = new LocationSearchParameter();
countryTargetSearchParameter.setLocations(new Location[]{loc});
Language lang = new Language();
lang.setId(Long.valueOf(1000));
LanguageSearchParameter langTargetSearchParameter = new LanguageSearchParameter();
langTargetSearchParameter.setLanguages(new Language[]{lang});
RelatedToQuerySearchParameter relatedToQuerySearchParameter
= new RelatedToQuerySearchParameter();
relatedToQuerySearchParameter.setQueries(new String[]{"bakery"});
selector.setSearchParameters(new SearchParameter[]{relatedToQuerySearchParameter, countryTargetSearchParameter, langTargetSearchParameter});
page = targetingIdeaService.get(selector);
targetingIdea = page.getEntries();
} catch (ApiException e) {
for (ApiError er : e.getErrors()) {
if (er instanceof RateExceededError) {
try {
Thread.sleep(((RateExceededError) er).getRetryAfterSeconds()); // 30seconds
page = targetingIdeaService.get(selector);
targetingIdea = page.getEntries();
} catch (Exception th) {
e.printStackTrace();
throw e;
}
}
}
e.printStackTrace();
}
return targetingIdea;
}
Any help would be appropriated.

Related

Synchronization of two workspaces in same RTC repository

I am working currently on a project using RTC (Rational Team Concert) through Plain Java API.
I have a online-repository on which I have a main stream named Sample_ROOT. From this stream I want to create another one named Sample_Client_Application. The Sample_Client_Application should have all components of Sample_ROOT and whenever changes are committed in Sample_ROOT, they have to be pushed into Sample_Client_Application.
I tried to implement all the process described above in eclipse with RTC Plain Java API with the code below.
/*
parentStream : represents Sample_ROOT
appStream : represents DTO of Sample_Client_Application
projectStreams is a map of projects with their workspaces : HashMap<String, List<IWorkspaceHandle>>
*/
public IWorkspaceConnection createAppInTeamAreas(String parentStream, StreamDTO appStream)
throws TeamRepositoryException {
monitor.println("Getting Workspace");
IWorkspaceManager workspaceManager = SCMPlatform.getWorkspaceManager(repository);
IWorkspaceConnection workspaceConnection = null;
Optional<IWorkspaceHandle> handle = projectStreams.get(appStream.getProjectName()).stream().filter(xHandle -> {
IWorkspace workspace = null;
try {
workspace = (IWorkspace) repository.itemManager().fetchCompleteItem(xHandle, IItemManager.DEFAULT,
null);
} catch (TeamRepositoryException e) {
e.printStackTrace();
}
return workspace != null && workspace.getName() == parentStream;
}).findFirst();
if (handle.isPresent()) {
IWorkspace parentWorkSpace = null;
try {
parentWorkSpace = (IWorkspace) repository.itemManager().fetchCompleteItem(handle.get(),
IItemManager.DEFAULT, null);
} catch (TeamRepositoryException e) {
e.printStackTrace();
return null;
}
IWorkspaceConnection parentConn = workspaceManager.getWorkspaceConnection(handle.get(), monitor);
List<IComponentHandle> parentComps = (List<IComponentHandle>) parentConn.getComponents();
IWorkspaceConnection childConn = null;
IItemManager itemManager = repository.itemManager();
for (IComponentHandle parentCompHandle : parentComps) {
IItemHandle itemHandle = (IItemHandle) parentCompHandle;
IComponent parentComp = (IComponent) itemManager.fetchCompleteItem(itemHandle, IItemManager.DEFAULT,
monitor);
monitor.println(String.format("Component name : %s", parentComp.getName()));
childConn = createAndAddSycrhronizedComponents(parentWorkSpace, parentComp, parentConn, appStream,
childConn);
}
workspaceConnection = childConn;
}
return workspaceConnection;
}
private IWorkspaceConnection createAndAddSycrhronizedComponents(IWorkspace parentStream, IComponent parentComp,
IWorkspaceConnection parentConnection, StreamDTO childStream, IWorkspaceConnection childConnection) {
IWorkspaceManager workspaceManager = SCMPlatform.getWorkspaceManager(repository);
IWorkspaceConnection workspaceConnection = null;
try {
if (childConnection == null) {
workspaceConnection = workspaceManager.createWorkspace(repository.loggedInContributor(),
childStream.getName(), childStream.getDescription(), monitor);
} else {
workspaceConnection = childConnection;
//Set flow for the newly created repository
IFlowTable flowTarget=workspaceConnection.getFlowTable().getWorkingCopy();
flowTarget.addDeliverFlow(parentStream, repository.getId(), repository.getRepositoryURI(), parentConnection.getComponents(), "Scoped");
flowTarget.addAcceptFlow(parentStream, repository.getId(), repository.getRepositoryURI(), parentConnection.getComponents(), "Scoped");
flowTarget.setCurrent(flowTarget.getDeliverFlow(parentStream));
flowTarget.setCurrent(flowTarget.getAcceptFlow(parentStream));
flowTarget.setDefault(flowTarget.getDeliverFlow(parentStream));
flowTarget.setDefault(flowTarget.getAcceptFlow(parentStream));
workspaceConnection.setFlowTable(flowTarget, monitor);
//Add the components to the workspace
List<Object> componentsToBeAdded=new ArrayList<>();
componentsToBeAdded.add(workspaceConnection.componentOpFactory().addComponent(parentComp, true));
workspaceConnection.applyComponentOperations(componentsToBeAdded, monitor);
//Accept stream changes to local repository
IChangeHistorySyncReport streamChangeHistorySyncReport=workspaceConnection.compareTo(parentConnection, WorkspaceComparisonFlags.INCLUDE_BASELINE_INFO, Collections.EMPTY_LIST, null);
workspaceConnection.accept(AcceptFlags.DEFAULT, parentConnection, streamChangeHistorySyncReport, streamChangeHistorySyncReport.incomingBaselines(parentComp), streamChangeHistorySyncReport.incomingChangeSets(parentComp), monitor);
}
} catch (TeamRepositoryException e) {
e.printStackTrace();
}
return workspaceConnection;
}
Sample_Client_Application is created successfully but it has no components. Can anyone help me Please ?

DCM4CHE, Network operations,Handling a C-Move call

Hi I'm trying to make a PACS server using Java. dcm4che appears to be quite popular. But I'm unable to find any good examples about it.
As a starting point I inspected dcmqrscp and it successfully stores a DICOM image. But I cannot manage to handle a C-MOVE call. Here's my CMove handler. It finds requested the DICOM file adds a URL and other stuff, it doesn't throw any exception yet client doesn't receive any files.
private final class CMoveSCPImpl extends BasicCMoveSCP {
private final String[] qrLevels;
private final QueryRetrieveLevel rootLevel;
public CMoveSCPImpl(String sopClass, String... qrLevels) {
super(sopClass);
this.qrLevels = qrLevels;
this.rootLevel = QueryRetrieveLevel.valueOf(qrLevels[0]);
}
#Override
protected RetrieveTask calculateMatches(Association as, PresentationContext pc, final Attributes rq, Attributes keys) throws DicomServiceException {
QueryRetrieveLevel level = QueryRetrieveLevel.valueOf(keys, qrLevels);
try {
level.validateRetrieveKeys(keys, rootLevel, relational(as, rq));
} catch (Exception e) {
e.printStackTrace();
}
String moveDest = rq.getString(Tag.MoveDestination);
final Connection remote = new Connection("reciverAE",as.getSocket().getInetAddress().getHostAddress(), 11113);
if (remote == null)
throw new DicomServiceException(Status.MoveDestinationUnknown, "Move Destination: " + moveDest + " unknown");
List<T> matches = DcmQRSCP.this.calculateMatches(keys);
if (matches.isEmpty())
return null;
AAssociateRQ aarq;
Association storeas = null;
try {
aarq = makeAAssociateRQ(as.getLocalAET(), moveDest, matches);
storeas = openStoreAssociation(as, remote, aarq);
} catch (Exception e) {
e.printStackTrace();
}
BasicRetrieveTask<T> retrieveTask = null;
retrieveTask = new BasicRetrieveTask<T>(Dimse.C_MOVE_RQ, as, pc, rq, matches, storeas, new BasicCStoreSCU<T>());
retrieveTask.setSendPendingRSPInterval(getSendPendingCMoveInterval());
return retrieveTask;
}
private Association openStoreAssociation(Association as, Connection remote, AAssociateRQ aarq)
throws DicomServiceException {
try {
return as.getApplicationEntity().connect(as.getConnection(),
remote, aarq);
} catch (Exception e) {
throw new DicomServiceException(
Status.UnableToPerformSubOperations, e);
}
}
private AAssociateRQ makeAAssociateRQ(String callingAET,
String calledAET, List<T> matches) {
AAssociateRQ aarq = new AAssociateRQ();
aarq.setCalledAET(calledAET);
aarq.setCallingAET(callingAET);
for (InstanceLocator match : matches) {
if (aarq.addPresentationContextFor(match.cuid, match.tsuid)) {
if (!UID.ExplicitVRLittleEndian.equals(match.tsuid))
aarq.addPresentationContextFor(match.cuid,
UID.ExplicitVRLittleEndian);
if (!UID.ImplicitVRLittleEndian.equals(match.tsuid))
aarq.addPresentationContextFor(match.cuid,
UID.ImplicitVRLittleEndian);
}
}
return aarq;
}
private boolean relational(Association as, Attributes rq) {
String cuid = rq.getString(Tag.AffectedSOPClassUID);
ExtendedNegotiation extNeg = as.getAAssociateAC().getExtNegotiationFor(cuid);
return QueryOption.toOptions(extNeg).contains(
QueryOption.RELATIONAL);
}
}
I added the code below to send a DICOM file as a response:
String cuid = rq.getString(Tag.AffectedSOPClassUID);
String iuid = rq.getString(Tag.AffectedSOPInstanceUID);
String tsuid = pc.getTransferSyntax();
try {
DcmQRSCP.this.as=as;
File f = new File("D:\\dcmqrscpTestDCMDir\\1.2.840.113619.2.30.1.1762295590.1623.978668949.886\\1.2.840.113619.2.30.1.1762295590.1623.978668949.887\\1.2.840.113619.2.30.1.1762295590.1623.978668949.888");
FileInputStream in = new FileInputStream(f);
InputStreamDataWriter data = new InputStreamDataWriter(in);
// !1! as.cmove(cuid,1,keys,tsuid,"STORESCU");
as.cstore(cuid,iuid,1,data,tsuid,rspHandlerFactory.createDimseRSPHandler(f));
} catch (Exception e) {
e.printStackTrace();
}
Throws this exception
org.dcm4che3.net.NoRoleSelectionException: No Role Selection for SOP Class 1.2.840.10008.5.1.4.1.2.2.2 - Study Root Query/Retrieve Information Model - MOVE as SCU negotiated
You should add a role to the application instance like:
applicationEntity.addTransferCapability(
new TransferCapability(null, "*", TransferCapability.Role.SCP, "*"));

Get PayPal account history using REST

I want to use Java PayPal SDK to get account history. I tried this simple code:
public void randomDatabaseData() throws SQLException, FileNotFoundException, IOException, PayPalRESTException {
String clientID = "test";
String clientSecret = "test";
String accessToken = null;
try {
Map<String, String> map = new HashMap<String, String>();
map.put("mode", "live");
try {
accessToken = new OAuthTokenCredential(clientID, clientSecret, map).getAccessToken();
} catch (Exception e) {
e.printStackTrace();
}
System.out.println(accessToken);
transactionSearch(accessToken);
} catch (Exception e) {
e.printStackTrace();
}
}
public TransactionSearchResponseType transactionSearch(String accessToken) {
TransactionSearchReq transactionSearchReq = new TransactionSearchReq();
TransactionSearchRequestType transactionSearchRequest = new TransactionSearchRequestType(
"2012-12-25T00:00:00+0530");
transactionSearchReq.setTransactionSearchRequest(transactionSearchRequest);
PayPalAPIInterfaceServiceService service = new PayPalAPIInterfaceServiceService();
service.setTokenSecret(accessToken);
TransactionSearchResponseType transactionSearchResponse = null;
try {
transactionSearchResponse = service.transactionSearch(transactionSearchReq);
} catch (Exception e) {
System.out.println("Error Message : " + e.getMessage());
}
if (transactionSearchResponse.getAck().getValue().equalsIgnoreCase("success")) {
Iterator<PaymentTransactionSearchResultType> iterator = transactionSearchResponse
.getPaymentTransactions().iterator();
while (iterator.hasNext()) {
PaymentTransactionSearchResultType searchResult = iterator.next();
System.out.println("Transaction ID : " + searchResult.getTransactionID());
}
} else {
List<ErrorType> errorList = transactionSearchResponse.getErrors();
System.out.println("API Error Message : " + errorList.get(0).getLongMessage());
}
return transactionSearchResponse;
}
But I get his error stack when I run the code:
Error Message : configurationMap cannot be null
java.lang.NullPointerException
at com.crm.web.tickets.GenearateTicketsTest.transactionSearch(GenearateTicketsTest.java:161)
at com.crm.web.tickets.GenearateTicketsTest.randomDatabaseData(GenearateTicketsTest.java:139)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
How I can fix this code? I configure client ID and secret key into PayPal web site but still I get error.
I recommend you using TransactionSearch API to get the payment history based on your search date range.

BeanSerializer/BeanDeserializer Axis Generated Object

I have a large number of axis generated objects from several WSDLs, I need a generic solution to store the objects in xml format in the database, but also load them back in java when needed.
This is what I've made so far:
private String serializeAxisObject(Object obj) throws Exception {
if (obj == null) {
return null;
}
StringWriter outStr = new StringWriter();
TypeDesc typeDesc = getAxisTypeDesc(obj);
QName qname = typeDesc.getXmlType();
String lname = qname.getLocalPart();
if (lname.startsWith(">") && lname.length() > 1)
lname = lname.substring(1);
qname = new QName(qname.getNamespaceURI(), lname);
AxisServer server = new AxisServer();
BeanSerializer ser = new BeanSerializer(obj.getClass(), qname, typeDesc);
SerializationContext ctx = new SerializationContext(outStr,
new MessageContext(server));
ctx.setSendDecl(false);
ctx.setDoMultiRefs(false);
ctx.setPretty(true);
try {
ser.serialize(qname, new AttributesImpl(), obj, ctx);
} catch (final Exception e) {
throw new Exception("Unable to serialize object "
+ obj.getClass().getName(), e);
}
String xml = outStr.toString();
return xml;
}
private Object deserializeAxisObject(Class<?> cls, String xml)
throws Exception {
final String SOAP_START = "<soapenv:Envelope xmlns:soapenv=\"http://schemas.xmlsoap.org/soap/envelope/\"><soapenv:Header /><soapenv:Body>";
final String SOAP_START_XSI = "<soapenv:Envelope xmlns:soapenv=\"http://schemas.xmlsoap.org/soap/envelope/\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"><soapenv:Header /><soapenv:Body>";
final String SOAP_END = "</soapenv:Body></soapenv:Envelope>";
Object result = null;
try {
Message message = new Message(SOAP_START + xml + SOAP_END);
result = message.getSOAPEnvelope().getFirstBody()
.getObjectValue(cls);
} catch (Exception e) {
try {
Message message = new Message(SOAP_START_XSI + xml + SOAP_END);
result = message.getSOAPEnvelope().getFirstBody()
.getObjectValue(cls);
} catch (Exception e1) {
throw new Exception(e1);
}
}
return result;
}
private TypeDesc getAxisTypeDesc(Object obj) throws Exception {
final Class<? extends Object> objClass = obj.getClass();
try {
final Method methodGetTypeDesc = objClass.getMethod("getTypeDesc",
new Class[] {});
final TypeDesc typeDesc = (TypeDesc) methodGetTypeDesc.invoke(obj,
new Object[] {});
return (typeDesc);
} catch (final Exception e) {
throw new Exception("Unable to get Axis TypeDesc for "
+ objClass.getName(), e);
}
}
I have fixed it, I will leave this here if anyone else needs to use it
Have fun.
I have fixed it, I will leave this here if anyone else needs to use it. See the updated version.
Thank you

Elasticsearch: Adding manual mapping using Java

I cant change the mapping. Can anybody help me to find the bug in my code?
I have found this standard way to change the mapping according to several tutorials. But when i'm try to call the mapping structure there just appear a blank mapping structure after manuall mapping creation.
But after inserting some data there appear the mapping specification because ES is using of course the default one. To be more specific see the code below.
public class ElasticTest {
private String dbname = "ElasticSearch";
private String index = "indextest";
private String type = "table";
private Client client = null;
private Node node = null;
public ElasticTest(){
this.node = nodeBuilder().local(true).node();
this.client = node.client();
if(isIndexExist(index)){
deleteIndex(this.client, index);
createIndex(index);
}
else{
createIndex(index);
}
System.out.println("mapping structure before data insertion");
getMappings();
System.out.println("----------------------------------------");
createData();
System.out.println("mapping structure after data insertion");
getMappings();
}
public void getMappings() {
ClusterState clusterState = client.admin().cluster().prepareState()
.setFilterIndices(index).execute().actionGet().getState();
IndexMetaData inMetaData = clusterState.getMetaData().index(index);
MappingMetaData metad = inMetaData.mapping(type);
if (metad != null) {
try {
String structure = metad.getSourceAsMap().toString();
System.out.println(structure);
} catch (IOException e) {
e.printStackTrace();
}
}
}
private void createIndex(String index) {
XContentBuilder typemapping = buildJsonMappings();
String mappingstring = null;
try {
mappingstring = buildJsonMappings().string();
} catch (IOException e1) {
e1.printStackTrace();
}
client.admin().indices().create(new CreateIndexRequest(index)
.mapping(type, typemapping)).actionGet();
//try put mapping after index creation
/*
* PutMappingResponse response = null; try { response =
* client.admin().indices() .preparePutMapping(index) .setType(type)
* .setSource(typemapping.string()) .execute().actionGet(); } catch
* (ElasticSearchException e) { e.printStackTrace(); } catch
* (IOException e) { e.printStackTrace(); }
*/
}
private void deleteIndex(Client client, String index) {
try {
DeleteIndexResponse delete = client.admin().indices()
.delete(new DeleteIndexRequest(index)).actionGet();
if (!delete.isAcknowledged()) {
} else {
}
} catch (Exception e) {
}
}
private XContentBuilder buildJsonMappings(){
XContentBuilder builder = null;
try {
builder = XContentFactory.jsonBuilder();
builder.startObject()
.startObject("properties")
.startObject("ATTR1")
.field("type", "string")
.field("store", "yes")
.field("index", "analyzed")
.endObject()
.endObject()
.endObject();
} catch (IOException e) {
e.printStackTrace();
}
return builder;
}
private boolean isIndexExist(String index) {
ActionFuture<IndicesExistsResponse> exists = client.admin().indices()
.exists(new IndicesExistsRequest(index));
IndicesExistsResponse actionGet = exists.actionGet();
return actionGet.isExists();
}
private void createData(){
System.out.println("Data creation");
IndexResponse response=null;
for (int i=0;i<10;i++){
Map<String, Object> json = new HashMap<String, Object>();
json.put("ATTR1", "new value" + i);
response = this.client.prepareIndex(index, type)
.setSource(json)
.setOperationThreaded(false)
.execute()
.actionGet();
}
String _index = response.getIndex();
String _type = response.getType();
long _version = response.getVersion();
System.out.println("Index : "+_index+" Type : "+_type+" Version : "+_version);
System.out.println("----------------------------------");
}
public static void main(String[] args)
{
new ElasticTest();
}
}
I just wanna change the property of ATTR1 field to analyzed to ensure fast queries.
What im doing wrong? I also tried to create the mapping after index creation but it leads to the same affect.
Ok i found the answer by my own. On the type level i had to wrap the "properties" with the type name. E.g:
"type1" : {
"properties" : {
.....
}
}
See the following code:
private XContentBuilder getMappingsByJson(){
XContentBuilder builder = null;
try {
builder = XContentFactory.jsonBuilder().startObject().startObject(type).startObject("properties");
for(int i = 1; i<5; i++){
builder.startObject("ATTR" + i)
.field("type", "integer")
.field("store", "yes")
.field("index", "analyzed")
.endObject();
}
builder.endObject().endObject().endObject();
}
catch (IOException e) {
e.printStackTrace();
}
return builder;
}
It creates mappings for the attributes ATTR1 - ATTR4. Now it is possible to define mapping for Example a list of different attributes dynamically. Hope it helps someone else.

Categories