I am developing a puzzle game like find differences on images in Android (Java).
So how can i know, where user touched on image, and is it right place where user touched.
make some CustomView extends View and override onTouchEvent method. or handle all touch events in Activity by overriding dispatchTouchEvent. you can obtain precise point of touch/action with getX() and getY() methods of MotionEvent (which is an object passed to both methods mentioned above), now its your turn to do some math and calculate which item has been touched (by default Activity is dispatching touch events to its layout e.g. inflated from XML)
Related
I have a custom calendar layout that forms by multiple Linear layouts. when I set OnTouchListener to the parent view. It's not working because they are filled with child view. and I can't add the listener to each child view because It's a lot to set. Any solution that can solve this problem?
Turns out intercepting touch events is not as simple and straightforward as the guide would have us believe. To properly handle touch events in the ViewGroup, you'll need to override both onInterceptTouchEvent() and onTouchEvent() (if you don't want your child views to get the touch events as well).
The reference docs for onInterceptTouchEvent() explain this in a much cleaner manner than the guide. Here's an article that has the relevant code along with descriptions.
TL;DR - Once you intercept the touch event in onInterceptTouchEvent() and return true, the following touch events are sent to onTouchEvent() of your parent viewgroup. If you return false in onInterceptTouchEvent(), the following touch events are sent to the onTouchEvent() of the child views which you can continue to intercept in onInterceptTouchEvent().
You need to override onInterceptTouchEvent() in your custom calendar layout and do your touch handling there. Refer to the official guide for further reference.
The onInterceptTouchEvent() method is called whenever a touch event is detected on the surface of a ViewGroup, including on the surface of its children. If onInterceptTouchEvent() returns true, the MotionEvent is intercepted, meaning it is not passed on to the child, but rather to the onTouchEvent() method of the parent.
Added a spaceView that doesn't cover all of the layout. And I want gesture detection work for only that view.
Couldn't do much but here is my codes:
space = findViewById(R.id.space);
this.gDetector = new GestureDetectorCompat(this, this);
gDetector.setOnDoubleTapListener(this);
The onTouchEvent in Android lets you do so. You can add whatever you want to do in that event.
As per Android Studio's official site, onTouchEvent() is described as ...
When a user places one or more fingers on the screen, this triggers the callback onTouchEvent() on the View that received the touch events. For each sequence of touch events (position, pressure, size, addition of another finger, etc.) that is ultimately identified as a gesture, onTouchEvent() is fired several times.
Here's the official link to you for further help
Hope this helps...
My (OpenGL ES 2.0) app has a splashscreen which is basically an Android View displayed over my GLSurfaceView.
I do all my initial setup on an AsyncTask and then (out of necessity), load my bitmaps/textures on the GLRendering thread in onSurfaceCreated. Once everything has loaded, I dismiss my splashscreen.
Everything works as expected apart from one thing. The splashscreen still accepts input and if the screen is touched just after the app is launched, I (obviously) get an null pointer exception.
I simply create my view like so:
splashscreen = new SplashScreen(getApplication(), width, height); //Where SplashScreen is a custom class that extends View
I then add it to my layout like so:
layout.add(myView); //My GLSurfaceView
layout.add(splashscreen); //The splashscreen View
setContectView(layout);
Once everything has loaded, I simply remove the (splashscreen) View to reveal my GLSurfaceView underneath:
layout.removeView(splashscreen);
splashscreen=null; //Null it
currently, in my onTouch method, to get around the issue, I have put a check in place like so:
public boolean onTouchEvent(MotionEvent event) {
if(splashscreen==null){
//Do touch related stuff here
}
return true;
}
This does work, however, I'm sure it's not the cleanest way to achieve what I'm after.
I've tried this:
splashscreensetClickable(false);
and
layout.setClickable(false);
as well as setEnabled(false)
Nothing seems to work, but I'm sure there must be a way of doing this.......
Generally, you will want the splash screen to prevent the user from doing anything while it is present. You can actually do this, because views in android, as in other UI frameworks, have a hierarchy in which touch events cascade.
Even though you don't want the splash screen to be interactive, you can still steal all of the touch events that occur on it, and just do nothing with them. The answer itself is to override onTouchEvent in your SplashScreen class with:
#Override
public boolean onTouchEvent(MotionEvent event) {
return true; // Splash screen is consuming all touch events on it.
}
Returning true in this basically says that nothing under this view should receive a touch event. As long as it is visible and over top of the surface view, your surface view will not get any touch events. See here for details.
In an app Im working on I have a class that extends a surface view and another that handles all calculations to figure out where and when objects should be drawn onscreen. Right now I have a sort of hacky way of registering screen clicks where a list of objects is parsed through on each click and the closest object takes the click. This works when there is only 3 or 4 objects onscreen but when I have more the clicks register for the wrong objects.
What Im trying to do is to make the onscreen objects extend the button class so I can just pass a MotionEvent to them when they are clicked. But because of the way that the classes I have are set up its a bit confusing to me at the moment on how this could be done.
Does anybody possible know of any tutorials that deal with extending the button class to use with moving onscreen objects?
If I understand correctly, you're trying to determine which object drawn to the screen has been clicked?
I know this isn't exactly what you asked for but when I've implemented a custom view the past, I've typically handled click detection like this:
I get the x and y coordinates of the objects that you want to be clickable. In your onTouchEvent(MotionEvent event) method within your SurfaceView class, have some if statement that checks whether the x and y of the click is within the bounds of your object.
Here's some code to better illustrate what I'm saying:
#Override
public boolean onTouchEvent(MotionEvent event)
{
// If the user touches the space occupied by object1
if(event.getAction() == MotionEvent.ACTION_DOWN
&& event.getX() <= object1.xPosition + object1.width
&& event.getX() >= object1.xPosition
&& event.getY() >= object1.yPosition
&& event.getY() < object1.yPosition + object1.height)
{
// The click was in the bounds of object1's current location
object1.doSomething();
}
// ......
}
Hopefully this makes sense and is somewhat helpful.
I think you kind of mixing stuff. Button is usually used in layout (the xml layout stuff). When you use surface view you (most likely) need to implement your own "button".
However, it is probably possible to just overlay a frameview on top of your surface view and place the button on that view.
In pure Java there is MouseInputListener which I can use to work with those 2 events.
How do I do it with Android?
If I implement both events then only one is fired (onClickListener) and not the other.
Updated:
The question is not about detecting the finger movement.
I have a View (ImageView for example). I need to detect the click on this view which is onClickListener() and finger movement on this view (i.e. press, move then release the finger).
The problem here is that only onClickListener() is called and MotionEvent handler is not caught.
I need to be able to differentiate those 2 events as the main event should be finger movement and onClickListener() should just say "Don't click this view. Spin this view."
Hopefully this is more clear.
OnClickListener and OnTouchListener kinda obstruct each other, since they both consume the MotionEvents that get caught on the View.
Basically you can write a single OnTouchListener that checks for both things. You'll get supplied with the MotionEvent as an argument. Check it's action via MotionEvent.getAction(), e.g. if it equals MotionEvent.ACTION_DOWN (the user put the finger on the display). If the user releases the finger at approx. the same position (ACTION_UP), you may want to interpret that as a click. Otherwise interpret the positions that you get with the ACTION_MOVE event as a gesture.
But the framework already has some classes that do this interpreting work for you, check out the SimpleGestureDetector class and it's SimpleOnGestureListener. That has some callbacks for common events, e.g. onSingleTapConfirmed() and onFling(). All you need to do is supply the MotionEvents from your OnTouchListener to the GestureDetector.