So, I have an activity, lets say a PetDetailActivity that shows a carousel with some Bitmaps in it (I use com.synnapps:carouselview:0.1.5 to handle my carousel). The problem is that the PetDetailActivity loaded with 0 sized carousel, which maybe the images is still being processed by a thread. How to wait Picasso to finish processing URLs, and then show it up in the new Activity?
This is the code of PetDetailActivity:
import ...
public class PetDetailActivity extends AppCompatActivity {
#Override
protected void onCreate(#Nullable Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_pet_detail);
Intent i = getIntent();
Pet targetPet = (Pet)i.getSerializableExtra("PetObject");
ActionBar actionBar = getSupportActionBar();
if(actionBar!=null) actionBar.setDisplayHomeAsUpEnabled(true);
//Creating a BitmapHandler object to download image from URL to Bitmap object using picasso.
BitmapHandler bitmapHandler = new BitmapHandler(targetPet.getCarouselImageUrl());
final ArrayList<Bitmap> petCarouselBitmaps = bitmapHandler.getProcessedBitmap();
//The bitmap is being downloaded in other thread, so the activity is up and
//CarouselView is still empty (petCarouselBitmaps.size() == 0)
//So how to wait the bitmaps is processed, like show a loading screen on the UI?
CarouselView petCarousel = findViewById(R.id.petCarousel);
petCarousel.setPageCount(petCarouselBitmaps.size());
petCarousel.setImageListener(new ImageListener() {
#Override
public void setImageForPosition(int position, ImageView imageView) {
imageView.setImageBitmap(petCarouselBitmaps.get(position));
}
});
}
...
}
And Here is the BitmapHandler Class that downloads image from URL to a Bitmap using picasso:
public class BitmapHandler extends Thread {
ArrayList<String> urlList;
private ArrayList<Bitmap> loadedBitmap;
public BitmapHandler(ArrayList<String> list){
this.urlList = list;
this.loadedBitmap = new ArrayList<>();
}
public ArrayList<Bitmap> getProcessedBitmap(){
this.run();
//Returning the loaded bitmap as a ArrayList<Bitmap> Object.
return loadedBitmap;
}
#Override
public void run() {
Target target = new Target() {
#Override
public void onBitmapLoaded(Bitmap bitmap, Picasso.LoadedFrom from) {
loadedBitmap.add(bitmap);
}
#Override
public void onBitmapFailed(Exception e, Drawable errorDrawable) {}
#Override
public void onPrepareLoad(Drawable placeHolderDrawable) {}
};
for (String url : urlList) {
Picasso.get().load(url).into(target);
}
}
}
Thank you for any helps!
Problem: . How to wait Picasso to finish processing URLs
Solution:
I think you can go with Target callback:
private Target target = new Target() {
#Override
public void onBitmapLoaded(Bitmap bitmap, Picasso.LoadedFrom from) {
}
#Override
public void onBitmapFailed() {
}
}
And while loading image, you need to write:
Picasso.with(this).load(myURL).into(target);
Just for the information:
onBitmapLoaded() mostly used to perform image operations before we load into the view actually.
Picasso.LoadedFrom describes where the image was loaded from, whether it's MEMORY, DISK or NETWORK.
I think you can use placeholder & then the image is loaded it will show in Image View.
And if you want to delay use can use Thread.sleep(5000).
Related
I am working on how to apply live wallpaper(GIF image). When I click on apply button the default gif image set as wallpaper. I'm getting all the images from firebase. So I want to set that image as wallpaper. I don't know how to pass that gif image from LiveViewActivity to GIFWallpaperService to set that .gif image as live wallpaper instead of default image. (sorry for my bad English, hope you understand)
LiveWallpaperActivity.java //main activity(where I'm getting all the images from firebase)
|
| //pass the .gif image url by intent to next activity
|
LiveViewActivity.java
|
|
| //Here I receive the image by intent and load into imageview with glide
|
| //added a button to apply live wallpaper(.gif image)
| //pass .gif image to GIFWallpaperService class service (I don't know how to do)
|
GIFWallpaperService
LiveViewActivity
Where i add a button to apply live wallpaper
here i want to pass that .gif image to GIFWallpaperService
public class LiveViewActivity extends AppCompatActivity {
ImageView imageView;
Button setLiveWallpaper;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_live_view);
imageView = findViewById(R.id.viewImage);
Glide.with(this).load(getIntent().getStringExtra("images")).into(imageView);
setLiveWallpaper.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
applyLiveWallpaper();
}
});
}
private void applyLiveWallpaper() {
Intent intent = new Intent(
WallpaperManager.ACTION_CHANGE_LIVE_WALLPAPER);
intent.putExtra(WallpaperManager.EXTRA_LIVE_WALLPAPER_COMPONENT,
new ComponentName(this, GIFWallpaperService.class));
startActivity(intent);
}
}
GIFWallpaperService
here I want to receive .gif image that I send from LiveViewActivity
to set as live wallpaper
public class GIFWallpaperService extends WallpaperService {
#Override
public WallpaperService.Engine onCreateEngine() {
try {
Movie movie = Movie.decodeStream(getResources().getAssets().open("sea_gif.gif")); //Here is the default gif image
return new GIFWallpaperEngine(movie);
} catch (IOException e) {
Log.d("GIFWallpaperService", "Could not loaded live wallpaper");
return null;
}
}
private class GIFWallpaperEngine extends WallpaperService.Engine {
private final int frameDuration = 20;
private SurfaceHolder holder;
private Movie movie;
private boolean visible;
private Handler handler;
public GIFWallpaperEngine(Movie movie) {
this.movie = movie;
handler = new Handler();
}
#Override
public void onCreate(SurfaceHolder surfaceHolder) {
super.onCreate(surfaceHolder);
this.holder = surfaceHolder;
}
private Runnable drawGIF = new Runnable() {
#Override
public void run() {
draw();
}
};
private void draw() {
if (visible) {
Canvas canvas = holder.lockCanvas();
canvas.save();
canvas.scale(4f, 4f);
movie.draw(canvas, -100, 0);
canvas.restore();
holder.unlockCanvasAndPost(canvas);
movie.setTime((int) (System.currentTimeMillis() % movie.duration()));
handler.removeCallbacks(drawGIF);
handler.postDelayed(drawGIF, frameDuration);
}
}
#Override
public void onVisibilityChanged(boolean visible) {
//super.onVisibilityChanged(visible);
this.visible = visible;
if (visible) {
handler.post(drawGIF);
} else {
handler.removeCallbacks(drawGIF);
}
}
#Override
public void onDestroy() {
super.onDestroy();
handler.removeCallbacks(drawGIF);
}
}
}
I don't know how to send and receive .gif image from LiveViewActivity to GIFWallpaperService
Your question is a little misty to me, but If I get you right, actually you can easily get the picture in LiveWallpaperActivity through the adapter by the context and then from this activity you can pass your image to whatever activity you want by intent.
To send a data from an activity to a service, you need to override onStartCommand there you have direct access to intent:
Override
public int onStartCommand(Intent intent, int flags, int startId) {
then from LiveViewActivity you will create the intent object to start service and then you place your image name inside the intent object :
Intent serviceIntent = new Intent(YourService.class.getName())
serviceIntent.putExtra("IMAGE_KEY", "image");
context.startService(serviceIntent);
When the service is started its onStartCommand() method will be called then you can get the image:
public int onStartCommand (Intent intent, int flags, int startId) {
String image = intent.getStringExtra("IMAGE_KEY");
return START_STICKY;
}
Try using DomainEventBus which helps in passing objects, images, and in fact any data types across all the applications.
https://greenrobot.org/eventbus/
With this, you need to register the "event bus" on creation of service and it will be able to receive the image from any component of the application.
I was looking for answer for hours but without result.
My app is using cameraCaptureSessions.setRepeatingRequest and saving/refreshing into TextureView privateTextureView.
I want to take a picture from TextureView -> apply filter using AsyncTask -> save it to the ImageView. But getting warning "unchecked call to execute(Params...) as a member of the raw type AsyncTask"
Application falls on button click(AsyncTask execute). Where am I wrong?
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
.
.
.
privateTextureView = (TextureView)findViewById(R.id.textureView);
privateTextureView.setSurfaceTextureListener(textureListener);
myTask = new ApplyFilterTask();
privateBtnGetCapture.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
pictureBitmap = privateTextureView.getBitmap();
//Start MyTask
myTask.execute(pictureBitmap);
}
});
}
protected class ApplyFilterTask extends AsyncTask<Bitmap, Bitmap, Bitmap> {
#Override
protected void onPreExecute() {
int color = Color.parseColor("#ffa500");
privateImageView.setColorFilter(color);
}
#Override
protected Bitmap doInBackground(Bitmap... bitmaps) {
Bitmap bmp = bitmaps[0];
try {
ImageFilters filter = new ImageFilters();
bmp = filter.applyContrastEffect(bmp, 5);
publishProgress(bmp);
} catch (Exception e) {
//System.out.println("Error!!!! " + e);
}
return bmp;
}
#Override
protected void onProgressUpdate(Bitmap... bm) {
System.out.println("TestProgress");
privateImageView.setImageBitmap(bm[0]);
}
/*
protected void onPostExecute(Bitmap... bm) {
System.out.println("TestExecude");
privateImageView.setImageBitmap(bm[0]);
}*/
}
The application failure mentioned by you seems to be due to multiple usages of the same AsyncTask. An Asynctask can be used only once. As mentioned in the official documentation :
The task can be executed only once (an exception will be thrown if a second execution is attempted.)
Your code creates a single myTask object that is used every time in the OnClick event listener. Instantiate a new AsyncTask everytime to avoid the application failures, as shown below.
privateBtnGetCapture.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
pictureBitmap = privateTextureView.getBitmap();
AsyncTask<Bitmap, Bitmap, Bitmap> myTask = new ApplyFilterTask();
//Start MyTask
myTask.execute(pictureBitmap);
}
});
The warning mentioned by the IDE is however a separate issue that has to do with the declaration of myTask. You seem to have declared myTask as AsyncTask and not as AsyncTask<Bitmap, Bitmap, Bitmap> or ApplyFilterTask. Change the declaration and the warning would disappear.
Instead of using a unique instance of ApplyFilterTask for all your tasks, you can create new instance every time you click on privateBtnGetCapture and that can be done easily this way:
privateBtnGetCapture.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
pictureBitmap = privateTextureView.getBitmap();
//apply the filter this way
new ApplyFilterTask().execute(pictureBitmap);
}
});
Hi guys trying to implement AsyncTask here in my project, there seems to be no error shown by Android Studio either and have debugged to see if the bitmaps are downloaded and yes it is. I dont know what the problem is so here is my code.
My Activity:
public class MainActivity extends AppCompatActivity {
public ImageView imageView;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
imageView =findViewById(R.id.justAnImage);
String[] strings = new String[2];
strings[0] = "blah.com/sds.jpg";
strings[1] = "blah.com/sds2.jpg";
new AsyncDownloader(this,imageView).execute(strings);
}
public void StartAnimation(ImageView imageView, Bitmap[] bitmaps)
{
AnimationDrawable animation = new AnimationDrawable();
for (int i=0;i<bitmaps.length;i++) //replaced erroneous code-> (int i=0;i>=bitmaps.length;i++)
{
animation.addFrame(new BitmapDrawable(getApplicationContext().getResources(),bitmaps[i]),1000);
}
animation.setOneShot(false);
imageView.setBackground(animation);
// start the animation!
animation.start();
}
}
My Async downloader class
public class AsyncDownloader extends AsyncTask<String[],Void,Bitmap[]> {
private Bitmap[] bitmapArray;
private ImageView imageView;
MainActivity mainActivity;
public AsyncDownloader(MainActivity mainActivity,ImageView imageView) {
this.imageView = imageView;
bitmapArray = new Bitmap[2];
this.mainActivity = mainActivity;
}
#Override
protected Bitmap[] doInBackground(String[]... strings) {
for(int i=1;i>=0;i--)
{
try {
URL url = new URL(strings[0][i]);
bitmapArray[i] = BitmapFactory.decodeStream(url.openConnection().getInputStream());
} catch (Exception e) {
Log.e("DOWNLOAD ASYNC", e.getMessage());
}
}
return bitmapArray;
}
#Override
protected void onPostExecute(Bitmap[] bitmapArray) {
super.onPostExecute(bitmapArray);
//imageView.setImageBitmap(bitmapArray[0]);
mainActivity.StartAnimation(imageView,bitmapArray);
}
}
I cant find the problem here, debugging is so terrible in Android Studio. If any one could help, i shall be thankful.
EDIT 1: After suggestions from fellow stack guys, it is clear that the animation drawable is not working as desired and nonetheless there is no image showing up on the ImageView control.
EDIT 2: Code works after correcting the loop specified in the answer. Found an alternative to display images after certain interval. The code goes like this :
public void StartAnimation(final ImageView imageView, final Bitmap[] bitmaps)
{
handler = new Handler();
Runnable runnable = new Runnable() {
int i = 0;
public void run() {
imageView.setImageBitmap(bitmaps[i]);
i++;
if (i > bitmaps.length - 1) {
i = 0;
}
handler.postDelayed(this, 1200);
}
};
handler.postDelayed(runnable, 1200);
}
Your code is perfect, except loop condition inside method StartAnimation.
issue is with loop condition which is always false.
To fix the issue change
From
for (int i = 0; i >= bitmaps.length; i++)
To
for (int i = 0; i < bitmaps.length; i++)
use fresco or picosso or glide library
these libraries automatically caching and managing memory.
it's easy to work with picosso and glide but fresco adds more features to developers.
I have made an imageView animate from one side to the other side of the screen. Here is the java code:
public class MainActivity extends AppCompatActivity {
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
final ImageView imageView = findViewById(R.id.imageView);
Button button = findViewById(R.id.button);
button.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
handleAnimation(imageView);
}
});
}
public void handleAnimation(View view) {
ObjectAnimator animatorX = ObjectAnimator.ofFloat(view, "x", 1000f);
animatorX.setDuration(2000);
animatorX.start();
}
}
And this is what we see when user clicks on the ANIMATE button:
Now my question is that how I can make a video file by capturing the animated imageView ?
EDIT:
What I need is: I want to make an app which takes some photos from the user and make some animations on the photos and some effects and also mix them with a desired sound and at the end exports a video clip. And of course if I can I would rather make all these things hidden.
You have to record your screen and then crop the video using your view's xy coordinates. You can record your screen using the MediaProject API on android (5) and above.
private VirtualDisplay mVirtualDisplay;
private MediaRecorder mMediaRecorder;
private MediaProjection mMediaProjection;
private MediaProjectionCallback callback;
MediaProjectionManager projectionManager = (MediaProjectionManager)
context.getSystemService(Context.MEDIA_PROJECTION_SERVICE);
mMediaProjection.registerCallback(callback, null);
initRecorder();
mMediaRecorder.prepare();
mVirtualDisplay = createVirtualDisplay();
mMediaRecorder.start();
public void initRecorder() {
path = "/sdcard/Record/video" + ".mp4";
recId = "capture-" + System.currentTimeMillis() + ".mp4";
File myDirectory = new File(Environment.getExternalStorageDirectory(), "Record");
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mMediaRecorder.setVideoEncodingBitRate(MainFragment.bitRate);
mMediaRecorder.setVideoFrameRate(30);
mMediaRecorder.setVideoSize(MainFragment.DISPLAY_WIDTH,
MainFragment.DISPLAY_HEIGHT);
mMediaRecorder.setOutputFile(path);
}
private VirtualDisplay createVirtualDisplay() {
return mMediaProjection.createVirtualDisplay("MainActivity",
MainFragment.DISPLAY_WIDTH, MainFragment.DISPLAY_HEIGHT, MainFragment.screenDensity,
DisplayManager.VIRTUAL_DISPLAY_FLAG_AUTO_MIRROR,
mMediaRecorder.getSurface(), null /*Callbacks*/, null /*Handler*/);
}
public class MediaProjectionCallback extends MediaProjection.Callback {
#Override
public void onStop() {
mMediaRecorder.stop();
// mMediaRecorder.reset();
mMediaRecorder.release();
mMediaProjection.unregisterCallback(callback);
mMediaProjection = null;
mMediaRecorder = null;
}
Once done simply call mMediaProjection.stop() to finish the recording and save the video as tmp
After which you can crop the video at the xy coordinates that your view is position using FFmpeg
ffmpeg -i in.mp4 -filter:v "crop=out_w:out_h:x:y" out.mp4
Where the options are as follows:
out_w is the width of the output rectangle
out_h is the height of the output rectangle
x and y specify the top left corner of the output rectangle
so in your case
String cmd ="-i '"+ tmpVideoPath+"' -filter:v "+"'crop="+view.getWidth()+":"+view.getHeight()+":"+view.getX()+":"+view.getY()+"'"+" -c:a copy "+outVideoPath
FFmpeg ffmpeg = FFmpeg.getInstance(context);
// to execute "ffmpeg -version" command you just need to pass "-version"
ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler() {
#Override
public void onStart() {}
#Override
public void onProgress(String message) {}
#Override
public void onFailure(String message) {}
#Override
public void onSuccess(String message) {}
#Override
public void onFinish() {}
});
There are two possible approaches to archive this.
1- You can acchive this by using the javacv library (FFmpeg) to combine a set of bitmaps taken from the view
FFmpegFrameRecorder recorder = new FFmpegFrameRecorder("/sdcard/test.mp4",256,256);
try {
recorder.setVideoCodec(avcodec.AV_CODEC_ID_MPEG4);
recorder.setFormat("mp4");
recorder.setFrameRate(30);
recorder.setPixelFormat(avutil.PIX_FMT_YUV420P10);
recorder.setVideoBitrate(1200);
recorder.startUnsafe();
for (int i=0;i< 5;i++)
{
view.setDrawingCacheEnabled(true);
Bitmap bitmap = Bitmap.createBitmap(v1.getDrawingCache());
view.setDrawingCacheEnabled(false);
recorder.record(bitmap);
}
recorder.stop();
}
catch (Exception e){
e.printStackTrace();
}
all the code of using this library is here
2- You can use this link for record the screen and use as per your need.
Screen Recorder
I have a fragment that i'm calling again and again it has Picasso in it to load images from url when I pop back same fragment with different url to load image from, the previous image shown until new image loaded, how can i solve this problem, here is my code:
Picasso.with(context)
.load(url)
.networkPolicy(NetworkPolicy.NO_CACHE)
.into(ivParallex, new com.squareup.picasso.Callback() {
#Override
public void onSuccess() {
getProductDetail(productId);
} else {
}
#Override
public void onError() {
}
});