im trying to open a Gallery with Images/Videos from a specific Folder. I´m using this solution but im getting the error code below and nothing happens. I guess its something abot the Uri but i cant find a solution. Has anyone an Idea how to solve this? I also included "my" code.
03-15 16:30:53.733 21902-22775/de.comidos.fotoapp D/onScanCompleted: Scan completed: content://media/external/images/media/1730
03-15 16:30:53.752 21902-22775/de.comidos.fotoapp D/Instrumentation: checkStartActivityResult() : Intent { act=android.intent.action.VIEW dat=content://media/external/images/media/1730 launchParam=MultiScreenLaunchParams { mDisplayId=0 mFlags=0 } }
03-15 16:30:53.773 21902-22775/de.comidos.fotoapp W/Binder: Binder call failed.
android.content.ActivityNotFoundException: No Activity found to handle Intent { act=android.intent.action.VIEW dat=content://media/external/images/media/1730 launchParam=MultiScreenLaunchParams { mDisplayId=0 mFlags=0 } }
at android.app.Instrumentation.checkStartActivityResult(Instrumentation.java:1839)
at android.app.Instrumentation.execStartActivity(Instrumentation.java:1531)
at android.app.Activity.startActivityForResult(Activity.java:4389)
at android.app.Activity.startActivityForResult(Activity.java:4348)
at android.app.Activity.startActivity(Activity.java:4672)
at android.app.Activity.startActivity(Activity.java:4640)
at de.comidos.fotoapp.GalleryViewActivity.onScanCompleted(GalleryViewActivity.java:59)
at android.media.MediaScannerConnection$1.scanCompleted(MediaScannerConnection.java:55)
at android.media.IMediaScannerListener$Stub.onTransact(IMediaScannerListener.java:60)
at android.os.Binder.execTransact(Binder.java:573)
package de.comidos.fotoapp;
import android.app.Activity;
import android.content.Intent;
import android.media.MediaScannerConnection;
import android.net.Uri;
import android.os.Bundle;
import android.os.Environment;
import android.util.Log;
import java.io.File;
public class GalleryViewActivity extends Activity implements MediaScannerConnection.MediaScannerConnectionClient {
public String[] allFiles;
private String SCAN_PATH ;
private static final String FILE_TYPE = "*/*";
private MediaScannerConnection conn;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_gallery);
File folder = new File(Environment.getExternalStorageDirectory().toString()+"/comidos/sent/");
allFiles = folder.list();
// uriAllFiles= new Uri[allFiles.length];
for(int i=0;i<allFiles.length;i++)
{
Log.d("all file path"+i, allFiles[i]+allFiles.length);
}
// Uri uri= Uri.fromFile(new File(Environment.getExternalStorageDirectory().toString()+"/yourfoldername/"+allFiles[0]));
SCAN_PATH= Environment.getExternalStorageDirectory().toString()+"/comidos/sent/"+allFiles[0];
Log.d("SCAN PATH", "Scan Path " + SCAN_PATH);
}
private void startScan()
{
Log.d("Connected","success"+conn);
if(conn!=null)
{
conn.disconnect();
}
conn = new MediaScannerConnection(this,this);
conn.connect();
}
#Override
public void onMediaScannerConnected() {
Log.d("onMediaScannerConnected","success"+conn);
conn.scanFile(SCAN_PATH, FILE_TYPE);
}
#Override
public void onScanCompleted(String path, Uri uri) {
try {
Log.d("onScanCompleted","Scan completed: "+uri );
if (uri != null)
{
Intent intent = new Intent(Intent.ACTION_VIEW);
intent.setData(uri);
startActivity(intent);
}
} finally
{
conn.disconnect();
conn = null;
}
}
#Override
public void onResume(){
super.onResume();
startScan();
}
}
There is no activity on your device that supports ACTION_VIEW of a content Uri for whatever MIME type that content is. There is no requirement for an Android device to have an ACTION_VIEW activity for every possible piece of content.
Related
I am getting a failure when trying to open an android activity while the application is closed. See in the code below that, when I receive a notification of data from firebase, while the app is in the background, I should open an activity using MethodChannel to access java, but I get this error:
No implementation found for method openActivity on channel com.example.service/start
Application.java
package com.example.mobile;
import io.flutter.app.FlutterApplication;
import io.flutter.plugin.common.PluginRegistry;
import io.flutter.plugin.common.PluginRegistry.PluginRegistrantCallback;
import io.flutter.plugins.firebasemessaging.FirebaseMessagingPlugin;
import io.flutter.plugins.firebasemessaging.FlutterFirebaseMessagingService;
public class Application extends FlutterApplication implements PluginRegistrantCallback {
#Override
public void onCreate() {
super.onCreate();
FlutterFirebaseMessagingService.setPluginRegistrant(this);
}
#Override
public void registerWith(PluginRegistry registry) {
FirebaseMessagingPlugin.registerWith(registry.registrarFor("io.flutter.plugins.firebasemessaging.FirebaseMessagingPlugin"));
}
}
AndroidManifest.xml
<application
android:name="com.example.mobile.Application"
android:label="mobile"
android:icon="#mipmap/ic_launcher">
MainActivity.java
package com.example.mobile;
import androidx.annotation.NonNull;
import io.flutter.embedding.engine.FlutterEngine;
import io.flutter.plugin.common.MethodChannel;
import io.flutter.plugins.GeneratedPluginRegistrant;
import io.flutter.embedding.android.FlutterActivity;
import io.flutter.plugins.firebasemessaging.FirebaseMessagingPlugin;
public class MainActivity extends FlutterActivity {
private static final String CHANNEL = "com.example.service/start";
#Override
public void configureFlutterEngine(#NonNull FlutterEngine flutterEngine) {
GeneratedPluginRegistrant.registerWith(flutterEngine);
new MethodChannel(flutterEngine.getDartExecutor(), CHANNEL)
.setMethodCallHandler(
(call, result) -> {
if(call.method.equals("openActivity")){
openActivity();
result.success("open activity");
}
}
);
}
void openActivity(){
Intent i = new Intent(this, SecondActivity.class);
startActivity(i);
}
}
main.dart
_firebaseMessaging.configure(
onMessage: (message) async {
//
},
onLaunch: (message) {
//
},
onResume: (message) {
//
},
onBackgroundMessage: myBackgroundMessageHandler,
);
Future<dynamic> myBackgroundMessageHandler(Map<String, dynamic> message) async {
MethodChannel channel = new MethodChannel("com.example.service/start");
if (message.containsKey('data')) {
final dynamic data = message['data'];
var open = await channel.invokeMethod("openActivity");
}
}
Where am I going wrong, and how can I make it work?
In your AndroidManifest.xml file the android:name must be android:name=".Application", And make sure that MainActivity.java and Application.java are in same folder
Ihave a problem in sending image from and Android application to API . I am getting "java.lang.RuntimeException: An error occurred while executing doInBackground()" error. Please advice me on what to do. Thanks.
RecognizeConceptsActivity.java
package com.example.statistic.api.v2.activity;
import android.content.Intent;
import android.graphics.BitmapFactory;
import android.os.AsyncTask;
import android.os.Bundle;
import android.support.annotation.NonNull;
import android.support.annotation.Nullable;
import android.support.annotation.StringRes;
import android.support.design.widget.Snackbar;
import android.support.v7.widget.LinearLayoutManager;
import android.support.v7.widget.RecyclerView;
import android.view.View;
import android.widget.ImageView;
import android.widget.ViewSwitcher;
import com.example.statistic.R;
import com.example.statistic.api.v2.App;
import com.example.statistic.api.v2.ClarifaiUtil;
import com.example.statistic.api.v2.adapter.RecognizeConceptsAdapter;
import java.util.Collections;
import java.util.List;
import butterknife.BindView;
import butterknife.OnClick;
import clarifai2.api.ClarifaiResponse;
import clarifai2.dto.input.ClarifaiImage;
import clarifai2.dto.input.ClarifaiInput;
import clarifai2.dto.model.ConceptModel;
import clarifai2.dto.model.output.ClarifaiOutput;
import clarifai2.dto.prediction.Concept;
import static android.view.View.GONE;
import static android.view.View.VISIBLE;
public final class RecognizeConceptsActivity extends BaseActivity {
public static final int PICK_IMAGE = 100;
// the list of results that were returned from the API
#BindView(R.id.resultsList)
RecyclerView resultsList;
// the view where the image the user selected is displayed
#BindView(R.id.image2)
ImageView imageView;
// switches between the text prompting the user to hit the FAB, and the loading spinner
#BindView(R.id.switcher)
ViewSwitcher switcher;
// the FAB that the user clicks to select an image
#BindView(R.id.fab)
View fab;
#NonNull
private final RecognizeConceptsAdapter adapter = new RecognizeConceptsAdapter();
#Override
protected void onCreate(#Nullable Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
}
#Override
protected void onStart() {
super.onStart();
resultsList.setLayoutManager(new LinearLayoutManager(this));
resultsList.setAdapter(adapter);
}
#OnClick(R.id.fab)
void pickImage() {
startActivityForResult(new Intent(Intent.ACTION_PICK).setType("image/*"), PICK_IMAGE);
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (resultCode != RESULT_OK) {
return;
}
switch(requestCode) {
case PICK_IMAGE:
final byte[] imageBytes = ClarifaiUtil.retrieveSelectedImage(this, data);
if (imageBytes != null) {
onImagePicked(imageBytes);
}
break;
}
}
private void onImagePicked(#NonNull final byte[] imageBytes) {
// Now we will upload our image to the Clarifai API
setBusy(true);
// Make sure we don't show a list of old concepts while the image is being uploaded
adapter.setData(Collections.<Concept>emptyList());
new AsyncTask<Void, Void, ClarifaiResponse<List<ClarifaiOutput<Concept>>>>() {
#Override
protected ClarifaiResponse<List<ClarifaiOutput<Concept>>> doInBackground(Void... params) {
// The default Clarifai model that identifies concepts in images
final ConceptModel generalModel = App.get().clarifaiClient().getDefaultModels().generalModel();
// Use this model to predict, with the image that the user just selected as the input
return generalModel.predict()
.withInputs(ClarifaiInput.forImage(ClarifaiImage.of(imageBytes)))
.executeSync();
}
#Override
protected void onPostExecute(ClarifaiResponse<List<ClarifaiOutput<Concept>>> response) {
setBusy(false);
if (!response.isSuccessful()) {
showErrorSnackbar(R.string.error_while_contacting_api);
return;
}
final List<ClarifaiOutput<Concept>> predictions = response.get();
if (predictions.isEmpty()) {
showErrorSnackbar(R.string.no_results_from_api);
return;
}
adapter.setData(predictions.get(0).data());
imageView.setImageBitmap(BitmapFactory.decodeByteArray(imageBytes, 0, imageBytes.length));
}
private void showErrorSnackbar(#StringRes int errorString) {
Snackbar.make(
root,
errorString,
Snackbar.LENGTH_INDEFINITE
).show();
}
}.execute();
}
#Override
protected int layoutRes() { return R.layout.activity_recognize; }
private void setBusy(final boolean busy) {
runOnUiThread(new Runnable() {
#Override
public void run() {
switcher.setDisplayedChild(busy ? 1 : 0);
imageView.setVisibility(busy ? GONE : VISIBLE);
fab.setEnabled(!busy);
}
});
}
}
App.java
public class App extends Application {
// In a real app, rather than attaching singletons (such as the API client instance) to your Application instance,
// it's recommended that you use something like Dagger 2, and inject your client instance.
// Since that would be a distraction here, we will just use a regular singleton.
private static App INSTANCE;
#NonNull
public static App get() {
final App instance = INSTANCE;
if (instance == null) {
throw new IllegalStateException("App has not been created yet!");
}
return instance;
}
#Nullable
private ClarifaiClient client;
#Override
public void onCreate() {
INSTANCE = this;
client = new ClarifaiBuilder(getString(R.string.clarifai_api_key))
// Optionally customize HTTP client via a custom OkHttp instance
.client(new OkHttpClient.Builder()
.readTimeout(30, TimeUnit.SECONDS) // Increase timeout for poor mobile networks
// Log all incoming and outgoing data
// NOTE: You will not want to use the BODY log-level in production, as it will leak your API request details
// to the (publicly-viewable) Android log
.addInterceptor(new HttpLoggingInterceptor(new HttpLoggingInterceptor.Logger() {
#Override
public void log(String logString) {
Timber.e(logString);
}
}).setLevel(HttpLoggingInterceptor.Level.BODY))
.build()
)
.buildSync(); // use build() instead to get a Future<ClarifaiClient>, if you don't want to block this thread
super.onCreate();
// Initialize our logging
Timber.plant(new Timber.DebugTree());
}
#NonNull
public ClarifaiClient clarifaiClient() {
final ClarifaiClient client = this.client;
if (client == null) {
throw new IllegalStateException("Cannot use Clarifai client before initialized");
}
return client;
}
}
This is the logcat error
E/AndroidRuntime: FATAL EXCEPTION: AsyncTask #1
Process: com.example.statistic, PID: 3451
java.lang.RuntimeException: An error occurred while executing doInBackground()
at android.os.AsyncTask$3.done(AsyncTask.java:325)
at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:354)
at java.util.concurrent.FutureTask.setException(FutureTask.java:223)
at java.util.concurrent.FutureTask.run(FutureTask.java:242)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:243)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1133)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:607)
at java.lang.Thread.run(Thread.java:761)
Caused by: java.lang.IllegalStateException: App has not been created yet!
at com.example.statistic.api.v2.App.get(App.java:28)
at com.example.statistic.api.v2.activity.RecognizeConceptsActivity$1.doInBackground(RecognizeConceptsActivity.java:105)
at com.example.statistic.api.v2.activity.RecognizeConceptsActivity$1.doInBackground(RecognizeConceptsActivity.java:101)
at android.os.AsyncTask$2.call(AsyncTask.java:305)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:243)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1133)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:607)
at java.lang.Thread.run(Thread.java:761)
Please advice me on what to do. Thanks
In most Android devices, the RecognitionService will be supplied by Google's native 'Now/Assistant' application.
Up until Android Oreo, I was able to query the languages supported by the Google Recognizer with the following simple code:
final Intent vrIntent = new Intent(RecognizerIntent.ACTION_GET_LANGUAGE_DETAILS);
// vrIntent.setPackage("com.google.android.googlequicksearchbox");
getContext().sendOrderedBroadcast(vrIntent, null, new BroadcastReceiver() {
#Override
public void onReceive(final Context context, final Intent intent) {
// final Bundle bundle = intent.getExtras();
final Bundle bundle = getResultExtras(true);
if (bundle != null) {
if (bundle.containsKey(RecognizerIntent.EXTRA_SUPPORTED_LANGUAGES)) {
Log.i("TAG", "onReceive: EXTRA_SUPPORTED_LANGUAGES present");
final ArrayList<String> vrStringLocales = bundle.getStringArrayList(
RecognizerIntent.EXTRA_SUPPORTED_LANGUAGES);
Log.i("TAG", "onReceive: EXTRA_SUPPORTED_LANGUAGES size: " + vrStringLocales.size());
} else {
Log.w("TAG", "onReceive: missing EXTRA_SUPPORTED_LANGUAGES");
}
} else {
Log.w("TAG", "onReceive: Bundle null");
}
}, null, 1234, null, null);
However, since 8.0+ the extra RecognizerIntent.EXTRA_SUPPORTED_LANGUAGES is no longer contained in the response.
Before I attempt to file this as a bug, I wanted to firstly see if others could replicate - but also check if there has been an Ordered Broadcast behavioural change in API 26 I've somehow overlooked, which could be the cause of this.
Thanks in advance.
So, I could't replicate, but further to the comments, if you don't set the package name
vrIntent.setPackage("com.google.android.googlequicksearchbox");
then it fails, otherwise all works fine to me.
That's the basic activity I've used to test it.
package it.versionestabile.stackover001;
import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import android.content.pm.PackageInfo;
import android.content.pm.PackageManager;
import android.speech.RecognizerIntent;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.util.Log;
import java.util.ArrayList;
import static java.security.AccessController.getContext;
/**
* https://stackoverflow.com/questions/48500077/recognizerintent-action-get-language-details-in-oreo
*/
public class MainActivity extends AppCompatActivity {
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
final Intent vrIntent = new Intent(RecognizerIntent.ACTION_GET_LANGUAGE_DETAILS);
vrIntent.setPackage("com.google.android.googlequicksearchbox");
PackageManager packageManager = getPackageManager();
for (PackageInfo packageInfo: packageManager.getInstalledPackages(0)) {
if (packageInfo.packageName.contains("com.google.android.googlequicksearchbox"))
Log.d("AAA", packageInfo.packageName + ", " + packageInfo.versionName);
}
this.sendOrderedBroadcast(vrIntent, null, new BroadcastReceiver() {
#Override
public void onReceive(final Context context, final Intent intent) {
// final Bundle bundle = intent.getExtras();
final Bundle bundle = getResultExtras(true);
if (bundle != null) {
if (bundle.containsKey(RecognizerIntent.EXTRA_SUPPORTED_LANGUAGES)) {
Log.i("TAG", "onReceive: EXTRA_SUPPORTED_LANGUAGES present");
final ArrayList<String> vrStringLocales = bundle.getStringArrayList(
RecognizerIntent.EXTRA_SUPPORTED_LANGUAGES);
Log.i("TAG", "onReceive: EXTRA_SUPPORTED_LANGUAGES size: " + vrStringLocales.size());
} else {
Log.w("TAG", "onReceive: missing EXTRA_SUPPORTED_LANGUAGES");
}
} else {
Log.w("TAG", "onReceive: Bundle null");
}
}
}, null, 1234, null, null);
}
}
I've tested it both on Android Studio 2.3 and 3.0.1 and on emulator with API 26 and 27.
All works fine with the above code.
But if you comment out this line:
vrIntent.setPackage("com.google.android.googlequicksearchbox");
on Oreo it doesn't work.
And I still suggest to check the presence of Google Now with Package Manager in a way like this:
PackageManager packageManager = getPackageManager();
for (PackageInfo packageInfo: packageManager.getInstalledPackages(0)) {
if (packageInfo.packageName.contains("com.google.android.googlequicksearchbox"))
Log.d("AAA", packageInfo.packageName + ", " + packageInfo.versionName);
// TODO - set a boolean value to discriminate the precence of google now
}
In order to decide if you have the right version of Google Now.
Hope it helps!
This is my first app using service. Getting java.lang.RuntimeException: Unable to start service java.lang.NullPointerException. My intention is to send sms when device is in suspend state.
SENDSMS.java
package com.qualcomm.sendsms;
import android.os.Bundle;
import android.app.Activity;
import android.content.Intent;
import android.util.Log;
import android.view.Menu;
public class SENDSMS extends Activity {
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_sendsms);
String phone_num = null;
String sms = null;
String sleep_time = null;
Bundle extras = getIntent().getExtras();
if (extras != null) {
phone_num = extras.getString("Phone_Number");
Log.e("????????????????SEND_SMS", "phno : "+phone_num);
sms = extras.getString("SMS_Body");
Log.e("????????????????SEND_SMS", "sms : "+sms);
sleep_time = extras.getString("Sleep_Time");
Log.e("????????????????Sleep_Time", "sleep_time : "+sleep_time);
Intent myIntent = new Intent(this, sendservicesms.class);
myIntent.putExtra("Phone_Number",phone_num);
myIntent.putExtra("SMS_Body",sms);
myIntent.putExtra("Sleep_Time",sleep_time);
startService(myIntent);
}
finish();
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.sendsm, menu);
return true;
}
}
Service : sendservicesms.java
package com.qualcomm.sendsms;
import android.app.IntentService;
import android.content.Intent;
import android.os.Bundle;
import android.os.IBinder;
import android.telephony.SmsManager;
import android.util.Log;
import android.widget.Toast;
public class sendservicesms extends IntentService {
int mStartMode; // indicates how to behave if the service is killed
IBinder mBinder; // interface for clients that bind
boolean mAllowRebind; // indicates whether onRebind should be used
public sendservicesms() {
super("sendservicesms");
}
public void onCreate() {
// The service is being created
}
#Override
protected void onHandleIntent(Intent intent) {
// The service is starting, due to a call to startService()
if(intent!=null) {
Bundle param = intent.getExtras();
if (param != null) {
String phone_no = (String)param.get("Phone_Number");
String sms_body = (String)param.get("SMS_Body");
String sleeptime = (String)param.get("Sleep_Time");
Log.e("????????????????SEND_SMS", "phno : "+phone_no);
Log.e("????????????????SEND_SMS", "sms : "+sms_body);
Log.e("????????????????Sleep_Time", "sleep_time : "+sleeptime);
Long time = Long.parseLong(sleeptime);
Log.e("????????????????time long", "Long_time : "+time);
try {
if(sleeptime!=null && sleeptime.length() > 0){
Thread.sleep(Long.parseLong(sleeptime));
}
Log.e("????????????????Sleep happened well", "sleep_time : "+sleeptime);
SmsManager smsManager = SmsManager.getDefault();
smsManager.sendTextMessage(phone_no, null, sms_body, null, null);
Toast.makeText(getApplicationContext(), "SMS Sent!",
Toast.LENGTH_LONG).show();
} catch (Exception e) {
Toast.makeText(getApplicationContext(),
"SMS faild, please try again later!",
Toast.LENGTH_LONG).show();
e.printStackTrace();
}
}
}
}
#Override
public IBinder onBind(Intent intent) {
// A client is binding to the service with bindService()
return mBinder;
}
#Override
public boolean onUnbind(Intent intent) {
// All clients have unbound with unbindService()
return mAllowRebind;
}
#Override
public void onRebind(Intent intent) {
// A client is binding to the service with bindService(),
// after onUnbind() has already been called
}
#Override
public void onDestroy() {
// The service is no longer used and is being destroyed
}
}
LOGCAT:
E/AndroidRuntime(10276): FATAL EXCEPTION: main
E/AndroidRuntime(10276): java.lang.RuntimeException: Unable to start service com.qualcomm.sendsms.sendservicesms#418dd170 with Intent { cmp=com.qualcomm.sendsms/.sendservicesms (has extras) }: java.lang.NullPointerException
E/AndroidRuntime(10276): at android.app.ActivityThread.handleServiceArgs(ActivityThread.java:2676)
E/AndroidRuntime(10276): at android.app.ActivityThread.access$1900(ActivityThread.java:144)
E/AndroidRuntime(10276): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1334)
E/AndroidRuntime(10276): at android.os.Handler.dispatchMessage(Handler.java:99)
E/AndroidRuntime(10276): at android.os.Looper.loop(Looper.java:137)
E/AndroidRuntime(10276): at android.app.ActivityThread.main(ActivityThread.java:5074)
E/AndroidRuntime(10276): at java.lang.reflect.Method.invokeNative(Native Method)
E/AndroidRuntime(10276): at java.lang.reflect.Method.invoke(Method.java:511)
E/AndroidRuntime(10276): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:793)
E/AndroidRuntime(10276): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:560)
E/AndroidRuntime(10276): at dalvik.system.NativeStart.main(Native Method)
E/AndroidRuntime(10276): Caused by: java.lang.NullPointerException
E/AndroidRuntime(10276): at android.app.IntentService.onStart(IntentService.java:116)
E/AndroidRuntime(10276): at android.app.IntentService.onStartCommand(IntentService.java:130)
E/AndroidRuntime(10276): at android.app.ActivityThread.handleServiceArgs(ActivityThread.java:2659)
You must call the super.onCreate() in the first line inside onCreate() if you override it, Cause system needs to be prepared before your own oncreate() implementation.....
Your program will be running fine if you do that
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
// Your rest of codes
}
When you remove onCreate() from your Activity, System will do that itself for you. :)
whenever there is a null pointer exception it is mostly due to xml... Some element may not be matching in your xml file.
to send sms u may try the following code:
Intent sendIntent = new Intent(Intent.ACTION_VIEW);
sendIntent.putExtra("sms_body", sms_text_string);
sendIntent.setType("vnd.android-dir/mms-sms");
startActivity(sendIntent);
I noticed the default camera activity I call on a Droid X is different looking than the one on my Droid and Nexus One. After selecting "OK" on the Droid and Nexus One, the activity would finish - the Droid X has a "Done" button (which takes you back to the Camera, instead of finishing the activity), and the only way to get to the screen I want is to hit the "Back" button.
Here is the class that works on Android 2.2/2.3, but not for Droid X's:
package com.android.xxx;
import java.io.File;
import android.content.Intent;
import android.net.Uri;
import android.os.Bundle;
import android.os.Environment;
import android.provider.MediaStore;
import android.view.Window;
public class CameraView extends MenusHolder {
protected String _path;
protected boolean _taken;
protected static final String PHOTO_TAKEN = "photo_taken";
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
requestWindowFeature(Window.FEATURE_NO_TITLE);
setContentView(R.layout.create_event_view);
/*
* save to sd
*/
File imageDirectory = new File(
Environment.getExternalStorageDirectory() + "/MyPath/");
imageDirectory.mkdirs();
/*
* temp image overwrites each time for space
*/
_path = Environment.getExternalStorageDirectory()
+ "/MyPath/temporary_image.jpg";
startCameraActivity();
}
protected void startCameraActivity() {
File file = new File(_path);
Uri outputFileUri = Uri.fromFile(file);
Intent intent = new Intent(
android.provider.MediaStore.ACTION_IMAGE_CAPTURE);
intent.putExtra(MediaStore.EXTRA_OUTPUT, outputFileUri);
startActivityForResult(intent, 0);
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
switch (resultCode) {
case 0:
setResult(5);
finish();
break;
case -1:
onPhotoTaken();
break;
}
}
protected void onPhotoTaken() {
_taken = true;
setResult(0);
finish();
}
#Override
protected void onSaveInstanceState(Bundle outState) {
outState.putBoolean(CameraView.PHOTO_TAKEN, _taken);
}
#Override
protected void onRestoreInstanceState(Bundle savedInstanceState) {
if (savedInstanceState.getBoolean(CameraView.PHOTO_TAKEN)) {
onPhotoTaken();
}
}
}
I solved this with a really really ugly workaround. I coded two functions to read and write files from sdcard (taken from here: http://www.sgoliver.net/blog/?p=2035).
private boolean readFile() {
try
{
File sd_path = Environment.getExternalStorageDirectory();
File f = new File(sd_path.getAbsolutePath(), "lock_camera_oncreate");
BufferedReader fin =
new BufferedReader(
new InputStreamReader(
new FileInputStream(f)));
String text = fin.readLine();
fin.close();
Log.e("Files", "Reading file");
return true;
}
catch (Exception ex)
{
Log.e("Files", "Error reading file from SD Card");
return false;
}
}
private void createFile() {
try
{
File sd_path = Environment.getExternalStorageDirectory();
File f = new File(sd_path.getAbsolutePath(), "lock_camera_oncreate");
OutputStreamWriter fout =
new OutputStreamWriter(
new FileOutputStream(f));
fout.write("Semaphore test.");
fout.close();
Log.e("Files", "File writed");
}
catch (Exception ex)
{
Log.e("Files", "Error reading file from SD Card");
}
}
Then, in onCreate function, I make this:
public void onCreate(Bundle savedInstanceState) {
this.setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
super.onCreate(savedInstanceState);
if(readFile() == true)
{
File sd_path = Environment.getExternalStorageDirectory();
File f = new File(sd_path.getAbsolutePath(), "lock_camera_oncreate");
f.delete();
Intent intent = this.getIntent();
this.setResult(RESULT_OK, intent);
return;
}
createFile();
Intent cameraIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
cameraIntent.putExtra(MediaStore.EXTRA_OUTPUT, Uri.fromFile(new File(mCurrentImagePath)));
startActivityForResult(cameraIntent, TAKE_PHOTO_CODE);
}
The setRequestedOrientation call solves the issue when you are using your app in portrait mode, but when camera is launched, you put the mobile in landscape and then shoot the photo.
Then, the ugly readFile thing checks if a lock_camera_oncreate file exists and if it's true,
then an additional onCreate call happened, so delete file and RETURN from this activity.
If activity advances, means the file's not created and there is only one camera activity running.
Hope it helps, it's ugly but worked for me :D
Dude... it's just a bug. I had the same problem and there's no way to workaround that. It sometimes work, and sometimes it does not. I asked the Motorola's guy for help and they said that there's no support for those Android images.