How to limit speed with BMW JSDK on 116i programmatically from Java? - java

I'm experimenting with the BMW Java SDK on the new BMW 116i Innovation Package. Basic things like turning the lights on and off, starting and stopping the motor work fine. What I'm trying to do now is that to write a carlet which would limit the speed to the maximum configured in the driver profile. Driver identity will be detected as usual via RFID reader.
My problem is that though I can read the speed from the tachometer, I can't really limit the speed. Here's what I've got working so far:
public class SpeenControllingCarlet extends GenericCarlet {
public void start(final VehicleModel model) throws CarletException {
RfidReader rfidReader = (RfidReader) model
.getDevice(Devices.DRIVER_RFID_READER);
Rfid rfid = rfidReader.getRfid();
DriverProfile driverProfile = model.getDriverProfileRegistry()
.getDriverProfile(rfid.toString());
if (driverProfile == null) {
return;
}
final Double maxAllowedSpeed = Double.valueOf(driverProfile
.getCustomAttribute("maxAllowedSpeed", "190"));
Tachometer tachometer = (Tachometer) mode.getDevice(Devices.TACHOMETER);
tachometer.addSpeedListener(new SpeedListener() {
public void onSpeedChanged(SpeedChangedEvent speedChangedEvent) {
if (speedChangedEvent.getCurrentSpeed() > maxAllowedSpeed)
{
Horn horn = (Horn) mode.getDevice(Devices.HORN);
horn.beep(440, 2000);
}
}
});
}
}
This will just beep for two seconds if the driver goes faster than the driver profile allows.
My question is - is there a possibility to actually limit the speed (not just silly beeping)?

How do you slow down using the imperfect human? You brake! Same with BMW SDK:
Brake brake = (Brake) mode.getDevice(Devices.BRAKE);
brake.apply(Brake.TO_THE_METAL);

Wrench wrench = (Wrench) Toolkit.getToolkit().get(Instruments.WRENCH);
wrench.hit(driver);

I think (and hope) that this is very likely not possible, and the reasons are that car manufacturers would be in a lot of legal trouble if they allowed "non-core" gadgets like a JVM built into the entertainment/navigation system to interfere with the motor or steering controls. That is a much worse security risk than your average browser exploit.
Fly-by-wire cars are scary enough as it is without end-user/hacker accessible parts.

Your big problem is that you're not taking the current gear ratio into account when you get the engine speed. You're looking at a speed of like 190, while the tach is going to return somewhere between 700 and 7000. You need a function that takes engine RPMs, gear ratio, and tire diameter, and returns actual speed.
Or you could get the car's speed from the speedometer or GPS.

Related

Android sampling rates variation of hardware Sensors on Nexus 6P

I'm developing an Android app, for a research, and im reading several Sensor data like accelerometer, gyroscope, barometer etc.
So I have 4 Nexus 6P devices all with the newest Factory Image and freshly set up with no other app installed than the standard once which are pre-installed.
So the Problem that occurs now is that one of the phones is constantly lagging behind, so for example i record for half an hour the accelerometer at 105 Hz (so the max possible rate for the accelerometer is 400Hz), just to make sure i get at least about the amount of samples i would expect for 100Hz and the results are the following:
Smapling for Half an hour at 100Hz -> 180000 Samples
Smapling for Half an hour at 105Hz -> 189000 Samples
(This is now just an example for the accelerometer but is the same for every other sensor on each device. So device 1,3,4 get about the same good results for other senors while device 2 gets the same bad results on all other sensors).
Device 1: 180000 Samples
Device 2: 177273 Samples <- the phone that is lagging behind
Device 3: 181800 Samples
Device 4: 179412 Samples
So the problem is at device number 2 where I'm missing almost 3000 Samples (I know this is crying at a high level) and my guess for this problem is that it is probably Hardware related. That it might be a performance issue i can probably rule out since it does not matter how many Sensors im reading and also reading them at 400Hz works as expected (if wanted i can also offer the Samples for this too). I also tried to set the sampling rate to 400Hz so to the fastest and then take recordings according to the timestamp which led to the same result.
So just in case I'll provide how I register the Sensor Listener:
protected void onCreate(Bundle savedInstanceState){
sensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
unaccDataSensor = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER_UNCALIBRATED);
}
....
private void Start(){
sensorManager.registerListener(unaccDataListener, unaccDataSensor, 10000);
}
So what i want is to get at least about the amount of samples that i should expect so above is no problem and just a bit below is also acceptable.
So if anyone has an idea about what else I can try or what can cause the problem i would be really thankful.
This is my first Post so if anything is missing or if i explained something in a bad way im sorry and i try my best to fix it.
I work with Android sensors a lot, and i can tell you the hardware is of variable quality. I usually use a filter if I need the results to be consistent across phones:
// Filter to remove readings that come too often
if (TS < LAST_TS_ACC + 100) {
//Log.d(TAG, "onSensorChanged: skipping");
return;
}
however this means you can only set the phones to match the lowest common denominator. If it helps I find that getting any more than 25hz is overkill for most applications, even medical ones.
It can also help to make sure any file writes you are doing are done off thread, and in batches, as writing to file is an expensive operation.
accelBuffer = new StringBuilder();
accelBuffer.append(LAST_TS_ACC + "," + event.values[0] + "," + event.values[1] + "," + event.values[2] + "\n");
if((accelBuffer.length() > 500000) && (writingAccelToFile == false) ){
writingAccelToFile = true;
AccelFile = new File(path2 +"/Acc/" + LAST_TS_ACC +"_Service.txt");
Log.d(TAG, "onSensorChanged: accelfile created at : " + AccelFile.getPath());
File parent = AccelFile.getParentFile();
if(!parent.exists() && !parent.mkdirs()){
throw new IllegalStateException("Couldn't create directory: " + parent);
}
//Try threading to take of UI thread
new Thread(new Runnable() {
#Override
public void run() {
//Log.d(TAG, "onSensorChanged: in accelbuffer");
// Log.d(TAG, "run: in runnable");
//writeToStream(accelBuffer);
writeStringBuilderToFile(AccelFile, accelBuffer);
accelBuffer.setLength(0);
writingAccelToFile = false;
}
}).start();
}
Doing all of the above has got me reasonably good results, but it will never be perfect due to differences in the hardware.
Good luck!

Detecting the device presence in a Pocket

My application needs to know whether the phone is in the pocket or in hand based on which few parameters are set specific to individual than to move on to perform next tasks.
I have read various blogs and also SensorManager android developer but none helped me out. The only related link I found on stack is this with no solution, Though one comment on that question suggests using Awareness API. I am going through it, my understanding is that the User Activity is the context to find this- I may be wrong. There maybe someone worked or may be doing R&D on this, please share your observation that may help me out in some way to go further.
Is there any way to find is the phone in pocket or not?If yes, Can somebody tell me How do One do that?
Any guidance/links to the concepts are helpful.
Thanks.
I implemented this in my project. I got readings from the Light sensor, Accelerometer and Proximity sensor. Keep in mind that it approximately detects device presence in a pocket.
Getting the current parameteres from the sensors (accelerometer, proximity and light sensors):
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
g = new float[3];
g = event.values.clone();
double norm_Of_g = Math.sqrt(g[0] * g[0] + g[1] * g[1] + g[2] * g[2]);
g[0] = (float)(g[0] / norm_Of_g);
g[1] = (float)(g[1] / norm_Of_g);
g[2] = (float)(g[2] / norm_Of_g);
inclination = (int) Math.round(Math.toDegrees(Math.acos(g[2])));
accReading.setText("XYZ: " + round(g[0]) + ", " + round(g[1]) + ", " + round(g[2]) + " inc: " + inclination);
}
if (event.sensor.getType() == Sensor.TYPE_PROXIMITY) {
proximityReading.setText("Proximity Sensor Reading:" + String.valueOf(event.values[0]));
rp = event.values[0];
}
if (event.sensor.getType() == Sensor.TYPE_LIGHT) {
lightReading.setText("LIGHT: " + event.values[0]);
rl = event.values[0];
}
if ((rp != -1) && (rl != -1) && (inclination != -1)) {
main.detect(rp, rl, g, inclination);
}
}
Then based on this data I decide whether or not the device is in a pocket:
public void detect(float prox, float light, float g[], int inc){
if((prox<1)&&(light<2)&&(g[1]<-0.6)&&( (inc>75)||(inc<100))){
pocket=1;
//IN POCKET
}
if((prox>=1)&&(light>=2)&&(g[1]>=-0.7)){
if(pocket==1){
playSound();
pocket=0;
}
//OUT OF POCKET
}
}
Keep in mind that it's not fully accurate.
Code: https://github.com/IvanLudvig/PocketSword
Blog post: https://ivanludvig.github.io/blog/2019/06/21/detecting-device-in-a-pocket-android.html
The only way we can come somewhat near to the solution is using.Google Awareness API wont solve the problem as it has a entirely different usage.
Light sensor(Environment sensor)
Proximity sensor(Position sensor)
The Android platform provides four sensors that let you monitor various environmental properties. You can use these sensors to monitor
relative ambient humidity
luminescence
ambient pressure
ambient temperature
All four environment sensors are hardware-based and are available only if a device manufacturer has built them into a device. With the exception of the light sensor, which most device manufacturers use to control screen brightness, environment sensors are not always available on devices. Because of this, it's particularly important that you verify at run time whether an environment sensor exists before you attempt to acquire data from it.
Light sensor can be used to calculate the light intensity.For example many mobile phones having Auto brightness mode function, this function work on light sensor that will adjust screen brightness as per light intensity.
There are many unites such as Lux,candela,lumen etc, to measure light intensity.
Considering this there will be considerable difference in light intensity when you phone in in pocket or outside pocket.
Although the same will happen for the case when you are operating phone is dark room. or at those place where the light intensity is quite low. hence to distinguish among such cases is the real challenge.You can use other environments sensor in combination of light sensor to come to an effective outcome.But i assume an accurate solution is dicey.
To study more about these sensors kindly refer to following links
https://developer.android.com/guide/topics/sensors/sensors_environment.html
https://developer.android.com/guide/topics/sensors/sensors_position.html
Google awareness API wont work for this case. as provides entirely different solution.
It provides two API
Fence API
Snapshot API
You can use the Snapshot API to get information about the user's current environment. Using the Snapshot API, you can access a variety of context signals:
Detected user activity, such as walking or driving.
Nearby beacons that you have registered.
Headphone state (plugged in or not)
Location, including latitude and longitude.
Place where the user is currently located.
Weather conditions in the user's current location.
Using the Fence API, you can define fences based on context signals such as:
The user's current location (lat/lng)
The user's current activity (walking, driving, etc.).
Device-specific conditions, such as whether the headphones are
plugged in.
Proximity to nearby beacons.
For a cross-platform solution, you can now use the NumberEight SDK for this task.
It performs a wide variety of context recognition tasks on both iOS and Android including:
Real-time physical activity detection
Device position detection (i.e. presence in pocket)
Motion detection
Reachability
Local weather
It can also record user context for reports and analysis via the online portal.
How to detect whether a phone is in a pocket:
For example, to record user activity in Kotlin, you would do:
val ne = NumberEight()
ne.onDevicePositionUpdated { glimpse ->
if (glimpse.mostProbable.state == State.InPocket) {
Log.d("MyApp", "Phone is in a pocket!")
}
}
or in Java:
NumberEight ne = new NumberEight();
ne.onDevicePositionUpdated(
new NumberEight.SubscriptionCallback<NEDevicePosition>() {
#Override
public void onUpdated(#NonNull Glimpse<NEDevicePosition> glimpse) {
if (glimpse.mostProbable.state == State.InPocket) {
Log.d("MyApp", "Phone is in a pocket!");
}
}
});
Here are some iOS and Android example projects.
Disclosure: I'm one of the developers.

Controllers that are not Gamepads in LWJGL

I'm having troubles with Gamepad Support.
try // to create the Controllers
{
Controllers.create();
}
catch(Exception exep)
{}
int allControllers=0;
allControllers=Controllers.getControllerCount(); //finding out how much
//of it do we have
It says that I have 3 Controllers.
But the Gamepad is the Controller number 0.
Because when I poll n1 or n2 Controller -- game just crashes.
Does anyone knows hot to automatically pick working gamepad from this list and evade the Crash?
Looks like nobody else can do it. I've been working on it for good, and there is only one solution so far. Here it is:
for(int co=0;co<allControllers;co++)
{
gamepad = Controllers.getController(co);
GamePadName=gamepad.getName();
if(GamePadName.charAt(0)!='H' && GamePadName.charAt(0)!='U')
Keys=checkGamepad(Keys);
}
There are two controllers that can't be polled. On some PC's they are called "HID something", on other they are called "USB Keybord", "USB Mouse". Maybe on other PC's they will be called in other way. So we are not polling these Controllers, and game is not crashing... seems to be a bad solution, but I see no better.

SlpashScreen/Performance: Proper way to initialize application data during startup especially using a SlpashScreen

I am trying a java desktop application(I am a student).
It have to deal with four kinds data during start up:
1)project tree(like eclipse Project tree) view data.
Currently I using XMLEncoder/XMLDecoder to save and reload from XML file.
2)user preferences data. such as font,recently files and so on.
Currently I using java.util.prefs.Preferences.
3)Class data .Some factory class like MenuFactory,util class like DatabaseUtil,FileUtil and so on they have some static data.
Currently I using static initializer in these class to initialize default data.
4)Database-related information,such as connection configuration,frequently used database name and table names.
Currently I using java.util.Properties;
What I want to improve:
1) Is it the right way to save my application data in those four kinds mentioned above ?
2)since there are so many data to load,what should I do during a splash screen.
To load them at start up time or delayed to the time when using them ?
At least,I do not want to deceive user by using the following code(not updating progress bar at a meaningful time):
SwingWorker<Void, Integer> worker = new SwingWorker<Void, Integer>() {
#Override
protected Void doInBackground() throws Exception {
for (int i = 0; i < 50; i++) {
Thread.sleep(100);// Simulate loading
publish(i*2);// Notify progress
}
return null;
}
3) I think too many static initializer may slowdown the program start up ,any suggestion?
If your app is targeted for high end devices then you can certainly load most of it during the startup/splash screen.
But the issue comes when you have low end devices as well as your target devices. For some of the low end devices loading too much data at startup leaves very less memory for other processing. Some imd leading to even crashing.
Thus it a decision which you have to take wisely and if it some small data then better leave for loading as per required or while that screen is loaded.

How to use the accelerometer to detect virbration above or equal to 2Gs?

I'm doing android application which is something like a car "blackbox" which records the traveling process of the car.
But I'm face with the problem of how am i going to integrate an accelerometer which is capable of detecting movement (Probably >= 2Gs) when an accident occur then it should trigger the video recording to stop and saving it to the Archive file, thus not losing the file as a result of the accident.. Anyone knows how to do the above mention task, i'm rather needing urgent help here please! I've read android developer on accelerometer and its not helping in my situation here first i'm rather bad in physics second i'm new to android/java and my first attempt working with the accelerometer? Any simple solution? Thanks in Advance :)
This is part of the section of the video recording but now how am i going to incorporate accelerometer for "Auto-Archiving" purposes?
A couple of points:
The Bosch BMA150 is used in many smartphones with 2g set as the maximum acceleration value (so you might never see >2g).
With SENSOR_DELAY_FASTEST you can take readings about every 20 milliseconds on an HTC Desire. However, since you have a multi-tasking operating system on the phone, you cannot guarantee this timing (delays of a couple of seconds might occur when the operating system is busy).
Hence a smartphone is currently not really suitable for this application. If Android allows smarter use of accelerometers in future this could change. If onSensorChanged was allowed a threshold parameter then accelerations exceeding this threshold could be buffered in the accelerometer chip's memory and read out when appropriate.
Put your startRecording() method in the Activity below, it's called when acceleration exceeds 2G, you can change this by changing the value of CRASH_G_MULTIPLIER
public class YourActivity extends Activity {
private double G = 9.81;
private double CRASH_G_MULTIPLIER = 2;
#Override public void onCreate(Bundle savedInstanceState){
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
startDropListening();
}
private void startRecording(){
// your code that does the recording here
}
private void startDropListening(){
SensorManager sm = (SensorManager)getSystemService(SENSOR_SERVICE);
sm.registerListener(
new SensorEventListener(){
#Override public void onAccuracyChanged(Sensor arg0, int arg1) {}
#Override public void onSensorChanged(SensorEvent arg0) {
double accel = Math.sqrt(
Math.pow(arg0.values[0], 2) +
Math.pow(arg0.values[1], 2) +
Math.pow(arg0.values[2], 2));
if (accel > G * CRASH_G_MULTIPLIER){
startRecording();
}
}
},
sm.getDefaultSensor(Sensor.TYPE_ACCELEROMETER),
SensorManager.SENSOR_DELAY_NORMAL);
}
}
There are some things you should be aware of:
As mentioned by others, the value of 2 needs to be increased quite substantially, but you can fine tune this yourself by experimentation.
You will want to acquire a wake lock. To create a wake lock, do this:
PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
wl = pm.newWakeLock(PowerManager.FULL_WAKE_LOCK, "ATag");
wl.acquire();
And the following when you are finished:
wl.release();
(You may want to change FULL_WAKE_LOCK to a lower priority wake lock). (You will also need to add the appropriate permission to your manifest file for the wake lock)
You may wish to increase the sample rate. This will drain battery significantly though. There are different values you can assign to it, in the code above, replace:
SensorManager.SENSOR_DELAY_NORMAL
with any of these:
SensorManager.SENSOR_DELAY_UI
SensorManager.SENSOR_DELAY_GAME
SensorManager.SENSOR_DELAY_FASTEST

Categories