I'm currently prototyping a multimedia editing application in Java (pretty much like Sony Vegas or Adobe After Effects) geared towards a slightly different end.
Now, before reinventing the wheel, I'd like to ask if there's any library out there geared towards time simulation/manipulation.
What I mean specifically, , an ideal solution would be a library that can:
Schedule and generate events based on an elastic time factor. For example, real time would have a factor of 1.0, and slow motion would be any lower value; a higher value for time speedups.
Provide configurable granularity. In other words, a way to specify how frequently will time based events fire (30 frames per second, 60 fps, etc.)
An event execution mechanism of course. A way to define that an events starts and terminates at a certain point in time and get notified accordingly.
Is there any Java framework out there that can do this?
Thank you for your time and help!
Well, it seems that no such thing exists for Java. However, I found out that this is a specific case of a more general problem.
http://gafferongames.com/game-physics/fix-your-timestep/
Using fixed time stepping my application can have frame skip for free (i.e. when doing live preview rendering) and render with no time constraints when in off-line mode, pretty much what Vegas and other multimedia programs do.
Also, by using a delta factor between each frame, the whole simulation can be sped up or slowed down at will. So yeah, fixed time stepping pretty much nails it for me.
Related
I programming a 60 FPS game and need to find those code sections that are badly affecting the game loop's performance. I'm currently storing a nanosecond timestamp at the start of the frame and split my code into logic parts, for example "Rendering particles" or "Iterate through enemy AIs". After finishing the execution of each section I store the execution time needed to execute code since the last section finished. At the end I retrieve the total execution time ([nanoseconds now] minus [first nanosecond timestamp]) and calculating the percentage of each code section execution time. This allows me to display the percentage taken by each section, but doesn't seem to be the perfect solution. I guess GC randomly affects code section executions.
Is there a better way I can implement or even an API / Analytics Tool for doing exactly this?
One of the commenters recommended traceview, which is generally useful for instrumenting code written in Java to figure out what's slow. It's less useful for a mix of Java and native (because the instrumentation disproportionately slows the Java code), and it lacks useful context when you're trying to figure out if you're meeting or missing deadlines when generating frames with OpenGL.
The preferred way to do this sort of analysis is with systrace (doc, explanation). As you can see from the second link, you can compare your app timeline directly with the vsync event and surface composition activity.
If you're running Android 4.3 (API 18) or later, you can add your own events. Adding these to strategic places in your app makes it much easier to visualize your bottlenecks. A simple example can be found here. (As of when I'm writing this, the official docs don't yet describe the 4.3 command-line usage, so it's easiest to just follow the example.)
Android 4.3 also added the "dalvik" tag; if included, your traces will show where the GC pauses start and end, as well as full details on which threads actually got suspended.
I have a program written in Python that accurately shows the time evolution of a quantum particle in both 1 and 2 dimensional wells. I am too lazy to post the entire thing online, but I will be happy to email the source to anyone willing to take a look.
My question is this: Is there a faster way? This thing should look like it's going crazy in its box, not calmly gliding around. When you run the program, choose "yes" on the realtime option to get a diagnostic of the performance. It runs at about 3 dt steps (on the order of 10-6 to 10-18 seconds) per actual real second. Needless to say, by the time this program shows me what has happened to the particle after 1 second of real time, I will be old and grey. Any suggestions?
It runs at about 3 dt steps (on the order of 10^-6 to 10^-18 seconds) per actual real second. Needless to say, by the time this program shows me what has happened to the particle after 1 second of real time, I will be old and grey. Any suggestions?
If you are lucky, you might get a factor of 10 to 100 speed-up by changing language implementations or languages. But it sounds like you want many orders of magnitude faster performance. For that you would need:
a fundamental change in the algorithms you are using, and / or
using a computation platform with lots of hardware parallelism.
This kind of computational problem doesn't have simple solutions.
I realize this is not strictly a code question, but I guess it belongs here anyway. If not, my apologies in advance.
Being as there's no inbuilt way to change the vibration intensity for the droid in code I'm using a kind of PWM control (switching the vibrator on and off at high frequency gives me a kind of control over vibration intensity). Right now I'm using a 20ms period (for example, with a 50% duty cycle the vibrator is on for 10ms and off for 10ms and it kind of feels like half power).
My question is, can some damage occurr to the vibrator motor using this kind of control?
I'm no engineer, but we're in luck because there is one sitting next to me. Apparently there's a kind of life cycle to things that relates in some ways to altering the state and in some other ways to duration of use so yes doing what you're talking about will stress the device in one way by trying to get something to go from 0% to 100% and back again very rapidly, but relieve some stress by only having it on half the time. Overall, what you're talking about doing shouldn't do any harm that would shorten the Android's life span as long as this pattern isn't intended to run for very long. I would definitely suggest getting in touch with someone who knows the mechanical part of the device more intimately because every device is different and general knowledge doesn't always translate into spot-on specific knowledge.
All I know is that delta relates somehow to adapting to different frame rates, but I'm not sure exactly what it stands for and how to use it in the math that calculates speeds and what not.
Where is delta declared? initialized?
How is it used? How are its values (min,max) set?
It's the number of milliseconds between frames. Rather than trying to build your game on a fixed number of milliseconds between frames, you want to alter your game to move/update/adjust each element/sprite/AI based on how much time has passed since the last time the update method has come around. This is the case with pretty much all game engines, and allows you to avoid needing to change your game logic based on the power of the hardware on which you're running.
Slick also has mechanisms for setting the minimum update times, so you have a way to guarantee that the delta won't be smaller than a certain amount. This allows your game to basically say, "Don't update more often than every 'x' milliseconds," because if you're running on powerful hardware, and have a very tight game loop, it's theoretically possible to get sub-millisecond deltas which starts to produce strange side effects, such as slow movement, or collision detection that doesn't seem to work the way you expect.
Setting a minimum update time also allows you to minimize recalculating unnecessarily, when only a very, very small amount of time has passed.
Have a read of the LWJGL timing tutorial found here. Its not strictly slick but will explain what the delta value is and how to use it.
I want to write a program in Java that uses fast Fourier transformation.
The program reads data every 5 milliseconds seconds from sensors and is supposed to do something with the data every 200 milliseconds based on the data from the last five seconds.
Is there a good library in Java that provides a way to do Fourier transformation without recalculating all five seconds every time?
Hard real time problems are not the proper application of Java. There are too many variables such as Garbage collection and Threads not guaranteed to happen within a given interval to make this possible. If close enough is acceptable it will work. The performance of your software as far as timing will also depend on the OS and hardware you are using and what other programs are also running on that box.
There is a Real Time Java, that does have a special API for the issues I mention above. You do not indicate that you are using that. It is also a different animal in a lot of respects than plain Java.