Best way to synchronize playback over 3g networks - java

I am trying to figure out a way to synchronize rhythm playback on 2+ mobile android devices.
Achieving good precision on WiFi / LAN is simple (Very low latency) but I need a good solution for 3G networks with variable high latency..
One idea I came up with is sending and timing messages and using average time-spans to compensate latency but this idea seems absurd and I'm kinda sure there are better other ways to solve this..
care to help?

I would first of all try to create a as close as possible synchronized clock across all devices that you can use as a reference.
When devices communicate they always include their local synchronized time with the message, this way you can always figure out the difference between when you received the message and when it was transmitted, and also always know that the time the message states you should play a beat is the same across all devices.
The real difficulty here being synchronizing the watches.. I would start by reading this article http://en.wikipedia.org/wiki/Network_Time_Protocol
There is a JAVA based NTP client here:
http://commons.apache.org/net/examples/ntp/NTPClient.java
If you get that to work, there are a number of NTP servers across the world.
http://www.pool.ntp.org/en/use.html

I have actually recently tried to solve the very same question myself. The best solution I have found was the NTP (Network Time Protocol).
The main drawback is that it can take long time to synchronise over a high latency network. So you need to have an app that is running quite a while before you need to have the devices synchronized.
The application I´m working on is not yet tested with perfect timings so I can´t promise that this is a viable solution. But one worth trying.

If your devices are close enough together, you could see whether you are able to use Bluetooth to speed up peer-to-peer synchronisation, while using NTP to get the global time -- probably by adding Bluetooth-linked phones as extra remotes in your NTP config. This would entail pairing end-user devices, which may be an issue for you.

Related

Time synchronization

I am creating a web application in Java in which I need to run a reverse timer on client browser. I have planned to send the remaining time from server to client and then tick the timer using javascript.
My questions are:
1. Does the clock tick rate varies with different systems?
2. Is there any better way to do this?
Does the clock tick rate varies with different systems?
Yes, it's the result of really, really small differences of frequencies of the quatrz used in chipsets. So if you do not synchronize your clocks now and then, they will diverge.
However, if you're not designing a satellite, remote control for ballistic missiles, or life supporting devices, you really should not care.
Is there any better way to do this?
Yes, if:
your reverse clock counts down from a year or at least month, or
you are running your client on a device with broken / really inaccurate clock
you may use a NTP protocol to make sure the client and the server clocks are synchronized. There are NTP libraries available for JavaScript and Java.
#npe -s solution with NTP will do, but is theoretically incorrect:
Even if the clocks are perfectly sync-ed, you will send the client the remaining-time. But that message needs to travel on the net, so by the time the client receives it it won't be correct anymore.
A better approach would be to send the end time to the client, which is an absolute value, hence not affected by network lag and do the countdown on the client, calculating the remaining-time there.
that said, the other answers about NTP are also necessary of course

Is Java fast enough to do live screensharing?

For the past few months, a developer and I have been working on a screensharing applet that streams to a media server like Wowza or Red5, but no matter what we do, we have about 5 seconds of latency, which is too long for a live application where people are interacting with each other. We've tried xuggle, different encoders, different players, different networks, different media servers, and even streaming locally, there's significant latency.
So, I'm beginning to wonder…
Is Java fast enough to do live screensharing?
I've seen lots of screen recording applets written in Java, but none of them are streaming live. Everything that's done live, such as GoToMeeting, seems to use C++. I'm thinking maybe there's a reason.
It's not a compression problem. Using ScreenVideo, we've compressed an hour-long stream down to about 100 MB, and we have plenty of bandwidth. The processor isn't overloaded doing the compression, either, but it seems to be taking too much time. We are getting the best results from some code pulled out of BigBlueButton, but still, the latency is terrible.
Streaming the WebCam, on the other hand, is nice and snappy. Almost no latency at all. So, the problem is the applet.
The only other idea I can think of is somehow emulating a WebCam with Java. Not sure if that would be faster or not.
Ideas? Or should I just give up on Java and do this in C++? I would hate to do that, because then I would have to create different versions for different platforms, but if it's the only way, it's the only way.
Many video streaming subsystems deliberately buffer so that a blip in connectivity doesn't impact the video, but that makes more sense in a recorded media scenario.
Make sure these systems have buffering turned off or turned down.
Also, while this isn't exactly scientific, you could run an app like wireshark on the outgoing and incoming computers, and try to see how long the traffic actually takes. If it's very fast, then I'd more seriously consider that buffering is the issue.
If you are on Windows, maybe just running Task Manager/Network tab would prove this or not (rather than installing something like wireshark, which isn't difficult... just trying to suggest a fast way to check)

Is my Android App Draining Battery?

I'm developing a game for Android. It uses a surface view and uses the sort of standard 2D drawing APIs provided. When I first released the game, I was doing all sorts of daft things like re-drawing 9-patches on each frame and likewise with text. I have since optimised much of this by drawing to Bitmap objects and drawing them each frame, only re-drawing onto the Bitmap objects when required.
I've received complaints about battery drain before, and following my modifications I'd like to know (scientifically) if I've made any improvements. Unfortunately, I don't have any prior data to go by, so it would be most useful to compare the performance to some other game.
I've been running Traceview, and using the results of it mostly for the purposes of identifying CPU-time-consuming methods.
So -- what's the best way of determining my app's battery performance, and what's a good benchmark?
I know I can look at the %s of different apps through the settings, but this is again unscientific, as the figure I get from this also depends on what's happening in all of the other apps. I've looked through (most of) Google's documentation, and although the message is clear that you should be saving battery (and it gives the occasional tip as to how), there is little indication of how I can measure how well my app is performing. The last thing I want are more complaints of battery drain in the Android Market!
Thanks in advance.
EDIT
Thanks for all your helpful advice/suggestions. What I really want to know is how I can use the data coming from Traceview (ie: CPU time in ms spent on each frame of the game) to determine battery usage (if this is at all possible). Reading back on my original question, I can see I was a bit vague. Thanks again.
Here is my suggestion:
I watch power consumption while developing my apps (that sometimes poll the sensors at rates of <25ns) using PowerTutor. Check it out, it sounds like this maybe what you are looking for, the app tells you what you are using in mW, J, or relative to the rest of the system. Also, results are broken down by CPU, WiFi, Display, (other radios installed). The only catch is that it is written for a specific phone model, but I use it with great success on my EVO 4G, Galaxy S (Sprint Epic), and Hero.
Good luck,
-Steve
There is a possibility that your game is draining battery. I believe this depends on several reasons, which reads as follows:
Your application is a game. Games drains battery quickly.
You're iterating with help from a Thread. Have you limited the FPS to make the CPU skip unnecessary iterations? Since you're working with 2D I assume you're using the SurfaceView. 60 FPS will be enough for a real-time game.
You don't stop the Thread when your application terminates. Hence you reiterate code when your application isn't alive.
Have you an iterate lock that does wait(); during onPause?
The people commenting that your game is leaking battery probably aims when your application isn't in use. Otherwise, it would be wierd because every game on Android Market drains battery - more or less.
If you're trying to gauge the "improvement over your previous version", I don't think it makes sense to compare to another game! Unless those two games do the exact thing, this is as unscientific as it gets.
Instead, I would grab the previous version of your app from source control, run it, measure it, and then run it with the latest code and compare it again.
To compare, you could for example use the command line tool "top" (definitely available in busybox if your phone is rooted, not sure if it comes with a stock phone. Probably not). That shows you the CPU usage of your process in percent.
There is a battery profiler developed by Qualcomm called Trepn Profiler: https://developer.qualcomm.com/mobile-development/increase-app-performance/trepn-profiler
how I can use the data coming from Traceview (ie: CPU time in ms spent on each frame of the game) to determine battery usage (if this is at all possible)
In theory it would be possible to extrapolate the battery usage for your app by looking at the power consumption on a frame by frame basis. The best way to accomplish this would be to evaluate the power consumption of the CPU (only) for a given period (say two seconds) while your app is running the most CPU intensive operation, (additionally, GPU power usage could be gleaned this way also) while recording TraceView data (such as frames per second or flops per second) giving you the the traffic across the CPU/GPU for a given millisecond. Using this data you could accurately calculate the average peak power consumption for your app by running the above test a few times.
Here is why I say it is theory only: There are many variables to consider:
The number and nature of other processes running at the time of the above test (processor intensive)
Method of evaluating the power draw across the CPU/GPU (while tools such as PowerTutor are effective for evaluating power consumption, in this case the evaluation would not be as effective because of the need to collect time stamped power usage data. Additionally, just about any method of collecting power data would introduce an additional overhead (Schrödinger's cat) but that strictly depends on the level of accuracy you require/desire.)
The reason for the power consumption information - If you are looking to define the power consumption of your app for testing or BETA testing/evaluation purposes then it is a feasible task with some determination and the proper tools. If you are looking to gain usable information about power consumption "in the wild", on user's devices, then I would say it is plausible but not realistic. The vairables involved would make even the most determined and dedicated researcher faint. You would have to test on every possible combination of device/Android version in the wild. Additionally, the combinations of running processes/threads and installed apps is likely incalculable.
I hope this provides some insight to your question, although I may have gone deeper into it than needed.
-Steve
For anyone looking, one resource we've been using that is extremely helpful is a free app from AT&T called ARO.
Give it a look: ARO
It has helped me before and I don't see it mentioned very often so thought I'd drop it here for anyone looking.
"I know I can look at the %s of
different apps through the settings,
but this is again unscientific, as the
figure I get from this also depends on
what's happening in all of the other
apps."
The first thing I'd do is hunt for an app already out there that has a known, consistent battery usage, and then you can just use that as a reference to determine your app's usage.
If there is no such app, you will have to hope for an answer from someone else... and if you are successful making such an app, I would suggest selling your new "battery usage reference" app so that other programmers could use it. :)
I know this question is old and it's late, but for anyone who comes here looking for a solution I suggest you take a look at JouleUnit test:
http://dnlkntt.wordpress.com/2013/09/28/how-to-test-energy-consumption-on-android-devices/
It integrates into eclipse and gives you a great amount of detail about how much battery your app is consuming.
I know of three options that can help you for a having scientific measure:
Use a hardware specifically built for this. Monsoon HIGH VOLTAGE POWER MONITOR.
https://msoon.github.io/powermonitor/PowerTool/doc/Power%20Monitor%20Manual.pdf
Download and install Trepn Profiler (a tools from Qualcomm) on your phone. You wont need a computer for reporting. Reports are live and realtime on the phone. You can download Trepn Profiler from the following link: https://play.google.com/store/apps/details?id=com.quicinc.trepn&hl=en_US
Please take note that for recent phone (with android 6+) it works in estimation mode. If you need accurate numbers, you need to a list of select devices. Check the following link for the list:
https://developer.qualcomm.com/software/trepn-power-profiler/faq
You can profile apps separately, and the whole system.
Use Batterystats and Battery Historian from google.
https://developer.android.com/studio/profile/battery-historian

Java: Anyone know of a library that detects the quality of an internet connection?

I know a simple URLConnection to google can detect if I am connected to the internet, after all I am confident that the internet is all well and fine If I cant connect to google. But what I am looking for at this juncture is a library that can measure how effective my connection to the internet is in terms of BOTH responsiveness and bandwidth available. BUT, I do not want to measure how much bandwidth is potentially available as that is too resource intensive. I really just need to be able to test wether or not I can recieve something like X kB's in Y amount of time. Does such a library already exist?
It's not really possible to be able to judge this. In today's world of ADSL 2+ with 20-odd Mb/s download speeds, you're largely governed by the speed of everything upstream from you. So if you're connecting to a site in another country, for example, the main bottleneck is probably the international link. If you're connected to a site in the same city as you are, then you're probably limited by that server's uplink speed (e.g. they might be 10MB/s and they'll be serving lots of people at once).
So the answer to the question "can I receive X KB in at most Y seconds" depends entirely on where you're downloading from. And therefore, the best way to answer that question is to actually start downloading from where ever it is you're planning to download, and then time it.
In terms of responsiveness, it's basically the same question. You can do an ICMP ping to the server in question, but many servers will have firewalls that drop ICMP packets without replying, so it's not exactly accurate (besides, if then ping is much less than ~100ms then the biggest contribution to latency probably comes from the server's internal processing, not the actual network, meaning an ICMP ping would be useless anyway).
This is true in general of network characteristics - and the internet in particular (because it's so complex) - you can't reliably measure anything about site X and infer anything about site Y. If you want to know how fast site Y will respond, then you just have to connect to site Y and start downloading.
Calculating the user's ability to reliably download a given number of bits in a given period of time might be complex -- but you could start with some of the code found at http://commons.apache.org/net/. That can tell you latency and bandwidth, anyway.
The answer may be wrong a millisecond (substitute any other period) after you've measured it.
Look at any application that gives you a "download time remaining" figure. Notice that it's generally incorrect and/or continually updating, and only becomes accurate at the last second.
Basically, so much change is inevitable over any reasonably complex network, such as the internet, that the only real measure is only available after the fact.

Sanity Check - Is a Multiplayer Game Server in Java using TCP (ServerSocket) viable?

Please stop me before I make a big mistake :) - I'm trying to write a simple multi-player quiz game for Android phones to get some experience writing server code.
I have never written server code before.
I have experience in Java and using Sockets seems like the easiest option for me. A browser game would mean platform independence but I don't know how to get around the lack of push using Http from the Server to the Browser.
This is how the game would play out, it should give some idea of what I require;
A user starts the App and it connects using a Socket to my server.
The server waits for 4 players, groups them into a game and then broadcasts the first question for the quiz.
After all the players have submitted their answers (Or 5 seconds has elapsed) the Server distributes the correct answer with the next question.
That's the basics, you can probably fill in the finer details, it's just a toy project really.
MY QUESTION IS;
What are the pitfalls of using a simple JAR on the server to handle client requests? The server code registers a ServerSocket when it is first run and creates a thread pool for dealing with incoming client connections. Is there an option that is inherently better for connection to multiple clients in real time with two way communication?
A simple example is in the SUN tutorials at the bottom you can see the source for a multithreaded server, except that I have a pool of threads initially to reduce overhead, my server is largely the same.
How many clients do you expect this system to be able to handle? If we have a new thread for each client I can see that being a limit, also the number of free Sockets for concurrent players. Threads seem to top out at around 6500 with the number of sockets available nearly ten times that.
To be honest If my game could handle 20 concurrent players that would be fine but I'm trying to learn if this approach is inherently stupid. Any articles on setting up a simple chess server or something would be amazing, I just can't find any.
Thanks in advance oh knowledgeable ones,
Gav
You can handle 20 concurrent players fine with a Java server. The biggest thing to make sure you do is avoid any kind of blocking UI like it was the devil itself.
As a bonus, if you stick with non-blocking I/O you can probably do the whole thing single-threaded.
Scaling much past 100 users or so may need to get into multiple processes/servers, depending on how much load each user places on your client.
It should be able to do it without an issue as long as you code it properly.
Project Darkstar
You can get around the "push from server to client over HTTP" problem by using the Long Poll method.
However, using TCP sockets for this will be fine too. Plenty of games have been written this way.

Categories