Make GWT application to work even in low bandwidth - java

I am working on a GWT application, where I want to use files for upload / download based on the internet bandwidth speed.
Is there any way to detect internet bandwidth / speed in GWT to decide which files to be loaded.
Thanks.

You may split your GWT application in parts, so that a part is loaded only when it is required. This is fully documented, see here. Very powerful stuff.
It makes the initial load smaller and faster, no matter what your connection speed is. Each time a new "feature" is used, a small additional package is downloaded (only the first time it is used). This works transparently if you follow the documentation.
Also, make sure you configure the web server properly, so that it makes full use of the cacheability of the GWT *.cache.js parts.

Upload speed can change a few seconds after you have measured it, and then change again and again. And if you try to measure it before loading each file, the overhead of doing it may outweigh any benefits.
If this is critical to your app, you can offer a "full" and "simplified" versions of your app and let users choose. This is an approach used by Chess.com, for example, where connection speed is very important for player experience. Many websites offer a lighter "mobile" version of their apps compared to a more complete "desktop" version, which often means a different resolution for images as well.
Another approach is to allow users to choose between "standard" and "HD" resolutions for media files, but it can be applied to other file types too. This is an approach used by YouTube, iTunes (for movies) and many other websites.

Well, there isn't any thing providing this info, unless you try to download or upload something and measure the time in client side.
Measuring the download bandwidth could be as simple as compute the time you app got loaded, or better send a request to your server to ask for a known size request.
1.- Case 1 (I use gquery for simplicity)
Set this in your index.html
<script>
window._app_start_loading = new Date().getTime();
</script>
Then compute the time your app took
onModuleLoad() {
double start = $(window).prop('_app_start_loading');
double millisecs = start - Duration.currentTimeMillis();
}
And based on the size of your app JS and experience, you could compute whether the user is using a poor quality network.
2.- Case 2: But the example below has a problem, the user browser will cache requests, so it only works the first time the user runs your app. It is much more accurate and realiable to do something like:
onModuleLoad() {
final double start = Duration.currentTimeMillis();
double millisecs = start - Duration.currentTimeMillis();
GQuery.post("my_app/a_fixed_sized_request?" + Duration.currentTimeMillis(), null, new Function() {
public void f() {
double millisecs = start - Duration.currentTimeMillis();
}
});
}
Now you know the size is always the same, and the request is not being cached.
3.- Measuring upload. You can do the same that in case 2, but sending some POST data in the request with a known size. Replace the second parameter.

Related

Selenium WebDriver / Java - Simulate Human Like Cursor Movement

I want to automate a simple task inside Facebook Ads Manager. This task involves setting up a campaign and uploading some ads. It can take a human 30 minutes to do this. However, they're doing the same thing every single time. Often with mistakes. It's something that should be automated. Done without human emotion or mistakes.
Facebook is very sensitive and I don't want it to ban me for the wrong reasons. So I need to feel human. I can take my time between clicks. However, the cursor movement itself needs to feel human. I only need to simulate a real human click for ethical purposes.
Say I get an element I want to move my cursor towards:
WebDriver driver;
// Set file path of chrome driver
System.setProperty("webdriver.chrome.driver", "C:\\chromedriver.exe");
// Create object
ChromeDriver driver = new ChromeDriver(options);
// Go to URL
driver.get("FACEBOOK URL");
// Get element
driver.findElement(By.xpath("//span[contains(text(), 'Setup Campaign')]");
What is the best way to move my cursor towards this element as a real human would?
A real human would first move the mouse. Not just click the element
They would move the mouse/cursor slowly. It could take up to say 500-1000 milliseconds. Certainly not instantly.
They would move the mouse/cursor in a curved fashion. Not just in a 100% straight line. Possibly, in a random fashion? So some elements of randomness may be needed?
I'm quite new to Selenium, so any input would be greatly appreciated.
I am writing my code in Java :)
WebDriver doesn't use an operating system input; it communicates directly with the browser via http protocol. If you want to simulate communication like a 'real' mouse input you have to use an automation solution that uses operating system based frameworks. In case of Windows you can use e.g.:
https://github.com/FlaUI/FlaUI (read https://github.com/FlaUI/FlaUI/wiki/FAQ to get the knowledge how to configure Chrome to expose web controls for FlaUI)
https://github.com/microsoft/WinAppDriver
I understand that this is not exactly what you asked, but in this case I recommend you to use the Facebook API instead of selenium.
It's more stable than your approach and without the risk of getting banned.
https://developers.facebook.com/docs/marketing-api/reference/v12.0

Space Restriction for Client side caching

I have a search result page that returns around 40 images. I use mongohq to store my images.
Now these image will never change. They will either be removed or left as is.
So my Spring servlet streams the images after reading from mongoHq based on image id
/app/download/{uniqueImageId}
All works good. Except the load timings to stream the images. I feel that these images will remain constant for these unique ids so why not cache them. I can add a filter that applies to my above url type and add a caching header, which i plan to give a really long value like maybe cache the images for a week.
My question is, if i start telling the client's browser to cache all these 40+ images, will it cache all these images?
Aren't there any space restrictions from the client side?
Do you see any better option to handle such scenario?
My question is, if i start telling the client's browser to cache all these 40+ images, will it cache all these images? Aren't there any space restrictions from the client side?
Of course, there are space restriction on the client side (also the storage space of the whole world is limited... uhm, sorry for that...). The user may restrict the caching space, and/or the browser just takes automatically the free space available for caching.
Typically I would expect that the browser cache is always some megabytes (let's say 100+), so often needed images, like icons, transfered in a session will be cached. Whether the image is still in the cache, when the user visits your site three days later, depends on the cache size and the users activity in between. So you never know.
What the client or any intermediate proxies do, is out of your direct control. The only thing you do by setting the caching headers is, to say that it is legal not to refresh this resource for a while. Make sure you understand the HTTP1.1 headers correctly, if you do set the headers in your application.
Do you see any better option to handle such scenario?
The term "better" isn't very exact here. What exactly do you need to optimize?
If you have a lot of requests on the same image set, you can reduce server and database load by putting an edge server, like nginx, in front of your application, which is configured as caching reverse proxy. In this case, your own edge server is interpreting the caching headers. In general, I consider it a good design, if an application has no significant load on serving static resources.

apps, remote sripts and security/obfuscation

I will construct a fictional app in order to construct my question.
I write a kind of treasure hunt app where the user gets a prize if they visit several locations around town. In effect the app would get their current lat/lon and check its proximity to the list of "treasure locations", if they are within 10 meters of any treasure location they get a notification.
The app will then do a http post to a remote script which basically inserts into a database. The post parameters will be uuid of device and the location they visited.
An attacker could easily watch wireshark and get the name of the script along with the parameters. They could go further, decompile the apk and get other things such as any hashing/obfuscation. They could then just use curl to post willynilly as they pleased and the game would be ruined for non-cheaters. This is a problem have never had to really address since in all the apps I have written there is always data which isnt sensitive and I dont mind it being exposed to the public.
What do I do?
The best think you could do is to send the data in a secure manner. Using HTTPS would be a much better choice, regardless of method. This effectively prevents eavesdroppers, it is the fundamental technology behind any secure communication on the internet.
Aside from the protocol to communicate with the server, there are still insecurities. Essentially, there are three methods that could work to overcome these.
The location of the player could be sent to the server at some periodic interval. The server responds back if they are close enough to one of the areas. Perhaps the server could include enough smarts to know that it takes time to get from point A to point B.
A single location could be sent at a time to the app. The track of the user could also be uploaded, to verify that the location is correct.
The locations could be sent through a one way function to the program. The real answer could be then sent to the server. The problem with this is that the exact location would need to be discovered in order for the same hash to result back. However, as GPS coordinates tend to only be accurate to a few meters, and don't tend to give insignificant digits, then multiple values could be tested near the current location. The one-way function would have to require some time to calculate in an effective manner, as otherwise it would be trivial for a bad guy to simply test every square meter in the city to figure out what would work.
The best method from a security standpoint would be the first, as at no time does the application know where it is supposed to go, until it reaches that location. Of course, this pings the server a large number of times needlessly.

Aside from RPC calls, what could be taking my App Engine Program so long

I'm trying to performance optimize a page load in GAE, and I'm a bit stumped what is taking so long to serve the page.
When I first got appstats running I found the page was calling about 500-600 RPC calls. I've now got that down to 3.
However, I'm still seeing a massive load of extra time in App Stats. Another page on my site (using the same django framework + templating) loads in about 60ms, doing a small query to a small data set.
Question is, what is this overhead, and where should I be looking for trouble points?
The data in the request has about 350 records, and about 30 properties per record. I'm cool with the data call itself taking the datastore api time, but it's the other time I'm confused about. The data does get stepped through a biggish iterator, and I've now used fetch on most of these requests to keep the RPC call down, and make sure things are in memory rather than being queried as they go.
Slow Request - Look at all the extra blue
Fast Request , RPC blue is matched against overall blue
EDIT
OK, so I have created a new model called FastModel, and copied the bare minimum items needed for the page to it, so it can load as quickly as possible, and it does make a big difference. Seems there are things on the Model that slow it all down. Will investigate further.
Deserializing 350 records, especially large ones, takes a long time. That's probably what's taking up the bulk of your execution time.

Retrieve multiple images from server quickly

For my BlackBerry application, I am using a single thread to retrieve images from the server one at a time. My application has a number of images and it takes too long to load all the images. How can I speed this up?
If these are static images, you can also do something like CSS sprites - stitch them all together into one big image, then in code you display the portion of the large image that corresponds to the original image you want.
The last two arguments to Graphics.drawImage(...) indicate where to start drawing from the original image, and that's how you would select the part you want.
Use multiple threads instead of one. Also, if this is a server that you control, consider pre-sizing the images for the target devices or having the device send its size to the server to generate and cache device specific images.
its too late but sorry for that.
i have used observer pattern for it.
Link:-http://en.wikipedia.org/wiki/Observer_pattern
thankx
#Peter
Threads on a mobile phone is a bad idea. Firstly threading on phones suck! secondly phones can't really handle more then one http connection at a time stuff bombs out.
#Userbb
You can do sneeky things like stream them via a socket connection OR include multiple images in a single http request (creating a connection and http headers have overhead).
and also deff do what #peter suggested about resizing serverside.

Categories