Determine a YouTube channel's upload rate using YouTube Data API v3 - java

I am writing a Java application that uses YouTube Data API v3. I want to be able to determine a channel's upload rate. For example, if a channel is one week old, and has published 2 videos, I want some way to determine that the channel's upload rate is 2 videos/week. How would I do this using the YouTube API?
import com.google.api.client.googleapis.json.GoogleJsonResponseException;
import com.google.api.client.http.HttpRequest;
import com.google.api.client.http.HttpRequestInitializer;
import com.google.api.client.http.javanet.NetHttpTransport;
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.jackson2.JacksonFactory;
import com.google.api.services.youtube.YouTube;
import com.google.api.services.youtube.model.Channel;
import com.google.api.services.youtube.model.ChannelListResponse;
import java.io.IOException;
import java.io.InputStream;
import java.security.GeneralSecurityException;
import java.util.Collection;
import java.util.Collections;
import java.util.Properties;
public class ApiExample {
public static void main(String[] args)
throws GeneralSecurityException, IOException, GoogleJsonResponseException {
Properties properties = new Properties();
try {
InputStream in = ApiExample.class.getResourceAsStream("/" + "youtube.properties");
properties.load(in);
} catch (IOException e) {
System.err.println("There was an error reading " + "youtube.properties" + ": " + e.getCause()
+ " : " + e.getMessage());
System.exit(1);
}
YouTube youtubeService = new YouTube.Builder(new NetHttpTransport(), new JacksonFactory(), new HttpRequestInitializer() {
public void initialize(HttpRequest request) throws IOException {
}
}).setApplicationName("API Demo").build();
// Define and execute the API request
YouTube.Channels.List request = youtubeService.channels()
.list("snippet,contentDetails,statistics");
String apiKey = properties.getProperty("youtube.apikey");
request.setKey(apiKey);
ChannelListResponse response = request.setId("UC_x5XG1OV2P6uZZ5FSM9Ttw").execute();
for (Channel channel : response.getItems()) {
/* What do I do here to get the individual channel's upload rate? /
}
}
}
The above example uses the YouTube Developers channel, but I want to be able to do this with any channel.

According to the official docs, once you invoke the Channels.list API endpoint -- that returns the specified channel's meta-data, a Channels resource --, you have at your disposal the following property:
statistics.videoCount (unsigned long)
The number of public videos uploaded to the channel.
Therefore, things are almost obvious: make the value returned by this property persistent (e.g. save it into a file) and arrange your program such that to be issued weekly for to compute your desired upload rate.
Now, for what concerns your code above, you should first get rid of:
for (Channel channel : response.getItems()) {
/* What do I do here to get the individual channel's upload rate? /
}
since the items property will contain at most one item. A good practice would be to assert this condition:
assert response.getItems().size() <= 1;
The value of the needed videoCount property will be accessible under the method getVideoCount of ChannelStatistics class:
response.getItems().get(0).getStatistics().getVideoCount().
Of course, since is always good to ask from the API only the info that is really of use, I would also recommend you to use the parameter fields (the method setFields) in the form of:
request.setFields("items(statistics(videoCount))"),
inserted, for example, after request.setKey(apiKey).
This way the API will send back to you only the property that you need.
Addendum
I also have to mention that the assertion above is correct only when you pass to the API endpoint (as you currently do within your code above) one channel ID only. If in the future you'll want to compute in one go the upload rate of N channels (with N <= 50), then the condition above will look like size() <= N.
The call of Channels.list in one go on multiple channels is possible, since this endpoint's id property is allowed to be specified as a comma-separated list of channel IDs.

Related

Tensorflow 2.0 & Java API

(note, I've resolved my problem and posted the code at the bottom)
I'm playing around with TensorFlow and the backend processing must take place in Java. I've taken one of the models from the https://developers.google.com/machine-learning/crash-course and saved it with tf.saved_model.save(my_model,"house_price_median_income") (using a docker container). I copied the model off and loaded it into Java (using the 2.0 stuff built from source because I'm on windows).
I can load the model and run it:
try (SavedModelBundle model = SavedModelBundle.load("./house_price_median_income", "serve")) {
try (Session session = model.session()) {
Session.Runner runner = session.runner();
float[][] in = new float[][]{ {2.1518f} } ;
Tensor<?> jack = Tensor.create(in);
runner.feed("serving_default_layer1_input", jack);
float[][] probabilities = runner.fetch("StatefulPartitionedCall").run().get(0).copyTo(new float[1][1]);
for (int i = 0; i < probabilities.length; ++i) {
System.out.println(String.format("-- Input #%d", i));
for (int j = 0; j < probabilities[i].length; ++j) {
System.out.println(String.format("Class %d - %f", i, probabilities[i][j]));
}
}
}
}
The above is hardcoded to an input and output but I want to be able to read the model and provide some information so the end-user can select the input and output, etc.
I can get the inputs and outputs with the python command: saved_model_cli show --dir ./house_price_median_income --all
What I want to do it get the inputs and outputs via Java so my code doesn't need to execute python script to get them. I can get operations via:
Graph graph = model.graph();
Iterator<Operation> itr = graph.operations();
while (itr.hasNext()) {
GraphOperation e = (GraphOperation)itr.next();
System.out.println(e);
And this outputs both the inputs and outputs as "operations" BUT how do I know that it is an input and\or an output? The python tool uses the SignatureDef but that doesn't seem to appear in the TensorFlow 2.0 java stuff at all. Am I missing something obvious or is it just missing from TensforFlow 2.0 Java library?
NOTE, I've sorted my issue with the answer help below. Here is my full bit of code in case somebody would like it in the future. Note this is TF 2.0 and uses the SNAPSHOT mentioned below. I make a few assumptions but it shows how to pull the input and output and then use them to run a model
import org.tensorflow.SavedModelBundle;
import org.tensorflow.Session;
import org.tensorflow.Tensor;
import org.tensorflow.exceptions.TensorFlowException;
import org.tensorflow.Session.Run;
import org.tensorflow.Graph;
import org.tensorflow.Operation;
import org.tensorflow.Output;
import org.tensorflow.GraphOperation;
import org.tensorflow.proto.framework.SignatureDef;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.List;
import org.tensorflow.proto.framework.MetaGraphDef;
import java.util.Map;
import org.tensorflow.proto.framework.TensorInfo;
import org.tensorflow.types.TFloat32;
import org.tensorflow.tools.Shape;
import java.nio.FloatBuffer;
import org.tensorflow.tools.buffer.DataBuffers;
import org.tensorflow.tools.ndarray.FloatNdArray;
import org.tensorflow.tools.ndarray.StdArrays;
import org.tensorflow.proto.framework.TensorInfo;
public class v2tensor {
public static void main(String[] args) {
try (SavedModelBundle savedModel = SavedModelBundle.load("./house_price_median_income", "serve")) {
SignatureDef modelInfo = savedModel.metaGraphDef().getSignatureDefMap().get("serving_default");
TensorInfo input1 = null;
TensorInfo output1 = null;
Map<String, TensorInfo> inputs = modelInfo.getInputsMap();
for(Map.Entry<String, TensorInfo> input : inputs.entrySet()) {
if (input1 == null) {
input1 = input.getValue();
System.out.println(input1.getName());
}
System.out.println(input);
}
Map<String, TensorInfo> outputs = modelInfo.getOutputsMap();
for(Map.Entry<String, TensorInfo> output : outputs.entrySet()) {
if (output1 == null) {
output1=output.getValue();
}
System.out.println(output);
}
try (Session session = savedModel.session()) {
Session.Runner runner = session.runner();
FloatNdArray matrix = StdArrays.ndCopyOf(new float[][]{ { 2.1518f } } );
try (Tensor<TFloat32> jack = TFloat32.tensorOf(matrix) ) {
runner.feed(input1.getName(), jack);
try ( Tensor<TFloat32> rezz = runner.fetch(output1.getName()).run().get(0).expect(TFloat32.DTYPE) ) {
TFloat32 data = rezz.data();
data.scalars().forEachIndexed((i, s) -> {
System.out.println(s.getFloat());
} );
}
}
}
} catch (TensorFlowException ex) {
ex.printStackTrace();
}
}
}
What you need to do is to read the SavedModelBundle metadata as a MetaGraphDef, from there you can retrieve input and output names from the SignatureDef, like in Python.
In TF Java 1.* (i.e. the client you are using in your example), the proto definitions are not available out-of-the-box from the tensorflow artifact, you need to add a dependency to org.tensorflow:proto as well and deserialize the result of SavedModelBundle.metaGraphDef() into a MetaGraphDef proto.
In TF Java 2.* (the new client actually only available as snapshots from here), the protos are present right away so you can simply call this line to retrieve the right SignatureDef:
savedModel.metaGraphDef().signatureDefMap.getValue("serving_default")

Data from Arduino to Processing isn't affecting Processing 'If Statement'

I'm trying to get a specific user's tweets into Processing and then have them spoken out using the TTS Library, but only have them spoken when a specific value is detected from Arduino over Serial = 491310
I've got the tweets coming into Processing and can have them printed and spoken, and the value 491310 is picked up by Processing, BUT it's the placement of the if Statement ( 'if (sensor == 491310) {') that I'm struggling with, as it currently has no effect - Can anyone solve this one?
Absolute novice here, any help would be great. Thanks.
import twitter4j.util.*;
import twitter4j.*;
import twitter4j.management.*;
import twitter4j.api.*;
import twitter4j.conf.*;
import twitter4j.json.*;
import twitter4j.auth.*;
import guru.ttslib.*;
import processing.serial.*;
TTS tts;
Serial myPort;
int sensor = 0;
void setup() {
tts = new TTS();
myPort = new Serial(this, Serial.list()[0], 9600);
}
void draw() {
ConfigurationBuilder cb = new ConfigurationBuilder();
cb.setOAuthConsumerKey("XXXX");
cb.setOAuthConsumerSecret("XXXX");
cb.setOAuthAccessToken("XXXX");
cb.setOAuthAccessTokenSecret("XXXX");
java.util.List statuses = null;
Twitter twitter = new TwitterFactory(cb.build()).getInstance();
String userName ="TWITTER HANDLE";
int numTweets = 19;
String[] twArray = new String[numTweets];
try {
statuses = twitter.getUserTimeline(userName);
}
catch(TwitterException e) {
}
if( statuses != null) {
for (int i=0; i<statuses.size(); i++) {
Status status = (Status)statuses.get(i);
if (sensor == 491310) {
println(status.getUser().getName() + ": " + status.getText());
tts.speak(status.getUser().getName() + ": " + status.getText());
}
}
}
}
void serialEvent (Serial myPort) {
int inByte = myPort.read();
sensor = inByte;
print(sensor);
}
Reading from a serial port returns a byte( 8 bit) not a 16 bit integer. The value of 'sensor" cannot be above 255 so never matches 491310. You'll have to do 2 reads to form the 16 bit int.
My guess is that you're hitting twitter's rate limit. Twitter only allows a certain amount of API calls in a given 15 minute window. And since you're calling getUserTimeline() in the draw() function (which happens 60 times per second), you're going to hit that limit pretty fast.
So you're probably getting a TwitterException, but you're just ignoring it. Never use an empty catch block! At least put a call to e.printStackTrace() in there:
catch(TwitterException e) {
e.printStackTrace();
}
To fix the problem, you're going to have to modify your code to only check for tweets once at the beginning of the program. Move all of your logic for fetching the tweets into the setup() function, and then move the logic for printing them out into the serialEvent() function.
If you still can't get it working, then you're going to have to do some debugging: what is the value of every single variable in your sketch? Use the println() function to help figure that out. Is statuses == null? What is the value of statuses.size()? What is the value of sensor? Once you know that, you'll be able to figure out exactly what's going wrong with your code. But my bet would be it's the twitter rate limit, so check that first.

App Engine Backend with Google Cloud Messaging sending message to more than 1000 users

I want to send a message (e.g. Update available) to all users(~15,000). I have implemented App Engine Backend with Google Cloud Messaging to send message.
I have tested on 2 devices. Got message on both. But as google docs says "GCM is support for up to 1,000 recipients for a single message."
My question is how to send same message to remaining 14,000 users in my
case? Or the code below will take care of it?
Below is the code which sends message
import com.google.android.gcm.server.Constants;
import com.google.android.gcm.server.Message;
import com.google.android.gcm.server.Result;
import com.google.android.gcm.server.Sender;
import com.google.api.server.spi.config.Api;
import com.google.api.server.spi.config.ApiNamespace;
import java.io.IOException;
import java.util.List;
import java.util.logging.Logger;
import javax.inject.Named;
import static com.example.shani.myapplication.backend.OfyService.ofy;
/**
* An endpoint to send messages to devices registered with the backend
* <p/>
* For more information, see
* https://developers.google.com/appengine/docs/java/endpoints/
* <p/>
* NOTE: This endpoint does not use any form of authorization or
* authentication! If this app is deployed, anyone can access this endpoint! If
* you'd like to add authentication, take a look at the documentation.
*/
#Api(name = "messaging", version = "v1", namespace = #ApiNamespace(ownerDomain = "backend.myapplication.shani.example.com", ownerName = "backend.myapplication.shani.example.com", packagePath = ""))
public class MessagingEndpoint {
private static final Logger log = Logger.getLogger(MessagingEndpoint.class.getName());
/**
* Api Keys can be obtained from the google cloud console
*/
private static final String API_KEY = System.getProperty("gcm.api.key");
/**
* Send to the first 10 devices (You can modify this to send to any number of devices or a specific device)
*
* #param message The message to send
*/
public void sendMessage(#Named("message") String message) throws IOException {
if (message == null || message.trim().length() == 0) {
log.warning("Not sending message because it is empty");
return;
}
// crop longer messages
if (message.length() > 1000) {
message = message.substring(0, 1000) + "[...]";
}
Sender sender = new Sender(API_KEY);
Message msg = new Message.Builder().addData("message", message).build();
List<RegistrationRecord> records = ofy().load().type(RegistrationRecord.class).limit(1000).list();
for (RegistrationRecord record : records) {
Result result = sender.send(msg, record.getRegId(), 5);
if (result.getMessageId() != null) {
log.info("Message sent to " + record.getRegId());
String canonicalRegId = result.getCanonicalRegistrationId();
if (canonicalRegId != null) {
// if the regId changed, we have to update the datastore
log.info("Registration Id changed for " + record.getRegId() + " updating to " + canonicalRegId);
record.setRegId(canonicalRegId);
ofy().save().entity(record).now();
}
} else {
String error = result.getErrorCodeName();
if (error.equals(Constants.ERROR_NOT_REGISTERED)) {
log.warning("Registration Id " + record.getRegId() + " no longer registered with GCM, removing from datastore");
// if the device is no longer registered with Gcm, remove it from the datastore
ofy().delete().entity(record).now();
} else {
log.warning("Error when sending message : " + error);
}
}
}
}
}
I know there are simillar questions but I am using Java language. I found questions which uses php language at backend. so not helpful to me!
Google Cloud Messaging: Send message to "all" users
Sending Push Notification on multiple devices
Is there anyone who has successfully implemented App Engine+Google Cloud Messaging JAVA language?
In the below code line if I replace 1000 with 15,000 Will it solve my problem?
List<RegistrationRecord> records = ofy().load().type(RegistrationRecord.class).limit(1000).list();
Please please help as soon as possible. And very sorry for my English.. If anyone need other details you are welcome to ask.
Thanks for your time.
A few considerations,
1) Sending notifications to a possibly huge number of users might take significant time, consider using Task Queues to queue that work to be done "offline" outside the 60 sec limit.
2) Now as for the GCM limit, if you need to all your users but GCM allow you 1000 at a time just split them in batches of 1000 and send every batch a message separately.
If you combine both recommendations you should have a fairly scalable process where you query for all your users in 1 request , split that list and just queue sending the message to those users 1000 at a time.
Extension to the #jirungaray answer below is code for sending GCM messages to all registered users,
Here I assume that from android you are registering each mobile-devices for GCM services and storing those device tokens in database.
public class GCM {
private final static Logger LOGGER = Logger.getLogger(GCM.class.getName());
private static final String API_KEY = ConstantUtil.GCM_API_KEY;
public static void doSendViaGcm(List<String> tocken,String message) throws IOException {
Sender sender = new Sender(API_KEY);
// Trim message if needed.
if (message.length() > 1000) {
message = message.substring(0, 1000) + "[...]";
}
Message msg = new Message.Builder().addData("message", message).build();
try{
MulticastResult result = sender.send(msg, tocken, 5);
}catch(Exception ex){
LOGGER.severe("error is"+ex.getMessage());
ex.printStackTrace();
}
}
}
In above code snippet API_KEY can be obtain from google console project ,here I assume that you have already created one google console project and enable GCM api,
you can generate API_KEY as follows
your_google_console_project>> Credentials>> Create New Key >> Server
key >> enter ip address Which you want to allow
access to GCM api[i used 0.0.0.0/0]
Now doSendViaGcm(List tocken,String message) of GCM class performs task of sending messages to all register android mobile devices
here List<String> token is array-list of all device token on which messages will be delivered ,remember this list size should not more than 1000 or else http call will fail.
hope this will help you
thanks

Protocol message contained an invalid tag (zero)

I'm working with pbf files from the open street maps
I want to parse node, relations, and ways.
when I try to parse nodes I get that message.
The code looks like
package myCode;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.util.List;
import protpbufCode.OsmPbf;
import protpbufCode.OsmPbf.Node;
import protpbufCode.OsmPbf.PrimitiveGroup;
public class ReadingPBF
{
public static void print(PrimitiveGroup node)
{
for (Node m: node.getNodesList())
{
System.out.print("Person ID: " + m.getId() + " ");
System.out.print(" Lat: " + m.getLat()+ " ");
System.out.print(" Long: "+ m.getLon()+ " ");
System.out.println("");
}
}
public static void main (String args[])
{
try
{
PrimitiveGroup newNode = PrimitiveGroup.parseFrom(new FileInputStream(new File("isle.pbf")));
print(newNode);
}
catch (IOException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
System.out.println(e.getMessage());
System.out.println(e.getCause());
}
}
}
the OsmPbf is java class that created using the protobuf compiler.
and that what gets printed.
com.google.protobuf.InvalidProtocolBufferException: Protocol message contained an invalid tag (zero).
at com.google.protobuf.InvalidProtocolBufferException.invalidTag(InvalidProtocolBufferException.java:89)
at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:158)
at protpbufCode.OsmPbf$PrimitiveGroup.<init>(OsmPbf.java:5230)
at protpbufCode.OsmPbf$PrimitiveGroup.<init>(OsmPbf.java:5219)
at protpbufCode.OsmPbf$PrimitiveGroup$1.parsePartialFrom(OsmPbf.java:5329)
at protpbufCode.OsmPbf$PrimitiveGroup$1.parsePartialFrom(OsmPbf.java:1)
at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:192)
at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:209)
at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:215)
at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
at protpbufCode.OsmPbf$PrimitiveGroup.parseFrom(OsmPbf.java:5627)
at myCode.ReadingPBF.main(ReadingPBF.java:33)
Protocol message contained an invalid tag (zero).
null
OpenStreetMap .pbf files are not simple protobufs. See the documentation here:
http://wiki.openstreetmap.org/wiki/PBF_Format
Under the section "File format", you'll see this:
The format is a repeating sequence of:
int4: length of the BlobHeader message in network byte order
serialized BlobHeader message
serialized Blob message (size is given in the header)
That is, the file starts out with a 4-byte integer before the first protobuf message. Since this integer is probably smaller than 2^24, the first byte will of course be zero, which explains the exact exception you are seeing.
You will need to read this 4-byte value manually, then make sure to read only that many bytes and parse them as a BlobHeader, and so on.
Personally I'd recommend looking for a PBF decoder library that already handles this for you. There must be a few out there.

Checking domain name age through Java

I am making a simple phishing scanner tool for a university project. One of my detection methods includes checking if the DNS within the email are valid, and I also want to check their age. This is example code of how I check if they are existing:
import javax.naming.NamingException;
import javax.naming.directory.Attribute;
import javax.naming.directory.Attributes;
import javax.naming.directory.DirContext;
import javax.naming.directory.InitialDirContext;
import java.util.Hashtable;
public class DNSExample {
static int doLookup( String hostName ) throws NamingException {
Hashtable env = new Hashtable();
env.put("java.naming.factory.initial",
"com.sun.jndi.dns.DnsContextFactory");
DirContext ictx = new InitialDirContext( env );
Attributes attrs =
ictx.getAttributes( hostName, new String[] { "MX" });
Attribute attr = attrs.get( "MX" );
if( attr == null ) return( 0 );
return( attr.size() );
}
public static void main( String args[] ) {
String [] array = {"google.com","dsad33114sssaxzx.com"} ;
for( int i = 0; i < array.length; i++ ) {
try {
System.out.println( array[i] + " has " +
doLookup( array[i] ) + " mail servers" );
}
catch( Exception e ) {
System.out.println(array[i] + " : " + e.getMessage());
}
}
}
}
How would I need to modify the above code to include a check of age
for servers that exist?
I think you've chosen a problem that cannot be solved in the general case ... using current generation internet standards:
The information you need cannot be obtained from DNS itself.
In some cases information about DNS registrations can be obtained from WHOIS. However, the information returned by WHOIS servers is not standardised:
There is no standard information model.
There is no standard format.
There are no guarantees as to the accuracy of the information.
It is not even clear if "age of server" is going to be available. (For instance, the closest that APNIC's WHOIS provides to that is the last modification timestamp for the DNS record. And that is NOT a good proxy for server age.)
There is a set of RFC's that define something called CRISP, but as far as I can make out the purpose of that standard is for registrar to registrar exchange of information. (I couldn't find any public-facing services based on CRISP.)
There is also an IETF working group called WEIRDS which I think is intended to define a web-enabled replacement for WHOIS. (Don't confuse WEIRDS with the IETF WEIRD WG!) But that was formed very recently, and it is too soon to make any predictions of the outcome. (Or how long it will take for the NICs to implement any specs that come out of the WG.)
Summary: your chances of implementing something in this space that really works are currently low. Probably the best you can hope to achieve is something based on screen-scraping one or two WHOIS services.
This might change in a few years, but that is of no help for your current project.
It seems based on your description and comments above you are trying to gather whois information.
download APIs from http://commons.apache.org/proper/commons-net/
change the nameToQuery below and run it.
public class WhoisIt {
public static final String WHOIS_SERVER = "whois.internic.net";
public static final int WHOIS_PORT = 43;
public static void main(String[] args) throws Exception {
String nameToQuery = "avajava.com";
WhoisClient whoisClient = new WhoisClient();
whoisClient.connect(WHOIS_SERVER, WHOIS_PORT);
String results = whoisClient.query(nameToQuery);
System.out.println(results);
}
}
good luck

Categories