I want to fetch various metrics like read/write latency, disk utilisation etc. of each of my Cassandra nodes(without using JMX) as a JSON object. It seems to me that MetricsServlet, can do exactly that. However, I'm still not able to figure out, what all do I need to do in order to use it(metrics-servlets does not come with Cassandra). I'll appreciate if I can get some advice/sample code(for fetching any metric).
Cassandra is not a java web server, it doesnt support servlets. You would need to start a java web server in same JVM as Cassandra and load those servlets. While possible its probably a lot less work to just query the metrics via JMX and convert to JSON with an external application or to expose JMX via http with something like MX4J (what I would recommend)
Related
We have one Redis for our company and multiple teams are using it. We are getting a surge of requests and nobody seems to know which application is causing it. We have only one password that goes around the whole company and our Redis is secured under a VPN so we know it's not coming from the outside.
Is there a way to know whose using Redis? Maybe we can pass in some headers with the connection from every app to identify who makes the most requests, etc.
We use Spring Data Redis for our communication.
This question is too broad since different strategies can be used here:
Use Redis MONITOR command. This is basically a built-in debugging tool that monitors all the commands executed by Redis
Use some kind of intermediate proxy. Instead of routing all the commands directly to redis - route everything to proxy that will do some processing like measuring the amounts of commands by the calling host or maybe types of commands depending what you want.
This is still only a configuration related solution so you won't need any changes at the level of applications
Since you have spring boot, you can use Micrometer / metering integration. This way you could create a counter / gauge that will get updated upon each request to Redis. If you also stream the metering data to tools like Prometheus, you'll be able to create a dashboard, say in grafana to see the whole picture. Micrometer can integrate also with other products, Prometheus/Grafana was only an example, you can chose any other solution (maybe in your organization you already have something like that).
We are trying to get live configuration data from our kubernetes cluster. Therefore we would like to read the configmaps from each of our services.
Is there a way to exctract this data with a spring microservice which runs alongside the rest of the services?
Or are there other (better?) ways / tools to get this information?
Using Kubernetes APIs you can get the configmaps you need. I am not familiar with the Java client, but here it is:
https://github.com/kubernetes-client/java
You can retrieve a list of configmaps and their contents using these APIs. Your application will need a cluster role and a cluster role binding to allow it reading from configmap resources if you're using RBAC.
To extract information you can just query the Kubernetes API, likely in your case using the Java Kubernetes client. Likely the biggest issue you will face will be ensuring you have read access for the namespace(s) that the ConfigMaps are in.
The bigger question about a 'better way' is trying to understand why you want to read all of the ConfigMaps for your applications. The goal you are trying to accomplish will guide the solution.
I am using com.sparkjava library for writing API.I want to monitor the metrics of these API like the average ,min and max time taken to give the response, throughput for the API's and count of request sent for the API etc.,
I was looking for a suitable library which provides all these data.I want these metrics data to be registered in the JVM using the JMX technology.I know of codahale.metrics library for registering ,apart from that are there any other better libraries?I don't want to write the MBean objects and register in the MBeanRegisrty unless there is no other alternative.I am looking for a library which gives the above metrics data once i run the application.
Have you tried newrelic? It's very easy to set up and gives you a lot of data out of the box :)
I want to create a webpage where a user sets up a profile via a form; form data is sent to my server and creates requisite nodes in neo4j. I want to do this in a way that does as much as possible to prevent people arbitrarily sending commands to my server outside of the form, such as via chrome or any other injection method.
I expect that I will need to utilize the REST API to connect with neo4j via java. It also seems like I will need to use Jersey to allow the site to communicate with the neo4j REST API. I am new to securing data being transferred from the client to server and to validating data received by the server to ensure I am not sending commands to neo4j that shouldn't have been sent, and which could cause all sorts of damage to my members. I am also new to utilizing graph databases and neo4j in general.
Can someone give me a step by step example of how to basically accomplish this task? I am looking to find out what tools I need to install, and what types of commands I should include both on the client and on the server side to ensure that I am only passing correct data to neo4j when creating/deleting/modifying nodes and relationships.
Thanks for any help that anyone is willing to provide - getting past this hump will allow me to move so much more quickly with the rest of my development.
I guess the most easy way to prevent others from accessing your neo4j instance is using Neo4j authentication example. Just follow the docs on the start page. Additionally you might set up some IP address filtering using e.g. iptables on linux to restrict network access to your Java client machine.
With authentication extensions installed, you need to supply username/password with each request. The most easy way to communicate with Neo4j from a Java client these days is using the Neo4j JDBC driver.
I know Thrift has its own Threadpool server but I'm not sure it will be able to handle heavy load. Would you recommend putting it behind tomcat ?
In addition, if you wanted to use the socket transport implementation could you still use tomcat ? or would need to use some other solution ?
I would really love to hear about your experience deploying thrift java services.
Consider putting it inside application server (tomcat, jetty, etc.) and accessing it through HTTP using TServlet. You get from the server:
Threads management
Connections management
You get to use standard Filters to maybe rate limit the requests, or manage access based on cookies
Probably readily available access logs
You can easily add JSON Protocol for debugging