Java - how to analyze a function code - java

We are working with mvc design pattern, where all the data is stored under map.
I want to iterate over all the classes in the system and for each to check what the method is putting on the map and what does the method get from the map.
For example for the next code:
private void myFunc()
{
Object obj = model.get("mykey");
Object obj2 = model.get("mykey2");
.....
model.put("mykey3", "aaa");
}
I want to know that in this function we have 2 gets: mykey and mykey2 and 1 put: mykey3
How can I do it with the code.
Thanks.

You tagged this with "reflection", but that will not work. Reflection only allows you to inspect "signatures". You can use it to identify the methods of a class, and the arguments of the methods.
It absolutely doesn't help you to identify what each method is doing.
In order to find out about that, you would need to either parse the java source code side, or byte code classes. As in: write code that reads that content, and understands "enough" of it to find such places. Which is a very challenging effort. And of course: it is very easy to bypass all such "scanner" code, by doing things such as:
List<String> keysToUpdate = Arrays.asList("key1", "key2");
for (String key : keysToUpdate) {
... does something about each key
Bang. How would you ever write code that reliable finds the keys for that?
When you found that code, now imagine that the list isn't instantiated there, but far away, and past as argument? When you figured how to solve that, now consider code that uses reflection to acquire the model object, and calls method on that. See? For any "scanner" that you write down, there will be ways to make that fail.
Thus the real answer is that you are already going down the wrong rabbit hole:
You should never have written:
Object obj = model.get("mykey");
but something like
Object obj = model.get(SOME_CONSTANT_FOR_KEY_X);
Meaning: there is no good way to control such stuff. The best you can do is to make sure that all keys are constants, coming from a central place. Because then you can at least go in, and for each key in that list of constants, you can have your IDE tell you about their usage.

NOTES
I assumed that your situation is complicated enough that simple or advanced text search in codebase doesn't help you.
This is a hack not a generic solution, designed only for testing and diagnosis purposes.
To use this hack, you must be able to change your code and replace the actual model with the proxy instance while you're testing/diagnosing. If you can't do this, then you have to use an even more advanced hack, i.e. byte-code engineering with BCEL, ASM, etc.
Dynamic proxies have drawbacks on code performance, therefore not an ideal choice for production mode.
Using map for storing model is not a good idea. Instead a well-defined type system, i.e. Java classes, should be used.
A general design pattern for a problem like this is proxy. An intermediate object between your actual model and the caller that can intercept the calls, collect statistics, or even interfere with the original call. The proxied model ultimately sends everything to the actual model.
An obvious proxy is to simply wrap the actual model into another map, e.g.
public class MapProxy<K, V> implements Map<K, V> {
public MapProxy(final Map<K, V> actual) {
}
// implement ALL methods and redirect them to the actual model
}
Now, reflection doesn't help you with this directly, but can help with implementing a proxy faster using dynamic proxies (Dynamic Proxy Classes), e.g.
#SuppressWarnings("unchecked")
private Map<String, Object> proxy(final Map<String, Object> model) {
final InvocationHandler handler = new InvocationHandler() {
public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
// Collect usage stats or intervene
return method.invoke(model, args);
}
};
return (Map<String, Object>) Proxy.newProxyInstance(Map.class.getClassLoader(),
new Class<?>[] { Map.class }, handler);
}
NOTE: Either case you need to be able to replace the actual model with the proxied model at least for the duration of your test.
With another trick, you can find out who called which method of your model. Simply by accessing Thread.currentThread().getStackTrace() and retrieving the appropriate element.
Now puting all the pieces together:
InvocationLog.java
public final class InvocationLog {
private Method method;
private Object[] arguments;
private StackTraceElement caller;
public InvocationLog(Method method, Object[] arguments, StackTraceElement caller) {
this.method = method;
this.arguments = arguments;
this.caller = caller;
}
public Method getMethod() { return this.method; }
public Object[] getArguments() { return this.arguments; }
public StackTraceElement getCaller() { return this.caller; }
#Override
public String toString() {
return String.format("%s (%s): %s",
method == null ? "<init>" : method.getName(),
arguments == null ? "" : Arrays.toString(arguments),
caller == null ? "" : caller.toString());
}
}
ModelWatch.java
public final class ModelWatch {
private final Map<String, Object> modelProxy;
private final List<InvocationLog> logs = new ArrayList<>();
public ModelWatch(final Map<String, Object> model) {
modelProxy = proxy(model);
}
#SuppressWarnings("unchecked")
private Map<String, Object> proxy(final Map<String, Object> model) {
final InvocationHandler handler = new InvocationHandler() {
public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
log(method, args, Thread.currentThread().getStackTrace());
return method.invoke(model, args);
}
};
return (Map<String, Object>) Proxy.newProxyInstance(Map.class.getClassLoader(),
new Class<?>[] { Map.class }, handler);
}
private void log(Method method, Object[] arguments, StackTraceElement[] stack) {
logs.add(new InvocationLog(method, arguments, stack[3]));
// 0: Thread.getStackTrace
// 1: InvocationHandler.invoke
// 2: <Proxy>
// 3: <Caller>
}
public Map<String, Object> getModelProxy() { return modelProxy; }
public List<InvocationLog> getLogs() { return logs; }
}
To put it in use:
private Map<String, Object> actualModel = new HashMap<String, Object>();
private ModelWatch modelWatch = new ModelWatch(model);
private Map<String, Object> model = modelWatch.getModelProxy();
// Calls to model ...
modelWatch.getLogs() // Retrieve model activity

Related

avoiding if conditions for similar type of checks

Is there anyway to avoid these if conditions? because there may be different type of objects coming in.
if ("OpenOrder".equals(order.getClass().getSimpleName())) {
return OpenOrderBuilder.createOFSMessage((OpenOrder) order); //Returns String
}
if ("ExecutionOrder".equals(order.getClass().getSimpleName())) {
return ExecutionOrderBuilder.createOFSMessage((ExecutionOrder) order); //Returns String
}
You can use a Router pattern to do this. Simple add the computations in a Map like this:
Map<String, Function> router = new HashMap<>();
router.put("OpenOrder", (value) -> OpenOrderBuilder.createOFSMessage((OpenOrder) value));
router.put("ExecutionOrder", (value) -> ExecutionOrderBuilder.createOFSMessage((ExecutionOrder) order));
And you can route the order using the String key. Here is a "OpenOrder" example:
String result = (String) router.get("OpenOrder").apply(order);
There are many ways to do it. Which one to choose, depends on your needs and in this case in particular on how many different types of objects you will have.
I suggest looking at concepts like interfaces and inheritance and on specific design patterns.
One approach I tend to like, although still not perfect, works as follows:
interface Order {
}
interface OrderBuilder<T> {
T forType();
Object createOFSMessage(Order order);
}
class OpenOrderBuilder<OpenOrder> implements OrderBuilder {
#Override
OpenOrder forType() {
return OpenOrder.class;
}
...
}
class ExecutionOrderBuilder<ExecutionOrder> implements OrderBuilder {
#Override
ExecutionOrder forType() {
return ExecutionOrder.class;
}
...
}
class MyProcessor {
Map<Class, OrderBuilder> obs;
public void initialize() {
List<OrderBuilder> builders = new ArrayList<>();
builders.add(new OpenOrderBuilder());
builders.add(new ExecutionOrderBuilder());
obs = new HashMap<Class, OrderBuilder>();
for(OrderBuilder b : builders) {
obs.put(b.forType(), b);
}
}
public Object createOFSMessage(Order order) {
return obs.get(order.getClass()).createOFSMessage(order);
}
}
In the above example, adding a new implementation would just consist of adding an entry to the builders collection. While in the example above it's done manually, normally this is done through Dependency Injection and frameworks like spring (in which case, the initialize method may turn into a constructor with builders as an #Autowired argument).
There are of course other ways, some more simple some more complicated. The best way really depends on what you have to do and one key rule: the less code you have the better.
First one should not forget the switch-on-string:
switch (order.getClass().getSimpleName()) {
case "OpenOrder":
return OpenOrderBuilder.createOFSMessage((OpenOrder) order); //Returns String
case "ExecutionOrder":
return ExecutionOrderBuilder.createOFSMessage((ExecutionOrder) order); //Returns String
}
The code however shows inheritance being used in combination with static child class factories. Evidently a createOFSMessage is not desired in the Order base class.
Then use a non-static "builder" - a factory. Follow the strategy pattern.
If you already know the type when calling the method, this code can help you :
private String CreateOFSMessage(Class<T> classOrder) {
if ("OpenOrder".equals(classOrder.getSimpleName())) {
return OpenOrderBuilder.createOFSMessage((classOrder) order);
}else if ("ExecutionOrder".equals(classOrder.getSimpleName())) {
return ExecutionOrderBuilder.createOFSMessage((classOrder) order);
}
}

Using decorator pattern without adding "different" behaviour

I have facade interface where users can ask for information about lets say Engineers. That information should be transferred as JSON of which we made a DTO for. Now keep in mind that I have multiple datasources that can provide an item to this list of DTO.
So I believe right now that I can use Decorative Pattern by adding handler of the datasource to the myEngineerListDTO of type List<EngineerDTO>. So by that I mean all the datasources have the same DTO.
This picture below shows that VerticalScrollbar and HorizontalScrollBar have different behaviours added. Which means they add behaviour to the WindowDecorator interface.
My question, does my situation fit the decorator pattern? Do I specifically need to add a behaviour to use this pattern? And is there another pattern that does fit my situation? I have already considered Chain of Responsibility pattern, but because I don't need to terminate my chain on any given moment, i thought maybe Decorator pattern would be better.
Edit:
My end result should be: List<EngineersDTO> from all datasources. The reason I want to add this pattern is so that I can easily add another datasource behind the rest of the "pipeline". This datasource, just like the others, will have addEngineersDTOToList method.
To further illustrate on how you can Chain-of-responsibility pattern I put together a small example. I believe you should be able to adapt this solution to suit the needs of your real world problem.
Problem Space
We have an unknown set of user requests which contain the name of properties to be retrieved. There are multiple datasources which each have varying amounts of properties. We want to search through all possible data sources until all of the properties from the request have been discovered. Some data types and data sources might look like bellow (note I am using Lombok for brevity):
#lombok.Data
class FooBarData {
private final String foo;
private final String bar;
}
#lombok.Data
class FizzBuzzData {
private final String fizz;
private final String buzz;
}
class FooBarService {
public FooBarData invoke() {
System.out.println("This is an expensive FooBar call");
return new FooBarData("FOO", "BAR");
}
}
class FizzBuzzService {
public FizzBuzzData invoke() {
System.out.println("This is an expensive FizzBuzz call");
return new FizzBuzzData("FIZZ", "BUZZ");
}
}
Our end user might require multiple ways to resolve the data. The following could be a valid user input and expected response:
// Input
"foobar", "foo", "fizz"
// Output
{
"foobar" : {
"foo" : "FOO",
"bar" : "BAR"
},
"foo" : "FOO",
"fizz" : "FIZZ"
}
A basic interface and simple concrete implementation for our property resolver might look like bellow:
interface PropertyResolver {
Map<String, Object> resolve(List<String> properties);
}
class UnknownResolver implements PropertyResolver {
#Override
public Map<String, Object> resolve(List<String> properties) {
Map<String, Object> result = new HashMap<>();
for (String property : properties) {
result.put(property, "Unknown");
}
return result;
}
}
Solution Space
Rather than using a normal "Decorator pattern", a better solution may be a "Chain-of-responsibility pattern". This pattern is similar to the decorator pattern, however, each link in the chain is allowed to either work on the item, ignore the item, or end the execution. This is helpful for deciding if a call needs to be made, or terminating the chain if the work is complete for the request. Another difference from the decorator pattern is that resolve will not be overriden by each of the concrete classes; our abstract class can call out to the sub class when required using abstract methods.
Back to the problem at hand... For each resolver we need two components. A way to fetch data from our remote service, and a way to extract all the required properties from the data retrieved. For fetching the data we can provide an abstract method. For extracting a property from the fetched data we can make a small interface and maintain a list of these extractors seeing as multiple properties can be pulled from a single piece of data:
interface PropertyExtractor<Data> {
Object extract(Data data);
}
abstract class PropertyResolverChain<Data> implements PropertyResolver {
private final Map<String, PropertyExtractor<Data>> extractors = new HashMap<>();
private final PropertyResolver successor;
protected PropertyResolverChain(PropertyResolver successor) {
this.successor = successor;
}
protected abstract Data getData();
protected final void setBinding(String property, PropertyExtractor<Data> extractor) {
extractors.put(property, extractor);
}
#Override
public Map<String, Object> resolve(List<String> properties) {
...
}
}
The basic idea for the resolve method is to first evaluate which properties can be fulfilled by this PropertyResolver instance. If there are eligible properties then we will fetch the data using getData. For each eligible property we extract the property value and add it to a result map. Each property which cannot be resolved, the successor will be requested to be resolve that property. If all properties are resolved the chain of execution will end.
#Override
public Map<String, Object> resolve(List<String> properties) {
Map<String, Object> result = new HashMap<>();
List<String> eligibleProperties = new ArrayList<>(properties);
eligibleProperties.retainAll(extractors.keySet());
if (!eligibleProperties.isEmpty()) {
Data data = getData();
for (String property : eligibleProperties) {
result.put(property, extractors.get(property).extract(data));
}
}
List<String> remainingProperties = new ArrayList<>(properties);
remainingProperties.removeAll(eligibleProperties);
if (!remainingProperties.isEmpty()) {
result.putAll(successor.resolve(remainingProperties));
}
return result;
}
Implementing Resolvers
When we go to implement a concrete class for PropertyResolverChain we will need to implement the getData method and also bind PropertyExtractor instances. These bindings can act as an adapter for the data returned by each service. This data can follow the same structure as the data returned by the service, or have a custom schema. Using the FooBarService from earlier as an example, our class could be implemented like bellow (note that we can have many bindings which result in the same data being returned).
class FooBarResolver extends PropertyResolverChain<FooBarData> {
private final FooBarService remoteService;
FooBarResolver(PropertyResolver successor, FooBarService remoteService) {
super(successor);
this.remoteService = remoteService;
// return the whole object
setBinding("foobar", data -> data);
// accept different spellings
setBinding("foo", data -> data.getFoo());
setBinding("bar", data -> data.getBar());
setBinding("FOO", data -> data.getFoo());
setBinding("__bar", data -> data.getBar());
// create new properties all together!!
setBinding("barfoo", data -> data.getBar() + data.getFoo());
}
#Override
protected FooBarData getData() {
return remoteService.invoke();
}
}
Example Usage
Putting it all together, we can invoke the Resolver chain as shown bellow. We can observe that the expensive getData method call is only performed once per Resolver only if the property is bound to the resolver, and that the user gets only the exact fields which they require:
PropertyResolver resolver =
new FizzBuzzResolver(
new FooBarResolver(
new UnknownResolver(),
new FooBarService()),
new FizzBuzzService());
Map<String, Object> result = resolver.resolve(Arrays.asList(
"foobar", "foo", "__bar", "barfoo", "invalid", "fizz"));
ObjectMapper mapper = new ObjectMapper();
mapper.enable(SerializationFeature.INDENT_OUTPUT);
System.out.println(mapper
.writerWithDefaultPrettyPrinter()
.writeValueAsString(result));
Output
This is an expensive FizzBuzz call
This is an expensive FooBar call
{
"foobar" : {
"foo" : "FOO",
"bar" : "BAR"
},
"__bar" : "BAR",
"barfoo" : "BARFOO",
"foo" : "FOO",
"invalid" : "Unknown",
"fizz" : "FIZZ"
}

Using Jackson to implement toString() without annotations

I want to use Jackson to implement toString() to return the JSON representation of an object, but I do not want to use any Jackson annotation in my code.
I tried an implementation along the lines of:
public String toString()
{
Map<String,Object> ordered = ImmutableMap.<String, Object>builder().
put("createdAt", createdAt.toString()).
put("address", address.toString()).
build();
ObjectMapper om = new ObjectMapper();
om.enable(SerializationFeature.INDENT_OUTPUT);
try
{
return om.writeValueAsString(object);
}
catch (JsonProcessingException e)
{
// Unexpected
throw new AssertionError(e);
}
}
This works well for simple fields but if "address" has its own fields then instead of getting this:
{
"address" : {
"value" : "AZ4RPBb1kSkH4RNewi4NXNkBu7BX9DmecJ",
"tag" : null
}
I get this output instead:
{
"address" : "{\n\"value\" : \"AZ4RPBb1kSkH4RNewi4NXNkBu7BX9DmecJ\",\n \"tag\" : null"
}
In other words, the address value is being treated like a String as opposed to a JsonNode.
To clarify:
On the one hand, I want to control how simple class fields are converted to String. I don't want to use Jackson's built-in converter.
On the other hand, for complex fields, returning a String value to Jackson leads to the wrong behavior.
I believe that I could solve this problem by adding a public toJson() method to all my classes. That method would return a Map<String, JsonNode>, where the value is a string node for simple fields and the output of toJson() for complex fields. Unfortunately, this would pollute my public API with implementation details.
How can I achieve the desired behavior without polluting the class's public API?
UPDATE: I just saw an interesting answer at https://stackoverflow.com/a/9599585/14731 ... Perhaps I could convert the String value of complex fields back to JsonNode before passing them on to Jackson.
I think you should implement two methods in each class - one to dump data, second to build JSON out of raw data structure. You need to separate this, otherwise you will nest it deeper and deeper every time you encapsulate nested toString() calls.
An example:
class Address {
private BigDecimal yourField;
/* …cut… */
public Map<String, Object> toMap() {
Map<String, Object> raw = new HashMap<>();
raw.put("yourField", this.yourField.toPlainString());
/* more fields */
return raw;
}
#Override
public String toString() {
// add JSON processing exception handling, dropped for readability
return new ObjectMapper().writeValueAsString(this.toMap());
}
}
class Employee {
private Address address;
/* …cut… */
public Map<String, Object> toMap() {
Map<String, Object> raw = new HashMap<>();
raw.put("address", this.address.toMap());
/* more fields */
return raw;
}
#Override
public String toString() {
// add JSON processing exception handling, dropped for readability
return new ObjectMapper().writeValueAsString(this.toMap());
}
}

How to write junit for this code?

How do I call populateMapWithFormattedDates method in JUnit and how to write JUnit populateMapWithFormattedDates for this method. I dont know how to write JUnit for nested methods so kindly help.
protected Map<String, String> populateDispatch(final RequestDispatchData requestDispatchData)
{
final Map<String, String> map = getDispatchFieldMapper().populateMapper(requestDispatchData);
populateMapWithFormattedDates(requestDispatchData, map);
}
private void populateMapWithFormattedDates(final RequestDispatchData requestDispatchData, final Map<String, String> map)
{
String dateFormatted = map.get("ticket_date");
Date date = null;
try
{
date = new SimpleDateFormat("MM/dd/yy").parse(dateFormatted);
}
catch (ParseException parseException)
{
customLogger.logMessage(diagnosticMethodSignature, DiagnosticType.EXCEPTION,
"Exception in parsing start date of ticket " + parseException);
}
map.put("startDateDDMMYY", DateEnum.DDMMYY.getFormattor().format(date));
map.put("startDateDDMMMYY", DateEnum.DDMMMYY.getFormattor().format(date));
map.put("startDateDMY", DateEnum.DMY.getFormattor().format(date));
map.put("startDateYYMMDD", DateEnum.YYMMDD.getFormattor().format(date));
}
Simple: you don't test private methods directly.
Instead, you focus on the "public contract" of those methods that get invoked "from the outside". In your case, that would be:
Map<String, String> populateDispatch(...
Thus you want write tests like:
#Test
public void populateDispatchForValidDate() {
RequestDispatchData request = ...
Map<String, String> actualOutput = underTest.populateDispatch(request);
assertThat(actualOutput.size(), is(5));
}
The above is just meant as an example. What it does:
create a "request" object. This could be a mock; or a real object - depends on what exactly your various methods are doing with this object. And how easy it is to create a "real" RequestDispatchData object with "test data"
it invokes that method under test
it asserts one/several properties of the result coming back
Looking at your production code, that code is doing way too many things within that single method. You might want to read about "clean code" and improve that code. That probably lead to the creation of some helper classes which would be easier to test then.
There is nothing such as a nested method in Java. It's a nested function call is what it is. Plus, yea, you cannot call the private functions of a class through its object so testing them individually by calling them is not possible.
You can although have a public or protected function doing the call somewhat like a getter.
I believe your code is some what like,
protected Map<String, String> populateDispatch(final RequestDispatchData requestDispatchData)
{
final Map<String, String> map = getDispatchFieldMapper().populateMapper(requestDispatchData);
return populateMapWithFormattedDates(requestDispatchData, map);
}
note that you have missed the return statement, and update the map on certain condition from ,
private void populateMapWithFormattedDates(final RequestDispatchData requestDispatchData, final Map<String, String> map)
{
// Map manipulation here
}
So if you have minimum dependency on the getDispatchFieldMapper().populateMapper(), then you can directly invoke populateDispatch() from your test code, else you may have to find a way to inject a custom implementation of DispatchFieldMapper to prepare the map for testing your target method.
Injection of DispatchFieldMapper can be via overriding the getDispatchFieldMapper() or use a setDispatchFieldMapper() on your class.
While preparing your custom DispatchFieldMapper, make sure the populateMapper() returns a map with all data required for your testing.
It is not good idea to call non accessible method while testing directly form the test class.
Second thing : Non accessible method is always called form some accessible method or scope otherwise that code is dead code just remove that.
Because method is privet, so if it is in use then it called somewhere from code of current class. in your code it called form populateDispatch, so actual way to write test case for populateMapWithFormattedDates method is cover all the scenarios for populateDispatch method and populateDispatch is also used form sub class of the current class call it form there.
But you can call private method in junit like:
Deencapsulation.invoke(<object of class in called method is exist>, "populateMapWithFormattedDates", <object of RequestDispatchData class>, <object of Map<String, String> class>);
Again it is a way to call private method but you should not use this...
You should decouple the populateMapWithFormattedDates method like this:
// I created an utility class but it's a suggestion.
// I'm using an util class because you don't use requestDispatchData for
// anything. But if you do, maybe it's a good idea to implement this code
// on RequestDispatchData class
class DispatchMapUtils {
// Note that I took of the requestDispatchData
public static Map<String, String> populateMapWithFormattedDates(final Map<String, String> map) throws ParseException {
// Your code without try-catch.
// Throw the exception to the caller of this method
// and try-catch there to use the customLogger
}
}
With this code, your test would be something like this:
#Test
public void shouldFormatTicketDateInVariousFormat() {
Map<String, String> map;
// Instantiate and put some initial datas
map = new ...
map.put('ticket_date') = ..
// Call the method!
DispatchMapUtils.populateMapWithFormattedDates(map);
// Do the assertions!
Assert.assertTrue(map.get("startDateDDMMYY").equals(...));
}
#Test
public void shouldThrowExceptionWhenTicketDateIsInvalid() {
// More testing code
}

How to populate map of string and another map in a thread safe way?

I am working on measuing my application metrics using below class in which I increment and decrement metrics.
public class AppMetrics {
private final AtomicLongMap<String> metricCounter = AtomicLongMap.create();
private static class Holder {
private static final AppMetrics INSTANCE = new AppMetrics();
}
public static AppMetrics getInstance() {
return Holder.INSTANCE;
}
private AppMetrics() {}
public void increment(String name) {
metricCounter.getAndIncrement(name);
}
public AtomicLongMap<String> getMetricCounter() {
return metricCounter;
}
}
I am calling increment method of AppMetrics class from multithreaded code to increment the metrics by passing the metric name.
Problem Statement:
Now I want to have metricCounter for each clientId which is a String. That means we can also get same clientId multiple times and sometimes it will be a new clientId, so somehow then I need to extract the metricCounter map for that clientId and increment metrics on that particular map (which is what I am not sure how to do that).
What is the right way to do that keeping in mind it has to be thread safe and have to perform atomic operations. I was thinking to make a map like that instead:
private final Map<String, AtomicLongMap<String>> clientIdMetricCounterHolder = Maps.newConcurrentMap();
Is this the right way? If yes then how can I populate this map by passing clientId as it's key and it's value will be the counter map for each metric.
I am on Java 7.
If you use a map then you'll need to synchronize on the creation of new AtomicLongMap instances. I would recommend using a LoadingCache instead. You might not end up using any of the actual "caching" features but the "loading" feature is extremely helpful as it will synchronizing creation of AtomicLongMap instances for you. e.g.:
LoadingCache<String, AtomicLongMap<String>> clientIdMetricCounterCache =
CacheBuilder.newBuilder().build(new CacheLoader<String, AtomicLongMap<String>>() {
#Override
public AtomicLongMap<String> load(String key) throws Exception {
return AtomicLongMap.create();
}
});
Now you can safely start update metric counts for any client without worrying about whether the client is new or not. e.g.
clientIdMetricCounterCache.get(clientId).incrementAndGet(metricName);
A Map<String, Map<String, T>> is just a Map<Pair<String, String>, T> in disguise. Create a MultiKey class:
class MultiKey {
public String clientId;
public String name;
// be sure to add hashCode and equals
}
Then just use an AtomicLongMap<MultiKey>.
Edited:
Provided the set of metrics is well defined, it wouldn't be too hard to use this data structure to view metrics for one client:
Set<String> possibleMetrics = // all the possible values for "name"
Map<String, Long> getMetricsForClient(String client) {
return Maps.asMap(possibleMetrics, m -> metrics.get(new MultiKey(client, m));
}
The returned map will be a live unmodifiable view. It might be a bit more verbose if you're using an older Java version, but it's still possible.

Categories