I've switched (or, am in the process of switching) to using bazel, though I'm doing so on Windows.
I'm interested in calling into my Go code from Java, so I started with this tutorial.
I was able to make that work using the same code as on their Github example and everything works fine. I tried adapting that to my bazel build. If I take the awesome.so file generated by go build -o awesome.so -buildmode=c-shared awesome.go and include it as a resource to my java_library, I can make everything work.
Relevant files shown below.
Ideally, however, I'd like to have everything generated through bazel, but despite all my attempts thus far my go_binary rule always outputs awesome.a (and awesome.x). If I switch to using //go:awesome as the resource from java:client_lib I am able to successfully see the awesome.a output as a resource, which suggests that getting my go_binary to output awesome.so is the last piece of the puzzle, but the correct combination of flags has thus far eluded me.
Basically I just want to make my go_binary rule have the same behavior as running go build -o awesome.so --buildmode=c-shared awesome.go.
In theory I'm ok if I need another rule to bridge the gap, but since I'm on windows and bash has been hit or miss thus far, using genrule as the intermediate doesn't currently look promising.
Please advise, and thanks!
WORKSPACE
...
# bazelbuild/rules_go for golang support.
http_archive(
name = "io_bazel_rules_go",
sha256 = "b725e6497741d7fc2d55fcc29a276627d10e43fa5d0bb692692890ae30d98d00",
urls = [
"https://mirror.bazel.build/github.com/bazelbuild/rules_go/releases/download/v0.24.3/rules_go-v0.24.3.tar.gz",
"https://github.com/bazelbuild/rules_go/releases/download/v0.24.3/rules_go-v0.24.3.tar.gz",
],
)
load("#io_bazel_rules_go//go:deps.bzl", "go_register_toolchains", "go_rules_dependencies")
go_rules_dependencies()
go_register_toolchains()
...
go/awesome.go is copied from the article.
go/BUILD
load("#io_bazel_rules_go//go:def.bzl", "go_binary", "go_library")
package(default_visibility = ["//visibility:public"])
go_binary(
name = "awesome",
srcs = glob(["*.go"]),
cgo = True,
copts = [
"-fPIC", # I tried adding this after some other reading about .a->.so
],
gc_linkopts = [
"-shared", # I think this is equivalent to the linkmode=c-shared below, but... <shrug>
],
linkmode = "c-shared",
static = "off",
)
# This one uses the pre-built awesome.so, and this works.
filegroup(
name = "prebuilt_awesome_resource",
srcs = ["awesome.so"],
)
java/Client.java is copied from the github repo linked in the article (with slight tweaks to the library's location).
java/BUILD
package(default_visibility = ["//visibility:public"])
java_import(
name = "jna",
jars = ["jna.jar"],
)
java_library(
name = "client_lib",
srcs = glob(["*.java"]),
resources = [
# "//go:awesome", # I'd rather use this one.
"//go:prebuilt_awesome_resource",
],
deps = [
":jna",
],
)
java_binary(
name = "client",
main_class = "Client",
runtime_deps = [
":client_lib",
],
)
and also, since it was important getting go stuff to run:
%programdata%/basel.bazelrc
startup --output_user_root="C:/_bazel_out"
build --compiler=mingw-gcc
Well, I guess I need to go sit in the shame cube for a while.
Of all the options I was looking at for the compiler I missed checking other attributes on go_binary. Specifically, the obvious one, out. The one that actually corresponds to the -o flag on go build
I added out = "awesome.so" to my go_binary rule and, sure enough, everything works.
Well that's a few hours wasted. Thanks Jay for trying to help and sorry for asking a dumb question.
This might not answer your question exactly, but I can give an example of calling a Go shared library from a C program on macOS. Hopefully that gets you most of the way there.
For the go_binary, you just need linkmode = "c-shared". You'll also need cgo = True for each package that either contains cgo code or has exported definitions. You don't need -shared, -fPIC, or static = "off".
Exported definitions should be marked with an //export comment.
There's an implicitly declared target with the suffix .c_hdrs that builds a header file for the Go library. It's :go_hello.c_hdrs in the example below. The actual header file name is go_hello.h, matching the target name.
You need to wrap the generated files with a cc_import rule to make it usable as a C/C++ dependency. #2433 is an open issue to streamline that process, but it's only recently possible in Bazel.
Anything that can consume a cc_library can consume the cc_import target the same way. So you should be able to call Go functions via JNI, though I've never tried that out.
BUILD.bazel
load("#io_bazel_rules_go//go:def.bzl", "go_binary")
go_binary(
name = "go_hello",
srcs = ["hello.go"],
cgo = True,
linkmode = "c-shared",
)
cc_import(
name = "c_hello",
hdrs = [":go_hello.c_hdrs"],
shared_library = ":go_hello",
)
cc_binary(
name = "use",
srcs = ["use.c"],
deps = [":c_hello"],
)
hello.go
package main
import "fmt"
import "C"
//export SayHello
func SayHello() {
fmt.Println("hello")
}
func main() {}
use.c
#include "go_hello.h"
int main() {
SayHello();
return 0;
}
Related
My CustomTest.java has this import:
com.google.protobuf.Timestamp
I'm using java_test_suite to run tests in my BUILD file like so:
java_test_suite(
name = "all-tests",
srcs = glob(["src/test/java/**/*.java"]),
runner = "junit5",
test_suffixes = ["Test.java"],
runtime_deps = JUNIT5_DEPS,
deps = [
":mylib",
"#com_google_protobuf//:timestamp_proto",
artifact("org.junit.jupiter:junit-jupiter-api"),
artifact("org.junit.jupiter:junit-jupiter-params"),
] + deps,
)
However when I run tests on it using:
bazel test //:all-tests
I'm getting this error:
src/test/java/com/x/CustomTest.java:75: error: [strict] Using type com.google.protobuf.Timestamp from an indirect dependency (TOOL_INFO: "#com_google_protobuf//:timestamp_proto wrapped in java_proto_library"). See command below **
private static Timestamp timestampFromMilli(long milli) {
^
** Please add the following dependencies:
#com_google_protobuf//:timestamp_proto to //:src/test/java/com/x/CustomTest
** You can use the following buildozer command:
buildozer 'add deps #com_google_protobuf//:timestamp_proto' //:src/test/java/com/x/CustomTest
What do I need to do exactly? I tried using the buildozer command but all I got was:
rule 'src/test/java/com/x/CustomTest' not found
Where do I need to add this #com_google_protobuf//:timestamp_proto?
Looking at protobuf's build files, it looks like timestamp_proto is a plain proto_library:
https://github.com/protocolbuffers/protobuf/blob/main/BUILD.bazel#L70-L74
https://github.com/protocolbuffers/protobuf/blob/main/src/google/protobuf/BUILD.bazel#L64-L68
and so per the advice here:
https://github.com/protocolbuffers/protobuf/blob/main/BUILD.bazel#L19-L25
you might just need to use java_proto_library to make the java version of the proto:
java_proto_library(
name = "timestamp_java_proto",
deps = ["#com_google_protobuf//:timestamp_proto"],
)
and then use that in the deps of your java_test_suite instead of the timestamp_proto.
Just a guess, but the error message is not very helpful maybe because there happens to be a Timestamp java class in the deps of the plain proto library, and Strict deps is finding that one in the test's indirect dependencies. Might be worth filing a bug about it on https://github.com/bazelbuild/bazel/issues
I want to use a specific Azul Zulu JDK for my Java builds. Therefore I have stored it in my repository locally e.g. under tools/zulu19.30.11-ca-jdk19.0.1-macosx_x64. Now I want to configure a java toolchain target such that I can pass it via --java_toolchain=//tools:my_custom_java_toolchain. I don't want to depend on some remote repositories.
What are the required steps to achieve this?
I have found this repository: https://github.com/salesforce/bazel-jdt-java-toolchain/blob/main/jdt/defs.bzl which defines a target of type default_java_toolchain but I can not derive something useful for my use case. I don't know e.g. what the field header_compiler means. My naive assumption is that I just have to pass some paths to the required tools (such as bin/javac) for java compilation.
My current approach uses the rules java_toolchain and java_runtime. My BUILD file looks like this:
java_runtime(
name = "zulu19.30.11-ca-jdk19.0.1-macosx_x64",
srcs = glob(["zulu19.30.11-ca-jdk19.0.1-macosx_x64/**"]),
java_home = "zulu19.30.11-ca-jdk19.0.1-macosx_x64",
)
java_toolchain(
name = "zulu-19",
source_version = "19",
target_version = "19",
java_runtime = ":zulu19.30.11-ca-jdk19.0.1-macosx_x64",
javabuilder = "",
ijar = "",
singlejar = "",
genclass = "",
)
I am trying to execute the command: bazel build --extra_toolchains="//tools:zulu-19" //:ProjectRunner and it complains about the missing mandatory attributes javabuilder, ijar, singlejar and genclass but I have no idea which are the correct paths or values.
I'm just wondering how bazel knows how to compile the java code with these few information. Why I don't have to specify javac for example?
Basically I came up with the following solution which seems to solve my problem and is inspired by the rule default_java_toolchain. I used the existing targets for javabuilder, ijar, singlejar and genclass which are defined in #bazel_tools//tools/jdk. Note that I had to use BASE_JDK9_JVM_OPTS for a successful build.
load("#bazel_tools//tools/jdk:default_java_toolchain.bzl", "BASE_JDK9_JVM_OPTS")
java_runtime(
name = "zulu19.30.11-ca-jdk19.0.1-macosx_x64",
srcs = glob(["zulu19.30.11-ca-jdk19.0.1-macosx_x64/**"]),
java_home = "zulu19.30.11-ca-jdk19.0.1-macosx_x64",
)
config_setting(
name = "zulu-19-runtime_version_setting",
values = {"java_runtime_version": "19"},
visibility = ["//visibility:private"],
)
toolchain(
name = "zulu-19_runtime_toolchain_definition",
target_settings = ["zulu-19-runtime_version_setting"],
toolchain_type = "#bazel_tools//tools/jdk:runtime_toolchain_type",
toolchain = "zulu19.30.11-ca-jdk19.0.1-macosx_x64",
)
java_toolchain(
name = "zulu-19",
source_version = "19",
target_version = "19",
java_runtime = ":zulu19.30.11-ca-jdk19.0.1-macosx_x64",
jvm_opts = BASE_JDK9_JVM_OPTS,
javabuilder = ["#bazel_tools//tools/jdk:javabuilder"],
ijar = ["#bazel_tools//tools/jdk:ijar"],
singlejar = ["#bazel_tools//tools/jdk:singlejar"],
genclass = ["#bazel_tools//tools/jdk:genclass"],
)
config_setting(
name = "zulu-19_version_setting",
values = {"java_language_version": "19"},
visibility = ["//visibility:private"],
)
toolchain(
name = "zulu-19_toolchain_definition",
toolchain_type = "#bazel_tools//tools/jdk:toolchain_type",
target_settings = ["zulu-19_version_setting"],
toolchain = "zulu-19",
)
The following command now successfully runs without any errors. System.out.println(Runtime.version()); also prints the correct version.
bazel run --extra_toolchains="//tools:zulu-19_toolchain_definition,//tools:zulu-19_runtime_toolchain_definition" --java_language_version="19" --java_runtime_version="19" //:ProjectRunner
But I still have a couple of questions, e.g. why javabuilder, ijar, singlejar and genclass are needed and what is the purpose of those targets? How does the toolchain target know about my javac binary for compiling the java code? Why do I have to use BASE_JDK9_JVM_OPTS for a successful build and can I omit some settings?
I have a script that works perfectly when I'm not using Renv. However, when running it in a project with Renv enabled, the last command line returns the following message:
> r5r_core <- setup_r5(data_path = data_path, verbose = FALSE)
Error in rJava::.jinit() : Unable to create a Java class loader.
Just run the code below inside a renv project to have a reproducible example:
options(java.parameters = "-Xmx2G")
library(r5r)
library(rJava)
data_path <- system.file("extdata/poa", package = "r5r")
list.files(data_path)
poi <- fread(file.path(data_path, "poa_points_of_interest.csv"))
head(poi)
points <- fread(file.path(data_path, "poa_hexgrid.csv"))
points <- points[ c(sample(1:nrow(points), 10, replace=TRUE)), ]
head(points)
# Indicate the path where OSM and GTFS data are stored
r5r_core <- setup_r5(data_path = data_path, verbose = FALSE)
My Java version is compatible with the one used in this package, but it looks like R is having a hard time communicating with Java in Renv. Could anyone tell me?
I want to include Java source code from multiple directories (which are shared between projects) in a Qt for Android project. On http://imaginativethinking.ca/what-the-heck-how-do-i-share-java-code-between-qt-android-projects/ an approach is described which copies the Java source files:
# This line makes sure my custom manifest file and project specific java code is copied to the android-build folder
ANDROID_PACKAGE_SOURCE_DIR = $$PWD/android
# This is a custom variable which holds the path to my common Java code
# I use the $$system_path() qMake function to make sure that my directory separators are correct for the platform I'm compiling on as you need to use the correct separator in the Make file (i.e. \ for Windows and / for Linux)
commonAndroidFilesPath = $$system_path( $$PWD/../CommonLib/android-sources/src )
# This is a custom variable which holds the path to the src folder in the output directory. That is where they need to go for the ANT script to compile them.
androidBuildOutputDir = $$system_path( $$OUT_PWD/../android-build/src )
# Here is the magic, this is the actual copy command I want to run.
# Make has a platform agnostic copy command macro you can use which substitutes the correct copy command for the platform you are on: $(COPY_DIR)
copyCommonJavaFiles.commands = $(COPY_DIR) $${commonAndroidFilesPath} $${androidBuildOutputDir}
# I tack it on to the 'first' target which exists by default just because I know this will happen before the ANT script gets run.
first.depends = $(first) copyCommonJavaFiles
export(first.depends)
export(copyCommonJavaFiles.commands)
QMAKE_EXTRA_TARGETS += first copyCommonJavaFiles
With later Qt versions the code has to be changed to this:
commonAndroidFilesPath = $$system_path($$PWD/android/src)
androidBuildOutputDir = $$system_path($$OUT_PWD/../android-build)
createCommonJavaFilesDir.commands = $(MKDIR) $${androidBuildOutputDir}
copyCommonJavaFiles.commands = $(COPY_DIR) $${commonAndroidFilesPath} $${androidBuildOutputDir}
first.depends = $(first) createCommonJavaFilesDir copyCommonJavaFiles
export(first.depends)
export(createCommonJavaFilesDir.commands)
export(copyCommonJavaFiles.commands)
QMAKE_EXTRA_TARGETS += first createCommonJavaFilesDir copyCommonJavaFiles
Is this the standard way to go, or is there some built-in functionality for including multiple Java source directories in Qt for Android projects?
Regards,
A much cleaner solution is this one:
CONFIG += file_copies
COPIES += commonJavaFilesCopy
commonJavaFilesCopy.files = $$files($$system_path($$PWD/android/src))
commonJavaFilesCopy.path = $$OUT_PWD/android-build
I researched and looked into the PlayN game framework and I liked it a lot. I program in Scala and actually don't know Java but it's not usually a problem since they work together great.
I've set up a basic project in eclipse and imported all the libraries and dependencies. I even translated over the base maven project code. Here's the two files:
Zeitgeist.scala
package iris.zeit.core
import playn.core.PlayN._
import playn.core.Game
import playn.core.Image
import playn.core.ImageLayer
class Zeitgeist extends Game {
override def init (){
var bgImage: Image = assets().getImage("images/bg.png")
var bgLayer: ImageLayer = graphics().createImageLayer(bgImage)
graphics().rootLayer().add(bgLayer)
}
override def paint (alpha: Float){
//painting stuffs
}
override def update(delta: Float){
}
override def updateRate(): Int = {
25
}
}
Main.scala
package iris.zeit.desktop
import playn.core.PlayN
import playn.java.JavaPlatform
import iris.zeit.core.Zeitgeist
object Main {
def main(args: Array[String]){
var platform: JavaPlatform = JavaPlatform.register()
platform.assets().setPathPrefix("resources")
PlayN.run(new Zeitgeist())
}
}
The cool thing is it works! A window comes up perfectly. The only problem is I can't seem to load images. With the above line, "assets().getImage("images/bg.png")" it pops out
Could not load image: resources/images/bg.png [error=java.io.FileNotFoundException: resources/images/bg.png]
I've played around with the location of my resources file to no avail. I was even able to find bg.png myself with java.io.File. Am I doing something wrong? Is there something I'm forgetting?
Looking at the code of JavaAssetsManager, it looks like it is trying to load a resource and not a file. So you should check that your images are actually in the classpath and at the path you give ("resources/images/bp.png")
Alternatively, you can use getRemoteImage and pass a File URL. As you succeeded in using a java.io.File, you can just get the URL with method toUri of File (toUrl is deprecated).
This almost certainly doesn't work because you're doing this:
platform.assets().setPathPrefix("resources")
That means you're saying your source folder looks like this:
src/main/resources/resources/images/bg.png
src/main/resources/resources/images/pea.png
src/main/resources/resources/images
I imagine it actually looks like one of these:
src/main/resources/assets/images/bg.png <-- 'assets' the default prefix
src/main/resources/assets/images/pea.png
src/main/resources/assets/images
or:
src/main/resources/images/bg.png <-- You have failed to put a subfolder prefix in
src/main/resources/images/pea.png
src/main/resources/images
You can either do this, if you have no prefix:
plat.assets().setPathPrefix("")
Or just put your files in the assets sub-folder inside the resources folder.
It's worth noting that the current implementation calls:
getClass().getClassLoader().getResource(...)
Not:
getClass().getResource(...)
The difference is covered elsewhere, but the tldr is that plat.assets.getImage("images/pea.png") will work, but plat.assets.getImage("/images/pea.png") will not.