I want to generate a captcha by myself without using any other API, but here are some issues. In my code I have treated each character as an image and then form some operation on them but now I want to rotate whole image with respect to its midpoint.
Here is my sample code:
for (int i = 0; i < charsToPrint; i++)
{
double randomValue =Math.random();
int randomIndex = (int) Math.round(randomValue * (chars.length - 1));
char characterToShow = chars[randomIndex];
finalString.append(characterToShow);
int charWidth = fontMetrics.charWidth(characterToShow);
int charDim = Math.max(maxAdvance, fontHeight);
//int halfCharDim = (int) (charDim / 2);
charImage = new BufferedImage(charDim, charDim,BufferedImage.TYPE_INT_ARGB);
charGraphics = charImage.createGraphics();
charGraphics.setColor(textColor);
charGraphics.setFont(textFont);
int rn = ran.nextInt((20 - 10) + 1) + 10;
x1=charDim/rn;
int rn1 = ran.nextInt((20 - 10) + 1) + 10;
y1=charDim/rn1;
int charX = (int) ((0.5 * charDim - 0.5 * charWidth)+x1);
int charY=(int) ((charDim - fontMetrics.getAscent()) / 2 + fontMetrics.getAscent()+y1);
charGraphics.drawString("" + characterToShow, charX, charY);
float x = horizMargin + spacePerChar * (i) - charDim / 2.0f;
int y = (int) ((height - charDim) / 2);
g.setColor(Color.GREEN);
g.drawImage(charImage, (int) (x+x1), (int)(y+y1), charDim,charDim, null,null);
charGraphics.dispose();
}
Related
I've been trying to implement a BC1 (DXT1) decompression algorithm in Java. Everything seems to work pretty precise but I've ran into problem with some blocks around transparent ones. I've been trying to resolve it for a few hours without success.
In short, after decompressing all blocks everything looks good except for the blocks whose are around transparent ones. During the development I've been checking results with results from DirectXTex (texconv) which is written in C++.
This is my result compared to DirectXTex one:
Here is the code I'm using:
BufferedImage decompress(byte[] buffer, int width, int height)
and implementation:
BufferedImage result = new BufferedImage(width, height, BufferedImage.TYPE_INT_ARGB);
int[] scanline = new int[4 * width]; //stores 4 horizontal lines (width/4 blocks)
RGBA[] blockPalette = new RGBA[4]; //stores RGBA values of current block
int bufferOffset = 0;
for (int row = 0; row < height / 4; row++) {
for (int col = 0; col < width / 4; col++) {
short rgb0 = Short.reverseBytes(Bytes.getShort(buffer, bufferOffset));
short rgb1 = Short.reverseBytes(Bytes.getShort(buffer, bufferOffset + 2));
int bitmap = Integer.reverseBytes(Bytes.getInt(buffer, bufferOffset + 4));
bufferOffset += 8;
blockPalette[0] = R5G6B5.decode(rgb0);
blockPalette[1] = R5G6B5.decode(rgb1);
if(rgb0 <= rgb1) {
int c2r = (blockPalette[0].getRed() + blockPalette[1].getRed()) / 2;
int c2g = (blockPalette[0].getGreen() + blockPalette[1].getGreen()) / 2;
int c2b = (blockPalette[0].getBlue() + blockPalette[1].getBlue()) / 2;
blockPalette[2] = new RGBA(c2r, c2g, c2b, 255);
blockPalette[3] = new RGBA(0, 0, 0, 0);
} else {
int c2r = (2 * blockPalette[0].getRed() + blockPalette[1].getRed()) / 3;
int c2g = (2 * blockPalette[0].getGreen() + blockPalette[1].getGreen()) / 3;
int c2b = (2 * blockPalette[0].getBlue() + blockPalette[1].getBlue()) / 3;
int c3r = (blockPalette[0].getRed() + 2 * blockPalette[1].getRed()) / 3;
int c3g = (blockPalette[0].getGreen() + 2 * blockPalette[1].getGreen()) / 3;
int c3b = (blockPalette[0].getBlue() + 2 * blockPalette[1].getBlue()) / 3;
blockPalette[2] = new RGBA(c2r, c2g, c2b, 255);
blockPalette[3] = new RGBA(c3r, c3g, c3b, 255);
}
for (int i = 0; i < 16; i++, bitmap >>= 2) {
int pi = (i / 4) * width + (col * 4 + i % 4);
int index = bitmap & 3;
scanline[pi] = A8R8G8B8.encode(blockPalette[index]);
}
}
//copy scanline to buffered image
result.setRGB(0, row * 4, width, 4, scanline, 0, width);
}
return result;
Does anyone have idea where is the problem? I've been doing exactly the same steps as specification says: Block Compression (Direct3D 10)
Is it that blockPalette[2].set(c2r, c2g, c2b); should be blockPalette[2].set(c2r, c2g, c2b, 255);? (in two locations)
For those who are interested, I've found that the problem was in comparing short values.
I've just changed:
if(rgb0 <= rgb1) {
to either
if(Short.compareUnsigned(rgb0, rgb1) <= 0) {
or
if((rgb0 & 0xffff) <= (rgb1 & 0xffff)) {
and this ensures that color values are compared as unsigned shorts (positive integers).
I need to implement Gaussian Blur in Java for 3x3, 5x5 and 7x7 matrix. Can you correct me if I'm wrong:
I've a matrix(M) 3x3 (middle value is M(0, 0)):
1 2 1
2 4 2
1 2 1
I take one pixel(P) from image and for each nearest pixel:
s = M(-1, -1) * P(-1, -1) + M(-1, 0) * P(-1, 0) + ... + M(1, 1) * P(1, 1)
An then division it total value of matrix:
P'(i, j) = s / M(-1, -1) + M(-1, 0) + ... + M(1, 1)
That's all that my program do. I leave extreme pixels not changed.
My program:
for(int i = 1; i < height - 1; i++){
for(int j = 1; j < width - 1; j++){
int sum = 0, l = 0;
for(int m = -1; m <= 1; m++){
for(int n = -1; n <= 1; n++){
try{
System.out.print(l + " ");
sum += mask3[l++] * Byte.toUnsignedInt((byte) source[(i + m) * height + j + n]);
} catch(ArrayIndexOutOfBoundsException e){
int ii = (i + m) * height, jj = j + n;
System.out.println("Pixels[" + ii + "][" + jj + "] " + i + ", " + j);
System.exit(0);
}
}
System.out.println();
}
System.out.println();
output[i * width + j] = sum / maskSum[0];
}
}
I get source from a BufferedImage like this:
int[] source = image.getRGB(0, 0, width, height, null, 0, width);
So for this image:
Result is this:
Can you describe me, what is wrong with my program?
First of all, your formula for calculating the index in the source array is wrong. The image data is stored in the array one pixel row after the other. Therefore the index given x and y is calculated like this:
index = x + y * width
Furthermore the color channels are stored in different bits of the int cannot simply do the calculations with the whole int, since this allows channels to influence other channels.
The following solution should work (even though it just leaves the pixels at the bounds transparent):
public static BufferedImage blur(BufferedImage image, int[] filter, int filterWidth) {
if (filter.length % filterWidth != 0) {
throw new IllegalArgumentException("filter contains a incomplete row");
}
final int width = image.getWidth();
final int height = image.getHeight();
final int sum = IntStream.of(filter).sum();
int[] input = image.getRGB(0, 0, width, height, null, 0, width);
int[] output = new int[input.length];
final int pixelIndexOffset = width - filterWidth;
final int centerOffsetX = filterWidth / 2;
final int centerOffsetY = filter.length / filterWidth / 2;
// apply filter
for (int h = height - filter.length / filterWidth + 1, w = width - filterWidth + 1, y = 0; y < h; y++) {
for (int x = 0; x < w; x++) {
int r = 0;
int g = 0;
int b = 0;
for (int filterIndex = 0, pixelIndex = y * width + x;
filterIndex < filter.length;
pixelIndex += pixelIndexOffset) {
for (int fx = 0; fx < filterWidth; fx++, pixelIndex++, filterIndex++) {
int col = input[pixelIndex];
int factor = filter[filterIndex];
// sum up color channels seperately
r += ((col >>> 16) & 0xFF) * factor;
g += ((col >>> 8) & 0xFF) * factor;
b += (col & 0xFF) * factor;
}
}
r /= sum;
g /= sum;
b /= sum;
// combine channels with full opacity
output[x + centerOffsetX + (y + centerOffsetY) * width] = (r << 16) | (g << 8) | b | 0xFF000000;
}
}
BufferedImage result = new BufferedImage(width, height, BufferedImage.TYPE_INT_ARGB);
result.setRGB(0, 0, width, height, output, 0, width);
return result;
}
int[] filter = {1, 2, 1, 2, 4, 2, 1, 2, 1};
int filterWidth = 3;
BufferedImage blurred = blur(img, filter, filterWidth);
I am trying to add some texture to my game. I am running into some problems getting the image to display properly.
This is what the texture should look like, just a boring black square:
And this is what I get. A little bit of black with blue lines.
This is the code I used to import the image. The BufferedImage is set to Type_INT_RGB:
package com.mime.minefront.graphics;
import java.awt.image.BufferedImage;
import javax.imageio.ImageIO;
public class Texture {
public static Render floor = loadBitmap("/textures/floorb.png");
public static Render loadBitmap(String fileName) {
try {
BufferedImage image = ImageIO.read(Texture.class.getResource(fileName));
int width = image.getWidth();
int height = image.getHeight();
Render result = new Render(width, height);
image.getRGB(0, 0, width, height, result.pixels, 0, width);
return result;
} catch (Exception e) {
System.out.println("CRASH!");
throw new RuntimeException(e);
}
}
}
Any help or advice would be great. I have tried to search for the answer but with no luck.
This is my Render class.
package com.mime.minefront.graphics;
public class Render {
public final int width;
public final int height;
public final int[] pixels;
public Render(int width, int height) {
this.width = width;
this.height = height;
pixels = new int[width * height];
}
public void draw(Render render, int xOffset, int yOffset) {
for (int y = 0; y < render.height; y++) {
int yPix = y + yOffset;
if (yPix < 0 || yPix >= height) {
continue;
}
for (int x = 0; x < render.width; x++) {
int xPix = x + xOffset;
if (xPix < 0 || xPix >= width) {
continue;
}
int aplha = render.pixels[x + y * render.width];
if (aplha > 0) {
pixels[xPix + yPix * width] = aplha;
}
}
}
}
}
and this is my Render3D class
package com.mime.minefront.graphics;
import com.mime.minefront.Game;
import com.mimi.minefront.input.Controller;
import com.mimi.minefront.input.InputHandler;
import java.awt.Robot;
import java.util.Random;
public class Render3D extends Render {
public double[] zBuffer;
private double renderDistance = 5000;
private double forward, right, up, cosine, sine;
public Render3D(int width, int height) {
super(width, height);
zBuffer = new double[width * height];
}
public void floor(Game game) {
double floorPosition = 8;
double cellingPosition = 8;
forward = game.controls.z;
right = game.controls.x;
up = game.controls.y;
double walking = Math.sin(game.time / 6.0) * 0.5;
if (Controller.crouchWalk) {
walking = Math.sin(game.time / 6.0) * 0.25;
}
if (Controller.runWalk) {
walking = Math.sin(game.time / 6.0) * 0.8;
}
double rotation = 0;//Math.sin(game.time / 20) * 0.5; //game.controls.rotation;
cosine = Math.cos(rotation);
sine = Math.sin(rotation);
for (int y = 0; y < height; y++) {
double celling = (y - height / 2.0) / height;
double z = (floorPosition + up) / celling;
if (Controller.walk) {
z = (floorPosition + up + walking) / celling;
}
if (celling < 0) {
z = (cellingPosition - up) / -celling;
if (Controller.walk) {
z = (cellingPosition - up - walking) / -celling;
}
}
for (int x = 0; x < width; x++) {
double depth = (x - width / 2.0) / height;
depth *= z;
double xx = depth * cosine + z * sine;
double yy = z * cosine - depth * sine;
int xPix = (int) (xx + right);
int yPix = (int) (yy + forward);
zBuffer[x + y * width] = z;
pixels[x + y * width] = //((xPix & 15) * 16 | ((yPix % 15) * 16) << 8);
Texture.floor.pixels[xPix & 7] + (yPix & 7) * 8;
if (z > 500) {
pixels[x + y * width] = 0;
}
}
}
}
public void renderWall(double xLeft, double xRight, double zDistance, double yHeight) {
double xcLeft = ((xLeft) - right) * 2;
double zcLeft = ((zDistance) - forward) * 2;
double rotLeftSideX = xcLeft * cosine - zcLeft * sine;
double yCornerTL = ((-yHeight) - up) * 2;
double yCornerBL = ((+0.5 - yHeight) - up) * 2;
double rotLeftSideZ = zcLeft * cosine + xcLeft * sine;
double xcRight = ((xRight) - right) * 2;
double zcRight = ((zDistance) - forward) * 2;
double rotRightSideX = xcRight * cosine - zcLeft * sine;
double yCornerTR = ((-yHeight) - up) * 2;
double yCornerBR = ((+0.5 - yHeight) - up) * 2;
double rotRightSideZ = zcRight * cosine + xcRight * sine;
double xPixelLeft = (rotLeftSideX / rotLeftSideZ * height + width / 2);
double xPixelRight = (rotRightSideX / rotRightSideZ * height + width / 2);
if (xPixelLeft >= xPixelRight) {
return;
}
int xPixelLeftInt = (int) (xPixelLeft);
int xPixelRightInt = (int) (xPixelRight);
if (xPixelLeftInt < 0) {
xPixelLeftInt = 0;
}
if (xPixelRightInt > width) {
xPixelRightInt = width;
}
double yPixelLeftTop = (yCornerTL / rotLeftSideZ * height + height / 2);
double yPixelLeftBottom = (yCornerBL / rotLeftSideZ * height + height / 2);
double yPixelRightTop = (yCornerTR / rotRightSideZ * height + height / 2);
double yPixelRightBottom = (yCornerBR / rotRightSideZ * height + height / 2);
double tex1 = 1 / rotLeftSideZ;
double tex2 = 1 / rotRightSideZ;
double tex3 = 0 / rotLeftSideZ;
double tex4 = 8 / rotRightSideZ - tex3;
for (int x = xPixelLeftInt; x < xPixelRightInt; x++) {
double pixelRotation = (x - xPixelLeft) / (xPixelRight - xPixelLeft);
double xTexture= (int) ((tex3+tex4*pixelRotation)/tex1+(tex2-tex1)*pixelRotation);
double yPixelTop = yPixelLeftTop + (yPixelRightTop - yPixelLeftTop) * pixelRotation;
double yPixelBottom = yPixelLeftBottom + (yPixelRightBottom - yPixelLeftBottom) * pixelRotation;
int yPixelTopInt = (int) (yPixelTop);
int yPixelBottomInt = (int) (yPixelBottom);
if (yPixelTopInt < 0) {
yPixelTopInt = 0;
}
if (yPixelBottomInt > height) {
yPixelBottomInt = height;
}
for (int y = yPixelTopInt; y < yPixelBottomInt; y++) {
pixels[x + y * width] = (int) xTexture*100;
zBuffer[x + y * width] = 0;
}
}
}
public void renderDistanceLimiter() {
for (int i = 0; i < width * height; i++) {
int colour = pixels[i];
int brightness = (int) (renderDistance / (zBuffer[i]));
if (brightness < 0) {
brightness = 0;
}
if (brightness > 255) {
brightness = 255;
}
int r = (colour >> 16) & 0xff;
int g = (colour >> 8) & 0xff;
int b = (colour) & 0xff;
r = r * brightness / 255;
g = g * brightness / 255;
b = b * brightness / 255;
pixels[i] = r << 16 | g << 8 | b;
}
}
}
From getRGB() :
Returns an array of integer pixels in the default RGB color model
(TYPE_INT_ARGB) and default sRGB color space, from a portion of the
image data. Color conversion takes place if the default model does not
match the image ColorModel
See if using TYPE_INT_ARGB instead of TYPE_INT_RGB works.
I'm new to OpenCV in Android. I'm trying to covert a C++ code into Java. I am stuck in some point that I cannot continue.
std::vector<cv::Vec4i> lines;
cv::HoughLinesP(bw, lines, 1, CV_PI/180, 70, 30, 10);
// Expand the lines
for (int i = 0; i < lines.size(); i++)
{
cv::Vec4i v = lines[i];
lines[i][0] = 0;
lines[i][1] = ((float)v[1] - v[3]) / (v[0] - v[2]) * -v[0] + v[1];
lines[i][2] = src.cols;
lines[i][3] = ((float)v[1] - v[3]) / (v[0] - v[2]) * (src.cols - v[2]) + v[3];
}
half way I converted.. upto the TODO
MatOfInt4 lines= new MatOfInt4();
Imgproc.HoughLinesP(bw, lines, 1, Math.PI/180, 70, 30, 10);
int[] lineArray = lines.toArray();
// Expand the lines
//TODO
for (int i = 0; i < lineArray.length; i++)
{
int v = lineArray[i];
lines.[i][0] = 0;
lines[i][1] = ((float)v[1] - v[3]) / (v[0] - v[2]) * -v[0] + v[1];
lines[i][2] = src.cols();
lines[i][3] = ((float)v[1] - v[3]) / (v[0] - v[2]) * (src.cols() - v[2]) + v[3];
}
which I'm confused is inside the for loop. when converted lines in to a Array it gives a int array. But inside the for loop again v is defined which should be a array. I didn't get this point. Can anybody please help me to get through this. Thank you in advance.
Finally I managed to write the working code. with help of this link
Mat lines = new Mat();
int threshold = 70;
int minLineSize = 30;
int lineGap = 10;
Imgproc.HoughLinesP(thresholdImage, lines, 1, Math.PI / 180, threshold,
minLineSize, lineGap);
for (int x = 0; x < lines.cols(); x++) {
double[] vec = lines.get(0, x);
double[] val = new double[4];
val[0] = 0;
val[1] = ((float) vec[1] - vec[3]) / (vec[0] - vec[2]) * -vec[0] + vec[1];
val[2] = src.cols();
val[3] = ((float) vec[1] - vec[3]) / (vec[0] - vec[2]) * (src.cols() - vec[2]) + vec[3];
lines.put(0, x, val);
}
I'm trying to rotate a Bitmap where the pixels are stored in an Array int pixels[]. I got the following method:
public void rotate(double angle) {
double radians = Math.toRadians(angle);
double cos, sin;
cos = Math.cos(radians);
sin = Math.sin(radians);
int[] pixels2 = pixels;
for (int x = 0; x < width; x++)
for (int y = 0; y < height; y++) {
int centerx = this.width / 2, centery = this.height / 2;
int m = x - centerx;
int n = y - centery;
int j = (int) (m * cos + n * sin);
int k = (int) (n * cos - m * sin);
j += centerx;
k += centery;
if (!((j < 0) || (j > this.width - 1) || (k < 0) || (k > this.height - 1)))
try {
pixels2[(x * this.width + y)] = pixels[(k * this.width + j)];
} catch (Exception e) {
e.printStackTrace();
}
}
pixels = pixels2;
}
But it just gives me crazy results. Does anyone know where the error is?
The line
int[] pixels2 = pixels;
is supposed to copy the array, but you are just copying the reference to it. Use pixels.clone(). In fact, you just need a new, empty array, so new int[pixels.lenght] is enough. In the end you need System.arraycopy to copy the new content into the old array.
There are other problems in your code -- you are mixing up rows and columns. Some expressions are written as though the image is stored row by row, others as if column by column. If row-by-row (my assumption), then this doesn't make sense: x*width + y. It should read y*width + x -- you are skipping y rows down and then moving x columns to the right. All in all, I have this code that works OK:
import static java.lang.System.arraycopy;
public class Test
{
private final int width = 5, height = 5;
private int[] pixels = {0,0,1,0,0,
0,0,1,0,0,
0,0,1,0,0,
0,0,1,0,0,
0,0,1,0,0};
public Test rotate(double angle) {
final double radians = Math.toRadians(angle),
cos = Math.cos(radians), sin = Math.sin(radians);
final int[] pixels2 = new int[pixels.length];
for (int x = 0; x < width; x++)
for (int y = 0; y < height; y++) {
final int
centerx = this.width / 2, centery = this.height / 2,
m = x - centerx,
n = y - centery,
j = ((int) (m * cos + n * sin)) + centerx,
k = ((int) (n * cos - m * sin)) + centery;
if (j >= 0 && j < width && k >= 0 && k < this.height)
pixels2[(y * width + x)] = pixels[(k * width + j)];
}
arraycopy(pixels2, 0, pixels, 0, pixels.length);
return this;
}
public Test print() {
for (int y = 0; y < height; y++) {
for (int x = 0; x < width; x++)
System.out.print(pixels[width*y + x]);
System.out.println();
}
System.out.println();
return this;
}
public static void main(String[] args) {
new Test().print().rotate(-45).print();
}
}
public void render(float nx, float ny, float nz, float size, float rotate) {
int wid = (int) ((width - nz) * size);
int hgt = (int) ((height - nz) * size);
if (wid < 0 || hgt < 0) {
wid = 0;
hgt = 0;
}
for (int x = 0; x < wid; x++) {
for (int y = 0; y < hgt; y++) {
double simple = Math.PI;
int xp = (int) (nx +
Math.cos(rotate) * ((x / simple) - (wid / simple) / 2) + Math
.cos(rotate + Math.PI / 2)
* ((y / simple) - (hgt / simple) / 2));
int yp = (int) (ny + Math.sin(rotate)
* ((x / simple) - (wid / simple) / 2) + Math.sin(rotate
+ Math.PI / 2)
* ((y / simple) - (hgt / simple) / 2));
if (xp + width < 0 || yp + height < 0 || xp >= Main.width
|| yp >= Main.height) {
break;
}
if (xp < 0
|| yp < 0
|| pixels[(width / wid) * x + ((height / hgt) * y)
* width] == 0xFFFF00DC) {
continue;
}
Main.pixels[xp + yp * Main.width] = pixels[(width / wid) * x
+ ((height / hgt) * y) * width];
}
}
}
This is only a new to rotating for me, but the process of this is that of a normal rotation. It still needs much fixing -- it's inefficient and slow. But in a small program, this code works. I'm posting this so you can take it, and make it better. :)