I found this interesting post about water ripple simulation. It includes a Xcode project with a little cocoa app for OSX. I tried to get the app running in iOS using Swift, but I couldn't find a solution to the following problem:
The OSX app uses NSBitmapImageRep
to draw bitmap data from an array onto the screen (this happens in the BBRippleView.m). Unfortunately, NSBitmapImageRep
is not available in iOS. I tried using a CGImage
instead, which is "kind of working": I can see some kind of water ripples, but they are split up in some weird way and they don't move the way they are supposed to. I guess the CGImage expects to get the bitmap data in a different format than it currently gets it.
Edit: Here is my iOS RippleView class: RippleView.swift
Edit 2: Modified RippleView.swift
What would be the right way to implement the following OSX code segments in iOS using Swift?
image = [[NSImage alloc] initWithSize:NSMakeSize(cols, rows)];
bufrep = [[NSBitmapImageRep alloc]
initWithBitmapDataPlanes:NULL
pixelsWide:cols
pixelsHigh:rows
bitsPerSample:8
samplesPerPixel:1
hasAlpha:NO
isPlanar:YES
colorSpaceName:NSDeviceWhiteColorSpace
bytesPerRow:0
bitsPerPixel:0];
[image addRepresentation:bufrep];
And
-(void)applyBuffer
{
unsigned char* data = [bufrep bitmapData];
int x = 0;
int y = 0;
int position = 0;
for ( y = 0; y < rows; y ++) {
for ( x = 0; x < cols ; x ++) {
position = (y * cols) + x;
data[position] = (char)buffer2[position] << 1;
}
}
}
You are telling CGImageCreate
that your image is greyscale, with 8 bits by pixel. But your buffers are arrays of Int
s, which are either 32 or 64 bits (4 or 8 bytes).
Switch to UInt8
, that should help a lot.