I am using CoreData in my project. I can save UIImage by using UIImageToDataTransformer class in my project. Also I am using opencv library and I need to save some cvMat struct (they are structs, right?). I have read some posts about transformable but still cant figure out how to save it in coredata and fetch it back.
I have one entity called "Human" and it has a attribute called ProjVal and its type is transformable which holds data of cvMat.
What should my next step be?
Below its the transformer class for UIImage but how can I transform cvMat's data? Thank for your advices.
#import "UIImageToDataTransformer.h"
@implementation UIImageToDataTransformer
+ (BOOL)allowsReverseTransformation {
return YES;
}
+ (Class)transformedValueClass {
return [NSData class];
}
- (id)transformedValue:(id)value {
return UIImagePNGRepresentation(value);
}
- (id)reverseTransformedValue:(id)value {
return [[[UIImage alloc] initWithData:value] autorelease];
}
@end
Edit: Is it a better way to use its type as Binary Data type instead Transformable type? Im using auto-created classes for my entities.
I use this code while saving
CvMat *mat = cvCreateMat(3, 3, CV_32FC1);
for (int i=0; i<9; i++) {
mat->data.fl[i] = 1.0f;
}
NSData *projValData = [[NSData alloc] initWithBytes:&mat length:sizeof(mat)];
human.projVal = projValData;
and while reading it back:
CvMat *mat;
NSData *data = [[humansArray objectAtIndex:0] projVal];
memcpy(&mat,[data bytes],[data length]);
if (mat) {
NSLog(@"if mat works");
for (int i=0; i<9; i++) {
NSLog(@"%f",mat->data.fl[i]);
}
}
it doesnt work well and crashed in for loop. Where is my mistake?
Can you get a pointer to (and size of) the raw bytes of your cvMat data?
If so, you can use the dataWithBytes:length:
class method of NSData to in your transformedValue:
implementation and the bytes
(instance) method in reverseTransformedValue:
.