I m working on a spring boot application and connecting it with Amazon S3 bucket. It is a simple application for uploading videos on aws.
VideoService Class
@Service
@RequiredArgsConstructor
public class VideoService {
private final S3Service s3Service;
private final VideoRepository videoRepository;
public void uploadFile(MultipartFile file)
{
String videoURL = s3Service.uploadFile(file);
var video = new Video();
video.setVideoUrl(videoURL);
videoRepository.save(video);
}
}
Service Class
@Service
@RequiredArgsConstructor
public class S3Service implements FileService{
public static final String BUCKET_NAME = "****";
private final AmazonS3Client amazonS3Client;
@Override
public String uploadFile(MultipartFile file) {
var filenameExtension = StringUtils.getFilenameExtension(file.getOriginalFilename());
var key = UUID.randomUUID().toString() + "." + filenameExtension;
var metadata = new ObjectMetadata();
metadata.setContentLength(file.getSize());
metadata.setContentType(file.getContentType());
try {
amazonS3Client.putObject(BUCKET_NAME, key, file.getInputStream(), metadata);
} catch (IOException ioException) {
throw new ResponseStatusException(HttpStatus.INTERNAL_SERVER_ERROR,
"An Exception occured while uploading the file");
}
amazonS3Client.setObjectAcl(BUCKET_NAME, key, CannedAccessControlList.PublicRead);
return amazonS3Client.getResourceUrl(BUCKET_NAME, key);
}
}
Controller class:
@RestController
@RequestMapping("/api/videos")
@RequiredArgsConstructor
public class VideoController {
private final VideoService videoService;
@PostMapping
@ResponseStatus(HttpStatus.CREATED)
public void uploadVideo(@RequestParam("file")MultipartFile file)
{
videoService.uploadFile(file);
}
}
I already created s3 bucket and access key on aws
The access key and secrey key are stored in VM options: -Dcloud.aws.credentials.access-key=**** -Dcloud.aws.credentials.secret-key=*****
this is what I get in Postman: Postman
this is shown in browser: Browser localhost
Did I miss something?
I was following a tutorial and checked every single code with the completed repo and everything was the same. Instead of giving me 200 status it is giving me 401 status.
I have the exact same use case. One difference is I am using Amazon S3 Java V2 - which is the recommended SDK version to use.
My Controller.
@RequestMapping(value = "/upload", method = RequestMethod.POST)
@ResponseBody
public String singleFileUpload(@RequestParam("file") MultipartFile file) {
try {
byte[] bytes = file.getBytes();
String fileName = file.getOriginalFilename();
UUID uuid = UUID.randomUUID();
String unqueFileName = uuid + "-" + fileName;
DynamoDBService dbService = new DynamoDBService();
S3Service s3Service = new S3Service();
AnalyzePhotos analyzePhotos = new AnalyzePhotos();
UploadEndpoint endpoint = new UploadEndpoint(analyzePhotos, dbService, s3Service);
endpoint.upload(bytes, fileName);
return "You have uploaded "+fileName;
} catch (Exception e) {
e.printStackTrace();
}
return "File was not uploaded";
}
Notice that i am getting the byte[] of the upload file.
The endpoint.upload() is here:
public void upload(byte[] bytes, String name) {
// Put the file into the bucket.
s3Service.putObject(bytes, PhotoApplicationResources.STORAGE_BUCKET, name);
this.tagAfterUpload(name);
}
The s3Service.putObject() is here:
// Places an image into a S3 bucket.
public void putObject(byte[] data, String bucketName, String objectKey) {
S3Client s3 = getClient();
try {
s3.putObject(PutObjectRequest.builder()
.bucket(bucketName)
.key(objectKey)
.build(),
RequestBody.fromBytes(data));
} catch (S3Exception e) {
System.err.println(e.getMessage());
e.printStackTrace();
throw e;
}
}
The getClient() is
private S3Client getClient() {
return S3Client.builder()
.region(PhotoApplicationResources.REGION)
.build();
}
Now all of this works and can be invoked from Postman:
Looks like you have Spring Security Dependencies in your POM file (based on your screenshot of a forced login). You should remove them and see if your controller gets the file when setting a break point. You should see: