I am using CodeBuild to deploy a react website to S3. When I run npm run build
locally, it works perfectly. However, when it hits CodeBuild it fails with the following error:
[Container] 2021/01/26 00:29:23 Running command npm run build
> gci-web-app@0.1.0 build /codebuild/output/src705738256/src
> react-scripts build
/codebuild/output/src705738256/src/node_modules/fs-extra/lib/mkdirs/make-dir.js:85
} catch {
^
SyntaxError: Unexpected token {
at createScript (vm.js:80:10)
at Object.runInThisContext (vm.js:139:10)
at Module._compile (module.js:616:28)
at Object.Module._extensions..js (module.js:663:10)
at Module.load (module.js:565:32)
at tryModuleLoad (module.js:505:12)
at Function.Module._load (module.js:497:3)
at Module.require (module.js:596:17)
at require (internal/module.js:11:18)
at Object.<anonymous> (/codebuild/output/src705738256/src/node_modules/fs-extra/lib/mkdirs/index.js:3:44)
I have tried installing fs-extra in the package, but still hitting the same error. Really scratching my head on this, as this is an out of the box create-react-app application. Any help would be greatly appreciated!
In case its needed, this is what my CodeBuild looks like:
phases:
pre_build:
commands:
- echo Installing source NPM dependencies...
- npm install
build:
commands:
- echo Build started on `date`
- npm run build
post_build:
commands:
# copy the contents of /build to S3
- aws s3 cp --recursive --acl public-read ./build s3://${GciWebAppBucket}/
# set the cache-control headers for service-worker.js to prevent
# browser caching
- >
aws s3 cp --acl public-read
--cache-control="max-age=0, no-cache, no-store, must-revalidate"
./build/service-worker.js s3://${GciWebAppBucket}/
# set the cache-control headers for index.html to prevent
# browser caching
- >
aws s3 cp --acl public-read
--cache-control="max-age=0, no-cache, no-store, must-revalidate"
./build/index.html s3://${GciWebAppBucket}/
# invalidate the CloudFront cache for index.html and service-worker.js
# to force CloudFront to update its edge locations with the new versions
- >
aws cloudfront create-invalidation --distribution-id ${Distribution}
--paths /index.html /service-worker.js
artifacts:
files:
- '**/*'
base-directory: build
The issue was due to the version of Nodejs the docker was using. Changing the image to standard:5.0 and setting runtime-versions to nodejs 14 in the install phase did the job. The full CodeBuild cfn that worked is below.
CodeBuild:
Type: 'AWS::CodeBuild::Project'
Properties:
Name: !Sub ${AWS::StackName}-CodeBuild
ServiceRole: !GetAtt CodeBuildRole.Arn
Artifacts:
# The downloaded source code for the build will come from CodePipeline
Type: CODEPIPELINE
Name: GCIWebApp
Source:
Type: CODEPIPELINE
Environment:
# Linux container with node installed
ComputeType: BUILD_GENERAL1_SMALL
Type: LINUX_CONTAINER
Image: "aws/codebuild/standard:5.0"
Source:
Type: CODEPIPELINE
BuildSpec: !Sub |
version: 0.2
phases:
install:
runtime-versions:
nodejs: 14
commands:
- npm i npm@latest -g
- npm cache clean --force
- rm -rf node_modules package-lock.json
- npm install
build:
commands:
- echo Build started on `date`
- npm run build
post_build:
commands:
# copy the contents of /build to S3
- aws s3 cp --recursive --acl public-read ./build s3://${GciWebAppBucket}/
# set the cache-control headers for service-worker.js to prevent
# browser caching
- >
aws s3 cp --acl public-read
--cache-control="max-age=0, no-cache, no-store, must-revalidate"
./build/service-worker.js s3://${GciWebAppBucket}/
# set the cache-control headers for index.html to prevent
# browser caching
- >
aws s3 cp --acl public-read
--cache-control="max-age=0, no-cache, no-store, must-revalidate"
./build/index.html s3://${GciWebAppBucket}/
# invalidate the CloudFront cache for index.html and service-worker.js
# to force CloudFront to update its edge locations with the new versions
- >
aws cloudfront create-invalidation --distribution-id ${Distribution}
--paths /index.html /service-worker.js
artifacts:
files:
- '**/*'
base-directory: build