Behind the Veil is currently hosted at: https://behind-the-veil-1077153890194.australia-southeast2.run.app/
- Dependencies
- Running the app
- Testing
- Errors
- Importing Mock Data
- Pull Request Strategy
- Git Help
- Versioning Strategy
- Deployment
- Team Composition
Reference: https://docs.meteor.com/install.html
Meteor requires a certain version of node to be installed. specifically Node.js version >= 10 and <= 14 is required.
Meteor recommends to use nvm to install the correct version: https://github.com/nvm-sh/nvm
Install NVM:
-
Use curl. This works on macOS or Linux.
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.35.3/install.sh | bash
-
Use Brew on macOS.
brew install nvm
-
On Windows
Follow the instructions on this link: https://www.freecodecamp.org/news/nvm-for-windows-how-to-download-and-install-node-version-manager-in-windows-10/
If you already installed Node.js and need to downgrade:
npm install -g n
n 14
To check if the version installed successfully run:
node -v
For Windows, Mac or Linux run the following:
npm install -g meteor
Installing meteor already installs React so you don't need to do anything!
Certain functions in the application require the ability to send emails, thus these third-party dependencies that support this function must be set up.
The application also requires a simple mail transfer protocol (SMTP) server for functionalities that send emails. The project currently uses an SMTP server provided by Twilio SendGrid, but this SMTP server can be from any provider.
You must already have a Gmail account for the application to send emails FROM. (Some organisational accounts cannot be used, including Monash emails.)
The current email that the project is using is: [email protected]
This will form one of the environment variable required:
FROM_USER:
"Nickname<email_address>"
e.g.:
FROM_USER:
"Behind the Veil<[email protected]>"
The following steps will take you through setting up an SMTP server on Twilio SendGrid:
- Create and verify your account
Registration at: https://sendgrid.com/en-us
The Twilio team may reach out for manual verification of identity. You will need to provide details about yourself, your company/organisation, what you will be using email service for, etc.
-
Once your account is verified, log in and add a Single Sender at: https://app.sendgrid.com/settings/sender_auth
- add the email address of the gmail account you have prepared during the prerequisite step.
- this email address is used for the application to send emails FROM
-
Verify the email address by accessing the inbox and clicking the link sent by Twilio.
-
Go to the setup guide for integration and select "SMTP Relay": https://app.sendgrid.com/guide/integrate
-
Continue onto the nest steps and obtain the full SMTP connection. It should be something like this:
MAIL_URL:
"smtps://apikey:<YOUR_API_KEY>@smtp.sendgrid.net:465"
This is one of the required environment variables.
The application uses a cloud MongoDB database for storing and syncing data across devices. The following instructions will help you create a cluster for the project and set up the cloud MongoDB connection.
-
Create and verify your account for MongoDB Atlas at: https://account.mongodb.com/account/register
-
Log in and create a new project. (optional: invite other team members to the project)
-
Create a new (free) M0 cluster and name it anything you want
-
Follow the prompts to create a first admin user for the cluster
-
In "Network Access" tab, press "+ Add IP Address" and add "0.0.0.0/0" (this will allow access to the database from anywhere)
-
In "Database Access" tab, press "+ Add New Database User"
- create a new user with: read-write permissions.
- Make sure you record the username and password
-
In the "Database" tab, press "Connect" on the cluster
-
Select "Drivers" and then copy the connect string (URI).
- Replace the <db_username> and <db_password> with the username and password of the user created in step 6.
-
You now have the other required environmental variable for connecting to the cloud database. It should look something like:
MONGO_URL:
"mongodb+srv://<db_username>:<db_password>@cluster0.d0qmc.mongodb.net/meteor?retryWrites=true&w=majority&appName=Cluster0"
Once all your dependencies have been installed:
-
Make sure you are in the behind-the-veil-siteroot directory
cd behind-the-veil-siteroot
-
Install meteor dependencies inside the file. This will create your node_modules file, do not skip this step.️
meteor npm install
-
To start the app with LOCAL database run:
meteor npm start
-
To start the app with email server AND cloud database export the environment variables that we have created just before. (Currently, the variables has placeholders, please make sure to replace them with the actual values.)
export FROM_USER="Nickname<email_address>" export MAIL_URL="smtps://apikey:<YOUR_API_KEY>@smtp.sendgrid.net:465" export MONGO_URL="mongodb+srv://<db_username>:<db_password>@cluster0.d0qmc.mongodb.net/meteor?retryWrites=true&w=majority&appName=Cluster0" meteor npm start
Your app should automatically open on the browser. If it doesn't, the console will show you a link that the app is running on the browser from so click this.
meteor test --driver-package meteortesting:mocha
In the event your app is crashing or failing to run, scroll up on the error message and it will tell you if you are missing any dependencies.
The most common missing one is Babel, so run this if you see it say missing babel package
meteor npm install --save @babel/runtime react react-dom
In the event that the app will not start due to "Unexpected mongo exit code 14":
- Try: meteor reset meteor credit: https://stackoverflow.com/questions/38988365/meteor-unexpected-mongo-exit-code-14-restarting-cant-start-mongo-server
-
Download the "MongoDB Command Line Database Tools" from: https://www.mongodb.com/try/download/database-tools
-
Unzip and move the folder to anywhere you want (NOT inside the project folder).
-
Get the path of the
/bin
folder inside the unzipped folder. The full path should be something like:C:\Program Files\MongoDB\mongodb-database-tools\bin
Then add the above folder path to environment PATH.
- Start menu type: "path", and go into:
Edit system environment variables
- press
Environment Variables
- Go into the Path variable for your user not the System variable. Double click to expand the view.
-
3.5. Recommended/Optional step: backup your Path variable before you alter it to prevent any loss than may occur.
-
copy this string to any text file or anywhere to save it on your computer for future.
- Press the
New
button or any empty row and paste the previously copied path into it.
edit the file at /etc/paths
and add the above address to the end of the file
- running this command:
sudo vim /etc/paths
- press
i
to go into insert mode and writing in the file - paste in your path you want to add at the end on a new line
- press
escape
and then type:wq
to save the file
-
run meteor project (make sure it is started)
-
Open another terminal and navigate to site-root
\Behind-The-Veil\behind-the-veil-siteroot
Run command:
mongoimport -h localhost:<PORT> --db meteor --collection <COLLECTION_NAME> --file <FILEPATH/FILENAME.json> --jsonArray
in the above command, replace:
<PORT>
with the port that the mongoDb is running on, if your meteor project is on 3000, then it is most likely 3001- (note: if you've run meteor on a different port, then mongoDb is probably running on that port + 1, e.g. meteor running on 4000, mongoDb running on 4001)
<COLLECTION_NAME>
with the database entity name (such as user, service, image, etc.)<FILEPATH/FILENAME.json>
with the file path to the file to import
Note: do not include the angled brackets (< and >)
Example commands to import everything (with port 3001)
mongoimport -h localhost:3001 --db meteor --collection services --file mockdata/services15.json --jsonArray
mongoimport -h localhost:3001 --db meteor --collection services --file mockdata/services50.json --jsonArray
mongoimport -h localhost:3001 --db meteor --collection users --file mockdata/artists50.json --jsonArray
mongoimport -h localhost:3001 --db meteor --collection users --file mockdata/brides50.json --jsonArray
mongoimport -h localhost:3001 --db meteor --collection bookings --file mockdata/bookings12.json --jsonArray
mongoimport -h localhost:3001 --db meteor --collection images --file mockdata/user_images50.json --jsonArray
mongoimport -h localhost:3001 --db meteor --collection images --file mockdata/service_images100.json --jsonArray
mongoimport -h localhost:3001 --db meteor --collection images --file mockdata/galleryImages50_abc.json --jsonArray
mongoimport -h localhost:3001 --db meteor --collection posts --file mockdata/post50_abc.json --jsonArray
mongoimport -h localhost:3001 --db meteor --collection reviews --file mockdata/reviews12.json --jsonArray
-
run meteor project (make sure it is started) and remember to export the MONGO_URL to conect to the cloud database
-
Open another terminal and navigate to site-root
\Behind-The-Veil\behind-the-veil-siteroot
run the command:
mongoimport --username <CLOUD_USERNAME> --password <CLOUD_PASSWORD> <HOST_URI> --db meteor --collection <COLLECTION_NAME> --file <FILEPATH> --jsonArray
in the above command, replace:
<CLOUD_USERNAME>
with the username of the Cloud MongoDB Database user (refer to step 6 in: here)<CLOUD_PASSWORD>
with the password of the Cloud MongoDB Database user (refer to step 6 in: here)<HOST_URI>
with the first part of the connection string, should be something like:mongodb+srv://cluster0.xxxxxxx.mongodb.net
<COLLECTION_NAME>
with the database entity name (such as user, service, image, etc.)<FILEPATH/FILENAME.json>
with the file path to the file to import
Note: do not include the angled brackets (< and >)
Example:
mongoimport --username meteor-main --password xxxxxx mongodb+srv://cluster0.d99ecyx.mongodb.net --db meteor --collection test --file mockdata\presentation\artists50.json --jsonArray
The Pull Request (PR) strategy outlines the process for contributing and reviewing code within the project. Having a PR ensures that all changes are reviewed, tested, and documented properly before being merged into the main
branch. The strategy is designed to maintain high-quality code, encourage team collaboration, and keep a agreed upon standard of code within the team.
The 10 Bridesmaids have elected to use the Gitflow workflow for this git repository. Instead of a single main branch, this workflow uses two branches to record the history of the project. The main
branch stores the official release history which will then be used to deploy the application, and the develop
branch serves as an integration branch for features.
The following is the guidelines to the workflow for the 10 Bridesmaids codebase:
-
Branch Creation:
- Each developer works on a separate feature or bugfix branch, forked from the
develop
. - Branch naming conventions:
<bugfix/feature>/<ticket-id>-brief-name-of-ticket
For example: feature/86cuw4kj1-unresponded-booking-alert
- Each developer works on a separate feature or bugfix branch, forked from the
-
Development:
- Code changes are made and tested locally on the new branch.
-
Commit and Push:
- Developers commit their changes and push them to the repository.
- Commit message conventions:
<bugfix/feature>/<ticket brief message about the changes made
For example: feature/86cuw4kj1 added the email api to send user for each unresponded booking
-
Open a PR:
- Once a feature or task is complete, a pull request is opened against the
develop
branch. - The PR should include a detailed description of the changes and the link of the ticket it addresses.
- This includes any mock data that needs to be added.
- Once a feature or task is complete, a pull request is opened against the
-
Code Review:
- The code reviewers for that sprint are assigned to review the code, check for quality, and provide feedback.
- A detailed guide to the code review standards and process steps is linked here for developer use. Please refer to: Code Review Checklist
-
Feedback and Changes:
- The developer addresses the feedback and pushes any necessary changes.
-
Approval and Merge:
- Once approved, the branch is squash merged into the
develop
branch. The PR is then closed and the branch is deleted remotely.
✏️ Note: The reason a squash merge was chosen as our merge method was due to some of our team members being relatively new to versioning sometimes duplicate commits occured from rebasing/merging. Squash merges ensures even if the developer or the reviewer missed the duplicate commit issue, main branch would have a clean commit log history.
- Once approved, the branch is squash merged into the
This file is for all the git issues and their resolution that I've come across before, more than welcome to add to it if you find a solution online to an issue you've had.
Reference: https://git-scm.com/book/en/v2/Git-Branching-Rebasing
✅ Step by Step video: https://youtu.be/RGtwxYqkkas?si=nvmMetHWNMegc0v_&t=129 (This guy does it all in his main terminal but i recommend using the IDE terminal so you can see your conflict files)
🐛 There is also a screenshotted guide specific to Behind The Veil codebase if you prefer that: Guide to Rebasing
When to use rebasing: When a new ticket has been merged to develop and your branch is not up to date with the develop branch
Why do we rebase? We want to limit the amount of merge conflicts happening in develop so we put all new changes from develop into our own branchs so we can resolve merge conflicts remotely instead of on develop.
! IMPORTANT ! We will be prioritising develop branch over yours, so if a merge conflict occurs in develop, it will be rolled back and your changes will be lost, so please make sure you rebase!
git add <filename>
or
git add .
then
git commit -m "your message"
git push
git checkout develop
git pull
git checkout <your working branch>
git rebase develop
When rebasing you may run into merge conflicts. In this scenario, your terminal will say something like this:
error: could not apply fa39187... something to add to patch A
When you have resolved this problem, run "git rebase --continue".
If you prefer to skip this patch, run "git rebase --skip" instead.
To check out the original branch and stop rebasing, run "git rebase --abort".
Could not apply fa39187f3c3dfd2ab5faa38ac01cf3de7ce2e841... Change fake file
This occurs because one or more of your files is going to be overwritten from the rebase. Your terminal will tell you which files are having a conflict. However you can also do cmd/ctrl + shift + f and search '<<<<<<< HEAD'.
A merge conflict typically looked like this:
/**
* This comment is from your working branch
*/
The way to resolve this is to select from the middle '=====' and either delete everything above it or below it. Make sure you delete the remaining >>> or <<< as well.
Once your resolve the conflict in the file remember to stage your changes
git add <changed filename>
then continue rebasing to see if any other files need to be resolved:
git rebase --continue
✅ You will know your rebase was successful if your terminal says rebased successfully.
If everything in your branch is working to your expectations and you are happy with you current branch then do
git push -f
Keep in mind git force push will rewrite your remote branch with everything in local regardless of merge conflicts. This is not a bad thing if you are happy with everything on your local and need your remote to match!
Reference: https://stackoverflow.com/questions/30590083/git-how-to-rename-a-branch-both-local-and-remote
Rename the local branch to the new name
git branch -m <old_name> <new_name>
Delete the old branch on remote - where is, for example, origin
git push <remote> --delete <old_name>
Or shorter way to delete remote branch [:]
git push <remote> :<old_name>
Prevent git from using the old name when pushing in the next step. Otherwise, git will use the old upstream name instead of <new_name>.
git branch --unset-upstream <new_name>
Push the new branch to remote
git push <remote> <new_name>
Reset the upstream branch for the new_name local branch
git push <remote> -u <new_name>
error: src refspec feature/86cv1cexu-git-repo-setup does not match any
error: failed to push some refs to 'https://github.com/Monash-FIT3170/Behind-The-Veil.git'
This error means the branch doesn't exist in remote but does locally. To resolve this you can do a git status to see what uncommitted files you have and commit them. Do a git push and the branch should push to remote and create it for you.
This means the branch you are trying to push has a detached head i.e it was never branched off the developer branch.
To resolve this go into your current working branch and run
git rebase develop
Resolve any conflicts that may occur and do add all changed files using
git add <filename> # recommended
or
git add . # not recommended unless you are confident you want ALL files staged
Finally to attach your branch back to head with all changes do
git push -f
Note if you are unsure about any of this please reach out to team members via General.
This strategy outlines how we manage versions in our project to ensure consistency and cohesion across the team for major releases, minor fixes and patching bugs.
We use Semantic Versioning to indicate changes in the project. The format follows v[MAJOR].[MINOR].[PATCH]
.
- MAJOR: Incremented when significant new features are added to the application. This release is also used for major API changes and features that may require the end user to learn or relearn how to use new feature. i.e New settings page or a new side bar is added etc.
For example: v2.0.0
- MINOR: Incremented when less signification new features or improvements are added. These do not break existing functionality but introduce enhancements. i.e Theme changes, button changes and other minor features.
For example: v1.1.0
- PATCH/BUG: Incremented when bug fixes are made. These changes do not affect functionality but resolve issues existing in the code. i.e bookings are being duplicated or multiple copies of the same services are added at a time.
For example: v1.0.1
- Initial Development:
- During the early stages of the project (pre-launch), the code was predominantly being developed on the
develop
branch hence no versioning was used for this. Once the project is stable and ready for release, we begin with 1.0.0.
- Branch-Based Development:
- All development is done on develop branches, and versions are assigned when merging back into the main branch (typically main or release).
- Each version release includes a tag in the repository with the version number, which serves as a snapshot of that point in development.
- Changelog:
- A changelog must be maintained to document significant changes between versions. It should outline:
- New features: new code added for functionality
- Bug fixes: Fixes on existing code
- Chores: maintenance work particularly for tech debt, cleaning up or documentation
- Release Management:
- All version releases are tagged and documented with a description of the changes in the repository. Tagging allows easy reference to specific versions for debugging or comparison.
- Post-Release Process:
- After each release, increment the version number according to the next expected update. For example, after releasing 1.2.0, the next patch might be 1.2.1 or the next minor update could be 1.3.0.
This section outlines how to manually deploy the app using Docker and Google Cloud Platform (GCP). The steps outlined are more specific to our app and workflow, but for general advice and more specific details on how to deploy using GCP, refer to https://cloud.google.com/run/docs/deploying
If you modify the Dockerfile, keep in mind that GCP requires containers to follow this contract https://cloud.google.com/run/docs/container-contract (“Executables in the container image must be compiled for Linux 64-bit. Cloud Run specifically supports the Linux x86_64 ABI format.”)
Note that these steps could probably be optimised, i.e. pushing Docker image straight to Artifact Registry instead of Dockerhub first.
- Install Docker and create a Docker account, then have the Docker engine running when attempting to build the Docker image
- Create a Dockerhub repo
- Create Google Cloud project and enable billing
- Set up Artifact Registry
- Enable Artifact Registry API
- Go to Artifact Registry > Repositories
- Create a docker repository
- In the form, select docker format and Australia region
- Configure secrets using Secret Manager
- Enable Secret Manager API
- Create secrets for each of our sensitive environment variables, e.g.
- MONGO_URL
- MAIL_URL
- FROM_USER
- Grant the ’Secret Manager Secret Accessor’ role to the service account being used to deploy
- Go to Secret Manager page
- ‘Show info panel’
- ‘Add principal’
- Select the service account used to deploy as new principal
- Select ’Secret Manager Secret Accessor’ as role
Assuming you have completed the prerequisites, here is the process when you’re ready to do a release and deploy:
- Create a branch off of ‘develop’ called ‘release/<version_number>’ (replace with your desired version number)
- Merge ‘release/<version_number>’ into ‘main’
- Checkout ‘main’ branch
- Open terminal. Go to behind-the-veil-siteroot, where the Dockerfile is
cd behind-the-veil-siteroot
- Log in
docker login
- Enter login credentials
- Build the Docker image
docker buildx build -t behind-the-veil --platform linux/amd64 .
- Test that the image runs locally
docker run -e MONGO_URL=<mongo_url> -e ROOT_URL=<root_url> -e MAIL_URL=<mail_url> -e FROM_USER=<from_user_email> -p 8080:8080 behind-the-veil
- Replace the environment variables above with our potentially sensitive values, or omit them if you know you don’t need them for testing
- If it fails, then debug, rebuild, and test again, else proceed
Note, the following example code snippets assume the following:
- Docker username = joshualoongwy
- Dockerhub repo name = behind-the-veil
- Version number = 2.0.1
- GCP project location = australia-southeast2
- GCP project id = behind-the-veil
- GCP Artifact Registry repo name = behind-the-veil-docker
Adjust each command according to your needs
- Tag Docker image to target Dockerhub repo
docker tag behind-the-veil joshualoongwy/behind-the-veil:2.0.1
- Push Docker image to Dockerhub repo
docker push joshualoongwy/behind-the-veil:2.0.1
- Go to GCP and open Cloud Shell. The following steps are performed in a Cloud Shell terminal. Configure authentication if necessary
gcloud auth configure-docker australia-southeast2-docker.pkg.dev
- Pull Docker image from Dockerhub
docker pull joshualoongwy/behind-the-veil:2.0.1
- Tag Docker image to target Google Artifact Registry repo
docker tag joshualoongwy/behind-the-veil:2.0.1 australia-southeast2-docker.pkg.dev/behind-the-veil/behind-the-veil-docker/behind-the-veil:2.0.1
- Push Docker image to Google Artifact Registry
docker push australia-southeast2-docker.pkg.dev/behind-the-veil/behind-the-veil-docker/behind-the-veil:2.0.1
- Go to Google Cloud Run and fill out service form
- If first time deploying, go to ‘Create service’ and fill out Google Cloud Run form fields:
- Container Image URL: Select the image you just pushed to Artifact Registry
- Region: Select ‘australia-southeast2’
- Authentication: Select ‘Allow unauthenticated invocations’
- Environment variables: Use the necessary environment variables (some should be configured as secrets in Secret Manager), e.g.
- ROOT_URL
- MONGO_URL
- MAIL_URL
- FROM_USER
- Increase/adjust resources (e.g. memory) as necessary
- If redeploying, select your existing service and go to ‘Edit & deploy new revision’, then make necessary changes in the form
- If first time deploying, go to ‘Create service’ and fill out Google Cloud Run form fields:
- Press ‘Deploy’, wait for GCP to deploy, then go the supplied URL to check for successful deployment
- Phillip (Kefei) Li
- Glenn Eric
- Lucas Sharp
- Laura Zhakupova
- Vicky Huang
- Anusha Yadav
- Katie (Kathryn) Mitchell
- Joshua Loong
- Trung Nguyen
- Hirun Hettigoda
- Nishan Chakma
- Nikki Li Shao
- Ryan Hicks
- Neth Botheju
- Kyle Bibby
- Nhu Nguyen
Written & tested by Neth, Josh and Nikki
Mongo error help info added added by Phillip