From ba2617e929219423a929432f676fc23b669b0d5b Mon Sep 17 00:00:00 2001 From: Liam Stevens Date: Thu, 19 May 2022 14:20:45 +0700 Subject: [PATCH] Release - 0.2.0 (#37) * Update README.md * Updated gettext errors * Removed unused variables * Fixed linting/formatting issues * Fixed linting/formatting issues * Update deploy_heroku.yml * Rename deploy_heroku.yml to deploy_heroku_staging.yml * Create deploy_heroku_prod.yaml * Update deploy_heroku_staging.yml * As a user I can sign up and sign in with a valid e-mail and password (#28) * [#6 #16] As a user I can sign in with a valid e-mail and password * Remove unused files * Remove unsused update function for User * Add session for user after log in * Add unique email constraint on Users table * Add user sign in * Add current sign in status for user * Add user sign out functionality (not in backlog) * Remove / refactored code * Remove coverage check for currently unused plug * Add controller tests * Prepare ExMachina for testing * User sign out now displays a message * Change session deletion method to ensure persistence of message to user upon sign out * Refactored password hashing function so it can be used in future tests * Refactor fixture to use ExMachina and Faker for data generation * Modify ExUnit tests to conform to standards * Tidy template pages with correct formatting * Remove comments and cleaned up code * Merge migrations into single file for User schema * Remove comments and cleaned up code * Correct English used in ExUnit test case * Move secret_key-base file to environment variable for production * Change multiple alias identifiers from one line to multiple to satisy codebase * Make blank line seperation more consistent in the tests * Add feature test case for User log in * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Remove code form AuthController to AuthHelper to reflect the functionality * [#6 #7 #16 #22] Remove auto-generated function spec * [#6 #7 #16 #22] Moved Account context into accounts folder and account schema into its own folder to improve structure * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Change refute to assert false for testing outcomese * [#6 #7 #16 #22] Changed from pattern matching to double equals to match exact output when required. Move value being tested against to the right side * [#3] [UI] As a user, I can upload a CSV file containing keywords which will then be used to search on Google (#31) * [#6 #16] As a user I can sign in with a valid e-mail and password * Remove unused files * Remove unsused update function for User * Add session for user after log in * Add unique email constraint on Users table * Add user sign in * Add current sign in status for user * Add user sign out functionality (not in backlog) * Remove / refactored code * Remove coverage check for currently unused plug * Add controller tests * Prepare ExMachina for testing * User sign out now displays a message * Change session deletion method to ensure persistence of message to user upon sign out * Refactored password hashing function so it can be used in future tests * Refactor fixture to use ExMachina and Faker for data generation * Modify ExUnit tests to conform to standards * Tidy template pages with correct formatting * Remove comments and cleaned up code * Merge migrations into single file for User schema * Remove comments and cleaned up code * Correct English used in ExUnit test case * Move secret_key-base file to environment variable for production * Change multiple alias identifiers from one line to multiple to satisy codebase * Make blank line seperation more consistent in the tests * Add feature test case for User log in * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Remove code form AuthController to AuthHelper to reflect the functionality * [#6 #7 #16 #22] Remove auto-generated function spec * [#6 #7 #16 #22] Moved Account context into accounts folder and account schema into its own folder to improve structure * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#3] Add endpoint and controller for handling keyword upload * [#3] Add template files for uploading files, including upload form * [#3] Add link to keywords page in navigation * [#3] Remove coverall and comments from authenticated plug to prepare for tests * [#3] Add tests for keyword controller and test csv file * [#3] Add tests for ensure_authenticated plug * Resolved merge conflict * [#3] Format code * [#3] Change keywords fixture file name and changed template to show 1000 keywords limit * [#3] Remove blank line and re-order assert tests for ensure_authenticated plug tests * [#3] Add an additional test to ensure unauthenticated users are unable to upload a keywords file * [#3] Format test * [#18] [Backend] As a user, I can upload a CSV file containing keywords which will be stored (#33) * [#6 #16] As a user I can sign in with a valid e-mail and password * Remove unused files * Remove unsused update function for User * Add session for user after log in * Add unique email constraint on Users table * Add user sign in * Add current sign in status for user * Add user sign out functionality (not in backlog) * Remove / refactored code * Remove coverage check for currently unused plug * Add controller tests * Prepare ExMachina for testing * User sign out now displays a message * Change session deletion method to ensure persistence of message to user upon sign out * Refactored password hashing function so it can be used in future tests * Refactor fixture to use ExMachina and Faker for data generation * Modify ExUnit tests to conform to standards * Tidy template pages with correct formatting * Remove comments and cleaned up code * Merge migrations into single file for User schema * Remove comments and cleaned up code * Correct English used in ExUnit test case * Move secret_key-base file to environment variable for production * Change multiple alias identifiers from one line to multiple to satisy codebase * Make blank line seperation more consistent in the tests * Add feature test case for User log in * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Remove code form AuthController to AuthHelper to reflect the functionality * [#6 #7 #16 #22] Remove auto-generated function spec * [#6 #7 #16 #22] Moved Account context into accounts folder and account schema into its own folder to improve structure * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#3] Add endpoint and controller for handling keyword upload * [#3] Add template files for uploading files, including upload form * [#3] Add link to keywords page in navigation * [#3] Remove coverall and comments from authenticated plug to prepare for tests * [#3] Add tests for keyword controller and test csv file * [#3] Add tests for ensure_authenticated plug * Resolved merge conflict * [#3] Format code * [#3] Change keywords fixture file name and changed template to show 1000 keywords limit * [#3] Remove blank line and re-order assert tests for ensure_authenticated plug tests * [#3] Add an additional test to ensure unauthenticated users are unable to upload a keywords file * [#3] Format test * [#18] Add NimbleCSV * [#18] Add Keyword Controller and helper function for validate/parse csv * [#18] Add Keyword Controller tests and additional test files for invalid cases * [#18] Change invalid file format to invalid file extension * [#18] Initial KeywordUpload Schema, associations and tests * [#18] Initial code to carry out the mass insertions of keywords into the table for the User. Needs refactor wip * [#18] Refactor KeywordUpload changeset to Use __MODULE__ as default argument * #[18] Slight refactor of keyword saving for user wip * Refactor Keyword context name and add one test * [#18] Change alias to fix formatting errors * [#18] Add test to Keyword Controller to verify an uplaod of two keywords returns the correct count to the user * [#18] Add additional empty line for csv files * [#18] Change from using length to Enum.count() for counting list size * [#18] Change from using string field to text for keyword html storage to remove character limit * [#18] Add positive test result for KeywordUpload changeset * [#18] Remove unnecessary conn.halts from keyword controller * [#18] Remove comments and changed grammar in test cases for Keywords * [#18] Change name and status fields of KeywordUpload to be to not accept null * [#18] Refactor parsing of keywords into correct structure for bulk inserts wip * [#18] Add two further KeywordUpload changeset tests to ensure a KeywordUpload has to have an existing user * [#1] [#20] As a user, I can view a list of my previously uploaded keywords (#35) * #[1] Add Context function to retreive list of uploaded keywords for a particular user * #[1] Add Controller and template to show the list of uploaded keywords for the user * #[1] Change github action trigger from Pull Request to Push to allow staging and prod * #[1] Change github action trigger from Pull Request to Push to allow staging and prod * [#1] Add Uploaded field to display for each KeywordUpload and format using Calendar module * #[1] Add test for KeywordView for formatting timestamp * #[1] Change Repo.list_all to return the inserted Keywords * #[1] Write tests for fetching KeywordUploads for a particular user * #[1] Add KeywordUpload Factory to tests for listing KeywordUploads for a User * [#1] Refactor keyword test using pipe operator to make it cleaner * [#1] Fix formatting on keywords index template file * [#1] Clean up keyword template file * [#1] Remove external Calendar library dependencies due to built-in functionality in Elxiir * [#1] Made keyword test title more explicit * Remove prod.secret.exe config import to allow deployment (#36) --- ...eroku_prod.yaml => deploy_heroku_prod.yml} | 1 + .github/workflows/deploy_heroku_staging.yml | 1 + .github/workflows/test.yml | 7 +- config/prod.exs | 2 - .../accounts/schemas/user.ex | 3 + .../keywords/keyword.ex | 56 +++++++++ .../keywords/schemas/keyword_upload.ex | 27 +++++ .../controllers/keyword_controller.ex | 33 ++++++ .../helpers/keyword_helper.ex | 34 ++++++ .../plugs/ensure_authenticated.ex | 7 +- lib/google_search_data_viewer_web/router.ex | 8 ++ .../templates/keyword/form.html.heex | 9 ++ .../templates/keyword/index.html.heex | 25 ++++ .../templates/layout/app.html.heex | 1 + .../views/keyword_view.ex | 7 ++ mix.exs | 1 + mix.lock | 1 + .../20220511095327_create_keyword_uploads.exs | 15 +++ ...11101610_keywordupload_belongs_to_user.exs | 9 ++ test/factories/keyword_upload_factory.ex | 18 +++ .../keywords/keyword_test.exs | 43 +++++++ .../keywords/schemas/keyword_upload_test.exs | 39 +++++++ .../controllers/keyword_controller_test.exs | 108 ++++++++++++++++++ .../plugs/ensure_authenticated_test.exs | 26 +++++ .../views/keyword_view_test.exs | 13 +++ test/support/factory.ex | 1 + .../fixtures/keywords/empty_keywords.csv | 0 .../keywords/invalid_extension_keywords.txt | 3 + .../fixtures/keywords/valid_keywords.csv | 3 + .../fixtures/keywords/valid_two_keywords.csv | 2 + 30 files changed, 494 insertions(+), 9 deletions(-) rename .github/workflows/{deploy_heroku_prod.yaml => deploy_heroku_prod.yml} (97%) create mode 100644 lib/google_search_data_viewer/keywords/keyword.ex create mode 100644 lib/google_search_data_viewer/keywords/schemas/keyword_upload.ex create mode 100644 lib/google_search_data_viewer_web/controllers/keyword_controller.ex create mode 100644 lib/google_search_data_viewer_web/helpers/keyword_helper.ex create mode 100644 lib/google_search_data_viewer_web/templates/keyword/form.html.heex create mode 100644 lib/google_search_data_viewer_web/templates/keyword/index.html.heex create mode 100644 lib/google_search_data_viewer_web/views/keyword_view.ex create mode 100644 priv/repo/migrations/20220511095327_create_keyword_uploads.exs create mode 100644 priv/repo/migrations/20220511101610_keywordupload_belongs_to_user.exs create mode 100644 test/factories/keyword_upload_factory.ex create mode 100644 test/google_search_data_viewer/keywords/keyword_test.exs create mode 100644 test/google_search_data_viewer/keywords/schemas/keyword_upload_test.exs create mode 100644 test/google_search_data_viewer_web/controllers/keyword_controller_test.exs create mode 100644 test/google_search_data_viewer_web/plugs/ensure_authenticated_test.exs create mode 100644 test/google_search_data_viewer_web/views/keyword_view_test.exs create mode 100644 test/support/fixtures/keywords/empty_keywords.csv create mode 100644 test/support/fixtures/keywords/invalid_extension_keywords.txt create mode 100644 test/support/fixtures/keywords/valid_keywords.csv create mode 100644 test/support/fixtures/keywords/valid_two_keywords.csv diff --git a/.github/workflows/deploy_heroku_prod.yaml b/.github/workflows/deploy_heroku_prod.yml similarity index 97% rename from .github/workflows/deploy_heroku_prod.yaml rename to .github/workflows/deploy_heroku_prod.yml index db359f1..d159440 100644 --- a/.github/workflows/deploy_heroku_prod.yaml +++ b/.github/workflows/deploy_heroku_prod.yml @@ -18,6 +18,7 @@ env: jobs: deploy: name: Deploy to Heroku production + permissions: write-all runs-on: ubuntu-latest steps: diff --git a/.github/workflows/deploy_heroku_staging.yml b/.github/workflows/deploy_heroku_staging.yml index 89f1eed..78d84f3 100644 --- a/.github/workflows/deploy_heroku_staging.yml +++ b/.github/workflows/deploy_heroku_staging.yml @@ -18,6 +18,7 @@ env: jobs: deploy: name: Deploy to Heroku staging + permissions: write-all runs-on: ubuntu-latest steps: diff --git a/.github/workflows/test.yml b/.github/workflows/test.yml index 951e6ab..271513f 100644 --- a/.github/workflows/test.yml +++ b/.github/workflows/test.yml @@ -1,6 +1,6 @@ name: Test -on: pull_request +on: push env: OTP_VERSION: "24.2.2" @@ -12,6 +12,7 @@ env: jobs: install_and_compile_dependencies: name: Install and Compile Dependencies + permissions: write-all runs-on: ubuntu-latest @@ -69,6 +70,7 @@ jobs: lint_codebase: name: Linting + permissions: write-all needs: install_and_compile_dependencies @@ -153,6 +155,7 @@ jobs: test_database_seeds: name: Test database seeds + permissions: write-all needs: lint_codebase @@ -231,6 +234,7 @@ jobs: unit_test: name: Unit test + permissions: write-all needs: lint_codebase @@ -313,6 +317,7 @@ jobs: feature_test: name: Feature test + permissions: write-all needs: lint_codebase diff --git a/config/prod.exs b/config/prod.exs index c2a2a5f..96fa5b0 100644 --- a/config/prod.exs +++ b/config/prod.exs @@ -50,5 +50,3 @@ config :logger, level: :info # Check `Plug.SSL` for all available options in `force_ssl`. config :google_search_data_viewer, GoogleSearchDataViewerWeb.Endpoint, force_ssl: [rewrite_on: [:x_forwarded_proto]] - -import_config "prod.secret.exs" diff --git a/lib/google_search_data_viewer/accounts/schemas/user.ex b/lib/google_search_data_viewer/accounts/schemas/user.ex index 87713ad..4005456 100644 --- a/lib/google_search_data_viewer/accounts/schemas/user.ex +++ b/lib/google_search_data_viewer/accounts/schemas/user.ex @@ -4,12 +4,15 @@ defmodule GoogleSearchDataViewer.Accounts.Schemas.User do import Ecto.Changeset alias GoogleSearchDataViewer.Accounts.Passwords + alias GoogleSearchDataViewer.Keywords.Schemas.KeywordUpload schema "users" do field :email, :string field :password, :string, virtual: true field :hashed_password, :string + has_many :keyword_uploads, KeywordUpload + timestamps() end diff --git a/lib/google_search_data_viewer/keywords/keyword.ex b/lib/google_search_data_viewer/keywords/keyword.ex new file mode 100644 index 0000000..12dced4 --- /dev/null +++ b/lib/google_search_data_viewer/keywords/keyword.ex @@ -0,0 +1,56 @@ +defmodule GoogleSearchDataViewer.Keywords.Keyword do + import Ecto.Query, warn: false + + alias GoogleSearchDataViewer.Keywords.Schemas.KeywordUpload + alias GoogleSearchDataViewer.Repo + + def get_keyword_uploads_for_user(user) do + KeywordUpload + |> where(user_id: ^user.id) + |> order_by(desc: :inserted_at) + |> select([:id, :user_id, :name, :status, :updated_at, :inserted_at]) + |> Repo.all() + end + + def insert_keyword_uploads(attrs) do + Repo.insert_all(KeywordUpload, attrs, returning: true) + end + + def create_keyword_uploads(keywords, user) do + keywords + |> process_keyword_params(user) + |> insert_keyword_uploads() + end + + defp process_keyword_params(keywords, user) do + keywords + |> Enum.map(fn keyword -> + create_params_for_keyword_and_user(keyword, user.id) + end) + |> Enum.map(fn params -> create_changeset_and_parse(params) end) + |> Enum.map(&Map.from_struct/1) + |> Enum.map(fn params -> Map.drop(params, [:__meta__, :user, :id]) end) + |> Enum.map(fn params -> insert_timestamps(params) end) + end + + defp create_params_for_keyword_and_user(keyword, user_id) do + %{ + name: keyword, + user_id: user_id + } + end + + defp create_changeset_and_parse(params) do + params + |> KeywordUpload.changeset() + |> Ecto.Changeset.apply_changes() + end + + defp insert_timestamps(params) do + current_date_time = NaiveDateTime.truncate(NaiveDateTime.utc_now(), :second) + + params + |> Map.put(:inserted_at, current_date_time) + |> Map.put(:updated_at, current_date_time) + end +end diff --git a/lib/google_search_data_viewer/keywords/schemas/keyword_upload.ex b/lib/google_search_data_viewer/keywords/schemas/keyword_upload.ex new file mode 100644 index 0000000..18dcd5f --- /dev/null +++ b/lib/google_search_data_viewer/keywords/schemas/keyword_upload.ex @@ -0,0 +1,27 @@ +defmodule GoogleSearchDataViewer.Keywords.Schemas.KeywordUpload do + use Ecto.Schema + + import Ecto.Changeset + + alias GoogleSearchDataViewer.Accounts.Schemas.User + + schema "keyword_uploads" do + field :name, :string + field :html, :string + + field :status, Ecto.Enum, + values: [:pending, :inprogress, :completed, :failed], + default: :pending + + belongs_to :user, User + + timestamps() + end + + def changeset(keyword_upload \\ %__MODULE__{}, attrs) do + keyword_upload + |> cast(attrs, [:name, :user_id]) + |> validate_required([:name, :user_id]) + |> assoc_constraint(:user) + end +end diff --git a/lib/google_search_data_viewer_web/controllers/keyword_controller.ex b/lib/google_search_data_viewer_web/controllers/keyword_controller.ex new file mode 100644 index 0000000..55df0f7 --- /dev/null +++ b/lib/google_search_data_viewer_web/controllers/keyword_controller.ex @@ -0,0 +1,33 @@ +defmodule GoogleSearchDataViewerWeb.KeywordController do + use GoogleSearchDataViewerWeb, :controller + + alias GoogleSearchDataViewer.Keywords.Keyword + alias GoogleSearchDataViewerWeb.KeywordHelper + + def index(conn, _params) do + keywords = Keyword.get_keyword_uploads_for_user(conn.assigns.current_user) + render(conn, "index.html", keywords: keywords) + end + + def upload(conn, %{"file" => file}) do + case KeywordHelper.validate_and_parse_keyword_file(file) do + {:ok, keywords} -> + {keyword_count, _keywords} = + Keyword.create_keyword_uploads(keywords, conn.assigns.current_user) + + conn + |> put_flash(:info, "File successfully uploaded. #{keyword_count} keywords uploaded.") + |> redirect(to: Routes.keyword_path(conn, :index)) + + {:error, :invalid_extension} -> + conn + |> put_flash(:error, "File extension invalid, csv only") + |> redirect(to: Routes.keyword_path(conn, :index)) + + {:error, :invalid_length} -> + conn + |> put_flash(:error, "Length invalid. 1-1000 keywords only") + |> redirect(to: Routes.keyword_path(conn, :index)) + end + end +end diff --git a/lib/google_search_data_viewer_web/helpers/keyword_helper.ex b/lib/google_search_data_viewer_web/helpers/keyword_helper.ex new file mode 100644 index 0000000..1d6c241 --- /dev/null +++ b/lib/google_search_data_viewer_web/helpers/keyword_helper.ex @@ -0,0 +1,34 @@ +defmodule GoogleSearchDataViewerWeb.KeywordHelper do + alias NimbleCSV.RFC4180, as: CSV + + @max_keyword_upload_count 1000 + + def validate_and_parse_keyword_file(file) do + with true <- file_valid?(file), + {:ok, keywords} <- parse_keyword_file(file) do + {:ok, keywords} + else + false -> {:error, :invalid_extension} + :error -> {:error, :invalid_length} + end + end + + defp file_valid?(file), do: file.content_type == "text/csv" + + defp parse_keyword_file(file) do + keywords = + file.path + |> File.stream!() + |> CSV.parse_stream(skip_headers: false) + |> Enum.to_list() + |> List.flatten() + + keywords_length = Enum.count(keywords) + + if keywords_length > 0 && keywords_length <= @max_keyword_upload_count do + {:ok, keywords} + else + :error + end + end +end diff --git a/lib/google_search_data_viewer_web/plugs/ensure_authenticated.ex b/lib/google_search_data_viewer_web/plugs/ensure_authenticated.ex index bd59941..737c123 100644 --- a/lib/google_search_data_viewer_web/plugs/ensure_authenticated.ex +++ b/lib/google_search_data_viewer_web/plugs/ensure_authenticated.ex @@ -1,6 +1,3 @@ -# TODO: Remove # coveralls-ignore-stop - -# coveralls-ignore-start defmodule GoogleSearchDataViewerWeb.EnsureAuthenticatedPlug do import Plug.Conn import Phoenix.Controller @@ -22,7 +19,7 @@ defmodule GoogleSearchDataViewerWeb.EnsureAuthenticatedPlug do |> redirect(to: Routes.page_path(conn, :index)) |> halt() - true -> + _user -> conn end end @@ -31,5 +28,3 @@ defmodule GoogleSearchDataViewerWeb.EnsureAuthenticatedPlug do defp get_user(user_id), do: Account.get_user(user_id) end - -# coveralls-ignore-stop diff --git a/lib/google_search_data_viewer_web/router.ex b/lib/google_search_data_viewer_web/router.ex index 9cf02b7..f2792b5 100644 --- a/lib/google_search_data_viewer_web/router.ex +++ b/lib/google_search_data_viewer_web/router.ex @@ -36,6 +36,14 @@ defmodule GoogleSearchDataViewerWeb.Router do resources "/sessions", SessionController, only: [:create, :new, :delete] end + scope "/keywords", GoogleSearchDataViewerWeb do + pipe_through [:browser, :authorized] + + get "/", KeywordController, :index + + post "/upload", KeywordController, :upload + end + # Other scopes may use custom stacks. # scope "/api", GoogleSearchDataViewerWeb do # pipe_through :api diff --git a/lib/google_search_data_viewer_web/templates/keyword/form.html.heex b/lib/google_search_data_viewer_web/templates/keyword/form.html.heex new file mode 100644 index 0000000..650efbe --- /dev/null +++ b/lib/google_search_data_viewer_web/templates/keyword/form.html.heex @@ -0,0 +1,9 @@ +<.form id="keyword-upload-form" let={f} for={@conn} action={@action} multipart={true}> + <%= label f, :File %> + <%= file_input f, :file, accept: ".csv", required: true %> + <%= error_tag f, :file %> + +
+ <%= submit "Upload" %> +
+ diff --git a/lib/google_search_data_viewer_web/templates/keyword/index.html.heex b/lib/google_search_data_viewer_web/templates/keyword/index.html.heex new file mode 100644 index 0000000..22662e4 --- /dev/null +++ b/lib/google_search_data_viewer_web/templates/keyword/index.html.heex @@ -0,0 +1,25 @@ +
+

Keywords

+ +

Select a csv file to upload keywords.

+

Maximum keywords per file: 1000

+ <%= render "form.html", Map.put(assigns, :action, Routes.keyword_path(@conn, :upload)) %> + +

Keywords

+
+ + + + + + + <%= for keyword <- @keywords do %> + + + + + + <% end %> +
KeywordStatusUploaded at
<%= keyword.name %><%= keyword.status %><%= format_date_time(keyword.inserted_at) %>
+
+
diff --git a/lib/google_search_data_viewer_web/templates/layout/app.html.heex b/lib/google_search_data_viewer_web/templates/layout/app.html.heex index 95724f0..85166bf 100644 --- a/lib/google_search_data_viewer_web/templates/layout/app.html.heex +++ b/lib/google_search_data_viewer_web/templates/layout/app.html.heex @@ -4,6 +4,7 @@ <%= if @current_user do %>
  • Signed in as <%= @current_user.email %>
  • <%= link "Sign out", to: Routes.session_path(@conn, :delete, @current_user), method: :delete %>
  • +
  • <%= link "Keywords", to: Routes.keyword_path(@conn, :index), method: :get %>
  • <% else %>
  • <%= link "Sign up", to: Routes.user_path(@conn, :new), method: :get %>
  • <%= link "Sign in", to: Routes.session_path(@conn, :new), method: :get %>
  • diff --git a/lib/google_search_data_viewer_web/views/keyword_view.ex b/lib/google_search_data_viewer_web/views/keyword_view.ex new file mode 100644 index 0000000..a7b85a1 --- /dev/null +++ b/lib/google_search_data_viewer_web/views/keyword_view.ex @@ -0,0 +1,7 @@ +defmodule GoogleSearchDataViewerWeb.KeywordView do + use GoogleSearchDataViewerWeb, :view + + def format_date_time(datetime) do + Calendar.strftime(datetime, "%d.%m.%y %H:%M:%S") + end +end diff --git a/mix.exs b/mix.exs index 0cec4c8..82379f5 100644 --- a/mix.exs +++ b/mix.exs @@ -53,6 +53,7 @@ defmodule GoogleSearchDataViewer.MixProject do {:gettext, "~> 0.18"}, {:jason, "~> 1.2"}, {:mimic, "~> 1.7.2", [only: :test]}, + {:nimble_csv, "~> 1.1"}, {:nimble_template, "~> 4.1", only: :dev, runtime: false}, {:oban, "~> 2.12.0"}, {:phoenix, "~> 1.6.6"}, diff --git a/mix.lock b/mix.lock index 63392e7..2f01661 100644 --- a/mix.lock +++ b/mix.lock @@ -37,6 +37,7 @@ "mime": {:hex, :mime, "2.0.2", "0b9e1a4c840eafb68d820b0e2158ef5c49385d17fb36855ac6e7e087d4b1dcc5", [:mix], [], "hexpm", "e6a3f76b4c277739e36c2e21a2c640778ba4c3846189d5ab19f97f126df5f9b7"}, "mimerl": {:hex, :mimerl, "1.2.0", "67e2d3f571088d5cfd3e550c383094b47159f3eee8ffa08e64106cdf5e981be3", [:rebar3], [], "hexpm", "f278585650aa581986264638ebf698f8bb19df297f66ad91b18910dfc6e19323"}, "mimic": {:hex, :mimic, "1.7.2", "27007e4e0c746ddb6d56a386c40585088b35621ae2d7167160e8c3283e8cd585", [:mix], [], "hexpm", "e4d40550523841055aa469f5125d124ab89ce8b2d3686cab908b98dff5e6111b"}, + "nimble_csv": {:hex, :nimble_csv, "1.2.0", "4e26385d260c61eba9d4412c71cea34421f296d5353f914afe3f2e71cce97722", [:mix], [], "hexpm", "d0628117fcc2148178b034044c55359b26966c6eaa8e2ce15777be3bbc91b12a"}, "nimble_template": {:hex, :nimble_template, "4.1.1", "ed9b223fe0cf03f07de76cdaed49b25ba5d9e884232fe6995c1f807cc4e935c7", [:make, :mix], [{:httpoison, "~> 1.7", [hex: :httpoison, repo: "hexpm", optional: false]}, {:jason, "~> 1.2", [hex: :jason, repo: "hexpm", optional: false]}, {:phoenix, "~> 1.6.6", [hex: :phoenix, repo: "hexpm", optional: false]}], "hexpm", "5ddb640de99bbf8046afa17ab0fd784173d3dff51f1d9d441f0c76e28bb96378"}, "oban": {:hex, :oban, "2.12.0", "bd5a283770c6ab1284aad81e5566cfb89f4119b08f52508d92d73551283c8789", [:mix], [{:ecto_sql, "~> 3.6", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.16", [hex: :postgrex, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "1557b7b046b13c0b5360f55a9fb7e56975f6b5f8247e56f2c54575bd95435ca0"}, "parse_trans": {:hex, :parse_trans, "3.3.1", "16328ab840cc09919bd10dab29e431da3af9e9e7e7e6f0089dd5a2d2820011d8", [:rebar3], [], "hexpm", "07cd9577885f56362d414e8c4c4e6bdf10d43a8767abb92d24cbe8b24c54888b"}, diff --git a/priv/repo/migrations/20220511095327_create_keyword_uploads.exs b/priv/repo/migrations/20220511095327_create_keyword_uploads.exs new file mode 100644 index 0000000..49fd76c --- /dev/null +++ b/priv/repo/migrations/20220511095327_create_keyword_uploads.exs @@ -0,0 +1,15 @@ +defmodule GoogleSearchDataViewer.Repo.Migrations.CreateKeywordUploads do + use Ecto.Migration + + def change do + create table(:keyword_uploads) do + add :name, :string, null: false + add :html, :text + add :status, :string, null: false + + timestamps() + end + + create index(:keyword_uploads, [:name]) + end +end diff --git a/priv/repo/migrations/20220511101610_keywordupload_belongs_to_user.exs b/priv/repo/migrations/20220511101610_keywordupload_belongs_to_user.exs new file mode 100644 index 0000000..4200726 --- /dev/null +++ b/priv/repo/migrations/20220511101610_keywordupload_belongs_to_user.exs @@ -0,0 +1,9 @@ +defmodule GoogleSearchDataViewer.Repo.Migrations.KeywordUploadBelongsToUser do + use Ecto.Migration + + def change do + alter table(:keyword_uploads) do + add :user_id, references(:users) + end + end +end diff --git a/test/factories/keyword_upload_factory.ex b/test/factories/keyword_upload_factory.ex new file mode 100644 index 0000000..0a15797 --- /dev/null +++ b/test/factories/keyword_upload_factory.ex @@ -0,0 +1,18 @@ +defmodule GoogleSearchDataViewer.KeywordUploadFactory do + alias Faker.Food.En + alias GoogleSearchDataViewer.Keywords.Schemas.KeywordUpload + + defmacro __using__(_opts) do + quote do + def keyword_upload_factory(attrs) do + name = attrs[:name] || En.dish() + user = attrs[:user] + + %KeywordUpload{ + name: name, + user: user + } + end + end + end +end diff --git a/test/google_search_data_viewer/keywords/keyword_test.exs b/test/google_search_data_viewer/keywords/keyword_test.exs new file mode 100644 index 0000000..794604a --- /dev/null +++ b/test/google_search_data_viewer/keywords/keyword_test.exs @@ -0,0 +1,43 @@ +defmodule GoogleSearchDataViewer.Keywords.KeywordTest do + use GoogleSearchDataViewer.DataCase, async: true + + alias GoogleSearchDataViewer.Keywords.Keyword + + describe "create_keyword_uploads/2" do + test "given a valid list of keywords and a user, creates keywords for the user" do + user = insert(:user) + keywords = ["dog", "cat", "fish"] + + {keyword_count, _keywords} = Keyword.create_keyword_uploads(keywords, user) + + assert keyword_count == Enum.count(keywords) + end + end + + describe "get_keyword_uploads_for_user/1" do + test "given an existing user with uploaded keywords, lists keywords for the user" do + user = insert(:user) + + keywords = ["dog", "cat", "fish"] + + keyword_uploads = + Enum.map(keywords, fn keyword -> insert(:keyword_upload, name: keyword, user: user) end) + + assert keyword_uploads == + user + |> Keyword.get_keyword_uploads_for_user() + |> Repo.preload(:user) + end + + test "given an existing user with no uploaded keywords, returns an empty list" do + user1 = insert(:user) + user2 = insert(:user) + + keywords = ["dog", "cat", "fish"] + + Enum.each(keywords, fn keyword -> insert(:keyword_upload, name: keyword, user: user1) end) + + assert Keyword.get_keyword_uploads_for_user(user2) == [] + end + end +end diff --git a/test/google_search_data_viewer/keywords/schemas/keyword_upload_test.exs b/test/google_search_data_viewer/keywords/schemas/keyword_upload_test.exs new file mode 100644 index 0000000..1588146 --- /dev/null +++ b/test/google_search_data_viewer/keywords/schemas/keyword_upload_test.exs @@ -0,0 +1,39 @@ +defmodule GoogleSearchDataViewer.Keywords.Schemas.KeywordUploadTest do + use GoogleSearchDataViewer.DataCase, async: true + + alias GoogleSearchDataViewer.Keywords.Schemas.KeywordUpload + + describe "changeset/2" do + test "given a changeset with keyword name and user id, returns valid changeset" do + keyword_upload_changeset = + KeywordUpload.changeset(%KeywordUpload{}, %{name: "cat", user_id: 1}) + + assert keyword_upload_changeset.valid? == true + end + + test "given an empty changeset with empty fields, fails to validate" do + keyword_upload_changeset = KeywordUpload.changeset(%KeywordUpload{}, %{}) + + assert keyword_upload_changeset.valid? == false + end + + test "given a changeset with keyword name and existing user id, validates" do + %{id: user_id} = insert(:user) + + keyword_upload_changeset = + KeywordUpload.changeset(%KeywordUpload{}, %{name: "cat", user_id: user_id}) + + assert keyword_upload_changeset.valid? == true + assert keyword_upload_changeset.changes == %{name: "cat", user_id: user_id} + end + + test "given an invalid user, return invalid changeset" do + keyword_upload_changeset = + KeywordUpload.changeset(%KeywordUpload{}, %{name: "cat", user_id: -1}) + + assert {:error, changeset} = Repo.insert(keyword_upload_changeset) + + assert errors_on(changeset) == %{user: ["does not exist"]} + end + end +end diff --git a/test/google_search_data_viewer_web/controllers/keyword_controller_test.exs b/test/google_search_data_viewer_web/controllers/keyword_controller_test.exs new file mode 100644 index 0000000..b060ebf --- /dev/null +++ b/test/google_search_data_viewer_web/controllers/keyword_controller_test.exs @@ -0,0 +1,108 @@ +defmodule GoogleSearchDataViewerWeb.KeywordControllerTest do + use GoogleSearchDataViewerWeb.ConnCase, async: true + + describe "GET /keywords" do + test "renders keywords page", %{conn: conn} do + user = insert(:user) + + conn = + conn + |> init_test_session(user_id: user.id) + |> get(Routes.keyword_path(conn, :index)) + + assert html_response(conn, 200) =~ "Select a csv file to upload keywords" + end + end + + describe "POST /keywords/upload" do + test "given a valid csv file extension, uploads the file", %{conn: conn} do + file = %Plug.Upload{ + path: "test/support/fixtures/keywords/valid_keywords.csv", + filename: "valid_keywords.csv", + content_type: "text/csv" + } + + user = insert(:user) + + conn = + conn + |> init_test_session(user_id: user.id) + |> post("/keywords/upload", %{:file => file}) + + assert get_flash(conn, :info) =~ "File successfully uploaded" + assert redirected_to(conn, 302) =~ "/keywords" + end + + test "given a valid csv file extension with two keywords, returns an uploaded count of two", %{ + conn: conn + } do + file = %Plug.Upload{ + path: "test/support/fixtures/keywords/valid_two_keywords.csv", + filename: "valid_two_keywords.csv", + content_type: "text/csv" + } + + user = insert(:user) + + conn = + conn + |> init_test_session(user_id: user.id) + |> post("/keywords/upload", %{:file => file}) + + assert get_flash(conn, :info) =~ "File successfully uploaded. 2 keywords uploaded" + assert redirected_to(conn, 302) =~ "/keywords" + end + + test "given an empty keywords file, fails to upload the file", %{conn: conn} do + file = %Plug.Upload{ + path: "test/support/fixtures/keywords/empty_keywords.csv", + filename: "empty_keywords.csv", + content_type: "text/csv" + } + + user = insert(:user) + + conn = + conn + |> init_test_session(user_id: user.id) + |> post("/keywords/upload", %{:file => file}) + + assert get_flash(conn, :error) =~ "Length invalid. 1-1000 keywords only" + assert redirected_to(conn, 302) =~ "/keywords" + end + + test "given an invalid file extension, fails to upload the file", %{conn: conn} do + file = %Plug.Upload{ + path: "test/support/fixtures/keywords/invalid_extension_keywords.txt", + filename: "invalid_extension_keywords.txt", + content_type: "text/plain" + } + + user = insert(:user) + + conn = + conn + |> init_test_session(user_id: user.id) + |> post("/keywords/upload", %{:file => file}) + + assert get_flash(conn, :error) =~ "File extension invalid, csv only" + assert redirected_to(conn, 302) =~ "/keywords" + end + + test "given an unauthenticated user, fails to upload and redirects to the home page", %{ + conn: conn + } do + file = %Plug.Upload{ + path: "test/support/fixtures/keywords/valid_keywords.csv", + filename: "valid_keywords.csv", + content_type: "text/csv" + } + + conn = post(conn, "/keywords/upload", %{:file => file}) + + assert conn.halted == true + assert get_flash(conn, :error) =~ "Please sign in to use this service" + assert redirected_to(conn, 302) =~ "/" + end + end +end diff --git a/test/google_search_data_viewer_web/plugs/ensure_authenticated_test.exs b/test/google_search_data_viewer_web/plugs/ensure_authenticated_test.exs new file mode 100644 index 0000000..8bb28a2 --- /dev/null +++ b/test/google_search_data_viewer_web/plugs/ensure_authenticated_test.exs @@ -0,0 +1,26 @@ +defmodule GoogleSearchDataViewerWeb.EnsureAuthenticatedPlugTest do + use GoogleSearchDataViewerWeb.ConnCase, async: true + + alias GoogleSearchDataViewerWeb.EnsureAuthenticatedPlug + + describe "init/1" do + test "returns given options" do + assert EnsureAuthenticatedPlug.init([]) == [] + end + end + + describe "call/2" do + test "given an unauthenticated user, renders home page", %{conn: conn} do + conn = + conn + |> init_test_session(%{}) + |> fetch_flash() + |> EnsureAuthenticatedPlug.call([]) + + assert conn.assigns.current_user == nil + assert conn.halted == true + assert get_flash(conn, :error) =~ "Please sign in to use this service" + assert redirected_to(conn, 302) =~ "/" + end + end +end diff --git a/test/google_search_data_viewer_web/views/keyword_view_test.exs b/test/google_search_data_viewer_web/views/keyword_view_test.exs new file mode 100644 index 0000000..c7adb70 --- /dev/null +++ b/test/google_search_data_viewer_web/views/keyword_view_test.exs @@ -0,0 +1,13 @@ +defmodule GoogleSearchDataViewerWeb.KeywordViewTest do + use GoogleSearchDataViewerWeb.ConnCase, async: true + + alias GoogleSearchDataViewerWeb.KeywordView + + describe "format_date_time/1" do + test "given a date and time, returns it in the format of d.m.y H:M:S" do + {_, current_date_time} = NaiveDateTime.new(1970, 1, 1, 0, 0, 0) + + assert KeywordView.format_date_time(current_date_time) == "01.01.70 00:00:00" + end + end +end diff --git a/test/support/factory.ex b/test/support/factory.ex index 49fb49d..b06c67a 100644 --- a/test/support/factory.ex +++ b/test/support/factory.ex @@ -1,4 +1,5 @@ defmodule GoogleSearchDataViewer.Factory do use ExMachina.Ecto, repo: GoogleSearchDataViewer.Repo + use GoogleSearchDataViewer.KeywordUploadFactory use GoogleSearchDataViewer.UserFactory end diff --git a/test/support/fixtures/keywords/empty_keywords.csv b/test/support/fixtures/keywords/empty_keywords.csv new file mode 100644 index 0000000..e69de29 diff --git a/test/support/fixtures/keywords/invalid_extension_keywords.txt b/test/support/fixtures/keywords/invalid_extension_keywords.txt new file mode 100644 index 0000000..6da7038 --- /dev/null +++ b/test/support/fixtures/keywords/invalid_extension_keywords.txt @@ -0,0 +1,3 @@ +dog +cow +sheep diff --git a/test/support/fixtures/keywords/valid_keywords.csv b/test/support/fixtures/keywords/valid_keywords.csv new file mode 100644 index 0000000..6da7038 --- /dev/null +++ b/test/support/fixtures/keywords/valid_keywords.csv @@ -0,0 +1,3 @@ +dog +cow +sheep diff --git a/test/support/fixtures/keywords/valid_two_keywords.csv b/test/support/fixtures/keywords/valid_two_keywords.csv new file mode 100644 index 0000000..a8ef6a9 --- /dev/null +++ b/test/support/fixtures/keywords/valid_two_keywords.csv @@ -0,0 +1,2 @@ +dog +cow