From ba2617e929219423a929432f676fc23b669b0d5b Mon Sep 17 00:00:00 2001 From: Liam Stevens Date: Thu, 19 May 2022 14:20:45 +0700 Subject: [PATCH 1/2] Release - 0.2.0 (#37) * Update README.md * Updated gettext errors * Removed unused variables * Fixed linting/formatting issues * Fixed linting/formatting issues * Update deploy_heroku.yml * Rename deploy_heroku.yml to deploy_heroku_staging.yml * Create deploy_heroku_prod.yaml * Update deploy_heroku_staging.yml * As a user I can sign up and sign in with a valid e-mail and password (#28) * [#6 #16] As a user I can sign in with a valid e-mail and password * Remove unused files * Remove unsused update function for User * Add session for user after log in * Add unique email constraint on Users table * Add user sign in * Add current sign in status for user * Add user sign out functionality (not in backlog) * Remove / refactored code * Remove coverage check for currently unused plug * Add controller tests * Prepare ExMachina for testing * User sign out now displays a message * Change session deletion method to ensure persistence of message to user upon sign out * Refactored password hashing function so it can be used in future tests * Refactor fixture to use ExMachina and Faker for data generation * Modify ExUnit tests to conform to standards * Tidy template pages with correct formatting * Remove comments and cleaned up code * Merge migrations into single file for User schema * Remove comments and cleaned up code * Correct English used in ExUnit test case * Move secret_key-base file to environment variable for production * Change multiple alias identifiers from one line to multiple to satisy codebase * Make blank line seperation more consistent in the tests * Add feature test case for User log in * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Remove code form AuthController to AuthHelper to reflect the functionality * [#6 #7 #16 #22] Remove auto-generated function spec * [#6 #7 #16 #22] Moved Account context into accounts folder and account schema into its own folder to improve structure * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Change refute to assert false for testing outcomese * [#6 #7 #16 #22] Changed from pattern matching to double equals to match exact output when required. Move value being tested against to the right side * [#3] [UI] As a user, I can upload a CSV file containing keywords which will then be used to search on Google (#31) * [#6 #16] As a user I can sign in with a valid e-mail and password * Remove unused files * Remove unsused update function for User * Add session for user after log in * Add unique email constraint on Users table * Add user sign in * Add current sign in status for user * Add user sign out functionality (not in backlog) * Remove / refactored code * Remove coverage check for currently unused plug * Add controller tests * Prepare ExMachina for testing * User sign out now displays a message * Change session deletion method to ensure persistence of message to user upon sign out * Refactored password hashing function so it can be used in future tests * Refactor fixture to use ExMachina and Faker for data generation * Modify ExUnit tests to conform to standards * Tidy template pages with correct formatting * Remove comments and cleaned up code * Merge migrations into single file for User schema * Remove comments and cleaned up code * Correct English used in ExUnit test case * Move secret_key-base file to environment variable for production * Change multiple alias identifiers from one line to multiple to satisy codebase * Make blank line seperation more consistent in the tests * Add feature test case for User log in * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Remove code form AuthController to AuthHelper to reflect the functionality * [#6 #7 #16 #22] Remove auto-generated function spec * [#6 #7 #16 #22] Moved Account context into accounts folder and account schema into its own folder to improve structure * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#3] Add endpoint and controller for handling keyword upload * [#3] Add template files for uploading files, including upload form * [#3] Add link to keywords page in navigation * [#3] Remove coverall and comments from authenticated plug to prepare for tests * [#3] Add tests for keyword controller and test csv file * [#3] Add tests for ensure_authenticated plug * Resolved merge conflict * [#3] Format code * [#3] Change keywords fixture file name and changed template to show 1000 keywords limit * [#3] Remove blank line and re-order assert tests for ensure_authenticated plug tests * [#3] Add an additional test to ensure unauthenticated users are unable to upload a keywords file * [#3] Format test * [#18] [Backend] As a user, I can upload a CSV file containing keywords which will be stored (#33) * [#6 #16] As a user I can sign in with a valid e-mail and password * Remove unused files * Remove unsused update function for User * Add session for user after log in * Add unique email constraint on Users table * Add user sign in * Add current sign in status for user * Add user sign out functionality (not in backlog) * Remove / refactored code * Remove coverage check for currently unused plug * Add controller tests * Prepare ExMachina for testing * User sign out now displays a message * Change session deletion method to ensure persistence of message to user upon sign out * Refactored password hashing function so it can be used in future tests * Refactor fixture to use ExMachina and Faker for data generation * Modify ExUnit tests to conform to standards * Tidy template pages with correct formatting * Remove comments and cleaned up code * Merge migrations into single file for User schema * Remove comments and cleaned up code * Correct English used in ExUnit test case * Move secret_key-base file to environment variable for production * Change multiple alias identifiers from one line to multiple to satisy codebase * Make blank line seperation more consistent in the tests * Add feature test case for User log in * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Remove code form AuthController to AuthHelper to reflect the functionality * [#6 #7 #16 #22] Remove auto-generated function spec * [#6 #7 #16 #22] Moved Account context into accounts folder and account schema into its own folder to improve structure * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#3] Add endpoint and controller for handling keyword upload * [#3] Add template files for uploading files, including upload form * [#3] Add link to keywords page in navigation * [#3] Remove coverall and comments from authenticated plug to prepare for tests * [#3] Add tests for keyword controller and test csv file * [#3] Add tests for ensure_authenticated plug * Resolved merge conflict * [#3] Format code * [#3] Change keywords fixture file name and changed template to show 1000 keywords limit * [#3] Remove blank line and re-order assert tests for ensure_authenticated plug tests * [#3] Add an additional test to ensure unauthenticated users are unable to upload a keywords file * [#3] Format test * [#18] Add NimbleCSV * [#18] Add Keyword Controller and helper function for validate/parse csv * [#18] Add Keyword Controller tests and additional test files for invalid cases * [#18] Change invalid file format to invalid file extension * [#18] Initial KeywordUpload Schema, associations and tests * [#18] Initial code to carry out the mass insertions of keywords into the table for the User. Needs refactor wip * [#18] Refactor KeywordUpload changeset to Use __MODULE__ as default argument * #[18] Slight refactor of keyword saving for user wip * Refactor Keyword context name and add one test * [#18] Change alias to fix formatting errors * [#18] Add test to Keyword Controller to verify an uplaod of two keywords returns the correct count to the user * [#18] Add additional empty line for csv files * [#18] Change from using length to Enum.count() for counting list size * [#18] Change from using string field to text for keyword html storage to remove character limit * [#18] Add positive test result for KeywordUpload changeset * [#18] Remove unnecessary conn.halts from keyword controller * [#18] Remove comments and changed grammar in test cases for Keywords * [#18] Change name and status fields of KeywordUpload to be to not accept null * [#18] Refactor parsing of keywords into correct structure for bulk inserts wip * [#18] Add two further KeywordUpload changeset tests to ensure a KeywordUpload has to have an existing user * [#1] [#20] As a user, I can view a list of my previously uploaded keywords (#35) * #[1] Add Context function to retreive list of uploaded keywords for a particular user * #[1] Add Controller and template to show the list of uploaded keywords for the user * #[1] Change github action trigger from Pull Request to Push to allow staging and prod * #[1] Change github action trigger from Pull Request to Push to allow staging and prod * [#1] Add Uploaded field to display for each KeywordUpload and format using Calendar module * #[1] Add test for KeywordView for formatting timestamp * #[1] Change Repo.list_all to return the inserted Keywords * #[1] Write tests for fetching KeywordUploads for a particular user * #[1] Add KeywordUpload Factory to tests for listing KeywordUploads for a User * [#1] Refactor keyword test using pipe operator to make it cleaner * [#1] Fix formatting on keywords index template file * [#1] Clean up keyword template file * [#1] Remove external Calendar library dependencies due to built-in functionality in Elxiir * [#1] Made keyword test title more explicit * Remove prod.secret.exe config import to allow deployment (#36) --- ...eroku_prod.yaml => deploy_heroku_prod.yml} | 1 + .github/workflows/deploy_heroku_staging.yml | 1 + .github/workflows/test.yml | 7 +- config/prod.exs | 2 - .../accounts/schemas/user.ex | 3 + .../keywords/keyword.ex | 56 +++++++++ .../keywords/schemas/keyword_upload.ex | 27 +++++ .../controllers/keyword_controller.ex | 33 ++++++ .../helpers/keyword_helper.ex | 34 ++++++ .../plugs/ensure_authenticated.ex | 7 +- lib/google_search_data_viewer_web/router.ex | 8 ++ .../templates/keyword/form.html.heex | 9 ++ .../templates/keyword/index.html.heex | 25 ++++ .../templates/layout/app.html.heex | 1 + .../views/keyword_view.ex | 7 ++ mix.exs | 1 + mix.lock | 1 + .../20220511095327_create_keyword_uploads.exs | 15 +++ ...11101610_keywordupload_belongs_to_user.exs | 9 ++ test/factories/keyword_upload_factory.ex | 18 +++ .../keywords/keyword_test.exs | 43 +++++++ .../keywords/schemas/keyword_upload_test.exs | 39 +++++++ .../controllers/keyword_controller_test.exs | 108 ++++++++++++++++++ .../plugs/ensure_authenticated_test.exs | 26 +++++ .../views/keyword_view_test.exs | 13 +++ test/support/factory.ex | 1 + .../fixtures/keywords/empty_keywords.csv | 0 .../keywords/invalid_extension_keywords.txt | 3 + .../fixtures/keywords/valid_keywords.csv | 3 + .../fixtures/keywords/valid_two_keywords.csv | 2 + 30 files changed, 494 insertions(+), 9 deletions(-) rename .github/workflows/{deploy_heroku_prod.yaml => deploy_heroku_prod.yml} (97%) create mode 100644 lib/google_search_data_viewer/keywords/keyword.ex create mode 100644 lib/google_search_data_viewer/keywords/schemas/keyword_upload.ex create mode 100644 lib/google_search_data_viewer_web/controllers/keyword_controller.ex create mode 100644 lib/google_search_data_viewer_web/helpers/keyword_helper.ex create mode 100644 lib/google_search_data_viewer_web/templates/keyword/form.html.heex create mode 100644 lib/google_search_data_viewer_web/templates/keyword/index.html.heex create mode 100644 lib/google_search_data_viewer_web/views/keyword_view.ex create mode 100644 priv/repo/migrations/20220511095327_create_keyword_uploads.exs create mode 100644 priv/repo/migrations/20220511101610_keywordupload_belongs_to_user.exs create mode 100644 test/factories/keyword_upload_factory.ex create mode 100644 test/google_search_data_viewer/keywords/keyword_test.exs create mode 100644 test/google_search_data_viewer/keywords/schemas/keyword_upload_test.exs create mode 100644 test/google_search_data_viewer_web/controllers/keyword_controller_test.exs create mode 100644 test/google_search_data_viewer_web/plugs/ensure_authenticated_test.exs create mode 100644 test/google_search_data_viewer_web/views/keyword_view_test.exs create mode 100644 test/support/fixtures/keywords/empty_keywords.csv create mode 100644 test/support/fixtures/keywords/invalid_extension_keywords.txt create mode 100644 test/support/fixtures/keywords/valid_keywords.csv create mode 100644 test/support/fixtures/keywords/valid_two_keywords.csv diff --git a/.github/workflows/deploy_heroku_prod.yaml b/.github/workflows/deploy_heroku_prod.yml similarity index 97% rename from .github/workflows/deploy_heroku_prod.yaml rename to .github/workflows/deploy_heroku_prod.yml index db359f1..d159440 100644 --- a/.github/workflows/deploy_heroku_prod.yaml +++ b/.github/workflows/deploy_heroku_prod.yml @@ -18,6 +18,7 @@ env: jobs: deploy: name: Deploy to Heroku production + permissions: write-all runs-on: ubuntu-latest steps: diff --git a/.github/workflows/deploy_heroku_staging.yml b/.github/workflows/deploy_heroku_staging.yml index 89f1eed..78d84f3 100644 --- a/.github/workflows/deploy_heroku_staging.yml +++ b/.github/workflows/deploy_heroku_staging.yml @@ -18,6 +18,7 @@ env: jobs: deploy: name: Deploy to Heroku staging + permissions: write-all runs-on: ubuntu-latest steps: diff --git a/.github/workflows/test.yml b/.github/workflows/test.yml index 951e6ab..271513f 100644 --- a/.github/workflows/test.yml +++ b/.github/workflows/test.yml @@ -1,6 +1,6 @@ name: Test -on: pull_request +on: push env: OTP_VERSION: "24.2.2" @@ -12,6 +12,7 @@ env: jobs: install_and_compile_dependencies: name: Install and Compile Dependencies + permissions: write-all runs-on: ubuntu-latest @@ -69,6 +70,7 @@ jobs: lint_codebase: name: Linting + permissions: write-all needs: install_and_compile_dependencies @@ -153,6 +155,7 @@ jobs: test_database_seeds: name: Test database seeds + permissions: write-all needs: lint_codebase @@ -231,6 +234,7 @@ jobs: unit_test: name: Unit test + permissions: write-all needs: lint_codebase @@ -313,6 +317,7 @@ jobs: feature_test: name: Feature test + permissions: write-all needs: lint_codebase diff --git a/config/prod.exs b/config/prod.exs index c2a2a5f..96fa5b0 100644 --- a/config/prod.exs +++ b/config/prod.exs @@ -50,5 +50,3 @@ config :logger, level: :info # Check `Plug.SSL` for all available options in `force_ssl`. config :google_search_data_viewer, GoogleSearchDataViewerWeb.Endpoint, force_ssl: [rewrite_on: [:x_forwarded_proto]] - -import_config "prod.secret.exs" diff --git a/lib/google_search_data_viewer/accounts/schemas/user.ex b/lib/google_search_data_viewer/accounts/schemas/user.ex index 87713ad..4005456 100644 --- a/lib/google_search_data_viewer/accounts/schemas/user.ex +++ b/lib/google_search_data_viewer/accounts/schemas/user.ex @@ -4,12 +4,15 @@ defmodule GoogleSearchDataViewer.Accounts.Schemas.User do import Ecto.Changeset alias GoogleSearchDataViewer.Accounts.Passwords + alias GoogleSearchDataViewer.Keywords.Schemas.KeywordUpload schema "users" do field :email, :string field :password, :string, virtual: true field :hashed_password, :string + has_many :keyword_uploads, KeywordUpload + timestamps() end diff --git a/lib/google_search_data_viewer/keywords/keyword.ex b/lib/google_search_data_viewer/keywords/keyword.ex new file mode 100644 index 0000000..12dced4 --- /dev/null +++ b/lib/google_search_data_viewer/keywords/keyword.ex @@ -0,0 +1,56 @@ +defmodule GoogleSearchDataViewer.Keywords.Keyword do + import Ecto.Query, warn: false + + alias GoogleSearchDataViewer.Keywords.Schemas.KeywordUpload + alias GoogleSearchDataViewer.Repo + + def get_keyword_uploads_for_user(user) do + KeywordUpload + |> where(user_id: ^user.id) + |> order_by(desc: :inserted_at) + |> select([:id, :user_id, :name, :status, :updated_at, :inserted_at]) + |> Repo.all() + end + + def insert_keyword_uploads(attrs) do + Repo.insert_all(KeywordUpload, attrs, returning: true) + end + + def create_keyword_uploads(keywords, user) do + keywords + |> process_keyword_params(user) + |> insert_keyword_uploads() + end + + defp process_keyword_params(keywords, user) do + keywords + |> Enum.map(fn keyword -> + create_params_for_keyword_and_user(keyword, user.id) + end) + |> Enum.map(fn params -> create_changeset_and_parse(params) end) + |> Enum.map(&Map.from_struct/1) + |> Enum.map(fn params -> Map.drop(params, [:__meta__, :user, :id]) end) + |> Enum.map(fn params -> insert_timestamps(params) end) + end + + defp create_params_for_keyword_and_user(keyword, user_id) do + %{ + name: keyword, + user_id: user_id + } + end + + defp create_changeset_and_parse(params) do + params + |> KeywordUpload.changeset() + |> Ecto.Changeset.apply_changes() + end + + defp insert_timestamps(params) do + current_date_time = NaiveDateTime.truncate(NaiveDateTime.utc_now(), :second) + + params + |> Map.put(:inserted_at, current_date_time) + |> Map.put(:updated_at, current_date_time) + end +end diff --git a/lib/google_search_data_viewer/keywords/schemas/keyword_upload.ex b/lib/google_search_data_viewer/keywords/schemas/keyword_upload.ex new file mode 100644 index 0000000..18dcd5f --- /dev/null +++ b/lib/google_search_data_viewer/keywords/schemas/keyword_upload.ex @@ -0,0 +1,27 @@ +defmodule GoogleSearchDataViewer.Keywords.Schemas.KeywordUpload do + use Ecto.Schema + + import Ecto.Changeset + + alias GoogleSearchDataViewer.Accounts.Schemas.User + + schema "keyword_uploads" do + field :name, :string + field :html, :string + + field :status, Ecto.Enum, + values: [:pending, :inprogress, :completed, :failed], + default: :pending + + belongs_to :user, User + + timestamps() + end + + def changeset(keyword_upload \\ %__MODULE__{}, attrs) do + keyword_upload + |> cast(attrs, [:name, :user_id]) + |> validate_required([:name, :user_id]) + |> assoc_constraint(:user) + end +end diff --git a/lib/google_search_data_viewer_web/controllers/keyword_controller.ex b/lib/google_search_data_viewer_web/controllers/keyword_controller.ex new file mode 100644 index 0000000..55df0f7 --- /dev/null +++ b/lib/google_search_data_viewer_web/controllers/keyword_controller.ex @@ -0,0 +1,33 @@ +defmodule GoogleSearchDataViewerWeb.KeywordController do + use GoogleSearchDataViewerWeb, :controller + + alias GoogleSearchDataViewer.Keywords.Keyword + alias GoogleSearchDataViewerWeb.KeywordHelper + + def index(conn, _params) do + keywords = Keyword.get_keyword_uploads_for_user(conn.assigns.current_user) + render(conn, "index.html", keywords: keywords) + end + + def upload(conn, %{"file" => file}) do + case KeywordHelper.validate_and_parse_keyword_file(file) do + {:ok, keywords} -> + {keyword_count, _keywords} = + Keyword.create_keyword_uploads(keywords, conn.assigns.current_user) + + conn + |> put_flash(:info, "File successfully uploaded. #{keyword_count} keywords uploaded.") + |> redirect(to: Routes.keyword_path(conn, :index)) + + {:error, :invalid_extension} -> + conn + |> put_flash(:error, "File extension invalid, csv only") + |> redirect(to: Routes.keyword_path(conn, :index)) + + {:error, :invalid_length} -> + conn + |> put_flash(:error, "Length invalid. 1-1000 keywords only") + |> redirect(to: Routes.keyword_path(conn, :index)) + end + end +end diff --git a/lib/google_search_data_viewer_web/helpers/keyword_helper.ex b/lib/google_search_data_viewer_web/helpers/keyword_helper.ex new file mode 100644 index 0000000..1d6c241 --- /dev/null +++ b/lib/google_search_data_viewer_web/helpers/keyword_helper.ex @@ -0,0 +1,34 @@ +defmodule GoogleSearchDataViewerWeb.KeywordHelper do + alias NimbleCSV.RFC4180, as: CSV + + @max_keyword_upload_count 1000 + + def validate_and_parse_keyword_file(file) do + with true <- file_valid?(file), + {:ok, keywords} <- parse_keyword_file(file) do + {:ok, keywords} + else + false -> {:error, :invalid_extension} + :error -> {:error, :invalid_length} + end + end + + defp file_valid?(file), do: file.content_type == "text/csv" + + defp parse_keyword_file(file) do + keywords = + file.path + |> File.stream!() + |> CSV.parse_stream(skip_headers: false) + |> Enum.to_list() + |> List.flatten() + + keywords_length = Enum.count(keywords) + + if keywords_length > 0 && keywords_length <= @max_keyword_upload_count do + {:ok, keywords} + else + :error + end + end +end diff --git a/lib/google_search_data_viewer_web/plugs/ensure_authenticated.ex b/lib/google_search_data_viewer_web/plugs/ensure_authenticated.ex index bd59941..737c123 100644 --- a/lib/google_search_data_viewer_web/plugs/ensure_authenticated.ex +++ b/lib/google_search_data_viewer_web/plugs/ensure_authenticated.ex @@ -1,6 +1,3 @@ -# TODO: Remove # coveralls-ignore-stop - -# coveralls-ignore-start defmodule GoogleSearchDataViewerWeb.EnsureAuthenticatedPlug do import Plug.Conn import Phoenix.Controller @@ -22,7 +19,7 @@ defmodule GoogleSearchDataViewerWeb.EnsureAuthenticatedPlug do |> redirect(to: Routes.page_path(conn, :index)) |> halt() - true -> + _user -> conn end end @@ -31,5 +28,3 @@ defmodule GoogleSearchDataViewerWeb.EnsureAuthenticatedPlug do defp get_user(user_id), do: Account.get_user(user_id) end - -# coveralls-ignore-stop diff --git a/lib/google_search_data_viewer_web/router.ex b/lib/google_search_data_viewer_web/router.ex index 9cf02b7..f2792b5 100644 --- a/lib/google_search_data_viewer_web/router.ex +++ b/lib/google_search_data_viewer_web/router.ex @@ -36,6 +36,14 @@ defmodule GoogleSearchDataViewerWeb.Router do resources "/sessions", SessionController, only: [:create, :new, :delete] end + scope "/keywords", GoogleSearchDataViewerWeb do + pipe_through [:browser, :authorized] + + get "/", KeywordController, :index + + post "/upload", KeywordController, :upload + end + # Other scopes may use custom stacks. # scope "/api", GoogleSearchDataViewerWeb do # pipe_through :api diff --git a/lib/google_search_data_viewer_web/templates/keyword/form.html.heex b/lib/google_search_data_viewer_web/templates/keyword/form.html.heex new file mode 100644 index 0000000..650efbe --- /dev/null +++ b/lib/google_search_data_viewer_web/templates/keyword/form.html.heex @@ -0,0 +1,9 @@ +<.form id="keyword-upload-form" let={f} for={@conn} action={@action} multipart={true}> + <%= label f, :File %> + <%= file_input f, :file, accept: ".csv", required: true %> + <%= error_tag f, :file %> + +
+ <%= submit "Upload" %> +
+ diff --git a/lib/google_search_data_viewer_web/templates/keyword/index.html.heex b/lib/google_search_data_viewer_web/templates/keyword/index.html.heex new file mode 100644 index 0000000..22662e4 --- /dev/null +++ b/lib/google_search_data_viewer_web/templates/keyword/index.html.heex @@ -0,0 +1,25 @@ +
+

Keywords

+ +

Select a csv file to upload keywords.

+

Maximum keywords per file: 1000

+ <%= render "form.html", Map.put(assigns, :action, Routes.keyword_path(@conn, :upload)) %> + +

Keywords

+
+ + + + + + + <%= for keyword <- @keywords do %> + + + + + + <% end %> +
KeywordStatusUploaded at
<%= keyword.name %><%= keyword.status %><%= format_date_time(keyword.inserted_at) %>
+
+
diff --git a/lib/google_search_data_viewer_web/templates/layout/app.html.heex b/lib/google_search_data_viewer_web/templates/layout/app.html.heex index 95724f0..85166bf 100644 --- a/lib/google_search_data_viewer_web/templates/layout/app.html.heex +++ b/lib/google_search_data_viewer_web/templates/layout/app.html.heex @@ -4,6 +4,7 @@ <%= if @current_user do %>
  • Signed in as <%= @current_user.email %>
  • <%= link "Sign out", to: Routes.session_path(@conn, :delete, @current_user), method: :delete %>
  • +
  • <%= link "Keywords", to: Routes.keyword_path(@conn, :index), method: :get %>
  • <% else %>
  • <%= link "Sign up", to: Routes.user_path(@conn, :new), method: :get %>
  • <%= link "Sign in", to: Routes.session_path(@conn, :new), method: :get %>
  • diff --git a/lib/google_search_data_viewer_web/views/keyword_view.ex b/lib/google_search_data_viewer_web/views/keyword_view.ex new file mode 100644 index 0000000..a7b85a1 --- /dev/null +++ b/lib/google_search_data_viewer_web/views/keyword_view.ex @@ -0,0 +1,7 @@ +defmodule GoogleSearchDataViewerWeb.KeywordView do + use GoogleSearchDataViewerWeb, :view + + def format_date_time(datetime) do + Calendar.strftime(datetime, "%d.%m.%y %H:%M:%S") + end +end diff --git a/mix.exs b/mix.exs index 0cec4c8..82379f5 100644 --- a/mix.exs +++ b/mix.exs @@ -53,6 +53,7 @@ defmodule GoogleSearchDataViewer.MixProject do {:gettext, "~> 0.18"}, {:jason, "~> 1.2"}, {:mimic, "~> 1.7.2", [only: :test]}, + {:nimble_csv, "~> 1.1"}, {:nimble_template, "~> 4.1", only: :dev, runtime: false}, {:oban, "~> 2.12.0"}, {:phoenix, "~> 1.6.6"}, diff --git a/mix.lock b/mix.lock index 63392e7..2f01661 100644 --- a/mix.lock +++ b/mix.lock @@ -37,6 +37,7 @@ "mime": {:hex, :mime, "2.0.2", "0b9e1a4c840eafb68d820b0e2158ef5c49385d17fb36855ac6e7e087d4b1dcc5", [:mix], [], "hexpm", "e6a3f76b4c277739e36c2e21a2c640778ba4c3846189d5ab19f97f126df5f9b7"}, "mimerl": {:hex, :mimerl, "1.2.0", "67e2d3f571088d5cfd3e550c383094b47159f3eee8ffa08e64106cdf5e981be3", [:rebar3], [], "hexpm", "f278585650aa581986264638ebf698f8bb19df297f66ad91b18910dfc6e19323"}, "mimic": {:hex, :mimic, "1.7.2", "27007e4e0c746ddb6d56a386c40585088b35621ae2d7167160e8c3283e8cd585", [:mix], [], "hexpm", "e4d40550523841055aa469f5125d124ab89ce8b2d3686cab908b98dff5e6111b"}, + "nimble_csv": {:hex, :nimble_csv, "1.2.0", "4e26385d260c61eba9d4412c71cea34421f296d5353f914afe3f2e71cce97722", [:mix], [], "hexpm", "d0628117fcc2148178b034044c55359b26966c6eaa8e2ce15777be3bbc91b12a"}, "nimble_template": {:hex, :nimble_template, "4.1.1", "ed9b223fe0cf03f07de76cdaed49b25ba5d9e884232fe6995c1f807cc4e935c7", [:make, :mix], [{:httpoison, "~> 1.7", [hex: :httpoison, repo: "hexpm", optional: false]}, {:jason, "~> 1.2", [hex: :jason, repo: "hexpm", optional: false]}, {:phoenix, "~> 1.6.6", [hex: :phoenix, repo: "hexpm", optional: false]}], "hexpm", "5ddb640de99bbf8046afa17ab0fd784173d3dff51f1d9d441f0c76e28bb96378"}, "oban": {:hex, :oban, "2.12.0", "bd5a283770c6ab1284aad81e5566cfb89f4119b08f52508d92d73551283c8789", [:mix], [{:ecto_sql, "~> 3.6", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.16", [hex: :postgrex, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "1557b7b046b13c0b5360f55a9fb7e56975f6b5f8247e56f2c54575bd95435ca0"}, "parse_trans": {:hex, :parse_trans, "3.3.1", "16328ab840cc09919bd10dab29e431da3af9e9e7e7e6f0089dd5a2d2820011d8", [:rebar3], [], "hexpm", "07cd9577885f56362d414e8c4c4e6bdf10d43a8767abb92d24cbe8b24c54888b"}, diff --git a/priv/repo/migrations/20220511095327_create_keyword_uploads.exs b/priv/repo/migrations/20220511095327_create_keyword_uploads.exs new file mode 100644 index 0000000..49fd76c --- /dev/null +++ b/priv/repo/migrations/20220511095327_create_keyword_uploads.exs @@ -0,0 +1,15 @@ +defmodule GoogleSearchDataViewer.Repo.Migrations.CreateKeywordUploads do + use Ecto.Migration + + def change do + create table(:keyword_uploads) do + add :name, :string, null: false + add :html, :text + add :status, :string, null: false + + timestamps() + end + + create index(:keyword_uploads, [:name]) + end +end diff --git a/priv/repo/migrations/20220511101610_keywordupload_belongs_to_user.exs b/priv/repo/migrations/20220511101610_keywordupload_belongs_to_user.exs new file mode 100644 index 0000000..4200726 --- /dev/null +++ b/priv/repo/migrations/20220511101610_keywordupload_belongs_to_user.exs @@ -0,0 +1,9 @@ +defmodule GoogleSearchDataViewer.Repo.Migrations.KeywordUploadBelongsToUser do + use Ecto.Migration + + def change do + alter table(:keyword_uploads) do + add :user_id, references(:users) + end + end +end diff --git a/test/factories/keyword_upload_factory.ex b/test/factories/keyword_upload_factory.ex new file mode 100644 index 0000000..0a15797 --- /dev/null +++ b/test/factories/keyword_upload_factory.ex @@ -0,0 +1,18 @@ +defmodule GoogleSearchDataViewer.KeywordUploadFactory do + alias Faker.Food.En + alias GoogleSearchDataViewer.Keywords.Schemas.KeywordUpload + + defmacro __using__(_opts) do + quote do + def keyword_upload_factory(attrs) do + name = attrs[:name] || En.dish() + user = attrs[:user] + + %KeywordUpload{ + name: name, + user: user + } + end + end + end +end diff --git a/test/google_search_data_viewer/keywords/keyword_test.exs b/test/google_search_data_viewer/keywords/keyword_test.exs new file mode 100644 index 0000000..794604a --- /dev/null +++ b/test/google_search_data_viewer/keywords/keyword_test.exs @@ -0,0 +1,43 @@ +defmodule GoogleSearchDataViewer.Keywords.KeywordTest do + use GoogleSearchDataViewer.DataCase, async: true + + alias GoogleSearchDataViewer.Keywords.Keyword + + describe "create_keyword_uploads/2" do + test "given a valid list of keywords and a user, creates keywords for the user" do + user = insert(:user) + keywords = ["dog", "cat", "fish"] + + {keyword_count, _keywords} = Keyword.create_keyword_uploads(keywords, user) + + assert keyword_count == Enum.count(keywords) + end + end + + describe "get_keyword_uploads_for_user/1" do + test "given an existing user with uploaded keywords, lists keywords for the user" do + user = insert(:user) + + keywords = ["dog", "cat", "fish"] + + keyword_uploads = + Enum.map(keywords, fn keyword -> insert(:keyword_upload, name: keyword, user: user) end) + + assert keyword_uploads == + user + |> Keyword.get_keyword_uploads_for_user() + |> Repo.preload(:user) + end + + test "given an existing user with no uploaded keywords, returns an empty list" do + user1 = insert(:user) + user2 = insert(:user) + + keywords = ["dog", "cat", "fish"] + + Enum.each(keywords, fn keyword -> insert(:keyword_upload, name: keyword, user: user1) end) + + assert Keyword.get_keyword_uploads_for_user(user2) == [] + end + end +end diff --git a/test/google_search_data_viewer/keywords/schemas/keyword_upload_test.exs b/test/google_search_data_viewer/keywords/schemas/keyword_upload_test.exs new file mode 100644 index 0000000..1588146 --- /dev/null +++ b/test/google_search_data_viewer/keywords/schemas/keyword_upload_test.exs @@ -0,0 +1,39 @@ +defmodule GoogleSearchDataViewer.Keywords.Schemas.KeywordUploadTest do + use GoogleSearchDataViewer.DataCase, async: true + + alias GoogleSearchDataViewer.Keywords.Schemas.KeywordUpload + + describe "changeset/2" do + test "given a changeset with keyword name and user id, returns valid changeset" do + keyword_upload_changeset = + KeywordUpload.changeset(%KeywordUpload{}, %{name: "cat", user_id: 1}) + + assert keyword_upload_changeset.valid? == true + end + + test "given an empty changeset with empty fields, fails to validate" do + keyword_upload_changeset = KeywordUpload.changeset(%KeywordUpload{}, %{}) + + assert keyword_upload_changeset.valid? == false + end + + test "given a changeset with keyword name and existing user id, validates" do + %{id: user_id} = insert(:user) + + keyword_upload_changeset = + KeywordUpload.changeset(%KeywordUpload{}, %{name: "cat", user_id: user_id}) + + assert keyword_upload_changeset.valid? == true + assert keyword_upload_changeset.changes == %{name: "cat", user_id: user_id} + end + + test "given an invalid user, return invalid changeset" do + keyword_upload_changeset = + KeywordUpload.changeset(%KeywordUpload{}, %{name: "cat", user_id: -1}) + + assert {:error, changeset} = Repo.insert(keyword_upload_changeset) + + assert errors_on(changeset) == %{user: ["does not exist"]} + end + end +end diff --git a/test/google_search_data_viewer_web/controllers/keyword_controller_test.exs b/test/google_search_data_viewer_web/controllers/keyword_controller_test.exs new file mode 100644 index 0000000..b060ebf --- /dev/null +++ b/test/google_search_data_viewer_web/controllers/keyword_controller_test.exs @@ -0,0 +1,108 @@ +defmodule GoogleSearchDataViewerWeb.KeywordControllerTest do + use GoogleSearchDataViewerWeb.ConnCase, async: true + + describe "GET /keywords" do + test "renders keywords page", %{conn: conn} do + user = insert(:user) + + conn = + conn + |> init_test_session(user_id: user.id) + |> get(Routes.keyword_path(conn, :index)) + + assert html_response(conn, 200) =~ "Select a csv file to upload keywords" + end + end + + describe "POST /keywords/upload" do + test "given a valid csv file extension, uploads the file", %{conn: conn} do + file = %Plug.Upload{ + path: "test/support/fixtures/keywords/valid_keywords.csv", + filename: "valid_keywords.csv", + content_type: "text/csv" + } + + user = insert(:user) + + conn = + conn + |> init_test_session(user_id: user.id) + |> post("/keywords/upload", %{:file => file}) + + assert get_flash(conn, :info) =~ "File successfully uploaded" + assert redirected_to(conn, 302) =~ "/keywords" + end + + test "given a valid csv file extension with two keywords, returns an uploaded count of two", %{ + conn: conn + } do + file = %Plug.Upload{ + path: "test/support/fixtures/keywords/valid_two_keywords.csv", + filename: "valid_two_keywords.csv", + content_type: "text/csv" + } + + user = insert(:user) + + conn = + conn + |> init_test_session(user_id: user.id) + |> post("/keywords/upload", %{:file => file}) + + assert get_flash(conn, :info) =~ "File successfully uploaded. 2 keywords uploaded" + assert redirected_to(conn, 302) =~ "/keywords" + end + + test "given an empty keywords file, fails to upload the file", %{conn: conn} do + file = %Plug.Upload{ + path: "test/support/fixtures/keywords/empty_keywords.csv", + filename: "empty_keywords.csv", + content_type: "text/csv" + } + + user = insert(:user) + + conn = + conn + |> init_test_session(user_id: user.id) + |> post("/keywords/upload", %{:file => file}) + + assert get_flash(conn, :error) =~ "Length invalid. 1-1000 keywords only" + assert redirected_to(conn, 302) =~ "/keywords" + end + + test "given an invalid file extension, fails to upload the file", %{conn: conn} do + file = %Plug.Upload{ + path: "test/support/fixtures/keywords/invalid_extension_keywords.txt", + filename: "invalid_extension_keywords.txt", + content_type: "text/plain" + } + + user = insert(:user) + + conn = + conn + |> init_test_session(user_id: user.id) + |> post("/keywords/upload", %{:file => file}) + + assert get_flash(conn, :error) =~ "File extension invalid, csv only" + assert redirected_to(conn, 302) =~ "/keywords" + end + + test "given an unauthenticated user, fails to upload and redirects to the home page", %{ + conn: conn + } do + file = %Plug.Upload{ + path: "test/support/fixtures/keywords/valid_keywords.csv", + filename: "valid_keywords.csv", + content_type: "text/csv" + } + + conn = post(conn, "/keywords/upload", %{:file => file}) + + assert conn.halted == true + assert get_flash(conn, :error) =~ "Please sign in to use this service" + assert redirected_to(conn, 302) =~ "/" + end + end +end diff --git a/test/google_search_data_viewer_web/plugs/ensure_authenticated_test.exs b/test/google_search_data_viewer_web/plugs/ensure_authenticated_test.exs new file mode 100644 index 0000000..8bb28a2 --- /dev/null +++ b/test/google_search_data_viewer_web/plugs/ensure_authenticated_test.exs @@ -0,0 +1,26 @@ +defmodule GoogleSearchDataViewerWeb.EnsureAuthenticatedPlugTest do + use GoogleSearchDataViewerWeb.ConnCase, async: true + + alias GoogleSearchDataViewerWeb.EnsureAuthenticatedPlug + + describe "init/1" do + test "returns given options" do + assert EnsureAuthenticatedPlug.init([]) == [] + end + end + + describe "call/2" do + test "given an unauthenticated user, renders home page", %{conn: conn} do + conn = + conn + |> init_test_session(%{}) + |> fetch_flash() + |> EnsureAuthenticatedPlug.call([]) + + assert conn.assigns.current_user == nil + assert conn.halted == true + assert get_flash(conn, :error) =~ "Please sign in to use this service" + assert redirected_to(conn, 302) =~ "/" + end + end +end diff --git a/test/google_search_data_viewer_web/views/keyword_view_test.exs b/test/google_search_data_viewer_web/views/keyword_view_test.exs new file mode 100644 index 0000000..c7adb70 --- /dev/null +++ b/test/google_search_data_viewer_web/views/keyword_view_test.exs @@ -0,0 +1,13 @@ +defmodule GoogleSearchDataViewerWeb.KeywordViewTest do + use GoogleSearchDataViewerWeb.ConnCase, async: true + + alias GoogleSearchDataViewerWeb.KeywordView + + describe "format_date_time/1" do + test "given a date and time, returns it in the format of d.m.y H:M:S" do + {_, current_date_time} = NaiveDateTime.new(1970, 1, 1, 0, 0, 0) + + assert KeywordView.format_date_time(current_date_time) == "01.01.70 00:00:00" + end + end +end diff --git a/test/support/factory.ex b/test/support/factory.ex index 49fb49d..b06c67a 100644 --- a/test/support/factory.ex +++ b/test/support/factory.ex @@ -1,4 +1,5 @@ defmodule GoogleSearchDataViewer.Factory do use ExMachina.Ecto, repo: GoogleSearchDataViewer.Repo + use GoogleSearchDataViewer.KeywordUploadFactory use GoogleSearchDataViewer.UserFactory end diff --git a/test/support/fixtures/keywords/empty_keywords.csv b/test/support/fixtures/keywords/empty_keywords.csv new file mode 100644 index 0000000..e69de29 diff --git a/test/support/fixtures/keywords/invalid_extension_keywords.txt b/test/support/fixtures/keywords/invalid_extension_keywords.txt new file mode 100644 index 0000000..6da7038 --- /dev/null +++ b/test/support/fixtures/keywords/invalid_extension_keywords.txt @@ -0,0 +1,3 @@ +dog +cow +sheep diff --git a/test/support/fixtures/keywords/valid_keywords.csv b/test/support/fixtures/keywords/valid_keywords.csv new file mode 100644 index 0000000..6da7038 --- /dev/null +++ b/test/support/fixtures/keywords/valid_keywords.csv @@ -0,0 +1,3 @@ +dog +cow +sheep diff --git a/test/support/fixtures/keywords/valid_two_keywords.csv b/test/support/fixtures/keywords/valid_two_keywords.csv new file mode 100644 index 0000000..a8ef6a9 --- /dev/null +++ b/test/support/fixtures/keywords/valid_two_keywords.csv @@ -0,0 +1,2 @@ +dog +cow From 4bed13895744abc873784389f9230ee085499e12 Mon Sep 17 00:00:00 2001 From: Liam Stevens Date: Thu, 2 Jun 2022 18:17:27 +0700 Subject: [PATCH 2/2] Release - 0.3.0 (#45) * Update README.md * Updated gettext errors * Removed unused variables * Fixed linting/formatting issues * Fixed linting/formatting issues * Update deploy_heroku.yml * Rename deploy_heroku.yml to deploy_heroku_staging.yml * Create deploy_heroku_prod.yaml * Update deploy_heroku_staging.yml * As a user I can sign up and sign in with a valid e-mail and password (#28) * [#6 #16] As a user I can sign in with a valid e-mail and password * Remove unused files * Remove unsused update function for User * Add session for user after log in * Add unique email constraint on Users table * Add user sign in * Add current sign in status for user * Add user sign out functionality (not in backlog) * Remove / refactored code * Remove coverage check for currently unused plug * Add controller tests * Prepare ExMachina for testing * User sign out now displays a message * Change session deletion method to ensure persistence of message to user upon sign out * Refactored password hashing function so it can be used in future tests * Refactor fixture to use ExMachina and Faker for data generation * Modify ExUnit tests to conform to standards * Tidy template pages with correct formatting * Remove comments and cleaned up code * Merge migrations into single file for User schema * Remove comments and cleaned up code * Correct English used in ExUnit test case * Move secret_key-base file to environment variable for production * Change multiple alias identifiers from one line to multiple to satisy codebase * Make blank line seperation more consistent in the tests * Add feature test case for User log in * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Remove code form AuthController to AuthHelper to reflect the functionality * [#6 #7 #16 #22] Remove auto-generated function spec * [#6 #7 #16 #22] Moved Account context into accounts folder and account schema into its own folder to improve structure * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Change refute to assert false for testing outcomese * [#6 #7 #16 #22] Changed from pattern matching to double equals to match exact output when required. Move value being tested against to the right side * [#3] [UI] As a user, I can upload a CSV file containing keywords which will then be used to search on Google (#31) * [#6 #16] As a user I can sign in with a valid e-mail and password * Remove unused files * Remove unsused update function for User * Add session for user after log in * Add unique email constraint on Users table * Add user sign in * Add current sign in status for user * Add user sign out functionality (not in backlog) * Remove / refactored code * Remove coverage check for currently unused plug * Add controller tests * Prepare ExMachina for testing * User sign out now displays a message * Change session deletion method to ensure persistence of message to user upon sign out * Refactored password hashing function so it can be used in future tests * Refactor fixture to use ExMachina and Faker for data generation * Modify ExUnit tests to conform to standards * Tidy template pages with correct formatting * Remove comments and cleaned up code * Merge migrations into single file for User schema * Remove comments and cleaned up code * Correct English used in ExUnit test case * Move secret_key-base file to environment variable for production * Change multiple alias identifiers from one line to multiple to satisy codebase * Make blank line seperation more consistent in the tests * Add feature test case for User log in * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Remove code form AuthController to AuthHelper to reflect the functionality * [#6 #7 #16 #22] Remove auto-generated function spec * [#6 #7 #16 #22] Moved Account context into accounts folder and account schema into its own folder to improve structure * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#3] Add endpoint and controller for handling keyword upload * [#3] Add template files for uploading files, including upload form * [#3] Add link to keywords page in navigation * [#3] Remove coverall and comments from authenticated plug to prepare for tests * [#3] Add tests for keyword controller and test csv file * [#3] Add tests for ensure_authenticated plug * Resolved merge conflict * [#3] Format code * [#3] Change keywords fixture file name and changed template to show 1000 keywords limit * [#3] Remove blank line and re-order assert tests for ensure_authenticated plug tests * [#3] Add an additional test to ensure unauthenticated users are unable to upload a keywords file * [#3] Format test * [#18] [Backend] As a user, I can upload a CSV file containing keywords which will be stored (#33) * [#6 #16] As a user I can sign in with a valid e-mail and password * Remove unused files * Remove unsused update function for User * Add session for user after log in * Add unique email constraint on Users table * Add user sign in * Add current sign in status for user * Add user sign out functionality (not in backlog) * Remove / refactored code * Remove coverage check for currently unused plug * Add controller tests * Prepare ExMachina for testing * User sign out now displays a message * Change session deletion method to ensure persistence of message to user upon sign out * Refactored password hashing function so it can be used in future tests * Refactor fixture to use ExMachina and Faker for data generation * Modify ExUnit tests to conform to standards * Tidy template pages with correct formatting * Remove comments and cleaned up code * Merge migrations into single file for User schema * Remove comments and cleaned up code * Correct English used in ExUnit test case * Move secret_key-base file to environment variable for production * Change multiple alias identifiers from one line to multiple to satisy codebase * Make blank line seperation more consistent in the tests * Add feature test case for User log in * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Remove code form AuthController to AuthHelper to reflect the functionality * [#6 #7 #16 #22] Remove auto-generated function spec * [#6 #7 #16 #22] Moved Account context into accounts folder and account schema into its own folder to improve structure * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#3] Add endpoint and controller for handling keyword upload * [#3] Add template files for uploading files, including upload form * [#3] Add link to keywords page in navigation * [#3] Remove coverall and comments from authenticated plug to prepare for tests * [#3] Add tests for keyword controller and test csv file * [#3] Add tests for ensure_authenticated plug * Resolved merge conflict * [#3] Format code * [#3] Change keywords fixture file name and changed template to show 1000 keywords limit * [#3] Remove blank line and re-order assert tests for ensure_authenticated plug tests * [#3] Add an additional test to ensure unauthenticated users are unable to upload a keywords file * [#3] Format test * [#18] Add NimbleCSV * [#18] Add Keyword Controller and helper function for validate/parse csv * [#18] Add Keyword Controller tests and additional test files for invalid cases * [#18] Change invalid file format to invalid file extension * [#18] Initial KeywordUpload Schema, associations and tests * [#18] Initial code to carry out the mass insertions of keywords into the table for the User. Needs refactor wip * [#18] Refactor KeywordUpload changeset to Use __MODULE__ as default argument * #[18] Slight refactor of keyword saving for user wip * Refactor Keyword context name and add one test * [#18] Change alias to fix formatting errors * [#18] Add test to Keyword Controller to verify an uplaod of two keywords returns the correct count to the user * [#18] Add additional empty line for csv files * [#18] Change from using length to Enum.count() for counting list size * [#18] Change from using string field to text for keyword html storage to remove character limit * [#18] Add positive test result for KeywordUpload changeset * [#18] Remove unnecessary conn.halts from keyword controller * [#18] Remove comments and changed grammar in test cases for Keywords * [#18] Change name and status fields of KeywordUpload to be to not accept null * [#18] Refactor parsing of keywords into correct structure for bulk inserts wip * [#18] Add two further KeywordUpload changeset tests to ensure a KeywordUpload has to have an existing user * [#1] [#20] As a user, I can view a list of my previously uploaded keywords (#35) * #[1] Add Context function to retreive list of uploaded keywords for a particular user * #[1] Add Controller and template to show the list of uploaded keywords for the user * #[1] Change github action trigger from Pull Request to Push to allow staging and prod * #[1] Change github action trigger from Pull Request to Push to allow staging and prod * [#1] Add Uploaded field to display for each KeywordUpload and format using Calendar module * #[1] Add test for KeywordView for formatting timestamp * #[1] Change Repo.list_all to return the inserted Keywords * #[1] Write tests for fetching KeywordUploads for a particular user * #[1] Add KeywordUpload Factory to tests for listing KeywordUploads for a User * [#1] Refactor keyword test using pipe operator to make it cleaner * [#1] Fix formatting on keywords index template file * [#1] Clean up keyword template file * [#1] Remove external Calendar library dependencies due to built-in functionality in Elxiir * [#1] Made keyword test title more explicit * Remove prod.secret.exe config import to allow deployment (#36) * [#24] [Backend] Retrieve and store HTML response using background job on Google Search with uploaded keywords (#39) * Release - 0.2.0 (#37) * Update README.md * Updated gettext errors * Removed unused variables * Fixed linting/formatting issues * Fixed linting/formatting issues * Update deploy_heroku.yml * Rename deploy_heroku.yml to deploy_heroku_staging.yml * Create deploy_heroku_prod.yaml * Update deploy_heroku_staging.yml * As a user I can sign up and sign in with a valid e-mail and password (#28) * [#6 #16] As a user I can sign in with a valid e-mail and password * Remove unused files * Remove unsused update function for User * Add session for user after log in * Add unique email constraint on Users table * Add user sign in * Add current sign in status for user * Add user sign out functionality (not in backlog) * Remove / refactored code * Remove coverage check for currently unused plug * Add controller tests * Prepare ExMachina for testing * User sign out now displays a message * Change session deletion method to ensure persistence of message to user upon sign out * Refactored password hashing function so it can be used in future tests * Refactor fixture to use ExMachina and Faker for data generation * Modify ExUnit tests to conform to standards * Tidy template pages with correct formatting * Remove comments and cleaned up code * Merge migrations into single file for User schema * Remove comments and cleaned up code * Correct English used in ExUnit test case * Move secret_key-base file to environment variable for production * Change multiple alias identifiers from one line to multiple to satisy codebase * Make blank line seperation more consistent in the tests * Add feature test case for User log in * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Remove code form AuthController to AuthHelper to reflect the functionality * [#6 #7 #16 #22] Remove auto-generated function spec * [#6 #7 #16 #22] Moved Account context into accounts folder and account schema into its own folder to improve structure * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Change refute to assert false for testing outcomese * [#6 #7 #16 #22] Changed from pattern matching to double equals to match exact output when required. Move value being tested against to the right side * [#3] [UI] As a user, I can upload a CSV file containing keywords which will then be used to search on Google (#31) * [#6 #16] As a user I can sign in with a valid e-mail and password * Remove unused files * Remove unsused update function for User * Add session for user after log in * Add unique email constraint on Users table * Add user sign in * Add current sign in status for user * Add user sign out functionality (not in backlog) * Remove / refactored code * Remove coverage check for currently unused plug * Add controller tests * Prepare ExMachina for testing * User sign out now displays a message * Change session deletion method to ensure persistence of message to user upon sign out * Refactored password hashing function so it can be used in future tests * Refactor fixture to use ExMachina and Faker for data generation * Modify ExUnit tests to conform to standards * Tidy template pages with correct formatting * Remove comments and cleaned up code * Merge migrations into single file for User schema * Remove comments and cleaned up code * Correct English used in ExUnit test case * Move secret_key-base file to environment variable for production * Change multiple alias identifiers from one line to multiple to satisy codebase * Make blank line seperation more consistent in the tests * Add feature test case for User log in * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Remove code form AuthController to AuthHelper to reflect the functionality * [#6 #7 #16 #22] Remove auto-generated function spec * [#6 #7 #16 #22] Moved Account context into accounts folder and account schema into its own folder to improve structure * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#3] Add endpoint and controller for handling keyword upload * [#3] Add template files for uploading files, including upload form * [#3] Add link to keywords page in navigation * [#3] Remove coverall and comments from authenticated plug to prepare for tests * [#3] Add tests for keyword controller and test csv file * [#3] Add tests for ensure_authenticated plug * Resolved merge conflict * [#3] Format code * [#3] Change keywords fixture file name and changed template to show 1000 keywords limit * [#3] Remove blank line and re-order assert tests for ensure_authenticated plug tests * [#3] Add an additional test to ensure unauthenticated users are unable to upload a keywords file * [#3] Format test * [#18] [Backend] As a user, I can upload a CSV file containing keywords which will be stored (#33) * [#6 #16] As a user I can sign in with a valid e-mail and password * Remove unused files * Remove unsused update function for User * Add session for user after log in * Add unique email constraint on Users table * Add user sign in * Add current sign in status for user * Add user sign out functionality (not in backlog) * Remove / refactored code * Remove coverage check for currently unused plug * Add controller tests * Prepare ExMachina for testing * User sign out now displays a message * Change session deletion method to ensure persistence of message to user upon sign out * Refactored password hashing function so it can be used in future tests * Refactor fixture to use ExMachina and Faker for data generation * Modify ExUnit tests to conform to standards * Tidy template pages with correct formatting * Remove comments and cleaned up code * Merge migrations into single file for User schema * Remove comments and cleaned up code * Correct English used in ExUnit test case * Move secret_key-base file to environment variable for production * Change multiple alias identifiers from one line to multiple to satisy codebase * Make blank line seperation more consistent in the tests * Add feature test case for User log in * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Remove code form AuthController to AuthHelper to reflect the functionality * [#6 #7 #16 #22] Remove auto-generated function spec * [#6 #7 #16 #22] Moved Account context into accounts folder and account schema into its own folder to improve structure * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#3] Add endpoint and controller for handling keyword upload * [#3] Add template files for uploading files, including upload form * [#3] Add link to keywords page in navigation * [#3] Remove coverall and comments from authenticated plug to prepare for tests * [#3] Add tests for keyword controller and test csv file * [#3] Add tests for ensure_authenticated plug * Resolved merge conflict * [#3] Format code * [#3] Change keywords fixture file name and changed template to show 1000 keywords limit * [#3] Remove blank line and re-order assert tests for ensure_authenticated plug tests * [#3] Add an additional test to ensure unauthenticated users are unable to upload a keywords file * [#3] Format test * [#18] Add NimbleCSV * [#18] Add Keyword Controller and helper function for validate/parse csv * [#18] Add Keyword Controller tests and additional test files for invalid cases * [#18] Change invalid file format to invalid file extension * [#18] Initial KeywordUpload Schema, associations and tests * [#18] Initial code to carry out the mass insertions of keywords into the table for the User. Needs refactor wip * [#18] Refactor KeywordUpload changeset to Use __MODULE__ as default argument * #[18] Slight refactor of keyword saving for user wip * Refactor Keyword context name and add one test * [#18] Change alias to fix formatting errors * [#18] Add test to Keyword Controller to verify an uplaod of two keywords returns the correct count to the user * [#18] Add additional empty line for csv files * [#18] Change from using length to Enum.count() for counting list size * [#18] Change from using string field to text for keyword html storage to remove character limit * [#18] Add positive test result for KeywordUpload changeset * [#18] Remove unnecessary conn.halts from keyword controller * [#18] Remove comments and changed grammar in test cases for Keywords * [#18] Change name and status fields of KeywordUpload to be to not accept null * [#18] Refactor parsing of keywords into correct structure for bulk inserts wip * [#18] Add two further KeywordUpload changeset tests to ensure a KeywordUpload has to have an existing user * [#1] [#20] As a user, I can view a list of my previously uploaded keywords (#35) * #[1] Add Context function to retreive list of uploaded keywords for a particular user * #[1] Add Controller and template to show the list of uploaded keywords for the user * #[1] Change github action trigger from Pull Request to Push to allow staging and prod * #[1] Change github action trigger from Pull Request to Push to allow staging and prod * [#1] Add Uploaded field to display for each KeywordUpload and format using Calendar module * #[1] Add test for KeywordView for formatting timestamp * #[1] Change Repo.list_all to return the inserted Keywords * #[1] Write tests for fetching KeywordUploads for a particular user * #[1] Add KeywordUpload Factory to tests for listing KeywordUploads for a User * [#1] Refactor keyword test using pipe operator to make it cleaner * [#1] Fix formatting on keywords index template file * [#1] Clean up keyword template file * [#1] Remove external Calendar library dependencies due to built-in functionality in Elxiir * [#1] Made keyword test title more explicit * Remove prod.secret.exe config import to allow deployment (#36) * [#24] Add client to interface with Google Search to carry out queries and receive HTML respose * [#24] Add HTTPoison dependency for Google Search Client * [#24] Add KeywordUpload functions and changesets for updating the status and HTML * [#24] Add Oban job and worker for fetching HTML for KeywordUpload and updating its status * [#24] Modify KeywordController to handle creation of KeywordUpload jobs via helper function * [#24] Refactor GoogleSearchClient function name * [#24] Fix oban job not scheduling with delay * [#24] Fix oban job not scheduling with delay * [#24] Refactor delay time to function argument * [#24] Add test using ExVCR to ensure GoogleSearchClient returns valid response * [#24] Add test using ExVCR to ensure GoogleSearchClient returns valid response * [#24] Add tests for the worker that performs the retrieving of the HTML and updating the status * [#24] Remove pattern matching on HTTPoison error result due to current testability issues * [#24] Add additional tests for KeywordUpload changesets for updating the status and html * [#24] Add additional tests for KeywordUpload context functions for updating the status and html * [#24] Change from Enum.zip_with function to Enum.with_index to generate delays for each job to improve readability * [#24] Add additional assertion for the status prior to change to make it more explicit that the status gets changed successfully * [#24] Make test title for the valid html changeset result more explicit * [#24] Change the KeywordUpload status to failed when max attempts have been reached * [#24] Create test for the job creation helper function to ensure jobs are inserted with delay * [#24] Reverted one line multi alias to conform to formatting warning * [#24] Change function guard to simpler pattern match for keyword upload worker attempts * [#24] Clean up search worker using pipes * [#24] Update job creation helper file name to include the suffix of helper * [#24] Update job creation helper file name to include the suffix of helper * [#24] Rename module SearchWorker to KeywordSearchWorker to reflect file name * [#24] Explicitly set uploaded keyword status during status update tests * [#24] Change function name insert_keyword_upload_html to update_keyword_upload_html * [#24] Clean up formatting of job_creation_helper_test * [#24] Clea up pattern matching to make code cleaner for checking keyword uplaod status * [#24] Add error response to GoogleSearchClient for 500 server errors with associated stub cassette test * [#24] Fix coding style on keyword search worker * [#24] Add additional error response and test for GoogleSearchClient to handle unhandled responses * [#24] Add additional error response and test for GoogleSearchClient to handle unhandled responses * [Chore] [#40] Update project structure and naming according to Nimble standards (#41) * Release - 0.2.0 (#37) * Update README.md * Updated gettext errors * Removed unused variables * Fixed linting/formatting issues * Fixed linting/formatting issues * Update deploy_heroku.yml * Rename deploy_heroku.yml to deploy_heroku_staging.yml * Create deploy_heroku_prod.yaml * Update deploy_heroku_staging.yml * As a user I can sign up and sign in with a valid e-mail and password (#28) * [#6 #16] As a user I can sign in with a valid e-mail and password * Remove unused files * Remove unsused update function for User * Add session for user after log in * Add unique email constraint on Users table * Add user sign in * Add current sign in status for user * Add user sign out functionality (not in backlog) * Remove / refactored code * Remove coverage check for currently unused plug * Add controller tests * Prepare ExMachina for testing * User sign out now displays a message * Change session deletion method to ensure persistence of message to user upon sign out * Refactored password hashing function so it can be used in future tests * Refactor fixture to use ExMachina and Faker for data generation * Modify ExUnit tests to conform to standards * Tidy template pages with correct formatting * Remove comments and cleaned up code * Merge migrations into single file for User schema * Remove comments and cleaned up code * Correct English used in ExUnit test case * Move secret_key-base file to environment variable for production * Change multiple alias identifiers from one line to multiple to satisy codebase * Make blank line seperation more consistent in the tests * Add feature test case for User log in * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Remove code form AuthController to AuthHelper to reflect the functionality * [#6 #7 #16 #22] Remove auto-generated function spec * [#6 #7 #16 #22] Moved Account context into accounts folder and account schema into its own folder to improve structure * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Change refute to assert false for testing outcomese * [#6 #7 #16 #22] Changed from pattern matching to double equals to match exact output when required. Move value being tested against to the right side * [#3] [UI] As a user, I can upload a CSV file containing keywords which will then be used to search on Google (#31) * [#6 #16] As a user I can sign in with a valid e-mail and password * Remove unused files * Remove unsused update function for User * Add session for user after log in * Add unique email constraint on Users table * Add user sign in * Add current sign in status for user * Add user sign out functionality (not in backlog) * Remove / refactored code * Remove coverage check for currently unused plug * Add controller tests * Prepare ExMachina for testing * User sign out now displays a message * Change session deletion method to ensure persistence of message to user upon sign out * Refactored password hashing function so it can be used in future tests * Refactor fixture to use ExMachina and Faker for data generation * Modify ExUnit tests to conform to standards * Tidy template pages with correct formatting * Remove comments and cleaned up code * Merge migrations into single file for User schema * Remove comments and cleaned up code * Correct English used in ExUnit test case * Move secret_key-base file to environment variable for production * Change multiple alias identifiers from one line to multiple to satisy codebase * Make blank line seperation more consistent in the tests * Add feature test case for User log in * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Remove code form AuthController to AuthHelper to reflect the functionality * [#6 #7 #16 #22] Remove auto-generated function spec * [#6 #7 #16 #22] Moved Account context into accounts folder and account schema into its own folder to improve structure * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#3] Add endpoint and controller for handling keyword upload * [#3] Add template files for uploading files, including upload form * [#3] Add link to keywords page in navigation * [#3] Remove coverall and comments from authenticated plug to prepare for tests * [#3] Add tests for keyword controller and test csv file * [#3] Add tests for ensure_authenticated plug * Resolved merge conflict * [#3] Format code * [#3] Change keywords fixture file name and changed template to show 1000 keywords limit * [#3] Remove blank line and re-order assert tests for ensure_authenticated plug tests * [#3] Add an additional test to ensure unauthenticated users are unable to upload a keywords file * [#3] Format test * [#18] [Backend] As a user, I can upload a CSV file containing keywords which will be stored (#33) * [#6 #16] As a user I can sign in with a valid e-mail and password * Remove unused files * Remove unsused update function for User * Add session for user after log in * Add unique email constraint on Users table * Add user sign in * Add current sign in status for user * Add user sign out functionality (not in backlog) * Remove / refactored code * Remove coverage check for currently unused plug * Add controller tests * Prepare ExMachina for testing * User sign out now displays a message * Change session deletion method to ensure persistence of message to user upon sign out * Refactored password hashing function so it can be used in future tests * Refactor fixture to use ExMachina and Faker for data generation * Modify ExUnit tests to conform to standards * Tidy template pages with correct formatting * Remove comments and cleaned up code * Merge migrations into single file for User schema * Remove comments and cleaned up code * Correct English used in ExUnit test case * Move secret_key-base file to environment variable for production * Change multiple alias identifiers from one line to multiple to satisy codebase * Make blank line seperation more consistent in the tests * Add feature test case for User log in * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Add account tests for invalid cases and fixed formatting * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Refactor code according to feedback * [#6 #7 #16 #22] Remove code form AuthController to AuthHelper to reflect the functionality * [#6 #7 #16 #22] Remove auto-generated function spec * [#6 #7 #16 #22] Moved Account context into accounts folder and account schema into its own folder to improve structure * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#6 #7 #16 #22] Add tests for user changesets for negative paths * [#3] Add endpoint and controller for handling keyword upload * [#3] Add template files for uploading files, including upload form * [#3] Add link to keywords page in navigation * [#3] Remove coverall and comments from authenticated plug to prepare for tests * [#3] Add tests for keyword controller and test csv file * [#3] Add tests for ensure_authenticated plug * Resolved merge conflict * [#3] Format code * [#3] Change keywords fixture file name and changed template to show 1000 keywords limit * [#3] Remove blank line and re-order assert tests for ensure_authenticated plug tests * [#3] Add an additional test to ensure unauthenticated users are unable to upload a keywords file * [#3] Format test * [#18] Add NimbleCSV * [#18] Add Keyword Controller and helper function for validate/parse csv * [#18] Add Keyword Controller tests and additional test files for invalid cases * [#18] Change invalid file format to invalid file extension * [#18] Initial KeywordUpload Schema, associations and tests * [#18] Initial code to carry out the mass insertions of keywords into the table for the User. Needs refactor wip * [#18] Refactor KeywordUpload changeset to Use __MODULE__ as default argument * #[18] Slight refactor of keyword saving for user wip * Refactor Keyword context name and add one test * [#18] Change alias to fix formatting errors * [#18] Add test to Keyword Controller to verify an uplaod of two keywords returns the correct count to the user * [#18] Add additional empty line for csv files * [#18] Change from using length to Enum.count() for counting list size * [#18] Change from using string field to text for keyword html storage to remove character limit * [#18] Add positive test result for KeywordUpload changeset * [#18] Remove unnecessary conn.halts from keyword controller * [#18] Remove comments and changed grammar in test cases for Keywords * [#18] Change name and status fields of KeywordUpload to be to not accept null * [#18] Refactor parsing of keywords into correct structure for bulk inserts wip * [#18] Add two further KeywordUpload changeset tests to ensure a KeywordUpload has to have an existing user * [#1] [#20] As a user, I can view a list of my previously uploaded keywords (#35) * #[1] Add Context function to retreive list of uploaded keywords for a particular user * #[1] Add Controller and template to show the list of uploaded keywords for the user * #[1] Change github action trigger from Pull Request to Push to allow staging and prod * #[1] Change github action trigger from Pull Request to Push to allow staging and prod * [#1] Add Uploaded field to display for each KeywordUpload and format using Calendar module * #[1] Add test for KeywordView for formatting timestamp * #[1] Change Repo.list_all to return the inserted Keywords * #[1] Write tests for fetching KeywordUploads for a particular user * #[1] Add KeywordUpload Factory to tests for listing KeywordUploads for a User * [#1] Refactor keyword test using pipe operator to make it cleaner * [#1] Fix formatting on keywords index template file * [#1] Clean up keyword template file * [#1] Remove external Calendar library dependencies due to built-in functionality in Elxiir * [#1] Made keyword test title more explicit * Remove prod.secret.exe config import to allow deployment (#36) * [#40] Update .gitignore * [#40] Changed context naming to plural form, and updated folder and module naming consistency with tests * [#40] Add newline for end of .gitignore file * [#40] Refactored tests and Exmachina to build the User through the Keyword factory insteado f separately in the tests * [#24] [Backend] Parse the HTML and store URL data for the Keyword Upload search for the User (#43) * [#24] Initial schema and migrations for storing URL data for Keyword Uploads * [#24] Add initial tests for validating url data into changeset * [#24] Clean up Keyword Upload factory * [#24] Change to realistic uploaded search data to ensure adwords appear on page * [#24] Remove doc comment * [#24] Initial adword parsing wip * [#24] Initial insertion of url data for keyword upload * [#24] Refactor keyword parsing * [#24] Refactor keyword parsing * [#24] Modified test to check for completed status * [#24] Add custom cassettes to test parser * [#24] Add factory for creating url data for search results * [#24] Add tests for parsing html links * [#24] Add test for creating search results from url data * [#24] Add tests for parsing html links * [#24] Add additional test data to ensure bottom ads are displayed for cassettes * [#24] Add additional test for bottom adwords parsing * [#24] Add additional tests to ensure errors are created for invalid search url data * [#24] Refactored search_result_url_date to search_result_url * [#24] Refactor search_result_url_data to search_result_url * [#24] Refactor naming of variables and functions related to url data * [#24] Add blank line to avoid warnings for csv * [#24] Move HTTPoison and cassette config out of test files to DataCse * [#24] Move cassete files to correct location * [#24] Refactor keyword parser for readability * [#24] Clean up keyword URL parsing further * [#24] Remove custom cassettes as not required * [#24] Refactor google search result parser * [24] Change name of google url parsing function * [#24] Change vcr cassette names to clarify its usage * [#24] Change name of functions used to parse individual urls from adwords * Remove files that I accidently accepted back in merge conflict? --- .gitignore | 4 + config/config.exs | 2 +- .../account.ex => account/accounts.ex} | 6 +- .../{accounts => account}/passwords.ex | 2 +- .../{accounts => account}/schemas/user.ex | 6 +- .../keyword/google_search_client.ex | 22 ++++++ .../keyword/google_search_parser.ex | 50 +++++++++++++ .../keyword.ex => keyword/keywords.ex} | 20 ++++- .../schemas/keyword_upload.ex | 16 +++- .../keyword/schemas/search_result_url.ex | 24 ++++++ .../keyword/search_results.ex | 10 +++ .../controllers/keyword_controller.ex | 13 ++-- .../controllers/session_controller.ex | 4 +- .../controllers/user_controller.ex | 6 +- .../plugs/ensure_authenticated.ex | 4 +- .../plugs/put_current_user.ex | 4 +- lib/google_search_data_viewer_worker/.keep | 0 .../keyword/keyword_search_worker.ex | 46 ++++++++++++ .../keyword/keywords.ex | 12 +++ mix.exs | 3 +- ...220531083622_create_search_result_urls.exs | 18 +++++ test/factories/keyword_upload_factory.ex | 14 ++-- test/factories/search_result_url_factory.ex | 17 +++++ test/factories/user_factory.ex | 4 +- .../accounts_test.exs} | 26 +++---- .../schemas/user_test.exs | 4 +- .../keyword/google_search_client_test.exs | 25 +++++++ .../keyword/google_search_parser_test.exs | 48 ++++++++++++ .../keyword/keywords_test.exs | 75 +++++++++++++++++++ .../schemas/keyword_upload_test.exs | 37 ++++++++- .../schemas/search_result_url_test.exs | 48 ++++++++++++ .../keyword/search_results_test.exs | 57 ++++++++++++++ .../keywords/keyword_test.exs | 43 ----------- .../keyword/keyword_search_worker_test.exs | 52 +++++++++++++ .../keyword/keywords_test.exs | 37 +++++++++ test/support/data_case.ex | 4 + test/support/factory.ex | 1 + .../fixtures/keywords/valid_keywords.csv | 6 +- .../vcr_cassettes/keyword_with_adword.json | 41 ++++++++++ .../keyword_with_adword_and_top_adword.json | 41 ++++++++++ .../vcr_cassettes/keyword_without_adword.json | 41 ++++++++++ 41 files changed, 792 insertions(+), 101 deletions(-) rename lib/google_search_data_viewer/{accounts/account.ex => account/accounts.ex} (82%) rename lib/google_search_data_viewer/{accounts => account}/passwords.ex (75%) rename lib/google_search_data_viewer/{accounts => account}/schemas/user.ex (84%) create mode 100644 lib/google_search_data_viewer/keyword/google_search_client.ex create mode 100644 lib/google_search_data_viewer/keyword/google_search_parser.ex rename lib/google_search_data_viewer/{keywords/keyword.ex => keyword/keywords.ex} (73%) rename lib/google_search_data_viewer/{keywords => keyword}/schemas/keyword_upload.ex (50%) create mode 100644 lib/google_search_data_viewer/keyword/schemas/search_result_url.ex create mode 100644 lib/google_search_data_viewer/keyword/search_results.ex delete mode 100644 lib/google_search_data_viewer_worker/.keep create mode 100644 lib/google_search_data_viewer_worker/keyword/keyword_search_worker.ex create mode 100644 lib/google_search_data_viewer_worker/keyword/keywords.ex create mode 100644 priv/repo/migrations/20220531083622_create_search_result_urls.exs create mode 100644 test/factories/search_result_url_factory.ex rename test/google_search_data_viewer/{accounts/account_test.exs => account/accounts_test.exs} (67%) rename test/google_search_data_viewer/{accounts => account}/schemas/user_test.exs (93%) create mode 100644 test/google_search_data_viewer/keyword/google_search_client_test.exs create mode 100644 test/google_search_data_viewer/keyword/google_search_parser_test.exs create mode 100644 test/google_search_data_viewer/keyword/keywords_test.exs rename test/google_search_data_viewer/{keywords => keyword}/schemas/keyword_upload_test.exs (51%) create mode 100644 test/google_search_data_viewer/keyword/schemas/search_result_url_test.exs create mode 100644 test/google_search_data_viewer/keyword/search_results_test.exs delete mode 100644 test/google_search_data_viewer/keywords/keyword_test.exs create mode 100644 test/google_search_data_viewer_worker/keyword/keyword_search_worker_test.exs create mode 100644 test/google_search_data_viewer_worker/keyword/keywords_test.exs create mode 100644 test/support/fixtures/vcr_cassettes/keyword_with_adword.json create mode 100644 test/support/fixtures/vcr_cassettes/keyword_with_adword_and_top_adword.json create mode 100644 test/support/fixtures/vcr_cassettes/keyword_without_adword.json diff --git a/.gitignore b/.gitignore index 3128e92..9c1342d 100644 --- a/.gitignore +++ b/.gitignore @@ -38,3 +38,7 @@ google_search_data_viewer-*.tar npm-debug.log /assets/node_modules/ +.DS_Store + +# Ignore .iex.exs files in case you like to edit your preload commands locally. +.iex.exs diff --git a/config/config.exs b/config/config.exs index 1d71f5b..9b62932 100644 --- a/config/config.exs +++ b/config/config.exs @@ -52,7 +52,7 @@ config :phoenix, :json_library, Jason config :google_search_data_viewer, Oban, repo: GoogleSearchDataViewer.Repo, plugins: [Oban.Plugins.Pruner], - queues: [default: 10] + queues: [default: 10, keyword_search: 10] # Import environment specific config. This must remain at the bottom # of this file so it overrides the configuration defined above. diff --git a/lib/google_search_data_viewer/accounts/account.ex b/lib/google_search_data_viewer/account/accounts.ex similarity index 82% rename from lib/google_search_data_viewer/accounts/account.ex rename to lib/google_search_data_viewer/account/accounts.ex index 5cfb23c..e4147ff 100644 --- a/lib/google_search_data_viewer/accounts/account.ex +++ b/lib/google_search_data_viewer/account/accounts.ex @@ -1,8 +1,8 @@ -defmodule GoogleSearchDataViewer.Accounts.Account do +defmodule GoogleSearchDataViewer.Account.Accounts do import Ecto.Query, warn: false - alias GoogleSearchDataViewer.Accounts.Passwords - alias GoogleSearchDataViewer.Accounts.Schemas.User + alias GoogleSearchDataViewer.Account.Passwords + alias GoogleSearchDataViewer.Account.Schemas.User alias GoogleSearchDataViewer.Repo def list_users, do: Repo.all(User) diff --git a/lib/google_search_data_viewer/accounts/passwords.ex b/lib/google_search_data_viewer/account/passwords.ex similarity index 75% rename from lib/google_search_data_viewer/accounts/passwords.ex rename to lib/google_search_data_viewer/account/passwords.ex index a0f799d..27e7ebe 100644 --- a/lib/google_search_data_viewer/accounts/passwords.ex +++ b/lib/google_search_data_viewer/account/passwords.ex @@ -1,4 +1,4 @@ -defmodule GoogleSearchDataViewer.Accounts.Passwords do +defmodule GoogleSearchDataViewer.Account.Passwords do def hash_password(password), do: Bcrypt.hash_pwd_salt(password) def verify_password(password, hashed_password), do: Bcrypt.verify_pass(password, hashed_password) diff --git a/lib/google_search_data_viewer/accounts/schemas/user.ex b/lib/google_search_data_viewer/account/schemas/user.ex similarity index 84% rename from lib/google_search_data_viewer/accounts/schemas/user.ex rename to lib/google_search_data_viewer/account/schemas/user.ex index 4005456..33db93d 100644 --- a/lib/google_search_data_viewer/accounts/schemas/user.ex +++ b/lib/google_search_data_viewer/account/schemas/user.ex @@ -1,10 +1,10 @@ -defmodule GoogleSearchDataViewer.Accounts.Schemas.User do +defmodule GoogleSearchDataViewer.Account.Schemas.User do use Ecto.Schema import Ecto.Changeset - alias GoogleSearchDataViewer.Accounts.Passwords - alias GoogleSearchDataViewer.Keywords.Schemas.KeywordUpload + alias GoogleSearchDataViewer.Account.Passwords + alias GoogleSearchDataViewer.Keyword.Schemas.KeywordUpload schema "users" do field :email, :string diff --git a/lib/google_search_data_viewer/keyword/google_search_client.ex b/lib/google_search_data_viewer/keyword/google_search_client.ex new file mode 100644 index 0000000..c5d6088 --- /dev/null +++ b/lib/google_search_data_viewer/keyword/google_search_client.ex @@ -0,0 +1,22 @@ +defmodule GoogleSearchDataViewer.Keyword.GoogleSearchClient do + @base_url "https://www.google.com/search?q=" + @headers [ + {"User-Agent", + "Mozilla/5.0 (Macintosh; Intel Mac OS X 12_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.67 Safari/537.36"} + ] + + def get_html(keyword) do + search_url = @base_url <> URI.encode(keyword) + + case HTTPoison.get(search_url, @headers) do + {:ok, %HTTPoison.Response{status_code: 200, body: body}} -> + {:ok, body} + + {:ok, %HTTPoison.Response{status_code: 500}} -> + {:error, "Internal server error"} + + {:ok, response = %HTTPoison.Response{status_code: _}} -> + {:error, response} + end + end +end diff --git a/lib/google_search_data_viewer/keyword/google_search_parser.ex b/lib/google_search_data_viewer/keyword/google_search_parser.ex new file mode 100644 index 0000000..008216b --- /dev/null +++ b/lib/google_search_data_viewer/keyword/google_search_parser.ex @@ -0,0 +1,50 @@ +defmodule GoogleSearchDataViewer.Keyword.GoogleSearchParser do + @css_search_selectors %{ + top_adwords: "#tads > .uEierd a.sVXRqc", + top_non_adwords: ".MhgNwc a", + non_adwords: ".yuRUbf a", + bottom_adwords: "#bottomads .uEierd a.sVXRqc" + } + + def parse_html_urls(html) do + {_, parsed_html} = Floki.parse_document(html) + + [] + |> parse_top_adwords(parsed_html) + |> parse_top_non_adwords(parsed_html) + |> parse_non_adwords(parsed_html) + |> parse_bottom_adwords(parsed_html) + end + + defp parse_top_adwords(url_stats, parsed_html) do + parsed_html + |> Floki.find(@css_search_selectors.top_adwords) + |> Floki.attribute("href") + |> Enum.map(fn url -> %{url: url, is_adword: true, is_top_adword: true} end) + |> Enum.concat(url_stats) + end + + defp parse_top_non_adwords(url_stats, parsed_html) do + parsed_html + |> Floki.find(@css_search_selectors.top_non_adwords) + |> Floki.attribute("href") + |> Enum.map(fn url -> %{url: url, is_adword: false, is_top_adword: false} end) + |> Enum.concat(url_stats) + end + + defp parse_non_adwords(url_stats, parsed_html) do + parsed_html + |> Floki.find(@css_search_selectors.non_adwords) + |> Floki.attribute("href") + |> Enum.map(fn url -> %{url: url, is_adword: false, is_top_adword: false} end) + |> Enum.concat(url_stats) + end + + defp parse_bottom_adwords(url_stats, parsed_html) do + parsed_html + |> Floki.find(@css_search_selectors.bottom_adwords) + |> Floki.attribute("href") + |> Enum.map(fn url -> %{url: url, is_adword: true, is_top_adword: false} end) + |> Enum.concat(url_stats) + end +end diff --git a/lib/google_search_data_viewer/keywords/keyword.ex b/lib/google_search_data_viewer/keyword/keywords.ex similarity index 73% rename from lib/google_search_data_viewer/keywords/keyword.ex rename to lib/google_search_data_viewer/keyword/keywords.ex index 12dced4..b2365ff 100644 --- a/lib/google_search_data_viewer/keywords/keyword.ex +++ b/lib/google_search_data_viewer/keyword/keywords.ex @@ -1,9 +1,11 @@ -defmodule GoogleSearchDataViewer.Keywords.Keyword do +defmodule GoogleSearchDataViewer.Keyword.Keywords do import Ecto.Query, warn: false - alias GoogleSearchDataViewer.Keywords.Schemas.KeywordUpload + alias GoogleSearchDataViewer.Keyword.Schemas.KeywordUpload alias GoogleSearchDataViewer.Repo + def get_keyword_upload(id), do: Repo.get(KeywordUpload, id) + def get_keyword_uploads_for_user(user) do KeywordUpload |> where(user_id: ^user.id) @@ -12,6 +14,18 @@ defmodule GoogleSearchDataViewer.Keywords.Keyword do |> Repo.all() end + def update_keyword_upload_status(keyword_upload, status) do + keyword_upload + |> KeywordUpload.status_changeset(status) + |> Repo.update() + end + + def update_keyword_upload_html(keyword_upload, html) do + keyword_upload + |> KeywordUpload.html_changeset(%{html: html}) + |> Repo.update() + end + def insert_keyword_uploads(attrs) do Repo.insert_all(KeywordUpload, attrs, returning: true) end @@ -29,7 +43,7 @@ defmodule GoogleSearchDataViewer.Keywords.Keyword do end) |> Enum.map(fn params -> create_changeset_and_parse(params) end) |> Enum.map(&Map.from_struct/1) - |> Enum.map(fn params -> Map.drop(params, [:__meta__, :user, :id]) end) + |> Enum.map(fn params -> Map.drop(params, [:__meta__, :user, :id, :search_result_urls]) end) |> Enum.map(fn params -> insert_timestamps(params) end) end diff --git a/lib/google_search_data_viewer/keywords/schemas/keyword_upload.ex b/lib/google_search_data_viewer/keyword/schemas/keyword_upload.ex similarity index 50% rename from lib/google_search_data_viewer/keywords/schemas/keyword_upload.ex rename to lib/google_search_data_viewer/keyword/schemas/keyword_upload.ex index 18dcd5f..bcd3c88 100644 --- a/lib/google_search_data_viewer/keywords/schemas/keyword_upload.ex +++ b/lib/google_search_data_viewer/keyword/schemas/keyword_upload.ex @@ -1,9 +1,10 @@ -defmodule GoogleSearchDataViewer.Keywords.Schemas.KeywordUpload do +defmodule GoogleSearchDataViewer.Keyword.Schemas.KeywordUpload do use Ecto.Schema import Ecto.Changeset - alias GoogleSearchDataViewer.Accounts.Schemas.User + alias GoogleSearchDataViewer.Account.Schemas.User + alias GoogleSearchDataViewer.Keyword.Schemas.SearchResultUrl schema "keyword_uploads" do field :name, :string @@ -13,6 +14,7 @@ defmodule GoogleSearchDataViewer.Keywords.Schemas.KeywordUpload do values: [:pending, :inprogress, :completed, :failed], default: :pending + has_many :search_result_urls, SearchResultUrl belongs_to :user, User timestamps() @@ -24,4 +26,14 @@ defmodule GoogleSearchDataViewer.Keywords.Schemas.KeywordUpload do |> validate_required([:name, :user_id]) |> assoc_constraint(:user) end + + def html_changeset(keyword_upload \\ %__MODULE__{}, html) do + keyword_upload + |> cast(html, [:html]) + |> validate_required([:html]) + end + + def status_changeset(keyword_upload \\ %__MODULE__{}, status) do + change(keyword_upload, status: status) + end end diff --git a/lib/google_search_data_viewer/keyword/schemas/search_result_url.ex b/lib/google_search_data_viewer/keyword/schemas/search_result_url.ex new file mode 100644 index 0000000..c2d7d40 --- /dev/null +++ b/lib/google_search_data_viewer/keyword/schemas/search_result_url.ex @@ -0,0 +1,24 @@ +defmodule GoogleSearchDataViewer.Keyword.Schemas.SearchResultUrl do + use Ecto.Schema + + import Ecto.Changeset + + alias GoogleSearchDataViewer.Keyword.Schemas.KeywordUpload + + schema "search_result_urls" do + field :url, :string + field :is_adword, :boolean, default: false + field :is_top_adword, :boolean, default: false + + belongs_to :keyword_upload, KeywordUpload + + timestamps() + end + + def changeset(search_result_url \\ %__MODULE__{}, attrs) do + search_result_url + |> cast(attrs, [:url, :is_adword, :is_top_adword, :keyword_upload_id]) + |> validate_required([:url, :keyword_upload_id]) + |> assoc_constraint(:keyword_upload) + end +end diff --git a/lib/google_search_data_viewer/keyword/search_results.ex b/lib/google_search_data_viewer/keyword/search_results.ex new file mode 100644 index 0000000..c088797 --- /dev/null +++ b/lib/google_search_data_viewer/keyword/search_results.ex @@ -0,0 +1,10 @@ +defmodule GoogleSearchDataViewer.Keyword.SearchResults do + alias GoogleSearchDataViewer.Keyword.Schemas.SearchResultUrl + alias GoogleSearchDataViewer.Repo + + def create_search_results(url_stats) do + url_stats + |> SearchResultUrl.changeset() + |> Repo.insert() + end +end diff --git a/lib/google_search_data_viewer_web/controllers/keyword_controller.ex b/lib/google_search_data_viewer_web/controllers/keyword_controller.ex index 55df0f7..3d98f6e 100644 --- a/lib/google_search_data_viewer_web/controllers/keyword_controller.ex +++ b/lib/google_search_data_viewer_web/controllers/keyword_controller.ex @@ -1,19 +1,22 @@ defmodule GoogleSearchDataViewerWeb.KeywordController do use GoogleSearchDataViewerWeb, :controller - alias GoogleSearchDataViewer.Keywords.Keyword + alias GoogleSearchDataViewer.Keyword.Keywords alias GoogleSearchDataViewerWeb.KeywordHelper + alias GoogleSearchDataViewerWorker.Keyword.Keywords, as: WorkerKeywords def index(conn, _params) do - keywords = Keyword.get_keyword_uploads_for_user(conn.assigns.current_user) + keywords = Keywords.get_keyword_uploads_for_user(conn.assigns.current_user) render(conn, "index.html", keywords: keywords) end def upload(conn, %{"file" => file}) do case KeywordHelper.validate_and_parse_keyword_file(file) do - {:ok, keywords} -> - {keyword_count, _keywords} = - Keyword.create_keyword_uploads(keywords, conn.assigns.current_user) + {:ok, parsed_keywords} -> + {keyword_count, keyword_uploads} = + Keywords.create_keyword_uploads(parsed_keywords, conn.assigns.current_user) + + WorkerKeywords.create_keyword_upload_jobs_with_delay(keyword_uploads) conn |> put_flash(:info, "File successfully uploaded. #{keyword_count} keywords uploaded.") diff --git a/lib/google_search_data_viewer_web/controllers/session_controller.ex b/lib/google_search_data_viewer_web/controllers/session_controller.ex index ae4afda..840e163 100644 --- a/lib/google_search_data_viewer_web/controllers/session_controller.ex +++ b/lib/google_search_data_viewer_web/controllers/session_controller.ex @@ -1,12 +1,12 @@ defmodule GoogleSearchDataViewerWeb.SessionController do use GoogleSearchDataViewerWeb, :controller - alias GoogleSearchDataViewer.Accounts.Account + alias GoogleSearchDataViewer.Account.Accounts def new(conn, _params), do: render(conn, "new.html") def create(conn, %{"email" => email, "password" => password}) do - case Account.validate_email_and_password(email, password) do + case Accounts.validate_email_and_password(email, password) do {:ok, user} -> conn |> GoogleSearchDataViewerWeb.AuthHelper.sign_in(user) diff --git a/lib/google_search_data_viewer_web/controllers/user_controller.ex b/lib/google_search_data_viewer_web/controllers/user_controller.ex index ffe557b..2f41539 100644 --- a/lib/google_search_data_viewer_web/controllers/user_controller.ex +++ b/lib/google_search_data_viewer_web/controllers/user_controller.ex @@ -1,8 +1,8 @@ defmodule GoogleSearchDataViewerWeb.UserController do use GoogleSearchDataViewerWeb, :controller - alias GoogleSearchDataViewer.Accounts.Account - alias GoogleSearchDataViewer.Accounts.Schemas.User + alias GoogleSearchDataViewer.Account.Accounts + alias GoogleSearchDataViewer.Account.Schemas.User def new(conn, _params) do changeset = User.changeset(%User{}, %{}) @@ -11,7 +11,7 @@ defmodule GoogleSearchDataViewerWeb.UserController do end def create(conn, %{"user" => user_params}) do - case Account.create_user(user_params) do + case Accounts.create_user(user_params) do {:ok, user} -> conn |> GoogleSearchDataViewerWeb.AuthHelper.sign_in(user) diff --git a/lib/google_search_data_viewer_web/plugs/ensure_authenticated.ex b/lib/google_search_data_viewer_web/plugs/ensure_authenticated.ex index 737c123..5318d6e 100644 --- a/lib/google_search_data_viewer_web/plugs/ensure_authenticated.ex +++ b/lib/google_search_data_viewer_web/plugs/ensure_authenticated.ex @@ -2,7 +2,7 @@ defmodule GoogleSearchDataViewerWeb.EnsureAuthenticatedPlug do import Plug.Conn import Phoenix.Controller - alias GoogleSearchDataViewer.Accounts.Account + alias GoogleSearchDataViewer.Account.Accounts alias GoogleSearchDataViewerWeb.Router.Helpers, as: Routes def init(opts), do: opts @@ -26,5 +26,5 @@ defmodule GoogleSearchDataViewerWeb.EnsureAuthenticatedPlug do defp get_user(nil), do: nil - defp get_user(user_id), do: Account.get_user(user_id) + defp get_user(user_id), do: Accounts.get_user(user_id) end diff --git a/lib/google_search_data_viewer_web/plugs/put_current_user.ex b/lib/google_search_data_viewer_web/plugs/put_current_user.ex index b67a365..f3a6d48 100644 --- a/lib/google_search_data_viewer_web/plugs/put_current_user.ex +++ b/lib/google_search_data_viewer_web/plugs/put_current_user.ex @@ -1,14 +1,14 @@ defmodule GoogleSearchDataViewerWeb.PutCurrentUserPlug do import Plug.Conn - alias GoogleSearchDataViewer.Accounts.Account + alias GoogleSearchDataViewer.Account.Accounts def init(opts), do: opts def call(conn, _opts) do user_id = get_session(conn, :user_id) - user = user_id && Account.get_user(user_id) + user = user_id && Accounts.get_user(user_id) assign(conn, :current_user, user) end diff --git a/lib/google_search_data_viewer_worker/.keep b/lib/google_search_data_viewer_worker/.keep deleted file mode 100644 index e69de29..0000000 diff --git a/lib/google_search_data_viewer_worker/keyword/keyword_search_worker.ex b/lib/google_search_data_viewer_worker/keyword/keyword_search_worker.ex new file mode 100644 index 0000000..ec5e878 --- /dev/null +++ b/lib/google_search_data_viewer_worker/keyword/keyword_search_worker.ex @@ -0,0 +1,46 @@ +defmodule GoogleSearchDataViewerWorker.Keyword.KeywordSearchWorker do + use Oban.Worker, + queue: :keyword_search, + max_attempts: 3, + unique: [period: 30] + + alias GoogleSearchDataViewer.Keyword.GoogleSearchClient + alias GoogleSearchDataViewer.Keyword.GoogleSearchParser + alias GoogleSearchDataViewer.Keyword.Keywords + alias GoogleSearchDataViewer.Keyword.SearchResults + + @max_attempts 3 + + @impl Oban.Worker + def perform(%Oban.Job{args: %{"keyword_id" => keyword_id}, attempt: @max_attempts}) do + {:ok, _} = + keyword_id + |> Keywords.get_keyword_upload() + |> Keywords.update_keyword_upload_status(:failed) + + {:error, "max attempts reached, attempt: #{@max_attempts}"} + end + + @impl Oban.Worker + def perform(%Oban.Job{args: %{"keyword_id" => keyword_id}}) do + {_, keyword_upload} = + keyword_id + |> Keywords.get_keyword_upload() + |> Keywords.update_keyword_upload_status(:inprogress) + + {:ok, html_response} = GoogleSearchClient.get_html(keyword_upload.name) + + {:ok, _} = Keywords.update_keyword_upload_html(keyword_upload, html_response) + + html_response + |> GoogleSearchParser.parse_html_urls() + |> Enum.map(fn url_stats -> Map.put(url_stats, :keyword_upload_id, keyword_upload.id) end) + |> Enum.each(fn search_result_url -> + SearchResults.create_search_results(search_result_url) + end) + + Keywords.update_keyword_upload_status(keyword_upload, :completed) + + :ok + end +end diff --git a/lib/google_search_data_viewer_worker/keyword/keywords.ex b/lib/google_search_data_viewer_worker/keyword/keywords.ex new file mode 100644 index 0000000..2d61d57 --- /dev/null +++ b/lib/google_search_data_viewer_worker/keyword/keywords.ex @@ -0,0 +1,12 @@ +defmodule GoogleSearchDataViewerWorker.Keyword.Keywords do + alias GoogleSearchDataViewerWorker.Keyword.KeywordSearchWorker + + def create_keyword_upload_jobs_with_delay(keyword_uploads, delay \\ 3) do + keyword_uploads + |> Enum.with_index() + |> Enum.map(fn {keyword_upload, index} -> + KeywordSearchWorker.new(%{keyword_id: keyword_upload.id}, schedule_in: index * delay) + end) + |> Oban.insert_all() + end +end diff --git a/mix.exs b/mix.exs index 82379f5..9037b96 100644 --- a/mix.exs +++ b/mix.exs @@ -49,8 +49,9 @@ defmodule GoogleSearchDataViewer.MixProject do {:excoveralls, "~> 0.14.4", [only: :test]}, {:exvcr, "~> 0.13.3", [only: :test]}, {:faker, "~> 0.17.0", [only: [:dev, :test], runtime: false]}, - {:floki, ">= 0.30.0", only: :test}, + {:floki, ">= 0.30.0"}, {:gettext, "~> 0.18"}, + {:httpoison, "~> 1.8"}, {:jason, "~> 1.2"}, {:mimic, "~> 1.7.2", [only: :test]}, {:nimble_csv, "~> 1.1"}, diff --git a/priv/repo/migrations/20220531083622_create_search_result_urls.exs b/priv/repo/migrations/20220531083622_create_search_result_urls.exs new file mode 100644 index 0000000..01b08e8 --- /dev/null +++ b/priv/repo/migrations/20220531083622_create_search_result_urls.exs @@ -0,0 +1,18 @@ +defmodule GoogleSearchDataViewer.Repo.Migrations.CreateSearchResultUrls do + use Ecto.Migration + + def change do + create table(:search_result_urls) do + add :url, :string + add :is_adword, :boolean, default: false, null: false + add :is_top_adword, :boolean, default: false, null: false + + add :keyword_upload_id, references(:keyword_uploads, on_delete: :delete_all) + + timestamps() + end + + create index(:search_result_urls, [:keyword_upload_id]) + create index(:search_result_urls, [:url]) + end +end diff --git a/test/factories/keyword_upload_factory.ex b/test/factories/keyword_upload_factory.ex index 0a15797..ad89230 100644 --- a/test/factories/keyword_upload_factory.ex +++ b/test/factories/keyword_upload_factory.ex @@ -1,16 +1,14 @@ defmodule GoogleSearchDataViewer.KeywordUploadFactory do - alias Faker.Food.En - alias GoogleSearchDataViewer.Keywords.Schemas.KeywordUpload + alias Faker.Food + alias GoogleSearchDataViewer.Keyword.Schemas.KeywordUpload defmacro __using__(_opts) do quote do - def keyword_upload_factory(attrs) do - name = attrs[:name] || En.dish() - user = attrs[:user] - + def keyword_upload_factory do %KeywordUpload{ - name: name, - user: user + name: Food.En.dish(), + status: :pending, + user: build(:user) } end end diff --git a/test/factories/search_result_url_factory.ex b/test/factories/search_result_url_factory.ex new file mode 100644 index 0000000..1bb68e1 --- /dev/null +++ b/test/factories/search_result_url_factory.ex @@ -0,0 +1,17 @@ +defmodule GoogleSearchDataViewer.SearchResultUrlFactory do + alias Faker + alias GoogleSearchDataViewer.Keyword.Schemas.SearchResultUrl + + defmacro __using__(_opts) do + quote do + def search_result_url_factory do + %SearchResultUrl{ + url: Faker.Internet.url(), + is_adword: false, + is_top_adword: false, + keyword_upload: build(:keyword_upload) + } + end + end + end +end diff --git a/test/factories/user_factory.ex b/test/factories/user_factory.ex index 90cfe74..a0e0186 100644 --- a/test/factories/user_factory.ex +++ b/test/factories/user_factory.ex @@ -1,9 +1,9 @@ defmodule GoogleSearchDataViewer.UserFactory do - alias GoogleSearchDataViewer.Accounts.Schemas.User + alias GoogleSearchDataViewer.Account.Schemas.User defmacro __using__(_opts) do quote do - alias GoogleSearchDataViewer.Accounts.Passwords + alias GoogleSearchDataViewer.Account.Passwords def user_factory(attrs) do email = attrs[:email] || Faker.Internet.email() diff --git a/test/google_search_data_viewer/accounts/account_test.exs b/test/google_search_data_viewer/account/accounts_test.exs similarity index 67% rename from test/google_search_data_viewer/accounts/account_test.exs rename to test/google_search_data_viewer/account/accounts_test.exs index 9bea768..1b418fd 100644 --- a/test/google_search_data_viewer/accounts/account_test.exs +++ b/test/google_search_data_viewer/account/accounts_test.exs @@ -1,8 +1,8 @@ -defmodule GoogleSearchDataViewer.Accounts.AccountTest do +defmodule GoogleSearchDataViewer.Account.AccountsTest do use GoogleSearchDataViewer.DataCase, async: true - alias GoogleSearchDataViewer.Accounts.Account - alias GoogleSearchDataViewer.Accounts.Schemas.User + alias GoogleSearchDataViewer.Account.Accounts + alias GoogleSearchDataViewer.Account.Schemas.User @valid_attrs %{email: "test@gmail.com", password: "aValidPasswordEntered"} @invalid_attrs %{email: "someemail", password: "somepassword"} @@ -11,7 +11,7 @@ defmodule GoogleSearchDataViewer.Accounts.AccountTest do test "returns all users" do user = insert(:user) - assert Account.list_users() == [user] + assert Accounts.list_users() == [user] end end @@ -19,12 +19,12 @@ defmodule GoogleSearchDataViewer.Accounts.AccountTest do test "given a valid id, returns the existing user with the given id" do user = insert(:user) - assert Account.get_user!(user.id) == user + assert Accounts.get_user!(user.id) == user end test "given an invalid id, returns an error indicating it doesn't exist" do assert_raise Ecto.NoResultsError, fn -> - Account.get_user!(-1) + Accounts.get_user!(-1) end end end @@ -33,11 +33,11 @@ defmodule GoogleSearchDataViewer.Accounts.AccountTest do test "given an id, returns the existing user with the given id" do user = insert(:user) - assert Account.get_user(user.id) == user + assert Accounts.get_user(user.id) == user end test "given an invalid id, returns nil indicating it doesn't exist" do - assert Account.get_user(-1) == nil + assert Accounts.get_user(-1) == nil end end @@ -45,19 +45,19 @@ defmodule GoogleSearchDataViewer.Accounts.AccountTest do test "given an email and password, validates correctly for existing email and password" do user = insert(:user, password: @valid_attrs[:password]) - assert Account.validate_email_and_password(user.email, @valid_attrs[:password]) == + assert Accounts.validate_email_and_password(user.email, @valid_attrs[:password]) == {:ok, user} end test "given an incorrect email and password, fails to validate for existing password and email" do user = insert(:user) - assert Account.validate_email_and_password(user.email, "wrongpassword") == + assert Accounts.validate_email_and_password(user.email, "wrongpassword") == {:error, :unauthorized} end test "given a non-existant email and password, fails to validate and returns an error" do - assert Account.validate_email_and_password("invalid@gmail.com", "wrongpassword") == + assert Accounts.validate_email_and_password("invalid@gmail.com", "wrongpassword") == {:error, :not_found} end end @@ -66,12 +66,12 @@ defmodule GoogleSearchDataViewer.Accounts.AccountTest do test "with valid data, creates a user" do valid_attrs = %{email: "some@validemail.com", password: "somepassword"} - assert {:ok, %User{} = user} = Account.create_user(valid_attrs) + assert {:ok, %User{} = user} = Accounts.create_user(valid_attrs) assert user.email == "some@validemail.com" end test "with invalid data, returns an error" do - assert {:error, %Ecto.Changeset{}} = Account.create_user(@invalid_attrs) + assert {:error, %Ecto.Changeset{}} = Accounts.create_user(@invalid_attrs) end end end diff --git a/test/google_search_data_viewer/accounts/schemas/user_test.exs b/test/google_search_data_viewer/account/schemas/user_test.exs similarity index 93% rename from test/google_search_data_viewer/accounts/schemas/user_test.exs rename to test/google_search_data_viewer/account/schemas/user_test.exs index c16d404..8f8ed39 100644 --- a/test/google_search_data_viewer/accounts/schemas/user_test.exs +++ b/test/google_search_data_viewer/account/schemas/user_test.exs @@ -1,7 +1,7 @@ -defmodule GoogleSearchDataViewer.Accounts.Schemas.UserTest do +defmodule GoogleSearchDataViewer.Account.Schemas.UserTest do use GoogleSearchDataViewer.DataCase, async: true - alias GoogleSearchDataViewer.Accounts.Schemas.User + alias GoogleSearchDataViewer.Account.Schemas.User describe "changeset/2" do test "given an empty changeset with empty fields, fails to validate" do diff --git a/test/google_search_data_viewer/keyword/google_search_client_test.exs b/test/google_search_data_viewer/keyword/google_search_client_test.exs new file mode 100644 index 0000000..b1e94df --- /dev/null +++ b/test/google_search_data_viewer/keyword/google_search_client_test.exs @@ -0,0 +1,25 @@ +defmodule GoogleSearchDataViewer.Keyword.GoogleSearchClientTest do + use GoogleSearchDataViewer.DataCase, async: false + + alias GoogleSearchDataViewer.Keyword.GoogleSearchClient + + describe "get_html/1" do + test "given a keyword, returns ok and body" do + use_cassette "keyword_without_adword" do + assert {:ok, _html_response} = GoogleSearchClient.get_html("dog") + end + end + + test "given a keyword and a server response with status code 500, returns error and description" do + use_cassette :stub, url: "https://www.google.com/search?q=dog", status_code: 500 do + assert {:error, "Internal server error"} = GoogleSearchClient.get_html("dog") + end + end + + test "given a keyword and a server response with an unhandled status code 504, returns error and HTTPoison.Response" do + use_cassette :stub, url: "https://www.google.com/search?q=dog", status_code: 504 do + assert {:error, %HTTPoison.Response{}} = GoogleSearchClient.get_html("dog") + end + end + end +end diff --git a/test/google_search_data_viewer/keyword/google_search_parser_test.exs b/test/google_search_data_viewer/keyword/google_search_parser_test.exs new file mode 100644 index 0000000..80dfbd5 --- /dev/null +++ b/test/google_search_data_viewer/keyword/google_search_parser_test.exs @@ -0,0 +1,48 @@ +defmodule GoogleSearchDataViewer.Keyword.GoogleSearchParserTest do + use GoogleSearchDataViewer.DataCase, async: false + + alias GoogleSearchDataViewer.Keyword.GoogleSearchClient + alias GoogleSearchDataViewer.Keyword.GoogleSearchParser + + describe "parse_html_urls/1" do + test "given a HTML response that contains a top adword, returns the URL" do + use_cassette "keyword_with_adword_and_top_adword" do + {:ok, html_response} = GoogleSearchClient.get_html("buy nike shoes") + + url_stats_results = GoogleSearchParser.parse_html_urls(html_response) + + assert Enum.member?(url_stats_results, %{ + is_adword: true, + is_top_adword: true, + url: "https://www.nike.com/th/" + }) + end + end + + test "given a HTML response that contains a bottom adword, returns the URL" do + use_cassette "keyword_with_adword" do + {:ok, html_response} = GoogleSearchClient.get_html("samsung galaxy s21") + + url_stats_results = GoogleSearchParser.parse_html_urls(html_response) + + assert Enum.member?(url_stats_results, %{ + is_adword: true, + is_top_adword: false, + url: "https://www.powerbuy.co.th/th/promotion/brand-fair/android-fair?brand=OPPO" + }) + end + end + + test "given a HTML response that contains no adwords, returns list of URL data with no adwords" do + use_cassette "keyword_without_adword" do + {:ok, html_response} = GoogleSearchClient.get_html("dog") + + url_stats_results = GoogleSearchParser.parse_html_urls(html_response) + + assert Enum.all?(url_stats_results, fn url_stats -> + url_stats.is_adword == false or url_stats.is_top_adword == false + end) + end + end + end +end diff --git a/test/google_search_data_viewer/keyword/keywords_test.exs b/test/google_search_data_viewer/keyword/keywords_test.exs new file mode 100644 index 0000000..0fb1f96 --- /dev/null +++ b/test/google_search_data_viewer/keyword/keywords_test.exs @@ -0,0 +1,75 @@ +defmodule GoogleSearchDataViewer.Keyword.KeywordsTest do + use GoogleSearchDataViewer.DataCase, async: true + + alias GoogleSearchDataViewer.Keyword.Keywords + + describe "create_keyword_uploads/2" do + test "given a valid list of keywords and a user, creates keywords for the user" do + user = insert(:user) + keywords = ["dog", "cat", "fish"] + + {keyword_count, created_keywords} = Keywords.create_keyword_uploads(keywords, user) + + assert keyword_count == Enum.count(keywords) + assert keywords == Enum.map(created_keywords, fn keyword -> keyword.name end) + end + end + + describe "get_keyword_uploads_for_user/1" do + test "given an existing user with uploaded keywords, lists keywords for the user" do + user = insert(:user) + + insert_list(3, :keyword_upload, user: user) + + assert Enum.count(Keywords.get_keyword_uploads_for_user(user)) == 3 + end + + test "given an existing user with no uploaded keywords, returns an empty list" do + user_with_keywords = insert(:user) + user_without_keywords = insert(:user) + + insert_list(3, :keyword_upload, user: user_with_keywords) + + assert Keywords.get_keyword_uploads_for_user(user_without_keywords) == [] + end + end + + describe "get_keyword_upload/1" do + test "with an existing uploaded keyword, returns the keyword upload for the given id" do + keyword_upload = insert(:keyword_upload, name: "dog") + + assert keyword_upload == + keyword_upload.id |> Keywords.get_keyword_upload() |> Repo.preload(:user) + end + + test "with a non-existing uploaded keyword, returns nil" do + user = insert(:user) + insert(:keyword_upload, name: "dog", user: user) + + assert Keywords.get_keyword_upload(-1) == nil + end + end + + describe "update_keyword_upload_status/2" do + test "given an existing uploaded keyword and a new status of inprogress, updates the status to inprogress" do + keyword_upload = insert(:keyword_upload, name: "dog", status: :pending) + + {_, keyword_upload_result} = + Keywords.update_keyword_upload_status(keyword_upload, :inprogress) + + assert keyword_upload_result.status == :inprogress + end + end + + describe "update_keyword_upload_html/2" do + test "given an existing uploaded keyword and a non-empty html value, updates the html value" do + keyword_upload = insert(:keyword_upload, name: "dog") + + html = " " + + {_, keyword_upload_result} = Keywords.update_keyword_upload_html(keyword_upload, html) + + assert keyword_upload_result.html == html + end + end +end diff --git a/test/google_search_data_viewer/keywords/schemas/keyword_upload_test.exs b/test/google_search_data_viewer/keyword/schemas/keyword_upload_test.exs similarity index 51% rename from test/google_search_data_viewer/keywords/schemas/keyword_upload_test.exs rename to test/google_search_data_viewer/keyword/schemas/keyword_upload_test.exs index 1588146..ed8e471 100644 --- a/test/google_search_data_viewer/keywords/schemas/keyword_upload_test.exs +++ b/test/google_search_data_viewer/keyword/schemas/keyword_upload_test.exs @@ -1,7 +1,7 @@ -defmodule GoogleSearchDataViewer.Keywords.Schemas.KeywordUploadTest do +defmodule GoogleSearchDataViewer.Keyword.Schemas.KeywordUploadTest do use GoogleSearchDataViewer.DataCase, async: true - alias GoogleSearchDataViewer.Keywords.Schemas.KeywordUpload + alias GoogleSearchDataViewer.Keyword.Schemas.KeywordUpload describe "changeset/2" do test "given a changeset with keyword name and user id, returns valid changeset" do @@ -36,4 +36,37 @@ defmodule GoogleSearchDataViewer.Keywords.Schemas.KeywordUploadTest do assert errors_on(changeset) == %{user: ["does not exist"]} end end + + describe "html_changeset/2" do + test "given a changeset with html, returns valid changeset" do + keyword_upload = build(:keyword_upload, name: "dog") + + changes = %{html: " "} + + html_changeset = KeywordUpload.html_changeset(keyword_upload, changes) + + assert %Ecto.Changeset{valid?: true, changes: ^changes} = html_changeset + end + + test "given a changeset with empty html, fails to validate" do + keyword_upload = build(:keyword_upload, name: "dog") + + changes = %{html: ""} + + html_changeset = KeywordUpload.html_changeset(keyword_upload, changes) + + assert html_changeset.valid? == false + assert errors_on(html_changeset) == %{html: ["can't be blank"]} + end + end + + describe "status_changeset/2" do + test "given a changeset with a valid status, changes the status" do + keyword_upload = build(:keyword_upload, name: "dog") + + status_changeset = KeywordUpload.status_changeset(keyword_upload, :inprogress) + + assert status_changeset.changes == %{status: :inprogress} + end + end end diff --git a/test/google_search_data_viewer/keyword/schemas/search_result_url_test.exs b/test/google_search_data_viewer/keyword/schemas/search_result_url_test.exs new file mode 100644 index 0000000..323f60f --- /dev/null +++ b/test/google_search_data_viewer/keyword/schemas/search_result_url_test.exs @@ -0,0 +1,48 @@ +defmodule GoogleSearchDataViewer.Keyword.Schemas.SearchResultUrlTest do + use GoogleSearchDataViewer.DataCase, async: true + + alias GoogleSearchDataViewer.Keyword.Schemas.SearchResultUrl + + describe "changeset/2" do + test "given a changeset with url, adword data and an existing keyword upload, validates" do + %{id: keyword_upload_id} = insert(:keyword_upload) + + search_result_url_changeset = + SearchResultUrl.changeset(%{ + url: "someurl@something.com", + is_adword: true, + is_top_adword: true, + keyword_upload_id: keyword_upload_id + }) + + assert search_result_url_changeset.valid? == true + + assert search_result_url_changeset.changes == %{ + url: "someurl@something.com", + is_adword: true, + is_top_adword: true, + keyword_upload_id: keyword_upload_id + } + end + + test "given an empty changeset with empty fields, fails to validate" do + search_result_url_changeset = SearchResultUrl.changeset(%{}) + + assert search_result_url_changeset.valid? == false + end + + test "given an invalid keyword upload, return invalid changeset" do + search_result_url_changeset = + SearchResultUrl.changeset(%{ + url: "someurl@something.com", + is_adword: true, + is_top_adword: true, + keyword_upload_id: -1 + }) + + assert {:error, changeset} = Repo.insert(search_result_url_changeset) + + assert errors_on(changeset) == %{keyword_upload: ["does not exist"]} + end + end +end diff --git a/test/google_search_data_viewer/keyword/search_results_test.exs b/test/google_search_data_viewer/keyword/search_results_test.exs new file mode 100644 index 0000000..b316713 --- /dev/null +++ b/test/google_search_data_viewer/keyword/search_results_test.exs @@ -0,0 +1,57 @@ +defmodule GoogleSearchDataViewer.Keyword.SearchResultsTest do + use GoogleSearchDataViewer.DataCase, async: true + + alias GoogleSearchDataViewer.Keyword.SearchResults + + describe "create_search_results/1" do + test "given valid search result data, creates search result data for a keyword upload" do + keyword_upload = insert(:keyword_upload) + + search_result_url = %{ + url: "www.google.com", + is_top_adword: false, + is_adword: true, + keyword_upload_id: keyword_upload.id + } + + {:ok, search_result} = SearchResults.create_search_results(search_result_url) + + assert search_result.url == "www.google.com" + assert search_result.is_top_adword == false + assert search_result.is_adword == true + assert search_result.keyword_upload_id == keyword_upload.id + end + + test "given an empty url, returns an error" do + keyword_upload = insert(:keyword_upload) + + search_result_url = %{ + url: "", + is_top_adword: false, + is_adword: true, + keyword_upload_id: keyword_upload.id + } + + {:error, changeset} = SearchResults.create_search_results(search_result_url) + + assert errors_on(changeset) == %{ + url: ["can't be blank"] + } + end + + test "given an invalid keyword upload, returns an error" do + search_result_url = %{ + url: "www.google.com", + is_top_adword: false, + is_adword: true, + keyword_upload_id: -1 + } + + {:error, changeset} = SearchResults.create_search_results(search_result_url) + + assert errors_on(changeset) == %{ + keyword_upload: ["does not exist"] + } + end + end +end diff --git a/test/google_search_data_viewer/keywords/keyword_test.exs b/test/google_search_data_viewer/keywords/keyword_test.exs deleted file mode 100644 index 794604a..0000000 --- a/test/google_search_data_viewer/keywords/keyword_test.exs +++ /dev/null @@ -1,43 +0,0 @@ -defmodule GoogleSearchDataViewer.Keywords.KeywordTest do - use GoogleSearchDataViewer.DataCase, async: true - - alias GoogleSearchDataViewer.Keywords.Keyword - - describe "create_keyword_uploads/2" do - test "given a valid list of keywords and a user, creates keywords for the user" do - user = insert(:user) - keywords = ["dog", "cat", "fish"] - - {keyword_count, _keywords} = Keyword.create_keyword_uploads(keywords, user) - - assert keyword_count == Enum.count(keywords) - end - end - - describe "get_keyword_uploads_for_user/1" do - test "given an existing user with uploaded keywords, lists keywords for the user" do - user = insert(:user) - - keywords = ["dog", "cat", "fish"] - - keyword_uploads = - Enum.map(keywords, fn keyword -> insert(:keyword_upload, name: keyword, user: user) end) - - assert keyword_uploads == - user - |> Keyword.get_keyword_uploads_for_user() - |> Repo.preload(:user) - end - - test "given an existing user with no uploaded keywords, returns an empty list" do - user1 = insert(:user) - user2 = insert(:user) - - keywords = ["dog", "cat", "fish"] - - Enum.each(keywords, fn keyword -> insert(:keyword_upload, name: keyword, user: user1) end) - - assert Keyword.get_keyword_uploads_for_user(user2) == [] - end - end -end diff --git a/test/google_search_data_viewer_worker/keyword/keyword_search_worker_test.exs b/test/google_search_data_viewer_worker/keyword/keyword_search_worker_test.exs new file mode 100644 index 0000000..7a7cc36 --- /dev/null +++ b/test/google_search_data_viewer_worker/keyword/keyword_search_worker_test.exs @@ -0,0 +1,52 @@ +defmodule GoogleSearchDataViewerWorker.Keyword.KeywordSearchWorkerTest do + use GoogleSearchDataViewer.DataCase, async: false + + alias GoogleSearchDataViewerWorker.Keyword.KeywordSearchWorker + + @max_attempts 3 + + setup_all do + HTTPoison.start() + end + + describe "perform/1" do + test "updates status from pending to completed for the uploaded keyword" do + use_cassette "keyword_with_adword" do + keyword_upload = insert(:keyword_upload, name: "samsung galaxy s21") + + KeywordSearchWorker.perform(%Oban.Job{args: %{"keyword_id" => keyword_upload.id}}) + + keyword_upload_result = Repo.reload(keyword_upload) + + assert keyword_upload_result.status == :completed + end + end + + test "inserts html into the uploaded keyword" do + use_cassette "keyword_without_adword" do + keyword_upload = insert(:keyword_upload, name: "dog") + + KeywordSearchWorker.perform(%Oban.Job{args: %{"keyword_id" => keyword_upload.id}}) + + keyword_upload_result = Repo.reload(keyword_upload) + + assert keyword_upload_result.html =~ "" + end + end + + test "updates status to failed when max attempts have been reached" do + use_cassette "keyword_without_adword" do + keyword_upload = insert(:keyword_upload, name: "dog") + + KeywordSearchWorker.perform(%Oban.Job{ + args: %{"keyword_id" => keyword_upload.id}, + attempt: @max_attempts + }) + + %{status: keyword_status} = Repo.reload(keyword_upload) + + assert keyword_status == :failed + end + end + end +end diff --git a/test/google_search_data_viewer_worker/keyword/keywords_test.exs b/test/google_search_data_viewer_worker/keyword/keywords_test.exs new file mode 100644 index 0000000..41d1522 --- /dev/null +++ b/test/google_search_data_viewer_worker/keyword/keywords_test.exs @@ -0,0 +1,37 @@ +defmodule GoogleSearchDataViewerWorker.Keyword.KeywordsTest do + use Oban.Testing, repo: GoogleSearchDataViewer.Repo + use GoogleSearchDataViewer.DataCase, async: true + + alias GoogleSearchDataViewerWorker.Keyword.Keywords + alias GoogleSearchDataViewerWorker.Keyword.KeywordSearchWorker + + @job_delay_in_seconds 3 + + describe "create_keyword_upload_jobs_with_delay/1" do + test "given two uploaded keywords and a delay, jobs are created in the oban_jobs table scheduled for the correct times for two keyword ids" do + first_keyword_upload = insert(:keyword_upload, name: "dog") + second_keyword_upload = insert(:keyword_upload, name: "cat") + + Keywords.create_keyword_upload_jobs_with_delay( + [ + first_keyword_upload, + second_keyword_upload + ], + @job_delay_in_seconds + ) + + delayed_time = DateTime.add(DateTime.utc_now(), @job_delay_in_seconds, :second) + + assert_enqueued( + worker: KeywordSearchWorker, + args: %{keyword_id: first_keyword_upload.id} + ) + + assert_enqueued( + worker: KeywordSearchWorker, + scheduled_at: delayed_time, + args: %{keyword_id: second_keyword_upload.id} + ) + end + end +end diff --git a/test/support/data_case.ex b/test/support/data_case.ex index 9d33abd..7063745 100644 --- a/test/support/data_case.ex +++ b/test/support/data_case.ex @@ -35,6 +35,10 @@ defmodule GoogleSearchDataViewer.DataCase do end setup tags do + HTTPoison.start() + + ExVCR.Config.cassette_library_dir("test/support/fixtures/vcr_cassettes/") + pid = Sandbox.start_owner!(GoogleSearchDataViewer.Repo, shared: not tags[:async]) on_exit(fn -> Sandbox.stop_owner(pid) end) :ok diff --git a/test/support/factory.ex b/test/support/factory.ex index b06c67a..da71b3a 100644 --- a/test/support/factory.ex +++ b/test/support/factory.ex @@ -1,5 +1,6 @@ defmodule GoogleSearchDataViewer.Factory do use ExMachina.Ecto, repo: GoogleSearchDataViewer.Repo use GoogleSearchDataViewer.KeywordUploadFactory + use GoogleSearchDataViewer.SearchResultUrlFactory use GoogleSearchDataViewer.UserFactory end diff --git a/test/support/fixtures/keywords/valid_keywords.csv b/test/support/fixtures/keywords/valid_keywords.csv index 6da7038..08115f7 100644 --- a/test/support/fixtures/keywords/valid_keywords.csv +++ b/test/support/fixtures/keywords/valid_keywords.csv @@ -1,3 +1,3 @@ -dog -cow -sheep +iphone 12 +samsung galaxy s21 +buy nike shoes diff --git a/test/support/fixtures/vcr_cassettes/keyword_with_adword.json b/test/support/fixtures/vcr_cassettes/keyword_with_adword.json new file mode 100644 index 0000000..e3a8d9a --- /dev/null +++ b/test/support/fixtures/vcr_cassettes/keyword_with_adword.json @@ -0,0 +1,41 @@ +[ + { + "request": { + "body": "", + "headers": { + "User-Agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 12_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.67 Safari/537.36" + }, + "method": "get", + "options": [], + "request_body": "", + "url": "https://www.google.com/search?q=samsung%20galaxy%20s21" + }, + "response": { + "binary": false, + "body": "samsung galaxy s21 - ค้นหาด้วย Google

    ลิงก์เกี่ยวกับการช่วยเหลือพิเศษ

    \"Google\"
    กด / เพื่อข้ามไปยังช่องค้นหา

    โหมดการค้นหา

    เครื่องมือ
      ผลการค้นหาประมาณ 225,000,000 รายการ (0.52 วินาที) 

      โฆษณา

      \"รูปภาพจาก
      Galaxy S22 กล้องชนะทุกแสง ถ่ายภาพสวย ชัด สว่าง เก็บได้ทุกรายละเอียด แม้ตอนกลางคืน เก่าแลกใหม่ ลดเพิ่มสูงสุด 4,000.- ผ่อน 0% 10 ด....

      รับฟรี Adapter 25W ราคา490.- ผ่อน0% เพียง 1,594.-/ด.* ถึง 31 พ.ค. 65นี้

      ฟรี! ความจุเพิ่ม มูลค่า 4,000.- ผ่อน0% 10ด. รับเครดิตเงินคืน14,000*

      รับส่วนลด 1,500.- ในการแลกซื้อ Galaxy Watch4 / Galaxy Buds2
      samsung s21 ราคาล่าสุด
      samsung galaxy s21 ultra 5g ราคาล่าสุด
      ซัมซุงs21 ราคาปัจจุบัน 2564
      samsung s21 ultra ราคาล่าสุด
      samsung s21 plus ราคาล่าสุด
      Samsung Galaxy
      Samsung S21 Ultra
      ซัมซุงs21 ultra ราคา

      และผู้คนยังค้นหา

      ผลการค้นหา

      สิทธิพิเศษสูงสุด 4 ต่อ เมื่อซื้อซัมซุง Galaxy S21 Series ส่วนลด 3000.- | รับฟรี Galaxy Buds2 | Samsung Care+ | ส่วนลดเพิ่ม 3000 บาท ...
      1 มี.ค. 2565 · คะแนน: 4 · ‎225 คะแนน · ‎฿24,900.00 ถึง ฿26,900.00

      ผลการค้นหาเสริม

      โฆษณา·เลือกซื้อsamsung galaxy s...
      โฆษณา·เลือกซื้อsamsung galaxy s21
      Samsung Galaxy S21 5G · 6.2นิ้วจอ Dynamic AMOLED 2X 24-bit1080 x 2400 พิกเซล · 12 MP + 12MP (Ultrawide) + 64MP (Telephoto) กล้องหน้า 10MP · Exynos 2100 Octa Core ...
      ซื้อมือถือและแท็บเล็ตราคาถูกได้ที่ AIS Online Store อาทิเช่น iPhone,iPad,iPad mini,Samsung,Samsung Galaxy,Samsung Gear,AIS Lava,Nokia,Sony,ซิมเบอร์สวย ...
      Samsung Galaxy S21 สมาร์ทโฟนที่สร้างขึ้นมาเพื่อให้ทำคอนเทนต์ได้เกินคาดในทุกๆวัน ออกแบบมาเพื่อเปลี่ยนการถ่ายภาพและวิดีโอแบบเดิมๆด้วยความละเอียดคมชัดระดับ 8K ...
      Memory: RAM 8GB / ROM 128GB
      Display: Dynamic AMOLED 2X (120Hz refresh ...
      Chip: Exynos 2100 Octa Core 2.9 GHz
      Screen Size: 6.2 inch
      ฿17,900.00 · ‎หมดสต็อก
      เลือกรูปภาพเพื่อแสดงความคิดเห็น
      ความคิดเห็น

      ดูทั้งหมด
      samsung galaxy s21 ultra 5g (12/128gb). ราคาปกติ 39,900.-. 13,400.-. ค่าบริการล่วงหน้า 1,000.-; แพ็กเกจเริ่มต้น 2199. รายละเอียด. Samsung Galaxy S21 5G (8/256GB).

      โฆษณา

      Android Fair ลดสูงสุด 30%+ แลกคะแนน The 1 ลดเพิ่ม 12.5%* + ลดเพิ่ม 5%* 19 เม.ย.-31 พ.ค. 65 การันตีของแท้ 100% บริการเก็บเงินปลายทาง และนัดรับสินค้าได้ที่สาขา ยิ่งซื้อเยอะ ยิ่งลดเยอะ

      ลิงก์ส่วนท้าย

      ไทย
       - อัปเดตตำแหน่ง
      อัปเดตตำแหน่งของคุณไม่ได้ดูข้อมูลเพิ่มเติม
      กำลังอัปเดตตำแหน่ง...
      ", + "headers": { + "Content-Type": "text/html; charset=UTF-8", + "Date": "Tue, 31 May 2022 06:52:21 GMT", + "Expires": "-1", + "Cache-Control": "private, max-age=0", + "Strict-Transport-Security": "max-age=31536000", + "Content-Security-Policy": "object-src 'none';base-uri 'self';script-src 'nonce-xqMf_vcipK1e26qSMNI_xw' 'strict-dynamic' 'report-sample' 'unsafe-eval' 'unsafe-inline' https: http:;report-uri https://csp.withgoogle.com/csp/gws/cdt1", + "Cross-Origin-Opener-Policy": "same-origin-allow-popups; report-to=\"gws\"", + "Report-To": "{\"group\":\"gws\",\"max_age\":2592000,\"endpoints\":[{\"url\":\"https://csp.withgoogle.com/csp/report-to/gws/cdt1\"}]}", + "Accept-CH": "Sec-CH-Viewport-Width", + "BFCache-Opt-In": "unload", + "P3P": "CP=\"This is not a P3P policy! See g.co/p3phelp for more info.\"", + "Server": "gws", + "X-XSS-Protection": "0", + "X-Frame-Options": "SAMEORIGIN", + "Set-Cookie": "1P_JAR=2022-05-31-06; expires=Thu, 30-Jun-2022 06:52:21 GMT; path=/; domain=.google.com; Secure; SameSite=none", + "Alt-Svc": "h3=\":443\"; ma=2592000,h3-29=\":443\"; ma=2592000,h3-Q050=\":443\"; ma=2592000,h3-Q046=\":443\"; ma=2592000,h3-Q043=\":443\"; ma=2592000,quic=\":443\"; ma=2592000; v=\"46,43\"", + "Accept-Ranges": "none", + "Vary": "Accept-Encoding", + "Transfer-Encoding": "chunked" + }, + "status_code": 200, + "type": "ok" + } + } +] diff --git a/test/support/fixtures/vcr_cassettes/keyword_with_adword_and_top_adword.json b/test/support/fixtures/vcr_cassettes/keyword_with_adword_and_top_adword.json new file mode 100644 index 0000000..97397a5 --- /dev/null +++ b/test/support/fixtures/vcr_cassettes/keyword_with_adword_and_top_adword.json @@ -0,0 +1,41 @@ +[ + { + "request": { + "body": "", + "headers": { + "User-Agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 12_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.67 Safari/537.36" + }, + "method": "get", + "options": [], + "request_body": "", + "url": "https://www.google.com/search?q=buy%20nike%20shoes" + }, + "response": { + "binary": false, + "body": "buy nike shoes - ค้นหาด้วย Google

      ลิงก์เกี่ยวกับการช่วยเหลือพิเศษ

      \"Google\"
      กด / เพื่อข้ามไปยังช่องค้นหา

      โหมดการค้นหา

      เพิ่มเติม
      เครื่องมือ
        ผลการค้นหาประมาณ 169,000,000 รายการ (0.51 วินาที) 

        โฆษณา

        เว็บไซต์อย่างเป็นทางการ ครบทุกสิ่งที่เป็น Nike เลือกซื้อสินค้าของผู้ชายใหม่ล่าสุดวันนี้! ไม่ว่าที่ไหน เมื่อไร สถานการณ์ใด เต็มที่ในแบบฉบับตัวเองด้วยรองเท้าและเสื้อผ้าจาก Nike คืนสินค้าฟรีสำหรับสมาชิก ไซต์ทางการ Nike

        ที่สุดแห่งแหล่งข้อมูลคนรักสนีกเกอร์ ชมปฏิทินเปิดตัว SNKRS

        ค้นหารองเท้าวิ่งคู่ใจ ที่จะช่วยให้คุณพุ่งไปได้เร็วที่สุด

        มีจำหน่ายแล้วที่ Nike.com ประเทศไทย เลือกซื้อเลย

        สไตล์ใหม่ที่ลดราคา ลดสูงสุดถึง 40%
        buy nike shoes online
        nike shop ในไทย
        Nike USA
        www.nike.com thailand
        Nike Shoes
        Nike Shoes Women's
        nike shoes price in thailand
        Nike SNKRS

        และผู้คนยังค้นหา

        ผลการค้นหา

        ร้านค้าอย่างเป็นทางการของ Nike. สนีกเกอร์ Dunk และ Jordan 1 รุ่นล่าสุด. เลือกซื้อ.
        \"buy

        ผลการค้นหาในท้องถิ่น

        ผลการค้นหาเสริม

        โฆษณา·เลือกซื้อbuy nike shoes
        โฆษณา·เลือกซื้อbuy nike shoes
        1951 ItemsHot! The Latest NIKE shop ✓ Free Delivery ✓ Cash on Delivery ✓ 30 Days Return. ... NIKE green NIKE Air Zoom Pegasus 39 Men's Running Shoes ...
        \"buy
        Buy Nike sneakers for men and women online at NOIRFONCE e shop - Air Max 1, Air Max 90, Air Max Plus , Daybreak, ACG, TN, Vapormax, Air Force 1, Blazer.
        \"buy

        และผู้คนยังถามถึง

        \"กำลังโหลด...\"
        Buy and sell Nike shoes at the best price on StockX, the live marketplace for 100% real Nike sneakers and other popular new releases.
        Nike · ‎Nike Dunk · ‎Nike SB · ‎Nike Basketball
        \"buy
        Shop the latest Nike shoes, clothing and accessories at JD Sports. Discover exclusive styles and free shipping on the hottest Nike sneakers.
        \"\"
        \"\"
        \"\"
        \"\"
        \"\"
        \"\"
        \"\"
        \"\"
        Nike Shoes - Shop for casual shoes & sports shoes for men, women & kids from Nike online at Myntra. Choose from a wide range of Nike shoes in different ...
        Looking for Nike Air Max shoes online? Shop Nike Air Max shoes & sneakers for men, women & kids at the best prices in Dubai, Abu Dhabi & UAE.
        \"buy
        เลือกรูปภาพเพื่อแสดงความคิดเห็น
        ความคิดเห็น

        ดูทั้งหมด
        Results 1 - 48 of 4175Shop the latest selection of Nike at Foot Locker. Find the hottest sneaker drops from brands like Jordan, Nike, Under Armour, ...

        การนำทางหน้าเว็บ

        12345678910ถัดไป

        ลิงก์ส่วนท้าย

        ไทย
         - อัปเดตตำแหน่ง
        อัปเดตตำแหน่งของคุณไม่ได้ดูข้อมูลเพิ่มเติม
        กำลังอัปเดตตำแหน่ง...
        ", + "headers": { + "Content-Type": "text/html; charset=UTF-8", + "Date": "Tue, 31 May 2022 05:21:04 GMT", + "Expires": "-1", + "Cache-Control": "private, max-age=0", + "Strict-Transport-Security": "max-age=31536000", + "Content-Security-Policy": "object-src 'none';base-uri 'self';script-src 'nonce-R03q2L0kHqx9CAvFwWc5Pw' 'strict-dynamic' 'report-sample' 'unsafe-eval' 'unsafe-inline' https: http:;report-uri https://csp.withgoogle.com/csp/gws/cdt1", + "Cross-Origin-Opener-Policy": "same-origin-allow-popups; report-to=\"gws\"", + "Report-To": "{\"group\":\"gws\",\"max_age\":2592000,\"endpoints\":[{\"url\":\"https://csp.withgoogle.com/csp/report-to/gws/cdt1\"}]}", + "Accept-CH": "Sec-CH-Viewport-Width", + "BFCache-Opt-In": "unload", + "P3P": "CP=\"This is not a P3P policy! See g.co/p3phelp for more info.\"", + "Server": "gws", + "X-XSS-Protection": "0", + "X-Frame-Options": "SAMEORIGIN", + "Set-Cookie": "1P_JAR=2022-05-31-05; expires=Thu, 30-Jun-2022 05:21:04 GMT; path=/; domain=.google.com; Secure; SameSite=none", + "Alt-Svc": "h3=\":443\"; ma=2592000,h3-29=\":443\"; ma=2592000,h3-Q050=\":443\"; ma=2592000,h3-Q046=\":443\"; ma=2592000,h3-Q043=\":443\"; ma=2592000,quic=\":443\"; ma=2592000; v=\"46,43\"", + "Accept-Ranges": "none", + "Vary": "Accept-Encoding", + "Transfer-Encoding": "chunked" + }, + "status_code": 200, + "type": "ok" + } + } +] diff --git a/test/support/fixtures/vcr_cassettes/keyword_without_adword.json b/test/support/fixtures/vcr_cassettes/keyword_without_adword.json new file mode 100644 index 0000000..752d351 --- /dev/null +++ b/test/support/fixtures/vcr_cassettes/keyword_without_adword.json @@ -0,0 +1,41 @@ +[ + { + "request": { + "body": "", + "headers": { + "User-Agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 12_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.67 Safari/537.36" + }, + "method": "get", + "options": [], + "request_body": "", + "url": "https://www.google.com/search?q=dog" + }, + "response": { + "binary": false, + "body": "dog - ค้นหาด้วย Google

        ลิงก์เกี่ยวกับการช่วยเหลือพิเศษ

        \"Google\"
        กด / เพื่อข้ามไปยังช่องค้นหา

        โหมดการค้นหา

        เพิ่มเติม
        เครื่องมือ
          ผลการค้นหาประมาณ 4,500,000,000 รายการ (0.64 วินาที) 
          เคล็ดลับ : ค้นหาผลลัพธ์ที่เป็นภาษาไทยเท่านั้น คุณสามารถระบุภาษาที่ใช้ค้นหาในการตั้งค่า

          ผลการค้นหา

          เลือกรูปภาพเพื่อแสดงความคิดเห็น
          ความคิดเห็น

          ดูทั้งหมด
          Dog: Directed by Reid Carolin, Channing Tatum. With Channing Tatum, Ryder McLaughlin, Aavi Haas, Luke Forbes. Two former Army Rangers are paired against ...
          20 ก.พ. 2565 · คะแนน: 6.5/10 · ‎19,484 คะแนน
          The dog or domestic dog is a domesticated descendant of the wolf, and is characterized by an upturning tail. The dog is derived from an ancient, ...
          Species: C. familiaris
          Kingdom: Animalia
          Genus: Canis
          Order: Carnivora

          ผลการค้นหาเสริม

          Dog

          พ.ศ. 2565 ‧ ตลก/ภาพยนตร์การเดินทาง ‧ 1 ชม. 41 นาที
          \"\"
          ดูแล้ว
          ดูแล้วหรือยัง
          ขอขอบคุณ การดำเนินการนี้จะช่วยปรับปรุงการแนะนำภาพยนตร์และรายการจาก Google
          รายการที่อยากดู
          หากต้องการดูรายการติดตามของคุณใน Google ให้ค้นหา \"รายการที่อยากดู\"
          เกิดข้อผิดพลาดและระบบไม่ได้บันทึกการดำเนินการของคุณ
          78% ชอบภาพยนตร์เรื่องนี้
          ผู้ใช้ Google
          ชอบแล้ว
          ไม่ชอบแล้ว
          อัปเดตรีวิวเป็น \"ชอบ\" แล้ว
          อัปเดตรีวิวเป็น \"ไม่ชอบ\" แล้ว

          คำอธิบาย

          แปลจากภาษาอังกฤษ-Dog เป็นภาพยนตร์แนวดราม่าแนวดราม่าอเมริกันในปี 2022 ที่กำกับโดย Channing Tatum และ Reid Carolin ซึ่งทั้งคู่ต่างก็เปิดตัวการกำกับเรื่องแรกตามลำดับ ภาพยนตร์เรื่องนี้นำแสดงโดยทาทั่มในฐานะหน่วยแรนเจอร์ซึ่งต้องพาสุนัขของผู้บัญชาการที่เสียชีวิตไปงานศพ Jane Adams, Kevin Nash, Q'orianka Kilcher, Ethan Suplee, Emmy ... วิกิพีเดีย (ภาษาอังกฤษ)
          ดูคำอธิบายต้นฉบับดูคำอธิบายต้นฉบับ

          คำอธิบาย

          Dog is a 2022 American comedy drama road film directed by Channing Tatum and Reid Carolin, both making their respective feature directorial debuts. The film stars Tatum as an Army Ranger who must escort the dog of his fallen commander to the funeral. Wikipedia
          วันที่ออกฉาย: 18 กุมภาพันธ์ 2565 (อเมริกา)
          เลือกสิ่งที่จะแสดงความคิดเห็น
          ความคิดเห็น
          Dogs, สุนัข [การแพทย์]. English-Thai: Longdo Dictionary (UNAPPROVED version -- use with care ) ...
          พจนานุกรมออนไลน์ คำแปล Dog คือ ค้นหาคำศัพท์ แปลภาษา ความหมายของคำว่า [Dog] หมายถึง ดิกชันนารี ออนไลน์ ภาษาไทยและภาษาอังกฤษ Sanook English Thai Dictionary.
          DOG is a buddy comedy that follows the misadventures of two former Army Rangers paired against their will on the road trip of a lifetime.
          Release Date (Theaters): Feb 18, 2022 wide
          Box Office (Gross USA): $61.8M
          Runtime: 1h 41m
          Genre: Comedy
          คะแนน: 76% · ‎146 คะแนน
          \"dog
          \"\"
          \"\"
          \"\"
          \"\"
          3 พ.ค. 2565What's a dog breed? People have been breeding dogs since prehistoric times. The earliest dog breeders used wolves to create domestic dogs.
          29 เม.ย. 2565dog, (Canis lupus familiaris), domestic mammal of the family Canidae (order Carnivora). It is a subspecies of the gray wolf (Canis lupus) ...
          \"dog

          การนำทางหน้าเว็บ

          12345678910ถัดไป

          ลิงก์ส่วนท้าย

          ", + "headers": { + "Content-Type": "text/html; charset=UTF-8", + "Date": "Mon, 23 May 2022 08:49:40 GMT", + "Expires": "-1", + "Cache-Control": "private, max-age=0", + "Strict-Transport-Security": "max-age=31536000", + "Content-Security-Policy": "object-src 'none';base-uri 'self';script-src 'nonce-ClgQWHWDisks5c48c1g0jw' 'strict-dynamic' 'report-sample' 'unsafe-eval' 'unsafe-inline' https: http:;report-uri https://csp.withgoogle.com/csp/gws/cdt1", + "Cross-Origin-Opener-Policy": "same-origin-allow-popups; report-to=\"gws\"", + "Report-To": "{\"group\":\"gws\",\"max_age\":2592000,\"endpoints\":[{\"url\":\"https://csp.withgoogle.com/csp/report-to/gws/cdt1\"}]}", + "Accept-CH": "Sec-CH-Viewport-Width", + "BFCache-Opt-In": "unload", + "P3P": "CP=\"This is not a P3P policy! See g.co/p3phelp for more info.\"", + "Server": "gws", + "X-XSS-Protection": "0", + "X-Frame-Options": "SAMEORIGIN", + "Set-Cookie": "1P_JAR=2022-05-23-08; expires=Wed, 22-Jun-2022 08:49:40 GMT; path=/; domain=.google.com; Secure; SameSite=none", + "Alt-Svc": "h3=\":443\"; ma=2592000,h3-29=\":443\"; ma=2592000,h3-Q050=\":443\"; ma=2592000,h3-Q046=\":443\"; ma=2592000,h3-Q043=\":443\"; ma=2592000,quic=\":443\"; ma=2592000; v=\"46,43\"", + "Accept-Ranges": "none", + "Vary": "Accept-Encoding", + "Transfer-Encoding": "chunked" + }, + "status_code": 200, + "type": "ok" + } + } +]