college/Summer-2024/CS-3443/LakeWatch
2024-08-31 02:13:52 -05:00
..
LakeWatch cs-3443: bring LakeWatch in-tree 2024-08-31 02:13:52 -05:00
LakeWatchAPI cs-3443: bring LakeWatch in-tree 2024-08-31 02:13:52 -05:00
LakeWatchScraper cs-3443: bring LakeWatch in-tree 2024-08-31 02:13:52 -05:00
Meetings cs-3443: bring LakeWatch in-tree 2024-08-31 02:13:52 -05:00
.envrc cs-3443: bring LakeWatch in-tree 2024-08-31 02:13:52 -05:00
.gitignore cs-3443: bring LakeWatch in-tree 2024-08-31 02:13:52 -05:00
docker-compose.yaml cs-3443: bring LakeWatch in-tree 2024-08-31 02:13:52 -05:00
LakeWatchProjectDemo.mp4 cs-3443: bring LakeWatch in-tree 2024-08-31 02:13:52 -05:00
Layout.png cs-3443: bring LakeWatch in-tree 2024-08-31 02:13:52 -05:00
Layout.uml cs-3443: bring LakeWatch in-tree 2024-08-31 02:13:52 -05:00
README.md cs-3443: bring LakeWatch in-tree 2024-08-31 02:13:52 -05:00
UML.png cs-3443: bring LakeWatch in-tree 2024-08-31 02:13:52 -05:00

CodingIsOurPassion Group

This is the mono repository for our group at University of Texas at San Antonio in the Applications Programming course (CS-3443).

Our group consists of the following members:

We chose to build an application named LakeWatch for our project. The intent is for it to present data about Canyon Lake in an Android application applying principles learned in our Application Programming course.

Organization of this Repository

  • LakeWatchAPI contains the backend service behind our Android application
  • LakeWatch contains the main Android application for our project
  • LakeWatchScraper contains the scraper service that goes out to various sites and actually pulls the data we need for the API.
  • Meetings contains our meeting notes

How to run the Android application

As a note the Android Application requires *internet access as it is calling out to our self-hosted API we wrote for this project.

Open the LakeWatch directory as a android application and run it on a Pixel Pro 6 API 28.

We are hosting the API and scraper on Price Hiller's server @ https://lakewatch.orion-technologies.io.

How the API and Scraper are Hosted

The LakeWatchScraper and LakeWatchAPI projects provie a flake.nix file in their top levels which define reproducible builds and configuration options for deployment on NixOS.

Both are hosted on my (Price's) Luna server over @ https://github.com/PriceHiller/dots/blob/Development/hosts/luna/modules/services/lakewatch.nix.

I (Price Hiller) will keep the API online on my server until the start of the next semester on the 26th (or longer if contacted).

Known Issues

  1. If the initial data fetch within the Android application fails and there wasn't a cache file populated yet the given views will be empty. This is out of our control at that point. If a single successful data fetch occurs the cache does get populated and this is a lesser problem.

  2. Android requires SSL secured endpoints for webrequests, this makes local development against the API without certificates quite difficult. Our development process included rolling the production API on a real endpoint while developing. Some improvement could be found here.

Self hosting the API and Scraper for yourself

Initialization

The preferred way of doing all of this is with https://nixos.org/, but if that's too experimental a simple docker-compose.yml and .envrc are provided at the top level of the repository.

Simply source the .envrc file and run docker compose up -d to get started.

Running the API

The first thing you'll want to do is head over to the LakeWatchScraper. You'll need a relatively recent version of the rustc toolchain, (Rust >=v1.78). Ensure the environment variables defined in the .envrc are exported in your environment, then you can run the API through two ways:

  1. Build the application via cargo build --release and run the created binary at LakeWatchAPI/target/release/lakewatch

Or, alternatively:

  1. Simply run the application in debug mode via cargo run

A quick aside, if there are any issues with the table creations in the database you can install the sqlx CLI via cargo install sqlx and run sqlx databse reset --force to refresh the API's database. Do note that this is a destructive procedure.

Running the scraper (AKA, actually getting data loaded into the database)

Now head on over to the LakeWatchScraper directory and ensure you have the following installed:

  • python 3.12 or later from any source or directly from https://www.python.org/
  • The poetry CLI for managing packages. It can be installed via pipx install poetry.
  • gcc which is required for our Postgresql adapter to talk to the database

With those required tools installed (which, by the way, nix solves this problem entirely 😉) run poetry install && poetry shell within the LakeWatchScraper directory.

At this point ensure the database brought up via Docker earlier is still good and that the environment variables from the .envrc are still exported in your environment.

At this stage you can run python lakewatchscraper when at the top level of the LakeWatchScraper directory which will start the scraper and push data into the database.

Checking out the API's swag(ger)

With the data now populated on the API, you can peruse it's endpoints on its swagger page by default hosted at localhost:8000.

Get In Contact

If you have any additional questions or are interested (especially in Nix), feel free to contact us.

Thanks for taking a look at our project 🙂