Downloads previously summarized WorldFish survey data from cloud storage, incorporates modeled aggregated estimates, and exports everything to MongoDB collections for use in data portals. The function also generates geographic regional summaries.
Usage
export_portal(log_threshold = logger::DEBUG, package = "coasts")Arguments
- log_threshold
The logging level threshold for the logger package (e.g., DEBUG, INFO) See `logger::log_levels` for available options.
- package
Name of the package whose `inst/conf.yml` to read. Defaults to `"coasts"`. Pass your own package name when calling from a downstream package with a compatible configuration.
Details
The function performs the following operations: - Downloads five summary datasets from cloud storage: - Monthly summaries: Aggregated catch metrics by district and month - Taxa summaries: Species-specific metrics in long format - Districts summaries: District-level indicators over time - Gear summaries: Performance metrics by gear type - Grid summaries: Spatial grid data from vessel tracking - Downloads aggregated catch estimates from the modeling step - Creates geographic regional summaries using the monthly data - Joins aggregated estimates (fishing trips, catch tonnage, revenue) to monthly summaries - Transforms monthly summaries to long format for portal consumption - Uploads all datasets to specified MongoDB collections
The function expects the summary files to be named with the pattern: `file_prefix_table_name.parquet` where table_name is one of: monthly_summaries, taxa_summaries, districts_summaries, gear_summaries, grid_summaries
See also
* [summarize_data()] for generating the summary datasets * [download_parquet_from_cloud()] for retrieving data from cloud storage * [mdb_collection_push()] for uploading data to MongoDB
Examples
if (FALSE) { # \dontrun{
# Export WF summary data with default debug logging
export_portal()
# Export with info-level logging only
export_portal(logger::INFO)
} # }