Changelog
peskas.zanzibar.data.pipeline 2.6.0
peskas.zanzibar.data.pipeline 2.5.0
Major Changes
- Enhanced validation workflow with KoboToolbox integration:
- Added
update_validation_status()
function to update submission status via API - Added
sync_validation_submissions()
for parallel processing of validation flags - Updated Kobo URL endpoint from kf.kobotoolbox.org to eu.kobotoolbox.org
- Added
New Features
- Implemented parallel processing for validation operations using future/furrr packages
- Added progress reporting during validation operations via progressr package
- Enhanced validation status synchronization between local system and KoboToolbox
Improvements
- Updated data preprocessing to handle flying fish estimates and taxa corrections (TUN→TUS, SKH→CVX)
- Updated export workflow to use validation status instead of flags for data filtering
- Added taxa information to catch export data
- Added Zanzibar SSF report template with visualization examples
- Improved package documentation structure with better categorization
peskas.zanzibar.data.pipeline 2.4.0
Major Changes
- Implemented support for multiple survey data sources:
- Refactored
get_validated_surveys()
to handle WCS, WF, and BA sources - Added source parameter to specify which datasets to retrieve
- Improved handling of data sources with different column structures
- Refactored
New Features
- Added
export_wf_data()
function for WorldFish-specific data export - Enhanced validation with additional composite metrics:
- Price per kg validation
- CPUE (Catch Per Unit Effort) validation
- RPUE (Revenue Per Unit Effort) validation
Improvements
- Added min_length parameter for better length validation thresholds
- Updated LW coefficient filtering logic in model-taxa.R
- Enhanced alert flag handling with combined flags from different validation steps
- Improved catch price and catch weight handling for zero-catch outcomes
- Enhanced data preprocessing with better field type conversion
peskas.zanzibar.data.pipeline 2.3.0
Major Changes
- Enhanced KoboToolbox integration:
- Implemented new validation status retrieval from KoboToolbox API
- Updated validation workflow to incorporate submission validation status
- Improved data validation process through direct API integration
New Features
- New KoboToolbox interaction functions:
-
get_validation_status()
: Retrieves submission validation status from KoboToolbox API
-
peskas.zanzibar.data.pipeline 2.2.0
Major Changes
- Completely restructured taxonomic data processing:
- Introduced new modular functions for taxa handling in model-taxa.R
- Added efficient batch processing for species matching
- Implemented optimized FAO area retrieval system
- Streamlined length-weight coefficient calculations
- Enhanced integration with FishBase and SeaLifeBase
New Features
- New taxonomic processing functions:
-
load_taxa_databases()
: Unified database loading from FishBase and SeaLifeBase -
process_species_list()
: Enhanced species list processing with taxonomic ranks -
match_species_from_taxa()
: Improved species matching across databases -
get_species_areas_batch()
: Efficient FAO area retrieval -
get_length_weight_batch()
: Optimized length-weight parameter retrieval
-
Improvements
- Enhanced performance through batch processing
- Reduced API calls to external databases
- Better error handling and input validation
- More comprehensive documentation
- Improved code organization and modularity
peskas.zanzibar.data.pipeline 2.1.0
Major Changes
- Enhanced taxonomic and catch data processing capabilities:
- Added comprehensive functions for species and catch data processing
- Implemented length-weight coefficient retrieval from FishBase and SeaLifeBase
- Created functions for calculating catch weights using multiple methods
- Added new data reshaping utilities for species and catch information
- Extended Wild Fishing (WF) survey validation with detailed quality checks
- Updated cloud storage and data download/upload functions
peskas.zanzibar.data.pipeline 2.0.0
Major Changes
- Complete overhaul of the data pipeline architecture
- Added PDS (Pelagic Data Systems) integration:
- New trip ingestion and preprocessing functionality
- GPS track data processing capabilities
- Implemented MongoDB export and storage functions
- Removed renv dependency management for improved reliability
- Updated Docker configuration for more robust builds
peskas.zanzibar.data.pipeline 1.0.0
peskas.zanzibar.data.pipeline 0.2.0
New features
Added the validation step and updated the preprocessing step for wcs kobo surveys data, see preprocess_wcs_surveys()
and validate_wcs_surveys()
functions. Currently, validation for catch weight, length and market values are obtained using median absolute deviation method (MAD) leveraging on the k parameters of the univOutl::LocScaleB
function.
In order to accurately spot any outliers, validation is performed based on gear type and species.
N.B. VALIDATION PARAMETERS ARE NOT YET TUNED
peskas.zanzibar.data.pipeline 0.1.0
Drop parent repository code (peskas.timor.pipeline), add infrastructure to download WCS survey data and upload it to cloud storage providers