Skip to content

ApoAlly Insights Setup#

Debian/Ubuntu#

Bash Upload Script
apoally-insights-upload.sh
#!/bin/bash
# upload_csv.sh
#
# This script now automatically searches for an optional schema file (.json or .csvs)
# with the same base name as the CSV/TSV file. JSON has priority over CSVS if both exist.
# If no schema file is found, only the CSV/TSV file is uploaded. If a directory is provided,
# the script processes all .csv and .tsv files within it (recursively), each with its optional schema file.
#
# Usage:
#   ./upload_csv.sh <path_to_csv_or_tsv_file_or_directory>
#
# Environment Variables:
#   API_URL - The URL of the CSV upload endpoint (default: https://api.apoally.de/insights/upload)
#   API_KEY - The API key used for authentication (default: YOUR-API-KEY)

API_URL="https://api.apoally.de/insights/upload"
API_KEY="YOUR-API-KEY"

# Check if exactly one argument is provided
if [ "$#" -ne 1 ]; then
    echo "Usage: $0 <path_to_csv_or_tsv_file_or_directory>"
    exit 1
fi

TARGET="$1"

function upload_single_file() {
    local file_path="$1"
    local base="${file_path%.*}"

    # Detect optional schema file with the same base name
    local schema_file=""
    if [ -f "${base}.json" ]; then
        schema_file="${base}.json"
    elif [ -f "${base}.csvs" ]; then
        schema_file="${base}.csvs"
    fi

    echo "Uploading file: $file_path"
    if [ -n "$schema_file" ]; then
        echo "Found optional schema file: $schema_file"
        curl -X POST \
             -H "X-API-Key: $API_KEY" \
             -F "files=@${file_path}" \
             -F "files=@${schema_file}" \
             "$API_URL"
    else
        # No schema file, upload only the CSV/TSV
        curl -X POST \
             -H "X-API-Key: $API_KEY" \
             -F "files=@${file_path}" \
             "$API_URL"
    fi
    echo
}

# If TARGET is a file, upload it and its optional schema
if [ -f "$TARGET" ]; then

    # Extract the extension and allow both .csv and .tsv
    extension="${TARGET##*.}"
    if [ "$extension" != "csv" ] && [ "$extension" != "tsv" ]; then
        echo "Error: '$TARGET' is neither a .csv nor a .tsv file."
        exit 1
    fi

    upload_single_file "$TARGET"

# If TARGET is a directory, process all .csv and .tsv files within it recursively
elif [ -d "$TARGET" ]; then

    echo "Processing directory: $TARGET"
    while IFS= read -r -d '' file; do
        upload_single_file "$file"
    done < <(find "$TARGET" -type f \( -iname '*.csv' -o -iname '*.tsv' \) -print0)

else
    echo "Error: '$TARGET' is not a valid file or directory"
    exit 1
fi

Installation and Configuration Guide#

  1. Create and Save the Script
    Save the Bash script apoally-insights-upload.sh in a desired directory, e.g., /usr/local/bin/.
sudo nano /usr/local/bin/apoally-insights-upload.sh

Paste the script content and save the file.

  1. Make it Executable
    Ensure the script is executable:
sudo chmod +x /usr/local/bin/apoally-insights-upload.sh
  1. Configure API Key
    Open the script and replace YOUR-API-KEY with your actual API key.
sudo nano /usr/local/bin/apoally-insights-upload.sh

Modify the line:

API_KEY="YOUR-API-KEY"

Save and close the file.

  1. Set Up a Cronjob
    To run the script regularly (no more than every 15 minutes), create a cronjob:
crontab -e

Add the following line to execute the script every 15 minutes:

*/15 * * * * /usr/local/bin/apoally-insights-upload.sh /path/to/your/csv_or_tsv_files >> /var/log/apoally-upload.log 2>&1

Replace /path/to/your/csv_or_tsv_files with the path to your file or directory.

  1. Check the Log File
    The script’s output is saved in /var/log/apoally-upload.log. Check the log file regularly to ensure the script is functioning correctly:
tail -f /var/log/apoally-upload.log
  1. Test Run
    Run the script manually to ensure everything is set up correctly:
/usr/local/bin/apoally-insights-upload.sh /path/to/your/csv_or_tsv_files

Check the output and the log file for errors.

Note#

  • Ensure your system has a stable internet connection to successfully upload the data.
  • Regularly check the cronjob execution and log files to ensure no issues occur.