I’m running a Laravel application that generates regular database and file backups. Right now every backup sits on the server, gradually eating up disk space. I need the backup flow tightened so that: • Only the most recent backup is retained locally. • Every new backup is pushed automatically to a Google Cloud Storage bucket. • The process runs hands-off through Laravel’s scheduler/cron with clear logging and failure alerts. You will create or configure the GCS bucket (including service account keys and IAM roles), adjust my backup configuration files (.env and config/backup.php or the package you prefer), and test the full cycle—local prune, remote upload, and restore validation. Please let me know about similar Laravel + Google Cloud Storage work you’ve completed; that practical experience will guide my hiring decision. Target completion is within a month, but sooner is welcome if quality is maintained.