Sign in to follow this  
  • entries
    25
  • comments
    26
  • views
    25607

Laravel Log File Backups to S3

Sign in to follow this  
Liza Shulyayeva

1374 views

SnailLife does a lot of logging for debugging purposes. Aside from the general [font='courier new']laravel.log[/font] I have separate loggers for each snail and also write logs for stuff like deleted items etc in separate files.

The problem is all the space this takes up on my Digital Ocean droplet in Laravel's storage folder. If I leave it for a week or two it fills up and I'll suddenly find my droplet inaccessible or some recurring commands not being able to finish properly due to insufficient space.

Instead of culling the logs more aggressively I decided to set up a backup to Amazon S3. With Laravel 5's filesystems this ended up being a pretty simple process.

First I set up an S3 bucket called [font='courier new']snaillife-storage [/font]with a user that has [font='courier new']getObject[/font], [font='courier new']createObject[/font], and [font='courier new']deleteObject[/font] permissions.

I set the S3 credentials in the [font='courier new'].env[/font] configuration file:
S3_KEY=blahS3_SECRET=blahS3_REGION=website-us-east-1S3_BUCKET=snaillife-storage
Note that I set the region here just in case but in reality I don't use it. In [font='courier new']config/filesystems.php[/font] I set up the S3 disk using these credentials (the region setting is removed. I also changed the local disk root to [font='courier new']storage_path()[/font]):
'local' => [ 'driver' => 'local', 'root' => storage_path(),],'s3' => [ 'driver' => 's3', 'key' => env('S3_KEY'), 'secret' => env('S3_SECRET'), 'bucket' => env('S3_BUCKET'),],
Then I made a new artisan command called BackUpLogs:
allFiles('logs'); $cloudDisk = Storage::disk('s3'); $pathPrefix = 'snailLogs' . DIRECTORY_SEPARATOR . Carbon::now() . DIRECTORY_SEPARATOR; foreach ($localFiles as $file) { $contents = $localDisk->get($file); $cloudLocation = $pathPrefix . $file; $cloudDisk->put($cloudLocation, $contents); $localDisk->delete($file); } } else { Log::info('BackUpLogs not backing up in local env'); } }}
Note that the directory you specify in [font='courier new']$localDisk->allFiles($dir)[/font] should be relative to the root path of the local disk - an absolute path does not work.

In [font='courier new']Kernel.php[/font] I set this to run every hour:
$schedule->command('BackUpLogs')->cron('5 * * * *');
So now every hour all the files in my storage/logs directory are backed up to my S3 bucket and deleted from the local disk.
Sign in to follow this  


0 Comments


Recommended Comments

There are no comments to display.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now