Laravel Excel 3.0 was released a while ago and completely redesigned the architecture behind exports. Our next target was to do the same paradigm shift for imports. Development took some time, because we wanted to get it right from the start.
We are now ready to introduce Laravel Excel 3.1!
Like 3.0 has export classes, 3.1 has import classes. Let’s start by creating our first import class. As an example we’ll use a users import. We’ll assume that the package is already installed, installation instructions can be found at: https://laravel-excel.maatwebsite.nl/3.1/getting-started/installation.html
The easiest way to create an import class is to use the make:import artisan command.
php artisan make:import UsersImport --model=User
It should have created a UsersImport class in App/Imports. Because we used the --model
option, the import already implements the ToModel concern. We can now implement the model method as follows:
<?php
namespace App\Imports;
use App\User;
use Illuminate\Support\Facades\Hash;
use Maatwebsite\Excel\Concerns\ToModel;
class UsersImport implements ToModel
{
/**
* @param array $row
*
* @return User|null
*/
public function model(array $row)
{
return new User([
'name' => $row[0],
'email' => $row[1],
'password' => Hash::make($row[2]),
]);
}
}
On each row that gets imported, we create a new user. We simply return a new User instance, the saving is handled by the package.
In your UsersController you can now add an import method that will call our UsersImport. Make sure to also add a route registration for this method.
<?php
namespace App\Http\Controllers;
use App\Imports\UsersImport;
use Maatwebsite\Excel\Facades\Excel;
class UsersController extends Controller
{
public function import()
{
Excel::import(new UsersImport, 'users.xlsx');
return redirect('/')->with('success', 'All good!');
}
}
Passing the UsersImport object to the Excel::import() method will tell the package how to import the file that is passed as the second parameter. The file is expected to be located in your default filesystem disk (see config/filesystems.php).
You can specify another disk with the third parameter like your Amazon S3 disk.
Excel::import(new UsersImport, 'users.xlsx', 's3');
If you let your user upload the document, you can also just pass the uploaded file directly.
Excel::import(new UsersImport, request()->file('your_file'));
The import class can be supercharged by adding the Importable trait.
<?php
namespace App\Imports;
use App\User;
use Illuminate\Support\Facades\Hash;
use Maatwebsite\Excel\Concerns\ToModel;
use Maatwebsite\Excel\Concerns\Importable;
class UsersImport implements ToModel
{
use Importable;
/**
* @param array $row
*
* @return User|null
*/
public function model(array $row)
{
return new User([
'name' => $row[0],
'email' => $row[1],
'password' => Hash::make($row[2]),
]);
}
}
We can now call the import method directly on the import class; no need for a facade.
(new UsersImport)->import('users.xlsx');
The thing we really wanted to get right from the start is to provide the tools to make imports more performant.
Usually one of the bottlenecks with imports is the database. By default, the package performs database inserts per row. However, on larger files, this can slow down your import a lot.
By implementing the WithBatchInserts concern you can limit the amount of queries done to import the entire file. The batch size will determine how many models we’ll insert at once. This will drastically reduce the import duration.
<?php
namespace App\Imports;
use App\User;
use Illuminate\Support\Facades\Hash;
use Maatwebsite\Excel\Concerns\ToModel;
use Maatwebsite\Excel\Concerns\WithBatchInserts;
class UsersImport implements ToModel, WithBatchInserts
{
public function model(array $row)
{
return new User([
'name' => $row[0],
'email' => $row[1],
'password' => Hash::make($row[2]),
]);
}
public function batchSize(): int
{
return 1000;
}
}
The package will now insert 1000 models in one query. You’ll have to play around with the batch size to find the sweet spot for your import.
Importing a large file can have a huge impact on the memory usage, as the library will try to load the entire sheet into memory.
To mitigate this increase in memory usage, you can use the WithChunkReading concern. This will read the spreadsheet in chunks and keep the memory usage under control.
<?php
namespace App\Imports;
use App\User;
use Illuminate\Support\Facades\Hash;
use Maatwebsite\Excel\Concerns\ToModel;
use Maatwebsite\Excel\Concerns\WithBatchInserts;
use Maatwebsite\Excel\Concerns\WithChunkReading;
class UsersImport implements ToModel, WithBatchInserts, WithChunkReading
{
public function model(array $row)
{
return new User([
'name' => $row[0],
'email' => $row[1],
'password' => Hash::make($row[2]),
]);
}
public function batchSize(): int
{
return 1000;
}
public function chunkSize(): int
{
return 1000;
}
}
The file will now be read in chunks of 1000. In combination with batch inserts this should improve the speed and memory usage of your imports!
By adding the ShouldQueue contract to your import class, each chunk will be queued as a separate job.
Laravel Excel 3.1 provides a lot more goodies to make importing Excel files a breeze. You can find more about them in the official documentation https://laravel-excel.maatwebsite.nl/3.1/imports/
Originally posted at: https://medium.com/maatwebsite/introducing-laravel-excel-3-1-e478502bf92e
I work at Maatwebsite on weekdays and I am a passionate filmmaker in my free time. Find me on Twitter & Github.