<<

NAME

bulkmarcimport.pl - Import bibliographic/authority records into Koha

USAGE

 $ export KOHA_CONF=/etc/koha.conf
 $ perl misc/migration_tools/bulkmarcimport.pl -d --commit 1000 \\
    --file /home/jmf/koha.mrc -n 3000

WARNING

Don't use this script before you've entered and checked your MARC parameters tables twice (or more!). Otherwise, the import won't work correctly and you will get invalid data.

DESCRIPTION

-h, --help

This version/help screen

-b, --biblios

Type of import: bibliographic records

-a, --authorities

Type of import: authority records

--file=FILE

The FILE to import

-v, --verbose

Verbose mode. 1 means "some infos", 2 means "MARC dumping"

--fk

Turn off foreign key checks during import.

-n=NUMBER

The NUMBER of records to import. If missing, all the file is imported

-o, --offset=NUMBER

File offset before importing, ie NUMBER of records to skip.

--commit=NUMBER

The NUMBER of records to wait before performing a 'commit' operation

-l

File logs actions done for each record and their status into file

--append

If specified, data will be appended to the logfile. If not, the logfile will be erased for each execution.

-t, --test

Test mode: parses the file, saying what it would do, but doing nothing.

-s

Skip automatic conversion of MARC-8 to UTF-8. This option is provided for debugging.

-c=CHARACTERISTIC

The CHARACTERISTIC MARC flavour. At the moment, only MARC21 and UNIMARC are supported. MARC21 by default.

-d, --delete

Delete EVERYTHING related to biblio in koha-DB before import. Tables: biblio, biblioitems, items

-m=FORMAT

Input file FORMAT: MARCXML or ISO2709 (defaults to ISO2709)

--authtypes

file yamlfile with authoritiesTypes and distinguishable record field in order to store the correct authtype

--yaml

yaml file format a yaml file with ids

--filter

list of fields that will not be imported. Can be any from 000 to 999 or field, subfield and subfield's matching value such as 200avalue

--insert

if set, only insert when possible

--update

if set, only updates (any biblio should have a matching record)

--all

if set, do whatever is required

-k, --keepids=<FIELD>

Field store ids in FIELD (useful for authorities, where 001 contains the authid for Koha, that can contain a very valuable info for authorities coming from LOC or BNF. useless for biblios probably)

--match=<FIELD>

FIELD matchindex,fieldtomatch matchpoint to use to deduplicate fieldtomatch can be either 001 to 999 or field and list of subfields as such 100abcde

-i, --isbn

If set, a search will be done on isbn, and, if the same isbn is found, the biblio is not added. It's another method to deduplicate. -match & -isbn can be both set.

--cleanisbn

Clean ISBN fields from entering biblio records, ie removes hyphens. By default, ISBN are cleaned. --nocleanisbn will keep ISBN unchanged.

-x=TAG

Source bib TAG for reporting the source bib number

-y=SUBFIELD

Source SUBFIELD for reporting the source bib number

--idmap=FILE

FILE for the koha bib and source id

--keepids

Store ids in 009 (useful for authorities, where 001 contains the authid for Koha, that can contain a very valuable info for authorities coming from LOC or BNF. useless for biblios probably)

--dedupbarcode

If set, whenever a duplicate barcode is detected, it is removed and the attempt to add the record is retried, thereby giving the record a blank barcode. This is useful when something has set barcodes to be a biblio ID, or similar (usually other software.)

--framework

This is the code for the framework that the requested records will have attached to them when they are created. If not specified, then the default framework will be used.

--custom=MODULE

This parameter allows you to use a local module with a customize subroutine that is called for each MARC record. If no filename is passed, LocalChanges.pm is assumed to be in the migration_tools subdirectory. You may pass an absolute file name or a file name from the migration_tools directory.

--marcmodtemplate=TEMPLATE

This parameter allows you to specify the name of an existing MARC modification template to apply as the MARC records are imported (these templates are created in the "MARC modification templates" tool in Koha). If not specified, no MARC modification templates are used (default).

-si, --skip_indexing

If set, do not index the imported records with Zebra or Elasticsearch. Use this when you plan to do a complete reindex of your data after running bulkmarciport. This can increase performance and avoid unnecessary load.

-sk, --skip_bad_records

If set, check the validity of records before adding. If they are invalid we will print the output of MARC::Lint->check_record and skip them during the import. Without this option bad records may kill the job.

<<