[OPEN-ILS-DEV] Importing marc records from Sirsi

Robert glibrarysystem at gmail.com
Tue Jun 3 11:29:41 EDT 2008


Well I guess when all else fails, reboot. That's what I did and everything
came up fine. Only now the import process is painfully slow. Every form of
conversion is extremely slow. For 1000 records it has taken 10 minutes. At
this rate even if it does import them correctly it will take a lifetime to
get all of our records imported into Evergreen. Is there any way to speed it
back up? It was going about 1000 times faster when I first started the
conversion and import process last week.

On Tue, Jun 3, 2008 at 10:42 AM, Robert <glibrarysystem at gmail.com> wrote:

> I checked the osrfsys.log file after the jabber server couldn't connect and
> it says that there was a jabber exception that it could not connect to
> jabber server: Inappropriate ioctl for device. Loc: 496
> OpenSRF::Transport::SlimJabber::Client. Loc:
> /openils/lib/perl5/OpenSRF/Transport/SlimJabber/Client.pm
>
>
> On Tue, Jun 3, 2008 at 10:03 AM, Robert <glibrarysystem at gmail.com> wrote:
>
>> Ok something weird is going on with this import. I started it on Friday
>> and it was still running on Monday. I came in this morning and it was
>> finished but with an error. Can't locate object method "class_name" via
>> package "HU" (perhaps you forgot to load "HU"?) at pg_loader.pl line 48, <>
>> line 1648134328. Once I saw that then I decided to try a smaller number of
>> imports at once, that one was 3500. So I had a file with 1000 records in it
>> to try. It started off really slow. I stopped it and tried to restart all of
>> the Evergreen services. I was informed by the services that there wasn't
>> enough free space to start up. So I started checking around. I found out
>> that the osrfsys.log file was almost 120GB in size! It had all of the
>> entries in there from the previous import. So I erased the log file. It
>> automatically created another osrfsys.log file and started putting entries
>> in it, again from the last import. I finally had to kill the perl process so
>> that I could completely erase the log. Now that I have it erased I can't get
>> the router user to connect to the jabber server. Can someone give me some
>> insight as to why this has happened and what I might be doing wrong to cause
>> it to happen?
>>
>>
>> On Mon, Jun 2, 2008 at 12:22 PM, Dan Scott <denials at gmail.com> wrote:
>>
>>> 2008/6/2 Robert <glibrarysystem at gmail.com>:
>>> > Hey guys, any news on why the copies or volumes might not have copied
>>> over?
>>> > Also, can someone tell me in their experience in importing records what
>>> the
>>> > maximum they imported at once? I tried to import a file that had 3500
>>> > records in it over the weekend and it is still running and looks to be
>>> hung
>>> > up. Just out of curiosity.
>>>
>>> 1) The steps listed for the Gutenberg records get bibliographic
>>> records into the system, but no call numbers or copies. That's what
>>> the import_demo package tries to demonstrate:
>>> http://svn.open-ils.org/trac/ILS-Contrib/wiki/ImportDemo The approach
>>> in the import_demo takes you through the steps for getting bib records
>>> into the system, then goes beyond that to parse holdings statements
>>> directly from the MARC21XML for the bib records and generates call
>>> numbers and copies to load into the system. This isn't necessarily the
>>> best approach for getting call numbers and copies into your system,
>>> but you're going to have to tailor your approach to the system you're
>>> working with.
>>>
>>> 2) The most bib records I have imported in a single file is somewhere
>>> around 2 million. This weekend I was importing approximately 360,000
>>> bib records from a single file. Note that you really want to be using
>>> the parallel_pg_loader.pl approach (as demonstrated in import_demo) if
>>> you're working on a system with memory constraints.
>>>
>>> --
>>> Dan Scott
>>> Laurentian University
>>>
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://list.georgialibraries.org/pipermail/open-ils-dev/attachments/20080603/1bbd6b98/attachment-0001.html


More information about the Open-ils-dev mailing list