Problem with local.lookup validation in entity.documentTemplate

I am trying to load Document Templates using the Kettle OpenVPMS Loader.. I am getting "Failed to validate Type of Document Template: errorMessage" errors on the Type/archetype field.

This has a local.lookup assertion.  If I set <entity.documentTemplate>archetype to "Report" or "Work in Progress Charges" there is no error, but "Reminder Report", "Patient Letter", and "Patient Form" fail.

Looking at the local.lookup values, I expect that the first two work because upper casing them and replacing and spaces by underlines matches the set values.  However, the values for the last three are act.patientReminder, act.patientDocumentLetter and act.patientDocumentForm respectively.

What value/recipe should I use to?

Regards, Tim G

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Re: Problem with local.lookup validation in ...

You should be setting the archetype node to the "name" part of the lookup lookup, rather than the "value".

So use "act.patientDocumentForm" instead of "Patient Form".

-Tim

Re: Problem with local.lookup validation in ...

Tim - this is what I initially did.  However, when it didn't work, I started experimenting and found that "Report" and "Work in progress charges" did.  BUT the archetype names do get the validation error.  Exhibits 1,2 & 3 below [snippets showing the transform and error log, the OPV Loader mapping, and the content of the data file.]  Regards, Tim G

Re: Problem with local.lookup validation in ...

Can you attach your transforms so I can try and reproduce locally?

Re: Problem with local.lookup validation in ...

Stupid me - I thought about doing that but concluded 'Tim A will look at the pictures and immediately point out where I have gone wrong'.  Attached are the transform and the csv data file. (This has been renamed to .txt to allow it to be attached - but the transform will look for .csv so you need to rename it back.)  You will also find some extra columns in the csv file where I was 'parking' data as I was testing  what worked and what did not. You will also want to uncheck the 'skip processed' box in the Loader step so that you can keep repeating the load without clearing the database.

Regards, Tim G

AttachmentSize
docTemplate.txt 1.93 KB
load_DocTemplates_057.ktr 8.56 KB

Re: Problem with local.lookup validation in ...

The docTemplate.txt above doesn't appear to correspond with your screenshot. Here's a data preview:

 

Re: Problem with local.lookup validation in ...

Sorry Tim A - I sent you the version that uses "Report" as the type and thus works.  Herewith (again as .txt) the version corresponding to my screen preview.

Apologies, Tim G

AttachmentSize
docTemplate.txt 1.26 KB

Re: Problem with local.lookup validation in ...

OK thanks. This is a bug in lookup.local assertion handling by the plugin.

Raised as https://openvpms.atlassian.net/browse/OVPMS-1290

-Tim A

Re: Problem with local.lookup validation in ...

Bit of a thread hijack here....but is it even possible to load documents like this...

I looked at the loader step you used for loading patientDocumentAttachments Tim, you specified the filename with full path against the <act.patientDocumentAttachment>fileName.  I have attempted this is the past and while the act gets created I have not been able to have the attached file actually load...I ended up assuming that we need to use the docloader for this

Re: Problem with local.lookup validation in ...

OpenVPMS compresses documents on storage, and decompresses them on retrieval.

This isn't supported by the Kettle plugin, so you need to use the docloader to load the content.

Support for this could be added to the plugin; if there's any interest, I'll set up a project.

-Tim A

Re: Problem with local.lookup validation in ...

Ben - some time ago I tried a test load of 5 attachments - worked like a train. I bought back 5GB of attachments when I went to Hong Kong at Christmas.  I will try loading all of these in the next couple of days and let you know how it goes. [I want to know how much the database grows and how long it takes to load.  Note that I will probably try to run multiple loads in parallel - I already run genbalance as a set of 4 jobs running in parallel and it's significant faster than running just one.]

Note however, that I have yet to use templateload to load my OO and jrxml files into the Document Templates that I am kettle loading above.  Again this is on the 'must do tomorrow' list.

Regards, TimG

Re: Problem with local.lookup validation in ...

I am assuming your using Docload TimG..

Here is my new theory ....using Kettle to generate investigations for each set of results...

batch the files using accession numbers retrievable by parsing the file with a pdfrenaming application that can extract content and rename the file.

Generate the investigation for each set of files then retrieve the investigations ID using a sql select based on the ETL loader iD...rename the files using a batch script step injecting variables for the rename

then allow docload -byid to do its thing. ? anyone know how docload assessing if a file is newer or older...does it assess the files creation date or just if in order the files are presented to the loader

 

Re: Problem with local.lookup validation in ...

You can load the documents by name; in this case you populate the fileName node, and run docload with the --byname argument. This  will look for all acts without attachments, and match the fileName with documents in the source directory.

The --byid --overwrite option supports versioning and duplicate checking. If a source file has already been loaded , it will be skipped. If not, and a document has already been loaded, the new document will replace it, and the old one will be versioned.

The --byid option determines if a source file has been loaded by comparing the contents (via checksum) and filename with any existing document linked to the document act. The file timestamp is not used.

-Tim

Re: Problem with local.lookup validation in ...

Ben - yes I am using docload - with the 'by name' mode. In my situation the RxWorks database has all the attachments specified by name, and the RxWorks/Attachments folder contains 19,250 odd attachements.  It took docload 53 minutes to load these and the database size grew by some 5GB. [A large fraction of the attachment volume are jpg's and hence compression is not help.]

I have also now run templateload to load the OO and jrxml templates to attach these to the Document Templates.  I had problems until I realised that the help text was coming from the java application and that templateload.bat just needed the file name.

Regards, Tim G

Re: Problem with local.lookup validation in ...

Its actually pretty neat what you can accomplish with PDI 3.2 really...I ended up using my theory above to generate Investigations then retrieved the investigations Ids and renamed all the files with the Ids..I simply ordered files by requisition # then date and used a unique row step to remove dated "versions" In terms of historical data you dont really need to load partial reports you just want the final.

I think used a job script to run docload passing the source and destination as variables.

Side Note: Wish List:

I wish the loader would send a "completion call or release rows": back to pentaho before it actually completes...I attempted to use blocking steps at one stage to so I could sequentially run 3 Loader steps however it appears the ROWS are passed out of the Loaders almost immediately ...hence the blocking step receieves them all and releases them before the 1st loader has finished processing.

Syndicate content