Intro
We have some old(in my eyes) software at the place I work at the moment,
Some stuff needs to be done manually, for instance we need to insert a XML file into a Oracle table, which will trigger... stuff
I got annoyed by having to do this manually all the time,
So I decided to automate the process a bit with powershell and Oracle's sqlldr
My knowledge of Oracle is limited hence the asking for a review,
What I did
- I have created a separate table, to which I am inserting my file.
- That table has a trigger, to get the specified information, and insert it into the correct table (which is not up for review)
- Created a powershell script to be able to load files in bulk into the table
Code
The sql script for creating the table
CREATE TABLE mckdev.ogd_xml_table (
id NUMBER(10),
xml XMLTYPE,
"IN_CON_LOG" NUMBER(1,0),
"ZIS_EVENT" VARCHAR2(35 BYTE),
);
ALTER TABLE mckdev.ogd_xml_table ADD (
CONSTRAINT ogd_xml_table_pk PRIMARY KEY (id)
);
CREATE SEQUENCE mckdev.ogd_xml_table_seq;
The load_data.ctl
load data
infile ''
append
into table ogd_xml_table
fields
(
filename FILLER CHAR(100),
xml lobfile( filename) terminated by eof
)
The bulkloader.ps1
$dataFiles = @(
"filelist21.dat"
"filelist22.dat",
"filelist23.dat",
"filelist25.dat",
"filelist121a.dat",
"filelist121b.dat",
"filelist122a.dat",
"filelist122b.dat"
)
$ctlFile = "$PSScriptRoot\load_data.ctl"
foreach ($f in $dataFiles) {
(Get-Content $ctlFile) -Replace "infile '[^']*'", "infile '$($PSScriptRoot)\$($f)'" | Out-File $ctlFile -Encoding ASCII
sqlldr mckdev@$SecretDatabase/$SecretPassword control=$ctlFile
}
An example filelist.dat
only holds a reference to another XML file.
Questions
Currently I need to have multiple
.dat
file to be able to load them in bulk.One for each XML file I want to load into the table. Is there any way around this?
Is this a correct approach or would you have done things differently?
2 Answers 2
As the ctl file can't be parameterized you'll have to create them for each infile.
As it is quite small, I'd create it on the fly from a here string with the format operator.
This sample script just echoes to screen, the commands to write/execute sqlldr are commented out.
## Q:\Test2019円01円24円\CR_212133.ps1
$dataFiles = @(
"filelist21.dat"
"filelist22.dat",
"filelist23.dat",
"filelist25.dat",
"filelist121a.dat",
"filelist121b.dat",
"filelist122a.dat",
"filelist122b.dat"
)
$ctlFile = "$PSScriptRoot\load_data.ctl"
foreach ($dataFile in $dataFiles) {
@'
load data
infile '{0}'
append
into table ogd_xml_table
fields
(
filename FILLER CHAR(100),
xml lobfile( filename) terminated by eof
)
'@ -f $dataFile #| Out-File $ctlFile -Encoding ASCII
#sqlldr mckdev@$SecretDatabase/$SecretPassword control=$ctlFile
}
> Q:\Test2019円01円24円\CR_212133.ps1
load data
infile 'filelist21.dat'
append
into table ogd_xml_table
fields
(
filename FILLER CHAR(100),
xml lobfile( filename) terminated by eof
)
----------------------------------------------------------------------
load data
infile 'filelist22.dat'
append
into table ogd_xml_table
fields
(
filename FILLER CHAR(100),
xml lobfile( filename) terminated by eof
)
----------------------------------------------------------------------
%<...snip...>%
After looking back at it again, I realized I didn't need all those different dataFiles
,
as all those dataFiles
hold only a single reference with the full path to the XML file.
Instead I could get the path for each XML file and overwrite the single dataFile with the correct file path I want to execute.
- This makes adding another XML file to load, as easy as adding it to the directory I read the files from
- And removes the need for multiple
dataFiles
$xmlDirectories = @(
"directory1",
"directory2"
)
$dataFile = "$PSScriptRoot\filelist.dat"
$ctlFile = "$PSScriptRoot\load_data.ctl"
foreach ($dir in $xmlDirectories) {
foreach ($f in Get-ChildItem -Path "$PSScriptRoot\$dir") {
$f.FullName | Out-File $dataFile -Encoding ASCII
sqlldr mckdev@$SecretDatabase/$SecretPassword control=$ctlFile
}
}