I prepared a script to create a table with 800 columns but SQLPlus and SQL Developer are not reading the whole script as they probably read only 7499/2499 characters max. As an alternative I am thinking to divide the script into an initial create script with 200 columns and then alter table to add remaining columns in 200 chunks. My question is that what is the best practice to achieve this? Are there any cleaner alternatives. I searched google but could not find the relevant answers and most of them got closed by suggestion that the table should at first place not have these many columns and it is a design issue.
Note: I am trying to replicate a scenario in a legacy system by creating a local database. So I don't have the option to denormalize.
What is the best way to create a table with 800 columns in a single script?
-
1I doubt you'll find "best practices" to do this, since creating such a table it in the first place is clearly not advised at all. Your split approach should work though, so why are you looking for alternatives?Mat– Mat2015年08月13日 10:22:15 +00:00Commented Aug 13, 2015 at 10:22
-
It seemed to me like I am going on a wrong path and there will be a cleaner approach. In case I want to replicate my schema in another database then I won't be able to run a single script to create a replica. That seemed unnatural to me.Khalid Ansari– Khalid Ansari2015年08月13日 11:09:39 +00:00Commented Aug 13, 2015 at 11:09
-
1>> prepared a script to create a table with 800 columns but SQLPlus and >>SQL Developer are not reading the whole script What version of both tools did you try? You should be able to run your script, @my_very_bad_script.sqlthatjeffsmith– thatjeffsmith2015年08月13日 14:06:50 +00:00Commented Aug 13, 2015 at 14:06
-
1The most columns that I have ever seen in a table was 225 columns with 25 indexes. For some reason there were some performance issues... :-) While I can't imagine wanting 800 columns in a table, your approach seems to be the best one that I can think of. What does the select statement look like???Gandolf989– Gandolf9892015年08月13日 14:09:26 +00:00Commented Aug 13, 2015 at 14:09
-
@thatjeffsmith - I downloaded them last week from oracle website. So those should be of latest version. I will recheck once again if I was making some other mistake but I doubt that. How many characters can I put in my create table script?Khalid Ansari– Khalid Ansari2015年08月13日 15:25:36 +00:00Commented Aug 13, 2015 at 15:25
4 Answers 4
The best way to run a very long script is not to load it into the editor, any editor.
Just reference it via @ and run it
So @my_script.sql and execute that.
I had to try it myself, because I did not believe SQL*Plus can not handle this. Apparently, it can.
Script for generating the create statement:
[oracle@ora71 ~]$ cat generate.sql
set echo off feedback off heading off pages 0
spool create.sql
select text from
(
select 'create table t1 (' as text, 0 as position from dual union
select 'columnxxxxxxxxxxxxxxxx' || rownum || ' number, ' as text, rownum as position from dual connect by level < 800 union
select 't date);' as text, null as position from dual
) order by position nulls last;
spool off
exit
Generating the DDL:
[oracle@ora71 ~]$ sqlplus -s bp/bp @generate.sql > /dev/null
[oracle@ora71 ~]$ wc -c create.sql
64881 create.sql
[oracle@ora71 ~]$ tr -d '[:space:]' < create.sql | wc -c
25481
64881 characters, or 25481 without spaces.
Creating the table without any problem:
[oracle@ora71 ~]$ sqlplus -s bp/bp @create.sql
Table created.
exit
-
Your answer is correct and that is exactly what I was looking for but thatjeffsmith answered it first so will mark his comment as the correct answer. Thanks a lot for your help. At first place I had put this question for the very same reason because it seemed strange to me that oracle can't handle a script of size more than 7.5k.Khalid Ansari– Khalid Ansari2015年08月14日 07:47:35 +00:00Commented Aug 14, 2015 at 7:47
-
If you found Balazs' post helpful, you can upvote it!Vérace– Vérace2015年08月14日 16:14:38 +00:00Commented Aug 14, 2015 at 16:14
Been there, done that my friend! I had to do it with a system that had 35K (not a typo - that's 35,000) fields - basically arrays of arrays.
I didn't even attempt to recreate the system on my SQL Server database. I connected using my development environment (Delphi at the time, ah... those were the days...) and chopped up the table record by record.
I designed incrementally. I outlined a normalised, multi-table approach (rough draft) and then, on my first phase, I'd take in some of the major fields and put them into my normalised structure.
Gradually, I slotted the 35K fields into more and more tables (there weren't 35K fields - or anything like it - in my new database, all tables) until I had put all of the data into the new system.
I was really (really) proud of the fact that I wrote a function of 650,000 lines of code (actually wrote a programme to write the code) and it worked first time!!!.
What's your dev. environment? I would suggest that the above approach is the best one. Failing that, can you take a .csv (or similar) dump of the data and then work with that text file? I had a Delphi driver for my old system, so I could communicate directly, but I realise that with legacy systems, this isn't always possible. Final question: is this ETL process a one-off, or do you have to do it on a regular basis?
-
Thanks for sharing the detailed feedback. This is actually a one time job. Let me explain. I mainly work on Salesforce where we have objects which actually corresponds to oracle tables in the backend. One of our objects because of bad design has hit the Salesforce limitation of 800 fields max. Now we are thinking of redesigning it but to do that we need to perform analysis on the current data in the table which is around 900k. Can't do that in excel and will face other issues if we do that in Salesforce. So I tried putting the data in Oracle table and then perform the analysis.Khalid Ansari– Khalid Ansari2015年08月13日 15:32:50 +00:00Commented Aug 13, 2015 at 15:32
-
If @mustaccio 's solution works, great. Otherwise, you might need some custom development. Maybe a .csv (or similar) dump? Maybe you can read in subsets using Oracle's external tables - and then recombine those subsets in some sort of normalised structure? Can you put a couple of sample records up on the web?Vérace– Vérace2015年08月13日 16:13:15 +00:00Commented Aug 13, 2015 at 16:13
-
Thanks for the suggestion. I can't give a sample though as it is production data.Khalid Ansari– Khalid Ansari2015年08月14日 14:51:13 +00:00Commented Aug 14, 2015 at 14:51
I'm not sure what is your definition of "best method", but here's an option: Oracle 11g and later allow CLOB
s as input to EXECUTE IMMEDIATE
, so you could write an anonymous block to read your DDL file using DBMS_LOB
and pass the contents to EXECUTE IMMEDIATE
.
-
I need to try this method and if this works then will mark this as the correct answer. Do you think this DDL file can take 30k characters?Khalid Ansari– Khalid Ansari2015年08月13日 15:29:00 +00:00Commented Aug 13, 2015 at 15:29
-
1