XML File processing

XML File processing

am 16.01.2008 21:59:03 von Umeshnath

Hi,
I have very complex and hug xml file which I have to store in a different
tables of Database . I want to which will give best performance.

1) Loading into dataset and storing into DB tables

2) Processing the XML file by using XML classes (XmlTextReader,XmlDocument)
and storing into DB tables.

Please advise me which have better performance

Also let me know if any alternative are there.

Re: XML File processing

am 17.01.2008 13:34:54 von Kevin Spencer

You have to process the XML file using XML classes in either case, since you
can't put the XML data into a DataSet without parsing it, so the obvious
solution is number 2.

--
HTH,

Kevin Spencer
Chicken Salad Surgeon
Microsoft MVP

"Umeshnath" wrote in message
news:99A86CB1-CED7-4D0B-B901-4CE1F43E7D8E@microsoft.com...
> Hi,
> I have very complex and hug xml file which I have to store in a different
> tables of Database . I want to which will give best performance.
>
> 1) Loading into dataset and storing into DB tables
>
> 2) Processing the XML file by using XML classes
> (XmlTextReader,XmlDocument)
> and storing into DB tables.
>
> Please advise me which have better performance
>
> Also let me know if any alternative are there.
>
>
>

Re: XML File processing

am 17.01.2008 14:02:17 von Marc Gravell

Actually, there are lots of options here if you are using a
SQL-enabled database (like SQL Server 2005):

3: (if the first-level children all represent similar entities - i.e.
.........)
Use SqlBulkCopy to get the individual rows into the database as xml
columns, then split from there

Use XmlReader to parse the data as far as each child, and spoof an
IDataReader - i.e. each element in the xml gets brought back as a row
for the spoof data-reader. I have posted similar before:

http://groups.google.co.uk/group/microsoft.public.dotnet.lan guages.csharp/browse_thread/thread/84d08dd9778efd77/91c7a200 56ffe8e1

4: upload the entire xml file as a single CLOB (using streaming
methods), and then use the database to shred it into tables

(unrelated to the database knowing about xml)
5: use xslt to process the xml into a series of simpler files (1 per
table) - perhaps CSV or TSV? Then use your database's bulk load
function to import the data

---

I guess the main point here is that much as I like them, sometimes an
object model (of any kind) simply isn't the right want to handle large
data imports.

Marc