Wednesday, March 28, 2012

how to enforce precedence at data flow level?

I wish to use the same data to update 2 different tables.

There is no green arrow output from the OLE DB data destination, so I can't have another component following on from the first insert.

This means I have to use the Multicast to 'copy' the data prior to the first table insert.

I can then use the data to perform inserts to both tables.

However, there is an FK constraint between these two tables, so I need to wait until the first table insert has finished before performing the second table insert.

How can I do this? How can I make the second insert dependent on the first?

Hi ya,

Unfortunately there is no precedence at data flow level. Your best bet would be to put it in 2 data flows and make sure that the first data flow is on success.

If you do want to run it in a transaction then put a sequence container and put the transaction property for container as Required whille both data flows should be supported.

Sorry but i couldn't find anything else........ probably the big guys would answer if there is anything else?

Hope that helps

Cheers


Rizwan

|||That's a pain.

That means i'll have to build the dataset up all over again......and no i daren't cut and paste due to the endless xml/serialization errors I always/randomly seem to get that completely blow up the IDE

I'm surprised there is no 'rendezvous' component in the toolbox.

do you know if anyone has written a custom component like this?|||

Sure this is but then again you always have the option to select a custom script task and then programatically do the importing and saving of data........ though this is just an idea as i haven't tried it myself yet.


leave the thread open and we'll see if Moderators or MVPs has something else in their mind.

Cheers

Rizwan

|||

One way to do this would to have the DataFlow drop the data for the child table to a RAW file, then in a 2nd dataflow read the RAW and piopulate the table|||

I agree that not having precedent constraints at the data flow level is a nuisance; DataStage used to allow it. Sorry, I am not offerring any useful advice here but want to let you know I share your pain as I had to recently do the same.

desibull

|||

desibull wrote:

I agree that not having precedent constraints at the data flow level is a nuisance; DataStage used to allow it. Sorry, I am not offerring any useful advice here but want to let you know I share your pain as I had to recently do the same.

desibull

I just want to let my opinion be known on this topic:

There is no reason to have precedence constraints in a data flow. A data flow is designed to move data in buffers -- as fast as it can. To be able to enforce ordering is non-sense. Just break the data up into two or more data flows. I do not support the notion of precedence constraints in a data flow.|||

I agree with Paul - the easiest way to do this without having to reprocess the dataset is to drop it to a RAW destination. It's also extremely fast.

|||

Hi,


That means i was right. Great

Cheers

Rizwan

No comments:

Post a Comment