Hi
I am working on on SQL Server Integration Services and facing few problems in it.
Actually am supposed to create a package that would automatically pick excel spreadsheets with a specific format and import it onto the SQL server.
(Lets say , there is a company named AB and they have got various products named CD,EF and GH and each product has its own spreadsheet in which its monthly sales(JAN, FEB,...NOV, DEC) is given. So i have to build a generic package for each product (lets say CD) so that i don't have to import every spreadsheet seperately for each month.
To summarize i just have to build a package where i can deploy the various spreadshhets again and agian instead of building a package for each and every month spreadsheet seperately.
I have tried and used lots of combinations like Loop
conatiners etc. but still am not able to find a solution to it.
Please help me out on this.
If the metadata (i.e. Number of columns, column names, column types) of each file is different then you cannot load everything using the same data-flow. You need a seperate data-flow for each one.
You can still put everything into one package. You need to decide, at execution-time, which data-flow(s) to execute based on which file you want to process. In other words, you need conditional workflow: http://www.sqlis.com/default.aspx?306
-Jamie
|||
But the format of all the monthly spreadsheets are the same. So still i have to cosider conditional flow or it can be done through any other way also.
Cheers
|||Oh OK, well you're all right then. You just need to set the name of the file that you want to process dynamically. You do this by putting a property expression on the Excel Connection Manager's ConnectionString property.
-Jamie
|||
Hi
I am really sorry to bother again but actually ,as i am new to these services, can you please guide me how to do it or may be direct me to some useful article.
Thanks a lot .Its been really helpful though.
Cheers
|||Sure, this: http://blogs.conchango.com/jamiethomson/archive/2005/05/30/1489.aspx provides all that you need to know.
There's probably more in there than you need but you can pick out the bits that are important. Make sure you download the demo which will show you how all this works.
-Jamie
|||
Thanks a lot. I really appreciate it. Its working fine now.
Actually i have twisted the previous question a bit. I don't know why but i am just keen to learn whatever comes my way.
Lets say we have a database table(eg XYZ) with certain attributes (A,B,C,D,E,F) and now we want to build a package which would take excel file(eg XUV) as a source and export the data onto the table which has already been created in the database(i.e XYZ). But in the excel file we only have few attributes that match the destination column(lets say A,C,F). Now in order to get it working we have to map the columns manually. Then perform the normal procedures and it would be loaded.
This package would deploy only one source file(i.e XUV). Actually what i was curious about is that if there are hundreds of excel source files(with the same attributes like the former excel source XUV) waiting to be loaded then am i supposed to do mapping for each and every file.
Thanks in advance.
|||Not quite sure I fully understand what you're saying but the important thing is you have to build a data-flow for each type of source file. By "type" I mean all files that have the same metadata are of the same type.
-Jamie
|||
Hi
Actually am still facing some problem with my last query.
There is no connection manager in the property listings in the expressions.
|||Where are you looking? What object are you looking at the properties of?
You have to change the ConnectionString property of the connection manager.
-Jamie
|||Really sorry was a mistake.
No comments:
Post a Comment