Showing posts with label source. Show all posts
Showing posts with label source. Show all posts

Friday, March 30, 2012

Scripting component.

Hi Guys,

I need to generate a file using script component. I'm taking some data from an oledb source and then i would need to modify the fields, generate new fields and then generate the flat file.

I don't have any problems with this but the file i need to replicate would need to have a head record and a trail record which doesn't need to be duplicated.

Now when i generate the file and put anything in "Public Overrides Sub Input0_ProcessInputRow" it will work but it will generate the header record again and again as well.

Any quick help would be really appreciated.


TA

Gemma

Gemma,

Any reason for using the script component to create the file? Can you just use the script to populate the pipeline and use a flat file destination for the file? If so, I recently described how you would go about creating a flat file with header and trailing records.. http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2168163&SiteID=1

|||

Hi Eric,

I'm just combinging the fields, padding the fields with zero and doing some cleaning of the data as well.

Is there any other way? I've checked your post but I don't know how to add something from the database in the header.

Yes you can enter any expressions but i need the value from the table.

Can you help me do that?


TA
Gemma

|||

Add a boolean flag at the class level and default it to True, Then in the ProcessInput check the flag. If it is True, write the header line and change the flag to false.

There's an example of that approach in the script I posted here - http://agilebi.com/cs/blogs/jwelch/archive/2007/09/14/dynamically-pivoting-rows-to-columns.aspx. The example is not what you were asking about, but the script illustrates the technique.

Scripting changes to the data source view

I need to script out creation of some new cubes and dimension to an SSAS server while leaving the rest of the OLAP database unchanged. Handling the cubes and dimensions themselves seems pretty straight forward, using the scripting tools in SQL Management Studio. However, I'm uncertain about scripting the changes to the data source view (for the fact and dimension tables for the new cubes and dimensions).

In SQL Management Studio, only the entire DSV is scriptable - I don't see a way to script out changes to just some of the tables. I could, in theory, manually edit out the parts of the DSV that aren't related to my new entities, but it's not clear from the XMLA Alter command documentation whether that would work. But my reading, the command would replace the entire Major Object, which is the entire DSV in this case. Am I reading that right?

The only other option I can think of is to programmatically add the new entities to the DSV through explicit code - doable, but a royal pain, and I'd rather avoid it if possible.

Am I understanding things correctly? Is there some other option I'm not aware of?

You are right, the XMLA alter command would replace the entire DSV, so if you only scripted out changed objects that is all you would end up with.

You could programmatically add new entities to the DSV through code, but if you trace the resulting interaction with SSAS you will find that it is still doing an XMLA Alter and sending throught the definition for the entire DSV. But atleast if you programmatically add a table to the DSV, AMO will handle the scripting of the rest of the DSV for you.

|||Thanks for the confirmation, Darren. Fortunately, I came up with a slightly simpler way to do it than manually creating all of the new DSV entities in code. I'm loading up the full cube deployment script, plucking out the new DSV items, and copying them into the existing DSV. In the end, it's not too painful. Works well for new cubes and dimensions as well.
|||Glad you have found a solution. I am just curious as to what you mean by "copying them into the existing DSV" are you doing an XML or text "diff" between your new DSV and the existing one?|||By that I meant using AMO to get the DSV on the existing database instance. Basically I'm getting the existing DSV schema (using AMO), parsing out the new DSV schema from the XML/A deployment script for the modified database, and merging the new elements into the existing DSV.

I haven't gotten as fancy as an automated diff process (yet). Currently I'm just manually keeping track of what the new items are. Hopefully I'll be automate that more soon.
|||

I thought you must have been doing something like this. It does seem like a lot of manual work. I'm not sure if you really need to go to all that effort. If you basically pass SSAS an alter script with the entire DSV from the deployment script it *should* be able to figure things out. (assuming that your development project is just an updated version of the project that was originally deployed to your server)

Would I be correct in assuming that the situation you are facing is that you have made mulitple changes to your solution, but you only want to deploy some of them - hence the need to manually script things out? I have been working with a friend on a project to build an add-in to BIDS (www.codeplex.com/bidshelper) to enhance BIDS and one feature we were thinking about was the option to deploy just a single object from a project. I have a sub set of this working where I can deploy just the Calculation Script from a cube. If you are interested I don't think it would be too hard for us to add some basic "deploy single object" functionality that would handle other object types.

|||>assuming that your development project is just an updated version of
>the project that was originally deployed to your server

And that's the rub - it's not. We are an ISV that includes a set of SSAS cubes in our product. We provide a UI for customers to extend the cubes with custom measures and dimensions. When we ship new versions of our software, we need to be able to deploy our changes (new cubes, dimensions, attributes, etc) without overwriting the customer's customizations.

The BidsHelper project looks pretty cool - I'll check it out. Ultimately, it doesn't sound like something we'd be able to take advantage of to solve our problem - but it may come in handy during development.

|||

Aah, now I see where you are coming from.

I am not really aware of any good way of handling this "out of the box". It sounds like you will either need to write custom AMO code to attach to the client database and modify them. Or you would need to store the customisations that customers apply so that they could be re-applied after an upgrade.

It sounds like a reasonable feature, you could log this as a suggestion on the connect.microsoft.com site under SQL Server if you would like to see it considered for a future version, but in the medium/short term this sounds like something that will need custom code.

|||

Hi Kevin

Need your help desperately

as i read in ur message that you are updating the data source view by AMO

please let me know how to do that

I am working on the SSAS project and we need to add the measure dynamically to the cube( can be SSIS or Amo any thing, it should be automatic)

I am not getting the way to do that

please send me the detail to do that

that will be a big help

thanks

|||

There is a product sample that shows how to build a lite version of the Adventure Works database from scratch using AMO.

http://www.codeplex.com/MSFTASProdSamples/Wiki/View.aspx?title=SS2005%21Readme_AMOAdventureWorks&referringTitle=Home

If you are dynamically adding measures you would need to update the DSV, add the measures to the appropriate measure group and reprocess all your data.

Scripting changes to the data source view

I need to script out creation of some new cubes and dimension to an SSAS server while leaving the rest of the OLAP database unchanged. Handling the cubes and dimensions themselves seems pretty straight forward, using the scripting tools in SQL Management Studio. However, I'm uncertain about scripting the changes to the data source view (for the fact and dimension tables for the new cubes and dimensions).

In SQL Management Studio, only the entire DSV is scriptable - I don't see a way to script out changes to just some of the tables. I could, in theory, manually edit out the parts of the DSV that aren't related to my new entities, but it's not clear from the XMLA Alter command documentation whether that would work. But my reading, the command would replace the entire Major Object, which is the entire DSV in this case. Am I reading that right?

The only other option I can think of is to programmatically add the new entities to the DSV through explicit code - doable, but a royal pain, and I'd rather avoid it if possible.

Am I understanding things correctly? Is there some other option I'm not aware of?

You are right, the XMLA alter command would replace the entire DSV, so if you only scripted out changed objects that is all you would end up with.

You could programmatically add new entities to the DSV through code, but if you trace the resulting interaction with SSAS you will find that it is still doing an XMLA Alter and sending throught the definition for the entire DSV. But atleast if you programmatically add a table to the DSV, AMO will handle the scripting of the rest of the DSV for you.

|||Thanks for the confirmation, Darren. Fortunately, I came up with a slightly simpler way to do it than manually creating all of the new DSV entities in code. I'm loading up the full cube deployment script, plucking out the new DSV items, and copying them into the existing DSV. In the end, it's not too painful. Works well for new cubes and dimensions as well.|||Glad you have found a solution. I am just curious as to what you mean by "copying them into the existing DSV" are you doing an XML or text "diff" between your new DSV and the existing one?|||By that I meant using AMO to get the DSV on the existing database instance. Basically I'm getting the existing DSV schema (using AMO), parsing out the new DSV schema from the XML/A deployment script for the modified database, and merging the new elements into the existing DSV.

I haven't gotten as fancy as an automated diff process (yet). Currently I'm just manually keeping track of what the new items are. Hopefully I'll be automate that more soon.|||

I thought you must have been doing something like this. It does seem like a lot of manual work. I'm not sure if you really need to go to all that effort. If you basically pass SSAS an alter script with the entire DSV from the deployment script it *should* be able to figure things out. (assuming that your development project is just an updated version of the project that was originally deployed to your server)

Would I be correct in assuming that the situation you are facing is that you have made mulitple changes to your solution, but you only want to deploy some of them - hence the need to manually script things out? I have been working with a friend on a project to build an add-in to BIDS (www.codeplex.com/bidshelper) to enhance BIDS and one feature we were thinking about was the option to deploy just a single object from a project. I have a sub set of this working where I can deploy just the Calculation Script from a cube. If you are interested I don't think it would be too hard for us to add some basic "deploy single object" functionality that would handle other object types.

|||>assuming that your development project is just an updated version of
>the project that was originally deployed to your server

And that's the rub - it's not. We are an ISV that includes a set of SSAS cubes in our product. We provide a UI for customers to extend the cubes with custom measures and dimensions. When we ship new versions of our software, we need to be able to deploy our changes (new cubes, dimensions, attributes, etc) without overwriting the customer's customizations.

The BidsHelper project looks pretty cool - I'll check it out. Ultimately, it doesn't sound like something we'd be able to take advantage of to solve our problem - but it may come in handy during development.

|||

Aah, now I see where you are coming from.

I am not really aware of any good way of handling this "out of the box". It sounds like you will either need to write custom AMO code to attach to the client database and modify them. Or you would need to store the customisations that customers apply so that they could be re-applied after an upgrade.

It sounds like a reasonable feature, you could log this as a suggestion on the connect.microsoft.com site under SQL Server if you would like to see it considered for a future version, but in the medium/short term this sounds like something that will need custom code.

|||

Hi Kevin

Need your help desperately

as i read in ur message that you are updating the data source view by AMO

please let me know how to do that

I am working on the SSAS project and we need to add the measure dynamically to the cube( can be SSIS or Amo any thing, it should be automatic)

I am not getting the way to do that

please send me the detail to do that

that will be a big help

thanks

|||

There is a product sample that shows how to build a lite version of the Adventure Works database from scratch using AMO.

http://www.codeplex.com/MSFTASProdSamples/Wiki/View.aspx?title=SS2005%21Readme_AMOAdventureWorks&referringTitle=Home

If you are dynamically adding measures you would need to update the DSV, add the measures to the appropriate measure group and reprocess all your data.

Scripting changes to the data source view

I need to script out creation of some new cubes and dimension to an SSAS server while leaving the rest of the OLAP database unchanged. Handling the cubes and dimensions themselves seems pretty straight forward, using the scripting tools in SQL Management Studio. However, I'm uncertain about scripting the changes to the data source view (for the fact and dimension tables for the new cubes and dimensions).

In SQL Management Studio, only the entire DSV is scriptable - I don't see a way to script out changes to just some of the tables. I could, in theory, manually edit out the parts of the DSV that aren't related to my new entities, but it's not clear from the XMLA Alter command documentation whether that would work. But my reading, the command would replace the entire Major Object, which is the entire DSV in this case. Am I reading that right?

The only other option I can think of is to programmatically add the new entities to the DSV through explicit code - doable, but a royal pain, and I'd rather avoid it if possible.

Am I understanding things correctly? Is there some other option I'm not aware of?

You are right, the XMLA alter command would replace the entire DSV, so if you only scripted out changed objects that is all you would end up with.

You could programmatically add new entities to the DSV through code, but if you trace the resulting interaction with SSAS you will find that it is still doing an XMLA Alter and sending throught the definition for the entire DSV. But atleast if you programmatically add a table to the DSV, AMO will handle the scripting of the rest of the DSV for you.

|||Thanks for the confirmation, Darren. Fortunately, I came up with a slightly simpler way to do it than manually creating all of the new DSV entities in code. I'm loading up the full cube deployment script, plucking out the new DSV items, and copying them into the existing DSV. In the end, it's not too painful. Works well for new cubes and dimensions as well.
|||Glad you have found a solution. I am just curious as to what you mean by "copying them into the existing DSV" are you doing an XML or text "diff" between your new DSV and the existing one?|||By that I meant using AMO to get the DSV on the existing database instance. Basically I'm getting the existing DSV schema (using AMO), parsing out the new DSV schema from the XML/A deployment script for the modified database, and merging the new elements into the existing DSV.

I haven't gotten as fancy as an automated diff process (yet). Currently I'm just manually keeping track of what the new items are. Hopefully I'll be automate that more soon.
|||

I thought you must have been doing something like this. It does seem like a lot of manual work. I'm not sure if you really need to go to all that effort. If you basically pass SSAS an alter script with the entire DSV from the deployment script it *should* be able to figure things out. (assuming that your development project is just an updated version of the project that was originally deployed to your server)

Would I be correct in assuming that the situation you are facing is that you have made mulitple changes to your solution, but you only want to deploy some of them - hence the need to manually script things out? I have been working with a friend on a project to build an add-in to BIDS (www.codeplex.com/bidshelper) to enhance BIDS and one feature we were thinking about was the option to deploy just a single object from a project. I have a sub set of this working where I can deploy just the Calculation Script from a cube. If you are interested I don't think it would be too hard for us to add some basic "deploy single object" functionality that would handle other object types.

|||>assuming that your development project is just an updated version of
>the project that was originally deployed to your server

And that's the rub - it's not. We are an ISV that includes a set of SSAS cubes in our product. We provide a UI for customers to extend the cubes with custom measures and dimensions. When we ship new versions of our software, we need to be able to deploy our changes (new cubes, dimensions, attributes, etc) without overwriting the customer's customizations.

The BidsHelper project looks pretty cool - I'll check it out. Ultimately, it doesn't sound like something we'd be able to take advantage of to solve our problem - but it may come in handy during development.

|||

Aah, now I see where you are coming from.

I am not really aware of any good way of handling this "out of the box". It sounds like you will either need to write custom AMO code to attach to the client database and modify them. Or you would need to store the customisations that customers apply so that they could be re-applied after an upgrade.

It sounds like a reasonable feature, you could log this as a suggestion on the connect.microsoft.com site under SQL Server if you would like to see it considered for a future version, but in the medium/short term this sounds like something that will need custom code.

|||

Hi Kevin

Need your help desperately

as i read in ur message that you are updating the data source view by AMO

please let me know how to do that

I am working on the SSAS project and we need to add the measure dynamically to the cube( can be SSIS or Amo any thing, it should be automatic)

I am not getting the way to do that

please send me the detail to do that

that will be a big help

thanks

|||

There is a product sample that shows how to build a lite version of the Adventure Works database from scratch using AMO.

http://www.codeplex.com/MSFTASProdSamples/Wiki/View.aspx?title=SS2005%21Readme_AMOAdventureWorks&referringTitle=Home

If you are dynamically adding measures you would need to update the DSV, add the measures to the appropriate measure group and reprocess all your data.

Monday, March 26, 2012

Script to Populate DB with Sample Data?

All:
Is there a tool out there that will construct DB creation scripts that
include taking the source data from the tables can carrying the info
over to the script? The goal here is for developers to be able to check
out the DB creation script (the DDL that creates tables, stored proc,
etc.) AND have their DB filled with sample data.
The tools that come w/ Enterprise Manager, etc. create the DDL scripts
fine - but how can I help my guys get the proper sample data?
Thanks,
Johnhttp://vyaskn.tripod.com/code/generate_inserts.txt
David Portas
SQL Server MVP
--
<jpuopolo@.mvisiontechnology.com> wrote in message
news:1126468738.427835.243580@.z14g2000cwz.googlegroups.com...
> All:
> Is there a tool out there that will construct DB creation scripts that
> include taking the source data from the tables can carrying the info
> over to the script? The goal here is for developers to be able to check
> out the DB creation script (the DDL that creates tables, stored proc,
> etc.) AND have their DB filled with sample data.
> The tools that come w/ Enterprise Manager, etc. create the DDL scripts
> fine - but how can I help my guys get the proper sample data?
> Thanks,
> John
>|||Hi
Another (quicker) approach would be to build a template database and
populate it with static data (which should really be also held in your
source code control system (see David's link) ) and then give the
developers a backup (which you may want to hold in your source code control
system).
John
<jpuopolo@.mvisiontechnology.com> wrote in message
news:1126468738.427835.243580@.z14g2000cwz.googlegroups.com...
> All:
> Is there a tool out there that will construct DB creation scripts that
> include taking the source data from the tables can carrying the info
> over to the script? The goal here is for developers to be able to check
> out the DB creation script (the DDL that creates tables, stored proc,
> etc.) AND have their DB filled with sample data.
> The tools that come w/ Enterprise Manager, etc. create the DDL scripts
> fine - but how can I help my guys get the proper sample data?
> Thanks,
> John
>|||Wow - Thanks!
John

Wednesday, March 21, 2012

Script task: Bindingsource

I used a binding source in my script task codes to filter the data table. when i used that, even if i declare it (dim bs as bindingsource), i'm still having an error - "type bindingsource is not defined".

cherrie

Hi, bindingsource is part of the forms name space. Does your code include

Imports System.Windows.Forms

Script task User defined variable and Data Reader Source SQL Statement

Hello,

Please can anyone tell me if or how I have to cross reference to a user defined variable from a script task in the SQL of a Data Reader Source?

The script task creates a variable for the last Sales Ledger Session. The SQL in the data reader source that updates the DW sales invoice lines file based upon those invoice lines where the session number is greater than the value from the script task.

The SQL command in the custom properties doesn't cross reference back to the variable.

If anyone can help or needs further detail to help please post a resonse.

Thanks,

Marcus Simpson

The SqlCommand property can be overridden with a property expression. For data flow components expression are exposed on the parent Data Flow task rather than at the individual component level.

If you are not already familiar with expressions, look them up in Books Online.

To quickly set an expression, select the Data Flow task and hit F4. Then find the Expressions property in the property grid (as shown by the F4 key) and then select the property for the your DR source, SqlCommand. The name will be something like [DataReader Source].[SqlCommand]. Now add an expression that uses the variable as part of the command.

|||

Darren,

Thankyou very much for your response. It was correct.

Marcus.

|||

Marcus Simpson wrote:

Darren,

Thankyou very much for your response. It was correct.

Marcus.

Please mark Darren's post as answered.

Tuesday, March 20, 2012

Script Source: Error on truncation etc

I have a script source to deal with a source that has different "record types" (first 3 columns are the same then the remaining 2 to 30 columns are different based on the record type).

Script source was working fine... then one of the columns that I had set to String with length of 2 came in with a length of 3 (which is not per spec)... instead of failing - all the columns after the one that had the bad value were null and the script just stopped as soon as it hit that.. AND said it was success. Which means it imported the data incorrectly and since the script says it was a sucess you'd never know anything went wrong and it only imported 30 rows instead of 10k+

Any ideas on how to capture this error?

Code (shortened with .... but should be enough - sorry the forum butchers the code formatting - if someone has a tip for pasting code from VS let me know):

Imports System

Imports System.Data

Imports System.Math

Imports System.IO

Imports System.Convert

Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper

Imports Microsoft.SqlServer.Dts.Runtime.Wrapper

Public Class ScriptMain

Inherits UserComponent

Public Overrides Sub CreateNewOutputRows()

Dim oCurrentFile As File

Dim oStreamReader As StreamReader

Dim sCurrentLine As String

Dim aCurrentLine() As String

Try

oStreamReader = oCurrentFile.OpenText(Me.Variables.SourceName)

sCurrentLine = oStreamReader.ReadLine()

Do While sCurrentLine IsNot Nothing

aCurrentLine = sCurrentLine.Split(Chr(44))

Select Case aCurrentLine(2) ' This is the 3rd column.. "Record Type" which tells us what type of record it is and how many columns etc (Chr(44) is a comma)

Case "BF"

BFRecordsBuffer.AddRow()

BFRecordsBuffer.TimeStamp = aCurrentLine(0)

BFRecordsBuffer.EyeCatcher = aCurrentLine(1)

BFRecordsBuffer.RecordType = aCurrentLine(2)

BFRecordsBuffer.Sym = aCurrentLine(3)

....

BFRecordsBuffer.RecordCount = Convert.ToInt32(aCurrentLine(24))

...

If aCurrentLine.GetLength(0) >= 30 Then

BFRecordsBuffer.SeqNo = aCurrentLine(29)

End If

Case "QF"

QFRecordsBuffer.AddRow()

QFRecordsBuffer.Timestamp = aCurrentLine(0)

QFRecordsBuffer.EyeCatcher = aCurrentLine(1)

QFRecordsBuffer.RecordType = aCurrentLine(2)

...

....

End Select

sCurrentLine = oStreamReader.ReadLine()

Loop

BFRecordsBuffer.SetEndOfRowset()

QFRecordsBuffer.SetEndOfRowset()

....

Catch ex As Exception

Me.ComponentMetaData.FireError(1, "Script Component", ex.Message, "", 0, True)

Finally

oStreamReader.Close()

End Try

End Sub

End Class

Chris Honcoop wrote:

Code (shortened with .... but should be enough - sorry the forum butchers the code formatting - if someone has a tip for pasting code from VS let me know):

I can help with that. Go here: http://www.jtleigh.com/people/colin/software/CopySourceAsHtml/

Yuo can see an example of the result here:

http://blogs.conchango.com/jamiethomson/archive/2006/12/19/SSIS-Nugget_3A00_-Extract-last-word-from-a-sentance-using-regular-expressions.aspx

-Jamie

|||

Thanks Jamie, I installed it and don't see it as an option under add-ons.. nor where it mentions it will be in the menus - am I missing something? In re: to your blog welcome to the wonderful world of regex ;)

Any ideas on the script source truncation/data type mismatch "non error"?

|||

Chris Honcoop wrote:

Thanks Jamie, I installed it and don't see it as an option under add-ons.. nor where it mentions it will be in the menus - am I missing something? In re: to your blog welcome to the wonderful world of regex ;)

Any ideas on the script source truncation/data type mismatch "non error"?

It doesn't work in VSA unfortunately so copy and paste it into a regular VB.Net project and copy as html from there.

Not sure about the error you're getting i'm afraid.

-Jamie

|||Its something with the way I do the error trapping.. I comment out the try / catch etc and the script source will fail when the columns are bad. Of course getting a useful error message about which column etc is experiencing the problem is an entirely different headache.|||To have the standard SSIS options of ignore,redirect, and fail

component in your own custom script component, there are a couple of

changes to consider.

1. Add a try/catch inside your loop. As written, when the first error

happens (e.g. "ABC" rather than "AB"), the exception will be caught

outside of the loop, and no rows beyond that will be processed. This

change will allow the implementation of the ignore and/or redirect

options, and of course, continued processing.

2. In the newly added try/catch, add an catch for the

DoesNotFitBufferException. This will "catch" the error. You can also simply ignore this particular

exception type, and the string will be truncated, or re-throw the

exception, and the component will fail, or lastly, redirect it (

essentially adding a row to an "error" output on your component)

3. To report which column name/index is experiencing the problem will

require a cache of column information (names, types,lengths, etc) by

overriding the PreExecute function, and then grabbing a reference to

the the base buffers by overriding the PrimeOutput function. The

column cache and base buffers can then be referenced in your

CreateNewOutputRows implementation.

Hopefully this will help you along your way. If not, I'll find some script component code and post that as well.

Monday, March 12, 2012

script for login transfer from one phy server to another

Hi,

Do you have any idea about the procedure for transferring the logins and password from source server to the destination server.

scenario is as follows

serv - A wanto transfer to ms sql 2000 server - B
running on 6.5 ------------> will be sql2k
(Old server) (new server)

will appriciate if any weblink containg the check list.

with best regards

ranjanI have a script that generates SQL for recreating logins, user, roles, and permissions on SQL 2000 databases. It may run on your 6.5 database.
It will not, however, restore passwords, and I'm not sure you can do this.

Have you thought about running the upgrade on your 6.5 databases (or a copy of them)? It's been a while, but I think this retains the old passwords.

blindman|||Thanx for reply,

Detail of my upgradation plan -

- I am planning to upgrade my db server (mssql-6.5) to server
(mssql2k) on the same machine.

- I have prepared the checklist for that , Only thing I want is a script
which will take care of transferring my ( login name & passoword)
from 6.5 server to 2000 server.

- will appriciate if you can help me in writting the script or foreward a
readymade script.

thanx regards

Ranjan|||Sorry, but I don't have a script for that. Are you sure that the upgrade process won't do this?
As a matter of fact, I think Microsoft changed its encryption algorythm between versions 6.5 and 2000, so even if you copied the encrypted value to the new server it might not recognize your password. If you are looking for a method of decrypting the password, I don't have one but I know it has been cracked (the algorythm was not very strong). You can probably find code for decrypting it on the web.

blindman|||Refer to this KBA (http://support.microsoft.com/default.aspx?scid=http://support.microsoft.com:80/support/kb/articles/Q246/1/33.ASP&NoWebContent=1) to accomplish the task.

Wednesday, March 7, 2012

Script Component as Source

I want to use Script Component as Source, but I don't know how to code the output rows. Someone can give me some clue or some sample codes? Thanks in advance.

Here's a sample

http://agilebi.com/cs/blogs/jwelch/archive/2007/03/22/writing-a-resultset-to-a-flat-file.aspx

|||Thanks jwelch, that's very helpful.

But, I have a one more question which is the following:
Actually I have a many tables want to do data transfer from one database into another database. So my programming logical is use Foreach loop to read each table name and send to a variable, inside the loop, I use a data flow which contain two script components, one for source which get output rows from table variable, one for destination which insert data into another database. What do you think about my solution? Do you have another better one?

Thanks for your big help.

|||

jwelch wrote:

Here's a sample

http://agilebi.com/cs/blogs/jwelch/archive/2007/03/22/writing-a-resultset-to-a-flat-file.aspx

Sort of... He's doing some other stuff in there than a basic script component source...

For a basic sample:
http://www.ssistalk.com/2007/04/04/ssis-using-a-script-component-as-a-source/|||

SSIS_NewMan wrote:

Thanks jwelch, that's very helpful.

But, I have a one more question which is the following:
Actually I have a many tables want to do data transfer from one database into another database. So my programming logical is use Foreach loop to read each table name and send to a variable, inside the loop, I use a data flow which contain two script components, one for source which get output rows from table variable, one for destination which insert data into another database. What do you think about my solution? Do you have another better one?

Thanks for your big help.

The control flow has a transfer objects task. Does that not work for you?|||But I only need to transfer some of tables not all, and before insert into another database I also need to do some data conversion. Do you think the Transfer Objects task can help me to do these? Thanks a lot.
|||

SSIS_NewMan wrote:

But I only need to transfer some of tables not all, and before insert into another database I also need to do some data conversion. Do you think the Transfer Objects task can help me to do these? Thanks a lot.

Well, you should be using the OLE DB source and destinations, not scripts. But either way you have a big problem. That is, using one data flow to handle this won't likely work too well unless all of your source tables are exactly the same format and are going to tables that have exactly the same format (among ALL of them)

If SourceTableA has 3 columns, all varchars, and SourceTableB has 4 columns, all integers, SSIS will not be able to handle that.

Changing metadata is not within the scope of SSIS' ability to handle.|||Yes, you are right. That is really a problem which I am trying to fig out. So as your experience, what I should to do? If now I have 10 tables or views (of course, all of the source table's format are different) need to transfer, do I need to create 10 SSIS for each source table? What is your best solution for this problem?

Thanks for your any advice and help.
|||Hi Phil Brammer,
Could you give me some advice about this problem?

Thanks,
|||

SSIS_NewMan wrote:

If now I have 10 tables or views (of course, all of the source table's format are different) need to transfer, do I need to create 10 SSIS for each source table?

I am afraid the short answer is YES. There have been some success reported on this forum about building the packages pragmatically. You may want to research on that

|||

SSIS_NewMan wrote:

Yes, you are right. That is really a problem which I am trying to fig out. So as your experience, what I should to do? If now I have 10 tables or views (of course, all of the source table's format are different) need to transfer, do I need to create 10 SSIS for each source table? What is your best solution for this problem?

Thanks for your any advice and help.

To echo Rafael, if you're performing data checks/edits when transferring the data, then yes, I believe your best option is to build a data flow for each source table.|||Thanks for your reply. I will try to get the best solution.

script component

i have 2 source columns testsource and testsource1 and 2 output columns

test and test2

i am passing them through a script component to check if the columns are numeric or not along with some more logic...I am able to get the first column evaluated based on the logic but not the second column

Is it that the script component can only look at one column?

this is the code i wrote

Imports System

Imports System.Data

Imports System.Math

Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper

Imports Microsoft.SqlServer.Dts.Runtime.Wrapper

Public Class ScriptMain

Inherits UserComponent

Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)

If IsNumeric(Row.testsource) Then

Row.test = Left(Row.testsource, 4) + CStr("/") + Mid(Row.testsource, 5, 2) + CStr("/") + Right(Row.testsource, 2)

Else

If Row.testsource = "N.A" Then

Row.test = "NULL"

End If

Row.test = "NULL"

End If

If IsNumeric(Row.testsource1) Then

Row.test2 = Left(Row.testsource1, 4) + CStr("/") + Mid(Row.testsource1, 5, 2) + CStr("/") + Right(Row.testsource1, 2)

Else

If Row.testsource1 = "N.A" Then

Row.test2 = "NULL"

End If

Row.test2 = "NULL"

End If

End Sub

End Class

example of run

INPUT

,20070930
,20080331
,20070930
,20071130
,20070930
,
,
,
20070504,20070503
20080331,
,20070930
N.A. ,N.A.
N.A. ,N.A.

OUTPUT

NULL,20070930
NULL,20080331
NULL,20070930
NULL,20071130
NULL,20070930
NULL,
NULL,
NULL,
2007/05/04,20070503
2008/03/31,
NULL,20070930
NULL,N.A.
NULL,N.A.

as you can see the first column gets evaluated based on logic but the second column just gets passed through..

Thanks for any help in advance

smathew

Put a data viewer in the path right before the script component and ensure that you have the correct results that you expect (for instance, you don't have that comma in there). Also, you might want to add a derived column before the script component to trim() each field to get rid of any extra spaces.|||

i do have a derived column that does the trim, but it still does not evaluate the second column based on the logic..

smathew

|||

smathew wrote:

i do have a derived column that does the trim, but it still does not evaluate the second column based on the logic..

smathew

The other thing you can do after you've looked at the data viewer, is to add a MsgBox(Row.testcolumn1.ToString) to see what happens.