Showing posts with label file. Show all posts
Showing posts with label file. Show all posts

Friday, March 30, 2012

No loading data from ScriptComponent.

Dear all,

I've created a Data Flow scenario as follow:

At first I've got a Flat File Source and then Script Component Task and then OleDb Destination, linked among them by arrows, of course. When I run the SSIS all of them is successfully executed except the last task. Why? I don't know but it isn't awared of nothing.

649 rows are passed to Script Component from the file but they aren't going to my Sql table.

Let me know any advice or thought regarding ths.

Thanks a lot,

We need far far more information than you have provided here.

Is it a synchronous or asynchronous compoennt?|||

Hi Jamie,

Thanks for your quick response.

You'll see the full contains for the .net script

Public Class ScriptMain

Inherits UserComponent

Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)

Dim valorColumna As String

valorColumna = Row.Column9.Substring(1, 1)

If valorColumna = "N" Then

Me.Output0Buffer.IMPBASE = -1 * CDbl(Row.Column10 / 100)

Else

Me.Output0Buffer.IMPBASE = CDbl(Row.Column10 / 100)

End If

Me.Output0Buffer.PORCRETEN = CDbl(Row.Column11 / 100)

Me.Output0Buffer.IMPRETEN = CDbl(Row.Column12 / 100)

Me.Output0Buffer.EJERCICIO = CInt(Row.Column2)

Me.Output0Buffer.CODPROV = CInt(Row.Column7)

Me.Output0Buffer.MODALIDAD = CInt(Row.Column8)

Me.Output0Buffer.NIFPERC = CStr(Row.Column3)

Me.Output0Buffer.NIFREP = CStr(Row.Column4)

Me.Output0Buffer.NOMBRE = CStr(Row.Column6)

Me.Output0Buffer.EJERDEV = CDbl(Row.Column13)

End Sub

I haven't idea if it's asyn or syn

I think that there is some big error very very stupid in all of this.

Thanks again,

Enric

No ldf file but have mdf file

Hi there,

I am using Sql Server Express and have an mdf file I want to attach but I don't have the ldf file. Any suggestions?

Thanks,

Eric

Hi Eric,

To add an existing database .mdf file to SQL Server, you can use Attach. Here are the steps.

1. Copy the .mdf file and.ldf (if exists) to another folder.

2. Right click on Databases, and select Attach from the popup menu.

3. In the Attach Databases dialog box, click Add button and select the .mdf file.

4. Click OK to finish, and the database will be attached to your database list in Management Studio.

Thanks.

|||

ThanksNai-Dong.

But I get the following message below when clicking OK. I think there are two problems. One is I have Express edition and the database has a Job script (and Express doesn't have Sql Server Agent), two the database did not close properly. I will try to find another database version. However thank you for your help.

TITLE: Microsoft SQL Server Management Studio Express
----------

Attach database failed for Server 'INDIVIDU-B3MSFN\SQLEXPRESS'. (Microsoft.SqlServer.Express.Smo)

For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=9.00.3042.00&EvtSrc=Microsoft.SqlServer.Management.Smo.ExceptionTemplates.FailedOperationExceptionText&EvtID=Attach+database+Server&LinkId=20476

----------
ADDITIONAL INFORMATION:

An exception occurred while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.Express.ConnectionInfo)

----------

Could not open new database 'LittleItalyVineyard'. CREATE DATABASE is aborted.
File activation failure. The physical file name "C:\Program Files\Microsoft SQL Server\MSSQL.2\MSSQL\DATA\LittleItalyVineyard_log.ldf" may be incorrect.
The log cannot be rebuilt because the database was not cleanly shut down. (Microsoft SQL Server, Error: 1813)

For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=09.00.3042&EvtSrc=MSSQLServer&EvtID=1813&LinkId=20476

----------
BUTTONS:

OK
----------

|||

Hi Eric,

Just as you said, you'd better to find another database management version and retry the steps above.

Good Luck!

Wednesday, March 28, 2012

No field delimiters using bcp command

Hi,

I am using a bcp command to load data into a text file . The command is below:

C:\>bcp "select ltrim(rtrim(char25))+replicate ('X',25-len(char25)),CONVERT(varc
har(8),dateg,112) as [yyyymmdd],flag1,replace( replicate ('0',19-len(amount)) +
ltrim(rtrim(amount)),'.',','),replace(replicate ('0',9-len(dperc)) + ltrim(rtrim
(dperc)),'.',',') from Bank_Info.dbo.ddd" queryout c:\xxxx\replicate_replace.tx
t -c -U sax -S KARAFOKAS -C 1252 -P passsax

The command runs fine , the problem is , the output in the text file is with tab delimited form. I want the format NOT to have tab delimited form but the values actually to have a continuation. That is, nothing to split one value from the other.

This is the output with tab delimited format.

vvvXXXXXXXXXXXXXXXXXXXXXX 20071112 h 0000000000005555,70 066,50000
abcXXXXXXXXXXXXXXXXXXXXXX 19000101 y 0454545454523456,45 077,30000
xyzcccXXXXXXXXXXXXXXXXXXX 19000101 x 0000000000003456,00 077,99865
fXXXXXXXXXXXXXXXXXXXXXXXX 20030302 6 0000000000232323,45 005,00000

I want the output to have to tabs , as shown below:

vvvXXXXXXXXXXXXXXXXXXXXXX20071112h0000000000005555 ,70066,50000
abcXXXXXXXXXXXXXXXXXXXXXX19000101y0454545454523456 ,45077,30000
xyzcccXXXXXXXXXXXXXXXXXXX19000101x0000000000003456 ,00077,99865
fXXXXXXXXXXXXXXXXXXXXXXXX2003030260000000000232323 ,45005,00000

Columns values should not be seperated by tabs. Any thoughts?

Thank you
George

Quote:

Originally Posted by karafokas

Hi,

I am using a bcp command to load data into a text file . The command is below:

C:\>bcp "select ltrim(rtrim(char25))+replicate ('X',25-len(char25)),CONVERT(varc
har(8),dateg,112) as [yyyymmdd],flag1,replace( replicate ('0',19-len(amount)) +
ltrim(rtrim(amount)),'.',','),replace(replicate ('0',9-len(dperc)) + ltrim(rtrim
(dperc)),'.',',') from Bank_Info.dbo.ddd" queryout c:\xxxx\replicate_replace.tx
t -c -U sax -S KARAFOKAS -C 1252 -P passsax

The command runs fine , the problem is , the output in the text file is with tab delimited form. I want the format NOT to have tab delimited form but the values actually to have a continuation. That is, nothing to split one value from the other.

This is the output with tab delimited format.

vvvXXXXXXXXXXXXXXXXXXXXXX 20071112 h 0000000000005555,70 066,50000
abcXXXXXXXXXXXXXXXXXXXXXX 19000101 y 0454545454523456,45 077,30000
xyzcccXXXXXXXXXXXXXXXXXXX 19000101 x 0000000000003456,00 077,99865
fXXXXXXXXXXXXXXXXXXXXXXXX 20030302 6 0000000000232323,45 005,00000

I want the output to have to tabs , as shown below:

vvvXXXXXXXXXXXXXXXXXXXXXX20071112h0000000000005555 ,70066,50000
abcXXXXXXXXXXXXXXXXXXXXXX19000101y0454545454523456 ,45077,30000
xyzcccXXXXXXXXXXXXXXXXXXX19000101x0000000000003456 ,00077,99865
fXXXXXXXXXXXXXXXXXXXXXXXX2003030260000000000232323 ,45005,00000

Columns values should not be seperated by tabs. Any thoughts?

Thank you
George


looks like you want a fixed-length output. have you tried passing -t "" ? or something like that? -t is the bcp parameter for field terminator.sql

no e-mail subscription option

Hi,
Does anybody know why the "Email subscription" option is not shown in the
Listbox ? I only get an option for the File Delivery.. And also, how can i
activate it (let it show in the option list)
Thanks!
KoenHi Koen -
To show up as a subscription option, the SMTP information in the
RSReportServer.config file must be set correctly. Make sure you have
the SMTPServer and From elements defined.
HTH...
--
Joe Webb
SQL Server MVP
~~~
Get up to speed quickly with SQLNS
http://www.amazon.com/exec/obidos/tg/detail/-/0972688811
I support PASS, the Professional Association for SQL Server.
(www.sqlpass.org)
On Thu, 15 Sep 2005 00:12:04 -0700, "Koen"
<Koen@.discussions.microsoft.com> wrote:
>Hi,
>Does anybody know why the "Email subscription" option is not shown in the
>Listbox ? I only get an option for the File Delivery.. And also, how can i
>activate it (let it show in the option list)
>
>Thanks!
>Koen

Monday, March 26, 2012

No Decimal trailing sign

I have a field defined as (decimal 9(15,2)) and the recipient of a conversion
..txt file wants to see just the meaningful digits, no decimals, no zero fill
but does want to
see a trailing minus sign for negative numbers. So they want to see 550.45
as 55045 and -45.25 as 4525-.
Stan Gosselin
Stan,
You'll have to use string functions to do this.
Here's one solution:
declare @.t table (
d decimal(15,2)
)
insert into @.t values (550.45)
insert into @.t values (10000)
insert into @.t values (-45.25)
insert into @.t values (0)
insert into @.t values (0.01)
insert into @.t values (1)
select
case when d >= 0
then ltrim(cast(100*d as int))
else ltrim(cast(-100*d as int)) + '-' end
from @.t
Be sure you and the client agree on how to represent everything.
This solution represents 0 as '0', not '000', for example, which may
or may not be right.
Steve Kass
Drew University
Stan wrote:

>I have a field defined as (decimal 9(15,2)) and the recipient of a conversion
>.txt file wants to see just the meaningful digits, no decimals, no zero fill
>but does want to
>see a trailing minus sign for negative numbers. So they want to see 550.45
>as 55045 and -45.25 as 4525-.
>
>
sql

Friday, March 23, 2012

No data file,damaged SQL Server,any chance to backup transaction l

Hi,
No data file,damaged SQL Server,any chance to backup transaction log? (MS
SQL Server 2000 SP3)
-- Many thanks, Oskar.
Hi,
no data file, no backup, no database. You cannot restore the database
from the transaction logfiles.
HTH, Jens K. Suessmeyer.
http://www.sqlserver2005.de
|||Hi Oskar
"Oskar" wrote:

> Hi,
> No data file,damaged SQL Server,any chance to backup transaction log? (MS
> SQL Server 2000 SP3)
>
You will have to go back to the last good full backup. Do you have scheduled
backups or backups on tape?
John
|||Thanks John. I'm not actually experiencing this problem at the moment. I was
just wondering if in theory it is possible to make the last transaction log
backup of a database if its data file and the server to which it was attached
are gone. And I came to a conclusion that it probably is - one could try to
restore the master database on a (virtual) test server and copy the salvaged
transaction log file onto the same drive letter it was on the original
server, and then make the last backup of the transaction log so that no
transactions are lost during recovery.
-- Oskar
"John Bell" wrote:

> Hi Oskar
> "Oskar" wrote:
> You will have to go back to the last good full backup. Do you have scheduled
> backups or backups on tape?
> John
>
|||Hi Oskar
"Oskar" wrote:

> Thanks John. I'm not actually experiencing this problem at the moment. I was
> just wondering if in theory it is possible to make the last transaction log
> backup of a database if its data file and the server to which it was attached
> are gone. And I came to a conclusion that it probably is - one could try to
> restore the master database on a (virtual) test server and copy the salvaged
> transaction log file onto the same drive letter it was on the original
> server, and then make the last backup of the transaction log so that no
> transactions are lost during recovery.
> -- Oskar
>
To backup the transaction log you will need the database to be working, in
which case not having the mdf file (as per your original post) will not allow
this. There is no substitute for having a proper backup regime and storing
the backups safely.
John
|||John, but how about the NO_TRUNCATE clause of the BACKUP LOG command. BOL
says it allows the final log backup to be made even when data files are lost
so that one can recover up to the point of failure (of data files). The case
I described differs only in that the server itself is also gone.
-- Oskar
"John Bell" wrote:

> Hi Oskar
> "Oskar" wrote:
> To backup the transaction log you will need the database to be working, in
> which case not having the mdf file (as per your original post) will not allow
> this. There is no substitute for having a proper backup regime and storing
> the backups safely.
> John
|||Hi Oskar
The log can still be backed up when the database is suspect see
http://www.karaszi.com/sqlserver/info_corrupt_suspect_db.asp I guess you
would need to try it.
John
"Oskar" wrote:
[vbcol=seagreen]
> John, but how about the NO_TRUNCATE clause of the BACKUP LOG command. BOL
> says it allows the final log backup to be made even when data files are lost
> so that one can recover up to the point of failure (of data files). The case
> I described differs only in that the server itself is also gone.
> -- Oskar
> "John Bell" wrote:

No data file,damaged SQL Server,any chance to backup transaction l

Hi,
No data file,damaged SQL Server,any chance to backup transaction log? (MS
SQL Server 2000 SP3)
-- Many thanks, Oskar.Hi,
no data file, no backup, no database. You cannot restore the database
from the transaction logfiles.
HTH, Jens K. Suessmeyer.
http://www.sqlserver2005.de
--|||Hi Oskar
"Oskar" wrote:

> Hi,
> No data file,damaged SQL Server,any chance to backup transaction log? (MS
> SQL Server 2000 SP3)
>
You will have to go back to the last good full backup. Do you have scheduled
backups or backups on tape?
John

No current backup?

Hello All,
The hourly sheduled appends to the trasnaction log file start genereating
the following in the output log.
There is no current database backup. This log backup cannot be used to roll
forward a preceding database backup. [SQLSTATE 01000].
This is happening on one database on SQL 2000 machine running three
instances of SQL Server. This happening to a database that is on the defaul
t
instance. Does anyone know why this is happening? Other than that, there i
s
one transaction log device and four data devices with the same [Primary]
filegroup defined for each data device in the database. Since I took over a
s
the admin for this particular database, I have not been able to trucnate the
transaction log. The Dbcc produces the following:
Cannot shrink log file 2 ('') because total number of logical log files
cannot be fewer than 2. [SQLSTATE 01000].
I have tried following the instructions as per MSKB 324432 but have'nt had
any luck. The only thing that I have not tried is recreate the database wit
h
only one log and one data device and transfer all the data. Actually I did
start it once and ran out of diskspace when the indexes were being generated
,
so I had to scrap that idea. If you have any clue as to what's going on,
please help.
Thanks.
Bilal Abbasi
Chadbourne & Parke LLP
30 Rockefeller Plaza
New York, NY 10112Hi
It looks like you don't have a full database backup? Backing up the log will
allow the log to be re-used and it won't change the size. Shrinking the log
file can make it fragmented on the disc and should only be carried out after
significant abnormal growth.
John
"Bilal Abbasi" wrote:

> Hello All,
> The hourly sheduled appends to the trasnaction log file start genereating
> the following in the output log.
> There is no current database backup. This log backup cannot be used to rol
l
> forward a preceding database backup. [SQLSTATE 01000].
> This is happening on one database on SQL 2000 machine running three
> instances of SQL Server. This happening to a database that is on the defa
ult
> instance. Does anyone know why this is happening? Other than that, there
is
> one transaction log device and four data devices with the same [Primar
y]
> filegroup defined for each data device in the database. Since I took over
as
> the admin for this particular database, I have not been able to trucnate t
he
> transaction log. The Dbcc produces the following:
> Cannot shrink log file 2 ('') because total number of logical log files
> cannot be fewer than 2. [SQLSTATE 01000].
> I have tried following the instructions as per MSKB 324432 but have'nt had
> any luck. The only thing that I have not tried is recreate the database w
ith
> only one log and one data device and transfer all the data. Actually I di
d
> start it once and ran out of diskspace when the indexes were being generat
ed,
> so I had to scrap that idea. If you have any clue as to what's going on,
> please help.
> Thanks.
>
> --
> Bilal Abbasi
> Chadbourne & Parke LLP
> 30 Rockefeller Plaza
> New York, NY 10112|||see this..from Books Online
Virtual Log Files
Each transaction log file is divided logically into smaller segments called
virtual log files. Virtual log files are the unit of truncation for the
transaction log. When a virtual log file no longer contains log records for
active transactions, it can be truncated and the space becomes available to
log new transactions.
The smallest size for a virtual log file is 256 kilobytes (KB). The minimum
size for a transaction log is 512 KB, which provides two 256-KB virtual log
files. The number and size of the virtual log files in a transaction log
increase as the size of the log file increases. A small log file can have a
small number of small virtual log files (for example, a 5-MB log file that
comprises five 1-MB virtual log files). A large log file can have larger
virtual log files (for example, a 500-MB log file that comprises ten 50-MB
virtual log files).
Microsoft SQL ServerT 2000 tries to avoid having many small virtual log
files. The number of virtual log files grows much more slowly than the size.
If a log file grows in small increments, it tends to have many small virtual
log files. If the log file grows in larger increments, SQL Server creates a
smaller number of larger virtual log files. For example, if the transaction
log is growing by 1-MB increments, the virtual log files are smaller and
more numerous compared to a transaction log growing at 50-MB increments. A
large number of virtual log files can increase the time taken to perform
database recovery.
As records are written to the log, the end of the log grows from one virtual
log file to the next. If there is more than one physical log file for a
database, the end of the log grows through each virtual log file in each
physical file before circling back to the first virtual log file in the
first physical file. Only when all log files are full will the log begin to
grow automatically.
"Bilal Abbasi" <BilalAbbasi@.discussions.microsoft.com> wrote in message
news:A8FE0793-E2B4-47E6-B176-9F78E78D6BFB@.microsoft.com...
> Hello All,
> The hourly sheduled appends to the trasnaction log file start genereating
> the following in the output log.
> There is no current database backup. This log backup cannot be used to
> roll
> forward a preceding database backup. [SQLSTATE 01000].
> This is happening on one database on SQL 2000 machine running three
> instances of SQL Server. This happening to a database that is on the
> default
> instance. Does anyone know why this is happening? Other than that, there
> is
> one transaction log device and four data devices with the same [Primar
y]
> filegroup defined for each data device in the database. Since I took over
> as
> the admin for this particular database, I have not been able to trucnate
> the
> transaction log. The Dbcc produces the following:
> Cannot shrink log file 2 ('') because total number of logical log files
> cannot be fewer than 2. [SQLSTATE 01000].
> I have tried following the instructions as per MSKB 324432 but have'nt had
> any luck. The only thing that I have not tried is recreate the database
> with
> only one log and one data device and transfer all the data. Actually I
> did
> start it once and ran out of diskspace when the indexes were being
> generated,
> so I had to scrap that idea. If you have any clue as to what's going on,
> please help.
> Thanks.
>
> --
> Bilal Abbasi
> Chadbourne & Parke LLP
> 30 Rockefeller Plaza
> New York, NY 10112|||That is exactly the point. The full dump is done and transaction log is
initialized as a scheduled process. Subsequent appends work so the log file
s
increase to around 15 when the output from the append job starts writing
"There is no current backup", just out of the blue. And it's not something
that I can predict either as to at what point this will start to fail.
Another thing that I noticed is that there are 5 rows in the sysfiles table
and 6 in sysfiles1. The sixth row in sysfiles1 is pointing to a physical LD
F
file that does not exist. I have not found a way to get rid of this
obviously orphanned record. It's a mess I need to cleanup somehow.
--
Bilal Abbasi
Chadbourne & Parke LLP
30 Rockefeller Plaza
New York, NY 10112
"John Bell" wrote:
[vbcol=seagreen]
> Hi
> It looks like you don't have a full database backup? Backing up the log wi
ll
> allow the log to be re-used and it won't change the size. Shrinking the lo
g
> file can make it fragmented on the disc and should only be carried out aft
er
> significant abnormal growth.
> John
> "Bilal Abbasi" wrote:
>|||Hi
You could code around this by issuing a full backups if you hit this error
to restart the log backup sequence, or it would probably better to use SQL
profiler to get to the root of this and find what is issuing the statement
that is breaking the log backup sequence.
What does sp_helpfiles return?
Can you take a full backup and restore it on a different machine? If so you
could try ALTER DATABASE <db> REMOVE FILE <logical_file>
John
"Bilal Abbasi" wrote:
[vbcol=seagreen]
> That is exactly the point. The full dump is done and transaction log is
> initialized as a scheduled process. Subsequent appends work so the log fi
les
> increase to around 15 when the output from the append job starts writing
> "There is no current backup", just out of the blue. And it's not somethin
g
> that I can predict either as to at what point this will start to fail.
> Another thing that I noticed is that there are 5 rows in the sysfiles tabl
e
> and 6 in sysfiles1. The sixth row in sysfiles1 is pointing to a physical
LDF
> file that does not exist. I have not found a way to get rid of this
> obviously orphanned record. It's a mess I need to cleanup somehow.
> --
> Bilal Abbasi
> Chadbourne & Parke LLP
> 30 Rockefeller Plaza
> New York, NY 10112
>
> "John Bell" wrote:
>sql

No current backup?

Hello All,
The hourly sheduled appends to the trasnaction log file start genereating
the following in the output log.
There is no current database backup. This log backup cannot be used to roll
forward a preceding database backup. [SQLSTATE 01000].
This is happening on one database on SQL 2000 machine running three
instances of SQL Server. This happening to a database that is on the default
instance. Does anyone know why this is happening? Other than that, there is
one transaction log device and four data devices with the same [Primary]
filegroup defined for each data device in the database. Since I took over as
the admin for this particular database, I have not been able to trucnate the
transaction log. The Dbcc produces the following:
Cannot shrink log file 2 ('') because total number of logical log files
cannot be fewer than 2. [SQLSTATE 01000].
I have tried following the instructions as per MSKB 324432 but have'nt had
any luck. The only thing that I have not tried is recreate the database with
only one log and one data device and transfer all the data. Actually I did
start it once and ran out of diskspace when the indexes were being generated,
so I had to scrap that idea. If you have any clue as to what's going on,
please help.
Thanks.
--
Bilal Abbasi
Chadbourne & Parke LLP
30 Rockefeller Plaza
New York, NY 10112Hi
It looks like you don't have a full database backup? Backing up the log will
allow the log to be re-used and it won't change the size. Shrinking the log
file can make it fragmented on the disc and should only be carried out after
significant abnormal growth.
John
"Bilal Abbasi" wrote:
> Hello All,
> The hourly sheduled appends to the trasnaction log file start genereating
> the following in the output log.
> There is no current database backup. This log backup cannot be used to roll
> forward a preceding database backup. [SQLSTATE 01000].
> This is happening on one database on SQL 2000 machine running three
> instances of SQL Server. This happening to a database that is on the default
> instance. Does anyone know why this is happening? Other than that, there is
> one transaction log device and four data devices with the same [Primary]
> filegroup defined for each data device in the database. Since I took over as
> the admin for this particular database, I have not been able to trucnate the
> transaction log. The Dbcc produces the following:
> Cannot shrink log file 2 ('') because total number of logical log files
> cannot be fewer than 2. [SQLSTATE 01000].
> I have tried following the instructions as per MSKB 324432 but have'nt had
> any luck. The only thing that I have not tried is recreate the database with
> only one log and one data device and transfer all the data. Actually I did
> start it once and ran out of diskspace when the indexes were being generated,
> so I had to scrap that idea. If you have any clue as to what's going on,
> please help.
> Thanks.
>
> --
> Bilal Abbasi
> Chadbourne & Parke LLP
> 30 Rockefeller Plaza
> New York, NY 10112|||see this..from Books Online
Virtual Log Files
Each transaction log file is divided logically into smaller segments called
virtual log files. Virtual log files are the unit of truncation for the
transaction log. When a virtual log file no longer contains log records for
active transactions, it can be truncated and the space becomes available to
log new transactions.
The smallest size for a virtual log file is 256 kilobytes (KB). The minimum
size for a transaction log is 512 KB, which provides two 256-KB virtual log
files. The number and size of the virtual log files in a transaction log
increase as the size of the log file increases. A small log file can have a
small number of small virtual log files (for example, a 5-MB log file that
comprises five 1-MB virtual log files). A large log file can have larger
virtual log files (for example, a 500-MB log file that comprises ten 50-MB
virtual log files).
Microsoft® SQL ServerT 2000 tries to avoid having many small virtual log
files. The number of virtual log files grows much more slowly than the size.
If a log file grows in small increments, it tends to have many small virtual
log files. If the log file grows in larger increments, SQL Server creates a
smaller number of larger virtual log files. For example, if the transaction
log is growing by 1-MB increments, the virtual log files are smaller and
more numerous compared to a transaction log growing at 50-MB increments. A
large number of virtual log files can increase the time taken to perform
database recovery.
As records are written to the log, the end of the log grows from one virtual
log file to the next. If there is more than one physical log file for a
database, the end of the log grows through each virtual log file in each
physical file before circling back to the first virtual log file in the
first physical file. Only when all log files are full will the log begin to
grow automatically.
"Bilal Abbasi" <BilalAbbasi@.discussions.microsoft.com> wrote in message
news:A8FE0793-E2B4-47E6-B176-9F78E78D6BFB@.microsoft.com...
> Hello All,
> The hourly sheduled appends to the trasnaction log file start genereating
> the following in the output log.
> There is no current database backup. This log backup cannot be used to
> roll
> forward a preceding database backup. [SQLSTATE 01000].
> This is happening on one database on SQL 2000 machine running three
> instances of SQL Server. This happening to a database that is on the
> default
> instance. Does anyone know why this is happening? Other than that, there
> is
> one transaction log device and four data devices with the same [Primary]
> filegroup defined for each data device in the database. Since I took over
> as
> the admin for this particular database, I have not been able to trucnate
> the
> transaction log. The Dbcc produces the following:
> Cannot shrink log file 2 ('') because total number of logical log files
> cannot be fewer than 2. [SQLSTATE 01000].
> I have tried following the instructions as per MSKB 324432 but have'nt had
> any luck. The only thing that I have not tried is recreate the database
> with
> only one log and one data device and transfer all the data. Actually I
> did
> start it once and ran out of diskspace when the indexes were being
> generated,
> so I had to scrap that idea. If you have any clue as to what's going on,
> please help.
> Thanks.
>
> --
> Bilal Abbasi
> Chadbourne & Parke LLP
> 30 Rockefeller Plaza
> New York, NY 10112|||That is exactly the point. The full dump is done and transaction log is
initialized as a scheduled process. Subsequent appends work so the log files
increase to around 15 when the output from the append job starts writing
"There is no current backup", just out of the blue. And it's not something
that I can predict either as to at what point this will start to fail.
Another thing that I noticed is that there are 5 rows in the sysfiles table
and 6 in sysfiles1. The sixth row in sysfiles1 is pointing to a physical LDF
file that does not exist. I have not found a way to get rid of this
obviously orphanned record. It's a mess I need to cleanup somehow.
--
Bilal Abbasi
Chadbourne & Parke LLP
30 Rockefeller Plaza
New York, NY 10112
"John Bell" wrote:
> Hi
> It looks like you don't have a full database backup? Backing up the log will
> allow the log to be re-used and it won't change the size. Shrinking the log
> file can make it fragmented on the disc and should only be carried out after
> significant abnormal growth.
> John
> "Bilal Abbasi" wrote:
> > Hello All,
> >
> > The hourly sheduled appends to the trasnaction log file start genereating
> > the following in the output log.
> > There is no current database backup. This log backup cannot be used to roll
> > forward a preceding database backup. [SQLSTATE 01000].
> >
> > This is happening on one database on SQL 2000 machine running three
> > instances of SQL Server. This happening to a database that is on the default
> > instance. Does anyone know why this is happening? Other than that, there is
> > one transaction log device and four data devices with the same [Primary]
> > filegroup defined for each data device in the database. Since I took over as
> > the admin for this particular database, I have not been able to trucnate the
> > transaction log. The Dbcc produces the following:
> >
> > Cannot shrink log file 2 ('') because total number of logical log files
> > cannot be fewer than 2. [SQLSTATE 01000].
> >
> > I have tried following the instructions as per MSKB 324432 but have'nt had
> > any luck. The only thing that I have not tried is recreate the database with
> > only one log and one data device and transfer all the data. Actually I did
> > start it once and ran out of diskspace when the indexes were being generated,
> > so I had to scrap that idea. If you have any clue as to what's going on,
> > please help.
> >
> > Thanks.
> >
> >
> > --
> > Bilal Abbasi
> > Chadbourne & Parke LLP
> > 30 Rockefeller Plaza
> > New York, NY 10112|||Hi
You could code around this by issuing a full backups if you hit this error
to restart the log backup sequence, or it would probably better to use SQL
profiler to get to the root of this and find what is issuing the statement
that is breaking the log backup sequence.
What does sp_helpfiles return?
Can you take a full backup and restore it on a different machine? If so you
could try ALTER DATABASE <db> REMOVE FILE <logical_file>
John
"Bilal Abbasi" wrote:
> That is exactly the point. The full dump is done and transaction log is
> initialized as a scheduled process. Subsequent appends work so the log files
> increase to around 15 when the output from the append job starts writing
> "There is no current backup", just out of the blue. And it's not something
> that I can predict either as to at what point this will start to fail.
> Another thing that I noticed is that there are 5 rows in the sysfiles table
> and 6 in sysfiles1. The sixth row in sysfiles1 is pointing to a physical LDF
> file that does not exist. I have not found a way to get rid of this
> obviously orphanned record. It's a mess I need to cleanup somehow.
> --
> Bilal Abbasi
> Chadbourne & Parke LLP
> 30 Rockefeller Plaza
> New York, NY 10112
>
> "John Bell" wrote:
> > Hi
> >
> > It looks like you don't have a full database backup? Backing up the log will
> > allow the log to be re-used and it won't change the size. Shrinking the log
> > file can make it fragmented on the disc and should only be carried out after
> > significant abnormal growth.
> >
> > John
> >
> > "Bilal Abbasi" wrote:
> >
> > > Hello All,
> > >
> > > The hourly sheduled appends to the trasnaction log file start genereating
> > > the following in the output log.
> > > There is no current database backup. This log backup cannot be used to roll
> > > forward a preceding database backup. [SQLSTATE 01000].
> > >
> > > This is happening on one database on SQL 2000 machine running three
> > > instances of SQL Server. This happening to a database that is on the default
> > > instance. Does anyone know why this is happening? Other than that, there is
> > > one transaction log device and four data devices with the same [Primary]
> > > filegroup defined for each data device in the database. Since I took over as
> > > the admin for this particular database, I have not been able to trucnate the
> > > transaction log. The Dbcc produces the following:
> > >
> > > Cannot shrink log file 2 ('') because total number of logical log files
> > > cannot be fewer than 2. [SQLSTATE 01000].
> > >
> > > I have tried following the instructions as per MSKB 324432 but have'nt had
> > > any luck. The only thing that I have not tried is recreate the database with
> > > only one log and one data device and transfer all the data. Actually I did
> > > start it once and ran out of diskspace when the indexes were being generated,
> > > so I had to scrap that idea. If you have any clue as to what's going on,
> > > please help.
> > >
> > > Thanks.
> > >
> > >
> > > --
> > > Bilal Abbasi
> > > Chadbourne & Parke LLP
> > > 30 Rockefeller Plaza
> > > New York, NY 10112

No column was specified to allow the component to advance through the file.

Hello,

I apologize in advance if this seems like a relative easy answer, however, I can't find it anywhere, and I can't figure it out personally, I am relatively new to SSIS and C#.

I am attempting to write in C# a simple program, where I am taking a table from an OLE DB Source and transfering it to a comma delimited flat file.

I have been trying to work through samples and other methods of help, but I continue to get stuck, and this is my latest problem.

When trying to excecute I get the error: No column was sepcified to allow the component to advance through the file, for my flat file destination.

I know what this means, I just do not know how to fix it.

Below is my code so far. I highly doubt my code is the best it can get for this type of example, so if you see anything that I do not need in the code for it to perform what I want it to perform, please let me know.

(The code to excecute this package is in a different file).

using System;

using Microsoft.SqlServer.Dts.Runtime;

using Microsoft.SqlServer.Dts.Pipeline;

using Microsoft.SqlServer.Dts.Pipeline.Wrapper;

namespace Microsoft.SqlServer.Dts.Samples

{

class Program

{

static void Main(string[] args)

{

// Create a package and add a Data Flow task.

Package package = new Package();

Executable e = package.Executables.Add("DTS.Pipeline.1");

TaskHost thMainPipe = e as TaskHost;

MainPipe dataFlowTask = thMainPipe.InnerObject as MainPipe;

// Create Application

Application app = new Application();

// Add an OLE DB connection manager to the package.

ConnectionManager conMgr = package.Connections.Add("OLEDB");

conMgr.ConnectionString = "Data Source=ROSIE\\ROSIE2005;" +

"Initial Catalog=AdventureWorks;Provider=SQLNCLI;" +

"Integrated Security=SSPI;Auto Translate=false;";

conMgr.Name = "SSIS Connection Manager for OLE DB";

conMgr.Description = "OLE DB connection to the " +

"AdventureWorks database.";

// Create and configure an OLE DB source component.

IDTSComponentMetaData90 source =

dataFlowTask.ComponentMetaDataCollection.New();

source.ComponentClassID = "DTSAdapter.OLEDBSource.1";

// Create the design-time instance of the source.

CManagedComponentWrapper srcDesignTime = source.Instantiate();

// The ProvideComponentProperties method creates a default output.

srcDesignTime.ProvideComponentProperties();

// Assign the connection manager.

source.RuntimeConnectionCollection[0].ConnectionManager =

DtsConvert.ToConnectionManager90(conMgr);

// Set the custom properties of the source.

srcDesignTime.SetComponentProperty("AccessMode", 2);

srcDesignTime.SetComponentProperty("SqlCommand",

"Select * from HumanResources.EmployeePayHistory");

srcDesignTime.SetComponentProperty("OpenRowset", "[AdventureWorks].[HumanResources].[EmployeePayHistory");

// Need to set the ConnectionManagerID

if (source.RuntimeConnectionCollection.Count > 0)

{

source.RuntimeConnectionCollection[0].ConnectionManagerID =

conMgr.ID;

source.RuntimeConnectionCollection[0].ConnectionManager =

DtsConvert.ToConnectionManager90(conMgr);

}

// Connect to the data source,

// and then update the metadata for the source.

srcDesignTime.AcquireConnections(null);

srcDesignTime.ReinitializeMetaData();

srcDesignTime.ReleaseConnections();

// Add an flat file source connection manager to the package.

ConnectionManager conMgr2 = package.Connections.Add("FlatFile");

conMgr2.ConnectionString = "C:\\Documents and Settings\\ddoorn" +

"\\My Documents\\Visual Studio 2005\\Projects\\" +

"DennisSampleProgram1\\EmployeePayHistory.txt";

conMgr2.Name = "SSIS Connection Manager for Flat File";

conMgr2.Description = "Flat File Destination Connection";

// Create Destination Component

IDTSComponentMetaData90 destination =

dataFlowTask.ComponentMetaDataCollection.New();

destination.Name = "Flat File Destination";

destination.ComponentClassID =

"DTSAdapter.FlatFileDestination.1";

CManagedComponentWrapper destDesignTime = destination.Instantiate();

destDesignTime.ProvideComponentProperties();

// Assign the connection manager.

destination.RuntimeConnectionCollection[0].ConnectionManager =

DtsConvert.ToConnectionManager90(conMgr2);

// Set Custom Properties

destDesignTime.SetComponentProperty("Overwrite", true);

// Assignment an ID to the ConnectionManager

if (destination.RuntimeConnectionCollection.Count > 0)

{

destination.RuntimeConnectionCollection[0].ConnectionManagerID =

conMgr2.ID;

destination.RuntimeConnectionCollection[0].ConnectionManager =

DtsConvert.ToConnectionManager90(conMgr2);

}

// Create the path from source to destination.

IDTSPath90 path = dataFlowTask.PathCollection.New();

path.AttachPathAndPropagateNotifications(source.OutputCollection[0],

destination.InputCollection[0]);

// Get the destination's default input and virtual input.

IDTSInput90 input = destination.InputCollection[0];

IDTSVirtualInput90 vInput = input.GetVirtualInput();

// Iterate through the virtual input column collection.

foreach (IDTSVirtualInputColumn90 vColumn

in vInput.VirtualInputColumnCollection)

{

// Call the SetUsageType method of the destination

// to add each available virtual input column as an input column.

destDesignTime.SetUsageType(

input.ID, vInput, vColumn.LineageID,

DTSUsageType.UT_READONLY);

}

//map external metadata to the inputcolumn

//int index = 0;

foreach (IDTSInputColumn90 inputColumn in input.InputColumnCollection)

{

IDTSExternalMetadataColumn90 exMetaColumn =

input.ExternalMetadataColumnCollection.New();

//(IDTSExternalMetadataColumn90)input.ExternalMetadataColumnCollection[index++];

exMetaColumn.CodePage = inputColumn.CodePage;

exMetaColumn.DataType = inputColumn.DataType;

exMetaColumn.Length = inputColumn.Length;

exMetaColumn.Name = inputColumn.Name;

inputColumn.ExternalMetadataColumnID = exMetaColumn.ID;

destDesignTime.MapInputColumn(input.ID, inputColumn.ID, exMetaColumn.ID);

}

// Verify that the columns have been added to the input.

// This is only really required for debugging purposes

Console.WriteLine("Below are the columns that have been added " +

"to the input. Press Enter to Verify");

foreach (IDTSInputColumn90 inputColumn in

destination.InputCollection[0].InputColumnCollection)

{

Console.WriteLine(inputColumn.Name);

}

Console.Read();

// Connect to the data source,

// and then update the metadata for the source.

destDesignTime.AcquireConnections(null);

destDesignTime.ReinitializeMetaData();

destDesignTime.ReleaseConnections();

// Save Package to XML

app.SaveToXml("C:\\Documents and Settings\\ddoorn\\My Documents\\" +

"Visual Studio 2005\\Projects\\DennisSampleProgram1\\" +

"DennisSampleProgram1\\DennisSampleProject1.xml",

package, null);

} // main

} // program

} // namespace

Nevermind, I found an example that comes with SQL Server 2005/Visual Studio and I found my mistake.

Wednesday, March 21, 2012

No clue about this error

No clue whats causing this error,please help

Server Error in '/learn' Application.

Unable to open the physical file "g:\inetpub\wwwroot\learn\App_Data\Personal.mdf". Operating system error 32: "32(The process cannot access the file because it is being used by another process.)".
An attempt to attach an auto-named database for file g:\inetpub\wwwroot\learn\App_Data\Personal.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share.

If you have Personal MDF opened in the Sever Explorer try disconnecting and then running your page again.|||

Thanks a lot, it worked for me !!

hoopslife:

If you have Personal MDF opened in the Sever Explorer try disconnecting and then running your page again.

sql

no blocking with bulk insert

I am running in QA:
Bulk insert tableA from 'C:\file.tab'
In another window ,
select * from tableA gives me no records .. I would assume some kind of
blocking while the bulk insert does it jobs. So what exactly does bulk
insert to ? The job is running and the tab file has 50 million records but
still on the second window i can query and in sp_lock all I can see for the
table is an IX tab lock and theres no blocking too. I am using SQL 2000
So whats going on ?In addition after like 15 mins, it started blocking.. So my question is what
was it doing till then.. Does it try to read the file first or something ?
"Hassan" <fatima_ja@.hotmail.com> wrote in message
news:e#4f7ykUDHA.2112@.TK2MSFTNGP10.phx.gbl...
> I am running in QA:
> Bulk insert tableA from 'C:\file.tab'
> In another window ,
> select * from tableA gives me no records .. I would assume some kind of
> blocking while the bulk insert does it jobs. So what exactly does bulk
> insert to ? The job is running and the tab file has 50 million records but
> still on the second window i can query and in sp_lock all I can see for
the
> table is an IX tab lock and theres no blocking too. I am using SQL 2000
> So whats going on ?
>
>|||No your are experiencing the effect of lock escalation. When you start the
bulk insert with no batch size or a batch size of 0 (zero) the complete file
will be inserted as one batch, so the lock accumulate and at a certain point
in time escalate to a table lock.
In order to prevent this you can take two approaches:
1) Use a batch size of 2499 to prevent the lock escalation form happening
2) Before you start the bulk insert take a rowlock on an entry in the table,
I always do this by inserting a dummy record at the end of the table. Keep
the lock. In another session do the bulk insert, because there is a lock
present you can not escalate to a table lock.
I prefer option 1.
GertD@.SQLDev.Net
Please reply only to the newsgroups.
This posting is provided "AS IS" with no warranties, and confers no rights.
You assume all risk for your use.
Copyright © SQLDev.Net 1991-2003 All rights reserved.
"Hassan" <fatima_ja@.hotmail.com> wrote in message
news:uBzxW5kUDHA.2012@.TK2MSFTNGP10.phx.gbl...
> In addition after like 15 mins, it started blocking.. So my question is
what
> was it doing till then.. Does it try to read the file first or something ?
>
> "Hassan" <fatima_ja@.hotmail.com> wrote in message
> news:e#4f7ykUDHA.2112@.TK2MSFTNGP10.phx.gbl...
> > I am running in QA:
> >
> > Bulk insert tableA from 'C:\file.tab'
> >
> > In another window ,
> >
> > select * from tableA gives me no records .. I would assume some kind of
> > blocking while the bulk insert does it jobs. So what exactly does bulk
> > insert to ? The job is running and the tab file has 50 million records
but
> > still on the second window i can query and in sp_lock all I can see for
> the
> > table is an IX tab lock and theres no blocking too. I am using SQL 2000
> >
> > So whats going on ?
> >
> >
> >
>

no available input columns

Hi ,

Im trying to build a package that will copy data from excel to SQL

in a program

unfortunately , when I open the package xml file and I drill into the

oledb destination I see that I have no available input columns

what could be the problem ?

thanks ahead

Eran

p.s

the script:

Dim p As Package = New Package()

Dim e As Executable = p.Executables.Add("DTS.Pipeline.1")

Dim thMainPipe As TaskHost = CType(e, TaskHost)

thMainPipe.Properties("Name").SetValue(thMainPipe, "Data Flow")

Dim dataFlowTask As MainPipe = CType(thMainPipe.InnerObject, MainPipe)

' Create excel connection MANAGER

Dim excelCon As ConnectionManager = p.Connections.Add("EXCEL")

excelCon.Name = "ExcelSourceConn"

excelCon.ConnectionString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=e:\try\try\try.XLS;Extended Properties=""Excel 8.0;HDR=YES"""

' Create sqldev connection Manager

Dim sqlCon As ConnectionManager = p.Connections.Add("OLEDB")

sqlCon.Name = "sqldevConn"

sqlCon.ConnectionString = "Provider=SQLOLEDB.1;Integrated Security=SSPI;Persist Security Info=False;User ID=sa;Initial Catalog=InsFocus_Admin_Eran;Data Source=SQLDEV\SQLDEV"

''create source component

Dim excelSource As IDTSComponentMetaData90 = dataFlowTask.ComponentMetaDataCollection.New()

excelSource.Name = "ExcelSource"

excelSource.ComponentClassID = "DTSAdapter.ExcelSource.1"

Dim excelInstance As CManagedComponentWrapper = excelSource.Instantiate()

excelInstance.ProvideComponentProperties()

excelSource.RuntimeConnectionCollection(0).ConnectionManager = DtsConvert.ToConnectionManager90(p.Connections(0))

excelInstance.SetComponentProperty("AccessMode", 0)

excelInstance.SetComponentProperty("OpenRowset", "business_codes$")

excelCon.AcquireConnection(Nothing)

'excelInstance.ReinitializeMetaData()

excelInstance.ReleaseConnections()

Dim sqldev As IDTSComponentMetaData90 = dataFlowTask.ComponentMetaDataCollection.New()

sqldev.Name = "sqldev"

sqldev.ComponentClassID = "DTSAdapter.OLEDBDestination.1"

Dim sqldevInstance As CManagedComponentWrapper = sqldev.Instantiate()

sqldevInstance.ProvideComponentProperties()

sqldev.RuntimeConnectionCollection(0).ConnectionManager = DtsConvert.ToConnectionManager90(p.Connections(1))

sqldevInstance.SetComponentProperty("AccessMode", 0)

sqldevInstance.SetComponentProperty("OpenRowset", "business_codes")

sqldevInstance.AcquireConnections(Nothing)

sqldevInstance.ReinitializeMetaData()

sqldevInstance.ReleaseConnections()

Dim path As IDTSPath90 = dataFlowTask.PathCollection.New()

path.AttachPathAndPropagateNotifications(excelSource.OutputCollection(0), sqldev.InputCollection(0))

MsgBox(excelSource.OutputCollection.Count)

'For Each input As IDTSInput90 In sqldev.InputCollection

' Dim vInput As IDTSVirtualInput90 = input.GetVirtualInput

' For Each vColumn As IDTSVirtualInputColumn90 In vInput.VirtualInputColumnCollection

' ' Call the SetUsageType method of the design time instance of the component.

' sqldevInstance.SetUsageType(input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY)

' Next

'Next

Dim app As Application = New Application()

app.SaveToXml("c:\myXMLPackage.dtsx", p, Nothing)

Hi,

I found your post and was wondering if you ever got a response to this? I'm having the same problem

Thanks

Cat

|||

When you open the package in the designer do you see any columns published on the source adapter?

Also to clarify: is the problem that the virtual column collection is empty in your program or the input column collection is empty when you look at your destination adapter afterwards?

The commented piece of code is supposed to add columns to the input column collection.

Thanks.

no available input columns

Hi ,

Im trying to build a package that will copy data from excel to SQL

in a program

unfortunately , when I open the package xml file and I drill into the

oledb destination I see that I have no available input columns

what could be the problem ?

thanks ahead

Eran

p.s

the script:

Dim p As Package = New Package()

Dim e As Executable = p.Executables.Add("DTS.Pipeline.1")

Dim thMainPipe As TaskHost = CType(e, TaskHost)

thMainPipe.Properties("Name").SetValue(thMainPipe, "Data Flow")

Dim dataFlowTask As MainPipe = CType(thMainPipe.InnerObject, MainPipe)

' Create excel connection MANAGER

Dim excelCon As ConnectionManager = p.Connections.Add("EXCEL")

excelCon.Name = "ExcelSourceConn"

excelCon.ConnectionString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=e:\try\try\try.XLS;Extended Properties=""Excel 8.0;HDR=YES"""

' Create sqldev connection Manager

Dim sqlCon As ConnectionManager = p.Connections.Add("OLEDB")

sqlCon.Name = "sqldevConn"

sqlCon.ConnectionString = "Provider=SQLOLEDB.1;Integrated Security=SSPI;Persist Security Info=False;User ID=sa;Initial Catalog=InsFocus_Admin_Eran;Data Source=SQLDEV\SQLDEV"

''create source component

Dim excelSource As IDTSComponentMetaData90 = dataFlowTask.ComponentMetaDataCollection.New()

excelSource.Name = "ExcelSource"

excelSource.ComponentClassID = "DTSAdapter.ExcelSource.1"

Dim excelInstance As CManagedComponentWrapper = excelSource.Instantiate()

excelInstance.ProvideComponentProperties()

excelSource.RuntimeConnectionCollection(0).ConnectionManager = DtsConvert.ToConnectionManager90(p.Connections(0))

excelInstance.SetComponentProperty("AccessMode", 0)

excelInstance.SetComponentProperty("OpenRowset", "business_codes$")

excelCon.AcquireConnection(Nothing)

'excelInstance.ReinitializeMetaData()

excelInstance.ReleaseConnections()

Dim sqldev As IDTSComponentMetaData90 = dataFlowTask.ComponentMetaDataCollection.New()

sqldev.Name = "sqldev"

sqldev.ComponentClassID = "DTSAdapter.OLEDBDestination.1"

Dim sqldevInstance As CManagedComponentWrapper = sqldev.Instantiate()

sqldevInstance.ProvideComponentProperties()

sqldev.RuntimeConnectionCollection(0).ConnectionManager = DtsConvert.ToConnectionManager90(p.Connections(1))

sqldevInstance.SetComponentProperty("AccessMode", 0)

sqldevInstance.SetComponentProperty("OpenRowset", "business_codes")

sqldevInstance.AcquireConnections(Nothing)

sqldevInstance.ReinitializeMetaData()

sqldevInstance.ReleaseConnections()

Dim path As IDTSPath90 = dataFlowTask.PathCollection.New()

path.AttachPathAndPropagateNotifications(excelSource.OutputCollection(0), sqldev.InputCollection(0))

MsgBox(excelSource.OutputCollection.Count)

'For Each input As IDTSInput90 In sqldev.InputCollection

' Dim vInput As IDTSVirtualInput90 = input.GetVirtualInput

' For Each vColumn As IDTSVirtualInputColumn90 In vInput.VirtualInputColumnCollection

' ' Call the SetUsageType method of the design time instance of the component.

' sqldevInstance.SetUsageType(input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY)

' Next

'Next

Dim app As Application = New Application()

app.SaveToXml("c:\myXMLPackage.dtsx", p, Nothing)

Hi,

I found your post and was wondering if you ever got a response to this? I'm having the same problem

Thanks

Cat

|||

When you open the package in the designer do you see any columns published on the source adapter?

Also to clarify: is the problem that the virtual column collection is empty in your program or the input column collection is empty when you look at your destination adapter afterwards?

The commented piece of code is supposed to add columns to the input column collection.

Thanks.

Friday, March 9, 2012

Newsgroup for attachments?

Is there a newsgroup where we can post file attachements (for code examples,
rdl files, etc.)?
Thank you,
--
Alain Quesnel
alainsansspam@.logiquel.com
www.logiquel.comOn Nov 5, 10:16 pm, "Alain Quesnel" <alainsanss...@.logiquel.com>
wrote:
> Is there a newsgroup where we can post file attachements (for code examples,
> rdl files, etc.)?
> Thank you,
> --
> Alain Quesnel
> alainsanss...@.logiquel.com
> www.logiquel.com
This is by far the largest, highest traffic Reporting Services
newsgroup (and has the most responses and participation); however, 3
others have the option to upload files (and I'm sure there are others
as well):
http://groups.google.com/group/SQL-SERVER-REPORTING-SERVICES
http://groups.google.com/group/RS2005
http://groups.google.com/group/ReportingServices
Hope this helps.
Regards,
Enrique Martinez
Sr. Software Consultant|||I take it Microsoft doesn't have any newsgroups that accept attachments?
Thank you,
--
Alain Quesnel
alainsansspam@.logiquel.com
www.logiquel.com
"EMartinez" <emartinez.pr1@.gmail.com> wrote in message
news:1194323747.519601.245000@.57g2000hsv.googlegroups.com...
> On Nov 5, 10:16 pm, "Alain Quesnel" <alainsanss...@.logiquel.com>
> wrote:
>> Is there a newsgroup where we can post file attachements (for code
>> examples,
>> rdl files, etc.)?
>> Thank you,
>> --
>> Alain Quesnel
>> alainsanss...@.logiquel.com
>> www.logiquel.com
>
> This is by far the largest, highest traffic Reporting Services
> newsgroup (and has the most responses and participation); however, 3
> others have the option to upload files (and I'm sure there are others
> as well):
> http://groups.google.com/group/SQL-SERVER-REPORTING-SERVICES
> http://groups.google.com/group/RS2005
> http://groups.google.com/group/ReportingServices
> Hope this helps.
> Regards,
> Enrique Martinez
> Sr. Software Consultant
>

Saturday, February 25, 2012

Newbie: XML as datasource

Hi everyone,
I have a Web Form with data fields. What I want to do is to save these
fields into an XML file (dynamically), then hook this XML (as a data source)
to a SQL Report so that I don't have to save these data into a SQL Server
database. Is it possible? If it is then can someone show me how.
Any suggestion is greatly appreciated.
Many thanks in advance.There is currently no support for accessing an XML data source. You either
have to write your own custom XML data source extension or look on the market
for an XML OLEDB provider. Or search MSDN for sample code that implements an
XML data source extension. My current understanding is that it will be
included with the SQL Server 2005 release of Reporting Services.
HTH
Charles Kangai, MCT, MCDBA
"Calvin KD" wrote:
> Hi everyone,
> I have a Web Form with data fields. What I want to do is to save these
> fields into an XML file (dynamically), then hook this XML (as a data source)
> to a SQL Report so that I don't have to save these data into a SQL Server
> database. Is it possible? If it is then can someone show me how.
> Any suggestion is greatly appreciated.
> Many thanks in advance.|||Hi
I used the "Microsoft OLE DB Simple Provider"
1.) Create new Shared Data Source
General: "Provider=MSDAOSP.1;Data Source=MSXML2.DSOControl.2.6"
Credentials: "Use Windows Authentication (Integrated Security)"
2.) Create new Report using this Shared Data Source
Query: path to the xml-file "c:\temp\test.xml"
Kind Regards
Daniel
"Charles Kangai" <CharlesKangai@.discussions.microsoft.com> wrote in message
news:4A0288C1-9C16-4CDF-981A-88B89475616C@.microsoft.com...
> There is currently no support for accessing an XML data source. You either
> have to write your own custom XML data source extension or look on the
> market
> for an XML OLEDB provider. Or search MSDN for sample code that implements
> an
> XML data source extension. My current understanding is that it will be
> included with the SQL Server 2005 release of Reporting Services.
> HTH
> Charles Kangai, MCT, MCDBA
> "Calvin KD" wrote:
>> Hi everyone,
>> I have a Web Form with data fields. What I want to do is to save these
>> fields into an XML file (dynamically), then hook this XML (as a data
>> source)
>> to a SQL Report so that I don't have to save these data into a SQL Server
>> database. Is it possible? If it is then can someone show me how.
>> Any suggestion is greatly appreciated.
>> Many thanks in advance.