Saturday, February 25, 2012

Problem with delete, lock and transaction log

we have problem with SQL Server 2000 SP3 / Windows 2003 EE
we have 10 huge tables, I have a procedure which deletes data from
these tables each day.
1. The deletes are very slow, it takes 20 min to delete 400,000
records. It does use index when it deletes, i have seen the plan
2. i am running the database in simple mode but still the transaction
log grows bigger and bigger during the delete process. I have used
checkpoint after every table delete but it didnt help. is there a way
we can avoid writing to transaction log so that i can solve this
problem and improve the delete performance too
3. If i stop the job then everything gets rolled back that is been done
by procedure. does it not commit table by table within the procedure?
is it waiting for the procedure to finish before it commits?
Can any one please help me ASAP
Thanks
RaghuAnswer for you questions:-
1. Disable the foreign key if you can do the house keeping after shutdown
the application [ See Alter table command]
2. No you can not avoid to log the transaction. The easy way is shrink the
transaction log fter delete [See DBCC SHRINKFILE]
3. That depends up the way you handle the transaction.. If its a single
delete statement inside the procedure and if you kill the job;
then automatically every thing pertaning to that delete will be rolled back
Thanks
Hari
SQL Server MVP
"raghu" <raghu.burju@.gmail.com> wrote in message
news:1154641875.483366.120510@.75g2000cwc.googlegroups.com...
> we have problem with SQL Server 2000 SP3 / Windows 2003 EE
> we have 10 huge tables, I have a procedure which deletes data from
> these tables each day.
> 1. The deletes are very slow, it takes 20 min to delete 400,000
> records. It does use index when it deletes, i have seen the plan
> 2. i am running the database in simple mode but still the transaction
> log grows bigger and bigger during the delete process. I have used
> checkpoint after every table delete but it didnt help. is there a way
> we can avoid writing to transaction log so that i can solve this
> problem and improve the delete performance too
> 3. If i stop the job then everything gets rolled back that is been done
> by procedure. does it not commit table by table within the procedure?
> is it waiting for the procedure to finish before it commits?
> Can any one please help me ASAP
> Thanks
> Raghu
>|||Thanks hari for the quick response.
I have no foreign keys on these tables. All i have is indexes to of
them one is for delete based on timestamp and other is for select which
includes 3 more columns.
DBCC SHRINKFILE is only after i do the delete, even before the delete
is finished my 100 GB transaction log is getting filled and stopping
all other application to either insert or delete data.
I have 10 delete statements so this mean that everything that it has
tried till that time is rolledback. I can do some kind of commit which
will clear the log as well as prevent the rollback if any problem
occurs. The begin and end transaction locks the table and wont allow to
make selects till it is commited, i want where selects can work when
delete is going on
Raghu
Hari Prasad wrote:
> Answer for you questions:-
> 1. Disable the foreign key if you can do the house keeping after shutdown
> the application [ See Alter table command]
> 2. No you can not avoid to log the transaction. The easy way is shrink the
> transaction log fter delete [See DBCC SHRINKFILE]
> 3. That depends up the way you handle the transaction.. If its a single
> delete statement inside the procedure and if you kill the job;
> then automatically every thing pertaning to that delete will be rolled back
> Thanks
> Hari
> SQL Server MVP
>
>
> "raghu" <raghu.burju@.gmail.com> wrote in message
> news:1154641875.483366.120510@.75g2000cwc.googlegroups.com...
> > we have problem with SQL Server 2000 SP3 / Windows 2003 EE
> > we have 10 huge tables, I have a procedure which deletes data from
> > these tables each day.
> >
> > 1. The deletes are very slow, it takes 20 min to delete 400,000
> > records. It does use index when it deletes, i have seen the plan
> >
> > 2. i am running the database in simple mode but still the transaction
> > log grows bigger and bigger during the delete process. I have used
> > checkpoint after every table delete but it didnt help. is there a way
> > we can avoid writing to transaction log so that i can solve this
> > problem and improve the delete performance too
> >
> > 3. If i stop the job then everything gets rolled back that is been done
> > by procedure. does it not commit table by table within the procedure?
> > is it waiting for the procedure to finish before it commits?
> >
> > Can any one please help me ASAP
> >
> > Thanks
> > Raghu
> >|||raghu wrote:
> we have problem with SQL Server 2000 SP3 / Windows 2003 EE
> we have 10 huge tables, I have a procedure which deletes data from
> these tables each day.
> 1. The deletes are very slow, it takes 20 min to delete 400,000
> records. It does use index when it deletes, i have seen the plan
> 2. i am running the database in simple mode but still the transaction
> log grows bigger and bigger during the delete process. I have used
> checkpoint after every table delete but it didnt help. is there a way
> we can avoid writing to transaction log so that i can solve this
> problem and improve the delete performance too
> 3. If i stop the job then everything gets rolled back that is been done
> by procedure. does it not commit table by table within the procedure?
> is it waiting for the procedure to finish before it commits?
> Can any one please help me ASAP
> Thanks
> Raghu
>
You could use TRUNCATE TABLE, which is non-logged. Be sure to read the
Books Online entry first, there are some caveats to using this vs. DELETE.
Tracy McKibben
MCDBA
http://www.realsqlguy.com|||Tracy
Truncate table would clean the entire table. I am just trying to clean
old data that is 15 days old and this is a job which does this each day
Raghu
Tracy McKibben wrote:
> raghu wrote:
> > we have problem with SQL Server 2000 SP3 / Windows 2003 EE
> > we have 10 huge tables, I have a procedure which deletes data from
> > these tables each day.
> >
> > 1. The deletes are very slow, it takes 20 min to delete 400,000
> > records. It does use index when it deletes, i have seen the plan
> >
> > 2. i am running the database in simple mode but still the transaction
> > log grows bigger and bigger during the delete process. I have used
> > checkpoint after every table delete but it didnt help. is there a way
> > we can avoid writing to transaction log so that i can solve this
> > problem and improve the delete performance too
> >
> > 3. If i stop the job then everything gets rolled back that is been done
> > by procedure. does it not commit table by table within the procedure?
> > is it waiting for the procedure to finish before it commits?
> >
> > Can any one please help me ASAP
> >
> > Thanks
> > Raghu
> >
> You could use TRUNCATE TABLE, which is non-logged. Be sure to read the
> Books Online entry first, there are some caveats to using this vs. DELETE.
>
> --
> Tracy McKibben
> MCDBA
> http://www.realsqlguy.com|||raghu wrote:
> Tracy
> Truncate table would clean the entire table. I am just trying to clean
> old data that is 15 days old and this is a job which does this each day
>
I guess I missed that in your original post...
Tracy McKibben
MCDBA
http://www.realsqlguy.com|||I have resolved one problem by splitting the job of delete into 2
different procedures where in the transaction log doesn't grow big now.
The performance of delete is still not improved it does take same
amount of time as mentioned. can anyone have any idea FYI
The deletes are too slow. it takes 20 min to delete 400000 records.
Raghu
raghu wrote:
> Thanks hari for the quick response.
> I have no foreign keys on these tables. All i have is indexes to of
> them one is for delete based on timestamp and other is for select which
> includes 3 more columns.
> DBCC SHRINKFILE is only after i do the delete, even before the delete
> is finished my 100 GB transaction log is getting filled and stopping
> all other application to either insert or delete data.
> I have 10 delete statements so this mean that everything that it has
> tried till that time is rolledback. I can do some kind of commit which
> will clear the log as well as prevent the rollback if any problem
> occurs. The begin and end transaction locks the table and wont allow to
> make selects till it is commited, i want where selects can work when
> delete is going on
> Raghu
> Hari Prasad wrote:
> > Answer for you questions:-
> >
> > 1. Disable the foreign key if you can do the house keeping after shutdown
> > the application [ See Alter table command]
> >
> > 2. No you can not avoid to log the transaction. The easy way is shrink the
> > transaction log fter delete [See DBCC SHRINKFILE]
> >
> > 3. That depends up the way you handle the transaction.. If its a single
> > delete statement inside the procedure and if you kill the job;
> > then automatically every thing pertaning to that delete will be rolled back
> >
> > Thanks
> > Hari
> > SQL Server MVP
> >
> >
> >
> >
> > "raghu" <raghu.burju@.gmail.com> wrote in message
> > news:1154641875.483366.120510@.75g2000cwc.googlegroups.com...
> > > we have problem with SQL Server 2000 SP3 / Windows 2003 EE
> > > we have 10 huge tables, I have a procedure which deletes data from
> > > these tables each day.
> > >
> > > 1. The deletes are very slow, it takes 20 min to delete 400,000
> > > records. It does use index when it deletes, i have seen the plan
> > >
> > > 2. i am running the database in simple mode but still the transaction
> > > log grows bigger and bigger during the delete process. I have used
> > > checkpoint after every table delete but it didnt help. is there a way
> > > we can avoid writing to transaction log so that i can solve this
> > > problem and improve the delete performance too
> > >
> > > 3. If i stop the job then everything gets rolled back that is been done
> > > by procedure. does it not commit table by table within the procedure?
> > > is it waiting for the procedure to finish before it commits?
> > >
> > > Can any one please help me ASAP
> > >
> > > Thanks
> > > Raghu
> > >

Problem with delete trigger

I am trying to write a delete trigger that fires when the user deletes the only record for a given entity in the table. In this situation I have an entity PMA which can have one or many uses. I want the trigger to fire if the user tries to delete the last use record for that PMA_NUM from the PMA_USE table.

Here is my code:

if (SELECT Count(pma_use.pma_num) FROM dbo.PMA_USE
inner join dbo.deleted as D1 on dbo.PMA_USE.PMA_NUM= d1.PMA_NUM) = 0
BEGIN
RAISERROR ('Cannot delete only Use for PMA!', 16, 1)
ROLLBACK TRANSACTION
END

I seem to be getting inconsistent results when I create the trigger and when I try to test it re the deleted table

Sometimes when I run the trigger script in Management Studio Express, it has trouble with my using dbo.deleted and sometimes it doesn't

When I try to test the trigger by using MSE to view the table and delete the last use record for a PMA, I get an invalid object cannot find 'dbo.deleted' or cannot find 'deleted'

What am I doing wrong?

Roberta

1 - Use just [deleted]

2 - you do not need to count in order to prove existence. Use EXISTS operator.

if exists(select * from dbo.PMA_USE as a innner join deleted as b on a.PMA_NUM = b.PMA_NUM)

BEGIN
RAISERROR ('Cannot delete only Use for PMA!', 16, 1)
ROLLBACK TRANSACTION
END

AMB

problem with delete statement

Hi guys,

i need some help on this issue, i have a table with a nvarchar type column with primary key constraint. I have made a procedure which deletes record from this table using the primary key. My primary key is TxnID, and when i ran the following statement:

DELETE from mytable where TxnID = '119DA-117440520'

it deleted the record with the id '119DA-1174405208'

I dont want this to happen, i only want to delete records with the exact IDs that i provide. Can any one tell me how i should do that?

Thanks in advance.

sorry! i managed to fix it...actually the width was not defined properly which was causing problem

sorry again

Problem with Delete On Linked Server

I am getting time-out errors when I try to perform a simple delete on a
linked server. The command is:

Delete From MyTable Where PKID=12345

I have tried executing this command directly from Query Analyzer, but it
just times out. However, when I run it from QA against the server itself
(rather than from my local server against a linked server), it executes
immediately. Similarly, if I run the same SQL command through an ODBC linked
table in an Access 2000 MDB file (tweaking the syntax slightly), it also
executes immediately. Only if I run it from SQL 7 as a stored procedure or a
QA command against a linked server does it time-out.

I have no problems inserting rows using any of the three methods noted
above. It is only with deleting rows through the linked server that it times
out.

Thanks for any assistance.

NeilCorrection: The syntax used against the linked server is the four-art
syntax:

Delete From svr.db.dbo.MyTable Where PKID=12345

Also, the same problem occurs with updates as with deletes (but not
inserts).

Thanks,

Neil

"Neil" <nospam@.nospam.net> wrote in message
news:_Jqag.5464$u4.5402@.newsread1.news.pas.earthli nk.net...
>I am getting time-out errors when I try to perform a simple delete on a
>linked server. The command is:
> Delete From MyTable Where PKID=12345
> I have tried executing this command directly from Query Analyzer, but it
> just times out. However, when I run it from QA against the server itself
> (rather than from my local server against a linked server), it executes
> immediately. Similarly, if I run the same SQL command through an ODBC
> linked table in an Access 2000 MDB file (tweaking the syntax slightly), it
> also executes immediately. Only if I run it from SQL 7 as a stored
> procedure or a QA command against a linked server does it time-out.
> I have no problems inserting rows using any of the three methods noted
> above. It is only with deleting rows through the linked server that it
> times out.
> Thanks for any assistance.
> Neil|||I ran a profiler trace while inserting and then deleting a row from the
table on the linked server. I added all events to the trace. However, for
both the insert and delete, it only showed a single SQL:BatchCompleted
event -- except that after the line for the delete command, there was a
second, blank, line in the trace. The two lines for the delete didn't show
in the profiler until the batch was canceled. Below are the lines from
Profiler. Any assistance is appreciated.

Thanks,

Neil

Event Class Text Application Name NT User Name SQL User Name CPU Reads
Writes Duration Connection ID SPID Start Time
+SQL:BatchCompleted insert into abcweb.abc.dbo.images (ImageID) values
(99986) MS SQL Query Analyzer neil sa 0 13 0 640 5264 12 02:23:31.450
+SQL:BatchCompleted Delete From abcweb.abc.dbo.images Where ImageID=99986
MS SQL Query Analyzer neil sa 0 20 1 14440 5264 12 02:23:42.387
SQL:BatchCompleted MS SQL Query Analyzer neil sa 0 0 0 0 5264 12
02:23:56.827

"Neil" <nospam@.nospam.net> wrote in message
news:_Jqag.5464$u4.5402@.newsread1.news.pas.earthli nk.net...
>I am getting time-out errors when I try to perform a simple delete on a
>linked server. The command is:
> Delete From MyTable Where PKID=12345
> I have tried executing this command directly from Query Analyzer, but it
> just times out. However, when I run it from QA against the server itself
> (rather than from my local server against a linked server), it executes
> immediately. Similarly, if I run the same SQL command through an ODBC
> linked table in an Access 2000 MDB file (tweaking the syntax slightly), it
> also executes immediately. Only if I run it from SQL 7 as a stored
> procedure or a QA command against a linked server does it time-out.
> I have no problems inserting rows using any of the three methods noted
> above. It is only with deleting rows through the linked server that it
> times out.
> Thanks for any assistance.
> Neil|||Which box were you running the trace against? Try running it
against the destination (the linked server) instead of the
server that you are executing the command from.
Another thing to see if it works - try executing the
statement using OpenQuery instead of the 4 part name and see
if that makes a difference.

-Sue

On Wed, 17 May 2006 06:23:30 GMT, "Neil" <nospam@.nospam.net>
wrote:

>I ran a profiler trace while inserting and then deleting a row from the
>table on the linked server. I added all events to the trace. However, for
>both the insert and delete, it only showed a single SQL:BatchCompleted
>event -- except that after the line for the delete command, there was a
>second, blank, line in the trace. The two lines for the delete didn't show
>in the profiler until the batch was canceled. Below are the lines from
>Profiler. Any assistance is appreciated.
>Thanks,
>Neil
>Event Class Text Application Name NT User Name SQL User Name CPU Reads
>Writes Duration Connection ID SPID Start Time
>+SQL:BatchCompleted insert into abcweb.abc.dbo.images (ImageID) values
>(99986) MS SQL Query Analyzer neil sa 0 13 0 640 5264 12 02:23:31.450
>+SQL:BatchCompleted Delete From abcweb.abc.dbo.images Where ImageID=99986
>MS SQL Query Analyzer neil sa 0 20 1 14440 5264 12 02:23:42.387
> SQL:BatchCompleted MS SQL Query Analyzer neil sa 0 0 0 0 5264 12
>02:23:56.827
>
>"Neil" <nospam@.nospam.net> wrote in message
>news:_Jqag.5464$u4.5402@.newsread1.news.pas.earthli nk.net...
>>I am getting time-out errors when I try to perform a simple delete on a
>>linked server. The command is:
>>
>> Delete From MyTable Where PKID=12345
>>
>> I have tried executing this command directly from Query Analyzer, but it
>> just times out. However, when I run it from QA against the server itself
>> (rather than from my local server against a linked server), it executes
>> immediately. Similarly, if I run the same SQL command through an ODBC
>> linked table in an Access 2000 MDB file (tweaking the syntax slightly), it
>> also executes immediately. Only if I run it from SQL 7 as a stored
>> procedure or a QA command against a linked server does it time-out.
>>
>> I have no problems inserting rows using any of the three methods noted
>> above. It is only with deleting rows through the linked server that it
>> times out.
>>
>> Thanks for any assistance.
>>
>> Neil
>|||Couldn't run a trace against the linked server, as I don't have permissions
to execute the trace sp. Sent a note to the web admin to give me
permissions.

In the meantime, I used OpenQuery as you suggested. This time at least I got
an error message (yea!), though somewhat cryptic:

"Could not process object 'Delete From images Where ImageID=99986'. The
OLE DB provider 'SQLOLEDB' indicates that the object has no columns."

Any ideas about what that means?

Thanks!

Neil

"Sue Hoegemeier" <Sue_H@.nomail.please> wrote in message
news:bh9m62hksvsaa8fdoob0sphsd32jgqc36h@.4ax.com...
> Which box were you running the trace against? Try running it
> against the destination (the linked server) instead of the
> server that you are executing the command from.
> Another thing to see if it works - try executing the
> statement using OpenQuery instead of the 4 part name and see
> if that makes a difference.
> -Sue
> On Wed, 17 May 2006 06:23:30 GMT, "Neil" <nospam@.nospam.net>
> wrote:
>>I ran a profiler trace while inserting and then deleting a row from the
>>table on the linked server. I added all events to the trace. However, for
>>both the insert and delete, it only showed a single SQL:BatchCompleted
>>event -- except that after the line for the delete command, there was a
>>second, blank, line in the trace. The two lines for the delete didn't show
>>in the profiler until the batch was canceled. Below are the lines from
>>Profiler. Any assistance is appreciated.
>>
>>Thanks,
>>
>>Neil
>>
>>Event Class Text Application Name NT User Name SQL User Name CPU Reads
>>Writes Duration Connection ID SPID Start Time
>>+SQL:BatchCompleted insert into abcweb.abc.dbo.images (ImageID) values
>>(99986) MS SQL Query Analyzer neil sa 0 13 0 640 5264 12 02:23:31.450
>>+SQL:BatchCompleted Delete From abcweb.abc.dbo.images Where ImageID=99986
>>MS SQL Query Analyzer neil sa 0 20 1 14440 5264 12 02:23:42.387
>> SQL:BatchCompleted MS SQL Query Analyzer neil sa 0 0 0 0 5264 12
>>02:23:56.827
>>
>>
>>
>>"Neil" <nospam@.nospam.net> wrote in message
>>news:_Jqag.5464$u4.5402@.newsread1.news.pas.earthli nk.net...
>>>I am getting time-out errors when I try to perform a simple delete on a
>>>linked server. The command is:
>>>
>>> Delete From MyTable Where PKID=12345
>>>
>>> I have tried executing this command directly from Query Analyzer, but it
>>> just times out. However, when I run it from QA against the server itself
>>> (rather than from my local server against a linked server), it executes
>>> immediately. Similarly, if I run the same SQL command through an ODBC
>>> linked table in an Access 2000 MDB file (tweaking the syntax slightly),
>>> it
>>> also executes immediately. Only if I run it from SQL 7 as a stored
>>> procedure or a QA command against a linked server does it time-out.
>>>
>>> I have no problems inserting rows using any of the three methods noted
>>> above. It is only with deleting rows through the linked server that it
>>> times out.
>>>
>>> Thanks for any assistance.
>>>
>>> Neil
>>>
>|||OK, I found out that the error I got originally (per other message) was due
to the fact that Delete doesn't return rows, as OpenQuery is looking for.
So, per http://support.microsoft.com/defaul...b;en-us;Q270119, I
changed it to:

Delete OPENQUERY(abcweb, 'Select ImageID From images Where
ImageID=99987')

and it worked! Would still be good to find out why the original method
wouldn't work for deletes or updates, but did work for inserts. But at least
this works.

Thanks!

Neil

"Sue Hoegemeier" <Sue_H@.nomail.please> wrote in message
news:bh9m62hksvsaa8fdoob0sphsd32jgqc36h@.4ax.com...
> Which box were you running the trace against? Try running it
> against the destination (the linked server) instead of the
> server that you are executing the command from.
> Another thing to see if it works - try executing the
> statement using OpenQuery instead of the 4 part name and see
> if that makes a difference.
> -Sue
> On Wed, 17 May 2006 06:23:30 GMT, "Neil" <nospam@.nospam.net>
> wrote:
>>I ran a profiler trace while inserting and then deleting a row from the
>>table on the linked server. I added all events to the trace. However, for
>>both the insert and delete, it only showed a single SQL:BatchCompleted
>>event -- except that after the line for the delete command, there was a
>>second, blank, line in the trace. The two lines for the delete didn't show
>>in the profiler until the batch was canceled. Below are the lines from
>>Profiler. Any assistance is appreciated.
>>
>>Thanks,
>>
>>Neil
>>
>>Event Class Text Application Name NT User Name SQL User Name CPU Reads
>>Writes Duration Connection ID SPID Start Time
>>+SQL:BatchCompleted insert into abcweb.abc.dbo.images (ImageID) values
>>(99986) MS SQL Query Analyzer neil sa 0 13 0 640 5264 12 02:23:31.450
>>+SQL:BatchCompleted Delete From abcweb.abc.dbo.images Where ImageID=99986
>>MS SQL Query Analyzer neil sa 0 20 1 14440 5264 12 02:23:42.387
>> SQL:BatchCompleted MS SQL Query Analyzer neil sa 0 0 0 0 5264 12
>>02:23:56.827
>>
>>
>>
>>"Neil" <nospam@.nospam.net> wrote in message
>>news:_Jqag.5464$u4.5402@.newsread1.news.pas.earthli nk.net...
>>>I am getting time-out errors when I try to perform a simple delete on a
>>>linked server. The command is:
>>>
>>> Delete From MyTable Where PKID=12345
>>>
>>> I have tried executing this command directly from Query Analyzer, but it
>>> just times out. However, when I run it from QA against the server itself
>>> (rather than from my local server against a linked server), it executes
>>> immediately. Similarly, if I run the same SQL command through an ODBC
>>> linked table in an Access 2000 MDB file (tweaking the syntax slightly),
>>> it
>>> also executes immediately. Only if I run it from SQL 7 as a stored
>>> procedure or a QA command against a linked server does it time-out.
>>>
>>> I have no problems inserting rows using any of the three methods noted
>>> above. It is only with deleting rows through the linked server that it
>>> times out.
>>>
>>> Thanks for any assistance.
>>>
>>> Neil
>>>
>

Problem with Delete On Linked Server

I am getting time-out errors when I try to perform a simple delete on a
linked server. The command is:
Delete From MyTable Where PKID=12345
I have tried executing this command directly from Query Analyzer, but it
just times out. However, when I run it from QA against the server itself
(rather than from my local server against a linked server), it executes
immediately. Similarly, if I run the same SQL command through an ODBC linked
table in an Access 2000 MDB file (tweaking the syntax slightly), it also
executes immediately. Only if I run it from SQL 7 as a stored procedure or a
QA command against a linked server does it time-out.
I have no problems inserting rows using any of the three methods noted
above. It is only with deleting rows through the linked server that it times
out.
Thanks for any assistance.
NeilCorrection: The syntax used against the linked server is the four-art
syntax:
Delete From svr.db.dbo.MyTable Where PKID=12345
Also, the same problem occurs with updates as with deletes (but not
inserts).
Thanks,
Neil
"Neil" <nospam@.nospam.net> wrote in message
news:_Jqag.5464$u4.5402@.newsread1.news.pas.earthlink.net...
>I am getting time-out errors when I try to perform a simple delete on a
>linked server. The command is:
> Delete From MyTable Where PKID=12345
> I have tried executing this command directly from Query Analyzer, but it
> just times out. However, when I run it from QA against the server itself
> (rather than from my local server against a linked server), it executes
> immediately. Similarly, if I run the same SQL command through an ODBC
> linked table in an Access 2000 MDB file (tweaking the syntax slightly), it
> also executes immediately. Only if I run it from SQL 7 as a stored
> procedure or a QA command against a linked server does it time-out.
> I have no problems inserting rows using any of the three methods noted
> above. It is only with deleting rows through the linked server that it
> times out.
> Thanks for any assistance.
> Neil
>|||I ran a profiler trace while inserting and then deleting a row from the
table on the linked server. I added all events to the trace. However, for
both the insert and delete, it only showed a single SQL:BatchCompleted
event -- except that after the line for the delete command, there was a
second, blank, line in the trace. The two lines for the delete didn't show
in the profiler until the batch was canceled. Below are the lines from
Profiler. Any assistance is appreciated.
Thanks,
Neil
Event Class Text Application Name NT User Name SQL User Name CPU Reads
Writes Duration Connection ID SPID Start Time
+SQL:BatchCompleted insert into abcweb.abc.dbo.images (ImageID) values
(99986) MS SQL Query Analyzer neil sa 0 13 0 640 5264 12 02:23:31.450
+SQL:BatchCompleted Delete From abcweb.abc.dbo.images Where ImageID=99986
MS SQL Query Analyzer neil sa 0 20 1 14440 5264 12 02:23:42.387
SQL:BatchCompleted MS SQL Query Analyzer neil sa 0 0 0 0 5264 12
02:23:56.827
"Neil" <nospam@.nospam.net> wrote in message
news:_Jqag.5464$u4.5402@.newsread1.news.pas.earthlink.net...
>I am getting time-out errors when I try to perform a simple delete on a
>linked server. The command is:
> Delete From MyTable Where PKID=12345
> I have tried executing this command directly from Query Analyzer, but it
> just times out. However, when I run it from QA against the server itself
> (rather than from my local server against a linked server), it executes
> immediately. Similarly, if I run the same SQL command through an ODBC
> linked table in an Access 2000 MDB file (tweaking the syntax slightly), it
> also executes immediately. Only if I run it from SQL 7 as a stored
> procedure or a QA command against a linked server does it time-out.
> I have no problems inserting rows using any of the three methods noted
> above. It is only with deleting rows through the linked server that it
> times out.
> Thanks for any assistance.
> Neil
>|||Which box were you running the trace against? Try running it
against the destination (the linked server) instead of the
server that you are executing the command from.
Another thing to see if it works - try executing the
statement using OpenQuery instead of the 4 part name and see
if that makes a difference.
-Sue
On Wed, 17 May 2006 06:23:30 GMT, "Neil" <nospam@.nospam.net>
wrote:

>I ran a profiler trace while inserting and then deleting a row from the
>table on the linked server. I added all events to the trace. However, for
>both the insert and delete, it only showed a single SQL:BatchCompleted
>event -- except that after the line for the delete command, there was a
>second, blank, line in the trace. The two lines for the delete didn't show
>in the profiler until the batch was canceled. Below are the lines from
>Profiler. Any assistance is appreciated.
>Thanks,
>Neil
>Event Class Text Application Name NT User Name SQL User Name CPU Reads
>Writes Duration Connection ID SPID Start Time
>+SQL:BatchCompleted insert into abcweb.abc.dbo.images (ImageID) values
>(99986) MS SQL Query Analyzer neil sa 0 13 0 640 5264 12 02:23:31.450
>+SQL:BatchCompleted Delete From abcweb.abc.dbo.images Where ImageID=99986
>MS SQL Query Analyzer neil sa 0 20 1 14440 5264 12 02:23:42.387
> SQL:BatchCompleted MS SQL Query Analyzer neil sa 0 0 0 0 5264 12
>02:23:56.827
>
>"Neil" <nospam@.nospam.net> wrote in message
>news:_Jqag.5464$u4.5402@.newsread1.news.pas.earthlink.net...
>|||Couldn't run a trace against the linked server, as I don't have permissions
to execute the trace sp. Sent a note to the web admin to give me
permissions.
In the meantime, I used OpenQuery as you suggested. This time at least I got
an error message (yea!), though somewhat cryptic:
"Could not process object 'Delete From images Where ImageID=99986'. The
OLE DB provider 'SQLOLEDB' indicates that the object has no columns."
Any ideas about what that means?
Thanks!
Neil
"Sue Hoegemeier" <Sue_H@.nomail.please> wrote in message
news:bh9m62hksvsaa8fdoob0sphsd32jgqc36h@.
4ax.com...
> Which box were you running the trace against? Try running it
> against the destination (the linked server) instead of the
> server that you are executing the command from.
> Another thing to see if it works - try executing the
> statement using OpenQuery instead of the 4 part name and see
> if that makes a difference.
> -Sue
> On Wed, 17 May 2006 06:23:30 GMT, "Neil" <nospam@.nospam.net>
> wrote:
>
>|||OK, I found out that the error I got originally (per other message) was due
to the fact that Delete doesn't return rows, as OpenQuery is looking for.
So, per http://support.microsoft.com/defaul...;en-us;Q270119, I
changed it to:
Delete OPENQUERY(abcweb, 'Select ImageID From images Where
ImageID=99987')
and it worked! Would still be good to find out why the original method
wouldn't work for deletes or updates, but did work for inserts. But at least
this works.
Thanks!
Neil
"Sue Hoegemeier" <Sue_H@.nomail.please> wrote in message
news:bh9m62hksvsaa8fdoob0sphsd32jgqc36h@.
4ax.com...
> Which box were you running the trace against? Try running it
> against the destination (the linked server) instead of the
> server that you are executing the command from.
> Another thing to see if it works - try executing the
> statement using OpenQuery instead of the 4 part name and see
> if that makes a difference.
> -Sue
> On Wed, 17 May 2006 06:23:30 GMT, "Neil" <nospam@.nospam.net>
> wrote:
>
>

Problem with Delete function within a Gridview

I've got an issue that when I update a record in the gridview it works fine. When I click the delete link to remove the record from the database, I get the following error, "System.FormatException: Input string was not in a correct format". Part of the Stack Trace refers to "String oldValuesParameterFormatString". This parameter is in my SqlDataSource. It was dynamically created when I originally created the SqlDataSource with VWD 2005 Express Edition. The delete function will work if I remove "OldValuesParameterFormatString="original_{0}"ProviderName="System.Data.SqlClient",and any reference to "original_" in the DeleteCommand the SqlDataSource. But if I do, the update function doesn't work. Anyway, here's the SqlDataSource: Any help would be greatly appreciated!!!!!

<asp:SqlDataSourceID="SqlDataSource1"runat="server"ConflictDetection="CompareAllValues"

ConnectionString="<%$ ConnectionStrings:LocalSqlServer %>"

DeleteCommand="DELETE FROM [houses] WHERE [intID] = @.original_intID AND [street] = @.original_street AND [city] = @.original_city AND [state] = @.original_state AND [zip] = @.original_zip AND [status] = @.original_status AND [pDate] = @.original_pDate AND [sPrice] = @.original_sPrice AND [asPrice] = @.original_asPrice AND [actSalePrice] = @.original_actSalePrice AND [cToDate] = @.original_cToDate AND [rehabBudget] = @.original_rehabBudget AND [tDay] = @.original_tDay AND [eDate] = @.original_eDate AND [loDate] = @.original_loDate AND [rsDate] = @.original_rsDate AND [flooringDate] = @.original_flooringDate AND [estCompDate] = @.original_estCompDate AND [actCompDate] = @.original_actCompDate AND [coe] = @.original_coe AND [lDate] = @.original_lDate AND [credits] = @.original_credits AND [agent] = @.original_agent AND [insComplete] = @.original_insComplete AND [cEscrowDate] = @.original_cEscrowDate AND [bidValue] = @.original_bidValue AND [thomGuideNumber] = @.original_thomGuideNumber AND [locksmith] = @.original_locksmith AND [notes] = @.original_notes AND [hoa] = @.original_hoa"

InsertCommand="INSERT INTO [houses] ([street], [city], [state], [zip], [status], [pDate], [sPrice], [asPrice], [actSalePrice], [cToDate], [rehabBudget], [tDay], [eDate], [loDate], [rsDate], [flooringDate], [estCompDate], [actCompDate], [coe], [lDate], [credits], [agent], [insComplete], [cEscrowDate], [bidValue], [thomGuideNumber], [locksmith], [notes], [hoa]) VALUES (@.street, @.city, @.state, @.zip, @.status, @.pDate, @.sPrice, @.asPrice, @.actSalePrice, @.cToDate, @.rehabBudget, @.tDay, @.eDate, @.loDate, @.rsDate, @.flooringDate, @.estCompDate, @.actCompDate, @.coe, @.lDate, @.credits, @.agent, @.insComplete, @.cEscrowDate, @.bidValue, @.thomGuideNumber, @.locksmith, @.notes, @.hoa)"

OldValuesParameterFormatString="original_{0}"ProviderName="System.Data.SqlClient"

SelectCommand="SELECT * FROM [houses] WHERE ([intID] = @.intID)"

UpdateCommand="UPDATE [houses] SET [street] = @.street, [city] = @.city, [state] = @.state, [zip] = @.zip, [status] = @.status, [pDate] = @.pDate, [sPrice] = @.sPrice, [asPrice] = @.asPrice, [actSalePrice] = @.actSalePrice, [cToDate] = @.cToDate, [rehabBudget] = @.rehabBudget, [tDay] = @.tDay, [eDate] = @.eDate, [loDate] = @.loDate, [rsDate] = @.rsDate, [flooringDate] = @.flooringDate, [estCompDate] = @.estCompDate, [actCompDate] = @.actCompDate, [coe] = @.coe, [lDate] = @.lDate, [credits] = @.credits, [agent] = @.agent, [insComplete] = @.insComplete, [cEscrowDate] = @.cEscrowDate, [bidValue] = @.bidValue, [thomGuideNumber] = @.thomGuideNumber, [locksmith] = @.locksmith, [notes] = @.notes, [hoa] = @.hoa WHERE [intID] = @.original_intID">

<DeleteParameters>
<asp:ParameterName="original_intID"Type="Int32"/>
<asp:ParameterName="original_street"Type="String"/>
<asp:ParameterName="original_city"Type="String"/>
<asp:ParameterName="original_state"Type="String"/>
<asp:ParameterName="original_zip"Type="String"/>
<asp:ParameterName="original_status"Type="String"/>
<asp:ParameterName="original_pDate"Type="DateTime"/>
<asp:ParameterName="original_sPrice"Type="Decimal"/>
<asp:ParameterName="original_asPrice"Type="Decimal"/>
<asp:ParameterName="original_actSalePrice"Type="Decimal"/>
<asp:ParameterName="original_cToDate"Type="Decimal"/>
<asp:ParameterName="original_rehabBudget"Type="Decimal"/>
<asp:ParameterName="original_tDay"Type="DateTime"/>
<asp:ParameterName="original_eDate"Type="DateTime"/>
<asp:ParameterName="original_loDate"Type="DateTime"/>
<asp:ParameterName="original_rsDate"Type="DateTime"/>
<asp:ParameterName="original_flooringDate"Type="DateTime"/>
<asp:ParameterName="original_estCompDate"Type="DateTime"/>
<asp:ParameterName="original_actCompDate"Type="DateTime"/>
<asp:ParameterName="original_coe"Type="DateTime"/>
<asp:ParameterName="original_lDate"Type="DateTime"/>
<asp:ParameterName="original_credits"Type="String"/>
<asp:ParameterName="original_agent"Type="String"/>
<asp:ParameterName="original_insComplete"Type="String"/>
<asp:ParameterName="original_cEscrowDate"Type="DateTime"/>
<asp:ParameterName="original_bidValue"Type="Decimal"/>
<asp:ParameterName="original_thomGuideNumber"Type="String"/>
<asp:ParameterName="original_locksmith"Type="String"/>
<asp:ParameterName="original_notes"Type="String"/>
<asp:ParameterName="original_hoa"Type="String"/>
</DeleteParameters><UpdateParameters>
<asp:ParameterName="street"Type="String"/>
<asp:ParameterName="city"Type="String"/>
<asp:ParameterName="state"Type="String"/>
<asp:ParameterName="zip"Type="String"/>
<asp:ParameterName="status"Type="String"/>
<asp:ParameterName="pDate"Type="DateTime"/>
<asp:ParameterName="sPrice"Type="Decimal"/>
<asp:ParameterName="asPrice"Type="Decimal"/>
<asp:ParameterName="actSalePrice"Type="Decimal"/>
<asp:ParameterName="cToDate"Type="Decimal"/>
<asp:ParameterName="rehabBudget"Type="Decimal"/>
<asp:ParameterName="tDay"Type="DateTime"/>
<asp:ParameterName="eDate"Type="DateTime"/>
<asp:ParameterName="loDate"Type="DateTime"/>
<asp:ParameterName="rsDate"Type="DateTime"/>
<asp:ParameterName="flooringDate"Type="DateTime"/>
<asp:ParameterName="estCompDate"Type="DateTime"/>
<asp:ParameterName="actCompDate"Type="DateTime"/>
<asp:ParameterName="coe"Type="DateTime"/>
<asp:ParameterName="lDate"Type="DateTime"/>
<asp:ParameterName="credits"Type="String"/>
<asp:ParameterName="agent"Type="String"/>
<asp:ParameterName="insComplete"Type="String"/>
<asp:ParameterName="cEscrowDate"Type="DateTime"/>
<asp:ParameterName="bidValue"Type="Decimal"/>
<asp:ParameterName="thomGuideNumber"Type="String"/>
<asp:ParameterName="locksmith"Type="String"/>
<asp:ParameterName="notes"Type="String"/>
<asp:ParameterName="hoa"Type="String"/>
<asp:ParameterName="original_intID"Type="Int32"/>
<asp:ParameterName="original_street"Type="String"/>
<asp:ParameterName="original_city"Type="String"/>
<asp:ParameterName="original_state"Type="String"/>
<asp:ParameterName="original_zip"Type="String"/>
<asp:ParameterName="original_status"Type="String"/>
<asp:ParameterName="original_pDate"Type="DateTime"/>
<asp:ParameterName="original_sPrice"Type="Decimal"/>
<asp:ParameterName="original_asPrice"Type="Decimal"/>
<asp:ParameterName="original_actSalePrice"Type="Decimal"/>
<asp:ParameterName="original_cToDate"Type="Decimal"/>
<asp:ParameterName="original_rehabBudget"Type="Decimal"/>
<asp:ParameterName="original_tDay"Type="DateTime"/>
<asp:ParameterName="original_eDate"Type="DateTime"/>
<asp:ParameterName="original_loDate"Type="DateTime"/>
<asp:ParameterName="original_rsDate"Type="DateTime"/>
<asp:ParameterName="original_flooringDate"Type="DateTime"/>
<asp:ParameterName="original_estCompDate"Type="DateTime"/>
<asp:ParameterName="original_actCompDate"Type="DateTime"/>
<asp:ParameterName="original_coe"Type="DateTime"/>
<asp:ParameterName="original_lDate"Type="DateTime"/>
<asp:ParameterName="original_credits"Type="String"/>
<asp:ParameterName="original_agent"Type="String"/>
<asp:ParameterName="original_insComplete"Type="String"/>
<asp:ParameterName="original_cEscrowDate"Type="DateTime"/>
<asp:ParameterName="original_bidValue"Type="Decimal"/>
<asp:ParameterName="original_thomGuideNumber"Type="String"/>
<asp:ParameterName="original_locksmith"Type="String"/>
<asp:ParameterName="original_notes"Type="String"/>
<asp:ParameterName="original_hoa"Type="String"/>
</UpdateParameters><SelectParameters>
<asp:QueryStringParameterName="intID"QueryStringField="intID"Type="Int32"/>
</SelectParameters><InsertParameters>
<!-- removed to save space -->
</InsertParameters></asp:SqlDataSource>

If intID is your primary key fo the table, you can do it in a simple way. Please set DataKeyNames="intID" in your gridview and try:

DeleteCommand="DELETE FROM [houses] WHERE [intID] = @.intID"
<DeleteParameters>
<asp:ParameterName="intID"Type="Int32"/>

</DeleteParameters>
 
|||

limno:

If intID is your primary key fo the table, you can do it in a simple way. Please set DataKeyNames="intID" in your gridview and try:

DeleteCommand="DELETE FROM [houses] WHERE [intID] = @.intID"
<DeleteParameters>
<asp:ParameterName="intID"Type="Int32"/>

</DeleteParameters>

Hey limno, thanks for the help. Unfortunately, it doesn't work. The record is not deleted. When I click the delete link, the page simply refreshes with the same data fields.
|||

Something strange happened to the reply above. Here it is again...

Hey limno, thanks for the help. Unfortunately, it doesn't work. The record is not deleted. When I click the delete link, the page simply refreshes with the same data fields.

|||

Hi tobias,

Please check in your Page_Load event handler to see if you have reloaded the DataSource by calling DataBind() method again. You need to check if it is a postback by using IsPostBack property.

If DataBind() is called, the operation will have no effect.

Problem with defining 2 datasets against OLAP cube

Hello all,

I built a report (RS 2005) that include several chart on the same layout.
Every chart is based on a different dataset, which is based on the same Datasource – an OLAP cube.
The datasets were created using the new query builder in RS 2005 (from SSAS 2005 cube), in both of them, I have set the Quarter dimension as a filter and marked the Parameter checkbox to set it as parameter.

After creating the first dataset and defining the parameter- a new parameter was defined in the report. Now, after creating the second dataset and adding the same filter and defining it as parameter, I have found out the no other parameter was defined for me but the first one was overwritten…

Is it a bug?

Thanks,

Liran

Hi Liran,

I don't think this is a bug. While you are keep on adding the same filter for a various datasets, the dataset will point to the same filter and you will have only one parameter if you select the check box for the filter in the filter pane.

I implement the same kind of reports and it works fine.

Sincerley,

--Amde

|||

Yes, but what if I want a different parameter defined for the second dataset?
I have tried doing it manually: I defined a new dataset similar to the one already created for me, and moved out from graphic design mode in the second dataset to reference the second parameter in the MDX.
I got an error saying no such parameter is defined L…

Problem with defining 2 datasets against OLAP cube

Hello all,

I built a report (RS 2005) that include several chart on the same layout.
Every chart is based on a different dataset, which is based on the same Datasource – an OLAP cube.
The datasets were created using the new query builder in RS 2005 (from SSAS 2005 cube), in both of them, I have set the Quarter dimension as a filter and marked the Parameter checkbox to set it as parameter.

After creating the first dataset and defining the parameter- a new parameter was defined in the report. Now, after creating the second dataset and adding the same filter and defining it as parameter, I have found out the no other parameter was defined for me but the first one was overwritten…

Is it a bug?

Thanks,

Liran

Hi Liran,

I don't think this is a bug. While you are keep on adding the same filter for a various datasets, the dataset will point to the same filter and you will have only one parameter if you select the check box for the filter in the filter pane.

I implement the same kind of reports and it works fine.

Sincerley,

--Amde

|||

Yes, but what if I want a different parameter defined for the second dataset?
I have tried doing it manually: I defined a new dataset similar to the one already created for me, and moved out from graphic design mode in the second dataset to reference the second parameter in the MDX.
I got an error saying no such parameter is defined L…

Problem with default schema

Hi, Michael,
I understand that you would like to know why the default schema was not
initially set to "TEST" after you log on your SQL Server with the login
TEST.
If I have misunderstood, please let me know.
Unfortunately per my test in SQL Server Management Studio, I could not
reproduce your issue. My test process was as following:
1. Logged on my SQL Server instance with a sysadmin account and created a
login named "globaltest" with default master database;
2. Executed the statement:
USE TestDB
CREATE USER TEST FOR LOGIN globaltest WITH DEFAULT_SCHEMA = TEST
ALTER LOGIN globaltest WITH DEFAULT_DATABASE TestDB
3. Executed the statement:
CREATE SCHEMA TEST AUTHORIZATION TEST;
4. Changed a table dbo.T1 to TEST.T1;
5. Closed the current connection and connected to the SQL Server instance
with the login 'globaltest';
6. Executed the query:
SELECT * FROM T1;
Everything worked fine. Are there some differences between our tests?
I recommend that you run "SELECT CURRENT_USER" to check if the user TEST is
the current user after you log on your SQL Server with the TEST login. You
may also perform a test according to my steps to see if the issue occurs.
If you have any other questions or concerns, please feel free to let me
know.
Have a good day!
Charles Wang
Microsoft Online Community Support
================================================== ===
Get notification to my posts through email? Please refer to:
http://msdn.microsoft.com/subscriptions/managednewsgroups/default.aspx#notif
ications
If you are using Outlook Express, please make sure you clear the check box
"Tools/Options/Read: Get 300 headers at a time" to see your reply promptly.
Note: The MSDN Managed Newsgroup support offering is for non-urgent issues
where an initial response from the community or a Microsoft Support
Engineer within 1 business day is acceptable. Please note that each follow
up response may take approximately 2 business days as the support
professional working with you may need further investigation to reach the
most efficient resolution. The offering is not appropriate for situations
that require urgent, real-time or phone-based interactions or complex
project analysis and dump analysis issues. Issues of this nature are best
handled working with a dedicated Microsoft Support Engineer by contacting
Microsoft Customer Support Services (CSS) at
http://msdn.microsoft.com/subscriptions/support/default.aspx.
================================================== ====
When responding to posts, please "Reply to Group" via
your newsreader so that others may learn and benefit
from this issue.
================================================== ====
This posting is provided "AS IS" with no warranties, and confers no rights.
================================================== ====
Hi, Michael,
Thanks for your posting your script file here.
I reproduced your issue at my side. After carefully checked the script, I
found that the issue was caused by the sysadmin role assigned to the login.
If a login has the server role sysadmin, it will be assigned to dbo
automatically.
That is why you got dbo when you executed the statement "SELECT
CURRENT_USER".
If the sysadmin role is cancelled, the issue will not occur.
Please refer to:
Database Owner (dbo)
http://msdn2.microsoft.com/en-us/library/aa905208(sql.80).aspx
Hope this helps. Please feel free to let me know if you have any other
questions or concerns.
Have a good day!
Charles Wang
Microsoft Online Community Support
================================================== ===
Get notification to my posts through email? Please refer to:
http://msdn.microsoft.com/subscriptions/managednewsgroups/default.aspx#notif
ications
If you are using Outlook Express, please make sure you clear the check box
"Tools/Options/Read: Get 300 headers at a time" to see your reply promptly.
Note: The MSDN Managed Newsgroup support offering is for non-urgent issues
where an initial response from the community or a Microsoft Support
Engineer within 1 business day is acceptable. Please note that each follow
up response may take approximately 2 business days as the support
professional working with you may need further investigation to reach the
most efficient resolution. The offering is not appropriate for situations
that require urgent, real-time or phone-based interactions or complex
project analysis and dump analysis issues. Issues of this nature are best
handled working with a dedicated Microsoft Support Engineer by contacting
Microsoft Customer Support Services (CSS) at
http://msdn.microsoft.com/subscriptions/support/default.aspx.
================================================== ====
When responding to posts, please "Reply to Group" via
your newsreader so that others may learn and benefit
from this issue.
================================================== ====
This posting is provided "AS IS" with no warranties, and confers no rights.
================================================== ====
|||Hi, Michael,
Thanks for your updating and response.
I am very glad to hear that the issue has been resolved. Please feel free
to let me know if you have any other questions or concerns.
Have a good day!
Charles Wang
Microsoft Online Community Support

Problem with default schema

Hi,
under SQL Server 2005 I greatet a user with:
USE MYDB
CREATE USER TEST FOR LOGIN TEST WITH DEFAULT_SCHEMA = TEST
ALTER LOGIN TEST WITH DEFAULT_DATABASE =MYDB
and schema:
CREATE SCHEMA TEST AUTHORIZATION TEST;
Connecting to the database using SQL Native Client (2005.9.1399.0) and ODBC
with the user "TEST" leads to default schema "DBO".
That means: Selecting a table using the unqualified name:
SELECT * FROM MYTABLE
leads to "objekt not found" error, if the table "MYTABLE" is part of the
schema "TEST"
after executing the statement
EXECUTE AS USER='TEST'
the statement
SELECT * FROM MYTABLE
succeeds.
Why the default schema is not initially set to "TEST" after connection?
Greetings
MichaelHi, Michael,
I understand that you would like to know why the default schema was not
initially set to "TEST" after you log on your SQL Server with the login
TEST.
If I have misunderstood, please let me know.
Unfortunately per my test in SQL Server Management Studio, I could not
reproduce your issue. My test process was as following:
1. Logged on my SQL Server instance with a sysadmin account and created a
login named "globaltest" with default master database;
2. Executed the statement:
USE TestDB
CREATE USER TEST FOR LOGIN globaltest WITH DEFAULT_SCHEMA = TEST
ALTER LOGIN globaltest WITH DEFAULT_DATABASE TestDB
3. Executed the statement:
CREATE SCHEMA TEST AUTHORIZATION TEST;
4. Changed a table dbo.T1 to TEST.T1;
5. Closed the current connection and connected to the SQL Server instance
with the login 'globaltest';
6. Executed the query:
SELECT * FROM T1;
Everything worked fine. Are there some differences between our tests?
I recommend that you run "SELECT CURRENT_USER" to check if the user TEST is
the current user after you log on your SQL Server with the TEST login. You
may also perform a test according to my steps to see if the issue occurs.
If you have any other questions or concerns, please feel free to let me
know.
Have a good day!
Charles Wang
Microsoft Online Community Support
========================================
=============
Get notification to my posts through email? Please refer to:
http://msdn.microsoft.com/subscript...ault.aspx#notif
ications
If you are using Outlook Express, please make sure you clear the check box
"Tools/Options/Read: Get 300 headers at a time" to see your reply promptly.
Note: The MSDN Managed Newsgroup support offering is for non-urgent issues
where an initial response from the community or a Microsoft Support
Engineer within 1 business day is acceptable. Please note that each follow
up response may take approximately 2 business days as the support
professional working with you may need further investigation to reach the
most efficient resolution. The offering is not appropriate for situations
that require urgent, real-time or phone-based interactions or complex
project analysis and dump analysis issues. Issues of this nature are best
handled working with a dedicated Microsoft Support Engineer by contacting
Microsoft Customer Support Services (CSS) at
http://msdn.microsoft.com/subscript...t/default.aspx.
========================================
==============
When responding to posts, please "Reply to Group" via
your newsreader so that others may learn and benefit
from this issue.
========================================
==============
This posting is provided "AS IS" with no warranties, and confers no rights.
========================================
==============|||underprocessable|||Hi, Michael,
Thanks for your posting your script file here.
I reproduced your issue at my side. After carefully checked the script, I
found that the issue was caused by the sysadmin role assigned to the login.
If a login has the server role sysadmin, it will be assigned to dbo
automatically.
That is why you got dbo when you executed the statement "SELECT
CURRENT_USER".
If the sysadmin role is cancelled, the issue will not occur.
Please refer to:
Database Owner (dbo)
http://msdn2.microsoft.com/en-us/library/aa905208(sql.80).aspx
Hope this helps. Please feel free to let me know if you have any other
questions or concerns.
Have a good day!
Charles Wang
Microsoft Online Community Support
========================================
=============
Get notification to my posts through email? Please refer to:
http://msdn.microsoft.com/subscript...ault.aspx#notif
ications
If you are using Outlook Express, please make sure you clear the check box
"Tools/Options/Read: Get 300 headers at a time" to see your reply promptly.
Note: The MSDN Managed Newsgroup support offering is for non-urgent issues
where an initial response from the community or a Microsoft Support
Engineer within 1 business day is acceptable. Please note that each follow
up response may take approximately 2 business days as the support
professional working with you may need further investigation to reach the
most efficient resolution. The offering is not appropriate for situations
that require urgent, real-time or phone-based interactions or complex
project analysis and dump analysis issues. Issues of this nature are best
handled working with a dedicated Microsoft Support Engineer by contacting
Microsoft Customer Support Services (CSS) at
http://msdn.microsoft.com/subscript...t/default.aspx.
========================================
==============
When responding to posts, please "Reply to Group" via
your newsreader so that others may learn and benefit
from this issue.
========================================
==============
This posting is provided "AS IS" with no warranties, and confers no rights.
========================================
==============|||Hi Charles,
many thanks for your help. In the execution environment it's not required
that the login is assigned to the sysadmin role.
So i'll change the skript!
Greetings
Michael|||Hi, Michael,
Thanks for your updating and response.
I am very glad to hear that the issue has been resolved. Please feel free
to let me know if you have any other questions or concerns.
Have a good day!
Charles Wang
Microsoft Online Community Support

Problem with default schema

Hi

I have a user database say Solumina. There is a login defined on the instance of sql 2005 server , say XYZ. There is also a user defined XYZ in the database mapped to the login 'XYZ' with the default schema as 'sfmfg' (Assume that 'sfmfg' schema already exists).

My question when I try to create a random table with the following syntax it gets created in the dbo schema instead of the sfmfg schema (I am using sfmfg login credentials).

create table t1(id int)

Can some 1 please help out.

Regards

Imtiaz

Is sfmfg a member of db_owner role? It is always safer to use 2-part names in your objects so that you know explicitly the schema for the object.

problem with default path for osql

Hi!
I have a following problem:
When I run osql from cmd prompt it is running fine, however if I have to run
osql from another osql session by using xp_cmdshell 'osql ...'. It can not
find the osql file.
I 've checked both PATH variable as well as registry
HKEY_LOCAL_MACHINE\Software...\80\Tools\ClientSetu p\SqlPath are set to proper
value. What is missing
I've seen this on machines that didn't reboot since installation. I worked around it by checking
registry entries for SQL server and build the bath based on that.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
Blog: http://solidqualitylearning.com/blogs/tibor/
"kimi" <kimi@.discussions.microsoft.com> wrote in message
news:84B0A2D8-17CA-449A-A012-B939B672465A@.microsoft.com...
> Hi!
> I have a following problem:
> When I run osql from cmd prompt it is running fine, however if I have to run
> osql from another osql session by using xp_cmdshell 'osql ...'. It can not
> find the osql file.
> I 've checked both PATH variable as well as registry
> HKEY_LOCAL_MACHINE\Software...\80\Tools\ClientSetu p\SqlPath are set to proper
> value. What is missing
|||Are you suggesting that I should hardcode path to osql if reboot does not help?
"Tibor Karaszi" wrote:

> I've seen this on machines that didn't reboot since installation. I worked around it by checking
> registry entries for SQL server and build the bath based on that.
> --
> Tibor Karaszi, SQL Server MVP
> http://www.karaszi.com/sqlserver/default.asp
> http://www.solidqualitylearning.com/
> Blog: http://solidqualitylearning.com/blogs/tibor/
>
> "kimi" <kimi@.discussions.microsoft.com> wrote in message
> news:84B0A2D8-17CA-449A-A012-B939B672465A@.microsoft.com...
>
|||Reboot did not help. Any ideas?
"kimi" wrote:
[vbcol=seagreen]
> Are you suggesting that I should hardcode path to osql if reboot does not help?
> "Tibor Karaszi" wrote:

problem with default path for osql

Hi!
I have a following problem:
When I run osql from cmd prompt it is running fine, however if I have to run
osql from another osql session by using xp_cmdshell 'osql ...'. It can not
find the osql file.
I 've checked both PATH variable as well as registry
HKEY_LOCAL_MACHINE\Software...\80\Tools\ClientSetup\SqlPath are set to proper
value. What is missingI've seen this on machines that didn't reboot since installation. I worked around it by checking
registry entries for SQL server and build the bath based on that.
--
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
Blog: http://solidqualitylearning.com/blogs/tibor/
"kimi" <kimi@.discussions.microsoft.com> wrote in message
news:84B0A2D8-17CA-449A-A012-B939B672465A@.microsoft.com...
> Hi!
> I have a following problem:
> When I run osql from cmd prompt it is running fine, however if I have to run
> osql from another osql session by using xp_cmdshell 'osql ...'. It can not
> find the osql file.
> I 've checked both PATH variable as well as registry
> HKEY_LOCAL_MACHINE\Software...\80\Tools\ClientSetup\SqlPath are set to proper
> value. What is missing|||Are you suggesting that I should hardcode path to osql if reboot does not help?
"Tibor Karaszi" wrote:
> I've seen this on machines that didn't reboot since installation. I worked around it by checking
> registry entries for SQL server and build the bath based on that.
> --
> Tibor Karaszi, SQL Server MVP
> http://www.karaszi.com/sqlserver/default.asp
> http://www.solidqualitylearning.com/
> Blog: http://solidqualitylearning.com/blogs/tibor/
>
> "kimi" <kimi@.discussions.microsoft.com> wrote in message
> news:84B0A2D8-17CA-449A-A012-B939B672465A@.microsoft.com...
> > Hi!
> > I have a following problem:
> > When I run osql from cmd prompt it is running fine, however if I have to run
> > osql from another osql session by using xp_cmdshell 'osql ...'. It can not
> > find the osql file.
> > I 've checked both PATH variable as well as registry
> > HKEY_LOCAL_MACHINE\Software...\80\Tools\ClientSetup\SqlPath are set to proper
> > value. What is missing
>|||Reboot did not help. Any ideas?
"kimi" wrote:
> Are you suggesting that I should hardcode path to osql if reboot does not help?
> "Tibor Karaszi" wrote:
> > I've seen this on machines that didn't reboot since installation. I worked around it by checking
> > registry entries for SQL server and build the bath based on that.
> >
> > --
> > Tibor Karaszi, SQL Server MVP
> > http://www.karaszi.com/sqlserver/default.asp
> > http://www.solidqualitylearning.com/
> > Blog: http://solidqualitylearning.com/blogs/tibor/
> >
> >
> > "kimi" <kimi@.discussions.microsoft.com> wrote in message
> > news:84B0A2D8-17CA-449A-A012-B939B672465A@.microsoft.com...
> > > Hi!
> > > I have a following problem:
> > > When I run osql from cmd prompt it is running fine, however if I have to run
> > > osql from another osql session by using xp_cmdshell 'osql ...'. It can not
> > > find the osql file.
> > > I 've checked both PATH variable as well as registry
> > > HKEY_LOCAL_MACHINE\Software...\80\Tools\ClientSetup\SqlPath are set to proper
> > > value. What is missing
> >
> >

problem with default path for osql

Hi!
I have a following problem:
When I run osql from cmd prompt it is running fine, however if I have to run
osql from another osql session by using xp_cmdshell 'osql ...'. It can not
find the osql file.
I 've checked both PATH variable as well as registry
HKEY_LOCAL_MACHINE\Software...\80\Tools\ClientSetup\SqlPath are set to prope
r
value. What is missingI've seen this on machines that didn't reboot since installation. I worked a
round it by checking
registry entries for SQL server and build the bath based on that.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
Blog: http://solidqualitylearning.com/blogs/tibor/
"kimi" <kimi@.discussions.microsoft.com> wrote in message
news:84B0A2D8-17CA-449A-A012-B939B672465A@.microsoft.com...
> Hi!
> I have a following problem:
> When I run osql from cmd prompt it is running fine, however if I have to r
un
> osql from another osql session by using xp_cmdshell 'osql ...'. It can not
> find the osql file.
> I 've checked both PATH variable as well as registry
> HKEY_LOCAL_MACHINE\Software...\80\Tools\ClientSetup\SqlPath are set to pro
per
> value. What is missing|||Are you suggesting that I should hardcode path to osql if reboot does not he
lp?
"Tibor Karaszi" wrote:

> I've seen this on machines that didn't reboot since installation. I worked
around it by checking
> registry entries for SQL server and build the bath based on that.
> --
> Tibor Karaszi, SQL Server MVP
> http://www.karaszi.com/sqlserver/default.asp
> http://www.solidqualitylearning.com/
> Blog: http://solidqualitylearning.com/blogs/tibor/
>
> "kimi" <kimi@.discussions.microsoft.com> wrote in message
> news:84B0A2D8-17CA-449A-A012-B939B672465A@.microsoft.com...
>|||Reboot did not help. Any ideas?
"kimi" wrote:
[vbcol=seagreen]
> Are you suggesting that I should hardcode path to osql if reboot does not
help?
> "Tibor Karaszi" wrote:
>

problem with DEFAULT (getdate())

Hi all

I create table and set default value detdate(). But after insert record date display ‘1900-01-01 00:00:00’.

Example this,

CREATE TABLE [tblTemp1] (

ItemUserDate [smalldatetime] NOT NULL CONSTRAINT [DF_tblTemp1_ItemUserDate] DEFAULT (getdate())

)

GO

INSERT INTO dbo.tblTemp1 values(0)

select ItemUserDate from dbo.tblTemp1 =‘1900-01-01 00:00:00’

select getdate() =’2006-07-20 15:53:27.820’ I need this answer

Hi,

try this:

CREATE TABLE [tblTemp1] (

part1 char,

ItemUserDate [smalldatetime] NOT NULL CONSTRAINT [DF_tblTemp1_ItemUserDate] DEFAULT (getdate())

)

GO

INSERT INTO dbo.tblTemp1 part1, values('a')

And then do your select. You'll recieve the correct answer.

You have you result because you're putting 0 in your default column, so default has no effect anymore...

Greeting.

|||

Hi

To make Insert work for the table structure with a single datetime column you can use Default VALUES option: INSERT INTO dbo.tblTemp1 DEFAULT VALUES

|||

thankx Stefan Haeck

|||thanks NB2006

Problem with decimal point

I have got txt file and I am trying to import this file into the database. For that action I use SQL Server Import and Export Wizard. I use Locale Czech and CodePage 1250. In the text file is the column in that format: 18152.65 - it is number with decimal point. When I use BCP utility for importing data, I use datatype decimal(10,2) and everything is OK. But when I try to use Import and Export Wizard, I choose for that column datatype numeric (DT_NUMERIC - precision 10, scale 2), the Import doesn't start and occurs the error:

- Executing (Error)

Messages

Error 0xc02020a1: Data Flow Task: Data conversion failed. The data conversion for column "PRED_CEL " returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
(SQL Server Import and Export Wizard)

Is it always a precision of 10, and a scale of 2? You may have some inconsistent data. There's two things you could do.

1) Increase the precision and scale.

or

2) Create a custom SSIS package, and redirect the error rows to see where the data inconsistency is. The Import/Export Wizard is great for straight forward tasks, but you'll find that with the smallest blib you'll need to create a custom package.

|||The import is very picky about number formats when importing from flat files. It will NOT import zero padded numbers into a decimal field. Even though it will implicetly convert it internally.

For example:
3.99 - OK
003.99 - FAILS
+003.99 - OK
-003.00 - OK

If you import "003.99" into a varchar(12) and then update a decimal(10,2), with the value of teh varchar, it will work fine.

Problem with debugging SSIS 2005 (32bit) component under Windows Vista ultimate

Hi,

I have a problem when i want to debug an ssis component under visual studio 2005.

I'm using sql 2005 with service pack 2 and i have already install patch for visual studio under vista.

When i put a breakpoint for exemple in the method "public override void PreExecute()" or in the other method. "public override void ProcessInput(int inputID, PipelineBuffer buffer)" and i execute the pgm.

however when i begin debugging,it's always skip breakpoint

However when i put a break point in the Validate method "public override DTSValidationStatus Validate()" it function.

When execute the the package it works fine.

I use this debug parametres :

Start external program :

C:\Program Files\Microsoft SQL Server\90\DTS\Binn\DTExec.exe

Command line arguments

/FILE "D:\Projects\Integration Services Project1\Integration Services Project1\TestXmlParserPipe.dtsx" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING EWCDI

I execute visual studio 2005 as Administrator

Please, can some one help me!

It must be some sort of those trivial problems. Are you sure the execution goes through those two methods? Try to check using some kind of checkpoint logging (i. e. MessageBox).

A few additional troubleshooting tips:

- Make sure you are building the debug version of your component.

- Check if the proper module gets loaded.

- Double-check the function signatures.

HTH.

|||

Hi,

thanks for your answer.

I'm sure, i'm was building the debug version.

I notice that if i put a messagebox in the script component and i run the package.

The message box appear, so i go to my ssis component code under visual studio2005 and then i put a breakpoint in

PreExecute() method (for example), after this i attached the process DtsDebugHost.exe (this process is visible in the process list only when the package ssis is running ). I came back to the message box i click ok and finally it work i can see the yellow line hilight the method.

What can i do to let de debugger work fine under vista ultimate? In my work, we have Windows XP professional and i don't have to use complicated procedure each time i want to debug my custom component.

so it will be so nice if some one know the solution for that.

Thanks in advance.

|||

Can you confirm one thing? Are you debugging your custom component using DTExec or BIDS?

Thanks,

Bob

|||

Hi,

I debug the my custom component using DTExec :

On Start action \start external program I use :

C:\Program Files\Microsoft SQL Server\90\DTS\Binn\DTExec.exe

I'm using the following command line arguments on start option:

/FILE "..\TestXmlParserPipe.dtsx" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING EWCDI

Please help with Windows Vista Ultimate ?

|||

Always start investigating debugging/breakpoints problems by checking if VS finds the symbols for your assembly. Go to Modules window, find your assembly, check if the symbols are loaded. If not, try to load the symbols manually.

Do you install a custom component (run gacutil /if ...) as part of VS post-build step?

What often happens is that if you don't do this, VS would build a new version of the component before starting debugging session; but the binary used by DTEXEC comes from GAC and it is different from the new version build by VS, so VS can't match the symbols and can't use the breakpoints.

Problem with debugging SSIS 2005 (32bit) component under Windows Vista ultimate

Hi,

I have a problem when i want to debug an ssis component under visual studio 2005.

I'm using sql 2005 with service pack 2 and i have already install patch for visual studio under vista.

When i put a breakpoint for exemple in the method "public override void PreExecute()" or in the other method. "public override void ProcessInput(int inputID, PipelineBuffer buffer)" and i execute the pgm.

however when i begin debugging,it's always skip breakpoint

However when i put a break point in the Validate method "public override DTSValidationStatus Validate()" it function.

When execute the the package it works fine.

I use this debug parametres :

Start external program :

C:\Program Files\Microsoft SQL Server\90\DTS\Binn\DTExec.exe

Command line arguments

/FILE "D:\Projects\Integration Services Project1\Integration Services Project1\TestXmlParserPipe.dtsx" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING EWCDI

I execute visual studio 2005 as Administrator

Please, can some one help me!

It must be some sort of those trivial problems. Are you sure the execution goes through those two methods? Try to check using some kind of checkpoint logging (i. e. MessageBox).

A few additional troubleshooting tips:

- Make sure you are building the debug version of your component.

- Check if the proper module gets loaded.

- Double-check the function signatures.

HTH.

|||

Hi,

thanks for your answer.

I'm sure, i'm was building the debug version.

I notice that if i put a messagebox in the script component and i run the package.

The message box appear, so i go to my ssis component code under visual studio2005 and then i put a breakpoint in

PreExecute() method (for example), after this i attached the process DtsDebugHost.exe (this process is visible in the process list only when the package ssis is running ). I came back to the message box i click ok and finally it work i can see the yellow line hilight the method.

What can i do to let de debugger work fine under vista ultimate? In my work, we have Windows XP professional and i don't have to use complicated procedure each time i want to debug my custom component.

so it will be so nice if some one know the solution for that.

Thanks in advance.

|||

Can you confirm one thing? Are you debugging your custom component using DTExec or BIDS?

Thanks,

Bob

|||

Hi,

I debug the my custom component using DTExec :

On Start action \start external program I use :

C:\Program Files\Microsoft SQL Server\90\DTS\Binn\DTExec.exe

I'm using the following command line arguments on start option:

/FILE "..\TestXmlParserPipe.dtsx" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING EWCDI

Please help with Windows Vista Ultimate ?

Problem with debugging SSIS 2005 (32bit) component under Windows Vista ultimate

Hi,

I have a problem when i want to debug an ssis component under visual studio 2005.

I'm using sql 2005 with service pack 2 and i have already install patch for visual studio under vista.

When i put a breakpoint for exemple in the method "public override void PreExecute()" or in the other method. "public override void ProcessInput(int inputID, PipelineBuffer buffer)" and i execute the pgm.

however when i begin debugging,it's always skip breakpoint

However when i put a break point in the Validate method "public override DTSValidationStatus Validate()" it function.

When execute the the package it works fine.

I use this debug parametres :

Start external program :

C:\Program Files\Microsoft SQL Server\90\DTS\Binn\DTExec.exe

Command line arguments

/FILE "D:\Projects\Integration Services Project1\Integration Services Project1\TestXmlParserPipe.dtsx" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING EWCDI

I execute visual studio 2005 as Administrator

Please, can some one help me!

It must be some sort of those trivial problems. Are you sure the execution goes through those two methods? Try to check using some kind of checkpoint logging (i. e. MessageBox).

A few additional troubleshooting tips:

- Make sure you are building the debug version of your component.

- Check if the proper module gets loaded.

- Double-check the function signatures.

HTH.

|||

Hi,

thanks for your answer.

I'm sure, i'm was building the debug version.

I notice that if i put a messagebox in the script component and i run the package.

The message box appear, so i go to my ssis component code under visual studio2005 and then i put a breakpoint in

PreExecute() method (for example), after this i attached the process DtsDebugHost.exe (this process is visible in the process list only when the package ssis is running ). I came back to the message box i click ok and finally it work i can see the yellow line hilight the method.

What can i do to let de debugger work fine under vista ultimate? In my work, we have Windows XP professional and i don't have to use complicated procedure each time i want to debug my custom component.

so it will be so nice if some one know the solution for that.

Thanks in advance.

|||

Can you confirm one thing? Are you debugging your custom component using DTExec or BIDS?

Thanks,

Bob

|||

Hi,

I debug the my custom component using DTExec :

On Start action \start external program I use :

C:\Program Files\Microsoft SQL Server\90\DTS\Binn\DTExec.exe

I'm using the following command line arguments on start option:

/FILE "..\TestXmlParserPipe.dtsx" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING EWCDI

Please help with Windows Vista Ultimate ?

|||

Always start investigating debugging/breakpoints problems by checking if VS finds the symbols for your assembly. Go to Modules window, find your assembly, check if the symbols are loaded. If not, try to load the symbols manually.

Do you install a custom component (run gacutil /if ...) as part of VS post-build step?

What often happens is that if you don't do this, VS would build a new version of the component before starting debugging session; but the binary used by DTEXEC comes from GAC and it is different from the new version build by VS, so VS can't match the symbols and can't use the breakpoints.

Problem with debugging SSIS 2005 (32bit) component under Windows Vista ultimate

Hi,

I have a problem when i want to debug an ssis component under visual studio 2005.

I'm using sql 2005 with service pack 2 and i have already install patch for visual studio under vista.

When i put a breakpoint for exemple in the method "public override void PreExecute()" or in the other method. "public override void ProcessInput(int inputID, PipelineBuffer buffer)" and i execute the pgm.

however when i begin debugging,it's always skip breakpoint

However when i put a break point in the Validate method "public override DTSValidationStatus Validate()" it function.

When execute the the package it works fine.

I use this debug parametres :

Start external program :

C:\Program Files\Microsoft SQL Server\90\DTS\Binn\DTExec.exe

Command line arguments

/FILE "D:\Projects\Integration Services Project1\Integration Services Project1\TestXmlParserPipe.dtsx" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING EWCDI

I execute visual studio 2005 as Administrator

Please, can some one help me!

It must be some sort of those trivial problems. Are you sure the execution goes through those two methods? Try to check using some kind of checkpoint logging (i. e. MessageBox).

A few additional troubleshooting tips:

- Make sure you are building the debug version of your component.

- Check if the proper module gets loaded.

- Double-check the function signatures.

HTH.

|||

Hi,

thanks for your answer.

I'm sure, i'm was building the debug version.

I notice that if i put a messagebox in the script component and i run the package.

The message box appear, so i go to my ssis component code under visual studio2005 and then i put a breakpoint in

PreExecute() method (for example), after this i attached the process DtsDebugHost.exe (this process is visible in the process list only when the package ssis is running ). I came back to the message box i click ok and finally it work i can see the yellow line hilight the method.

What can i do to let de debugger work fine under vista ultimate? In my work, we have Windows XP professional and i don't have to use complicated procedure each time i want to debug my custom component.

so it will be so nice if some one know the solution for that.

Thanks in advance.

|||

Can you confirm one thing? Are you debugging your custom component using DTExec or BIDS?

Thanks,

Bob

|||

Hi,

I debug the my custom component using DTExec :

On Start action \start external program I use :

C:\Program Files\Microsoft SQL Server\90\DTS\Binn\DTExec.exe

I'm using the following command line arguments on start option:

/FILE "..\TestXmlParserPipe.dtsx" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING EWCDI

Please help with Windows Vista Ultimate ?

|||

Always start investigating debugging/breakpoints problems by checking if VS finds the symbols for your assembly. Go to Modules window, find your assembly, check if the symbols are loaded. If not, try to load the symbols manually.

Do you install a custom component (run gacutil /if ...) as part of VS post-build step?

What often happens is that if you don't do this, VS would build a new version of the component before starting debugging session; but the binary used by DTEXEC comes from GAC and it is different from the new version build by VS, so VS can't match the symbols and can't use the breakpoints.

Problem with deadocks

Hi list, iam very new to SQL. i have a developer that is making a very
complex query over a SQL2000 + SP3 database, with a lots of union queries.
He recieves this error:
Transaction (Process ID 97) was deadlocked on lock | communication buffer
resources with another process and has been chosen as the deadlock victim.
Rerun the transaction.
I dont know how to proceed with this, how can try to discover the problem?
Could be a resources problem?
Thanks for the advices...Aswers in microsoft.public.sqlserver.server
--
--
Mike Epprecht, Microsoft SQL Server MVP
Zurich, Switzerland
MVP Program: http://www.microsoft.com/mvp
Blog: http://www.msmvps.com/epprecht/
"Tinchos" wrote:

> Hi list, iam very new to SQL. i have a developer that is making a very
> complex query over a SQL2000 + SP3 database, with a lots of union queries.
> He recieves this error:
> Transaction (Process ID 97) was deadlocked on lock | communication buffer
> resources with another process and has been chosen as the deadlock victim.
> Rerun the transaction.
> I dont know how to proceed with this, how can try to discover the problem?
> Could be a resources problem?
> Thanks for the advices...|||http://www.microsoft.com/communitie...80-e67d8812bd2d
AMB
"Tinchos" wrote:

> Hi list, iam very new to SQL. i have a developer that is making a very
> complex query over a SQL2000 + SP3 database, with a lots of union queries.
> He recieves this error:
> Transaction (Process ID 97) was deadlocked on lock | communication buffer
> resources with another process and has been chosen as the deadlock victim.
> Rerun the transaction.
> I dont know how to proceed with this, how can try to discover the problem?
> Could be a resources problem?
> Thanks for the advices...|||In your from statement use WITH (NOLOCKS) when you are selecting data.
Rick
"Tinchos" <Tinchos@.discussions.microsoft.com> wrote in message
news:B5CBFB12-BF8F-4BA8-B71A-8D9A893C66F6@.microsoft.com...
> Hi list, iam very new to SQL. i have a developer that is making a very
> complex query over a SQL2000 + SP3 database, with a lots of union queries.
> He recieves this error:
> Transaction (Process ID 97) was deadlocked on lock | communication buffer
> resources with another process and has been chosen as the deadlock victim.
> Rerun the transaction.
> I dont know how to proceed with this, how can try to discover the problem?
> Could be a resources problem?
> Thanks for the advices...

Problem with dead-locks

We are experiencing some problems with deadlocks in our system. We call
Sql server 2k from a .Net applicationlayer.
The flow is
Create transaction object in .Net
FOR each table in a list (e.g. 11 tables in total)
DELETE FROM Table WHERE SomeID IN (x,y) // SomeID is foreign key
(done in stored procedure with no cursors)
// END FOR
FOR each table in a list (same list af before, but in opposite sequence)
BULK INSERT MyTable ..
(SQL-statement created in .Net and sent to Sql svr)
// END FOR
transactionsobject.Commit();
We receive data from an external system once every w. The external
system cannot give os delta-information, e.g. only new data, s they send
us their entire data. This is the reason for why we delete all data in
our system and insert them again.
The problem occurs at BULK INSERT, but odly not always. The database in
question is offline for the rest of the system, so nothing takes place on
the server apart from this job. The transaction object is sent to the
server using Microsoft Practices Enterprise Library's data-block to
enable us to roll-back the transaction if one of the statements fail.
We are kind of blank with regards to how to solve this - do you guys have
an idea to a solution?
Thanks,
:o)
Jesper Stocholm
http://stocholm.dk
<a href="http://links.10026.com/?link=http://www.sony.com">evil</a>
Findes din kiosk p nettet? Se http://ekiosk.dkHave you followed the basic recommendations on deadlocks in Books Online?
Start here, if you haven't:
http://msdn.microsoft.com/library/d... />
a_3hdf.asp
Tracing deadlocks might also be much help:
http://msdn.microsoft.com/library/d...
tabse_5xrn.asp
ML|||I have two suggestions. (1) change the mechanism so that the deletes are
committed prior to starting the bulk inserts, or (2) generate a temporary
stored procedure or dynamic SQL to execute the entire job in a single batch,
complete with error handling, etc.
You could save off the rows your deleting into a separate database, and
reinsert them if a failure occurs during the bulk inserts.
My preference is to migrate transaction processing to the data tier whenever
possible. It's a lot easier to modify a stored procedure than to recompile
and redeploy a middle-tier component or God forbid, a client application.
"Jesper Stocholm" <j@.stocholm.invalid> wrote in message
news:Xns97105AA34EC40stocholmdk@.207.46.248.16...
> We are experiencing some problems with deadlocks in our system. We call
> Sql server 2k from a .Net applicationlayer.
> The flow is
> Create transaction object in .Net
> FOR each table in a list (e.g. 11 tables in total)
> DELETE FROM Table WHERE SomeID IN (x,y) // SomeID is foreign key
> (done in stored procedure with no cursors)
> // END FOR
> FOR each table in a list (same list af before, but in opposite sequence)
> BULK INSERT MyTable ..
> (SQL-statement created in .Net and sent to Sql svr)
> // END FOR
> transactionsobject.Commit();
> We receive data from an external system once every w. The external
> system cannot give os delta-information, e.g. only new data, s they send
> us their entire data. This is the reason for why we delete all data in
> our system and insert them again.
> The problem occurs at BULK INSERT, but odly not always. The database in
> question is offline for the rest of the system, so nothing takes place on
> the server apart from this job. The transaction object is sent to the
> server using Microsoft Practices Enterprise Library's data-block to
> enable us to roll-back the transaction if one of the statements fail.
> We are kind of blank with regards to how to solve this - do you guys have
> an idea to a solution?
> Thanks,
> :o)
> --
> Jesper Stocholm
> http://stocholm.dk
> <a href="http://links.10026.com/?link=http://www.sony.com">evil</a>
> Findes din kiosk p nettet? Se http://ekiosk.dk|||"Brian Selzer" <brian@.selzer-software.com> wrote in
news:epOZG6w6FHA.3416@.TK2MSFTNGP15.phx.gbl:

> I have two suggestions. (1) change the mechanism so that the deletes
> are committed prior to starting the bulk inserts, or (2) generate a
> temporary stored procedure or dynamic SQL to execute the entire job in
> a single batch, complete with error handling, etc.
I will look into this, thanks.

> My preference is to migrate transaction processing to the data tier
> whenever possible. It's a lot easier to modify a stored procedure
> than to recompile and redeploy a middle-tier component or God forbid,
> a client application.
We have since my first post moved the SQL to the database in a stored
procedure that creates some dynamic sql and executes it, so it is no longer
in the application layer. The problem, however, did not go away.
Also, we cannot do it all in the application layer. The entire process is a
ETL-process, where the data is (heavily) transformed before being loaded
into the (other) database. It is not possible to put some of the key parts
in the database-layer - for various purposes we need to do the
transformation in the application layer.
I talked to one of our DBAs, and he suggested that another benefit of
moving the sql to a stored procedure would be that it would complete the
current batch after each "GO"-statement at the end of executing the stored
procedure. We moved the SQL-code and the problem seems to have dissapeared
... for now at least. We are keeping our fingers crossed. If this works, we
are happy ... a bit nervous that we didn't find the cause of the error ...
but happy, none the less.
:o)
Jesper Stocholm
http://stocholm.dk
<a href="http://links.10026.com/?link=http://www.sony.com">evil</a>