DebugDiag dump analysis help

DebugDiag dump analysis help

am 23.01.2008 02:47:06 von Strago

Hi there, I am analyzing a periodic IIS crash I receive on one of my servers.
This server hosts a document management product, and has an Oracle client
installation, among other things. During the last downage, I was able to
take a hang dump from the Windows Debug Diagnostic tool, and have pasted a
summary below. All I can tell is that it appears to be caused by a database
call, as the hanging process looks to be the Oracle client.

Is it possible to glean any additional info from this dump? Let me know if
you need the full log (it exceeded the posting limit).

Are there any additional tools I can use to get more precise info, such as
what sql code is hanging?

Thanks in advance...



Analysis Summary
Type Description Recommendation
Warning The following threads in
IIS_COM+__Date__01_04_2008__Time_08_39_09AM__w3wp.exe__Defau ltAppPool__PID__5848__637__Manual
Dump.dmp are making a COM call to multi-threaded apartment (MTA) in another
COM server process via the local RpcSs service



( 16 17 18 19 20 21 22 24 25 26 27 28 29 30 )



42.42% of threads blocked


The thread(s) in question is/are waiting on a CoCreateInstance call to
return. Further analysis of the process hosting the particular component (not
the RpcSs service) should be performed to determine why these calls have not
completed. More information for the particular component(s) that the
thread(s) in question is/are attempting to instantiate can be found in the
thread detail for each thread listed in the Description pane to the left.
Warning Detected possible blocking or leaked critical section at
0x023f4c58 owned by thread 19 in
IIS_COM+__Date__01_04_2008__Time_08_39_06AM__dllhost.exe__RD CMS_XMLServer__PID__5192__121__Manual Dump.dmp



Impact of this lock



1 critical sections indirectly blocked



(Critical Sections ntdll!LdrpLoaderLock)



19.05% of threads blocked



(Threads 1 6 10 20)



The following functions are trying to enter this critical section

oracore10!sltsmna+f



The following module(s) are involved with this critical section

d:\oracle\product\10.2.0\client_1\BIN\oracore10.dll from Oracle Corporation
The following vendors were identified for follow up based on root cause
analysis



Oracle Corporation

Please follow up with the vendors identified above Consider the following
approach to determine root cause for this critical section problem:
Enable 'lock checks' in Application Verifier
Download Application Verifier from the following URL:

Microsoft Application Verifier
Enable 'lock checks' for this process by running the following command:

Appverif.exe -enable locks -for dllhost.exe
See the following document for more information on Application Verifier:

Testing Applications with AppVerifier
Use a DebugDiag crash rule to monitor the application for exceptions

Information
DebugDiag did not detect any known problems in the following dump files:


IIS_COM+__Date__01_04_2008__Time_08_39_03AM__inetinfo.exe__P ID__1360__902__Manual Dump.dmp

IIS_COM+__Date__01_04_2008__Time_08_39_00AM__dllhost.exe__Sy stem
Application__PID__280__934__Manual Dump.dmp



Warning 14 client connection(s) in
IIS_COM+__Date__01_04_2008__Time_08_39_09AM__w3wp.exe__Defau ltAppPool__PID__5848__637__Manual
Dump.dmp have been executing a request for more than 90 seconds. Please see
the Client Connections section of this report for more detailed information
about the connection(s).

Re: DebugDiag dump analysis help

am 23.01.2008 12:47:09 von Ken Schaefer

Is the worker process actually crashing (terminating) or just appears to be
in a hung state (not responding to requests)?

From what you have posted below, it would appear that the process is hung,
and that the problem appears to be either in the Oracle client
components -or- some issue with the backend Oracle server.

Cheers
Ken

"Strago" wrote in message
news:6D744529-E303-4399-9BC6-46D20A3265FB@microsoft.com...
> Hi there, I am analyzing a periodic IIS crash I receive on one of my
> servers.
> This server hosts a document management product, and has an Oracle client
> installation, among other things. During the last downage, I was able to
> take a hang dump from the Windows Debug Diagnostic tool, and have pasted a
> summary below. All I can tell is that it appears to be caused by a
> database
> call, as the hanging process looks to be the Oracle client.
>
> Is it possible to glean any additional info from this dump? Let me know
> if
> you need the full log (it exceeded the posting limit).
>
> Are there any additional tools I can use to get more precise info, such as
> what sql code is hanging?
>
> Thanks in advance...
>
>
>
> Analysis Summary
> Type Description Recommendation
> Warning The following threads in
> IIS_COM+__Date__01_04_2008__Time_08_39_09AM__w3wp.exe__Defau ltAppPool__PID__5848__637__Manual
> Dump.dmp are making a COM call to multi-threaded apartment (MTA) in
> another
> COM server process via the local RpcSs service
>
>
>
> ( 16 17 18 19 20 21 22 24 25 26 27 28 29 30 )
>
>
>
> 42.42% of threads blocked
>
>
> The thread(s) in question is/are waiting on a CoCreateInstance call to
> return. Further analysis of the process hosting the particular component
> (not
> the RpcSs service) should be performed to determine why these calls have
> not
> completed. More information for the particular component(s) that the
> thread(s) in question is/are attempting to instantiate can be found in the
> thread detail for each thread listed in the Description pane to the left.
> Warning Detected possible blocking or leaked critical section at
> 0x023f4c58 owned by thread 19 in
> IIS_COM+__Date__01_04_2008__Time_08_39_06AM__dllhost.exe__RD CMS_XMLServer__PID__5192__121__Manual
> Dump.dmp
>
>
>
> Impact of this lock
>
>
>
> 1 critical sections indirectly blocked
>
>
>
> (Critical Sections ntdll!LdrpLoaderLock)
>
>
>
> 19.05% of threads blocked
>
>
>
> (Threads 1 6 10 20)
>
>
>
> The following functions are trying to enter this critical section
>
> oracore10!sltsmna+f
>
>
>
> The following module(s) are involved with this critical section
>
> d:\oracle\product\10.2.0\client_1\BIN\oracore10.dll from Oracle
> Corporation
> The following vendors were identified for follow up based on root cause
> analysis
>
>
>
> Oracle Corporation
>
> Please follow up with the vendors identified above Consider the following
> approach to determine root cause for this critical section problem:
> Enable 'lock checks' in Application Verifier
> Download Application Verifier from the following URL:
>
> Microsoft Application Verifier
> Enable 'lock checks' for this process by running the following command:
>
> Appverif.exe -enable locks -for dllhost.exe
> See the following document for more information on Application Verifier:
>
> Testing Applications with AppVerifier
> Use a DebugDiag crash rule to monitor the application for exceptions
>
> Information
> DebugDiag did not detect any known problems in the following dump files:
>
>
> IIS_COM+__Date__01_04_2008__Time_08_39_03AM__inetinfo.exe__P ID__1360__902__Manual
> Dump.dmp
>
> IIS_COM+__Date__01_04_2008__Time_08_39_00AM__dllhost.exe__Sy stem
> Application__PID__280__934__Manual Dump.dmp
>
>
>
> Warning 14 client connection(s) in
> IIS_COM+__Date__01_04_2008__Time_08_39_09AM__w3wp.exe__Defau ltAppPool__PID__5848__637__Manual
> Dump.dmp have been executing a request for more than 90 seconds. Please
> see
> the Client Connections section of this report for more detailed
> information
> about the connection(s).
>

Re: DebugDiag dump analysis help

am 23.01.2008 16:10:04 von Strago

Only a hung state. I suppose I will try and look at this more from the DB
side of things. Thanks.

"Ken Schaefer" wrote:

> Is the worker process actually crashing (terminating) or just appears to be
> in a hung state (not responding to requests)?
>
> From what you have posted below, it would appear that the process is hung,
> and that the problem appears to be either in the Oracle client
> components -or- some issue with the backend Oracle server.
>
> Cheers
> Ken
>
> "Strago" wrote in message
> news:6D744529-E303-4399-9BC6-46D20A3265FB@microsoft.com...
> > Hi there, I am analyzing a periodic IIS crash I receive on one of my
> > servers.
> > This server hosts a document management product, and has an Oracle client
> > installation, among other things. During the last downage, I was able to
> > take a hang dump from the Windows Debug Diagnostic tool, and have pasted a
> > summary below. All I can tell is that it appears to be caused by a
> > database
> > call, as the hanging process looks to be the Oracle client.
> >
> > Is it possible to glean any additional info from this dump? Let me know
> > if
> > you need the full log (it exceeded the posting limit).
> >
> > Are there any additional tools I can use to get more precise info, such as
> > what sql code is hanging?
> >
> > Thanks in advance...
> >
> >
> >
> > Analysis Summary
> > Type Description Recommendation
> > Warning The following threads in
> > IIS_COM+__Date__01_04_2008__Time_08_39_09AM__w3wp.exe__Defau ltAppPool__PID__5848__637__Manual
> > Dump.dmp are making a COM call to multi-threaded apartment (MTA) in
> > another
> > COM server process via the local RpcSs service
> >
> >
> >
> > ( 16 17 18 19 20 21 22 24 25 26 27 28 29 30 )
> >
> >
> >
> > 42.42% of threads blocked
> >
> >
> > The thread(s) in question is/are waiting on a CoCreateInstance call to
> > return. Further analysis of the process hosting the particular component
> > (not
> > the RpcSs service) should be performed to determine why these calls have
> > not
> > completed. More information for the particular component(s) that the
> > thread(s) in question is/are attempting to instantiate can be found in the
> > thread detail for each thread listed in the Description pane to the left.
> > Warning Detected possible blocking or leaked critical section at
> > 0x023f4c58 owned by thread 19 in
> > IIS_COM+__Date__01_04_2008__Time_08_39_06AM__dllhost.exe__RD CMS_XMLServer__PID__5192__121__Manual
> > Dump.dmp
> >
> >
> >
> > Impact of this lock
> >
> >
> >
> > 1 critical sections indirectly blocked
> >
> >
> >
> > (Critical Sections ntdll!LdrpLoaderLock)
> >
> >
> >
> > 19.05% of threads blocked
> >
> >
> >
> > (Threads 1 6 10 20)
> >
> >
> >
> > The following functions are trying to enter this critical section
> >
> > oracore10!sltsmna+f
> >
> >
> >
> > The following module(s) are involved with this critical section
> >
> > d:\oracle\product\10.2.0\client_1\BIN\oracore10.dll from Oracle
> > Corporation
> > The following vendors were identified for follow up based on root cause
> > analysis
> >
> >
> >
> > Oracle Corporation
> >
> > Please follow up with the vendors identified above Consider the following
> > approach to determine root cause for this critical section problem:
> > Enable 'lock checks' in Application Verifier
> > Download Application Verifier from the following URL:
> >
> > Microsoft Application Verifier
> > Enable 'lock checks' for this process by running the following command:
> >
> > Appverif.exe -enable locks -for dllhost.exe
> > See the following document for more information on Application Verifier:
> >
> > Testing Applications with AppVerifier
> > Use a DebugDiag crash rule to monitor the application for exceptions
> >
> > Information
> > DebugDiag did not detect any known problems in the following dump files:
> >
> >
> > IIS_COM+__Date__01_04_2008__Time_08_39_03AM__inetinfo.exe__P ID__1360__902__Manual
> > Dump.dmp
> >
> > IIS_COM+__Date__01_04_2008__Time_08_39_00AM__dllhost.exe__Sy stem
> > Application__PID__280__934__Manual Dump.dmp
> >
> >
> >
> > Warning 14 client connection(s) in
> > IIS_COM+__Date__01_04_2008__Time_08_39_09AM__w3wp.exe__Defau ltAppPool__PID__5848__637__Manual
> > Dump.dmp have been executing a request for more than 90 seconds. Please
> > see
> > the Client Connections section of this report for more detailed
> > information
> > about the connection(s).
> >
>
>

Re: DebugDiag dump analysis help

am 23.01.2008 19:20:58 von patfilot

Can you post the threads?


Pat

"Strago" wrote in message
news:61BF349C-56E4-49B5-B702-0E21771CDD61@microsoft.com...
> Only a hung state. I suppose I will try and look at this more from the DB
> side of things. Thanks.
>
> "Ken Schaefer" wrote:
>
>> Is the worker process actually crashing (terminating) or just appears to
>> be
>> in a hung state (not responding to requests)?
>>
>> From what you have posted below, it would appear that the process is
>> hung,
>> and that the problem appears to be either in the Oracle client
>> components -or- some issue with the backend Oracle server.
>>
>> Cheers
>> Ken
>>
>> "Strago" wrote in message
>> news:6D744529-E303-4399-9BC6-46D20A3265FB@microsoft.com...
>> > Hi there, I am analyzing a periodic IIS crash I receive on one of my
>> > servers.
>> > This server hosts a document management product, and has an Oracle
>> > client
>> > installation, among other things. During the last downage, I was able
>> > to
>> > take a hang dump from the Windows Debug Diagnostic tool, and have
>> > pasted a
>> > summary below. All I can tell is that it appears to be caused by a
>> > database
>> > call, as the hanging process looks to be the Oracle client.
>> >
>> > Is it possible to glean any additional info from this dump? Let me
>> > know
>> > if
>> > you need the full log (it exceeded the posting limit).
>> >
>> > Are there any additional tools I can use to get more precise info, such
>> > as
>> > what sql code is hanging?
>> >
>> > Thanks in advance...
>> >
>> >
>> >
>> > Analysis Summary
>> > Type Description Recommendation
>> > Warning The following threads in
>> > IIS_COM+__Date__01_04_2008__Time_08_39_09AM__w3wp.exe__Defau ltAppPool__PID__5848__637__Manual
>> > Dump.dmp are making a COM call to multi-threaded apartment (MTA) in
>> > another
>> > COM server process via the local RpcSs service
>> >
>> >
>> >
>> > ( 16 17 18 19 20 21 22 24 25 26 27 28 29 30 )
>> >
>> >
>> >
>> > 42.42% of threads blocked
>> >
>> >
>> > The thread(s) in question is/are waiting on a CoCreateInstance call to
>> > return. Further analysis of the process hosting the particular
>> > component
>> > (not
>> > the RpcSs service) should be performed to determine why these calls
>> > have
>> > not
>> > completed. More information for the particular component(s) that the
>> > thread(s) in question is/are attempting to instantiate can be found in
>> > the
>> > thread detail for each thread listed in the Description pane to the
>> > left.
>> > Warning Detected possible blocking or leaked critical section at
>> > 0x023f4c58 owned by thread 19 in
>> > IIS_COM+__Date__01_04_2008__Time_08_39_06AM__dllhost.exe__RD CMS_XMLServer__PID__5192__121__Manual
>> > Dump.dmp
>> >
>> >
>> >
>> > Impact of this lock
>> >
>> >
>> >
>> > 1 critical sections indirectly blocked
>> >
>> >
>> >
>> > (Critical Sections ntdll!LdrpLoaderLock)
>> >
>> >
>> >
>> > 19.05% of threads blocked
>> >
>> >
>> >
>> > (Threads 1 6 10 20)
>> >
>> >
>> >
>> > The following functions are trying to enter this critical section
>> >
>> > oracore10!sltsmna+f
>> >
>> >
>> >
>> > The following module(s) are involved with this critical section
>> >
>> > d:\oracle\product\10.2.0\client_1\BIN\oracore10.dll from Oracle
>> > Corporation
>> > The following vendors were identified for follow up based on root cause
>> > analysis
>> >
>> >
>> >
>> > Oracle Corporation
>> >
>> > Please follow up with the vendors identified above Consider the
>> > following
>> > approach to determine root cause for this critical section problem:
>> > Enable 'lock checks' in Application Verifier
>> > Download Application Verifier from the following URL:
>> >
>> > Microsoft Application Verifier
>> > Enable 'lock checks' for this process by running the following command:
>> >
>> > Appverif.exe -enable locks -for dllhost.exe
>> > See the following document for more information on Application
>> > Verifier:
>> >
>> > Testing Applications with AppVerifier
>> > Use a DebugDiag crash rule to monitor the application for exceptions
>> >
>> > Information
>> > DebugDiag did not detect any known problems in the following dump
>> > files:
>> >
>> >
>> > IIS_COM+__Date__01_04_2008__Time_08_39_03AM__inetinfo.exe__P ID__1360__902__Manual
>> > Dump.dmp
>> >
>> > IIS_COM+__Date__01_04_2008__Time_08_39_00AM__dllhost.exe__Sy stem
>> > Application__PID__280__934__Manual Dump.dmp
>> >
>> >
>> >
>> > Warning 14 client connection(s) in
>> > IIS_COM+__Date__01_04_2008__Time_08_39_09AM__w3wp.exe__Defau ltAppPool__PID__5848__637__Manual
>> > Dump.dmp have been executing a request for more than 90 seconds. Please
>> > see
>> > the Client Connections section of this report for more detailed
>> > information
>> > about the connection(s).
>> >
>>
>>

Re: DebugDiag dump analysis help

am 24.01.2008 15:04:03 von Strago

Listed below are threads 1. 6. 10. and 20. I then pasted thread 16, which is
identical to all the other ones specified in that first group of the debug
report.



Thread 1 - System ID 2608
Entry point ntdll!RtlpTimerThread
Create time 1/4/2008 7:50:16 AM
Time spent in user mode 0 Days 00:00:00.00
Time spent in kernel mode 0 Days 00:00:00.00


Function Source
ntdll!KiFastSystemCallRet
ntdll!NtDelayExecution+c
ntdll!RtlpTimerThread+47
kernel32!BaseThreadStart+34


Thread 6 - System ID 4720
Entry point w3tp!THREAD_MANAGER::ThreadManagerThread
Create time 1/4/2008 7:50:16 AM
Time spent in user mode 0 Days 00:00:00.046
Time spent in kernel mode 0 Days 00:00:00.00


Function Source
ntdll!KiFastSystemCallRet
ntdll!ZwRemoveIoCompletion+c
kernel32!GetQueuedCompletionStatus+29
w3tp!THREAD_POOL_DATA::ThreadPoolThread+33
w3tp!THREAD_POOL_DATA::ThreadPoolThread+24
w3tp!THREAD_MANAGER::ThreadManagerThread+39
kernel32!BaseThreadStart+34


Thread 10 - System ID 5924
Entry point ntdll!RtlpIOWorkerThread
Create time 1/4/2008 7:50:16 AM
Time spent in user mode 0 Days 00:00:00.00
Time spent in kernel mode 0 Days 00:00:00.00


Function Source
ntdll!KiFastSystemCallRet
ntdll!NtDelayExecution+c
ntdll!RtlpIOWorkerThread+3f
kernel32!BaseThreadStart+34

Thread 20 - System ID 5224
Entry point msvcrt!_endthreadex+2f
Create time 1/4/2008 8:06:21 AM
Time spent in user mode 0 Days 00:00:00.015
Time spent in kernel mode 0 Days 00:00:00.00


This thread is making a COM call to multi-threaded apartment (MTA) in
another COM server process via the local RpcSs service

Function Source
ntdll!KiFastSystemCallRet
ntdll!NtWaitForMultipleObjects+c
kernel32!WaitForMultipleObjectsEx+11a
user32!RealMsgWaitForMultipleObjectsEx+141
ole32!CCliModalLoop::BlockFn+7d
ole32!ModalLoop+5b
ole32!ThreadSendReceive+e3
ole32!CRpcChannelBuffer::SwitchAptAndDispatchCall+112
ole32!CRpcChannelBuffer::SendReceive2+d3
ole32!CCliModalLoop::SendReceive+1e
ole32!CAptRpcChnl::SendReceive+6f
ole32!CCtxComChnl::SendReceive+1a9
rpcrt4!NdrProxySendReceive+43
rpcrt4!NdrClientCall2+206
rpcrt4!ObjectStublessClient+8b
rpcrt4!ObjectStubless+f
ole32!CRpcResolver::CreateInstance+14e
ole32!CClientContextActivator::CreateInstance+fa
ole32!ActivationPropertiesIn::DelegateCreateInstance+f7
ole32!ICoCreateInstanceEx+3f8
ole32!CComActivator::DoCreateInstance+6a
ole32!CoCreateInstanceEx+23
ole32!CoCreateInstance+3c
msvbvm60!CEcProjTypeComp::LookupMember+3ab
RDCMSAsp!DllCanUnloadNow+2aff1
oleaut32!DispCallFunc+16a
msvbvm60!VBStrToLong+cf
msvbvm60!rtFindFirstFile+185
vbscript!CatchIDispatchInvoke+46
vbscript!IDispatchInvoke2+af
vbscript!IDispatchInvoke+59
vbscript!InvokeDispatch+13a
vbscript!InvokeByName+42
vbscript!CScriptRuntime::Run+2587
vbscript!CScriptEntryPoint::Call+5c
vbscript!CSession::Execute+b4
vbscript!COleScript::ExecutePendingScripts+13e
vbscript!COleScript::SetScriptState+150
asp!CActiveScriptEngine::TryCall+19
asp!CActiveScriptEngine::Call+31
asp!CallScriptFunctionOfEngine+5b
asp!ExecuteRequest+17e
asp!Execute+249
asp!CHitObj::ViperAsyncCallback+3f3
asp!CViperAsyncRequest::OnCall+92
comsvcs!CSTAActivityWork::STAActivityWorkHelper+32
ole32!EnterForCallback+c4
ole32!SwitchForCallback+1a3
ole32!PerformCallback+54
ole32!CObjectContext::InternalContextCallback+159
ole32!CObjectContext::DoCallback+1c
comsvcs!CSTAActivityWork::DoWork+12d
comsvcs!CSTAThread::DoWork+18
comsvcs!CSTAThread::ProcessQueueWork+37
comsvcs!CSTAThread::WorkerLoop+190
msvcrt!_endthreadex+a3
kernel32!BaseThreadStart+34


This thread is calling CoCreateInstance to create a component with CLSID =
"{46C972B8-D393-43C6-92D3-8F24AA9B0EDB}"


Thread 16 - System ID 5620
Entry point msvcrt!_endthreadex+2f
Create time 1/4/2008 7:50:16 AM
Time spent in user mode 0 Days 00:00:00.125
Time spent in kernel mode 0 Days 00:00:00.031


This thread is making a COM call to multi-threaded apartment (MTA) in
another COM server process via the local RpcSs service

Function Source
ntdll!KiFastSystemCallRet
ntdll!NtWaitForMultipleObjects+c
kernel32!WaitForMultipleObjectsEx+11a
user32!RealMsgWaitForMultipleObjectsEx+141
ole32!CCliModalLoop::BlockFn+7d
ole32!ModalLoop+5b
ole32!ThreadSendReceive+e3
ole32!CRpcChannelBuffer::SwitchAptAndDispatchCall+112
ole32!CRpcChannelBuffer::SendReceive2+d3
ole32!CCliModalLoop::SendReceive+1e
ole32!CAptRpcChnl::SendReceive+6f
ole32!CCtxComChnl::SendReceive+1a9
rpcrt4!NdrProxySendReceive+43
rpcrt4!NdrClientCall2+206
rpcrt4!ObjectStublessClient+8b
rpcrt4!ObjectStubless+f
ole32!CRpcResolver::CreateInstance+14e
ole32!CClientContextActivator::CreateInstance+fa
ole32!ActivationPropertiesIn::DelegateCreateInstance+f7
ole32!ICoCreateInstanceEx+3f8
ole32!CComActivator::DoCreateInstance+6a
ole32!CoCreateInstanceEx+23
ole32!CoCreateInstance+3c
msvbvm60!CEcProjTypeComp::LookupMember+3ab
RDCMSAsp!DllCanUnloadNow+2aff1
oleaut32!DispCallFunc+16a
msvbvm60!VBStrToLong+cf
msvbvm60!rtFindFirstFile+185
vbscript!CatchIDispatchInvoke+46
vbscript!IDispatchInvoke2+af
vbscript!IDispatchInvoke+59
vbscript!InvokeDispatch+13a
vbscript!InvokeByName+42
vbscript!CScriptRuntime::Run+2587
vbscript!CScriptEntryPoint::Call+5c
vbscript!CSession::Execute+b4
vbscript!COleScript::ExecutePendingScripts+13e
vbscript!COleScript::SetScriptState+150
asp!CActiveScriptEngine::TryCall+19
asp!CActiveScriptEngine::Call+31
asp!CallScriptFunctionOfEngine+5b
asp!ExecuteRequest+17e
asp!Execute+249
asp!CHitObj::ViperAsyncCallback+3f3
asp!CViperAsyncRequest::OnCall+92
comsvcs!CSTAActivityWork::STAActivityWorkHelper+32
ole32!EnterForCallback+c4
ole32!SwitchForCallback+1a3
ole32!PerformCallback+54
ole32!CObjectContext::InternalContextCallback+159
ole32!CObjectContext::DoCallback+1c
comsvcs!CSTAActivityWork::DoWork+12d
comsvcs!CSTAThread::DoWork+18
comsvcs!CSTAThread::ProcessQueueWork+37
comsvcs!CSTAThread::WorkerLoop+190
msvcrt!_endthreadex+a3
kernel32!BaseThreadStart+34


This thread is calling CoCreateInstance to create a component with CLSID =
"{46C972B8-D393-43C6-92D3-8F24AA9B0EDB}"

Re: DebugDiag dump analysis help

am 24.01.2008 17:16:01 von patfilot

Threads 16 & 20 are the only ones of interest. What is happening is
RDCMSAsp.dll (a VB dll that you ware missing symbols for) is making a cross
apartment/cross process call and is waiting for a response from the remote
point.

If you can get the symbols for RDCMSAsp you can get the function where
CreateObject() is being called and find the culprit. Or you can look in the
registry for the CLSID GUID which will tell you the dll that is not
returning.


Pat



"Strago" wrote in message
news:1183F836-8AA4-46EC-83C6-CC642AEB1123@microsoft.com...
> Listed below are threads 1. 6. 10. and 20. I then pasted thread 16, which
> is
> identical to all the other ones specified in that first group of the debug
> report.
>
>
>
> Thread 1 - System ID 2608
> Entry point ntdll!RtlpTimerThread
> Create time 1/4/2008 7:50:16 AM
> Time spent in user mode 0 Days 00:00:00.00
> Time spent in kernel mode 0 Days 00:00:00.00
>
>
> Function Source
> ntdll!KiFastSystemCallRet
> ntdll!NtDelayExecution+c
> ntdll!RtlpTimerThread+47
> kernel32!BaseThreadStart+34
>
>
> Thread 6 - System ID 4720
> Entry point w3tp!THREAD_MANAGER::ThreadManagerThread
> Create time 1/4/2008 7:50:16 AM
> Time spent in user mode 0 Days 00:00:00.046
> Time spent in kernel mode 0 Days 00:00:00.00
>
>
> Function Source
> ntdll!KiFastSystemCallRet
> ntdll!ZwRemoveIoCompletion+c
> kernel32!GetQueuedCompletionStatus+29
> w3tp!THREAD_POOL_DATA::ThreadPoolThread+33
> w3tp!THREAD_POOL_DATA::ThreadPoolThread+24
> w3tp!THREAD_MANAGER::ThreadManagerThread+39
> kernel32!BaseThreadStart+34
>
>
> Thread 10 - System ID 5924
> Entry point ntdll!RtlpIOWorkerThread
> Create time 1/4/2008 7:50:16 AM
> Time spent in user mode 0 Days 00:00:00.00
> Time spent in kernel mode 0 Days 00:00:00.00
>
>
> Function Source
> ntdll!KiFastSystemCallRet
> ntdll!NtDelayExecution+c
> ntdll!RtlpIOWorkerThread+3f
> kernel32!BaseThreadStart+34
>
> Thread 20 - System ID 5224
> Entry point msvcrt!_endthreadex+2f
> Create time 1/4/2008 8:06:21 AM
> Time spent in user mode 0 Days 00:00:00.015
> Time spent in kernel mode 0 Days 00:00:00.00
>
>
> This thread is making a COM call to multi-threaded apartment (MTA) in
> another COM server process via the local RpcSs service
>
> Function Source
> ntdll!KiFastSystemCallRet
> ntdll!NtWaitForMultipleObjects+c
> kernel32!WaitForMultipleObjectsEx+11a
> user32!RealMsgWaitForMultipleObjectsEx+141
> ole32!CCliModalLoop::BlockFn+7d
> ole32!ModalLoop+5b
> ole32!ThreadSendReceive+e3
> ole32!CRpcChannelBuffer::SwitchAptAndDispatchCall+112
> ole32!CRpcChannelBuffer::SendReceive2+d3
> ole32!CCliModalLoop::SendReceive+1e
> ole32!CAptRpcChnl::SendReceive+6f
> ole32!CCtxComChnl::SendReceive+1a9
> rpcrt4!NdrProxySendReceive+43
> rpcrt4!NdrClientCall2+206
> rpcrt4!ObjectStublessClient+8b
> rpcrt4!ObjectStubless+f
> ole32!CRpcResolver::CreateInstance+14e
> ole32!CClientContextActivator::CreateInstance+fa
> ole32!ActivationPropertiesIn::DelegateCreateInstance+f7
> ole32!ICoCreateInstanceEx+3f8
> ole32!CComActivator::DoCreateInstance+6a
> ole32!CoCreateInstanceEx+23
> ole32!CoCreateInstance+3c
> msvbvm60!CEcProjTypeComp::LookupMember+3ab
> RDCMSAsp!DllCanUnloadNow+2aff1
> oleaut32!DispCallFunc+16a
> msvbvm60!VBStrToLong+cf
> msvbvm60!rtFindFirstFile+185
> vbscript!CatchIDispatchInvoke+46
> vbscript!IDispatchInvoke2+af
> vbscript!IDispatchInvoke+59
> vbscript!InvokeDispatch+13a
> vbscript!InvokeByName+42
> vbscript!CScriptRuntime::Run+2587
> vbscript!CScriptEntryPoint::Call+5c
> vbscript!CSession::Execute+b4
> vbscript!COleScript::ExecutePendingScripts+13e
> vbscript!COleScript::SetScriptState+150
> asp!CActiveScriptEngine::TryCall+19
> asp!CActiveScriptEngine::Call+31
> asp!CallScriptFunctionOfEngine+5b
> asp!ExecuteRequest+17e
> asp!Execute+249
> asp!CHitObj::ViperAsyncCallback+3f3
> asp!CViperAsyncRequest::OnCall+92
> comsvcs!CSTAActivityWork::STAActivityWorkHelper+32
> ole32!EnterForCallback+c4
> ole32!SwitchForCallback+1a3
> ole32!PerformCallback+54
> ole32!CObjectContext::InternalContextCallback+159
> ole32!CObjectContext::DoCallback+1c
> comsvcs!CSTAActivityWork::DoWork+12d
> comsvcs!CSTAThread::DoWork+18
> comsvcs!CSTAThread::ProcessQueueWork+37
> comsvcs!CSTAThread::WorkerLoop+190
> msvcrt!_endthreadex+a3
> kernel32!BaseThreadStart+34
>
>
> This thread is calling CoCreateInstance to create a component with CLSID =
> "{46C972B8-D393-43C6-92D3-8F24AA9B0EDB}"
>
>
> Thread 16 - System ID 5620
> Entry point msvcrt!_endthreadex+2f
> Create time 1/4/2008 7:50:16 AM
> Time spent in user mode 0 Days 00:00:00.125
> Time spent in kernel mode 0 Days 00:00:00.031
>
>
> This thread is making a COM call to multi-threaded apartment (MTA) in
> another COM server process via the local RpcSs service
>
> Function Source
> ntdll!KiFastSystemCallRet
> ntdll!NtWaitForMultipleObjects+c
> kernel32!WaitForMultipleObjectsEx+11a
> user32!RealMsgWaitForMultipleObjectsEx+141
> ole32!CCliModalLoop::BlockFn+7d
> ole32!ModalLoop+5b
> ole32!ThreadSendReceive+e3
> ole32!CRpcChannelBuffer::SwitchAptAndDispatchCall+112
> ole32!CRpcChannelBuffer::SendReceive2+d3
> ole32!CCliModalLoop::SendReceive+1e
> ole32!CAptRpcChnl::SendReceive+6f
> ole32!CCtxComChnl::SendReceive+1a9
> rpcrt4!NdrProxySendReceive+43
> rpcrt4!NdrClientCall2+206
> rpcrt4!ObjectStublessClient+8b
> rpcrt4!ObjectStubless+f
> ole32!CRpcResolver::CreateInstance+14e
> ole32!CClientContextActivator::CreateInstance+fa
> ole32!ActivationPropertiesIn::DelegateCreateInstance+f7
> ole32!ICoCreateInstanceEx+3f8
> ole32!CComActivator::DoCreateInstance+6a
> ole32!CoCreateInstanceEx+23
> ole32!CoCreateInstance+3c
> msvbvm60!CEcProjTypeComp::LookupMember+3ab
> RDCMSAsp!DllCanUnloadNow+2aff1
> oleaut32!DispCallFunc+16a
> msvbvm60!VBStrToLong+cf
> msvbvm60!rtFindFirstFile+185
> vbscript!CatchIDispatchInvoke+46
> vbscript!IDispatchInvoke2+af
> vbscript!IDispatchInvoke+59
> vbscript!InvokeDispatch+13a
> vbscript!InvokeByName+42
> vbscript!CScriptRuntime::Run+2587
> vbscript!CScriptEntryPoint::Call+5c
> vbscript!CSession::Execute+b4
> vbscript!COleScript::ExecutePendingScripts+13e
> vbscript!COleScript::SetScriptState+150
> asp!CActiveScriptEngine::TryCall+19
> asp!CActiveScriptEngine::Call+31
> asp!CallScriptFunctionOfEngine+5b
> asp!ExecuteRequest+17e
> asp!Execute+249
> asp!CHitObj::ViperAsyncCallback+3f3
> asp!CViperAsyncRequest::OnCall+92
> comsvcs!CSTAActivityWork::STAActivityWorkHelper+32
> ole32!EnterForCallback+c4
> ole32!SwitchForCallback+1a3
> ole32!PerformCallback+54
> ole32!CObjectContext::InternalContextCallback+159
> ole32!CObjectContext::DoCallback+1c
> comsvcs!CSTAActivityWork::DoWork+12d
> comsvcs!CSTAThread::DoWork+18
> comsvcs!CSTAThread::ProcessQueueWork+37
> comsvcs!CSTAThread::WorkerLoop+190
> msvcrt!_endthreadex+a3
> kernel32!BaseThreadStart+34
>
>
> This thread is calling CoCreateInstance to create a component with CLSID =
> "{46C972B8-D393-43C6-92D3-8F24AA9B0EDB}"
>
>

Re: DebugDiag dump analysis help

am 24.01.2008 20:05:02 von Strago

Ah, fascinating! Ok, I found the dll being called, both it and rdcmsasp are
indeed part of my Document Management suite that is causing the problem. So
which is the culprit, the calling dll or the remote one? Or is it not
possible to tell?

I would also not be sure how to obtain symbols for this, nor how to set it
up, please forgive my newbness. Is this something I'd have to get from the
vendor of the Doc mgmt app?


Thanks,
Jaime

Re: DebugDiag dump analysis help

am 25.01.2008 21:22:29 von patfilot

The calling DLL is trying to call the remote one & is waiting on a response.
The remote one is not coming back.


Pat

"Strago" wrote in message
news:8A08E1E4-E2B7-444A-8EA2-86FC83586DB8@microsoft.com...
> Ah, fascinating! Ok, I found the dll being called, both it and rdcmsasp
> are
> indeed part of my Document Management suite that is causing the problem.
> So
> which is the culprit, the calling dll or the remote one? Or is it not
> possible to tell?
>
> I would also not be sure how to obtain symbols for this, nor how to set it
> up, please forgive my newbness. Is this something I'd have to get from
> the
> vendor of the Doc mgmt app?
>
>
> Thanks,
> Jaime