Scrigroup - Documente si articole


HomeDocumenteUploadResurseAlte limbi doc
AccessAdobe photoshopAlgoritmiAutocadBaze de dateCC sharp
CalculatoareCorel drawDot netExcelFox proFrontpageHardware
HtmlInternetJavaLinuxMatlabMs dosPascal
PhpPower pointRetele calculatoareSqlTutorialsWebdesignWindows

AspAutocadCDot netExcelFox proHtmlJava
LinuxMathcadPhotoshopPhpSqlVisual studioWindowsXml

Threading Traps


+ Font mai mare | - Font mai mic

Threading Traps

We've seen the two main situations where it can be a good idea to use threading in your applications. However, there are some circumstances in which spawning a new thread would be a bad idea. Obviously, this isn't going to be a complete listing of inappropriate times to create new threads, but it is meant to give you an idea of what constitutes a bad threading decision. There are two main areas we'll look at here: the first is an instance where execution order is extremely important, and the second is a mistake seen quite often in code - creating new threads in a loop.

Execution Order Revisited

Recall the example do_something_thread.cs from earlier in the chapter where we created some code demonstrating the fact that execution randomly jumped from one thread to the other. It looked as if one thread would execute and show 10 lines in the console, then the next thread would show 15, and then return back to the original thread to execute 8. A common mistake in deciding whether to use threads or not is to assume that you know exactly how much code is going to execute in the thread's given time slice.

Here's an example that demonstrates this problem. It looks as if the thread t1 will finish first because it starts first, but that's a big mistake. Create a console application called ExecutionOrder and set its startup object to Main . Build and run this example a few times - you'll get differing results:

using System;
using System.Threading;

namespace Chapter 02

public static void Main()

public static void Increment()


Sometimes t1 will finish then t2 will execute some more code and then finish. Sometimes t2 will finish completely and then t1 will execute to completion. The point is that you can't count on the threads completing in the order they were started. Later in this book we will discuss how you can synchronize threads to execute in a specified order. However, it's important to note synchronization doesn't happen by default.

This isn't the only problem associated with execution order. Take the next piece of example code where we show that data can be adversely affected by unsynchronized threads, ExecutionOrder2:

using System;
using System.Threading;
public class ExecutionOrder2

public static void Main()

public static void Increment()



This is a very similar class to ExecutionOrder. This time, however, we created a shared incrementing counter called iIncr. We tell the application to increment the variable before moving on to the WriteFinished() method. If we execute this application a few times, you will notice that the value of the incrementing counter will change at different times. Keep in mind again that we will show you how to synchronize these threads later on. These two examples should act as warnings that threads do not execute in the order that you want by default. However, in these cases, you can use synchronization tactics such as using the Join() method we discussed earlier. Thread synchronization will be covered more in depth later in this book.

Threads in a Loop

One other common mistake made when someone discovers the joys of threading is that they create and use them within a loop. There follows a code example that demonstrates this, which is often implemented by programmers who are new to the threading concept. It is a common concept used by developers or system administrators to send notifications when an event occurs. The idea is not bad, but its implementation using a thread in the loop can cause many problems.

Please be aware that running this code may well disable your system. Don't run it unless you don't mind rebooting your machine to reclaim the resources the program will waste.

using System;
using System.Threading;
using System.Web.Mail;
using System.Collections;

public class LoopingThreads

public static Thread CreateEmail(SendMail oSendEmail,
string EmailTo , string EmailFrom ,
string EmailBody , string EmailSubject )

class Mailer

public class DoMail


public static void SendAllEmail()

The code may be a little more complex than you thought because it also demonstrates how to use a delegate and a lengthy set of classes to call a thread with parameters. This is necessary because threads can only create an entry on a method that has no parameters. As such, it is the duty of the programmer to create proxy methods that create the parameters for another method and return a thread object (we'll see more of this in later chapters). The calling method can then use the reference to the returned Thread to start execution.

Let's concentrate on the SendAllEmail method. This is where we loop through the ArrayList and send our parameters to the proxy method. We start a new thread for each and every e-mail we want to send:

public static void SendAllEmail()


At first glance, this sounds like a good idea. Why not send e-mail on another thread? It takes a long time to process sometimes doesn't it? This is true, but the problem is that we are now tying up the processor's execution time by switching between the threads. By the time this process is done, the time slice allocated to each thread is mainly spent unpacking and packing the thread local storage. Very little time is spent executing the instructions in the thread. The system may even lock up completely leaving poor John without any mail from us. What may make more sense is to create a single thread and execute the SendAllEmail method on that thread. Additionally, you could use a thread pool with a fixed number of threads. In this instance, when one thread in the pool has completed, it will spawn the next thread and send another e-mail.

One common programming practice is to place work into a queue to be processed by a service. For instance, a bank may place an XML-based file in a network directory to be picked up by a service running on another server. The service would scan the directory for new files and process them one at a time. If more than one file is placed in the directory at a time, the service would process them one by one. In a typical environment, new files would be placed in this directory infrequently. Based on this information, at first glance, this might seem like a good time to start a new thread when a file is found. You would be right, but think about what would happen if the service that processes these files was stopped. What happens if a network problem prevents the service from accessing the directory for a long period of time? The files would pile up in the directory. When the service finally started again, or was allowed access to the directory once again, each file would essentially spawn a new thread on the service. Anyone who has used this model can tell you that this situation can bring a server to its knees.

The file model is just one example. Another similar model may be to use Microsoft BizTalk Server or Microsoft Message Queue with a service that processes items in a queue. All of these implementations have the same basic structure. The actual implementation isn't the important thing to note here. The point to walk away with is that if your work is being placed into a queue and you feel that multithreading is the answer, you might want to consider using thread pooling.

Politica de confidentialitate | Termeni si conditii de utilizare



Vizualizari: 980
Importanta: rank

Comenteaza documentul:

Te rugam sa te autentifici sau sa iti faci cont pentru a putea comenta

Creaza cont nou

Termeni si conditii de utilizare | Contact
© SCRIGROUP 2024 . All rights reserved