15

The following code should (at least in my opinion) create 100 Tasks, which are all waiting in parallel (that's the point about concurrency, right :D ?) and finish almost at the same time. I guess for every Task.Delay a Timerobject is created internally.

public static async Task MainAsync() {

    var tasks = new List<Task>();
    for (var i = 0; i < 100; i++) {
        Func<Task> func = async () => {
            await Task.Delay(1000);
            Console.WriteLine("Instant");
        };
        tasks.Add(func());
    }
    await Task.WhenAll(tasks);
}

public static void Main(string[] args) {
    MainAsync().Wait();
}

But! When I run this on Mono I get very strange behavior:

  • The Tasks do not finish at the same time, there are huge delays (probably about 500-600ms)
  • In the console mono shows a lot of created threads:

Loaded assembly: /Users/xxxxx/Programming/xxxxx/xxxxxxxxxx/bin/Release/xxxxx.exe

Thread started: #2

Thread started: #3

Thread started: #4

Thread started: #5

Thread started: #6

Thread started: #7

Thread finished: #3 <-- Obviously the delay of 1000ms finished ?

Thread finished: #2 <-- Obviously the delay of 1000ms finished ?

Thread started: #8

Thread started: #9

Thread started: #10

Thread started: #11

Thread started: #12

Thread started: #13

... you get it.

Is this actually a bug ? Or do I use the library wrong ?

[EDIT] I tested a custom sleep method using Timer:

    public static async Task MainAsync() {
        Console.WriteLine("Started");
        var tasks = new List<Task>();
        for (var i = 0; i < 100; i++) {
            Func<Task> func = async () => {
                await SleepFast(1000);
                Console.WriteLine("Instant");
            };
            tasks.Add(func());
        }
        await Task.WhenAll(tasks);
        Console.WriteLine("Ready");
    }

    public static Task SleepFast(int amount) {
        var source = new TaskCompletionSource<object>();
        new Timer(state => {
            var oldSrc = (TaskCompletionSource<object>)state;
            oldSrc.SetResult(null);
        }, source, amount, 0);
        return source.Task;
    }

This time, all tasks completed instantaneously. So, I think it's a really bad implementation or a bug.

[Edit2] Just FYI: I've tested the original code (using Task.Delay) on .NET using Windows 8.1 now and it ran as expected (1000 Tasks, waiting for 1 second in parallel and finishing).

So the answer is: Mono's impl. of (some) methods is not perfect. In general Task.Delay does not start a thread and even a lot of them should not create multiple threads.

12
  • 4
    Task.Delay internally uses a timer, that internally uses the thread-pool. As you're currently creating 100 tasks very quickly, I'm guessing you're running out of threads in the thread-pool so new threads are created automatically.
    – ken2k
    Commented Feb 19, 2014 at 10:47
  • So the timers are not implemented using a scalable technique ? I was hoping for something very efficient. For example in node.js they use epoll (for linux), i believe.
    – Kr0e
    Commented Feb 19, 2014 at 10:52
  • 500-600 ms between each task? Could you post the exact timings? Commented Feb 19, 2014 at 11:14
  • 7
    The Task.Delay operation itself won't create a new thread, but the callback method to execute your Console.WriteLine after the delay is over will certainly need to execute on a thread. If all delays are the same, all of these callbacks will occur roughly at the same time. Try randomizing the delay in your example and see if it uses fewer threads. Commented Feb 19, 2014 at 12:14
  • 1
    Have you reported this as a bug to Mono project? I strongly disagree about bad concept. "May create thread" is a wrong wording, there's always an I/O completion thread, which runs the the code after a hardware interrupt. In case with node.js, it will just marshall the completion callback to the node's main thread, where it will be picked up by the event loop. You can easily implement this model with .NET using a custom task scheduler: stackoverflow.com/q/20993007/1768303
    – noseratio
    Commented Feb 19, 2014 at 22:11

2 Answers 2

5

On .NET Framework Desktop.

In short, there this special VM thread which periodically checks queue of timers and runs timers' delegates on thread pool queue. Task.Delay does not create new Thread, but still may be heavy, and no guaranties on order of execution or being precise about deadlines. And as I understand, passing cancellation Task.Delay may end up in just removing item from collection, with no thread pool work queued.

Task.Delay scheduled as DelayPromise by creating new System.Threading.Timer. All timers are stored in AppDomain singleton of TimerQueue. Native VM timer used to callback .NET to check if need to fire any timers from queue. Timer delegates scheduled for execution via ThreadPool.UnsafeQueueUserWorkItem.

From performance point of view, it seems better to cancel delay if delay ends earlier:

open System.Threading
open System.Threading.Tasks

// takes 0.8% CPU
while true do
  Thread.Sleep(10)
  Task.Delay(50)

// takes 0.4% CPU
let mutable a = new CancellationTokenSource()
while true do
  Thread.Sleep(10)
  a.Cancel()
  a.Dispose()
  a <- new CancellationTokenSource()
  let token = a.Token
  Task.Delay(50,token)
4

The Task library is designed more for managing blocking tasks without blocking an entire workflow (task asynchronism, confusingly called "task parallel" by Microsoft), and not for doing large blocks of concurrent computation (parallel execution).

The task library uses a scheduler and queues jobs ready for execution. When jobs are run, they will do so on a thread-pool thread, and these are very limited in number. There is logic to expand the thread count, but unless you have hundreds of CPU cores, it's going to stay a low number.

So to answer the question, some of your tasks are queued up waiting for a thread from the pool, while the other delayed tasks have been issued by the scheduler.

The scheduler and thread-pool logic can be changed at runtime, but if you are trying to get lots of computation done quickly Task isn't right for the job. If you want to deal with lots of slow resources (like disk, database, or internet resources) Task may help keep an app responsive.

If you just want to learn about Task try these:

2
  • So writing a high performance server is not possible using the Task library ? (Because in a server scenario you definitely have lots of small/fast tasks.)
    – Kr0e
    Commented Feb 20, 2014 at 12:41
  • 5
    More like you will can use the task library to ensure your server is not being blocked by IO. Tasks won't solve performance problems, but are a useful tool. They are more like 'green threads' or Erlang threads -- they won't make your hardware go faster, but they can help use the power available better. As always with optimisation, measure first Commented Feb 20, 2014 at 13:30

Not the answer you're looking for? Browse other questions tagged or ask your own question.